Sunday, 15 June 2014

machine learning - What is suitable neural network architecture for the prediction of popularity of articles? -


I am a newbie in machine learning and also in neural networks. Currently I'm taking a course on neuronnetwork about coursera.org, but I do not understand everything. I have a problem with my thesis. I should use a neural network, but I do not know How to choose the right nerve network structure for my problem

I have a lot of data from web portals (generally online versions of newspapers, magazines). The information about the article is, for example, the name, article text and the release of the article. It also has a large amount of sequence data that captures users' behavior.

My goal is to estimate the popularity of an article (number of readers or unique user click on the article) I want to create a vector from this data and I want to feed my neural network with these vectors.

I have two questions:

1 How do I create the correct vector?

2. Which neural network architecture is most suitable for this problem?

These are very comprehensive questions

right vector how to be?

For text data, you usually use it

Which neural network architecture is appropriate for this problem?

It is very difficult to say I have the size of my vector after implementing the input neurons (where k TF-IDF) Begins with a network with: You can also select some type of feature to reduce the number of features. A good feature is the selection method.)

Then, a standard network layout is given by using a hidden layer with the number of neurons equal to the average between the number of input neurons and output neurons That is, you only need an output neuron, which shows how the article will be popular (this can be either one or a neuron).

For neurons in your hidden layer, you experiment with.

There are several other things that you can also try: weight decay, speed technique, multiple layers network, recurrent network and so on. It is impossible to say what would be best for your problem without having to use too much.


No comments:

Post a Comment