Recursive Neural Networks with PyTorch

Originally published at:

From Siri to Google Translate, deep neural networks have enabled breakthroughs in machine understanding of natural language. Most of these models treat language as a flat sequence of words or characters, and use a kind of model called a recurrent neural network (RNN) to process this sequence. But many linguists think that language is best…

Back propagated RL of Pyros - Dante's Inferno was a lovefest.

How about a Turing test for the NLP age: Process top 100 classic novels and summarize top 10 common themes. 2001 is not a classic.

Summary of Deep Learning from the 3 godfathers of NNs, prior to recent advances by Socher, Bowman, et. al.

Come to the GPU Technology Conference, May 8-11 in San Jose, California, to learn more about deep learning and PyTorch. GTC is the largest and most important event of the year for AI and GPU developers. Use code CMDLIPF to receive 20% off registration!

Degree in linguistics - that's why you can actually write, not just engineer or machine garbled "natural" language, eh? Well done we need more native speakers using natural style.

Thanks for the post. Which installation of torch are you using? I'm getting the error:
TypeError: splits() got an unexpected keyword argument 'wv_type'
In the data loading part

Thanks for your code. I have a question.
answers.build_vocab(train) makes answers has "neutral", "contradiction", "entailment", and "<unk>". Also config.d_out = len(answers.vocab) makes config.d_out equals to 4.

It means you classify data into 4 classes. Is it a mistake or does it still work as you thought?