Table of Contents:<br /><br />Preface<br />Acknowledgments<br />Introduction<br />Learning Basics and Linear Models<br />From Linear Models to Multi-layer Perceptrons<br />Feed-forward Neural Networks<br />Neural Network Training<br />Features for Textual Data<br />Case Studies of NLP Features<br />From Textual Features to Inputs<br />Language Modeling<br />Pre-trained Word Representations<br />Using Word Embeddings<br />Case Study: A Feed-forward Architecture for Sentence Meaning Inference<br />Ngram Detectors: Convolutional Neural Networks<br />Recurrent Neural Networks: Modeling Sequences and Stacks<br />Concrete Recurrent Neural Network Architectures<br />Modeling with Recurrent Networks<br />Conditioned Generation<br />Modeling Trees with Recursive Neural Networks<br />Structured Output Prediction<br />Cascaded, Multi-task and Semi-supervised Learning<br />Conclusion<br />Bibliography<br />Author's Biography
0コメント