# Tensorflow for Text Classification¶

Earlier this year, I gave a talk at London Tensorflow Meetup, giving an interactive tutorial on how to do text classification using Tensorflow. This is just a short post going over the resources I created as part of that talk. The resources can be found on my github..

## Motivation¶

For those wanting to learn to apply neural networks to text classification, finding practical resources can be somewhat challenging. The excellent, CS224n from Stanford has some excellent theoretical resources. However, if you don't go to stanford, access to materials to do this in practice can be somewhat lacking!

All the exercises below have a semi-complete set of notebooks for you to work on, and have full solutions too, in case you want something to compare against or something is unclear. I suggest you invest the time to get a minimal working example before you look at my approach - it will pay dividends!

## Suggested Syllabus¶

### Lesson 1¶

- Lecture 8 of CS224n Slides
- First part of WildML blog on RNNs Blog
- WildML blog of RNNs in Tensorflow. Blog - Some features/locations may have changed between tensorflow versions.
- Read the documentation about dynamic_rnn, and cell types, on tensorflow website. dynamic_rnn BasicRNNCell

#### Exercises¶

**BasicRNN**

### Lesson 2¶

- Lecture 9 CS224n Slides
- Colah's Blog on LSTMs Blog
- Read the documentation about GRUs and LSTMs, on tensorflow website. LSTM GRU

#### Exercises¶

**GRURNN****LSTMRNN**

### Lesson 3¶

- Later part of WildML blog on RNNs Blog
- Oxford Deep NLP course Lecture 5
- Read tensorflow documentation about bidirectional_dynamic_rnn #### Exercises
**BasicBidirectionalRNN**- Modify this code to run bidirectional LSTM and GRU networks.

### Lesson¶

- Oxford Deep NLP Conditional Language Modelling with attention Slides

#### Exercises¶

**BasicBidirectionalRNN-MeanPooling****BasicBidirectionalRNN-MaxPooling****GRUBidirectionalRNN-MeanPooling****GRUBidirectionalRNN-MaxPooling**

### Lesson 5¶

- Oxford Deep NLP Conditional Language Modelling with attention Slides
- Wild ML Post on attention Blog
- Hierchical Attention Networks - Zhang, 2015 Paper #### Exercises
**BasicRNNAttention**- Modify this code to run attention over an LSTM network.

### Lesson 6+¶

Apply these techniques to other datasets, some examples:

#### Exercises¶

# Summary¶

This is only a short set of notebooks, and of course their are many other approachs and techniques that one could use. However, if you invest the time and work through these, you will be incredibly well placed to tackle current research papers in the area!