Why Transformers Work

And RNNs Fall Short

vincent warmerdam

Algorithms Deep Learning Natural Language Processing

See in schedule

This will be a technical talk where I'll explain the inner workings of the machine learning algorithms inside of Rasa. In particular I'll talk about why the transformer has become a part in many of our algorithms and has replaced RNNs. These include use-cases in natural language processing but also in dialogue handling.

You'll see a live demo of a typical error that an LSTM would make but a transformer wouldn't. The algorithms are explained with calm diagrams and very little maths.

Type: Talk (45 mins); Python level: Intermediate; Domain level: Intermediate

vincent warmerdam


My name is Vincent, AskMeAnything[tm]. I have been evangelizing data and open source for the last 5 years. You might know my from tech talks where I attempt to defend common sense over hype in data science. Currently I work as a Research Advocate at Rasa where I collaborate with the research team to explain and understand conversational systems better.

The future is pretty awesome, all we have to do is build it.