16/09/2020
18/09/2020

Transformers and their friends in NLP

This course is about using state-of-the-art techniques for building Natural Language Processing models

Course outline

General concepts

  • Tokenisation, parsing, …
  • Embeddings and transfer learning
  • Labeling data correctly

Deep Learning Architectures

  • LSTM
  • Contextualized String Embeddings using the ELMo model
  • Attention mechanisms
  • Transformer architecture
  • BERT model
  • BERT’s derivatives and when to use them

Case studies and exercises

  • E-mail automation
  • Document information extraction
  • NLP for search

Teachers

Kaja Zupanc, PhD

Senior NLP Engineer
Kaja_Verhoeven_Zupanc-IMG_6813

Aleksandra Vercauteren, PhD

Head of NLP

Data Scientist with a passion for NLP and a background as a researcher in Theoretical Linguistics. Specialized in formal approaches to typical aspects of linguistic interaction, such as question answering, implicit information, and discourse organization. Aleksandra loves tackling complex problems and finding patterns in unstructured data. A growing fascination for data science and its applications in Language Technology and AI led me to participate in a data science boot camp. I am an advocate of making science accessible to a broad audience and strongly believe that crowdsourcing is a crucial component of innovation.

Aleksandra_Vercauteren-IMG_6791

Course level

expert

Admission Fee

€ 4500

Prerequisites

  • Python programming at the intermediate level
  • Knowledge of basic TensorFlow concepts (we use the Keras API)
  • Knowledge of basic Machine Learning concepts like data splitting, classification, overfitting, probabilities, ...

You get

Course material Drinks, snacks and lunch Cloud servers for use during training 4h of support and question answering up to 6 months after the course
Scroll to Top

Inquiry for your POC

=