Skip to main content
AI & Machine LearningAdvanced

Natural Language Processing

Build text classifiers, summarisers, and conversational AI with modern NLP tools.

35 hoursDr. Sarah Chen4.7 (2,800 learners)

About this course

A practical NLP course covering everything from tokenisation and classical text classification through to fine-tuning BERT-family models, retrieval-augmented generation, and building production chatbots. Projects include a sentiment analyser, a document summariser, and a simple RAG pipeline.

Target audience: ML engineers, data scientists, developers building AI-powered text applications

What you will learn

  • NLP
  • BERT fine-tuning
  • RAG
  • LLM APIs
  • Text classification

Course syllabus

10 modules · video + projects

  1. 1Text preprocessing: tokenisation, stemming, and lemmatisation
  2. 2Bag-of-words, TF-IDF, and classical text classification
  3. 3Word embeddings: Word2Vec, GloVe, and FastText
  4. 4Sequence models: LSTMs for NLP
  5. 5The transformer architecture and attention mechanism
  6. 6BERT, RoBERTa, and fine-tuning for downstream tasks
  7. 7Named entity recognition and relation extraction
  8. 8Retrieval-augmented generation (RAG)
  9. 9Prompt engineering and LLM API integration
  10. 10Building and deploying a production chatbot

Prerequisites

  • Python
  • Machine Learning Fundamentals

Frequently asked questions

Does this course use the OpenAI API?

Yes, the LLM integration sections use the OpenAI API. A small amount of API credit (under $5) is sufficient for all exercises. The course also covers open-source alternatives using the HuggingFace Hub.

Ready to start Natural Language Processing?

Join 2,800+ learners already enrolled. Self-paced, certificate on completion.