All Courses
Mastering Generative AI: Language Models with Transformers
edX
Course
Intermediate
Free to Audit
Certificate

Mastering Generative AI: Language Models with Transformers

IBM

Build job-ready skills in NLP based applications using GPT and BERT in just 2 weeks! Add sought-after skills employers are looking for to your resume and power up your AI career.

4 hrs/week2 weeksEnglish825 enrolled
Free to Audit

About this Course

The demand for transformer-based language models is skyrocketing. AI engineers skilled in using transformer-based models for NLP are essential for developing successful gen AI applications. This course builds the job-ready skills employers need. During the course, you’ll explore the concepts of transformer-based models for natural language processing (NLP). You’ll look at how to apply transformer-based models for text classification, focusing on the encoder component. Plus, you’ll learn about positional encoding, word embedding, and attention mechanisms in language transformers, and their role in capturing contextual information and dependencies. You’ll learn about multi-head attention and decoder-based language modeling with generative pre-trained transformers (GPT) for language translation. You’ll consider how to train models and implement models using PyTorch. You’ll explore encoder-based models with bidirectional encoder representations from transformers (BERT) and train them using masked language modeling (MLM) and next sentence prediction (NSP). Plus, you’ll learn to apply transformers for translation using transformer architecture and implement it using PyTorch. Throughout, you’ll apply your new skills practically in hands-on activities and you’ll complete a final project tackling a real-world scenario. If you’re looking to build job-ready skills for gen AI applications employers are looking for, ENROLL TODAY and enhance your resume in just 2 weeks! Prerequisites: To enroll for this course, you need a working knowledge of Python, PyTorch, and machine learning. 3b:T51a, W

What You'll Learn

  • Job-ready skills in transformer-based models for NLP employers are looking for in just 2 weeks.
  • A good understanding of attention mechanisms in transformers, including their role in capturing contextual information.
  • A good understanding of language modeling with decoder-based GPT and encoder-based BERT.
  • How to implement positional encoding, masking, attention mechanism, document classification, and LLMs like GPT and BERT.
  • How to use transformer-based models and PyTorch functions for text classification, language translation, and modeling.

Prerequisites

  • For this course, basic knowledge of Python and a familiarity with machine learning and neural network concepts is recommended.

Instructors

J

Joseph Santarcangelo

PhD., Data Scientist

Course Info

PlatformedX
LevelIntermediate
PacingUnknown
CertificateAvailable
PriceFree to Audit

Start Learning Now