Generative AI and LLMs: Architecture and Data Preparation

This course is part of multiple programs. Learn more

Instructors: Joseph Santarcangelo +1 more

Instructor ratings

We asked all learners to give feedback on our instructors based on the quality of their teaching style.

What you'll learn

  •   Differentiate between generative AI architectures and models, such as RNNs, transformers, VAEs, GANs, and diffusion models
  •   Describe how LLMs, such as GPT, BERT, BART, and T5, are applied in natural language processing tasks
  •   Implement tokenization to preprocess raw text using NLP libraries like NLTK, spaCy, BertTokenizer, and XLNetTokenizer
  •   Create an NLP data loader in PyTorch that handles tokenization, numericalization, and padding for text datasets
  • Skills you'll gain

  •   Text Mining
  •   Data Processing
  •   Artificial Intelligence and Machine Learning (AI/ML)
  •   Generative AI
  •   Natural Language Processing
  •   Large Language Modeling
  •   Artificial Neural Networks
  •   PyTorch (Machine Learning Library)
  • There are 2 modules in this course

    Designed for data scientists, ML engineers, and AI enthusiasts, you’ll learn to differentiate between various generative AI architectures and models, such as recurrent neural networks (RNNs), transformers, generative adversarial networks (GANs), variational autoencoders (VAEs), and diffusion models. You’ll also discover how LLMs, such as generative pretrained transformers (GPT) and bidirectional encoder representations from transformers (BERT), power real-world language tasks. Get hands-on with tokenization techniques using NLTK, spaCy, and Hugging Face, and build efficient data pipelines with PyTorch data loaders to prepare models for training. A basic understanding of Python, PyTorch, and familiarity with machine learning and neural networks are helpful but not mandatory. Enroll today and get ready to launch your journey into generative AI!

    Data Preparation for LLMs

    Explore more from Machine Learning

    ©2025  ementorhub.com. All rights reserved