Generative AI and LLMs: Architecture and Data Preparation
This course is part of multiple programs. Learn more
Instructors: Joseph Santarcangelo +1 more
Instructor ratings
We asked all learners to give feedback on our instructors based on the quality of their teaching style.
What you'll learn
Skills you'll gain
There are 2 modules in this course
Designed for data scientists, ML engineers, and AI enthusiasts, you’ll learn to differentiate between various generative AI architectures and models, such as recurrent neural networks (RNNs), transformers, generative adversarial networks (GANs), variational autoencoders (VAEs), and diffusion models. You’ll also discover how LLMs, such as generative pretrained transformers (GPT) and bidirectional encoder representations from transformers (BERT), power real-world language tasks. Get hands-on with tokenization techniques using NLTK, spaCy, and Hugging Face, and build efficient data pipelines with PyTorch data loaders to prepare models for training. A basic understanding of Python, PyTorch, and familiarity with machine learning and neural networks are helpful but not mandatory. Enroll today and get ready to launch your journey into generative AI!
Data Preparation for LLMs
Explore more from Machine Learning
©2025 ementorhub.com. All rights reserved