Quinnipiac University

CSC 375/575 Generative AI

About the Course

CSC 375/575 Generative AI is an advanced course that explores transformer architectures, large language models, and generative AI techniques from foundational principles to practical implementation. Students will build and train language models from scratch, learning about tokenization, embeddings, attention mechanisms, and multi-head attention. The course covers modern architectures like GPT and BERT, along with advanced topics including fine-tuning, instruction following, reinforcement learning from human feedback (RLHF), and ethical considerations in AI development. Students will complete hands-on programming projects implementing core LLM components and develop a comprehensive final project demonstrating real-world applications of generative AI technologies.

Course Schedule

Course Materials

📄 Course Syllabus

Available Lectures

Lecture Topic Materials
1 Introduction to Generative AI Slides Handout
2 LLM Foundations & Pre-training Slides Handout Notebook Google Colab
3 Tokenization & Data Processing Slides Handout Notebook Google Colab
4 Attention Mechanisms & Transformers Slides Handout Notebook

Assignments

Assignment Topic Materials
1 Building Meta's LLaMA Tokenizer Download Package Instructions Submit

Course Topics Overview

Current Available Lectures:

Lecture 1: Introduction to Generative AI - Historical context, AI evolution, and course overview

Lecture 2: LLM Foundations & Pre-training - Introduction to large language models, architecture, and pre-training concepts

Lecture 3: Tokenization & Data Processing - Text processing pipelines, building tokenizers from scratch, and modern BPE tokenization

Lecture 4: Attention Mechanisms & Transformers - Self-attention, multi-head attention, transformer architecture, and positional encoding

More lectures will be added throughout the semester as the course progresses.

Required Textbooks

Build a Large Language Model (From Scratch)

Sebastian Raschka

Manning Publications, 2024

Primary textbook covering LLM implementation from fundamentals

Foundations of Large Language Models

Tong Xiao and Jingbo Zhu

NLP Lab, Northeastern University & NiuTrans Research, 2025

Supplementary material for theoretical foundations

Useful Resources

All
Papers
Tools
Demos
Historical
Tokenization
Visualization