Step inside the black box of Large Language Models. From the mathematics of neurons to the deployment of autonomous AI Agents.
History, Neurons & The Science of Words
From ELIZA (1966) to GPT-4. Understanding AI types and why language was so hard for computers to solve.
The math of a neuron. Building simple neural networks and understanding how layers process signals.
Tokenization and Word Embeddings. Mapping semantic meaning to mathematical space.
Transformers, APIs & Deployment
Unpacking the 2017 paper that changed everything. Understanding Encoders and Self-Attention mechanics.
Mastering system prompts. Using API Keys to run models like Gemma-2 and fine-tuning outputs.
Final Project: Creating a chatbot that uses RAG (Retrieval) and designing autonomous workflows.