1 of 33

Milestones in AI history

2 of 33

3 of 33

Test for machine intelligence (1950):

Alan Turing’s  idea was to understand if the machine can think accordingly and make decisions as rationally and intelligently as a human being. 

In the test, an interrogator has to figure out which answer belongs to a human and which one to a machine. 

So, if the interrogator wouldn’t be able to distinguish between the two, the machine would pass the test of being indistinguishable from a human being.

4 of 33

The father of AI – John McArthy (1995)

John McCarthy, an American Computer Scientist, coined the term Artificial Intelligence in his proposal for the Dartmouth Conference, the first-ever AI conference held in 1956. 

The objective was to design a machine that would be capable of thinking and reasoning like a human. He believed that this scientific breakthrough would unquestionably happen within 5-5000 years. 

��

5 of 33

The first chatbot – Eliza (1964)

Eliza – the first-ever chatbot was invented in the 1960s by Joseph Wiezenbaum at the Artificial Intelligence Laboratory at MIT. Eliza is a psychotherapeutic robot that gives pre-fed responses to the users. Such that, they feel they are talking to someone who understands their problems. 

��

6 of 33

Deep Blue (1997)

IBM's Deep Blue chess computer defeated reigning world champion Garry Kasparov, heralding the potential for AI to outperform humans in complex strategic games.���

7 of 33

Voice recognition on Iphone and Siri (2008)

This advancement gave users the power to quite literally *voice* their queries and concerns.�

8 of 33

AlphaGo (2016)

Developed by DeepMind, AlphaGo became the first AI program to defeat a world champion Go player, Lee Sedol. This achievement was considered a significant milestone in AI, as Go was long regarded as a game too complex for computers to master.�

9 of 33

10 of 33

Generative AI (2020’s)

OpenAI launched ChatGPT on 30th November, 2022 - an advanced AI chatbot that could connect with users like a real human.�

11 of 33

Generative AI

Creative Systems & Intelligent Workflows

12 of 33

Overview & Requirements

  1. Intro: Generative AI + LLM’s��+ Ideation (assignment)
  2. LLM’s part 2 : MCP, Agents, Codex, Huggingface, and more��+ Implementation (assignment)
  3. Creative AI : Diffusion Models , 3D and Worldmodels��+ Design / Marketing (assignment)�
  4. Creative AI part 2: TTS & STT + Voicecloning & Music generators,…��+ Design / Marketing (assignment)
  5. AI Workshop : Final assignment! �

13 of 33

AI Landscape

Image Generation | Audio | AI Video | LLM’s

14 of 33

COMFYUI

15 of 33

16 of 33

17 of 33

18 of 33

19 of 33

LARGE LANGUAGE MODELS : Part 1

20 of 33

Course Overview and Structure�Key modules and learning outcomes

Learning Objectives

  • Understand LLM fundamentals
  • Apply prompt engineering techniques
  • Identify ethical issues

Key Modules

  • Introduction to LLMs
  • Prompt Engineering
  • Ethical Considerations

21 of 33

Attention is �All You Need!

GPT | Generative Pre-Trained Transformer

22 of 33

23 of 33

Understanding Large Language Models

Large Language Models (LLMs) utilize pattern prediction to generate text, relying on tokens for language representation. By analyzing context, they understand and produce coherent responses, facilitating advanced interaction.

The Mechanics of Transformers Explained

1. Attention Mechanism

Enables models to focus on relevant input data.

�2. Layer Normalization

Stabilizes training and improves model performance.

24 of 33

Training Stages

LLM = 1TB Lossy, probabilistic “zip file of the internet”

Parameters store world knowledge, though usually out of date by a few months

Pre-training: $10M, 3 months of training on internet documents

Post-training: Cheaper finetuning with RLHF, RL on Conversations

25 of 33

LLM Key Concepts

  1. TokensSubword pieces (~4 characters on average) that LLMs use to process text�
  2. TemperatureControls creativity vs consistency in outputs (higher = more creative, lower = more predictable)�
  3. Context Window�Model's working memory capacity�
  4. Self-Attention�How models understand relationships between words by weighing their importance in context

26 of 33

LLM Limitations: The 3 Critical Failures

  1. Hallucinations��Confident false information, fabricated citations, made-up API methods�
  2. Context Window Limits��Forgets earlier conversation, max [x]-K characters.�Solution: New chat for new topic�
  3. Calculation Errors��Math is not LLM’s strength!�Solution: Use code interpreters for calculations, always verify numerical outputs.�

27 of 33

Transformers

"The cat chased the mouse, but it ran away."

28 of 33

Prompt Engineering�Prompting is the process of providing specific instructions to a Gen AI tool to receive new information or to achieve a desired outcome on a task. (text | images | videos | sound | code)

The 4-Step prompting cycle

1. Be Specific with Context

2. Provide Examples

3.Request Reasoning

4. Iterate & Refine

29 of 33

Prompting Techniques: 2025

  1. Zero-shot promptingDirect instruction without examples. Best for simple, straightforward tasks.�
  2. Few-shot promptingProvide 1-3 input/output examples to guide the model. Ideal when specific format or style is needed.�
  3. Chain-of-thought�Request step-by-step reasoning using phrases like "Let's think step by step..." Excellent for complex reasoning tasks.�
  4. Role prompting�Assign an expert role to the LLM (e.g., "You are an expert Python developer"). Most effective for specialized domain knowledge.

30 of 33

5 STEP FRAMEWORK [Google]

  1. TASKWhat You want the AI to do.�
  2. ContextThe more context you can provide the better, the output will be.�
  3. References�Provide examples.�
  4. Evaluate�Is this output what I wanted it to be?�
  5. Iterate�Always keep iterating

4 ITERATION METHODS

  1. Revisit the prompting framework
  2. Separate prompt into shorter sentences / tasks��
  3. Try different phrasing or switching to an analogous task��
  4. Use constraints to narrow the focus down�

31 of 33

Example: Good vs Poor Prompts

  1. Bad Prompt: �“Make a poster for a concert”�����
  2. Good Prompt: �Design a vibrant A3 poster for the band Solar Bloom’s sunset concert in Antwerp. Use a warm color palette (sunset oranges, purples, soft pinks), include minimal typography with the band name and date, and evoke an atmospheric, dreamy feeling inspired by Coachella posters.

32 of 33

Developer Use Cases in LLMs

Code Intelligence & Engineering Support

  • Code Generation, completion & Refactoring
  • Code Review & Explanation
  • Bug Detection & Debugging
  • Test Generation & QA

Software Design, Architecture, and Documentation

  • Architecture Design
  • API Design & Integration
  • Documentation & Onboarding
  • Legacy System Understanding

🚀 TL;DR : LLM’s help developers:

Write, reason, document, debug, test, integrate, research, automate, and design faster!�But the skill lies in knowing when to hand off, when to verify, and when to override.

33 of 33

DEMO TIME

  • Models & Pricing
  • Thinking Models and when to use them
  • Tool use : e.g. Internet search
  • DeepSearch | Council of Models
  • File Upload & Context�
  • Data Analysis
  • ChatGPT Projects & Claude Artefacts
  • Audio Input & Output�
  • Advanced voice Mode (true audio)
  • NotebookLM & OAI Canvas
  • Image Input & OCR
  • Image Output�
  • Video Input & Talk on app
  • Video Output
  • Custom Memory & Instructions
  • Custom GPT’s