Enrolment options

Google's 2017 research paper "Attention Is All You Need" described the transformer, a new machine learning technique. From that paper the modern Large Language Model was born, and we're now living in the thick of a new era brought on by companies like OpenAI, Mistral and Anthropic. But where does this cutting-edge technology come from? What are its roots? What are its problems?

This talk explores the history of procedural generation in text and games, from the I-Ching to tranformer-based language models and beyond. The talk will emphasize current state of the art in text-based language models, and include demonstrations on how to run language models locally on your own hardware.

Level: Introductory

Length: 1.5 Hours

Format: Lecture

Prerequisites: None

Self enrolment (Participant)
Self enrolment (Participant)