Understand LLMs by seeing them work

Interactive visual simulations that make transformers, attention, and KV cache click. No GPU required. Free.

Live Preview: Self-Attention
Thecatsatonthemat
The
cat
sat
on
the
mat
The
0.35
0.07
0.05
0.10
0.35
0.08
cat
0.06
0.26
0.47
0.06
0.07
0.08
sat
0.05
0.52
0.21
0.06
0.05
0.11
on
0.07
0.05
0.08
0.20
0.08
0.52
the
0.33
0.07
0.05
0.10
0.38
0.07
mat
0.04
0.07
0.10
0.42
0.05
0.32

Hover over tokens to explore attention patterns

LLM Internals — 9 Interactive Modules

100%Browser-based
9Modules
FreeAll foundational