← back to glossary

modelsfoundations

Attention

The mechanism inside transformer models that lets the AI weigh how relevant each word or token is to every other word in the input before generating a response.

Last updated 2026-05-12