1 article tagged with "attention"
The landmark paper introducing the Transformer architecture that revolutionized NLP and became the foundation for modern LLMs.