Published: September 30, 2025
7
50
84

8 Google engineers wrote the paper that every AI company now uses as their bible. OpenAI built GPT on it, Anthropic built Claude on it, and Meta built LLaMA on it. Every LLM worth billions uses this paper's transformer architecture as the foundation...

Before 2017, teaching computers human language was torture. AI would read text like humans reading through a keyhole - one word at a time. They were slow, forgot context, and choked on long passages. Then 8 researchers decided to flip things up...

Image in tweet by Alex Vacca

They published an 8-page paper titled "Attention Is All You Need" The idea was simple: Instead of reading word by word, why not look at everything at once? Like how you can glance at a page and immediately see which words relate to each other. They called it a Transformer.

An example: "The bank by the river bank was full of cash." Old AI would get confused. Two banks? Transformers see everything at once. "Bank" near "river" = riverbank. "Bank" near "cash" = financial institution. One formula makes this work & it's worth more than most countries.

Image in tweet by Alex Vacca

Attention(Q,K,V) = softmax(QK^T/√d)V That's it. This equation alone created trillions in AI market value. Every word calculates relevance with every other word. "Apple" + "stock" = company. "Apple" + "pie" = fruit. But they didn't stop at one attention mechanism.

Image in tweet by Alex Vacca

Eight attention mechanisms ran in parallel. One tracked grammar Another found subject-verb connections A third linked pronouns The other five caught different meaning patterns. All simultaneously. When tested, it broke every record.

Image in tweet by Alex Vacca
Image in tweet by Alex Vacca

Best translation model: 26.3 BLEU score, weeks to train Their Transformer: 28.4 BLEU, just 3.5 days A 2-point jump is like going from dial-up to broadband. 10x faster training. But OpenAI saw something in those pages that even Google missed.

Image in tweet by Alex Vacca

OpenAI made one surgical change that created ChatGPT. The original Transformer had an encoder (understands text) and a decoder (generates text). OpenAI threw away the encoder entirely. Just kept the decoder. Why would removing half the system make it better?

Image in tweet by Alex Vacca

Encoders need paired data - English sentence, German translation. Whereas decoders only need raw text, maybe the entire internet. Just predict the next word which needs no translation needed. OpenAI turned Google's translation machine into a universal intelligence engine.

Anthropic took transformers and made them "safe." First, they had Claude critique their own outputs. "Am I being harmful? Biased? Lying?" The AI argues with itself about ethics before answering you. They called it Constitutional AI. But that wasn't enough.

Image in tweet by Alex Vacca

Then came RLHF - humans rating millions of Claude's responses. Do this millions of times. The transformer learns what humans actually want. Same 8-page architecture underneath. But Meta went even further.

Meta spent millions training LLaMA with months of supercomputers running 24/7. Then they released the actual AI brain - the files that are the model. Small (7B), medium (13B), large (70B) versions. You could run AI on your laptop locally. But why give away $100M models?

Image in tweet by Alex Vacca

Zuck's play: Let 100,000 developers improve LLaMA. They debug it, optimize it and build tools. Meta gets all innovations back. While Google/OpenAI charge fees, Meta built an army of unpaid developers. Genius move? I don't know

Today, transformers power everything: ChatGPT: Decoder transformer Claude: Standard transformer DALL-E: Vision transformer Copilot: Code transformer Same architecture. Different products.

Thanks for making it to the end! I'm Alex, co-founder at ColdIQ. Built a $6M ARR business in under 2 years. We're a remote team across 10 countries, helping 400+ businesses. Here's how I make $450k+ every month with AI: https://tinyurl.com/5n79rd5w

RT the first tweet if you found this thread valuable. Follow me @itsalexvacca for more threads on outbound and GTM strategy, AI-powered sales systems, and how to build profitable businesses that don't depend on you. I share what worked (and what didn't) in real time.

@itsalexvacca Your thread is gaining traction! #TopUnroll https://threadreaderapp.com/th... 🙏🏼@UplandTwit for 🥇unroll

@itsalexvacca Interesting how foundational ideas shape numerous advancements! What’s next in AI innovation?

Share this thread

Read on Twitter

View original thread

Navigate thread

1/20