Get The Memo.
Waiting for Amazon Olympus? Check out Google DeepMind Gemini…
An independent report by
Alan D. Thompson
18 pages incl title page, references, appendix.
|Dataset size (tokens)||Alan expects: 40T (around 80TB). See also: Gemini|
|Training data end date||Alan expects: Oct/2023|
|Training start date||Alan expects: Nov/2023|
|Training end/convergence date||Alan expects: May/2024|
|Training time (total)||~25,000 H100s/A100s for ~120 days (compare with GPT-4 @ ~25,000 A100s for ~90 days, and GPT-3 @ ~1,024 A100s for 34 days)|
|Release date (public)||Alan expects: Aug/2024|
2nd gen: Titan: the Titans were second-generation deities who lived on Mount Othrys. The Titans were gigantic compared to the Olympians.
3rd gen: Olympus: the Olympians were third-generation gods who occupied Mount Olympus. The Olympians outnumbered the Titans which resulted in their victory. (source)
2023-2024 optimal language model size highlightsDownload source (PDF)
Permissions: Yes, you can use these visualizations anywhere, please leave the citation intact.
Models tableSummary of current models: View the full data (Google sheets)
Download PDF version
Timeline to Olympus
|28/Sep/2023||Titan embeddings released.|
|8/Nov/2023||Olympus plans leaked.|
Read more about Alan’s conservative countdown to AGI…
Dr Alan D. Thompson is an AI expert and consultant, advising Fortune 500s and governments on post-2020 large language models. His work on artificial intelligence has been featured at NYU, with Microsoft AI and Google AI teams, at the University of Oxford’s 2021 debate on AI Ethics, and in the Leta AI (GPT-3) experiments viewed more than 4.5 million times. A contributor to the fields of human intelligence and peak performance, he has held positions as chairman for Mensa International, consultant to GE and Warner Bros, and memberships with the IEEE and IET. Technical highlights.
This page last updated: 26/Nov/2023. https://lifearchitect.ai/olympus/↑