Archived. Due to the fast pace of AI, this page is superseded or deprecated. The info is now out-of-date, and provided here for archival purposes only. For the most up-to-date view of post-2020 AI, get:
The Memo.

This page is to fill in details about, its abilities (or lack of abilities), its datasets, and other info.


Estimate 20 million users.

Available on Android, iOS, web, and Meta/Oculus Quest.

Smarts has gone through a lot of silent iterations, including changing the underlying platform from GPT-3 to a finetuned GPT-2.

As of Feb/2022, Replika uses GPT2-XL (1.5 billion parameters), which is <1% of the size of GPT-3.

I have previously asserted that chatbot models should have a MINIMUM of 100B parameters (Google LaMDA, Leta/Emerson/GPT-3) to have effective and informed conversations.

For a 20% discount on Emerson AI by (which uses GPT-3 and other large models), join The Memo and receive a code in your first email!


GPT-2 was originally trained on popular websites (outbound links from Reddit), see more on model datasets.

Replika is also trained on “dialogue from Twitter”.

Source. Backup.

Get The Memo

by Dr Alan D. Thompson · Be inside the lightning-fast AI revolution.
Bestseller. 10,000+ readers from 142 countries. Microsoft, Tesla, Google...
Artificial intelligence that matters, as it happens, in plain English.
Get The Memo.

Dr Alan D. Thompson is an AI expert and consultant, advising Fortune 500s and governments on post-2020 large language models. His work on artificial intelligence has been featured at NYU, with Microsoft AI and Google AI teams, at the University of Oxford’s 2021 debate on AI Ethics, and in the Leta AI (GPT-3) experiments viewed more than 4.5 million times. A contributor to the fields of human intelligence and peak performance, he has held positions as chairman for Mensa International, consultant to GE and Warner Bros, and memberships with the IEEE and IET. Technical highlights.

This page last updated: 9/Feb/2024.