“From talking to OpenAI, GPT-4 will be about 100 trillion parameters…”
— Cerebras in Wired (24/Aug/2021)

At 100T parameters, GPT-4 will be over 500 times larger than GPT-3.

GPT-4 will also have roughly the same number of parameters (connections) as there are synapses (connections between neurons) in the human brain. (It is estimated that the human brain has 125T synapses connecting 86B neurons.)

See also:

Dr Alan D. Thompson is an AI expert and consultant. With Leta (an AI powered by GPT-3), Alan co-presented a seminar called ‘The new irrelevance of intelligence’ at the World Gifted Conference in August 2021. He has held positions as chairman for Mensa International, consultant to GE and Warner Bros, and memberships with the IEEE and IET. He is open to major AI projects with intergovernmental organisations and impactful companies. Contact.

This page last updated: 26/Aug/2021.