200 words a day = 1 novel/year
- Mar 27, 2020
Stephen Wolfram explores the broader picture of what's going on inside ChatGPT and why it produces meaningful text. Discusses models, training neural nets, embeddings, tokens, transformers, language syntax.
I got super excited by this article because I finally found a description that I could understand, and a description that I think many people will follow. It sent me back to my school days of Markov chain - Wikipedia and made it relatable. I think it's well worth a read.