What OpenAI– and most importantly, the beta testers with access to GPT-3 and other models– are able to achieve continues to amaze and in most cases, unexpectedly delight us. Trained on trillions of words, GPT-3 is a 175-billion specification transformer design– the third of such models released by OpenAI. OpenAI scientists first released the paper introducing GPT-3 in May 2020, and what started out as some awesome use cases on Twitter has rapidly become a hotbed of startup activity. Companies have been formed on top of GPT-3, using the design to create e-mails and marketing copy, to produce an interactive nutrition tracker or chatbot, and more. As early-stage innovation financiers, we are inspired to see AI broadly, and natural language processing particularly, end up being more accessible through the next generation of massive transformer models like GPT-3. Article curated by RJ Shara from Source. RJ Shara is a Bay Area Radio Host (Radio Jockey) who talks about the startup ecosystem – entrepreneurs, investments, policies and more on her show The Silicon Dreams. The show streams on Radio Zindagi 1170AM on Mondays from 3.30 PM to 4 PM.