The most impressive language generator yet

OpenAI’s GPT-3 language generator is lighting up the internet.
Sign up for the Freethink Weekly newsletter!
A collection of our favorite stories straight to your inbox

OpenAI’s new machine learning language generator, GPT-3, is currently an internet darling. With a catholic knowledge of the English canon, drawn from the vast corners of the internet, the AI can generate a piece of writing that can, at times, read as fine as any a human could compose.

It’s a comprehensive step beyond AI’s usual playing with language, and perhaps a hint that machine learning could soon assume powerful roles in writing, much like those it is assuming in medicine and robotics.

GPT-3 is the most powerful language generator yet made, according to MIT’s Technology Review. Inside any neural network’s black box are parameters, the guardrails for its training: GPT-2, released last year, had a massive 1.5 billion parameters; GPT-3, in comparison, has 175 billion — one of those hard-to-comprehend numbers.

Sucking this dataset, Charybdis-like, into its maw, GPT-3 can now spit out, based on all it has “read,” whatever words should follow a given prompt.

Styles and Shortcomings

GPT-3 uses its vast dataset to apply a mathematical prediction as to what words, and in what order, will best complete a given prompt. Some of the results have been deeply impressive, despite that ghostly quality AI writing seems to always contain.

If given a prompt with a sci-fi bent — say, the opening line to George Orwell’s 1984, from the Guardian‘s Alex Hern — it will return a suitably sci-fi result. 

“It was a bright cold day in April, and the clocks were striking thirteen,” Orwell begins.

“I was in my car on my way to a new job in Seattle. I put the gas in, put the key in, and then I let it run,” the language generator continued. “I just imagined what the day would be like. A hundred years from now. In 2045, I was a teacher in some school in a poor part of rural China. I started with Chinese history and history of science.”

It’s a hair disjointed, but read altogether, it does feel like a particularly promising freshman comp assignment. 

People playing with GPT-3 have prompted poetry and prose in the styles of certain authors, composed music, and created code. It’s even written an article on itself. (Your correspondent will fight an AI to keep his job.)

Little wonder that the Twitter cage is rattling.

GPT-3 is not a perfect mimic, or an artificial general intelligence, however. Kevin Lacker’s Turing test of GPT-3 reveals some of its shortcomings.

The language generator has little issue predicting answers to trivia-style questions; it knows who won the 1995 World Series, how many eyes a giraffe has, and the human life expectancy in the United States.

 It proved pretty good at answering common sense questions, too, knowing that an elephant is heavier than a mouse, and a pop can heavier than a paperclip. But give it a prompt that’s a bit odd — Lacker asked, is a pencil heavier than a toaster? — and you can trip it up. 

(GPT-3’s response: the pencil is heavier. This is likely because the literature on comparing pencils and toasters is probably pretty … light.)

Interestingly, it won’t admit it is wrong, and it won’t  call out your question if it’s nonsensical. When asked how to sporgle a morgle, it replied “with a sporgle” (uh, duh).

But even OpenAI’s founder, Sam Altman, tried to temper things a bit. 

“The GPT-3 hype is way too much,” he tweeted. “It’s impressive (thanks for the nice compliments!) but it still has serious weaknesses and sometimes makes very silly mistakes. AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot still to figure out.”

Still, while it’s true that GPT-3 is not an artificial general intelligence — or even intelligent, in, like, a philosophical sense (“Music is the most advanced form of mathematics” is a sophomoric, one-blunt insight, GPT-3) — that does not mean it’s as simple as a chatbot toy.

With the ability to write, with varying levels of convincingness, everything from fiction to music, poetry to technical information, journalism to code, GPT-3 could prove to be a powerful tool. Any task that requires the written word could be augmented — or even automated — with a high quality language generator.

And GPT-3 might just be the lede.

Sign up for the Freethink Weekly newsletter!
A collection of our favorite stories straight to your inbox
Related
AI chatbots may ease the world’s loneliness (if they don’t make it worse)
AI chatbots may have certain advantages when roleplaying as our friends. They may also come with downsides that make our loneliness worse.
Will AI supercharge hacking — if it hasn’t already?
The future of hacking is coming at us fast, and it isn’t clear yet whether AI will help attackers and defenders more.
No, LLMs still can’t reason like humans. This simple test reveals why.
Most AI models are incredible at taking tests but easily bamboozled by basic reasoning. “Simple Bench” shows us why.
The future of fertility, from artificial wombs to AI-assisted IVF
A look back at the history of infertility treatments and ahead to the tech that could change everything we thought we knew about reproduction.
“Model collapse” threatens to kill progress on generative AIs
Generative AIs start churning out nonsense when trained on synthetic data — a problem that could put a ceiling on their ability to improve.
Up Next
Subscribe to Freethink for more great stories