The most impressive language generator yet

OpenAI’s GPT-3 language generator is lighting up the internet.
Subscribe to Freethink on Substack for free
Get our favorite new stories right to your inbox every week

OpenAI’s new machine learning language generator, GPT-3, is currently an internet darling. With a catholic knowledge of the English canon, drawn from the vast corners of the internet, the AI can generate a piece of writing that can, at times, read as fine as any a human could compose.

It’s a comprehensive step beyond AI’s usual playing with language, and perhaps a hint that machine learning could soon assume powerful roles in writing, much like those it is assuming in medicine and robotics.

GPT-3 is the most powerful language generator yet made, according to MIT’s Technology Review. Inside any neural network’s black box are parameters, the guardrails for its training: GPT-2, released last year, had a massive 1.5 billion parameters; GPT-3, in comparison, has 175 billion — one of those hard-to-comprehend numbers.

Sucking this dataset, Charybdis-like, into its maw, GPT-3 can now spit out, based on all it has “read,” whatever words should follow a given prompt.

Styles and Shortcomings

GPT-3 uses its vast dataset to apply a mathematical prediction as to what words, and in what order, will best complete a given prompt. Some of the results have been deeply impressive, despite that ghostly quality AI writing seems to always contain.

If given a prompt with a sci-fi bent — say, the opening line to George Orwell’s 1984, from the Guardian‘s Alex Hern — it will return a suitably sci-fi result. 

“It was a bright cold day in April, and the clocks were striking thirteen,” Orwell begins.

“I was in my car on my way to a new job in Seattle. I put the gas in, put the key in, and then I let it run,” the language generator continued. “I just imagined what the day would be like. A hundred years from now. In 2045, I was a teacher in some school in a poor part of rural China. I started with Chinese history and history of science.”

It’s a hair disjointed, but read altogether, it does feel like a particularly promising freshman comp assignment. 

People playing with GPT-3 have prompted poetry and prose in the styles of certain authors, composed music, and created code. It’s even written an article on itself. (Your correspondent will fight an AI to keep his job.)

Little wonder that the Twitter cage is rattling.

GPT-3 is not a perfect mimic, or an artificial general intelligence, however. Kevin Lacker’s Turing test of GPT-3 reveals some of its shortcomings.

The language generator has little issue predicting answers to trivia-style questions; it knows who won the 1995 World Series, how many eyes a giraffe has, and the human life expectancy in the United States.

 It proved pretty good at answering common sense questions, too, knowing that an elephant is heavier than a mouse, and a pop can heavier than a paperclip. But give it a prompt that’s a bit odd — Lacker asked, is a pencil heavier than a toaster? — and you can trip it up. 

(GPT-3’s response: the pencil is heavier. This is likely because the literature on comparing pencils and toasters is probably pretty … light.)

Interestingly, it won’t admit it is wrong, and it won’t  call out your question if it’s nonsensical. When asked how to sporgle a morgle, it replied “with a sporgle” (uh, duh).

But even OpenAI’s founder, Sam Altman, tried to temper things a bit. 

“The GPT-3 hype is way too much,” he tweeted. “It’s impressive (thanks for the nice compliments!) but it still has serious weaknesses and sometimes makes very silly mistakes. AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot still to figure out.”

Still, while it’s true that GPT-3 is not an artificial general intelligence — or even intelligent, in, like, a philosophical sense (“Music is the most advanced form of mathematics” is a sophomoric, one-blunt insight, GPT-3) — that does not mean it’s as simple as a chatbot toy.

With the ability to write, with varying levels of convincingness, everything from fiction to music, poetry to technical information, journalism to code, GPT-3 could prove to be a powerful tool. Any task that requires the written word could be augmented — or even automated — with a high quality language generator.

And GPT-3 might just be the lede.

Subscribe to Freethink on Substack for free
Get our favorite new stories right to your inbox every week
Related
The missing tech case for how we create an era of abundance
AI and other new technologies could make things that are costly and scarce today, cheap and abundant for all tomorrow.
Why America reinvents itself every 80 years — and is doing so again
Three separate theories help explain why America enters a period of great progress every 80 years — and why another is coming soon.
How DeepSeek rewrote the rules of the AI race
Chinese startup DeepSeek has proven that vast quantities of capital and cutting-edge chips aren’t prerequisites for world-class AI.
Kevin Kelly points a new way forward into the Age of AI
One of the most original and optimistic thinkers in America helps build out some big through lines on what’s possible with AI in the next 25 years.
The artifact isn’t the art: Rethinking creativity in the age of AI
ChatGPT’s Studio Ghibli imitations invite questions about the creative value of people and what we really mean when we talk about creativity.
Up Next
Exit mobile version