Microsoft unveils AI Copilot for its 365 apps

Copilot can write a proposal for you in Word and then turn it into a Powerpoint.

Microsoft has just unveiled Copilot, a new AI tool for its Microsoft 365 apps, including Word, Excel, and PowerPoint.

“Copilot marks a new era of computing that will fundamentally transform the way we work,” said Jared Spataro, corporate VP of modern work and business applications at Microsoft, during a virtual event dubbed “The Future of Work With AI.”

Clippy 2.0: Microsoft’s Copilot is based on large language models (LLMs), which are capable of responding to natural language — the kind we humans use to communicate with one another — and generating their own easy-to-understand text in response.

How Copilot uses these abilities depends on the Microsoft 365 application.

If you’re using Word, for example, you can ask Copilot to draft a proposal based on some notes and a list of products — these reference docs can be uploaded right in the prompt. When Copilot is done, you can give it a previous proposal and tell it to format the new one similarly.

In Excel, you might ask Copilot to identify and summarize three trends in a spreadsheet highlighting your company’s sales, and in Powerpoint, you can ask it to create a 10-slide presentation based on the proposal you created in Word.

Microsoft also unveiled Business Chat, a chatbot accessible in all of the 365 apps, during the virtual event. It can access the data in your calendar, emails, chats, and more, which can be useful if you want a summary of everything that happened with a specific client that week.

Microsoft says it plans to announce pricing for Copilot “soon,” and for now, it is strictly limiting who has access to the AI.

“We are currently testing Microsoft 365 Copilot with 20 customers, including 8 in Fortune 500 enterprises,” wrote Colette Stallbaumer, general Manager of Microsoft 365 and Future of Work.

Helper bots: A couple of phrases crop up repeatedly in the Copilot and Business Chat demos Microsoft shared: “Remember to check for accuracy” and “AI-generated content may be incorrect.”

That’s because while LLMs are adept at producing human-sounding language, they’re also prone to “hallucinating” — writing things that sound factual, but aren’t.

“Sometimes, Copilot will get it right. Other times it will be usefully wrong.”

Jared Spataro

Microsoft says Copilot is built in part on ​​GPT-4, an LLM developed by OpenAI, the AI research firm that Microsoft invested $10 billion into in January 2023 — while this LLM is less likely to hallucinate than the model currently powering the popular ChatGPT, it’s not immune to the problem.

“Sometimes, Copilot will get it right,” said Spataro. “Other times it will be usefully wrong, giving you an idea that’s not perfect, but still gives you a head start.”

It seems likely that there will also be times Copilot is unhelpfully wrong, though, and dealing with those missteps is going to be the tradeoff of using the AI — yes, Copilot can help you at work, but only with proper supervision, sort of like a green but eager intern.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

Frontier, the Oak Ridge National Laboratory’s supercomputer, performed 9.95 quintillion calculations per second. 
Frontier, the ORNL supercomputer, used machine learning to set a new speed record of 9.95 quintillion calculations per second.
Google’s new Gemini AI beats GPT-4 in 30 of 32 tests
Tech giant Google has unveiled the multimodal Gemini AI, its “largest and most capable” AI model ever released.
Move over, agony aunt: study finds ChatGPT gives better advice than professional columnists
Researcher found that people preferred the AI-generated responses over those given by a professional advice columnist, and couldn’t tell which was which.
World’s largest robots will help airlines cut carbon emissions
Norwegian startup Avinxt is building massive AI robots to help airlines reduce their emissions, save water, and inspect their planes.
Technology expert tells us why the AI “doomer” narrative is all wrong
Alex Kantrowitz believes “doomerism” is born of our misplaced and exaggerated human propensity for fear. Plus, fear sells.
Up Next
Subscribe to Freethink for more great stories