Fast company logo
|
advertisement

Windows 11’s new Copilot may not be magical, but the idea of using AI to teach software how to use itself is worth exploring.

The dawn of self-aware software

[Source image: OsakaWayne Studios/Getty Images]

BY Harry McCracken4 minute read

Greetings to you and thanks for spending time with Plugged In, Fast Company’s weekly report from the world of tech. I’m global technology editor Harry McCracken, and it’s always nice to see you. If a friend or colleague forwarded this edition to you—or you’re reading it on FastCompany.com—you can check out previous issues and sign up to get it yourself every Wednesday morning. You’ve been sending me some great emails with ideas and feedback lately: Keep them coming to hmccracken@fastcompany.com.


Last week, when I was visiting New York City to help host Fast Company’s Innovation Festival, I played hooky on Thursday morning to attend a media event that Microsoft happened to be holding in town. It was packed with news—everything from details on when new AI-infused versions of apps such as Word and Excel will be available (November 1, at least if you’re an enterprise customer willing to pay extra) to the unveiling of two new Surface laptops.

But among all the elements that CEO Satya Nadella and others covered onstage, one captured my imagination. It was Windows 11’s new Copilot, which is currently rolling out, and the way it can respond to typed commands such as “turn on dark mode” and “play something to help me focus.”

It’s not that Microsoft’s examples were that huge a whoop in themselves. When I gushed a little about the Copilot on Leo Laporte’s This Week in Tech podcast, Laporte was unimpressed. Being able to use a chatbot to turn on dark mode, he said, is not exactly an epoch-shifting moment in tech history. He has a point—though I must confess that I wasn’t sure how to do it on my own, perhaps because the option is buried several layers deep in the settings app.

Even then, it’s less the specifics of what Microsoft showed than what it could portend that got me thinking. The fact that Windows 11 even grasps it has a dark mode and can switch it on is tantalizing. What if all the operating systems and apps in our lives had a sense of their full capabilities? What if using them was less about mastering all of those features on our own, and more about telling the OSes and apps what to do?

In other words, what if software were self-aware?

Talking about self-aware software might bring to mind visions of 2001: A Space Odyssey’s HAL or Her’s Samantha—the sort of stuff that would have felt like fantasy until recently. But I’m not thinking of automatons that border on sentience or even a halfway convincing simulation of it. I just want software to understand its own features so I can spend less time thinking about what’s located where and how to get to it. Nobody even has to stick a talking paper clip on the screen as an avatar of such assistance.

Using AI in this way would address the software industry’s long-standing failure to adequately explain how to use its own products. Built-in help systems are shallow resources at best, and meaty printed manuals are a thing of the distant past. True, there’s valuable information out there in support forums and other web destinations—but when I use Google to hunt for it, I find that the results are increasingly dominated by links to companies trying to sell me products.

Even if figuring out how to turn on Windows 11 dark mode by myself isn’t exactly an impossible dream, there are plenty of features in all the software I use that straddle the line between being a tad intimidating and completely impenetrable. For example, now that I own an iPhone 15 Pro with an action button, I’m excited about customizing it using iOS’s shortcuts feature—but I’ve never gotten around to mastering the intricacies of building a sophisticated shortcut from scratch. An iOS upgrade that crafted shortcuts based on my instructions would feel pretty magical.

And have I mentioned that my new HP OfficeJet printer, which I complained about in last week’s newsletter, isn’t even visible on my network at the moment? Rather than me having to block off time to troubleshoot it, I’d much rather it solved its own problems.

In part, I’m excited about the potential of self-aware software because of all the things it wouldn’t be. AI-powered search-engine substitutes such as ChatGPT, Bing Chat, and Google Bard have a devious tendency to spout fabrications that sound like they might be true. Wondering whether the email I’m responding to was ghosted by a computer doesn’t sound like fun. I can’t help but feel that the world may be better off without the ability to generate convincing-looking fake photos with a few clicks. But software that takes more of the heavy lifting of using it off your hands sounds like it should be all upside and no dystopia.

Back to the Copilot in Windows 11: It feels like its ChatGPT-esque interface might be an expedient stopgap rather than the ideal way to deliver the sort of help I envision. At Microsoft’s NYC event, I asked Corporate VP Jared Spataro about the possibility of other kinds of AI experiences that haven’t quite been invented yet. “We do think that in the future you’re going to start to have a little bit more of what you might call a rich user experience that’s not just chat-based; that’s just starting to emerge,” he told me.

Color me intrigued—and remind me to spend some time with the Windows 11 Copilot to see just how useful it is right now.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Harry McCracken is the global technology editor for Fast Company, based in San Francisco. In past lives, he was editor at large for Time magazine, founder and editor of Technologizer, and editor of PC World More


Explore Topics