Humane’s Ai Pin, which debuts today, is the most hyped piece of hardware in recent memory.
With $240 million in funding from luminaries including Salesforce CEO Marc Benioff and OpenAI CEO Sam Altman, the device attaches to your lapel with magnets, listens to your requests like Siri, and will search the internet, translate your speech, or project an interface right onto your hand.
Revealed after five years of stealth development in a dramatic TED talk last May, Humane cofounder Imran Chaudhri waxed poetic about the need for “technology that disappears.” Then in October, we got our first full look at the hardware during Coperni’s Paris Fashion Week show—on a garment donned by Naomi Campbell.

Last week, I flew to San Francisco for a meeting ahead of the Ai Pin’s launch. After watching a scripted demo, complete with carefully choreographed jokes, I asked to try the device for myself. At that moment, Chaudhri and his cofounder and spouse, Bethany Bongiorno, glanced at their PR handlers. That wouldn’t be possible, the handler said, as Humane planned media hands-on testing post-launch.
The Ai Pin is like a tiny smartphone that sits on your lapel instead of in your pocket. It will cost you $699, plus a monthly subscription fee of $24 that will give you a dedicated phone number and unlimited talk, text, data, and cloud storage. The Pin has a camera and an internet connection. Its biggest twist is that you can hold up your hand and it will project an interface onto it. For example, you can see the album you’re listening to on Tidal, with buttons to play and skip. By flicking your wrist and pinching your fingers—on the same hand that serves as the display—you can control the GUI on your skin, toggling through buttons you click with a pinch.

However, the device’s “Laser Ink Display” projector, which Humane says is the smallest and brightest ever built, is near-illegible when reading its WarGames-green text, or worse, looking at photos on your hand.
That leaves its spoken AI interface, which connects to ChatGPT through Humane’s proprietary onboard AI, as the main way you interact with it. But seeing the demonstration, it honestly didn’t seem much more advanced than using Siri on your iPhone or Apple Watch. A world with no screens might sound lovely, until you actually consider the ramifications of using a modern smartphone solely by talking to it.

