
Most wearable AI devices at MWC 2026 wanted to put information in front of your eyes. Scople went the other direction entirely.
Built by startup Moxiebyte, this pocket-sized gadget uses computer vision to analyze what’s happening around you in real time, from the emotional reactions of people you’re talking to, to whether you’re reaching for junk food again.
The twist, according to Moxiebyte, is what happens after it processes all that visual data: nothing gets saved. Not a single frame.
How the Scople AI wearable reads the room
The hardware is small enough for daily carry, with USB-C and wireless charging that keeps it topped off fast. What it does with its onboard camera is the actually interesting part. Scople tracks attention, glances, and emotional responses from people around you during conversations. It can identify dietary patterns by visually analyzing what you eat throughout the day. And it monitors your surroundings to surface data-driven feedback on things like outdoor activity and environmental awareness that most people wouldn’t catch on their own.
Moxiebyte calls this “self-awareness through data,” and the framing clicks once you think about it. The device isn’t trying to augment your vision or replace your phone’s camera. It treats the visual world as a raw data source, pulling behavioral and environmental signals and surfacing them as statistical feedback. Think of it less like a camera and more like a passive sensor that happens to have eyes.
Scople’s privacy play: analyze locally, delete everything
Here’s where Scople gets smart about its own biggest liability. Moxiebyte says it doesn’t transmit original visual data to any server. It doesn’t store the images it captures. Every frame gets analyzed locally and then discarded, with only the statistical output kept. In a product category already dealing with public side-eye over always-on cameras, Moxiebyte built its entire pitch around the idea that the device sees everything but keeps nothing.

That approach won’t satisfy every privacy skeptic, but it’s a meaningfully different starting point than what companies like Humane and Meta have offered with their own wearable camera products. Whether it holds up under real-world scrutiny is a question for later. For now, the architecture is designed with deletion as the default, and that’s worth paying attention to.
Where Scople fits among AI wearables at MWC 2026
MWC 2026 had no shortage of companies trying to figure out what a wearable AI device should actually do. Motorola showed Project Maxwell, a pendant-style device. Qualcomm announced the Snapdragon Wear Elite platform to power the next wave of AI-capable wearables. The category is clearly heating up, but the approaches couldn’t be more different from each other.

Scople sits in a strange, interesting corner of that landscape. It isn’t trying to be a voice assistant you wear around your neck or a notification relay on your wrist. It’s a computer vision device that processes the social and environmental world around you and hands back data you didn’t know you wanted. Whether that use case resonates beyond the early-adopter crowd depends entirely on how well the real-time analysis actually works, and that’s something a Kickstarter campaign and first production run will have to prove.
From MWC 2026 to Kickstarter
Moxiebyte secured $2 million in pre-seed funding by August 2024. Co-founder and COO Anna Melnyk led demonstrations at 4YFN, the startup-focused program running alongside MWC Barcelona, showing the device from their booth. The company is positioning Scople as what it calls the “first true wearable AI device” built for all-day, everyday use.

The next milestone is a Kickstarter campaign in mid-April 2026 to fund initial production. Pricing hasn’t been confirmed yet, so those details should surface closer to launch. What Moxiebyte is really betting on is a simple idea with complicated execution: the next generation of wearable AI shouldn’t sit on your wrist and count steps. It should understand the world happening around you, and it should be able to do that without building an archive of everything it’s seen.
