
Following a trend established by tech companies last year, Samsung's latest Galaxy Unpacked event was heavy on promises of what AI technologies will bring us... sometime later.
Although this presentation revolved around the reveal of new Samsung S25 phones -- with a surprise thin S25 Edge teased at the end -- most of the time was spent on AI technologies that are still out of reach of anyone not working in research labs.
To be sure, not all the announced AI features are hype. The ProScaler screen technology uses AI to improve image quality ("by up to 40%") by giving photos what looks like a live sharpening pass. Best Face can pull sections of frames recorded around a photo to replace things like someone's eyes midblink with a version where their eyes are open.
Both features appear to be shipping with the Samsung Galaxy S25 models starting Feb. 5. However, those were buried deep into the presentation when Samsung finally got around to spotlighting the actual phones.
Here are some technologies designed to convince you to buy a new phone to run them someday (hopefully) soon.
For more on Samsung Galaxy Unpacked, see how Samsung could beat Apple to market with a slim phone and all the other AI features in the S25 series.
A way for the phone to see itself
Currently, the key AI feature everyone is running toward is "agentic systems," an umbrella term that refers to AI knowing more about you and your preferences so it can do things for you. To quote Demis Hassabis, Google DeepMind CEO, in a prerecorded video shown at the event, "In the coming months, you'll be able to ask Gemini to reason about the things you see, whether it's on your phone or the world around you."
One part of that is Screen Sharing, a way for the phone to analyze what's being displayed on the screen. In the example given, a person has asked the phone whether a pair of jeans they're shopping for would be a good fit. It reads the size chart and description from the web page and replies, "Based on the size chart, a medium is equivalent to a size 38. However, since the jeans have a relaxed fit, you might want to consider sizing down to a 36 for a slightly more fitted look."
Good advice, but seemingly something the person could deduce by reading the sizes (which for some reason were all in French).
(Why Samsung chose the generic term "screen sharing" is a mystery. That's already a thing people understand as being able to view someone else's screen, such as during a Zoom presentation or training.)
Understanding what's in a live or recorded video
Live Video Streaming is more impressive, with the AI analyzing what's happening in a video. In this example, someone is working dough and wants to know if it's in good shape. "Your dough looks smooth and elastic, which is a good sign," says the AI, and then suggests a baking technique called the windowpane test to confirm the dough's readiness.
It would have been more impressive if the dough were not ready, though. I want an AI that can look at dough that still needs more work and identify that perhaps five more minutes of kneading or the addition of more flour would be the solution.
Android XR interactions
Android XR is the software powering devices like mixed-reality headsets and glasses, which CNET's Scott Stein sampled in December. The software is still in an early form for developers, so it's no surprise that Jay Kim, Samsung EVP and head of customer experience, said, "Multimodal AI will transform how we interact with all our devices. ... Interactions will be more natural and intuitive, and we can't wait to share more details later this year."
What did get an unannounced physical reveal was Project Moohan, a Samsung headset that looks an awful lot like Apple's Vision Pro. It wasn't mentioned in the presentation and was only on display for attendees. Scott was able to try it out in December, but he couldn't take any photos or video of it at the time. Now it (or at least a prototype) exists publicly.
The digital assistant they wish you'll have
It's one thing for software to identify what's on your phone screen or in a video, but the larger goal is to tie together that observed information with what the AI knows about you. Samsung calls this the Personal Data Engine, and Professor Ian Horrocks of the University of Oxford and Oxford Semantic Technologies, which is parterning with Samsung, explained it.
The Personal Data Engine is fueled by Knowledge Graph technology that "understand[s] a user's experience along with the surrounding context and storing it in the user's personal knowledge graph," according to Horrocks. "This level of integration is revolutionary. It will change the way users interact with their devices for an experience that is enhanced overall and, importantly, is more personal."
It will. It doesn't do that now. But that's the goal.
The S25 Edge of tomorrow
If we're looking at future technologies, we can't leave out the sneaky tease of a new ultrathin Samsung S25 Edge phone, which was given a name, a few exploded component views and no release date. Attendees of the event got to see some mockups suspended from wires.
Though the S25 Edge isn't an AI feature -- I'm sure it will be festooned with AI, however, especially if it's using the same Snapdragon 8 processor as the other S25 models -- it's fitting that the surprise reveal of the event is coming at some unspecified future date.

