Honor this week unveiled its flagship Magic 7 Pro to an international audience following its release in China late last year.
It’s a fantastic smartphone in many regards, offering top-tier screen tech, oodles of power, and a camera setup that includes a 200MP telephoto lens – the highest resolution of its kind on the market right now.
As part of Honor’s big gambit for 2025, the Honor Magic 7 Pro uses a suite of GenAI-powered tools under the AI Honor Image Engine banner to try and boost the quality of images captured on the device.
Generally, I’ve been very impressed with the Magic 7 Pro’s camera capabilities. The 50MP main camera may not be the highest resolution around, but with specs including OIS, a large 1/1.3-inch sensor and a variable f/1.4-f/2.0 aperture, it takes a pretty stellar photo, and the same can be said for both the 200MP 3x telephoto and 50MP ultrawide lenses – even as light levels begin to drop.
Instead, the controversy comes with one of the Magic 7 Pro’s big new features: AI Super Zoom, available when zooming in at 30x or above.
More specifically, it’s powered by Honor’s proprietary Telephoto Enhancement LM, a language model trained on enhancing telephoto shots with a whopping 12.4 billion parameters, allowing it to identify objects and boost light, texture and colour, with 127 billion calculations per image before it’s sent back to your phone but what’s not obvious is how your AI Super Zoom shots are actually created.
You might assume that this works in a similar way to other AI-powered zoom enhancement tools seen on the likes of the Oppo Find X8 Pro, simply boosting the detail in your otherwise blurry original shots, but that’s not the case here. In fact, it’s completely different from what Oppo and others are doing.
For one, you’ll need an active internet connection to use the feature as it uploads the image you’ve just taken to the cloud, where it’s then processed and ‘enhanced’. That’s not the big deal in itself; Samsung, Xiaomi and other phone makers have all landed on this hybrid on-device and cloud AI system, apparently offering the best of both worlds.
The problem, dear reader, is that the image you’re presented with isn’t the photo you took originally. Far from it.
It might look similar on the surface, but what you’re actually seeing is a wholly new image generated entirely by Honor’s cloud-based Telephoto Enhancement Large Model, complete with the same issues around AI hallucinations we see when getting popular GenAI image tools like Midjourney to generate photorealistic images.
Take the below example; shot outside my bedroom window, the AI-enhanced view of the train track and chimney stacks behind my house looks great on the surface, with more detail than the image captured by the phone itself – but look closer and you’ll start to see massive inconsistencies between the two.
That includes, but is not limited to, swapping out the orange leaves on an autumnal tree for red flowers, getting rid of the TV antenna next to one of the chimney stacks, implementing a slightly different tree, and chimney stacks that look, lets be honest, fake with odd brickwork.
Rather ironically, Honor actually provides built-in functionality that lets you compare the two images side-by-side, highlighting just how different the before and after can be.
I’m pretty sure that the Shard’s glass exterior doesn’t look that way in real life:
This scene looks more like a painting than a real photograph:
This is supposed to be a sheep, if it wasn’t clear:
It can even replace textures, seemingly at will:
Now, to be fair to Honor, I was explicitly told when I received the phone that AI Super Zoom was not designed to take photos of complex architecture, religious buildings and people, exactly because of the issues I’ve mentioned above.
The issue I have is that there’s nothing on the phone informing consumers of this, so they’ll likely use it to take photos of anything they see and get those hit-and-miss results.
When focusing mainly on shots of nature, landscapes and simpler buildings, as Honor recommends, the AI Super Zoom does a markedly better job. There are still differences when viewed side-by-side, and some finer details give away the AI-generated nature, but they’re more than passable for sharing with friends and family, as seen below.
There’s also hope that, as Honor claims, the quality of the AI-generated results will continue to improve as the Telephoto Enhancement Large Model continues with its training, but how much of a difference it’ll make is yet to be seen.
Regardless of the quality of the results, I still feel like it’s one of the most controversial AI camera features we’ve seen on a smartphone yet. Why? Because what you’re left with isn’t the photo you captured, or even a real-life photo; it’s just an AI-generated approximation of your photo, landing itself squarely in uncanny valley territory.
It’s at this point I can’t help but ask, when is a photo no longer a photo? Is an image that looks a lot like your original generated by AI enough, or is a true photo one that you’ve captured without the use of AI processing tools? Is it somewhere in between?
That question would’ve seemed a bit odd a few years ago, but I feel we’re fast approaching that point where it needs to be discussed.
Thankfully, this isn’t a feature you’re forced to use once you go beyond the 30x mark – instead, a button appears in the Camera UI to toggle the functionality on and off. Plus, more than that, it’s disabled by default, so users likely won’t be unknowingly using the tech.
Still, I do feel a little uncomfortable when presented with a photo on my phone’s Gallery app that I, in no shape or form, actually took myself.