For the past several months I’ve been using and evaluating Adobe’s Project Indigo camera app. I’ve been an Apple iPhone user for several years, and I think I’ve tried more than a dozen 3rd party camera apps over the years, and always return to the native camera app. It always came down to the 3rd party app not really producing a better quality image, or something so unique and so compelling that I wanted to use the 3rd party camera more than the Apple native camera app. I even tried (and paid for one year of the Leica LUX camera app, which ended up being a disappointment. Then around the middle of 2025 I heard about the Adobe Project Indigo app from James Stacy on The Grey NATO podcast. James mentioned that he’d been using it to do “wrist shots” (a common convention of the online luxury wrist watch enthusiast community) with great success. So I decided to give it a try.
Project Indigo is a free product from Adobe Labs that promises “SLR-like” quality from your iPhone. According to Adobe, it accomplishes this using a number of techniques, including underexposing highlights more aggressively and combining more frames (up to 32) than the iPhone’s native camera app. In theory, this should result in fewer blown-out highlights and less noise. In practice, it does something…more. Something I can’t quite explain in purely technical terms. It was a noticeable change from how modern Apple computational shift which affects how shadows are lifted, how colors are interpreted, and how multiple frames are combined into the “final” photo.
Let me explain it this way: Remember taking photos with the iPhone 10? That’s the nearest analog I can give you to the results you get using Adobe PI on a current iPhone. On the iPhone 11 in late 2019, Apple introduced Deep fusion, which used machine learning on the Neural Engine to analyze multiple exposures and optimize texture, detail, and noise at a pixel level — a true AI–driven computational step, not just tweaks to exposure blending. To me, that’s when iPhone images stopped looking like a DSLR image. Indigo on the other hand seems to intentionally:
- Ease off microcontrast
- Preserve natural tonal compression
- Avoid overly aggressive sharpening halos
- Keep color science calmer
Before Apple went all-in with AI driven computational image rendering I used to really enjoyed taking and posting wrist shots of watches from my collection and sharing the images with forum communities. But I never truly liked how the post iPhone 10 iOS native camera app rendered my wrist shots, which put me in real catch-22: I didn’t like iPhone wrist shots, but neither did I want to set up my pro camera gear just to grab a quick wrist shot. So I slowly stopped. But since James Stacey was so enthusiastic about his experience with Project Indigo, I decided I had to try it. Here are some examples:
This is the “straight out of the camera” version of a shot I took in my car. The shot is fantastic and looks better than the results I typically get with the iPhone 17 Pro. In short, it doesn’t look like an iPhone shot. It looks like it was taken with a “real“ camera. Of course, since Project Indigo is an Adobe product you can easily take the photo in to Lightroom Mobile and do a bit of creative editing. Here’s the result:
From photo to result in about 5 to 10 minutes using my phone. And the result looks as good as anything I could’ve gotten with the dedicated camera. Here’s a few more examples of what I’ve been able to do with Adobe Project Indigo:
Adobe PI provides more pleasing results that photographers will appreciate for most any type of photography – here’s another unedited result:
As good as the Project Indigo camera app is, you have to wonder what the future holds for this app? Even the name suggests that it’s somewhat experimental. At some point Adobe’s going to want to monetize PI in some way. Maybe by bundling it with the mobile version of Lightroom or perhaps as a standalone subscription. Either way, enjoy it for free while you can.
PI was first released in June of 2025, and I think I started using it in August with the iPhone 16 Pro. The experiment was short lived for me, as PI stopped working for me when I got the iPhone 17 Pro on September 12th. It would take Adobe until late October of 2025 to finally update PI, adding compatibility with the iPhone 17 Pro front sensor. Other noteworthy enhancements include the November 2025 update when iPhone 17 Pro front camera support was also added. At this point I’m using The PI camera app 3 to 1 over the native camera app. It’s easily the best non-Apple camera app available for the iPhone.






0 Comments