How RAW changes iPhone photography for the better

Ben is an independent designer, developer and photographer based in Dublin, Ireland. He’s the brains and talent behind the popular third-party camera app, Obscura, and can usually be found with a coffee in one hand and a camera in the other.

Photography is changing, and iOS 10 is going to be a major turning point.

There are few things that captivate me like photography and technology.

I grew up spending afternoons in my dad's studio: Watching shoots, learning about cameras and lenses, getting to mess around on Photoshop; the smell of darkroom chemicals still brings me back instantly. No surprise that I picked up a camera myself and started a career doing the very same thing.

The iPhone arrived in the midst of all this, when I was 13. It took seven years for me to start developing apps, but when I did, I knew I wanted to make a camera. I started working on Obscura toward the end of summer 2014, and after a year of working on it in my spare time, released it in summer 2015. I designed and built Obscura to mimic the ease and ergonomics of an SLR, and offered a number of filters for processing images.

iPhone photography has come a long way since its humble beginnings: Obscura wouldn't have been possible without iOS 8's manual camera APIs for developers. But in iOS 10, it truly grows up, offering a major new feature that brings mobile photography into the professional world. And that feature is RAW. 

So... what exactly is RAW?

RAW is a file format that contains the raw data directly from a camera sensor. This raw data is unprocessed and uncompressed; it's the "purest" form of the image. RAW files tend to appear quite flat in coloration and depth initially as a result, but it allows users to really bring them to life in post-production.

RAW files tend to be larger than JPEGs: This is because JPEG files are compressed, while RAW files are not. As a result, RAW can record significantly more dynamic range, which is vital when it comes to editing.

When you shoot RAW, you can pull down highlights or boost shadows to bring out new detail in the image. It's a huge win for photographing high-contrast scenes, and it's why most professional photographers generally opt to shoot RAW. You may put more work in when doing post-production tasks, but the benefits are huge.

How iOS 10 uses RAW

Different camera makers use different varieties of RAW file formats (Canon uses .CR2 and Nikon uses .NEF, for example). An increasingly common RAW standard, however, is Adobe's .DNG (Digital Negative); this is format Apple has chosen for iOS 10.

What's especially interesting about iOS 10 and RAW is that Apple hasn't adopted it for its built-in Camera app — the company is instead leaving it up to third-party developers to implement.

So I did: This summer, I reconfigured Obscura to shoot with Apple's RAW feature. Here's what it looks like.

Initial RAW tests in iOS 10

A few days ago, I shot this image in Obscura. It's a tricky scene, given the dynamic range: bright highlights in the sky, and deep shadows throughout the building. I photographed the building in both RAW and JPEG for my tests.

The initial shot

In all these tests, JPEG is to the left; RAW to the right. Directly out of the camera, the JPEG looks a little more interesting: It has more contrast, and there appears to be more detail. The RAW image looks downright drab in comparison.

But as Apple SVP Phil Schiller noted on stage during the iPhone 7 event, Apple does a lot of work to process images behind the scenes using its ISP (image signal processor). It makes the images more vibrant and ready to display on the iPhone's beautiful screen. But it does mean that the image is being altered as you take it — and that can be a detriment when you want to make further changes beyond what the ISP had in mind.

To truly put these images to the test, I brought them over to my Mac and opened up Lightroom.

Changing the exposure

Watch what happens when we try to increase the exposure on both images. Pay attention to the clouds in the top left.

Immediately we see that the RAW file can hold detail in the white clouds, while the JPEG has "blown out" in a very unpleasant way. The RAW image also feels smoother. The extra contrast applied to the JPEG is hardcoded, leaving less room for manipulation.

Blowing up the image, we can see the issue in detail: The combination of extra contrast and the JPEG compression makes the image look jagged in places. 

It also shows the benefits of white balance when shooting RAW: With a JPEG image, when you adjust the white balance, you're essentially applying a colour filter over the image. With RAW, changing white balance changes the way the original image is processed; it also means the adjustments to white balance are fully non-destructive.

Desaturation

When desaturating images, there's an equal difference in quality. When the JPEG was processed by the phone, it reduced its dynamic range, and thus, what I could target in Lightroom; in contrast, the extra data in the RAW image more accurately depicts the gradient of light down the building.

The difference isn't staggering, but one is appropriate for Instagram — the other I'd print and hang on a wall.

Going pro

Editing RAW files feels like a huge leap forward in terms of mobile photography: With iOS 10, the iPhone is evolving from a great camera for taking casual photos with into a capable professional tool. It still has plenty of limitations, but I suspect we've passed a tipping point. 

But shooting while out and about is one thing. What about using the iPhone in a studio? I gathered together a couple of friends to do a little impromptu photoshoot to see how the iPhone would hold up. 

For this portrait shoot, I used my iPhone 6s, Obscura 4.0, and Lightroom on the Mac for post-processing. I started off using a tripod but quickly found it easier to go without. We also used a white backdrop in the studio, and three lights: two with umbrellas for the backdrop, and one with a diffuser to light the subject.

Call me crazy, but I think I just did a successful photo shoot, entirely #ShotOniPhone.

I shot about 1300 photos. It's probably more than I would have taken with a DSLR, but shooting with the iPhone and its fixed wide-angle lens took a bit of adjustment. In total, that added up to about 14GB of images — about 11MB per RAW file. Not too bad on a 64GB device, though I did have to pause during the shoot to remove music from my device to free up some space. I imagine this won't be much of an issue on a 128 or 256GB iPhone, however. 

Most of the images were captured with an ISO of 50, exposure duration of 1/100, and using the iPhone 6s's fixed aperture of f/2.2. I think the iPhone sensor proves itself more than capable of capturing great light and colour. The detail becomes a little lacking when zoomed in very close, but it's unlikely most viewers will ever see that (or care to). I did run into a few limitations, however, primarily over focal length and lack of interoperability with photographic equipment. 

The issue with the lens is partially a personal one: I prefer shooting with a tighter focal length — 50mm to 80mm would be much more comfortable for portraiture (and thankfully, we'll soon have that with the dual lens setup on the iPhone 7 Plus). Unfortunately, the iPhone 6s's wide-angle lens often resulted in tripod legs poking into shots — I ended up having to work with the subject, lights, and myself, all very close together.

Interoperability with photographic equipment is a trickier problem to solve. I was shooting with speed lights — large flashes that will synchronise with a camera shutter. The problem is that these speed lights connect to a camera via cable or with proprietary wireless triggers. They can also be triggered by a flash, but the flash on the iPhone isn't nearly powerful enough for their sensors to pick up.

This was a challenge, and it meant I could only shoot with the much dimmer modelling bulbs on these lights. It worked — but only just. Ideally, I'd have been able to connect to these lights via the iPhone 6s's headphone jack (R.I.P.) or Lightning connector. 

Battery life was also a small concern: We started the shoot at 100%, and hit 20% about three-quarters of the way through, at which point I plugged the iPhone into a portable battery in my pocket. Though this might seem cumbersome, I can't use my SLR at all while the batteries are charging — depending on the context, this could be an advantage.

None of these issues ended up being showstoppers, but there's definitely still room for progress. I don't expect many pro photographers to be throwing their cameras away any time soon, but doing real work with the iPhone is no longer a novelty, it's a very valid choice of camera.

The testing continues

I'm looking forward to trying this experiment again with the iPhone 7 Plus: A telephoto lens is something I've been dreaming of since I first laid hands on an iPhone, and it's great to see it coming to a shipping product.

Overall, it was a joy pushing the iPhone camera to its limits and seeing it deliver quality results — and, of course, also very rewarding to use a camera app I created in a semi-professional context. It'll be interesting to see what other photographers, both professional and amateur alike, make of these changes, and how it impacts their photography going forward. 

For those of you eager to try out RAW capture yourself, you'll be able to use it on iOS 10 with any third-party camera app updated to support this feature (including Obscura 4).

Comments are closed.