iPhone 7/7+ Raw Capabilities


Would an iPhone 7 raw capture have produced more shadow detail, more highlight detail, and less sky noise than this iPhone 6 standard camera shot?

While the internet is flooded with negative articles about the iPhone 7 series and how little new they have to offer (a great way to get clicks, whether you have anything meaningful to say or not), there are, in fact, a number of very interesting new features, especially in the 7+. I will wait to discuss the dual cameras and what they offer for phone photography, as well as the wide gamut P3 colorspace of the new iPhones, until I actually have one in hand (the prudent way to write about any product), but in the meantime I can’t resist commenting on another feature of the new phones, or for that matter other recent iPhones, running iOS 10.

That would be the ability to shoot raw images. Not that the native camera app which Apple supplies (and which accounts for the vast majority of images shot with iPhones) offers such an option; but it is available for third parties to use. Adobe is making a splash by supporting this capability in their Lightroom camera function. But first, lets step back, and think about what raw really means.

Raw means nothing, unless there is more than 8 bits (256 levels) of meaningful data available. So the value of raw functions of any type with iPhones will depend on how much meaningful raw data is actually captured, and made available for use, from these phones.

Experience with DSLRs and mirrorless cameras has shown that ten bits of data is good, and twelve bits is better. But where does such “extra” data show up, since screens often don’t display more then 256 levels per color channel anyways?

It shows up mostly when you make significant adjustments to the file, to open the shadows, or enhance the highlights. And the peculiar way that bit depth in files works, extra bits allows us to keep much more highlight detail, while leaving more bits for further down the range. However, unless the dynamic range captures meaningful data, not noise, in the deep shadows, then the value of that extra depth is questionable.

So what we will be looking for from raw capture as we test the iPhone 7 and 7+ (and iOS 10 with phones from the 6s forward) is the ability to produce more highlight and shadow detail, and the ability to make big density shifts in editing software, without causing “thinness”, which shows up as posterization in one or more zones after the edit has been made.

How will the iPhone 7 series perform in raw mode? These are tiny sensors, which are therefore prone to much more noise, especially in the shadows, and in dim lighting. Perhaps the 7+ with its dual camera functionality will be able to reduce that noise a bit, but  don’t expect  raw capture from the iPhone 7 and 7+ to respond like a recent generation DSLRs when editing. But we can hope that this will provide at least incremental improvement on previous iPhone images.

The real question is whether the improvements by shooting with Lightroom raw, over the standard iPhone camera, is large enough and frequent enough for us to use the Lightroom camera as our default, go-to choice for shooting.

Copyright C. David Tobie

The Retina Display iMac for Photographers, Part ll

Visual color assessment of the Retina iMac’s display shows it to be close to the target values, closer than many off-the-shelf displays. Its color and densities, out of the box, would be better for general consumer use than almost any other solution, short of top-end dedicated graphics displays.

That’s great news for most users, but not quite enough for those doing serious color work, including photography, on the iMac. The display we tested was just a bit denser in the midtones than would be ideal, and the colors are a bit punchy.

More obvious than either of those issues was the lack of neutrality; in fact, on an uncalibrated display the grays were sufficiently off-neutral to make it difficult to judge the other.

Analyzing the iMac display with a Datacolor Spyder profile provided some understanding of these visual issues. So let’s dig into the details this tool can provide us about the display.

First is the Gamma Curves. Lets start with a chart of the red channel, showing the target of gamma 2.2 in red, and the native curve in black.


See how it’s a bit below the 2.2 line? That would begin to account for part of the midtones being too dark. Next, lets look at the calibrated results. The black line lying right over the red target line is the corrected result.


The green channel shows a similar story here is the green channel, with the uncalibrated line below the target, and the calibrated line nearly on top of it.GreenCalandUncal

The blue channel looks rather different. Here the uncalibrated curve lies above the target through much of the curve, but below it in the highlights. Corrected blue curve left out for clarity.


It’s the combination of small variations in channel curves that add up to a slight inaccuracy in the overall density, as well as the gray balance.

Next let’s look at the gamut of the display. Apple has been standardizing on an sRGB gamut, for the simplicity of having sRGB web images, as well as Rec 709 video, look more or less correct on screen.

In the image below you can see that the red triangle representing the display gamut is a good match for the green triangle representing sRGB. In Apple’s laptops and mobile devices, the gamut is very close to sRGB, but a hair inside, generally rating 97 to 99% of sRGB. However, the display gamut is outside of sRGB, especially at the red and green points.


This would account for colors on the display being just a bit punchy, a direction that would please general users, but would bother advanced photographers, especially in the skintones. Correcting this oversaturation in an editing app on the iMac would then result in desaturated colors in color-accurate prints. This is where a user-generated correction profile for the display is key to success!

Other functions of the display rated well, with the exception of the uniformity issue that is the weak point of most white LED lit displays. Color accuracy on all colors except for the Cyan patch (which is outside the sRGB gamut) were in the range of 1 delta-E, or barely discernible. Contrast measured at over 1000 to 1. Other factors can be seen in the summary chart below.


As can be seen from this table, as long as you don’t depend on the brightness in the corners of the screen, the Retina iMac display is a very good display.

The brightness of the iMac display would be the last issue to consider. This screen is capable of producing 500 candelas per meter squared. That would be great for use on a tradeshow floor, but in a normal, dim editing environment it is critical to dim this display to an appropriate level.

We calibrated it to 120 candelas, as we do with our other displays, and it worked very well at that level. If you have other custom settings you require, such as calibrating to 5800K instead of 6500K, etc, then custom calibration is a necessity.

Pairing the Retina iMac with a wide gamut graphics screen as a second display offers the best of both worlds, including the ability to view nearly the entire image from most DSLR cameras on one screen at 100% scale, making dust busting and sharpening a joy. Calibrating both those displays to make the match between them as close as possible can provide a great image editing experience, saving time and providing a good screen to print match.

Thanks to Datacolor for the tools used to analyze the Retina iMac display!

C. David Tobie

WEBINAR: Exploring Architectural Photography, Today 3PM EDT


This webinar will be an unscripted discussion of the concepts and techniques used in architectural photography, sponsored by Datacolor. David Saffir and I will be discussing a range of sample images, chosen for their value in illustrating architectural concepts and the techniques used to shoot them. Please join us for what we hope will be an interesting discussion on this very challenging theme. And stay tuned, as there will be a Datacolor Spyder product given away to one attendee at the end of the session, plus some excellent discounts for all attendees.

Sign up now at : https://www2.gotomeeting.com/register/268882242

Credits: C. David Tobie, Copyright 2012. Website: CDTobie.com Return to Blog’s Main Page

C David Tobie to Present at International Printing Week, Cal Poly


Anyone attending International Printing Week at the Cal Poly, or in the San Luis Obispo, CA area, please join C David Tobie, Datacolor Product Technology Manager, for a one-hour presentation on Display Calibration for All Your Viewing Devices. This session runs from 4PM to 5PM on Tuesday, Jan 28. This session will cover calibration for a wide array of display types, multiple display tuning, projector calibration, and iPhone/iPad calibration. This session is restricted to Cal Poly students and those with International Printing Week passes; please contact the Graphic Communication Department at the URL above if you are interested in attending.

Credits: C. David Tobie, Copyright 2012. Website: CDTobie.com Return to Blog’s Main Page

FOTOfusion Public Presentation: Video for Photographers, Friday Jan 25, 3:45 PM Hall 616

Photo courtesy of Samyang Lenses

If you are attending FOTOfusion, at the Palm Beach Photo Centre, or if you are in the Palm Beach area, please join me for a one hour presentation on video for photographers. This session runs from 3:45 to 4:45 on Friday, Jan 24, and will cover basic info on cameras, lenses, tripods, lights, software, computers, and displays, for photographers who are considering moving into motion work. Color management for video will also be discussed. This will not be a highly technical session, so feel free to attend even if you don’t know what a CODEC is, or why you might need one. This session is open to the public, so all you need to do is show up at the West Palm Beach Community Center (same building as the library), and ask directions to 616. I hope to see you there!

Credits: C. David Tobie, Copyright 2012. Website: CDTobie.com Return to Blog’s Main Page

WEBINAR: Exploring Photographic Composition in Landscape and Still Life, Today 3PM EST

Tuscan Window Image, as originally processed

Tuscan Window

This webinar will be an unscripted discussion of composition, gesture, and other factors that make images work, co-sponsored by Datacolor and Digital Silver Imaging. David Saffir and I will be discussing a range of sample images, chosen for their value in illustrating compositional features, and in some instances, chosen for defying easy analysis. Please join us for what we hope will be an interesting discussion on this very interesting theme. And stay tuned, as there will be a Datacolor Spyder product given away to one attendee at the end of the session, plus some excellent discounts for all attendees.

The recorded version of this webinar is now available here.

Credits: C. David Tobie, Copyright 2012. Website: CDTobie.com Return to Blog’s Main Page

WEBINAR: Getting the Most Out of Your Your Holiday Photography, Wed Dec 12 3-4 EST


Please join David Saffir and myself for a casual chat about improving your holiday photos. This will include tips for photographers at all levels. You can register for this webinar here. This will be at 3pm today (Dec 12) on the East Coast of the US, Noon on the West Coast. Your local time may vary…

There will be a Datacolor Spyder4Pro given away to a participant at each webinar (no reindeer required, we’ll ship it to you), and there are sure to be some excellent specials offered as well. We’ve put a good deal of time and thought into the ideas and examples in this webinar, so we hope you’ll attend; and enjoy the holiday spirit, as well as the photographer’s comradery.

Credits: C. David Tobie, Copyright 2012. Website: CDTobie.com Return to Blog’s Main Page