More Computational Photography

As an Electrical Engineer and a long time photographer, I’ve been interested in computational photography for a while.  You can read some of my  earlier posts on the subject at this link. Here’s Wikipedia’s definition:

…”digital image capture and processing techniques that use digital computation instead of optical processes.”

I’d change this slightly from”instead of” to “instead of or in addition to”, but that’s a small quibble.

Moore’s law keeps driving the computing capabilities in phones and cameras ever higher and it’s fascinating to see what companies do with the extra potential.

Towalga River below High FallsTowalga River below High Falls, Georgia. Composite – iPhone default live view and Long Exposure, merged in Photoshop

I made the image above on our recent trip using an iPhone 8+ and Apple’s “Live Photo” mode. After I made the photo, I edited it on the phone and enabled the “Long Exposure” effect. The Long Exposure effect of blurring the moving water is computed from ~3 seconds worth of frames that Live Photo captures.  This is Apple’s web page explaining the feature.  Here’s what it looked like before I changed the mode – it’s not nearly as photogenic:

Towalga River below High FallsDefault iPhone live view

Until iOS 11 added this feature, I wasn’t too interested in Live Photos.  Now, I’m watching for places to use it. You can get a better image with your high-end camera and traditional optical techniques, but this is easier and a lot of fun to play with.

Here are some hints:

  • Pick a suitable subject:  moving water, traffic on a road, blurring people in a crowd, etc.
  • Motion blur with a traditional optical approach requires a slow shutter speed – either low light or using filters.  Since computational methods works by processing multiple frames, you can use it in bright light without filters.
  • Apple says it works on their newer phones (6+ and later).  You’ll need to have iOS 11 (or later) installed.
  • The Long Exposure effect has to align Individual frames and then crop where there’s no overlap so you’ll lose pixels around the edges.  Ideally, use a tripod – but that sort of defeats the idea of pulling your phone out of your pocket, doesn’t it?  Just hold the phone as steady as you can to minimize cropping.
  • Make several exposures and pick the best one later.
  • Long Exposure resolution seems to be lower than default iPhone photos.  This isn’t a huge problem for the moving parts of the frame – they’re supposed to be blurry.  For the static portions, you can load both versions into layers in Photoshop and use masking to paint in higher resolution where you want it.  I did this for the first photo above.
  • You can set a Long Exposure photo as your wallpaper.  You’ll see the static Long Exposure version until you press on it from the lock screen.  Then it changes to show the three-second animation – cool!

I hope Apple enhances this in future updates.  It’d be good to have some control over the blur effect.  3 seconds is nice, but some subjects will look better with less (or more?).

iOS 11 includes other updated computation photo capabilities (e.g. portrait lighting) – but that’s a subject for another day.

Photography’s changing fast – it’s a wonderful time to be a photographer, isn’t it?  In today’s digital world, many advances are likely to be computational and not optical.  Keep up – don’t be left behind!

Thanks for stopping by and reading my blog. Now – go compute some photos!

©2016, Ed Rosack. All rights reserved.

I'd love to hear from you, please leave a comment!

This site uses Akismet to reduce spam. Learn how your comment data is processed.