Tag Archives: iPhone

Neighborhood Kites

Swallow-tailed Kites

I was getting a little exercise on a morning walk last Wednesday when I noticed some birds in the distance soaring on a thermal. As I got closer I could tell they were Swallow-tailed Kites.

I always enjoy seeing these birds. They’re very distinctive and watching them use their tails as a rudder to swoop, glide, roll, and zoom through the sky is fascinating. They migrate about 5000 miles from South America and arrive in Florida in the spring, spending several months here to breed and then returning south in the late summer.

It’s not uncommon to see them in my neighborhood, and even over my house. But they always appear when I’m not ready to photograph them. This time was no different. The only camera I had with me was my phone and I was sure they’d be gone by the time I could get home and get my big lens out. As I got closer, the birds circled lower in the sky and I decided to try making some photos anyway.

Swallow-tailed Kites circling overhead in our neighborhoodSwallow-tailed Kites circling overhead in our neighborhood (click on the photo to see a larger version on Flickr)

I used the built in camera app with the 3x lens (50mm equivalent) and the output set to RAW mode. I made about 30 frames, hoping some would turn out.

I went through them when I got home and picked the best ones to process. Most of the rejects were due to framing, exposure, or chromatic aberrations / fringing. Their colors make them hard to expose correctly and the white feathers were blown out in many of the frames. There was very distracting blue / purple fringing along wing edges in the ones that were made at f/1.8. The f/2.8 ones didn’t have that issue. This left me with just a handful of images to process.

I ran them through Adobe’s “Enhance / Super Resolution” and used masks and subject and sky selections to make local adjustments. I also set the sharpening to zero in Lightroom and used Topaz Sharpen AI as a last step.

A couple turned out OK, but I really wish I’d had my big lens with me!

Thanks for stopping by and reading my blog. Stay positive, be kind, take care of yourselves and each other. And if you can – make some photos, even if all you have is your phone camera!

©2022, Ed Rosack. All rights reserved

iPhone Event Photography

MK won two extremely good tickets in a drawing at work for a recent Orlando Magic game and invited me to go with her (Thanks MK!).  I was excited and looked up the camera policy for the Amway Center.  The relevant sentence was:

“Cameras with detachable lenses longer than six (6) inches when extended are prohibited from all Amway Center sporting events.”

I put my Olympus 12 – 100 lens on my E-M1 2 camera, stuck an extra battery in my pocket, and was ready to make some super photos.  When we arrived I was stopped by the first security guard I saw at the start of the entrance line and told that no interchangeable lens cameras were allowed.  I didn’t think it was worth arguing, so I begrudgingly took my camera back to the car.  After going through the line to get in, we asked about the policy and were told that yes, interchangeable lens cameras are allowed!

Amway – you need to make sure your security people understand your policies!  Anyhow, I was tired of walking back and forth to the car and decided to just make photos with my phone.

Nikola Vucevic grabs a reboundNikola Vucevic grabs a rebound. Orlando Magic vs. Brooklyn Nets. The Magic led for most of the game, but lost by two in the final seconds.

Which ended up working well, since the seats were in the first row!  I used the 2x lens and shot in RAW mode using burst to capture the peak action.

I think a main disadvantage of phones vs. dedicated cameras is the lens selection, especially at the telephoto end.  Phones right now usually have two or three lenses at most.  Standalone cameras have a virtually unlimited lens selection available.  At events, you need to be close to the action or you need to use a long lens.

Last week, Lynn and I went to a concert at the Plaza.  Their camera policies are more restrictive (and  vague).  They can also change, depending on the performer, so I left my camera gear at home.   Our seats this time were about eight rows back, which was close enough to get a few iPhone photos of one of my favorite guitar players.

Hot Tuna at the Plaza, Downtown Orlando, Jorma Kaukonen and Jack Cassidy Jan. 24, 2019Hot Tuna at the Plaza, Jorma Kaukonen and Jack Cassidy, Jan. 24, 2019, Downtown Orlando.

I would’ve liked to get closer.  And the resolution isn’t as good as I’d want for prints on the wall.  But phones can work surprisingly well – if your seats are good enough.

I have more Orlando Magic photos in this album on Flickr.  And a few more Jorma Kaukonen photos in this one.

Sometimes my photo plans don’t work out.  But I make photos anyway.  It’s what I do.  Thanks for stopping by and reading my blog.  Now – go make some photos!

©2019, Ed Rosack. All rights reserved

More Computational Photography

As an Electrical Engineer and a long time photographer, I’ve been interested in computational photography for a while.  You can read some of my  earlier posts on the subject at this link. Here’s Wikipedia’s definition:

…”digital image capture and processing techniques that use digital computation instead of optical processes.”

I’d change this slightly from”instead of” to “instead of or in addition to”, but that’s a small quibble.

Moore’s law keeps driving the computing capabilities in phones and cameras ever higher and it’s fascinating to see what companies do with the extra potential.

Towalga River below High FallsTowalga River below High Falls, Georgia. Composite – iPhone default live view and Long Exposure, merged in Photoshop

I made the image above on our recent trip using an iPhone 8+ and Apple’s “Live Photo” mode. After I made the photo, I edited it on the phone and enabled the “Long Exposure” effect. The Long Exposure effect of blurring the moving water is computed from ~3 seconds worth of frames that Live Photo captures.  This is Apple’s web page explaining the feature.  Here’s what it looked like before I changed the mode – it’s not nearly as photogenic:

Towalga River below High FallsDefault iPhone live view

Until iOS 11 added this feature, I wasn’t too interested in Live Photos.  Now, I’m watching for places to use it. You can get a better image with your high-end camera and traditional optical techniques, but this is easier and a lot of fun to play with.

Here are some hints:

  • Pick a suitable subject:  moving water, traffic on a road, blurring people in a crowd, etc.
  • Motion blur with a traditional optical approach requires a slow shutter speed – either low light or using filters.  Since computational methods works by processing multiple frames, you can use it in bright light without filters.
  • Apple says it works on their newer phones (6+ and later).  You’ll need to have iOS 11 (or later) installed.
  • The Long Exposure effect has to align Individual frames and then crop where there’s no overlap so you’ll lose pixels around the edges.  Ideally, use a tripod – but that sort of defeats the idea of pulling your phone out of your pocket, doesn’t it?  Just hold the phone as steady as you can to minimize cropping.
  • Make several exposures and pick the best one later.
  • Long Exposure resolution seems to be lower than default iPhone photos.  This isn’t a huge problem for the moving parts of the frame – they’re supposed to be blurry.  For the static portions, you can load both versions into layers in Photoshop and use masking to paint in higher resolution where you want it.  I did this for the first photo above.
  • You can set a Long Exposure photo as your wallpaper.  You’ll see the static Long Exposure version until you press on it from the lock screen.  Then it changes to show the three-second animation – cool!

I hope Apple enhances this in future updates.  It’d be good to have some control over the blur effect.  3 seconds is nice, but some subjects will look better with less (or more?).

iOS 11 includes other updated computation photo capabilities (e.g. portrait lighting) – but that’s a subject for another day.

Photography’s changing fast – it’s a wonderful time to be a photographer, isn’t it?  In today’s digital world, many advances are likely to be computational and not optical.  Keep up – don’t be left behind!

Thanks for stopping by and reading my blog. Now – go compute some photos!

©2016, Ed Rosack. All rights reserved.

Pocket Computational Photography

If you’ve followed this blog for a while, you may have seen my earlier posts on computational photography.  If not, you can review them at this link:  https://edrosack.com/?s=computational+photography.  The term refers to using software algorithms to supplement or replace optical capture processes.  Common examples are multi-frame panoramas, focus stacking, HDR processing, post capture focus, and other techniques.  You can read more about it at this link on Wikipedia:  https://en.wikipedia.org/wiki/Computational_photography

As phone capabilities increase, their computational photography power is growing.  Camera phones have long been able to do on the fly panorama and HDR capture.  And here’s an example of a new capability that arrived on the iPhone 7+.

BokehBokeh

Apple calls this “Portrait Mode”.  It’s available in Beta on the iPhone 7+ in the latest version of IOS.  Since the 7+ has two cameras separated by a small distance, it provides the info necessary to compute a “depth map” of pixels in the frame.  The software uses this to selectively blur pixels based on distance to add a “Bokeh” (shallow depth of field) effect that helps with subject isolation.  For comparison, here is the non-computed version of the image.  You can see that the background looks very different.

Original
Original

All isn’t perfect.  The algorithm has problems around small features at the boundaries.   Look closely at the next frame and you can see blurring issues at the edges of the reed.

Phone output
Phone output

The processing blurred parts of the reed that we wanted sharp.  For the first photo above – I cheated and used Photoshop to correct the problems.  Maybe in future versions the software will be better.

Here’s one more example.  This is Lynn, rocking an election day t-shirt.  First, the portrait mode version.

Lynn - original
Lynn – portrait mode

And finally, the original.  In this case, the software did much better, with no obvious blurring issues.  These two are straight out of the camera with no processing on my part.

Lynn - portrait mode
Lynn – original

It’s fascinating how photography and computers are merging.  For someone who started out programming a large room sized Univac in FORTRAN with punch cards, the power and ability that fits in my pocket is just stunning.  I’m glad to have it with me.

What can they possibly think of next?  Do you use computational photography techniques?  Do you like or hate them?

Thanks for stopping by and reading my blog. Now – go compute some images!

©2016, Ed Rosack. All rights reserved

iPhone vs. "big" cameras?

When I’m traveling, I try to take an iPhone photo when I get to a new place.  Sometimes I forget but when I can remember, the iPhone’s GPS capability records the location for me.  Then when I’m back home, it makes it easier to map out exactly where I’ve been.

This is one of the first photo’s I made on our trip out to Utah a few weeks ago:

Cedar Breaks National Monument amphitheaterCedar Breaks National Monument amphitheater iPhone panorama

 When I posted it on Flickr, I commented “Straight out of the iPhone’s panorama mode. I’m not sure why I have all these other cameras.”  And I do like the photo.  Phone cameras do pretty well, especially in good light.  So I wondered …

When I got home and processed the rest of my photos, I took a look at some of the other iPhone images compared with similar images from my “big” cameras (interchangeable lens cameras with larger sensors).  Here’s another example:

Sunrise at Point Supreme, iPhone PanoramaSunrise at Point Supreme, iPhone version – Panorama mode

Although the light was very pretty that morning, it was also very challenging for the iPhone sensor and lens.    I’ve tried to adjust the photo to be as similar as possible to the one below.  But I can still see major differences.  I made the next photo a minute or so later and very near the same spot with an Olympus E-M5 II micro four-thirds camera and the 12 – 40 mm f/2.8 Pro zoom lens.

Sunrise at Point Supreme
Sunrise at Point Supreme – Olympus version – multi image panorama

After looking at several cases where I had similar photos, I think this example shows why we need to keep our big cameras.

  • The exposure latitude and dynamic range capability of sensors that are larger than the one in the iPhone means that the dark areas have more detail and less noise, and the bright areas are less likely to blow out.  For high contrast light (sunrise / sunset) this helps a lot.
  • The lens in the iPhone didn’t handle the flare / glare very well.
  • The resolution capabilities of phone cameras are growing.  But with careful capture, I can create much larger images with the big cameras.  For instance the last photo above is 58 megapixel. The amount of detail in a file that large is enormous compared to a phone photo.
  • Control:  For me, the big cameras beat phone cameras in flexibility / control and ergonomics.  I can easily control everything from lens choice to aperture, ISO, shutter speed, etc.  You can get apps for your phone that add better controls, but I find them inconvenient and don’t often use the ones I have.
  • Color / white balance:  The default color and white balance on the phone are very good.  But when I use the big cameras, I can shoot in RAW format, which makes adjusting white balance and color much easier in post processing.  RAW format also allows more adjustment latitude, since I’m working with a 14 bit file in RAW, instead of an 8 bit jpg file.  RAW is coming to the iPhone soon, which should help.

So there are some reasons why I think big cameras are worth the extra weight / trouble of bringing them along.  I use my phone camera to supplement them.  How about you?

Thanks for stopping by and reading my blog. Now – go make some photos!

©2016, Ed Rosack. All rights reserved.