Computational Photography

My acquisition of images has suffered greatly since the advent of digital photography.  I used to build pinhole cameras, and spend hours in darkrooms with smelly chemicals.  When the kids were smaller I had ample inspiration to take lots of photos.  Carrying around my full size DSLR started to seem like a chore.  Everyone and their dog is suddenly taking pictures with their phones.Instagram seems like a such a cheat to me, each filter designed to mimic a type of photography from the past. When I get an image like this sent to me

 copyright Kevin Chow

I want to say “Hey, look you smeared vaseline on your lens, just like the cinematographers of the 30’s”. Or when I see other filters,  I start creating the chemical process in my head..”hey, that’s a bleach-bypass!”.  Yeah, yeah, Jon tell us more about how analogue photography was soo much better. It is actually an amazing thing the processing power and image capture that these cameras are capable of.  I get that.  Steven Soderbergh recently stated that he may only use iPhones from now on, to create films after his experience filming UNSANE.

When I went to film school I would have killed to have this portable filmmaking studio in my pocket!

The latest in digital ‘improvement’ to photography is with Apple’s new Portrait Mode on their iPhones.  It enables you to mimic some serious studio lighting looks.  You know where someone has spent years fine tuning how far away, and at what angle, large lights should be to create different effects on a face or model.  Well it is pretty cool.  It does a great job of creating depth of field that was a lacking in these tiny cameras. This first image is with the regular camera mode, and you can see the background is very much in focus.

 

This second image is taken from almost the same position (further back actually), and you can see the background has been blurred out.

 

This blurring is due to a blending of two distinct images taken by the phone.  Another cool effect the iPhone creates is those nice portraits against black backgrounds, as seen in the next set of images.  Both photos taken at the same location.  The iPhone does the computation and deletes the background based on depth information from the dual cameras. No black back drop needed.

The fun thing about this process is that it’s not perfect.  What ever algorithm it is using does not like glass. So you get images like this:

I have been playing around with the tolerance of the algorithm trying to break it like in the image above. Check out the way portions of the glassware itself have been totally wiped out.  It is like a very bad photoshop.

I was able to use the algorithm to create this interesting image after a few tries.  This was taken on the dash of the van as we were driving back from Vancouver.  You should be able to see the highway behind the coffee cup.

Anyway, I  will work on pushing the process to see what else I can come up with.