Ever since the iPhone 7 Plus was released, I have been experimenting with the Portrait mode effect. At first, it was in beta and only worked half the time, but after a few updates, it got much better and I started getting truly remarkable results with it.

This summer, Apple released iOS 11 which included support to developers to have access to the Portrait Mode effect and allow access to the dual cameras. The first app that stood out to me was the Anamorphic App by BrainFeverMedia. It uses the Depth API to add a truly realistic looking anamorphic blur to your images. Check out our video review/ tutorial below:

After using this app for the last few weeks, I can honestly say that it is still blowing my mind with the images I am getting.

Taken using iPhone 7 Plus and the Anamorphic App

I recommend shooting everything using the built in Apple Portrait Mode first, and then importing it into the Anamorphic app. That way you give yourself options to choose from. Sometimes the built in Apple app works better and looks more natural.

The thing that excites me most about this technology is where it will go from here. Will we see this type of computational photography trickle into the pro market anytime soon? The Lytro Cinema camera seems promising, but only for million dollar Hollywood films.

The Light L16 seems cool too (We hope to review the L16 soon). The thing that makes me most excited is the thought that one day pro cameras will use this depth data to apply lens simulations. Imagine taking a photo and then applying a vintage Leica Summilux lens profile to your image inside of Lightroom. For video shooters, imagine adding a $100,000 Panavision Anamorphic lens profile blur to your video in post.

We have a lot to look forward to as this technology advances! The iPhone 8/7 Plus and X are a great starting point! Be sure to subscribe to our YouTube channel and follow us on Facebook. 

Facebook Comments