Er, didn't Google introduce this as a post-processing software feature? Where you could selectively blur parts of the image giving the "portrait" effect? Plus, isn't Apple just doing it software as well?
You can do all sorts of stuff manually through post processing, but the whole point here is not to have to do post processing manually.
I can also do all sorts of other things on my dSLR (and now iPhone I guess) by shooting RAW and post-processing the crap out of it, but most of the time I just want to take a damn picture and have it look decent.
I'm not trying to completely disparage Apple, as this is a first iteration, but seems like the best thing would have been to put a larger and higher MP sensor and lens.
I respectfully disagree. This is a great, great, great feature to have, if implemented correctly. One of a few things I've always hated about smartphone cameras is the deep depth of field. It almost always makes portrait shots on a smartphone look cheezy and crappy. If they can get this to work consistently and reduce the edge blurring, it will be truly great. If it stays the same as it is now, it will merely be good.
I couldn't give two chits about a higher MP sensor. Actually, I'd be happier if they just made Live Photos smoother instead of the 15 fps or whatever they are now.
How accurate is the depth sensing aspect? Multiple sensor definitely offers plenty of interesting applications, but this portrait mode didn't need it, and the zoom is not that impressive. And as far as I know they're not really leveraging other things they could do much if at all.
Another thing I hated most about iPhone cameras is the lack of zoom. Now they've got that covered as well, with a 2X zoom. No, it's a not a 10X optical zoom or anything, but it's a huge step forward for iPhones.
Apple literally addressed two of my of biggest two biggest issues in iPhone cameras in one single release. Now what I want to see is further improved low light performance, and optical image stabilization on the "telephoto" lens (because currently only the wide angle lens has OIS).
Apple isn't the first, but nonetheless, it's a great feature to have on an iPhone.
Also it highlights exactly why I hate this iterative advancement in the whole industry.
People don't design revolutionary products every day. Things need time to progress. Now granted, Apple also tends to "iterate" its releases in order to keep momentum in its sales, but it's not as if Apple actually develops and builds camera sensors. And for the portrait mode, they bought that a few years back by paying $20 million for a company called LinX.
For a side note, has anyone checked out the DxO camera module? I hadn't heard about it at all until I went to the site after googling it (I'd forgotten what it was called and never see it referenced on here any more, but it was pretty popular for a while there). $499 is pretty high (and I'd personally put that towards something like the Nikon 360 camera), but seems like there's still plenty of room for a company to make some real camera add-ons for smartphones that would get low-hanging fruit. For instance, I feel like Apple would have been better off making a case that housed an improved flash and lens (a big lens with an electronic mechanism allowing it to adjust a bit to offer some zoom as well as other focal capabilities, all controlled by the phone; and as a bonus it could have provided a 3.5mm jack for optional headset/mic, and even a battery boost for those that want it; it'd be heralded by everyone, and they could sell it for $200 easy). Of course with the placement of the camera itself, it limits what Apple could offer for the lens (which yes, putting it where Apple does limits the camera bump, but still just seems non-optimal).
DxO would never sell that thing in super high numbers. Basically it makes a simple interface and form factor suck. I'd rather just go whole hawg and take my Canon 7D instead, or perhaps a Sony RX100.
I don't see this as low-hanging fruit at all. It's a tiny niche market, and one that can't even compare to Apple Watch.