The history of smartphone photography is a never-ending chase after features of full-sized cameras while keeping the actual camera modules slim enough to fit a thin phone case. Impossibly and amazingly, the smartphone is winning this battle. A modern smartphone still can’t replace a professional-grade DSLR at a sporting event, but it can take one hell of a good photo, especially in the daytime. Megapixel counts seemed to have peaked, and manufacturers have mostly settled for modules ranging from 12 to 20 megapixels. But how do you tackle the physical constraints of the tiny lenses that go into smartphone camera modules? SEE ALSO:Why your iPhone photos look so bad。One solution is to put in more lenses. Apple’s iPhone 7 Plus was one of the first major smartphones to bring something akin to optical zoom, with dual lenses with different focal lengths doing the work. So what’s next? China’s Oppo hinted at one possibility at the Mobile World Congress in February 2017, by showing a camera module that offers 5x “lossless” zoom. The details of the tech were somewhat murky, but the idea revolves, again, around using two lenses; one standard, and another a horizontally positioned telescope lens that gets light through a prism. And Light’s L16 camera comes with 16 lenses and takes 10 photos at once, resulting in a stunning 52-megapixel image.。 Mashable Light SpeedWant more out-of-this world tech, space and science stories?Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.。By signing up you agree to our Terms of Use and Privacy Policy.。Thanks for signing up!。 It’s fair to assume that Apple is working (or will license) similar technology, and we wouldn’t be too surprised for an iPhone in the near future (say, iPhone 2020) to an optical zoom even better than today's, perhaps as much as 5x.
。Oppo's periscope lens allows for 5x "lossless" zoom.Credit: Oppo。 Another area where digital photography is advancing rapidly is post-processing. Google showed how much can be done with post-processing in its Pixel phones, which take amazing photos without optical image stabilization. And a Google engineer recently figured out a way to take absolutely stellar photos in near-total darkness by taking a large number of photos in a short time-frame and then working some computational magic to remove nearly all traces of graininess. Finally, apps like Prisma employ the power of neural networks to turn regular photos into artwork, and I’m sure Apple’s code wizards are looking into ways to employ AI in order to improve smartphone photography. This will not only improve the quality of photos taken but do wonders for how these photos are tagged and organized.。 Put all of that together, and you get a very bright future for smartphone photography; one in which physical constraints of smartphone cameras become less relevant as lenses multiply and work gets handed over to incredibly powerful processors.。 All this tech is out there for the taking by Apple. iPhones in the future might easily be doing billions of computations on each photo you take in order to churn out that perfect nighttime panorama or club selfie. And with a couple of cleverly positioned lenses working in unison, features such as lossless, optical zoom and huge megapixel counts -- currently reserved for standalone cameras only -- should become bullet points on the iPhone spec sheet. 。 Featured Video For You
。 What the iPhone will look like next。TopicsCameras。 |