The iPhone is arguably the most transformative device since the automobile. From its debut, it has dominated cell phone sales, at times besting all other Android products put together. With the dawn of the 2012 iPhone 5 and its eight-megapixel camera, iPhone sales have eclipsed all other cameras on the planet. And thanks to the rise of the photo-specific app, Instagram, we collectively snapped 1.2 trillion photos in 2017. Suddenly the world’s most-shared, richest, emotionally resonant, and arguably most accessible artform is photography.

The sheer popularity of the iPhone has made this photography revolution possible. Unlike a traditional camera, where the image you have very much relies on the size of the sensor (a full-frame DSLRs sensor is about 45 megapixels), Apple has rather obstinately kept to relatively small lenses and a mere 12MP sensor.

Where the power of selling 200 million phones a year comes in, versus the much smaller volume of digital camera sales or sales of smartphone competitors, is Apple’s ability to invest in computational analysis and make better software, nearly rendering sensor size irrelevant. How? Two important ways.

iPhone Xs and Xs Max

First, the dual-lens tech on the 2016 iPhone 7 Plus allowed Apple’s proprietary software to study the entire image you shoot, and to shoot several shots at once. The software in the phone is looking simultaneously at the intended subject and the extant lighting to create the best possible single image.

Now, with the debut of two new iPhones—the Xs and the Xs Max—Apple’s taking computational photography technology even further. Like the original X, the new phones have dual-camera tech, one for 2x zoom and one that defaults to wide. But Apple also added what it calls smart HDR. An HDR is several images sandwiched into one. A professional photographer might “bracket” a series of shots taken with a DSLR, then fuse the images with software into a single photo in order to make shadows brighter and highlights more balanced. This is great for landscapes, but tough for non-pros to strike the balance with a portrait. Until now.

Because the tech in the Xs and Xs Max automatically takes multiple shots at once, you’ve only tapped the shutter a single time, yet in essence have snapped dozens of shots at different exposures. The technology analyzes the information in an instant (performing up to one trillion modifications) and then outputs this to one shot with a perfectly lit background, foreground, and skin tone. If shooting action, the camera detects motion. Prior to even pressing the shutter, it “pre-shoots” four frames so you don’t miss the moment. Naturally, it also compares focus of all those shots to make sure the result is ultra-sharp. And if you’re using portrait mode, which is designed to softly blur the background to create what’s called bokeh (the effect of bringing your subject forward), you can adjust the background focus after shooting. Maybe there’s too much bokeh and the subject looks artificially cookie-cut out of the scene? No worries; you can reduce that effect.

Screen Shot

Three other top-note features include the larger Xs and Xs Max screens, yet both are actually physically smaller than Apple’s largest phone, the 8 Plus. That’s because the screens dominate the new phones, while the 8 Plus’s screen is relatively small. So the Xs Max won’t feel inordinately large, and if you want more screen in a smaller phone, the Xs is the way to go.

Both phones are also more waterproof, rated to submersion in up to over six feet of water (including chlorinated pool water), and both also get dual SIM tech—a fantastic feature for both vacation and business travel. You can have two numbers in a single phone, keeping the billing distinct, or if frequently jetting abroad, you can use an overseas carrier for your second number.