When it comes to design, the iPhone 11 Pro and 11 Pro Max are identical to the iPhone XS and XS Max, measuring in at 5.8 and 6.5 inches, respectively, with full-screen OLED displays than stretch from edge to edge and top to bottom with minimal bezels.
A notch at the front houses the TrueDepth camera system, front speakers, and other sensors, but other than the notch and a slim bezel that wraps around the side of each device, the iPhone 11 Pro and 11 Pro Max are all display.
The rounded corners of each display flow into a body made from a new matte glass material that is encased by a durable stainless steel frame, an upgrade from the aluminum frame in the iPhone 11. Apple designed the steel frame from an alloy created to match the body color, with nearly invisible antenna bands at the top and bottom.
There is no Home button, no bottom bezel, and no Touch ID fingerprint sensor, with the two devices using Face ID for biometric authentication purposes. The left side of the iPhone 11 Pro and Pro Max houses a standard mute switch and volume buttons, while the right side features a side button that doubles as a power button.
From the front, the iPhone 11 Pro and Pro Max don’t look different than the XS and XS Max, but Apple made some major changes to the back of the device. There’s now a large square-shaped camera bump that houses three lenses arranged in a triangle shape with a flash and microphone nearby.
The camera bump is made from the same glass material as the iPhone and it flows right into the body of the device, but the three lenses do protrude and it is a noticeable change as it’s much larger than the previous dual-lens camera bump from the XS and XS Max.
Both the iPhone 11 Pro and the 11 Pro Max are a bit thicker and a bit heavier than their predecessors to account for the new triple-lens camera system.
The iPhone 11 Pro measures in at 144mm tall, 71.4mm wide, and 8.1mm thick. It weighs 188 grams. Comparatively, the iPhone XS was 143.6mm tall, 70.9mm wide, and 7.7mm thick, weighing in at 177 grams.
The iPhone 11 Pro Max measures in at 158mm tall, 77.8mm wide, and 8.1mm thick. It weighs 226 grams. The iPhone XS Max was 157.5mm tall, 77.4mm wide, and 7.7mm thick. It weighed in at 208 grams, so the iPhone 11 Pro Max is the heaviest iPhone Apple has released.
Colors and Finish
The iPhone XS and XS Max had a glossy finish, but for the iPhone 11 and 11 Pro, Apple implemented a matte finish that looks more like a brushed glass.
There are four colors this year: Silver, Space Gray, Gold, and Midnight Green. Midnight Green is a new color that Apple hasn’t ever used before, and it’s a deep, forest green shade that was made possible by ink techniques created by Apple supplier Seiko Advance.
The iPhone 11 Pro is made from the most durable glass ever in a smartphone, so in theory, it should hold up better to accidental bumps, drops, scratches, and other minor damage. It’s still glass, though, so it’s best to use a case or have AppleCare+ in case of accidental damage.
Apple says a “dual ion-exchange process” was used to strengthen the front and back glass to make it more durable than prior models.
Water and Dust Resistance
The iPhone 11 Pro, like the prior-generation iPhone XS, has an IP68 water resistance rating, but it is more water resistant. It is rated to survive a depth of four meters (13 feet) for up to 30 minutes, which is an improvement over the two meter rating in the XS and the two meter rating in the current iPhone 11.
In the IP68 number, the 6 refers to dust resistance (and means the iPhone 11 Pro can hold up to dirt, dust, and other particulates), while the 8 pertains to water resistance. IP6x is the highest dust resistance rating that exists.
With an IP68 water resistance rating, the iPhone 11 Pro can withstand splashes, rain, and brief accidental water exposure, but intentional water exposure should be avoided if possible. Apple warns that water and dust resistance are not permanent conditions and deteriorate as a result of normal wear.
Apple’s warranty does not cover liquid damage to iOS devices so it’s best to use caution when exposing the iPhone 11 Pro to liquids.
Spatial Audio and Dolby Atmos
The iPhone 11 Pro is built with a new spatial audio feature that’s designed to simulate surround sound for a more immersive audio experience. It also supports Dolby Atmos sound.
The iPhone 11 Pro and iPhone 11 Pro Max use a “Super Retina” XDR display, which Apple says is its best display ever in an iPhone. The Super Retina display features support for Dolby Vision, HDR10, and wide color gamut for unparalleled color accuracy.
The Super Retina display features vivid, true-to-life colors, deeper blacks, and, new this year, a 2,000,000:1 contrast ratio, up from 1,000,000:1.
Compared to a traditional LCD display, such as the display in the iPhone 11, the iPhone 11 Pro is noticeably higher quality, especially when it comes to highlights and shadows. Blacks are blacker, whites are whiter, and everything just looks more realistic and like it does in real life.
Maximum brightness has been improved in the iPhone 11 Pro models, with 800 nits max brightness in typical use (up from 625 nits) and 1200 nits max brightness for HDR.
True Tone support is included, allowing the iPhone’s ambient light sensor to adjust the white balance of the display to match the ambient lighting in a room, cutting down on eyestrain for a more paper-like reading experience.
The iPhone 11 Pro, which has a 5.8-inch display, features a resolution of 2436 x 1125 at 458 ppi, while the 6.5-inch iPhone 11 Pro Max features a resolution of 2688 x 1242 at 458 ppi. Apple’s newest display is 15 percent more power efficient, which contributes to some impressive battery life gains in the iPhone 11 Pro models.
The iPhone 11 Pro Max has received the highest grade ever for a display from testing and calibration firm DisplayMate. DisplayMate says the iPhone 11 Pro Max offers “considerably better display performance than other competing smartphones.”
3D Touch, a feature that’s been available in iPhones since the iPhone 6s, has been eliminated in the entire 2019 iPhone lineup. The iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max all use a new Haptic Touch feature, which was first introduced in the iPhone XR.
Haptic Touch is similar to 3D Touch and offers much of the same functionality, but it is not pressure sensitive so there are not multiple functions for each press. Instead, Haptic Touch is like a long press with haptic feedback.
Haptic Touch is able to be used in many of the same places that had 3D Touch support, so most users are not losing features with the exception of the pressure-based feedback. For more on the difference between Haptic Touch and the previous 3D Touch, make sure to check out our Haptic Touch guide.
A13 Bionic Processor
An updated, next-generation A13 Bionic chip powers the iPhone 11 Pro and Pro Max. The A13 Bionic is faster and more efficient than the A12 Bionic chip in the previous-generation iPhones, and according to Apple, it is the fastest chip ever used in a smartphone.
The two performance cores in the A13’s CPU are up to 20 percent faster and use 30 percent less power than the A12, and the four efficiency cores are up to 20 percent faster and use up to 40 percent less power.
The GPU in the A13 is 20 percent faster than the GPU in the A12 and it uses 40 percent less power.
The A13 chip features a next-generation 8-core Neural Engine that Apple says is faster than ever for real-time photo and video analysis. A pair of Machine Learning Accelerators allow the CPU to run up to six times faster, delivering more than 1 trillion operations per second.
The Neural Engine is up to 20 percent faster and uses up to 15 percent less power than the previous-generation Neural Engine. Apple says its Neural Engine powers the camera system, Face ID, AR apps, and more.
Core ML 3 for developers allows apps to leverage the power of the A13 Bionic for apps and games.
RAM and Storage Space
The iPhone 11 Pro and Pro Max appear to have 4GB of RAM that’s available to apps and the iOS system, but it is unclear if there is extra RAM that’s dedicated to the camera as rumors have been mixed. We’ll need to wait for more information to determine just what’s inside.
Leaked benchmarks have indicated that there’s 4GB RAM in the two devices, but benchmarks are able to be faked. A leak from a Chinese source has said there’s 6GB, but it won’t be long before we know for sure.
As for storage space, the iPhone 11 Pro and Pro Max are available in 64, 256, and 512GB capacities.
TrueDepth Camera and Face ID
The iPhone 11 Pro and Pro Max are equipped with Face ID, the facial recognition biometric authentication system that Apple has been using since 2017. Face ID components are housed in the True Depth camera system in notch on the front of the iPhone.
Apple in the iPhone 11 Pro and Pro Max has introduced an updated TrueDepth camera system that uses new hardware. It’s 30 percent faster than before at unlocking the device and authenticating passwords and purchases, plus it is designed to work from a wider range of angles.
Face ID is used across iOS for tasks like unlocking your iPhone, allowing access to third-party passcode-protected apps, confirming purchases in iTunes and the App Store, and authenticating Apple Pay payments.
Face ID works through a set of sensors and cameras built into the TrueDepth camera system in the iPhone 11 Pro and Pro Max. To create a 3D facial scan that maps the curves and planes of each unique face, a Dot Projector projects more than 30,000 invisible infrared dots onto the surface of the skin, which are then read by an infrared camera.
This facial depth map is then relayed to the A13 Bionic processor where it is transformed into a mathematical model that the iPhone uses to make sure it’s you attempting to access your iPhone.
Face ID uses infrared, so it works in low light and in the dark, with a built-in Flood Illuminator making sure there’s always adequate light to take a facial scan. Face ID works with hats, beards, glasses, sunglasses, scarves, makeup and all other accessories and items that might partially obscure a face, but it does need to see your eyes, nose, and mouth to work.
The A13 Bionic chip with built-in Neural Engine means that Face ID can adjust to minor facial changes over time, so if you grow your hair longer or grow a beard, Face ID adjusts and continue to unlock your iPhone.
Face ID Security and Privacy
Face ID uses a detailed 3D facial scan that’s unable to be fooled by a photo, mask, or other facial imitation. An “Attention Aware” security feature allows Face ID to unlock your device only when you look in the direction of the iPhone 11 Pro with your eyes open, so it does not work when your eyes are closed, when you’re sleeping, when you’re unconscious, or when you’re looking away from your phone.
Attention aware is optional and there is an accessibility feature to turn it off for those who are unable to focus on the iPhone’s screen, but most people should leave it turned on for the added layer of security.
With the attention aware feature, the iPhone 11 Pro knows when you’re looking at it. Face ID displays notifications and messages on the Lock screen when you look at the iPhone 11 Pro, it keeps the screen lit, and it automatically lowers the volume of an alarm or ringer when it knows your attention is on the iPhone 11 Pro’s display.
Face ID data is encrypted and stored in the Secure Enclave on the iPhone 11 Pro. Apple can’t access your Face ID data, nor can anyone who has your phone. Authentication happens entirely on your device, with no Face ID data ever stored in the cloud or uploaded to Apple. Third-party developers do not have access to the facial map that Face ID uses to unlock a device, but the TrueDepth camera can be used to scan a user’s face for the purpose of creating more realistic augmented reality apps.
With Face ID, there’s a 1 in 1,000,000 chance that someone else’s face can fool Face ID, but the error rate increases to 1 in 1 in 500,000 with an alternate appearance registered in iOS 13. Face ID has been fooled by identical twins, children, and a carefully crafted mask, but it’s still secure enough that the average person shouldn’t worry about their iPhone being unlocked by someone else.
TrueDepth Camera Specs
The TrueDepth camera system, in addition to powering Face ID, includes a standard front-facing camera that can be used for selfies.
In the iPhone 11 Pro, the front-facing camera has been upgraded to 12 megapixels from 7 megapixels in the iPhone XS, and it supports next-generation Smart HDR for better than ever contrast and color. The updated camera is capable of recording 60 fps video in 4K with support for extended dynamic range video at 30 fps.
When you take a selfie with the iPhone 11 Pro in standard portrait orientation, it uses a zoomed in 7-megapixel version. Turning your iPhone to landscape mode allows more into the frame and results in a 12-megapixel photo, as does tapping the little arrow icon to zoom out when in portrait orientation.
When using the new front-facing camera, you can turn the iPhone 11 Pro from portrait mode to landscape mode to zoom out to automatically capture more in the frame, which is useful for situations like group selfies or when you want to capture more of what’s behind you in a selfie.
The front-facing TrueDepth camera is able to capture 120 fps slo-mo videos for the first time, enabling a new feature that Apple is calling “Slofies.” These are slow motion front-facing camera videos similar to the slo-mo videos available from the rear facing camera in prior iPhones.
Animoji and Memoji
The TrueDepth Camera System supports two features called “Animoji” and “Memoji,” which are animated, 3D emoji characters that you control with your face. Animoji are emoji-style animals, while Memoji are customizable, personalized avatars that you can create.
To enable Animoji and Memoji, the TrueDepth camera analyzes more than 50 muscle movements in different areas of the face, detecting movement of the eyebrows, cheeks, chin, eyes, jaw, lips, eyes, and mouth.
All of your facial movements are translated to the Animoji/Memoji characters, letting them reflect your expression and emotion. Animoji and Memoji can be shared with friends and used in the Messages and FaceTime apps.
There are more than a dozen different Animoji to choose from, modeled after existing emoji characters: mouse, octopus, cow, giraffe, shark, owl, warthog, monkey, robot, cat, dog, alien, fox, poop, pig, panda, rabbit, chicken, unicorn, lion, dragon, skull, bear, tiger, koala, t-rex, and ghost. There are an unlimited number of Memoji that can be created to look like you and other people.
As of iOS 13, there are also non-animated Animoji and Memoji stickers that can be used in the Messages app and other areas of the operating system.
Triple-Lens Rear Camera
A triple-lens rear camera, a first for an iPhone, is the hallmark feature in the iPhone 11 Pro and Pro Max. There are telephoto and wide-angle lenses like before, along with a new ultra wide-angle camera lens.
All three lenses are 12-megapixels and the differences between them are detailed below:
Ultra Wide-Angle Camera
13mm focal length
120 degree field of view
The ultra wide-angle camera is the right most lens
Larger 12-megapixel sensor that lets in more light
Optical Image Stabilization
100 percent focus pixels
The wide-angle camera is located at the top left of the iPhone
Optical Image Stabilization
40% more light capture than in XS
The telephoto camera is the bottom left lens on the iPhone
According to Apple, with the new ultra wide-angle lens, iPhone users can capture up to four times more scene, which is ideal for landscape shots, architecture shots, group portraits, and tons more.
Apple recommends using the ultra wide-angle lens for an “artful perspective” when taking a close up shot, as it offers up unique angles thanks to the short focal length.
Using the three cameras, you can zoom from the telephoto all the way out to the ultra wide-angle lens, allowing for a 4x zoom. That’s 2x optical zoom in and 2x optical zoom out, with digital zoom up to 10x also available.
The Camera app interface has been improved with an updated look that displays the entire field of view captured by the ultra wide-angle lens, even when you’re taking a telephoto or standard wide-angle shot.
This is designed to let you see what an image could look like if you zoomed out, which you can do with a tap. There is a dedicated button in the camera app for switching between the three available lenses and their different focal lengths so you can get just the shot that you want.
Camera controls for swapping between the three lenses are available no matter what you’re doing in the camera app, from taking a photo, video, time lapse image, or slo-mo video.
To make all three cameras work together and function as one, Apple calibrated each camera individually for white balance, exposure, and other metrics. The three cameras are paired and calibrated for module to module alignment, with those calibrations applied to each image in real time.
Apple says that capturing an image is like taking raw images from three cameras and processing them for a consistent look and color, with that calculation happening in a split second. This process makes sure your photos look the same, whether you take them with the telephoto, wide-angle, or ultra wide-angle camera.
A next-generation Smart HDR feature is included in the iPhone 11 Pro and Pro Max, using advanced algorithms to bring out highlight and shadow detail in images. It’s also able to use machine learning to recognize faces in images, intelligently relighting them for the best possible detail in both the subject and the background.
This is a feature that Apple says even some DSLRs aren’t capable of handling.
The wide-angle camera in the iPhone 11 Pro features a larger sensor with 100 percent Focus Pixels to enable new low light capabilities like Night Mode, which is designed to take much brighter pictures in low lighting conditions. It’s similar to Google’s Night Shift mode, lightening up the photo using complex AI software.
Night Mode turns on automatically in low lighting conditions, and there’s no need to use the flash with it. When you’re in an area with poor lighting, the camera takes multiple images while optical image stabilization works to steady the lens.
The A13 chip is then engaged to align images to correct for movement. Sections with too much blur are eliminated, while sharper images are fused together. The contrast is then adjusted, the colors are fine tuned, excess noise is eliminated, and details are enhanced to create a final image that looks much brighter and crisper than the lighting conditions would normally allow for.
Apple says that users can experiment with manual controls in Night Mode to get even more detail and less noise if desired, so you can get just the look you’re going for even in situations where the lighting is far from ideal.
Portrait Mode in the iPhone 11 Pro models allows for photos that are focused on a subject in a foreground while the background is blurred.
Portrait Mode has been available since the iPhone X, but in this year’s iPhone, Portrait Mode photos can be taken with either the telephoto lens or the wide-angle lens, thanks to the addition of the ultra wide-angle lens, which can be used for depth perception.
In the iPhone X, XS, and XS Max, Portrait Mode was limited to a telephoto focal length. The update means you can take Portrait Mode shots that are more zoomed out and have a wider field of view than before.
The iPhone 11 Pro supports Portrait Lighting, which allows the lighting effects of an image to be shifted using software. There are several different lighting options to choose from, including Natural, Studio, Contour, Stage, Stage Mono, and High-Key Mono.
As of iOS 13, Portrait Lighting effects can also be adjusted using an intensity slider, making them more useful because more subtle looks can be achieved.
Other Camera Features
Other available camera features include a 36 percent brighter True Tone flash, 63-megapixel panoramas that can be twice as high, wide color capture, Live Photos support, advanced red-eye correction, and burst mode.
In iOS 13.2, Apple added a Deep Fusion feature, which is a new image processing system that uses the A13 Bionic and the Neural Engine. Deep Fusion uses advanced machine learning techniques to do pixel-by-pixel processing of photos, optimizing for texture, details, and noise in each part of the image.
Deep Fusion is aimed at improving indoor photos and photos taken in medium lighting. It’s a feature that activates automatically based on the lens being used and the light level in the room rather than being something that can be manually enabled.
Donec accumsan auctor iaculis. Sed suscipit arcu ligula, at egestas magna molestie a. Proin ac ex maximus, ultrices justo eget, sodales orci. Aliquam egestas libero ac turpis pharetra, in vehicula lacus scelerisque. Vestibulum ut sem laoreet, feugiat tellus at, hendrerit arcu..