Like the iPhone XR, the iPhone 11 features a precision-machined 7000 series aluminum frame that wraps around an all-glass enclosure. With its 6.1-inch display, the iPhone 11 is between the 5.8-inch iPhone 11 Pro and 6.5-inch iPhone 11 Pro Max in size.
The iPhone 11 has an edge-to-edge display with slim bezels and no Home button, adopting a notch at the top for the TrueDepth camera system. Because it uses an LCD instead of OLED display, the iPhone 11 has slightly thicker bezels than the iPhone 11 Pro models.
Other than the notch at the top for the Face ID camera, speaker, and ambient light sensor, the iPhone 11 is all display.
The iPhone 11 measures in at 150.9mm tall, 75.7mm wide, and 8.3mm thick, which is identical to the previous-generation iPhone XR. It weighs in at 6.84 ounces, also identical to the XR. It is thicker than the iPhone 11 Pro and iPhone 11 Pro Max, which measure in at 8.1mm.
The back of the iPhone 11 features the most significant design change thanks to the new dual-camera system. Apple is using a new square-shaped camera bump that flows into the rest of the device. The two camera lenses protrude slightly from the back of the iPhone as the camera elements are thicker than the body of the iPhone.
The Apple logo on the iPhone 11 has been relocated relative to previous iPhone models. It’s now in the middle of the device rather than towards the top, a change that may have been implemented for a planned two-way charging feature that was later scrapped. The bilateral wireless charging option would have allowed the iPhone 11 to be used to charge other iPhones, the Apple Watch, the AirPods, and other Qi-based devices.
More Durable Glass
According to Apple, the iPhone 11 is made from the most durable glass ever in a smartphone, so in theory, it should hold up better to accidental bumps and drops. It’s still glass, though, so it’s best to use a case or have AppleCare+ in case of accidental damage.
Apple says it’s using a “dual ion-exchange process” to strengthen the front and back glass of the iPhone 11.
The iPhone 11 continues to be available in six colors, but Apple introduced new colors this year. It comes in black, green, yellow, purple, (PRODUCT)RED, and white, with Apple eliminating the coral and blue shades that the iPhone XR was available in.
Water and Dust Resistance
The iPhone 11 has an IP68 water resistance rating, which is up from IP67 in the previous-generation iPhone XR. It’s rated to survive a depth of up to two meters (6.5 feet) for up to 30 minutes. Two meters is double the depth rating of the iPhone XR, but half that of the iPhone 11 Pro.
The iPhone 11 Pro has is able to stand up to submersion in water up to four meters deep (13 feet) for up to 30 minutes.
With an IP68 water resistance rating, the iPhone 11 can withstand splashes, rain, and brief accidental water exposure, but intentional water exposure should be avoided. Apple warns that water and dust resistance are not permanent conditions and deteriorate as a result of normal wear.
Apple’s warranty does not cover liquid damage to iOS devices so it’s best to use caution when exposing the iPhone 11 to liquids.
Spatial Audio and Dolby Atmos
The iPhone 11 is built with a new spatial audio feature that’s designed to simulate surround sound for a more immersive audio experience. It also supports Dolby Atmos sound.
Like the iPhone XR, the iPhone 11 uses an LCD display that it calls the “Liquid Retina HD” display. It measures in at 6.1 inches and features a 1792 x 828 resolution at 326 pixels per inch.
Though the iPhone 11 has an LCD instead of OLED display, Apple says it is the most advanced LCD that’s been introduced in a smartphone, constructed with new engineering techniques. It is identical to the display in the iPhone XR, and is inferior to the iPhone 11 Pro and Pro Max displays.
The iPhone 11 display continues to support Apple’s latest technology advancements including Tap to Wake to activate display with a single tap, a swipe-based gesture system to replace the Touch ID Home button, True Tone for matching the white balance of the display to the ambient lighting, and wide color for vivid, true to life colors.
It has a 1400:1 contrast ratio, which is one of the areas where it falls far short of the iPhone 11 Pro models. The iPhone 11 Pro and Pro Max have a 2,000,000:1 contrast ratio, which means their colors are richer, blacks are blacker, and there is HDR support, which is unavailable in the iPhone 11.
Apple eliminated the 3D Touch feature in the iPhone XR and replaced it with a new Haptic Touch option, which has now rolled out to the entire 2019 iPhone lineup.
Haptic Touch is similar to 3D Touch and offers much of the same functionality, but it is not pressure sensitive so there are not multiple functions for each press. Instead, Haptic Touch is like a long press with haptic feedback. For more on the difference between Haptic Touch and the previous 3D Touch, make sure to check out our Haptic Touch guide.
The iPhone 11 is equipped with an A13 Bionic chip that’s faster and more efficient than the A12 Bionic chip in the iPhone XR. Apple says that the A13 Bionic is the fastest chip ever in a smartphone and so advanced that it’s “years ahead of the pack.”
The CPU’s two performance cores in the A13 are up to 20 percent faster and use 30 percent less power than the A12, and the four efficiency cores are up to 20 percent faster and use up to 40 percent less power.
The GPU in the A13 is 20 percent faster than the GPU in the A12 and it uses 40 percent less power.
According to testing by AnandTech, the A13 in the iPhone 11 and 11 Pro offers 50 to 60 percent higher sustained graphics performance than the iPhone XS and 20 percent faster CPU performance.
Neural EngineThe A13 chip features a next-generation 8-core Neural Engine that Apple says is faster than ever for real-time photo and video analysis. A pair of Machine Learning Accelerators allow the CPU to run up to six times faster, delivering more than 1 trillion operations per second.
The Neural Engine is up to 20 percent faster and uses up to 15 percent less power than the previous-generation Neural Engine. Apple says its Neural Engine powers the camera system, Face ID, AR apps, and more.
Core ML 3 for developers allows apps to leverage the power of the A13 Bionic for apps and games.
RAM and Storage Space
While there hasn’t been a teardown yet, rumors and benchmarks have suggested the iPhone 11 is equipped with 4GB RAM, up from 3GB RAM in the iPhone XR.
The iPhone 11 is available in 64, 128, and 256GB capacities.
TrueDepth Camera and Face ID
Face ID, introduced in 2017, is the biometric authentication system used in the iPhone 11, which features a notch housing the True Depth camera system that enables Face ID.
In the iPhone 11, the TrueDepth camera system has been improved with new hardware. It’s faster and it’s able to work from a wider range of angles, so it’s more efficient and quicker than ever.
Face ID is used across the iOS operating system for tasks like unlocking your iPhone, allowing access to third-party passcode-protected apps, confirming purchases in iTunes and the App Store, and authenticating Apple Pay payments.
Face ID works through a set of sensors and cameras built into the front of the iPhone 11, called the TrueDepth Camera system. To create a facial scan, a Dot Projector projects more than 30,000 invisible infrared dots onto your face, which are then read by an infrared camera.
This depth map of your face is then relayed to the A13 Bionic processor where it is transformed into a mathematical model that the iPhone uses to make sure it’s you attempting to access your iPhone.
Face ID uses infrared, so it works in low light and in the dark, with a built-in Flood Illuminator making sure there’s always adequate light to take a facial scan. Face ID works with hats, beards, glasses, sunglasses, scarves, makeup and all other accessories and items that might partially obscure a face, but it does need to see your eyes, nose, and mouth to work.
The A13 Bionic chip with built-in Neural Engine means that Face ID can adjust to minor facial changes over time, so if you grow your hair longer or grow a beard, Face ID adjusts and continue to unlock your iPhone.
Face ID Security and Privacy
Face ID uses a detailed 3D facial scan that’s unable to be fooled by a photo, mask, or other facial imitation. An “Attention Aware” security feature allows Face ID to unlock your device only when you look in the direction of the iPhone 11 with your eyes open, so it does not work when your eyes are closed, when you’re sleeping, when you’re unconscious, or when you’re looking away from your phone.
Attention aware is optional and there is an accessibility feature to turn it off for those who are unable to focus on the iPhone’s screen, but most people should leave it turned on for the added layer of security.
With the attention aware feature, the iPhone 11 knows when you’re looking at it. Face ID displays notifications and messages on the Lock screen when you look at the iPhone 11, it keeps the screen lit, and it automatically lowers the volume of an alarm or ringer when it knows your attention is on the iPhone 11’s display.
If a thief demands your iPhone, Face ID can be disabled quickly and discretely by pressing on the side button and the volume button at the same time. Do this before handing your phone over, and a thief won’t be able to scan your face. Face ID also turns off after two failed facial recognition attempts and a passcode needs to be entered for it to be turned back on.
Face ID data is encrypted and stored in the Secure Enclave on the iPhone 11. Apple can’t access your Face ID data, nor can anyone who has your phone. Authentication happens entirely on your device, with no Face ID data ever stored in the cloud or uploaded to Apple. Third-party developers do not have access to the facial map that Face ID uses to unlock a device, but the TrueDepth camera can be used to scan a user’s face for the purpose of creating more realistic augmented reality apps.
With Face ID, there’s a 1 in 1,000,000 chance that someone else’s face can fool Face ID, but the error rate increases to 1 in 1 in 500,000 with an alternate appearance registered in iOS 13. Face ID has been fooled by identical twins, children, and a carefully crafted mask, but it’s still secure enough that the average person shouldn’t worry about their iPhone being unlocked by someone else.
TrueDepth Camera Specs
The TrueDepth camera system, in addition to powering Face ID with the additional biometric components, is also a standard front-facing camera that can be used for selfies.
In the iPhone 11, the front-facing camera has been upgraded to 12 megapixels from 7 megapixels in the iPhone XR, and it supports next-generation Smart HDR. The updated camera is capable of recording 60 fps video in 4K with support for extended dynamic range video at 30 fps.
When using the new front-facing camera, you can turn the iPhone from portrait mode into landscape mode to automatically capture more in the frame, which is useful for situations like group selfies.
When you take a selfie with the iPhone 11 in standard portrait orientation, it uses a zoomed in 7-megapixel version. Turning your iPhone to landscape mode allows more into the frame and results in a 12-megapixel photo, as does tapping the little arrow icon to zoom out when in portrait orientation.
The front-facing TrueDepth camera is able to capture 120 fps slo-mo videos for the first time, enabling a new feature that Apple is calling “Slofies.” These are slow motion front-facing camera videos similar to the slo-mo videos available from the rear facing camera in prior iPhones.
Animoji and Memoji
The TrueDepth Camera System supports two features called “Animoji” and “Memoji,” which are animated, 3D emoji characters that you control with your face. Animoji are emoji-style animals, while Memoji are customizable, personalized avatars that you can create.
To enable Animoji and Memoji, the TrueDepth camera analyzes more than 50 muscle movements in different areas of the face, detecting movement of the eyebrows, cheeks, chin, eyes, jaw, lips, eyes, and mouth.
All of your facial movements are translated to the Animoji/Memoji characters, letting them reflect your expression and emotion. Animoji and Memoji can be shared with friends and used in the Messages and FaceTime apps.
There are more than a dozen different Animoji to choose from, modeled after existing emoji characters: monkey, robot, cat, dog, alien, fox, poop, pig, panda, rabbit, chicken, unicorn, lion, dragon, skull, bear, tiger, koala, t-rex, and ghost. There are an unlimited number of Memoji that can be created to look like you and other people.
As of iOS 13, there are also Animoji and Memoji stickers that can be used in the Messages app and other areas of the operating system.
The major new feature in the iPhone 11 is an upgraded dual-lens camera system. It includes an f/1.8 6-element 12-megapixel wide-angle lens (26mm focal length) and an f/2.4 5-element 12-megapixel ultra wide-angle lens (13mm focal length), up from a single 12-megapixel camera lens in the iPhone XR.
The new ultra wide-angle lens has a 120 degree field of view, which is ideal when you want to get a landscape or architecture shot, or fit more in the frame close up. Unlike the iPhone 11 Pro, it does not have a telephoto lens, so while 2x optical zoom out is supported, there’s no optical zoom in feature.
The standard wide-angle camera supports Optical Image Stabilization, but the ultra wide-angle lens does not.
With the new ultra wide-angle lens, Apple is introducing an updated camera app interface that displays the entire field of view captured by the ultra wide lens even when you’re taking a picture with the standard wide-angle lens. Toggling between modes can be done with a tap.
The iPhone 11 is equipped with next-generation Smart HDR, which Apple says better recognizes people, treating them differently from the rest of the shot. Faces feature highlights, shadows, and natural-looking skin tones while background elements are preserved.
The wide-angle camera in the iPhone 11 has a new, larger sensor with 100 percent Focus Pixels that enables new low light capabilities such as a Night mode that’s designed to take much brighter pictures in low lighting conditions. It’s similar to Google’s Night Shift mode, brightening up the photo using software.
Night mode turns on automatically in low lighting conditions, and there’s no need to use the flash with it. When you’re in an area with poor lighting, the camera takes multiple images while optical image stabilization works to steady the lens.
The A13 chip is then engaged to align images to correct for movement. Sections with too much blur are eliminated, while sharper images are fused together. The contrast is then adjusted, the colors are fine tuned, excess noise is eliminated, and details are enhanced to create a final image that looks much brighter and crisper than the lighting conditions would normally allow for.
Though the iPhone 11 has no telephoto lens, it is still able to take Portrait mode shots using the other camera lenses, much like the iPhone XR. The two cameras work together to create Portrait Mode photos where the subject of the photo is in focus and the background is blurred, similar to the effect that you get with a DSLR.
Portrait Mode in the iPhone 11 has been improved because it works with people, pets, food, and other objects. With the iPhone XR, Portrait Mode was limited to shots of people.
The iPhone 11 supports Portrait Lighting, which allows the lighting effects of an image to be shifted using software. More lighting modes are supported in the iPhone 11, including Natural, Studio. Contour, Stage, Stage Mono, and High-Key Mono. Stage and Stage Mono were not available in the XR.
As of iOS 13, Portrait Lighting effects can be adjusted using an intensity slider, making them more useful because more subtle looks can be achieved.
Other Camera Features
Other available camera features include a 36 percent brighter True Tone flash, 63-megapixel panoramas, wide color capture, Live Photos support, advanced red-eye correction, and burst mode.
This fall, Apple plans to introduce a Deep Fusion feature, which is a new image processing system that uses the A13 Bionic and the Neural Engine. Deep Fusion uses advanced machine learning techniques to do pixel-by-pixel processing of photos, optimizing for texture, details, and noise in each part of the image.
Deep Fusion is aimed at improving indoor photos and photos taken in medium lighting. It’s a feature that activates automatically based on the lens being used and the light level in the room rather than being something that can be manually enabled.
Donec accumsan auctor iaculis. Sed suscipit arcu ligula, at egestas magna molestie a. Proin ac ex maximus, ultrices justo eget, sodales orci. Aliquam egestas libero ac turpis pharetra, in vehicula lacus scelerisque. Vestibulum ut sem laoreet, feugiat tellus at, hendrerit arcu..