Selfie Reviews - DXOMARK https://www.dxomark.com/category/selfie-reviews/ The leading source of independent audio, display, battery and image quality measurements and ratings for smartphone, camera, lens, wireless speaker and laptop since 2008. Tue, 05 Mar 2024 12:14:24 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://www.dxomark.com/wp-content/uploads/2019/09/logo-o-transparent-150x150.png Selfie Reviews - DXOMARK https://www.dxomark.com/category/selfie-reviews/ 32 32 Honor Magic6 Pro Selfie test https://www.dxomark.com/honor-magic6-pro-selfie-test/ https://www.dxomark.com/honor-magic6-pro-selfie-test/#respond Sun, 25 Feb 2024 11:04:31 +0000 https://www.dxomark.com/?p=166647&preview=true&preview_id=166647 We put the Honor Magic6 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing [...]

The post Honor Magic6 Pro Selfie test appeared first on DXOMARK.

]]>
We put the Honor Magic6 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 50MP sensor
  • f/2.0-aperture lens
  • Autofocus
  • 4K video at 30fps, 1080p at 30/60fps (4k at 30 fps tested)

Scoring

Sub-scores and attributes included in the calculations of the global score.

Honor Magic6 Pro
Honor Magic6 Pro
151
selfie
149
Photo
90

97

97

106

105

Best

73

79

93

94

78

91

90

93

80

Best

155
Video
84

87

87

90

88

92

89

97

80

83

87

92

75

82

Pros

  • Accurate target exposure and wide dynamic range
  • Neutral white balance and fairly pleasant color rendering
  • Low noise levels in bright light and indoors
  • Wide depth of field

Cons

  • Slightly strong contrast in backlit scenes
  • Loss of fine detail in low light
  • Sharpness differences between frames in video with motion
  • Artifacts such as hue shift near saturation sometimes visible in outdoor and indoor conditions

The Honor Magic6 Pro delivered an impressive performance in the DXOMARK Selfie test, achieving the highest overall score to date. The new model’s updated front camera hardware undoubtedly contributed to this outstanding result and helped achieve noticeable improvements over the predecessor. While the Magic5 Pro had to make do with a 12MP image sensor and f/2.4-aperture lens, the new device comes with a 50MP resolution and a wider f/2 aperture, offering significantly better light collection capabilities. In addition, the new Snapdragon 8 Gen 3 chipset provides plenty of image processing power and the addition of an autofocus system allows for precise focusing at all subject distances.

In our tests, the Honor Magic6 Pro delivered the most well-rounded performance to date, achieving excellent results across all sub-attributes. It was even better than the previous No. 1, the Apple iPhone 15 Pro Max, in most test areas, except exposure and color in both photo and video. When comparing the two front-runner devices, we can see that the iPhone generally provides better contrast and more vibrant colors. However, the iPhone images also come with significant amounts of noise. The Honor Magic6 Pro, on the other hand, does particularly well for noise, delivering an impressive trade-off between texture preservation and noise reduction.

Please note that the Magic6 Pro was tested in SDR format for both photo and video as no HDR format is available when shooting with the front camera. The Apple iPhone 15 Pro Max was tested in HDR mode.

Honor Magic 6 Pro Selfie Scores vs Ultra-Premium
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

Photo

149

Honor Magic6 Pro

Best

Honor Magic 6 Pro Photo scores vs Ultra-Premium
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

In our front camera tests, the Honor Magic6 Pro achieved the best score for photo to date, significantly surpassing its predecessor Magic5 Pro. The wide depth of field and associated ability to keep all subjects in focus were a major strength. In addition, the camera managed to keep noise levels down in most test conditions, delivering a good trade-off between texture and noise. While exposure and color rendering were not quite as nice as on the iPhone 15 Pro Max, the Honor was still among the best in these test areas. The natural bokeh effect was another favorite of our testers, with the Honor achieving the best results we have seen to date for this test. On the downside, some image artifacts could be noticeable, especially in difficult backlit scenes or low light.

Honor Magic6 Pro – Pleasant color rendering, accurate target exposure and high level of details.

Exposure

90

Honor Magic6 Pro

97

Apple iPhone 15 Pro

Exposure is one of the key attributes for technically good pictures. The main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.

When compared to the predecessor Honor Magic5 Pro, the Magic6 Pro delivered better face contrast, especially in difficult backlit scenes. This said, it was still not quite on the same level as the Apple iPhone 15 Pro Max in this respect. The camera was also capable of capturing a wide dynamic range.

Honor Magic6 Pro – Slightly strong contrast, slight highlight clipping on face
Apple iPhone 15 Pro Max – Pleasant contrast

Color

97

Honor Magic6 Pro

106

Google Pixel 8 Pro

Color is one of the key attributes for technically good pictures. The image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

The Magic6 Pro generally captured images with a neutral white balance but slight color casts could be noticeable in some scenes. Skin tones and color rendering were pleasant overall, but in some scenes, colors looked slightly desaturated. Our testers sometimes also found the skin tones on dark-skinned models to be slightly too red.

Honor Magic6 Pro – Slightly desaturated colors, occasional slightly cold cast
Apple iPhone 15 Pro Max – Bright and vivid colors, natural skin tones

Focus

105

Honor Magic6 Pro

Best

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

Despite the wide f/2.0 aperture, the Honor Magic6 Pro offers a wide depth of field. The addition of an autofocus system allows for precise adjustment of the focus point. As a result, in our tests, the camera was capable of keeping all faces in group scenes in focus.

Honor Magic6 Pro - Depth of field
Honor Magic6 Pro - Background face (and all other faces) in focus, wide depth of field
Apple Iphone 15 Pro Max - Depth of field
Apple Iphone 15 Pro Max - Background face is out of focus, slightly limited depth of field

Texture

73

Honor Magic6 Pro

79

Asus ZenFone 7 Pro

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

The Magic6 Pro front camera captured fairly good detail in daylight and when shooting under indoor conditions. However, a loss of detail was noticeable in low light. Our testers also observed some unnatural detail rendering in some scenes, especially in low light.

Honor Magic6 Pro - Detail
Honor Magic6 Pro - Good detail but unnatural detail rendering on hair
Apple iPhone 15 Pro Max - Detail
Apple iPhone 15 Pro Max - Acceptable detail
Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

93

Honor Magic6 Pro

94

Huawei Mate 50 Pro

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

The Honor Magic6 Pro managed to maintain a quite good trade-off between texture and noise across all light conditions, outperforming the iPhone 15 Pro Max in this respect. The difference between the two cameras was most noticeable in low light, where the Honor managed visibly lower noise levels.

Visual noise evolution with illuminance levels in handheld condition
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
Honor Magic6 Pro - Outdoor noise
Honor Magic6 Pro - Well-controlled noise on face
Apple iPhone 15 Pro Max - Outdoor noise
Apple iPhone 15 Pro Max - Fine luminance noise on face
Honor Magic6 Pro - Low light noise
Honor Magic6 Pro - Slight noise
Apple iPhone 15 Pro Max - Low light noise
Apple iPhone 15 Pro Max - Noise

Artifacts

78

Honor Magic6 Pro

91

Apple iPhone 15 Pro

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

Our experts found some noticeable artifacts in the Magic6 Pro’s image output, including color quantization and face rendering artifacts. While the camera captured high levels of detail on the eyes and on the edges of facial features, skin texture was sometimes rendered unnaturally. Color quantization could be quite noticeable, especially in low light. The testers also observed a loss of sharpness towards the edges of the frame, as well as a hue shift close to clipped areas.

Main photo artifacts penalties

Bokeh

80

Honor Magic6 Pro

Best

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

The Honor Magic6 Pro achieved the best results for Selfie bokeh to date. The Bokeh simulation was very natural, with a pleasant blur and smooth blur transitions. In addition, spotlights in the background were rendered naturally, with high contrast and a well-defined shape.

Honor Magic6 Pro – Good depth estimation, no artifacts
Apple iPhone 15 Pro Max – Depth estimation artifacts in complex scenes.

Video

155

Honor Magic6 Pro

156

Apple iPhone 15 Pro
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

Honor Magic 6 Pro Video scores vs Ultra-Premium
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

The Honor Magic6 Pro video mode was tested at 4K resolution and 30 frames per second as this configuration delivered the overall best video quality. At these settings, video performance was great across most sub-tests, with high levels of detail and low noise in most conditions. Texture and noise were also the most improved areas over the Magic5 Pro, and the main advantage over the Apple iPhone 15 Pro Max. In addition, dynamic range was wide in most test conditions, and color rendering was pleasant. Like for still images, a wide depth of field helped keep multiple subjects in a scene in focus. Video stabilization was the Honor’s only weakness when compared to its direct rivals in the Ultra Premium segment.

Exposure

84

Honor Magic6 Pro

87

Apple iPhone 15 Pro

Video target exposure was accurate in most conditions but sometimes our testers observed some underexposure in low-light clips. Dynamic range was generally wide, but highlight clipping could occur in some bright light scenes.

Honor Magic6 Pro – Accurate target exposure, wide dynamic range with pleasant contrast

Apple iPhone 15 Pro Max – Accurate target exposure, slight clipping on face

Color

87

Honor Magic6 Pro

90

Apple iPhone 15 Pro

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Our testers found the Magic6 Pro’s video color rendering to be nice and pleasant when recording in bright light as well as under indoor lighting. In addition, the camera captured nice skin tones and managed a neutral white balance. However, in more difficult backlit scenes, we observed some color rendering failures and noticeable color casts.

Honor Magic6 Pro – Slight white balance cast, inaccurate skin tones, color fringing on background

Apple iPhone 15 Pro Max – Pleasant color rendering, no cast, natural skin tones

Texture

89

Honor Magic6 Pro

97

Asus ZenFone 6

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

Video clips recorded on the Magic6 Pro front camera in bright conditions offered high levels of texture. In addition, fine detail was preserved well.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

80

Honor Magic6 Pro

83

Xiaomi Mi 11 Ultra

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Both visual noise and temporal noise were fairly well controlled when recording video in bright light. However, noise tended to become noticeable in the background of low-light scenes.

Spatial visual noise evolution with the illuminance level
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
Temporal visual noise evolution with the illuminance level
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Stabilization

75

Honor Magic6 Pro

82

Apple iPhone 15 Pro

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jellow artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

The Magic6 Pro’s video stabilization performed on a very similar level as on the iPhone 15 Pro Max, with some camera shake still noticeable and moderate sharpness differences between frames. On occasion, the Honor Magic6 Pro footage also displayed a slight frame shift.

Honor Magic6 Pro – Camera shake, sharpness differences between frames, slight frame shift

Apple iPhone 15 Pro Max – Sharpness differences between frames

Artifacts

87

Honor Magic6 Pro

92

Apple iPhone 12 mini

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

Hue shift close to clipped areas and on faces was a common occurrence in the Honor’s video clips during our testing. In low light, our testers also observed some face rendering artifacts.

Main video artifacts penalties

The post Honor Magic6 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/honor-magic6-pro-selfie-test/feed/ 0 Honor Magic6 Pro Best Best SELFIE SELFIE Best VegetationFair_HonorMagic6Pro_DxOMark_Selfie_05-00 Forbidden_HonorMagic6Pro_DxOMark_Selfie_05-00 VegetationMedium_HonorMagic6Pro_DxOMark_Selfie_05-00 VegetationMedium_AppleiPhone15ProMax_DxOMark_Selfie_05-00 Best Best Grid_HonorMagic6Pro_DxOMark_Selfie_05-00 Grid_AppleiPhone15ProMax_DxOMark_Selfie_05-00
Apple iPhone 15 Pro Selfie test https://www.dxomark.com/apple-iphone-15-pro-selfie-test/ https://www.dxomark.com/apple-iphone-15-pro-selfie-test/#respond Thu, 14 Dec 2023 14:15:23 +0000 https://www.dxomark.com/?p=163372 The Apple iPhone 15 Pro and iPhone 15 Pro Max share the same front camera specs, as well as the same chipset, so as expected, the results of the Apple iPhone 15 Pro camera were exactly the same as those of the Apple iPhone 15 Pro Max. For a more in-depth look at the Apple [...]

The post Apple iPhone 15 Pro Selfie test appeared first on DXOMARK.

]]>
The Apple iPhone 15 Pro and iPhone 15 Pro Max share the same front camera specs, as well as the same chipset, so as expected, the results of the Apple iPhone 15 Pro camera were exactly the same as those of the Apple iPhone 15 Pro Max.

For a more in-depth look at the Apple iPhone 15 Pro’s front camera photo and video performance, we direct you to the full test results of the Apple iPhone 15 Pro Max Selfie.

Overview

Key camera specifications:

  • 12MP sensor
  • f/1.9-aperture lens
  • Autofocus
  • 4K video at 24/25/30/60 fps, 1080p at 25/30/60/120 fps (4K at 30 fps tested)

Scoring

Sub-scores and attributes included in the calculations of the global score.

Apple iPhone 15 Pro
Apple iPhone 15 Pro
149
selfie
145
Photo
97

Best

103

106

103

105

76

79

61

94

91

Best

78

93

80

Best

156
Video
87

Best

90

Best

91

92

84

97

71

83

87

92

82

Best

Pros

  • Accurate exposure and wide dynamic range for photos and videos
  • Reliable and fast autofocus with a wide depth of field
  • High levels of detail in bright light for photos and videos
  • Effective stabilization when holding the device still and when moving while recording
  • Very accurate subject isolation in bokeh mode

Cons

  • Noise across all shooting conditions
  • Occasional color quantization artifacts in photos and videos
  • Occasional sharpness differences between frames when recording video while walking

 

The post Apple iPhone 15 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/apple-iphone-15-pro-selfie-test/feed/ 0 Apple iPhone 15 Pro Best Best Best Best Best Best SELFIE SELFIE
Xiaomi 13 Ultra Selfie test https://www.dxomark.com/xiaomi-13-ultra-selfie-test/ https://www.dxomark.com/xiaomi-13-ultra-selfie-test/#respond Mon, 11 Dec 2023 12:31:23 +0000 https://www.dxomark.com/?p=162879&preview=true&preview_id=162879 We put the Xiaomi 13 Ultra through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with [...]

The post Xiaomi 13 Ultra Selfie test appeared first on DXOMARK.

]]>
We put the Xiaomi 13 Ultra through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 32MP sensor
  • f/2.0-aperture lens
  • 1080p/30 fps

Scoring

Sub-scores and attributes included in the calculations of the global score.

Xiaomi 13 Ultra
Xiaomi 13 Ultra
128
selfie
123
Photo
75

97

87

106

78

105

71

79

72

94

82

91

82

93

65

80

137
Video
79

87

80

90

83

92

74

97

71

83

76

92

78

82

Pros

  • Good texture/noise trade-off in photo and video
  • Good video stabilization when holding the device still and when moving while recording
  • Good exposure in most situations

Cons

  • Quite strong tone compression in backlit scenes
  • Narrow dynamic range results in highlight clipping
  • Skin enhancement effect can result in unrealistic skin textures

In the DXOMARK Selfie tests, the Xiaomi 13 Ultra delivered an overall decent performance and showed some significant improvements over its predecessor the 12S Ultra, especially in terms of photo color and exposure. In addition, video stabilization was more effective, resulting in higher levels of detail. On the downside, the shallow depth of field prevented good sharpness on all subjects in group shots, just like on the 12S Ultra, and in some challenging light conditions, we observed strong tone compression, which resulted in a lack of contrast on the face, as well as unnatural colors.

While the Xiaomi 13 Ultra could not match the overall front camera results of some Ultra-Premium rivals such as the Apple iPhone 14 Pro or the  Huawei Mate 50 Pro, it was better in some areas. Unwanted image artifacts were better under control than on the Huawei Mate 50 Pro; noise reduction worked more effectively, and flash mode was better than on the iPhone 14 Pro.

Xiaomi 13 Ultra Selfie Scores vs Ultra-Premium
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

Photo

123

Xiaomi 13 Ultra

149

Honor Magic6 Pro
Xiaomi 13 Ultra Photo scores vs Ultra-Premium
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

In photo mode, the Xiaomi 13 Ultra front camera produced selfie images with good detail and accurate exposure. However, the skin-smoothening effect that kicks in automatically could be quite strong. The bokeh mode was not among the best that we have seen. Subject isolation left some room for improvement and the lack of a blur gradient resulted in an unrealistic effect that did not really provide a sense of depth. When capturing images with the flash enabled, noise was well under control on faces, but vignetting around the edge of the frame was quite strong.

Exposure

75

Xiaomi 13 Ultra

97

Apple iPhone 15 Pro

Color

87

Xiaomi 13 Ultra

106

Google Pixel 8 Pro

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

In our tests, the Xiaomi generally exposed subjects well, except in low light, where our testers noticed slight underexposure. Dynamic range was pretty narrow, which resulted in frequent highlight clipping in scenes with high contrast. Color was managed well but we often observed strong tone compression, which resulted in unnatural skin textures and low contrast on the subject’s face, especially when shooting under daylight.

Xiaomi 13 Ultra – Strong tone compression
Huawei Mate 50 Pro – Natural result
Apple iPhone 14 Pro – Slight pink/orange cast

Focus

78

Xiaomi 13 Ultra

105

Honor Magic6 Pro

Our testers found the camera to deliver good focus as long as the subject was in close distance to the lens. However, a shallow depth of field means that the Xiaomi 13 Ultra is not the best option for capturing group selfies when subjects are at different distances from the camera. While the closest face will be sharp, subjects further back will be out of focus.

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

Xiaomi 13 Ultra - Depth of field
Xiaomi 13 Ultra - Background subjects out of focus
Huawei Mate 50 Pro - Depth of field
Huawei Mate 50 Pro - Most faces in focus, subject at the back slightly out of focus
Apple iPhone 14 Pro - Depth of field
Apple iPhone 14 Pro - All faces in focus, including in the background

Texture

71

Xiaomi 13 Ultra

79

Asus ZenFone 7 Pro

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

The Xiaomi 13 Ultra managed to limit noise levels on the face, while also maintaining decent skin texture. In low light, it performed better than the Apple iPhone 14 Pro but could not quite match the low noise levels of the Huawei Mate 50 Pro.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.
Xiaomi 13 Ultra - Detail and noise
Xiaomi 13 Ultra - Loss of fine detail on face
Huawei Mate 50 Pro - Detail and noise
Huawei Mate 50 Pro - Good detail but unnatural textures in the eye area
Apple iPhone 14 Pro - Detail and noise
Apple iPhone 14 Pro - Good detail and natural textures

Noise

72

Xiaomi 13 Ultra

94

Huawei Mate 50 Pro

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

Visual noise evolution with illuminance levels in handheld condition
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

Artifacts

82

Xiaomi 13 Ultra

91

Apple iPhone 15 Pro

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

In our tests, the Xiaomi 13 Ultra tended to apply a strong skin-smoothing effect to its images, which could result in an unnatural look. In difficult light conditions, such as backlit scenes, our testers observed further artifacts, including halo effects, tone compression, and hue shift close to saturated image areas. In addition, anamorphosis could be slightly noticeable in close-up selfies, with the face close to the edge of the frame.

Main photo artifacts penalties

Video

137

Xiaomi 13 Ultra

156

Apple iPhone 15 Pro
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

Compared to its predecessor the 12SUltra, the Xiaomi 13 Ultra’s video mode showed improvements in various areas. Video noise levels were lower, stabilization was smoother, and the level of recorded detail had increased across all light levels. On the downside, we noticed occasional underexposure in low-light scenes and, just like for photo mode, dynamic range was limited. Rivals like the Apple iPhone 14 Pro or the Huawei Mate 50 Pro performed better in this regard, with a well-managed HDR mode expanding the dynamic range of the video footage.

Xiaomi 13 Ultra Video scores vs Ultra-Premium
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

Exposure

79

Xiaomi 13 Ultra

87

Apple iPhone 15 Pro

Color

80

Xiaomi 13 Ultra

90

Apple iPhone 15 Pro

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Face exposure remained generally accurate in bright light and under indoor conditions, but the limited dynamic range resulted in highlight clipping in the background in many scenes. In low light, our testers often found the subject to be underexposed.

Xiaomi 13 Ultra – Stable white balance and face exposure, highlight clipping in background

Huawei Mate 50 Pro – Wide dynamic range and accurate face exposure (HDR Video), stable white balance

Apple iPhone 14 Pro – Wide dynamic range and accurate face exposure (HDR Video), stable white balance

Texture

74

Xiaomi 13 Ultra

97

Asus ZenFone 6

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

71

Xiaomi 13 Ultra

83

Xiaomi Mi 11 Ultra

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Spatial visual noise evolution with the illuminance level
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
Temporal visual noise evolution with the illuminance level
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Video noise was handled pretty well by the Xiaomi 13 Ultra, especially on faces. Noise reduction was very effective even in low light, removing noise without sacrificing texture.

Stabilization

78

Xiaomi 13 Ultra

82

Apple iPhone 15 Pro

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jello artifacts, during walk and panning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

The video stabilization of the Xiaomi 13 Ultra is solid; no jello effect was noticeable, and overall performance was on par with class-leading devices, such as the Apple iPhone 14 Pro and Huawei Mate 50 Pro. However, we noticed some camera shake at the start of the video recording. Please note that this impacted the artifacts score rather than the stabilization score.

Xiaomi 13 Ultra – good stabilization, slight camera shake noticeable on background

Huawei Mate 50 Pro – Good stabilization, good background detail

Apple iPhone 14 Pro – Good stabilization, good background detail

Artifacts

76

Xiaomi 13 Ultra

92

Apple iPhone 12 mini

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

Main video artifacts penalties

The post Xiaomi 13 Ultra Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/xiaomi-13-ultra-selfie-test/feed/ 0 Xiaomi 13 Ultra SELFIE SELFIE Xiaomi 13 Ultra Apple iPhone 14 Pro
Google Pixel 8 Pro Selfie test https://www.dxomark.com/google-pixel-8-pro-selfie-test/ https://www.dxomark.com/google-pixel-8-pro-selfie-test/#respond Thu, 19 Oct 2023 11:16:38 +0000 https://www.dxomark.com/?p=158683&preview=true&preview_id=158683 We put the Google Pixel 8 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our [...]

The post Google Pixel 8 Pro Selfie test appeared first on DXOMARK.

]]>
We put the Google Pixel 8 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 10.5MP sensor
  • f/2.2-aperture lens
  • 1.22µm pixels
  • Autofocus
  • 4K video at 24/30/60fps, (4K at 30fps tested)

Scoring

Sub-scores and attributes included in the calculations of the global score.

Google Pixel 8 Pro
Google Pixel 8 Pro
145
selfie
143
Photo
90

97

106

Best

96

105

67

79

78

94

90

91

82

93

70

80

149
Video
84

87

85

90

85

92

85

97

68

83

87

92

82

Best

Pros

  • Mostly accurate white balance in photo
  • Nice color rendering in indoor videos
  • Generally accurate exposure in photo and video
  • Accurate focus on face
  • Generally good detail in daylight and indoors, for photo and video
  • Smooth video stabilization when panning

Cons

  • Occasional slight color casts
  • Slight fine noise in photo and video
  • Residual motion when holding the camera still while recording, slight focus instabilities in video
  • Depth estimation artifacts in bokeh mode

The Google Pixel 8 Pro delivered an excellent performance in the DXOMARK Selfie tests, approaching the top of our ranking and making it the best Android device to date. The Pixel 8 Pro uses similar front-camera hardware as the Pixel 7 generation but adds autofocus. Overall, the camera showed significant improvements over the previous models in both photo and video modes. In our tests, the Pixel 8 Pro did particularly well in terms of color. The autofocus helped greatly improve texture levels, especially in close-up shots, and therefore also resulted in a better texture/noise trade-off.

In video mode, the Pixel 8 Pro just trailed the class-leading Apple iPhone 15 Pro Max, thanks to accurate exposure, nice colors and good detail. Video stabilization was smooth on panning shots, making for an overall decent video performance.

Please note that the Google Pixel 8 Pro was tested in SDR mode for Selfie because it provided better results than HDR mode, especially in terms of color.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

Google Pixel 8 Pro Selfie Scores vs Ultra-Premium
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Photo

143

Google Pixel 8 Pro

149

Honor Magic6 Pro
Google Pixel 8 Pro Photo scores vs Ultra-Premium
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

In photo mode, the Google Pixel 8 Pro managed to maintain the pleasant and natural color rendering, and the wide dynamic range we had already seen on the predecessor Pixel 7 Pro. It also added some significant improvements to the mix, particularly in terms of autofocus and texture/noise trade-off.

Please note: For Google Pixel 8 Pro photos, we evaluated SDR files on a SDR display. The comparison photos from the Apple iPhone 15 Pro Max were evaluated using HDR files on an HDR display. HEIC files from the Apple device cannot be displayed on this page, so we are using converted SDR files, which are not showing the iPhone’s full potential, instead.

Google Pixel 8 Pro – Accurate target exposure and pleasant color rendering

Exposure

90

Google Pixel 8 Pro

97

Apple iPhone 15 Pro

Exposure is one of the key attributes for technically good pictures. The main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.

Google Pixel 8 Pro – Bright face exposure, wide dynamic range on background
Apple iPhone 15 Pro – Bright face exposure, wide dynamic range on background
Huawei Mate 50 Pro – Bright face exposure, wide dynamic range on background

The Pixel 8 Pro generally delivered accurate target exposure in combination with a very wide dynamic range. However, the camera had a tendency to err on the side of slight underexposure, in order to protect the highlighted areas of the frame.

Google Pixel 8 Pro – Underexposure on face in backlit scenes
Google Pixel 7 Pro – Underexposure on face in backlit scenes
Huawei Mate 50 Pro – Good face exposure in backlit scenes

Color

106

Google Pixel 8 Pro

Best

Color is one of the key attributes for technically good pictures. The image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

Color performance on the Pixel 8 Pro was consistent with its predecessor, predominantly producing a natural white balance. In addition, Google’s RealTone outperformed many comparable features from the competition, offering outstanding color accuracy on skin tones.

Google Pixel 8 Pro – Pleasant skin tones, nice color on sky and trees
Apple iPhone 15 Pro – Pleasant skin tones, nice color on sky and trees
Huawei Mate 50 Pro – Less pleasant skin tone rendering

Focus

96

Google Pixel 8 Pro

105

Honor Magic6 Pro

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

While the predecessor Pixel 7 Pro used a fixed focus lens, the Pixel 8 Pro now comes with an autofocus system in the front camera. In our tests, this helped improve the level of subject detail greatly, especially for  close and long shooting distances.

Google Pixel 8 Pro – Face and background in focus
Apple iPhone 15 Pro – Face and background in focus
Huawei Mate 50 Pro – slight out of focus on subject and background.

Texture

67

Google Pixel 8 Pro

79

Asus ZenFone 7 Pro

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

On the latest Pixel Pro, the trade-off between noise and texture has been noticeably improved, resulting in better overall rendering in various light conditions and enhanced subject detail. Thanks to the new autofocus system, the Pixel was capable of capturing good detail from very short (30cm) to long (120cm and more) subject distances, providing a more consistent level of detail across subject distances than previous models.

Noise

78

Google Pixel 8 Pro

94

Huawei Mate 50 Pro

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

Visual noise evolution with illuminance levels in handheld condition
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

Compared to previous models, image noise was overall fairly well under control (with very close behavior to the Pixel 7 Pro, and better performances than iPhone15 Pro Max in this domain), contributing to significant improvements in terms of texture/noise trade-off.

Artifacts

90

Google Pixel 8 Pro

91

Apple iPhone 15 Pro

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

In terms of unwanted image artifacts, our testers observed only minor issues, and the Pixel 8 Pro did overall very well in this category. This said, color quantization was noticeable on occasion, as well as some slight local loss of texture in several test scenes (represented in the graph by face rendering artifact score).

Main photo artifacts penalties

Bokeh

70

Google Pixel 8 Pro

80

Honor Magic6 Pro

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

The bokeh performance of the Pixel 8 Pro was very close to the predecessor Pixel 7 Pro. The overall results were quite satisfying, despite occasional depth estimation inaccuracies, but achieving a natural DSLR-like blur gradient remained a challenge.

Google Pixel 8 Pro – Slight depth artifacts around the subject
Apple iPhone 15 Pro – Nice subject isolation
Google Pixel 7 Pro – Slight depth artifacts around the subject

Video

149

Google Pixel 8 Pro

156

Apple iPhone 15 Pro
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

Google Pixel 8 Pro Video scores vs Ultra-Premium
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

In our front camera video tests, the Google Pixel 8 Pro delivered a solid and consistent performance, finding itself just behind the best-in-class Apple iPhone 15 series.

Please note that we tested the Google Pixel 8 Pro video at both HDR and SDR settings, and concluded that SDR mode offered the slightly better overall quality. It was therefore used for the comparisons below. 

Exposure

84

Google Pixel 8 Pro

87

Apple iPhone 15 Pro

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed.

In our testing, the Pixel 8 Pro exhibited a strong exposure performance, with accurate target exposure, particularly in daylight and under indoor lighting. However, the camera still had difficulties achieving good target exposure in low light. Dynamic range was pretty wide, showing slight improvements over the previous model, especially in low light. Exposure transitions could be slower than on the competitors, but were consistently accurate.

Google Pixel 8 Pro – Accurate face exposure, wide dynamic range

Google Pixel 7 Pro – Accurate face exposure, wide dynamic range

Apple iPhone 15 Pro – Accurate face exposure, wide dynamic range, high contrast (on HDR displays)

Color

85

Google Pixel 8 Pro

90

Apple iPhone 15 Pro

Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

The camera’s color rendering could sometimes have difficulties in daylight conditions or in very low light. However, it was outstanding when recording indoors, particularly in terms of skin tones, which were remarkably accurate.

Google Pixel 8 Pro – Pleasant skin tones

Google Pixel 7 Pro – Pleasant skin tones

Apple iPhone 15 Pro – Pleasant skin tones

Focus

85

Google Pixel 8 Pro

92

Huawei Mate 40 Pro

The new autofocus system was accurate most of the time but also showed some inconsistencies, particularly when tracking moving subjects. In total, the addition of the autofocus did not increase the focus score compared to the predecessor Pixel 7 Pro. However, it did help improve the texture score significantly, thanks to the new model’s ability to focus accurately across a range of subject distances.

Google Pixel 8 Pro – Wide depth of field, fine detail in background

Google Pixel 7 Pro – Wide depth of field, fine detail in background

Apple iPhone 15 Pro – Wide depth of field, good detail in background

Texture

85

Google Pixel 8 Pro

97

Asus ZenFone 6

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

Like photo mode, video mode has improved significantly in terms of texture/noise trade-off, achieving a very good level. The autofocus system, and its ability to achieve good focus across a range of subject distances, plays a substantial role in driving these enhancements.

Google Pixel 8 Pro – Good fine detail

Google Pixel 7 Pro – Good fine detail

Apple iPhone 15 Pro – Good detail
Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

68

Google Pixel 8 Pro

83

Xiaomi Mi 11 Ultra

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Spatial visual noise evolution with the illuminance level
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
Temporal visual noise evolution with the illuminance level
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Stabilization

82

Google Pixel 8 Pro

Best

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jello artifacts, during walk and panning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

Stabilization performance remained globally efficient, with limited residual motion especially on handheld videos. Stabilization was better than average  when panning or walking while recording.

Google Pixel 8 Pro – Effective stabilization, slight residual motion on subject, slight background deformation

Google Pixel 7 Pro – Effective stabilization, slight residual motion on subject, slight background deformation

Apple iPhone 15 Pro – Effective stabilization, slight residual motion on subject, no background deformation

Artifacts

87

Google Pixel 8 Pro

92

Apple iPhone 12 mini

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

Main video artifacts penalties

The post Google Pixel 8 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/google-pixel-8-pro-selfie-test/feed/ 0 Google Pixel 8 Pro Best Best SELFIE SELFIE IenaBridge_GooglePixel8Pro_DxOMark_Selfie_05-00_n_ Building1_GooglePixel8Pro_DxOMark_Selfie_05-00_n_ Building1_AppleiPhone15Pro_DxOMark_Selfie_05-00 Building1_HuaweiMate50Pro_DxOMark_Selfie_05-00 BacklitPortraitAsianFair_GooglePixel8Pro_DxOMark_Selfie_05-00_n_ BacklitPortraitAsianFair_GooglePixel7Pro_DxOMark_Selfie_05-00 BacklitPortraitAsianFair_HuaweiMate50Pro_DxOMark_Selfie_05-00 Best DuoFondVert_Pixel8Pro_PXL_20231009_115446924_ref_n_ DuoFondVert_AppleiPhone15ProMax_IMG_0831_ref DuoFondVert_HuaweiMate50Pro_IMG_20231009_135651 BacklitColoredBuilding_GooglePixel8Pro_DxOMark_Selfie_05-00_n_ BacklitColoredBuilding_AppleiPhone15Pro_DxOMark_Selfie_06-00 BacklitColoredBuilding_HuaweiMate50Pro_DxOMark_Selfie_05-00 SunGlasses_GooglePixel8Pro_DxOMark_Selfie_05-00_n_ SunGlasses_AppleiPhone15Pro_DxOMark_Selfie_05-00 SunGlasses_GooglePixel7Pro_DxOMark_Selfie_05-00 Best
Apple iPhone 15 Pro Max Selfie test https://www.dxomark.com/apple-iphone-15-pro-max-selfie-test/ https://www.dxomark.com/apple-iphone-15-pro-max-selfie-test/#respond Wed, 11 Oct 2023 13:12:52 +0000 https://www.dxomark.com/?p=157989&preview=true&preview_id=157989 We put the Apple iPhone 15 Pro Max through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of [...]

The post Apple iPhone 15 Pro Max Selfie test appeared first on DXOMARK.

]]>
We put the Apple iPhone 15 Pro Max through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Please note that all iPhone 15 Pro Max sample images in this review have been converted to JPG format. Image quality was analyzed using the original HEIC files. For both photo and video, the best visualization pipeline for an optimal HDR experience is described in the article.

Overview

Key front camera specifications:

  • 12MP sensor
  • f/1.9-aperture lens
  • Autofocus
  • 4K video at 24/25/30/60 fps, 1080p at 25/30/60/120 fps (4K at 30 fps tested)

Scoring

Sub-scores and attributes included in the calculations of the global score.

Apple iPhone 15 Pro Max
Apple iPhone 15 Pro Max
149
selfie
145
Photo
97

Best

103

106

103

105

76

79

61

94

91

Best

78

93

80

Best

156
Video
87

Best

90

Best

91

92

84

97

71

83

87

92

82

Best

Pros

  • Accurate exposure and wide dynamic range for photos and videos
  • Reliable and fast autofocus with a wide depth of field
  • High levels of detail in bright light for photos and videos
  • Effective stabilization when holding the device still and when moving while recording
  • Very accurate subject isolation in bokeh mode

Cons

  • Noise across all shooting conditions
  • Occasional color quantization artifacts in photos and videos
  • Occasional sharpness differences between frames when recording video while walking

The Apple iPhone 15 Pro Max delivered outstanding results in the DXOMARK Selfie tests, earning itself the top spot in our smartphone front camera ranking. The new device uses a similar front camera hardware as the previous iPhone 14 generation but still showed notable advancements in both photo and video, thanks to improvements in the software.

The iPhone 15 Pro Max performed particularly well and was the best device tested to date in video mode. This was mainly due to impressive improvements in exposure and noise management. The iPhone 15 Pro Max did not quite take the top spot for still imaging. However, it showed notable advancements over its predecessor and particularly excelled in terms of exposure, color and bokeh, delivering the best results to date in these categories.

Overall, the Apple iPhone 15 Pro Max is an obvious choice for smartphone users who prioritize creating high-quality image and video content using the front camera.

The combination of Apple’s HEIC image files with embedded HDR data produced amazing results when viewed on a dedicated display. Image results were striking in Photo App on macOS Sonoma on an XDR Display and the photo app on the iPhone 15 Pro Max display alike, with brighter and more vivid rendering than ever. However, users of displays, devices or apps that do not support Apple’s photo HDR format will only see JPG images without the HDR effect.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

Apple iPhone 15 Pro Max Selfie Scores vs Ultra-Premium
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Photo

145

Apple iPhone 15 Pro Max

149

Honor Magic6 Pro
Apple iPhone 15 Pro Max Photo scores vs Ultra-Premium
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

In still image mode, the iPhone 15 Pro Max front camera showed important improvements over the previous generation, producing class-leading results in exposure and color, as well as bokeh where our testers were impressed with the precise subject isolation and nice bokeh gradient. The autofocus system worked reliably, and a wide depth of field ensured good focus on all subjects in group shots. Detail levels were high in bright light, but  image noise was noticeable across all light conditions, like on previous iPhone generations.

Exposure

97

Apple iPhone 15 Pro Max

Best

Exposure is one of the key attributes for technically good pictures. The main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.

Compared to the previous iPhone generation, the iPhone 15 Pro Max has significantly improved in terms of dynamic range. Face exposure and contrast were still accurate in most conditions, but highlight clipping was noticeably reduced.

Apple iPhone 15 Pro Max – Bright face exposure, wide dynamic range on background
Apple iPhone 14 Pro – Bright face exposure, highlight clipping on background
Huawei Mate 50 Pro – Good face exposure but stronger highlight clipping than iPhone 15 Pro Max

The iPhone 15 Pro Max also provided excellent contrast on faces. This was especially appreciable when viewing the images on an HDR display, such as the iPhone display itself, the latest generation iPad Pro, or an Apple Mac with Sonoma and XDR display.

Color

103

Apple iPhone 15 Pro Max

106

Google Pixel 8 Pro

Color is one of the key attributes for technically good pictures. The image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

The HDR format also helped produce very natural and pleasant colors, even in challenging light conditions. Skin tones looked very natural, with subtle color variations and authentic hues. This was, again, especially noticeable when viewing images on an HDR display. In addition, the iPhone front camera captured nice colors on background elements, such as the sky or vegetation.

Apple iPhone 15 Pro Max – Pleasant skin tones, nice color on sky and trees
Apple iPhone 14 Pro – Pleasant skin tones, nice color on sky and trees
Huawei Mate 50 Pro – Less pleasant skin tone rendering

Focus

103

Apple iPhone 15 Pro Max

105

Honor Magic6 Pro

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

Like its predecessor, the iPhone 15 Pro Max uses an autofocus system in the front camera. This allows the camera to optimize focus across a range of subject distances. It also helps increase background detail when shooting at longer subject distances, for example with a selfie stick. Compared to the previous generation, our testers observed even fewer focus failures across all test conditions.

Apple iPhone 15 Pro Max – Face and background in focus
Apple iPhone 14 Pro – Face and background in focus
Huawei Mate 50 Pro – Face slightly out of focus, blurry background

Texture

76

Apple iPhone 15 Pro Max

79

Asus ZenFone 7 Pro

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

61

Apple iPhone 15 Pro Max

94

Huawei Mate 50 Pro

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

Visual noise evolution with illuminance levels in handheld condition
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

In low light, noise management left some room for improvement. Image noise was quite noticeable on both subject and background. The observations in our perceptual tests were confirmed by objective measurements in the lab.

Apple iPhone 15 Pro Max - Low light
Apple iPhone 15 Pro Max - Loss of detail, noise
Apple iPhone 14 Pro Max - Low light
Apple iPhone 14 Pro Max - Noise
Huawei Mate 50 Pro - Low light
Huawei Mate 50 Pro - Low noise levels, slightly unnatural detail

Artifacts

91

Apple iPhone 15 Pro Max

Best

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

Main photo artifacts penalties

Bokeh

80

Apple iPhone 15 Pro Max

Best

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

The previous iPhone generation already delivered a great front camera bokeh simulation and was at the top for this category in our database. The 15 Pro Max has now raised the bar even higher, capturing the best bokeh images we have seen to date. Our test samples featured a smooth blur gradient, as well as foreground blur, accurate depth estimation, and a natural bokeh shape. Thanks to improved subject isolation, bokeh mode images looked more authentic and natural, with even fine detail, such as hair, nicely preserved.

Apple iPhone 15 Pro Max – Slight depth artifacts around the face, nice blur gradient
Apple iPhone 14 Pro – Depth artifacts around the face, nice blur gradient
Huawei Mate 50 Pro – Depth artifacts, no blur gradient

Video

156

Apple iPhone 15 Pro Max

Best

About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

Apple iPhone 15 Pro Max Video scores vs Ultra-Premium
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

When recording front camera video, the Apple iPhone 15 Pro Max excelled in exposure, color (white balance), texture, and autofocus. The combination of these strengths allowed the device to push the level of front camera video quality to new heights and earn itself the current top score for Selfie Video.

Compared to the predecessor 14 Pro Max, noise reduction has been improved and our testers also observed slightly higher levels of detail. However, it is worth noting that the video mode is not without its imperfections. For instance, noise could still be noticeable in low light. On occasion we also observed  a slight deviation in skin tone rendering recording backlit indoor scenes. Overall, video stabilization was highly effective, but there were occasionally noticeable variations in sharpness between frames when recording while in motion.

The front camera video mode of the iPhone 15 Pro Max has been tested at 4K resolution, 30 frames per second, and in the Dolby Vision format.

Exposure

87

Apple iPhone 15 Pro Max

Best

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed.

In our test, the iPhone 15 Pro Max consistently delivered clear and well-exposed videos across all test conditions. In addition, the improved dynamic range resulted in minimal highlight clipping.

Apple iPhone 15 Pro Max – Accurate face exposure throughout the clip, wide dynamic range

Apple iPhone 14 Pro – Accurate face exposure throughout the clip, wide dynamic range

Google Pixel 7 Pro – Good face exposure, wide dynamic range

Color

90

Apple iPhone 15 Pro Max

Best

Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Color rendering was natural in most conditions, without any noticeable color casts. Skin tones were rendered accurately, ensuring realistic group shots and preserving the distinct characteristics and natural appearance of all skin tone types.

Apple iPhone 15 Pro Max – Pleasant skin tones

Apple iPhone 14 Pro – Pleasant skin tones

Google Pixel 7 Pro – Pleasant skin tones

Focus

91

Apple iPhone 15 Pro Max

92

Huawei Mate 40 Pro

In our front camera tests, the autofocus was quick and reliable even when recording on the move. Our testers observed hardly any failures, with sharp subjects and good background detail, even when recording on the move.

Apple iPhone 15 Pro Max – Wide depth of field, fine detail in background

Apple iPhone 14 Pro – Wide depth of field, fine detail in background

Google Pixel 7 Pro – Wide depth of field, good detail in background

Texture

84

Apple iPhone 15 Pro Max

97

Asus ZenFone 6

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

The iPhone 15 Pro Max did not quite beat the very best in terms of video texture, but still maintained impressively high levels of detail, particularly in bright conditions. Fine detail was preserved effectively, with noticeable improvements over last year’s iPhone 14 generation.

Apple iPhone 15 Pro Max – Good fine detail

Apple iPhone 14 Pro – Good fine detail

Google Pixel 7 Pro – Good detail
Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

71

Apple iPhone 15 Pro Max

83

Xiaomi Mi 11 Ultra

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Noise reduction remained a challenge for the iPhone 15 Pro Max, but some improvements over the iPhone 14 Pro were noticeable. When recording in bright light, noise was hardly noticeable, but in low light conditions luminance noise could make an appearance, especially on background elements and in the corners of the frame.

Spatial visual noise evolution with the illuminance level
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
Temporal visual noise evolution with the illuminance level
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Stabilization

82

Apple iPhone 15 Pro Max

Best

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jellow artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

Like on the previous iPhone generation, video stabilization was impressive, counteracting even fairly challenging camera motion effectively. Videos recorded while holding the device still were perfectly steady and walking clips were rendered smoothly, thanks to a subtle combination of face and background stabilization. However, our testers still observed some sharpness differences between frames when recording while walking.

Apple iPhone 15 Pro Max – Effective stabilization, slight sharpness differences between frames

Apple iPhone 14 Pro – Effective stabilization, slight sharpness differences between frames

Google Pixel 7 Pro – Good stabilization, sharpness difference between frames

Artifacts

87

Apple iPhone 15 Pro Max

92

Apple iPhone 12 mini

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

In our front camera tests, clips recorded with the iPhone 15 Pro Max were consistently pretty much free of unwanted artifacts. Only in low light did we observe some color quantization on occasion. Slight ghosting could be noticeable around subjects in backlit scenes, or when capturing moving subjects in low light.

Main video artifacts penalties

The post Apple iPhone 15 Pro Max Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/apple-iphone-15-pro-max-selfie-test/feed/ 0 Apple iPhone 15 Pro Max Best Best Best Best Best Best SELFIE SELFIE Best BacklitPortraitAsianFair_AppleiPhone15ProMax_DxOMark_Selfie_05-00 BacklitPortraitAsianFair_AppleiPhone14Pro_DxOMark_Selfie_05-00 Best Best CafetWall_AppleiPhone15ProMax_DxOMark_Selfie_05-00 CafetWall_AppleiPhone14Pro_DxOMark_Selfie_05-00 Best Best Best Best
Honor Magic5 Pro Selfie test https://www.dxomark.com/honor-magic5-pro-selfie-test/ https://www.dxomark.com/honor-magic5-pro-selfie-test/#respond Wed, 04 Oct 2023 14:18:10 +0000 https://www.dxomark.com/?p=156036&preview=true&preview_id=156036 We put the Honor Magic5 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing [...]

The post Honor Magic5 Pro Selfie test appeared first on DXOMARK.

]]>
We put the Honor Magic5 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 12MP sensor, 1.22µm pixels
  • f/2.4-aperture lens with 100º field of view

Scoring

Sub-scores and attributes included in the calculations of the global score.

Honor Magic5 Pro
Honor Magic5 Pro
123
selfie
116
Photo
72

97

84

106

93

105

52

79

74

94

69

91

76

93

65

80

136
Video
75

87

83

90

90

92

79

97

55

83

81

92

77

82

Pros

  • Accurate exposure in bright light and indoors, for photo and video
  • Consistent exposure across consecutive shots
  • Effective video stabilization
  • Fairly good in video
  • Good depth estimation in bokeh mode

Cons

  • Limited dynamic range in difficult backlit photo scenes
  • Low level of detail in photo
  • Occasionally inaccurate white balance and skin tones rendering in photo and video
  • Temporal video noise
  • Noticeable adaptations in video panning shots

The Honor Magic5 Pro delivered a decent performance in the DXOMARK Selfie test but could not quite keep up with the best in its class. In photo mode the camera captured decent still images with good exposure. However, our testers found dynamic range to be limited, color rendering could have been better and the trade-off between detail retention and noise reduction left some room for improvement as well.

The performance in video mode was comparatively better than stills, with good video stabilization making the Honor a good option for vlogging and similar use cases. Exposure was usually good as well, but could be slightly unstable on occasion. Our experts also noticed occasionally inaccurate skin tones in the recorded video footage.

Honor Magic 5 Pro Selfie Scores vs Ultra-Premium
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

Photo

116

Honor Magic5 Pro

149

Honor Magic6 Pro
Honor Magic 5 Pro Photo scores vs Ultra-Premium
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

Exposure

72

Honor Magic5 Pro

97

Apple iPhone 15 Pro

Color

84

Honor Magic5 Pro

106

Google Pixel 8 Pro

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

Honor Magic5 Pro –  Orange white balance cast, inaccurate skin tones, slight highlight clipping, good contrast
Apple iPhone 14 Pro – Neutral white balance, accurate skin tones, strong highlight clipping, excellent contrast
Huawei Mate 50 Pro – Neutral white balance, desaturated skin tones, very slight highlight clipping, low contrast

Focus

93

Honor Magic5 Pro

105

Honor Magic6 Pro

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

Honor Magic5 Pro – Close-up portrait – Subject in focus, noise, inacurate skin tones
Apple iPhone 14 Pro – Close-up portrait – Subject in focus, slight noise, accurate skin tones
Huawei Mate 50 Pro – Close-up portrait – Subject in focus, no noise, accurate skin tones

Texture

52

Honor Magic5 Pro

79

Asus ZenFone 7 Pro

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

74

Honor Magic5 Pro

94

Huawei Mate 50 Pro

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

Visual noise evolution with illuminance levels in handheld condition
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

Artifacts

69

Honor Magic5 Pro

91

Apple iPhone 15 Pro

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

Main photo artifacts penalties

Video

136

Honor Magic5 Pro

156

Apple iPhone 15 Pro
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

Honor Magic 5 Pro Video scores vs Ultra-Premium
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

The video performance is generally efficient. Motion is generally well compensated and frame shift is barely visible. In terms of exposure, the target exposure is generally accurate. However, some slight exposure adaptations are visible during panning. The white balance is generally pleasant, but skin tones have sometimes an orange hue. The level of detail is generally high. However, noise is often visible, especially in lab measurements. Few artifacts are visible.

Exposure

75

Honor Magic5 Pro

87

Apple iPhone 15 Pro

Color

83

Honor Magic5 Pro

90

Apple iPhone 15 Pro

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Honor Magic 5 Pro – Accurate exposure, slightly unpleasant skin tones (orange hue)

Apple iPhone 14 Pro – Accurate target exposure and skin tone rendering

Huawei Mate 50 Pro – Accurate target exposure and skin tone rendering

Texture

79

Honor Magic5 Pro

97

Asus ZenFone 6

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

55

Honor Magic5 Pro

83

Xiaomi Mi 11 Ultra

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Spatial visual noise evolution with the illuminance level
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
Temporal visual noise evolution with the illuminance level
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Stabilization

77

Honor Magic5 Pro

82

Apple iPhone 15 Pro

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jellow artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

Honor Magic 5 Pro – Effective stabilization

Apple iPhone 14 Pro – Effective stabilization

Huawei Mate 50 Pro – Effective stabilization

Artifacts

81

Honor Magic5 Pro

92

Apple iPhone 12 mini

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

Main video artifacts penalties

The post Honor Magic5 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/honor-magic5-pro-selfie-test/feed/ 0 Honor Magic5 Pro SELFIE SELFIE BacklitDuofieFairDeep_HonorMagic5Pro_DxOMark_Selfie_05-00 BacklitDuofieFairDeep_AppleiPhone14Pro_DxOMark_Selfie_05-00 _cuva _cuva BridgeHDR_HonorMagic5Pro_DxOMark_Selfie_05-00 BridgeHDR_AppleiPhone14Pro_DxOMark_Selfie_05-00
Google Pixel 7a Selfie test https://www.dxomark.com/google-pixel-7a-selfie-test/ https://www.dxomark.com/google-pixel-7a-selfie-test/#respond Tue, 11 Jul 2023 13:08:49 +0000 https://www.dxomark.com/?p=152767&preview=true&preview_id=152767 We put the Google Pixel 7a through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing [...]

The post Google Pixel 7a Selfie test appeared first on DXOMARK.

]]>
We put the Google Pixel 7a through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 13MP sensor, 1.12μm pixels
  • f/2.2 aperture
  • 94° field of view
  • Fixed focus
  • Video: 4K/30 fps, 1080p/30fps

Scoring

Sub-scores and attributes included in the calculations of the global score.

Google Pixel 7a
Google Pixel 7a
140
selfie
136
Photo
91

97

105

106

93

105

55

79

79

94

90

91

72

93

65

80

146
Video
81

87

85

90

88

92

71

97

70

83

88

92

82

Best

Pros

  • Usually accurate white balance
  • Accurate target exposure and wide dynamic range in most conditions
  • Fairly accurate depth estimation in bokeh mode

Cons

  • Slight loss of fine detail, especially in low light
  • Coarse noise can be visible, especially in low light
  • Strong vignetting in flash shots
  • Faces sometimes slightly out of focus in close distance shots

The Google Pixel 7a offered a very good performance in our DXOMARK front-camera test, with results close to the higher-tier Google Pixel 7. The good Photo score was based on the excellent exposure and vivid color rendering. Skin tones were rendered nicely across all skin types and light conditions. Our experts also liked the depth estimation in bokeh mode but noticed some noise and a loss of fine detail in low light.

The Pixel 7a also did very well in video mode, thanks to a wide dynamic range, as well as generally accurate white balance and skin tones. Video stabilization was excellent, too, effectively counteracting hand motion and stabilizing the subject’s face in the frame. On the downside, our testers noticed a lack of detail in low-light recordings, as well occasional high levels of video noise.

Selfie Scores vs High-End
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

Photo

136

Google Pixel 7a

149

Honor Magic6 Pro
Photo scores vs High-End
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

Exposure

91

Google Pixel 7a

97

Apple iPhone 15 Pro

Color

105

Google Pixel 7a

106

Google Pixel 8 Pro

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

Google Pixel 7a – Accurate exposure and color
Apple iPhone 14 – Accurate exposure and color

Focus

93

Google Pixel 7a

105

Honor Magic6 Pro

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

Google Pixel 7a – Face slightly out of focus
Google Pixel 7 – Face slightly out of focus

Texture

55

Google Pixel 7a

79

Asus ZenFone 7 Pro

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.
Google Pixel 7a - Texture
Google Pixel 7a - Good level of texture
Apple iPhone 14 - Texture
Apple iPhone 14 - Good level of texture
Google Pixel 7 - Texture
Google Pixel 7 - Good level of texture

Noise

79

Google Pixel 7a

94

Huawei Mate 50 Pro

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

Visual noise evolution with illuminance levels in handheld condition
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

Artifacts

90

Google Pixel 7a

91

Apple iPhone 15 Pro

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

Bokeh

65

Google Pixel 7a

80

Honor Magic6 Pro

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

 

Google Pixel 7a – No blur gradient
Apple iPhone 14 – Blur gradient
Google Pixel 7 – No blur gradient

Video

146

Google Pixel 7a

156

Apple iPhone 15 Pro
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

Video scores vs High-End
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

Exposure

81

Google Pixel 7a

87

Apple iPhone 15 Pro

Color

85

Google Pixel 7a

90

Apple iPhone 15 Pro

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Texture

71

Google Pixel 7a

97

Asus ZenFone 6

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

70

Google Pixel 7a

83

Xiaomi Mi 11 Ultra

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Spatial visual noise evolution with the illuminance level
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
Temporal visual noise evolution with the illuminance level
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Stabilization

82

Google Pixel 7a

Best

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jellow artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

Artifacts

88

Google Pixel 7a

92

Apple iPhone 12 mini

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

The post Google Pixel 7a Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/google-pixel-7a-selfie-test/feed/ 0 Google Pixel 7a Best SELFIE SELFIE 020 (4)_ref 020 ForbiddenInside_GooglePixel7a_DxOMark_Selfie_05-00 ForbiddenInside_GooglePixel7_DxOMark_Selfie_05-00 CafetWall_GooglePixel7a_DxOMark_Selfie_05-00 CafetWall_AppleiPhone14_DxOMark_Selfie_05-00 CafetWall_GooglePixel7_DxOMark_Selfie_05-00 Best
Samsung Galaxy S23 Ultra Selfie test https://www.dxomark.com/samsung-galaxy-s23-ultra-selfie-test/ https://www.dxomark.com/samsung-galaxy-s23-ultra-selfie-test/#respond Mon, 15 May 2023 14:03:35 +0000 https://www.dxomark.com/?p=145661&preview=true&preview_id=145661 We put the Samsung Galaxy S23 Ultra through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our [...]

The post Samsung Galaxy S23 Ultra Selfie test appeared first on DXOMARK.

]]>
We put the Samsung Galaxy S23 Ultra through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 12MP sensor
  • f/1.9 aperture lens
  • Up to 4K / 60fps video
  • 3D depth sensing camera

Scoring

Sub-scores and attributes included in the calculations of the global score.

Samsung Galaxy S23 Ultra
Samsung Galaxy S23 Ultra (Snapdragon)
141
selfie
139
Photo
89

97

104

106

95

105

69

79

74

94

79

91

79

93

75

80

145
Video
81

87

83

90

86

92

76

97

74

83

89

92

74

82

Pros

  • Good exposure and rather wide dynamic range in photo and video
  • Accurate white balance and nice skin tones in bright light and indoors
  • Fast and accurate autofocus
  • Noise levels well under control in most conditions
  • Nice bokeh effect with accurate depth estimation
  • Effective video stabilization

Cons

  • Occasional slight underexposure in difficult low light conditions
  • Lack of detail in low light
  • Lack of contrast in photo mode
  • Ghosting, ringing and hue shift artifacts

The Samsung Galaxy S23 Ultra performed very well in the DXOMARK Selfie tests, coming close to the very best devices in the Ultra-Premium segment. Its front camera has also been slightly improvement over the predecessor S22 Ultra, with differences most noticeable in terms of color in photo as well as noise reduction in both photo and video. This said, our testers found the level of captured detail to be generally lower than on the previous model.

When shooting still images, the camera delivered nice colors and skin tones in most conditions. Dynamic range was pretty wide, capturing good detail in both the highlight and shadow regions of the frame. The autofocus was accurate and quick to lock on, working reliably even in difficult conditions. While not as high as on some direct competitors, the level of captured detail was good, and image noise was kept well under control across all light conditions. On the downside, our testers observed underexposure in some scenes and images often lacked contrast. Unwanted artifacts such as ringing or hue shift near clipped areas could also be found in some sample shots.

Video quality was overall quite similar to the Galaxy S22 Ultra, with a wide dynamic range and generally good exposure, despite the occasional underexposed sequence. Video autofocus was quick and stable and the stabilization system did a good job at keeping camera shake to a minimum. Noise levels were noticeably lower than on the Galaxy S22 Ultra, but on the flip side, video clips showed lower levels of detail, especially when recording in low light.

Samsung Galaxy S23 Ultra Selfie Scores vs Ultra-Premium
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

Photo

139

Samsung Galaxy S23 Ultra (Snapdragon)

149

Honor Magic6 Pro
Samsung Galaxy S23 Ultra Photo scores vs Ultra-Premium
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.
Samsung Galaxy S23 Ultra – Good exposure, nice colors, good detail

Exposure

89

Samsung Galaxy S23 Ultra (Snapdragon)

97

Apple iPhone 15 Pro

Color

104

Samsung Galaxy S23 Ultra (Snapdragon)

106

Google Pixel 8 Pro

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

Samsung Galaxy S23 Ultra – Neutral white balance, accurate color rendering
Apple iPhone 14 Pro – Warmer white balance, slightly reddish skin tones
Huawei Mate 50 Pro – Warmer white balance, slightly orange skin tones

Focus

95

Samsung Galaxy S23 Ultra (Snapdragon)

105

Honor Magic6 Pro

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

Samsung Galaxy S23 Ultra - Depth of field
Samsung Galaxy S23 Ultra - Wide depth of field
Apple iPhone 14 Pro - Depth of field
Apple iPhone 14 Pro - Wide depth of field
Huawei Mate 50 Pro - Depth of field
Huawei Mate 50 Pro - Slightly limited depth of field

Texture

69

Samsung Galaxy S23 Ultra (Snapdragon)

79

Asus ZenFone 7 Pro

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

74

Samsung Galaxy S23 Ultra (Snapdragon)

94

Huawei Mate 50 Pro

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

Visual noise evolution with illuminance levels in handheld condition
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

Artifacts

79

Samsung Galaxy S23 Ultra (Snapdragon)

91

Apple iPhone 15 Pro

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

Main photo artifacts penalties

Bokeh

75

Samsung Galaxy S23 Ultra (Snapdragon)

80

Honor Magic6 Pro

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

Samsung Galaxy S23 Ultra – Accurate depth estimation

Video

145

Samsung Galaxy S23 Ultra (Snapdragon)

156

Apple iPhone 15 Pro
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

Samsung Galaxy S23 Ultra Video scores vs Ultra-Premium
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

Exposure

81

Samsung Galaxy S23 Ultra (Snapdragon)

87

Apple iPhone 15 Pro

Color

83

Samsung Galaxy S23 Ultra (Snapdragon)

90

Apple iPhone 15 Pro

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Samsung Galaxy S23 Ultra – Balanced exposure, good dynamic range, slight lack of contrast, pleasant colors

Apple iPhone 14 Pro – Balanced exposure, slight highlight clipping, slight desaturation

Huawei Mate 50 Pro – Balanced exposure, good dynamic range, pleasant colors

Texture

76

Samsung Galaxy S23 Ultra (Snapdragon)

97

Asus ZenFone 6

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

74

Samsung Galaxy S23 Ultra (Snapdragon)

83

Xiaomi Mi 11 Ultra

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Spatial visual noise evolution with the illuminance level
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
Temporal visual noise evolution with the illuminance level
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Stabilization

74

Samsung Galaxy S23 Ultra (Snapdragon)

82

Apple iPhone 15 Pro

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jellow artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

Samsung Galaxy S23 Ultra – Some camera shake

Apple iPhone 14 Pro – Some camera shake

Huawei Mate 50 Pro – Some camera shake

Artifacts

89

Samsung Galaxy S23 Ultra (Snapdragon)

92

Apple iPhone 12 mini

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

Main video artifacts penalties

The post Samsung Galaxy S23 Ultra Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/samsung-galaxy-s23-ultra-selfie-test/feed/ 0 Samsung Galaxy S23 Ultra SELFIE SELFIE IenaBridge_SamsungGalaxyS23Ultra_DxOMark_Selfie_05-00 BacklitColoredBuilding_SamsungGalaxyS23Ultra_DxOMark_Selfie_05-00 BacklitColoredBuilding_AppleiPhone14Pro_DxOMark_Selfie_05-00 BacklitColoredBuilding_HuaweiMate50Pro_DxOMark_Selfie_05-00 SamsungGalaxyS23Ultra_900
Vivo X90 Pro+ Selfie test https://www.dxomark.com/vivo-x90-pro-plus-selfie-test/ https://www.dxomark.com/vivo-x90-pro-plus-selfie-test/#respond Wed, 03 May 2023 15:40:29 +0000 https://www.dxomark.com/?p=146910&preview=true&preview_id=146910 We put the Vivo X90 Pro+  through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing [...]

The post Vivo X90 Pro+ Selfie test appeared first on DXOMARK.

]]>
We put the Vivo X90 Pro+  through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 32MP 1/2.8″ sensor, 0.8µm pixels
  • f/2.5 aperture lens
  • 1080p video at 30fps/60fps

Scoring

Sub-scores and attributes included in the calculations of the global score.

Vivo X90 Pro+
Vivo X90 Pro+
125
selfie
121
Photo
74

97

89

106

86

105

65

79

76

94

76

91

74

93

55

80

131
Video
77

87

78

90

84

92

76

97

59

83

79

92

73

82

Pros

  • Good white balance in bright light and indoors in photo and video
  • Nice skin-tone rendering in photo and video
  • Wide dynamic range in photo
  • Good target exposure in photo and video
  • High levels of detail in photo and video

Cons

  • Lack of contrast on faces in high-contrast scenes
  • Flare, color fringing and hue shift artifacts
  • Noise, especially on backgrounds
  • White balance casts and color rendering issues in low light video
  • Focus stabilization is challenged in low-light conditions.
  • Dynamic range in video mode is not so extended for its price segment.

With a DXOMARK Selfie score of 125, the Vivo X90 Pro+ delivered a decent performance in our front camera tests but could not keep up with the very best devices in the Ultra-Premium segment. Overall, the camera provided pleasant image and video rendering, with good exposure, pleasant colors and nice skin tones, across a large array of shooting conditions.  However, the top Selfie devices in its class are capable of producing better contrast in still images and a wider dynamic range in video mode.

Vivo X90 Pro Plus Selfie Scores vs Ultra-Premium
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

Photo

121

Vivo X90 Pro+

149

Honor Magic6 Pro
Vivo X90 Pro Plus Photo scores vs Ultra-Premium
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

Exposure

74

Vivo X90 Pro+

97

Apple iPhone 15 Pro

Color

89

Vivo X90 Pro+

106

Google Pixel 8 Pro

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

Vivo X90 Pro+ – Good exposure, nice color
Apple iPhone 14 Pro – Good exposure, nice color
Huawei Mate 50 Pro – Good exposure, nice color

Focus

86

Vivo X90 Pro+

105

Honor Magic6 Pro

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

Vivo X90 Pro+ - Depth of field
Vivo X90 Pro+ - Front subject slightly blurred
Apple iPhone 14 Pro - Depth of field
Apple iPhone 14 Pro - Good sharpness on front subject
Huawei Mate 50 Pro - Depth of field
Huawei Mate 50 Pro - Good sharpness on front subject

Texture

65

Vivo X90 Pro+

79

Asus ZenFone 7 Pro

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

76

Vivo X90 Pro+

94

Huawei Mate 50 Pro

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

Visual noise evolution with illuminance levels in handheld condition
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

Artifacts

76

Vivo X90 Pro+

91

Apple iPhone 15 Pro

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

Main photo artifacts penalties

Video

131

Vivo X90 Pro+

156

Apple iPhone 15 Pro
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

Vivo X90 Pro Plus Video scores vs Ultra-Premium
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

Exposure

77

Vivo X90 Pro+

87

Apple iPhone 15 Pro

Color

78

Vivo X90 Pro+

90

Apple iPhone 15 Pro

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Vivo X90 Pro+ – Limited dynamic range, good exposure, nice color

Apple iPhone 14 Pro – Fairly wide dynamic range, good exposure, nice color

Huawei Mate 50 Pro – Fairly wide dynamic range, good exposure, nice color

Texture

76

Vivo X90 Pro+

97

Asus ZenFone 6

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

Texture acutance evolution with the illuminance level
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Noise

59

Vivo X90 Pro+

83

Xiaomi Mi 11 Ultra

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Spatial visual noise evolution with the illuminance level
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
Temporal visual noise evolution with the illuminance level
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Stabilization

73

Vivo X90 Pro+

82

Apple iPhone 15 Pro

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jellow artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

Vivo X90 Pro+ – Some camera shake

Apple iPhone 14 Pro – Effective video stabilization

Huawei Mate 50 Pro – Effective video stabilization

Artifacts

79

Vivo X90 Pro+

92

Apple iPhone 12 mini

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

Main video artifacts penalties

The post Vivo X90 Pro+ Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/vivo-x90-pro-plus-selfie-test/feed/ 0 Vivo X90 Pro+ SELFIE SELFIE Tree_VivoX90ProPlus_DxOMark_Selfie_05-00 Tree_AppleiPhone14Pro_DXOMARK_Selfie_05-00 Tree_HuaweiMate50Pro_DxOMark_Selfie_05-00