There's nothing worse than posing for a picture and then seeing something that is less than flattering.
For many smartphone users with darker skin tones, this often means barely showing up in pictures or having to heavily edit photos before sharing them with friends or posting to social media.
But even when users with more melanated skin edit pictures to increase highlights, boost saturation, or limit shadows the result is often a gray, almost "ashy" look--a less than accurate depiction of the person in front of the camera.
"There's bias built in whether it's film photography, definitely digital photography. And that bias is favoring people with lighter color skin and isn't favoring people with darker complexions," said Patrick Holland, senior editor at technology and consumer electronic reviewer CNET.
Holland said all smartphones have gotten better at taking pictures under most light settings in recent years, but believes the Pixel 6 is a major improvement.
The Pixel 6 phone developed by Google is now being billed as the most inclusive smartphone camera, but is it?
"In short, yes," said Holland.
The new line of Pixel 6 and Pixel 6 Pro phones released late last month debuts Google's new "Real Tone" technology, developed to eliminate long-standing biases towards lighter skin in photography.
"I think a lot of that shows up in technology, when you sit in a room of very creative, very smart people. And you see a lot of people who look, my skin color, you have to wonder how much of that is affecting the things they program," said Holland.
Florian Koenigsberger leads Google's Image Equity initiative and says the tech giant spent four years developing these new tools.
The company partnered with 18 cinematographers and colorists known for capturing images of people of color in their perfect shade.
"The mission of our image equity work is ensuring that we deliver best in class camera and image experiences for people of color, especially with darker skin tones," said Koenigsberger.
But what does that mean for you when it's time to snap a selfie?
With Holland playing photographer and ABC7 News executive producer Mariel and videographer JC joining Race and Culture Reporter Julian Glover as models, we put the latest smartphones to the test under a bunch of tough lighting conditions.
We took a series of photos using the Pixel 6, Samsung Galaxy S21 Ultra, and Apple's latest offering the iPhone 13 Pro.
First up, how well does the camera juggle dim lighting and balancing two different complexions?
"It's finding Julian's face instantly in this case -- it's pretty impressive," said Holland.
The Pixel 6 took a good-looking photograph featuring even skin tones and few shadows on Mariel and Julian's faces.
The S21 pictures were slightly softer and bluer, with the iPhone 13 Pro picture showing deeper shadows and a slightly darker look to it.
Julian and Mariel preferred the Pixel 6 here.
"It's the sum total of improvements to our auto exposure and auto white balance models that determine the brightness and color in a portrait as well as changes to models that run on some of the hardware," said Koenigsberger as he explains the tech. "For example, our face detection models that see a face in a picture are now able to see a greater diversity of faces."
Next, we wanted to see how the phones handled a variety of skin tones in one picture while we stood in front of a window for a challenging-to-photograph "back lit" scenario, where there is more light behind the subject than in front of the subject.
SCROLL: Photos compare camera quality between Apple, Samsung and Google smart phones
JC and Mariel show up fine in the Galaxy S21 picture, but Julian's face was covered in shadow.
In the iPhone picture it cast a blue hue on all three "models" and washed out Mariel's face trying to bring Julian's skintone into focus.
The Pixel 6 equally lit all three, but did have some stray light that removed some saturation from the photo.
This time, all three preferred different pictures.
"This one looks true, but this one just looks prettier," said Mariel.
None of the smartphones delivered a "perfect" picture under such challenging light conditions, but clearly are a major improvement from the images phone cameras were capable of capturing just five years ago.
"This work is never done. This is a first expression of this mission, but our teams are already back working on the next phone," said Koenigsberger.
He added that Google's Image Equity Initiative isn't just about making sure people accurately show up in photos. The team is working on expanding this tech to an array of Google image products to improve the way users look in video conferencing tools or even edit images in Google Photos.
Koenigsberger describes it as a process centered around improvements in machine learning, and Holland said it will continue to improve in all smartphones over time.
"No matter what phone you have in your pocket, it's taking multiple images and combining different parts," said Holland. "As phone cameras get better, I think the distance between something like a professional camera, and something like a phone camera is going to shrink."
So which camera is best? The answer to that question lies mostly in personal preference.
As most photographers will even admit, the best camera is the one you have with you to capture the moment.
ABC7 News reached out to Apple for comment. The company said it is also using machine learning in the iPhone and working with photographers to capture more life-like pictures.
Samsung told ABC7 News it, too, is employing artificial intelligence to improve the camera in its Galaxy phones, and takes everything from skin tone to hair color into account in making sure pictures look accurate and natural.