Hewlett Packard (HP) recently released face tracking software that allows a webcamera to chase you around as you talk to it. The face recognition software appears to recognize lighter-skinned faces with no problem, but has trouble with darker-skinned faces. This probably doesn’t mean that HP is anti-black people, but it does suggest that HP didn’t sufficiently test its product on all kinds of faces, which means that it didn’t value black customers very much while doing research and development.
Both Kate W. and Lucy P. sent in this video in which Wanda and Desi, humorously, demonstrate the problem in this youtube clip:
We’ve seen this kind of thing before with a Nikon camera that seemed to think that Asian people were always blinking (though there was some confusion as to whether it did that to everyone) and another version of face recognition software.
Nikon responded to complaints about the program by saying:
We are working with our partners to learn more. The technology we use is built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose. We believe that the camera might have difficulty “seeing” contrast in conditions where there is insufficient foreground lighting.
Which Kate said is a nice way of saying: it’s not HPs fault that “your face doesn’t have enough contrast and why don’t you turn on a light while you’re at it.”
(By the way, the fact that this video has, as of right now, almost 2 million views is a beautiful example of the democratizing power of the youtubes!!!)
Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.
Comments 125
ibis lynn — January 5, 2010
Not only do their computers suck, but they are racist, too? Good job, HP!
cb — January 5, 2010
It just seems SO BASIC to test technology with different variables, including different people. Don't all decent sized companies to demographics research? Why wouldn't technology companies? So confusing.
Blue Wizard — January 5, 2010
I think when you write, "Nikon responded to complaints about the system..." you mean HP, not Nikon.
Duran2 — January 5, 2010
BTW, I'm waiting for the outrage here when people realize that most commercial voice recognition systems do really poorly with an "urban" accent...
Sara — January 5, 2010
Anyone see the episode of Better Off Ted where the company makes everything (elevators, lights, drinking fountain) operate hands-free, except to do so the devices rely on "seeing" people to turn things on/off? But, the technology doesn't see black people, so the company hires white people to follow the black people around to turn everything on because that's cheaper than fixing the system (which is racial discrimination, so they'd have to hire more black people to follow the white people, and so on and so forth, until they eventually decide it's more efficient to just replace the system). A clever commentary on how companies often don't even think about trying out new technology with people who fall outside the "norms," whether it's race or sex or height or any other situation, and how the problem only really gets addressed when it's in the company's best interest.
maus — January 5, 2010
"Which Kate said is a nice way of saying: it’s not HPs fault that “your face doesn’t have enough contrast and why don’t you turn on a light while you’re at it.”"
It is true, people aren't generally buying these webcams because they have expensive sensors for using in low-light.
You can't place back detail that doesn't exist with software. You can prevent the detail from being lost with hardware, but that costs $$. Really though, the main "solution" here if people complain enough is to drop the feature entirely for all customers and only leave it on more expensive webcames (which few people outside of business teleconferencing needs are going to purchase.)
I can sympathize with the frustration, but from an engineering standpoint, this is more than being lazy and "not testing enough".
thewhatifgirl — January 5, 2010
For once, I don't think this has anything to do with racism. You can't blame the difficulties of focusing an artificial eye like a camera on racism - blame it on the human eye.
The easiest way to focus a camera manually is to look at something that you know with your own eye is a sharp line and then adjust to make that line sharp in the camera lens as well. Auto-focus simply does the same thing for you. But anyone who has manually focused an SLR camera can tell you that the less contrast you have, the harder that is to do.
Operation White Shadow « Memoirs of a SLACer — January 5, 2010
[...] 5, 2010 by John Sometimes, life imitates [...]
Engineer — January 5, 2010
While it's a serious oversight that this algorithm doesn't work on non-Caucasian faces, as someone who does this kind of work for a living, let me point out that it's much more likely that this is what happened:
1. Marketing says "Face detection should be in the product"
2. Engineers implement a standard face-detection algorithm that requires training on thousands of hand-cropped face photos. They get these photos from
a. Photos of other people in their department - eg: engineers
b. Standard collections of such photos collected at universities... who basically took photos of their grad students.
3. The algorithm worked on 95% of the photos in the database and it was called a success. Never mind that say 80% of the failures were on the few dark skinned faces. I doubt anyone tested the success rate on skin color groups, just on faces in general.
African Americans are underrepresented among engineers - so the only way they'd have a good sample is if someone went out and intentionally collected thousands of photos of dark skinned people. For the same reason, I'd suspect that women are also underrepresented in the collections. The one I worked with was universally Caucasian men.
Should we generate a new, race-balanced face database? Absolutely. But I doubt HP wanted to pay for it, and it's unclear to me that engineering would have been able to communicate the need for it to anyone with the authority to commission one.
splack — January 5, 2010
I sometimes think we need new words for unintentional isms. People seem to argue that something isn't racist or sexist if it's unintentional, if the person doesn't mean it that way, etc. regardless of the actual effect in reality.
I was listening to some political radio show months ago, and the host guy was saying that we're all racist in this country and we have to deal with it. A black man called in to say that he'd grown up in a real racist town, and yeah, sometimes white people don't "get it", but there's a huge difference between hate and prejudice and just being oblivious. He said it was important to his safety to be able to gauge hate vs. ignorance.
It happens a lot on this site and others where if someone describes something as racist or sexist, there's a defensive reaction in the comments because because it feels like an accusation or there's an assumption about intent. You can do or say a racist or a sexist or a classist thing without it being what you perceive to be your identity, no?
There's institutional classism/racism/sexism/ableism/ageism that just goes on without real active participation by people (it's just the past that built our current reality), but the effects on those targeted or left out are real.
This guy really bought this computer and it really doesn't work because he's black. What do we call that if not racism? Sincere question.
phio gistic — January 5, 2010
I had similar problems several years ago with Microsoft's voice recognition software. You were supposed to read canned paragraphs (about how great Microsoft is!) so it could "learn" your voice. It wasn't very good, and it seemed much worse with female voices than male voices. I don't have any solid proof, but I suspect that most of the testing was done with males.
Victoria — January 5, 2010
What makes this about race is that if the product had NOT worked on white faces, it wouldn't have made it to the market until it did. White being the standard is what makes it racist.
Sue — January 5, 2010
I'm reminded of old arguments by Hollywood that blacks were hard to light.
Tom Allen — January 5, 2010
but it does suggest that HP didn’t sufficiently test its product on all kinds of faces, which means that it didn’t value black customers very much while doing research and development.
I'm not in the software biz, but I do have a company that works with engineers and manufacturers, and I often see mistakes that *appear* to be small, but end up affecting entire processes in unexpected ways.
When I heard about this a few weeks ago, my first guess was that it wasn't tested with black people simply because nobody suspected that it would make any difference. Even if you argue that it's poor testing or oblivious sampling, the point is that test experiments are planned based on certain expectations. I can't tell you how often an engineer calls me *after* a product has been made because something that seemed okay during tests did not work out in the field.
This isn't racism; it's simply humans working on a tight budget.
E — January 5, 2010
What Splack said.
Re-read it, or read it. And think about it instead of just thinking about how to defend HP, engineers, or your argument.
E — January 5, 2010
I am absolutely positive there was no malicious intent when Band-Aid made the flesh colored Band-Aids for themselves, the people that were making them --or the people that made the flesh colored crayon.
Or the algorithm for the stupid web-cam.
There does not need to be racist intent by individual engineers to unintentionally disregard an entire group of people. This is exactly how it happens. This is the definition of perpetuating institutional racism.
Steph — January 5, 2010
Did anyone see the Better Off Ted episode in the first season that made fun of this? I thought they made it up- I can't believe it is a real thing! That is sad.
Anonymous — January 5, 2010
This is unfortunate, but I do appreciate the humor with which it was presented. It's a good video, and an even better observation about privilege and its blindness (intentional or not).
abbie — January 5, 2010
Did anyone notice that all the text is backwards? His name tag, the sign in the background... it's all backwards. Is this just a goof in the video, or did they deliberately fake out the software using a mirror?
Jamie — January 6, 2010
Here's a video I found explaining a remedy to the bug.
http://www.youtube.com/watch?v=Q9HdBA9JvQU
HP also acknowledges the problem and said (according to an article I read) that they're working on fixing it.
Mike — January 6, 2010
...that's because it's *hard*. math doesn't care about race, color or your political party. low contrast skin tones are hard to find symmetry in. YUV camera data doesn't have a lot of resolution to work with, if your face is really dark there's not a lot of data there.
it's fixable, but it will take more advanced algorithms and better cameras.
Alyssa — January 6, 2010
In response to everyone defending HP:
1: Yes we realize the software itself isn't racist. We also realize that there probably isn't anyone at HP laughing manically because they pulled off their plan to make a camera that ignores black people.
2: Yes we realize this software does work for dark skinned people in the right situations. But when one person has to set up lighting to be just right (forward light and enough light) while another can just set up the camera without thinking about lighting and have fairly good results, that isn't exactly equal is it?
3: Yes we realize these are cheap cameras. But if a camera is too cheap to make a feature work properly, that feature should be dropped from the camera and saved for more expensive cameras that will have enough of a profit to make the feature work correctly. This feature ISN'T autofocus. The camera would sell perfectly fine if this feature were dropped.
4: Because it is a cheap camera, we can assume that the consumers of this camera aren't professionals, and won't have proper lighting. To bring it back to the consumer by saying it's bad lighting, is creating an expectation of the consumer that wasn't there when the camera was created.
5: Anyone who works in a for-profit company understands that marketing wants everything, management wants it done for nothing, and the people developing it have to figure out how to please both parties. But if quality control noticed the problem, and it couldn't be fixed on their budget, then engineering and QC need to mention that to management and marketing. Then management and marketing can decide whether to expand the budget to fix the problem or drop the feature. This problem could have been anyone's fault. It doesn't really matter whose fault it is (unless you personally worked on this project). But the fact that it got through says there is a problem somewhere in HP. No one has put the blame on the engineers, so relax and please stop explaining how for-profit companies work.
6: Once the problem became known to HP (if they didn't know about it before), they really didn't do anything about it. They could have recalled the product and either dropped or fixed the feature. But as well all know, this would hurt their profits. Ultimately they decided they would rather keep the offense than spend the money to fix the problem. That is their decision to make. You can argue that money outweighs the offense here, but it makes no sense to argue that there is no offense.
Jenn — January 6, 2010
Whoever decided to base the algorithm on contrast in the first place was making a racist decision. Sure, it works great for light faces with a lot of contrast. But they never stopped to think about people who don't have the contrast they do in their face. There's no way that the algorithm would have been based on contrast if the development team and testing subjects contained a good number of people with darker skin.
It's kind of like the camera that thinks that all people with narrow eyes are blinking: if the programmers had stopped to think for a minute some people have narrow eyes and some people have much different contrast on their faces, then they'd never have based the software on those values. But instead they chose to because of their lack of foresight and assumption that the only consumers of their product looked like them.
That ignorance, in and of itself, is racist. Plus, it ultimately makes the software inaccessible to demographics of people accustomed to being excluded by other's ignorance, which probably doesn't feel all that nice for the excluded. Hey, but they're the minority, who cares, right?
Imagine that someone designed a car that doesn't work, or works poorly, for 10% or more of its customers. 1 in 10 cars are total lemons. Would anyone buy that make of car, knowing that the company doesn't care about quality? That's basically what HP and Nikon made: a lemon of a camera, except worse. The car manufacturer is simply lazy. Maybe their production line is poorly run. It's sheer negligence. But because the chance of getting a lemon of a car is random, everyone would avoid the car company.
The camera manufacturer, on the other hand, is not only lazy but ignorant. They cut corners at every step of development so that their product is effectively useless and broken for a significant portion of the demographic. More nefarious than the car producers, the chance of their cameras not working is not random. No, it goes neatly along class lines. But since it replicates the typical exclusion that those people face on a daily (completely unjust) basis, nobody caught the mistaken until it was too late. And once it was noticed, people rushed to defend it!
Can anyone really imagine a world in which people are okay with a company making an expensive product that is a total lemon 1 out of 10 times? If it's randomly badly made, everyone would throw a fit and the company would go under or the product would be discontinued. However, if it just excludes the same people that everyone else excludes, it's business as usual.
Well, business as usual is pretty damn racist. I can't rationally grasp how it isn't.
HzlStone — January 6, 2010
Oh this is *so* amusing. Because if they had just focussed on the *colour* of the face, rather than the contrast, they would have got it right.
How so? Well, 10 years ago I was working as a research engineer in a company that makes equipment for the broadcast industry (stuff for TV studios and so on). A colleague of mine was working on skin detection. Being a smart chap (PhD in quantum mechanics) he did a lot of research and experimentation with images from all over the world and found that nearly all (I'll explain that qualifier in minute) skin tones fall within a tightly defined set of colours - hue and saturation - that tend not to occur elsewhere. We are all pretty much the same hue, it's mostly just the saturation (darkness and intensity of the colour) that's different. At least to a video camera. All except for one tribe on the African continent (long time ago, sorry, can't remember the country or the tribe), who oddly have some blue tones in their skin. You can't see the blue with the naked eye it's only apparent when you process the image to extract the spectrum. So my colleague added in a bit of code to also look for that particular shade.
Add the skin-colour recognition to feature extraction that recognises the eyes (much easier to do than the nose as the whites give contrast) and bingo, facial recognition that uses colour to ensure it works on skin of every colour.
Of course our equipment was a couple of orders of magnitude more expensive than a home video camera but Moore's Law implies the technology we then used should be cheap enough by now to implement in consumer products.
Oh and every engineer in that research group was Anglo-Saxon pinky-white. But we knew our job.
HzlStone — January 6, 2010
The recognition smarts was not in the sensors, it was in the algorithm, so Moore's law does apply. He prototyped it using a desktop computer and images from the web. And the reference to the PhD was to make the point that the intelligent thing to do was consider all skin tones, not just those of people he knew.
Pants Charming — January 7, 2010
if it were really racist it would follow black people really well. especially around stores.
Bagelsan — January 7, 2010
Now I'm thinking about a science-fiction story where everyone is tracked all the time by cameras and Big Brother and whatnot... except really dark-skinned people! The Terminator walks by 'em, the killbots skip them, the billboards that respond to you looking don't bother them, bank security cameras ignore them...
Creepy and weird, yes. But a cool story idea...if terribly unrealistic. :p
Eli — January 8, 2010
>>The technology we use is built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose. We believe that the camera might have difficulty “seeing” contrast in conditions where there is insufficient foreground lighting.
I am apparently reading this wrong, but shouldn't that mean that these algorithms would be better at detecting the contrast between the white of a person of color's eyes and their dark skin?
Student — January 8, 2010
Looks like I missed most of the fun. Oh well.
Anyway, what I wanted to say was that there's things both sides can and are learning from each other. One side is learning about optics and why this happened, the other side is learning that they need to work on better ways to program.
It seems likely to me that it may have been tested on black people in an environment that was well-lit, that the error would have most likely been on thinking to test both skin color and lighting conditions at the same time. If the testers figured that it did work on all people in one set of conditions, they might not have thought to test both conditions at once, figuring there probably wouldn't be a reason to since it worked on all people previously. This may have indicated more of a lack of knowledge of camera optics in regards to different skin tones than necessarily never having tested other demographics.
I'm just saying there's a chance it worked like this:
Management: Did you get it working within the budget and time constraints?
Engineers: Yes, it works really well. We're pretty excited about it.
Management: Does it work for all skin colors?
Engineers: Yeah, we tested that out.
Management: Does it work in different lighting?
Engineers: Well, we have some issues at low lighting, but with proper lighting being used, it should work just fine.
Management: Good.
There may have been no point in the conversation where someone went "Did you test all skin colors and lighting circumstances simultaneously?" Works for all colors: check. Starts messing up in poor lighting: check. They might not have checked both at the same time. I'm saying that it seems possible that it wasn't tested extensively, that they just covered the bases instead of checking various circumstances at the same time.
Then again, who knows under what constraints or what scenarios this was done under? It's all speculation, so even my post is useless.
I think it would be best for HP and other places to have Sociologists hired to think about these circumstances, to reduce the chance of these things happening. However, there's still a chance even a Sociologist wouldn't have asked "Did you make sure it works with people of all different skin tones in low lighting?" Engineers and programmers are by nature problem solvers figuring out how to make something work cheaply. They're problem solvers of technology, not necessarily people knowledgeable of or trained to deal with all kinds of possible outcomes of their product. Now that they have undeniable awareness of the flaws of using only contrast in their programming, I have no doubt they'll solve it.
I guess one last thing would be, if people went "Your product doesn't work with dark skin tones as well as it does with light skin tones. We feel it would be best to address this problem as quickly as possible, and once fixed, offer replacements to people unsatisfied with your product. The camera's poor quality when presented with darker skin colors will reflect on your sales, so we would encourage you to fix this as quickly as possible." I doubt a reasonable company wouldn't start working on fixing the problems right away, even without aggressive words. Then again, if they don't respond with "We're working on it, in the mean time please use good lighting" then by all means, rage away until they fix it, because then they would demonstrate that they legitimately do not care about other races.
anonymoussociologist — January 9, 2010
Don't know if someone already said this but Better Off Ted had an episode in the first season called "Racial Diversity" I believe and it was about sensors that do not see darker skin. So f'd up!
The great white sharks of the reading world | Lectitare — July 10, 2015
[…] People just don’t think about some group of people, and then something gets messed up. (See: face-tracking software can’t see black people.) It’s too bad Scribd handled it by wiping out their romance selection, though. It may have […]
Mark H Black — October 28, 2021
I think this is a big omission in technology development. It can also be offensive because the company hasn't considered all the options. To avoid such problems, you should take a responsible approach to choosing a
fintech software development company , because otherwise you can get a lot of errors and then spend time and effort fixing them.
Aleks Shamles — November 3, 2021
This is really a terrible problem that needs to be solved. First of all, you need to find a company that will help you in enterprise software development and fix your problem. I can advise you a company that can handle this, as it has 20 years of experience in this field. Increase the value of your business with our IT capabilities and achieve the desired business results with effective digital solutions.
Darius Quid — March 1, 2022
I believe that it always makes sense to refine the software, app or site in order to make it more functional. I have my site which I always try to develop in terms of new opportunities for buyers, which is why I bought the Layered Navigation Premium plugin at https://amasty.com/premium-custom-layered-navigation-for-magento-2.html . This is a great solution to make it easier for customers to make purchases.
EvanBrown — July 25, 2022
Surely you know that any web or mobile development requires the right skills and resources. In addition, when I decided to develop software for my company, I needed to know what startup owners can hire developers here and understand that this comes from a wide range of methods and practices. This has given me a better understanding of this sector, and I wish you the same - development.
Mark H Black — March 30, 2023
It is worth using only high-quality software in the process of doing business, as well as really competent advice and recommendations to improve the business and start developing it. There is a lot of information on the Internet now, including this link , which you can successfully use in your business to achieve automated work and good results.
Reginald Coghlan — March 25, 2024
The article highlights the critical importance of thorough testing and inclusivity in software development, as demonstrated by the issues faced by HP's face tracking software in recognizing darker-skinned faces. It emphasizes the necessity for developers to consider diverse user demographics during both the design and testing phases. For further insights into different software development models and guidance on selecting the most appropriate approach to ensure inclusivity and effectiveness in product development, you can visit the link provided at this source. Prioritizing comprehensive testing and inclusivity enables developers to mitigate biases and ensure equitable user experiences for all.
Maybe you’ve fallen in love, but your ai partner isn’t yours. – kirstensnd — May 31, 2024
[…] datasets are inevitably inherited by the ai and applied to the works it generates. This includes gender bias, racial bias, and even political […]