Toban B., Elisabeth, and Mark sent us a link to a post at jozjozjoz about the Nikon S630 digital camera. As Joz explains, “As I was taking pictures of my family, it kept asking ‘Did someone blink?’ even though our eyes were always open.”
Apparently the camera perceives “Asian” eyes as closed.
Does anyone know about how cameras are programmed to do things like recognize blinking? Does the program include specific measurements to look for to define an eye as open or closed and then prompt the user with a question about blinking? It would seem that the program doesn’t know how to deal with Asian features, which makes me wonder about the “typical” faces or facial features used to write the program–who was used as the “neutral” model?
Anybody know more about how these types of programs are written and how specifications are chosen to provide the camera a baseline for determining that the face in a photo requires “fixing” of some sort?
UPDATE: Commenter Elizabeth says,
I just got back from vacation with a friend who has this camera (we are three white women) and after every photo, it asked us, “Did someone blink?” It became a running joke because the sensor asked this question whether or not there was a person (or blinking person) in the shot.
Several of our other commenters had some info on how face-recognition programs work and what the problem might be, and that a) they generally suck and b) might suck slightly more for some groups than others, but still are overall pretty crappy at this point no matter what.
NEW! Racialicious posted about the Microsoft Natal game, which seems to have some problems recognizing the movements of people with dark skin (and maybe dreadlocks):
Research into the issue resulted in a study concluding that near-infra-red cameras did indeed struggle to read movements from those with darker skin. However, Microsoft has responded to these worries, telling Gamezine that all ethnicities will be able to use the technology.
The post has a really good discussion about race and “neutral” avatars in games, including some in which you have to pay extra to get a non-White character.
Comments 64
Duran — May 29, 2009
This is, if true, probably just a case of poor software testing caused by an over aggressive drive to market. Conplex
Software is relatively new to cameras, and the industry is likely still figuring out the demands behind releasing high quality global software. The early days of mobile phones had similar problems with software. You have to understand that these guys ate driven pwimarily by the hardware- to a lot of vendors software is something you slap in just beforeyour CNET reveiw. Same for phones in the past.
Duran — May 29, 2009
Sorry for the typos. written on a phone
Anonymous — May 30, 2009
Yeah, if they did any sort of rigorous testing, they wouldn't have had a "neutral" model. They would've tested on people of all ages, sizes, ethnicities, etc.
Elena — May 30, 2009
I've just gotten back from my holidays in Tokyo. I have a Nikon Coolpix camera and I was really amused when the same message appeared most times I took a picture of a statue of Buddha.
FWIW, Nikon (full name 株式会社ニコン or Kabushiki-gaisha Nikon) is a Japanese company, so yeah. I don't really think they are discriminating against Asians.
simono — May 30, 2009
i can imagine this is harder to tune for asian faces, since the difference between blinking and non-blinking is much smaller.
For example: at least in the pic you show the eyes DO look blinking even to my human processor.
to recognize blinking they probably cut out the eyes and determine the relative amount of 'white' that can be seen. if it's below a certain margin for both eyes then the camera thinks the person is blinking.
Joshua — May 30, 2009
This may be exacerbated by the natural squinting that occurs during some smiles.
Dubi — May 30, 2009
As Elena noted, this is a Japanese company. Rather than talk about the supposed "neutral" model (i.e.: why those %@%^%$% racists!), I think this is actually one of those interesting cases of the cultural fascination the Japanese have with Americans (or, actually, people of European origins). This fascination rivals any incident of exotization of Asians you can find in Western cultures. It actually gives us a good opportunity to get a feel of how they react to our feeble attempts at understanding their culture(s), as it so often turns out quite comical.
At any rate, this isn't about racism against Asians, but merely a misrepresentation of the West as ethnically homogeneous.
T B — May 30, 2009
There are more comments here -- where there are links to various other sites where comments have been posted -
http://www.flickr.com/photos/jozjozjoz/3529106844/
(Finding any substantial analysis will be a challenge, it seems, however.)
Umlud — May 30, 2009
Not to get all meta (again) on a post: but Asia is a continent containing several billion people of widely varying (such as it is within a single species) facial features. If by "Asian features" you meant East-Asian (i.e., Japanese, Korean, east-China Chinese) then why not say so? Remember, all peoples originating from Turkey to Japan are all "Asian", and I'm assuming that you didn't mean to imply that this particular programming fault affected all those billions of "Asian" eyes, even those that don't have "Asian features".
I also agree with Elena: Nikon - a Japanese company - is unlikely singling out those with "Asian features". The point that I was wondering when reading the article was whether Nikon uses a different facial-recognition computer model in their Japanese-market cameras, and (if so) why they chose to use a different one in the above example. Anyone know the answer to this question? (Elena - was the camera you used from the US?)
Elena — May 30, 2009
No, it was bought in Europe. But it also displays the same message when my Caucasian relatives are looking downward or sideways, and sometimes in images where no humans are present. Image recognition AI software is just not ready for prime time yet.
DaniFae — May 30, 2009
Elena's posts kind of confirmed what I was thinking. The AI just isn't that good. Judging from the picture submitted the camera would ask that of just about anyone who was smiling or laughing, barring someone with extremely large/wide eyes.
Though, personally, I'd have to see a series of pictures, taken of different people, before saying the camera's programing was racist. (One picture is not a good sample.)
Paul — May 30, 2009
It may not be germane to the post, but I'm reminded of a story my ex- told about a decade ago of a vocal-prompt her grandfather's new camera would offer. I tried to find a corroborating reference online, but can only come up with the following phrase repeated verbatim on different sites:
"Too crowdy. Use frash." -- Jen's grandfather's very expensive talking Japanese camera on a typical overcast Seattle day.
Elizabeth — May 30, 2009
I just got back from vacation with a friend who has this camera (we are three white women) and after every photo, it asked us, "Did someone blink?" It became a running joke because the sensor asked this question whether or not there was a person (or blinking person) in the shot. We found a way to turn this application off because it was so distracting. I'm curious as to whether you'll discover anything here in relation to race.
Jonathan — May 31, 2009
Okay, speaking from the programming side of things, computers are just really, really bad at interacting with physical reality outside of all but the narrowest, most tightly controlled situations. Pop culture likes to describe brains as being "like computers." They are not. Not in the least. Generally speaking, the brain is good at doing what it evolved to do, interacting with physical reality and adapting to new stimuli. Computers are good at doing what they were originally designed to do, large mathematical calculations performed quickly and with a high degree of accuracy. Getting a computer to do anything else requires herculean efforts that mainly result in novel ways to reduce some activity to a series of mathematical calculations. Now this doesn't work well when faced with messy reality. That's why our cars still can't drive themselves, that's why our computers still sound like robots, and that's why that camera can't tell when people's eyes are shut or when they're open. We just don't have good formulas for that yet.
Sycorax — May 31, 2009
It's not just cameras that have trouble with human features; there's a Flickr pool of "Things iPhoto Thinks Are Faces" here:
http://www.flickr.com/groups/977532@N24/pool/
simono — June 1, 2009
@sycorax: camers and iPhoto = similar software, same problem
Sociological Images » What We’ve Been Up To Behind Your Back (June 2009) — July 1, 2009
[...] Racialicious had an interesting post about Microsoft’s Natal game initially having trouble recognizing people with “dark skin,” which we added to our post about Nikon’s blink-recognition software problems. [...]
HP Software Doesn’t See Black People » Sociological Images — January 5, 2010
[...] seen this kind of thing before with a Nikon camera that seemed to think that Asian people were always blinking (though there was [...]
user testing fail - usability fail. — January 5, 2010
[...] or their engineers would’ve caught it testing on themselves. Since this kind of problem isn’t new in facial recognition software, lack of full user testing that takes into consideration the full [...]
RickS — January 30, 2010
Applying a human characteristic to an electronic machine, have we become that brain dead that we become so offended when we have to answer a machines request of how to process something we just did. That we can't just say no and take more pictures. No, we have to turn it into a racial thing and make a Global Statement claiming racism by a camera manufacturer that creates a wide variety of optical equipment that benefits all mankind, it boggles the mind. I think only a racist could find this racist and isn't serious enough for me to give up my NIKON F4s over because it captures every thing, this includes color, acurately. This camera would probably ask the same thing if I took one of a younger brother of mine, because he was always being asked if his eyes were open, during the pictures and looking at the prints. I think if the camera asked 'there was an ugly person in the picture or did you get everybodys good side, do you want to save anyway' then we might have a problem. Until then use the other option of turning it off. Here's a project for you see if it asks that when people are laughing so hard they close their eyes or when someone is sleeping. Oh and here is another one get a friggin life.
Who Do Asians Take So Many Photos? | Ed Uncovered — September 24, 2013
[...] he sent me this link (cos we were speaking online at the time – no one chats with their mates in real life any [...]
Teaching The Camera To See My Skin — April 3, 2014
[…] a message popped on the screen inquiring whether or not the subject blinked, to which she posted a photo online replying, “No, I’m just Asian.” Even today, in low light, the sensors search for […]
Teaching The Camera To See My Skin » The Viral Trend... Funny, Viral Videos, Pictures and Stories — April 3, 2014
[…] a message popped on the screen inquiring whether or not the subject blinked, to which she posted a photo online replying, “No, I'm just Asian.” Even today, in low light, the sensors search for […]
Teaching The Camera To See My Skin | Cherry Wired — April 3, 2014
[…] a message popped on the screen inquiring whether or not the subject blinked, to which she posted a photo online replying, “No, I’m just Asian.” Even today, in low light, the sensors search for […]
Teaching The Camera To See My Skin | Bringing the best news to the People — April 3, 2014
[…] a message popped on the screen inquiring whether or not the subject blinked, to which she posted a photo online replying, “No, I’m just Asian.” Even today, in low light, the sensors search for […]
Teaching The Camera To See My Skin — April 3, 2014
[…] a message popped on the screen inquiring whether or not the subject blinked, to which she posted a photo online replying, “No, I'm just Asian.” Even today, in low light, the sensors search for […]
Teaching The Camera To See My Skin | Perfect Your Lifestyle — April 12, 2014
[…] a message popped on the screen inquiring whether or not the subject blinked, to which she posted a photo online replying, “No, I'm just Asian.” Even today, in low light, the sensors search for […]
The Arioch — December 4, 2014
https://plus.google.com/116727221167063581708/posts/8CfgVB5yc1q
Latino Films Absent From Hollywood’s 2015 Slate – Flavorwire — January 7, 2015
[…] the transition to digital photography brought the Nikon Coolpix’s tendency to assume Asians were constantly blinking and low-light sensors designed to focus only on light-skinned faces. This is an ethnic […]
For Mon 10/5 | Asian Pacific American Media — October 1, 2015
[…] http://thesocietypages.org/socimages/2009/05/29/nikon-camera-says-asians-are-always-blinking/ […]
Make something from HERE. Wherever here is to you. Thoughts from Interaction16. | Michelle Thorne — March 16, 2016
[…] It falsely identified some people of color as “gorillas.” In another example, Nikon cameras falsely interpreted the eye shapes of many Asian users. The cameras suggested that people in the photo had blinked when their eyes were actually […]
Why Our Conversations on Artificial Intelligence Are Incomplete - Artificial Intelligence Online — February 19, 2017
[…] bias is what led a Google application to tag black people as gorillas or the Nikon camera software to misread Asian people as blinking. Second, if the process being measured through data collection itself reflects long-standing […]
為什麼我們談話對人工智慧是不完整 - New Geekers — February 19, 2017
[…] 標記黑人為大猩猩 或尼康相機軟體 被誤讀為閃爍 […]
Can We Say Goodbye to Bias in AI? – the Science Bae — April 19, 2019
[…] Nikon Camera thinks Individuals of Asian Descent are Always Blinking […]
“Racist” Government passport system rejects black man’s photo despite meeting all criteria - DIY Photography — September 19, 2019
[…] and Google had the same problem with their products. Nikon cameras from ten years ago would ask “Did someone blink?” when an Asian woman was taking photos of her family. A bit more recently, Google Photos app tagged […]
Online database ImageNet to remove 600,000 images after art project exposes its racist and gender bias - DIY Photography — September 25, 2019
[…] one. Even though artificial intelligence improves over the years, it still turns out to be pretty stupid or offensive in some situations. My assumption is that it will still take time before labeling […]
Mario Gómez – Emotion Research Lab – WORLD WILD WEB — December 14, 2019
[…] 2 Se dio a conocer el Flickr en 2009. Fuente: The Society Pages. […]
Anonymous — June 4, 2021
lol
Facial Misrecognition - The Collation — July 9, 2021
[…] recognition technology is shaped by and reinforces the priorities and prejudices of its designers: cameras that cannot discern facial features of Asians and algorithms that identify dark-skinned faces as those of gorillas. Buolamwini terms the bias […]
Baked in oppression: the racism of “things” - SSZEE MEDIA — August 15, 2021
[…] This is linked with photography again, as Nikon cameras struggle with the ‘blink recognition’ setting, constantly asking the photographers of Asian subjects “did someone blink?” […]
Will SEPTA’s new AI security system racially profile riders? - RB Webcity — December 25, 2022
[…] at twice the rate of white defendants. When digital cameras became widely available, some detected Asian people as perpetually blinking. Algorithmic bias is common in other fields, including mortgage loans, speech recognition, and our […]
Will SEPTA’s new AI security system racially profile riders? - Daily Net News Station — December 25, 2022
[…] at twice the rate of white defendants. When digital cameras became widely available, some detected Asian people as perpetually blinking. Algorithmic bias is common in other fields, including mortgage loans, speech recognition, and our […]
The Future - WAITS FOR U — December 25, 2022
[…] at twice the rate of white defendants. When digital cameras became widely available, some detected Asian people as perpetually blinking. Algorithmic bias is common in other fields, including mortgage loans, speech recognition, and our […]
The Revolution Will Be Programmed; addressing ethics issues in artificial intelligence - Brave In The Attempt — July 19, 2023
[…] Nikon cameras say Asian people are always blinking […]
AI Is Racist, Experts Say. Here's Why HD IMAGES — September 11, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Experts Say. Here’s Why - — September 11, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Experts Say. Here’s Why - NEVERFOMOAGAIN — September 11, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Experts Say. Here’s Why - lightningx — September 11, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Experts Say. Here's Why - Koinsamosa — September 11, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Experts Say. Here's Why - Blockchain Beaat — September 11, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Experts Say. Here's Why | Crypto Current Prices — September 11, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Experts Say. Here’s Why – Decrypt – CoinCap — September 11, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Consultants Say. Here is Why – Coin Market Updates — September 11, 2023
[…] 2009, Nikon’s face detection software program would ask Asian folks in the event that they have been blinking. In 2016, an AI software utilized […]
AI Is Racist, Experts Say. Here's Why - NTLS Official Blog — September 11, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Experts Say. Here's Why - TokenRoo.com — September 12, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Is Racist, Experts Say. Here’s Why – Decrypt – Block-News org.preview-domain — September 12, 2023
[…] 2009, Nikon’s face detection software would ask Asian people if they were blinking. In 2016, an AI tool used by U.S. courts to assess the […]
AI Exhibits Racial Bias Similar To Humans, Says Experts - The Distributed — September 15, 2023
[…] 2009, Nikon’s facial recognition software mistakenly inquired if they were blinking. Then, in 2016, an artificial intelligence […]
Notes from ARLT SIG 7th Feb 2024 LTHE chat participation : #ALTC Blog — February 19, 2024
[…] Nikon Camera Says Asians: People Are Always Blinking […]
Blog Cyberjustice - Dilemmes éthiques et biais dans l'intelligence artificielle et la technologie de reconnaissance faciale — February 29, 2024
[…] https://thesocietypages.org/socimages/2009/05/29/nikon-camera-says-asians-are-always-blinking/ […]
Hétköznapi adatelemzés - kinek és mit higgyünk el? | Motiváció, hatékonyság, siker — May 10, 2024
[…] hogy a Nikon fényképezőgépei, amelyeket úgy programoztak, hogy ismételjék meg a felvételt, ha úgy érzékelik, valaki […]
Hétköznapi adatelemzés - kinek és mit higgyünk el? - KettőPontNulla — June 6, 2024
[…] hogy a Nikon fényképezőgépei, amelyeket úgy programoztak, hogy ismételjék meg a felvételt, ha úgy érzékelik, valaki […]