{"id":72309,"date":"2018-03-29T09:00:45","date_gmt":"2018-03-29T14:00:45","guid":{"rendered":"https:\/\/thesocietypages.org\/socimages\/?p=72309"},"modified":"2018-03-29T17:16:07","modified_gmt":"2018-03-29T22:16:07","slug":"when-data-cant-dj","status":"publish","type":"post","link":"https:\/\/thesocietypages.org\/socimages\/2018\/03\/29\/when-data-cant-dj\/","title":{"rendered":"When Data Can&#8217;t DJ"},"content":{"rendered":"<p>More social scientists are pointing out that the computer algorithms that run so much of our lives <a href=\"https:\/\/scatter.wordpress.com\/2017\/06\/13\/algorithmic-decisionmaking-replaces-your-biases-with-someone-elses-biases\/\">have our human, social biases<\/a> baked in. This has <a href=\"https:\/\/us.macmillan.com\/automatinginequality\/virginiaeubanks\/9781250074317\/\">serious consequences<\/a> for determining who gets credit, who gets parole, and all kinds of other important life opportunities.<\/p>\n<p>It also has some sillier consequences.<\/p>\n<p>Last week NPR host Sam Sanders tweeted about his Spotify recommendations:<\/p>\n<blockquote class=\"twitter-tweet\" data-lang=\"en\">\n<p dir=\"ltr\" lang=\"en\">Y&#8217;all I think <a href=\"https:\/\/twitter.com\/Spotify?ref_src=twsrc%5Etfw\">@Spotify<\/a> is segregating the music it recommends for Me by the race of the performer, and it is so friggin&#8217; hilarious <a href=\"https:\/\/t.co\/gA2wSWup6i\">pic.twitter.com\/gA2wSWup6i<\/a><\/p>\n<p>\u2014 Sam Sanders (@samsanders) <a href=\"https:\/\/twitter.com\/samsanders\/status\/976504661568991233?ref_src=twsrc%5Etfw\">March 21, 2018<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Others quickly chimed in with screenshots of their own. Here are some of my mixes:<\/p>\n<p><a href=\"https:\/\/thesocietypages.org\/socimages\/files\/2018\/03\/Spot1.png\" data-rel=\"lightbox-image-0\" data-rl_title=\"\" data-rl_caption=\"\" title=\"\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-medium wp-image-72313\" src=\"https:\/\/thesocietypages.org\/socimages\/files\/2018\/03\/Spot1-500x218.png\" alt=\"\" width=\"500\" height=\"218\" \/><\/a><\/p>\n<p><a href=\"https:\/\/thesocietypages.org\/socimages\/files\/2018\/03\/Spot2.png\" data-rel=\"lightbox-image-1\" data-rl_title=\"\" data-rl_caption=\"\" title=\"\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-medium wp-image-72314\" src=\"https:\/\/thesocietypages.org\/socimages\/files\/2018\/03\/Spot2-500x229.png\" alt=\"\" width=\"500\" height=\"229\" srcset=\"https:\/\/thesocietypages.org\/socimages\/files\/2018\/03\/Spot2-500x229.png 500w, https:\/\/thesocietypages.org\/socimages\/files\/2018\/03\/Spot2-768x351.png 768w, https:\/\/thesocietypages.org\/socimages\/files\/2018\/03\/Spot2-1024x468.png 1024w, https:\/\/thesocietypages.org\/socimages\/files\/2018\/03\/Spot2.png 1574w\" sizes=\"auto, (max-width: 500px) 100vw, 500px\" \/><\/a><\/p>\n<p>The program has clearly learned to suggest music based on established listening patterns and norms from music genres. Sociologists know that music tastes are a way we <a href=\"https:\/\/books.google.com\/books?hl=en&amp;lr=&amp;id=z0jXKP0ZsI0C&amp;oi=fnd&amp;pg=PP1&amp;dq=info:1-8og7nsGgwJ:scholar.google.com&amp;ots=cFymypO7mK&amp;sig=CbzUgrdN-hVXlW-bf-vTPvhWF3Y#v=onepage&amp;q&amp;f=false\">build communities<\/a> and <a href=\"http:\/\/www.jstor.org\/stable\/2096459?seq=1#page_scan_tab_contents\">signal our identities to others<\/a>, and the music industry <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S0304422X04000324\">reinforces these boundaries<\/a> in their marketing, especially along racial lines.<\/p>\n<p>These patterns highlight a core sociological point that social boundaries large and small emerge from our behavior even when nobody is trying to exclude anyone. Algorithms accelerate this process by the sheer number of interactions they can watch at any given time. It is important to remembers the stakes of these design quirks when talking about new technology. After all, if biased results come out, the program probably\u00a0<a href=\"https:\/\/www.youtube.com\/watch?v=Y-Elr5K2Vuo\" data-rel=\"lightbox-video-0\">learned it from watching us<\/a>!<\/p>\n<span class=\"ft_signature\"><i><a href=\"https:\/\/www.evan-stewart.com\/\">Evan Stewart<\/a> is an assistant professor of sociology at University of Massachusetts Boston. You can follow his work at <a href=\"https:\/\/evan-stewart.com\">his website<\/a>, or on <a href=\"https:\/\/bsky.app\/profile\/evanstewart.bsky.social\">BlueSky<\/a>.<\/i>  <\/span>","protected":false},"excerpt":{"rendered":"<p>More social scientists are pointing out that the computer algorithms that run so much of our lives have our human, social biases baked in. This has serious consequences for determining who gets credit, who gets parole, and all kinds of other important life opportunities. It also has some sillier consequences. Last week NPR host Sam [&hellip;]<\/p>\n","protected":false},"author":1893,"featured_media":72324,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[15,100606,115,285],"class_list":["post-72309","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized","tag-culture","tag-methods-big-data","tag-music","tag-raceethnicity"],"jetpack_featured_media_url":"https:\/\/thesocietypages.org\/socimages\/files\/2018\/03\/Spot1-1-e1522254207183.png","_links":{"self":[{"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/posts\/72309","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/users\/1893"}],"replies":[{"embeddable":true,"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/comments?post=72309"}],"version-history":[{"count":8,"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/posts\/72309\/revisions"}],"predecessor-version":[{"id":72327,"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/posts\/72309\/revisions\/72327"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/media\/72324"}],"wp:attachment":[{"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/media?parent=72309"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/categories?post=72309"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/thesocietypages.org\/socimages\/wp-json\/wp\/v2\/tags?post=72309"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}