{"id":10475,"date":"2020-03-30T17:44:41","date_gmt":"2020-03-30T17:44:41","guid":{"rendered":"https:\/\/thesocietypages.org\/discoveries\/?p=10475"},"modified":"2020-03-30T17:44:42","modified_gmt":"2020-03-30T17:44:42","slug":"algorithmic-blues-accuracy-versus-morality-in-policy-debates","status":"publish","type":"post","link":"https:\/\/thesocietypages.org\/discoveries\/2020\/03\/30\/algorithmic-blues-accuracy-versus-morality-in-policy-debates\/","title":{"rendered":"Algorithmic Blues: Accuracy Versus Morality in Policy Debates"},"content":{"rendered":"<div class='citation'>\n    <span class='authors'>Barbara Kiviat, <\/span><span class='link'><a href=\"https:\/\/journals.sagepub.com\/doi\/abs\/10.1177\/0003122419884917?casa_token=E3wUhifvxO8AAAAA:clilrw0t24iU8-ZPP-Dno1KNbcFFnAbRNKY-uA6HsCk7RVA1RWuN3zqC5Zf9kTPlWtym-XMn5ebx\">&ldquo;The Moral Limits of Predictive Practices: The Case of Credit-Based Insurance Scores,&rdquo; <em>American Sociological Review<\/em>,<\/a><\/span><span class='year'> 2019<\/span><\/div>\n\n<figure class=\"wp-block-image size-large\"><a href=\"https:\/\/www.flickr.com\/photos\/cafecredit\/27321078025\/\"><img loading=\"lazy\" decoding=\"async\" width=\"600\" height=\"268\" src=\"https:\/\/thesocietypages.org\/discoveries\/files\/2020\/03\/27321078025_99ba1ac7f9_z-600x268.jpg\" alt=\"Picture of a color-coded credit score scale\" class=\"wp-image-10489\" srcset=\"https:\/\/thesocietypages.org\/discoveries\/files\/2020\/03\/27321078025_99ba1ac7f9_z-600x268.jpg 600w, https:\/\/thesocietypages.org\/discoveries\/files\/2020\/03\/27321078025_99ba1ac7f9_z-300x134.jpg 300w, https:\/\/thesocietypages.org\/discoveries\/files\/2020\/03\/27321078025_99ba1ac7f9_z.jpg 640w\" sizes=\"auto, (max-width: 600px) 100vw, 600px\" \/><\/a><figcaption>Photo by <a href=\"http:\/\/cafecredit.com\">CafeCredit.com<\/a>, Flickr CC<\/figcaption><\/figure>\n\n\n\n<p>It seems that algorithms are shaping more and more of our world. However, algorithms &#8212; rule or process-based calculations most often done by computers &#8212; have been an important part of society for centuries. In her <a href=\"https:\/\/journals.sagepub.com\/doi\/abs\/10.1177\/0003122419884917?casa_token=E3wUhifvxO8AAAAA:clilrw0t24iU8-ZPP-Dno1KNbcFFnAbRNKY-uA6HsCk7RVA1RWuN3zqC5Zf9kTPlWtym-XMn5ebx\">new research<\/a>, <a href=\"https:\/\/sociology.stanford.edu\/people\/barbara-kiviat\">Barbara Kiviat<\/a> explores how policymakers respond to one not-so-new use of algorithms and the predictions they can produce: how insurance companies use credit scores to set prices.\u00a0\u00a0<\/p>\n\n\n\n<p> <div class=\"pull-this-show\" id=\"pull-this-show-10475-ex1\" style=\"display:none;\"><\/div> Kiviat examines thousands of pages of documents and 28 hours of testimony from state, congressional, and professional debates and investigations around insurance companies&#8217; use of credit scores. Credit scores are the output of algorithms that rely on huge amounts of consumer financial information. Insurance companies use these scores to set prices based on predictions of how often a customer will make insurance claims, so customers with lower credit scores have higher prices. In the insurance industry there is widespread agreement that this practice is justified because of \u201cactuarial fairness.\u201d In other words, the data is fair to use to set prices because credit scores do actually predict how often someone will use their insurance.  <span class=\"pull-this-mark\" id=\"pull-this-mark-10475-ex1\" style=\"display:none;\"> Policymakers try to understand whether or not people were responsible for bad or good past actions that correspond to their current credit score and insurance cost.  <\/span><\/p>\n\n\n\n<p>However, policymakers do not agree with the insurance industry\u2019s argument of credit scores as \u201cactuarially fair.\u201d Instead, policymakers draw on ideas of \u201cmoral deservingness.\u201d They try to understand whether or not people were responsible for bad or good behaviors that corresponded to their current credit score and insurance cost. Policymakers objected to the use of credit scores when they did not reflect policymakers\u2019 understandings of what counted as good or bad behavior. For instance, policymakers sought to include sections for \u201cextraordinary life circumstances\u201d in insurance regulation that would not penalize consumers for poor credit scores resulting from, for example, the death of a spouse or child.<\/p>\n\n\n\n<p> <div class=\"pull-this-show\" id=\"pull-this-show-10475-ex2\" style=\"display:none;\"><\/div> This research shows that policymakers do not object to predictive practices because they are mysterious or confusing. Rather, they object when algorithmic results disagree with existing assumptions of what is good or bad behavior. Kiviat\u2019s findings are important to consider as algorithms and the predictions they create are used in more of our social and economic life, such as for identifying students at \u201chigh-risk\u201d of poor academic outcomes, informing policing by \u201cpredicting\u201d crime, or showing job ads to some individuals and not others.   <span class=\"pull-this-mark\" id=\"pull-this-mark-10475-ex2\" style=\"display:none;\"> These findings are important to consider as algorithms are used in more of our lives, such as for identifying students at \u201chigh-risk\u201d of poor academic outcomes or informing policing by \u201cpredicting\u201d crime. <\/span>  <\/p>\n\n\n\n<p>Resistance to algorithms based on fairness can only go so far. Who will be protected from the use of algorithms if we think they are unfair only for \u201cgood\u201d people?<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Barbara Kiviat, &ldquo;The Moral Limits of Predictive Practices: The Case of Credit-Based Insurance Scores,&rdquo; American Sociological Review, 2019 It seems that algorithms are shaping more and more of our world. However, algorithms &#8212; rule or process-based calculations most often done by computers &#8212; have been an important part of society for centuries. In her new [&hellip;]<\/p>\n","protected":false},"author":2020,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[15,13,85],"tags":[125983,125984,33095,125982,904,125976,14907,586,18824,37332,1842,3107,371,125980,125979,37336,764,125985,1877],"class_list":["post-10475","post","type-post","status-publish","format-standard","hentry","category-culture","category-inequality","category-politics","tag-actuarial","tag-actuary","tag-algorithm","tag-algorithmic-risk","tag-credit","tag-credit-score","tag-sociology-of-culture","tag-economic","tag-economic-inequality","tag-inequality","tag-insurance","tag-morality","tag-policy","tag-policy-maker","tag-policymaker","tag-politics","tag-prediction","tag-predictive-practies","tag-risk"],"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/posts\/10475","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/users\/2020"}],"replies":[{"embeddable":true,"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/comments?post=10475"}],"version-history":[{"count":16,"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/posts\/10475\/revisions"}],"predecessor-version":[{"id":10493,"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/posts\/10475\/revisions\/10493"}],"wp:attachment":[{"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/media?parent=10475"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/categories?post=10475"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/thesocietypages.org\/discoveries\/wp-json\/wp\/v2\/tags?post=10475"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}