Photo by Kandukuru Nagarjun, Flickr CC

This post was created in collaboration with the Minnesota Journalism Center.

Technology has its share of perks and benefits. Past articles on The Society Pages demonstrate how artificial intelligence and technology can help enhance journalism and curb trafficking and trolling online — but scholars have also found technology has a dark side. Meredith Broussard calls it, “technochauvinism,” a belief that tech is always the solution to the world’s problems. It is a red flag, she says, because “there has never been, nor will there ever be, a technological innovation that moves us away from the essential problems of human nature.”

One of these problems is unequal access to the internet. On The Society Pages, we highlighted how access to the internet influences activism. Other research shows how access to the internet influences various societal practices including predictive policing, real estate markets, affordable housing, social services and medical care. For example, predictive policing is a developing area of inquiry. This practice has come under scrutiny for its lack of transparency and potential to assign inaccurate risk scores to individuals that may become a victim or offender in a violent crime, which can lead to the overpolicing of already marginalized areas.
Scholars have also discovered that blue-chip companies, including Google, produce search results that marginalize underrepresented populations. Further, there is fear that algorithms are writing people out of jobs. While algorithms do have the potential to write people out of jobs, different fields may experience this to various degrees. This may be true for professions including paralegals: Up to 69 percent of paralegals’ time could be automated. In the journalistic profession, reporters and editors are in better shape due to their ability to animate algorithms to their advantage: As a human-centered process, algorithms have the potential to increase reporting outputs with less human effort. But algorithms aren’t neutral — they are produced by people, and they have the potential to reproduce marginalization.