So, when you talk about DNA with respect to music, THIS is the first thing that comes to MY mind.
This is cross-posted at Its Her Factory.
There are a lot of reasons to headdesk over this 538 video about Pandora’s Music Genome Project and its application to music therapy. There’s the video itself: some people in my twitter TL found its cinematography too precious. There’s the project it details: a big data project that uncritically draws assumptions about music from 18th century European music theory (CPE Bach actually wrote the book on tonal harmony), and assumptions about structure, organization, and relationships from genetics.
In addition to the technical problems with the project (that is, its uncritical reliance on Western music theory…whiiiiiich is also racist, in the sense of normatlively white supremacist), the Music Genome Project is, I want to suggest, racist. Its genetics-based approach is too too resonant with 19th century race science, and its therapeutic application (the second half of the video is entirely about this) is pretty clearly biopolitically racist.
“I just wanted to hear your voice and tell you how much I love you” –Samantha, Her
“Is it strange to have made something that hates you?” –Ava, Ex Machina
2001: A Space Odyssey is memorable for more than its depiction of artificial intelligence but also its tranquil pacing and sterile modernism. Ex Machina plays the same, taking place somewhere almost as deeply isolated as space. The remote IKEA-castle of a compound is itself mostly empty with soft piano notes echoing off lonely opulence. The mansion is cold and modern but incorporates the lush nature outside. The film moves from windowless labs to trees and waterfalls to a living room that’s half house half nature. The techno-bio juxtaposition and enmeshment clearly echo the film’s techno-human subject matter. But the wilderness reminds of death as much as life.
The nature here is more than natural but is isolation, is the constant implication that there is no escape, is vulnerable dependence, and ultimately is a reminder that you are under the control of a violent, clever, scheming drunk. (more…)
Fox has decided to renew X-Files, a series that aired its last episode over thirteen years ago, with a “six-episode event series” that begins this January. I don’t know what an “event series” is but I’m pretty excited. Of course, there’s a lot of new things to distrust the government about, so one has to wonder: from the burning temperature of jet fuel to the Facebook algorithm, what will the writers decide to focus on? I couldn’t help myself and made a listicle. (more…)
Dear Cyborgology Readers, we want you to write for us!! In our first ever thematic CFP, we invite guest posts about Cameras and Justice. This theme is broad in scope and we encourage you to put your own spin on it.
If you have an idea, pitch us. If you have a full post, send it our way. We will be taking submissions on this theme until mid June.
Posts are generally between 500-1500 words. Authors should write in a clear and accessible style (think upper-level undergraduate or well read non-academic). We welcome traditional text based essays, image based essays, and art pieces.
To get the brain juices flowing, here are a few pieces on Cameras and Justice from the Cyborgology team:
Cameras on Cops Isn’t the Same as Cops on Camera
ACLU Mobile Justice App: Channeling Citizen Voices
Sousveillance and Justice: A Panopticon in the Crowds
Surveillance from the Clouds to the Fog
Other riffs on this theme could include children’s privacy, tourism, unsolicited dick pics, structural oppression aided by the rhetoric of authenticity, and much, much more.
For submissions, questions, and proposals, email co-editors David Banks (email@example.com) and Jenny Davis (firstname.lastname@example.org) using the subject line “Cameras and Justice.”
Remember that Cyborgology (for better or worse) is an all volunteer effort and we cannot pay for writing.
Headline Pic: Source
There has been a lot of talk about magic lately in critical, cultural and technological spaces; what it does, who it is for, and who are the ones to control or enact it. As a way of unpacking a few elements of this thinking, this essay follows on from the conversations that Tobias Revell and I, and a whole host of great participants had at Haunted Machines, a conference as part of FutureEverything 2015 which examined the proliferation of magical narratives in technology. With our speakers we discussed where these stories and mythologies reveal our anxieties around technology, who are the ones casting the spells, and where – if possible – these narratives can be applied in useful ways.
As an ex-literature student, I’m quite interested in ghost stories as analogy, because they can reveal, or be an interesting way of exploring, these anxieties; where the voices in the static are coming from, where the pipes are creaking, and what they tell us about what our technology is doing or can potentially do to us.
I’m going to use a load of slightly ham-fisted contemporary narratives to signpost the anxieties that come out of two personal and increasingly algorithmically mediated spaces: the social network and in the home. Where does the role of narrative in magic, the supernatural, and the unknown allow us to get a better grasp of technology’s power over us? Where are the uncertain terrains of our technologies creating the capacity for hauntings, and where can techniques used to imagine future scenarios better equip us for the ghosts to come? When we think of a haunting, we think of the unseen forces acting upon our domestic space, and when considering technology, a reappropriation of Clarke’s third law that Tobias Revell summoned with his work on Haunted Machines can apply– Any sufficiently advanced hacking is indistinguishable from a haunting. But where else are we haunted? (more…)
What does it mean to have access to the internet? It’s an apparently simple question that gets complicated when we consider the wide variety of ways people access the web and products from the web. Indeed, the question is wrapped up in recent debates about zero rating, net neutrality, “the next billion” and numerous initiatives designed to bring people from the developing world online.
At Theorizing the Web this year, I presented research that combined my fieldwork and personal observations in developing world internet contexts like rural northern Uganda, urban China and rural Philippines with emergent research and journalism on the use of sneakernets–the physical transfer of data using devices like USB sticks or Bluetooth-enabled mobile phones–in places like Mali, North Korea and Cuba. These latter formed the basis for my talk and a recent paper in The New Inquiry, in which I draw from Jan Chipchase’s writing on binary thinking about connectivity and how this ultimately overlooks the vast diversity of ways that people do access the web and its products. (more…)
At the beginning of this month, the ACLU in California released a free mobile app that monitors police violence. The app, called Mobile Justice CA, preserves users’ footage of police encounters. Available on both Apple and Android devices, the user pushes a large “Record” button to document their own and others’ interactions with police. The content automatically transmits to the ACLU servers. The point is to preserve recorded content even if police destroy the recording device and/or delete the video. For instance, the ACLU would have maintained documentation of police detaining residents in an LA neighborhood, even after an officer smashed the cellphone of a witness recording the events.
The ACLU treats transmissions through the app as legal communications and protects the anonymity of the sender. Legal action is only taken upon the sender’s request, but the ACLU maintains the rights to the footage, meaning they can distribute it to media outlets as evidence of injustice. Branches of the ACLU in in New York, New Jersey, Oregon, and Missouri have released similar apps.
These apps are significant in their reflection of an increasingly central mode of activism: Sousveillance. They are also reflective of the structural embeddedness of the sousveilling citizen. (more…)
Science, to borrow a phrase from Steven Shapin, is a social process that is “produced by people with bodies, situated in time, space, culture, and society, and struggling for credibility and authority.” This simple fact is difficult to remember in the face of intricate computer generated images and declarative statements in credible publications. Science may produce some of the most accurate and useful descriptions of the world but that does not make it an unmediated window onto reality.
Facebook’s latest published study, claiming that personal choice is more to blame for filter bubbles than their own algorithm, is a stark reminder that science is a deeply human enterprise. Not only does the study contain significant methodological problems, its conclusions run counter to their actual findings. Criticisms of the study and media accounts of the study have already been expertly executed by Zeynep Tufecki, Nathan Jurgenson, and Christian Sandvig and I won’t repeat them. Instead I’d like to do a quick review of what the social sciences know about the practice of science, how the institutions of science behave, and how they both intersect with social power, class, race, and gender. After reviewing the literature we might also be able to ask how the study of science could have improved Facebook’s research. (more…)
The most crucial thing people forget about social media, all technologies, is that certain people with certain politics, insecurities, and financial interests structure them. On an abstract level, yeah, we may all know that these sites are shaped, designed, and controlled by specific humans. But so much of the rhetoric around code, “big” data, and data science research continues to promote a fallacy that the way sites operate is almost natural, that they are simply giving users what they want, which then downplays their own interests and role and responsibility in structuring what happens. The greatest success of “big” data so far has been for those with that data to sell their interests as neutral.
Today, Facebook researchers released a report in Science on the flow of ideological news content on their site. “Exposure to ideologically diverse news and opinion on Facebook” by Eytan Bakshy, Solomon Messing, and Lada Adamic (all Facebook researchers) enters into the debate around whether social media in general, and Facebook in particular, locks users into a so-called “filter bubble”, seeing only what one wants and is predisposed to agree with and limiting exposure to outside and conflicting voices, information, and opinions. And just like Facebook’s director of news recently ignored the company’s journalistic role shaping our news ecosystem, Facebook’s researchers make this paper about minimizing their role in structuring what a user sees and posts. I’ve just read the study, but I already had some thoughts about this bigger ideological push since the journalism event as it relates to my bigger project describing contemporary data science as a sort of neo-positivism. I’d like to put some of my thoughts connecting it all here.
Okay, so. Apple’s iOS8 Health app is an issue, at least potentially.
To recap, it’s an issue in significant part – and for the purposes of this – in terms of its effect on people who experience disordered eating and/or obsessive-compulsive behaviors and thoughts. Health trackers in general have the potential to do this, and in fact to be quite harmful. This is primarily because health trackers are highly quantitative in nature and extremely oriented toward the monitoring of details, and obsessive-compulsive tracking is one of the primary symptoms of an eating disorder – and the Health app is a focal point for this kind of monitoring. Though it allows for the entry of data, its primary purpose is to allow better curation of data from other health apps, but it still exists. In fact, it not only exists, but it can’t be removed. It can be hidden, but you – the user – still know it’s there. It will be difficult to ignore even if it can’t be seen. It gnaws. Trust me, things like that do.