A couple of weeks ago, in my Social Issues in Qualitative Methodology course, I was assigned to give a presentation on the “technologies of interviewing.” At first, I was told by older cohort members that I was lucky because I had the easiest topic: “Just do the history of the recorder.” As I googled the topic, thinking that it would then be some cool history and development I found that my predecessors had just done a timeline of photos of how the recorder has changed over time. How boring! Who would want to sit through a 20 minute lecture, slide after slide, talking about the recorder, especially when we’re supposed to be talking about the social issues involved in qualitative methods?
My advice to you, graduate students, today is to avoid this typical pitfall in your methods classes (as both student and instructor): revamp your lessons so they can be of some actual use! Below I offer an example of how I revamped this “simple and easy topic” to something that students can actually use and learn from.
When ‘A’ claims they invented ‘X’ as a sociologist of technology my default position is often to respond with scepticism: as I did when I saw this picture above. I understand why Sir Tim Berners-Lee and Vint Cerf would claim they invented the World Wide Web and the Internet respectively. The status of inventor gives each of them a public plinth from which they can discuss how they think digital technology should be mobilised for the benefit of society. Examining their claims to fame is not an attempt to debunk these men’s status: it’s an exercise to show that technology never emerges in isolation. The sociology of technology tells us the invention and development of the Web and the Internet, like all technology, has to be understood within a broader social context that involves networks of people and technology as well as cultural values. I can’t do this statement justice here. By applying this logic to these t-shirts in the picture above I can, however, begin to show the value of the sociology of technology. (more…)
‘They’, we are told, are prime movers we can observe to spot future trends; like rejecting Facebook. ‘They’ are doing something problematic or exotic: different to ‘us’. For example, sexting or hacking. Or ‘they’ are being brainwashed and radicalised by the Internet. ‘They’ are teenagers. We are not similarly fixated by other social groups such as pensioners in this way. What lies behind our obsession with teenagers online? (more…)
By Iconshock [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons
Over the past few months, numerous publications have
discussed – and mostly: dismissed – the trend to incorporate so-called trigger warnings into the college classroom and syllabus. Trigger warnings have become a standard practice for articles in feminist blogs and other online media that discuss incidences of violence, sexual assault and that may contain other potentially ‘triggering’ material, with the purpose of giving readers a way to opt-out of exposing themselves to said material. As some college professors have started to incorporate this practice into their classrooms in order to warn students of potentially ‘triggering’ material – and some colleges
have even discussed adopting trigger warning policies – the public reaction has been mostly negative. However, it is my position that most of these commentators have it backward and misunderstand what trigger warnings are about and can do – granted, there are examples of very poorly-done trigger warnings out there that can easily be taken as evidence for some of the critics’ fears – and I believe they can and should have a place in the sociology classroom and that they can actually play a positive and productive pedagogical role.
Police departments across the country are rapidly increasing their technological capacity to become a more efficient and effective force. These technologies vary from new weapons, wide-area surveillance, facial recognition software, closed caption television cameras, and crime mapping software. Each of these technologies is oriented towards identifying offenders and preventing or intervening in crime incidents. The technology has become a multi-billion dollar industry with vendors regularly contacting departments attempting to sell them the next great technology. One technology becoming increasingly popular with the police is body-cameras. The police are beginning to wear small cameras on their shirt, hat, or sunglasses in order to capture interactions with citizens. The body-cameras are one of the few technologies adopted by the police that focus on limiting police behavior. Body cameras are thought to reduce police deviance and increase police professionalism by monitoring police actions (Ariel & Farrar, 2014). The movement towards police wearing body cameras causes the police to be more aware of their behaviors and acts as a deterrent for the police committing crimes. Multiple research studies indicate video technology alters the behavior of offenders (Chartrand, & Bargh, 1999; IACP, 2004). (more…)
This week BBC News asked “can wearable tech make us more productive?” The news package covered a research project which has the broader purpose of investigating impact of wearable connected tech on every aspect of our lives. The umbrella term that (albeit loosely) confederates connected technology is the ‘Internet of Things’. Its advocates believe the Internet of Things is one of the most compelling ideas of the twenty first century. The original definition of the Internet of Things referred to inanimate objects that had an electronic product code so they could be inventoried. Now, thanks to IPv6 (which provides 3.4×1038 addresses on the Internet), as utility (or the market) demands it, all our everyday objects such as TVs, microwave ovens and cars can be allocated an address on the Internet and offer the potential to transmit and receive digital data. However, an IP address is not a prerequisite of the Internet of Things. The term can also refer to devices that have the potential to produce digital data for the Internet. This includes technologies of the ‘quantified self’, such as the GPS enabled sports watch I use for example. (more…)
By movie studio [Public domain], via Wikimedia Commons.
In January, President Obama became the latest in a long list of politicians and high profile public figures in taking a shot at academic disciplines perceived to be ‘useless’ from a labor market perspective. Talking about manufacturing and job training, Obama (who has since apologized
for his remarks) said
: “I promise you, folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree.”
This attack on disciplines, fields and degrees that do not tie in directly to what is perceived to be the workplace of today and tomorrow are nothing new. North Carolina Governor Pat McCrory made similar, albeit much more explicit and vicious, remarks
about higher education just last year, lashing out against the (inter)discipline of women’s and gender studies: “If you want to take gender studies that’s fine. Go to a private school, and take it. But I don’t want to subsidize that if that’s not going to get someone a job.”
These and similar remarks point to two related notions that dominate in the debate about (higher) education: 1. The idea of a “skills gap” – that is the idea that workers and college graduates do not possess the right skills to fill vacant jobs in growing economic sectors. And 2. The idea that some academic disciplines are simply useless pursuits, as they do not help graduates secure employment. But do these ideas have empirical ground?
Yesterday, I read a disturbing article in Adweek. “Powerful Ads Use Real Google Searches to Show the Scope of Sexism Worldwide Simple: Visual For Inequality” by David Griner explores a new campaign idea from UN Women, which used real suggested search terms from Google’s autocomplete feature. The ads, which were designed by art director and graphic designer Christopher Hunt, were designed to illustrate how gender inequality continues to be so problematic that even Google has come to expect it.
Being a sociologist, I was interested in the reliability of the Google experiment. So, I conducted my own Google search of the phrases used in the project. My findings were not exactly the same as the UN Women campaign but I was saddened to see so many misogynistic phrases pop up on my screen. I found “women shouldn’t…” work, be in combat, or be cops. In contrast, “women should…” know their place, not preach, not speak in church, and not be in combat. Google autocomplete told me that “women need to…” shut up, grow up, feel wanted, and feel safe. And finally, “women cannot…” be trusted, be pastors, have it all, or teach men.
Source: Above the Law
Last week, a survey of 1,300 incoming freshman at Harvard University found that 42 percent of respondents had cheated on a homework assignment or problem set before starting at Harvard. This study lead to countless editorial pieces with provocative titles such as “Welcome to Harvard, Cheaters of 2017” and “More Incoming Harvard Students Have Cheated On Their Homework Than Had Sex.” However, what the study and related articles did not discuss was how “cheating” was defined both for those conducting the survey and those responding; how technology has complicated the definition of “cheating;” and if academic institutions as well as the larger social structure needs to rethink academic ethics in today’s changing and advancing digital society.
Last week, NPR ran a series on American public libraries. The series immediately filled me with a sentimental wave of nostalgia. One of my earliest memories is packing up my teddy bear and a sack lunch and driving to the Lincoln Township Public Library for their annual Teddy Bear Picnic. I also remember taking a writing workshop at the library in second or third grade, which lead to my first publication, “Friends of the Sea.” As an undergraduate, I spent countless hours at the Harold Washington Library in Chicago studying or perusing the library’s extensive collection of foreign films. And a staff member at the Boston Public Library in Copley Square helped me navigate the IRS website when income tax forms first transitioned from a paper to online format.
The NPR series was more than just interesting. It led me to start to think about the ways the public library has been important in my life. In addition, I started to consider what libraries teach all of us about functioning in society. Are public libraries a hidden social institution?