technology

In my research on the Dutch banking system, it became clear that the banks are seriously worried about social engineering. These techniques, such as phishing and identity theft, have become increasingly common. No reason for concern, right? Surely, a system upgrade, some stronger passwords, or new forms of encryption and all will be well again. Wrong! When it comes to social engineering, trust in technology is deadly. The solution, in fact, cannot be technological; it must to be social.

The term social engineering has been around for decades, but in the last couple of years, it has been popularized by famous social engineer Kevin Mitnick.  In the book Social Engineering: The Art of Human Hacking by another famous social engineer, Christopher Hadnagy, social engineering is defined as “the act of manipulating a person to take an action that may or may not be in the ‘target’s’ best interest.” This may include obtaining information, gaining computer system access, or getting the target to take certain action. Kevin Mitnick pointed out that instead of hacking into a computer system it is easier to “hack the human.” While cracking the code is nearly impossible, tricking someone into giving it to you is often relatively easy. more...

Cyborgology editors Nathan Jurgenson and PJ Rey on WYPR (Baltimore’s NPR affiliate) discussing technology and the Occupy movement: Click here to listen to the audio.

The Human Microphone was created by Occupy Wall Street as a way to get around New York City’s ban on amplified sound in Zuccotti Park. In other words, it is a tool–and a form of non-digital technology–designed to facilitate communication and discussion in large crowds. But like any form of technology, its use isn’t confined to what it was originally created to do.

This is Karl Rove being “mic-checked” while delivering a speech at Johns Hopkins on November 14th. It starts about 1:48 in (be aware, there’s a huge jump in volume at that point).

YouTube Preview Image

more...

Everybody knows the story: Computers—which, a half century ago, were expensive, room-hogging behemoths—have developed into a broad range of portable devices that we now rely on constantly throughout the day.  Futurist Ray Kurzweil famously observed:

progress in information technology is exponential, not linear. My cell phone is a billion times more powerful per dollar than the computer we all shared when I was an undergrad at MIT. And we will do it again in 25 years. What used to take up a building now fits in my pocket, and what now fits in my pocket will fit inside a blood cell in 25 years.

Beyond advances in miniaturization and processing, computers have become more versatile and, most importantly, more accessible – you can easily sell your computer processor, there’ll be plenty of those interested, everybody needs it nowadays.  In the early days of computing, mainframes were owned and controlled by various public and private institutions (e.g., the US Census Bureau drove the development of punch card readers from the 1890s onward). When universities began to develop and house mainframes, users had to submit proposals to justify their access to the machine. They were given a short period in which to complete their task, then the machine was turned over to the next person. In short, computers were scarce, so access was limited. more...

all photos in this post by nathan jurgenson

The role of new, social media in the Occupy protests near Wall Street, around the country and even around the globe is something I’ve written about before. I spent some time at Occupy Wall Street last week and talked to many folks there about technology. The story that emerged is much more complicated than expected. OWS has a more complicated, perhaps even “ironic” relationship with technology than I previous thought and that is often portrayed in the news and in everyday discussions.

It is easy to think of the Occupy protests as a bunch of young people who all blindly utilize Facebook, Twitter, SMS, digital photography and so on. And this is partially true. However, (1) not everyone at Occupy Wall Street is young; and (2), the role of technology is certainly not centered on the new, the high-tech or social media. At OWS, there is a focus on retro and analogue technologies; moving past a cultural fixation on the high-tech, OWS has opened a space for the low-tech.

What I want to think about there is the general Occupy Wall Street culture that has mixed-feelings about new technologies, even electricity itself. I will give examples of the embracing of retro-technology at OWS and consider three overlapping explanations for why this might be the case. I will also make use of some photographs I took while there. more...

One of the things I find most striking about discussion around technology’s “place” in schools is that adults treat technology as if it is a hot-potato bomb tossed around among young people.  In some senses, I think it is a bit of a ticking bomb: when used in schools, new technologies show that society’s norms about their “appropriate” use are still being formalized. Moreover, when new technologies are used in the classroom, they reveal how both teacher authority and the construction of childhood are themselves unstable – schools are charged not only with the role of enforcing appropriate use of technologies, but they must also maintain that they offer an ideal learning environment for children. In classical sociologist Max Weber’s terms, schools’ current use of technology reveal cracks in teacher legitimacy, fueling a panic whereby parents and teachers suggest these technologically-infused settings are contrary to the needs of young people.

In a recent series of op-eds in the New York Times, Greg Simon argues that a Silicon Valley Waldorf School, one of a number of esteemed and very expensive K-12 schools here in the U.S., is a model for education because it privileges creativity and imagination over the infusion of technology in classroom instruction.  For Simon, technology and childhood are dichotomous entities: technology serves only to debase kids’ need for free-spirited play. Moreover, because computer images, games, and ubiquitous technology dominate in the adult world, they serve as distractions to children and cannot “fit” in schools. more...

This is the first of a two-part series dedicated to answering the question “Do we need a new World’s Fair?” It is an honest question that I do not have an answer to. What I aim to do here is share my thoughts on the subject and present historical data on what these sorts of events have done in the past. In the first part, I explore what previous World Fairs have accomplished and what we must certainly avoid. The second part will investigate what a new 21st century fair might look like, and how it would help our economy. Part 1 is here.

Our Generation's Only Exposure to the Concept of the World's Fair. (Copyright Paramount Pictures and Marvel)

Yesterday we looked at the last few World Fairs that were held in the United States. Those  20th century fairs demonstrated technologies that today we take for granted as common-place. Everything from Juicy Fruit gum to fluorescent lighting has been introduced to the world through these massive fairs. World Expos still take place, but are now found in China, Japan and South Korea. The 2012 expo will be held in Seoul, South Korea. The latest World’s Fair, Expo 2010, was held in Shanghai, China and set historic records as the largest and most well-attended expo. But the success of the Shanghai Expo doesn’t quite translate to America’s shores. As The Atlantic’s Adam Minter wrote last year:

To American ears, the concept of a World’s Fair sounds archaic, and when applied to Shanghai, a contemporary symbol of all that is new, vibrant, and even threatening, it’s disconcerting. But in Shanghai, where the future is an obsession, this reported $46 billion hat-tip to the past makes perfect sense: just as New York once announced its global pre-eminence via World’s Fairs in 1939 and, again, in 1964, the organizers of Expo 2010 view the six month event as nothing less than Shanghai’s coronation as the next great world city. more...

This is the first of a two-part series dedicated to answering the question “Do we need a new World’s Fair?” It is an honest question that I do not have an answer to. What I aim to do here is share my thoughts on the subject and present historical data on what these sorts of events have done in the past. In the first part, I explore what previous World Fairs have accomplished and what we must certainly avoid. The second part will investigate what a new 21st century fair might look like, and how it would help our economy. Part 2 is here.

electrical building
By Charles S. Graham (1852–1911). Printed by Winters Art Litho. Co. (Public domain c/o Wikipedia.)

A “World Fair” is first and foremost, a grand gesture. They are typically months if not a few years long. Think of them as temporary theme parks, or the the olympics of technological innovation. They are extravagant, optimistic, and brash. But let’s be clear here. All of the World Fairs held in Paris, Chicago, New York, and Seattle had sections that are deeply troubling. The 19th century fairs had human zoos and “freak shows.” The 20th century fairs were, in many ways, launchpads for the corporate take-over of the public realm and the plundering of the very cities that hosted them (more on that later). But that does not mean the form is totally useless or inherently bad. In fact, a new American World Fair might be just what we need. more...

Scientists in London  are working on an oral (rather than topical) form of sunscreen. Specifically, they are synthesizing sun-propelling properties from coral to work in the body as a  defense for the human skin against the sun’s damaging rays. This truly is a cyborg technology in Donna Haraway’s use of the term. It is a melding of human, animal, and machine. By ingesting the pill, the human biology is altered. This biological alteration in the human body is caused by its interaction and so enmeshment with the body and biology of the coral–an animal that was altered and synthesized using machine technology in a lab. Beyond a nice illustration of Haraway, the coral-based, human altering, technology reminds us that “nature” and “technology” are not mutually exclusive.

QR codes line the bulletin boards of many college campuses.

Lisa Wade over at our sister blog Sociological Images sent us an email from one of her readers, Steve Grimes, who shared this image and some interesting thoughts about how Quick Response codes or, QR codes can contribute to inequality. That is, QR codes such as these serve to make certain content and information “exclusive” to those who have smartphones. He states,

There is a general thinking that technology can create a level playing field (an example of this is can be seen with the popular feelings about the internet). However, technology also has a great ability to create and widen gaps of inequality.

In a practical sense the company may be looking for students who are tech savvy. They may also want to save on ink toner (might be a stretch). So using the matrix barcode may serve that purpose. However, the ad also shows how technology can exclude individuals; primarily in this case, students without smart phones. One may think that being on a college campus every student would have a smart phone. However, when you look at the prices of most smart phones along with the prices for the plans of a carrier (usually somewhere $75-150 per month) one can see that not every student may have one. Especially considering the other things that they may have to pay for that are a bit pressing to their environment (books, food, etc). more...