On this blog, I typically discuss the intersection of social theory and the changing nature of the Internet (e.g., using Marx, Bourdieu, Goffman, Bauman, DeBord and so on). In a chapter of the new third edition of the McDonaldization Reader edited by George Ritzer, I argue that what we are seeing is a general trend towards the deMcDonaldization of the Internet.
The shift from a top-down centrally conceived and controlled “Web 1.0” to a more user-generated and social “Web 2.0” is a shift away from the dimensions of McDonaldization as Ritzer defines the concept. For example, a corporate-generated website that does not allow user-generated content is paradigmatic of Web 1.0. The site is produced efficiently by few individuals, making it predictable, controllable and relatively devoid of outside human input. Web 2.0, alternatively, is not centered on the efficient production of content [I’ve made this argument previously]. User-generated content is, instead, produced by many individuals, making it much less predictable –evidenced by the random videos we come across on YouTube, articles on Wikipedia, or perhaps the best example is the downright capricious and aleatory experience of Chatroulette. The personalization and community surrounding social networking sites are hard to quantify and make the web far more humanized. Thus, Web 2.0 marks a general deMcDonaldization of the web. Examples of these points are further illustrated in the chapter.
Finally, further consideration needs to be given to the various ways in which Web 2.0 remains McDonaldized, rationalized and standardized. Many of the sites that allow for unpredictable user-generated content do so precisely because of their rationalized and standardized -and thus McDonaldized- underlying structure. In many ways, our Facebook profiles all seem to look and behave similarly. The rationalized and standardized structures of Web 2.0 seem to coexist comfortably with irrational and unpredictable content they facilitate. ~nathanjurgenson.com