Recently there’s been heightened attention to calling out microaggressions and giving trigger warnings. I recently speculated that the loudest voices making these demands come from people in categories that have gained in power but are still not dominant, notably women at elite universities.  What they’re saying in part is, “We don’t have to take this shit anymore.” Or as Bradley Campbell and Jason Manning put it in a recently in The Chronicle:

…offenses against historically disadvantaged social groups have become more taboo precisely because different groups are now more equal than in the past.

It’s nice to have one’s hunches seconded by scholars who have given the issue much more thought.

Campbell and Manning make the context even broader. The new “plague of hypersensitivity” (as sociologist Todd Gitlin called it) isn’t just about a shift in power, but a wider cultural transformation from a “culture of dignity” to a “culture of victimhood.” More specifically, the aspect of culture they are talking about is social control. How do you get other people to stop doing things you don’t want them to do – or not do them in the first place?

In a “culture of honor,” you take direct action against the offender.  Where you stand in society – the rights and privileges that others accord you – is all about personal reputation (at least for men). “One must respond aggressively to insults, aggressions, and challenges or lose honor.” The culture of honor arises where the state is weak or is concerned with justice only for some (the elite). So the person whose reputation and honor are at stake must rely on his own devices (devices like duelling pistols).  Or in his pursuit of personal justice, he may enlist the aid of kin or a personalized state-substitute like Don Corleone.

In more evolved societies with a more extensive state, honor gives way to “dignity.”

The prevailing culture in the modern West is one whose moral code is nearly the exact opposite of that of an honor culture. Rather than honor, a status based primarily on public opinion, people are said to have dignity, a kind of inherent worth that cannot be alienated by others. Dignity exists independently of what others think, so a culture of dignity is one in which public reputation is less important. Insults might provoke offense, but they no longer have the same importance as a way of establishing or destroying a reputation for bravery. It is even commendable to have “thick skin” that allows one to shrug off slights and even serious insults, and in a dignity-based society parents might teach children some version of “sticks and stones may break my bones, but words will never hurt me” – an idea that would be alien in a culture of honor.

The new “culture of victimhood” has a different goal – cultural change. Culture is, after all, a set of ideas that is shared, usually so widely shared as to be taken for granted. The microaggression debate is about insult, and one of the crucial cultural ideas at stake is how the insulted person should react. In the culture of honor, he must seek personal retribution. In doing so, of course, he is admitting that the insult did in fact sting. The culture of dignity also focuses on the character of offended people, but here they must pretend that the insult had no personal impact. They must maintain a Jackie-Robinson-like stoicism even in the face of gross insults and hope that others will rise to their defense. For smaller insults, say Campbell and Manning, the dignity culture “would likely counsel either confronting the offender directly to discuss the issue,” which still keeps things at a personal level, “or better yet, ignoring the remarks altogether.”

In the culture of victimhood, the victim’s goal is to make the personal political.  “It’s not just about me…”  Victims and their supporters are moral entrepreneurs. They want to change the norms so that insults and injustices once deemed minor are now seen as deviant. They want to define deviance up.  That, for example, is the primary point of efforts like the Microaggressions Project, which describes microaggressions in exactly these terms, saying that microaggression “reminds us of the ways in which we and people like us continue to be excluded and oppressed” (my emphasis).

5

So, what we are seeing may be a conflict between two cultures of social control: dignity and victimhood. It’s not clear how it will develop. I would expect that those who enjoy the benefits of the status quo and none of its drawbacks will be most likely to resist the change demanded by a culture of victimhood. It may depend on whether shifts in the distribution of social power continue to give previously more marginalized groups a louder and louder voice.

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

I was on jury duty this week, and the greatest challenge for me was the “David Brooks temptation” to use the experience to expound on the differences in generations and the great changes in culture and character that technology and history have brought.

I did my first tour of duty in the 1970s. Back then you were called for two weeks. Even if you served on a jury, after that trial ended, you went back to the main jury room. If you were lucky, you might be released after a week and a half. Now it’s two days.

What most struck me most this time was the atmosphere in the main room. Now, nobody talks. You’re in a large room with maybe two hundred people, and it’s quieter than a library. Some are reading newspapers or books, but most are on their latops, tablets, and phones. In the 1970s, it wasn’t just that there was no wi-fi, there was no air conditioning. Remember “12 Angry Men”? We’re in the same building. Then, you tried to find others to talk to. Now you try to find a seat near an electric outlet to connect your charger.

2 (1)

I started to feel nostalgic for the old system. People nowadays – all in their own narrow, solipsistic worlds, nearly incapable of ordinary face-to-face sociability. And so on.

But the explanation was much simpler. It was the two-day hitch. In the old system, social ties didn’t grow from strangers seeking out others in the main jury room. It happened when you went to a courtroom for voir dire. You were called down in groups of forty. The judge sketched out the case, and the lawyers interviewed the prospective jurors. From their questions, you learned more about the case, and you learned about your fellow jurors – neighborhood, occupation, family, education, hobbies. You heard what crimes they’d been a victim of.  When judge called a break for bathroom or lunch or some legal matter, you could find the people you had something in common with. And you could talk with anyone about the case, trying to guess what the trial would bring. If you weren’t selected for the jury, you went back to the main jury room, and you continued the conversations there. You formed a social circle that others could join.

This time, on my first day, there were only two calls for voir dire, the clerk as bingo-master spinning the drum with the name cards and calling out the names one by one. My second day, there were no calls. And that was it. I went home having had no conversations at all with any of my fellow jurors. (A woman seated behind me did say, “Can you watch my laptop for a second?” when she went to the bathroom, but I don’t count that as a conversation.)

I would love to have written 800 words here on how New York character had changed since the 1970s.  No more schmoozing. Instead we have iPads and iPhones and MacBooks destroying New York jury room culture – Apple taking over the Apple. People unable or afraid to talk to one another because of some subtle shift in our morals and manners. Maybe I’d even go for the full Brooks and add a few paragraphs telling you what’s really important in life.

But it was really a change in the structure. New York expanded the jury pool by eliminating most exemptions. Doctors, lawyers, politicians, judges – they all have to show up. As a result, jury service is two days instead of two weeks, and if you actually are called to a trial, once you are rejected for the jury or after the trial is over, you go home.

The old system was sort of like the pre-all-volunteer army. You get called up, and you’re thrown together with many kinds of people you’d never otherwise meet. It takes a chunk of time out of your life, but you wind up with some good stories to tell. Maybe we’ve lost something. But if we have lost valuable experiences, it’s because of a change in the rules, in the structure of how the institution is run, not a because of a change in our culture and character.

Cross-posted  at Montclair Socioblog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

In the aftermath of Dylann Roof’s racist murder, some cities in the South are reconsidering their relationship to the Confederate Flag. Should it fly? Be in a museum? Burn? The discussion raises larger questions of how to move forward from ugly histories without simultaneously whitewashing a city’s past. And, as well, how do we know when something is truly in our past?

I was thinking about just these questions a couple weeks ago when a friend of mine walked me by the monument to the Crescent City White League in New Orleans. The conical stone was erected to commemorate the return of white supremacist government two years after a lethal insurrection against the Reconstruction state government in 1874. In that insurrection, thousands of former Confederate soldiers attacked the city police and state military. They killed 11 members of the NOPD and held city government buildings for three days before federal troops arrived and they fled.

Two years later, the white supremacist politicians were back in power and they placed the monument in a prominent place where Canal St. meets the Mississippi. The monument, to be clear, is in honor of cop-killing white supremacists.

Here it is in 1906 (source, photographer unknown):14

So, what to do with the thing?

In 1974 — one hundred years after the insurrection and 98 years after its erection — the city added a marker nearby distancing itself from the message of white supremacy. It read:

Although the “battle of Liberty Place” and this monument are important parts of the New Orleans history, the sentiments in favor of white supremacy expressed thereon are contrary to the philosophy and beliefs of present-day New Orleans.

In 1993, some of the original inscriptions were removed and replaced with this slightly more politically correct comment:

In honor of those Americans on both sides who died in the Battle of Liberty Place. … A conflict of the past that should teach us lessons for the future.

It was also moved to a new location. Today it sits between a flood wall, a parking lot, and an electrical substation. If you wanted to give a monument the finger, this is one way to do it. Here’s how it looks on Google Maps streetview:

3 4

So, the question is: What to do with these things?

I’ll admit that seeing the monument tucked into an unpleasant corner of New Orleans was somehow satisfying. But I was also uneasy about its displacement. Is this an example of New Orleans trying to repress knowledge of its racist history? (And present?) Or is it a sign that the city actively rejects the values represented by the monument? Conversely, if the city had left the monument at the foot of Canal St. would this be a sign that it took history seriously? And, thus, responsibility for its past? Or a sign that it didn’t take an anti-racist stance seriously enough?

This seems like an obviously difficult call to make, but I’m glad that we’re using the horror of Roof’s massacre to begin a discussion about how to handle symbols like these and, maybe, truly make them a part of our past.

Cross-posted at A Nerd’s Guide to New Orleans.

Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.

One of the important conversations that has began in the wake of Dylann Roof’s racist murder in South Carolina has to do with racism among members of the Millennial generation. We’ve placed a lot of faith in this generation to pull us out of our racist path, but Roof’s actions may help remind us that racism will not go away simply by the passing of time.

In fact, data from the General Social Survey — one of the most trusted social science data sets — suggests that Millennials are failing to make dramatic strides toward a non-racist utopia. Scott Clement, at the Washington Post, shows us the data. Attitudes among white millennials (in green below) are statistically identical to whites in Generation X (yellow) and hardly different from Baby Boomers on most measures (orange). Whites are about as likely as Generation X:

  • to think that blacks are lazier or less hardworking than whites
  • to think that blacks have less motivation than whites to do well
  • to oppose living in a neighborhood that is 50% or more black
  • to object if a relative marries a black person

And they’re slightly more likely than white members of Generation X to think that blacks are less intelligent than whites. So much for a Millennial rescue from racism.

4 5 6 7 8

All in all, white millennial attitudes are much more similar to those of older whites than they are to those of their peers of color.

***

At PBS, Mychal Denzel Smith argues that we are reaping the colorblindness lessons that we’ve sowed. Millennials today may think of themselves as “post-racial,” but they’ve learned none of the skills that would allow them to get there. Smith writes:

Millennials are fluent in colorblindness and diversity, while remaining illiterate in the language of anti-racism.

They know how to claim that they’re not racist, but they don’t know how to recognize when they are and they’re clueless as to how to actually change our society for the better.

So, thanks to the colorblindness discourse, white Millennials are quick to see racism as race-neutral. In one study, for example, 58% of white millennials said they thought that “reverse racism” was as big a problem as racism.

Smith summarizes the problem:

For Millennials, racism is a relic of the past, but what vestiges may still exist are only obstacles if the people affected decide they are. Everyone is equal, they’ve been taught, and therefore everyone has equal opportunity for success. This is the deficiency found in the language of diversity. … Armed with this impotent analysis, Millennials perpetuate false equivalencies, such as affirmative action as a form of discrimination on par with with Jim Crow segregation. And they can do so while not believing themselves racist or supportive of racism.

Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.

Saturday night, I went to the 7:30 showing of “Me and Earl and the Dying Girl.” The movie had just opened, so I went early. I didn’t want the local teens to grab the all the good seats – you know, that thing where maybe four people from the group are in the theater but they’ve put coats, backpacks, and other place markers over two dozen seats for their friends, who eventually come in five minutes after they feature has started.

That didn’t happen. The theater (the AMC on Broadway at 68th St.) was two-thirds empty (one-third full if you’re an optimist), and there were no teenagers. Fox Searchlight, I thought, is going to have to do a lot of searching to find a big enough audience to cover the $6 million they paid for the film at Sundance. The box office for the first weekend was $196,000 which put it behind 19 other movies.

But don’t write off “Me and Earl” as a bad investment. Not yet. According to a story in Variety, Searchlight is looking that “Me and Earl” will be to the summer of 2015 what “Napoleon Dynamite” was to the summer of 2004. Like “Napoleon Dynamite,” “Me and Earl” was a festival hit but with no established stars and debt director (though Gomez-Rejon has done television – several “Glees” and “American Horror Storys”). “Napoleon” grossed only $210,000 its first week, but its popularity kept growing – slowly at first, then more rapidly as word spread – eventually becoming cult classic. Searchlight is hoping that “Me and Earl” follows a similar path.

The other important similarity between “Napoleon” and “Earl” is that both were released in the same week as a Very Big Movie – “Harry Potter and the Prisoner of Azkaban” in 2004, “Jurassic World” last weekend. That too plays a part in how a film catches on (or doesn’t).

In an earlier post I graphed the growth in cumulative box office receipts for two movies – “My Big Fat Greek Wedding” and “Twilight.”  The shapes of the curves illustrated two different models of the diffusion of ideas.  In one (“Greek Wedding”), the influence came from within the audience of potential moviegoers, spreading by word of mouth. In the other (“Twilight”), impetus came from outside – highly publicized news of the film’s release hitting everyone at the same time. I was working from a description of these models in sociologist Gabriel Rossman’s Climbing the Charts.

You can see these patterns again in the box office charts for the two movies from the summer of  2004 – “Harry Potter/Azkaban” and “Napoleon Dynamite.” (I had to use separate Y-axes in order to get David and Goliath on the same chart; data from BoxOfficeMojo.)

2

“Harry Potter” starts huge, but after the fifth week the increase in total box office tapers off quickly. “Napoleon Dynamite” starts slowly. But in its fifth or sixth week, its box office numbers are still growing, and they continue to increase for another two months before finally dissipating. The convex curve for “Harry Potter” is typical where the forces of influence are “exogenous.” The more S-shaped curve of “Napoleon Dynamite” usually indicates that an idea is spreading within the system.

But the Napoleon curve is not purely the work of the internal dynamics of word-of-mouth diffusion. The movie distributor plays an important part in its decisions about how to market the film – especially when and where to release the film. The same is true of “Harry Potter.”

The Warner Bros. strategy for “Harry Potter” was to open big – in theaters all over the country. In some places, two or more of the screens at the multi-plex would be running the film. After three weeks, the movie began to disappear from theaters, going from 3,855 screens in week #3 to 605 screens in week #9.

4

“Napoleon Dynamite” opened in only a small number of theaters – six to be exact.  But that number increased steadily until by week #17, it was showing in more than 1,000 theaters.

12

It’s hard to separate the exogenous forces of the movie business from the endogenous word-of-mouth – the biz from the buzz.  Were the distributor and theater owners responding to an increased interest in the movie generated from person to person? Or were they, through their strategic timing of advertising and distribution, helping to create the film’s popularity? We can’t know for sure, but probably both kinds of influence were happening. It might be clearer when the economic desires of the business side and the preferences of the audience don’t match up, for example, when a distributor tries to nudge a small film along, getting it into more theaters and spending more money on advertising, but nobody’s going to see it. This discrepancy would clearly show the influence of word-of-mouth; it’s just that the word would be, “don’t bother.”

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Flashback Friday.

Back when I was in high school and college, I learned that one of the major things that separated humans from other species was culture. The ability to develop distinct ways of living that include an understanding of symbols, language, and customs unique to the group was a specifically human trait.

And, ok, so it turned out that other species had more complex communication systems than we thought they did, but still, other animals were assumed to behave according to instinct, not community-specific cultures.

But as with so many things humans have been convinced we alone possess, it’s turning out that other species have cultures, too. One of the clearest examples is the division of orcas into two groups with distinct customs and eating habits; one eats mammals while the other is pescetarian, eating only fish. Though the two groups regularly come in contact with each other in the wild, they do not choose to intermingle or mate with one another. Here’s a video:

 

Aside from the obvious implications for our understanding of culture, this brings up an issue in terms of conservation. Take the case of orcas. Some are suggesting that they should be on the endangered species list because the population has declined. What do we do if it turns out at some point that, while the overall orca population is not fully endangered, one of the distinct orca cultural groups is? Is it enough that killer whales still exist, or do we need to think of the cultures separately and try to preserve sufficient numbers of each? In addition to being culturally different, they are functionally non-interchangeable: each group has a different effect on food chains and ecosystems.

Should conservation efforts address not just keeping the overall population alive and functioning, but ensure that the range of cultural diversity within a species is protected? If this situation occurred, should we declare one orca culture as endangered but not the other? Are both ecological niches important?

I love these questions. If we recognize that creatures can have cultures, it challenges our sense of self, but also brings significantly more complexity to the idea of wildlife preservation.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Mr. Draper, I don’t know what it is you really believe in but I do know what it feels like to be out of place, to be disconnected, to see the whole world laid out in front of you the way other people live it. There’s something about you that tells me you know it too.

Mad Men, Season 1, Episode 1

The ending of Mad Men was brilliant. It was like a good mystery novel: once you know the solution – Don Draper creating one of the greatest ads in Madison Avenue history – you see that the clues were there all along.  You just didn’t realize what was important and what wasn’t. Neither did the characters. This was a game played between Matt Weiner and the audience.

The ending, like the entire series, was also a sociological commentary on American culture. Or rather, it was an illustration of such a commentary. The particular sociological commentary I have in mind is Philip Slater’sPursuit of Loneliness, published in 1970, the same year that this episode takes place. It’s almost as if Slater had Don Draper in mind when he wrote the book, or as if Matt Weiner had the book in mind when he wrote this episode.

In the first chapter, “I Only Work Here,” Slater outlines “three human desires that are deeply and uniquely frustrated by American culture”:

(1) the desire for community – the wish to live in trust, cooperation, and friendship with those around one.

(2) the desire for engagement – the wish to come to grips directly with one’s social and physical environment.

(3) the desire for dependence – the wish to share responsibility for the control of one’s impulses and the direction of one’s life.

The fundamental principle that gives rise to these frustrations is, of course, individualism.

Individualism is rooted in the attempt to deny the reality of human interdependence. One of the major goals of technology in America is to “free” us from the necessity of relating to, submitting to, depending upon, or controlling other people. Unfortunately, the more we have succeeded in doing this, the more we have felt disconnected, bored, lonely, unprotected, unnecessary, and unsafe.

Most of those adjectives could apply to Don Draper at this point. In earlier episodes, we have seen Don, without explanation, walk out of an important meeting at work and, like other American heroes, light out for the territory, albeit in a new Cadillac. He is estranged from his family. He is searching for something – at first a woman, who turns out to be unattainable, and then for… he doesn’t really know what. He winds up at Esalen, where revelation comes from an unlikely source, a nebbishy man named Leonard. In a group session, Leonard says:

I’ve never been interesting to anybody. I, um –  I work in an office. People walk right by me. I know they don’t see me. And I go home and I watch my wife and my kids. They don’t look up when I sit down…

I had a dream. I was on a shelf in the refrigerator. Someone closes the door and the light goes off. And I know everybody’s out there eating. And then they open the door and you see them smiling. They’re happy to see you but maybe they don’t look right at you and maybe they don’t pick you. Then the door closes again. The light goes off.

People are silent, but Don gets up, slowly moves towards Leonard and tearfully, silently, embraces him. 3

On the surface, the two men could not be more different. Don is interesting. And successful. People notice him. But he shares Leonard’s sense that his pursuit – of a new identity, of career success, of unattainable women – has left him feeling inauthentic, disconnected, and alone. “I’ve messed everything up,” he tells his sometime co-worker Peggy in a phone conversation. “I’m not the man you think I am.”

The next time we see him, he is watching from a distance as people do tai-chi on a hilltop.1b

And then he himself is sitting on a hilltop, chanting “om” in unison with a group of people. At last he is sharing something with others rather than searching for ego gratifications. 1c

And then the punch line. We cut to the Coke hilltop ad with its steadily expanding group of happy people singing in perfect harmony. 2A simple product brings universal community (“I’d like to buy the world a Coke and keep it company”). It also brings authenticity. “It’s the real thing.” Esalen and Coca-Cola. Both are offering solutions to the frustrated needs Slater identifies. But both solutions suffer from the same flaw – they are personal rather than social. A few days of spiritual healing and hot springs brings nor more social change than does a bottle of sugar water.It’s not that real change is impossible, Slater says, and in the final chapter of the book, he hopes that the strands in the fabric of American culture can be rewoven.  But optimism is difficult.
So many healthy new growths in our society are at some point blocked by the overwhelming force and rigidity of economic inequality… There’s a… ceiling of concentrated economic power that holds us back, frustrates change, locks in flexibility.

The Mad Men finale makes the same point, though with greater irony (the episode title is “Person to Person”). When we see the Coke mountaintop ad, we realize that Don Draper has bundled up his Esalen epiphany, brought it back to a huge ad agency in New York, and turned it into a commercial for one of the largest corporations in the world.

Cross-posted at Montclair SocioBlog and Pacific Standard.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

20150526_105320A substantial body of literature suggests that women change what they eat when they eat with men. Specifically, women opt for smaller amounts and lower-calorie foods associated with femininity. So, some scholars argue that women change what they eat to appear more feminine when dining with male companions.

For my senior thesis, I explored whether women change the way they eat  alongside what they eat when dining with a male vs. female companion. To examine this phenomenon, I conducted 42 hours of non-participant observation in two four-star American restaurants in a large west coast city in the United States. I observed the eating behaviors of 76 Euro-American women (37 dining with a male companion and 39 dining with a female companion) aged approximately 18 to 40 to identify differences in their eating behaviors.

I found that women did change the way they ate depending on the gender of their dining companion. Overall, when dining with a male companion, women typically constructed their bites carefully, took small bites, ate slowly, used their napkins precisely and frequently, and maintained good posture and limited body movement throughout their meals. In contrast, women dining with a female companion generally constructed their bites more haphazardly, took larger bites, used their napkins more loosely and sparingly, and moved their bodies more throughout their meals.

On the size of bites, here’s an excerpt from my field notes:

Though her plate is filled, each bite she labors onto her fork barely fills the utensil. Perhaps she’s getting full because each bite seems smaller than the last… and still she’s taking tiny bites. Somehow she has made a single vegetable last for more than five bites.

I also observed many women who were about to take a large bite but stopped themselves. Another excerpt:

She spreads a cracker generously and brings it to her mouth. Then she pauses for a moment as though she’s sizing up the cracker to decide if she can manage it in one bite. After thinking for a minute, she bites off half and gently places the rest of the cracker back down on her individual plate.

Stopping to reconstruct large bites into smaller ones is a feminine eating behavior that implies a conscious monitoring of bite size. It indicates that women may deliberately change their behavior to appear more feminine.

I also observed changes in the ways women used their napkins when dining with a male vs. female companion. When their companion was a man, women used their napkins more precisely and frequently than when their companion was another woman. In some cases, the woman would fold her napkin into fourths before using it so that she could press the straight edge of the napkin to the corners of her mouth. Other times, the woman would wrap the napkin around her finger to create a point, then dab it across her mouth or use the point to press into the corners of her mouth. Women who used their napkins precisely also tended to use them quite frequently:

Using her napkin to dab the edges of her mouth – finger in it to make a tiny point, she is using her napkin constantly… using the point of the napkin to specifically dab each corner of her mouth. She is using the napkin again even though she has not taken a single bite since the last time she used it… using napkin after literally every bite as if she is constantly scared she has food on her mouth. Using and refolding her napkin every two minutes, always dabbing the corners of her mouth lightly.

In contrast, women dining with a female companion generally used their napkins more loosely and sparingly. These women did not carefully designate a specific area of the napkin to use, and instead bunched up a portion of it in one hand and rubbed the napkin across their mouths indiscriminately.

Each of the behaviors observed more frequently among women dining with a male companion versus a female one was stereotypically feminine. Many of the behaviors that emerged as significant among women dining with a female companion, on the other hand, are considered non-feminine, i.e. behaviors that women are instructed to avoid. Behavioral differences between the two groups of women suggest two things. First, women eat in a manner more consistent with normative femininity when in the presence of a male versus a female companion. And, second, gender is something that people perform when cued to do so, not necessarily something people internalize and express all the time.

Cross-posted at Pacific Standard.

Kate Handley graduated from Occidental College this month. This post is based on her senior thesis. After gaining some experience in the tech industry, she hopes to pursue a PhD in Sociology.