Flashback Friday.

Is it true that Spanish-speaking immigrants to the United States resist assimilation?

Not if you judge by language acquisition and compare them to earlier European immigrants. The sociologist Claude S. Fischer, at Made in America, offers this data:

The bottom line represents the percentage of English-speakers among the wave of immigrants counted in the 1900, 1910, and 1920 census. It shows that less than half of those who had been in the country five years or less could speak English. This jumped to almost 75% by the time they were here six to ten years and the numbers keep rising slowly after that.

Fast forward 80 years. Immigrants counted in the 1980, 1990, and 2000 Census (the top line) outpaced earlier immigrants by more than 25 percentage points. Among those who have just arrived, almost as many can speak English as earlier immigrants who’d been here between 11 and 15 years.

If you look just at Spanish speakers (the middle line), you’ll see that the numbers are slightly lower than all recent immigrants, but still significantly better than the previous wave. Remember that some of the other immigrants are coming from English-speaking countries.

Fischer suggests that the ethnic enclave is one of the reasons that the wave of immigrants at the turn of the 20th century learned English more slowly:

When we think back to that earlier wave of immigration, we picture neighborhoods like Little Italy, Greektown, the Lower East Side, and Little Warsaw – neighborhoods where as late as 1940, immigrants could lead their lives speaking only the language of the old country.

Today, however, immigrants learn to speak with those outside of their own group more quickly, suggesting that all of the flag waving to the contrary is missing the big picture.

Originally posted in 2010.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Will Davies, a politics professor and economic sociologist at Goldsmiths, University of London, summarized his thoughts on Brexit for the Political Economy and Research Centre, arguing that the split wasn’t one of left and right, young and old, racist or not racist, but center and the periphery. You can read it in full there, or scroll down for my summary.

——————————–

Many of the strongest advocates for Leave, many have noted, were actually among the beneficiaries of the UK’s relationship with the EU. Small towns and rural areas receive quite a bit of financial support. Those regions that voted for Leave in the greatest numbers, then, will also suffer some of the worst consequences of the Leave. What motivated to them to vote for a change that will in all likelihood make their lives worse?

Davies argues that the economic support they received from their relationship with the EU was paired with a culturally invisibility or active denigration by those in the center. Those in the periphery lived in a “shadow welfare state” alongside “a political culture which heaped scorn on dependency.”

Davies uses philosopher Nancy Fraser’s complementary ideas of recognition and redistribution: people need economic security (redistribution), but they need dignity, too (recognition). Malrecognition can be so psychically painful that even those who knew they would suffer economically may have been motivated to vote Leave. “Knowing that your business, farm, family or region is dependent on the beneficence of wealthy liberals,” writes Davies, “is unlikely to be a recipe for satisfaction.”

It was in this context that the political campaign for Leave penned the slogan: “Take back control.” In sociology we call this framing, a way of directing people to think about a situation not just as a problem, but a particular kind of problem. “Take back control” invokes the indignity of oppression. Davies explains:

It worked on every level between the macroeconomic and the psychoanalytic. Think of what it means on an individual level to rediscover control. To be a person without control (for instance to suffer incontinence or a facial tick) is to be the butt of cruel jokes, to be potentially embarrassed in public. It potentially reduces one’s independence. What was so clever about the language of the Leave campaign was that it spoke directly to this feeling of inadequacy and embarrassment, then promised to eradicate it. The promise had nothing to do with economics or policy, but everything to do with the psychological allure of autonomy and self-respect.

Consider the cover of the Daily Mail praising the decision and calling politicians “out-of-touch” and the EU “elite” and “contemptuous”:2

From this point of view, Davies thinks that the reward wasn’t the Leave, but the vote itself, a veritable middle finger to the UK center and the EU “eurocrats.” They know their lives won’t get better after a Brexit, but they don’t see their lives getting any better under any circumstances, so they’ll take the opportunity to pop a symbolic middle finger. That’s all they think they have.

And that’s where Davies thinks the victory  of the Leave vote parallels strongly with Donald Trump’s rise in the US:

Amongst people who have utterly given up on the future, political movements don’t need to promise any desirable and realistic change. If anything, they are more comforting and trustworthy if predicated on the notion that the future is beyond rescue, for that chimes more closely with people’s private experiences.

Some people believe that voting for Trump might in fact make things worse, but the pleasure of doing so — of popping a middle finger to the Republican party and political elites more generally — would be satisfaction enough. In this sense, they may be quite a lot like the Leavers. For the disenfranchised, a vote against pragmatism and solidarity may be the only satisfaction that this election, or others, is likely to get them.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Nowadays, women are much more likely to earn more income than their spouse than they used to. But this is a shift, not a revolution, because very very few women are the kind of breadwinner that some men used to be.

Using data on 18-64 year-old married wives and their spouses (95.5% of which were men) from Decennial Censuses and the 2014 American Community Survey, here are some facts from 2014:

  • In 2014, 25% of wives earn more than their spouses (up from 15% in 1990 and 7% in 1970).
  • The average wife-who-earns-more takes home 68% of the couple’s earnings. The average for higher-earning men is 82%.
  • In 40% of the wife-earns-more couples, she earns less than 60% of the total, compared with 18% for higher earning men.
  • It is almost 9-times more common for a husband to earn all the money than a wife (19.6% versus 2.3%).

Here is the distribution of income in married couples (wife ages 18-64; the bars add to 100%):

coupincdist

Male and female breadwinners are not equivalent; making $.01 more than your spouse doesn’t make you a 1950s breadwinner, or the “primary earner” of the family.

Philip N. Cohen is a professor of sociology at the University of Maryland, College Park, and writes the blog Family Inequality, where this post originally appeared. You can follow him on Twitter or Facebook.

Rose Eveleth’s piece for Fusion on gender and bodyhacking was something I didn’t know I needed in my life until it was there. You know how you’ve always known something or felt something, but it isn’t until someone else articulates it for you that you truly understand it, can explain it to yourself, think you might be able to explain it to others – or, even better, shove the articulation at them and be all THAT RIGHT THERE, THAT’S WHAT I’M TALKING ABOUT. You know that kind of thing?

Yeah, that.

Eveleth’s overall thesis is that “bodyhacking” isn’t new at all, that it’s been around forever in how women – to get oversimplified and gender-essentialist in a way I try to avoid, so caveat there – alter and control and manage their bodies (not always to positive or uncoercive ends), but that it’s not recognized as such because we still gender the concept of “technology” as profoundly masculine:

Men invent Soylent, and it’s considered technology. Women have been drinking SlimFast and Ensure for decades but it was just considered a weight loss aid. Quantified self is an exciting technology sector that led tech giants such as Apple to make health tracking a part of the iPhone. But though women have been keeping records of their menstrual cycles for thousands of years, Apple belatedly added period tracking to its Health Kit. Women have been dieting for centuries, but when men do it and call it “intermittent fasting,” it gets news coverage as a tech trend. Men alter their bodies with implants and it’s considered extreme bodyhacking, and cutting edge technology. Women bound their feet for thousands of years, wore corsets that altered their rib cages, got breast implants, and that was all considered shallow narcissism.

As a central personal example, Eveleth uses her IUD, and this is what especially resonated with me, because I also have one. I’ve had one for about seven years. I love it. And getting it was moderately life-changing, not just because of its practical benefits but because it altered how I think about me.

The insertion process was not comfortable (not to scare off anyone thinking of getting one, TRUST ME IT IS GREAT TO HAVE) and more than a little anxiety-inducing ahead of time, but I walked out of the doctor’s office feeling kind of cool. I had an implant. I had a piece of technology in my uterus, that was enabling me to control my reproductive process. I don’t want children – at least not right now – and my reproductive organs have never been significantly important to me as far as my gender identity goes (probably not least because I don’t identify as a woman), but managing my bits and what they do and how they do it has naturally been a part of my life since I became sexually active.

And what matters for this conversation is that the constant task of managing them isn’t something I chose. Trying to find a method that worked best for me and (mildly) stressing about how well it was working was a part of my identity inasmuch as it took up space in my brain, and I wasn’t thrilled about that. I didn’t want it to be part of my identity – though I didn’t want to go as far as permanently foreclosing on the possibility of pregnancy – and it irked me that it had to be.

Then it didn’t have to be anymore.

And it wasn’t just about a little copper implant being cool on a pure nerd level. I felt cool because the power dynamic between my self and my body had changed. My relationship between me and this set of organs had become voluntary in a way entirely new to me.

I feel like I might not be explaining this very well.

Here: Over thirty years ago, Donna Haraway presented an image of a new form of self and its creation – not creation, in fact, but construction. Something pieced together with intentionality, the result of choices – something “encoded.” She offered a criticism of the woman-as-Earth-Mother vision that then-contemporary feminists were making use of, and pointed the way forward toward something far stranger and more wonderfully monstrous.

The power of an enmeshing between the organic and the technological lies not only in what it allows one to do but in what it allows one to be – and often there’s no real distinction to be made between the two. We can talk about identity in terms of smartphones, but when we come to things like technologies of reproductive control, I think the conversation often slips into the purely utilitarian – if these things are recognized as technologies at all.

Eveleth notes that “technology is a thing men do,” and I think the dismissal of female bodyhacking goes beyond dismissal of the utilitarian aspects of these technologies. It’s also the dismissal of many of the things that make it possible to construct a cyborg self, to weave a powerful connection to the body that’s about the emotional and psychological just as much as the physical.

I walked out of that doctor’s office with my little copper implant, and the fact that I no longer had to angst about accidental pregnancy was in many respects a minor component of what I was feeling. I was a little less of a goddess, and a little more of a cyborg.

Sunny Moraine is a doctoral candidate in sociology at the University of Maryland and a fiction author whose work has appeared in Clarkesworld, Lightspeed, Shimmer, Nightmare, and Strange Horizons, as well as multiple Year’s Best anthologies; they are also responsible for both the Root Code and Casting the Bones novel trilogies. Their current dissertation work concerns narrative, temporality, and genocidal violence. They blog at Cyborgology, where this post originally appeared, and can be followed on Twitter at @dynamicsymmetry.

Until as late as the 1950s, there was no widely accepted set of terms that referred to whether people were attracted to the same or the other sex. Same-sex sexual activity happened, and people knew that, but it was thought of as a behavior, not an identity. It was believed that people had sex with same-sex others not because they were constitutionally different, but because they gave in to an urge they were supposed to resist. People who never indulged homosexual desires weren’t considered straight; they were simply morally upright.

Today our sexual object choices are generally believed to reflect more than a feeling; they are part of who we are: as a static, essential identity, one that it inborn and unchanging. And we have a plethora of language to describe one’s “sexual orientation”: asexual, heterosexual, homosexual, bisexual, pansexual, polysexual, demisexual, and more. It has been, as Michel Foucault put it, “a multiplication of sexualities.”

Undoubtedly, this has value. These words, for example, give a name to feelings that have in recent history been difficult to understand. They also enable sexual minorities to find community and organize. If they can come together under the same label, they can join together for self-care and the promotion of social change.

These labels, though — and the belief in sexual orientation as an identity instead of just a behavior — also create their own voids of possibility. It’s significantly less possible today, for example, for a person to feel sexual urges for someone unexpected and dismiss them as irrelevant to their essential self. Because sexual orientation is an identity, those feelings jump start an identity crisis. If a person has those feelings, it’s difficult these days to shrug them off (but see Not Gay: Sex Between Straight White Men). Once one comes to embrace an identity, then all sexual urges that conflict with it must be repressed or explained away, lest the person undergo yet another identity crisis that results in yet another label.

This train of thought was inspired by these anonymous secrets sent into the Post Secret project:

2

4

.

“Even though I’m a gay man,” the first confessor says, “I still sometimes think about women’s breasts.” I AM, he says, a GAY MAN. It is something he is, essential and unchanging. Yet he has a feeling that doesn’t obey his identity: an interest in women’s breasts. So, “even though” he is gay, he finds himself distracted by something about the female body. It is a conundrum, a identity problem, even a secret that he perhaps confesses only anonymously. To be open about it would be to call into question who he and others think he is, to embark on a crisis. “I’m trying not to think about what that might mean,” says the other.

But none of this is at all necessary. It is only because we’ve decided that our sexual urges should be translated into an identity that thinking about women’s breasts seems incompatible with a primary orientation toward men. In a world of no labels at all, one in which sexual orientation is not an idea that we acknowledge, people’s sexual urges would be nothing more than that. And if that world was free of homophobia and heterocentrism, then we would act or not act on whichever urges we felt as we wished. It wouldn’t be a thing.

Most people think that the multiplication of sexualities is a good thing. From this point of view, language that can describe our urges, however imperfectly, makes those urges more visible and normalized, especially if we can make a case that they are inborn and unchanging, just a part of who we are. I don’t disagree.

But I see advantages, too, to a different system in which we don’t use any labels at all, where the object of one’s sexual attraction is an irrelevant detail or, at least, just one of the many, many, many things that come together to make someone sexy to us. In this world, we would be no more surprised to find ourselves attracted to a man one day and a woman the next than a construction worker one day and a lawyer the next, or a tall person one day and a short one the next, or an extrovert one day and an introvert the next. It would be just part of the messy, complicated, ever-shifting, works in mysterious ways thing that is the chemistry of sexual attraction. Nobody would have to have angst about it, seek support for it, defend it, or confess it as a secret. We would just… be.

Maybe the idea of sexual orientation was critical to the Gay Liberation movement’s goals of normalizing same-sex love and attraction, but I wonder if sexual liberation in the long run would be better served by abandoning the concept altogether. Perhaps a real sexual utopia doesn’t fetishize privilege genitals as the one true determinant of our sexualities. Maybe it simply puts them in their rightful place as tools for pleasure and reproduction, but not the end-all and be-all of who we are.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Historian Molly Worthen is fighting tyranny, specifically the “tyranny of feelings” and the muddle it creates. We don’t realize that our thinking has been enslaved by this tyranny, but alas, we now speak its language. Case in point:

“Personally, I feel like Bernie Sanders is too idealistic,” a Yale student explained to a reporter in Florida.

Why the “linguistic hedging” as Worthen calls it? Why couldn’t the kid just say, “Sanders is too idealistic”? You might think the difference is minor, or perhaps the speaker is reluctant to assert an opinion as though it were fact. Worthen disagrees.

“I feel like” is not a harmless tic. . . . The phrase says a great deal about our muddled ideas about reason, emotion and argument — a muddle that has political consequences.

The phrase “I feel like” is part of a more general evolution in American culture. We think less in terms of morality – society’s standards of right and wrong – and more in terms individual psychological well-being. The shift from “I think” to “I feel like” echoes an earlier linguistic trend when we gave up terms like “should” or “ought to” in favor of “needs to.” To say, “Kayden, you should be quiet and settle down,” invokes external social rules of morality. But, “Kayden, you need to settle down,” refers to his internal, psychological needs. Be quiet not because it’s good for others but because it’s good for you.

4

Both “needs to” and “I feel like” began their rise in the late 1970s, but Worthen finds the latter more insidious. “I feel like” defeats rational discussion. You can argue with what someone says about the facts. You can’t argue with what they say about how they feel. Worthen is asserting a clear cause and effect. She quotes Orwell: “If thought corrupts language, language can also corrupt thought.” She has no evidence of this causal relationship, but she cites some linguists who agree. She also quotes Mark Liberman, who is calmer about the whole thing. People know what you mean despite the hedging, just as they know that when you say, “I feel,” it means “I think,” and that your are not speaking about your actual emotions.

The more common “I feel like” becomes, the less importance we may attach to its literal meaning. “I feel like the emotions have long since been mostly bleached out of ‘feel that,’ ” …

Worthen disagrees.  “When new verbal vices become old habits, their power to shape our thought does not diminish.”

“Vices” indeed. Her entire op-ed piece is a good example of the style of moral discourse that she says we have lost. Her stylistic preferences may have something to do with her scholarly ones – she studies conservative Christianity. No “needs to” for her. She closes her sermon with shoulds:

We should not “feel like.” We should argue rationally, feel deeply and take full responsibility for our interaction with the world.

——————————-

Originally posted at Montclair SocioBlog. Graph updated 5/11/16.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Pregnancy wasn’t always something women did in public. In her new book, Pregnant with the Stars, Renée Ann Cramer puts public pregnancies under the sociological microscope, but she notes that it is only recently that being publicly pregnant became socially acceptable. Even as recently as the 1950s, pregnancy was supposed to be a private matter, hidden behind closed doors. That big round belly was, she argues, “an indicator that sex had taken place, [which] was simply considered too risqué for polite company.”

Lucille Ball was the first person on television to acknowledge a pregnancy, real or fictional. It was 1952, but it was considered lewd to actually say the word “pregnant,” so the episode used euphemisms like “blessed event” or simply referred to having a baby or becoming a father.

Almost 20 years later, in 1970, a junior high school teacher was forced out of the classroom in her third trimester on the argument that her visible pregnancy would, as Cramer puts it, “alternately disgust, concern, fascinate, and embarrass her students.” So, when Demi Moore posed naked and pregnant on the cover of Vanity Fair just 21 years after that, it was a truly groundbreaking thing to do.

2

Today being pregnant is public is unremarkable. Visibly pregnant women are free to run errands, go to restaurants, attend events, even dress up their “baby bump” to try to (make it) look cute. All of this is part of the entrance of women into the public sphere more generally and the pressing of men to accept female bodies in those spaces. The next frontier may be breast feeding, an activity related to female-embodied parenting that many still want to relegate to behind closed doors. We may look back in 20 years and be as surprised by intolerance of breastfeeding as we are today over the idea that pregnant women weren’t supposed to leave the house. Time will tell.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

One word in the headlines last week seemed like a throwback to an earlier era:

As Trump moves to soften his image, Democrats seek to harden it

The Washington Post

Donald Trump to reshape image, new campaign chief tells G.O.P.

The New York Times

Trump surrogates say GOP front-runner “projecting an image” during primaries

— Fox News

It was in the 1960s that politicians, their handlers, and the people who write about them discovered image. The word carries the cynical implication that voters, like shoppers, respond to the surface image rather than the substance – the picture on the box rather than what’s inside.  A presidential campaign was based on the same thing as an advertising campaign – image.  You sold a candidate the same way you sold cigarettes, at least according to the title and book jacket of Joe McGinnis’s book.

Then, sometime around 1980, image began to fade. In its place we now have brand. I went to Google N-grams and looked at the ratio of image to brand in both the corporate and the political realm. The pattern is nearly identical.


The ratio rises steeply from 1960 to 1980 – lots more talk about image, no increase in brand. Then the trend reverses. Sightings of image were still rising, but nowhere nearly as rapidly as brand, which doubled from 1980 to 2000 in politics and quadrupled in the corporate world.

Image sounds too deceptive and manipulative; you can change it quickly according to the needs of the moment. Brand implies permanence and substance (not to mention Marlboro-man-like rugged independence and integrity.) No wonder people in the biz prefer brand.

Decades ago, when my son was in grade school, I met another parent who worked in the general area of public relations. On seeing him at the next school function a few weeks later, I said, “Oh right, you work in corporate image-mongering.” I thought I said it jokingly, but he seemed offended. He was, I quickly learned, a brand consultant. Image bad; brand good.

In later communications, he also said that a company’s attempt to brand itself as something it’s not will inevitably fail.  The same thing supposedly goes for politics:

“One thing you learn very quickly in political consulting is the fruitlessness of trying to get a candidate to change who he or she fundamentally is at their core,” said Republican strategist Whit Ayres, who did polling for Rubio’s presidential campaign before he dropped out of the race. “So, is the snide, insulting, misogynistic guy we’ve seen really who Donald Trump is? Or is it the disciplined, respectful, unifying Trump we saw for seven minutes after the New York primary?

These consultants are saying what another Republican said a century and a half ago: “You can fool all the people some of the time, and some of the people all the time, but you cannot fool all the people all the time.”

This seems to argue that political image-mongers have to be honest about who their candidate really is. But there’s another way of reading Lincoln’s famous line: You only need to fool half the people every four years.

Originally posted at Montclair SocioBlog.

———————

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.