Media have tended to depict childfree people negatively, likening the decision not to have children to “whether to have pizza or Indian for dinner.” Misperceptions about those who do not have children have serious weight, given that between 2006 and 2010 15% of women and 24% of men had not had children by age 40, and that nearly half of women aged 40-44 in 2002 were what Amy Blackstone and Mahala Dyer Stewart refer to as “childfree,” or purposefully not intending to have children.

Trends in childlessness/childfreeness from the Pew Research Center:

4

Blackstone and Stewart’s forthcoming 2016 article in The Family Journal, “There’s More Thinking to Decide”: How the Childfree Decide Not to Parent, engages the topic and extends the scholarly and public work Blackstone has done, including her shared blog, We’re Not Having a Baby.

When researchers explore why people do not have children, they find that the reasons are strikingly similar to reasons why people do have children. For example, “motivation to develop or maintain meaningful relationships” is a reason that some people have children – and a reason that others do not. Scholars are less certain on how people come to the decision to to be childfree. In their new article, Blackstone and Stewart find that, as is often the case with media portrayals of contemporary families, descriptions of how people come to the decision to be childfree have been oversimplified. People who are childfree put a significant amount of thought into the formation of their families, as they report.

Blackstone and Stewart conducted semi-structured interviews with 21 women and 10 men, with an average age of 34, who are intentionally childfree. After several coding sessions, Blackstone and Stewart identified 18 distinct themes that described some aspect of decision-making with regard to living childfree. Ultimately, the authors concluded that being childfree was a conscious decision that arose through a process. These patterns were reported by both men and women respondents, but in slightly different ways.

Childfree as a conscious decision

All but two of the participants emphasized that their decision to be childfree was made consciously. One respondent captured the overarching message:

People who have decided not to have kids arguably have been more thoughtful than those who decided to have kids. It’s deliberate, it’s respectful, ethical, and it’s a real honest, good, fair, and, for many people, right decision.

There were gender differences in the motives for these decisions. Women were more likely to make the decision based on concern for others: some thought that the world was a tough place for children today, and some did not want to contribute to overpopulation and environmental degradation. In contrast, men more often made the decision to live childfree “after giving careful and deliberate thought to the potential consequences of parenting for their own, everyday lives, habits, and activities and what they would be giving up were they to become parents.”

Childfree as a process

Contrary to misconceptions that the decision to be childfree is a “snap” decision, Blackstone and Stewart note that respondents conceptualized their childfree lifestyle as “a working decision” that developed over time. Many respondents had desired to live childfree since they were young; others began the process of deciding to be childfree when they witnessed their siblings and peers raising children. Despite some concrete milestones in the process of deciding to be childfree, respondents emphasized that it was not one experience alone that sustained the decision. One respondent said, “I did sort of take my temperature every five, six, years to make sure I didn’t want them.” Though both women and men described their childfree lifestyle as a “working decision,” women were more likely to include their partners in that decision-making process by talking about the decision, while men were more likely to make the decision independently.

Blackstone and Stewart conclude by asking, “What might childfree families teach us about alternative approaches to ‘doing’ marriage and family?” The present research suggests that childfree people challenge what is often an unquestioned life sequence by consistently considering the impact that children would have on their own lives as well as the lives of their family, friends, and communities. One respondent reflected positively on childfree people’s thought process: ‘‘I wish more people thought about thinking about it… I mean I wish it were normal to decide whether or not you were going to have children.’’

Braxton Jones is a graduate student in sociology at the University of New Hampshire, and serves as a Graduate Research and Public Affairs Scholar for the Council on Contemporary Families, where this post originally appeared.

We often think that religion helps to build a strong society, in part because it gives people a shared set of beliefs that fosters trust. When you know what your neighbors think about right and wrong, it is easier to assume they are trustworthy people. The problem is that this logic focuses on trustworthy individuals, while social scientists often think about the relationship between religion and trust in terms of social structure and context.

New research from David Olson and Miao Li (using data from the World Values survey) examines the trust levels of 77,405 individuals from 69 countries collected between 1999 and 2010. The authors’ analysis focuses on a simple survey question about whether respondents felt they could, in general, trust other people. The authors were especially interested in how religiosity at the national level affected this trust, measuring it in two ways: the percentage of the population that regularly attended religious services and the level of religious diversity in the nation.

These two measures of religious strength and diversity in the social context brought out a surprising pattern. Nations with high religious diversity and high religious attendance had respondents who were significantly less likely to say they could generally trust other people. Conversely, nations with high religious diversity, but relatively low levels of participation, had respondents who were more likely to say they could generally trust other people.

5

One possible explanation for these two findings is that it is harder to navigate competing claims about truth and moral authority in a society when the stakes are high and everyone cares a lot about the answers, but also much easier to learn to trust others when living in a diverse society where the stakes for that difference are low. The most important lesson from this work, however, may be that the positive effects we usually attribute to cultural systems like religion are not guaranteed; things can turn out quite differently depending on the way religion is embedded in social context.

Evan Stewart is a PhD candidate at the University of Minnesota studying political culture. He is also a member of The Society Pages’ graduate student board. There, he writes for the blog Discoveries, where this post originally appeared. You can follow him on Twitter

Flashback Friday.

Russ Ruggles, who blogs for Online Dating Matchmaker, makes an argument for lying in your online dating profile. He notes, first, that lying is common and, second, that people lie in the direction that we would expect, given social desirability. Men, for example, tend to exaggerate their height; women tend to exaggerate their thinness:

Since people also tend to restrict their searches according to social desirability (looking for taller men and thinner women), these lies will result in your being included in a greater proportion of searches. So, if you lie, you are more likely to actually go on a date.

Provided your lie was small — small enough, that is, to not be too obvious upon first meeting — Ruggles explains that things are unlikely to fall to pieces on the first date. It turns out that people’s stated preferences have a weak relationship to who they actually like. Stated preferences, one study found, “seemed to vanish when it came time to choose a partner in physical space.”

“It turns out,” Ruggles writes, that “we have pretty much no clue what we actually want in a partner.”

So lie! A little! Lie away! And, also, don’t be so picky. You never know!

Originally posted in 2010. Crossposted at Jezebel.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Historian Molly Worthen is fighting tyranny, specifically the “tyranny of feelings” and the muddle it creates. We don’t realize that our thinking has been enslaved by this tyranny, but alas, we now speak its language. Case in point:

“Personally, I feel like Bernie Sanders is too idealistic,” a Yale student explained to a reporter in Florida.

Why the “linguistic hedging” as Worthen calls it? Why couldn’t the kid just say, “Sanders is too idealistic”? You might think the difference is minor, or perhaps the speaker is reluctant to assert an opinion as though it were fact. Worthen disagrees.

“I feel like” is not a harmless tic. . . . The phrase says a great deal about our muddled ideas about reason, emotion and argument — a muddle that has political consequences.

The phrase “I feel like” is part of a more general evolution in American culture. We think less in terms of morality – society’s standards of right and wrong – and more in terms individual psychological well-being. The shift from “I think” to “I feel like” echoes an earlier linguistic trend when we gave up terms like “should” or “ought to” in favor of “needs to.” To say, “Kayden, you should be quiet and settle down,” invokes external social rules of morality. But, “Kayden, you need to settle down,” refers to his internal, psychological needs. Be quiet not because it’s good for others but because it’s good for you.

4

Both “needs to” and “I feel like” began their rise in the late 1970s, but Worthen finds the latter more insidious. “I feel like” defeats rational discussion. You can argue with what someone says about the facts. You can’t argue with what they say about how they feel. Worthen is asserting a clear cause and effect. She quotes Orwell: “If thought corrupts language, language can also corrupt thought.” She has no evidence of this causal relationship, but she cites some linguists who agree. She also quotes Mark Liberman, who is calmer about the whole thing. People know what you mean despite the hedging, just as they know that when you say, “I feel,” it means “I think,” and that your are not speaking about your actual emotions.

The more common “I feel like” becomes, the less importance we may attach to its literal meaning. “I feel like the emotions have long since been mostly bleached out of ‘feel that,’ ” …

Worthen disagrees.  “When new verbal vices become old habits, their power to shape our thought does not diminish.”

“Vices” indeed. Her entire op-ed piece is a good example of the style of moral discourse that she says we have lost. Her stylistic preferences may have something to do with her scholarly ones – she studies conservative Christianity. No “needs to” for her. She closes her sermon with shoulds:

We should not “feel like.” We should argue rationally, feel deeply and take full responsibility for our interaction with the world.

——————————-

Originally posted at Montclair SocioBlog. Graph updated 5/11/16.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Way back in 1996 sociologist Susan Walzer published a research article pointing to one of the more insidious gender gaps in household labor: thinking. It was called “Thinking about the Baby.”

In it, Walzer argued that women do more of the intellectual and emotional work of childcare and household maintenance. They do more of the learning and information processing (like buying and reading “how-to” books about parenting or researching pediatricians). They do more worrying (like wondering if their child is hitting his developmental milestones or has enough friends at school). And they do more organizing and delegating (like deciding when towels need washing or what needs to be purchased at the grocery store), even when their partner “helps out” by accepting assigned chores.

For Mother’s Day, a parenting blogger named Ellen Seidman powerfully describes this exhausting and almost entirely invisible job. I am compelled to share. Her essay centers on the phrase “I am the person who notices…” It starts with the toilet paper running out and it goes on… and on… and on… and on. Read it.

She doesn’t politicize what she calls an “uncanny ability to see things… [that enable] our family to basically exist.” She defends her husband (which is fine) and instead relies on a “reduction to personality,” that technique of dismissing unequal workloads first described in the canonical book The Second Shift: somehow it just so happens that it’s her and not her husband that notices all these things.

But I’ll politicize it. The data suggests that it is not an accident that it is she and not her husband that does this vital and brain-engrossing job. Nor is it an accident that it is a job that gets almost no recognition and entirely no pay. It’s work women disproportionately do all over America. So, read it. Read it and remember to be thankful for whoever it is in your life that does these things. Or, if it is you, feel righteous and demand a little more recognition and burden sharing. Not on Mother’s Day. That’s just one day. Everyday.

Cross-posted and in print at Money.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Despite the maxim about familiarity breeding contempt, we usually like what’s familiar.  With music for example, familiarity breeds hits in the short run and nostalgia in the long run. The trouble is that it’s tempting to attribute our liking to the inherent quality of the thing rather than its familiarity.  With movies, film buffs may make this same conflation between what they like and what they easily recognize.

That’s one of the points of Scott Lemieux’s takedown of Peter Suderman’s Vox article about Michael Bay.

Suderman hails Bay as “an auteur — the author of a film — whose movies reflect a distinctive, personal sensibility. Few filmmakers are as stylistically consistent as Bay, who recycles many of the same shots, editing patterns, and color schemes in nearly all of his films.”

But what’s so great about being an auteur with a recognizable style? For Lemieux, Michael Bay is a hack. His movies aren’t good, they’re just familiar. Bay’s supporters like them because of that familiarity but then attribute their liking to some imagined cinematic quality of the films.

My students, I discovered last week,  harbor no such delusions about themselves and the songs they like. As a prologue to my summary of the Salganik-Watts MusicLab studies, I asked them to discuss what it is about a song that makes it a hit. “Think about hit songs you like and about hit songs that make you wonder, ‘How did that song get to be #1?’” The most frequent answers were all about familiarity and social influence. “You hear the song a lot, and everyone you know likes it, and you sort of just go along, and then you like it too.” I had to probe in order to come up with anything about the songs themselves – the beat, the rhymes, even the performer.

Lemieux cites Pauline Kael’s famous essay “Circles and Squares” (1963), a response to auteur-loving critics like Andrew Sarris. She makes the same point – that these critics conflate quality with familiarity, or as she terms it “distinguishability.”

That the distinguishability of personality should in itself be a criterion of value completely confuses normal judgment. The smell of a skunk is more distinguishable than the perfume of a rose; does that make it better?

Often the works in which we are most aware of the personality of the director are his worst films – when he falls back on the devices he has already done to death. When a famous director makes a good movie, we look at the movie, we don’t think about the director’s personality; when he makes a stinker we notice his familiar touches because there’s not much else to watch.

Assessing quality in art is difficult if not impossible. Maybe it’s a hopeless task, one that my students, in their wisdom, refused to be drawn into. They said nothing about why one song was better than another. They readily acknowledged that they liked songs because they were familiar and popular, criteria that producers, promoters, and payola-people have long been well aware of.

“In the summer of 1957,” an older friend once told me, “My family was on vacation at Lake Erie. There was this recreation hall – a big open room where teenagers hung out. You could get ice cream and snacks, and there was music, and some of the kids danced. One afternoon, they played the same song – ‘Honeycomb’ by Jimmie Rodgers – about twenty times in a row, maybe more. They just kept playing that song over and over again. Maybe it was the only song they played the whole afternoon.”

It wasn’t just that one rec hall. The people at Roulette Records must have been doing similar promotions all around the country and doing whatever they had to do to get air play for the record. By the end of September, “Honeycomb” was at the top of the Billboard charts. Was it a great song? Assessment of quality was irrelevant, or it was limited to the stereotypical critique offered by the kids on American Bandstand: “It’s got a good beat. You can dance to it.” Of course, this was before the 1960s and the rise of the auteur, a.k.a. the singer-songwriter.

Hollywood uses the same principle when it churns out sequels and prequels – Rocky, Saw, Batman. They call it a “franchise,” acknowledging the films had the similarity of Burger Kings. The audience fills the theaters not because the movie is good but because it’s Star Wars. Kael and the other anti-auteurists argue that auteur exponents are no different in their admiration for all Hitchcock. Or Michael Bay. It’s just that their cinema sophistication allows them to fool themselves.

Originally posted at Montclair SocioblogBig hat tip to Mark at West Coast Stat Views.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Over at Politics Outdoors, sociologist and political scientist David Meyer has argued that Trump is a charismatic leader. The idea comes from Max Weber, widely seen as a founding father of sociology, who argued that there are three types of authority: traditional, legal, and charismatic. Traditional authority derives its power from custom, legal from bureaucracy, and charismatic from cult of personality.

Weber argues that charismatic leaders are seen as somehow superhuman, exemplary, or ordained. They are different than the average human with exceptional qualities that can be depended upon to ensure that everything they do will be right. It is because Trump is a charismatic leader that he can say “trust me” and give few details as to his priorities or policies, even on something as serious as foreign conflict. It’s why he can say, when asked who he’s consulting: “I’m speaking with myself, number one, because I have a very good brain and I’ve said a lot of things.”

His followers don’t need to know what he might do or who he might listen to because they believe in him, not what he stands for. That’s why it makes sense to them to pledge allegiance to Trump instead of the flag.

 

Meyer adds that charismatic leaders are especially attractive during “turbulent times.” “[F]ew people would be willing throw in with someone who obviously lacks all of the qualities for the job he seeks,” Meyer writes, “unless times were truly desperate.” This is part of why Trump’s constant emphasis on inept politicians, broken policies, and the threat of terror and immigration works in his favor. Even his slogan, “Make American Great Again,” ominously implies that we are no longer great.

Charismatic authority is also, paradoxically, unstable. While followers tend to believe their leader to be infallible, the moment they no longer believe so, his power has vanished. At that time, movements either fall apart or find a charismatic successor. If Trump stumbles enough to reveal a weakness, and his supporters are willing to see it, this particular anti-establishment movement could disappear and more quickly than one might think. Unless, of course, they find someone who can step into Trump’s shoes.

Cross-posted at Pacific Standard.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

2 (1)“It is fair to say,” writes historian Heather Williams about the Antebellum period in America, “that most white people had been so acculturated to view black people as different from them that they… barely noticed the pain that they experienced.”

She describes, for example, a white woman who, while wrenching enslaved people from their families to found a distant plantation, describes them as “cheerful,” in “high spirits,” and “play[ful] like children.” It simply never occurred to her or many other white people that black people had the same emotions they did, as the reigning belief among whites was that they were incapable of any complex or deep feeling at all.

It must have created such cognitive dissonance, then — such confusion on the part of the white population — when after the end of slavery, black people tried desperately to reunite with their parents, cousins, aunties and uncles, nieces and nephews, spouses, lovers, children, and friends.

And try they did. For decades newly freed black people sought out their loved ones. One strategy was to put ads in the paper. The “Lost Friends” column was one such resource. It ran in the Southwestern Christian Advocate from 1879 until the early 1900s and a collection of those ads — more than 330 from just one year — has been released by the Historic New Orleans Collection. Here is an example:

4

The ads would have been a serious investment. They cost 50 cents which, at the time, would have been more than a day’s income for most recently freed people.

Williams reports that reunions were rare. She excerpted this success story from the Southwestern in her book, Help Me To Find My People, about enslaved families torn asunder, their desperate search for one another, and the rare stories of reunification.

A FAMILY RE-UNITED

In the SOUTHWESTERN of March 1st, we published in this column a letter from Charity Thompson, of Hawkins, Texas, making inquiry about her family. She last heard of them in Alabama years ago. The letter, as printed in the paper was read in the First church Houston, and as the reading proceeded a well-known member of the church — Mrs. Dibble — burst into tears and cried out “That is my sister and I have not seen her for thirty three years.” The mother is still living and in a few days the happy family will once more re-united.

I worry that white America still does not see black people as their emotional equals. Psychologists continue to document what is now called a racial empathy gap, both blacks and whites show lesser empathy when they see darker-skinned people experiencing physical or emotional pain. When white people are reminded that black people are disproportionately imprisoned, for example, it increases their support for tougher policing and harsher sentencing. Black prisoners receive presidential pardons at much lower rates than whites. And we think that black people have a higher physical pain threshold than whites.

How many of us tolerate the systematic deprivation and oppression of black people in America today — a people whose families are being torn asunder by death and imprisonment — by simply failing to notice the depths of their pain?

Cross-posted at A Nerd’s Guide to New Orleans.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.