Search results for day care

3Movie Reviews

The History of Christmas

Christmas Across Cultures

The Economics of Christmas

Racializing Christmas

Christmas and Gender

Gift Guides and the Social Construction of Gender

Sexifiying Christmas

Christmas Marketing

Just for Fun

Flashback Friday.

A study by doctor Ruchi Gupta and colleagues mapped rates of asthma among children in Chicago, revealing that they are closely correlated with race and income. The overall U.S. rate of childhood asthma is about 10%, but evidence indicates that asthma is very unevenly distributed. Their visuals show that there are huge variations in the rates of childhood asthma among different neighborhoods:

The researchers looked at how the racial/ethnic composition of neighborhoods is associated with childhood asthma. They defined a neighborhood’s racial make-up by looking at those that were over 67% White, Black, or Hispanic. This graph shows the percent of such neighborhoods that fall into three categories of rates of asthma: low (less than 10% of children have asthma), medium (10-20% of children have it), and high (over 20% of kids are affected). While 95% of White neighborhoods have low or medium rates, 56% of Hispanic neighborhoods have medium or high rates. However, the really striking finding is for Black neighborhoods; 94% have medium or high prevalence. And the racial clustering is even more pronounced if we look only at the high category, where only a tiny proportion (6%) of White neighborhoods fall but nearly half of Black ones do…a nearly mirror image of what we see for the low category:

It’s hard to know exactly what causes higher rates of asthma in Black and Hispanic neighborhoods than in White ones. It could be differences in access to medical care. The researchers found that asthma rates are also higher in neighborhoods that have high rates of violence. Perhaps stress from living in neighborhoods with a lot of violence is leading to more asthma. The authors of the study suggest that parents might keep their children inside more to protect them from violence, leading to more exposure to second-hand smoke and other indoor pollutants (off-gassing from certain types of paints or construction materials, for instance).

Other studies suggest that poorer neighborhoods have worse outdoor environmental conditions, particularly exposure to industries that release toxic air pollutants or store toxic waste, which increase the risk of asthma. Having a parent with asthma increases the chances of having it as well, though the connection there is equally unsure–is there a genetic factor, or does it simply indicate that parents and children are likely to grow up in neighborhoods with similar conditions?

Regardless, it’s clear that some communities — often those with the fewest resources to deal with it — are bearing the brunt of whatever conditions cause childhood asthma.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

“That’s private equity for you,” said Steve Jenkins. He was standing outside the uptown Fairway grocery at 125th St. about to go to breakfast at a diner across the street. He no longer works at Fairway.

Steve was one of the early forces shaping Fairway back when it was just one store at 74th and Broadway. He hired on as their cheese guy. “What do you want that for?” he growled at me one day long ago when he saw me with a large wedge of inexpensive brie. “That’s the most boring cheese in the store.” He was often abrasive, rarely tactful. I tried to explain that it was for a party and most of the people wouldn’t care. He would have none of it. He cared. He cared deeply – about cheese, about food generally.

He helped Fairway expand from one store to two, then four. He still selected the cheeses. He wrote the irreverent text for their signs, including the huge electric marquee that drivers on the West Side Highway read. And then in 2007 Fairway got bought out by a private equity firm. The three original founders cashed out handsomely. Steve and others stayed on. Much of their their share of the deal was in Fairway stock, but with restrictions that prevented them from selling.

Fairway kept expanding – stores in more places around New York – and they aimed more at the median shopper. Gradually, the store lost its edge, its quirkiness. With great size comes great McDonaldization – predictability, calculability. “Like no other market,” says every Fairway sign and every Fairway plastic bag. But it became like lots of other markets, with “specials” and coupons. Coupons! Fairway never had coupons. Or specials.

The people who decided to introduce coupons and specials were probably MBAs who knew about business and management and maybe even research on the retail food business. They knew about costs and profits. Knowing about food was for the people below them, people whose decisions they could override.

“I gotta get permission from corporate if I want to use my cell phone,” said Peter Romano, the wonderful produce manager at 74th St. – another guy who’d been there almost from the start. He knew produce like Steve knew cheese. Peter, too, left Fairway a few months ago.

Maybe this is what happens when a relatively small business gets taken over by ambitious suits. Things are rationalized, bureaucratized. And bureaucracy carries an implicit message of basic mistrust:

If we trusted you, we wouldn’t make you get approval. We wouldn’t make you fill out these papers about what you’re doing; we’d just let you do it. These procedures are our way of telling you that we don’t trust you to do what you say you’re doing.

The need for predictability, efficiency, and calculability leave little room for improvisation. The food business becomes less about food, more about business. It stops being fun. The trade-off should be that you get more money. But there too, Fairway’s new management disappointed. They expanded rapidly, putting new stores in questionable locations. In the first months after the private equity firm took Fairway public in 2013, the stock price was as high as $26 a share. Yesterday, it closed at $1.04. The shares that Steve Jenkins and others received as their part of the private equity buyout are practically worthless.

4

Steve Jenkins will be all right. He’s well known in food circles. He’s been on television with Rachel Ray, Jacques Pepin. Still, there he was yesterday morning outside the store whose cheeses and olive oils had been his dominion. “I’m sixty-five years old, and I’m looking for a job.”

Originally posted at Montclair SocioBlog; re-posted at Pacific Standard.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

I recently moved to a neighborhood that people routinely describe as “bad.” It’s my first time living in such a place. I’ve lived in working class neighborhoods, but never poor ones. I’ve been lucky.

This neighborhood — one, to be clear, that I had the privilege to choose to live in — is genuinely dangerous. There have been 42 shootings within one mile of my house in the last year. Often in broad daylight. Once the murderers fled down my street, careening by my front door in an SUV. One week there were six rapes by strangers — in the street and after home invasions — in seven days. People are robbed, which makes sense to me because people have to eat, but with a level of violence that I find confusing. An 11-year-old was recently arrested for pulling a gun on someone. A man was beaten until he was a quadriplegic. One day 16 people were shot in a park nearby after a parade.

I’ve lived here for a short time and — being white, middle-aged, middle class, and female — I am on the margins of the violence in my streets, and yet I have never been so constantly and excruciatingly aware of my mortality. I feel less of a hold on life itself. It feels so much more fragile, like it could be taken away from me at any time. I am acutely aware that my skin is but paper, my bones brittle, my skull just a shell ripe for bashing. I imagine a bullet sheering through me like I am nothing. That robustness that life used to have, the feeling that it is resilient and that I can count on it to be there for me, that feeling is going away.

So, when I saw the results of a new study showing that only 50% of African American teenagers believe that they will reach 35 years of age, I understood better than I have understood before. Just a tiny — a teeny, teeny, tiny — bit better.

2

I have heard this idea before. A friend who grew up the child of Mexican immigrants in a sketchy urban neighborhood told me that he, as a teenager, didn’t believe he’d make it to 18. I nodded my head and thought “wow,”‘ but I did not understand even a little bit. He would be between the first and second column from the right: 54% of 2nd generation Mexican immigrants expect that they may very well die before 35. I understand him now a tiny — a teeny, teeny tiny — bit better.

Sociologists Tara Warner and Raymond Swisher, the authors of the study, make clear that the consequences of this fatalism are far reaching. If a child does not believe that they might live to see another day, what motivation can there possibly be for investing in the future, for caring for one’s body, for avoiding harmful habits or dangerous activities? Why study? Why bother to see a doctor? Why not do drugs? Why avoid breaking the law?

Why wouldn’t a person put their future at risk — indeed, their very life — if they do not believe in that future, that life, at all?

If we really want to improve the lives of the most vulnerable people in our country, we cannot allow them to live in neighborhoods where desperation is so high that people turn to violence. Dangerous environments breed fatalism, rationally so. And once our children have given up on their own futures, no teachers’ encouragement, no promise that things will get better if they are good, no “up by your bootstraps” rhetoric will make a difference. They think they’re going to be dead, literally.

We need to boost these families with generous economic help, real opportunities, and investment in neighborhood infrastructure and schools. I think we don’t because the people with the power to do so don’t understand — even a teeny, teeny tiny bit — what it feels like to grow up thinking you’ll never grow up. Until they do, and until we decide that this is a form of cruelty that we cannot tolerate, I am sad to say that I feel pretty fatalistic about these children’s futures, too.

Re-posted at Pacific Standard.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

So, Star Wars is out with a new movie and instead of pretending female fans don’t exist, Disney has decided to license the Star Wars brand to Covergirl. A reader named David, intrigued, sent in a two-page ad from Cosmopolitan for analysis.

What I find interesting about this ad campaign — or, more accurately — boring, is its invitation to women to choose whether they are good or bad. “Light side or dark side. Which side are you on?” it asks. Your makeup purchases, apparently, follow.

3

 

4

This is the old — and by “old” I mean ooooooooold — tradition of dividing women into good and bad. The Madonna and the whore. The woman on the pedestal and her fallen counterpart. Except Covergirl, like many cosmetics companies before that have used exactly the same gimmick, is offering women the opportunity to choose which she wants to be. Is this some sort of feminist twist? Now we get to choose whether men want to marry us or just fuck us? Great.

But that part’s just boring. What’s obnoxious about the ad campaign is the idea that, for women, what really matters about the ultimate battle between good and evil is whether it goes with her complexion. It affirms the stereotype that women are deeply trivial, shallow, and vapid. What interests us about Star Wars? Why, makeup, of course!

If David — who also noted the inclusion of a single Asian model as part of the Dark Side — hadn’t asked me to write about this, I probably wouldn’t have. It feels like low hanging fruit because it’s just makeup advertising and who cares. But this constant message that women are genuinely excited at the idea of getting to choose which color packet to use as some sort of idiotic contribution to a battle of good versus evil is corrosive.

Moreover, the constant reiteration of the idea that we are thrilled to paint our faces actually obscures the fact that we are essentially required to do so if we want to be taken seriously as professionals, potential partners or, really, valuable human beings. So, not only does this kind of message teach us not to take women seriously at all, it hides the very serious way in which we are actively forced to capitulate to the male gaze — every. damn. day. — and feed capitalism while we’re at it.

This ad isn’t asking us if we want to be on the dark side or the light side. It’s asking us if we want to wear makeup or wear makeup. It’s not a choice at all. But it sure does make subordination seem fun.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

When it comes to rule-breakers and rule enforcers, which side you are on seems to depend on the rule-breaker and the rule. National Review had a predictable response to the video of a school officer throwing a seated girl to the floor. Watch with caution; disturbing imagery:

[youtube]https://www.youtube.com/watch?v=JAD15m6wqJI[/youtube]

Most of the response when the video went viral was revulsion. But not at National Review. David French said it clearly:

I keep coming to the same conclusion: This is what happens when a person resists a lawful order from a police officer to move.

The arrested student at Spring Valley High School should have left her seat when her teacher demanded that she leave. She should have left when the administrator made the same demand. She should have left when Fields made his first, polite requests. She had no right to stay. She had no right to end classroom instruction with her defiance. Fields was right to move her, and he did so without hurting her. The fact that the incident didn’t look good on camera doesn’t make his actions wrong.

This has been the general response on the right to nearly all the recently publicized incidents of the police use of force. If law enforcement tells you to do something, and then you don’t do it, it’s OK for the officer to use force, and if you get hurt or killed, it’s your fault for not complying, even if you haven’t committed an offense.

That’s the general response. There are exceptions, notably Cliven Bundy. In case you’d forgotten, Bundy is the Nevada cattle rancher who was basically stealing – using federal lands for grazing his cattle and refusing to pay the fees.  He’d been stiffing the United States this way for many years. When the Federales finally arrested him and rounded up his cattle, a group of his well armed supporters challenged the feds. Rather than do what law enforcers in other publicized accounts do when challenged by someone with a gun – shoot to kill –  the Federal rangers negotiated.

Bundy was clearly breaking the law. Legally, as even his supporters acknowledged, he didn’t have a leg to stand on. So the view from the right must have been that he should do what law enforcement said. But no.

Here is National Review’s Kevin Williamson:

This is best understood not as a legal proceeding but as an act of civil disobedience… As a legal question Mr. Bundy is legless. But that is largely beside the point.

What happened to “This is what happens when a person resists a lawful order”? The law is now “beside the point.” To Williamson, Bundy is a “dissident,” one in the tradition of Ghandi, Thoreau, and fugitive slaves.

Not all dissidents are content to submit to what we, in the Age of Obama, still insist on quaintly calling “the rule of law.”

Every fugitive slave, and every one of the sainted men and women who harbored and enabled them, was a law-breaker, and who can blame them if none was content to submit to what passed for justice among the slavers?

(The equation with fugitive slaves became something of an embarrassment later when Bundy opined that those slaves were better off as slaves than are Black people today who get government subsidies. Needless to say, Bundy did not notice that the very thing he was demanding was a government handout – free grazing on government lands.)

The high school girl refused the teacher’s request that she give up her cell phone and then defied an order from the teacher and an administrator to leave the classroom.  Cliven Bundy’s supporters “threatened government employees and officials, pointed firearms at law enforcement officers, harassed the press, called in bomb scares to local businesses, set up roadblocks on public roads, and formed lists (complete with photos and home addresses) of their perceived enemies” (Forbes).

A Black schoolgirl thrown to the floor by a weightlifting cop twice her size — cop right, rule-breaker wrong. A rural White man with White male supporters threatening Federal law enforcers — cops wrong, rule-breakers right.

Originally posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Daniel Drezner once wrote about how international relations scholars would react to a zombie epidemic. Aside from the sheer fun of talking about something as silly as zombies, it had much the same illuminating satiric purpose as “how many X does it take to screw in a lightbulb” jokes. If you have even a cursory familiarity with the field, it is well worth reading.

Here’s my humble attempt to do the same for several schools within sociology.

Public Opinion. Consider the statement that “Zombies are a growing problem in society.” Would you:

  1. Strongly disagree
  2. Somewhat disagree
  3. Neither agree nor disagree
  4. Somewhat agree
  5. Strongly agree
  6. Um, how do I know you’re really with NORC and not just here to eat my brain?

Criminology. In some areas (e.g., Pittsburgh, Raccoon City), zombification is now more common that attending college or serving in the military and must be understood as a modal life course event. Furthermore, as seen in audit studies employers are unwilling to hire zombies and so the mark of zombification has persistent and reverberating effects throughout undeath (at least until complete decomposition and putrefecation). However, race trumps humanity as most employers prefer to hire a white zombie over a black human.

Cultural toolkit. Being mindless, zombies have no cultural toolkit. Rather the great interest is understanding how the cultural toolkits of the living develop and are invoked during unsettled times of uncertainty, such as an onslaught of walking corpses. The human being besieged by zombies is not constrained by culture, but draws upon it. Actors can draw upon such culturally-informed tools as boarding up the windows of a farmhouse, shotgunning the undead, or simply falling into panicked blubbering.

Categorization. There’s a kind of categorical legitimacy problem to zombies. Initially zombies were supernaturally animated dead, they were sluggish but relentlessness, and they sought to eat human brains. In contrast, more recent zombies tend to be infected with a virus that leaves them still living in a biological sense but alters their behavior so as to be savage, oblivious to pain, and nimble. Furthermore, even supernatural zombies are not a homogenous set but encompass varying degrees of decomposition. Thus the first issue with zombies is defining what is a zombie and if it is commensurable with similar categories (like an inferius in Harry Potter). This categorical uncertainty has effects in that insurance underwriters systematically undervalue life insurance policies against monsters that are ambiguous to categorize (zombies) as compared to those that fall into a clearly delineated category (vampires).

Neo-institutionalism. Saving humanity from the hordes of the undead is a broad goal that is easily decoupled from the means used to achieve it. Especially given that human survivors need legitimacy in order to command access to scarce resources (e.g., shotgun shells, gasoline), it is more important to use strategies that are perceived as legitimate by trading partners (i.e., other terrified humans you’re trying to recruit into your improvised human survival cooperative) than to develop technically efficient means of dispatching the living dead. Although early on strategies for dealing with the undead (panic, “hole up here until help arrives,” “we have to get out of the city,” developing a vaccine, etc) are practiced where they are most technically efficient, once a strategy achieves legitimacy it spreads via isomorphism to technically inappropriate contexts.

Population ecology. Improvised human survival cooperatives (IHSC) demonstrate the liability of newness in that many are overwhelmed and devoured immediately after formation. Furthermore, IHSC demonstrate the essentially fixed nature of organizations as those IHSC that attempt to change core strategy (eg, from “let’s hole up here until help arrives” to “we have to get out of the city”) show a greatly increased hazard for being overwhelmed and devoured.

Diffusion. Viral zombieism (e.g. Resident Evil, 28 Days Later) tends to start with a single patient zero whereas supernatural zombieism (e.g. Night of the Living Dead, the “Thriller” video) tends to start with all recently deceased bodies rising from the grave. By seeing whether the diffusion curve for zombieism more closely approximates a Bass mixed-influence model or a classic s-curve we can estimate whether zombieism is supernatural or viral, and therefore whether policy-makers should direct grants towards biomedical labs to develop a zombie vaccine or the Catholic Church to give priests a crash course in the neglected art of exorcism. Furthermore, marketers can plug plausible assumptions into the Bass model so as to make projections of the size of the zombie market over time, and thus how quickly to start manufacturing such products as brain-flavored Doritos.

Social movements. The dominant debate is the extent to which anti-zombie mobilization represents changes in the political opportunity structure brought on by complete societal collapse as compared to an essentially expressive act related to cultural dislocation and contested space. Supporting the latter interpretation is that zombie hunting militias are especially likely to form in counties that have seen recent increases in immigration. (The finding holds even when controlling for such variables as gun registrations, log distance to the nearest army administered “safe zone,” etc.).

Family. Zombieism doesn’t just affect individuals, but families. Having a zombie in the family involves an average of 25 hours of care work per week, including such tasks as going to the butcher to buy pig brains, repairing the boarding that keeps the zombie securely in the basement and away from the rest of the family, and washing a variety of stains out of the zombie’s tattered clothing. Almost all of this care work is performed by women and very little of it is done by paid care workers as no care worker in her right mind is willing to be in a house with a zombie.

Applied micro-economics. We combine two unique datasets, the first being military satellite imagery of zombie mobs and the second records salvaged from the wreckage of Exxon/Mobil headquarters showing which gas stations were due to be refueled just before the start of the zombie epidemic. Since humans can use salvaged gasoline either to set the undead on fire or to power vehicles, chainsaws, etc., we have a source of plausibly exogenous heterogeneity in showing which neighborhoods were more or less hospitable environments for zombies. We show that zombies tended to shuffle towards neighborhoods with low stocks of gasoline. Hence, we find that zombies respond to incentives (just like school teachers, and sumo wrestlers, and crack dealers, and realtors, and hookers, …).

Grounded theory. One cannot fully appreciate zombies by imposing a pre-existing theoretical framework on zombies. Only participant observation can allow one to provide a thick description of the mindless zombie perspective. Unfortunately scientistic institutions tend to be unsupportive of this kind of research. Major research funders reject as “too vague and insufficiently theory-driven” proposals that describe the intention to see what findings emerge from roaming about feasting on the living. Likewise IRB panels raise issues about whether a zombie can give informed consent and whether it is ethical to kill the living and eat their brains.

Ethnomethodology. Zombieism is not so much a state of being as a set of practices and cultural scripts. It is not that one is a zombie but that one does being a zombie such that zombieism is created and enacted through interaction. Even if one is “objectively” a mindless animated corpse, one cannot really be said to be fulfilling one’s cultural role as a zombie unless one shuffles across the landscape in search of brains.

Conversation Analysis.2 (1)

Cross-posted at Code and Culture.

Gabriel Rossman is a professor of sociology at UCLA. His research addresses culture and mass media, especially pop music radio and Hollywood films, with the aim of understanding diffusion processes. You can follow him at Code and Culture.

Flashback Friday.

Two of my favorite podcasts, Radio Lab and Quirks and Quarks, have stories bout how inertia and reliance on technology can inhibit our ability to find easy, cheap solutions to problems.

Story One

The first story, at Radio Lab, was about a nursing home in Düsseldorf, Germany.  As patients age, nursing homes risk that they will become disoriented and “escape” the nursing home.  Often, they are trying to return to homes in which they lived previously, desperate that their children, partners, or even parents are worried and waiting for them.

When they catch the escapee in time, the patient is often extremely upset and an altercation ensues.  If they don’t catch them in time, the patient often hops onto public transportation and is eventually discovered by police.  The first outcome is unpleasant for everyone involved and the second outcome is very dangerous for the patient.  Most nursing homes fix this problem by confining patients who’ve began to wander off to a locked ward.

An employee at the Benrath Senior Center came up with an alternative solution: a fake bus stop placed right outside of the front doors of the nursing home.  The fake bus stop does two wonderful things:

(1)  The first thing a potential escapee does when they decide to “go home” is find a bus stop.  So, patients who take off usually get no further than the first bus stop that they see.  “Where did Mrs. Schmidt go?”  “Oh, she’s at the bus stop.”  In practice, it worked tremendously.  This meant that many disoriented patients no longer needed to be kept in locked wards.

(2)  The bus stop diffuses the sense of panic.  If a delusional patient decided that she needed to go home immediately because her children were all alone and waiting for her, the attendant didn’t need to restrain her or talk her out of it, she simply said, “Oh, well… there’s the bus stop.”  The patient would go sit and wait.  Knowing that she was on her way home, she would relax and, given her diminished cognition, she would eventually forget why she was there.  A little while later the attendant could go out and ask her if she wanted to come in for tea.  And she would say, “Ok.”

Listening to this, I thought it was just about the most brilliant thing I’d ever heard.

Story Two

The second story, from Quirks and Quarks, was regarding whether it is true that dogs can smell cancer.  It turns out that they can.  It appears that dogs can smell lots of types of cancer, but people have been working specifically with training them to detect melanomas, or skin cancers.  It turns out that a dog can be trained, in about three to six weeks, to detect melanomas (even some invisible to the naked eye) with an 80-90% accuracy rate.   If we could build a machine that was able to detect the same chemical that dogs are reacting to (and we don’t know, at this time, what that is) it would have to be the size of a refrigerator to match the sensitivity of a dog’s nose.  When it comes to detecting melanomas, dogs are better diagnosticians than our best humans and our most advanced machines.

Doggy doctors offer some really wonderful possibilities, such as delivering low cost cancer detection to communities who may not have access to clinical care.  A mobile cancer detection puppy bus, anyone?

Both these stories — about these talented animals and the pretend bus stop — are fantastic examples of what we can do without advanced technology. I fear that we fetishize the latter, turning first to technology and forgetting to be creative about how to solve problems without them.

This post originally appeared in 2010.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.