Search results for day care

Dmitriy T.M. sent in a link to a 13-minute video in which Van Jones discusses the problems with patting ourselves on the back too much every time we put a plastic bottle in the recycle bin instead of the trash, and the need to recognize the link between environmental concerns and other social issues:

Also see our posts on the race between energy efficiency and consumption, exposure to environmental toxins and social class, race and exposure to toxic-release facilities, reframing the environmental movement, tracking garbage in the ocean, mountains of waste waiting to be recycled, framing anti-immigration as pro-environment, and conspicuous environmentalism.

Full transcript after the jump, thanks to thewhatifgirl.

more...

Nils G. drew my attention to a fascinating now-abandoned America educational practice that nicely illustrates how ideas about ideal parenting shift over time.  Between 1919 and 1969, the Home Economics departments of about 50 colleges and universities served as foster homes for orphans. Writes Emily Anthes at Wonderland:

During this time, homemaking… was considered to be something that could be conquered by science. Running a home based on instinct was considered to be woefully old-fashioned; the idea that raising a child and maintaining a home could be optimized by following a set of scientific rules was gaining currency.

Accordingly, getting a degree in Home Economics included a labratory set up exactly like a home: “practice apartments.”  And what better to fill these homes with than “practice babies!”  Students would practice applying the latest science-endorsed parenting techniques on orphans.  An article published in the Journal of Home Economics in 1920, by Elizabeth Vermilye, explained the rotation of care:

Each girl, in rotation, carried the work of “baby manager” for one week… The “baby manager” assumed the entire responsibility for the care of the child during her period. She herself did the actual work of caring for him between the hours of 6.00 to 8.00 a.m. and from 4.30 to 6.00 p.m. During the day the child was in the care of three or four other students during the time they were not in class, the manager making the program for this care, giving instructions regarding food and other matters needing attention. The baby manager did the baby’s laundry work.

A student taking care of a practice baby:

Far from being exploited, it was believed that these babies would get not just excellent, attentive care, but the best, most scientifically-valid care.  Vermilye claims that the examining physician was highly impressed with the children’s development during their stay with the students.  She quotes him saying, “The improvement in the condition of these children speaks highly for your cooperative motherhood.”

These pictures of orphan and practice baby Bobby Domecon (surnamed after his role in the Domestic Economics department) reveal his chubbification.

A skinny 6 pounds at 2 months old:

Perking up at age 10 months:

Nice and chubby 5 months later:

Because these children were believed to be benefiting from the latest science of parenting, they were highly adoptable; many couples were eager to get their hands on a child that had such a good start in life (source).

Eventually, however, ideas about mothering began to change.  In particular, scholars began to talk about Attachment Disorder and argue that a child’s development required that it strongly bond to one unique person.  In 1954, a short Time magazine article on the subject included experts suggesting that the program was harmful.  Starting with the Superintendent of the Illinois State Child Welfare Division, the author writes:

“It is not a normal family setting,” said he. “There are just too many persons involved in the handling of that child.”  Heaven only knows, added the superintendent, how many neuroses little David might develop. Other officials seemed to agree. “Imagine.” cried Mrs. Babette Penner, director of the Women’s Services Division of United Charities, “what anxieties there are in a child who is given a bottle in twelve or more pairs of arms.”

The scientific consensus eventually changed and, as a result, by 1969, then, “practice babies” were a thing of the past.

In this video from ABC Doris Mitchell, Cornell University graduate and Home Economics major, sweetly remembers her experience helping raise a practice baby at Cornell University:

For another fantastic example of historic management of children without parents, see our post on the Orphan Trains.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Mab R. sent in a nice example of how children are socialized into gendered expectations. Chunky Monkey Mind has a post about the cut-out trading cards that appeared on the back of Cap’n Crunch cereal boxes a while back. Each card features a Cap’n Crunch character. Here’s the card for Smedley:

Ok, so for the male character we get basic stats, and he’s clearly an active guy who has thrilling adventures.

On the same box that featured the Smedley card was a card for Magnolia Bulkhead, who is shown with hearts hovering around her face as she clasps her hands together in rapture:

But of course, being female, she isn’t going to give us all of her vital statistics — in particular, age and weight are secrets women should guard carefully. Also notice the reinforcement of the idea that women are obsessed with romance. While Smedley’s hobbies involve action, Magnolia’s only listed hobby is daydreaming about a man (and his cereal). And her greatest adventure? Why, almost getting married, of course. Yes, the most amazing adventure of her life is something she failed at, but since it held out at least the possibility of romance, and she’s female, it was still the highlight of her life.

Ah, gender stereotypes! Fun for kids of all ages!

“It was kind of unreal,” the Steamboat Springs, Colorado native said, describing his recent 34th birthday fete at Kandahar Airfield, better known as KAF. “At least for a few minutes, you could pretend you were somewhere else. It was like going back home” (source).

“I was expecting to arrive in a warzone but instead here I am wearing sunglasses in the sun and eating a baguette,” said Dimitra Kokkali, a NATO contractor newly arrived from Brussels. “On my first night I surprised my family by calling them from an outdoor rock concert” (source).

Time magazine slideshow, titled “R&R at Kandahar Airfield,” uses images to describe how the busiest airport in the world “tries to re-create the comforts of home for the coalition forces in Afgahnistan.” Kandahar Airfield is the busiest airport in the world because all supplies and troops pass through on their way to or from war in Iraq or Afghanistan. At any given time there are about 25,000 service members and civilian contractors at the airfield.

These images of the Kandahar’s “Boardwalk” recreation area are striking for a few reasons. First, they show a blurring of the line dividing the homefront and the warfront. The slide show includes images of service members using FaceBook in computer labs, and eating meals in their fatigues at TGI Fridays.

Second, these images reflect that there is increasing emphasis on how service members are supported and cared for by the military during wartime. These photos show the side of war that is not about fighting and danger—instead, they are about the comfort and making a foreign land where they are fighting as “homelike” as possible.

Third, these show the blurring of the boundary between the military and privately owned businesses. Civilian Contractors are augmenting military personnel during the wars in Iraq and Afghanistan and the inclusion of these civilian contractors in war zones has raised the issues of the safety of civilian workers and the costs of hiring corporations (Contexts).

Finally, as a consequence of the blurring of the boundaries of homefront and warfront, the division between the country of Afghanistan and the military is sharpened. Afghanis (except for those few with security clearance) are not allowed to shop or enjoy the free entertainment on the Boardwalk at Kandahar.  Meanwhile, service members can safely buy souvenirs on the Boardwalk itself.  Afghani culture is commodified as a tourist attraction in this theme park-like Boardwalk setting.

All of these images speak to the changing boundary between the homefront and the warfront, and as a result, changes in how we, as a country, view war. Instead of the images of brutality, death, and chaos that Americans saw in their living rooms on TV during Vietnam, for example, these images show the military taking care of service members who are being entertained, keeping in touch with loved ones, and having fun.

But as this service member describes, walking the Kandahar “Boardwalk” in a warzone is still a jarring experience:

“I couldn’t believe I was in Kandahar eating a double-dipped chocolate ice cream at sunset on a Saturday afternoon,” said Coleman, who was downing a strawberry smoothie from the French bakery behind him, where an Eiffel Tower climbs a wall above picnic tables with fake potted plants.

“It was a surreal experience,” he said, as a jet fighter roared across the sky, letting loose a stream of defensive white flares. “I remember thinking, ‘We’re in the heart of the war-zone. The bad guys are 10 miles away. And here we are eating soft-serve ice cream'” (source).

Wendy Christensen is a Visiting Assistant Professor at Bowdoin College whose specialty includes the intersection of gender,war, and the media.

Mark Grief wrote a fantastic analysis of the “hipster” in the New York Times.  Drawing on a book he edited,”What Was the Hipster?“, with Kathleen Ross and Dayna Tortorici, Grief offers an analysis based on the work of Pierre Bourdieu.

Bourdieu observed that the rich justified and naturalized their economic advantage over others not only by pointing to their bank accounts, but by being the arbiters of taste.  Bourdieu shows us that taste…

…is not stable and peaceful, but a means of strategy and competition. Those superior in wealth use it to pretend they are superior in spirit.

Style, in other words, is not just arbitrary; it is about establishing that you are better than other people.

Those below us economically, the reasoning goes, don’t appreciate what we do; similarly, they couldn’t fill our jobs, handle our wealth or survive our difficulties.

But the rich aren’t the only ones who attempt to use taste and style to gain and preserve status.  Indeed, hipsters may be the purest example of this phenomenon.

“Once you take the Bourdieuian view,” Grief explains, “you can see how hipster neighborhoods are crossroads where young people from different origins, all crammed together, jockey for social gain” by liking cool things first.

I will quote Grief liberally because he does such a fantastic job of describing the field:

One hipster subgroup’s strategy is to disparage others as “liberal arts college grads with too much time on their hands”; the attack is leveled at the children of the upper middle class who move to cities after college with hopes of working in the “creative professions.” These hipsters are instantly declassed, reservoired in abject internships and ignored in the urban hierarchy — but able to use college-taught skills of classification, collection and appreciation to generate a superior body of cultural “cool.”

They, in turn, may malign the “trust fund hipsters.” This challenges the philistine wealthy who, possessed of money but not the nose for culture, convert real capital into “cultural capital” (Bourdieu’s most famous coinage), acquiring subculture as if it were ready-to-wear. (Think of Paris Hilton in her trucker hat.)

Both groups, meanwhile, look down on the couch- surfing, old-clothes-wearing hipsters who seem most authentic but are also often the most socially precarious — the lower-middle-class young, moving up through style, but with no backstop of parental culture or family capital. They are the bartenders and boutique clerks who wait on their well-to-do peers and wealthy tourists. Only on the basis of their cool clothes can they be “superior”: hipster knowledge compensates for economic immobility.

All hipsters play at being the inventors or first adopters of novelties: pride comes from knowing, and deciding, what’s cool in advance of the rest of the world.

This, Grief concludes, is why everyone, especially hipsters, hates to be called a hipster.  The whole idea is to have authentically superior tastes.  Once you are revealed as someone who cares about having the right tastes, you are disqualified as a person who has good taste effortlessly.  Likewise, if you are suddenly one who has the same tastes as everyone else, you are just one of the masses.  Being a hipster, it turns out, is a perilous identity that must be constantly re-worked and re-authenticated.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

As a member of a cattle-raising family, I hear a pretty steady stream of complaints about people eating less beef, which is variously attributed to a conspiracy against the American rancher (possibly by terrorists), the result of stupid city people who get all terrified over every little health concern (Mad Cow Disease is a myth! Unless it’s a terrorist plot to ruin ranching), environmentalists, animal rights activists, and me (I’ve been a vegetarian since 1996 and thus single-handedly nearly destroyed the beef industry).

The National Cattlemen’s Beef Association is similarly concerned about reduced beef consumption. And given that we frequently hear about the connections between red meat consumption and health concerns such as heart disease, and are advised to substitute white meat for red meat (to the point that the pork industry began branding pork as “the other white meat”), you’d probably expect to see a dramatic decline in consumption of beef.

And we do see a decline, but not as much as you might expect, as this graph from the Freakonomics blog, sent in by Dmitriy T.M. and Bryce M. (a student at Rensselaer Polytechnic Institute), illustrates:

Clearly beef consumption has declined since its peak in the late 1970s, when people in the U.S. ate nearly 90 pounds of beef each per year, to closer to 60 lbs. each today. On the other hand, all those health warnings, disease scares, and environmentalist-vegetarian terrorist plots haven’t yet knocked beef out of its position as the most-eaten meat in the U.S. Clearly, chicken seems poised to take over that position, but beef doesn’t exactly appear to be falling off the charts.

So how do we compare to other countries in terms of overall meat consumption? In a 2003 article in the Journal of Nutrition, Andrew Speedy provided data on global meat consumption (defined as “beef and buffalo, sheep and goat, pig meat and poultry”) — note it’s in kilograms, not pounds, and the legend should be read across, not down (so the first bar is for the U.S., the second is for France, and so on):

So insofar as there has been a decrease in beef consumption in the U.S., and more dramatic increase in chicken consumption: what’s going on? The Freakonomics article presents an explanation:

A study by the agricultural economists James Mintert, Glynn Tonsor, and Ted Schroeder found that for every 1 percent increase in female employment, beef consumption sank by .6 percent while chicken consumption rose by .6 percent. Why? Probably because beef takes longer than chicken to prepare, and because poultry producers did a good job marketing cheap and ready-to-cook chicken products. Furthermore, all those working women meant more household income, which meant more families eating in restaurants — where meals are less likely to contain beef than meals at home.

Health concerns do play a part; the authors found that negative media coverage of beef (either recalls due to contamination or general links to heart disease, etc.) reduced consumption, while positive coverage that linked eating meat to getting iron, zinc, and other minerals increased it. But they found that health effects were small compared to the effects of changing family dynamics — that is, women working outside the home and families eating fewer meals at home.

It’s a nice example of how the factors driving social changes are often much more complex than we’d expect. Common sense explanations of changes in beef consumption would, I think, a) overestimate how much less beef Americans eat than in the past and b) assume the major driving factors to be health-related concerns, whether about chronic disease or recalls. Yet it turns out a major aspect of the story is a structural change that doesn’t seem clearly connected at all.

I guess if I were a health advocate hoping people in the U.S. were starting to listen to messages about healthy eating, that might depress me. But I guess I can tell my grandma that the terrorists’ evil plans to infect U.S. cattle herds with Mad Cow or some other disease might not be as catastrophic as they might imagine.

UPDATE: As a couple of readers point out, the increase in chicken consumption can’t be explained just as a result of people eating chicken when they otherwise would have eaten beef; the drop in beef consumption is way overshadowed by the increase in how much chicken people eat. The total amount of all meat eaten each year has increased dramatically.

I don’t know what is driving all of that change, but I suspect a lot of it is marketing campaigns — not just directly to consumers, but efforts by industry groups and the USDA to get more meat into a wide variety of items at grocery stores and on restaurant menus, as they have done with cheese.


Anita Sarkeesian, at Feminist Frequency, starts from the beginning.  How is contemporary advertising to children gendered today?  And why does it matter?  With a special discussion of girls and technology.  Enjoy:

(Transcript after the jump.)

more...

Katelyn G. sent in a link to a story at The Economist about a new study that attempted to measure the harmful effects, to both the user and to the U.K. more broadly, of a number of legal and illegal drugs. The methodology:

Members of the Independent Scientific Committee on Drugs, including two invited specialists, met in a 1-day interactive workshop to score 20 drugs on 16 criteria: nine related to the harms that a drug produces in the individual and seven to the harms to others. Drugs were scored out of 100 points, and the criteria were weighted to indicate their relative importance.

Harm to others included factors such as health care costs, family disruptions, social services, and the cost of criminal justice programs to regulate drugs.

The results? Alcohol outranked all illegal substances they considered by a significant margin, particularly in terms of the harm caused to others:

Will this lead to major changes in drug policy in the U.K.? Unlikely. Here’s a tidbit from an NPR story:

…last year in Britain, the government increased its penalties for the possession of marijuana. One of its senior advisers, David Nutt — the lead author on the Lancet study — was fired after he criticized the British decision.

“What governments decide is illegal is not always based on science,” said van den Brink. He said considerations about revenue and taxation, like those garnered from the alcohol and tobacco industries, may influence decisions about which substances to regulate or outlaw.