This image has an empty alt attribute; its file name is When-Trauma-is-Passed-Down-TROT-Image-600x400.jpg
Image description: Mohammed, a Somali exile, sits in a chair on the right-hand side of the image. his children sit on the floor around him, as they discuss art. Art covers the wall. Creating cultural products is one way that communities process trauma. Image courtesy of UNHCR, CC BY-NC 2.0.

Originally published April 12, 2021

Scientific developments in the field of epigenetics have called attention to intergenerational transfers of trauma. We know now that traumatic experiences can be passed down through the genes— to children, and even grandchildren, of the survivors of horrific experiences like the Holocaust or American slavery. Sociology can help show how past trauma is passed down through social ties, and about its effects on current health and wellbeing. These social consequences of trauma could be even more powerful than the genetic impacts, affecting group dynamics, identity, history, and culture. In addition to what is passed down, sociological research also provides examples of how groups are managing these social effects, in both helpful and harmful ways. 

Cultural Trauma and Group Identity
Cultural sociologists assert that in addition to individual bodily and psychiatric trauma, there is also collective “cultural trauma” when groups experience horrific events. This collective trauma compounds and complicates individual effects. In order for the process of cultural trauma to occur, the group must first recognize that a great evil has been done to them, and construct a cohesive and shared narrative that includes perpetration and victimhood. Then this narrative becomes incorporated into that group’s collective memory as an enduring aspect of their identity, like the Holocaust has been for Jews or collective memory of slavery for Black Americans.
Both perpetrators and victims of violence must contend with the horrific event in some way, as it is now permanently associated with their group. This can occur either through avoidance of the difficult past, or stigma management practices like acknowledgment, denial, and silencing.

Cultural Trauma and Group Conflict: Violence Begets Violence

Sometimes, this cultural trauma process results in further violence. As the group comes to understand the harms they have suffered and assign responsibility, they can seek violent retaliation against the offending perpetrators. Examples include the bombing of Pearl Harbor (and subsequent Japanese internment and Hiroshima/Nagasaki bombings), and the 9/11 attacks leading to the U.S. War on Terror. In ex-Yugoslavia, ancient collective memories were stoked and reconstructed by elites to provoke inter-ethnic violence that led to ten years of war, genocide, and ethnic cleansing. In Hawai’i, Irwin and Umemoto trace the emotional and psychological effects of violent colonial subjugation, such as distress, outrage, and depression, to contemporary violence among Pacific Islander youth.

Memory Work: Social Solidarity and Empowerment

Sociological research also provides examples of people “making sense” of difficult pasts by doing “memory work,” which can include art, music, and other cultural production. For example, second-generation Sikhs in the U.S. are using internet spaces to challenge dominant narratives of the 1984 anti-Sikh violence in India, contributing to group solidarity, resilience, and identity within their communities here in the U.S. Similarly, the children of Vietnamese refugees are using graphic novels and hip-hop music to articulate how the Vietnam War contributes to current struggles in the Vietnamese community. This shared understanding and validation then empower communities to fight for recognition and social justice. 

When a group experiences a horrific event, the social effects live on to future generations. Understanding these effects is crucial for developing solutions to group suffering moving forward. Going through the cultural trauma process is necessary to overcome difficult pasts, but it is critical that this process occurs in a way that promotes justice and peace rather than further violence.

Video imagery courtesy of canva, canva free media usage.

Originally posted on March 16, 2017

The United States and the United Nations have had a closely intertwined relationship since the organization’s founding in 1945. The UN deals with a broad range of issues around the globe, and its widespread influence is often controversial. However, the influence of the United Nation continues to be instrumental in promoting crucial human rights causes, and the reach of its aid is arguably beyond compare. Despite its numerous shortcomings, the UN plays a crucial role in promoting human rights norms across the globe.

Throughout the 1990s in particular, the United Nations took on a central role in the global justice process. It organized and funded international courts following episodes of mass violence, such as the International Criminal Tribunal for Rwanda, and it made indictments for egregious crimes possible for the first time (including the crime of genocide).  Sociologists find that the existence of these courts have a global impact in providing justice, and the trials seem to have a positive effect in reducing human rights violations in the long run.
The judicial process alone cannot adequately address global human rights issues — humanitarianism and diplomacy also play key roles. The United Nation arguably plays the most dominant global role in these initiatives, with monumental campaigns addressing topics like hunger, refugee needs, and climate change. The UN has been criticized for showcasing Western ideals and not taking into account cultural contexts, such as early endeavors to reduce female genital cutting. However, the UN has made improvements and when programs are approached as an opportunity for partnership and not dominance, the outcomes can be quite positive. For example, the agency has taken great strides in promoting gender equality and access to education.
Video courtesy of canva, canva free media usage.

Originally posted November 14, 2019

As we prepare for Thanksgiving, many people look forward to sharing a warm meal with their family and friends. Others dread the holiday, gearing up to argue with their relatives or answer nosey questions. TSP has written about the political minefield that holiday meals can be in the past. This year we want to point out that the roots of difficult dinners actually run deep in everyday family mealtime. Thanksgiving, like any family mealtime, has the potential for conflict. 

Scholars have documented how important meal time can be for families in terms of cultivating relationships and family intimacy. However, they also show that despite widespread belief that families should share “happy meals” together, meals can be emotionally painful and difficult for some families and family members.
Disagreements between parents and children arise at mealtime, in part, because of the meal itself. Some caregivers go to battle with “picky eaters.” Migrant parents struggle to pass cultural food traditions to children born in the United States. Low income parents worry that their children will not like or eat the food they can afford.
Family meals also reproduce conflict between heterosexual partners. Buying, preparing, and serving food are important ways that women fulfill gendered expectations. At family meal-times men continue to do less work but hold more power about how and when dinner is served.
Thanksgiving, or any big holiday meal, can involve disagreements. However, that is not altogether surprising considering that everyday family meals are full of conflicts and tension.

Image: A little white girl sits on an adult’s lap in front of a laptop, her small hands covering the adults as they use the computer. Image courtesy of Nenad Stojkovic CC BY 2.0

Democrats in Congress continue toward passing sweeping infrastructure legislation. Part of the infrastructure packages would provide funding for childcare including universal pre-K for three and four-year-olds, aid for working families to pay for the costs of daycare, and paid family leave. Social science research helps place this current debate in perspective, connecting it to larger conversations about who is responsible for paying the costs of raising kids, the consequences for families of the private responsibility for childcare, and what international comparison can show us about alternatives. 

Part of the question concerns whether we should think of raising children as a social, rather than individual, responsibility. Public investments in childcare, whether through public assistance to cover the cost of childcare or a public system of universal childcare, are one way that countries communicate who is responsible for reproductive labor: the work of having and caring for children. In the United States, this is often thought of as the responsibility of individual families and, historically, mothers. Feminist scholars, in particular, have critiqued the individualization of responsibility for raising children, emphasizing that the work of having and raising children benefits society across the board. Having kids creates the next generation of workers and tax-payers, carrying on both practical and cultural legacies. Scholars argue that because we all benefit from the work of reproducing the population we should all share its costs and responsibilities.

Other wealthy Western nations handle childcare differently. For instance, in Sweden there is subsidized childcare available for all children that is considered high quality and is widely utilized. In Germany, there is greater availability of well-paying part-time jobs that can enable two-parent households to better balance the responsibilities of work with the demands of raising kids. In the United States, there is now virtually no public support for childcare. Parents are left to their own devices to figure out how to cover the time before the start of public school at age five as well as childcare for before or after school, and during school vacations. The U.S. is not alone in expecting families to provide childcare, for instance, Italy has a culture of “familialism” that expects extended family and, in particular, grandparents to provide free childcare for working families. However, as Caitlyn Collins writes, the combination of little support for families, and cultural expectations that workers are fully devoted to their jobs, makes raising a child particularly challenging in America.

There are two important consequences to the lack of public support for childcare in the United States. The first is economic. Mothers experience a “motherhood penalty” in overall lifetime wages when they exit the labor force to provide childcare, or they may be placed on on “mommy tracks” in their professions, with lower-paying and less prestigious jobs that can better accommodate their caring responsibilities. Scholarship shows much smaller motherhood penalties in countries with more cultural and institutional support for childcare.

A second consequence of little support for caring responsibilities is emotional. As Caitlyn Collins writes, mothers in many nations feel guilt and struggle to balance the responsibility to care for their children and their jobs. However, in the United States this guilt and emotional burden is particularly acute because mothers are left almost totally on their own to bear both the practical and moral responsibility for raising children. The guilt parents feel, as well as the stress of balancing childcare responsibilities and full-time work, may be one reason that there is a larger “happiness gap” between parents and non-parents in the United States when compared to other wealthy nations that provide better public support for raising children.

The pandemic has brought a number of social realities into stark relief, including the fact that individual families have to navigate childcare on their own, never clearer than when school closings kept kids at home. As we imagine a post-pandemic future and the potential to “build back better,” we should consider what social research tells us about who should be responsible for caring for kids, the weight of that responsibility, and how public policy changes might provide better care for the nation’s youngest citizens.

Members of Mecklenberg County’s Crisis intervention Team demonstrate their response to a call, image courtesy of Mecklenberg County, CC BY-NC 2.0. Image: A young black man sits at a picnic table, his hood up, speaking to a black woman who is taking notes. Two white police officers are in the foreground, one squatting and one standing, looking on.

Since George Floyd’s murder in Minneapolis over one year ago, police reforms across the country continue to make headlines and shift the meaning of public safety. One important reform area involves responding to community members with mental health crises. Police officers have sometimes been described as  “street corner psychiatrists” because 10%40% of their total emergency calls involve persons with mental health concerns. 

As communities increasingly recognize that police are not mental health professionals, they have begun partnering police with mental health professionals to form Crisis Intervention Teams (CITs), sometimes known as Crisis Response Teams, or Co-Response Teams

‘CIT’ Programs and Effectiveness

CITs are joint responses to mental health crises by multidisciplinary teams including police, mental health providers, social workers, and hospital emergency services which have three key features: 1) community collaboration 2) training for police, and 3) accessibility to mental health services. 

Social scientists are now evaluating the effectiveness and benefits of these programs.  The National Alliance on Mental Illness reports over 2,700 CIT programs in different communities across the United States.  Research has shown that these programs increase diversion from jails and prisons to mental health services by 11% – 22%, relieve police workloads by 27%, and reduce the likelihood of people with mental illness to be arrested by 11% – 12%.  While these figures regarding CITs are promising as a short term intervention, future investments in long-term stabilization programs are needed to sustainably address mental health crises.

Promising Practice

As a promising practice, CIT has evolved over recent decades and has been successful in promoting improvements in mental health responses, increased officer confidence for working with people experiencing a mental health crisis, and reduced frequency and length of detention.  However, consistency across program elements within CITs is scattered and more exploration is needed.  Future evaluation, standardization, and regulation of CITs is necessary.

Societal responses to mental health impact every person in the US – whether it is a neighbor across the street, a colleague, a friend, or a family member.  Social science research is playing an important part in evaluating and refining policies and programs such as CITs.  Rather than punishing mental crises, CITs view them through a treatment lens – fostering healing and restoration. While this early research shows the promise of CITs as a short term “first response” intervention, this research also suggests that “second response” investments in long-term mental health care are needed to equitably and sustainably address mental health crises.

A illustrated image of men’s faces in striped shirts and hats. All of the men except one, who is orange, are yellow. Image via pixabay, Pixabay License.

For many, the start of the school year brings a mixed bag of emotions, from budding anticipation to feelings of unease and anxiety about self-worth and competence, otherwise known as imposter syndrome. Imposter syndrome exists well beyond academia, disproportionately affecting minorities and women, groups underrepresented in fields like business and medicine. What does social science research tell us about what imposter syndrome is, how it works, and how it can be addressed?

What it Is

Imposter syndrome, first described as the “impostor phenomenon,” refers to individuals’ perceived fraudulence and unworthiness within high-pressure environments and workplaces–the feeling that they don’t fit or aren’t really supposed to be there. It appears to be most prevalent among systematically marginalized groups like women, first-generation students, and BIPOC and queer people. Imposter syndrome flourishes in spite of, or perhaps even because of, increased diversity and representation. Individuals with imposter syndrome doubt the validity of their achievements and fear being exposed as frauds. These feelings of self-doubt and unworthiness are often compounded by social anxiety and depression, which can lead to self-sabotage. Imposter syndrome may partially explain higher drop-out rates among undergraduate groups in fields historically dominated by white men like medicine, mathematics, and science.

Impression Management

To manage feelings of inadequacy, individuals rely on what Erving Goffman called impression management. Impression management is the practice of keeping up appearances and matching one’s identity and behavior with societal expectations for social roles, positions, and identities. Imposter syndrome can emerge in settings with conflicting roles or expectations or when someone’s background, identity, and interaction style does not fit well with what is expected. This can lead people to using perfectionism and workaholism to exhibit competence. For instance, research on female facilities managers shows that performing competence often leads to higher performance outcomes despite persistent feelings of inadequacy. Displaying competency despite feelings of inadequacy can exacerbate the role conflict individuals experience or the tension between self-doubt and high achievement.

The Challenges of Diversifying

Efforts to “diversify” high-status fields like academia, law, and medicine sometimes fail to address the subtle cultural factors that can marginalize and exclude underrepresented groups. Lack of familiarity with field-specific concepts like peer review and tenure track or norms like networking or mentoring can leave individuals feeling alienated. This unfamiliarity is often at the root of the unease associated with imposter syndrome. To address imposter syndrome schools and workplaces have proposed a range of solutions including targeted mentorship programs and additional support for nontraditional students and employees. Scholars emphasize that addressing imposter syndrome should involve solutions that emphasize flourishing and well-being over identity-based inclusion efforts.
Image description: A blonde woman sits in a church pew, facing away from the camera. Image courtesy of pixabay, pixabay license.

This October, Pope Francis is kicking off a three year synod, assembling leaders and laypeople to discuss issues of church doctrine and practices. One big question on the table: should the Catholic church ordain women women as deacons? Driven in part by a growing movement of women around the world who feel called to ordination, the case for ordaining women will likely be to be one of the most hotly debated issues among Catholics worldwide. 

After hundreds of years restricting the role of women in church leadership, how did the Church even get to this point? The story begins with changes in mainstream culture. Historically, changing norms around sex and gender have encouraged church leaders to reexamine their existing doctrines, particularly if church participation is declining. As  mainstream culture changes, religious institutions face the challenge of “retraditioning” themselves for the future: adjusting their doctrines and practices to better align with changing mainstream culture. Religious leaders then debate proposed changes to church doctrines and practices–exactly the point that the Catholic Church is at today with the upcoming three-year synod.

What will happen at the end of the synod in 2024? Historical research suggests that after church leaders begin debating ideologies, changes to church policy often come through sheer luck, force, or the influence of powerful personalities. Only time will tell if the church leaders will have responded to the calls of these Catholic women.

Image: A drag queen, dressed in a rainbow-sequeened dress and pink wig stands with arms raised, smiling. Image courtesy of Dany Sternfeld, CC BY-NC-ND 2.0.

In June, in celebration of pride month, members of LGBTQ+ communities and allies honored and reflected on hard-fought advancements for queer people, from civil rights like marriage equality and employment protections, to representation in positions of political prominence and mainstream culture. One area of change is the rising popularity of drag — an artform pioneered by queer people of color in clandestine ballrooms, now occupying prominent positions in gay bars, television competition programs, and mainstream films.

Drag began, like many parts of queer life, underground in urban nightlife spaces. In queer havens like San Francisco and New York City, drag performers have graced nightclub stages for over a century. As homosexuality grew more visible in the late twentieth century, drag performers were at the forefront of battles for liberation, political rights, and, later, for medical treatment during the years of the AIDS pandemic. In the 1960s, drag queens in Los Angeles and San Francisco pushed back against run-ins with law enforcement. Some in the queer community believe a drag queen, Marsha P. Johnson, threw the first brick at the Stonewall Inn in 1969, kicking off the now-infamous confrontation with the NYPD. In the years that followed, drag queens blended campy performance with activism, protesting governments unresponsive to those dying from AIDS, drug manufacturers, and anti-same sex marriage advocates. 

Today, in light of the LGBTQ+ community’s social, political, and legal advances, drag enjoys an unprecedented prominence in mainstream culture. Drag Story Hours, where performers, or queens, read storybooks to children are commonplace in schools and libraries in twenty six states and Puerto Rico. Performing drag, once a marginalized profession, is now a viable, if precarious, job prospect. The RuPaul’s Drag Race franchise boasts thirteen seasons (with six All-Stars seasons to boot) as well as international spin-offs in seven countries. The cultural significance and prominence of drag today raises questions: How does drag celebrate queerness and resist normative sexuality? How did drag find its footing in both pop culture and political circles? Significant research in the humanities and social sciences sheds light on this. 

Performing Gender

Gender theorists have argued that gender is a performed identity, reproduced in daily social interactions. Like other social categories, gender is shaped by, and reshapes, relations of power. Drag involves stylistic and exaggerated gender performance. Drag queens were initially male-identifying performers who, unlike “crossdressers,” relied on exaggerated and parodied gender performance to both entertain and draw attention to political causes. Given the queer community’s social exclusion, drag performers have for decades formed closely knit communities, or “houses”, of mutual support and solidarity for performers often cut off from traditional familial networks. 

Subverting Gender: Transgressive Tactics

Drag’s most enduring social impact has been calling into question popular conceptions of gender. Drag draws attention to the important differences between sexual orientation and gender, as well as internal gender identity, external performance, and biological sex, topics of ongoing discussion in academic and activist communities.
Drag queens use cultural tools like language and physical appearance to subvert and perform gender identities. Through parodying and imitating mainstream gender norms, drag queens reveal the arbitrariness, cultural origins, and performance inherent to all gender identities. In these communities, queens have developed coherent group identities by creating particular speech patterns and unique cultural cues. Values like not being too competitive or “hungry,” maintaining “sisterhood,” and exuding professionalism and humility are reinforced through language and cultural norms. Drag entertainers draw on appearances and practices situationally, in some cases displaying feminine sides in interactions with men while reverting to their masculinity in situations that call for it. Lesbian drag kings – female identifying performers presenting as men – similarly subvert gender roles by drawing on masculine practices in performance and, in some cases, more feminine practices in intimate settings.
While drag’s prominence today has prompted debates on gender norms in the mainstream, it has, at the same time, led to criticism about some harmful aspects  of drag performance, including caricaturing racial minorities and marginalized groups. 

Art Form as Resistance

Since drag first became a commonplace – if clandestine – staple at gay bars and clubs, the performances involved an inherent critique of dominant gender norms, presentation, and behavior. This resistance owes much to the repression and marginalization queer performers have faced in many aspects of their lives. Homophobic views often forced performers out of their homes, leading them to build bonds and kinship networks with other queer people in more accepting urban locales. Under precarious conditions, performers build community with other marginalized queer individuals and crafted a trangressive art form now seen as a cultural staple.  Drag’s rise would not have been possible with changing gender norms and styles of self-expression. Birth control became publicly available in 1960, opening new possibilities for women beyond the home. In the postwar decades, artists like Esquerita, Little Richard, and Sylvester pushed the limits of accepted gender presentation, normalizing new portrayals of gender. norms in their revolutionary performances., Cultural change was already well underway by the time the Stonewall riots kickstarted the national queer liberation movement. While evolving gender norms and the cultural movements of the 1960s did help the cause of queer liberation, fractures among LGBTQ+ activists kept drag remained in a marginal position within the movement.
Image description: Two college football players make contact on the field. Sports like football and basketball bring in huge amounts of money to colleges, but the student-athletes do not receive pay, raising questions about amateurism and exploitation that are now in front of the Supreme Court. Image via pixabay, pixabay license.

The NCAA was back in the Supreme Court last month, in the middle of its fabled March Madness basketball tournament. In NCAA v. Alston the association argued that the NCAA, not the courts, should decide the definition of amateurism. Or in other words, the NCAA should be in charge of deciding what types of compensation college athletes are able to receive–or not receive, as the case may be. 

The NCAA’s arguments built on their last Supreme Court case, in 1984, when the court ruled that the association’s actions in monopolizing TV contracts for college football violated antitrust regulations. But in his decision ruling against the NCAA Justice Stevens also included the line, “The NCAA plays a critical role in the maintenance of a revered tradition of amateurism in college sports.” The NCAA has leaned on this language over the past 35 years to maintain control over what benefits college athletes may receive, with the stated purpose of maintaining a separation between college and professional sports. 

The NCAA contends that the “product” of college sports will be devalued if college athletes are allowed to be paid, as fewer consumers will be interested in watching college sports if players are perceived as professional. Given that this argument amounts to an argument about consumer demand, what does the social science research say on whether the public believes that college athletes should be paid? 

A recent Ohio State survey (National Sports and Society Survey-NSASS) finds that a majority of Americans do support paying college athletes. This is a change from past research and potentially an important finding for the current legal challenges. 
What has not changed are the racial dynamics of who is more likely to support paying college athletes, with Black Americans more likely to support paying college athletes than white Americans. 
The main reason that public opinion has shifted is a growing sense that college athletes are being exploited, especially in football and men’s basketball. Exploitation is primarily a moral issue, with college athletes (and increasingly others) questioning whether the exchange relationship between themselves and the university is fair. Black athletes have long reported feelings of exploitation, and a recent NBER working paper illustrated how revenue brought in by Black men is being used to fund educational opportunities for other, white athletes.
Even aside from questions of exploitation, the concept of amateurism itself is suspect. Scholars have long argued that amateurism is a fundamentally classist concept based on nineteenth-century ideas of aristocracy, morality, and a “purity” of sport based on exclusion of the working-class. This flawed ideal has never described big-time American college sport, which has been commercialized and professionalized since its founding. It has certainly not described college athletics since the NCAA instituted one-year, renewable athletic scholarships in 1973, which in practice look and act like employment contracts

Taylor Branch’s now classic piece in The Atlantic, “The Shame of College Sports,”as well as the publicity around Ed O’Bannon’s court case to allow college athletes to profit off of their likeness, illustrate how issues of amateurism and exploitation in college sports have firmly entered popular discourse. The decision in Alston won’t answer many questions about the future of college sports, but it does represent an effort by the NCAA to assert control over a rapidly-changing situation. Court watchers feel that the Court is unlikely to rule for the NCAA after seeing oral arguments, which will make name/image/likeness and potentially pay-for-play legislation all the more important over the next few years.

Image description: Mohammed, a Somali exile, sits in a chair on the right-hand side of the image. his children sit on the floor around him, as they discuss art. Art covers the wall. Creating cultural products is one way that communities process trauma. Image courtesy of UNHCR, CC BY-NC 2.0.

Scientific developments in the field of epigenetics have called attention to intergenerational transfers of trauma. We know now that traumatic experiences can be passed down through the genes— to children, and even grandchildren, of the survivors of horrific experiences like the Holocaust or American slavery. Sociology can help show how past trauma is passed down through social ties, and about its effects on current health and wellbeing. These social consequences of trauma could be even more powerful than the genetic impacts, affecting group dynamics, identity, history, and culture. In addition to what is passed down, sociological research also provides examples of how groups are managing these social effects, in both helpful and harmful ways. 

Cultural Trauma and Group Identity
Cultural sociologists assert that in addition to individual bodily and psychiatric trauma, there is also collective “cultural trauma” when groups experience horrific events. This collective trauma compounds and complicates individual effects. In order for the process of cultural trauma to occur, the group must first recognize that a great evil has been done to them, and construct a cohesive and shared narrative that includes perpetration and victimhood. Then this narrative becomes incorporated into that group’s collective memory as an enduring aspect of their identity, like the Holocaust has been for Jews or collective memory of slavery for Black Americans.
Both perpetrators and victims of violence must contend with the horrific event in some way, as it is now permanently associated with their group. This can occur either through avoidance of the difficult past, or stigma management practices like acknowledgment, denial, and silencing.

Cultural Trauma and Group Conflict: Violence Begets Violence

Sometimes, this cultural trauma process results in further violence. As the group comes to understand the harms they have suffered and assign responsibility, they can seek violent retaliation against the offending perpetrators. Examples include the bombing of Pearl Harbor (and subsequent Japanese internment and Hiroshima/Nagasaki bombings), and the 9/11 attacks leading to the U.S. War on Terror. In ex-Yugoslavia, ancient collective memories were stoked and reconstructed by elites to provoke inter-ethnic violence that led to ten years of war, genocide, and ethnic cleansing. In Hawai’i, Irwin and Umemoto trace the emotional and psychological effects of violent colonial subjugation, such as distress, outrage, and depression, to contemporary violence among Pacific Islander youth.

Memory Work: Social Solidarity and Empowerment

Sociological research also provides examples of people “making sense” of difficult pasts by doing “memory work,” which can include art, music, and other cultural production. For example, second-generation Sikhs in the U.S. are using internet spaces to challenge dominant narratives of the 1984 anti-Sikh violence in India, contributing to group solidarity, resilience, and identity within their communities here in the U.S. Similarly, the children of Vietnamese refugees are using graphic novels and hip-hop music to articulate how the Vietnam War contributes to current struggles in the Vietnamese community. This shared understanding and validation then empower communities to fight for recognition and social justice. 

When a group experiences a horrific event, the social effects live on to future generations. Understanding these effects is crucial for developing solutions to group suffering moving forward. Going through the cultural trauma process is necessary to overcome difficult pasts, but it is critical that this process occurs in a way that promotes justice and peace rather than further violence.