When Selena Gomez launched Rare Beauty back in 2020, the message was simple: break down previous notions that everyone must be perfect, and shine a light on mental health issues.
While this may have broken every budding makeup brand’s dream, brands like Fenty Beauty shared similar, groundbreaking mission statements: bolster inclusivity in the makeup industry and force all brands to do the same in the process.
Inspired by her 2020 album, Rare, Rare Beauty began with the basics: 48 foundation shades, lip balms and matte lip creams, eyebrow definers, and the icon, liquid blush. Four years later, it’s hard to imagine a more viral, innovative celebrity makeup brand that remains in stride with Fenty.
Quickly, the Rare Beauty Soft Pinch Liquid Blush became TikTok’s go-to staple product. And no one can deny there is no blush on the market that is as pigmented, easily blendable, and long-lasting as this one. Selena Gomez has proven herself a bonafide content creator with her charismatic social media posts for fun Rare Beauty launches like an under-eye brightener, an SPF-laden tinted moisturizer, and lip combos.
Not only is Rare Beauty inclusive in shade range, but the spherical shape of the top of their products is disability-friendly.
As of 2024, Rare Beauty is a $2 billion company. But what sets this company apart is their attention to detail and true dedication to bettering the world. The same year that Rare Beauty was founded, the Rare Impact Fund was also created.
What Is The Rare Impact Fund?
In a statement by Gomez on the Rare Impact Fund’s website, she states,
“The Rare Impact Fund is committed to expanding access to mental health services and education for young people everywhere. We work with a strong network of supporters and experts to bring mental health resources into educational settings to reach young people.
Because no one– regardless of age, race, gender, sexual orientation, or background - should struggle alone.”
Upon their start, the Rare Impact Fund committed to raising $100 million by 2030. Along with corporate sponsorships and donations from individuals, 1% of proceeds from all Rare Beauty sales go towards the charity as well. By 2021, they had donated over $1.2 million in grants to eight mental health institutions including Yale Center for Emotional Intelligence.
In 2021, the Rare Impact Fund launched a GoFundMe for their new Mental Health 101 initiative. According to the GoFundMe,
“Mental Health 101 advocates for more mental health in education, empowers our community, and encourages financial support for more mental health services in educational settings through the Rare Impact Fund,”
Promising to match up to $200,000 in donations, to date the GoFundMe has raised over $500,000 and has donations from less than six months ago.
How The Rare Impact Fund Works
By leveraging both Selena Gomez’s millions of social media followers and the four million people who follow Rare Beauty on Instagram, the Rare Impact Fund quickly trickles into visibility. Suddenly, fans of the brand and Gomez alike can help make a difference by donating even a few dollars in honor of their favorite actress-singer extraordinaire.
As of 2023, the Rare Impact Fund helped grantees like UCLA Friends of Semel Institute, Batyr, La Familia, Mindful Life Project, Black Teacher Project, and Trans Lifeline. According to the website, they have raised $6 million in contributions and distributed $3 million in grant support so far.
Rare Beauty and the Rare Impact Fund alone are blazing a trail for all brands: you can make a change while still distributing high-quality products — and it pays off.
China has forced at least 1,000,000 Uighur Muslims to undergo "re-education" training.
Remote buildings fenced in by barbed wire, governmental slogans urging citizens to declare their loyalty, and armed guards preventing entry and exit: history has highlighted these as familiar omens of totalitarian oppression. Now the international community is condemning the Chinese government's "re-education camps," in which approximately one million Uighur Muslims have been detained, as the latest government machination violating human rights.
Under claims of combating religious radicalism," Chinese authorities have revised a law to condone the use of detention centers "to carry out the educational transformation of those affected by extremism." However, witness testimony and government documents have exposed a litany of human rights violations taking place in the camps under the guise of "vocational training" for the Uighur and other Muslim minority populations.
Chinese security in XinjiangThe New York Times
Within the camps, "re-education" programs not only restrict Muslims from practicing their religion, but impose a militant regimen of psychological indoctrination, including studying communist propaganda, reciting hymns to praise the Chinese Communist Party, writing "self-criticism" essays, and ritually giving thanks to Chinese President Xi Jinping. In what The New York Times calls "the country's most sweeping internment program since the Mao era," detainees are disciplined by thousands of guards armed with police batons, electric cattle prods, and pepper spray.
Camps are located in Xinjiang, an autonomous, arid region in the northwest. It's the largest region of China and noted as the residence of about 10 million Uighur Muslims among China's 1.4 billion population. Gay McDougall of the U.N. Committee on the Elimination of Racial Discrimination condemned the Chinese authorities' treatment of Muslims "as enemies of the state solely on the basis of their ethno-religious identity." Despite the Chinese government's initial claims that the camps' "students" were treated to amenities from ping-pong and TV to air conditioning and free dining, McDougall makes clear that Xinjiang has become "something resembling a massive internment camp, shrouded in secrecy, a sort of no-rights zone."
Most concerning are the reports of torture methods like waterboarding, sleep deprivation, and beatings for those who deviate from the program. A former detainee named Omir told the BBC in September, "They have a chair called the 'tiger.' My ankles were shackled, my hands locked to the chair. I couldn't move. They wouldn't let me sleep. They also hung me up for hours, and they beat me. They had thick wooden and rubber batons, whips made from twisted wire, needles to pierce the skin, pliers for pulling out your nails."
Abdusalam Muhemet and his 3 children in their Istanbul home.The New York Times
Abdusalam Muhemet, a 41-year-old former restaurant owner, recited a verse from the Quran at a funeral in 2015 and was subsequently detained in a prison cell for seven months before being relocated to a Xinjiang camp. "That was not a place for getting rid of extremism," he recalled to The New York Times. "That was a place that will breed vengeful feelings and erase Uighur identity." Muhemet was released after two months of detainment; he was never charged with a crime.
700,000 Muslims were forced to flee to neighboring Bangladesh in 2017.
On Monday, Facebook said it removed 13 pages and 10 accounts controlled by the Myanmar military in connection with the Rohingya refugee crisis.
The accounts were masquerading as independent entertainment, beauty, and information pages, such as Burmese popstars, wounded war heroes, and "Young Female Teachers." Fake postings reached 1.35 million followers, spreading anti-Muslim messages to social media users across the Buddhist-majority country.
Facebook's move comes a year after 700,000 Rohingya, a Muslim minority group in Myanmar, were forced to flee to neighboring Bangladesh amid widely-documented acts of mob violence and rape perpetrated by Myanmar soldiers and Buddhist mobs. The United Nations Human Rights Council denounced the crisis as "a textbook case of ethnic cleansing and possibly even genocide."
Rohingya children rummaging through the ruins of a village market that was set on fire.Reuters
Last month, the social media giant announced a similar purge, removing Facebook and Instagram accounts followed by a whopping 12 million users. Senior General Min Aung Hlaing, commander-in-chief of the Myanmar armed forces, was banned from the platform, as was the military's Myawady television network.
Over the last few years, Facebook has been in the hot seat for their tendency to spread misinformation. In the 2016 U.S. presidential election, inauthentic Facebook accounts run by Russian hackers created 80,000 posts that reached 126 million Americans through liking, sharing, and following. This problem has persisted in the 2018 midterm elections, ahead of which 559 pages were removed that broke the company's policies against spreading spam and coordinated influence efforts. Recent campaigns originating in Iran and Russia target not only the U.S., but also Latin America, the U.K., and the Middle East.
The situation in Myanmar is particularly troubling—it's not an effort by foreign powers to stoke hate and prejudice in a rival, but rather an authoritarian government using social media to control its own people. According to the New York Times, the military Facebook operation began several years ago with as many as 700 people working on the project.
Screen shots from the account of the Myanmar Senior General Min Aung Hlaing, whose pages were removed in August.
Claiming to show evidence of conflict in Myanmar's Rakhine State in the 1940s, the images are in fact from Bangladesh's war for independence from Pakistan in 1971.
Fake pages of pop stars and national heroes would be used to distribute shocking photos, false stories, and provocative posts aimed at the country's Muslim population. They often posted photos of corpses from made-up massacres committed by the Rohingya, or spread rumors about people who were potential threats to the government, such as Nobel laureate Daw Aung San Suu Kyi, to hurt their credibility. On the anniversary of September 11, 2001, fake news sites and celebrity fan pages sent warnings through Facebook Messenger to both Muslim and Buddhist groups that an attack from the other side was impending.
Facebook admitted to being "too slow to prevent misinformation and hate" on its sites. To prevent misuse in the future, they plan on investing heavily in artificial intelligence to proactively flag abusive posts, making reporting tools easier and more intuitive for users, and continuing education campaigns in Myanmar to introduce tips on recognizing false news.
The company called the work they are doing to identify and remove the misleading network of accounts in the country as "some of the most important work being done [here]."