Dall-E Mini, the AI-powered text-to-image generator has taken over the internet. With its ability to render nearly anything your meme-loving heart desires, anyone can make their dreams come true.
DALL-E 2, a portmanteau of Salvador Dali, the surrealist and Wall-E, the Pixar robot, was created by OpenAI and is not widely available; it creates far cleaner imagery and was recently used to launch Cosmpolitan’s first AI-generated cover. The art world has been one of the first industries to truly embrace AI.
The open-sourced miniature version is what’s responsible for the memes. Programmer Boris Dayma wants to make AI more accessible; he built the Dall-E Mini program as part of a competition held by Google and an AI community called Hugging Face.
And with great technology, comes great memes. Typing a short phrase into Dall-E Mini will manifest 9 different amalgamations, theoretically shaping into reality the strange images you’ve conjured. Its popularity leads to too much traffic, often resulting in an error that can be fixed by refreshing the page or trying again later.
If you want to be a part of the creation of AI-powered engines, it all starts with code. CodeAcademy explains that Dall-E Mini is a seq2seq model, “typically used in natural language processing (NLP) for things like translation and conversational modeling.” CodeAcademy’s Text Generation course will teach you how to utilize seq2seq, but they also offer opportunities to learn 14+ coding languages at your own pace.
You can choose the Machine Learning Specialist career path if you want to become a Data Scientist who develops these types of programs, but you can also choose courses by language, subject (what is cybersecurity?) or even skill - build a website with HTML, CSS, and more.
CodeAcademy offers many classes for free as well as a free trial; it’s an invaluable resource for giving people of all experience levels the fundamentals they need to build the world they want to see.
As for Dall-E Mini, while some have opted to create beauty, most have opted for memes. Here are some of the internet’s favorites:
no fuck every other dall-e image ive made this one is the best yet pic.twitter.com/iuFNm4UTUM
— bri (@takoyamas) June 10, 2022
There’s no looking back now, not once you’ve seen Pugachu; artificial intelligence is here to stay.
Countless vets were underpaid after a software glitch in the Department of Veteran Affairs.
On Wednesday the Department of Veteran Affairs told congressional staffers that it will not compensate veterans who were underpaid in their recent GI Bill benefits, despite the error in the department's own computer system.
For weeks, student veterans have reported missing or incorrect payments, either in excessive or diminished amounts, due to a problem in the department's software. The glitch stems from system changes under the new Forever GI Act, designed to afford veterans more financial stability to pursue their education. VA spokesman Terrence Hayes stated that "severe critical errors" occurred when they implemented new standards for calculating stipends owed to veterans.
In response to these errors, the VA postponed using the new system until December 2019. Until then, they're deferring to the rates used in 2017, denying veterans a 1% increase in payments included in the Forever GI Act. In addition to delaying benefits, the VA also miscalculated housing allowances.
To those who were underpaid, the VA initially promised that they would issue retroactive payments. However, on Wednesday, officials told anonymous congressional staffers that they have no plans on issuing payments because doing so would require an all-encompassing audit of every education claim prior to December 2019, as many as 2 million claims, according to an aide.
Another aide told NBC News, "They are essentially going to ignore the law and say that that change only goes forward from December 2019."
Amidst the VA's refusal to comment and spokespersons' vague responses on the matter, it is unclear how many students have been underpaid or how much money is owed, but hundreds of thousands of veterans are thought to be affected. The department defends its actions with the claim that the audit required to reissue payments would only delay processing future claims, causing more veterans to suffer.
One of those veterans already feeling severe strain is Jane Wiley, 31, a former Marine who now serves as a reservist in the Air Force. Her husband is also a former marine, and they support two children while she attends Texas A&M San Antonio. In October, she told NBC News that they had yet to receive their housing allowance through the GI Bill, despite filing all necessary paperwork. They were facing food and housing insecurity as a direct result.
Wiley lamented, "People are homeless and starving because they can't rely on getting their benefits. If it means making [VA] employees stay all night, then get it done because it's better than putting families in crisis." She added, "You can count on us to serve, but we can't count on the VA to make a deadline."
Under Secretary for Benefits Paul Lawrence is due to testify before the House Committee on Veterans' Affairs. Another key witness slated to appear resigned from the VA after news of underpaid veterans broke.
On Thursday, the VA denounced NBC News' original report as "misleading." Press Secretary Curt Cashour stated in an email sent to student veterans, "By the end of 2018, VA will install the current year uncapped DoD [basic allowance for housing] rates, and subsequently [monthly housing allowance] payments will follow this rate. For many students, this rate will be equal or higher than their current payments. Shortly after this update, VA will issue an additional payment to students who were underpaid for applicable terms."
How they'll define "applicable terms" in the new year is unclear, as it remains unspecified how long payments have been backed up or incorrect for how many veterans.
700,000 Muslims were forced to flee to neighboring Bangladesh in 2017.
On Monday, Facebook said it removed 13 pages and 10 accounts controlled by the Myanmar military in connection with the Rohingya refugee crisis.
The accounts were masquerading as independent entertainment, beauty, and information pages, such as Burmese popstars, wounded war heroes, and "Young Female Teachers." Fake postings reached 1.35 million followers, spreading anti-Muslim messages to social media users across the Buddhist-majority country.
Facebook's move comes a year after 700,000 Rohingya, a Muslim minority group in Myanmar, were forced to flee to neighboring Bangladesh amid widely-documented acts of mob violence and rape perpetrated by Myanmar soldiers and Buddhist mobs. The United Nations Human Rights Council denounced the crisis as "a textbook case of ethnic cleansing and possibly even genocide."
Rohingya children rummaging through the ruins of a village market that was set on fire.Reuters
Last month, the social media giant announced a similar purge, removing Facebook and Instagram accounts followed by a whopping 12 million users. Senior General Min Aung Hlaing, commander-in-chief of the Myanmar armed forces, was banned from the platform, as was the military's Myawady television network.
Over the last few years, Facebook has been in the hot seat for their tendency to spread misinformation. In the 2016 U.S. presidential election, inauthentic Facebook accounts run by Russian hackers created 80,000 posts that reached 126 million Americans through liking, sharing, and following. This problem has persisted in the 2018 midterm elections, ahead of which 559 pages were removed that broke the company's policies against spreading spam and coordinated influence efforts. Recent campaigns originating in Iran and Russia target not only the U.S., but also Latin America, the U.K., and the Middle East.
The situation in Myanmar is particularly troubling—it's not an effort by foreign powers to stoke hate and prejudice in a rival, but rather an authoritarian government using social media to control its own people. According to the New York Times, the military Facebook operation began several years ago with as many as 700 people working on the project.
Screen shots from the account of the Myanmar Senior General Min Aung Hlaing, whose pages were removed in August.
Claiming to show evidence of conflict in Myanmar's Rakhine State in the 1940s, the images are in fact from Bangladesh's war for independence from Pakistan in 1971.
Fake pages of pop stars and national heroes would be used to distribute shocking photos, false stories, and provocative posts aimed at the country's Muslim population. They often posted photos of corpses from made-up massacres committed by the Rohingya, or spread rumors about people who were potential threats to the government, such as Nobel laureate Daw Aung San Suu Kyi, to hurt their credibility. On the anniversary of September 11, 2001, fake news sites and celebrity fan pages sent warnings through Facebook Messenger to both Muslim and Buddhist groups that an attack from the other side was impending.
Facebook admitted to being "too slow to prevent misinformation and hate" on its sites. To prevent misuse in the future, they plan on investing heavily in artificial intelligence to proactively flag abusive posts, making reporting tools easier and more intuitive for users, and continuing education campaigns in Myanmar to introduce tips on recognizing false news.
The company called the work they are doing to identify and remove the misleading network of accounts in the country as "some of the most important work being done [here]."