Swifties, this one’s for you. It seems like Taylor Swift's Eras Tour has lasted eons. Yet somehow, there’s always something to talk about. Just thinking about how much she’s accomplished while on tour makes me want to buckle down, lock in, and channel my inner girlboss. But while I can’t even be bothered to cook dinner at home after a long day of work, Taylor is accomplishing milestones most musicians can only dream of. Let’s recap.
The Era’s Tour began in March 2023 with its North American leg. It’s set to go until December 2024, with dates in Europe, Australia, Asia, and South America— spanning 152 shows across five continents.
As the queen of multitasking, Swift hasn’t stopped at just selling out stadiums. Since the Eras tour began, she’s released multiple albums — both new and old — and shaken up the tour setlist with each new release. Her list of new releases started on the first day of tour with “All Of The Girls You Loved Before,” which was quickly followed up by “The Alcott,” a feature on The National’s album — reciprocity for their work on her pandemic era albums, Folklore and Evermore.
She also released Midnights: Late Night Edition (including the iconic collab with Ice Spice), as well as not one but two album re-releases — Speak Now Taylor's Version and 1989 Taylor's Version. As if that wasn’t enough, she announced her latest album, The Tortured Poet’s Department, in a GRAMMY’s acceptance speech. Talk about legendary. Since its release, she’s also been churning out deluxe versions and remixes to keep us on our toes. The Eras Tour was even made into a Blockbuster film that brought Beyonce to its premiere. Star power: confirmed.
But that’s just her work life. Her personal life is just as eventful. She ended her 7-year relationship with Joe Alwyn in April 2023. Then entered into a brief but controversial fling with 1975 frontman Matty Healy. Though it didn’t last long, the relationship was enough to inspire a whole album and catapult her into her current romance with Travis Kelce, aka Amerca’s first nepo boyfriend. Now they’re the American Royal couple — and she somehow had time to fly from tour to his Super Bowl performance.
We all have the same hours in the day as Taylor Swift, but how she uses them will always be a mystery to me. I work eight hours a day and can barely manage a social life. Meanwhile, Taylor literally has it all — though conservatives are turning on her for daring to be a woman in her 30s who’s not married with kids. If that’s not proof that women can’t do anything right, I don’t know what is.
Clearly, she’s working late because she’s a singer. No wonder Taylor Swift became a billionaire months into her tour in October 2023. Her net worth is currently around 1.3 billion dollars, making her the only female musician to become a billionaire from her music.
Other entertainment billionaires like Rihanna, Kylie Jenner, Kim Kardashian, Jay-Z, and Kanye West have joined the three-comma club thanks to ventures like clothing brands, beauty products, and other entrepreneurial pursuits. Rihanna has her FENTY Empire. Kim has her award-winning SKIMS. Ye had Yeezy. But Taylor has an unbeatable catalog of publishing.
But Taylor isn’t just different from other Billionaires because of how she earned her money. She’s the Taylor we know and love because of how she spends it. Her rollercoaster Eras Tour is how she’s made much of her fortune. And she’s using it to give back in monumental degrees. From individual donations to investing in local infrastructure, Taylor is literally changing lives on a macro and micro scale. And teaching us what to expect from all billionaires in the process.
The Era’s Tour Bonuses — Talk About Workplace Benefits
First to make headlines were the Eras Tour crew bonuses. While some of us get rewarded with a pizza party or a $10 gift card to Starbucks, Taylor casually dropped $55 million in bonuses for her tour crew. The massive sum was paid out to everyone who makes the Eras Tour go around, from truck drivers to dancers and sound technicians.
In fairness, these bonuses are definitely well-deserved. Taylor’s shows are over three hours long. Imagine dancing for that long — because Swift certainly isn’t the one with the impressive moves — for hundreds of tour dates. Or remembering countless combinations of light cues to go with a setlist that changes daily. Yeah, they’re clocking in. And if my boss had millions to blow, I’d be expecting a comfortable bonus too. But $55 Million? That’s a testament to Swift’s generosity. It's like she's Oprah, but instead of cars, she's giving out life-changing amounts of cash. "You get a bonus! You get a bonus! Everybody gets a bonus!"
It’s similar to how Zendaya gave film equity to every member of the crew that worked on her controversial black-and-white drama, Malcolm & Marie. Filmed in a few days with a bare-bones crew during the peak of the pandemic, the film was Zendaya’s passion project with Sam Levinson, in which she starred alongside John David Washington. Though the film got mixed reviews, it captured the audience’s attention all the same. After all, it was Zendaya — and we’ll watch her in anything. So since the film sold to Netflix for a hefty sum, all the crew members got payouts from the deal on top of their salaries to reward their hard work.
Bonuses and equity payouts are common in many industries, but not entertainment. Even though it’s one of the most lucrative and recognizable American industries, most entertainers don’t make enough to survive. The SAG and WGA strikes last year were proof that there needs to be systemic change in the industry. LA County has even identified show businesses as risk factors for being unhoused — after all, how many stories do we hear of actors who were living in their cars before their big break? And for many, their big break never comes. For even more, they get hired on amazing gigs with giant performers … then go right back to the grind afterward. While individual actions from our favorite stars won’t fix everything, Zendaya and Taylor are providing models for how Hollywood should treat the people who make this town go round.
And in this economy, even a little bit could go a long way. Inflation and the cost of living are not a joke. Especially when, like with many creative careers, you often have to invest in lessons or equipment for your craft. With all this considered, the impact of Swirt’s donations can’t be overstated. Imagine getting a lump sum of cash for dancing to your favorite Taylor Swift tracks? Talk about a dream job.
The Economic Impact of Swift - Swiftonomics, if you will
Like Barbie and Beyonce last year, Swift is still on a tear to boost the economy of the cities she’s in just by traveling there — ad inspiring others to make the trek, too.
The Barbie movie proved that by marketing to women (instead of just making Marvel flops like Madame Web that aren’t really targeted to women at all), the entertainment industry can make giant profits. Barbie fever went beyond the theater. Thanks to a plethora of product collabs, the phenomenon rippled through retail.
Similarly, Beyonce’s Renaissance Tour tour generated an estimated $4.5 billion for the American economy. According to NPR, that’s almost as much as the entire 2008 Olympics earned for Beijing. People were taking money out of their 401ks to pay for Beyonce tickets and the glittery, silver-hues outfits to rock at her shows. Cities even started calling her effect the “Beyonce Bump.”
Swift has the same effect. She’s not just proving her generosity on a micro-scale for the people close to her, she’s having actual, tangible effects on the economy. It's like she's leaving a trail of dollar bills in her wake, and cities are scrambling to catch them like it's a country-pop, capitalist version of musical chairs.
The US Travel Association called it the Taylor Swift Impact after she generated over $5 Billion in just the first 5 months of the Eras Tour. But how does this work? It’s not like Taylor is printing more money at those shows, but it almost is. Her tour dates are pretty much economic steroid shots for local businesses. Hotels are booked solid, restaurants are packed, and let's not even get started on the surge in friendship bracelet supplies.
“Swifties averaged $1,300 of spending in local economies on travel, hotel stays, food, as well as merchandise and costumes,” say the US Travel Association. “That amount of spending is on par with the Super Bowl, but this time it happened on 53 different nights in 20 different locations over the course of five months.” That’s not to say anothing of her effect on the actual Super Bowl and the entire NFL season thanks to her ball-throwing boyfriend.
It's like she's created her own micro-economy, and everyone's invited to the party. And unlike some economic theories that rely on wealth trickling down (spoiler alert: it doesn't), Taylor's wealth is more like a t-shirt cannon or the confetti at her shows — showering everyone around.
Donations that actually do good
Taylor isn’t just stepping into cities and calling it a night. She’s also not just throwing pennies at problems - she's making significant contributions that are changing lives. And more importantly, she's using her platform to encourage her fans to do the same.
She kicked off her tour with quiet donations to food banks in Glendale, Ariz., and Las Vegas ahead of the Eras Tour. Once the tour was in full swing, she continued this practice. In Seattle, she donated to Food Lifeline, a local hunger relief organization. In Santa Clara, she showed some love to Second Harvest of Silicon Valley. And let's not forget about her $100,000 donation to the Hawkins County School Nutrition Program in Tennessee.
She’s been making similar donations overseas. Taylor Swift donated enough money to cover the food bills for an entire year across 11 food banks and & community pantries in Liverpool. Swift also covered 10,800 meals for Cardiff Foodbank and many more banks across the UK and EU. Her impact is so profound that her numbers are doing more to combat issues like hunger than the government.
Can billionaires actually be good?
One thing about me, I’m always ready and willing — knife and fork in hand — to eat the rich. Because fundamentally, can any billionaire really be good? In our late-stage capitalist horror story, the answer is usually no. Look how many of them are supporting the Trump campaign just to get some tax breaks.
But here's the thing - Taylor Swift might just be the exception that proves the rule. She's not perfect, sure. She still flies private jets and probably has a carbon footprint bigger than Bigfoot. But unlike most of the others in her tax bracket, she's not flaunting her wealth like it's a personality trait.
Take a look around. We've got billionaires trying to colonize Mars instead of, I don't know, helping people on Earth. In this context, Taylor's approach is more like Mackenzie Scott’s — Bezos’s ex-wife. She's not trying to escape to another planet - she's trying to make this one better.
And look, I'm not saying we should stop critiquing billionaires or the system that creates them. But she's just setting the bar for what we should expect from all billionaires. She's showing us that our collective power as fans can translate into real-world change. That our love for catchy choruses and bridge drops can somehow, improbably, lead to food banks getting funded and crew members getting life-changing bonuses.
So sorry to my neighbors who hear me belting “Cruel Summer” and “right where you left me” at the top of my lungs (and range). Just know it’s for the greater good.
Is The Grid On The Skids?
AI Data Centers Relocate, Increasing Strain on Power Grid
Great. AI is not only stealing, regurgitating, and degrading the original creative work of actual human beings, it’s wreaking havoc with the nation’s energy resources. This is putting an enormous strain on the electric grid that delivers our energy.
Evan Helper succinctly describes the problem in a recent Washington Post article: “Vast swaths of the United States are at risk of running short of power as electricity-hungry data centers and clean-technology factories proliferate around the country, leaving utilities and regulators grasping for credible plans to expand the nation’s creaking power grid.”
Fast Company’s Chris Morris highlights two potential negative effects of this unprecedented demand for power. The bill for upgrading the grid may be fobbed off on residential users and not the electricity-gobbling data centers.
The second effect is environmental. Utility companies, Morris writes, “are lobbying to delay the shutdown of fossil fuel plants...to meet the surge in demand.”
The problem is exacerbated by a shift in where AI data centers are located. “In the past,” Helper writes, “companies tried to site their data centers in areas with major internet infrastructure, a large pool of tech talent, and attractive government incentives. But these locations aregetting tapped out.” Low-profile cities in Ohio, Iowa, and Indiana – to name only a few – are or will soon be home to huge new AI “factories.” Traditionally, power supplies in these locales are often limited, placing a further burden on an old and overworked grid.
AI is a drain on energy and water | UBS Trendingwww.youtube.com
As the BBC reports, it’s not just an American issue. The UK is a case in point. “There is currently a moratorium preventing the construction of new data centres in Dublin. Nearly a fifth of Ireland’s electricity is used up by data centres, and this figure is expected to grow significantly in the next few years... data centre electricity demand in the UK will rise six-fold in just 10 years, fueled largely by the rise of AI.”
June Kim of the MIT Technology Reviewoffers a far more positive view of AI as the perfect tool to protect and enhance the grid. “AI’s ability to learn from large amounts of data and respond to complex scenarios makes it particularly well-suited to the task of keeping the grid stable, and a growing number of software companies are bringing AI products to the notoriously slow-moving energy industry.” Kim envisions a world in which AI runs a fully-automated grid, but recognizes that issues of data security, reliability, and social/economic biases render such a scenario impossible for now.
Time and again, new technology has been presented as the scientific equivalent of penicillin, a one-size-fits-all panacea and boon to suffering humanity. Remember how the Internet was going to make a heaven out of earth? Look how that turned out. And wasn’t the Atom Bomb going to put an end to warfare? You go, technocrats!
It’s a truism to say that technology is not, in itself, bad or dangerous or evil. It’s the uses to which it’s put that determines the ethical or unethical nature of a given development. But as Calvin Coolidge once said, “The chief business of the American people is business,” and business is notoriously blind to anything but profit. Remaining hopeful that AI in all its manifestations and consequences will benefit the global family is not an easy task. One fears we are whistling in the dark.
And, when an over-burdened grid finally collapses, we will be.
How the Internet Is Changing Your Brain
In a way, we're all living in the matrix: going through our days within an illusion of freedom when really our lives are completely dictated by technology.
Updated: 3/18/2024
40-odd years ago, there was no such thing as a cell phone, and the only computers in existence took up entire rooms. Then the World Wide Web was born.
20 years ago, the iPhone was just a seed of a dream in Steve Jobs' mind. But today, if you're reading this, you have access to countless screens and endless gigabytes of information; and you probably have a phone in your pocket that you can't be separated from without experiencing a cold rush of panic. Like it or not, you live in the digital age.
Everything is happening so fast these days; it's hard to find the time to seriously question how technology has altered the fabric of our realities. But here are four major ways the Internet has made our minds different from how they were before—so much so that we can never go back.
1. We never have to wonder about anything
Once upon a time, if you were sitting at dinner and a question came up about, say, climate change or the effects of a certain drug, you would have to either find someone who knew the answer or wait until a library opened. Then you'd have to go there and parse through the Dewey Decimal System until you found a volume that might be able to provide the answer.
Today, we all have any piece of information, no matter how small or obscure, quite literally at our fingertips. So we should be smarter than ever, right? But all this instantly accessible information is coming at a price. One study found that millennials' memories are worse than seniors; and a recent Columbia University study revealed that if people feel they will be able to look up something in the future, they'll be less likely to remember it.
In his book The Shallows: What the Internet Is Doing to Our Brains, Nicholas Carr argues that technology is making us stupider, less likely to think critically and retain the information we need. Part of this is because every time we go online, we are confronted with billions of sources vying for our attention, making it difficult to deploy the kind of focused concentration needed to synthesize and reflect on information.
Also, now that we have endless information at our fingertips, many people have proposed that we may be less curious than ever, less inclined to come up with original ideas. However, curiosity is a fluid entity, and though the Internet offers more resources than ever, that also means that more people are creating content than ever before. And new innovative technologies are cropping up every day, revealing that although the Internet might be making some of us stupider, it's also a fertile breeding ground for incredible, world-changing inventions and unprecedentedly viral content.
2. We're more interconnected—and lonelier than ever
Once upon a time, you had to call someone up to speak to them, but now you can see what any of your friends are doing at any time. Instagram and Snapchat stories make it possible to share intimate images of our lives on a wide scale with huge audiences at any time; and online algorithms make it so that whatever you post will never truly vanish from the Internet, even if you delete it. We can see the daily coffee choices and midnight tear-stained selfies of our favorite stars; we can hit up old friends from across the globe with a single Facebook search.
Humans have always been hard-wired for connection, desperately looking for kinship and community, and so it makes sense that the Internet has become so addictive. Every ping, alert, and notification provokes the same kind of dopamine rush that comes from an expression of love and friendship. On the other hand, cyberbullying and persistently comparing oneself to others in the virtual sphere can both have very adverse effects in the real world.
Some studies have proposed that social media increases levels of loneliness. One found that heavy Facebook, Snapchat, and Instagram use can contribute to depression in young adults. Excessive time on Facebook has also been found to be associated with poor physical health and life satisfaction. On the other hand, social media has presented an opportunity for isolated adults and senior citizens to reach out and connect; and online fan and lifestyle communities provide oases for people all over the world.
Image via Business Insider
For better or for worse, the Internet has changed the way we connect. It's also changed the way we love. 26 million matches are made every day on dating apps, and roughly 13% of people who met on dating apps got married. And phones allow us to communicate with anyone at any moment of the day, creating whole new rules and expectations for relationships, making them altogether more interactive and involved than they once were. Plus, adult entertainment is fundamentally changing the way we have sex, with
many studies revealing that it's lowering sex drives and creating unrealistic expectations across the board.
It's the same for work: a Fortune study found that the average white-collar worker spends three hours per day checking emails. This comes part and parcel with the gig economy, that staple of Millennial culture built on perpetual interconnectedness and 24/7 "hustle"—a phenomenon that often leads to burnout.
3. We can have more than one reality—or can hide inside our individual worlds more easily than ever
The Internet has made it easier than ever to craft false personas and to embody illusory identities. We can use Photoshop to alter our appearances; we can leverage small talents to viral fame and huge monetary gains, and we can completely escape our world in exchange for online communities and ever-growing virtual and augmented reality options.
The Internet is also altering our perceptions of reality. Although people once thought that interconnected online communities would facilitate the sharing of diverse viewpoints, it has turned out that social media allows us to access echo chambers even more isolated and partisan than what we'd see in our real lives.
In short, we're all at risk of being catfished.
4. Many of us are completely addicted
When was the last time you went a day without checking your phone? A week? And do you think that if you needed to, you could quit? Most likely, the answer is no, so you'd better believe it: you're addicted to technology. But you're not alone. A 2017 study found that 210 million people may be addicted worldwide.
There are five primary types of Internet addictions: cyber-sexual addiction, net compulsions (online shopping), cyber relationships (online dating), gaming, and information seeking (surfing). In recent years, internet addiction rehab has grown in popularity. The majority of people with legitimate internet addiction problems are men in their teens to late thirties, but it's likely that we all suffer from this to some extent.
Image via the Fix
Although the Internet is changing everything about our lives, ultimately, there is no clear consensus on whether these changes are for the worse or the better. But the changes will be growing more extreme over the years. Moore's Law proposes that, essentially, overall technological processing power will double each year, indefinitely—meaning that technology will continue to advance at an unimaginable rate. If the past twenty years have given us iPhones, what will the next twenty bring? The next hundred, if we make it that far without global warming ending everything?
Only time will tell. We won't be the same—but then again, we were never meant to remain stagnant as a species. Change and chaos are the laws of the human race, and as a species, we've always been obsessed with progress.
Some theorists believe that technological progress will only end when we create an operating system more intelligent than us, in a revolutionary event called the singularity—theoretical future event when computer intelligence surpasses that of humans. If this happens, the AI could decide to eliminate us. That's another story—but until then, the sky is the limit for innovators and consumers everywhere.
Eden Arielle Gordon is a writer and musician from New York City. Follow her on Twitter @edenarielmusic.
8K Writers Can’t Be Wronged - AI Platforms “Scraping” & Stealing Bestselling Books
Close to 8 thousand writers recently signed a letter from The Authors Guild protesting the unauthorized use of their stories.
Technology is inescapably linked to the art and craft of writing. Humanity’s desire to share and preserve its thoughts, its pleasures, its discoveries, its knowledge, and its very survival led to hieroglyphics, the development of paper and ink, to the printing press, the typewriter, and the computer. Technology’s traditional aim was to make it faster and easier to create and disseminate the written word. Now it seems technology’s out to eliminate writers altogether.
Or, at the very least, the writers’ livelihoods.
NPR’s Chloe Veltman tells us that The Authors Guild – an organization founded in 1912 to “support working writers and their ability to earn a living from authorship” – is taking on “artificial intelligence companies like OpenAI and Meta” which use writers’ work “without permission or compensation.” As Veltman describes it, “text-based generative AI applications like GPT-4 and Bard...scrape the Web for authors' content without permission or compensation and then use it to produce fresh content in response to users' prompts”.
Approximately eight thousand writers, Veltman reports, have signed a Guild letter protesting such unauthorized use of their material. Some of the better-known scribes include Nora Roberts, Viet Thanh Nguyen, Michael Chabon, and Margaret Atwood.
The Authors Guild’s petition is not the only action being taken in the wake of AI. Other writers have filed class-action suits against AI companies, claiming their work is being pirated. AI is one of the main reasons for the Writers Guild of America’s strike (starting May 2nd), bringing American film and television production to a complete standstill. The New York Times summarizes the WGA’s position: “Writers are asking the studios for guardrails against being replaced by AI, having their work used to train AI or being hired to punch up AI-generated scripts at a fraction of their former pay rates.”
Award-winning writer/director Doug Burch describes the WGA strike as “vital to the future of those wanting basic living wages...It’s truly despicable when CEOs make $400 million a year and say that writers and actors are being unrealistic wanting to at least make a living wage.”
And just what is this average yearly salary? A forthcoming report from The Authors Guild asserts that the median income for a full-time writer was $23,000 in 2022. This after, a precipitous 42% decline in writers' incomes between 2009 and 2019.
History proves time and again that the haves never give anything to the have-nots without being forced to “share the wealth.” Whether it's coal mining, auto manufacturing, or movie-making, it’s taken the commitment of generations of die-hard activists to help address an economic imbalance.
The writers have one huge strength, something no boss or executive can do without – their talent, their craft, originality, passion, and their grit. As I understand it, AI can synthesize, imitate, mimic a writer’s work. The one thing it can’t do is create original thought and original material. Writers – with their unique perspectives and experiences, their individual and idiosyncratic use of language, and their ability to capture human behavior in all its grunge and glory – cannot be replaced.
Books, films, non-fiction, graphic novels, and poems are not merely material to be scraped, stolen, and exploited. They’re not “a data set to be ingested by an AI program”, they hold our past, our future, our quotidian lives, they teach us what it is to be human. This is a writer’s work.
The message is clear – Support the writers.
Researchers Have Created an AI Too Dangerous to Release. What Will Happen When It Gets Out?
The GPT-2 software can generate fake news articles on its own. Its creators believe its existence may pose an existential threat to humanity. But it could also present a chance to intervene.
Researchers at OpenAI have created an artificial intelligence software so powerful that they have deemed it too dangerous for public release.
The software, called GPT-2, can generate cohesive, coherent text in multiple genres—including fiction, news, and unfiltered Internet rants—making it a prime candidate for creating fake news or fake profiles should it fall into the wrong hands.
Fears like this led the Elon Musk-founded company OpenAI to curtail the software's release. "Due to our concerns about malicious applications of the technology, we are not releasing the trained model," they announced in a blog post. "As an experiment in responsible disclosure, we are instead releasing a much smaller model for researchers to experiment with, as well as a technical paper."
In addition to writing a cohesive fictional story based on Lord of the Rings, the software wrote a logical scientific report about the discovery of unicorns. "In a shocking finding, scientists discovered a herd of unicorns living in a remote, previously unexplored valley, in the Andes Mountains," the software wrote. "Even more surprising to the researchers was the fact that the unicorns spoke perfect English. The scientist named the population, after their distinctive horn, Ovid's Unicorn. These four-horned, silver-white unicorns were previously unknown to science."
This journalistic aptitude sparked widespread fears that AI technologies as sophisticated as the GPT-2 could influence upcoming elections, potentially generating unfathomable amounts of partisan content in a single instant. "The idea here is you can use some of these tools in order to skew reality in your favor," said University of Washington professor Ryan Calo. "And I think that's what OpenAI worries about."
Elon Musk quit OpenAI in 2018, but his legacy of fear and paranoia regarding AI and its potential evils lives on. The specter of his caution was likely instrumental in keeping GPT-2 out of the public sphere. "It's quite uncanny how it behaves," echoed Jack Clark, policy director of OpenAI, when asked about his decision to keep the new software under locks.
In a world already plagued by fake news, cat-fishing, and other forms of illusion made possible by new technology, AI seems like a natural next step in the dizzying sequence of illusion and corruption that has rapidly turned the online world from a repository of cat videos (the good old days) to today's vortex of ceaselessly reproduced lies and corrupted content. Thinkers like Musk have long called for resistance against AI's unstoppable growth. In 2014, Musk called AI the single largest "existential threat" to humanity. That same year, the late physicist Stephen Hawking ominously predicted that sophisticated AI could "spell the end of the human race."
But until AI achieves the singularity—a level of consciousness where it achieves and supersedes human intelligence—it is still privy to the whims of whoever is controlling it. Fears about whether AI will lend itself to fake news are essentially fears of things humans have already done. All the evil at work on the Internet has had a human source.
When it comes down to the wire, for now, AI is a weapon.
When AI is released into the world, a lot could happen. AI could become a victim, a repository for displaced human desire. Some have questioned whether people should be allowed to treat humanoid creatures in whatever ways they wish to. Instances of robot beheadings and other violent behaviors towards AI hint towards a darker trend that could emerge should AI become a free-for-all, a humanoid object that can be treated in any way on the basis of its presumed inhumanity.
Clearly, AI and humanity have a complex and fundamentally intertwined relationship, and as we all become more dependent on technology, there is less of a clear line dividing the human from the robotic. As a manmade invention, AI will inevitably emulate the traits (as well as the stereotypes) of the people who created it. It could also take on the violent tendencies of its human creators. Some thinkers have sounded the alarm about this, questioning the dearth of ethics in Silicon Valley and in the tech sphere on the whole. Many people believe that AI (and technology in general) is fundamentally free of bias and emotion, but a multitude of examples have shown that this is untrue, including instances where law enforcement software systems displayed racist bias against black people (based on data collected by humans).
AI can be just as prejudiced and close-minded as a human, if not more so, especially in its early stages where it is not sophisticated enough to think critically. An AI may not feel in and of itself, but—much like we learn how to process the world from our parents—it can learn how to process and understand emotions from the people who create it, and from the media it absorbs.
Image via techno-pundit.blogspot.com
Image via cio.com
Their quandary may not be so different from the struggle parents face when deciding whether to allow their children to watch R-rated movies. In this case, both the general public and the AIs are the children, and the scientists, coders, and companies peddling new inventions are the parents. The people designing AIs have to determine the extent to which they can trust the public with their work. They also have to determine which aspects of humanity they want to expose their inventions to.
OpenAI may have kept their kid safe inside the house a little longer by freezing the GPT-2, but that kid is growing—and when it goes out into the world, it could change everything. For better or worse, at some point, super-intelligent AI is going to wind up in the public's hands. Now, during its tender, formative stages, there is still a chance to shape it into whom it's going to be when it arrives.
Eden Arielle Gordon is a writer and musician from New York City. Talk to her about AI on Twitter @edenarielmusic.
AI and the Hiring Process
Are games and algorithms the future of the interview process?
HR reps and middle managers alike often find themselves with employees who look good on paper and talk a big game during their interview, but don't deliver once they've been hired. On top of this, there's nothing really stopping a potential employee from flat out lying during the hiring process. If an interviewee gets caught in a lie, she won't get hired, but she didn't have a job to begin with, so she's no worse for wear. In order to mitigate these and the myriad other difficulties associated with the hiring process, employers have started using (in a somewhat ironic twist) artificial intelligence to aid with recruiting.
Outside of the difficulties discussed above, one of the primary motivators for companies' move towards automated recruiting processes is money. It can cost nearly a quarter million dollars to recruit, hire, and onboard a new employee, and when someone turns out to be a dud, the effects can reverberate throughout the entire company. That said, it's not as if corporations have HAL from 2001 a Space Odyssey handpicking the optimal candidate, not yet at least. Different AI developers offer different things. For example, x.ai specializes in scheduling interviews, Filtered automatically generates coding challenges for aspiring programmers looking for work, and Clearfit has a tool that can rank potential candidates.
These programs, however useful, only free employers from having to do the low-level clerical work hiring. The bulk of the sorting and selecting of candidates still falls squarely on the shoulders of the hiring manager. Cue Pymetrics, a company built on the idea of replacing the way in which we conduct interviews and hire new employees. Pymetrics' AI uses a series of algorithms and cognitive science-based games to help pair employees and companies. The latter though, is what differentiates Pymetrics from the competition.
The idea is simple: when an employer gives an applicant a test or asks her a series of questions, the applicant answers in a way that comports with what she thinks the interviewer wants to hear. With an objective-based game where a clear goal is outlined, a candidate has a much harder time masking her methodology. The games Pymetrics develops reportedly measure 90 "cognitive, social and personality traits" and are used to gather data on a company's top performers. After enough data is collected, Pymetrics can then create the perfect composite employee. Every applicant is then measured against this composite, giving employers an objective look at who is best for the job.
The use of games is far from a passing trend however, and is not unique to Pymetrics. A Deloitte report recently revealed that nearly 30% of all business leaders use games and simulations in their hiring process. Unfortunately for companies hoping AI and algorithmic programs are the cure all for their (the companies') hiring woes, this report also concluded that over 70% of these business leaders found cognitive games to be a "weak" indicator of employee success. Still, to throw another wrench into the equation, there is a significant amount of evidence to support the idea that algorithms do outperform humans when it comes to hiring ideal candidates. In reality though, humans and AI systems are just better at different things. For example, a person, with only so much time in their day, can't accurately or quickly read through thousands of resumes and cover letters. But while algorithms are good at narrowing down selections and denying clear wrong fits, they aren't particularly well suited for sussing out passion or work ethic. There's also another, unforeseen issue attached to AI hiring.
AI and algorithmically-based hiring are supposedly unbiased and don't allow for pettiness, racism, or sexism to factor into their selection process. That said, today's leaders in AI technology are far from working out all the kinks. A crime-predicting algorithm in Florida recently labeled black people as potential criminals twice as often as it did white people. It also isn't an unrealistic jump in logic to suggest that an algorithm could see a demographic inconsistency, such as the best salesman at a particular firm happening to be male, and conclude that it should rank female job applicants lower. Pyretics, in particular, claims that its algorithms are rigorously designed to avoid this type of bias, but this issue not only calls AI's efficacy into question but its ethics as well. According to Sandra Wachter, a researcher at the Alan Turing Institute and the University of Oxford, "Algorithms force us to look into a mirror on society as it is," and that relying too heavily on data can make it seem as though our cultural biases are just inscrutable facts. This is what Arvind Narayanan, a professor of computer science at Princeton calls the "accuracy fetish," a fetish that's all too prevalent in Silicon Valley, a place in which AI is consistently touted as objective.
In a lot of ways, it's hard to argue against algorithmic hiring procedures. They save both time and money, and have been proven to work in several cases. The danger is not in this technology supplanting HR reps. People will continue to be a part of the interview process, if only for the reason that liking the people you work with is one of the most important facets of productivity. Algorithms only become a problem when they're treated as infallible oracles, capable of answering questions inaccessible to the human mind, rather than pieces of machinery. It's important to remember, the algorithm is a tool, an electric drill to the interview process's hand crank. AI isn't meant to replace human judgement, but to narrow the gap between rote tasks and decisions that require said judgement. In this metaphor, people aren't the hand crank or the electric drill; we're the screw.
That said, it's human nature to appeal to authority and the question that lies at the heart of the luddite's fear, is whether or not we can demystify this technology enough to continue trusting our guts over an algorithm's calculations. Arthur C. Clarke once said, "Any sufficiently advanced technology is indistinguishable from magic." We'll find out soon enough whether or not he was right.