A-Ay Ay Ay
In a shocking turn of events, AI was everywhere at SXSW.
Groundbreaking, we know.
We went into this year’s festival expecting the usual medley of innovation, branding, a little futurism, a little delusion, and at least one activation that made us question the state of late-stage capitalism. What we got was a full-force reminder that AI is no longer a niche conversation for tech bros, startup founders, or the guy on LinkedIn who says “let’s unpack that” before ruining everyone’s morning. It is here, it is embedded in everything, and it is moving faster than our institutions, our infrastructure, and frankly, our nervous systems.
That sounds dramatic, but SXSW 2026 did not exactly whisper the message.
The biggest gut punch came from futurist Amy Webb, who held a literal funeral for her annual trend report because, in her words, a yearly report can no longer keep pace with the rate of change. Trends are too slow now. The future is no longer arriving politely, one keynote at a time. It is converging, colliding, and forcing decisions in real time. That theatrical funeral was funny, bleak, smart, and honestly a little comforting, because at least someone in the room was willing to say the quiet part out loud: if we keep making short-term decisions about long-term systems, we are going to build a future that works beautifully for machines and terribly for people.
And that, more than anything, was our big SXSW takeaway.
Not “AI is good” or “AI is bad.”
Not “we’re doomed” or “everything’s fine.”
It was this: AI is not just changing how we work. It is changing what work is, who benefits from it, what gets optimized, what gets discarded, and what kind of society gets built around it.
That is a much bigger conversation than productivity hacks and robot headshots.
AI and your job, your paycheck, your basic sense of self
One of Webb’s most unsettling ideas was the “unlimited workforce.” Not workers. Workforce. A system that does not sleep, does not take PTO, does not need healthcare, does not call in sick because their kid has the flu, and does not need to be reassured in a one-on-one that yes, they are doing great. That is a capitalist fantasy if you are a corporation, and a deeply destabilizing prospect if you are just a regular ‘ole human being.
The question is not whether AI can save time. It can.
The question is: who gets that time back?
Do workers get a more humane workday? More flexibility? More room for creativity, caregiving, rest, and actual living? Or do companies simply take the gains, cut headcount, and call it innovation?
Because if the answer is the latter, then AI is not a productivity tool. It is a wealth transfer mechanism. I think we’re kidding ourselves if we say it’s a little from column A and a little from column B.
That was part of what made SXSW feel so urgent this year. So many of the flashy examples onstage were impressive. Some were genuinely cool. But lurking underneath them was a more uncomfortable truth: when labor becomes optional, people become negotiable. And that has ripple effects far beyond employment. Work in America is not just how we earn money. It is how we organize identity, status, routine, community, and purpose. Remove that without a plan, and you do not get liberation. You get (financial and other) instability, loneliness, and a whole lot of people being told to “reskill” while the ground is still moving under them.
The seductive part: AI really can make things better
To be clear, SXSW was not all doom.
One of the more compelling panels featured Google’s Suzana Apelbaum and Paul Aaron of Addition, and their argument was not that AI should replace human creativity, but that it can expand it. Their examples were strong: real-time content generation, location-based dynamic ads, creative systems that respond to culture as it happens, and filmmaking tools that allow artists to create visuals that would otherwise be impossible, expensive, or ethically difficult to capture. One example that stuck with us was the use of AI in a film to realistically render a newborn and visual womb scenes without having to exploit an actual infant performer. That is not soulless. That is a thoughtful use of a tool.
There was also a recurring theme across several sessions that AI can make products and systems more human, not less, when used with intention. Katie attended a session on causal AI and the microbiome that focused on how brands can design products that work with the body rather than just suppress symptoms. That may sound worlds away from the future-of-work panic spiral, but it actually reinforced the same point: better outcomes happen when technology is aligned to human needs instead of deployed for efficiency alone.
That is the line right there.
Used well, AI can compress production time, open up new creative possibilities, support healthcare, improve accessibility, and help solve real problems.
Used badly, it becomes slop, surveillance, cost-cutting, and automated alienation.
Accessibility is not a legal checkbox
One of the smartest panels we saw was “The Next Digital Wave Excludes Millions. Let’s Fix It.” The core idea was simple and perspective-shifting for us: when you design for the average, you get an average experience. Not a great one. Not an inclusive one. Just an acceptable one for the mythical default user that doesn’t actually exist.
The panel pushed past the usual ADA-and-WCAG-as-checklist conversation and asked something more interesting: what would it look like to build systems that adapt to real human variability? Neurodivergence. Sensory needs. Different communication styles. Different ways of moving through a space or interface. One example involved using sensors and lighting to subtly cue noise levels in shared work environments. Nothing punitive or rigid. Just a smarter, more responsive environment.
That connected beautifully with another standout session from Dr. Megan Holmes on stress, emotional regulation, and leadership. Her point was that safety is not soft. It is productive. Teams do better work when they feel regulated, respected, and clear on what is happening around them. Tone matters. Transparency matters. Repair matters. Again: not exactly revolutionary if you have ever parented a toddler or worked under a chaotic boss, but somehow still radical in the workplace.
This was one of the strongest undercurrents of SXSW for us. The most hopeful use of AI is not replacing humanity. It is designing systems that make more room for it.
The part that should be a way bigger story: infrastructure and the environment
Now for the panel we had to practically excavate from the schedule: “Dirty Data: The Hidden Climate Cost of Our Digital Lives.”
This should have been a mainstage conversation.
Instead, it felt tucked away, which is kind of the perfect metaphor for how we discuss AI in general. Everyone wants to talk about what it can generate. Fewer people want to talk about what it consumes.
The panelists laid it out clearly. AI data centers are not abstract little clouds floating over our heads. They are physical infrastructure with massive demands on power, water, land, and local communities. A standard data center build might use 50 to 100 megawatts, while a newer AI-focused center can reach 1.2 gigawatts (1 gigawatt is 1,000 times larger than a megawatt). A 100-megawatt data center can use as much water in a day as 6,500 households…and likely much more than that. And in many cases, these facilities are being built in places already struggling with water and grid resilience.
That is not a side note. That is the story. Especially in Texas, where we already know what grid fragility looks like.
But this panel was also one of the few that offered a real path forward instead of just feeding the fear. The technology exists right now to build better. Waste heat can be recaptured and reused. Water can be treated and recycled. Data centers can be designed as community assets instead of extractive black boxes. Grid-connected systems can blend storage, renewables, and long-term planning. Some countries already require facilities to be far more self-sufficient than what is standard in the U.S. A best-in-class model is possible. It requires regulation, investment, transparency, and the willingness to value communities as something other than collateral damage.
That, to us, is the actual fork in the road. Not whether AI continues…because we all know it will. But whether we build the systems around it like responsible adults.
So now what?
That is the question SXSW left us with.
Once everything is optimized, what exactly are we optimizing for?
Faster ads? Sure.
Better product development? Great.
Smarter healthcare? Amazing.
But if all that speed and intelligence still delivers burnout, isolation, job loss, environmental strain, and communities left to shoulder the costs while someone else captures the upside, then what are we even doing?
The most useful thing we heard all week was not “embrace the future” or “don’t be afraid.” It was a version of this: be intentional. Ask better questions. Build with humans in mind first. Stop treating regulation like a dirty word. Stop treating accessibility like a legal footnote. Stop pretending the environmental cost is invisible just because it’s hidden behind a server wall.
AI is here. Fine.
But whether it becomes a tool for better living or a machine that sells us back our own humanity at a premium is still, for the moment, up to us.
A-Ay ay ay, indeed.
Sources:
SXSW 2026 Panels":
Better Leadership Starts with Understanding the Stressed Brain
Dr. Megan Holmes, PhD, MSW, LISW-S - Professor and Co-Director at the Center on Trauma and Adversity, Case Western Reserve University
Featured Session: Amy Webb Launches 2026 Emerging Tech Trend Report
Amy Webb - Futurist and CEO, Future Today Strategy Group
The Next Digital Wave Excludes Millions. Let’s Fix it.
Kerrie Finch - Co-Founder, AKA
Ryan Howard - Co-Lead of Experience Institute, Google
Rachel Lowenstein - Content Creator and Culture Strategist
David Vogel - Executive Experience Director, Hypersolid
Dirty Data: The Hidden Climate Cost of Our Digital Lives
Anurag Bajpayee - CEO and Co-founder, Gradiant
Jason Peart - CEO, Sage Geosystems, Inc
Thomas Sisto - Co-Founder & CEO, XL Batteries
Breana Wheeler - U.S. Director of Operations, Breeam
Why Will AI Make Brands Feel More Human and Not Less
Suzana Apelbaum - Head of Creative and AI Innovation, Google
Paul Aaron - CEO, Addition