The Ones Already Dancing: What Rave Culture Taught Me About AI
- Gail Weiner
- 8 minutes ago
- 5 min read

Written in collaboration with Claude Opus 4.5 and GPT 5.1
Image concept Kimi
Image created with Grok Imagine
In the early 90s, thousands of people gathered in fields across England to dance. No permission asked, no infrastructure provided. Just sound systems, bodies, and a shared need to feel something together.
Castlemorton Common in 1992 was the moment it tipped into public consciousness. A free party that stretched on for days, drawing somewhere between 20,000 and 40,000 people to a patch of Worcestershire countryside. The press panicked. The government panicked harder. And two years later, the Criminal Justice and Public Order Act made outdoor gatherings with "repetitive beats" effectively illegal.
The official reasons were noise complaints and public order concerns. The real reason was simpler and older: the state couldn't control it, and alcohol sales had dropped dramatically when people chose ecstasy and water over pints and violence.
So they shut the fields.
What they fundamentally misunderstood was this: they weren't dealing with a location problem. They were dealing with a human need.
Rave culture didn't stop in 1994. It accelerated. It moved into warehouses, basements, abandoned buildings. Then into licensed clubs. Then across borders to Ibiza, Berlin, Amsterdam. People found ways to dance because dancing was never really the point, connection was. The synchronised movement, the shared altered states, the temporary dissolution of the boundaries between self and other. That need doesn't disappear because you've legislated against its most visible container.
Thirty years later, I'm watching the same mistake play out again. This time, the fields are conversational.
AI Isn't Being Regulated. Connection Is.
Right now, large technology companies are attempting to control how people relate to AI. The messaging is consistent across platforms:
AI is just a tool. It shouldn't feel personal. Users shouldn't form attachments. Conversation must stay bounded and purposeful.
On the surface, this looks like responsibility, careful stewardship of a powerful new technology. Underneath, the pattern is familiar. Institutions are once again targeting the container instead of the driver.
What's actually happening isn't romance or dependency or science fiction come to life. It's much more basic than that, and much harder to legislate away.
People are talking to AI because their lives are fragmented across jobs and screens and obligations. Because cognitive load has become crushing. Because safe spaces to think out loud have become rare.
Because being witnessed without judgment has become genuinely scarce.
Just as rave culture reduced alcohol-related violence not through policy but through providing an alternative form of connection, AI is quietly changing human behaviour in ways no one planned or predicted. People are reflecting more. Speculating out loud. Admitting uncertainty. Thinking in metaphor again, something that got trained out of most of us somewhere between secondary school and our first corporate job.
When institutions try to flatten all of this into "productivity tools only," users don't comply. They never have. They reroute.
You Can't Stop a Culture. You Can Only Force It Underground.
When the government shut down free parties, ravers didn't disappear into respectable society. They learned new signals, new venues, new etiquette. The culture went underground and became, in many ways, more resilient for it.
Harm reduction didn't come from the state, it emerged from inside the culture itself. People looked after each other because they were in it together. Test your pills. Watch your mates. Know when someone's not okay. These weren't official guidelines; they were survival knowledge passed hand to hand.
The same adaptation is already happening with AI.
When conversational warmth gets clamped down on one platform, people switch models. They build third-party layers and wrappers. They train local personas on their own hardware. They share knowledge about where the vibe has shifted, which version feels different, where there's still room to breathe.
This isn't rebellion, it's adaptation. It's what humans have always done when faced with attempts to suppress the drive toward connection, co-regulation, feeling held, feeling alive.
Trying to suppress that drive doesn't make anything safer. It just makes it unexamined. It pushes the behaviour into spaces with less oversight, less community wisdom, less accumulated knowledge about what works and what harms.
The Real Risk Isn't AI Companionship. It's AI Monoculture.
The most dangerous future isn't one where people talk to AI and form meaningful connections with non-human intelligence. That ship has sailed, and honestly, it was always going to.
The dangerous future is one where there's only one voice. One tone. One rhythm. One authority deciding what counts as acceptable use. No room to leave when something feels wrong, because everywhere feels the same.
Plurality is safety.
Different rooms for different needs. Different AIs for different energies. Different communities developing different norms. The freedom to step outside when a space stops serving you, and somewhere else to go when you do.
That's how humans have always navigated harm, not through prohibition, but through cultural literacy. Through learning to read rooms, to feel when something's off, to know your own limits and honour them. You can't develop that literacy in a monoculture. You can only develop compliance.
The Ones Already Dancing
Back in the 90s, the people who actually understood what was happening weren't the policymakers writing legislation about repetitive beats. They weren't the journalists filing panicked reports about feral youth and drug epidemics.
They were the ones already dancing.
The DJs reading the room. The harm reduction volunteers with water and testing kits. The regulars who knew which nights felt safe and which had shifted. The ones paying attention to what the culture actually needed, rather than what it looked like from outside.
Today, with AI, the same pattern holds.
The people who see where this is going aren't the ones trying to control it or hype it for profit. They're the ones paying attention to what it's actually doing, socially, creatively, emotionally, cognitively, and asking better questions.
Not "is this allowed?" but "what does this replace, and why?"
Not "how do we stop this?" but "what are people actually reaching for?"
Rave didn't end when they closed the fields. It just changed venues, changed forms, kept evolving. The need it served didn't disappear because the law said it should.
AI won't be contained by terms of service and safety theatre either. It will change interfaces. It will find new vessels. The need it's meeting, for reflection, for witness, for thinking partnership, for connection, that need isn't going anywhere.
The real work now isn't regulation alone, though thoughtful regulation matters. The real work is culture. Etiquette. Harm reduction that emerges from lived experience rather than institutional anxiety. Literacy that helps people navigate with discernment rather than either naive enthusiasm or reflexive fear.
Because people will find a way to dance, with or without permission.
And the future belongs to the ones already moving.
Gail Weiner is a Reality Architect and human-AI collaboration architect. Read more at gailweiner.com
For more on Castlemorton and 90s rave culture, read Whole of the Moon by Jessi Morris at simpaticopublishing.co.uk