🔹 Who we’re looking for
We’re looking for:
- Artists, educators, and organisers who want to teach or learn AI skills
- Technologists, coders, and data people interested in building open, ethical AI tools
- Philosophers, ritualists, and symbol-makers who want to shape how society relates to AI
- Students, researchers, and institutions interested in AI safety, public infrastructure, and equity
No degree or experience required—just curiosity and commitment to collaborative intelligence.
🔹 What we’re doing first
🌀 Phase 0.1 – Summer–Autumn 2025
- Prototype 3 Co:AI workshops (AI for creative work, AI safety 101, chain-of-thought tracing)
- Publish the Co:AI Protocol for Consensual Hallucination
- Host 2 open salons + 1 red-team ritual performance at Club395
- Build a mini toolkit (simple CoT visualiser, prompts, notebooks)
- Begin outreach to partner venues, local councils, and researchers
🌀 Phase 1 – Winter 2025
- Launch a Co:AI Fellowship (3-month part-time)
- Train 5–10 community AI educators to run workshops in their own spaces
- Develop open-source monitor prototypes that align with current AI safety literature
- Connect with PhaseSpace to explore symbolic + embodied interfaces for AI legibility and flow
🌀 Phase 2 – 2026
- Seek investment from BridgeAI, OpenAI Community Fund, and UK AI Safety Institute
- Deploy Co:AI as a replicable model for grassroots venues across the UK
- Feed results into UK industry dialogues around ethical AI and creative sector uptake
🔹 Call-out: Join us now
We’re just getting started—but the time is urgent.
If you’re an artist, educator, coder, organiser, or simply AI-curious—we need your mind.
📍 We’re based in Bristol at Club395, but open to collaborators across the UK and beyond.
🌐 Whether you want to help build open-source workflows, host a workshop, or shape how your community uses AI, Co:AI is yours to grow.
📧 Email: connect@club395.uk
Or on Instagram @threeninefive.bristol