VR Ecology 🎓 IEEE VR 2026 · Doctoral Consortium · Daegu, Korea QR code
↗ HELLO IEEEVR 2026! NEW OSS TOOL · pip install bhaptics-http · HTTP REST bridge for bHaptics hardware · Raw dot patterns, no Studio required

S.C. Vollmer · York University · IEEE VR 2026 DC

Relational Haptic Co-Creation
for Kinaesthetic Creativity in XR ✨🧤

How can an AI partner stay legible and negotiable when the primary interface is the body — not the text box?

VR Ecology is a research-creation project exploring relational haptic interaction, kinaesthetic creativity, and adaptive AI attunement within extended reality.

🧤 haptic togetherness 🎚️ intensity-based feedback 🎛️ gesture + sound 🧠🤝 adaptive AI attunement 📼 replay + research dashboard
🦦 haptic co-regulation 🤝 consentful turn-taking 🧤 full-body haptic suit ✋ gesture + MediaPipe 🧠 AI scene co-creation 📼 trace-memory logs · 50+ sessions

A system that runs on the body's logic

The system integrates gesture, sound, spatial modulation, and intensity-based feedback to support interability-driven co-creation in immersive environments.

preview 1
preview 2
preview 3
preview 4
🍃 calm by default 🧩 modular + hot-swap friendly 🧭 tracking → signals → response 🗃️ structured logging for research
🖥️ DC Presentation Slides 13 slides · ← → to navigate · N for notes
First presenter of the day. 10-minute talk covering the research framing, the prototype, and three open questions for the consortium. Built with the same template as sparks2026.headsetparties.com.

What is VR Ecology?

VR Ecology is a dissertation research-creation project at York University. It asks what AI collaboration looks like when the primary interface is the body — not the text box. Generative AI keeps pulling creative practice toward language: stop moving, stop listening, start explaining. XR runs on a different logic. You lean, reach, hesitate, feel a pull. That somatic logic deserves an AI that can operate within it.

The system uses a small, learnable haptic vocabulary — drawn from consentful design principles, ProTactile phatic grounding, and interspecies coordination research — to let an AI partner signal intent, offer turns, and regulate pace through the body. No text prompts. No interruption. The cue is an invitation: confirmable, ignorable, repairable in the moment.

50+ live sessions captured. The open questions I'm bringing to the DC are products of that accumulated work — not its starting point.

* Consentful — design practice for consent-based interaction; not a checkbox but an ongoing, renegotiable relationship. See consentfultech.io

The "engineering backbone" cast of characters

Not animals (yet) — more like tiny helpful gremlins that keep the scene alive.

🧤 The Haptic Sprite tactile events · intensity envelopes · comfort-aware pulses
  (•‿•)ノ  *tap*
  "i translate touch"
🎛️ The Sound Goblin gesture→sound · rhythmic mapping · spatial cues
  (ง'̀-'́)ง  ♫
  "i turn motion into music"
🪩 The Scene Weaver visual modulation · overlays/trails · live updates
  ⟡⟡⟡
  "i rearrange the light"
🫧 The Tracker Moth hands/controllers/sensors · stream fusion · normalized signals
  (˘▾˘)~
  "i follow the shimmer"
🧠🤝 The Attunement Buddy adaptive AI scene updates · constraints · co-creative modulation
  (•ᴗ•)  "we adjust together"
📼 The Timeline Owl research dashboard · replay · tagging · exports
  (o_o)📝
  "i remember politely"

A small, learnable haptic language

Six signals. Three coordination functions. Runs on full-body add-on haptics — 9 devices, 40+ motor points of contact. bHaptics 🇰🇷 (IEEEVR 2026 sponsor) confirmed working end-to-end.

function signal means
ORIENT bilateral shoulder tap · 200ms "look here"
OFFER soft double-pulse · ~500ms "may I?"
HANDOFF firm bilateral · 500ms sustained "your turn"
CONFIRM quick bilateral tap · 100ms "got it"
SLOW sustained low bilateral · 800ms "ease the pace"
SETTLE descending intensity · 1s "rest here"

The AI taps — the maker moves — that movement returns as context for the next move.

On-Body Haptics — open-source

The custom haptic hardware developed for this research is fully open-sourced — multiple variants, step-by-step build guides, written for students, artists, and budget-conscious labs.

v1 · Arduino belt

5 motors · Bluetooth · ~$40 in parts · beginner-friendly

v2 · Raspberry Pi

8 motors per device · I2C · DRV2605L · 120+ waveform effects · custom PCBs

Schematics, code, wiring diagrams, and build guides all documented. v3 in progress — contributions welcome. If you need haptics and can't afford a commercial kit: please use it, fork it, contribute back.

↗ misscrispencakes.github.io/On-body-haptics/ ↗ bhaptics-http — HTTP REST bridge for bHaptics · pip install bhaptics-http

The questions arrive late, not early

The open questions I'm bringing to the DC are products of accumulated work —

On-body haptics · distributed creativity
Arduino belt → Raspberry Pi · Hackaday Supercon · Mitacs e-Accelerate grant · open-sourced
Intuitive gestures in VR painting
empirical user study · 14 participants · HCI paper · York University
Curious Creatures
living VR research-creation lab · gesture + haptics first fused · ACM MOCO
AI + procedural VFX + WebXR
co-creative human-computer worldmaking · HCCC paper · EVA London
Comps
embodied interaction · kinaesthetic expression · ProTactile · interspecies coordination
Relational co-creation
AI & Society — under review · SIGGRAPH Sparks 2026 talk
→ now: 50+ sessions · prototype at L2/L3 · open questions
IEEE VR 2026 · Daegu · Doctoral Consortium

Three open questions for the DC

Where the accumulated work has run into genuine open problems —

Q1 · Measurement
How do you measure grounding without breaking flow?
Standard psychometrics are session-interrupting — stop someone to fill out a trust scale and you've broken the flow state you came to study. Trace-memory logs give coordination data continuously, but that's not trust or grounding directly. Working plan: logs + retrospective video-stimulated recall.
Q2 · Generalization
What contributions travel from a living, evolving system?
The system legitimately keeps changing — that's the method. Capability staging indexes findings to levels, not builds. But it doesn't specify what kind of knowledge is produced. Working framing: empirically-grounded design knowledge indexed to capability levels.
Q3 · Ethics of borrowing
Habituation, sensory diversity, and responsible use of ProTactile
The haptics literature assumes cues that habituate — but ProTactile is a living language made for touch. Do the same dynamics apply? And: ProTactile is a communication system developed by and for the deaf-blind community. Citing it is not the same as community consultation. What does ethical borrowing actually require?

System walkthrough video

Teaser and walkthrough coming soon — drop back after the conference.

preview 1
🎥 Video coming soon

Contact

S.C. Vollmer · York University · The Alice Lab
Contact: TBA

SuDal-E otter — back to top