Lumeto
Designing a scalable XR training platform from concept to enterprise deployment.
0 → 1 enterprise XR training platform deployed to 15,000+ users.
- Role
- UX/UI Designer
- Tenure
- Jan 2021 – Nov 2021
- Scope
- Involve XR — Multiplayer training platform for healthcare and public safety
- Environment
- Enterprise XR · Research-led design · Unity collaboration
- Technologies
- Figma · Unity (collaborative) · XR interaction systems
Context
Traditional actor-led crisis simulation training is expensive, difficult to scale, and logistically complex. As a result, trainees often receive limited practice before facing real world high-stakes scenarios.
Lumeto set out to build a scalable, data-informed XR training ecosystem that could support both synchronous multiplayer simulations and asynchronous practice environments.

Jay Street XR training environment used for scenario simulations.
Environment design supported spatial navigation and scenario immersion.
Discovery
Translating scenario-based training into XR
Before designing anything, I mapped the existing in-person training flows for both user groups separately. Facilitators would be operating a desktop interface to run and monitor scenarios, while trainees would be inside the VR environment, two completely different surfaces with different needs and different failure modes.
What the mapping made clear was that the core problem with in-person training wasn't just cost. It was consistency. Scenarios varied depending on who was facilitating, how often sessions could run, and how many people could participate at once. That insight reframed the design goal. The job wasn't just to recreate the training in XR. It was to standardize it, so the experience could be repeated reliably across institutions without depending on a specific facilitator or a specific day.
That became the foundation for the three design principles that guided the platform: Immersion, Interactivity, and Standardization.

User flow used to understand the structure of in-person scenario-based training before translating it into XR.
The Challenge
Design immersive training experiences that:
- Preserve realism
- Support measurable learning outcomes
- Scale across institutions
- Operate within XR performance and interaction constraints
All while transitioning from concept to deployable enterprise platform.
What I Did
- Led UX across end-to-end VR and web-based training flows
- Translated SME-driven curriculum into structured interaction systems
- Conducted stakeholder interviews with training facilitators and domain experts
- Synthesized research into journey maps and scenario logic
- Defined core design principles: Immersion, Interactivity, Standardization
- Delivered a Proof of Concept within a 3-month timeline featuring 4 distinct scenarios
- Designed information architecture across both VR and web interfaces
- Collaborated with Unity engineers to refine interaction behavior and performance constraints
- Contributed to narrative design, environment planning, and avatar direction

Controller interaction model designed for intuitive VR navigation and object interaction.
Mappings were tested in simulation scenarios to ensure actions such as teleportation, view rotation, and object manipulation remained discoverable under stress.
System Architecture
The platform functioned as:
- A synchronous multi-user training hub
- An asynchronous practice and assessment environment
- A cross-surface experience (VR + Web)
Design decisions accounted for:
- Spatial interaction logic
- Motion capture integration
- NPC behavioral design
- Data-informed evaluation potential

Scenario architecture linking planning tools with immersive VR training environments.
Impact
15,000+
users across Ontario
Adopted within healthcare and public safety training contexts.
User scale reflects real world deployment following platform launch.
Reflection
Designing for XR amplifies ambiguity. Interaction friction and unclear information architecture become significantly more disruptive in spatial environments.
What this project reinforced for me is that immersive systems still depend on clear structural logic. Scenarios, roles, and interactions must be modeled explicitly before they can become believable experiences.
This work strengthened my ability to translate complex learning objectives into structured systems that balance immersion, usability, and technical constraints.