Blended Learning
Digital Workshop
An immersive, gamified team workshop bridging foundational instruction and certification-level performance.
Project Overview
This immersive digital workshop served as the practical application component of a multi-phase report-building certification program. Learners entered a themed, story-driven environment — a space station analytics hub — where they tackled realistic reporting challenges modeled after real client scenarios.
The experience blended scenario-based eLearning, collaborative team problem-solving, and expert facilitation. Participants navigated a cohesive narrative, explored interactive rooms to uncover clues, answered team knowledge checks, and applied their skills to authentic data analysis tasks. Trainers supported live sessions, guiding learners through the analytic process and assessing performance against challenge criteria.
The result was an engaging, high-stakes learning environment that connected foundational analytics training to real-world application — earning strong learner engagement and facilitator endorsement during the pilot phase.
Click below to see a storyboard sample for this project:
Audience
Client teams participating in a multi-phase analytics certification program
Tools Used
Articulate Storyline 360, Microsoft Word, PowerPoint, Asana
My Role
I served as instructional design lead on this project, driving all content strategy, scenario design, gamification architecture, visual direction, and learner flow. Working directly in Storyline 360 alongside a developer throughout the build, I directed interaction design, made content-level edits within the file, and collaborated closely on the technical execution of the experience.
Note: Original materials are masked due to company confidentiality. All visuals shown are masked mock-ups.
Responsibilities
Led end-to-end instructional design including learning objectives, scenario narrative, gamification strategy, and challenge structure
Designed the room-based exploration mechanic, clue-finding sequence, and star-based progress tracking system
Wrote all content including knowledge check questions, feedback layers, challenge criteria, participation expectations, and facilitator-facing instructions
Directed visual design and interaction decisions throughout the Storyline build, working directly in the file alongside the developer
Created storyboards and interaction flows for all slides and modal sequences
Designed the report-building challenge checklist, example report reference, and helpful resources panel
Partnered with trainers to align facilitation strategy and assessment approach
Supported pilot facilitation and documented recommendations for full rollout
The Challenge
The organization needed a way to bridge foundational analytics instruction with real-world performance. Existing training covered concepts, but learners lacked a structured opportunity to apply those concepts under realistic conditions — the kind of hands-on practice that prepares people to perform, not just recall.
The program also needed to support a live, facilitated environment with multiple learners working simultaneously, requiring interaction design that could accommodate team collaboration, facilitator oversight, and individual accountability at the same time.
The Solution
I designed the practical application component of a three-part certification program:
Instructor-Led Training (ILT): Foundational, facilitator-led sessions to build baseline knowledge (designed by broader team)
Practical Application Component (my role): An immersive, gamified digital workshop where learners applied analytics skills to real-world reporting challenges inside a cohesive themed environment
Competency Assessments: Formal assessments to validate mastery and award certification
The workshop was structured as a mission narrative: learners were stationed at a lunar analytics hub tasked with uncovering the competitive landscape of the galactic market. Three report-building challenges stood between them and mission completion — each requiring team collaboration, facilitator verification, and a code entry in the Control Room to advance.
My Process
Gamification Architecture
The experience was built around a mission narrative: learners were stationed at a lunar analytics hub and tasked with unlocking a comprehensive view of the galactic market by solving three report-building challenges. To progress, teams explored three distinct rooms, found hidden clues, answered knowledge checks collaboratively, and earned stars toward uncovering each challenge.
A control room served as the final destination where teams entered facilitator-issued codes after completing each report-building task — creating a real-world consequence tied to in-module performance.
Learning Theory and Model Integration
Merrill's First Principles: All tasks anchored in authentic report-building scenarios modeled after real client deliverables. Learners do, not just read.
Cognitive Load Theory: Chunked interactions, scaffolded challenge criteria, and embedded resource panels reduce overwhelm at moments of highest complexity.
Constructivism and Social Learning: Team-based clue discovery and collaborative knowledge checks reinforce peer reasoning and shared accountability.
Backward Design: All interactions mapped to certification-level competencies from the outset — every game element serves a learning objective.
Design Highlights
Several design decisions distinguish this experience from standard eLearning:
Hotspot-Based Navigation: Rather than a traditional menu or button interface, learners navigate by clicking numbered doors in a hallway scene. This maintains immersion while providing intuitive wayfinding — the space station environment stays intact without breaking to a menu screen.
Layered Modal Architecture: Room interactions use a two-step modal sequence: an entry instruction modal primes learners for the collaborative task before exposing them to the clue discovery interaction. This reduces cognitive load at the moment of highest engagement.
Team Collaboration Built Into the Interaction: Both the clue discovery modals and knowledge check questions include explicit team instructions embedded in the interaction itself — not relying on facilitator prompting to trigger collaboration. Every learner responds individually, but the discussion is built in as a required step.
Gamification as a Learning Signal: The star tracker and badge system are not decorative — each star earned corresponds to a knowledge check answered correctly, and the badge icons update state as challenges are completed. Progress is always visible and always meaningful.
Performance Support Within the Task: The challenge criteria modal includes a checklist, an example report view, participation expectations, and a helpful resources panel — all accessible from within the same interaction without navigating away. Scaffolding is embedded where the learner needs it, not on a separate reference slide.
-
Identified the need for consistent, advanced report‑building skills and formal recognition of expertise.
Analyzed learner roles and workplace demands using andragogical principles to ensure relevance.
Identified performance gaps in engagement, skill transfer, and operational impact.
Used Backward Design to clarify certification‑level expectations and map required competencies.
-
Defined measurable learning objectives aligned to certification outcomes.
Designed an immersive, themed, scenario‑based workshop grounded in Constructivism and Social Learning Theory.
Applied Cognitive Load Theory through chunking, scaffolding, and clear guidance.
Used Merrill’s First Principles to anchor activities in authentic report‑building tasks.
Created storyboards and interaction flows using Multimedia Learning Principles.
Integrated gamification elements (badges, progress tracking, instant feedback).
-
Built scenario‑based activities, knowledge checks, and embedded resources aligned to real business reporting challenges.
Directed instructional design while collaborating with a developer on the Storyline build.
Conducted QA for usability, accessibility, and alignment to objectives.
Developed facilitator guides using Cognitive Apprenticeship to support expert modeling.
-
Led the pilot launch to validate flow, engagement, and scenario clarity.
Partnered with trainers to support facilitation and learner questions.
Monitored engagement through collaborative activities and gamification.
Documented recommendations for full rollout.
-
Conducted formative evaluation during the pilot to refine content and interactions.
Ensured assessments aligned with Backward Design and certification competencies.
Analyzed early indicators of engagement and skill application.
Outcome Indicators: Pilot results showed strong learner competency gains and positive facilitator effectiveness feedback, supporting recommendations for full rollout.
Provided recommendations for continuous improvement and broader rollout.
Reflection: Project Takeaways
Key Learnings
Designing and piloting the practical application component reinforced the value of grounding learning in authentic, real‑world tasks. The immersive, themed environment — combined with scenario‑based challenges, cognitive scaffolding, and collaborative problem‑solving — created a meaningful bridge between foundational instruction and certification‑level performance. Strong competency gains and facilitator endorsement supported the effectiveness of the design and its readiness for broader rollout.
Enhancement Opportunities
1. Build in earlier, structured user testing cycle prior to pilot.
While the pilot offered valuable insights, future projects would benefit from earlier, iterative user testing during development. Short usability checks with a small learner group — before the full pilot — would help validate interaction patterns, cognitive load, and scenario clarity earlier in the process. This would accelerate refinement and reduce rework later in the cycle.
2. Formalize a more frequent cross-functional review cycle.
Collaboration with trainers and the developer was strong, but future projects could be strengthened by establishing a more formalized review cadence across SMEs, trainers, and technical partners. A recurring touchpoint (e.g., weekly design reviews) would ensure alignment, surface questions sooner, and create a shared sense of ownership throughout the build. This structure would also support smoother handoff if team transitions occur, as they did in this project.

