From Analysis to Action:
The Analytic Process

Turn data into decisions through a structured analytic framework.


Project Overview

This program was designed to equip learners with a structured analytic framework that empowers them to generate actionable insights and solve complex business challenges. Organized into three integrated components — Learn, Practice, and Grow — this program builds foundational knowledge, reinforces skills through applied scenarios, and validates competency through formal assessment in a comprehensive learning experience. While it is currently being piloted internally for feedback and refinement, it is intended for external client audiences.

Audience

Customers (Manufacturers and Retailers)

Tools Used

Articulate Rise, Storyline, Microsoft Word, PowerPoint, Workday, Asana, Copilot

I led the instructional design and development process, including:

  • Project Planning and Management: Planned tasks in Asana across ADDIE phases, maintained updates, and shared weekly progress to ensure transparency and stakeholder alignment.

  • Analysis and Content Review: Reviewed stakeholder insights, evaluated existing ILT and asynchronous training, and collaborated with SMEs to identify gaps and constraints.

  • Design and Development: Created learning objectives, mapped them to real-world actions, adapted and created end-to-end report-based case studies, storyboarded all modules, and partnered with a developer to build the program in Articulate Rise.

  • Collaboration and Resource Creation: Worked with SMEs on case study generation, with fellow designers on materials and assessments; developed job aids, participant guides, and downloadable resources for learners.

  • Implementation and Evaluation: Structured end-to-end program LMS documentation and organized SCORM files in coordination with LMS administrator, as well as managed timelines, stakeholder communication, and evaluation strategy.

Responsibilities

Note: Due to company confidentiality and ownership of project materials, the original build and documentation are not available for public display, nor can any sensitive company/client information be shared.

Project Origin Details

Identifying the Problem and the Solution

While it’s tough to distill everything I’ve learned throughout the course of this project into a short list, I’ve attempted to hone it on three key things below that I feel did (and will continue to) greatly impact my Learning Designer and Developer processes moving forward.

The Problem

Previous ILT sessions on the analytic process were shortened to fit time constraints, reducing depth and limiting practice. While the original design wasn’t flawed, the condensed format left learners without enough support to apply the process effectively.

This program was created to restore essential content, reinforce each step with structured guidance and practice, and reconnect the dots for learner success. It was also designed with a dual purpose: to serve as a scalable, revenue-generating offering for external customers.

I designed an asynchronous eLearning program that taught the analytic process as a sequential four-phase framework:

  • Phase 1: Project Scoping (gather and organize all necessary information, define core issue, generate hypotheses and research objectives, validate scope with stakeholder)

  • Phase 2: Analysis Design (map analytical flow, set data parameters, confirm design feasibility)

  • Phase 3: Data Analysis (identify key findings and implications, build presentation slides and story)

  • Phase 4: Recommendation Development (and Delivery) (generate actionable recommendations, quantify opportunities, deliver analysis, outline next steps)

The program featured scenario-based modules, interactive case studies, and competency-based assessments. I streamlined existing ILT content, retaining what was effective and revising or replacing gaps. The final solution focused on what new learners needed to succeed, balancing clarity and efficiency while restoring depth and support that had been lost in earlier iterations.

The Solution

The Framework

My Process

Learning Model & Theory Integration

Review below for insight into my program creation process for this specific project, aligned to each phase of the ADDIE model, as well as how I leveraged Bloom’s Taxonomy, Gagne’s Nine Events of Instruction, and the Kirkpatrick Model for Evaluation into my design.

Analysis

During the analysis phase, I reviewed stakeholder insights provided by my team manager and set up a comprehensive spreadsheet of project timelines and assignments, along with an end-to-end project plan in Asana. I aligned closely with the manager, led the kickoff meeting with the team, and communicated weekly status updates on progress to leadership.

Additionally, I evaluated current and past instructor-led training (ILT) and asynchronous materials, collaborated with subject matter experts (SMEs) to identify gaps caused by time constraints, and determined where content could be reused or needed to be rebuilt. To maintain organization throughout the project, I established an asset tracker to keep all deliverables well-managed and accessible.



Design

During the design stage of the instructional program, the analytic process structure and sub-steps were refined to emphasize essential knowledge for new learners. Learning objectives were drafted in alignment with Bloom’s Taxonomy and mapped to real-world actions learners would need to perform (see corresponding section for more context).

Solution overview documentation was created to share with the team and leadership, outlining all learning goals and essential competencies. An end-to-end program content outline was established for text-based storyboarding, and the team (myself, two other designers and a developer) collaborated to align on the design process and platform for delivery (Articulate Rise).

For case study design in Learn, I collaborated with SMEs to update an existing data story and reports to incorporate the changes we’d aligned on in the analysis phase to improve the learner experience. The Practice case study design was a bit more complex; I put together a report series focused on a new business challenge, working closely with SMEs to ensure accuracy and alignment of the data to the narrative I created for learners to practice applying their analytic process skills in a real-world context.

Instructional strategies included scenario-based learning, interactive activities, and realistic business case studies, with content sequenced to ensure progressive complexity and relevance. I storyboarded the Learn and Practice components and collaborated with designers to develop participant guides, job aids, and competency assessment questions, ensuring a cohesive and comprehensive learning experience. I used Copilot frequently throughout the design process to support in phrasing adjustments to complex or repetitive data-related language, quality-checking prior to peer review before developer handoff, as well as bouncing ideas off of based on uploaded context.

You can review how Gagné’s Nine Events of Instruction were integrated throughout the design to enhance learner engagement and effectiveness in the corresponding section.

  • Attention was gained through realistic business scenarios in Learn and Practice components and a high-stakes context in the Grow assessment. Learners were informed of clear objectives at the program start and reinforced in each component.

  • Recall of prior learning was stimulated by referencing personal experiences in Learn, recalling strategies from Learn in Practice module application, and building assessments on both.

  • Content was presented using structured frameworks and case studies presented progressively across components, while learning guidance was provided through participant guides, job aids, checklists, and embedded tips within the modules.

  • Performance was elicited via interactive activities, real-world case study application opportunities, and competency tasks, with immediate feedback in Learn and Practice, and final assessment scoring in Grow.

  • Performance assessment included knowledge checks and formal competency tests.

  • Finally, retention and transfer were enhanced through the learning guidance materials mentioned above, real-world application guidance, and downloadable summaries ensuring learners could apply skills effectively on the job.

Gagné’s Nine Events of Instruction

Bloom’s Taxonomy

Phase 1: Project Scoping
Focus: Define and frame the challenge; Learners clarify problems, generate hypotheses and define research objectives.
Bloom’s Levels: Understand > Analyze > Create

Phase 2: Analysis Design
Focus: Plan the analytical approach; Learners specify deliverables, map analytical flow, and validate design.
Bloom’s Levels: Apply > Analyze > Evaluate

Phase 3: Data Analysis
Focus: Extract insights and build narrative for analysis presentation; Learners identify findings, build storyboards, and generate implications.
Bloom’s Levels: Apply > Analyze > Create > Evaluate

Phase 4: Recommendation Development
Focus: Deliver strategic solutions; Learners propose actions, quantify impact, and present executive summaries.
Bloom’s Levels: Create > Evaluate > Apply


Development

During the development stage, I built Rise modules and partnered with a developer to create embedded Storyline blocks, ensuring a seamless integration of interactive elements. I also supported fellow designers in developing program materials and assessment components, acting as a reviewer to ensure alignment with learning objectives and consistency across the Learn and Practice components, thereby maintaining instructional integrity and quality throughout the program.

I leveraged Copilot throughout this stage to act as an additional quality check prior to sharing with fellow designers in Review360 for final review before the implementation phase. Additionally, I did also turn to Copilot in some situations as I was developing Learn and Practice in Rise to help refine or tweak any language or content choices I’d made initially in the text-based storyboards, based on seeing the cohesive program come to life.


Implementation

During the implementation phase, I created a structured LMS setup document and organized SCORM files to ensure clear access and efficient management. I coordinated closely with the LMS administrator to facilitate smooth deployment within Workday, aligning the LMS structure with learner flow and stakeholder expectations. The program was delivered via LMS as self-paced eLearning, with sequencing designed to require learners to complete the Learn phase before progressing to Practice, culminating in the Achieve phase for certification.

To support learners throughout the program, participant guides and support resources were provided both at the start and embedded throughout each Rise module within the Learn and Practice components, ensuring accessible learner support and guidance at every stage.

Evaluation

Evaluation during the learning program was conducted through formative, summative, and continuous improvement methods.

Formative evaluation included knowledge checks and feedback loops embedded within the Learn and Practice components. Summative evaluation was carried out through competency tests in the Grow component. Continuous improvement efforts involve monitoring completion metrics, assessment scores, learner survey input, and stakeholder feedback to inform ongoing enhancements.

Review how Kirkpatrick’s Four Levels of Evaluation was applied to measure the program’s effectiveness comprehensively in the corresponding section.

The Kirkpatrick Model

  • At Level 1, 95% of internal learners rated the training as relevant and clear.

  • Level 2 results showed an average assessment score of 88%, exceeding the 80% proficiency target.

  • Level 3 post-training surveys indicated self-reported improvement in applying the analytic process.

  • Level 4 evaluation is ongoing, with plans to collect data post-release to client audiences to assess long-term behavior change and business impact.

Key Learnings

How did I grow in this process as a learning designer and developer?

While it’s tough to distill everything I’ve learned throughout the course of this project into a short list, I’ve attempted to hone it on three key things below that I feel did (and will continue to) greatly impact my Learning Designer and Developer processes moving forward.

Throughout the course of this project, I had the opportunity to work with colleagues I hadn’t previously been partnered with. Not only did it allow me to get to know my teammates better, but also to gain a clear understanding of their working and learning styles, which was essential in us figuring out how best to approach this project end-to-end. Future collaboration will be that much more seamless, having this shared experience and familiarity, and not only benefits myself, but our entire team.

Collaboration

AI Integration

This was my first experience leveraging AI within an instructional designer context via Copilot — and it was awesome.

I learned so many strategies — from efficiency improvement hacks, to condensing complex language, to suggesting initial drafts that I could then refine with my ID expertise — and more. I’m excited to continue finding new ways to incorporate AI in my learning design and development process.

Partway through the development of our Learn modules, Articulate Rise released their custom blocks feature. This was a game-changer for myself and the developer I worked closely with to build out all of the Learn and Practice eLearning.

We met to discuss on visual design changes we could to improve the learner experience, and aligned on specific styles and instances to use for custom blocks in the company branding throughout all modules. It modernized the look and feel of the program and I’ve continued to use them in Rise builds since.

Custom Blocks (Rise)