Data Source Onboarding:
Quick Start for New Users

Get up to speed quickly on the essential data knowledge needed to build a basic report.


A digital webpage with a blue theme focused on data-driven insights fuel growth. The page features a main title, a subtitle about decoding sales dynamics, a call-to-action button for a quick start guide, and a step-by-step process with four numbered sections: Overview, Core Metrics, Get Started, and What's Next.

Note: Due to company confidentiality and ownership of project materials, the original build and documentation are not available for public display, nor can any sensitive company/client information be shared. Visuals included are masked mock-ups.

Project Overview

I designed and developed a Quick Start Learning Program for new users of a specific data source. This program was created to onboard customers who were unfamiliar with this type of data, helping them quickly understand how to access, interpret, and apply insights in a reporting platform. The initiative combined an interactive eLearning module, how-to video, job aids, and supporting documentation to deliver a scalable, digital-first learning experience

Audience

Customers

Tools Used

Articulate Rise, ScreenPal, Contentful, Microsoft Word, PowerPoint, Asana

I led the instructional design and development process, including:

  • Project Planning and Management: I scoped and sequenced tasks across ADDIE phases in Asana, maintained weekly updates, and aligned stakeholders to ensure timely delivery and a coordinated launch.

  • Analysis and Content Review: I leveraged prior knowledge of users and existing documentation, reviewed stakeholder insights, and collaborated with SMEs to identify gaps, scoping the Quick Start build to focus on foundational learner needs.

  • Design and Development: I created clear, scoped learning objectives, organized module progression into a logical flow, authored and built content in Articulate Rise, and recorded and edited a how‑to video in ScreenPal, embedding knowledge checks to reinforce comprehension.

  • Collaboration and Resource Creation: I created job aids and reference articles in Word, partnered with a developer to translate them into Contentful Markdown, and coordinated publishing workflows to align LMS and Contentful launches.

  • Implementation and Evaluation: I published the Rise program as a SCORM file, coordinated the simultaneous launch of reference articles, and am in process of monitoring engagement and learner confidence to assess program effectiveness and uncover insights to help further program scalability.

Responsibilities

Project Origin

Identifying the Challenge and the Solution

The Challenge

New users struggled to understand core measurement concepts and apply them effectively within the reporting platform. Existing documentation was dense and fragmented, which made onboarding slow and overwhelming.

Learners lacked confidence in interpreting foundational metrics and building reports, and there was no streamlined entry‑level learning path to bridge the gap between technical documentation and more advanced analytics training.

The Solution

I scoped and delivered a Quick Start Learning Program designed to give new users immediate confidence with the reporting platform. The plan focused on creating a foundational learning path that introduced key concepts, emphasized the most critical metrics, and guided learners through building a basic report step‑by‑step.

I combined interactive eLearning, a concise video tutorial, and supporting job aids into a cohesive experience, ensuring multiple reinforcement points. The program was intentionally positioned as the first stage in a larger learning journey, providing quick wins while laying the groundwork for future modules that address more advanced analysis and strategic application.

My Process

Learning Model and Theory Integration

Review below for insight into my approach for this specific project, aligned to each phase of the ADDIE model and detailing the integration of Bloom’s Taxonomy and Gagné’s Nine Events of Instruction within my design.

  • In the analysis phase, I leveraged my extensive existing knowledge of the user base and the documentation I had already contributed to over several years. Having designed and maintained job aids and training courses on the specific data source previously, I had a deep understanding of user needs, pain points, and workflows. I reviewed stakeholder insights and existing training materials to identify redundancies and gaps, and collaborated with subject matter experts to clarify business use cases and reporting requirements. This ensured the program was grounded in real-world challenges and aligned with the expectations of both customers and stakeholders.

    🔗 Integration — Gagné’s Nine Events of Instruction:

    • Stimulate Recall of Prior Learning — Connected new content to familiar reporting challenges and prior resources.

  • During the design phase, I translated these insights into a learner-centered blueprint and created specific learning objectives aligned to Bloom’s Taxonomy.

    • Remembering: Define [data source title] data, recall the [2] core metrics.

    • Understanding: Explain the difference between each core metric; describe the role of [data source title] data.

    • Applying: Build a basic report in the report-building platform using guided steps.

      Note: Data source title and specific core metric titles removed. Bloom’s objectives for Analyzing/Evaluating/Creating and not within scope for this quick start module; these higher‑order objectives are reserved for future program scalability/advanced modules.

    I structured the program into modular segments — Overview, Core Metrics, Get Started, and What’s Next — to create a logical progression of learning. Each module was storyboarded with interactive elements, knowledge checks, and visual scaffolding to reinforce comprehension. I applied Gagné’s Nine Events of Instruction to guide the flow, ensuring learners were engaged, supported, and able to transfer knowledge effectively

    🔗 Integration — Gagné’s Nine Events of Instruction:

    • Inform Learners of Objectives — Stated goals upfront.

    • Gain Attention — Used bold visuals and real-world scenarios.

    • Provide Learning Guidance — Embedded tips, prompts, and walkthroughs.

  • In the development phase, I personally built the program in Articulate Rise, authoring all instructional content including tutorials, metric definitions, and scenario-based examples. I also recorded and edited the how-to video on report building using ScreenPal, providing learners with a practical demonstration.

    For the job aids and reference articles, I developed the content in Word and then collaborated with a developer to translate these into Contentful format (Markdown) for consistency and accessibility. Unlike traditional participant guides, the supporting resources were designed as concise, searchable reference articles and job aids to reinforce learning outside the course.

    🔗 Integration — Gagné’s Nine Events of Instruction:

    • Present the Content — Delivered concise, visual explanations with layered interactivity and examples.

    • Elicit Performance — Designed guided practice activity where learners created a report in the report-building platform.

    • Provide Feedback — Integrated knowledge checks with immediate feedback.

  • Implementation focused on ensuring a coordinated launch. I created SCORM packages for the Rise program and published them to the LMS, while the developer published the reference articles to Contentful. By managing timelines and aligning deliverables, I ensured that the eLearning module and job aids launched simultaneously, providing learners with a seamless experience across platforms. I facilitated stakeholder communication and quality assurance reviews to confirm readiness and smooth deployment.

    🔗 Integration — Gagné’s Nine Events of Instruction:

    • Enhance Retention and Transfer — Ensured learners had ongoing access to job aids and reference articles alongside the course.

  • In the evaluation phase, I embedded mechanisms to measure effectiveness and inform future iterations. Knowledge checks were integrated within the module to provide immediate feedback and track comprehension. I designed a post-launch evaluation strategy that captured learner satisfaction, completion rates, and competency gains.

    Engagement metrics are currently being monitored to identify areas of strength and opportunities for improvement. As participant data continues to be collected via program completions and surveys, I plan to assess and document key takeaways and share recommendations with leadership for scaling the program, ensuring that future enhancements would continue to meet learner and business needs.

    🔗 Integration — Gagné’s Nine Events of Instruction:

    • Assess Performance — Used knowledge check questions to confirm understanding of metrics and reporting steps within the program build, as well as self-assessment of competency in program surveys.