When developing elearning as part of a team or even as a team of one, it is critical to think about how you’re going to test it. At Endurance Learning, we have evolved our processes over many years.
When we started building elearning as Endurance Learning we were a small, nimble team where everyone did every role. There was no real structure to quality assurance (QA) and everyone took their turn at testing elearning. We often tested live with one person taking the course while the other watched on a Zoom call. Over time, we have evolved and created some more effective processes.
1. Create Traceability
An early lesson was that we didn’t know why we changed something. Did the customer ask? Did we find something? Did we open up the wrong file to make edits? We had used the Storyboards but it got very hard to tell if we had addressed all concerns. Some of us had used Trello before and it seemed like we could log issues and move them from one column to the next when we addressed them. Over time, we’ve continued to evolve this practice and have gotten better as a team at commenting on issues so that we not only know the issue was addressed, but have a quick way to tell who and how it was addressed.

2. Be Consistent
Early in our elearning journey, our internal team was responsible for Storyboards while external contractors were responsible for development. We worked with a ton of really smart people and they each had their own way of working. With each person, we adapted to their style. Suddenly, any process we thought we had was out the window. Some developers liked the feedback in a Word document. Some wanted it in Slack. Some liked the Trello board but didn’t move the tickets. We acquiesced. Eventually, we learned that we were doing ourselves and our customers a disservice. We couldn’t remember from project to project what process we were following and issues would get lost. To verify that every issue was addressed took additional time and energy. Once we started requiring the same exact process for every developer, we found ourselves being more efficient and more confident in the final product.
3. Unit Test
Our consistency had improved and we were producing enough elearning that we needed help with quality assurance. We hired a contractor who would regularly do QA on modules. She eventually became an employee. One day during a check in, it became clear that our team of internal and external developers were handing off elearning to be tested that wasn’t what we expected. When you receive a module where the “Start” button on the first page doesn’t work, you know you have a problem.
We quickly talked to every developer about their process for testing their work. In software development this is called unit testing. While the concept in elearning isn’t identical, the message was that developers had to take what they developed as a learner. At the same time they needed to have the software open to make edits. Our team saw immediately that we had to document fewer issues and go through fewer rounds of QA. Even if the developer’s unit test added two hours onto an eight hour development effort, it proved to save time and money.
4. QA Before You Build
Our QA team and developers reported that we were fixing issues in the elearning that could have been caught in the storyboard. Software industry studies prove that catching bugs early can save 15 hours for every 1 hour of effort. Our own internal study showed this to be true for elearning work, especially for minor but important issues. If we didn’t include the trademark on a product or use consistent terminology to describe a process, identifying and fixing these in a developed module was way more expensive than catching them in the storyboard.
5. Be Agile
Our team builds everything from 3-5 minute microlearning to extensive onboarding programs. In almost every case, we’ve learned that we shouldn’t wait to test. A larger course on a deadline is very hard to test. We build a lot of custom scenarios (what, you thought we build page turners?) and these need to be tested. While our team is still getting used to not seeing a course all at once, we’ve started to publish sections of a module to be tested. This has a few wonderful benefits. First, we can start testing long before we’re near the deadline. Second, it can really reduce the overload that our QA team feels when they test modules. Finally, it can inform changes to be made in additional parts of an elearning module.
Bonus: Add New Tools for Efficiency
One of our latest changes also pulls from the world of software development. Whether it was project leads or those in QA, reviewing elearning modules was proving to be too time intensive. We did some research on tools that would integrate with Trello to allow our team to more quickly capture feedback. While we are still only a month into our implementation, our team reports that the use of Marker.io has been a big boost to their productivity and accuracy. We’ll talk more about this in the future if it continues to show the benefits we expect.