click here for the features menu
spacer
spacer
The quest for quality

spacer
by Clive Shepherd
spacer
spacer
In the rush to become players in the e-learning industry, many content providers have been guilty of focusing on quantity rather than quality. A large library of e-learning content is not much of an asset if in practice it is found to be bug-ridden, inaccurate, unusable and impossible to learn from. Trainers and learners are telling us that e-learning content is not always living up to the exaggerated claims that are made for it. While you would not expect to see great products so early in the lifetime of a new medium, you do expect to see good ones that perform reliably and achieve their objectives. In this article, Clive Shepherd explores the many dimensions of e-learning quality and assesses what developers can do to ensure their products are fit for purpose.

Contents
Why quality matters
A testing problem
Real-world testing
Piloting the path to quality
Case study: Royal Mail and TATA
Quality standards and processes

Why quality matters

It is reasonable to assume that, in the rush to enter a promising new market and make a name for themselves, enterprising new suppliers will cut corners in order to get products on the shelf; everyone believes they can carve themselves a niche in the market by getting in quick, regardless of whether they have the necessary capital, resources or talent. Fine, unless you’re a consumer in a new market, in which case you have to suffer the fallout of working with malperforming products and companies that are here today and gone tomorrow.

E-learning has suffered from these phenomena, just as other industries have before. However, there’s a price to pay in terms of market credibility and, sooner or later, steps must be taken to reassure the consumer that you can deliver on your promises with products and services that meet customer expectations in full. For e-learning, quality is now the most pressing issue.

According to a survey by the European Training Village, in July 2002, almost two thirds of the trainers who responded rated e-learning as either 'fair' or 'poor' – not many classroom trainers would survive with happy sheet ratings this low. Now, you could argue that trainers are not real end users and that many e-learners are reporting an excellent e-learning experience, but the perceptions of buyers are important and the e-learning industry must take note of these or die. The quest for quality is under way.

spacer
Contents

A testing problem
spacer
In Glenford Myers’ book The Art of Software Testing, he points to a simple program containing 20 lines of code, including a loop and three IF statements. He calculated that there were 100 trillion execution paths, which would take a billion years to test! Given that a typical e-learning application, with its requirement for sophisticated interaction, will be many times more complex, you would be forgiven for concluding that quality control in e-learning is a hopeless task.

Not so, according to Mark Aberdour, Manager of EpiCentre, the UK’s largest e-learning testing service. “In the hands of well-trained testing professionals, it is possible to deliver a robust product that delivers to specification. Yet, despite overwhelming evidence regarding the cost-effectiveness of testing, a large proportion of the e-learning industry remains either sceptical or unaware of the need to test rigorously.”

The effects of inadequately tested e-learning courses can be devastating. According to a landmark study by IBM and Rockwell, a bug found post-release costs on average 100 times more to fix than a bug found early on in development. Says Aberdour: “In e-learning terms that could mean help-desk support, learners unable to complete training, negotiating rework with developers, an additional re-work cycle and another roll-out of the product to the organisation. For a high profile project, this could easily run to millions. In spite of this, testing continues to be done late in the product cycle, corners are cut to save time and the work is often carried out by junior, untrained staff.”

According to Aberdour, a content developer should allow something between 10 and 15% of project time for testing. Even when this is allowed, project delays often mean that testing time is cut back. To avoid this, Aberdour recommends that the test team have an early involvement in the development process, as bugs found early on are cheaper to fix.
Quality control is not just a process of eliminating bugs. E-learning projects have many facets, requiring different quality control measures. First and foremost there is a need for ‘functional testing’, to ensure that the product conforms to requirements, typically expressed in a technical specification or design document. Functional testing includes a check on functionality (to make sure it does what it says on the tin), on content (to remove spelling and grammatical errors) and on compatibility (to ensure it works on all the required platforms, operating systems and browsers).

Functional testing requires a mix of skills, resources and methods. Testers not only have to check the product against its spec, they also need to look for the unexpected. Says Aberdour: “We allow a couple of days for destructive testing, when our testers look for all those things that the designers didn’t expect learners to do. This always helps us to identify important defects that could cause a product to fall over in the field.”

To test for compatibility, EpicCentre have an enormous range of different hardware and software at their disposal. Although the software industry would rather like it that everyone used the latest version of Internet Explorer, running under Windows XP, with a full range of the latest plug-ins, on a Pentium 4 with a broadband connection, the reality is sadly rather different. Reaching a wide audience means supporting a range of platforms and that isn’t so easy, to develop or to test. EpicCentre track all the problems that they find in their bug database, an off-the-shelf product called Problem Tracker, which helps them to monitor who is accountable for resolving a problem and where it is in terms of resolution.

Interest is increasing in what EpicCentre call ‘non-functional testing’, which covers a myriad of important issues, including accessibility (for the disabled), usability, conformance to industry standards such as SCORM, and security. Each of these is an important consideration for an organisation publishing or distributing e-learning and each requires a different approach from the test team.

spacer
Contents

Real-world testing
spacer
Probably the UK’s largest e-learning development project is at learndirect, with more than 40 developers contributing to the product catalogue. Colin Buckley is Operations Manager for the Learning Team: “Before e-learning, technology-based training products were relatively self-contained. They were designed for a particular platform and worked in isolation from other systems. Now, it is impossible to really see how a product performs until it goes live, when it is running within the learner support environment and additional services, such as synchronous and asynchronous tutor support, are being provided.”

Learndirect is implementing a three-stage testing process to ensure that what it delivers meets expectations. Buckley: “At the very beginning of any contract we work with the developer to define very thoroughly what the build will contain and the technical specification that it must conform to. This is easier said that done, because all 40-odd developers seem to read the specification in different ways! At beta stage we then perform a comprehensive functional test, across many combinations of browser and platform. Finally, we upload the software to the real-world system, but not live, for a four-week period. In this time, our hubs and learning centres provide us with valuable feedback on how the product is performing. On top of that, we perform a final bug fix three months after going live. Only then can we be satisfied that the job is done.”

Of course, none of these tests tell you whether the learner is actually learning anything. Encouragingly, learndirect are addressing this issue with a new evaluation project. Says Buckley: “Using software from University College, Northampton, we will be asking selected learners to complete an online questionnaire, to provide us with feedback on pedagogical issues related to the courses they have completed.”

Another content developer that is concerned about the quality of the learning experience is Line Communications Group. Says CEO, Piers Lea: “The developer has an important responsibility to ensure that content is instructionally sound when it comes to screen. There is still a shortage of skill in this area and many customers do not fully appreciate its importance.”

The same can be said for usability testing. Says Lea: “We do usability testing early on at the prototype stage, using members of the target audience. This has been hugely useful in bringing out practical issues relating to how things are labelled, the navigational system and the look and feel. Typically we will leave learners alone with the software and video them, following this up with some formal questions.”

Testing for usability and ‘learnability’ does not come cheap and, in the end, customers have to assess their priorities. It is often said that, given the choice of quality, price or speed of delivery, you can pick any two of the three, but not the lot. Buyers of e-learning content development would be advised to ensure that quality is one of the two that they pick.

spacer
Contents

Piloting the path to quality
spacer
All of the quality control measures that we have discussed so far, depend on one simple assumption: that the designer of the e-learning content knows how their product will be used in the real-world. With that knowledge all they have to do is ensure that the product matches up to spec and release it to an eager public. According to Microsoft Partner Readiness Manager, Mark Buckley, this assumption is a dangerous one: “In our experience, it’s more important that a learning product matches what the learner is looking for from the experience than that it conforms to a specification. The process should start with an understanding of the learner’s goals with regard to the learning experience and work from there.”

Brian Sutton is Chief Educator with Microsoft’s training partner, QA: “When you look at how e-learning is used in practice you can be surprised. Not only did we find that many managers failed to deliver on their promises to learners of sufficient time and space to do the learning, we also found that learners used the products in unexpected ways. For example, in many cases learners would gather in small groups to go through learning materials or participate in virtual classroom sessions. This isn’t a problem, it’s actually a bonus; it’s just that the materials need to be optimised to support this process.”
According to Sutton, “developers need to get out of the mindset of ‘what am I going to tell them’ and think instead of ‘how can I best facilitate the required learning’. That’s how trainers would work in the classroom, so why not online? To make sure you’re really delivering on what the learner requires, there’s only one answer and that’s piloting, with real users in the real job environment.”

Piloting, usability testing, content testing, conformance testing – readers would be forgiven for thinking that life’s too short. In practice, these activities sound like a much more onerous commitment than they really are, when compared to the time spent on research, scripting, graphic design and authoring. The reality is that for every hour spent on quality control, many more hours can be cut from the overall schedule and the better the chance that, once your e-learning product is finally launched, you can sit back and bask in the glory, rather than running for cover.

spacer
Contents


Case study: Royal Mail and TATA
spacer
Speaking at this year’s Learning Technologies conference, held in London’s Olympia, Nigel Marsh, e-Learning Lead Project Manager at Royal Mail plc, and his colleague, Jane Deed, outlined the key issues and factors that any organisation must take into consideration if they are to deliver a quality e-learning experience to their employees.

Deed explained: “Our experience with e-learning materials has taught us that the processes that are put in place at the very beginning of the project dictate whether or not the project will be a success. You have to decide on the criteria for success – and how these are to be measured – as well as determining how the deployment of the learning materials will be managed and tracked.”

“Any e-learning materials have to be able to be run on any of the delivery technology available within the organisation – and, within the Royal Mail, that is a wide range indeed!” she smiled. “If this is not the case, the learner will have a ‘bad experience’ with the e-learning materials and will never want to attempt them again.

“You also have to be aware of the cost of technical development,” she said. “The cost of re-working e-learning materials can be expensive – so you can’t afford to ‘tinker’ with the materials up to the point at which they are rolled out. It’s vital that you ensure that the initial specification is correct.”

The Royal Mail work with one supplier – TATA Interactive Systems (TIS), who have produced some 30 e-learning courses for them over the last two years. Sambit Mohapatra is TIS’s Vice President for UK, Europe and Middle East: “One of the key reasons why we have been able to enjoy such a long-lasting relationship with Royal Mail, and why we have attracted as many as 30 new customers over the past year, in difficult trading conditions, is our commitment to quality.”

“We observed several years ago that re-work was a big component of total project time and that we had to do something if we were to provide a highly reliable and cost-effective service.” Not content with achieving ISO 9001 quality certification, TIS became – last year - the first company providing custom-built e-learning solutions to be assessed at level five on the Capability Maturity Model (CMM) scale. The CMM is the most rigorous quality standard worldwide and encompasses leading companies such as Boeing, Raytheon, IBM, NASA and Motorola. It was a short step for TIS to investigate Six Sigma - a management philosophy originally developed by the Motorola organisation. The central idea behind Six Sigma is that if you can measure how many ‘defects’ you have in a process, you can work out how to eliminate them and get as close to ‘zero defects’ as possible.

Applying these techniques helped TIS to achieve a remarkable 70 per cent reduction in product defects from 44 per thousand to just 18 per thousand within the first quarter of 2002. Said Mohapatra: “Customers are looking for people who can deliver a reliable product on time. A recent KPMG audit demonstrated that we are completing projects on schedule in 97% of cases, something that we couldn’t have dreamed of without our major commitment to quality processes.”

Interestingly, Marsh revealed that, “What has proved to be most beneficial for our business is not necessarily the quality of what TATA produce – although that is high – but the quality of the relationship we have developed with them.” Before developers disregard this article to concentrate instead on customer service, they should reflect that they are unlikely to be given the chance to develop this form of long term relationship if their early work does not fully meet customer quality requirements!
spacer
Contents


Quality standards and processes
spacer
Total Quality Management
spacer
The first major quality assurance movement, based principally on the work of W. Edwards Deming and applied to great effect in Japanese industry. TQM stressed the need for a ‘right first time’ approach by all participants in a work process, the use of ‘quality circles’ to ensure continuous improvement, and a way of looking at quality objectively in terms of ‘fitness for purpose’.

ISO 9001
spacer
ISO 9000 is a group of standards and guidelines for quality management and quality assurance. ISO 9001-2000, the most recent standard, applies to manufacturing and service firms and public agencies. To conform to ISO 9001, organisations must adopt work processes that conform to the standards and be certified as compliant by an awarding body.

Carnegie Mellon Software Engineering Institute: Capability Maturity Model
http://www.sei.cmu.edu/cmm/cmms/cmms.html
spacer
A wide-ranging software quality model employed by over 60 major organisations worldwide. Covers areas such as the management of technological change, the reduction of errors and process change management. Emphasises that ‘what can’t be measured can’t be managed’ and ‘what can’t be measured and managed, can’t be improved’.

Six Sigma
spacer
A customer-based approach that acknowledges that defects are expensive and that fewer defects mean lower costs and improved customer loyalty. Six Sigma aims for a defect level of less than 3.4 per million opportunities. The underlying process consists of five steps: define, measure, analyse, improve and control, or DMAIC.

Institute of IT Training: e-Learning Standards
http://www.iitt.org.uk
spacer
Comprehensive standards for the design of e-learning materials, endorsed by major industry bodies such as the e-Learning Network.

The World Wide Web Consortium (W3C) Web Content Accessibility Guidelines
http://www.w3.org/WAI
spacer
Guidelines for the development of web content that can be accessed by the disabled.

SCORM (Sharable Courseware Objects Reference Model)
http://www.adlnet.org
spacer
Technical standards to ensure the interoperability of e-learning content with e-learning platforms, such as Virtual Learning Environments and Learning Management Systems.

Institute of IT Training: Website Usability Standards
http://www.iitt.org.uk
spacer
A set of 200 guidelines to help website designers create more usable websites.

spacer
Contents

E-learning's Greatest Hits by Clive Shepherd
spacer
E-learning's Greatest Hits
by Clive Shepherd
Available now from Above and Beyond

spacer
click here for the features menu

© 2003 Fastrak Consulting Ltd All rights reserved