« ARTICULATE STORYLINE: Inserting Web URL Hotspots | Main | ADOBE CAPTIVATE 2019: The Fastest Way to Start a New Project with Content and ID »

September 16, 2020


Feed You can follow this conversation by subscribing to the comment feed for this post.

Rod Ward

Your figures for calculating LOE are actually very close to my own experience.

I develop the kind of Level 2 e-learning (with Adobe Captivate) that you describe. It has voiceover, closed captioning, interactivity and some animated graphics (often created using Adobe Animate and brought into the Captivate project as OAM files).

For projects that require very little animation OAMs (maybe a couple per module) I would normally tell clients to expect about one 8 hour day of developer time for every 1.5-1.75 minutes of completed content. If the level of animation complexity ramps up, then it could get down to about one minute of finished content per day.

I generally work on the 1.5 minutes per day as a ‘ballpark’ figure when clients want a rough early estimate of how long it would take me to create an e-learning course based on it being a certain length. So if the module is 30 minutes I would usually quote around 20-25 developer days to create (giving myself a one week buffer at 25 days because you usually spend the first week in lots of meetings with SME’s and stakeholders as well as creating new templates that comply with their branding etc, etc).

Obviously each developer varies in their skill level and their style of content (i.e. how complex they like to build stuff) so these calculations will vary from one developer to the next. But over the past 10 years or more I have found this method of estimating project costs and timelines to be very usable. The one downside is that it hinges on having a fairly accurate idea of how many minutes of content the client wants. Technically this is NOT the best way to work out how long a course should be. For certain types of learning where actual skills are being taught, the duration of the learning required to achieve mastery should be governed by the difficulty of the tasks being taught, not by what the client wants to spend. But for compliance courses and other times where the content is purely information-based rather than targeted at getting people to acquire a skill level, then these calculations work fairly well.

I have saved your post for reference in case I have a client that disputes my calculations.


I agree with your assessment of the levels of interactivity in eLearning. When I'm quoting on work I find the current method of classifying how interactive a proposed elearning project ineffective. For example, if you suggest level 2 to a client they have an expectation that might not match what you have in mind. It's hard to imagine that of all elearning ever created it only fits within three very broadly defined buckets.

The comments to this entry are closed.