Evaluating online video

Whilst I was Multimedia Producer at the Wellcome Trust I began a programme of evaluation for our online videos. I left before I was able to see most of the results from that, but in this post I’ll share the plan of action I had for the evaluation, which used a number of different approaches. I was building upon work that Danny Birchall and I did in games evaluation, using some of the same methods, but evaluating video presents a somewhat different set of challenges. If you have any thoughts about other techniques one could use, or ways of improving on my methodology, I’d love to hear that.

Wellcome Trust YouTube Channel
Wellcome Trust YouTube Channel

Why evaluate videos?

At the Trust we (me and the other Multimedia Producer, Barry J Gibb) were creating short documentaries about the research funded by the Trust, educational films, and so on. The audience varied, but was usually considered to be an interested lay audience or a GCSE/A-level student. Our video dissemination strategy was fairly straightforward, we posted them on our site (where they frankly got a bit lost) and on YouTube (which I always considered to be far more important especially because of the much greater chance someone would find them there, and the ability to share).

I set the YouTube channels up in early 2009, but had no formal plan of evaluation (oh! to turn the clock back) and simply kept an ad hoc eye on analytics and feedback. The analytics were sometimes interesting, but commentary on YouTube, you may have noticed, is rarely that worthwhile. When we got lots of it, it was generally polarised and sometimes unpleasant. When we got none, which was usually the case, it was like putting the videos up to the sound of tumbleweeds and crickets. We had no real idea who was watching our videos, whether or not they liked them, how they were using them, and therefore it was hard to know how to improve them.

That was the key, for me. How could we develop a strategy for the future when we didn’t really know what had worked or not in the past? But all was not lost. A lot of information was just sitting there, in the form of analytics, and to find out more, we just had to ask.

My approach

I split the project up into a number of different areas:

1. Analytics

2. Staff feedback

3. Audience feedback

4. External organisations

Analytics

My plan was to track several paramenters for the lifetime of each video, ideally broken down into years. I had decided to go for calendar years rather than years since posted to be able to compare videos across the same period, but was somewhat in two minds about it. Youtube analytics offers information about views, traffic sources, and playback locations, which I had planned to include, and demographics and audience retention, which I hadn’t. Demographics on YouTube are only tracked for those people who are signed in and have added those details to their profile, so are largely worthless. I guess they may be less so now that it’s owned by Google and more people are likely to be signed in, but I still wouldn’t place much stock in them. The audience retention graphs are interesting, but hard to compare.

The analytics presented the biggest challenge for the project, though.  The first challenge was that the analytics were split between Google Analytics for the videos on our website and Youtube’s own analytics. The latter are much more restrictive than the former, though they give more detail on the attention patterns, for example. GA depends very much on how it’s been set up for video, but thankfully when GA was set up for the Wellcome Trust site at least someone had the sense to track the number of actual video plays as events, rather than rely on pageviews. But still, marrying GA and YouTube together wasn’t going to be easy.

The second issue was that getting data out of YouTube isn’t easy. You can download spreadsheet reports for one parameter over one time period at a time, but when you have well over 50 videos and want to try and track several parameters over several years, well, I didn’t fancy doing that kind of epic legwork. There is a solution, but it’s not ideal either. Google have an experimental API for YouTube analytics, and our IT department were as I left on the case of trying to write an application that would use it to built reports.

I did have a couple of initial findings from my more ad hoc monitoring of stats over the years, though these aren’t going to be hugely robust and I had hoped to interrogate them further with more data.

  1. On YouTube, most people were arriving after searching for relevant keywords *in Youtube*, (ie, not through Google organic search), or from related video links. The latter were definitely responsible for hits on our most popular videos, that were coming up as related links to other high profile videos from others.
  2. Many more people were arriving directly at video pages rather than the Channel page.

Staff Feedback

The Wellcome Trust has several hundred employees and, as with our external audience, we didn’t know what all of them thought of our video output. We didn’t even know if they were all aware of it, that they could use it or even commission their own films. Their opinions and needs were also very important to developing our video strategy. I put together a survey in Survey Monkey, had it checked by our UX expert (always worth doing)  and posted it on our intranet. I wasn’t around for the results of this one.

Audience Feedback

This was the bit I was most interested in, qualitative feedback from our audience. For our games evaluation work we had put up a survey at the end of the game, followed up with telephone interviews, and also looked at community commentary. As we’ve established, for Youtube the latter is not particularly useful, and we also didn’t have comments open on our site. But the survey was still an approach we could take, except instead of one game, we had ninety three videos. And instead of several hundred thousand plays over the launch weekend, most videos would have a few hundred views over their lifetimes.

This meant that we would need a much longer data gathering period as only a small fraction of people will answer a survey. As always, we ran a prize draw with this to help the numbers, but even so, it seemed wise to assume it would be six months (at least) before we had enough responses to come to any conclusions.

I drew up some survey questions, with the help of colleagues and again had them checked over by Nancy Willacy, our UX expert, to make sure they were understandable and that we were asking the right questions. There is lots out there on survey design, including some points from our own experience in this paper on games evaluations that I wrote with colleagues at Wellcome, The Science Museum and the National Space Center. Being very clear about what you are trying to find out is very important, it’s easy to creep beyond that and try and find out everything, but nobody wants to answer a huge survey online.

My intention was to add the survey as a link on our site next to all videos, and also to put them up as linked annotations on YouTube videos. I did the YouTube links myself before I left, which was a pretty tedious job. You can see an example at the end of this video. I have a feeling that you may have less options in the annotations if you aren’t a YouTube partner, and that adding links like this might not be possible. I guess you’d have to add one in the text below, but that’s not ideal.

External organisations

Finally, I wanted to know what other people were doing with video at similar organisations (and also what they thought of our output, if they’d seen it). This was to be much less formal, simply identifying other organisations and writing to them with a list of questions, at least in the first instance.

And you?

I hope some of that is interesting or useful, but it would also be great if other people share their experiences of evaluating video. What do you do in your organisation? What could we have done better here? Even better, does anyone have any results they can share?