Videos on the internet

A bit of press coverage for Recoplace had me monitoring the spike in our site hits. This got me thinking of the following incident from my days at Yahoo Videos. That was when I was wallowing in viewership data.

As someone who used to work with broadcast video and also embedded (hardware) video, I used to claim that much of internet video is erring in something fundamental. The image below is a screen grab of a YouTube video of 2017 Australian open. Looks bad? This is a manifestation of what I claim Internet videos is getting wrong.

Poor quality video
Poor quality video

A few years after I started believing that Internet videos evolved a bit wrong, I became an external consultant to Yahoo (2011-13). I made my pet claim at Yahoo too. Every time I said this, someone would tell me “But YouTube also does the exact same thing”. YouTube was #1 by far and Yahoo was #2 most months (per ComScore). However, in spite of my going against the grain, the culture in Yahoo gave me the space and access to attempt to prove my claims.

Over a period of about a year, I conducted various experiments and showed demos to folks in the company, worldwide. I also conducted blind comparison tests with content owners and editors. Slowly, folks came around and I was cleared to make changes on small Yahoo properties, where the viewership was in thousands per day (as against the tens of millions overall. Source: ComScore). The results were good and suddenly there was pressure on me to stop analysis and just implement it Yahoo wide. The acceptance was wonderful, but prospect of a Yahoo wide rollout made me nervous. Finally, we rolled out the changes I recommended on Yahoo News, one of the biggest Yahoo video properties.

With inevitable apprehension, I monitor the changes being deployed. On Day 1 the metrics do not look good but I control my panic by reminding myself that changes take time to percolate. Day 2 looks worse and I raise a red flag internally. I can barely control my distress. Day 3 and the quality stats (stuff I was trying to improve) has fallen considerably and I request a rollback of my changes. Wanted to bury myself someplace, but soon the urge to debug, using the data from millions of views, was overpowering.

Folks at Yahoo Bangalore, the work’s sponsors really, were totally supportive but I couldn’t look them in the eye. Folks at Sunnyvale were also absolutely focused on why things went south rather than how I screwed up. No wonder people at Yahoo loved working there.

Day 4 allowed us to confirm that the folks responsible for the roll out had done everything perfectly right.

Days 5 & 6 did not surface any flaws in my earlier experiments or calculations.

Day 7 and I had nothing more to “Do” or analyze. Feeling like an idiot was a given. Wishing one had not started this whole “Internet videos missed a trick during its evolution” was a constant. Luckily, that part of my brain where humans continue to outperform machines in using big data, found another gear. And, like most engineering solutions, the solution to my mystery was also extremely simple – an anticlimax really.

Somewhere around Day 5 or so, the data and charts I was looking at started triggering a sense of déjà vu. By Day 7, this was all that I had left to work on. I brought out data from a month or so prior, where I thought I had noticed (and ignored) a small dip in quality. I waded through data, found the dip and noted the videos that had trended. And suddenly I KNEW where I would find the same big dip that had happened earlier. It had to have appeared a few months earlier, when the Boston marathon bombing had happened. By then I knew what was going on and checking was a mere formality. I no longer needed to search the video metrics for the dip. I googled (yes Sir, at Yahoo, that is what I did) the dates of the incident and went straight to that day’s video metrics in our repository. The dip was present after the Boston bombing, and this was months before my changes were implemented.

The answer was simple. I had implemented my changes for Video on Demand  (VoD) and not for Live video. When massive news events happen, Live video dominates consumption. And often, Live Video had lower quality stats. This was causing the dip when there was a lot of Live content and it had nothing to do with the changes.

The roll out of our changes had coincided exactly with the commencement of the Zimmerman trial – a few days when lot of people watched live video from the court.

Day 9 or so. I explained what was going on and asked for the changes we had rolled back to be reintroduced. This time, the trend lines behaved like what ISRO calls a textbook launch. Matched what was expected. I could even tradeoff between bandwidth and quality as I had planned. I could see the consequences of this trade off in the gross metrics from millions (and, in time, billions) of video views.

And the icing on the cake was when a few days later, the Zimmerman trial verdict made Live Video dominate again, and the same behavior we had seen earlier, manifested.

A month or so later, we implemented the changes for all Video on Demand at Yahoo. This was a perfect launch. A few months later, Yahoo scaled down their Bangalore operations, and we consultants were the first out of the door. Never got a chance to test the same changes on Live content.

Now, of course, Yahoo no longer exists.

I continue to analyze videos from different providers. I see that YouTube continues to carry the problem though they have been optimizing around it. I see the same problem in LinkedIn, Vimeo, CNN and many other providers. I have continued working on end user video quality and wish I was still at Yahoo, because I am sure Yahoo would have given me a chance to prove up and implement the new improvement opportunities I have found.

A few things you might like:

Growing up to Mars

Metros, tubes and the metro maps we take for granted

 

In search of a love song

https://www.comscore.com/Insights/Rankings/comScore-Releases-February-2016-US-Desktop-Online-Video-Rankings