Freeing the Video Industry’s Data from Its Black Box
What it means to unbox the black box, release video data from its silos, and improve the online video experience.
The impact of these unforeseen times has narrowed the lens on video analytics dramatically. People have not only embraced how video streaming has become the main source of consuming entertainment content, but also how it’s rapidly starting to play a major role within other industries.
We’re starting to see the adoption of video streaming with work conferences, telehealth appointments, education settings, and more during this pandemic. Video, in general, has had a huge boost not just because of the yearning for connectivity to keep us sane and entertained, but because organizations are now realizing they can still survive and thrive taking their business execution to the screen. With a somewhat “no hands” on deck mentality, organizations are using virtual technologies including video streaming to replace in-person touchpoints and conduct business as usual.
Where the Industry’s at Now
Even before the recent months, as an industry, we’ve also seen big changes under a few major conglomerates with the shuffling of different platforms. Disney swallowed up ESPN and the majority of Hulu, while Viacom scooped PlutoTV and merged with CBS. When Comcast acquired NBC and SKY, AT&T followed suit with DirecTV, Turner, and Otter Media. This proves how all the big players are at the forefront as media corporations scramble to stake their claim in the new streaming world order.
With 80% of the internet now being video traffic, companies can’t dispute their consumers prefer to ingest content via video. With that in mind, those who value the end-user experience are constantly looking for answers, and specifically more data to help them better understand and control the streaming video experience.
Today’s Black Box
A common pain point we hear is that analysts, marketers, and other decision-makers are frustrated by the walled gardens of information they’re forced to operate in. Their patchwork of single platform tools creates scattered data across their video stack, failing to generate proper insights that drive actions or business outcomes. A black box of information without any context or visibility.
For product, operations, marketing, advertising, and business teams at a content publisher, the insights they have come from several systems and technologies that perform analysis without anyone fully understanding their inner workings. These black boxes lack transparency because they’re comprised of very complex systems, contrasting inputs, and complicated algorithms.
When it comes to video streaming, improving the experience of consuming video content requires real-time data. For this to happen, the data must become unified in real-time to power observability, adaptability, and to optimize solutions. For the video stack to truly become actionable, you must maintain a constant pulse on the health of the data in your system. Identifying and evaluating data quality and discoverability issues leads to healthier pipelines, more productive teams, and happier customers.
Why It’s Taken So Long
Streaming is complicated. It’s unique in the sense that it requires an uninterrupted experience for its entirety. Customers expect their viewing experience to be seamless without ample spinning hourglasses. The Internet is used in every other function. It’s adaptable in the sense that you can deliver a file here and a text there. The user’s expectations of file, text, and photo sharing are less impacted by the microsecond changes across the end to end environment. Video is not that forgiving. It’s highly susceptible, down to the very millisecond, to the impact across the delivery chain.
The unique challenge with video is it’s not just a system under a single entity’s control, it’s a system that has control spread throughout many entities. From the content owner to the vendors who support those owners, to the internet providers who balance traffic and connectivity, it’s a distributed system that needs to come together and work synergistically for the final outcome to meet expectations. It requires having the ability to observe, trace between, and influence the control over the interaction between multiple back-end systems.
The industry hasn’t been able to fully apply consistent measurement across the end-to-end process, which prevents businesses from toggling the variables to change the output. Many systems (Encoders, Origin, CDN, Transit, ISPs) are used to prepare, deliver, and play content, but all are monitored independently. Moreover, the data and metric outputs from those systems are inconsistent and unstandardized, preventing true apples to apples analysis. And without a common understanding of end-to-end performance, we can’t pinpoint operational breakages or areas to improve. This leaves us with the black box because we never used a precise consistent measurement, standardized as an industry, or tried to pull together a framework as we do for user experience. It’s a fully distributed system that needs to come together.
Generally speaking, when it comes to data sharing, how nice would it be to share some non-private, telemetry data about how certain services are performing? Sharing this technical feedback with multiple outside vendors would in turn help them work cohesively and serve you better. Are there ways to collect the right data about system performance and the effects on video quality and share them today?
How To Unbox the Black Box
Breaking the seal around what’s happening for the end-user is the first line of duty. However, the results generated from the tools available today to monitor the end-user experience have great variability in measurement, and this has led to an inability to interpret the current state. Even when conjoining these insights with those from other back-end platforms, the inconsistency of results generated makes it difficult to align all stakeholders efficiently to take action. Today manual re-interpretation of metrics is often required, and this prevents any scalable, automated, or real-time improvements from being deployed.
So how do we get our data and metrics to be reliable and insightful for all? Investments need to be made to ensure consistent data collection and measurement at every stage. Establishing a single methodology for what and how things are monitored and measured will create a common understanding of system performance. Therefore, when we tie together insights, we can easily deduce what variables impact our end to end workflows, and thus actually control the outcome.
Agreeing to not only sharing insights but using shared data collection and measurement methodologies will allow all stakeholders, including external vendors, to align and take action to best support the end-user experience.
A quality video data platform will help all stakeholders involved in the end-to-end video pipeline to do their job more efficiently, and thus help level the playing field for customers of any size to take advantage of the internet to deliver content. You don’t have to be Comcast or Disney to create a great user experience for your users if you can efficiently and effectively align all parties involved to make it happen. You can start by creating data pipes to customize which datasets are shared internally, and which can be provided as feedback for vendors.
Essentially to optimize an end-to-end workflow requires that everyone is able to optimize their system, and thus do their part. If we do that, we can raise the bar for all video delivery and deliver flawless video experiences.
A Glimpse into the Future
A business operates best when everyone’s on the same page. Your video systems should be run the same way. If you can tap into the power of raw data to align your technologies with a single source of truth, you’ll create a vast ecosystem.
The future surely holds more data standards creation, adoption, and technical data sharing between entities at different stages of the end to end workflow — together we can eliminate the black box. If all parties can be more transparent, practices can be improved, and opportunity cost can be reduced. With more controlled data sharing in a standardized manner, the more likely a premium experience is achieved for end-users.