Archives

Best Practices For Using A Multi-CDN Strategy: How To Balance, Prioritize and Optimize Traffic

The debate surrounding the use of a multi-CDN strategy has been gaining momentum over the past few months with more case studies showing how it can be done. For a while now, multiple vendors have provided CDN load-balancing as a service, and in that time customers have learned a lot about the process of configuring CDNs to improve quality and match business goals. When used correctly, a multi-CDN strategy provides great advantages to content owners including the ability to better control quality, prevent overage charges, ensure bandwidth commitments are met, and permitting a selection process for delivery using additional requirements. A multi-CDN strategy requires two decisions: the first is the criteria used to select the CDN, and the second is deciding the process by which the switch between CDNs is carried out.

There are many selection strategies that can be used when discussing CDN balancing, and solutions provider NPAW recently shared with me how they explain the process to customers. There are three types of strategies, those being balanced, prioritized, and optimized. Using a balanced strategy, one merely distributes traffic with different thresholds (like traffic served or concurrent streams) to spill over into secondary CDNs upon reaching a specified limit. A prioritized schema provides a criteria hierarchy, which may include the platform, ISP, device, or protocol utilized until a certain level. For example, you can control concurrencies in your own delivery network better by diverting overflow to a regional or global CDN network depending on the amount of incoming traffic, or number of concurrent users. Ultimately, on the most granular level, an optimized strategy leverages performance metrics in the decision making process. This means that the CDN chosen is the “best” performing CDN, which has received the highest score across a number of factors, including the recent measurement of QoE metrics within a specific region, for a specific piece of content, considering the ISP and device of the end-user intending to access the video.

By choosing the best performing CDN for each user/view, OTT platforms and content distributors can significantly reduce buffering rates, play failures, and join times of their services which results in driving more consumption, reduced churn, longer play times, and maximizing a user’s quality of experience. The second part of the process is deciding how to actually perform the switch once the CDN has been selected. There are three main ways to execute a switch: based on DNS routing, through the sole use of client-side plug-ins, and based on client or server-side communication between APIs.

DNS: A CDN switching technique that works at the DNS level can be integrated without modifying the app, as it is independent of the application layer. This is a big advantage as far as integration is concerned, although it makes CDN traffic analysis more difficult afterwards. The main benefit (which can also turn into the worst drawback) is that the application is unaware of anything about the CDN being used and therefore cannot influence the DNS routing.

A DNS routing switch indicates the URL of the service and this URL is divided into two different parts: the base, which changes every time there is a CDN switch, and the content, which specifies the video content to be delivered. DNS routing for VOD streaming poses a low risk with such modifications of the URL, but live streaming switching might not be possible given the specific URLs used by some CDNs where not only the base route changes, but also parameters in the entire URL.

PLUGINS: CDN-switching based on plugins is basically a third-party software platform inside the player that makes the decision to switch between CDNs. When switching between CDNs (or even renditions) this permits parts of QoE metrics and performance issues affecting to the user’s device (for example CPU performance or memory usage) to be taken into account. NPAW says this grade of autonomy, although it may seem tempting, is very dangerous because these systems are making very important decisions without any knowledge of the whole business context.

Plugin based switching may make automatic adjustments to account for preset QoE parameters, but since the program is unaware of the context of that adjustment, the chosen CDN might not match the business and strategic goals the distributor desired. However, the main risk associated with having an autonomous system in your player that makes decisions purely based on performance, is the liability of impacting another “middleman” that can fail along the critical path with your video delivery plan.

API: While a bit more complicated to implement than a DNS solution, one of the most important differences between an API based model and other strategies is that it is completely scalable. For example, CDN-Switching based on plugins is a model where the switching method occurs in the players. This means that for each new player adopted by the customer a new implementation with that new player is required, with the cost of time and development that this represents included as well. By contrast, a solution based on APIs is totally scalable as it functions based on the server, so the integration of new players is fast, effective, and without additional costs.

Also, if a switching method based on an API is used, the communication can be extended from ‘client-server’ to ‘server-server’. In fact, this is the communication method that the majority of industry leaders use. The client or server-side sends a request to for instance NPAW’s API inquiring as to which CDN is the best for a specific IP and device. NPAW’s API computes the algorithm considering the configurations previously made by the customer and it returns the CDNs ordered based on the switching method configured in real-time. Then, the client’s API will finally choose the CDN and redirect the data flow. Here’s a diagram from NPAW that shows how their Youbora solution works:

communication_client-server_vs-_communication_server-server

Last but not the least, NPAW’s solution exists independent of the “critical path” as their platform operates based on the server, not based on client. The players-plugins scheme only collects useful information, it does not execute actions, which means that they will never drive a total blackout of the service, with the economic costs that this means for the customers.

Content owners I have spoken to have tried and tested many alternatives with a CDN-switching method based on APIs. Yet, the industry appears to be defaulting to the belief that an API based switching technique offers a lot more benefits when compared to the other solutions previously explored including low customization, low client side cost, and higher flexibility. Multi-CDN deployments aren’t new in the industry, but they are getting a lot of traction as of late with solutions like NPAW’s and others in the market that let you do it easily, cost-effectively and most importantly, based off of real video QoE data.

Sponsored by

AT&T’s Streaming Service DirecTV Now Peaking At 35,000 Simultaneous Users

In speaking with third-party suppliers responsible for delivering the video for AT&T’s DirecTV Now live streaming service, I can confirm that to date, AT&T has peaked at around 35,000 simultaneous viewers. That doesn’t say how many subscribers AT&T has in total, but if we use the industry average that at any given time about 25% of users are streaming from the service, AT&T would have about 140,000 total subscribers. [Updated 1/20/17: AT&T says they have 200,000 subscribers] But that number would also include those that are testing the service for free for the first 30 days. While I don’t know what percentage of total users are paid versus non-paid, I would estimate AT&T has less than 100,000 paying subs for their new service since it launched.

I would also expect that due to all the technical problems AT&T has had with the service and the volume of negative press the company continues to get, that the rate of sign ups is slowing since it launched in the market more than 30 days ago. When a service like AT&T’s struggles with reliability, video quality and functionality, and has non-existent support for consumers, it’s not going to fare well. Add in the fact that it is not available on Roku, Xbox and PlayStation devices and you can’t expect the offering to do very well. Even though some want to suggest otherwise, AT&T’s DirecTV Now service won’t have any material impact on cable TV subscriber numbers and isn’t a catalyst for cord cutting.

Calling All Vendors: Looking For VR Help/Demos at Streaming Conference

goodAt the next Streaming Media East show in NYC, taking place May 16-17th, we are looking to add a big hands-on VR component to the show floor. We already let attendees get hands on with all of the latest video devices and platforms in the market and now we want to add VR to the mix. If you are involved in VR and want to help provide the ability for attendees to see VR demos, please reach out to me. We are looking for companies that can bring gear, do demos and showcase the latest that is taking place with VR technology. You can email me or call me at 917-523-4562.

Wall Street Needs To Run The Numbers: OTT Growth Not A Big Driver Of Revenue For CDNs

Over the past week or so I’ve seen a few reports regarding Akamai’s valuation in the market with some suggesting that the “growth in OTT business models” will be a “catalyst” for Akamai’s revenue. Lets take Akamai out of the picture as the point I’m making isn’t about what Akamai’s valuation should or should not be, but rather the incorrect argument that some are making with regards to the impact of OTT services on CDNs revenue.

The term OTT is used to describe just about any kind of video these days, but in reality most on Wall Street define it as a premium video service. The problem is that none of those services that have a large subscriber base, other than Hulu, relies on third-party CDNs to deliver their content. And the ones that do like CBS All Access, HBO Now, Showtime Anytime, WWE, Sling TV, DirecTV Now, PlayStation Vue etc. all have small numbers. Most are around 1M total subs, (WWE has almost 2M) with many like PlayStation Vue and DirecTV having far fewer.

It’s ok to suggest that as OTT services grow over the years there will be a positive impact to the CDNs, but the problem is that many are suggesting that positive impact will come in the next few quarters. It won’t. All of these OTT services that don’t have their own CDN use more than one third-party CDN to deliver their content, with many using three. So even if traffic grows, the percentage of traffic any one CDN gets is small. For instance DirecTV Now uses Akamai, Limelight and Level 3, as did MLBAM, which did the NFL streams for Twitter. So any growth is tempered by the fact that traffic is usually not exclusive to any one CDN.

And none of this is a secret, it’s just math and it’s easy to figure out. So why is it many on Wall Street still don’t know what the real impact of OTT growth is with real numbers? If one person to any of these OTT services watched 2 hours of video a day, for a total of 60 hours a month, with 50% viewing on a small screen and the other 50% on a large screen, the total number of bits delivered would be about 54GB per month. And with the average price per GB delivered on these sized deals being around $0.005 per GB, the revenue to a CDN would be $0.27 per user, per month. And if the service has 1M subs, which again most don’t, the total value to a CDN would be $270,000 per month. Except that they aren’t getting 100% of that traffic and typically are getting 50% or less. Total revenue to a CDN would be $90,000 if the OTT provider was using three CDNs. [Note the revenue per user to a CDN would be higher/lower depending on the number of hours they watch, but the average high/low in the industry, outside of Netflix, is around 40/80 hours per month]

Another data point no one wants to acknowledge is how slowly these OTT services are actually growing. CBS said they want to have 4M subs to CBS All Access by 2020. If they make that target, they will have gone from 0-4M subs in seven years. That’s no “catalyst” on any third-party CDNs revenue, that’s slow organic growth. I’ve also seen a few reports that suggest that content providers “risk subpar subscriber/viewer experience by utilizing in-house CDNs“. That could not be further from the truth and anyone who writes that sounds like they are listening to someone who works in the marketing department of a CDN provider. Just look at how many live events and OTT services have had quality issues and failures in 2016, all while being delivered from third-party CDNs. Some do say that OTT services won’t move the needle for Akamai or other CDNs with regards to revenue, but then also say in the same sentence that it provides “growth opportunities“. Without defining the size of the opportunity, or how fast it is growing, “growth opportunities” means nothing.

Updated 4:28pm ET: Here’s a statement from another report I saw today that said “AT&T’s DirecTV Now, targeting +20M cord never households is a positive catalyst.” It’s not a “catalyst”. AT&T hopes to have 1M simultaneous streams a year from now. If they get that many, which I don’t think they will, and Akamai get’s 1/3 of that traffic, it’s only worth about $1M per year to each CDN. For a company like Akamai that’s doing over $2B a year in revenue in 2016, an extra $1M or even $2M a year is not a “catalyst” at all.

The growth of OTT and all video services is having a positive impact on the CDNs, but the growth of these services is tempered by the fact that the actual rate of growth is small, pricing for the services is low, traffic is split out amongst multiple CDNs, and the number of bits need to view on mobile and tablets is 2/3 smaller than what’s needed to a TV screen.  I wish Wall Street would rely on actual data, numbers we have in the market that show what people are consuming, how often, at what bitrate, and what that value is, in revenue, to the CDNs.

Open Letter To Streaming Media Vendors: Stay Focused In The New Year

While many have asked me if I plan to do a year in review of the streaming media industry, vendors in this space already know what did and did not work well for them by talking to customers, reviewing their product adoption rates and looking at their balance sheet. To me, that’s the best year in review any company can do.

But I will say that by my calculations, more than 90% of all vendors in the streaming and online video market have less than $100M in revenue. Only a small handful of companies can take huge chances and make big bets on the future. As the Dow approaches 20,000 and the markets are doing well, many companies may try to over-extend themselves in the New Year. Don’t. Have the discipline to stay focused. If you look back at the twenty years of this industry, the worst time for vendors wasn’t when the market was bad, it’s when vendors lost focus, trying to be everything to everybody. Trying to sell their product/services into every vertical that exists, losing their way.

OTT is growing, it is expanding, but the business side of OTT is still unproven for most. The number of live streaming events in 2016 was probably a record, but most of them lost money and could only afford to be streamed due to a big financial backer. We are a long way still from being able to monetize live streaming WITH profitability and still have more work to do. 4K is coming, but it’s many, many years off. So be excited, but be realistic. Set proper expectations with your employees and your investors.

Realize that the best technology and platforms is not what always gets adopted. Services that are easy to understand, buy, deploy, manage and track the ROI associated with them will always beat out the newest and greatest technology. Some say we need more “adoption” of online video, but we don’t. The adoption is here and has been for some time. Now we need consumer-facing services with a proven business model and the technology behind those services to work seamlessly. The growth of this industry is NOT due to a lack of adoption or needed growth of the consumption of video or 4K etc. it’s making all the back-end pieces of these platforms work with one another. The ability to track, measure and analyze the entire video ecosystem, along with the quality of the experience is crucial.

My personal thanks to all the content owners, distributors, and vendors who shared so much information, data, and real-world use cases with me in 2016. The only reason I can act as the focal point for disseminating a lot of industry information is because vendors and others share it with me. That data, those uses cases, it allows me to have an insight into the bigger picture of the industry. It allows me to do my job, which I see it is one simple thing; Inform, Educate and Empower others.

I also appreciate all of the vendors (over 35 of them) who sponsored my blog throughout the year. I’m still amazed that for someone who has no training in writing, has no background in journalism and still struggles with grammar at times in my posts – that people still have an interest in what I have to say, and what I report on. I’ve never looked at my blog as being “mine”. As far as I am concerned, it’s a blog for the industry, by the industry, as the information on it comes from those who build, sell and deliver these videos services for a living.

There are many in this industry who have been and will continue be working each day to help this market continue to grow. Passion breeds success. True success equals profitability. And profitability guarantees a stable, realistic and prosperous future. We’ve only just scratched the surface of what this industry is going to evolve into down the road. So keep up the fight. There are more good things to come.

Wishing everyone a peaceful and healthy holiday season.

Accessing The Cloud Providers In South Korea For The Next Olympics

With the Rio Olympics now behind us, we can start to look at what to anticipate for the next event, the Winter Olympics in PyeongChang, South Korea in 2018. The ability of broadcasters and OTT providers to successfully provide streams from these locations depends heavily on the infrastructure in the region. While South Korea does not have as many large cloud installments as Brazil does (see this report) what we can do is look at the likely clouds from major providers in the immediate area around South Korea. These are the most likely the partners that OTT providers will choose to assist with their live streams in two years.

Cedexis recently pulled some data for me that evaluated cloud deployments including; AWS APAC Tokyo, Azure Cloud Asia East, Azure Cloud Asia Southeast, Azure Cloud Japan East, Azure Cloud Japan West, Softlayer Tokyo, and Ecritel E2C in Shanghai. These were selected based on proximity and performance based on a larger set of clouds where Cedexis excluded the poorest performers.

As you can see below from this seven-day look of latency from every network within South Korea, there is a lot of variability.

screen-shot-2016-08-26-at-9-47-34-am

In particular Ecritel E2C Shanghai goes from best in the early part of the week to fluctuating to the worst and best in the later half of the week. You can also see fluctuations in performance amongst the best performing pack and the worst performing pack of cloud providers.

Taking a deeper look at this seven-day period, the following chart shows the statistical distribution of latency to the 7 clouds from measurements taken within South Korea.

statistical-distribution-of-latency-to-cloud-from-within-south-korea

As you can see, at 36ms the Azure Cloud – Japan East has excellent latency at the low-end but it’s also one of the tightest statistical distributions. This means that the outliers or worse performers are actually not all that bad; although 119ms is pretty slow. Softlayer Tokyo also has very good low-end latency in its distribution, but note that its 153ms top end is considerably slower. Also note that in the seven-day period Ecritel and AWS Tokyo both had very good latency at the 10th percentile but the outliers for both were very painful.

Its important to note that this seven day period is unique and the best and worst performers will changes as peering relationships and congestion effect the overall landscape of performance. To get an even deeper understanding of how these networks performance can effect the overall cloud performance rating Cedexis took 5 of the top networks within South Korea to show how these networks perform relative to the 7 clouds under consideration.

heatmap-of-networks-latency-to-various-clouds-from-within-south-korea

The greener a box is the faster that networks round trip time (RTT) to the cloud, and likewise the more red a box the slower the RTT. As you can see, on average Azure Cloud Japan East is the winner but it’s also important to note that as we saw in the first diagram, averages can deceive. The best and worst can change constantly over the course of the day. In fact here is the latency to these 7 clouds over the last 60 minutes of the testing to give some sense of the how often these clouds trade positions.

screen-shot-2016-08-26-at-11-47-30-am

While Azure gets the gold for best performing cloud, the race has just begun. With two years to go before the Winter Olympics, the race is bound to change hands many times in the interim. Stay tuned to see who will be the best performer by then. It’s bound to be exciting.

PacketZoom’s New Mobile Benchmarking Study Analyzes Mobile Network and App Performance

05538ff8-packetzoom-logoLast month, PacketZoom launched a new mobile benchmarking study that analyzes mobile network and app performance on a global scale. The company created this ongoing standard measuring millions of data points gathered from usage of mobile apps around the world, to give app developers deeper insights to create apps with better performance. This is the first time, that I am aware of, that performance insights have been presented at the mobile app level instead of merely showing carrier signal strength.

PacketZoom focuses on improving user experience on mobile apps by eliminating performance roadblocks in the mobile last mile. Mobile publishers and developers boost app performance worldwide by accelerating and improving reliability of content delivery through the integration of PacketZoom’s SDK. In the benchmark, PacketZoom measured the performance of mobile apps on live cellular and Wi-Fi networks throughout October 2016 to determine countries and networks that performed the best.

The initial results are pretty interesting:

  • AT&T has the lowest number of TCP drops in the US (2.94% vs. Verizon with 3.66%)
  • Sprint is far behind with over 5% of disconnects
  • Verizon response time is almost 30% faster than AT&T

Countries Leading in Adoption of Advanced Cellular Technology
(*End user adoption of 4G cellular technology by country based on mobile application usage)

  1. Canada: 96%
  2. South Korea: 95%
  3. Japan: 93%
  4. Indonesia: 91%
  5. US: 83%

Best Response Times for Mobile Apps
(*Average round trip time from mobile app to content server as experienced by end users over cellular and WiFi networks – ms)

  1. France: 276 ms
  2. Spain: 338 ms
  3. Netherlands: 355 ms
  4. UK: 357ms
  5. Germany: 389 ms

Every smartphone user knows that the expected customer experience on mobile is the same as we get on the desktop. Some mobile reports show that the accepted wait time on mobile for website and app performance is about 5-6 seconds. However, there are many factors that play into the speed of a mobile website or app and with PacketZoom’s new benchmarking, hopefully the data will help developers looking to create a better mobile experience.