Archives

Amazon’s Pre-Season NFL Stream: Audio Syncing Issues, Errors on iPad, Low-Bitrate for TV

Amazon’s pre-season NFL game looked ok on some devices for me, but had some major issues on others. The TNF page wouldn’t load at all on iPad, giving me a, “Problem Occurred. Please try again in a few minutes” error (screenshot). I could navigate to the browse page, but the NFL game isn’t listed anywhere. There is no way to get to it and there is no “sports” section at the top. Some of the other tabs at the top are also reporting “problem occurred” errors, which is also the same message I got when I did a search on “NFL football”.

Around 40 minutes into the game, Amazon support told me there were aware of a “technical issue” involving “some specific issues with prime video streaming.” Not sure what exactly that means and they didn’t have more details. On a 55″ TV and up, the video stutters with fast movement and at times, is fuzzy up close. To my eyes, it looks like the bitrate isn’t high enough. Amazon’s support page suggests consumers only need 5Mbps to get the HD stream but I don’t know what their bitrate ladders are for this specific stream. The video quality looks much better on smaller screens.

On the iPhone, the stream worked well for me with about 1 a second startup time on WiFi and just slightly longer when on 4G. On desktop, (tested 10x), startup times averaged 9 seconds. There are some major issues however with audio syncing across my multiple Fire TV Sticks with other users reporting the same problem. The stream is not synced between devices, with as much as a 15 second delay between the Fire TV Stick, iPhone and desktop. This is odd since Amazon purchased the Sye tech from NetInsight to reduce end-to-end latency and sync video across multiple channels. With regards to advertising, all ad triggering for me has worked perfectly every time and for every break.

Some comments from uses on Twitter seeing similar issues with video quality and audio syncing:

  •  I looove watching lagging frame by frame video on TNFonPrime. Unreal how bad this video quality is!
  • Picture quality is just okay. Audio mix is just okay too. Side note: too many clicks to get to the game
  • The picture is terrible on TNF ugh
  • Can you up the bandwidth so we can see better than a 480i picture?
  • Why is this stream of this game so choppy? None of my other apps do this on my firestick
  • Can you fix the lag between the video and audio. Audio is way a head of the video.
  • Only three minutes into game before the stream froze and said “something went wrong. Please try again later”.
  • Already have problems with glitches and now no volume.
  • Audio on my prime stream is about 3 seconds ahead of video. Hear the whistle before the play is visibly over.
  • Already annoying with the fact theres 5 steps to just get to the game anytime I change back to normal TV from the app.
  • The streaming quality for TNFonPrime is straight garbage. Can’t keep a stable resolution
  • Anyone else’s TNFonPrime stream super laggy and freezing constantly?
  • Is it just me or is the TNFonPrime stream super choppy and stutters?
  • Running like trash on my 55” Roku TV. Awful resolution drops, freezing, stuttering, audio desync. Really no excuse for this when Prime Video has done plenty of live TV in the past.

To be clear, I have no way to know what percentage of viewers had problems like I did or what percentage had no problems. We also don’t know how large the audience was, the platforms and devices they watched the game on and how much viewing was on small screens versus TVs. This game was basically a live “rehearsal” for Amazon and I would expect them to improve upon the experience for their next game on September 15th. It’s possible they will make some changes to their bitrate ladders and other settings, but the audio being out of sync so badly and the bitrate seemingly looking too low are big issues that need to be addressed when the NFL season kicks off next month. Streaming is not a perfect technology and never will be. That’s not how the technology works. As a broadcast medium, it will never deliver the same experience to 100% of viewers. That’s just reality.

Notes: I tested on 4 different Fire TV Sticks/Cube (Fire TV Stick 4K Max, Fire TV Stick 4K, Fire TV Stick Lite, 2nd-Gen Fire TV Cube), MacBook, iPad, iPhone and smart TVs, with all having the latest OS and app versions installed. All devices were also restarted before testing and for some, during testing as well. All Fire TV Sticks and smart TVs were connected via ethernet, with tablets on WiFi and phone WiFi and 4G. My internet connection is 300Mbps with Verizon FiOS and 300Mbps with Optimum.

Sponsored by

Rate of Video Traffic Growth Declining Across CDNs and ISPs As OTT Services Optimize Encoding Bitrates, See Little Demand for 4K Quality

During Akamai’s Q2 earnings call, the company noted that revenue attributed to their delivery business was down 11% year-over-year. These numbers lead some to speculate that Akamai had lost a large percentage of market share, which isn’t the case, or that CDN pricing was falling faster than usual. While there has been some accelerated pricing pressure in the CDN space, but only for a handful of the largest CDN customers across the industry, the reality is that something bigger is going on.

Since the start of the year, many streaming services have focused on doing a better job of optimizing their bitrates and in some cases, reducing their bitrate ladders in an effort to save money. Given all the cost cutting everyone is doing in anticipation of recession like macro economics, many streaming services have cut back on the number of bits they are delivering. And it’s not just Akamai that is seeing the impact of this across their network. Other CDNs I have spoken to have also seen customers take this approach and some ISPs I have spoken to have seen a drastic reduction in the number of video bits delivered this year as a result of the bitrate optimization.

One ISP in the US that didn’t want to be named told me, “Since the beginning of the year our combined peak ingress internet traffic (IP Transit, Public and Private Peering, CDN and Caching) has been down more than has been in other years. It is not unusual to be down or flat till this time of the year, but even with subscriber growth we are down this year. I am thinking our year end traffic growth will be lower this year as we are still negative for the year.” Checking with a few other ISPs, they are seeing the same thing, an overall rate of fewer video bits delivered across their network this year.

Another key data point is that 4K/UHD still makes up a very small volume of the overall amount of video bits delivered across CDNs and ISPs and is not seeing much in the way of growth, year-over-year. Multiple CDNs have told me that of all the bits they will deliver this year, 4K/UHD will make up “less than” 10% of those bits, which is nearly identical to last year. For one OTT service I spoke with, they told me that less than 2% of total viewing time across their platform this year was in 4K/UHD content and they are one of the largest OTT platforms in the world. For all the talk of 4K in the streaming industry, we have yet to see much in the way of real adoption based on the total volume of hours viewed or total bits delivered.

When some OTT services first came to the market they pitched 4K/UHD to consumers as a way to differentiate their service. But to date, consumers have not seen 4K/UHD as the value proposition some expected. Some streaming services including fuboTV and YouTube TV charge more each month for 4K quality, as does Netflix. None of these services will break out for the industry what percentage of their users are paying extra each month for 4K/UHD quality because the rate of adoption is so small.

Part of the reason consumers are not demanding 4K/UHD content is because HD streams look great and the industry started applying HDR to HD streams instead of how it was originally rolled out, which was only with 4K. Today, HD streams with HDR look so good that many consumers would not be able to tell the difference between HD/HDR and 4K/UHD. And for those that do, they simply don’t see it as a big enough feature to pay for it. Also, when talking about the total volume of bits delivered, we have so much viewing done on desktop, tablets and mobile where 4K can’t even be used. We’ve also seen very few tentpole streaming events or pieces of content in 4K/UHD including sports. As of now, Amazon’s Thursday Night Football games won’t be in 4K/UHD and Apple’s Friday Night Baseball games aren’t in 4K/UHD either. Peacock has announced they plan to stream Premier League games in 4K, but not until 2023. Very few major sporting events being streamed are in 4K/UHD and we all know that sports is where a lot of viewing comes from.

We also see services like Netflix that set the default streaming quality to “Auto” in their settings, meaning the best possible mix of quality and data usage. I’ve never seen 4K on by “default”. While the recent premiere of House of the Dragon was available in 4K/UHD on HBO Max, you won’t see Warner Bros. Discovery talking about the percentage of viewers that watched it in that quality. Without going into things I can’t discuss, I can tell you the numbers were much lower than many in the streaming industry would expect.

Not all streaming and OTT services are being more aggressive in optimizing their encoding or removing higher-tier bitrate ladders. Looking at how Disney is doing encoding today for their D2C services, versus the start of the year, doesn’t result in any changes I have seen from looking at the streams on my devices. But in some cases, the Disney app limits the video streaming quality to 1080p depending on the platform. The PS5 version of the Disney app doesn’t allow for 4K streaming and caps playback at 1080p, although rumors of an updated Disney app for PS5, codenamed “Vader”, is apparently in the works.

From my point of view, the lack of any large-scale 4K/UHD adoption by consumers is not a bad thing. It doesn’t help third-party CDNs that are in the business of delivering video bits and have an economics of scale business, but any streaming tech should only be used when it is driven by a clear business purpose. For some, 4K/UHD matters and streaming services can charge those consumers more money. But for the vast majority of viewers, based on data from streaming services, CDNs and ISPs, 4K/UHD simply isn’t something consumers are demanding or willing to pay more for.

Episode 32: The Problem With Third-Party Viewer Measurement Platforms; Previewing NFL on Amazon Prime

Podcast Episode 32 is live! This week we discuss the recent news of Amazon’s plans to use Nielsen for TV measurement of Thursday Night Football on Prime Video and highlight problems the streaming industry faces in defining what success looks like. With no standards or agreed upon methodology, definitions or user metrics, the streaming industry is struggling to measure and define viewership from one service to another. We also recap the Walmart+ and Paramount deal; new sports licensing deals with Big Ten Conference and the UEFA Champions League and discuss more of what Netflix’s AVOD offering could look like. Thanks to this week’s podcast sponsor, Agora.

Companies, and services mentioned: Amazon Prime Video, NFL, Netflix, Walmart, Nielsen, FOX, CBS, Disney, Peacock, Lionsgate, Big Ten Conference, ESPN, Paramount+, UEFA Champions League, DAZN, fuboTV.

Why Video Engineering Teams Are Taking A Video QoE-First Approach To Playback Testing

Recently, I wrote about how device fragmentation and testing have become two of the most significant issues facing streaming services, specifically in ensuring audiences have access to high-quality playback that provides the best viewer experience on every device. Frustrated users are more than happy to give feedback on poor experiences through app ratings, social media, and other forums, influencing how future potential customers view streaming services. This matters as quality of experience has become one of the deciding factors in user retention, and difficulty in providing it can seriously affect growth and adoption.

At the root of ensuring quality is the Q/A testing done by the engineering teams, which generally happens through manual and automated methods. As I mentioned in my other post, both come at a high cost in terms of time and budget and require a high level of streaming workflow knowledge to identify and create the tests for every relevant use case. In this post, I will briefly touch on some of the current options on the market, their limitations, and how providers are jumping into the testing mix.

I often get asked, what tools are streaming Q/A teams currently using? Many Q/A teams start with a small list of manual tests being tracked in a spreadsheet or internal wiki, but this approach doesn’t scale and leaves too much room for error as the number of test cases grows. With the complexity around playback, that is just not an option for most, if not all, streaming providers. This isn’t the case for automated testing, as multiple open-source frameworks and vendors offer services covering browsers and different devices.

When it comes to free open source testing frameworks, some of the names I hear most often are Selenium, Cypress and Playwright and on the SaaS platforms side, Browserstack, LambdaTest and AWS Device Farm. Some of these platforms are excellent at providing or enabling streaming services to build testing structures for website and application performance, but they miss the mark when it comes to video. If you are focused on streaming media, these options aren’t dedicated to testing streaming playback and won’t always cover every device you need to support. This is important because even with access to automated testing frameworks, development team will need to identify and take the time to build up most of the use cases to fit their needs.

Additionally, even though these options are good at what they do, they either come barebones (no pre-set test cases) or have general use cases for multiple industries that can be implemented across testing structures. The significant limitations with general purpose frameworks become apparent when engineering teams need to get more granular and focus on performance, functionality, and playback quality. The heavy lifting will still be up to them.

Some vendors focusing on OTT and streaming are Suitest, Eurofins, Applause, and Bitmovin. The first three do it well with certain limitations to the automation control, not being self-service and needing to buy dedicated test devices or focused on guaranteeing the quality of experience while on applications. Bitmovin is the latest to join this group, known in the industry for their encoding, player, analytics capabilities, and streaming workflow expertise. Bitmovin added playback testing to the mix back in April by making their extensive internal playback quality and performance test automation publicly available, creating a unique client-facing solution.

Just before the NAB Show in April, Bitmovin released their latest Player feature, Stream Lab, which is currently in beta and open for anyone to test. I got the chance to see it firsthand at the show and learn more about this ambitious attempt to address the issue of device fragmentation. With their expertise in streaming and investment in building their own internal automated testing solution, stepping into playback testing made sense.

As their solutions are an essential part in a range of end-to-end video workflows, they have a full view of how the different pieces interact with each other and have developed over 1000 use cases they currently test on their Player. They have also built up their testing center with multiple generations of real devices, which you can see from my LinkedIn post when Stream Lab was first announced. That is where it is advantageous, as it is the first automated Playback testing solution that provides access to pre-generated use cases built for the streaming community to test on major browsers and physical devices such as Samsung Tizen and LG Smart TVs. This makes it possible for teams to ensure high-quality playback for streams, with no Q/A or development experience necessary, giving you confidence and peace of mind around version updates or even potentially supporting new devices.

Even though Stream Lab is innovative in its mission when it comes to being a solution for device fragmentation, it is currently available in open beta as Bitmovin looks to add more functionality and use cases. The company started developing their own internal automated infrastructure about five years ago and from that moment, they’ve been adding use cases and functionality from workflows they’ve implemented ever since. Stream Lab still has a good amount to work on and add, but it stands to be a significant plus to development teams in the streaming sector and will be an essential piece for Bitmovin’s playback services.

How the industry tackles device fragmentation will be interesting to continue to analyze. It’s definitely something to pay attention to, as device makers show no signs of slowing down and creating a standard. Due to this, video engineering teams will struggle to cover every use case viewers might experience during playback, especially as new AVOD services from Netflix and Disney come to the market.

As Viewership To Disney’s Streaming Services Grow, Their D2C Losses Widen

Nielsen’s latest data says Disney’s D2C streaming services gained 5.4% of TV viewing “share” last month, yet Disney’s D2C business lost $1.1B in their fiscal Q3 quarter, their biggest loss ever. The metric of “share”, or viewing time, by itself, is not what the industry should be looking at to determine success. That one data point, by itself, does not tell a complete story and does not determine a “winner” as the media continues to say.

Also, suggesting there will only by a “few” winners or “3-4 winners” in the streaming wars, like I keep hearing execs say on CNBC each day is not accurate. Define what a “winner” means? Apple, Netflix, Disney, Warner Bros. Discovery, Paramount, Comcast, Amazon, Roku, will all be highly competitive for years to come and most have different business models. None of them are going to exit the market and that’s 8 companies right there. Suggesting there will only be a “few” winners simply isn’t accurate.

Disney’s D2C losses broken out by quarter:

  • Disney’s Q3 2022: D2C lost $1.1 billion
  • Disney’s Q2 2022: D2C lost $887 million
  • Disney’s Q1 2022: D2C lost $593 million
  • Disney’s Q4 2021: D2C lost $630 million
  • Disney’s Q3 2021: D2C lost $293 million
  • Disney’s Q2 2021: D2C lost $290 million
  • Disney’s Q1 2021: D2C lost $466 million
  • Disney’s Q4 2020: D2C lost $580 million
  • Disney’s Q3 2020: D2C lost $706 million
  • Disney’s Q2 2020: D2C lost $812 million
  • Disney’s Q1 2020: D2C lost $693 million
  • Disney’s Q4 2019: D2C lost $740 million
  • Disney’s Q3 2019: D2C lost $553 million
  • Disney’s Q2 2019: D2C lost $393 million
  • Disney’s Q1 2019: D2C lost $136 million

*Note: For some of these quarters, Disney included D2C revenue and losses with “International” before they restructured their business.

Episode 31: Q2 Earnings Recap From Disney, Akamai, Edgio, Vizio, Trade Desk

Podcast Episode 31 is live! This week we recap all the news from Disney’s April-June earnings including their D2C business losing over $1B in the quarter; their addition of 14.4M Disney+ subs globally; Hulu and Disney+ price raises; Hulu losing 3.4M SVOD subs and 100,000 Live TV subs and their launch of Disney+ with ads in December. We also cover the numbers you need to know from Akamai (Revenue up 6% y/o/y), Edgio (Initial 2023 revenue outlook of between $550-$560 million), Vizio (Platform+ net revenue up 69% y/o/y), and Trade Desk earnings (revenue grew 35% y/o/y). Thanks to this week’s podcast sponsor, Agora.

Companies, and services mentioned: Disney, Netflix, Hulu, Tubi, Fox, ESPN+, Paramount+, Disney + Hotstar, Indian Premium League, Vizio, The Trade Desk, Edgio, Akamai, Roblox, Coinbase.

Qwilt and NCTC Members To Offer Better Video QoE by Deploying Caches Inside Rural ISPs

Last month, Qwilt and the National Content and Technology Cooperative (NCTC), announced significant progress on their initiative to upgrade NCTC member networks across the United States to help ISPs deploy an edge CDN, providing high-quality content delivery and better digital experiences. Combined, NCTC members reach 34 million households in the US but many of these ISPs are very small and don’t have caches from commercial CDNs within their network.

NCTC’s program with Qwilt will allow their members more control over content flows and catering to the needs of global and regional content providers for more capacity, consistency and performance in video delivery. A single API allows content publishers access to a national federation of NCTC member networks and monetization of content delivery for NCTC ISPs through revenue sharing with content providers. More than 100 NCTC members have signed up to deploy the Qwilt CDN inside their network and are expected to have it in production by the end of Q3. While Qwilt won’t discuss the total combined network capacity across all ISPs, I would expect it to be in the range of 25 Tbps of egress capacity, using the math of 15-20 Gbps of capacity per deployed server.

These deployments will help deliver video to areas where QoE could be a larger problem and while not mentioned in the release, I would expect that Disney+ will be one of the first OTT services to go across this distribution network, since Qwilt already has a commercial relationship with Disney. In addition, the NCTC negotiates agreements with OTT content providers to give their members economical access to streaming content. For NCTC members that participate in their OTT agreements, independent operators gain access to a portfolio of entertainment options with the goal of attracting and retaining cord-cutters and cord-nevers. The NCTC currently has agreements in place with Cheddar, CuriosityStream, Disney+, ESPN+, HBO Max, Hulu, Peacock, fuboTV and Philo.

If you look at the NCTC as a collective ISP, they are larger than Comcast when it comes to the number of households they serve. The new caches from Qwilt will allow smaller ISPs to deliver the type of video experiences that most viewers expect and get within larger ISP. This allows regional service providers to compete with the same level of QoE and get the exact same technology that Verizon uses in their deal with Qwilt, but they don’t have to be the size of Verizon to get access to the caches.

While on-demand video will be the first use case for the deployment, I expect that live streaming will also come to the platform and is where ISPs will really see some immediate benefits on the QoE side. Large-scale live events, with unknown traffic spikes, is where we see the majority of QoE problems, so I would expect we’ll see first-hand accounts of ISPs delivering live events via the caches in the immediate future. Qwilt hasn’t given a timeline on the other types of content the platform will support, but they did tell me they expect the caches to be able to handle software downloads and other content shortly.

Over the years we have seen a lot of CDN models come to the carrier and service provider market, with little success. But this deal with the NCTC is by far the largest deployment to date, based on the numbers of households it covers. With about 132 million households in the US, the Qwilt caches will cover almost 27% of all those households and provides and open caching platform that federates otherwise isolated carrier CDN models into a unified global CDN.