Archives

New Report Reveals that Ad Blocking is Pervasive Amongst Millennials Who Choose Illegal Streaming Over Linear Television

Millennials are watching video content, but in most cases it’s on-demand video, not live TV. This is one of the reasons why, when content aggregators build television channels for them, they don’t show up. A truism that eventually shuttered Pivot and explains why Vice’s average viewer is 40 years old and their ratings aren’t quite as strong as H2, the channel they replaced.

While many studies have explored millennials’ clear preference for streaming content over linear TV, Anatomy just released a report that takes a closer look at how young millennials (18-24) are viewing their video content and they specifically explored if they pay for what they stream with their data, dollars or demographics. They surveyed over 2,500 young millennials to get some hard data around their behaviors and opinions. Anatomy looked at this subset of the millennial population because as Anatomy’s CEO Gabriella Mirabelli told me, “we feel that this cohort is the engine behind the disruptive behaviors that will be rocking the media landscape down the road. As this population ages they don’t adopt regressive technology, but rather propagate their behaviors up and down the demographic spectrum.”

Anatomy’s Millennials at the Gate report found that young millennials represent the bloodiest cutting edge of ad block adoption. In fact, two out of three young millennials use an ad blocker. Why do they do it? 64% say it’s to avoid intrusive video ads. They also want to speed up their browsing and increase their privacy. In a nutshell, they do it to improve their viewing experience.

screen-shot-2016-10-11-at-3-37-52-pmGabriella Mirabelli, Anatomy’s CEO, notes, “This isn’t particularly surprising because the mantra for those looking to monetize content is create a premium user experience, or lose viewers. While much of this is common knowledge, it’s surprising that the most established video publishers aren’t doing anything about it.” An ad block wall is a website feature that detects ad blocker software and prevents a user from accessing site content until the ad block software is disabled. Of the 17 broadcast networks Anatomy surveyed; only one (CBS) employed an ad block wall. This failure by content owners means lost revenue, plain and simple.

After the Olympics, some articles suggested that NBC’s online viewership may have cut into their linear ad revenue. Other articles laid the blame squarely at the door of millennials who, they complained, didn’t show up. Well, maybe they did and maybe they didn’t. But of the young millennials who did show up, we know that two-thirds had their ad blockers on, so the ad content was stripped out and the potential revenue never obtained.

Of course NBC is not alone. Other than CBS, none of the networks tested had an ad block wall in place. And, it’s worth noting, CBS affiliates didn’t benefit from their network’s best practice and there’s really no excuse for that. Of course, it might be that the networks are aware that if they put in an ad block wall, they will need to simultaneously monitor and manage the viewer’s streaming ad experience. Nothing will be more damaging to their brand or detrimental to the user experience than to force feed viewers ads that are irrelevant and repetitive and yet I have been complaining about this exact thing for years.

Sponsored by

Streaming Meetup Dates Announced: Oct. 25th / Dec. 13th

Save the dates! The next meetup of streaming media professionals in NYC will take place on Tuesday October 25th, starting at 6pm. Due to the holiday, November’s meetup is cancelled. The location of the December meetup will be announced shortly.

Tuesday October 25th (sponsored by Cedexis and Varnish)
Tuesday December 13th (sponsored by Bitmovin and 3Q SDN)

If you would like to sponsor a meetup, by covering $500 of the bar tab, please let me know.

There is no RSVP needed or list at the door. Just show up with a business card and ID and you are in! You will need a wristband to drink, so introduce yourself to me when you show up.

I’ll keep organizing these every month so if you want to be notified via email when the next one is taking place, send me an email and I’ll add you to the list.

Akamai Rolls Out New “Fast Purge” Solution; Questions Remain About Speed and Scale

For well over a year, customers of Akamai have been complaining about Akamai’s cache invalidation times, which has impacted manifest caching, real-time news feeds, and any time sensitive content. Historically, Akamai purges used to take at least 15 minutes and sometimes, in a couple of really terrible cases, I’ve heard of hours. Competitors like Fastly have been quick to jump on Akamai’s purging limitations and have been winning deals in the market based on Fastly’s ability to purge content within hundreds of milliseconds.

It seems that some CDNs caching strategies are old school and based around the idea that you have little to no real-time control of your caches. You set a TTL and if you need to invalidate, it can’t be mission critical, you just have to wait minutes or hours. So you set your caching strategy by identifying what you can afford to behave like this and then build against that, so your home page, news feeds, api’s, dynamic elements, HLS manifests etc. can’t be cached. Customers tell me that with Fastly, they have turned the caching strategy on its head. They cache everything (except truly uncacheable content like PII or specific to a single user), and then invalidate it as needed. It’s the difference between 90% cache hit rate and 99.999%. New York Times is a classic example of an Akamai customer, which serves all their HTML from origin and only caches images on Akamai because they don’t have this capability.

Akamai has been aware of the major limitations of their platform when it comes to purging content (see their blog post from January) and has been building out a new system, which allows purging as low as a few seconds. That is a dramatic improvement, but content owners have been asking me how widespread Akamai’s new system is, if it is available on their entire platform, or if there’s any rate limiting. Some Akamai customers tell me they are still in the 15 minute purge range. By comparison, when they compare Akamai to Fastly, their entire platform supports instant purge, they don’t rate limit and you can purge the entire cache if you want and it’s all API driven. Fastly customers tell me they have 150ms or less purging capabilities.

So with all these questions out in the market, I had a chance to speak to Akamai about how they are addressing their purging issue and got details from the company on their new platform, made available this week, which looks to address some of their customer’s purging complaints. Akamai’s new “Fast Purge” solution enables Akamai customers to invalidate or delete their content from all of Akamai’s Edge servers via API and UI in “approximately 5 seconds”. With Akamai’s Fast Purge API, the company said their customers “can automate their publishing flow to maximize performance and offload without compromising on freshness.” With this “Hold Til Told” methodology, Akamai customers can now cache semi-dynamic content with long TTLs, and refresh it near instantly as soon as it changes.

Akamai says the Fast Purge UI will complete the roll out process this week to all customers, and is already available to 85% of them. Fast Purge API has been adopted by almost 100 Akamai customers so far and they said it supports a “virtually unlimited throughput of over 100x that of our legacy purge APIs per customer.” Its early adopters include major retailers caching their entire product catalog and major media companies caching news stories and live API feeds for day-to-day operations. In Q1 2017, Akamai says Fast Purge will support purge-by-cpcode and purge-by-content-tag. With Fast Purge by content tag, customers will be able to apply tags to content, and then with one purge-by-tag request, refresh all content containing that specific tag. For example, eCommerce customers will be able to tag search result pages with the SKUs on them, and then when an SKU is out of stock, with one request remove all pages referencing it.

It’s good to see Akamai finally offering a better purging solution in the market, but only customers will determine if what Akamai now offers will fit the bill or not. The keys to instant purge are speed and reliability at scale. Customers say their experience on the “Hold Til Told” approach suggests that you need to trust that purges will happen and they need to be reliable across the world and at scale. If your site depends on being updated in real-time to ensure you don’t sell something you don’t have or provide outdated information the users need to trust it will work. If purges do not happen reliably, it creates mistrust and damages the entire premise of “hold til told”. So customers of any CDN should test purge times under many different conditions and in various regions on the production network to ensure it actually works as advertised. Even more so for Akamai customers, since we don’t know what scale or reliability their new Fast Purge solution has. While Akamai said they now have a 100x increase on the throughput from the legacy system, the old system was so limited that it’s possible that a 100x increase simply isn’t enough and would not meet the needs of many large customers.

Another unanswered question is what Akamai has done to integrate their underlying purging system into major CMS vendors and platforms, so that you get this feature as part of your basic install of the CMS. Akamai has not traditionally worked with the partner ecosystem well and it will be interesting to see how they plan to be on by default in the key CMS and platforms. Competitive CDNs have historically been developer friendly and have well-documented APIs for integrating with other platforms, and that has traditionally been a challenge for Akamai.

On the speed front, it’s good to see Akamai improving, but many businesses would not function with 5 second purges times. For example, customers that have real-time inventory that cannot be oversold. I see this 5 second limitation and the unknown scale and reliability of the system being a huge challenge for Akamai in a market that is truly milliseconds based. It is great they went from minutes to seconds but the performance game is now measured in milliseconds. Scale, reliability and speed are words everyone uses when it comes to delivering content on the web but for purging of content, customers use real-world methodology to measure the impact it has, positive or negative on their business. Customers are the ultimate judge of any new service of feature in the market and at some point, as more look to adopt Akamai’s Fast Purge solution, we’ll find out if 5 seconds is fast enough or not.

If you are a customer of Akamai or any other CDN, I’d be interested to hear from you in the comments section on how fast you need purging to take place.

As Pay TV Flocks To Devices, Multi-DRM Can Make Or Break Service Success

[This is a guest post by my Frost & Sullivan colleague Avni Rambhia]

It’s no longer news that TVE and OTT have graduated from experimental endeavors to full-fledged service delivery channels. On metrics such as subscriber growth, growth in hours viewed, and growth in advertising revenue – OTT services are surpassing traditional Pay TV services. That is not to say that OTT services are fully monetized today. Revenue generation, whether ad-based, transactional or subscription, remains an ongoing challenge for TVE/OTT services despite growing uptake and aggressive infrastructure investments.

The quest to bring a consistent, managed-quality experiences to  an unruly stable of unmanaged devices is a formidable challenge. Maintaining support across all past and present devices in the face of changing streaming and delivery standards is an undertaking in its own right. Nonetheless, secure multimedia delivery holds the key to delivering premium content to subscribers. With competing services only a few clicks away, ongoing growth relies heavily on the ability to deliver a service transparently across the many underlying DRM systems, device platforms, browsers and streaming protocols currently in use and on the horizon.

HTML5, with its related EME and CDMi standards, was envisioned as a way to unify cross-platform fragmentation and simplify cross-platform app development and content delivery. Things didn’t quite materialize that way, with the result that content companies will need to manage secure content delivery and handle back-end license and subscriber management across all major DRM platforms. While there is a perception that “DRM is free”, stemming primarily from the royalty-free nature of Widevine and the falling costs of PlayReady licensing, in reality the total cost of ownership is quite high. Moreover, DRM needs to be treated as a program rather than a project, subject often to unexpected spikes in R&D and testing overhead when a new operating system is released, a new device surges in popularity, old technology is deprecated, the DRM core itself is revised, or when a new standard takes hold. While the client side of the problem is often the first concern, server-side components play an important role in service agility and scalability in the longer run.

As part of our content protection research coverage at Frost & Sullivan, we took an in-depth look at factors affecting the total cost of ownership for both content companies (including broadcasters, new media services and video service operators) as well as OVPs who are increasingly outsourcing OTT workflows on behalf of content companies. The findings from this research are reported in a new white paper sponsored by Verimatrix. We’ll be discussing many of these factors, and their real life impact on customers, during a live webinar on Wednesday September 28th at 10am ET. Divitel will be joining to discuss their experiences first hand.

As we’ll talk about in the webinar, agility and scalability are crucial to OTT services as TV by appointment fades away and customers continue to trend towards device-first viewing behavior. While some companies may have the engineering talent and budget capacity to build and maintain their own multi-DRM infrastructure, our best practice recommendation in the majority of cases is to work with a security specialist vendor instead of going DIY. If you would like to share your own stories, whether as a vendor or a customer, or if you have any questions about DRM and available options, feel free to comment here or reach out to Avni Rambhia, Industry Principal, ICT/Digital Transformation at Frost & Sullivan.

Tuesday Webinar: Accelerating Your OTT Service

Tuesday at 2pm ET, I’ll be moderating a StreamingMedia.com webinar on the topic of “Accelerating Your OTT Service“. The OTT market is expected to generate an estimated $15.6 billion in revenue globally through 2018. Join Brightcove’s Vice President of OTT Solutions Luke Gaydon for an interactive discussion about the state of the OTT landscape and the key opportunities and challenges facing media companies looking to capitalize on this thriving market.

Luke will review the latest data on the growth of OTT and discuss complexities including device fragmentation and how to address them. Then, he will showcase Brightcove OTT Flow – powered by Accedo, including key product features, and share how this innovative turnkey solution enables the seamless, rapid deployment of OTT services across multiple platforms. Join this webinar to learn:

  • The latest data on the growth of OTT across devices, platforms and audiences
  • The growing challenges, including device fragmentation and technical scope
  • Strategies for taking your content OTT
  • Key features, analytics and how OTT Flow provides a consistent user experience across devices

REGISTER NOW to attend this free live web event.

Correcting The Hype: Twitter’s NFL Stream Lacks Engagement and A Profitable Business Model

nfl-twitterWith two NFL games under Twitter’s belt now, I’m reading far too many headlines hyping what it means for Twitter to be in the live streaming business. Phrases like “game changing,” “milestone,” and “make-or-break moment” have all been used to describe Twitter’s NFL stream. Many commenting about the stream act as if live video is a new kind of new technology breakthrough, with some even suggesting that “soon all NFL games will be broadcast this way.” While many want to talk about the quality of the stream, no one is talking about the user interface experience or the fact that Twitter can’t cover their licensing costs via any advertising model. What Twitter is doing with the NFL games is interesting, but it lacks engagement and is a loss leader for the company. There is nothing “game changing” about it.

The first NFL game on Twitter reached 243,000 simultaneous users and 2.3M total viewers. But looking further at the data, the average viewing time was only 22 minutes. Most who turned into Twitter didn’t stick around. Many like myself tuned into only to see what the game would look like and how Twitter would handle live video within their platform. For week two, Twitter reached 2.2M total viewers and had 347,000 simultaneous users, but the NFL isn’t saying what the average viewing time was. Twitter and the NFL are also counting a viewer as anyone who watched a “minimum of three seconds with that video being 100% in view”, which is a very short metric to be using.

Unfortunately, the whole NFL experience on Twitter was a failure in what Twitter is supposed to be about – engagement. Watching the stream in full screen, on any device, felt like I was watching the game via any other app. Twitter didn’t overlay tweets in any way, some commercial breaks had no ads shown at all and tweets weren’t curated. Far too many tweets added nothing to the game with comments like “Jets Stink.”

Streaming live content on the web has been around for 21 years now, and it’s sad state of the industry when the most exciting part of the event was that people could not believe the video didn’t buffer or have widespread quality issues. It’s not hard to deliver a video stream to 243,000/347,000 simultaneous users, especially when Twitter hired MLBAM, who then used Limelight, Level 3 and Akamai to deliver the stream. Some suggested that the “NFL on Twitter opens an enticing lucrative new revenue stream,” which of course isn’t the case at all. We don’t know for sure what Twitter paid for the NFL games, but if the rumors of $1M per game are right, Twitter can’t make that back on advertising. They don’t have a large enough audience tuning into the stream and would never get a CPM rate to cover their costs. Some have even suggested that the NFL stream on Twitter is a “model for other revenue-generating live stream events.” but of course that’s not the case. One-off live events can’t generate any substantial revenue as the audience size is too small, and the length of the event too short.

There is nothing wrong with Twitter using NFL content to try to attract more users to the service and grow their audience, but the NFL content itself isn’t a revenue generator for the company. Some, including NFL players, suggested that soon all NFL games will be broadcast online and that what Twitter and the NFL are doing is the future. That idea isn’t in touch with reality. The NFL is getting paid $28B from FOX, CBS and NBC over the course of their contracts, which end in 2022. That averages out to $3.1B per year the NFL is getting from just those three broadcasters. The NFL has no financial incentive to put more NFL games online, without restrictions, or they risk losing their big payday from the broadcasters. It’s not about what consumers want, it’s about what’s best for the NFL’s bottom line. Economics drives this business, not technology or platforms.

If Twitter has a game plan for all the live video they are licensing, it’s not clear what that is. In a recent interview with Twitter’s CFO, he commented that Twitter’s goal with the NFL games is to, “help the NFL reach an audience it was not otherwise reaching.” How does Twitter know they are doing that? There were plenty like me who were watching the game on TV and Twitter at the same time. The NFL didn’t need Twitter to reach me. And when the CFO uses the term “high fidelity” to describe the stream, what does that mean? Twitter keeps saying they have the “mobile audience,” but they won’t break out numbers on what the usage was on mobile versus the web, or any data on average viewing time on mobile. Twitter also said, “there was evidence that people who had not watched the NFL in a while were back engaged with it.” Why kind of evidence is that exactly? Twitter can’t tell if I was on NFL.com the day before, or watching the game on TV today, so what kind of data are they referencing?

Twitter also says that they were “incredibly pleased with how easy it was for people to find the Thursday night game on Twitter.” Making a live link available isn’t hard. A WSJ article said there are other “live sports streaming technologies out there” but Twitter’s was “easy to use.” All the live linear video on the web is using the same underlying technology, Twitter isn’t doing anything special. They are using the same platform that MLB, ESPN, PlayStation Vue, WWE and others use, as they are all MLBAM customers. Many in the media made it sound like Twitter did something revolutionary with the video technology, which wasn’t the case at all.

Someone commented online that the reason Twitter’s NFL stream is so “successful” is because “advertisers love the mass that live sports delivers.” But it doesn’t deliver that mass audience online, only on TV. And that’s the rub. The NFL and every other major sports league isn’t going to do anything to mess with their core revenue stream. So for at least the next six years, until their contracts with the broadcasters come up for renewal, the NFL isn’t going to do anything more than experiment with live streaming. And there will always be another Twitter, Yahoo, Verizon, Google, or someone else who wants to give the NFL money, more for the publicity of the deal, than for anything that actually increases their bottom line.

Moderating NYME Session On Future Of Live Streaming Thursday, Looking For Speaker

nyme-logoThursday at 12:20pm I am moderating a session on “The Future Of Live Streaming: Sports, Linear TV & Social” at the New York Media Festival in NYC. It’s a short 30-minute round table panel, with lots of Q&A from the audience. I am looking for one more speaker to join the panel, preferably a content owner/broadcaster/aggregator etc. Email me if interested.

The Future Of Live Streaming: Sports, Linear TV & Social
From NFL games on Twitter, to upcoming live linear services from Hulu and AT&T joining Sling TV, live streaming is exploding on the web. With rumors of Amazon wanting to license live sports content, Disney’s investment in MLBAM, and Twitch pumping out millions of live streams daily, consumers now have more live content choices than ever before. Attendees of this session will have the opportunity to participate in the discussion about the most important obstacles being faced when it comes to live streaming. Topics to be covered include content licensing costs for premium content, monetization business models, what consumers want to watch and the impact social services could have on live video. This will be an interactive session with plenty of Q&A from the audience. http://mefest.com/session/the-future-of-live-streaming/