Archives

The Business Benefits Of Using A Hybrid DIY CDN Approach To Content Delivery

No CDN provider can guarantee best performance for all kinds of content and across all geographies 100% of the time. For some use cases, this can result in a lack of control over content delivery when and where it is most critical for customers. To counter this, I’m starting to see a move by some companies toward hybrid CDN delivery, defined as building a private DIY CDN on top of a third-party CDN. For some this might seem like overkill, but in business-critical operations, slow performance, downtime and giving up too much control of content can dull a company’s competitive edge.

The hybrid CDN approach enables business advantages that ultimately boil down to optimizing performance within a content delivery ecosystem that they create, control and oversee to ensure availability, reliability, flexibility, and user experience. At the CDN Summit last month, there was a lot of talk about the business benefits of doing a hybrid CDN. The four most important being: always available and risk distribution; performance improvement; flexibility and control; security.

Always available and risk distribution
With online services at the heart of doing business, downtime or outages equal losses. As such, ensuring as-near-as-possible availability is the goal. Coupling a DIY, private CDN with a third-party CDN helps ensure this uptime by avoiding a single point of failure. When a third-party CDN goes down, websites go down with it – leading to the “all eggs in one basket” syndrome. Risk can be distributed more effectively by assembling your own private CDN and adopting a hybrid CDN approach that allows for better insight and control into what is happening in real-time. This results on cuts back on avoidable downtime, and, while seeing fewer problems overall, being able to spot them before or as they hit, to troubleshoot and mitigate their effects early and fast.

Performance improvements
As usual, one size does not fit all, nor does one tool solve all the complex problems: this is why, when looking for performance gains, leveraging a private DIY CDN on top of a third-party CDN ensures that content is deployed by best-performing CDN components in real-time in both regular or surging traffic. It can also allow for content delivery from the PoP closest to your primary region(s), ensuring fastest possible content delivery.

Flexibility and control
Implementing a hybrid CDN approach can let you to gain better oversight into traffic management and better traffic control of content. By creating your own purpose-built CDN ecosystem specifically for the needs of your specific business/traffic patterns you can direct traffic to your own private CDN nodes or third-party nodes to gain maximum performance and efficiency where demand is and run your business and application logic as close to your user as possible. This flexibility also introduces the ability to gain more control of your costs. Private CDNs enable greater scalability, as costs don’t rise based on capacity but rather on number of nodes and requests. This makes a big difference when heavier content (e.g. high-res video) requires more bandwidth. You gain control over your own traffic without adding more costs, paying for bandwidth, which you’d already do anyway, but with a private CDN layer, you gain oversight into the costs you incur. Commercial CDNs are basically rented resources, which can become expensive and out of your control.

Security
Part of “taking back control”” of content includes ensuring its security. For many content owners, having valuable content on a multi-tenant, third-party CDN is not optimal for security. In addition to the aforementioned limitations of “renting resources”, letting your proprietary content travel across someone else’s infrastructure removes the layer of privacy and security some companies need. The private cloud structure can also act as a WAF to provide protection against DDoS and other attacks since with a hybrid CDN solution, your private content is not on an open, public network.

Sometimes the adage “less is more” doesn’t apply. In the case of business-critical content delivery, more is more. A hybrid CDN approach (your own DIY CDN combined with your third-party CDN) lets you address your content delivery challenges on a global scale to get the best of what the hybrid solution offers for tangible, positive business results.

Summing it all up, a DIY CDN solution allows you to:

  • Mitigate outages and downtime.
  • Eliminate the “single point of failure” problem.
  • Set up private nodes that will act as “origin shield” to optimize content delivery and cache-hit ratio, protecting the origin from sudden floods of calls from the CDN.
  • Focus on your performance: with a DIY CDN, you are not sharing performance with the thousands of other customers of the third-party CDN.
  • Direct traffic to your own private CDN nodes or third-party nodes to gain maximum performance and efficiency where demand is and run your business and application logic as close to your user as possible while controlling your costs.
  • Secure your proprietary content by delivering it over your own private CDN.

Recognizing the need for the kind of flexibility and control a DIY CDN offers, Varnish Software and Cedexis partnered last year to create a solution delivering all of these benefits: Varnish Extend. Underpinned by high-performance caching from Varnish Plus and global traffic management from Cedexis, Varnish Extend is a DIY content delivery solution designed specifically to provide tailored, individualized content delivery infrastructure options. [See my post entitled: Varnish Software And Cedexis Announce A New Private Content Delivery Offering]

For more in-depth details on a hybrid CDN approach, check out the video of Varnish Software’s presentation entitled, “Combining Your Existing CDN With a Private Content Delivery Solution” from the CDN Summit last month.

Sponsored by

Wednesday Webinar: Using Data To Enhance QoE and QoS

Wednesday at 2pm ET, we’ll have a StreamingMedia.com webinar on the topic of “Using Data: Enhancing Quality of Experience and Quality of Service.” Online video viewers now expect a broadcast-quality viewing experience. The only way to do that is with highly granular analytics that help assess and improve Quality of Experience and Quality of Service. Join this roundtable to hear insights into how to get the most out of your data, and how to make sure your viewers get the quality they expect. Topics we’ll cover include the following:

  • The four key indicators for analyzing video quality of experience for an enterprise webcast
  • How to utilize system, stream, network, and user level data
  • How to use analytics to improve the customer experience of your OTT service
  • The ways prescriptive analytics can ensure webcasting success
  • Encoding considerations for the various types of networks
  • Measuring QoE in DASH-based adaptive streaming—start-up delay, buffering, quality switches and media throughput
  • Optimizing adaptive streaming with the QoE data—pick the most appropriate quality level based on measured parameters
  • Real-time insights with real-time monitoring—use cases and customer examples

REGISTER NOW to join this FREE Live web event.

Best Practices For CDN Origin Storage

Because a CDN is a highly scalable network it handles most requests from edge cache without impacting the content distributor’s origin and application infrastructure. However, the content must be available for retrieval on cache miss or when the request has to pass to the application. Whether the assets are videos, images, files, software binaries, or other objects, they must reside in an origin storage system that is accessible by the CDN. Thus, the origin storage system becomes a critical performance component on cache miss or application request.

Most CDNs have historically offered file-based solutions designed and architected to permanently store content on disk and act as the origin server and location for CDN cache-fill. Other alternatives include object stores, general-purpose cloud storage services including Amazon S3, and CDN-based cloud storage solutions with object-based architectures. What’s interesting about origin storage services within CDNs is that they should be able to offer an advantage over monolithic cloud environments. CDNs are essentially large distributed application platforms, and one of those applications is storage.

To be clear, origin storage is fundamentally different from the caching applications that CDNs also provide. Storage implies some permanence and durability, whereas with caching, the objects are ultimately evicted when they become less popular or expire. CDNs all operate some form of distributed architecture that is connected with last mile telco providers. If storage is distributed throughout the CDN in multiple locations, requests from end-users for content that is not already in cache can be delivered significantly faster from a nearby storage location. However, performance suffers if a request has to traverse the CDN network and potentially the open Internet to access remote origin storage.

Some content distributors have accepted the risk that the availability and durability metrics for single storage locations offered up by cloud storage providers are good enough and their applications will continue to work even if some issues are experienced by their cloud provider. With hindsight and experience though, it is clear that is not always the case, as can be seen from the fallout from the Eastern USA S3 outage in early 2017. The solution offered by Amazon is to use their tools to architect and build your own HA solution and redundancy using multiple storage locations and versioning your objects. This is a complex change in operations from simply uploading content to a single location. The operational overhead and cost of doing this for multi-terabyte or petabyte libraries is significant.

I also hear a lot of customers who focus on the cost of storage at rest but they don’t consider the additional costs of replicating content or all of the storage access fees. For traditional cloud storage workflows, the costs of accessing content can be even more expensive than the costs of storing the content. Content owners should pick a CDN that charges a flat fee to store multiple copies of a customer’s content, without any additional charges for moving content into storage or accessing the content when it is requested by users. In many cases, storage from a CDN is actually more cost effective for customers who need to frequently access content from storage, than from a traditional cloud storage provider. Storing content in multiple locations also allows faster delivery of content that is not already in the cache.  While it can be difficult to assign a specific value to delivery performance, the improved customer satisfaction of faster delivery can potentially outweigh any additional cost of replicating content closer to users. Storage costs can potentially be relatively small compared to the benefits of customer satisfaction and retention from the improved performance.

Storing content in multiple locations also allows faster delivery of content that is not already in the cache. While it can be difficult to assign a specific value to delivery performance, the improved customer satisfaction of faster delivery can potentially outweigh any additional cost of replicating content closer to users. Storage costs can potentially be relatively small compared to the benefits of customer satisfaction and retention from the improved performance.

At the Content Delivery Summit last month, I had a conversation with Limelight Networks about what customers are asking for when it comes to origin storage and what a CDN should be doing to provider better performance than cloud storage providers. What Limelight said they have done is consider why and how a company would choose object storage integrated with a CDN, as well as the challenges of migrating content, and architected a solution. So they purpose-built something called Intelligent Ingest, which automates the movement of objects into integrated origin storage based on either audience demand or a manifest of files. In load-on-demand mode, audience requests that require retrieval from origin storage deliver the content and load it into edge cache. In addition, the content is also automatically stored in Limelight’s origin storage services. In manifest mode, content distributors provide a list of content to migrate and parameters to control the rate of migration.

In picking and choosing the best origin storage solution, content owners should look for one that has automatic replication to multiple locations based on regional policies. Customers can choose policies based on audience location such as a single region like the Americas, EMEA or APAC, weighted policies across geographies, or fully global policies. By doing so, content is then automatically positioned appropriately to be close to the audience and future origin storage calls, whether due to cache-miss or refresh checks, are automatically served from the best origin storage location available for that request.

There are a number of workflows and use cases where these features could be useful for customers. For new content production, one could automate the movement of new content into the CDN storage environment as it is published and end users start requesting it. It could also be useful if you are migrating a library from your existing solution to a CDNs origin storage, which Limelight said is a frequent use case. Enabling load-on-demand lets audience requests determine which assets to migrate, and providing a manifest of files automates migration of those assets. Another use case is pre-positioning, or what is sometimes known as pre-caching or cache warming. In advance of a launch, the CDN could distribute all the necessary files across their origin storage and when the launch is pushed live, the CDN handles the subsequent traffic spikes, offloading demand from the customer’s infrastructure.

Looking at the CDN market today, it is clear the emphasis is not only on high quality and highly efficient delivery solutions but also on the range of services provided to help manage production workflows and improve the end-user experience. Moving more logic to the CDN edge to incorporate smart solutions for request handling — and as discussed above, automating content asset migration and distribution to improve performance and QoE — are areas where CDNs can make a clear difference.

Third HEVC Patent Pool Launches With Ericsson, Panasonic, Qualcomm, Sharp & Sony

For content owners and broadcasters looking to adopt HEVC, two patent pools offered by MPEG LA and HEVC Advance have been offering licensing terms around HEVC patents for some time. But if dealing with two pools wasn’t already confusing enough, we now have a third pool that has entered the market. Launched in April, Velos Media has launched a new licensing platform that includes patents from Ericsson, Panasonic, Qualcomm Incorporated, Sharp and Sony.

As all these patents pool claim, Velos Media says they are making it “easy and efficient” to license from the “innovators that have made many of the most important contributions to the HEVC standard.” Of course, they also say that their pool “reduces transaction costs and time to market”, that their terms are “reasonable”, with “rates that will encourage further adoption of the technology.” For a pool that tries to sound professional, their website is a joke. It contains no real details of any kind, such as which patents are covered in the pool, by which companies. Nor does it give any details on what the licensing terms are or whom exactly they cover.

In their press release they make a mention of “device manufacturers” but give no other context of whom they are targeting. To make matters more confusing, Velos Media is being run by the Marconi Group, “an entity formed to create, launch and support new patent licensing platforms.” It’s clear Velos Media has no understanding of the market, doesn’t know the use cases and doesn’t realize the important of transparency when it comes to patent pools.

Recovering From The Flu, Back To Blogging Shortly

I’m recovering from the flu and will be back online shortly. I’m catching up on emails this week and will be responding to all follow up from the Streaming Media East and CDN Summit events.

MLBAM CTO To Kick Off Day Two Of Streaming Show With Fireside Chat

I’m excited to announce that Joe Inzerillo, Executive VP and CTO of Major League Baseball Advanced Media will sit down with me for a fireside chat to kick off day two of the Streaming Media East show, on Wednesday May 17th at 9am. From BAMTech’s recent billion dollar investment by Disney to their new CEO and European office, it’s going to be a great discussion with lots of topics covered. If you have a question you want me to ask Joe, feel free to email it to me. The keynote is free to attend if you register online using code 200DR17 and select a discovery pass. #smeast

Learn How To Increase Viewership and Video Revenue with Syndication & Social

Social networks have experienced an explosion in video consumption. As a result, they have become increasingly more important to publishers and content owners for discoverability, audience reach and monetization. At the Streaming Media East show in NYC, taking place May 16-17, you will hear about the landscape for video across social platforms and how to take advantage of the opportunities at scale. Learn about how content owners can delight audiences and generate revenue while maintaining brand visibility on social, and the importance of viewing metrics in social distribution during this interactive discussion. Confirmed speakers include:

  • Moderator: Anil Jain, EVP & GM, Brightcove
  • Michael Philippe, Co-Founder, Keli Network
  • Jonathan Moffie, Director, Video Product and Distribution, Time
  • Rob Dillon, Digital Operations Manager, Tribune Media
  • Claudia Page, VP, Partner Products, Dailymotion

You can register online using code 200DR17 and get $200 off your registration ticket. #smeast