With Automated Data Pipes and Data Fuel, the Broadcast TV Ad Sales Engine is Ready to Roar


In this BIA Vantage Points invited post, Premium Media 360, a TV data automation firm that synchronizes data among media and agency partners, argues that having data and automating TV advertising platforms are critical success factors. However, there remains the urgent need to synchronize islands of data across vendor and application stacks to achieve the full benefits of workflow automation.

The Vantage Point series taps the perspectives of various lookout points from around the local media and tech sectors. The views expressed do not necessarily reflect that of BIA Advisory Services. Please contact Rick Ducey if you have insights to share.

With Automated Data Pipes and Data Fuel, The Broadcast TV Ad Sales Engine is Ready to Roar

Joan FitzGeraldpicBy Joan FitzGerald, SVP – Advanced TV Global Partnerships, PremiumMedia360

At this year’s NAB Show, one of the most compelling expert panels was about a new focus on reducing friction between buyers and sellers of local broadcast television advertising. The panel discussed the importance of automation and progress on standards for APIs. Although APIs – the data “pipes” – are an important starting point to solve the automation puzzle, there’s still a missing piece: the data synchronization “fuel”. Think about it this way: even if the data pipes are in place, if the agency uses gasoline and the broadcaster uses diesel fuel, the ad sales “engine” still won’t work.

Why all the fuss? Agency leaders from GTB and Publicis Media put it best: local linear TV is losing revenue to advertisers who need to be on air in hours, not weeks. Digital advertising, already automated, today takes the bulk of every new dollar spent on advertising.

Broadcast television leaders are taking notice of these market realities and taking action to reverse the trends.  Initiatives like Television Interface Practices (TIP), founded by Hearst, Nexstar, Sinclair and TEGNA, published API standards TIP 2.0 to accelerate improvements in broadcaster-agency data communications. PremiumMedia360, an advertising data automation company, announced they are the first to align with the TIP 2.0 API standards and to combine APIs with data synchronization.

It’s a tribute to broadcast market leaders that automation is moving so fast, and on its way. The combination of APIs and data synchronization are the keys to enabling buyers and sellers to “speak” the same data language with complete, accurate data and true 2-way, real-time communication.

With the right pipes in place and the right fuel in the engine, the broadcast ad sales engine is primed and ready to roar with new revenue growth.  Automated, frictionless transactions between buyers and sellers can be a reality. Moreover, with automation in place, new capabilities for both consumers and advertisers that accelerate broadcaster revenue growth are now on the horizon.

About the Author

Joan FitzGerald leads advanced TV and global partnerships at PremiumMedia360, a leader in advertising automation solutions, focusing on creating frictionless transactions between buyers and sellers of TV advertising and programmatic linear TV. Joan has held significant business and product leadership roles, including vice president at TiVo, launching an advanced audience-targeting platform for national TV advertising and SVP at comScore, pioneering comScore’s first cross platform measurement solution. Joan’s professional achievements include a David Ogilvy Award, nominated by Walmart, and a patent on advertising measurement techniques.

This Post Has 3 Comments

  1. Bluestacks

    Analytics is computationally taxing. If you use the same systems for analysis that you use for capturing your data, you risk impairing both the performance of your service (at the capture end), as well as slowing down your analysis.

  2. TextNow

    Data from multiple systems or services sometimes needs to be combined in ways that make sense for analysis. For example, you might have one system that captures events, and another that stores user data or files. Having a separate system to govern your analytics means you can combine these data types without impacting or degrading performance.

  3. TextNow

    you may not want analysts to have access to production systems, or conversely, you may not want production engineers to have access to all analytics data.
    If you need to change the way you store your data, or what you store, it’s a lot less risky to make those changes on a separate system while letting the systems that back your services continue on as before.
    These two systems (data generating v. data analyzing) are often called OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing). Here’s a quick overview of the functions of each:

Leave a Reply