shutterstock_116283283_small

Media Monitoring with Big Data: Opportunities and Challenges

What would marketing strategists do if every second of all broadcast television, broadcast radio, and social media posts were available in real time for analysis? How would advertisers react if the transcripts of all the TV and radio broadcast information were available in addition to video and audio for every second anywhere in the nation in real time? Big Data analytics provides a means to digest this copious amount of information and to make ad buying decisions on the fly.

Big Data analytics means not only processing very large amounts of data but also attempting to coherently connect the unstructured nature of the data. For example, the data could be a combination of radio and TV transmissions converted into speech by a speech coder, closed caption text of TV programs, Twitter and Facebook posts, financial data and spreadsheets, and news articles. Another advantage of Big Data analytics is that the system is self-learning; once the initial analytical model has been created, additional new data and information can be easily integrated into the knowledge base. Big Data analytics has been creating a buzz in the industry for a number of years, and many companies are already offering tools and services to reap the benefits.

An example of a company that archives broadcast content for media monitoring is Critical Mention. Critical Mention captures and indexes 40 hours of broadcast content every 60 seconds from more than 2,000 unique broadcast sources. The users thus have access to a database of more than 16 million searchable segments, all available in broadcast quality and near-real time. The users can search TV/radio/online news, watch video, edit and share coverage, receive real-time alerts, and create reports. The system also offers analytic tools to gain insight into the data, to benchmark against competitors, to analyze viewer sentiment, and to visualize density of coverage on maps. The following picture is an example of the analysis tool output.

mediaMonitoring

Figure: Critical Mention Analytics

Integration of Big Data analytics with archived media information would open up many new possibilities in the future. For example, the single string-based search provided by Critical Mention can be augmented to perform searches based on different combinations of groups of words. Grouping of related words and inclusion/exclusion combinations of many different groups are known as dictionaries and rules in Big Data jargon, and these well known techniques are used to isolate relevant information and validate hypotheses. The natural language processing (NLP) capabilities of big data can also help make inference about the context in which the key words are used, making the results more reliable and relevant. Yet one can envision that image processing and pattern matching algorithms could be used to detect and quantify product placement in broadcast video. For example, the Coke logo can be identified in time-stamped video frames across all the broadcasters of all the DMAs in real time. These capabilities are computationally intensive, but the algorithms are fairly well understood. Cloud services, which offer scalable computational resources, can make this kind of analysis available in the future.

Recognizing some of these potential business opportunities, broadcasters and consumer electronics manufacturers are developing a next-generation television system with dynamic ad insertion and targeted advertising capabilities. If ad buyers can identify a need – in real-time if they wish – to maintain their brands’ share of voice (SOV) in certain geographically segmented markets, broadcasters will have the technical means to deliver the ads to the intended audience in the future. As a consequence, programmatic advertising and ad exchange might become more dynamic in terms of balancing demand and supply.

This kind of individualized targeting might also raise challenges to traditional advertising models in the broadcast industry. With access to all the real-time and personalized information of the audience, an ad buyer will have the option to use different means to reach the audience, even when the audience is consuming broadcast content. For example, armed with the knowledge of a consumer’s state of mind at a certain location at a certain instant, the ad buyer might reach the targeted customer through a smartphone, thus totally bypassing the broadcasters. Proliferation of the Internet of Things (IoT) will provide more outlets for targeted advertising, and the situation might become even more challenging for the traditional broadcasting model.

Media monitoring along with Big Data analytics has tremendous potential for targeted and contextual advertising. The advertising industry is changing rapidly, revealing both opportunities and challenges for broadcasters. Broadcasters will need to be vigilant of this fundamental paradigm shift in advertising and adapt to new market challenges.

here1

Is Broadcast Part of the Solution to Internet Traffic Congestion?

Akamai’s Chief Architect Will Law talked about the trends of Internet connectivity at the ATSC Broadcast Television Conference held on May 14, 2015 in Washington DC. He identified that delivering high quality video will be a challenge to the Internet infrastructure in the future, and that integration of broadcast and Internet could be a solution to that problem.

Akamai’s Intelligent Platform™ is a leading cloud platform that delivers secure, high-performing user experiences. The platform is a globally distributed network of servers and intelligent software, and it handles over two trillion interactions daily. Through this platform Akamai has gathered insightful metrics such as connection speed, network availability, traffic patterns, etc. over different geographical regions. Akamai’s Chief Architect shared some of the data and statistics with the ATSC Broadcast Television Conference audience.

Although average connection speed is increasing across the US, there are significant disparities among the states. It is generally accepted among Internet providers that OTT delivery of a 4K UHD signal would require about a 15 Mbps internet pipe. The following figure shows that many states are not yet ready to support a widespread 4K UHD OTT service.

pic

Akamai also showed that peak bandwidth demand varies with time and coincides with big events.  Moreover, the peak bandwidth demand also increases year by year. The following figure shows that the peak bandwidth demand is spiky and that the envelope of the peaks is growing.

pic2

Factors that are driving the demand for higher Internet speed are 4K/UHD video, OTT services, catch-up TV, device-based video playback outside the living room, and increasing user expectations of quality. There are a few potential solutions to the bandwidth challenge: better compression (HEVC), growth in average throughput, backbone fiber improvements, hybrid UDP protocols, possibility of IP multicast, P2P, HTTP2, standards such as MPEG-DASH, decreasing cost of storage, and broadcast integration.

Akamai’s solution to the bandwidth problem is to place edge servers near the end user, as shown in the following figure. The edge server, marked as an orange box, is physically near the end user; this kind of placement reduces the burden of fetching the content from distant sources and thus reduces overall internet traffic.

pic3

Broadcasters have several opportunities in this environment. First, they could form a partnership with CDN providers and collaborate with caching content on the edge servers.  Second, using new technologies such as ATSC 3.0, broadcasters could reach many of the end users directly over the air, especially during the large bandwidth demand spikes associated with big events. If terrestrial broadcasts are integrated with broadband networks, service to consumers will be optimized from the combination of the inherent flexibility of the Internet and the high spectrum efficiency of broadcast delivery.

Follow