The ability to generate thoughts and build relationships within the stream of consciousness characterizes the human mind. We think in concepts. By filing each memory not just as a picture, but as a series of associated values, ideas and themes, it then has the ability to ‘think’ in a meaningful way; to see patterns, distinguish differences, identify commonalities, find common threads and join up memories and ideas in a way that creates meaning and fuels creativity.
These ‘common threads’ are, in essence, metadata – and have as much of a place within computing as in the human brain. They give us the ability to turn disparate data into something meaningful – to ‘see’ the content of the picture, to recognize and join up the concepts associated with a clip or file. And their application within the field of broadcast is continually growing.
Real-world metadata applications in the field of broadcast
Metadata potential is almost limitless. There’s a good chance you’ll already be familiar with some of the central applications. But it doesn’t hurt to recap. Metadata can:
- Help with technical production; allowing for video spec to be matched up when selecting clips for composition, or for dictating how they are distributed.
- Speed up the search process; allowing clips to be identified by theme, visual elements, ‘emotion’ – through AI sentiment analysis, and even spoken content – through AI-based speech detection.
- Automate scheduling –fill slots with assets based on series, chronology, theme, subject, genre, demographic or any other metric that works for you.
- Match advertising with broadcast content, creating congruence and improving ROI for advertisers.
- Assist with the categorization of media assets according to rights permissions and access certificates.
- Quickly filter out sensitive and adult content which is not suitable for certain audiences through sentiment and language analysis.
- Make your system intuitive and accessible to third-party users who need access to your media library.
- Deliver automated speech-to-text and subtitling, along with associated time codes.
- Allow for you to engage in more sophisticated reporting processes; giving you strategic insight into your media asset performance, but also providing greater accountability to external stakeholders.
Major Usage Trends of metadata in the Broadcast & Media industry
A few of these areas are seeing particular interest in the broadcast market at the moment. Take for instance the broad field of rights management. For broadcasters, content exchangers and anybody managing a huge library of film and TV assets (or media in general), then metadata tags can be associated with assets relating to publication/depublication dates, geo-blocking policies, view counts and syndication. This is of vital importance for ensuring that the right content goes to the right markets, or perhaps more importantly – doesn’t go to the wrong ones – thus providing an added level of security and checking, and ensuring that expensive legal mistakes aren’t made.
Metadata is also helping broadcasters to make their content libraries more accessible and appealing to audiences; localizing content through the provision of languages and subtitles, or associating themes and synopsis. These latter elements don’t merely allow for a user – be they industry or consumer – to assess whether a given asset is right for them, but also allow for better predictive algorithms to be used; making for more efficient workflows and/or keeping audiences happy, loyal and tuned in.
Metadata isn’t only useful for the input part of the broadcast process though. It’s also key to understanding how your content library as a whole has been used and compiling reports based on the overarching themes and nature of that use. What parts of your library have generated the most value? Be it financial, administrative or project management-based, the ability to gain an overview of your media performance based on its metadata properties gives you a much more sophisticated and usable business-level understanding – allowing you to make strategic decisions for the future.
VSN and metadata
The significance of metadata in the management of media has been apparent to VSN right from the beginning – and as such, we’ve always worked to push the boundaries of how it can be used, and how we can innovate to make it faster, more accurate, and more usable.
This means metadata isn’t merely a hastily applied afterthought in our media management systems (VSN Explorer)– it’s built from the core up. It’s key that metadata elements are present and utilized throughout the entire workflow, and we make sure of this by integrating it into every one of our systems, but also by making sure that as content is passed through each of these systems, metadata flows alongside it. All of the metadata applications discussed above are built right into the core of our system; fully customizable user-generated metadata input, that where possible, integrates predictive or automated workflows to make the metadata categorization process smoother and more efficient.
A crucial component of this is the use of AI; a key element that sets VSN apart from many lower-level competitors. We achieve this by partnering with the best in the field – the so-called ‘Big Three’ (Google Cloud, Microsoft Azure and IBM Watson) to harness the power of the millions of inputs they receive every day.
This ability to categorize and learn is a very big deal. Traditionally, because metadata is essentially about ascribing ‘human understanding’ to a video, then the process of generating metadata was necessarily human in nature. Pity the poor person, locked away in a basement, watching thousands of hours of video, and attempting to generate as many conceptual tags as their poor, addled brain could think of. It’s not a job that we would want…
Now, the ability to apply AI operations to the process literally (and we don’t misuse the word) has the ability to take months’ worth of cataloging time and turn it into the work of a few minutes: key for harnessing both the value of legacy libraries – unlocking the reuse of decades worth of footage by making it relevant and accessible – and new content.
Out AI can process incredibly sophisticated elements; features (facial recognition, objects, places, organizations and logos), audio layers (music, voice, speakers and audio effects), speech-to-text and automatic translation, Optical Character Recognition (OCR) – and this is pretty incredible – sentiment analysis, in which you can actually automatically detect the emotion being conveyed within a video. Perfect for finding the ideal tear-jerking moment to accompany your ‘lost cat’ news segment…
Because ultimately, metadata is about making processes more efficient in the long run; maximizing the ways you can use your media assets and improving the ROI achieved by them with relatively little investment – particularly when AI and automation are doing much of the leg work for you. Put simply, the sophisticated metadata elements that VSN has integrated across the entire VSNExplorer suite allow for quicker, more efficient, more cost-effective and more creative use of media – whatever the application. Why not find out how you can make metadata work for you?
Subscribe to our newsletter to stay updated about our activity