In July, New York City is a hazy and muggy city, but the SVG Sports Content Management Forum was anything but. It had a clear set of talks and findings that reflect the state of the sports video industry today.
I’d like to start by expressing a big thank you to Jason Dachman and Andrew Gabel for the organisation. Attendance was back to pre-pandemic levels and the organisation of the talks was, as always, excellent.
To start off with a cliché … the main thread I picked up on throughout the day was AI. OK. Is image recognition, AIOps, and ChatGPT really AI? I still can’t quite accept the term – but nonetheless, you know what I mean. And strikingly, it was AI not in some kind of woolly futuristic sense, but AI that is in use daily today.
For instance, we had talks focused on multiple MAMs that utilize AI to find semantic scenes in video archives. Fox Sports provided a great discussion on this with a custom MAM (Media Asset Management), followed by Newsbridge discussing their AI-driven MAM in use for multiple customers.
By semantic AI, I refer to searching for scenes such as “a home run being scored” or “someone smiling”. AI has now moved on from mere facial recognition in this sense.
It is clear this form of AI allows archives that can be used far more deeply than was previously possible, allowing better monetization and discovery of a more diverse set of usable clips.
And, in reference to the writer’s strike, it was strongly argued that this form of usage of AI is not about the replacement of jobs; it’s a better search engine to allow us to make better programming.
During the afternoon, my presentation focused on edge to core to cloud – that data gravity will drive us to want to run applications where the data is, rather than having to move data around to where the applications are.
In the case of edge, this will allow us to have economic benefits, processing video locally and driving AIOps to only moving the video required to the on-prem or cloud solutions. Of course, this needs to be managed without vendor lock-in, with open self-describing metadata, open API, and open ID.
There is a big shift coming in on where and how we keep and process our data to both handle the explosion of video content being filmed and to take advantage of edge first and cross-tier data management.
To conclude: it is always great to listen and learn, to network with friends old and new, and to be inspired to work on products that truly take things forwards. We now eagerly await the shared insights and networking at IBC, NAB New York, and the next SVG event!
Click here to find out which upcoming events the Perifery team will be exhibiting at.