With AI technology transforming various industries, the music industry is no exception. Markus and his team at Cyanite have been working on a revolutionary AI-powered music transformer model that promises to disrupt the way we listen to and search for music and deliver the right music content regardless of the use case. In this interview, we dive deep into the fascinating world of AI in music, discussing everything from the technology behind Cyanite’s music transformer model to the impact it will have on the music industry.
So, without further ado dive right in!
Markus, can you tell us about your background and how you got involved in the music tech industry?
I played in bands pretty much throughout my life, trying to make it as a musician when still in high school. Since I was clearly not good enough I decided to go into the business side of things where I worked in labels and PR/promotion.
What inspired you to start Cyanite, and what problem does it aim to solve?
We saw big tech companies developing all these crazy recommendation algorithms for music consumers. But music selection in advertising and marketing was still rather archaic with people clicking through track after track in search of suitable ones. When we looked into the market we discovered that no tech was available to sufficiently solve that, so we built it ourselves.
Furthermore, there is a lot of uncertainty involved in music decisions when a lot of opinions on music taste are thrown around. So you rarely end up with the best possible option for your purpose and it takes ages to get there. We bring clarity and objectivity into that process by delivering easy-to-interpret data on the (among other things) emotional effect and brand fit of music.
How do you ensure that Cyanite’s AI models are unbiased and free from algorithmic biases?
That is a really important question for us. In fact, our Chief of Data Roman is driving the conversation internally on how to best mitigate biases both from the data as well as the creators (us). The algorithms are only as good as the input data, so we are trying to get data from around the globe and not just the West. The fact we have customers from Korea, Japan, China, Dubai, South Africa, Colombia, Kazakhstan, and Brazil speaks to this work and is a big success for us.
How does Cyanite partner with music industry stakeholders such as record labels, streaming services, and artists?
As mentioned, most recommendation algorithms were built for music consumption in B2C. They’re not necessarily great for B2B use cases as they’re not targeted enough and don’t deliver any data insights on the music like the emotional effect etc. Our algorithms are specifically tailored to the language of advertisers, marketeers, or music supervisors. When a music company/rights holder wants to make it easier for the demand side to find their music and make an informed music decision, they come to us. We run their entire catalog through our engine and make a kind of B2B Spotify out of it. Afterward, they can answer music briefs easier and back up their decision using our detailed data insights.
Read More @ https://ai-techpark.com/aitech-interview-with-markus-schwarzer/