AIMS and the AI-Powered Quest to Solve 'Search for Music'
A closer look at the company transforming how we find and explore music
AIMS is a global company specializing in AI-powered music search, discovery and curation solutions.
Around since 2019, it was built by people who spent years in the industry searching, tagging, licensing — and got tired of wasting time.
Designed for seamless integration by music libraries, publishers, record labels, TV, film and media companies, AIMS has grown exponentially since its launch, and now works with some of the world’s biggest music and media companies, including APM Music, Universal Production Music, Warner Chappell Production Music, Partisan Records, Hipgnosis, Extreme Music, BMG Production Music, The Nerve, SATV and many others.
AI and music are forming an increasingly potent alliance within the industry. The talented AIMS team is positioned smack dab in the middle of all of the excitement and innovation — benefiting the music and creative industries with an impressive array of AI-powered solutions.
We had the pleasure to connect with Einar Helde, AIMS Co-founder and CCO, for a delightful conversation about all things music and technology — specifically AI.
What was the jumping off point to build this platform for AI-powered search tools? There are a lot of exciting / interesting components of the music business… Why the focus on search and discovery?
Martin [Nedved] and I come from the music industry, specifically sync. We’ve done the hands-on work of searching, licensing and even tagging, and we’ve led companies that relied on doing it well.
So we literally built AIMS for ourselves.
Really, AIMS was born out of frustration with menial work taking up the majority of our time. Most briefs include references or very specific instructions, so we (well, mainly the genius of our co-founder Viktor Parma’s and his teams’ AI expertise) solved those two problems with our Similarity Search — which works with any link or file reference — and both Prompt Search and Lyrics Search, which understand a user’s own words and require zero keywords or musical expertise.
The goal was speed, quality and ease of use. Getting it exactly right took years. But once it was ready, it worked so well that we saw interest from our friends and partners in our industry, and AIMS made its way to all the biggest production music libraries in the world.
Now we’re also working with record labels, publishers, broadcasters… everyone really.
Our focus has always been to help music professionals find the right track. We keep uncovering new ways to do that, but we stay focused on what we’re experts in — and that’s search. We want to be THE search for music, which is why we allow anyone to try our technology with their own catalog so they can actually feel the difference.
The use of artificial intelligence to power music search is now a vital element for playlist curators and music supervisors. You already have a terrific collection of tools and technology — are you working on updates to existing solutions, perhaps looking at new products to develop and launch in 2025?
First of all, appreciate the compliment!
We always have a lot of ideas and POCs, but we try to focus on what brings the most value for our clients. That’s why we released the natural language-based Prompt Search, followed by advanced Lyrics Search last year; people who know how reliable AIMS technology is were asking us for these solutions.
Whether it’s music supervisors, playlist curators, sync professionals, labels or anyone in-between, we listen to feedback very carefully because they’re important end users of our products.
Right now, we’re working on new methods of search; for example, searching through video. And there are also UI and research developments on interesting applications and improvements of Prompt Search — because we believe it can be used not just for a better search but also for less-obvious use cases, such as curation, as you mention.
Whether it’s an update or a product, we only release it to the market once we think it’s amazing. We have some really interesting experiments, but it’s too early to talk about them. Although with our research team, even we’re surprised at what they can achieve, and in what timeline.
Last year you partnered with AudioShake to launch AI-powered lyric transcription and search for sync licensing. How has that partnership developed as a powerful opportunity for music companies to search entire catalogs using sound and lyrics simultaneously to find tracks in minutes instead of hours?
We created Lyrics Search because a lot of high-value syncs are based on lyrical themes and topics, and it’s very hard to search for that by just using actual words in the lyrics. So our search looks at the sentiment of songs and understands the real meaning of them, in addition to finding exact words.
Once we had that, we realized that a lot of clients for this product don’t have all their lyrics in the right format — or don’t have lyrics at all. We partnered with AudioShake because of their award-winning AI lyric transcription, which is the perfect match for what we’re doing with search.
We’d been really impressed with their technology for a long time, so it was great to finally have an opportunity to work with Jessica (CEO) and her team.
Since its release, Lyrics Search has been well-received for enabling something that was literally impossible before. We really think this is the way creatives want to search for music in the future.
How do you see the music search & discovery space changing in the next five years? It would seem like we’re still in the early stages of what can be achieved as the music business and music technology both change rapidly.
In the past, people had to spend most of their time figuring out how to find tracks in their catalog rather than being creative. Instead of exploring what kind of music would really enhance a scene, they were trying to guess what tags to use to find something, almost randomly browsing their catalog.
With the tools we have and are building, music professionals are able to experiment more and more, really fast, with many different ideas — and ultimately offer diverse selections of music.
We’re going from, “What does the system and metadata allow me to find?” to the search being an invisible enabler. I think in the next five years, our search will become so good that people will actually forget it even exists.
How has AIMS already played a role in user behavior, from sync professionals to clients exploring various catalogs and libraries?
Since we launched in 2019, our core products have become the standard in several corners of the music industry. When you go to various music platforms now, you see a huge improvement compared to a few years ago; AIMS is often the technology powering those searches.
In this way, our products have effectively changed user behavior. Expectations are higher, and users have a lot less patience with searches that aren’t as fast or don’t immediately give relevant results.
We’ve managed to stay on the forefront of how this technology has evolved because we specialize in search. It’s the only thing we do, so we have all the resources to create the best search possible, much more than companies doing it on the side.
That’s not a criticism, it’s just a fact. If you focus solely on one issue, you can move forward significantly faster.
The merging of music and technology continues to move swiftly. Where do you see this powerful and important connection moving forward? Will AI have fully engulfed the nexus of music-making and music-consuming, or will AI become simply one of many key tools a professional or fan can access to help navigate the music+tech ecosystem?
Technology is already an important factor in music, and I definitely see the two merging more and more. The industry has to use tech to move forward; it’s always been like that. But for me, AI is a supporting tool for human creativity — and it should stay that way.
Using technology to reach your audiences is a great thing because that’s very difficult to do right now. Instead of trying to create something that fits an audience, the artist can be their own authentic self — and tech will allow them to find out who’s actually interested in their music. That’s a strong positive.
I don’t see generative music as bringing much value to the music industry.
For ideation, it can be interesting… as a supplement to some parts of creativity, maybe. But for me, music should be made by humans, for humans.
I hope AI will become a tool for fans and professionals to engage with music they will love.
What’s on the horizon for AIMS as we embark on a new year?
We want to solve search for music. It sounds wide and maybe even vague, but that’s our goal — to help anyone who needs to find music discover the right music.
That’s on our horizon this year but will probably be ongoing for a longer time.
To be a bit more specific, we’re growing our current partnerships and starting some really interesting new ones across music and tech. In terms of products, we’re looking into more recommendation and discovery tools this year — which is where our clients and team see room for simplifying clunky processes.
Again, we’re not ones to publicly introduce new things before they’re absolutely ready. But if you reach out, I can tell you more about what’s almost at the finish line.
Or keep an eye out via LinkedIn and our monthly newsletter for official announcements.