Google wants to legalize AI music. It can change the world of media.

Google is said to be seeking to develop a business model for AI-generated music based on the work of existing musicians. If successful, it could disrupt the media industry by paving the way for the widespread use of AI content.

Alphabet (Ticker: GOOGL), Google’s parent company, is in discussions with Universal Music Group (UMG.Netherlands) to develop a tool for users to create AI-generated music. The Financial Times, citing people familiar with the talks, reported that it would be based on existing melodies and sounds but also pay the copyright holders of the original songs.

Google and Universal Music did not immediately respond to requests for comment Barron early Wednesday.

The conversations underscore the potential necessity for AI companies to find partners in the media industry, in the face of artists’ mistrust about their intentions. The ongoing strike by Hollywood actors and writers revolves in part over their demands for guarantees that they will not lose control of their digital forms or be replaced by content generated by artificial intelligence.

“We believe that UMG, as the world’s largest music group, is appropriate to protect the rights of its artists and act promptly to adapt to artificial intelligence,” analysts at Deutsche Bank wrote in a research note.

Announcement – scroll to continue

Analysts expect Generative Voice to reach a wide audience in 2024, but note that it will take much longer for any major cast offsets. However, other industries can be affected immediately.

“New technologies are not constrained by resources in the same way that actors and musicians are constrained. This will likely affect a range of sectors such as gaming, telecom, music and news,” Deutsche Bank analysts write.

See also  Florence Pugh responds to trolls who criticized her for wearing a sheer Valentino dress

Google was one of seven companies that met with the Biden administration in July and made pledges to control AI risks, including developing mechanisms that would let people know when content is being generated by AI, such as a system of watermarks. However, they no longer disclose the data used to train their AI systems, which is a major point of contention for artists who claim their work has been used to develop such models.

Announcement – scroll to continue

Write to Adam Clark at [email protected]

Leave a Reply

Your email address will not be published. Required fields are marked *