Search

How YACHT fed their old music to the machine and got a killer new album - Ars Technica

How YACHT fed their old music to the machine and got a killer new album - Ars Technica

The dance punk band YACHT has always felt like a somewhat techy act since debuting in the early 2000s. They famously recorded instrumental versions of two earlier albums and made them available for artists under a Creative Commons license at the Free Music Archive. Post-Snowden, they wrote a song called “Party at the NSA” and donated proceeds to the EFF. One album cover of theirs could only be accessed via fax initially (sent through a Web app YACHT developed to ID the nearest fax to groups of fans; OfficeMax must’ve loved it). Singer Claire L. Evans literally wrote the book (Broad Band) on female pioneers of the Internet.

So when Evans showed up at Google I/O this summer, we knew she wasn’t merely making a marketing appearance ala Drake or The Foo Fighters. In a talk titled “Music and Machine Learning,” Evans instead walked a room full of developers through a pretty cool open secret that awaited music fans until this weekend: YACHT had been spending the last three years writing a new album called Chain Tripping (out yesterday, August 30). And the process took a minute because the band wanted to do it with what Evans called “a machine-learning generated composition process.”

“I know this isn’t the technical way to explain it, but this allowed us to find melodies hidden in between songs from our back catalog,” she said during her I/O talk. “Here’s what the user-facing side of the model looked like when we recorded the album last May—it’s a Colab Notebook, not the kind of thing musicians usually bring into the studio.”

A look at YACHT's work with MusicVAE Colab Notebook.
Enlarge / A look at YACHT's work with MusicVAE Colab Notebook.
YACHT / Google I/O 2019

YACHT had long possessed an interest in AI and its potential application in music. But the band tells Ars it wasn’t until recently, around 2016, that the concept of doing a full album using this approach seemed feasible. While research entities had long been experimenting with AI or machine learning and allowing computers to autonomously generate music, the results felt more science project than albums suitable for DFA Records (home to labelmates like Hot Chip or LCD Soundsystem). Ultimately, a slow trickle of simplified apps leveraging AI—face swap apps felt huge around then; Snapchat and its dynamic filters rose to prominence—finally gave the band the idea that now could be the time.

“We may be a very techy band, but none of us are coders,” Evans tells Ars. “We tend to approach stuff from the outside looking in and try to figure out how to manipulate and bend tools to our strange specific purposes. AI seemed like an almost impossible thing, it was so much more advanced than anything we had dealt with… And we wanted to use this to not just technically achieve the goal of making music—so we can say, ‘Hey an AI wrote this pop song’—rather we wanted to use this tech to make YACHT music, to make music we identify with and we feel comes from us.”

Bringing a Colab Notebook to a rock studio

Having the idea to use artificial intelligence to somehow make music was one thing; doing it proved to be something else entirely. The band started by looking at everything available: “We messed around with everything that was publicly available, some tools that were only privately available—we cold emailed every single person or entity or company working with AI and creativity,” as YACHT founder Jona Bechtolt puts it. But no single existing solution quite offered the combination of quality and ease of use the band had hoped for. So, they decided to ultimately build out their own system by borrowing bits and pieces from all over, leveraging their entire back catalog in the process.

“We knew we’d have to base everything on some kind of dataset, so early on, we thought, ‘What if we used our back catalog?” Bechtolt says. “We naively thought it’d be something like Shazam, where we could throw raw audio at an algorithm. That isn’t really possible…”

“Or, at least, not within the realm of our computing capacity,” Evans interjects.

“So we had to notate all our songs in MIDI, which is a laborious process,” Bechtolt continues. “We have 82 songs in our back catalog, which is still not really enough to train a full model, but it was enough to work with the tools we had.”

With that MIDI data, Bechtolt and longtime collaborator (bass and keyboards player) Rob Kieswetter started by identifying small segments—a particular guitar riff, a vocal melody, a drum pattern, anywhere from two bars to 16 bars—that could be looped, combined, and ultimately run through the band’s simplified AI and ML model. The band relied heavily on Colab Notebooks in a Web browser—specifically, the MusicVAE model from Google’s Magenta team—manually inputting the data and then waiting (and waiting) for a fragment of output from this workflow. And that AI/ML-generated fragment, of course, was nothing more than data, more MIDI information. Evans told I/O the band ran pairs of those loops through the Colab Notebook at different temps “dozens, if not hundreds of times to generate this massive body of melodic information” as source material for new songs. From there, it became the humans’ turn.

“It still couldn’t make a song just by pushing a button; it was not at all an easy or fun flow to work through,” Bechtolt says. “So after three days, we were like, ‘OK, I think we have enough stuff.’ By that point we had a few thousand clips between two- and 16-bars, and we just had to call it quits at some point.”

“It wasn’t something where we fed something into a model, hit print, and had songs,” Evans adds. “We’d have to be involved. There’d have to be a human involved at every step of the process to ultimately make music… The larger structure, lyrics, the relationship between lyrics and structure—all of these other things are beyond the technology’s capacity, which is good.”

Listing image by YACHT / Google I/O 2019

Evans demonstrates how, with MusicVAE, YACHT could take two old songs and generate some new ideas. Here, the band’s tracks “Holograms” and “I Wanna Fuck You ‘Til I’m Dead” generate a new melody that ends up on Chain Tripping.

So with their troves of data, the band hit the studio and started the process of interpreting the computer-generated information with instruments in hand. They also adopted one rule from the start of studio sessions: “We were very strict about not adding anything ourselves,” Bechtolt says. “We weren’t going to improvise or jam on top of algorithmic output, we were only to use things it generated.” In case of emergency, they could revisit their Colab Notebooks—say none of the files had the right melody, the band could take parts of old songs to “finagle the algorithm to make this happen”—but artificial intelligence had to be a constant collaborator on the compositions.

“We usually go into the studio as human beings with a notebook full of lyrics, ideas, and a few riffs we’ve been thinking about. But to build up the repository of information we were working with... in two seconds we could have 10,000 words of lyrics. In a few minutes, we had hundreds of four-bar segments of MIDI data,” Evans says. “There’s so much stuff on the cutting room floor, cutting room hard drive if you will, that may go into future albums or just simply disappear into the multidimensional mathematical obscurity from which it emerged.”

In this light, Chain Tripping challenged YACHT to be less of a traditional band at times and more DJ or mash-up artist, taking existing bits of sound (err, computer-generated instructions for sounds) and combining them in creative ways. “It’s not unlike making hip hop music or DJing,” Bechtolt admits. “But instead of crate digging and finding samples on different records, it was all found in the latent space of our own music and output, which is really trippy to think about.”

The band also utilized AI when making the video for the album's first single, "Downtown Dancing."

But does it sound good or math-y?

AI-generated music has been done before: from researchers teaming up with orchestras a decade ago to YouTubers and startups more recently. But YACHT's Chain Tripping may represent the most high-profile, traditional album to be released after being fully composed with AI/ML.

Yet, the first time anyone listens to Chain Tripping, especially fans of the band’s prior work, I think they’d be hard-pressed to know anything unusual happened behind the scenes. Bass riffs on “Downtown Dancing” or “DEATH” will make you move like the best of what you can find at DFA Records. You leave singing to yourself “California Dali, the mirror is melting” or “I’m… I’m… I’m… I’m gonna have… sa-ad money” (on a song called, “Sad Money.Nice, computers). It frankly sounds like it belongs in YACHT’s discography, and you could be fooled into thinking some of these tracks sneaked into livesets back in Hamilton, New York, circa 2010.

The new record even surprises the band at times. Bechtolt points to an instrumental interlude in the song “SCATTERHEAD” that starts around the 1:30-minute mark. A bass line pounds, but then some glitchy synth notes interject and linger. The passage culminates with what sounds like a ’90s software-version of a marimba ripping a laid-back solo that eschews the rest of the arrangement’s pace in a cool, syncopated fashion. Bechtolt says when the band first listened to that idea as a MIDI file, they just left it on a loop for a half hour. “It was such a cool thing, something all three definitely loved,” Bechtolt says. “But I don’t know if we could’ve written it ourselves. It took a risk maybe we aren’t willing to take when we’re writing a pop song, and it ended up in a place that’s really interesting and beautiful.”

“The same thing happened a lot with the lyrics, too,” Evans says. At I/O, she laid out the AI/ML process here, too. Technologist Ross Godwin worked with the band to create a lyric model similar to the melodic model, and this was trained on 20GB of text (about two million words) from the band’s music and music from acts YACHT admires, draws inspiration from, or considers a peer.

YACHT wanted to visualize the AI's lyric output, and so they printed it all out with a Dot Matrix printer. Evans literally brought it to the studio with her to highlight particularly interesting phrases to use.
Enlarge / YACHT wanted to visualize the AI's lyric output, and so they printed it all out with a Dot Matrix printer. Evans literally brought it to the studio with her to highlight particularly interesting phrases to use.
YACHT / Google I/O 2019

Vocally, YACHT has always been playful—the band’s past lyrics have been built around the old “see a penny, pick it up” kids' rhyme and crowds routinely shout back “Ai-ai-ai-a” at them—and Chain Tripping wouldn’t seem totally out of character to any unsuspecting listeners. It has the shoutable and hooky choruses, the lyrics that seem thoughtful despite explicit simplicity and vagueness. Then you read the album’s lyric sheet, and, well, here’s the beginning of “Loud Light.”

“These tools, because they’re failing at doing exactly what they’re supposed to do, they tend to fall just outside the boundary of what’s an acceptable, normal, or traditional melody,” Evans says. “It’s just on the other side of the fence, close enough that we can see it and say, ‘That’s a sentence, that’s a melody.’ It’s not completely formless, it’s just outside of what humans would do. And that space between meaning and meaninglessness and what’s expected and what’s out of bounds is such an interesting space.”

Despite all the extra effort, the band definitely seems happy with the outcome in Chain Tripping. Evans says she hopes the album can demystify the AI/ML concepts many people grow leery of due to sci-fi stories or just real news (“Tech doesn’t have to be used for terrifying purposes, it can also be used to make beauty—that’s a really meaningful thing). And musically, the press notes for the album quote the group calling this music “more YACHT than YACHT.”

At the very least, the process has made them more interested in experimenting with AI or ML going forward, though doing everything this exact way again may not be possible. After all, AI/ML as an industry continues to rapidly advance, becoming both more user friendly and less artificial, more polished. For data scientists and engineers, that’s a great thing. But as Chain Tripping shows, there’s something special about the organic imperfection of AI/ML at this particular point in time.

“This tech is evolving so quickly, by the time this record is maybe a year old, a lot of the stuff we appreciated for its wonkiness and strangeness will be so polished. They’ll produce melodies and lyrics that are unrecognizable as machine generated, they’ll be ‘perfect,’” Evans says. “Something will be really lost, because when it’s not perfect there’s this specialness unique to our moment in time. We’re going to be nostalgic. People will come back to this slightly undefinable wonky aesthetic that AI has—not just in music, in images, text—and it’ll become the new analogue. In the way musicians fetish lo-fi, writers use typewriters, or we buy vinyl instead of just streaming, all these nostalgic feelings will be associated with his moment in time for AI.”

Let's block ads! (Why?)



2019-08-31 12:00:00Z

Bagikan Berita Ini

Related Posts :

0 Response to "How YACHT fed their old music to the machine and got a killer new album - Ars Technica"

Post a Comment

Powered by Blogger.