Play Public Instance
Information about plays
list: List of plays
retrieve: Information about a specific play by ID
GET /v2/plays/350503/?format=api
https://api.kexp.org/v2/plays/350503/?format=api", "airdate": "2019-07-05T15:43:36-07:00", "show": 5836, "show_uri": "https://api.kexp.org/v2/shows/5836/?format=api", "image_uri": "", "thumbnail_uri": "", "song": "(Downtown) Dancing", "track_id": "44f94d50-d5ed-4f61-86f0-37a8e4e36d57", "recording_id": null, "artist": "YACHT", "artist_ids": [ "f6a0a03f-3b2f-4d3a-b4b4-34ca4ce7f3cb" ], "album": "(Downtown) Dancing", "release_id": "d44a3934-e547-40f1-8cb9-95af4659fdac", "release_group_id": null, "labels": [ "DFA Records" ], "label_ids": [ "32d02635-98fc-4405-94e1-e5b06f9d2025" ], "release_date": "2019-06-21", "rotation_status": "Light", "is_local": true, "is_request": false, "is_live": false, "comment": "For “(Downtown) Dancing,” the three-person group—with vocalist Claire Evans and now including former touring member Rob “Bobby Birdman” Kieswetter—used machine learning tools like Google Magenta’s Music VAE model and NSynth hardware synthesizer to write the lyrics, compose the melodies, and assist with the sound design. https://bit.ly/2xy0cxS<br/><br/>\n\n “For us, we decided that every single song that we were going to create with this process had to be interpolated from existing melodies from our back catalog,” Evans said in the talk. “We hoped that this would result in songs that had that indefinable YACHT feeling, which we don’t know how to quantify and I don’t think the model can, either.” Converting their entire 82-song catalog into short MIDI segments—files that function as a sort of digital sheet music, readable by synthesizers and computers—the trio started with Magenta’s MusicVAE model to generate new melodies based on patterns found in the input data. Evans and her YACHT bandmates sculpted the raw output into proper songs, aiming to flesh out and refine the neural network’s idiosyncratic output without masking its involvement entirely. https://bit.ly/2xy0cxS\n\n“It was about really trying to create a sort of structure around this process that would keep us from being overwhelmed, and would also allow us to enact our will and our taste on top of all that,” Evans tells me after the conference. “There’s a lot of tools that can help you generate fragmentary things like melodies or text, but we’re not quite at a point now with machine learning where it can help us make structured pop songs.” https://bit.ly/2EQdQ3i", "location": 1, "location_name": "Default", "play_type": "trackplay" }{ "id": 350503, "uri": "