Autonomous Tidal Code Generation

Hi all !

Going to be working on a project over the coming months on autonomous Tidal code generation and thought I’d start up a discussion in case anyone’s been working on anything similar recently.

Some other exciting works that I’ve been pointed towards recently are:

  1. Simon Hickinbothom and Susan Stepney from University of York created an evolutionary algorithm for the extramuros platform.

http://eprints.whiterose.ac.uk/113747/1/evomusart16.pdf

As part of this, they’ve written a grammar for tidal patterns, which seems to work quite well: https://github.com/franticspider/antlr4tidal

  1. Jeremy Stewart, Shawn Lawson, Mike Hodnick and Ben Gold have been working on “Cibo” a neural network autonomous code generator..

https://iclc.toplap.org/2019/papers/paper101.pdf

Also another paper recently came out of the most recent ICLC proceedings.

They’ve also released downloadable packages for writing code to json to train their networks, available on atom and sublime text (package is jensaarai 0.2.1)

If anyone else has been working on something similar, feel free to share here!

Will hopefully be able to share some of the outcomes of the project here over the next few weeks.

18 Likes

Also if anyone is interested, here’s a draft of a context-free grammar for Tidal generation that I started writing as a side project recently.

https://gist.github.com/lwlsn/c57a2a1c2d77430d1e02b305d18b66c6

Hopefully can share some code for this soon, when it’s a bit more functional..

3 Likes

This version of the Daem0n piece had crude Tidal generation (modulated by equally crude machine listening and a mapping layer pasted together with string and glue...): https://www.youtube.com/watch?v=EcRtmWUdnx4

2 Likes

Also the tidal-parse module (aka MiniTidal in Estuary) has a detailed Tidal grammar (expressed in the form of combinatorial parsers).

1 Like

Very exciting @lwlsn!

Probably worth mentioning Charlie and co's Tidal PEG: https://github.com/gibber-cc/tidal.pegjs

1 Like

check the work done by https://twitter.com/kakuya_sris

he has had a system doing this for tidal for a few years now.

3 Likes

Oh this is v cool, thanks!

will check it out, thanks renick!

1 Like

Maybe not tidal per se, but maybe of interest still

William Fields and Tom Hall live stream and breakdown tomorrow

https://club.tidalcycles.org/t/live-stream-announcements/317/12?u=cleary

1 Like

Hi, if you are looking for ways to generate expressions in a formal grammar using a neural network, I can recommend looking into the transition system introduced by the TranX semantic parsing framework, https://github.com/pcyin/tranX. I am using this approach currently to predict SQL from natural language questions.

In TranX, grammars are declared using ASDL. term expressions are converted to sequences of grammar production actions that build the recursive term structure from root to leaves and from left to right. This conversion is necessary so that generic sequential decoding models can be used. in the original TranX approach, a recurrent LSTM stack is used, but in principle this method can also be used with transformer decoders or language models.

2 Likes

super interesting, thanks :slight_smile:

A while back I created an Atom package that generates (and plays) Tidal code from Tracery grammars. This package hasn't been maintained in some time and I believe some users are having trouble getting it to run, but perhaps the implementation is valuable:

And here is a video recording of it in action:

I don't really have plans to enhance it. License is MIT, so feel free to fork/use/enhance!

7 Likes

Paper on reinforcement learning with Python module and Supercollider quark from the Tidal Club discussion: "RaveForce: A Deep Reinforcement Learning Environment for Music"

https://www.duo.uio.no/handle/10852/73776

2 Likes

Definitely worth trying, works as a SC class, still a proof of concept, but a lot of room for improvement.

The RaveForce code seems super interesting. Definitely worth trying out - I think it'll likely get you some interesting timbres and synthdefs as it works its way through the gradient.

As an adjacent topic that I brought up on another thread - has anyone had any luck with getting haskell code sent to Tidal outside of passing it to GHCI via a text buffer (e.g. through an osc message as text) ? From my investigation, it seems like this is going to require some work in Haskell, but the "hint" library (https://hackage.haskell.org/package/hint) may be the right jumping off point ?

Had a go at running tidal through GPT-2 today and got some cool new patterns to try

3 Likes

very cool! how did you use the pretrained model? do you have code to share for your fine tuning process?