Going to be working on a project over the coming months on autonomous Tidal code generation and thought I’d start up a discussion in case anyone’s been working on anything similar recently.
Some other exciting works that I’ve been pointed towards recently are:
Simon Hickinbothom and Susan Stepney from University of York created an evolutionary algorithm for the extramuros platform.
Also another paper recently came out of the most recent ICLC proceedings.
They’ve also released downloadable packages for writing code to json to train their networks, available on atom and sublime text (package is jensaarai 0.2.1)
If anyone else has been working on something similar, feel free to share here!
Will hopefully be able to share some of the outcomes of the project here over the next few weeks.
This version of the Daem0n piece had crude Tidal generation (modulated by equally crude machine listening and a mapping layer pasted together with string and glue...): https://www.youtube.com/watch?v=EcRtmWUdnx4
Hi, if you are looking for ways to generate expressions in a formal grammar using a neural network, I can recommend looking into the transition system introduced by the TranX semantic parsing framework, https://github.com/pcyin/tranX. I am using this approach currently to predict SQL from natural language questions.
In TranX, grammars are declared using ASDL. term expressions are converted to sequences of grammar production actions that build the recursive term structure from root to leaves and from left to right. This conversion is necessary so that generic sequential decoding models can be used. in the original TranX approach, a recurrent LSTM stack is used, but in principle this method can also be used with transformer decoders or language models.
A while back I created an Atom package that generates (and plays) Tidal code from Tracery grammars. This package hasn't been maintained in some time and I believe some users are having trouble getting it to run, but perhaps the implementation is valuable:
And here is a video recording of it in action:
I don't really have plans to enhance it. License is MIT, so feel free to fork/use/enhance!
Paper on reinforcement learning with Python module and Supercollider quark from the Tidal Club discussion: "RaveForce: A Deep Reinforcement Learning Environment for Music"
The RaveForce code seems super interesting. Definitely worth trying out - I think it'll likely get you some interesting timbres and synthdefs as it works its way through the gradient.
As an adjacent topic that I brought up on another thread - has anyone had any luck with getting haskell code sent to Tidal outside of passing it to GHCI via a text buffer (e.g. through an osc message as text) ? From my investigation, it seems like this is going to require some work in Haskell, but the "hint" library (https://hackage.haskell.org/package/hint) may be the right jumping off point ?