Hi, if you are looking for ways to generate expressions in a formal grammar using a neural network, I can recommend looking into the transition system introduced by the TranX semantic parsing framework, https://github.com/pcyin/tranX. I am using this approach currently to predict SQL from natural language questions.
In TranX, grammars are declared using ASDL. term expressions are converted to sequences of grammar production actions that build the recursive term structure from root to leaves and from left to right. This conversion is necessary so that generic sequential decoding models can be used. in the original TranX approach, a recurrent LSTM stack is used, but in principle this method can also be used with transformer decoders or language models.
A while back I created an Atom package that generates (and plays) Tidal code from Tracery grammars. This package hasn't been maintained in some time and I believe some users are having trouble getting it to run, but perhaps the implementation is valuable:
And here is a video recording of it in action:
I don't really have plans to enhance it. License is MIT, so feel free to fork/use/enhance!
The RaveForce code seems super interesting. Definitely worth trying out - I think it'll likely get you some interesting timbres and synthdefs as it works its way through the gradient.
As an adjacent topic that I brought up on another thread - has anyone had any luck with getting haskell code sent to Tidal outside of passing it to GHCI via a text buffer (e.g. through an osc message as text) ? From my investigation, it seems like this is going to require some work in Haskell, but the "hint" library (https://hackage.haskell.org/package/hint) may be the right jumping off point ?