AbletonOSC x Tidal?

Daniel Jones presented this project at NIME 2023:

AbletonOSC: A unified control API for Ableton Live
This paper describes AbletonOSC, an Open Sound Control API whose objective is to expose the complete Ableton Live Object Model via OSC. Embedded within Live by harnessing its internal Python scripting interface, AbletonOSC allows external processes to exert real-time control over any element of a Live set, ranging from generating new melodic sequences to modulating deeply-nested synthesis parameters. We describe the motivations and historical precedents behind AbletonOSC, provide an overview of its OSC namespace and the classes of functionality that are exposed by the API, and discuss new types of musical interaction that AbletonOSC enables.

PDF: 28.pdf - Google Drive

Previously, I have attempted an integration between Tidal and Bitwig via BitwigOSC:

My memory is a little weak, but I remember some limitations/frustrations being:

  • Unable to achieve sample-level sync of Tidal MIDI / Audio recording, messing around with nudge
  • Lack of bidirectional OSC comms
  • BtiwigOSC written in Java, using official API but not possible (for me) to alter as needed

I am now evaluating whether to embark on a new effort with AbletonOSC.

In comparison to before:

A few questions for folks here:

  • Would you be interested in using such a project, if so what for?
  • Would you be interested in contributing development time to such a project, and if so what aspect(s)?
  • Would you be more interested in Strudel / Tidal / Vortex / integration?

I have my own sense of how I would use this, but I believe it could be quite flexible for different approaches. And if we went the Vortex route, being in the Python ecosystem opens up for many possibilities regarding creative AI/ML.


I’ve not deeply considered the details of what I would be looking for, and am still very inexperienced with Tidal in comparison to others here. My workflow utilizes SuperCollider lightly compared to many folks, as my sound sources are primarily in hardware (though I am considering moving some amount of that into plugins, M4L devices, etc.) This workflow largely relegates SC to control-rate duties like clocking and sequencing of triggers, pitch, and modulation. That’s probably about enough context for the moment.

For a while I’ve been considering how much more, and more easily, I could support and expand my workflow by leaning on Live more than SC, perhaps even entirely eliminating SC.

One of the obvious areas of workflow expansion might be in terms of capturing patterns as they are played, for both visualization as well as later alteration, arrangement, etc.

In my mind this all starts with an ability to send n-cycles-worth of a pattern (or the results of a transition between patterns, etc.) to a specified new or existing clip in Live.

If the support also included the ability to launch one or more specified clips then that would be most excellent.

A notable area I definitely haven’t yet considered is the sending of automation, which I expect would prove quite valuable.

To some of your questions:

  • I might be able to contribute to the effort, but time availability is low and my python experience is very minimal and rather stale.
  • I would personally be using Tidal, as Haskell is my jam.
1 Like

I'm interested in learning more and staying in touch with any efforts to use OSC from Tidal to Ableton Live. The direction here is intriguing but I don't have a current use case for it. If some work materializes I may be able to contribute thoughts and ideas, and eventually do some testing/validation and documentation.

In addition to Tidal and Sardine, I use Ableton Live as well as Max/MSP and Max4Live. Max4L has an integration with Live, and connects to the LOM. Some time ago, I investigated this and tested out a simple connection to observe some parameters. I assume that AbletonOSC has the same access to the LOM as Max4Live, but it would be good to have that validated by someone who knows.

My personal interest would be to use Python/Vortex. Bubo has some thoughts on what is needed to make Vortex more usable with improved packaging. But I'm not sure what that would mean for a general Tidal integration, which might have a wider potential user base.

Thanks for posting.

1 Like

I have not thought about this in any depth, but at a first pass, I can see this as being very useful if one could pattern parameters of Ableton devices and third-party plugins. I am still a Tidal lightweight, but hoping to have time soon to dive more deeply into the system. Much of my usage is currently sending MIDI to hardware and software instruments, so I do not have much use for the tie to SuperCollider at this point. I am interested in learning more SC in the future as well, but just in terms of what I typically do with Tidal at the moment, it is MIDI-centric.

If I did not need SC and could instead host software instruments and effects in Ableton, and then could both send MIDI notes and control their parameters (perhaps via automation), that would be excellent. If that is a place this could go, I may be able to make some contribution. Python is a good language for this.

Seeing the mention of Bitwig in the first post, I will say that I would be even more excited about this if the integration was with Bitwig instead of Ableton. But I have little facility with Java so that could be a barrier.

Will keep an eye on this as it develops. Interested to hear what others might do with this integration.


i'd be very interested in a deeper Tidal x Ableton integration in general, as i do believe that while mostly used in a studio production context nowadays, Ableton still is (or can be) an amazing if not the best tool for live performance of electronic music.

The fact that Tidal uses Link for scheduling now makes the pairing even more of a no-brainer.

I'm currently returning to Tidal after a long-ish break, working on a hybrid workflow with ableton but currently limited to sending MIDI from Supercollider.

Right now i'm mostly concerned with getting Supercollider out of the equation (as it seems overkill just to send MIDI to ableton), since Tidal-MIDI is dead GitHub - isyuck/tidalrytm at device-agnosticism seems promising (and rather straightforward to me as i have to deal with C++ a lot at work)

Another dream of mine would be to launch predefined Tidal Patterns from named ableton clips, i know this is possible in theory but i haven't managed to get tidal-listener to build on my M1 mac so far. i had some success last winter by wrapping ghc inside a node.js instance within max4live and sending strings directly to stdin that way, but the whole thing ended up ridiculously niche, overly complex and hard to use or maintain.

as for AbletonOSC - this looks extremely promising! i have no experience with python and the Live API seemed kinda intimidating the last time i looked at it, but i'd be willing to give python a go for this :+1:

1 Like