Optics (and semantics) of "part" and "whole"

I was explaining/exemplifying to students: combination of patterns with ...

  • structure from both
tidal> flip queryArc (Arc 0 1) $ run 2 |+| run 3

fine, 4 events.

  • structure from left
tidal> flip queryArc (Arc 0 1) $ run 2 |+ run 3

... wat? this shows 4 events, but we only hear two (when we use this inside note)

  • structure from right
tidal> flip queryArc (Arc 0 1) $ run 2 +| run 3

again, wat?

I was reading What are the part and the whole in an Event? - #7 by mindofmatthew and that motivates part/whole semantics by small query windows - but in the above, the query window is large

So, my questions -

  • why are these extra events there, at all (what processing step would need them?)
  • what is the easiest/quickest way to tell (looking at the printout, i.e., the result of show) whether an event will be audible

(Note: I'm writing in part to better understand this myself, so apologies for any gaps or misunderstandings I have)

Second question first: The section of the event in parentheses is the span of the "part", so an event listed as (0>1)-3 has a whole from 0-3, of which 0-1 is the active part. Only events where the beginning of the part aligns with the beginning of the whole are audible, because Tidal only cares about dispatching events with onsets. (0>1)-3 is audible, 0-(1>2)-3 isn't.

Now back to the extra events: you say "the query window is large," but your (Arc 0 1) isn't the only query, it's just the last one. A stack of operations on patterns is just a stack of queries with manipulations in between. I'm not completely sure of the code, but my sense is that you combine patterns (as in +) by using the events of one pattern to define a set of queries that you run on the other pattern. In both cases, the parts will be the places where the events overlap (note that all three patterns have the same parts), it's only the wholes that are determined by which direction you do the querying. So...

The extra events are the ones where the beginning is chopped off (where the "part" only captures the end of the event). They're kind of ghosts—they no longer matter to the audio engine, but they still get passed around. Why aren't they filtered out at this step? I don't know, but I think it probably has a lot to do with what makes the implementation easier to write or understand. These events aren't strictly necessary, but they give you some information about how the pattern was derived, and can be useful for doing things like joining two adjacent events that were previously split up.

When we say 'where the structure comes from' we really mean 'where the wholes come from'. That is, when two overlapping events are combined, we end up with an event that is the intersection of them. Do we treat that intersection as a part of the event on the left or the right, or do we just say that the intersection is a whole new event? So in the example output you give, the parts are the same, only the wholes differ.

The confusion comes when this ends up with an event without an onset, i.e. one where the 'whole' timespan or arc begins before the 'part' of it that we end up with. As @mindofmatthew says, the scheduler generally ignores these events, because superdirt only deals in trigger messages. So indeed the question is, why not discard these 'phantom' events straight away?

Here is such a phantom event, with a 'whole' of the whole cycle, but an active 'part' of the second half of the cycle:

"1" |+ "~ 2"

The biggest reason for keeping them is that these events can come back into play when combined with another pattern.

("1" |+ "~ 2") +| "10*4"

There our half-event is combined with two others, to create two whole events.

Another reason is that these parts could be used by a synthesiser. It would be fun to have a superdirt mode that automatically chops up samples into parts so that d1 $ sound "arpy" # hpf "2000 3000 1000" does the same as d1 $ chop 3 $ sound "arpy" # hpf "2000 3000 1000" now. I've done this sort of thing when generating graphics from tidal at least, but it's an under-explored area. (Unfortunately due to the complications of polyphony, there isn't an obvious way to make this example work by modulating the effect without chopping up the sample, but maybe it's possible.)

Just a thought—could mininotation context be useful here as a way of identifying events that “go together”?

It would be insufficient in plenty of cases, and besides, the distinction between functions that operate on parts of events vs functions that create “new” events is a tricky and probably subjective one. But, it might be a starting point for exploring the ways that the history of events can be tracked through pattern manipulations