Making Decisions in Time

Rodrigo Constanzo


Shortly after conceiving the idea for my series of compositions Everything. Everything at once. Once. I had realized that my compositional thinking was drifting away from composition specific gestures and further into pure improvisation. I decided that I would need a framework to think about, practice, and analyze this type of working process, particularly since this makes up a large part of my PhD research. I came up with the idea to analyze decisions just before filming the first Everything. Everything at once. Once. performance. The day after filming, I rewatched the videos over and over and started writing down what I thought from moment to moment. After analyzing each performance I began to break the decisions into discreet streams, which I felt encompassed all of my improvisational thinking. These are Material, Formal, Interface, and Interaction, and they are defined as follows:

Material – Decisions dealing with manipulations of local, sonic materials. This can come in the form of instrumental behaviours or general development, but is open to context and interpretation.
Formal – Decisions dealing with form and transitions.
Interface – Decisions dealing with instrument, ergonomics, technology, and performance modalities.
Interaction – Decisions dealing with how materials interact. This is primarily dealing with simultaneous materials (as opposed to Formal decisions), but is not exclusively so.

Each of three first versions of ‘Everything’ videos were analyzed in this manner. I felt that for the analysis to be meaningful, I would have to be brutally honest with my thinking, even if it was not flattering. Here is a segment of the raw, unedited analysis for Everything. Everything at once. Once. (1b):

0:05 – decide on ‘soft’ entry
0:08 – begin cymbal rubbing pattern/rhythm (in contrast to attack-based playing method of previous piece)
0:10 – pattern to regular – alter steady rhythmic pattern to create variety (slower/faster)
0:15 – formal brain calls for further breaking of pattern. adding interruption/pauses
0:20 – decide to use cymbal press/pauses to coax electronic changes
0:25 – shift to change of playing surface to engage electronic sounds – it does not happen
0:28 – formal brain calls for a gesture to return to rubbing pattern, now fused with rubbing gesture
0:35 – electronic sounds start changing and becoming more interesting – pause to listen
0:37 – return to playing with even more erratic gestures
0:39 – incorporate acoustic friction sound (from previous explorations/pieces)
0:42 – formal brain calls for shift. fade out amp
0:45 – return to rubbing gesture, but more erratically/quickly
0:47 – decide overall sound is too weak for energy level, decide to bring amp back in on smaller gesture

After coming up with the moment-by-moment list of decisions I separated them into individual streams. This was occasionally difficult, where some decisions could potentially fall into multiple categories. When this was the case, I just went with the most pertinent stream. Here is the same section of the analysis with the streams attached and the wording cleaned up slightly:

0:05 – Material : Decide on ‘soft’ entry.
0:08 – Material : Begin cymbal rubbing pattern/rhythm (in contrast to attack-based playing method of previous piece).
0:10 – Material : Pattern too regular – alter steady rhythmic pattern to create variety (slower/faster).
0:15 – Formal : Formal brain calls for further breaking of pattern. adding interruption/pauses.
0:20 – Interface : Decide to use cymbal press/pauses to coax electronic changes.
0:25 – Interface : Shift to change of playing surface to engage electronic sounds – it does not happen.
0:28 – Formal : Formal brain calls for a gesture to return to rubbing pattern, now fused with rubbing gesture.
0:35 – Interaction : Electronic sounds start changing and becoming more interesting – pause to listen.
0:37 – Material : Return to playing with even more erratic gestures.
0:39 – Material : Incorporate acoustic friction sound (from previous explorations/pieces).
0:42 – Formal : Formal brain calls for shift. Fade out amp.
0:45 – Material : Return to rubbing gesture, but more erratically/quickly.
0:47 – Interface : Decide overall sound is too weak for energy level, decide to bring amp back in on smaller gesture.

Feel free to follow along with the video. Everything. Everything at once. Once. (1b):

The decisions range from the treatment of local material, to practical instrumental housekeeping. This is the case for all of the analyses I’ve done so far. The general activity level is quite high as well. I have not generated enough analysis data yet to confirm this in my own playing, but I have a working theory that Material decisions happen roughly at the rate of language and speech.

Generating these analyses has given me some insight into how my decision making apparatus works in time. I can see explicit patterns and tendencies in the way decisions are structured, but more importantly, tuning in to that decision framework has let me draw a conceptual circle around that creative plane, and let me articulate on it. This is very similar to how focusing on the curation of instruments let me articulate on that creative plane in the Everything. Everything at once. Once. pieces.

Additionally, this kind of analytical thinking led to the composition an amplifier a mirror an explosion an intention, though in that composition it is used as the conceptual and composition framework, and not an after the fact analysis tool.

Everything. Everything at once. Once. (2a):

After conceiving the analytical framework I decided that a visualisation of some manner would be helpful in understanding how my improvisatory thinking operated. I initial experimented with the SubRip file format (.srt), with the idea to use video subtitles as the main way to view the analyses. A fellow PhD candidate, Braxton Sherouse, helped me create a ruby script that would take my text files and convert them into suitably formatted .srt files. This approach proved to be problematic as anything beyond a low density of simultaneous events quickly becomes unreadable. It also did not allow for any statistical analysis of the analyses.

I created a spreadsheet using the analysis data shown above (time, stream, comment), and then began creating all kinds of graphs and charts from the data. The most useful one being the static “constellation” view, showing the streams on one axis, and time on the other. Here is the analysis for Everything. Everything at once. Once. (1a):


There are some remarkable things in this analysis. First being that Material decisions stop half way through the piece, when the sonic material shifts towards being more electronic. This comes along with an increase in activity in the Interface/Interaction streams. There are more similar insights coming from being able to visualize the analysis data this way.

I also produced activity rates within each stream, along with trend line showing the overall trajectory of activity in the piece (or within each stream).


In addition to the static, graph-based analyses, I did some generic number crunching, calculating the minimum, maximum and mean for each stream. As well as a co-occurance matrix, showing how often each stream went to each other stream (ie Material going to Formal 6 times in the piece). These static analyses proved insightful, and I imagine will provide even more insight once I have enough analysis data to correlate between individual analyses. This will allow me to notice tendencies that I may have on a subconscious, or even physiological level, if my language-based rate of Material decisions theory is correct.

So with these static-spreadsheet based analyses in hand, I decided I wanted something more interactive, and musically useful. Not to mention something easier to produce. Many of the metrics I produced (such as co-occurance) I had to calculate manually for each analysis I produced. Something that is both tedious and error prone.

Earlier in the year, I had spoken with Braxton about how to visualize this kind of data and he recommended the D3 library for Javascript. I was also very much inspired by Jack Schaedler’s amazingly well put together introduction to digital signal processing. I looked in to the D3 library, and though I used javascript in the past and had done the Javascript course on, putting together something complex in D3 was a bit beyond my technical chops. Enter Tom Ward.

Everything. Everything at once. Once. (3c):

I contacted Tom and asked him if he would be interested in putting together a better version of what I had built. Luckily he was. Tom, in addition to being a programmer is an improvising saxophonist, so he was able to not only understand the motivations and functionality of this kind of framework, but to contribute to how it could be best displayed.

Working back and forth over a few weeks Tom put together something interactive, compact, and significantly better than what I had cobbled together in a spreadsheet. On top of all of that it used all open source technologies. Fitting in well with the sharing ethos I have.

It is largely built around the D3 library, but uses some additional web technologies to allow linked audio playback, and dynamic recalculation of zoomed in data.

Here is the playback/zoom section, along with the no-longer-static “constellation” view.


You can hover over each point to see the comment. You can click on any point (or the waveform) to begin playback from there. You can zoom in to a specific section of the piece using the selection bars at the top. Everything dynamically adjusts when you resize the viewable area.

Click here to view and interact with the Everything. Everything at once. Once. (1a) analysis.

There is a large trend chart view which allows you to see the trends for the overall piece, or within any given stream. It looks like this.


Everything, including the trend lines, recalculate when a new selection is made in the top part of the window.

Finally are the static metrics of minimum/maximum/mean/standard deviation/co-occurance. All of these are calculated automatically, and they dynamically recalculate based on a new selection being made.


So far I have analyzed three performance. All three videos from my initial Everything (1) set of videos. In analyzing these videos I quickly learned that the ability to empathize with one’s own decision making, while watching a video, dissolves very quickly. I analyzed (1a) the day after the performance, then (1b) the day after that, and finally (1c) a day later. By the time I got to the third day, I found that I could only really infer what I was thinking, by observing the results of that thinking (ie what I was doing). As a result, the data for the 3rd analysis is quite different from the first two. I have kept it, and will likely use it as a control, showing what ‘bad analysis’ looks like.

Click here to view and interact with the Everything. Everything at once. Once. (1b) analysis.

I plan on analyzing all of my upcoming videos/performances and feeding them into this system, and have asked some other performers to contribute their own analyses. Once I have enough data, I, with the help of Tom, will come up with some more graphs/metrics to correlate between the data sets, showing tendencies over time and between different performers. Eventually, this will expand to have a page where one can upload their own analysis (as a .csv file, an mp3, and an optional video file), and the system will add it to a database of existing analyses. The user will then be able to view their analysis, or any analysis in the database, and then view correlated data between a selection of these analyses.

In addition to the analysis and correlation of solo performances, I plan on analyzing duo and trio improvisations where each performer will analyze their own performance ‘blindly’ to then correlate data from each performance within the same performance. This will, undoubtedly, provide tremendous insight into the group dynamic and interplay happening between performers. This will, of course, require some completely different visualization tools.

You can follow the developments of this improvisation analysis framework on it’s static page here. As I add more analyses, and come up with new approaches on how to visualize the data, I will update the static page, and add any relevant links to it there.