Algo-Jazz Series

 

Algorithmically generated art from a Processing program of mine.

The MUSICA project I’m involved in has a side to it focused on interactive jazz. In exploring approaches to algorithmic jazz (or algo-jazz), I started working with pattern-based generation for melodies and parts for other instruments, such as walking bass. Some of my early strategies are documented on my post about an Infinite Jazz Music Generator in Haskell. The pieces on this page were created as part of my exploration of those topics, and development of this series is ongoing. My use of the computer ranges from letting it be fairly autonomous in writing the score to being more of a compositional partner.

The original inspiration for this series was actually a Processing program for creating neon-looking algorithmic art. A family member commented that it looked like an avant-garde jazz album cover generator. The MUSICA project needed jazz algorithms and the art demanded music! If you want to hear how this series started, check out this video on YouTube.

Compositions

Compositions are listed in reverse chronological order.

Lobophyllia II (December 2018): an algo-jazz adaptation of an old piece of mine. This one about 50% algorithmic. It uses a output from Python-based interactive bossa nova program I made that was given a lead sheet for the main progressions in the original Lobophyllia. Certain melodies and bass riffs were taken from the original piece and mingled with the algorithmic components. Unlike 33rd and 5th (further down on this page), this piece is using the algorithms as a composition tool rather than showcasing them in a more isolated sense.

Phyllorhyza (December 2018): algorithmically generated parts for everything except the drums. Some parts were printed out as sheet music and recorded in through a ROLI seaboard block. The algorithms used were implemented in Haskell/Euterpea.

33rd & 5th (September 2018) 100% algorithmic material from an infinite jazz generator implemented with Haskell/Euterpea. The parts were output to MIDI files and assigned virtual instruments in a DAW. There are no human-added or human-edited notes in this. It’s basically just one long algorithmic run.