Thursday, February 29, 2024

KaOzBrD Introduced

Higher still and higher, From the earth thou springest, Like a cloud of fire

The blue deep thou wingest, and singing still dost soar, and soaring ever singest

- Percy Shelley

By Chance a Wavestation Vector Synthesis Performance.

The music project will be continued on the SoundCloud KaOzBrD account. Note the different SoundCloud links throughout the journal & blog. Of the SoundCloud spinning links; one points to the KicKRaTT account and the other to the KaOzBrD account.

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

Wednesday, February 28, 2024

Generating Music in the Key of G

"2 DAY" a pure data performance in the key of G

A continuation of DAY with an adjustment to the bass guitar. Rooting the bass guitar notes played on the rhythm chords of the piano's left hand greatly tightens the rhythm of th generated performance. In the running of this particular patch, there were two very humanistic moments in the performances. The first is in the "2 DAY" video and the second in the beginning of the studio recording of "Keys to the Lock" posted at the bottom of this blog. Check out the moment in the "2 DAY" video between time 2:08-2:13. The random choice of the patch generates a truly jazzy break in the rhythm. YouTube videos "DAY" & "2 DAY" are focused primarily on the development of the piano's right hand style which employs a more calculated [expr] method of choosing the next note played by the right hand. I go into much more detail about this scheme of this pure data patch in the previous blog post.

The Piano rhythm chord sub patch with attached bass guitar notes. At the bottom of the rhythm chords in this patch image that are generated during this performance are the selected four bass guitar notes (directed out to midi channel 9 [noteout 9]), that are in harmony with the 20 chords available. The association of these bass guitar notes to the chords gives the patch's performance a tighter rhythm and a more harmonious end result than the random approach of generating the bass guitar notes previously heard in DAY.

The studio recording for this patch in the key of G. I ran the patch for a total of 7 times, generating full midi scores between 7 & 10 minutes in length. Of the 7 this particular midi composition stood out due to the beginning's rhythm chords. There are moments in the randomness, which is how the chords are chosen, that even the random generation selects chords that are almost human in choice. This was the second occurrence of the humanistic moment I described above that occurred in running this patch. As I continue to develop the pure data patches, when the moments like this happen I will be sure to present. The "DAY" videos and related patches represent a very human like approach to generating, in this case the piano. This idea of creating a pure data patch to generate a human like score is all the effort. Hopefully through a future algorithm approach will I achieve a patch that is indistinguishable from a real human band reaching for a new groove in which to develop a song off of.

KicKRaTT Keys to The Lock Midi Score can be purchased here!

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662

Tuesday, February 27, 2024

generating piano in the key of E

"DAY" a pure data generated performance in the key of E.

* This page is a work in progress and might be repetitive.

Random vs expression calculations in generating composition.

When constructing code that generates music, the go to choice for the bulk of the compositions heard across the net, is to use random number generating objects or code. To the musician's ear this creates a rather dull & non harmonious score or as I call it "bad next note choice". Example... [random 10] = generates a output 8,2,9,4,10,5,1...etc. If we assigned these generated numbers to notes, the end result is a non-harmonious fumbling around within an octave, C D D F E G C. Whereas instead if we use expressions, [expr x+1] or [expr x-1] (where variable x is a chosen first note, lets say x=6=A) we generate a pattern that stays more central to this first chosen note, resulting in a output 6,7,8,7,6,5,4,5. Using the aforementioned x=6=A, this group of generated numbers to notes creates the more harmonious pattern A B C B A G F G.

Consider when Mozart penned "Twinkle Twinkle Little Star" or as the notes go; C C G G A A G, his chosen notes were in harmony with one another. Mozart’s notes continue to climb up scale and then concludes by falling one note back. Code can be written to expand these expression choices to include [expr x+1] to [expr x+?] & [expr x-1] to [expr x-?]. Which would give your calculation the ability to jump from C to G, step up to A, & then back step down to G. Attempting to produce this outcome with random codes or objects, observed in the previous random set; C D D F E G C, is impossible.

The video "DAY" that I have posted displays this difference in employing [expression] code over random. The result is a more humanistic & harmonious approach to generating a musical score.

All notes and chords for my piano performance "DAY" are generated in key of E. The algorithm can run indefinitely and does so in my home. Enjoy!

AI technology employs a process of frequency analyzing the audio samples of other musician's work in order to create melody, chords & harmonies. A ethically questionable procedure. In order for AI to compete with human musicians it will require samples of talented musicians. No samples of other musician's work were used to generate the piano performance heard in this video. This code directs it's output to the MS GS Wavetable found on every Windows desktop.

-----------------------------------------------------------

Determining midi notes using the [expr] object instead of the [random] object.

In the pure data alternative purr data there is a object called [drunk]. The [drunk] object outputs random numbers in a moving range. Rather than using say [random 100] where the output can jump from 1 to 100, then 75 to 25. Purr data object [drunk] will stagger up and down it's random output, centering the random result more closely to it's initial numerical starting point.

  • Example... pure data [random 10] = output = 8,2,9,4,10,5,1...etc
  • Example... purr data [drunk 10] = output = 5,6,4,5,3,4,6,7,5...etc

Building patches that use one over the other creates two different outcomes. I was very excited at first with the purr data [drunk] object as the resulting midi score created was more central and consolidated random choice of the next note. [random] even in it's randomosity, does create a predictable outcome that is not really very musical and becomes repetitive when composing score after score. This statement is arguable. For what is musical is determined by the listener. Maybe a better word is harmonious. When a human plays a instrument, the human moves from one note to the next in a pattern that is harmonious to the musician's ear. When Motzart penned "Twinkle Twinkle Little Star (TTLS)" or as the notes go; C C G G A A G, his chosen notes were in harmony with one another. Motzart's notes continue to climb up scale and then concludes by falling one note back. The result is a melody that has been cherished, whistled & played for centuries. Attempting to produce this outcome with the [random] object outcome; A G C A C G A G is next to impossible. With the [drunk] object you might get a bit closer; C G C G A G A. Never to achieve the memorable & melodic C C G G A A G through randomosity. Solving this in pure data will be quiet the challenge. As the TTLS melody is way too human & certainly not a melody that was achieved through random means (Motzart's Dice Game?). Remember random numbers are not calculated, nor are they chosen sequential. Random numbers are chosen randomly from a set that has a max & min value. So when using random objects in pure data or purr data, you can set the results to values & functions which can be determined to apply to a specific group of notes, octave(s), note properties... but in the end there will be no calculated move or mathematically related approach to the choice of the note that is to be played next. It will be determined randomly. As a musician I next to seldom write, play or perform my music by jumping randomly around the instrument's available octaves. Melodies & chords develop through a more centralized exercise of the notes within a particular octave and develop outward from this location of the instrument.

Using the [random] object, a boolean scheme or fractal equation in the [expr] object patch is probably why so many programmers of pure data, who are not musicians, emploee it as the go to choice for determining a next note outcome. We have so many pure data examples that are of a random determination. I can only say this, there are just too few musicians who are programmers or programmers that are musicians, that are focused on constructing patches that attempt to create a more harmonious musical composition over a randomly created experimental composition. This is one of the reasons I have chosen to work pure data in a total midi only format over manipulating the [dac] object output. I want to stay more focused to the note than processing audio when working in pure data. There is a bigger conundrum here than any to be found in the frequency analysis procedures of the highly talked about music AI programs in the news? To find the human in the algorithm structure. The other rout that could be taken is to recreate the human form from the sounds we make or the songs we sing. Both maybe fruitless, as both endeavors are attempting to make something humanistic from a device that humans created. These conversations remind me of a drinking game using the Drake Equation.

After a week or so of tearing through paper designs, attempting to arrive at a simple pure data construction that attempts to calculate the next note over randomly choosing one, I arrived at this piano performance patch. I haven't in any way solved this problem but what I have stumbled upon is a hybrid structure that creates a more harmonious outcome. Hybrid in that the patch, presented in this journal entry, employs both [random] & [expr] elements. I arrived at the construction by thinking about the left & right hand movement of a pianist. The left hand (bottom right half of the patch) plays chords that are triggered off the rhythm of the 8 step sequencer of the [metro] tree, then the chord that is played is randomly determined from the 20 available 3 note chords. The right hand (top right half of the patch), then randomly chooses from a group of [expr] objects. Which is a calculated move to the next midi note to be played; [expr x + 1], [expr x - 1], [expr x + 2], [expr x - 2] ...etc up to [expr x +/- 4]. At this stage I have limited the expressions to a max of 4, even though the human hand can reach for a full octave up or down. These [expr] objects are calculating changes off of a variable [value x], the starting value of x is selected by the user before the patch is performed. In the patch displayed here you will notice that I have set selection of [value x] to numbers 9-15. These values correspond to the center of the row of [select] object values which in turn play the predetermined midi notes of the right hand in the key of E across 3 octaves.

A necessary patch within this patch is the use [moses 0] & [moses 24] objects as conditional statement. If the [value x] is < 0 or > 24 then [expr x + 4] or [expr x - 4] is calculated and the value of variable x is kept within the range of notes I have determined to be allowed for the right hand. If this expression was not in the patch, the value of x wwanders out of range to the [select] object values. Rather than just nudge the note back into the [select] object range, I throw the note back into the range using a +4 addend or a -4 subtrahend. This use of [moses] as a conditional keeps the calculated note from banging it's head against the limits of the [value x] object range.

The final result of this patch is a more humanistic approach to playing a piano improv in the key of E than anything I achieve using random objects. I have run the patch for almost 3 hours continuous & what I found interesting were the comments by listeners inside of my house. Those hearing the patch asked, "Who is this playing the piano?". Now, that's what I'm hoping to hear! The performance of the patch is not unlike the improv you might hear from a piano player contemplating a new song. This response inspired me to create the YouTube video "DAY a pure data performance with visuals". The gamma visuals were created with a separate software app and are not generated through pure data GEM. I like to think of this patch as a simple achievement in my goal of creating a pure data patch that generates a midi composition that exhibits a more humanistic approach to the music it performs.

Most of the songs that we enjoy in life are performed using a scale in a specific key, but you usually find that a musician less often ever uses all of the notes from the scale during the performance. At this stage of my pure data patches, I am using all of the notes found within a scale of a specific key. I'm thinking for a next step in this development to take a recognizable melody, determine the notes of that melody and limit the [select] objects to only those notes. Even repeating them as notes to be selected. It will be interesting to hear during the patch's performance if or how often the patch arrives at the recognized melody that inspired the patch's notes through it's [expr] calculation. Or a better thought might be, how I will have to construct the [expr] objects so that they might reach that recognizable melody. There is this other idea I have and that is to found the chords or the notes off of each other. A chord in the key of E initiating a melody in the key of E or the notes in the key of E building off of a chord in the key of E.

For the historical, in developing a piano composition patch I studied the groundbreaking Max composition of Philippe Manoury's, Pluton.

While I have found no original patch available for Philippe Manoury's, Pluton on the internet to study, a description of his composition can be found in this article. From what I have read, Manoury's pure data patch was more expressive than generative. His pure data patch followed his playing and then altered the midi characteristics to manipulate the synthesis processing.

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

sub patching & triggered by rhythm

This YouTube Video that I have posted presents a walk through on sub patching. Sub patching is a way to breakup and organize a much larger patch. When you add the pure data object [pd~] or [pd "sub patch name"] it opens a blank pure data patching window. The first object needed to be added in the new sub patch window is the object [inlet]. This [inlet] object receives the inbound signals & messages from the previously added [pd~] object on the main page of your pd patch. From this point continue with your patch configuration in the new sub patch window. In this YouTube video I circle the [pd~] of the sub patch window I open next. Hopefully you can easily follow along in the video, pause it if you need, to understand how I have broken up my larger single screen patch into smaller ones. This post to my journal contains screen shots of all the patch & sub patch windows found in this YouTube presentation.

* All YouTube videos are performed direct to desktop and use the MS GS Wavetable for sounds. This is NOT the case for the tracks uploaded to soundcloud which are performed on my studio synthesizers and midi components. You will see references in these patch images to the MS GS Wavetable and [pgmout] objects initializing GM instruments on specific midi channels. Be advised if you are constructing these patches for your own use, you will need to adjust these settings to your own instruments or midi gear.

The poly rhythmatic core that I have defined in my previous journal entrees, soundcloud tracks & YouTube videos is also used for this midi composition. In this image I explode the PITCH branch of the poly rhythm [metro] tree. The biggest difference in this pic of my core poly rhythm patch are the sub patch [pd~] objects; [pd kick], [pd hh snare], [pd bass] & [pd keys]. These [pd~] objects attach the [metro] core patch to their defined sub patch windows using the [inlet] object. This isolates the core patch on the primary patch window, making room to expand, develop controls for the patch & help clean up the overall patch construction.

The kick drum or [pd kick] sub patch, labelled here as the KICK DRUM PATCH randomly creates the on / off pattern of the kick drum across an eight step sequencer & then repeats. The [inlet] object sends the [bang] to the [random 100] from the [pd kick] object of the poly rhythmic core patch to create the generated kick patter for the following 8 steps in the measure. The predefined percentages can be set to determine how many or how few kicks will be played. The last [random 100] & [moses 50] objects determine which kick drum sound will be heard. I like to have different kick drum sounds in the mix for creating difference in tone.

The Hi Hat & Snare patch or [pd hh snare] sub patch, labelled here as the PERCUSSION PATCH, randomly creates on / off patterns for the Hi Hat. The [inlet] object sends the [bang] to the [random 100] from the [pd hh snare] object of the poly rhythmic core to create the generated Hi Hat pattern for the following 8 steps in the measure. In my previous generating patch in F#, I started playing with a configuration for the bass guitar that instead of randomly generating notes across the 8 steps, is now triggered by the different instruments in rhythm. This approach created a more humanistic feel to the performance. I moved this idea to the HH patch and branched the snare off of the Hi Hats rhythm. This configuration created a much more realistic approach to the snare in the rhythm of the patch. This quickly developed branches off of the Hi Hat patch for other percussion instruments like toms & cymbals. The [delay 250] objects when adjusted by the horizontal sliders to 504.3 elevated the percussion playing to a VERY humanistic feel.

The bass patch or [pd bass] sub patch, labelled here as the BASS PATCH, randomly creates on / off patterns for the bass notes. The [inlet] object sends the [bang] to the [random 100] from the [pd bass] object of the poly rhythmic core to create the generated bass pattern for the following 8 steps in the measure. This bass patch manipulates two groups of identical midi notes in key; [38] [42] [45] [47] [49]. Doubling midi notes [45] & [49] respectively. The 8 step sequence of the patch randomly shift back and forth between these two groups. The design of the bass patch was arrived at more through feeling than a calculated structure.

The keys patch or [pd keys] sub patch, labelled here the KEYBOARD CHORDS PATCH, randomly determines whether or not a keyboard chord should be played and then randomly determines which of the 20 chords is to be played. The chords in this sub patch are three note chords. Four note chords could easily be used by adding another triggered note to each of the chords defined here. Instead of triggering the chords from a 8 note sequencer the chords are triggered off of the [spigot] which is randomly opened and closed by the poly rhythmic core. I have found this keeps the chords from being played too often and gives a more rhythmical approach to the keyboards playing. Moving forward my desire is to somehow bind the bass, sequence & chords to build off the same note??? more to come on this.

The "strands together" soundcloud track. This track uploaded on soundcloud was made with this patch/sub patch configuration. I accompany the track with a performance on my Korg Wavestation EX & layer it with the Korg Wavestation SR rackmount. I use the factory metropolitan sound found on both these Korg devices. Stands together is in the key of F# just like my previously uploaded track, Island. I think? There were rapid changes that occurred to the HH Snare & Bass sub patches that moved me quickly to a new song idea. The new configurations defined above in these two sub patches rapidly produced strands together from the original Island patch. To accompany the track with my own performance I quickly whittled down the available chords in the sub patch into chords that accompanied my playing. The final recorded performance uploaded to soundcloud was composed, performed & recorded in a day. There are still a few moments at the end of the track where my playing is just a bit off the percussion's rhythm.

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

Monday, February 26, 2024

Generating Music in A major

This presented pure data patch is generating the midi notes for; kick, hi hat, snare, bass guitar & keyboard. In the randomly chosen scale of A major. I have run my completed patch a number of times and have a number of interesting final midi compositions that I will be uploading to SoundCloud. Island is the first of these new music uploads that employ this polyrhythmic patch.

A generating Pure Data patch using a A major scale.

I am using a system that chooses a scale from a database I am constructing that produces all the midi note numbers (0-128) from the random scale chosen. The midi note numbers are manually entered into the [number] objects seen through the patches of the instruments before their respective [makenote] objects. It's important to keep all the instruments in key. Also works as reference for bad or out of key midi notes that show up in the patch's performance. As I develop this generating band process, Randomizing the choice of scale in the patch is the first part, the notes in A major scale; E, F#, G#, A, D, C#, B. Pure data through probability & conditional statements, randomly chooses whether A goes to A# next or whether it goes to B. Here is a real AI question in my algorithm, what do we replace the [random] object that would choose going from A to B? What is the alternative to a random choice? And why is it made?

The core polyrhythmic patch, presented earlier in this blog is the core for the current tracks being uploaded to SoundCloud. The first track of this type is Island that has been uploaded to SoundCloud. I have exploded the polyrhythmic patch for all to see. If you are actually reconstructing these presented pure data patches and are having trouble deciphering. Please refer to the previous posts or YouTube channel videos to aid in your patching. Comments, questions on any of my pages are always welcome.

The final patch used in the performance of Island is larger than the patch I have presented in this entree. My final performance patches expand upon the instrument's diversity & direction. I use a lot more [makenote] midi objects. The jury is still out on whether it is required that multiple [makenote] objects are required or all you need is one per channel??? In the future I want to expand the drum kit performed in the patch to include more of a full jazz rock kit. Currently I layer analog & digital drums to create my final drum kit sound. In all of my final patch performances I add a instrument that is a generated sequence apart from the primary instruments; drums, bass & keys. This sequence track will be heard in these next uploads (heard on island) & play a big part in the composition. Sequence tracks help to create instruments. The bass guitar instrument is a generated sequence confined to midi notes 35-45. Creating that bass guitar sound in your mix is all about the gear, synths, PCMs, VSTs and more that you employee in your home studio. I am striving to develop a very unique sounding bass guitar & will elaborate on this in some blog entree in the future once I feel I have succeeded.

Presented in this patch & YouTube video are three different ways I am generating midi notes.

The Kick & Hi Hat are identical in their construction. The Snare, once its own instrument on midi channel 10, but I found it became played too much in the performance. I wanted the snare's performance reduced further than the conditional percentages of the instrument's generation. I branched the Snare off of the Hi-hat. Which if you think about it, when the two hands of a drummer are in rhythm, the Hi-hat is always dominant, and the Snare becomes more of an accent. Considering how the two hands of the percussionist move through out a performance will be very important as we consider expanding the drum kit to include more percussion sounds.

The Keyboard Chords. There are two ways to generate chords within a fixed scale; there are some great websites, software apps & VSTs on the net that can generate for you all the chords in a particular scale, or you can take all the notes of a scale and predetermine how many 3-5 note combinations there are using those notes. Both directions show their differences in the final performance. What I have found with the latter of the two is having to spend more time chasing down dis harmony in the final performance. Whereas using known scale produces a much more pleasant result. That is when it comes to being harmonious. My final patch uses double the chords available for the key in this patch. If you were to see my final performance patch it's obvious my chord patch is twice as large. The single conditional object after the randomization in this patch determines the outcome of whether a chord is triggered or not. I greatly expand on this single point in my final patches that not only increase the trigger to play 2-3 chords in routine but also allows me to turn the chords on or off in the performance & last alter the length of time a chord is played. This chord control will be demonstrated in my next Soundcloud upload. Where I will control the chord's playing in the first half of the track & then let it loose in the second half of the performance. I'll point this out once I have uploaded the tracks to SoundCloud.

The Bass Guitar. The bass guitar instrument as I patch is always growing & changing. What used to be a 8 step instrument like the other percussion instruments is now a group of midi notes that are triggered through the rhythm of the performance. In the patch you can see that the bass notes are triggered off the kick and receiving [r ] objects; seven, nine & fourteen. Founding all bass movements off the kick drum. I have found that changing the receiving [r ] objects you can create numerous different bass patterns. This patch idea produces a more real feel to the Bass guitars final performance over the Band Theory Bass, which is standard 8 step conditional determined sequencer type. I'll will add more to the bass in the future as I get a grip of how I'm constructing the generating bass guitar within these pure data patches moving forward.

KicKRaTT Island Midi Score can be purchased here!

Island is a totally generated composition. The midi notes for all the instruments of the track were generated from a pure data patch. The performance of the patch always produces a different score. Island employs a polyrhythmic core, different to the Band Theory tracks which use a more linear 8 step sequencer for the performances rhythm.

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

Sunday, February 25, 2024

polyrhythm sequencer patch

Here is my polyrhythm sequencer patch. There are three [bang] objects at the top; RESET, OFF & ON. There is a PITCH CONTROL. The horizontal PITCH slider does not adjust the PITCH in real time. After you increase or decrease the pitch you must push the ON [bang] object to initiate your change in PITCH. The basic to running this patch once constructed correctly in pure data; Select these objects in order; RESET, select the MEASURE (1-32), adjust the PITCH & last ON. This should initiate the patch to run correctly no matter what pitch. I have set the min-max values in the properties of the pitch slider: 1-400. As I close in on a performance I often adjust these values with the max set about 20 clicks above the pitch that I feel best suits the pitch with the patch. This creates a finer adjustment on the slider. A pitch of 0 will create anomalies in your performance & can create uncontrollable sounds with your outboard gear. When the patch is performing, pushing the RESET button either, starts the starting pattern from the beginning or creates different patterns without disturbing the pitch. At this moment in my working with pure data this is either an error in my patching or an interesting result of when I pushed RESET [bang] object in the timing of the performance.

Interesting things could happen here; patch a [random] or [moses] object here to alter the pattern during is performance.

This video presents the polyrhythmic sequencer in action. Please note that in the patch the "Rim Shot" is a "crazy train" Shaker (I wish) instrument, not a rim shot. It was a rim shot but the sound overtook the kick drum & I wanted the sounds on each track to be distinguishable for this presentation.

This polyrhythmic patch is built off of the Linear patch presented in my earlier blog post. Polyrhythm is essentially different clocks or in puredata's case [metro objects] beating at different rates, governed by a master clock or the central [PITCH CONTROL] in this patch. This polyrhythmic patch yields three metro objects (different tempo branches) that sequence through 8 send [s ]& 8 receive [r ] objects. The three red bangs ( valued 1 - 3) for each of the groups give the conductor the ability to easily alter multiples or denominators. Multipling or dividing the master pitch. In each of the three groups the pitch can be set to equal the master pitch (value = 1) or a multiple / division of the master pitch at (value = 2) or a triplet at (value = 3). In this patch I have two metro groups that multiply & only one that divides. This can be changed by altering the math objects, [* 2] to [/ 2] for example, in any of the groups. Or an entirely new metro group could be added to the overall patch, so that the patch would have two [* 2] & two [/ 2] metro groups. Any number can be predetermined into the puredata math objects seen in the youtube video patch version or available as a selection into these pd math objects, as displayed in the patch image. Playing around with these multiples or divisions of the central PITCH will produce many different outcomes in tempo for it's specific [metro] object branch. Depending on your midi gear & other hardware you may find excessive values wreck havoc in the production. keeping values strict in the math objects to 1-3 is suggested. With the master pitch slider set from 1-1000 you'll be able to (increase / decrease) various [metros] and cover all the various time signatures out there from quarter, half & triplet. and still find that your midi gear can handle excessive tempos and not run off the rails. I have created both gigantic sounds & sounds that won't stop unless you power down your equipment, because of bad math in the patch, whether your midi panic buttons work or not. These puredata patches as you construct & test them, considering the math employed can & will produce anomaly midi issues, reminiscent of midi flatulence. These audio anomalies can just be devastatingly different depending on the mathematical outcome of your puredata patch. Dividing by 0 or values that create a tempo that is too fast or too slow, can produce undesirable results depending on your application of a [metro] object's tempo into your midi components.

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

pure data & other wares

I will be producing my pure data patches on a Windows 10 PC AMD RYZEN 9 5950 16 core PC with 64g ram. I often run my pure data patches on Windows 7 Intel CELERON 2 core 4g ram. All my PCs are rack mount industrial types. I use both new and old pcs running my pure data midi configurations. Even with larger patches I put together, both the old & new PCs execute the patches. So if you want to start working on the pure data program know that it will run fine as a midi compositional sequencer on what ever PC you have. I use both pure data extended .54 & purr data Pd-12ork 2.14 to create my patches. There are objects that I find only in purr data & have observed object functionality issues when moving a pd file between the two versions. The vanilla version of pure data is the core to all of these other distributions, with the pure data extended .54 being the most current download being made available at pure data.info . at purr data.net you will find a starting point for obtaining the correct distribution of purr data for your system. There are reasons to have both and both are respected versions in the pd community. Just be aware that you might run across functional issues with specific objects if you create a patch in purr data & then try to load & run the patch in the original pure data extended. A disclaimer if you will. I have also come across some insightful conversation threads on the net debating that these cross incompatibilities aren't issues, more like "just something done differently between the two apps". Which seems logical. A pure data patch can often be simplified & there does seem to be almost a dozen or so ways to do everything. It's a nice programing environment to learn, I am almost exclusively working in midi with my pure data patches. harnessing randomized composition, manipulating midi note properties & mixing the outcome will be my endeavor with pure data at the helm.

I use Reaper to record & play the midi compositions generated by pure purr data. I use Steinberg Wavelab & Audacity to edit & master the .wav file to .mp3. As for the moment I am only uploading mp3s to soundcloud. I spend very little time editing & mastering .wav files & I'm not much of an engineer. I reach for strong initial recordings that will only require a light amount of improvement before burning to .mp3.

It's a two PC system, where one pc executes the midi out patch & the other records the inbound midi data. I usually run/record the patches 3-5 times. There is always one that sticks out amongst the group & that's the one I go with & will probably eventually upload to soundcloud. From this point, synthesis, mixing & accompaniment takes over. If accompaniment reveals something from the generated midi dump, the track goes into multi-track recording. More on this as it develops in the journal. Using pure purr data to create a band with it's own style for me to accompany as a member is really what I hope to accomplish.

To create a band that generates it's own music at my direction and to explain how it was done so that others can build on it. All of my patches employ midi out objects [noteout] to send control messages to external hardware synthesizers and therefor in the configuration settings of pure data make sure to set up the desktop wavetable or other external midi components accordingly.

I started working with puredata in 2005, and at the time building a modular synthesizer captured more of my attention. In the last 10 years so much technology has come together not only commercially but for me personally, in my studio. What we can do in our house in 2023 feels like a light-year from what we could do in our house in 1995. A truth about technology, your end creative result is only as good as a sum of it's parts. There is always room for improvements & upgrades.

You can download the pure data programming environment here https://puredata.info/

You can download the purr data programming environment here https://www.purrdata.net/

Anyone reading this & seeking pure data ideas, assistance or communication should join the pure data forum at https://forum.pdpatchrepo.info/ and register.

for the historical, pure data began as "Max" in 1985 by the computer developer Miller Puckette at the Institut de Recherche et Coordination Acoustique / Musique (IRCAM) in Paris and is currently owned & developed here https://cycling74.com/products/max in San Francisco.

Soundcloud Band Theory Tracks

The central metronome core to the tracks Band Theory & Band Theory After found on my soundcloud page is demonstrated on my YouTube video "pure data pd linear rhythm". It works as labeled, a 8 step sequencer with pitch control. The measure control alters the [%] modulus objects which in turn trigger changes to the values of the [spigot] objects. This patch image displayed is a 8 step sequencer that plays for 16 measures and then repeats until you turn it OFF. altering the conditional cobjects determines what percussion instrument is played. Branches off of the 8 steps of the sequencer trigger chords in key at random intervals.

Breakdown to the image above labelled sections will help to explain this patch.

Section B : The ON/OFF switch or "bang" as pure data calls, not only initiate & halt the sequencer patch, it kicks the trigger bang [t b b b] object that starts the metro, looks for INPUT from the pitch control, sets the select [sel 0] object, calculates the tempo based off of the INPUT pitch control & feeds it to the metronome.

Section A : sets the number of measures to run the eight step sequencer until it repeats or initiates a generative change. lots more on this connection later. This sequencer works with two sets of modulus [% ] objects. The first, seen in the center of the patch [% 8] that controls the 8 step sequencer & the second [% 32] that controls how many times that 8 step sequencer should play a routine. The modulus object is very important to the patch & will play a bigger role in my poly-rhythmic patch.

In sections A & B what you create in this pure data patch is a 8 step sequencer that will play it's 8 steps X# of times, over & over or until you turn it OFF. The vertical radio slider's control OUTPUT#2 will be used through out the greater patch for the composition as it triggers the bangs in section E, that initiate the generative changes before the following measures are played. At this stage of development I am using a pitch control and vertical radio slider to manually enter in these choices. You could randomize these two inputs, the resulting midi score is chaotic.

OUTPUT #1 These are the bang branches of your 8 step sequencer. From the black bang found at each of these 8 [sel] points you attach the pure data patches that occur at each of these steps. They can lead anywhere & pretty much do anything.

This patch displayed in this post is at the heart of the midi score produced for the Band Theory tracks. It's a very efficient manual sequencer patch that can easily be expanded to any number of steps. Just change the modulus [% ] to the desired number of steps & add or delete the selection [sel] bang branches. You can easily produce a 32 step sequencer that plays out for 32 measures & repeats.

There are two types of bang objects being triggered; all but one of the bangs follows through the steps of each of the [metro] object branches. These bangs are always being fired in sequence to the step tempo of their [metro] object. The one different bang object in the whole is triggering the generative change that will take place at the conclusion of the set number of measures in the patch.

Having the measure control is preferred over not having the measure control. If constant change or chaos generative music is desired, then set the measure to one on the radio slider and the tracks generative process will constantly occur at the conclusion to just one measure. I have found that without the measure control the end performance looses it's foundation, as the instruments wander off to a no conclusion state because they have no measure control. It's a matter of opinion, my directive & a choice I have made in this musical direction. I want to control the generative process into a musical form over the alternative to just let chaos reign.

The directive for the band theory tracks is to create a generating sequence of the central instruments found in a traditional band; drums (kick drum, hi hat & snare), Bass guitar & keyboard chords. This is a beginning step towards creating a greater band performance with variation. During the coarse of this track the drums, bass & keyboard go through random changes, initiated every 8 measures. The bass guitar roams at random through it's eight note scale. The keyboard chords are chosen randomly from a set of twenty 3 note chords. The bass guitar & keyboard chords are initiated at particular events in the drums sequence, giving the bass & keyboard chords a closer relationship to the rhythm. These types of connections I make within a pure data patch will be revealed in following patch images, videos & tracks produced.

As I develop a pure data patch for it's musical composition & one that I find particular in organization, I will upload versions of these songs that will have me improving along on guitar or synthesizer. Whether I play chords or lead to be determined by a best fit to the song after I have had a chance to play around with it on my multi-track. The track "band theory after" is almost the same as it's predecessor. The most notable difference is high-hat. In the original track "band theory" the HH was almost hiden in the performance. By altering the conditional percentages of the [spigot] object the HH jumps right into the performance as a prominent instrument in the mix.

The band theory tracks perform the beginning of my project to create a pure data patch that produces a conventional track in a definable genre of music. We will see where it gets me. You can always come back to chaos as pure data patch structures do that very well.

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

Thursday, February 22, 2024

ROGUE 1989

My introduction to generating music 1988-1992

ROGUE was a program I worked on between 1988 - 1991 on my Apple IIe. The ROGUE BASIC program generated a text file that I converted into midi. Other programs that contributed to the development were rpg character generating codes, a program that attempted to create weather patterns that was in BYTE magazine, code for a Traveller game add-on that generated planets. From time to time programs in BASIC would appear in magazines like; BYTE, Creative Computing & Dragon. When the Lotto first showed up in Florida, I remember writing a BASIC program that would generate numerous random lottery choices based on previous drawings. These types of programs in BASIC all employed a RND (random) which conjured up numerous ideas as to how to generate notes.

If interested, select my KicKRaTT Journal link, found at the bottom of this post. On my KicKRaTT Journal you will find a segment of this code that I have held onto for years in the KicKRaTT Journal post ROGUE. I have gone through numerous hard drive and disks, moved around alot & thrown a zillion disks away since. This portion of the program's list is all that remains of my early attempts to generate music using an Apple IIe. This generative music process started with a BASIC program (not shown here) which would randomly chose a scale from 10 scales of difference that I entered into it's data strings. The second BASIC program, ROGUE, would generate the midi notes based off the notes from the chosen scale determined by the first program, and would create this one large text file or print out. After parsing the file in Lotus123 on my MacSE30, and adding a velocity & length to each of the generated notes. I would be able to then convert the text file to a midi file with a C program I wrote in the Macintosh Programmer's Workshop environment. Passport's MasterTrax, the sequencer software package I used back on the Se30 could open the resulting 4 track midi file created the ROGUE program. This rather cumbersome 3 program system of mine worked very well in generating a midi score & removing the long hours having to recompose or to reenter mass midi notes manually from a text file or line printer output that I periodically did in testing the modules of my program. The versions of ROGUE that actually created the midi notes for the ROGUE tracks, employed many GOTO & GOSUB routines that branched off of this portion of the program that I have included in the KicKRaTT Journal post "ROGUE". Sub routine branches that would create melodies, store drum patterns, and calculated chords into the data strings, to be used routinely in the compositions process. Once in MasterTrax, I made numerous edits to carve the song out of the mass midi note blitz and remove out of key notes that had gone rogue. The end result was 4 tracks of midi in key with one another. Whether it created a moving piece of music is best left to the listener. I was proud that as a completely generated composition, the four tracks of midi notes stayed in the key of the chosen scale for the duration of the song and didn't somehow shift in some part of the conversion process. That was all the effort back then.

PROGRAM: ROGUE

  • Description: Random Music Generating Composition Program
  • Date: November 22, 1990
  • Language: Apple BASIC
  • Location: Clearwater Florida

The ROGUE tracks that I have uploaded to SoundCloud

Ducks Pond, 1989.

Ducks Pond was generated 4 track midi sequence created on a Apple IIe, using a program I developed called ROGUE. The midi file was run using Passport's MasterTrax. The sound module's used; a Roland Sound Canvas & Yamaha FB01. Ducks Pond was recorded on a 80s Yamaha recording console with TEAC reels, master to cassette. Duck's Pond was my first completed generated composition. While the FM tones fall dead in comparison to what was available back then, I do feel that my composition does contain a rather bombastic imagery & some rather intriguing melodies created in the generation. The Chaos exists in some of the audio carpet moments where all four of the tracks are in key, but yet wandering off to their own conclusions. Like four different bugs forced together in a box. It also shows the uncontrollable side of a generated score, that IMO is the watermark to a real genuinely created generative score. Where the played scale of notes just wanders off into routines or next note choices that evoke questions to the listener.

Swamp Gas, 1991.

Swamp Gas was a generated 4 track midi sequence created with ROGUE. The sound module's used; a Roland Sound Canvas, Yamaha FB01, emu procussion module & the Korg WaveStation. Recorded on a TASCAM 1508 mixer & TSR8 reel to reel. Alesis Reverb.

The Beginning

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

Wednesday, February 21, 2024

Frank Zappa

"Without music to decorate it, time is just a bunch of boring deadlines or dates by which bills must be paid." - Frank Zappa

What started my interest in creating generated music? I was very influenced at the time by Frank Zappa's Jazz From Hell (JFH) released in 1986. Regrettably his last studio album. The album is not only a dramatic presentation of Zappa's compositional scoring talent, but a introduction blitz into computer music & his use of the Synclavier music system. The Synclavier tracks on JFH were composed by Frank himself & are not randomly generated, though first listen would lead you to consider a massive computer construction. FZ hand wrote most of the album's tracks 3-4 years before JFH. His written scores were directly written to the Synclavier & played through it's FM / additive synth engine. Employing the Synclavier not only as a scribe & conductor but as a single 16 track, 64 voice / 32 output synth module only. No other synth was used on this album. The whole process from written work to final performance was carried out on this single Digital Audio Workstation (DAW). Zappa's Jazz From Hell was the inspiration, getting introduced to the idea of generating music was the outcome. The rest is contained in the creative pursuit.

I did not in 1986 have a Synclavier at my disposal, nor would I have music system with those same specs for another 15 years. I would instead be programming my score on an Apple IIe, sequencing on a MAC30se & using Roland's Sound Canvas & Yamaha's FB-01 as my synth engines. Rather BIG difference when it comes to DAW power. But no matter, just as FZ transcribed his written score into his machine, I would be doing the same, but with code, & into a hell of lesser machines. Frank's Synclavier system vs my Apple II system, my programmed code vs Frank's musical genius. When constructing a code that is going to generate a musical midi composition, should I not reach for a style or an approach to playing the score? How does one even go about composing like Frank Zappa if one is not Frank Zappa? How can you even conceive a code based on the stylistic approach of Zappa?? Quite the dilemma to ponder??? The ongoing quest for new gear & pondering thoughts like this are the stuff musician's are made of. JFH is a fantastic journey into electronic music composition. Certainly not the European sound of the new wave & way more compositional than anything in the progressive genres of rock or jazz. The process initiated long nights of thinking, pondering over equations & how to use them musically, IF THEN ELSE routines, random procedures & predictive percentages all went to work in the code that I manipulated in order to humanize & create a style in my BASIC & PASCAL produced midi "text" note scores to file.

The music technology of 1988, the midi equipment and the commercial hardware you bought often had problems performing your midi sequence performances. Quite often you hit either the limitations of you PC's processor or the maximum polyphony of your synth module. No matter how much money you poured into your gear at that time, all midi performances dealt with midi issues. The way the execution of the midi sequence sounded was odd, a very sound on / sound off, music box result. The musical compositions would sound like a typewriter & move like a robot. Accessing the many properties of a midi note these days is easy, back then you were satisfied if the note played, stayed in key & ended. There were many system shortcomings that were difficult to program around. The number of voices & polyphony, always a factor. This was what running large midi compositions was like six years after midi was commercially made available in 1986. In the midi generating system that I developed, it would take numerous calculations to produce the final midi file for my Jazz From Hell like performances. It would take numerous times running the final sequence in MasterTrax to record it without midi error. Constant repetition in programming, sound design & recording to achieve a final outcome that I, eh... sort of desired? The system I developed over four years produced around 5 incomplete works & 4 completed to a point. To a point to which I gave up on an idea and just stopped calculating. Editing the song out with a long mixer fade out or abrupt ending. Jazz From Hell influenced me greatly back then & still resonates in the currant compositions that I am exploring on this journal & soundcloud.

KicKRaTT Studio

Welcome to my studio. This post is just a manifest of the synthesizers, gear & software that I employ. I won't go into a lengthy definition on any of the individual components, though how I have used them in a specific track may come up in the future. With the exception of a few items in my studio, anyone can find information, videos & sales for any of the components listed hear. My sound is created through the use of my instruments, analog & digital midi hardware and the electronic projects & synthesizer modules I have built.

MODULAR SYNTHESIZER

CAB1 : DOTCOM QCP22

DOTCOM Q106 Oscillator | DOTCOM Q141 Oscillator Aid | DOTCOM Q106 Oscillator | DOTCOM Q106 Oscillator | STG Mankato Filter | DOTCOM Q107 State Variable Filter | DOTCOM Q104 Midi Interface | DOTCOM Q117 Sample & Hold | DOTCOM Q105 Slew Limiter | DOTCOM Q124 Multiples | DOTCOM Q109 Envelope Generator | DOTCOM Q110 Noise | DOTCOM Q118 Instrument Interface | DOTCOM Q130 Clipper | DOTCOM Q109 Envelope Generator | STG Sea Devils Filter

CAB2 : DOTCOM QCP22

DOTCOM Q150 Transistor Ladder Filter | OAKLEY Cota VCF | OAKLEY VC Phaser | MOTM 910 Multiples | KLEE Sequencer v2 Scott Stites | MOTM 190 VCA | DOTCOM Q109 Envelope Generator | DOTCOM Q127 Fixed Filter Bank | DOTCOM Q113 8 Channel Mixer | DOTCOM Q109 Envelope Generator | DOTCOM Q108 Amplifier | DOTCOM Q111 Pan / Fade

CAB3 : GREY Homemade cab based on the QCE22

DOTCOM Q960 Moog Sequencer Controller | DOTCOM Q962 Sequential Switch | MOON 565E Quantizer Controller | MOON 565 Quad Quantizer | MOTM 910 Multiples | MOON 553 Voltage to Midi Converter | MODCAN 70B Triple VCO | MOTM 310 micro VCO | CGS DUEL Real Ring Modulator | ENCORE ELECTRONICS FREQUENCY SHIFTER | STG WAVE FOLDER | MODCAN 69B SCANNER | TELLUN CORP VEEBLEFETZER | DJB-019 David Brown's JYE-TECH OSCILLOSCOPE | MOTM 101 Noise SnH | DOTCOM Q149 Signal Switch

SYNTHESIZERS

EMU Ultra Proteus | EMU Xtreme Lead | Korg Poly 61 | Korg Wavestation | Korg Wavestation SR | Korg Wavestation AD | Novation Supernova | Roland JX-10 | Roland JV-1080 | Roland JV-1010 | Roland RS-09 | Sequential Circuits pro-one | Sequential Circuits prophet 600 | Waldorf Micro Q,/p>

RECORDING GEAR

FOSTEX 3180 Spring Reverb | Korg D888 hard disk recorder | Lexicon MPX 1 | Soundtracs Topaz 32 channel mixer | TASCAM 1508 8 channel mixer | TASCAM TSR8 Reel to Reel | Tascam DR-680 8-Track Portable Digital Field Recorder | t.c. electronic D-TWO delay | TELLUN 156 Neural Agonizer - a custom spring reverb

DRUM MACHINES

BOSS Dr-670 | BOSS Dr-202 | BOSS DC33 Clone | EMU PRO/CUSSION | ARTURIA DrumBRUTE | PROJECT 9090

GUITAR GEAR

Gibson Les Paul & Ibanez Fretless Bass Guitars

DigiTech Legend II Guitar Preamp | DigiTech 2112 Guitar Preamp | ART TPS II Tube Preamp System | BOSS CE3 chorus, BF2 flanger, DS1 Distortion | Electro Harmonix EH4800 small stone phase | Yamaha UD-Stomp Delay Pedal | Fender PRO185 amp (2)

SOFTWARE : Reaper | Wavelab | Pure Data [pd~]| AudioMulch | EMU Proteus XV & Emulator X | MasterTRAX | synthedit & VSTs for wave file editing & mastering only.

PC's (3) : Windows 10 (recording), Windows 7 (pd & sequencing), Windows XP (odd stuff)

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

early pure data tracks

There are tracks that I will upload that represent a transitional music phase of integrating pure data into my music creating process that has happened overthe years. These tracks employee a single pure data patch, that generates music for maybe just one instrument, trigger samples or orchestrates the drum machines. These past tracks can be relative to my current endeavor and I have uploaded a few of them to soundcloud. To get things started these tracks were the first songs that I uploaded to my Soundcloud KicKRaTT account.

Chrysalis

This skin, it cages me. I long to be free. To escape this puzzle of bone and flesh. Time and space. Birth to death. Death in chrysalis will be to live once again.

Chrysalis, a metaphorical personification in audio. To paint a picture with sound.

Chrysalis is an algorithm performance generating midi notes converted to voltages for the high frequency modular synth oscillator noise (representing the cocoon). The algorithm also generates the notes for the low-end piano tremors, a Roland MKS piano module (the butterfly wrestling about inside). Generated notes play the Roland JX10 (the improvisational butterfly leaving the cocoon). There is a lot of live mixing here that directs the high frequency audio but otherwise a totally generated performance.

This 2004 recording is one of my first completely generated algorithm compositions.

COIL 2012 : A puredata patch that generates the drums, percussion rhythms & sequenced samples of our parrot (African Grey). I accompany the pure data patch, with a performance using a sequential circuits pro-one (drone performance), a layered Novation supernova & Waldorf micro Q and bass guitar.

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

KicKRaTT Mobile

What is KicKRaTT? In the 80s I found this steel medical cart that I could pile up my gear & effects, roll it out on to a stage & quickly roll it off. The cart was so durable that it could get kicked off the stage & not fall over. It was heavy & had these real wide casters. "Kick that cart over here!", became a moment at rehearsals & gigs. It supported my first drum machine, a Yamaha RX11, used only for the kick drum & as a metronome for the drummer. The kick drum-cart, was now an item, it was part of the music & member of the band.

M - CHANGING CHANNELS was the first project I produced from beginning to end. Recorded the music, made the cassette inserts, mass produced the tapes. Sold few. Handed out many.

Sometimes we forgot to stop the drum machine after songs. Sometimes the drum machine kept going after we thought we had stopped it. Along side being a click track & a kick drum machine and the angst our real drummer had against drum machines. The "kick drum-cart" took on a life of it's own. Over the next couple of years the cart was called all sorts of things. The kart dealt with many a pummeling, kicking, throwing...paint, beer you name it. That kart went through the hell of every gig. By the early 90s, the slang name for the kart "KicKRaTT" was in the conversation. In 91, I used the name on a cassette project that I produced in our local scene. Regulars that found their way to gigs would often say that they only came to see the DrumRaTT. Lots of ridiculous stories surround this music equipment support cart. By 98, I had retired the cart & moved all the gear into a rack enclosure. The original rack enclosure was worse than the commercial enclosure seen in these pictures from 2002, but the name lived on. KickRaTT now resides as a rack in my home studio & I still have the old cart in my garage. All of that original gear that was housed on the support cart, except for the guitar pedals are long gone, but the cart & the part it plays in my musical stories still lingers on. The cart has stayed with me forever, carrying my gear from one stage to the next. From one project to another. "KicKRaTT" is the name of my studio. You will find that I have used the name on numerous promotional logos and graphics for my musical projects. It's a name that has found it's way into all of my music.

& now it's the lead name I have given to this project. KicKRaTT

As this project develops and you find yourself reading through this blog you will find the namesakes KicKRaTT and KaOzBrD are given to the algorithms as well. That both of these names have also been used and can be found as our team name in the Internation Ai community. KicKRaTT name has always been identified with the drums. KaOzBrD name is everything notes, chords & harmonizing. A brand personification.

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

Hello & Welcome

Welcome to the KicKRaTT web on electronic music composition. Generating midi notes using algorithms, commercial devices, noise and conversion data. This journal presents the different ways I generate midi music, the compositions & online activity of KicKRaTT. Structuring algorithms to generate musically harmonized composition. Developing a process to compose intelligent music & bring structure to chaos RANDOM / CONDITIONAL music generation. At present, I am generating MIDI with algorithms structured in the pure data (pd) environment. Aka, open-source MAX. Generating MIDI in this way produces solos for individual instruments & multi-instrument synthetic bands, infinite improvise and datasets. I use the generated midi to orchestrate the audio components in my studio, construct synthetic datasets for large language model (LLM) training, improvisational live jam sessions and genre-specific music generation. I hope to cover numerous midi generating schemes that produce humanistic instrumentals, explore LLM technology to predict composition variations and my continued work on my "evolving musical algorithm". An algorithm that generates music and changes its conditional statements in real time to improve the musicality of the composition output.

In the audio department it is all about guitar & bass sounds. Part two, titled KaOzBrD will be a dramatic shift towards a guitar centric improvisational band. I have made serious progress in the guitar & bass sound that will change the KicKRaTT sound completely. One might even say the genre! Stay tuned!

This Welcome Page serves as a gateway to the different online presentations related to the KicKRaTT music project and the links found on the LinkTree page. If you have arrived here via platforms such as Linktree, SoundCloud, SoundClick, YouTube, or Vimeo you will find all the music associated with this project uploaded to the KicKRaTT and KaOzBrD SoundCloud, SoundClick & ReverbNation accounts. At these accounts can be heard all of the music discussed on this journal, featuring both new works and previous relevant compositions. On YouTube, Vimeo, and Odysee, you will find videos for the generated, predicted & performed music at various developmental stages. KickRaTT music can be purchased on the SoundClick and Bandcamp.

This Google Blogger is the straight-forward version of the KicKRaTT Journal.

The current project involves generating midi from algorithms structured in the pure data (pd) programming environment. Pure Data an alternative to MAX MSP can be downloaded online. The focus is on designing algorithms that generate midi through random, conditional and expression-based mathematics. The structured algorithms create endless midi music. The project's objectives are to craft a musical style into the algorithm's design that evolves during the generative process. To generate a definable form of music that easily fits into mainstream music genre categories. To produce computer-generated compositions that exhibit a distinctly human-like quality. I will be documenting this journey here on the Dreamwidth journal (covering the process and conversations) and on Google blogger (streamlined) to provide insight, conversation and variety.

Thank you for enjoying the music.

KicKRaTT has taken part in the International AI songwriting competitions of 2024 & 2025. The 2024 AI song submission "ARBOREAL" was composed with pure data algorithms generating enough midi input to compose synthetic learning datasets for the large language models (LLMs) used to predict. The pure data structure essentially generating its own learning dataset and then generates the midi input for the LLMs to predict off of. The midi music compositional process blooms in a self-generative way in this procedure. It's a unique procedure and has been recognized as a first in auto-generated music composition by the Ai community. The process was elaborated on for the 2025 Ai completion entry SOLARIUM. Generating learning datasets with algorithms & various commercial midi devices, Utilizing LLMs, specifically GPT and LSTM models, trained on the generated midi input datasets, composing a final score through predicted variations. The goal was for an algorithm to generate the LLM training datasets and for the LLMs to predict a composition based on the continue generated midi input. The algorithm or device that created the dataset also created the midi input that the LLM used to predict. While minor adjustments to the conditional statements of the algorithm during the process, the statement is true that the same algorithm or device that created the training dataset is the same that wrote the song. An original score from a fresh generated source of data, avoiding the use of commercial or historical audio samples or MIDI files to train the AI models.

Constructing the AI workstation for the GitHub LLMs to reside, developing the pure data structures to generate the MIDI input datasets, and creating an AI training program that evolved the algorithm composition through predicted variations and managed datasets was the endeavor. The process and competition participations are presented and discussed in detail on these journal entries, Google Blogger, and YouTube videos. This developed process has been presented, examined & judged in the two Ai competitions of 2024 & 2025.

If it is the competition tracks ARBOREAL or SOLARIUM that has brought your interest here, then proceed to one of these links below or the YouTube & Soundcloud KaOzBrD icons at the bottom of this entry. The song ARBOREAL and the Ai song contest experience is pretty well documented here.

A brief reporting on the International Ai Song Contest of 2024.

Ai Song Contest 2024 ARBOREAL Process Document.

A brief reporting on the International Ai Song Contest of 2025.

Ai Song Contest 2025 ARBOREAL Process Document.

Thoughts about constructing Ai datasets.

* KicKRaTT; MUSIC, ALGORITHMS, DOCUMENTS, GRAPHICS & LOGOS: ISNI# 0000000514284662 *

image host