Fargot Password? / Help

rokettube porno mobil porno mobil porno rokettube porno porno

Archive for October 2011


Pace makes perfect

image484849893.jpgahhhh...that's better. finding a new pace does wonders for the sanity.

night before last, the overcooked pace of the project had started to take a toll on my sanity. I couldn't focus. I was getting depressed. ideas were blocked. I took a few hours away from it all and determined that it may have been because I was doing everything from my 3 monitor control center, made up of one borrowed laptop (mine blew up last month) and a donated desktop and secondary monitor, which has been invaluable for developing the visual interface. I think it was just too much LCD for too long. so yesterday I took a step back and decided to just use my smartphone for blogging and Internet duties and let the big computers be for development only. whoa! what a difference! whereas my blog posts were sporadic and taking upwards of 4 hours to write! now I sit here on the couch and do a nice concise stream of consciousness for about 45min-1hour and have the whole rest of the day left for working. yaaaay! this is a habit I think I will hold onto.

another piece of good news is that paypal didn't have a vendetta against me after all. it would seem that my problems were down to one lone really really shitty customer service rep, whose "service" the rep I spoke to last night, apologized for. that was stressing me OUT as I depend on that account for my crowdfunding efforts and there is literally no alternative to paypal. that thought still scares me but at least were back on good terms. a few people had problems contributing to the campaign. if you were one of them, you can now do so (thanks in advance ;-))

I made a loop last night! it pops and loops in odd places but it's a good first step. currently, it's more about learning the grammatical syntax of the audio part of this language. it's much much more involved than the "control" parts which I am used to.

this part deals with very fast data. cd quality is 44100 samples per second which means that the computer is taking frequency "samples" of incoming audio 44100 times a second and playing those recorded samples back at the rate they were analyzed (samplerate). so for something like looping, there are many things that must be looked at like making sure that the audio you've recorded is the right length, so there is no gap of silence at the beginning or end. Also making sure that when it wraps back around that you've looped at a position that doesn't pop, called a zero crossing point. I have no clue about that part yet. that's today's gig. but once these things are sorted, the loops must be numbered and stacked so that they can be played together. that should be...wait, let me not assume what e-a-s-y yet. I'll stick with one foot in front of the other for right now.

from the small amount of looping I did last night I can attest that it will not sound anything slightly like any other music I have done. it goes a little something like this;

-"transform" into a particular sound
-loop either the sound or a phrase played with the sound
-transform a sound that "fits" with the previous sound
-that's all I hit so far because I only have one crappy loop.

The feel is visceral. what goes with what and how and why will be engaging, to say the least.

I got my accelerometer-oscillators to work last night as well. this, when it works properly, should allow me to do the thing I thought i was going to do primarily with these controllers when i envisioned them; glowsticking with sound. where the speed of my hands making light trail patterns would control aspects of the basic sound elements. in my mind, it should sound like lightsaber swooshing noises but pitched so i can play them musically. currently it sounds like a broken car radio in a car that just hit a speed bump causing it to short out ever so often. it's all good though. I still got time.

thank you to those cats that contributed to the campaign yesterday. every bit will help create a somthing mind-blowing, I am shooting for nothing less than mind-blowing! believe. if you would like to be part of creating a new way of creating and performing music, go over to and check out the perks. I think you'll dig what's in offer.

back to work. holla.

looper dev and burnout management

image1119989259.jpgspent the last 24 hours working on the looping system. my idea is that each sound element will have it's own looper which should allow me to control each part more intimately than one looper for the whole system. the design is such that it will have the feel of a single looper but when I go into edit mode, it should allow detail manipulation.

my experiments yesterday led to me crashing my machine and almost destroying my patch because I multiplied the sample rate with the length of the recording and overloaded a buffer I didn't know existed (one of many I am coming to see). that was frustrating as I had to rebuild parts of it from scratch and lost quite a bit of momentum.

momentum Is so important right now. I have, for better or for worse, placed myself into the precarious situation of having no fall back plan; either this system works or I have screwed myself. it's great when everything is working and energy is high, but a month of 18 hour days in one room in front of a computer, is starting to take it's toll.

for instance, I realize that right now, being in the middle of a crowdfunding campaign, on which everything hinges, I can't take another week away from blogging, but I can't stare at 3 monitors today. my brain is fried. so I am blogging this from my phone and may go somewhere and code from there this afternoon. I phone-blogged all last summer and it was great. it didn't take as much time and felt more like entries in a diary. I may continue this way. since there is no "team" that can take over when I am tired, I must create new ways to keep these plates spinning.

the looping part of this project is interesting. i must initiate recording with one button, stop recording with the same button, give it a small amount of time to take the length of what I recorded and apply it to the place where the loop is to be stored and looped, called an array, then make it start looping, without glitching on the loop boundaries like a skipping record. my plan today is to create a recorder separate from the looper whose job is to record and measure what I play into it. that will be shuttled to a second recorder which will take over from the first. it's job will be to record everything from loop 2 foreward at the proper length and send it to a "play" array. seem like the most workable solution for my project because then it will have the recorder up front, that doesn't have to worry about looping, just recording. and one one of the loopers have recorded a first loop, it is turned off until the stop or reset button is pressed.

efficient? I have no idea. this is my first looper. at very least it will lead me to a more efficient design after I get it working. I will take my time on it. in the meantime, although I will be blogging, the next few days will be just my phone, pure data and my paper notebook.

contributing to my campaign will allow me to focus on building this system. it means that all of this insane amount of developement will be placed into a proper construct and shared with the world in myriad ways, from workshops to busking systems to concert systems that reimagine the interaction between music, lights, audience and interaction. ive never heard the word "bombastic" more that I have this year, but that's the goal. check it out at

Loops and Drums

31 days left to help make the beatjazz system a reality.  your contributions will allow us to realize this concept fully and the full concpet will blow your mind! contributing today means that i can start acquiring the parts and resources neccessary to start working on the hardware, TODAY!  Head over to and help make this a reality!


The gestural grid works, Its time to move on to the next stages of getting the system, with the goal of having a basic complete construct by weeks end.  Loops and drums, in that order.

alpha synth1Now that i have access to these mental sound creation possiblities, i just want to buffer them against themselves in the form of looping.  after spending the last 3 years using looping as the exclusive means of creating music and expressing myself, to play lines that don't keep playing after the initial idea is expressed, just seems wrong and alien.  i am finding myself playing these little looping structures using the new sound capapbilites but then they just dissipate as soon as i stop playing.  it feels so incomplete.

loop recorder Luckily, as is becoming common place in this project, the previous step greatly influences the next.  now that the sound is gesturally enabled, looping will be an integral function of the synthesis. each sound element (drums, bass, and 3 synths of varying synth architecture) will have looping integrated into its design.  i just started working on the looping buffer for the first synth.  its very very rudimentary; it records and it loops but there is no volume control or fx and it pops at the end of the loop, but it works somewhat.  will be working to make it dynamically resize and create copies of itself so i don't have to predetermine the number of buffers.  it should just create as many as i need.

The drums currently are just envelopes and white noise.  i just need placeholders so i can get it all strapped in.  once they are responding to gestural input, the likes of which i am still working on, i will build some proper beat beasts to trigger in there.

pd subprocess Once this is all done, gonna have to put the sound elements and looping in a seprate patch.  there is an object called pd~ that lets patches speak to each other in across different instances of pd which has the side benefit of allowing pd to work with multiple cpu cores.  so far,I've just ben lazy in that regard, but i will get on it early Saturday.

I am giving myself a soft deadline of Sunday to make a functional gesturally controlled system.  If it works by Saturday night, i will, if its not too cold, go test the whole thing, including visualization (i'll bring my second monitor) at Tacheles in my favorite mezzanine area.  i really would enjoy doing that as i have not play a single continuous piece of music in over a month and all this programming is making stir crazy.


MAKE Mag vol28 in stores today! official release of the Beatjazz controller open source project.

IMG_0298 OK kids, hot off the presses.  Here is the official release of the beatjazz controller open source project by way of MAKE magazine volume 28, which should hit stores today.  I promised during my first crowdfunding campaign that i would make the controller an open source project and i am happy to have had MAKE magazine help me keep that promise in a bigger way that simply posting it on a page on my website (which i will be doing in 2 months, during the newsstand run of this issue of the magazine).

a few notes for those of you ready to try your hand at building a beatjazz controller.  first, go get MAKE mag vol28.  then go over to the makemag beatjazz controller wiki at where you will find

  • the pure data patches that the controller is built upon
  • the schematics in fritzing format
  • the config files for the xbee's
  • The firmata firmware for the arduino’s
  • the printable hand unit case templates, to make the cardboard controller shells.
  • a full fingering chart (easily modifiable)
  • the midi controller assignment chart


IMG_0299 This version is the "controller" version meaning that it is setup as a midi controller.  i put A LOT of midi controllers in there.  you don't have to use all of them, in fact you don't really need most of them, but they are there just in case.  actually i would suggest against assigning all of them because the flood of midi data will probably choke your music software. 

I want to especially than Tomas Henriques, Hans Christoph Steiner, Gordon Good and Robert Guyser for their feedback and fielding many of my questions while building it and much thanks to "maelstrom" and "obiwannabe" for giving me the exact right answer so many times on the pure data forum

Speaking of forums, I have a forum specifically for this project where i will post most of the super geeky research into beatjazz from now on, at  Currently, because i haven’t been making use of it, it has become one massive spam trap so i should be finished cleaning it in the next hour or so, at which point feel free to post any build questions you may have, there.

I have entertained the idea of putting together a kit.  if enough of you let me know that this would be something you would dig, i will try and get one put together as part of my current crowdfunding campaign.  holla.


Introducing Gestural synthesis!

It is a big no no to take a whole week off from a crowdfunding campaign.  one must maintain a constant momentum if one wants ones crowdfunding efforts to succeed.  but i felt like i was having to split my time too much between blogging and programming so i decided to take a week to jump into the the project with both feet so as to have something to show you.

Demonstrantion of the gestural synthesis system

well kids, here it is as promised. it took a while to figure out the basics but i present to you, Gestural synthesis (more accurately, gesturally controlled modular synthesis).  Gestural synthesis is a means of designing sound by moving the beatjazz controllers, spatially.  this allows the artist (at this point myself and the two contributors that have already contributed to my crowdfunding campaign) to create the sound that is in his/her head. 

Circa 1999-Busking in San Francisco Over the years, Beatjazz has evolved from merely a stylistic concept which i would consider when i would play wind midi controller live over my own pre-recorded beats or over a DJ set.  by 2007, it had evolved to playing live over beats and arrangements that were also played live, using a technique called live looping.  but i felt that the limitation was that i would need to preprogram every possible type of sound i would need to express myself, before hand.  i had  quite a large number of sounds but was restricted to altering those sounds rather than the ability to create wholly new textures.  so at the beginning of this year, i created a new type of controller that allowed me to manipulate the sounds and effects more intimately and narratively and this has been very close to what I envisioned but i was still limited by my choice of sound genration; vst software synthesizers.  these synths are amazing and sound great but creating a way to interface with up to 20 different synths, in a way that was cohesive. was a pain in the ass.  i had to create a different gestural interface for each one and even then it didn't allow the freedom i needed, so in September, after giving my presentation at the World Maker Fair in NYC, I deleted my entire software synth based studio and switched to linux where my only musical means of expression is Pure Data, a data flow programmming language.

gestural grid from here and from this psuedo-blank slate (my beatjazz controller was already based on pure data) i worked backward from the controller interface, inward toward the goal of using spatial hand positioning to control sound parameters in a way that wouldn't confuse me in my goto hypothetical situation (playing in a hot dark club, late night, possibly inebriated).  my eventual solution was a gestural grid that allowed me to place data points at very specific places in the gesture "bubble" (the complete depth, width, length and height of the the ranges capable by each accelerometer).  from this point, the synthesis aspects became a simple matter of mapping sound functions to spatial points and ranges that i felt i could remember.  this is the result of the first stage.  a simple monophonic subtractive synthesizer with full gestural control of all parameters.

beatjazz controller updatei do this by placing the system into an edit mode where i can operate on the sound, then return to the default "play" mode to play it.  in edit mode, all of the playing keys have a different function.  new combinations of key and hand positions, "latch" functions on and off, like a 3d version of the mechanism inside a door lock.  by inserting your key, the ridges on the key push up a series of tiny piston like pins that, when alligned, allow you to turn the key and open the door.  its like that.  by pressing a key, then a modifier key, then a specific spatial position for your hand, you are able to manipulate the function assigned to that postion and once any one of these criteria cease, you effectively exit the editing mode for that function.  yaaaaay!!!  long way of saying IT WORKS!!!!

NOW, the fun part begins ( who am i kidding...its all fun!)  functionally, presets are no longer  necessary.  this is jazz baby!!  don't need no stinking presets.  i can keep them joints right here!!(points to head).   now, the journey back to trying to find a sound that was made in a previous set, will be half the fun of playing.  i have begun calling the method of moving from sound to sound "transforming" because unlike "morphing" there is not the same sort of smooth transitioning between sounds.  each must go thru a series of steps to get to the next, hence the term.  also, i discovered something last night as i added the first "buffers" (looping...more on that in a moment).  in this context, “key”, the relative collective pitches of a series of notes, grouped by the pitch of the root note,  no longer plays any significant role in the musical structure because one can create a harmonic structure that falls outside of westerm 12 tone tuning, but then create a "next" sonic element that adheres to THIS structure rather than an arbitrary, predefined one.  THIS  is the aspect i am most excited by. now that the system is "purely" pure data based, and sound generation and interfacing are completely integrated, looping has taken on a new significance.  no longer is it strictly "loops" but more like "buffers" of sonic ideas to be parsed as a sort of sonic database.  i realized this last night.  the looping buffers i created with my simplistic looper, are much more part of the sound than a separate element. 

This was a hard part of the project.  taking this from concept to playable reality bodes well for the rest of the software aspect of this project.  the next truly exciting parts are creating accelerometer based oscillators which means that i will finally have realized what "glowsticking" will “sound” like (anyone else feeling the P.L.U.R.!?) and a vocal oscillator that will make all of my sounds in to psuedo-vocoders.  THEN, to make 6 copies of this monster so that i can play chords!!  these wont take that long which will allow me to begin on the rhythm synthesizer which no one will ever mistake for a drum machine, believe!  after which, i will have about a week to refine to looping/buffering system and FX before i debut the whole, gig worthy beta version of the system at the C.A.T. conference on Nov 5th at the new Theatre in the new Modulor art-plex at MoritzPlatz.

whew!!  there is excitement in the air!  I have no idea what this system will sound like and i cant wait to find out.  but this literally just the beginning phase of SOFTWARE development, which also includes, diversifying the code so the controller can be used as a "beatbox" controller, a trumpet controller and I have been contemplating a "tabla-like" controller that's part beatjazz controller, part mpc drumpad. (I have the basic code but i need to see more interest in from the crowdfunding campaign before i will dedicate any time to developing them).  in the hardware department, i plan to reduce the number of necessary wireless transmitters to 4 from 6, and if the campaign starts to pick up, acquire the the 3d printer i will use to build beatjazz controllers for my campaign contributors.  (I must note at this point that after the campaign, and after i build the controllers for the contributors, i will be working on my beatjazz system project, not, taking  order for further controllers.  these controllers that i design will be for contributors only during this campaign. outside of this scope, you could always build your own from the newly release MAKE magazine (vol28) buildnotes article i wrote, describing how to build the version i use now for prototyping.)

THEN, after all the perks have been shipped, and i take a week off to party, I have 2-3 months of beatjazz system development including the helmet systems (augmented heads up display, transcrannial direct current stimulation system, and integrated computing system, to name but a few bits), the busking rig, controller refinement, performance clothing design, and reasearch into parameterizing gestures into narrative sonic constructs.  so the next 6-8 months are pretty much etched in stone.  this is just the start of what is to come.

I sincerely ask that if you are at all interested in this project that you do one of two things;

1. contribute. I am a very driven person, but drive is limited without resources.  check out the perks i have on offer and see if one of them is something you would really dig, knowing, that by grabbing it, you are an integral part of birthing an entirely new artform. and/or...

2. SHARE THIS CAMPAIGN.  If you really dig whats going on here, please take a moment, or a few even, and send this post directly and personally to those persons you feel would really be interested in this project.  we all know a few cats that would really be up for investing in a new artform, or buying a new type of instrument, even if its just to buy it and put it on ebay in a few years, and have to resources to do it.  share this link with those people.  let them know that you believe in it.

like i have stated in previous posts, it is just "you" and "me".  we are the only one that are going to make this happen.  I want "us" to make something special together.

ok, back to work...


Logging off to concentrate on the sound contruct

"Do, or do not...There is no try"- Yoda

Everyday is explosive with new insights and solutions.  each day, things that were just ideas or theories, become functions which newer ideas depend on.  each idea births a new idea births a new function.  I am in HEAVEN! Creating a new form of synthesis is tediously geeky.  if you are into that stuff, then hey hey hey!! i got few new insights.  but after this post i will be taking a break from any posts until i've grafted the sound constructs into the system.  unless of course, some new profound insights pop up;-)

Over the last week or so, I have been working on integrating sound elements into the interface I have been working on. but i have realized a couple of important aspects.  first, its just not time to integrate sound yet.  its close but the concept is to have a gestural concept that can mold the sound, intimately.  i need to feel like i am holding it...pushing and pulling parameters.  so getting the gestural interface right is of supreme significance.

it was at this point that something occurred to me this week.  i realized that the interface "is" the synth.  I had believed that the "sound" was going to be the synth but now i see that the interface is the synth and the oscillators (of which i have designed some mental ones!!) are simply there to relate the gestures audibly.  this may sound like semantics, but it is an important distinction at this phase of development.  there is no interface and no synth.  there is only one cohesive construct  for "wearing" sound. 

the sound elements themselves will be simple but will react dynamically because of network of data that is being fed to them constantly. They are the nucleus.  there will be an inherent instability.  there should be no way to produce the same sound exactly, twice.  this excites me greatly.

since last week, the gestural grid has taken shape.  I have now parameterized specific points in space and will spend the next few day relating this visually to the visual interface in a manner that is easily understandable by not just me, as this will be the heads up display that will be built into my helmet system, but also will be projected during my performances. 

that's all i got.  My head is in the code.  not so much the blogging.  this will be my last post until this thing is making sounds.  I am on a little bit of a deadline as i have the C.A.T. conference coming up on nov 5th and between now and then i need to make sound, then make GOOD sound, then make effects, then make a looping system, all of which i expect to be as unpredictably insightful as the last few weeks have been which is scary and exciting. 

If it isnt apparent, My hope is that some of you reading feel inspired to CONTRIBUTE to the crowdfunding campaign i am in the midst of so i can build this system properly.  its funny...on a group i belong to, someone posted a lament about how they wanted to support new, futuristic forms of music, so i posted the link to the campaign and they promptly did nothing.  I would like to think that i can present such a project to YOU...yes you.  not the vague general "you"  but YOU and that you would support it by either contributing or passing it along to someone(s) that would really be into such a project.  make no mistake though...with or without you, this is my dream...this isnt "going to" happen.  this is "happening"...believe!  bombastic or not, i want to change the world (there i said it).  i want everyone to create beatjazz one day.  My idealism leads me to believe that presenting such an idea directly to people is the way to do it.  that with open source and crowdfunding, beatjazz will retain a sense of artistic purity going forward that i don't think would be possible with a primarily commercial project.  It is because of the support of those few of you who contributed to this campaign and the last one, that this project exists at all!  Support this campaign and i promise you will be part of making the craziest live music system ever, possible. word! 

So now i've got to get back to work.  see ya in a week.


Feature and interview on MakeZine live: Ep 18



here is the video of my interview with Becky and Matt of Makezine live


I’ll be online live tonight with MAKEZINE-live!

As part of the run up til the release of vol28 of MAKE Magazine-the official debut of the beatjazz controller open source project, I will be video conferencing with the MAKEzine-live hosts Becky Stern and Matt Richarson about all things beatjazz.  They will be showing some of the footage from the Maker Faire in NYC last month as well as previewing the article to come.  I will share some of the new aspects of the beatjazz system as well. 


you have 2 ways to tune in.  either or at 9pm ET/6pm PT (US), 3am Berlin time.   see you there!


Functional Gestural control parameterization! 47 days left to fund the Beatjazz System!

There are 48 days left to help fund the creation of the beatjazz system.  right now; i am using this time of limited funding to work on the software systems.  eventually this system will have to move into more physical constructs like hardware development, 3-d printing, part fabrication, electronics, etc.  picking up one of the perks on offer will ensure that the project succeeds.  check me out over at

 IMG_0286 I must take a moment to explain why i keep going into such a geeky bag with my posts.  firstly, because i am geeky.  but also because there are cats out there that read these posts for pointers as to how i built my project in the same way as i read many geeky posts about others experiments on their projects.  This is part of the open source ideal;  freely sharing information.  also because if you know where it came from and somewhat about how it works, you may dig it in a different way when I start gigging the new system.  and lastly, for posterity.  these notes in my notebook at home arent doing anyone any good.  i have found things i have written online that i don't have in my possession anymore (if it werent for russian mp3 sites, my old albums from the 90's wouldn't even exist anymore, although i dont know if that is a blessing or a curse…). 

The last few days have been marked by my ignorance of linux but this time, as opposed to my dalliances last winter, there is no going back (for the music system.  my internet computer is still xp).  As well as getting reacquainted with OS in terms of key commands, file structure, basic system stuff, i updated many of the apps that came with Pure:dyne, the Linux distribution i am using.  i didn't realize at the time that updating their heavily modified version of Pure data, would break it.  I wanted to update it because, to be honest, it was hideous.  the windows version, while not "beautiful" in comparison to Max/msp's candy-inspired interface, was pleasing enough on the eyes to not be distracting; white background, grey boxes with black text and blue highlights when connecting wires onscreen. simple.  but the version that came with puredyne was no contrast, black on white.  period.  it got the job done but i wanted something a bit more aesthetically pleasing, since i have to look at it for 10+ hours a day, as well as wanting a few of the updated functions of the windows version so that, for instance, the library of objects, called 'externals', would be more compatible with the ones i had used to create my patches in windows.

I wont complain about spending 2 full days hunched in front of my computer installing/reinstalling stuff because it gave me a crash refresher on the details of using linux, compiling ones own software, and generally being able to maneuver in a new system.  i can change directory permissions like champ now!!  although i had wanted to use the new 0.43.1 pre-release PD version, it was taxing my patience a bit too much considering i am on a bit of a deadline, so i was finally able to appease the gods of binary and install a much more recent version of the 0.42.5 pd-extended.  i edited the config file (.pdrc) by hand to load the necessary object libraries.  and although it still says those libraries aren't loading, they work, and that's good enough for me right now.

gestural-concpt-drawing just before the updating saga started, i was at a bit of a crossroads as to how to implement the gestural parameterization, in a real, Useable way.  i could attach a parameter to an accelerometer direction and call it a day, but i need a precise way to access functions, say like filter type, alter it, freeze the alteration and still manipulate it further in a sort of "post-alteration" mode.  last night, it came to me as if it had just been sitting there waiting to be discovered (PD flashcards each morning and night don't hurt either).  i created a set of cascading "latches" that allow me to use 2 or more physical controls to manipulate the data streams.  so, for instance, pressing key one will tell the system "send parameter (x) to the accelerometer, but wait for..." then press key 8, which allows the function to go thru.  once key 8 is lifted, the function freezes at the point that my hand was at at that moment. pressing key 8 multiple times allows me to scroll thru a subset of functions that can be assigned to key one (per our filter example, filter envelope, cutoff amount, etc) and on and on.  basically multiply this by 6 keys (the last two are dedicated to function scrolling) and multiply that by, say, 4 functions per key, each with 2-3 axis control vectors, and you have access to approximately 40-60 gestural control vectors!

gestural-crosshair-vizsound complicated? i thought so too, so i got rid of the floating 3d square thingy in the visualization system, which was, truth be told, only there as a place holder to be able to "see" the accelerometer data", and replaced it with parameterized "crosshairs".  each set of crosshairs will respond to hand movement and each function will have its own named crosshair that shows up only when said function is instantiated.  there will be WAY more text readouts than i had conceived of a few weeks ago, but the functionality is on par with what i imagined before i could envision the interface as it is developing currently.

instrument-modes this was the breakthrough i have been looking for for the last few months!  the synthbuilding, which i am most excited about, could not progress because i needed to have an idea of how i was going to control them.  i didn't want to build some crazy synth system, then have to rebuild it again because it didn't match up with the control system.  so today will be spent editing the details of the new gestural control construct; parameter readouts, smoothing function switching so it doesn't skip (which it is doing right now), then add this functionality to each "key", after which, the first "mode" will be functional.  In addition to this "Instrument edit mode", I can cut and paste this construct to create the Looper edit mode, the FX edit mode and the System edit mode, all in addition to the "Play mode" which is the existing "basic" controller  system.  yaaaaay!!!

THEN--and by "then" i mean, maybe even tonight--I can begin to create sound constructs, anew!!  In my research, the options for sound creation and manipulation in pure data are MIND BOGGLING! i predict years of sonic experimentation with this system.  but first things first...functional gestural controls. l8r.


Introducing the Beatjazz visualization system (alpha)

Intro to the beatjazz visualization system

It is Day 51 of the Beatjazz system crowdfunding campaign.  It is my sincere hope that you vibe on the project enough to become a patron.  Currently, I am working on the aspects of the system that i can do without much financial investement, ie., software developement and conceptualization which i hope gives you enough of a scope as to the direction, depth and breadth of the project, to grab a perk, which will fund its creation.  check it out over at

So, this is the alpha version of the Beatjazz visualization system.  its a bit old school “cyber” vibe in its current incarnation.  the first thing my photographer said when he saw it was “Lawnmower Man”.  indeed.  it does have that mid-90’s, virtual reality, cyberspace vibe, which i am not opposed to one bit. 

As i get used to working in Pure:dyne Linux, there are a few hiccups.  as far as i can see, i can only h ave one comport instance per patch.  i could be wrong, but it seems that opening each of the 3 comport nodes (left hand, right hand, and mouth interface) individually, allows them to work properly, whereas before, they were very jumpy and not sending sensor data to the correct port.  I also had to rename a few objects to account for them being located in different folders within the PD installation, but after i did this, everything, surprisingly, worked.  and work WELL!

IMG_0277I can not comment on the speed enhancement of this switch from windows to linux, enough!  wow!  everything, when it works, works so fast and clean with so much processing headroom.  the graphics respond immediately to the slightest movement and, whereas in windows, the number boxes would either run really slow or simply freeze up after a few minutes (although the patch it self worked fine), the same objects now sing with rapidly updating digits!  its beautiful.  i am in love.

so here is the rundown fo the viz system.  this will form the basis how we see beatjazz.  every musically relevant parameter will have a graphical representation.  currently, those representations are simple geometric shapes which is fine for now but the visualizations will diverge into two distinct categories; the Heads up display visuals and the relational performance visuals.

 IMG_0280 As you may know, I have a crowdfunding campaign going on to build the beatjazz system and one aspect of the system are augmented goggles i can use to get a full realtime overview of every relevant aspect of the system.  the visualizations will appear to float in space in front of me.  in the video above there is a black background but this is the alpha channel which means that when projected into the goggles, the black area will be completely clear, leaving only the graphical objects, so it is important to define their relevance now.  as i begin the gestural synthesis part of the project, it is very importatnt to to have feedback as to which of the 4 modes i have defined, i am in (performance, instrument edit, FX edit, looper edit).  now i can attach graphical variations to onscreen objects and see exactly where i am in this increasingly complex system.

the other category is relational visuals that will be projected during my performances.  I have read many comments over the last few months of people having no idea what exactly it is that i am doing when i play.  i have to concede that even with the addition of the lights on the units, there is not much to convey the amount of things that are happening, so the visuals will “relate” every musically relevant activity that is being played or modified, into a projected graphical representation.  even the edit modes will be represented graphically.  eventually, this version will evolve into something “prettier” than the Headsup version, which only really needs to be functional rather than aesthetically pleasing.  eventually it will share the concptual morphability of the synthesis system.

IMG_0278once the first of the synths gets added, you will see a sharp increas in the number of little readout elements which will be useful as well as reinforce the “cyber” vibe which i am increasingly fond of.  GEM, the graphical portion of Pure Data, makes this easy with its cascading data flow style of programming.  data that affects an object, flows down into it. then that object can be flowed into another object, so, for instance, i can flow video effects onto a video clip object, but then i can flow the effected video clip onto a cube or a sphere which can be flowed onto someother object.  its very deep, powerful and logical.

there are a couple of minute elemental additions i must now do this weekend then all of my efforts will go into getting the gestural synthesis system!  funnily enough, working through the visualization system informed quite a bit about how the synthesis system should operate.  it will be much easier to progress now, knowing how and where parameters should be.  I feel confident that i will have the system ready to throw down at the C.A.T. (CreateArt and Technology) conference on Nov 4th here in Berlin, where i will be preforming and presenting the new system for the first time.  this will be the first C.A.T. and looks to be really fresh.  it will be held at the ampitheatre in the new Modulor Building at MoritzPlatz.  Not to mention that one of the organizers is the My friend Anton whose site supplied 90% of the parts (and advice) that went into the beatjazz controller.  he’s really good at putting together things like this and i believe this event will make a significant impact creatively, on those that attend.  mark your calendar for that weekend. Until next time…l8r.

R.I.P. Derrick Bell and Steve Jobs. Our lives are better because you lived. #Insipired


Warning: file_get_contents( [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.0 410 Gone in /home3/solarpt/public_html/ on line 44

Latest Tweet