Fargot Password? / Help

rokettube porno mobil porno mobil porno rokettube porno porno

Author: onyx


Coming to America and The New Normal

vlcsnap-2013-02-25-14h56m22s80it has been a while since my last post.  this wasnt a lapse.  i needed to order my thoughts fully to express stuff properly.

The New Normal (?)

this "thing" or state or project or whatever it is, is changing, and it is changing me with it. (“the new normal” is a reference from a kevin kelly article I read) its is a very interesting feeling to feel as if you are being mentally manipulated by your own expression, in realtime.  manipulated may be the wrong term;  more like "programmed by your own will".  this didnt just start.  i "could" say that it began when i started playing around with the idea of live-looping with synthesizers back in 2007. vlcsnap-2013-01-16-16h31m41s195 but that was an origin point, like birth.  this is something markedly different.  it incorporates all of the previous experiences, and i do mean, ALL of them; from insignificant habits as an 8 year old and everything else, all the way up to a generalized point in October of 2012, just after a couple of shows in London where i was profoundly not happy with how i sounded.  the following week, just before a couple of shows in berlin, i changed a couple of key parameter trajectories that resulted in a sound and concept so alien to me, yet so obvious as the next step that it completely wiped my "imagination slate" clean.  And from that point, mutations, which before this fateful week were mostly hypothetical conjecture, became as clear as map points on google maps.

timestretch algorithmThe big change?  simply gesturally controlled granular time-stretching.  the same timestretching that has been a staple of audio recording and performance software since almost the beginning of software synthesis as a democratic musical concept.  it is a technique where the "play head" is looping the audio right underneath it but is also moving at normal speed, so it sounds like normal playing audio.  but if you slow it down, the little loop is still playing at normal speed so it sounds like the audio has slowed down without changing the pitch.  in addition, the little loop can be slowed down and even reversed so a whole range of manipulations can be be had from this simple technique.  mine isnt even that complex.  it is based on the same one that comes in the help files for pure data, with a couple of gestural tweaks that allow me to control these variables with my hand position.

8 elementsMy system works by having 8 "elements", or sound sources. i can use finger pressure to select one of these elements as the focused one, then play it into a looping audio buffer.  each element has its own.  they are all time synchronized wih each other.  i simply move from element to element, building my arrangment.  i can mute and unmute any or all of them any time i want.  In september of 2012 i decided to add a "resampling" buffer to element 8 that would take the entire mix of all the other elements as a single audio file and allow me to manipulate it.  sonically, this was very interesting because now i could, for instance, play chords as a two step process of playing a basic chord then resampling it and playing new interval relationships with it. 

But -and there is always a but, i had a problem i had had since the beginning of this idea of continuous live looping: i could not change the tempo.  i would do all of these changes but if i needed to change the tempo or loop length, i needed to start over from scratch.  i found the monotony of the predictability of the loop length became incredibly taxing on me, creatively.  like performing to a metronome, which i have done in the past, and was happy to stop doing.

so after said gig in London, i was frustrated enough with this issue to go into one the many trance-like programming states that seem to be more and more common.  By the end of the 3rd week of October 2012 i had added the aforementioned timestretch algorithm in hopes of being able to change the tempo.  i did not expect the sound that came out from that hypothesis.  the sound could not only be stretched, it could be scratched, and ripped apart, IN STEREO -yes, i can scratch the left side with the left hand and the right side with the right hand, at the same time, like some kind of scratch-drummer. THAT was the point;  what was this sound?  why was this sound so alien yet so obvious?  i had completely baffled myself.  so few parameters yet so much scope for narrative and syntax.  it was then, like being in a video game where you can only progress to the next level after you have solved all the puzzles on this level, that the "transform" element showed me the next stage; better, more timbrally complex elements to feed into the transformer.  with the transform element, there were no more mistakes.  mistakes become timbrally interesting building blocks thatsound almost sweet when digested in this new matrix.  and the tempo is completely untethered to the previous musical ideas.  the transform process resets the system loop length for all elements so it can be a brief stutter or long and flowing. 

I can no longer predict what i am going to play in a performance anymore.  the beginning of a "thought" no longer has any bearing on how it will end or how it sound after its first transform.  this confused me as an aesthetic concept, as i didnt know how to contextualize this sound concept, musically, so i started looking to mathematics for answers, which was easier than i thought it would be.  maybe by this point, i was mentally ready for a more mathematical answer than an aesthetic one.  the one that impressed itself in my psyche most readily, was that of fractals.  a basic construction algorithm of nature based on self similarity where phenomena is expressed then divides and splits and repeats, in simplest terms ( I am still studying this so dont shoot me for my oversimplification).  I began to look at the system as not a "looping" system-looping is an incidental aspect of the way the sonic ideas are expressed-but as an expression evolution engine.  the expression vectors must evolve along a different type of route after being completely and unpredictably transformed in every way.  this is not restricted to sound.  in late december i cleared a data bottle neck by ridding my system of the serial communications bus, which was slowing down everything i wanted to do, and replaced it with much higher bandwidth 802.11g, or wifi.  this has enabled highly controllable expression vectors which now consist of light-color, tempo, intensity, etc-, and robotics for which pure data is extremely well suited.  i dont understand why the linkage between pure data and robotics isnt spoken about more, but i digress.

node1So yes, this is as much of a head fuck as it sounds like it is.  i have begun designing gestural  fractal elements-sounds of varying timbral complexity that will translate a certain way, to feed this process and also started redesigning the transform matrix for more dramaitic types of transformations.  and all of this has changed me.  i dont watch movies anymore.  they distract me from this crazy algorithm that is trying to parse itself in my head.  the only things i can really listen to are post bebop/pre-70's jazz from artists like sun ra, archie shep, ornette coleman and eric dolphy, none of whom i knew about 2 years ago and when i would hear their musics randomly, i recoiled from them from being to “self-indulgent” (yeah, pots talking shit about kettles…I know).  now i cant get enough of it.  and everyday, all day, is streams of the strangest series of questions about everything from stop signs to wall paint composition, to geographical locations of saltmines and what the first humans did before language or tools, and 1000 other questions.  it was driving me a little mad but now its more comfortable.  it is the new normal.

this was the last 2-3 months, much of which was spent perched on my crappy office chair, rambling to either myself or my neighbor, but i knew that i was going to be in the US for the first time since 2009.  technically, i had been in the US twice in 2011-once for the TED thing and once for the Maker Faire in NYC, but those were both in NYC and i didnt see any family nor did i really have a defined concept as complete as this "mental algorithm" or "pattern".  i was acutely aware, as i am writing this blog post, that maybe they, the people in savannah georgia and in memphis tennesse, would think i was completely nuts.  maybe i had sat in my flat a month or two too long, exploring my own world in too much detail.  i knew those were possibilites, but the algorithm felt so complete, that i was more curious than apprehensive (airport TSA shenanigans not withstanding).

Coming to America

savannah cover photoSo I arrived in Savannah georgia to the completely surreal experience of seeing my own picture on the cover of the local weekly zine.  whoa!  I've never been cover material, or so i told myself.  that, and it was HOT, as in weather.  so i took advantage of the opportunity to revel in some much missed sweetend iced tea and a whole list of unhealthy former addictive food-stuffs which usually populate the shelves near the counter at gas stations.  all that is besides the point though...I was in Savannah to do a series of presentations and performances for the Telfair Museums Pulse festival.  there was a whole digital arts exhibit with digital video arts, video gaming arts and more.  it as beautifully put together.

there were two days of full auditorium presentations for my project.  i was happy with the diversity of the people in attendance, especially the age ranges.  there seemed to be just as many older people as young ones.  so i just decided to drop it how i see it now, without any consideration that i might be kooky sounding and surprisingly to me, everyone got it!  i just described the esoterics and how they tied to the mechanics of the system and idea and everyone, older and the kids as well, got it.  i could tell because some of the questions were very specific.  one kid, about 8 years old, asked if i had thought about putting the sensors in my body and i explained, in detail, my apprehensions with sub-dermal sensors but that i was open to EEG scanning and he was completely cool with that answer!  cool!  questions about integration of this modality into the school system, its place in the continuity of jazz, from a local older jazz cat (really happy that he saw the connections i was trying to present musically, without saying anything about them first), communal music creation, and on and on.  the Q&A sessions lasted from 30 minutes the first day to about an hour on the second and last day.

After the talks, the range of people that came up and the range of questions were very informed and interesting.  i felt like something had connected and felt very much more confident that i wasn’t crazy.  that doesn’t mean that everyone liked it.  there were swaths of people who got up and left as soon as i started playing, which was encouraging since i dont want this idea to be middle of the road;  “dig it or dont or build your own!”-should be the motto.

I was also given some beautiful artwork by local artist PE Walker, created from reclaimed materials.  I was so happy to get these items home without destroying them in my luggage. IMG_20130301_123110[1]IMG_20130301_123122[1]IMG_20130301_123156[1]

i had a week or so with family to chill out and revel how old we're all getting and how grey we're becoming and even talk about this "algorithm" (and by talk, i mean ramble endlessly) and it still kinda held strong.  nothing that fell out of my face was glarringly wrong or complete bullshit, because if it were, i would have heard about it right then and there.  this was heartening as i prepared for Memphis.

IMG_20130301_123420[1]but i should say that Memphis was prepared for me!  they went all out! and by “they” I mean Eric Swartz and Sarah Bolton who went to a lot of trouble getting me there( I met them while they were in Berlin filming a Documentary called “Children of the Wall” which you can check out Here,) and guys at the Five in One Social Club who dusted out their wonderful subterrainian bunker and filled it with custom stages, screen printed posters-yes, screen printed, like on tshirts but on cardboard-beautfully designed and they even made me a t-shirt version. and an amazing 3d projection mapping co-performance by Christopher Reyes. (this and more photos from the night here)tumblr_mibu5n0esN1rk1wygo7_1280 Memphis is special because my first time playing there in 2008, was literally my last stop on my way to new york to fly to berlin for the first time, to begin a new life.  and that party was heavy!  I and Jeremy Bustillo did an all EWI based party night, and held it down all night long.  the crowd kept it going!  and then again in 2009 at a space called Odessa, which was so warm and comfortable that i couldn’t wait to play there again.

I was so chilled that i sat, onstage, for a good portion of my talk.  i rambled about all of the above mental stuff and interspersed it with bits of playing and then dropped into a full set.  people were sitting on the floor and generally relaxed from the barbeque, cole slaw and iced tea that was being served, so i took the opportunity to investigate new ways of transforming.  i am having to come up with new terms for this stuff but i found "stutter rips", and "dving deep" and my new and barely explored favorite so far, "scratch drumming", where i am learning to scratch each side of the audio in relation to the other. kneeling-memphis i finally stopped when i felt like i had explored all the permutations i could think of that night.  i was tired and the crowd seemed satisfied. and the rest of the evening was super chilled out.  it was a great opportunity to talk interesting heavy stuff with many representatives of Memphis' artistic community.  I even met Larry Heard-House music legend-There! 


After all of this, i spent a few nights in a motel 6 near Hartsfield airport to decompress a bit and even in the solitude of a hotel room, it was all a flurry of drawings, and patch programming and design bits.  i think this is a permanent state now, which is ok with me.  the next stages are clear now and in the coming week, i will be doing another crowdfunding campaign to unveil what "next" means.  it will be weird- Trust!- but i think it will be as obvious to you as it is to me now.


Nothing like your own show poster t- shirt!

The guys at Five in one gave me this wonderful T-shirt and I thought I would show off their amassing design work here. If you live in Memphis, be sure to stop by later! image

Notes on switching from xbee to RN-XV for low latency applications using Pure Data

IMG_0043Happy new year!!  It’s been a crazy crazy month!  I wont even go into all the mental intensities emmerging from digging deeper into the mathematical underpinnings of sound design (which resulted in a couple of weeks of burnout).  although a reprap breakdown has slowed down the pace of printing perks for crowdfunding campaign fullfillment, I have been working on personal mixes for my beatjazz supporter pack contributors and managed to mail off another controller! One more to go!!  it still amazes me how much nicer they are than my own, which is constantly being added to and tweaked.  this particular contributor received his system with my own personal xbee wirelesss data system. the one I have been using since the beginning of this journey.

As crazy as it sounds, one of the things I have learned about myself that if I have a hard “future” thing to do but I have a comfortable “now” thing laying around, I will go for the “now” thing.  especially when there is a deadline or it’s a bit too abstract to fiddle with.  so what I do is get rid of the “now” thing and make the “future” thing the “now” thing.  I have done this for years, like when switching from saxophone to wind midi controller, switching from playing cover tunes to playing originals, from hardware to laptop, from wind controller to beatjazz controller and, last summer, from commercial audio software to using pure data as my central means of expression. 

 IMG_0055the Digi made, extremely popular Xbee series 1 and series 1 pro’s (the pro’s are the same, but with 63mW broadcast output as compared to the regular series ones 1mW),  are marketed as “a drop in xbee replacement”  which excited me.  the xbee’s werent my first dalliance with wirelessness.  I had a kenton midistream wireless midi system when they first came out in 2004.  I also used my iphone’s accelerometer as a gestural controller for most of 2010-the summer before I began working on the beatjazz controller.  the Xbee was my first interaction with actual networking protocols, although it was an older, slower yet more reliable one called Serial protocol.

Serial was created a long time ago and its very slow in comparison to todays protocols, but its stable and is good for less bandwitdth hungry data transmission such as that coming from the low powered arduino platform.  there are multiple ways of integrating xbees with arduinos using snap on “shields” and, in my case with the arduino fio, having an xbee “shoe” built directly onto the board.  this has worked for almost 2 years with some caveats that are worth mentioning;

  • if the computer loses the connection, the software must be restarted, which sucks in a show.
      bidirectionally, the data coming from the arduino is mostly speedy enough for low latency performance, but data transmitted BACK to the arduino from the computer, using the aforementioned xbee pro’s was useless for anything time sensitive so I only used it to change the light colors infrequently. 
    if I am in the vicinity of a “hipster forcefield”, (a phenomena whereby the sheer number of smartphones, with their numerous wireless capabilities, drown out my xbees broadcast capacity and the system becomes unresponsive), I end up huddled over my laptop-situated xbee basestation, gasping for bandwidth.

IMG_0052the RN-XV’s, made by roving networks, use wifi and are designed for serial communication although not limited by it.   I purchased them but didn’t really use them much until a recent robotics project (I will be writing about soon).  I found them to be remarkably faster by Wide margin, out of the box, but had less success when using them with the beatjazz prosthetics.   the data was very shakey and garbled and I had no idea where to even begin fixing it.  I put them away, mentally, until just recently when I became interested refining the system, which seemed a bit labored recently.  so I just said fuck it and mailed the xbees off with the controller and stranded myself in 802.11 land.

Along the way, I learned a lot of very valuable stuff but there was one thing In particular that I wanted to share; If you use firmata, as I do, you can use it, (mostly) unedited. you can use the rn_xv to create a wireless arduino network with no serial port virtualization!


How to.

the Goals:

  • go pure wifi instead of serial for wireless communication
  • give each “unit” (headset, left hand, right hand) its own ip address, directed at a small dedicated wireless router (d-link dir-300)
  • speed up bidirectional communication significantly enough to make better narrative use of the lights and haptic feedback, and reduce processing overhead.


  • arduinoobject-inside first thing is to go into the [arduino] object in PD. and open it where you will find this structure.

  • the inlet is where settings bound for the arduino are input into the [pd command processing] abstraction which then sends it to comport object, which is PD’s serial interface.  The comport inlet (at the top of the object) sends the serially formatted data thru a serial connection, to the arduino.  in my system, the xbee’s were the virtual (wireless) serial cable.

the outlet at the bottom of the comport object is the data that comes “from” the arduino.  this data is a firmata-formatted stream of numbers
224 17 0 (repeat) but in a stream

this stream is sent to the [pd make lists] abstraction and then from there to the outlet where it is routed to digital and analog route objects where you can assign them to things in pd. 

udp internal stuffI needed to get sensor data from the arduinos  as fast as it could send it, then process it in pd and send control data back to the arduino, wirelessly.  it also needed to be as blazingly fast as was possible.  My first idea was to emulate a serial port using one of many virtual serial port programs on the internet like Serial/IP, socat, HW-VCP and about 6-7 others.  in all cases, the data I was getting was garbled and would stop after a while (more on this later).  so after a long time of beating my head against this wall I decided to try simply getting rid of the serial communications system and trying this with networking objects.

I tried all of the networking objects I could find in pd with most people telling me to try UDP, which is a connectionless format that, unlike serial and TCP, doesn’t need to “hand-shake” to allow data to pass back and forth.  both ends of the link simply “scream” data at the other.  in this way it is great for things like skype and video streaming but not so much for things that have a low tolerance for errors.  so after another week of fiddling I was able to place the [udpsend] object underneath the command processing abstraction, where the comport was.  [udpreceive] receives data from the arduino and sends it to the rest of the pd-system.  [udpsend] accepts data at its inlet and also needs the ip address and port to send the data to the right arduino. 

the [udpreceive] object recieves it data from a port on the router that I set on the RN-XV itself, negating the need for arduino libraries or rewriting the firmata firmware.  at this point I will jump to the RN-XV setup.  it will make sense when I come back to this abstraction.


The RN-XV is a powerful 802.11b/g tranciever marketed as a drop-in xbee replacement, meaning that the it shares the same small pin spacing that xbee is based on but is a bit more tedious to program.  for the purposes of this tutorial we will stick to the features that will enable us to link this directly to pd.

what we need to give each RN-XV its own static ip address and port, first.  this is where the sparkfun USB explorer or equivalent usb serial to xbee board, comes in handy.  with it you can program using a terminal like Putty or, in my case on win7,  Coolterm

in coolterm, goto Connection->Options and make sure the serial port your explorer is on is selected and the initial baudrate is 9600 and click OK.

in the main window, click the “connect” button then in the terminal window type

$$$ (but DO NOT press enter afterward.)

this puts the RN-XV into command mode and the green light blinks more quickly.
if you type “get everything” (no quotes) you should see all of the settings for the tranciever. these are the ones I found most important:

  • “set ip address”<ip address> –is the ip address of the of the rn-xv
    ”set ip local port”<port number> –this is the port that [udpsend]will send data thru to the arduino
    ”set ip host” <ip address of the computer>
    ”set ip remote” <port number> –this is the port  that the [udpreceive] will use to receive sensor data and send it to the system.
    ”set ip protocol” <1> this sets the rn-xv to use UDP protocol instead of TCP.
    ”set wlan ssid” <ssid> –this is the name of the wireless network you are connecting to. 
    “set wlan channel”<ch.number> –set your channel number here. I set mine to 0 so it associates with the AP with whatever channel is there.
    “set wlan join” <1> joins the access point saved in memory (the one we input above)
    ”set wlan linkmon” <0> I turned this off. 
    “set wlan rate” <15> I am playing with it at this rate which is the fastest as 54mb/sec.  1-2mb/sec would be fine(0 or 1).  roving states that lower transmit rates increase the range, so play with this one as you need.
    ”set sys autoconn” <0> this turns off the autoconnect timer.
    ”set uart baudrate” <baudrate> this effectively sets the baudrate of the connection.  I have mine set to 115200 kbps  and will experiment with higher rates but this seems to be fast enough for right now.  one line in the arduino firmata firmware will need to be changed to allow this to work properly.  if you need to use Coolterm with this rn-xv in the future, it must be at this baudrate in the option settings.
    ”set uart flow” <0>
    ”set uart mode” <1>
  • and the last and possibly most important two settings
  • “set comm size” <1420> this is the “packet” size or the number of bytes sent over udp as a block or numbers
    ”set comm time <5> this is the number of milliseconds between flushes of data blasted out to the computer from the rn-xv.  every 5 ms, it sends a packet of data that “should be” 1420 bytes.  I say should be because it is udp and there is no guarantee of delivery.
  • after inputting all of this into the terminal window, type “save” which saves it all to its own local memory (which is why the arduino WiFly librairies are unneccessary in this instance) then type “reboot” which puts it back into regular mode, ready to be placed back into the arduino fio.


Arduino firmata edit

next thing is to now use the arduino software to change the baudrate of the firmata firmware from 57600kbps to 115200kbps.  goto File->Examples->Firmata->standard firmata.  scroll to the end of the sketch and then scroll up until you see


and change it to


then upload it to the arduino. 8you can not have the rn-xv attached to the arduino while uploading a sketch. put it on after uploading)

  • Setting up the new network arduino object


ok, so back to our edited [arduino] object, which I have named [myarduino-udp].  we need to tell [udpsend] where to send its data, so we create a message with [connect xxxx < which is the ip address we put into the rn-xv earlier, and its port number.  when you connect it and click on it, it “should” work and output a 1 at its outlet, signifying that it works.  if it does not, make sure that the router is working properly and that everything is connected.  if there is a red light blinking on the rn-xv, then it is not associated with the router.

Next, in the [udpreceive], enter to port number that you entered as the remote port in the rn-xv settings. if this is working, it should show the received ip address and bytes from the second outlet. ( I took this from the help files).


once it works, the “bytes received” number box should show some activity if you have sensors attached so you should send some data at this point. for reference, I attached a print object to the output of [udpreceive] so I could see what was coming out (below). it is blocks of numbers starting with a 2xx number the a 0-x number and another 0 to about 17 or so.  these are the right numbers in the wrong format.  they must come as a stream like;


so I used a list object called [list drip] which outputs a list as a stream.  this formated the data properly and everything worked!  it was at this point I realized that either [comport] or the serial port was the previous bottleneck in my system.  the system was now noticibly quicker than with the xbees.  so I decided to try to eek out a bit more speed by toying around with the [list-drip] since now, it was the new bottleneck when I came across this thread with test abstractions that sped up the [list-drip] object, significantly!  I ended up using an undocumented version called [list-drip-scheid] which itself is a faster variation of one called [list-drip-quick2]!!! (the thread is great read as well)

one last thing I did was to place a bang object on the inlet of the [loadbang] because it doesn’t send the data to the arduino on the first [loadbang] so I do it manually after the patch has loaded.  and that should be it!  I say “should be” because there is always something that falls over somewhere but this works for me and has been stable for about a week since I figured it out.  my operating latency is about 4ms bidirectionally and now the system is fast enough to control the lights with gestures and breath pressure in real time while allowing much more expandability.  as well, there are no weird comport, serial port, ftdi driver issues; everything will work on any operating system with minimal fuss.  I now need only add ip based components to my system for my router to recognize and pd to utilize.

this is my first official tutorial so let me know what you think.  if it is confusing or too long or whatever.  I aint gonna change it but I will incorporate the feedback for future posts.  l8r.

EDIT. here is a demo of the complete system, working.

Me and my bot “Boogie”, working out in the lab.

the “twins”

I was going to try to not post this for a while until got caught up with my crowdfunding fullfillment and some other stuff, but things are advancing so fast right now that if i dont put this in the conversation, it will just get left behind. Introducing the first iteration of my beatjazz parameter feedback bots. I call them "the twins" there is one for each hand and give axial spatial feedback without the need for a projection. i will be sharing more about it soon but i REALLY have to wrap up the previous campaign first.

Happy Holidays! the Beatjazz controller is ready for download for makers everywhere!

so finally, I have begun the process of fullfilling my promis of turning this into an open source project.  you can see the files for building your own controller here  it is now in the wild.

I dont really know what this will mean to me or to others or to the project.  it might be ignored completely.  it might be jacked and someone else take credit for it.  it might be used for something i cant imagine or just one piece might be relevant to more than just me.  I dont know...which is exciting.  i dont have experience here so i am interested to see what happens.

what i hope is that people see that this is not an instrument, but an idea.  an idea that one can create their own personal prostheic computer interface and not have to be content with what the corporations think we want based on their little demographic surveys. 

 it is a seed.  it is not meant to be this configuration persistently but to be modified, changed and even discarded once said person has found themselves through going through the build process.  i have recently seen 3 different systems inspired by the beatjazz system and none of them look or operate like mine.  that is the point.  it is also the point to the construction.  i designed it with a million little pieces so that each of those pieces can be iterated to each persons perfection.

I knew that i had to upload it now because my own personal system iterates so much that i do not consider it to actually exist as a tangible thing.  it only exists as a tangible "state"  no part of my system has remained unchanged for more than 6 weeks.  and there is no reason for it to.  it can change based on my whims, creative urges, geographical location, event is a tangible thought...a corporeal idea.  and as such i knew that if i didnt upload it now, it would iterate away from anything that anyone would understand.  

so Happy holidays and happy building.  and thank you for helping me build (one of) my dreams.

Transform presentation and talk from DDB-Tribal event

This was an unexpectedly interesting performance from a few weeks ago where i had just changed a couple of fundamental function positions and everything became infinitely more comfortable to take the sounds thru the process currently still called Beatjazz.  I hope you dig it.


equations relative to each other are beginning to synergize. aesthetic based on function is proving to be a decent guiding principle for not just design, but sonic interval interpretation, formerly called music. the rabbit hole has heiroglyphs on the walls of its interior. the myth construct has become self-aware...


The Function Aesthetic

SANY0111I have seen a truth.  It’s all I have been able to babble about for the last two weeks.  I knew this truth already but i had not given it a name.  it was an invisible guiding principle that slowly revealed itself as the unseen force that governed progress.  Function.  Function, or, specifically,  focusing the attention in this project, to function over any aesthetic consideration.  the logic? imagining something has a finite range, but by parameteririzing what you think you want, it builds a bridge to what you really want, but didnt know how to put into form, mentally.  to build a construct around the idea so it can express itself.

SANY0105Function itself is an aesthetic.  the idea of making a design that is perfect for the task at hand.  so by creating a system of pure function, all possible states are within a graspable range of expression vectors.  and when the idea of function becomes a guiding principle for not just the hardware design, but the software, the musical concept and the movements, an aesthetic creates itself from those variables.  this function aesthetic is now fully an all encompassing goal.

SANY0109if all of the sounds become functions-meaning that an "element" can cover all utterances of the common phenomena (like a hit or a scrape or air turbulence like a horn ), then the way they are assembled becomes more varied, giving voice to the mind that does not speak with words. function wires the ideas directly into your nervous system, especially considering the use of sensors.  what it is becoming has outstripped my ability to guide it by imagining future sonic outcomes.  I cant imagine what i am playing TODAY. 3 months from now may as well be 1000 years.  so by becoming a sort of "function engineer", i can work to create the best iteration of a function and learn to operate that function over time, then repeat.

Over the last couple of weeks I have been creating music from a more function-focused perspective of investigating some new capability like the timestretching and pitchshifting, in a methodical way, in performance and i have found that once the mind knows where things are, it can consturct its own statements without having to try too much.  so by focusing on things like, timestretching and pitchshifting in a controllable manner, repositioning the accelrometers just a few degrees for a more natural hand position, and even adding a small robot made of servos i had lying around, and using it as a parameter feedback system (more on that in the next post), i enhance the function, even if i dont understand how this enhancement will manifest itself.  the results have yielded a few recordings that i consider to be the new me based on these criteria.  my sound wont be able to go back from this place.  the "function aesthetic" is Beatjazz v3.0(beta).

And, it is the EP that i can finally send to all of the contributors of my now year old crowdfunding campaign.  I just couldnt do it unless it was representative of something different than the last one.  The sound is very different to anything i've done before, in really interesting ways.  for instance, now, nothing is a mistake because i can "transform" it into something else, endlessly.  and the current sounds work well together.  they are mostly gesturally active, with 2 or 3 fully gesturally programmable.  so the "thoughts" end up in some strange places but can be resolved easily at any time, in multiple ways. 

the album is three parts; the introduction and 2 "thoughts" which can be thought of a complete musical expression vector.  not quite a song or a mix or a track or agroove.  a bit of all that, but something else. I split them at the point in the "transform" where the idea had shifted from one to the next, definitively.  so there is a lot of interesting stuff at the end of the tracks as they transition from one to the next. 

the first "thought" is free to download for everyone, but the album is $5 because that was the contribution amount that contributors gave for this album.

after i get everybody sorted out with download codes (and updated shipping info), i will write a little post on some experiments with robotic parameter feedback and a beatjazz controller for painters.  holla...


Accelerometer Planes

What a difference a small tweak can make.

earlier this week, i was watching some vids from the previous weekends Black Dynomite party.  I do this so i can improve things that may not be obvious.  there is still much to do sonically but within the realm of the hardware design i was a bit irked by the hand positions i subconciously end up with.  webcam-toy-photo8 when i need to alter parameters i assume a hand position that looks as if i am climbing into a window or trying to scare small children.  that position is that way because i use accelerometers to vary an increasing number of overlapping parameters. accelerometers measure the force of acceleration in one of 3 dimensions; left to right (X), forward to back (Y) and the pull agains gravity, up and down (Z). 

when i built the first prototype of the hand units, which i constructed from cardboard, i determined at that time that placing the x axis under the keypad would allow me to tilt the device left to right to, say, vary volume, in a manner that was like turning a volume knob.  SANY0080and i could vary the y axis by simply moving my hand up and down at the wrist.  and that worked then because the cardboard version was ergonomically less precise than the 3d printed version.  the "boogieman" pose was an issue back then as well but gestures were less important to the system than they are now.  the combination of the "controller" becoming a "prosthesis" combined with designing a software system with gestural axial accuracy at its lowest level, the need to reorient the hand positioning became glaringly obvious.

webcam-toy-photo7So i started looking for what would be the most natural way to express precise x-y-z gestures.  what i discovered was "point position".  if you hold your arm out in front of you and point, the point just under the lowest 1/3 of your forefinger is absolutely the most memorable position for natural x-y control.  it is unconcious because the ranges are constrained by the range of the hand at the wrist.   it is always perpendicular to the thing you point at and no one points sideways or upside down. SANY0053 the x-y are perfectly in the center of their respective ranges and if you, say, point down, the y axis moves down to that position.  it is freakish how accurate it is now, without thinking.  so much so that i can draw circles or waves or letters although i havent  written any code (yet) to take advantage of that.

i reprinted the accelerometer mount pieces to incorporate the new angle. my newly created scratching technique is much more predictable and my hands now assume more of a Bruce Lee stance than before.  the gestures are making way for "kata's" now that keeping the keypad on a flat plane is no longer necessary. 

SANY0065so saturday nov. 10 at Backlit Lounge #5 i will get to play with how these new forms combine with some new vocal processing concepts i am dying to work out with.  just thought i'd share this tidbit for those toying with accelerometers.  if youre in Berlin, feel free to bring em to the thing. l8r


Of Beatjazz Pilots and vortexualization

Beatjazz "controllers" for everyone!!  ok, maybe not.  actually about 6 people total, so far.  but that may change soon.

so far, i have delivered 3 systems in varying states of operational ability.  the first 2 were delivered earlier in the year and were functional in a very basic way but took way too much non-specific software to make them work.  lots of vsts and sample collections. phillip with controller this was a big inspiration for digging into pd and building a complete system from the ground up, which has been my sole focus for a few months now.  so it was nice to have "something" that i would consider to be an embryonic version of the eventual beatjazz system,( which is what i am currently playing) to give to one of my contributors who came to berlin to replace a few broken plastic parts and some software/firmware upgrades.  THAT was surreal!  to see someone take out a beatjazz controller that isnt mine, and put it on and play it!  !?!?  it was nt a concert or anything but he had become acquainted with the controls and the and the modes.  i was very impressed. 

IMG_1021A friend of mine has taken up the task of purchasing his own printer and has printed enough parts for 3 full systems!  all in white abs plastic which should look sick with the LEDs in it.  SANY0039he has gotten really good at it too with attention to detail like bolt lengths and even choosing brass parts over the stainless steel that i chose.  and he can even build the circuit boards!  which arent hard or anything but still, it is amazingly cool that he supports the project to this degree!! SANY0043

before i headed to London for 3 shows at the 3d print show, I completed the hardware for a system that was going to the first contributor of my last campaign. DSC_3354 DSC_3369he lives in the UK so i figured itd be nice to just meet up and give it to him in person.  it was really cool to meet him and spend the afternoon with him and his wife and their friend, configuring sensors and tweaking software.  that aspect isnt finshed but we got it together well enough that i felt comfortable handing the system over to him.  very cool afternoon. DSC_3383

that particular lovely afternoon almost didnt happen.  as i stated, i was in london to perform for a fashion show of 3d printed stuff.  i flew into Gatwick airport and at the border told them i was in the UK to play a show but apparently there was confusion over the documentation i needed to have (it is called a certificate of sponsorship, for those who want to know) but didnt so i spent 10 hours in a room with no windows, being told periodically that i was being sent back to berlin, as if that was a bad thing(!?)  i stopped caring one way or the other, i just wanted out of that little room, but eventually, everything was sorted (the fault was bad wording on the government website) and i proceeded into the UK.

the 3D Printshow was an expo of all thing 3d printing.  many of the cool inovations you may have seen in 3d printing in the last few months were represented there.  I got a chance to meet some real superstars of the 3d printing world.  it was awesome, but i was there to perform with my system in a fashion show of various 3d printed artistic expressions.  i had not seen many 3d printed fashions before this show.  i was really impressed with teh intricacy and beauty of some of the works on display so i wanted to represent!

for the first show, all of my equipment was just off stage and i was to go out, play for 10 minutes are come back, but there was a variable that i knew about but didnt properly prepare for; radio interference.  i have been battling with it for months.  it seems that when a large convergence of smartphones are in my immediate vicinity, my low latency radio range drops to a meter or two, as opposed to 20+"low latency" meters otherwise.  i have dubbed the phenomena a "hipster forcefield".  and it was in full effect that night.  in addition, there was a mist machine that allowed visual to be projected in midair but arduinos dont like mist machines...who knew?!  and in spectacular and in hindsight, predictable manner, my system freaked out, the graphics were blinking....all that shit.  and i had to scurry off stage in silence and self loathing.  i was not happy.

me at printshowfor the second show, we strapped all my shit to a table and walked the whole rig out onstage, JUST to make sure and it was cool.  i was happy.  for the 3rd and final show i was cautious but confident.  we did everything the same, i was building it up over a nice groove i had in my head.  i was tweaking filters, dropping beats, just getting ready to ride into the groove with a fury reserv....(the stage manager taps me on the shoulder while i am onstage) "No one can hear you..." (scurries offstage...) i had, in my overconfidence, not turned up the volume. everyone had a laugh, i turn it up and played and exited the stage.  it was fun as was the show and its organizers.  im glad I got in to be able to do it.



I have been working on my resampling channel for about a month(which from now on shall be refered to a the "vortex") i had a vague idea of being able to twist my sound in a dynamic manner with controls that were properly controllable.  i have had some success with it.  a month ago i introduced a pitch shifter but it had no gestural prameterization attached so it changed the pitch but did so i a manner that could be called questionably musical and it was unstable.  so i decided to go more granular and control pitch from the granular timestretch patch rather than a seperate pitch patch.  the first outing was at a show i had on thursday here in berlin, which now that i have an integrated file recorder that records everything i play, was recorded and posted online. for the first time in a long time, i could not predict what the next transform would result in and it was exciting.

from this session i could see more clearly what neeeded to happen next; the creation of sonic elements that took advantage of the eccentricities of granular timestretching, namely the ability to retain its tonal and harmonic qualities or to at least understand how the process could produce new tonalties. so i createed a few and for the ReTune party on saturday night, it all came together in a brief ripping set that felt GREAT! i wasnt trapped within one tempo or one key.  both were freeely malleable and more controllable that i would have imagined. 

SANY0044so now, i gotta throw some sick sounds at this new blender.  i have been designing some morphing constructs that should make the timestretching phase less gimmicky and more integral.  going so far as to dsign the sounds "within" the timestretch buffer to get more predictable outcome qualities. ANNNND, I'm gonna be ready to rock that shit this friday at mindpirates.  me, JR and Loganic are doing our first (of many i hope) outting together, so i gotta make sure i got my algorithms on lock to be in the company of these professors of groove.  the feet dont care if its live or record so i gotta handle my biz to be in these gentlemens company.  it is my sincere hope that i can freak some sounds, shift everything over to linux AND possibly transition to wifi trancievers, which "should" fix my interference problems (or not...i have no idea) . so i gotta get back to work.  holla...


Warning: file_get_contents( [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.0 410 Gone in /home3/solarpt/public_html/ on line 44

Latest Tweet