Username:

Password:

Fargot Password? / Help

rokettube porno mobil porno mobil porno rokettube porno porno

Uncategorized

1

How Beatjazz works

So, I finally got around to creating the first a number of videos on the beatjazz system.  these two videos are a synopsis of all the hardware and software components and a demonstration of “transforming”.  interface controls.fw

 

0

No, I haven’t lost my mind…I’m designing a new one!

 

 

So here it is.  My next project phase.  A Sonic Fractal Matrix. TA DAAAAA!!!!  A network of wirelessly connected nodes-some of which are robotic-to enable any number of beatjazz system enabled (musician/dancer/performer/scientist)’s to synchronize and become a…thing.  a shimmering sonic pattern.  yaaaaaay!!  I’ve completely lost my mind! But along the way, I’ve picked up a new one…

Onyx_Node06_7I have been alluding to this shift in mind state over the last few months but there are no words for how much of a shift all of this has represented to me as a person and as an artist.  WE LIVE IN THE FUTURE!  its like a fantasy, like the dude that created Pinocchio (no time to Google it, I’m flowing…).  It is possible to go from idea –in your head- to “thing” on your desk or body!  In one to three steps!  I can not iterate that enough (no pun intended) …you think something…then you make it exactly how you see it in your head! #mindblown

Onyx_Node06_8I honestly thought I was freaking out a couple of months ago.  The possibilities were too numerous. I got heavily into studying Fractals.  I reduced the number of “possibilities” down to a set of “probabilities” and it was still too much.  The saving concept, which I have mentioned numerous times, is the “function aesthetic”.  meaning that “function” executed properly, has its own aesthetic.  So by focusing on function, an aesthetic emerges and it tells its own, very honest, story.  That story, abstract though it may be, calms me and directs my attention toward an ever clearer field of probabilities.

 

You see, you don’t go from horn sounds to “transforming”, in one smooth curve, but in a loosely related surges.  When I began the first campaign, I just wanted to play without having to hold my hands together on a horn, or what I had begun referring to as a “musical neck brace”.  I had no frame of reference for what it is has become just as the next stage is completely murky.  I have no imagination construct to envision the stage that I propose.  I can only design and construct the idea of the sonic fractal matrix concept.  but as with the exo-voice (what I call my beatjazz controller now),  I can see very clearly how to make it work.  designing the logic and hierarchical systems will be reasonably easy (I like logic systems…go figure…).  I will have to learn a lot more about networking over the coming months, but that’s the point;  to learn and create and share.

-----

As I begin my 3rd campaign, I think about what the last 2 have taught me.  Amanda palmer hit the nail on the head in her alignment of crowdfunding with busking.  Maybe that was why it felt so comfortable as a concept to me; I share my expression and if you dig it, help me keep sharing it, by contributing. But to go one step further I would say that crowdfunding has become “part” of this widening artistic expression, enable by the internet.

Not long ago, performance was a “thing” with rules and stages. promoting was something “else” and life was another thing that was kept separate.  But as I started preparing for this campaign, I began to look at the narrative potential of every single element; how the video looks…how to present these two “onyx’s”-the one that creates and the one that performs…what is the natural progression from the 3d printed exo-voice campaign?  all of these things became elements of a new (for me) type of performance that is social media, 3d printing, robots, network system design, film, and crowdfunding.  a multi-dimensional performance across time and space.

a crowdfunding campaign feels like it should be gripping somehow.  like between the time it begins and it ends, there should be something interesting to emerge from the intensity of presenting a new concept, daily, for 30+ days.  then to try to shoot past the idea of that stated goal as the project moves into “creation phase”.  when I began the first campaign, I envisioned DIY with lots of wire and plumbing pipe and in the second one it was carbon fiber molds and synthesis.  and in both instances, neither turned out the way I intended…they came out better but the whole idea evolved in wonderfully unexpected ways.  I think this time will be even more interesting because the goals have evolved this time.

-------

IMG_0969in the previous campaigns, I created beatjazz controllers as perks for contributors.  they all work, because I played each one of them extensively before delivery.  BUT, I underestimated the differences in the computer systems they would be connected to.  the controller was created for my needs and optimized for my computer.  so for me, its good and getting better but for the ones that make it into the hands of others, the mileage varied wildly based on their existing hardware.  so, going into this phase of the project, I wanted to make the exo-voice playable, “out of the box”.  it occurred to me that the only way to do this was to ship it with its own computer.  my experiments with using the raspberry pi as the computer, have been very encouraging, but I want to incorporate a second computing board just for audio processing.  the point being that I want these systems to be playable instantly.

sonic fractal matrix network config.fwNow, this doesn’t mean that the new “pilot” (cause its like flying thru sound…geddit?!) will be able to play it “out of the box”. first off, it’s a very logical, but unconventional musical input construct. And I have been playing for 30 years still have to practice daily.  Practice is the key.  Practicing with another person makes it even easier, so I figured that since I’m building the systems, I can make a simple way for 2 or more beatjazz systems to form a network so that there is a master tempo from the first person on the network.  then two or more people can play together and build a construct made of sound that just hangs in the air.  I call it a pattern.  I cribbed the whole pattern/pattern master thing from an Octavia Butler book series called the “patternist series” which is about a race of telepaths that are all connected together into a web called a pattern which is controlled by a Pattern master.  In my implementation, the role of pattern master, is that of providing a master tempo to everyone in the pattern and also the ability to transform the audio of every “node” in a manner similar to this (fast forward to 3:46 );

The best part is that once the pattern master transforms the audio then creates the new tempo, the role is passed on to the next node in the network list; everyone one is important.  The pattern will have a wide array of different interpretations because there is no “leader” and everyone is leader. 

The network is made up of nodes and the nodes I am creating now, are “baby nodes”.  primordial nodes (like “boogie”) in a network that will evolve very quickly.  My  first node is robotic because there is something incredibly exciting about creating a bio-mimetic robot avatar; when I move, it moves. robot avatar v1.fw By watching it move, I can refine my own movements.  It is me.  There are no autonomous functions where it does things on its own.  this avatar will be a robot expression vector (it also keeps the wireless router as close to the exo-voice as possible).  I find this extremely exciting. Where does this lead?  I have no idea.  I am the function.

  

------

So, think about it…weird new instrument.  You buy it and can hack it by design.  But then you can also go “sync” with others with the system (actually, it would only need the wireless router and node patches in pd.  It could theoretically run on a smart phone).  Maybe that would mean beautiful music.  Maybe horrible noise.  Maybe visuals. My vision is one where any space where one “patternist” creates, is a sonic fractal matrix consisting of one node.  and any number of patternists can join them and create an evolving pattern of increasing self organizing complexity.

THIS is the performance now; “creating” this sonic fractal matrix is as much part of the expression as playing it.  The idea is exciting to me, because I know that I can build it, even though I have never built it before.  24/7 goosebumps!  And since I am creating it for my purposes, even if no one gets into the idea, “I” can still play it!  This campaign is structured so that I can fulfill some aspect of the concept no matter how well the campaign does.  it will be another 1/2 generation before this takes hold, so it’s a long-haul iterative project.  In other words, it’s gonna happen no matter what. 

So, come with me on this ride.  I promise that I am not even slightly exaggerating when I say that if you let me, I will blow your mind.  Let’s see where this goes…

0

How a Sonic Fractal Matrix works

Soooo… here is the logic and the methodology for how beatjazz evolves into a networked sonic construction system.

stick man.fwThis is a person….kinda.  it’s a stickman.  but it is a representation of a person, by a person who sucks at illustration.

A person can generate expression in many ways; talking, writing, jumping, walking, surfing, singing, etc.

The original beatjazz “controller” was created to provide a buffer where expression could be channeled through it to become something more.

 

promophoto-onyx-ashanti-oct-2012The beatjazz controller idea became the origin point for what is now called the “Exo-Voice”.  A refinement of the beatjazz controller into a fully integrated “sonic prosthesis”, complete with its own software.

 

For the sake of continuity we shall present it like this:

 

 

 

interface controls.fw

There are prosthetic attachments for  each hand, one for the head, dealing with breath pressure (in and out), monitoring, lip pressure and head position, and one for each foot.

fractal pilot.fwThe person, now wearing the full prosthesis, is now the “pilot” of a metaphorical sonic-spaceship construct. This construct is a sonic fractal matrix-level:0-the singular level.

 

The matrix is composed of 8 elements relating to the 8 “keys” used to input performance data (i.e.., playing).  each element is a different sound, called an element.  In addition to a sound component, each element also has

 

 

  • A looping buffer that records played sound from that element and loops it according to the system tempo.
  • An edit mode that allows for sound design while in the midst of playing. fractal matrix.fw this methodology is called Gestural Trajectory Synthesis.
  • The ability to add fix, mute, change volume and pan the sound.

 

The 8th element is called the “transform” element.  it twists all the other elements into a singular trippy vortex-y sound with no limitations in tempo or pitch.  When making a loop within this element, all other elements sync to this new tempo.  it sounds like this:

 

 

 

All together, this system is capable of sound concepts I am only just beginning to explore.

Along the way I accidently discovered an interesting solution to a few of problems; parameter feedback, laptop interaction and wireless latency.

A couple of months ago I fixed most of my latency issues by switching over to Wi-Fi from xbee wireless.  the increase in bi-directional data speed allowed me to use the LEDs for more than just element color indication.  They were fast enough for parameter feedback! so I took the most important data –mute mode, record mode, panning and volume- and used the leds to provide that information and it worked beautifully.  So much so that I no longer needed to interact with the “laptop” computer at all.  but I found that being “near” the wireless router helped to keep the latency (the time it takes for the computer to make something happen after you push a button) down. And the laptop takes me out of the ”altered state” that wearing this system drops me into. 

Around this time, I built a small robot and discovered that robots are amazing for parameter feedback.  it was much more engaging and intimate than projections because it was a physical, moving “thing”.  I began using it for parameter feedback and it became my first robot avatar.

It eventually occurred to me that giving the node legs would solve a number of issues like  having to set up computers at a show and having everything on a table or in a booth somewhere rather than having an avatar right next to you, linked wirelessly. also, I have finally began working on a foot interface to compliment the hand interface and will use the robot legs to provide sensor information from them. So I designed a small simple system that is a wireless router, and a couple of inexpensive computer boards that will handle synthesis and everything else in a form factor that feels something like “intimate”. This is Node:0

robot avatar v1.fw

Node:0-Level:0 is a small wirelessly controlled hexapod robot that houses everything necessary to convey expression and  provides parameter feedback concerning the system.  the node:0-Level:1 is a larger robot that uses the level:0 avatar as its brain.  this larger bot has a full speaker system as well as various digital projection and video recording capabilities. 

sonic fractal matrix-level:1 is a network of nodes.

sonic fractal matrix network config.fw

As soon as 2 or more pilots-called patternists when networked- join the same network, a hierarchy is established.  The first patternist is node:0-Pattern master.  The second node is node:1, then node:2, etc.

The pattern master establishes the tempo that everyone is synced to.  all patternists are fully audible, all the time.  when the pattern master switches to their transform element, they can transform every transform element of every node on the network as much or as little or for as long as they want, but when they set the new loop length after their transform “solo” they become node:1 and node 1 becomes node:0 and thus, patternmaster.  Then the process repeats and cycles thru every node in the network. 

I envision 3 rooms. one for beginners to build things for the first time, a second room for intermediate pilots to “sync” or “mesh” with each other and trade ideas and even parts and techniques.  the main room is where there are at least 1 or more patternists that can guide the pattern more fully.  These would be considered “pro’s”, each of whom is capable of expressing full patterns when solo.  The goal is to create a type of social construct where there is no leader, no band, no dj; just a network of people who create the sound-space improvisationally. a space where one can learn about it, how to do it, and to actually do it, all in the same space at the same time. and where the collective will is expressed because every person becomes patternmaster eventually and if the network is too large for each patternist to become pattern master, it can be broken up into smaller patterns.

The most exciting part about this project is that I can or already have done everything I described above so at this point I just want to see it  happen.  the exhilaration of hearing a room full of loosely related grooves being twisted together into a vortex of sound, boggles my mind.  lets do this thing!!!

0

Coming to America and The New Normal

vlcsnap-2013-02-25-14h56m22s80it has been a while since my last post.  this wasnt a lapse.  i needed to order my thoughts fully to express stuff properly.

The New Normal (?)

this "thing" or state or project or whatever it is, is changing, and it is changing me with it. (“the new normal” is a reference from a kevin kelly article I read) its is a very interesting feeling to feel as if you are being mentally manipulated by your own expression, in realtime.  manipulated may be the wrong term;  more like "programmed by your own will".  this didnt just start.  i "could" say that it began when i started playing around with the idea of live-looping with synthesizers back in 2007. vlcsnap-2013-01-16-16h31m41s195 but that was an origin point, like birth.  this is something markedly different.  it incorporates all of the previous experiences, and i do mean, ALL of them; from insignificant habits as an 8 year old and everything else, all the way up to a generalized point in October of 2012, just after a couple of shows in London where i was profoundly not happy with how i sounded.  the following week, just before a couple of shows in berlin, i changed a couple of key parameter trajectories that resulted in a sound and concept so alien to me, yet so obvious as the next step that it completely wiped my "imagination slate" clean.  And from that point, mutations, which before this fateful week were mostly hypothetical conjecture, became as clear as map points on google maps.

timestretch algorithmThe big change?  simply gesturally controlled granular time-stretching.  the same timestretching that has been a staple of audio recording and performance software since almost the beginning of software synthesis as a democratic musical concept.  it is a technique where the "play head" is looping the audio right underneath it but is also moving at normal speed, so it sounds like normal playing audio.  but if you slow it down, the little loop is still playing at normal speed so it sounds like the audio has slowed down without changing the pitch.  in addition, the little loop can be slowed down and even reversed so a whole range of manipulations can be be had from this simple technique.  mine isnt even that complex.  it is based on the same one that comes in the help files for pure data, with a couple of gestural tweaks that allow me to control these variables with my hand position.

8 elementsMy system works by having 8 "elements", or sound sources. i can use finger pressure to select one of these elements as the focused one, then play it into a looping audio buffer.  each element has its own.  they are all time synchronized wih each other.  i simply move from element to element, building my arrangment.  i can mute and unmute any or all of them any time i want.  In september of 2012 i decided to add a "resampling" buffer to element 8 that would take the entire mix of all the other elements as a single audio file and allow me to manipulate it.  sonically, this was very interesting because now i could, for instance, play chords as a two step process of playing a basic chord then resampling it and playing new interval relationships with it. 

But -and there is always a but, i had a problem i had had since the beginning of this idea of continuous live looping: i could not change the tempo.  i would do all of these changes but if i needed to change the tempo or loop length, i needed to start over from scratch.  i found the monotony of the predictability of the loop length became incredibly taxing on me, creatively.  like performing to a metronome, which i have done in the past, and was happy to stop doing.

so after said gig in London, i was frustrated enough with this issue to go into one the many trance-like programming states that seem to be more and more common.  By the end of the 3rd week of October 2012 i had added the aforementioned timestretch algorithm in hopes of being able to change the tempo.  i did not expect the sound that came out from that hypothesis.  the sound could not only be stretched, it could be scratched, and ripped apart, IN STEREO -yes, i can scratch the left side with the left hand and the right side with the right hand, at the same time, like some kind of scratch-drummer. THAT was the point;  what was this sound?  why was this sound so alien yet so obvious?  i had completely baffled myself.  so few parameters yet so much scope for narrative and syntax.  it was then, like being in a video game where you can only progress to the next level after you have solved all the puzzles on this level, that the "transform" element showed me the next stage; better, more timbrally complex elements to feed into the transformer.  with the transform element, there were no more mistakes.  mistakes become timbrally interesting building blocks thatsound almost sweet when digested in this new matrix.  and the tempo is completely untethered to the previous musical ideas.  the transform process resets the system loop length for all elements so it can be a brief stutter or long and flowing. 

I can no longer predict what i am going to play in a performance anymore.  the beginning of a "thought" no longer has any bearing on how it will end or how it sound after its first transform.  this confused me as an aesthetic concept, as i didnt know how to contextualize this sound concept, musically, so i started looking to mathematics for answers, which was easier than i thought it would be.  maybe by this point, i was mentally ready for a more mathematical answer than an aesthetic one.  the one that impressed itself in my psyche most readily, was that of fractals.  a basic construction algorithm of nature based on self similarity where phenomena is expressed then divides and splits and repeats, in simplest terms ( I am still studying this so dont shoot me for my oversimplification).  I began to look at the system as not a "looping" system-looping is an incidental aspect of the way the sonic ideas are expressed-but as an expression evolution engine.  the expression vectors must evolve along a different type of route after being completely and unpredictably transformed in every way.  this is not restricted to sound.  in late december i cleared a data bottle neck by ridding my system of the serial communications bus, which was slowing down everything i wanted to do, and replaced it with much higher bandwidth 802.11g, or wifi.  this has enabled highly controllable expression vectors which now consist of light-color, tempo, intensity, etc-, and robotics for which pure data is extremely well suited.  i dont understand why the linkage between pure data and robotics isnt spoken about more, but i digress.

node1So yes, this is as much of a head fuck as it sounds like it is.  i have begun designing gestural  fractal elements-sounds of varying timbral complexity that will translate a certain way, to feed this process and also started redesigning the transform matrix for more dramaitic types of transformations.  and all of this has changed me.  i dont watch movies anymore.  they distract me from this crazy algorithm that is trying to parse itself in my head.  the only things i can really listen to are post bebop/pre-70's jazz from artists like sun ra, archie shep, ornette coleman and eric dolphy, none of whom i knew about 2 years ago and when i would hear their musics randomly, i recoiled from them from being to “self-indulgent” (yeah, pots talking shit about kettles…I know).  now i cant get enough of it.  and everyday, all day, is streams of the strangest series of questions about everything from stop signs to wall paint composition, to geographical locations of saltmines and what the first humans did before language or tools, and 1000 other questions.  it was driving me a little mad but now its more comfortable.  it is the new normal.

this was the last 2-3 months, much of which was spent perched on my crappy office chair, rambling to either myself or my neighbor, but i knew that i was going to be in the US for the first time since 2009.  technically, i had been in the US twice in 2011-once for the TED thing and once for the Maker Faire in NYC, but those were both in NYC and i didnt see any family nor did i really have a defined concept as complete as this "mental algorithm" or "pattern".  i was acutely aware, as i am writing this blog post, that maybe they, the people in savannah georgia and in memphis tennesse, would think i was completely nuts.  maybe i had sat in my flat a month or two too long, exploring my own world in too much detail.  i knew those were possibilites, but the algorithm felt so right...so complete, that i was more curious than apprehensive (airport TSA shenanigans not withstanding).

Coming to America

savannah cover photoSo I arrived in Savannah georgia to the completely surreal experience of seeing my own picture on the cover of the local weekly zine.  whoa!  I've never been cover material, or so i told myself.  that, and it was HOT, as in weather.  so i took advantage of the opportunity to revel in some much missed sweetend iced tea and a whole list of unhealthy former addictive food-stuffs which usually populate the shelves near the counter at gas stations.  all that is besides the point though...I was in Savannah to do a series of presentations and performances for the Telfair Museums Pulse festival.  there was a whole digital arts exhibit with digital video arts, video gaming arts and more.  it as beautifully put together.

there were two days of full auditorium presentations for my project.  i was happy with the diversity of the people in attendance, especially the age ranges.  there seemed to be just as many older people as young ones.  so i just decided to drop it how i see it now, without any consideration that i might be kooky sounding and surprisingly to me, everyone got it!  i just described the esoterics and how they tied to the mechanics of the system and idea and everyone, older and the kids as well, got it.  i could tell because some of the questions were very specific.  one kid, about 8 years old, asked if i had thought about putting the sensors in my body and i explained, in detail, my apprehensions with sub-dermal sensors but that i was open to EEG scanning and he was completely cool with that answer!  cool!  questions about integration of this modality into the school system, its place in the continuity of jazz, from a local older jazz cat (really happy that he saw the connections i was trying to present musically, without saying anything about them first), communal music creation, and on and on.  the Q&A sessions lasted from 30 minutes the first day to about an hour on the second and last day.

After the talks, the range of people that came up and the range of questions were very informed and interesting.  i felt like something had connected and felt very much more confident that i wasn’t crazy.  that doesn’t mean that everyone liked it.  there were swaths of people who got up and left as soon as i started playing, which was encouraging since i dont want this idea to be middle of the road;  “dig it or dont or build your own!”-should be the motto.

I was also given some beautiful artwork by local artist PE Walker, created from reclaimed materials.  I was so happy to get these items home without destroying them in my luggage. IMG_20130301_123110[1]IMG_20130301_123122[1]IMG_20130301_123156[1]

i had a week or so with family to chill out and revel how old we're all getting and how grey we're becoming and even talk about this "algorithm" (and by talk, i mean ramble endlessly) and it still kinda held strong.  nothing that fell out of my face was glarringly wrong or complete bullshit, because if it were, i would have heard about it right then and there.  this was heartening as i prepared for Memphis.

IMG_20130301_123420[1]but i should say that Memphis was prepared for me!  they went all out! and by “they” I mean Eric Swartz and Sarah Bolton who went to a lot of trouble getting me there( I met them while they were in Berlin filming a Documentary called “Children of the Wall” which you can check out Here,) and guys at the Five in One Social Club who dusted out their wonderful subterrainian bunker and filled it with custom stages, screen printed posters-yes, screen printed, like on tshirts but on cardboard-beautfully designed and they even made me a t-shirt version. and an amazing 3d projection mapping co-performance by Christopher Reyes. (this and more photos from the night here)tumblr_mibu5n0esN1rk1wygo7_1280 Memphis is special because my first time playing there in 2008, was literally my last stop on my way to new york to fly to berlin for the first time, to begin a new life.  and that party was heavy!  I and Jeremy Bustillo did an all EWI based party night, and held it down all night long.  the crowd kept it going!  and then again in 2009 at a space called Odessa, which was so warm and comfortable that i couldn’t wait to play there again.

I was so chilled that i sat, onstage, for a good portion of my talk.  i rambled about all of the above mental stuff and interspersed it with bits of playing and then dropped into a full set.  people were sitting on the floor and generally relaxed from the barbeque, cole slaw and iced tea that was being served, so i took the opportunity to investigate new ways of transforming.  i am having to come up with new terms for this stuff but i found "stutter rips", and "dving deep" and my new and barely explored favorite so far, "scratch drumming", where i am learning to scratch each side of the audio in relation to the other. kneeling-memphis i finally stopped when i felt like i had explored all the permutations i could think of that night.  i was tired and the crowd seemed satisfied. and the rest of the evening was super chilled out.  it was a great opportunity to talk interesting heavy stuff with many representatives of Memphis' artistic community.  I even met Larry Heard-House music legend-There! 

 

After all of this, i spent a few nights in a motel 6 near Hartsfield airport to decompress a bit and even in the solitude of a hotel room, it was all a flurry of drawings, and patch programming and design bits.  i think this is a permanent state now, which is ok with me.  the next stages are clear now and in the coming week, i will be doing another crowdfunding campaign to unveil what "next" means.  it will be weird- Trust!- but i think it will be as obvious to you as it is to me now.

1

Nothing like your own show poster t- shirt!

The guys at Five in one gave me this wonderful T-shirt and I thought I would show off their amassing design work here. If you live in Memphis, be sure to stop by later! image
0

Notes on switching from xbee to RN-XV for low latency applications using Pure Data

IMG_0043Happy new year!!  It’s been a crazy crazy month!  I wont even go into all the mental intensities emmerging from digging deeper into the mathematical underpinnings of sound design (which resulted in a couple of weeks of burnout).  although a reprap breakdown has slowed down the pace of printing perks for crowdfunding campaign fullfillment, I have been working on personal mixes for my beatjazz supporter pack contributors and managed to mail off another controller! One more to go!!  it still amazes me how much nicer they are than my own, which is constantly being added to and tweaked.  this particular contributor received his system with my own personal xbee wirelesss data system. the one I have been using since the beginning of this journey.

As crazy as it sounds, one of the things I have learned about myself that if I have a hard “future” thing to do but I have a comfortable “now” thing laying around, I will go for the “now” thing.  especially when there is a deadline or it’s a bit too abstract to fiddle with.  so what I do is get rid of the “now” thing and make the “future” thing the “now” thing.  I have done this for years, like when switching from saxophone to wind midi controller, switching from playing cover tunes to playing originals, from hardware to laptop, from wind controller to beatjazz controller and, last summer, from commercial audio software to using pure data as my central means of expression. 

 IMG_0055the Digi made, extremely popular Xbee series 1 and series 1 pro’s (the pro’s are the same, but with 63mW broadcast output as compared to the regular series ones 1mW),  are marketed as “a drop in xbee replacement”  which excited me.  the xbee’s werent my first dalliance with wirelessness.  I had a kenton midistream wireless midi system when they first came out in 2004.  I also used my iphone’s accelerometer as a gestural controller for most of 2010-the summer before I began working on the beatjazz controller.  the Xbee was my first interaction with actual networking protocols, although it was an older, slower yet more reliable one called Serial protocol.

Serial was created a long time ago and its very slow in comparison to todays protocols, but its stable and is good for less bandwitdth hungry data transmission such as that coming from the low powered arduino platform.  there are multiple ways of integrating xbees with arduinos using snap on “shields” and, in my case with the arduino fio, having an xbee “shoe” built directly onto the board.  this has worked for almost 2 years with some caveats that are worth mentioning;

  • if the computer loses the connection, the software must be restarted, which sucks in a show.
      bidirectionally, the data coming from the arduino is mostly speedy enough for low latency performance, but data transmitted BACK to the arduino from the computer, using the aforementioned xbee pro’s was useless for anything time sensitive so I only used it to change the light colors infrequently. 
    if I am in the vicinity of a “hipster forcefield”, (a phenomena whereby the sheer number of smartphones, with their numerous wireless capabilities, drown out my xbees broadcast capacity and the system becomes unresponsive), I end up huddled over my laptop-situated xbee basestation, gasping for bandwidth.

IMG_0052the RN-XV’s, made by roving networks, use wifi and are designed for serial communication although not limited by it.   I purchased them but didn’t really use them much until a recent robotics project (I will be writing about soon).  I found them to be remarkably faster by Wide margin, out of the box, but had less success when using them with the beatjazz prosthetics.   the data was very shakey and garbled and I had no idea where to even begin fixing it.  I put them away, mentally, until just recently when I became interested refining the system, which seemed a bit labored recently.  so I just said fuck it and mailed the xbees off with the controller and stranded myself in 802.11 land.

Along the way, I learned a lot of very valuable stuff but there was one thing In particular that I wanted to share; If you use firmata, as I do, you can use it, (mostly) unedited. you can use the rn_xv to create a wireless arduino network with no serial port virtualization!

 

How to.

the Goals:

  • go pure wifi instead of serial for wireless communication
  • give each “unit” (headset, left hand, right hand) its own ip address, directed at a small dedicated wireless router (d-link dir-300)
  • speed up bidirectional communication significantly enough to make better narrative use of the lights and haptic feedback, and reduce processing overhead.

 

  • arduinoobject-inside first thing is to go into the [arduino] object in PD. and open it where you will find this structure.

  • the inlet is where settings bound for the arduino are input into the [pd command processing] abstraction which then sends it to comport object, which is PD’s serial interface.  The comport inlet (at the top of the object) sends the serially formatted data thru a serial connection, to the arduino.  in my system, the xbee’s were the virtual (wireless) serial cable.


the outlet at the bottom of the comport object is the data that comes “from” the arduino.  this data is a firmata-formatted stream of numbers
224 17 0 (repeat) but in a stream

this stream is sent to the [pd make lists] abstraction and then from there to the outlet where it is routed to digital and analog route objects where you can assign them to things in pd. 

udp internal stuffI needed to get sensor data from the arduinos  as fast as it could send it, then process it in pd and send control data back to the arduino, wirelessly.  it also needed to be as blazingly fast as was possible.  My first idea was to emulate a serial port using one of many virtual serial port programs on the internet like Serial/IP, socat, HW-VCP and about 6-7 others.  in all cases, the data I was getting was garbled and would stop after a while (more on this later).  so after a long time of beating my head against this wall I decided to try simply getting rid of the serial communications system and trying this with networking objects.

I tried all of the networking objects I could find in pd with most people telling me to try UDP, which is a connectionless format that, unlike serial and TCP, doesn’t need to “hand-shake” to allow data to pass back and forth.  both ends of the link simply “scream” data at the other.  in this way it is great for things like skype and video streaming but not so much for things that have a low tolerance for errors.  so after another week of fiddling I was able to place the [udpsend] object underneath the command processing abstraction, where the comport was.  [udpreceive] receives data from the arduino and sends it to the rest of the pd-system.  [udpsend] accepts data at its inlet and also needs the ip address and port to send the data to the right arduino. 

the [udpreceive] object recieves it data from a port on the router that I set on the RN-XV itself, negating the need for arduino libraries or rewriting the firmata firmware.  at this point I will jump to the RN-XV setup.  it will make sense when I come back to this abstraction.

RN-XV

The RN-XV is a powerful 802.11b/g tranciever marketed as a drop-in xbee replacement, meaning that the it shares the same small pin spacing that xbee is based on but is a bit more tedious to program.  for the purposes of this tutorial we will stick to the features that will enable us to link this directly to pd.

what we need to give each RN-XV its own static ip address and port, first.  this is where the sparkfun USB explorer or equivalent usb serial to xbee board, comes in handy.  with it you can program using a terminal like Putty or, in my case on win7,  Coolterm

in coolterm, goto Connection->Options and make sure the serial port your explorer is on is selected and the initial baudrate is 9600 and click OK.

in the main window, click the “connect” button then in the terminal window type

$$$ (but DO NOT press enter afterward.)

this puts the RN-XV into command mode and the green light blinks more quickly.
if you type “get everything” (no quotes) you should see all of the settings for the tranciever. these are the ones I found most important:

  • “set ip address”<ip address> –is the ip address of the of the rn-xv
    ”set ip local port”<port number> –this is the port that [udpsend]will send data thru to the arduino
    ”set ip host” <ip address of the computer>
    ”set ip remote” <port number> –this is the port  that the [udpreceive] will use to receive sensor data and send it to the system.
    ”set ip protocol” <1> this sets the rn-xv to use UDP protocol instead of TCP.
    ”set wlan ssid” <ssid> –this is the name of the wireless network you are connecting to. 
    “set wlan channel”<ch.number> –set your channel number here. I set mine to 0 so it associates with the AP with whatever channel is there.
    “set wlan join” <1> joins the access point saved in memory (the one we input above)
    ”set wlan linkmon” <0> I turned this off. 
    “set wlan rate” <15> I am playing with it at this rate which is the fastest as 54mb/sec.  1-2mb/sec would be fine(0 or 1).  roving states that lower transmit rates increase the range, so play with this one as you need.
    ”set sys autoconn” <0> this turns off the autoconnect timer.
    ”set uart baudrate” <baudrate> this effectively sets the baudrate of the connection.  I have mine set to 115200 kbps  and will experiment with higher rates but this seems to be fast enough for right now.  one line in the arduino firmata firmware will need to be changed to allow this to work properly.  if you need to use Coolterm with this rn-xv in the future, it must be at this baudrate in the option settings.
    ”set uart flow” <0>
    ”set uart mode” <1>
  • and the last and possibly most important two settings
  • “set comm size” <1420> this is the “packet” size or the number of bytes sent over udp as a block or numbers
    ”set comm time <5> this is the number of milliseconds between flushes of data blasted out to the computer from the rn-xv.  every 5 ms, it sends a packet of data that “should be” 1420 bytes.  I say should be because it is udp and there is no guarantee of delivery.
  • after inputting all of this into the terminal window, type “save” which saves it all to its own local memory (which is why the arduino WiFly librairies are unneccessary in this instance) then type “reboot” which puts it back into regular mode, ready to be placed back into the arduino fio.

 

Arduino firmata edit

next thing is to now use the arduino software to change the baudrate of the firmata firmware from 57600kbps to 115200kbps.  goto File->Examples->Firmata->standard firmata.  scroll to the end of the sketch and then scroll up until you see


  Firmata.begin(57600);

and change it to

Firmata.begin(115200);

then upload it to the arduino. 8you can not have the rn-xv attached to the arduino while uploading a sketch. put it on after uploading)

  • Setting up the new network arduino object

myarduino

ok, so back to our edited [arduino] object, which I have named [myarduino-udp].  we need to tell [udpsend] where to send its data, so we create a message with [connect xxx.xxx.xxx.xxx xxxx < which is the ip address we put into the rn-xv earlier, and its port number.  when you connect it and click on it, it “should” work and output a 1 at its outlet, signifying that it works.  if it does not, make sure that the router is working properly and that everything is connected.  if there is a red light blinking on the rn-xv, then it is not associated with the router.

Next, in the [udpreceive], enter to port number that you entered as the remote port in the rn-xv settings. if this is working, it should show the received ip address and bytes from the second outlet. ( I took this from the help files).

packetblocks

once it works, the “bytes received” number box should show some activity if you have sensors attached so you should send some data at this point. for reference, I attached a print object to the output of [udpreceive] so I could see what was coming out (below). it is blocks of numbers starting with a 2xx number the a 0-x number and another 0 to about 17 or so.  these are the right numbers in the wrong format.  they must come as a stream like;

200
0
2
224
0
0
230
1
5
etc…

so I used a list object called [list drip] which outputs a list as a stream.  this formated the data properly and everything worked!  it was at this point I realized that either [comport] or the serial port was the previous bottleneck in my system.  the system was now noticibly quicker than with the xbees.  so I decided to try to eek out a bit more speed by toying around with the [list-drip] since now, it was the new bottleneck when I came across this thread with test abstractions that sped up the [list-drip] object, significantly!  I ended up using an undocumented version called [list-drip-scheid] which itself is a faster variation of one called [list-drip-quick2]!!! (the thread is great read as well)

one last thing I did was to place a bang object on the inlet of the [loadbang] because it doesn’t send the data to the arduino on the first [loadbang] so I do it manually after the patch has loaded.  and that should be it!  I say “should be” because there is always something that falls over somewhere but this works for me and has been stable for about a week since I figured it out.  my operating latency is about 4ms bidirectionally and now the system is fast enough to control the lights with gestures and breath pressure in real time while allowing much more expandability.  as well, there are no weird comport, serial port, ftdi driver issues; everything will work on any operating system with minimal fuss.  I now need only add ip based components to my system for my router to recognize and pd to utilize.

this is my first official tutorial so let me know what you think.  if it is confusing or too long or whatever.  I aint gonna change it but I will incorporate the feedback for future posts.  l8r.

EDIT. here is a demo of the complete system, working.

Me and my bot “Boogie”, working out in the lab.
0

the “twins”

I was going to try to not post this for a while until got caught up with my crowdfunding fullfillment and some other stuff, but things are advancing so fast right now that if i dont put this in the conversation, it will just get left behind. Introducing the first iteration of my beatjazz parameter feedback bots. I call them "the twins" there is one for each hand and give axial spatial feedback without the need for a projection. i will be sharing more about it soon but i REALLY have to wrap up the previous campaign first.
3

Happy Holidays! the Beatjazz controller is ready for download for makers everywhere!

so finally, I have begun the process of fullfilling my promis of turning this into an open source project.  you can see the files for building your own controller here http://www.thingiverse.com/thing:37472  it is now in the wild.

I dont really know what this will mean to me or to others or to the project.  it might be ignored completely.  it might be jacked and someone else take credit for it.  it might be used for something i cant imagine or just one piece might be relevant to more than just me.  I dont know...which is exciting.  i dont have experience here so i am interested to see what happens.

what i hope is that people see that this is not an instrument, but an idea.  an idea that one can create their own personal prostheic computer interface and not have to be content with what the corporations think we want based on their little demographic surveys. 

 it is a seed.  it is not meant to be this configuration persistently but to be modified, changed and even discarded once said person has found themselves through going through the build process.  i have recently seen 3 different systems inspired by the beatjazz system and none of them look or operate like mine.  that is the point.  it is also the point to the construction.  i designed it with a million little pieces so that each of those pieces can be iterated to each persons perfection.

I knew that i had to upload it now because my own personal system iterates so much that i do not consider it to actually exist as a tangible thing.  it only exists as a tangible "state"  no part of my system has remained unchanged for more than 6 weeks.  and there is no reason for it to.  it can change based on my whims, creative urges, geographical location, event occasion...it is a tangible thought...a corporeal idea.  and as such i knew that if i didnt upload it now, it would iterate away from anything that anyone would understand.  

so Happy holidays and happy building.  and thank you for helping me build (one of) my dreams.
1

Transform presentation and talk from DDB-Tribal event


This was an unexpectedly interesting performance from a few weeks ago where i had just changed a couple of fundamental function positions and everything became infinitely more comfortable to take the sounds thru the process currently still called Beatjazz.  I hope you dig it.
0

Birth…

equations relative to each other are beginning to synergize. aesthetic based on function is proving to be a decent guiding principle for not just design, but sonic interval interpretation, formerly called music. the rabbit hole has heiroglyphs on the walls of its interior. the myth construct has become self-aware...

Pages:123456789...1516

Warning: file_get_contents(http://search.twitter.com/search.atom?q=from:onyxashanti&rpp=1) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.0 410 Gone in /home3/solarpt/public_html/onyx-ashanti.com/wp-content/themes/Onyx/footer.php on line 44

Latest Tweet