Sunday 20 May 2007

“Keyboard based glove video controller” (artifact)




At the beginning the aim of the project is to build a glove that control a VJ-ing programme named Arkaos. I wanted to do this because I use a lot of VJ-ing programmes and I have figured out that none of that I have used offer the possibility of being controlled whilst standing up, away from the computer. I looked into a few different ways, of how it might be possible to control a piece of VJ-ing software whilst standing up performing. At first I thought that it would be nice to have some kind of control that interacts with a projection screen, e.g. a Bluetooth control, which would interact with the software. For the software that I wanted to use this was not possible to do, as there is no way of telling it to react to the movements of a Bluetooth control. Another idea was to try and get the software to work with a midi keyboard, and a computer keyboard, so I thought that maybe I should make something that would work with those two devices. Using Arkaos it is easy to map any computer keyboard, but I could not find any way of using a midi signal or keyboard to my advantage. As it is so easy to map any key on a normal computer keyboard to any action in Arkaos, I looked into creating a wearable control. As you have to use your fingers to control keys, the control had to be based around the wrist/hand.
At first I thought it would be nice to have a keyboard in the arm of a jumper, with an entire keyboard sewn in. I made several attempts at this using a flexible keyboard, which did not turn out so well. I then turned to my backup plan of having the keyboard embedded into a glove.


Development


Using the USB flexible keyboard, I pre mapped specific keys using Arkaos, and then set about cutting open the board, so that I could use the keys. I mapped four keys that were most comfortable to use with one hand. The keys I chose were: 4, F5, 7, and I. If you put your hand on these keys, you will see that they are perfect for an outstretched hand. I then had the problem of finding the materials to make the glove. First of all, I had the problem of cutting open the keyboard. My original idea was to place under each finger a key but this turned out to be impossible because the inside of the keyboard shows a plastic circuit where all the keys are connected to each other. It was impossible for me to cut only a single key so I decided to use the entire circuit.
The problem appeared when I have realized that the keyboard was too big to fit into a glove.
The only solution was to make a proper glove to reach the size of the circuit. Thanks to the flexibility of the keyboard I was able to bend the circuit and wrap my hand inside it.
Unfortunately I couldn’t made a normal glove where all the fingers are separated so I have managed to made one that can wrap the whole hand.


Technical preparation


• From the keyboard I have cut the selected keys and I have stick them back over the circuit, with some super glue.

• I have fixed the circuit around my hand with some cello tape, but it was really uncomfortable because the bottom part of the hand wasn’t stable and some of the keys couldn’t make contact.

• I have stuck a piece of cardboard in the bottom part to make it more stable on surfaces and to let easily work the keys.

• To build the glove I have cut an arm from a jumper, to cover the outside and the inside part of the circuit.

• Wrapping the circuit with the arm it is been quite tricky but at the end I found out a nice way to fold it properly.

• I didn’t know how to sew, so after few attempts I found out that the super glue could fix properly even this kind of material.



The glove is quite comfortable and the keys are really sensitive. Even if it is limited to only four keys, it has turned out to be a useful object to control Arkaos.
It is possible to connect more then one keyboard and while one hand can control the software, the other can move around into the glove, still giving inputs to the video. Obviously there are a lot of devices like that but none is built properly for VJ ing software. Building this one made me realise that I should probably try to improve this device.
At the beginning I was looking forward to build something that can control a video performance not from the stage but from the front side of the projection.
But this would have involved the construction of a similar device connected by Bluetooth.
Unfortunately I couldn’t find any flexible Bluetooth keyboard.
The idea is to give control even to the people that assist the show and let them interact with the installation.
The possibility can be very wide thanks to the multi connectivity of this software that is used most by Vj’s.

Monday 26 March 2007

ETUDE 4: SOUND ATMOSPHERE PLAYGROUND

Your name

MATTEO FONTANA

Your Pathway Combination

DIGITAL ARTS & MUSIC TECH

The title of your etude

SOUND ATMOSPHERE PLAYGROUND

A short statement of intentions

I AM TRYING TO CREATE A SOUND ENVIRONMENT THAT EVOLVE OR CHANGE THANKS TO THE MOVEMENT CAPTURED BY SOME SENSORS


What you initially wanted the piece to be about / for - its conceptions How you initially intend to create the etude

I WOULD LIKE TO BUILT A SORT OF INTERACTIVE AUDIO SPACE USING SOME PRESET LIBRARIES.
THE SOUNDS PRESENT RESPOND THANKS TO THE SENSOR PLACED IN THE INSTALLATION AREA.

Conceptualization of the work

I WOULD LIKE TO RECREATE A PARTICULAR ATMOSPHERE FOR THE PLACE I WILL CHOOSE,
WHAT IS IMPORTANT IS WHERE I AM GOING TO SET UP THE INSTALLATION, IN FACT THE SURROUNDING NEED TO BE LINKED WITH THE SOUND IN ORDER TO REFLECT WHAT THE SPECTATOR IS SEEING.
THE IDEA IS TO GIVE A SORT OF SENSATION THAT SOMETIMES IS POSSIBLE TO SENSE WHEN YOU STAND IN FRONT OF A LANDSCAPE, OBVIOUSLY EVERYONE DOES USE THEIR OWN JUDGMENT TO INTERPRET WHAT IS SEEN.
LEADING THE PEOPLE THROUGH A SENSORS BASED PATH, HOPEFULLY IT WILL GIVE THE SENSATION ON INTERACT WITH THE PLACE, AND IN A CERTAIN WAY, HAVE THE POSSIBILITY TO CONTROL THE ATMOSPHERE ITSELF.


I ALWAYS BEEN INTERESTED ON IMAGES AND VIDEO BUT I REALIZE THAT A SOUND OR A TRACK IS STILL VERY IMPORTANT IN A VISUAL INSTALLATION. THE SOUNDS GIVE US A SORT OF PERCEPTION THAT LEAD US THROUGH A PARTICULAR MOOD AND MAKE US PERCEPT WHAT IS AROUND US IN A MORE INTENSE WAY. OBVIOUSLY EVERYONE OF US HAVE HIS OWN PERCEPTION OF SOUND AND IMAGES. TAKING CONTROL OF THE SOUNDS CAN REALLY CHANGE THE WAY OF WHAT WE ARE LOOKING.
DEVELOPMENT:

FOR FIRST I STARTED TO THINK ABOUT FEW FEELINGS THAT ARE EXPERIENCED EVERY DAY ALMOST
BY EVERYBODY.
I USED LOGIC AND FEW RECORDING I HAVE MADE TO COMPILE SOME TRACKS.
THE CONCEPT IS REALLY SIMPLE.
I HAVE MANAGED TO DRAW A PATH IN A NORMAL PLAYGROUND AND THE IDEA IS TO PLACE SOME SENSOR THAT ACTIVATE THE SOUND WHILE A PERSON IS PASSING NEARBY.
THE SENSOR WILL NOT ONLY ACTIVATE THE SOUND BUT THEY WILL AFFECT EVEN HIS CONTENT. IT IS POSSIBLE TO CONNECT THE IMPULSE, WITH SOME EFFECT USING THE AUTOMATION TOOL PRESENT IN THE MOST COMMON MUSIC PROGRAM SUCH, ABLETON LIVE, LOGIC, CUBASE ETC.. ETC.. WITH THIS PROCESS THE SENSOR WILL READ THE MOVEMENT INTO THE SUPPORTED RANGE AREA AND IT WILL AFFECT IN REAL TIME THE SOUND PLACED IN THAT PARTICULAR POINT.

CONNECTING THE SENSORS:

I FOUND ON INTERNET THE MIDIsense (WWW.adafruit.com) THAT ALLOW QUITE EASILY TO CREATE A BOARD ALREADY EQUIPPED WITH SENSORS. UNFORTUNATELY THE SITE DON'T SELL THE BOARD ALREADY MADE BUT ONLY THE KIT WHICH MEANS YOU NEED TO HAVE THE PROPER TOOLS AND YOU NEED TO BE REALLY CAREFUL IN ORDER TO DON'T AFFECT THE BOARD WITH THE ELECTRICITY YOU CAN TRANSMIT WITH YOU HANDS.


I DIDN'T BUY YET THE BOARD BUT I THINK THAT THIS IS THE BEST SOLUTION.
"there are 3-4 different boards. Each board is customized to handle a particular kind of sensor. For example, one board handles 'resistive' type sensors: force, bend, and photosensors all act like variable resistors, and require a certain kind of conditioning circuitry. Another board will handle 'capacitive touch' button sensors: this type of sensor requires a specific chip to function. And so on. "

ETUDE: 3 video interactive installations

Your name

MATTEO FONTANA

Your Pathway Combination

DIGITAL ARTS AND MUSIC TECH

The title of your etude

VIDEO INTERACTION AND CAMERA TRACKING ART

A short statement of intentions

I WOULD LIKE TO CREATE A VIDEO INSTALLATION THAT INTERACT WITH THE SHADOW OF THE PEOPLE STANDING IN FRONT OF THE SCREEN.
I HAVE SEEN LOT OF INSTALLATION LIKE THAT BUT NO ONE EXPLAIN HOW TO REALIZE THIS KIND OF PROJECT.


CAMILLE UTTERBACK is one of the first artist I have seen doing an installation like that. the picture above is from her project "Text rain" that involve Participants (using the familiar instrument of their bodies, to do what seems magical - to lift and play with falling letters that do not really exist. In the Text Rain installation participants stand or move in front of a large projection screen. On the screen they see a mirrored video projection of themselves in black and white, combined with a color animation of falling text. Like rain or snow, the text appears to land on participants' heads and arms. The text responds to the participants' motions and can be caught, lifted, and then let fall again. The falling text will land on anything darker than a certain threshold, and "fall" whenever that obstacle is removed)"http://www.camilleutterback.com/

Later on I found out other interesting piece that uses the same technique...............

zack booth Simpson & Ken Demarest ... they create a similar installation to "Text rain" but this time using sand



"A stream of liquid sand flows from above and reacts with your shadow as if it were solid. Its hypnotic motion conjures childhood feelings of playing with water or building wet sandcastles. Like making shadow puppets, you can easily construct concave structures with your hands to catch the sand and then you can pour it from hand to hand or maybe into your friend's mouth. Play with it long enough and you might discover some of its many secrets."(http://www.mine-control.com/sand.html)



WHAT I HAVE UNDERSTAND SO FAR: THE MATERIAL NEEDED IS A PROJECTOR, A CAMERA THAT STREAM THE SIGNAL OF THE PROJECTION STRAIGHT INTO A SOFTWARE. THE PROBLEM IS THAT MOST OF THE SOFTWARE THAT CREATE THIS PROJECT ARE CUSTOM BUILT.
ALL THE APPLICATION AND CODING THAT I HAVE FOUND TEND TO BE CLOSED SOURCE

SO THAT I FIGURED TO START CODING WITH "PROCESSING" WHICH IS AN OPEN SOURCE APPLICATION
THAT EXPLAIN WITH SOME EXAMPLES HOW TO CONNECT A VIDEO PROJECTION INTO A COMPUTER
AND INTERACT WITH IT.
I START TO BUILD SOME SMALL VIDEO WITH ANOTHER OPEN SOURCE SOFTWARE "FLXER" BUT THE CODING I'VE TRIED SUDDENLY DON'T WORK:

Import processing.video.*;

// Variable for capture device
Capture video;
color trackColor;


void setup()
{
size(200, 200);
frameRate(30);
colorMode(RGB,255,255,255,100);
// Using the default capture device
video = new Capture(this, 200, 200, 12);
trackColor = color(0); // Start off tracking for white
noFill();
smooth();
strokeWeight(4.0);
stroke(0);
}

void captureEvent(Capture camera)
{
camera.read();
}

void draw()
{

loadPixels();

// Draw the video image on the background
image(video,0,0);
// Local variables to track the color
float closestDiff = 100.0f;
int newx = 0;
int oldx = 0;
int oldupdatex = 0;
int closestX = 0;
int closestY = 0;
int updatex = 0;
int[] Xs = new int[2];
Xs[0] = 0;
Xs[1] = 0;

int[] upXs = new int[2];
upXs[0] = 0;
upXs[1] = 0;

int count = 0;

// Begin loop to walk through every pixel
for ( int x = 0; x < video.width; x++) {
// for ( int y = 0; y < video.height; y++) {
// int loc = x + y*video.width;
int loc = x + (video.width * 100);
// What is current color
color currentColor = video.pixels[loc];
float r1 = red(currentColor); float g1 = green(currentColor); float b1 = blue(currentColor);
float r2 = red(trackColor); float g2 = green(trackColor); float b2 = blue(trackColor);

// Using euclidean distance to compare colors
float d = dist(r1,g1,b1,r2,g2,b2);
// If current color is more similar to tracked color than
// closest color, save current location and current difference
if (d < closestDiff) {
// println (Xs[0] + " " + Xs[1]);
// if (x > 0){

// println (Xs[0] + " " + Xs[1]);

closestDiff = d;
closestX = x;
Xs[0] = Xs[1];
Xs[1] = x;
// }
// else {Xs[0] = 999;
// Xs[1] = 0;
// }

}


I AM GOING ON TRYING TO LEARN THE CODING FOR THIS APPLICATION AND I THINK I AM WILLING TO BUY ONE ARDUINO CHIP AS WELL.
AS I UNDERSTAND I WILL HAVE MORE POSSIBILITIES.

conceptualization of the work

I THINK THAT THE CONCEPT BEHIND THOSE KIND OF INSTALLATIONS IS THE RELATION BETWEEN THE HUMAN MESSAGE AND THE MACHINE.
CONSIDERING THAT THE SOFTWARE IS AN APPLICATION BUILT BY SOMEONE, IS EASY TO INTERACT
WITH A VIDEO, RESPONDING TO THE MESSAGE THAT THE ARTIST WANT TO GIVE.
USING YOUR BODY IS POSSIBLE TO INTERACT WITH THIS MESSAGE AND JUST FOLLOWING YOUR SENSES GIVE AN UNIQUE RESPONSE. IN THIS CASE WHAT IS CONSIDERED AS ART, IS NOT THE PIECE ITSELF, BUT THE MESSAGE THAT CAME OUT WITH THIS HI-TECH TOOL.

write about

I THINK THAT THE MAIN POINT OF A COMMUNICATION IS THE RESPONSE YOU CAN GET FROM SOMEONE.
LIKE THAT, YOU CAN UNDERSTAND IF YOU STATEMENT ARE CORRECT, OR IS SIMPLY SATISFYING HAVING A SORT OF RESPONSE......
WITH A VIDEO INTERACTIVE INSTALLATION THIS RESPONSE CAN BE GIVEN IN A DIFFERENT AND MORE ARTISTIC WAY LEADING THE ARTIST INTO A MORE INSPIRATIONAL STATE, BECAUSE THE DIALOGUE IS JUST DIFFERENT.

how successful is the Etude

THIS RESEARCH BROUGHT ME INTO A FIELD I HAVE NEVER CONSIDER.
AT THE END I COULDN'T GET WORKING THE CODE BUT SEEING OTHER PEOPLE PROJECT KEPT ME GOING ON. I THINK THAT I'LL CAME OUT WITH SOMETHING MAYBE ASKING SOME HELP TO THOSE GUYS THAT ALREADY DID THIS INSTALLATION.

Sunday 25 March 2007

ETUDE 2 : UV LIGHT AND SMALL PROJECTS

Your name

MATTEO FONTANA

Your Pathway Combination
DIGITAL ARTS and MUSIC TECH

The title of your etude
UV LIGHT IN A COMPUTER BASED ENVIRONMENT

A short statement of intentions
I WANT TO RESEARCH THE VARIOUS FUNCTION OF THE UV LIGHT AND HOW THEY CAN BE USED IN AN ARTISTIC WAY
EXPERIMENTING AS MUCH AS POSSIBLE TO FIND OUT WHAT ARE THE VARIOUS CONNECTION POSSIBLE.

What you initially wanted the piece to be about / for - its conceptions How you initially intend to create the etude
I WOULD LIKE TO GAIN THE KNOLEDGE OF THE USE OF THOSE LIGHT IN ORDER TO EXPRESS WITHOUT LIMITATION EVERY POSSIBLE EFFECT I CAN GET.
Conceptualisation of the work

WHAT I LIKE OF THOSE LIGHTS IS THE FACT THAT THEY'RE NOT REALLY REAL, IN FACT YOU CAN SEE THEM ONLY UNDER PARTICULAR SITUATION. THE UV LIGHT IS MORE VISIBLE ON WHITE SURFACES AND IN A DARK ENVIRONMENT ETC.. ETC..
THOSE LIMITATION CAN LEAD THE ARTISTS IN DISCOVER MORE VARIOUS TECHNIQUES AND GIVE ONE MORE TOOL OF EXPRESSION.

Write about:

RESEARCHING IN THOSE LIGHTS I HOPE TO GAIN A COMPLETE VISION OF THE FUNCTIONALITIES IN ORDER TO SOLVE ALL KIND OF LIMITATION I COULD ENCOUNTER. MY INTENTION IS TO GO ON WITH IT AND USE ALL THE PATHS POSSIBLE TO GET AN EASY WAY OF EXPRESS MY INTENTIONS

Documentation of Technical and Artistic Process

LOOKING AT VARIOUS WEB SITES I HAVE DISCOVERED MANY PROJECTS THAT USES UV LIGHT .
REALLY NICE ONE IS HOW TO BUILD AN UV LAMP FROM AN OLD SCANNER


THIS IS A REAL PROJECT I'M USING PICTURES AND MATERIAL FOUND IN THIS LINK (http://www.instructables.com/id/EAQTE8M7FIEXCFHAL0/) THE PROJECT YOU'LL FIND IN THIS ETUDE RAPRESENT ONE OF EXPERIMENT I HAVE TRIED TO DO IN ORDER TO UNDERSATND THE FUNCTIONALITY OF THE UV LIGHT...
I AM NOT GOING TO POST ALL THE PROJECTS I HAVE DONE BUT ONLY THE ONES THAT I THINK HAVE SOME KIND OF RELATION WITH A COMPUTER BASED ENIRONMENT

1)

First of all, safety. Notice that this is a mains powered device so maximum care must be taken in the design to insure that for no reason someone may come into contact with live parts. If you are not sure about medium voltage (mains) electrical wiring practice ask a friend or someone else whom is.



2)

First thing I did was to disembowel the scanner, after all what I needed was the case with glass and cover. No electrical parts where reused. Of course you may want to save the motors, belts, screws, head parts...You know, almost everything.

Then I opted for 3 tubes (around 8 Euro each), with ballasts and starters (6/7 Euro for each set). The switch, fuse holder and mains socket I scavenged from somewhere.
For the bottom plane I used a tin sheet. This acts as a sort of mirror/diffuser for UVs.
I used also some scrap aluminum bars from kitchen furnitures, their colour in the pictures tell it. Spacers and screws as required.

Now, the pictures show the electric diagram and the interior of the UV bed.
The circuit is based on three TL5 8W wood light tubes. Each tube is powered by its own ballast and starter . The number of tubes can be increased at will. The circuit is provided with a safety fuse and a power switch. A power socket (taken from a PC power supply) complements the circuit.

All metallic parts inside that have an external metallic counterpart electrically connect to it, must be earthed i.e. connected to the ground line. This is an important safety measure: if something goes wrong and the metal inside becomes 'live' the safety switch or fuse you have in your home trips off and no one is injured. Otherwise the external metallic part may become live without anyone noticing until touched. In my case, since I used a metallic bottom and metallic spacers and screws to hold it to the case, I bolted the metallic bottom to the ground.


3)


Grommets must be used where the wires cross the metallic plane so as the plastic insulator of the
electric wire does not wear out against the metal plane.

The fuse must be rated for the lamps used. In my case 3x8W 220Vac require a 0.5A fuse.

The tubes, ballasts and starters must be rated togheter. Too high rated the ballast and the tubes
get burnt, too low rated and the ballasts burn. Ballasts are typically rated with ranges e.g. 4-20W.
With different wiring and rating one ballast can serve two tubes. Check with the ballast supplier.

As a second thought, I think I shouldn't have removed the scanner's head. I should have mounted one single tube to the moving head of the scanner and use the stepper motor and belt to move the head back and forth.
To provide a uniform lighting the head should have been moved with a non-uniform speed (arcsinusoidal, or inverse sin function, I am guessing). Exposure time would be given by the head-lamp scanning faster or slower as required. But that's another story

One final important notice: UVs are dangerous to the eyes, so do not stare at the tubes when lit.



















MY IDEA

THE PROJECT I INTEND TO REALIZE THANKS TO THE KNOLEDGE I HAVE GAIN FROM INSTRUCTABLES

I WANT TO BUILD A BOARD AND SPRAY OVER SOME UV SENSITIVE PAINT...
PLACING A UV LIGHT IN FRONT OF THE BOARD I WOULD LIKE TO BE ABLE TO DRAW SHAPES WITH THE SHADOW USING RANDOM OBJECT OR EVEN USING MY HANDS....

PLACING THE OBJECT BETWEEN THE LIGHT AND THE BOARD WILL CAPTURE THE OBJECT SHAPE.
EXPERIMENTING MORE THAN ONE COMBINATION I WILL BE ABLE TO CREATE AN ACTUAL DRAW THAT OF COURSE IT WILL DISAPPEAR WHEN THE LIGHT WILL HIT THE BOARD AGAIN



THE CONCEPT OF THIS WORK IS TO GET A REVERSE USE OF THE UV LIGHT.
IN FACT WHAT WE NOTICE IN FRONT OF A UV INSTALLATION IS THE DRAW MADE BY THE UV LIGHT.
WITH THIS METHOD I WANT TO HIGHLIGHT THE EFFECT CAUSED BY THE SHADOW.

I HOPE THAT THIS TECHNIQUE WILL OFFER OTHER WAY OF INSPIRATION FOR THOSE WHO ENJOY USE THIS INSTALLATION.

THE MEANING I CAN FIND CAN START FROM DIFFERENT POINTS OF VIEW. FOR EXAMPLE THE TIME THAT THE DRAW IT'S GOING TO STAY ON THE BOARD, OR THE NEXT OBJECT THAT WILL APPEAR CAN BE RELATE WITH THE ONE BEFORE.

Friday 16 February 2007

ETUDE 1: MIDI signal and the functions

Etude Number 1

Basic Details
RESEARCH ABOUT THE MIDI FUNCTIONS AND HOW IT'S BORN
Your name
MATTEO FONTANA
Your Pathway Combination
DIGITAL ART AND MUSIC TECH
The title of your etude
MIDI SIGNAL AND THE FUNCTIONS
A short statement of intentions

To get a comprehensive knowledge of the connectivity between a various sorts of devices and video clips, I have made a research through few web sites and books.


Write about:
I discover that there is more than one way to get this possible one of those is the MIDI format.
This signal is only one of the ways possible but it is even the most common.
MIDI data is digital, meaning that the information sent is in the form of multiple on/off signals. It is not in the form of an analog audio signal. MIDI data can only travel on direction through a single MIDI cable. Most MIDI devices are equipped with both MIDI input and MIDI output. This means that electronic instruments with MIDI capabilities are able to transmit and also sometimes play MIDI data. MIDI specifies 16 separate channels, allowing for the control of up to 16 different instruments at simultaneously. Communication between MIDI devices is done through the passing of messages, which composed of 3-byte (24 bit) strings. These messages are transmitted at a rate of 31.25 kbaud, or 31,250 bits of information per second. When a computer, sequencer, controller, or other MIDI device receives the MIDI data, the data is decoded and interpreted. MIDI capable instruments respond to messages according their current mode.
MIDI it’s born in 1983 but at the beginning it is been used only to connect music devices such keyboard and synthesizers, more recently MIDI it’s been used to control Software instruments, drum machines, and so on. Without MIDI you couldn't play them as instruments. But MIDI support isn't limited to these applications: many VJ and live video applications, such as motion dive. tokyo (distributed by Edirol), Resolume, Arkaos VJ, and Vidvox Grid also support MIDI input for control and live visuals. MIDI has even found its way into some 3D applications, including Maya.
But MIDI is also a technology that represents music in digital form. Unlike other digital music technologies such as MP3 and CDs, MIDI messages contain individual instructions for playing each individual note of each individual instrument. So with MIDI it is actually possible to change just one note in a song, or to orchestrate and entire song with entirely different instruments. And since each instrument in a MIDI performance is separate from the rest, its easy to "solo" (listen to just one) individual instruments and study them for educational purposes, or to mute individual instruments in a song so that you can play that part yourself.
All this functions have the same application for video based installations but often is ignored, in fact
all the MIDI controllers have even more application-agnostic than MIDI software, because as long as you ignore the labels on the knobs and faders, you can use MIDI devices for whatever you want. Some have standard MIDI ports, some have USB or FireWire connections for computers, and many these days have both. Keyboards were among the first MIDI hardware and still make up a big chunk of the market, but the need to control computer software with something more useful than a keyboard has led to lots of knob- and fader-covered control surfaces, hardware designed for use with live visual apps and VJ performances, virtual turntables for scratching, MIDI wind instruments, MIDI guitars, MIDI drums, foot pedals, wireless controllers, and even MIDI accordions. MIDI can also be found on some light boards, and other control specifications used for lighting, like DMX, can be interfaced with MIDI using an adapter. (LanBox, http://www.lanbox.com, is a particularly useful vendor for MIDI-to-DMX products suitable for installations as well as theatrical applications.)
MIDI hardware is generally oriented toward performance, but what that performance is can be entirely up to you. If you enjoy playing a MIDI wind instrument but would rather control video than make sound, you can; just select the port for the device you want to use in any software that supports MIDI input.