Networked_Music_Review

Interview: Amit Pitaru

pitaru.jpgAmit Pitaru is an artist, designer and researcher of Human Machine Interaction (HCI). As an artist, he develops custom-made musical and animation instruments, and has recently exhibited/performed at the London Design Museum, Paris Pompidou Center, Sundance Film Festival and ICC Museum in Tokyo. He is also a designer with particular interest in Assistive Technologies and Universal Design. He was recently commissioned by the MacArthur Foundation to write a chapter for an upcoming book on his recent work – creating toys and software that are inclusively accessible to people with various disabilities. As an educator, Amit develops curricula that focus on the coupling of technology and the creative thought process. He regularly teaches at New York University’s ITP and Cooper Union’s Arts department.

Helen Thorington: Welcome Amit. I know you’re originally from Israel, but tell us something more. Where did you grow up, what is your educational background and how did you become a musician?

Amit Pitaru: I was born in Jerusalem, later raised in Tel-Aviv. I also spent a couple of years in Canada as a child. My parent had a friend named Yael Bernstein who invented a successful method to teach children music through colors and shapes. It turned out that I was her first guinea pig at the age of 5. I’ve been playing one instrument or another since then.

I later studied with Arik Shapira, a modern composer. He influenced my musical approach in ways I only realized years later. For example, I remember that he once composed a piece for a chamber orchestra, recorded the performance, chopped it up to tiny-bitty little pieces, and then pieced it together into a new composition. But the way he glued those pieces back together wasn’t random or purely algorithmic; it all notated on paper to the last note! This was long before samplers and computer-editing, so he did it with scissors and glue. It sounded awful – like shattered glass, but I loved it. I was his worst student but probably understood his wacky compositions better than many others. The odd thing is that he refused to teach me his personal take on modern composition. Instead, he pounded Solfeges, Bach cannons and all of that foundation material into my head. He used to throw stuff at me when I didn’t get it right. Man, it was love though, or maybe just tough.

Helen: When did you start thinking about building your own instruments, and why? When did you first use one in concert?

Amit: At 23, I moved to NY and started playing in clubs – Funk, Reggae, Hip hop – anything really – but mostly became obsessed with playing Blues on a Hammond organ. This instrument really opened my eyes to how a tool can influence musical expression and how a sound can affect the soul – you need to see it in church on a Sunday morning in order to understand. I guess this is true for the piano, violin and especially the electric guitar, but I happened to really internalize it when playing the Hammond organ. One of my favorite things to do with the Hammond is to play it up-side down – meaning to place a heavy object on one of the keys and than play the drawbars (which affect the sound harmonics). Needless to say, this was pretty annoying for everyone else in the room. But still, I wanted to find a way to create a composition based on this method, and that made me think of Arik and his work. In my mind I imagined an instrument, or rather a new interface for a Hammond that would allow me to compose what I had in mind. But I had no idea how to open up a Hammond organ and build my own knobs into it. A couple of years later, I became very proficient with using computers in recording studios, and at some point someone installed a Hammond-organ emulator on the computer – it even looked like a Hammond on the screen. flower_thumb.gifThe moment I saw it, things clicked into place: ‘If I could just build a graphical interface around that software sound engine…’ I don’t know what made me think that I could learn to program a computer – perhaps I would have not started if I knew how hard it would be. I’ll spare the details, but a year later I made the Hammond Flower Organ.

Helen: When did you start thinking about making your instruments available to others?

Amit: After the Hammond Organ, I started making more instruments and also became a close friend with James Paterson – the guy behind http://presstube.com. We started collaborating and I found myself building instruments both for music and animation production. At the time I was both performing on my music tools, as well as presenting in new-media conferences together with James. In both cases, people often approached me after the show and requested to try out the instruments. At some point we started getting requests to place these tools in art exhibits. For the first time, I had to think about how to clean up these tools so they could be used by others.

Helen: What have you learned from watching the way others use them, particularly children?

Amit: It’s a privilege to have so many people engage these instruments; it allows me to test new ideas and improve the interfaces – under the notion that by improving the interface for others I’m also making it a better tool for myself. It’s often surprising and always insightful.

As most people cannot spend more than an hour with the instrument in an exhibit, I cannot expect them to become experts and create accomplished music with it. But still, it’s a great way to test the interface – whether it’s intuitively engaging, self explanatory, and suggestive of its own potential. These are all attributes that are often missing from new instruments, if looked at through the lens of traditional tools. For example – I remember observing children engage a piano for the first time; it’s fun to bang on those keys, and its fun to discover that each key produces a separate note, and its fun to try to play a simple tune. Many end up playing a tune in just a few minutes. It’s a game – a puzzle. And most incredible is that this same instrument, with its simple interface that has not changed in centuries, also has the depth for a Leonard Bernstein or Thelonious Monk to express their genius. In contrast to the piano, my instruments do not have centuries to evolve to this level of perfection (and probably never will). But looking at children engage them for the first time is as close as I get to the truth about their nature.
composite.jpg
I can also learn a lot about people by how they interact with my instrument. For example, the Sonic Wire Sculptor has been exhibited in Japan, Korea, France, Germany, Israel, Spain and other places, and I enable the audience to save their work. I am always amazed to see the difference in how each culture approaches the tool, and what they eventually save. I can write an entire essay about this stuff – for example – in France, people will allow themselves to play with it for up to 15 minutes while others are waiting, and at the end – the audience will clap! In Austria many users used the tool to produce their national anthem, or other known tunes. In Japan they will grasp the nature of the interface in seconds, will not spend more than 5 minutes out of courtesy to others, but will remain in the room and try it again several times (each time for 5 minutes). In Spain they use it with passion, sometimes to the point of breaking it, and will also produce some of the most interesting work. Movie.

Helen: Have digital technologies altered your conception of and approach to composing and performing? How?

Amit: In the past few years I’ve been looking hard at the line that separates music and sound. As you are now sitting in a room in front of the computer, there are a few background noises like the computer fan or cars outside that your brain is tuning out (until now, that is). You obviously do not consider these sounds as music but rather background noise. But composers today are asking listeners to treat these sounds as music – to process these sounds in the areas of their brain that usually process music. Musicians are also asking the audience to accept the fact that these sounds are currently detached from a physical body; for example – when you hear a guitar playing, you automatically envision someone strumming on a guitar. But what do you envision when you are sitting in a concert hall, listening to screeches and blips? We are asking a lot from our audience. I’d like to bridge some of these gaps – explore new forms of musical expression but not discard the overall notions that have revealed themselves over centuries, as to how and why music is listened to.

Helen: Thanks Amit!


May 18, 2007
Trackback URL

2 Responses

  1. newmediafix.net » Blog Archive » INTERVIEW: Amit Pitaru, by Helen Thorington:

    […] This interview is republished in collaboration with Turbulence.org. It was released in Networked Music Review on 05/18/07 . Only the text is reproduced here. To hear audio and access proper links related to this interview, please go to http://transition.turbulence.org/ networked_music_review/ 2007/05/18/interview- amit-pitaru/ […]


  2. GarageGeeks Blog » Blog Archive » Experiments in Art, Software, Hardware and UX with Amit Pitaru:

    […] PDF on creating games for children with disabilities. A couple of interviews with Amit here and […]


Leave a comment

Interviews

Current interview:
Robin Meier, Ali Momeni and the sound of insects

Previous Interviews:

Tags


livestage music sound performance calls + opps installation audio/visual radio festival instrument networked audio interactive experimental electronic workshop video participatory writings event mobile exhibition concert live collaboration electroacoustic environment nature reblog distributed soundscape field recording net_music_weekly improvisation software history locative media space public noise recording immersion voice acoustic sonification lecture generative conference body tool sound sculpture net art art + science VJ/DJ light diy remix site-specific perception mapping film visualization listening laptop algorithmic multimedia city urban data wearable architecture open source game virtual biotechnology sound walk spatialization webcast hacktivism robotic image score platform electromagnetic new media cinema ecology found news composer telematic interface streaming residency interviews/other sensor dance circuit bending synesthesia physical political notation intervention object controller broadcasts conversation narrative second life responsive mashup place technology ambient social network symposium motion tracking hybrid intermedia augmented spoken word livecoding text phonography auralization acousmatic upgrade! gesture opera aesthetics mixed reality resource theory processing 8bit orchestra nmr_commission wireless device toy wireless network theater web 2.0 presentation community surveillance p2p 3D copyright soundtrack research podcast sample feedback psychogeography social chance interdisciplinary tactile recycle interview language systems code emergence presence cassette privacy free/libre software media play chiptune newsletter place-specific archives avatar education haptics activist surround sound audio tour glitch hardware tactical identity bioart asynchronous business tv tangible composition animation jazz transmission arts apps tag e-literature collective microsound relational synchronous Artificial Intelligence conductor convergence reuse simulation ubiquitous synthesizers im/material
3D 8bit acousmatic acoustic activist aesthetics algorithmic ambient animation apps architecture archives art + science Artificial Intelligence asynchronous audio audio/visual audio tour augmented auralization avatar bioart biotechnology body broadcasts business calls + opps cassette chance chiptune cinema circuit bending city code collaboration collective community composer composition concert conductor conference controller convergence conversation copyright dance data distributed diy e-literature ecology education electroacoustic electromagnetic electronic emergence environment event exhibition experimental feedback festival field recording film found free/libre software game generative gesture glitch hacktivism haptics hardware history hybrid identity im/material image immersion improvisation installation instrument interactive interdisciplinary interface intermedia intervention interview interviews/other jazz language laptop lecture light listening live livecoding livestage locative media mapping mashup media microsound mixed reality mobile motion tracking multimedia music narrative nature net art networked net_music_weekly new media news newsletter nmr_commission noise notation object open source opera orchestra p2p participatory perception performance phonography physical place place-specific platform play podcast political presence presentation privacy processing psychogeography public radio reblog recording recycle relational remix research residency resource responsive reuse robotic sample score second life sensor simulation site-specific social social network software sonification sound soundscape sound sculpture soundtrack sound walk space spatialization spoken word streaming surround sound surveillance symposium synchronous synesthesia synthesizers systems tactical tactile tag tangible technology telematic text theater theory tool toy transmission arts tv ubiquitous upgrade! urban video virtual visualization VJ/DJ voice wearable web 2.0 webcast wireless device wireless network workshop writings

Archives

2017

Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2016

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2015

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2014

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2013

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2012

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2011

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2010

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2009

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2008

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2007

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2006

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2005

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2004

Dec | Nov | Oct | Sep | Aug | Jul

What is this?

Networked_Music_Review (NMR) is a research blog that focuses on emerging networked musical explorations.

Read more...

NMR Commissions

NMR commissioned the following artists to create new sound art works. More...
More NMR Commissions

Net_Music_Weekly

"Two Trains" by Data-Driven DJ aka Brian Foo

Two Trains: Sonification of Income Inequality on the NYC Subway by Data-Driven DJ aka Brian Foo: The goal of this song is to emulate a ride on the New York City Subway's 2 Train ... Read more
Previous N_M_Weeklies

Bloggers

Guest Bloggers:

F.Y.I.

Feed2Mobile
Massachusetts Cultural Council
networked_performance
Networked: a (networked_book) about (networked_art)
New American Radio
New Radio and Performing Arts, Inc.
New York State Council on the Arts, a State agency
New York State Music Fund
Turbulence
Upgrade! Boston

Turbulence Works