Brief

5th Feb 2010

Abstract
The aim of this project is to create an interactive environment that lends itself to audience participation. There will be a gallery of interactive installations (consisting of industrial devices and sensors) which when used create and modify sounds. Audio supplied from user interfaces will combine with the performed element to form the piece, which will use hardware controllers and computer software.

The software used will be Max/MSP, Max for Live and Ableton Live. From a laptop the progression of sound will be monitored and performed alongside the audience. 

Concept
The analogue technologies that the users encounter are rooted from oil and gas industry electronics. This is reflected in the sonic output of the environment. By taking seemingly innocuous day to day sounds (factory ambiances, wielding, drilling, static charges etc) 31 parameters will show that by re-working analogue technologies into the digital realm, we can muster an interesting and unique sonic experience. Indeed, the sonic environment of daily operations in this industry is rich in musicality, if applied in a suitable manner.

The visual feedback echoes the same concept: the interfaces are placed in metallic casing (reminiscent of oil rig platforms) which are located in an industrial looking room. The rig becomes the instrument.

Group musicality and group dynamics are also explored. The users learn the instrument together, however, as it evolves over time the users have to re-learn the new sonic meanings of their actions.

The Interactive Installations
Proposals for the interfaces include:
  • Floor Mats
    • Five or so mats are placed on the floor. Each mat triggers a sound/parameter when stood on. These combine to make 8/16 bar loops of sound. The samples evolve dependent on the amount of times triggered or pressure/velocity on the mat. The samples are then altered, so the sound progresses. This can be realised using Arduino circuit boards and pressure sensors.
  • Glasses
    • A tilt switch is used to flick on and off. Ideally this object would be haptic, perhaps using an accelerometer to provide the data. X-axis may be pitch, Y-axis may be speed.
  • Apple Remote
    • This would work in a similar fashion. I would like it to work with a visual interface. So the screen at InSpace would give a visual representation of the users movements.
  • Wii Remote
    • Potentially used inside a beach ball. As the beach ball is tossed around the room the sound it triggers (from the laptop) would be carried spatially from speaker to speaker with the movement of the ball. The plan would be to set up a quadraphonic speaker system.
  • Camera
    • Live video feed which is analyzed using Jitter to detect certain variables: colour of clothes, amount of people in room, amount of people outside. This data will be used without the knowledge of the audience so this parameter is would be subtle, with the projected image of themselves as the only indicator of the variable.
  • Wall Sensors
    • Using either infrared or light sensors a fundamental interface is to build a wall sequencer. This would be linked up to the system using the matrixctrl Max object. 6 sensors, and audience members interacting with the metallic wall (different distances and different amounts of people affect the sound) would create matrix combinations. As members moved back and forth, or away from the sensor completely, different combinations would be activated. Each combination can be used to trigger sounds or control parameters of existing sounds. LED's would be projected to represent the current matrix layout (hence audience position). This installation would be a highly interactive, and possibly a energetic, experience.
The hope is to realise as many of these installations as time permits. However, I don’t want to clutter the performance unnecessarily. Therefore, a combination of only three data sources (two haptic objects and one data stream) seems a realistic goal at this stage.

Interactive Digital Performance (original brief)

Project Supervisors Gavin Fort & Yann Seznec

Abstract

This project emerges from last years very successful digital orchestra project which concentrated on the creation of digital instruments and the composition and performance of works using those instruments. Although unique and contemporary in many respects, the performances by the SuperD’Orch group nevertheless followed the traditional and static, performer/audience dynamic. This project seeks to blur the boundaries between installation and concert through the introduction of interactivity and audience participation into digital performance.
Haptic and Gestural interfaces offer new and novel ways of interacting with and creating new musical forms. Increasingly it is the integration of these interfaces with more complex adaptive systems or dynamically variable social contexts that provide significant opportunities for socially mediated composition through conscious and subconscious interaction“(Livingstone, D. Miranda, E 2005)
The group may choose to look at motion sensors, mobile technology, video capture and analysis, room mics and sound analysis etc. The starting point for performance is sound/music but could incorporate movement/dance or a more theatrical element. Some examples include the Bangarama project where the act of headbanging is used as a control input, Ian McCreedy’s Kugel device or Kevin Baird’s piece No Clergy.
An important factor of this project is the audience input, the group must consider how much control is relinquished to the audience (or how much it appears is relinquished!) and the nature of the interaction. Will the audience’s input take the form of an additional instrument, an effect or data to be processed in some other way. In doing so they will establish a new relationship with the public and must therefore consider how they would define and convey the rules and boundaries of that new relationship.

Aims and Objectives

  • Define design strategies for an Interactive Audio/Visual ‘installation’ system to compliment/augment live performance.
  • Create the situation for, and carry out, a unique and interesting performance which has a significant element of audience input.
  • Create interesting sound elements through a variety of means including recording practices and synthesis.
  • Tackle concepts of virtuosity and digital technology as a tool of democratisation.
  • Alternative or experimental approaches to composition.
  • Engage in cross-disciplinary collaboration in the context of audio-visual practice

Learning outcomes

  • *An appreciation and understanding of alternative approaches to composition
  • *Performance skills
  • *Approaches to the design and construction of installation elements including hardware hacking
  • *Team work
  • *Realtime sound generation and processing
  • *Max/MSP
  • *SuperCollider

Presentation suggestions for both submissions

1) It is suggested that for the first submission the group produce a website that compiles their research to date, clearly states their intentions with regards to the design of an interactive system, and if possible, includes video footage of prototypes and/or the team at work.
2) During presentation week the group are expected to hold a performance with a significant element of interaction/input from the audience. This could be through a device or piece of equipment that can be used by an audience to interact with a performance.
3) The final submission should consist of well edited documentary footage of the presentation as well as personal accounts by the individual team members of their role in the group.
Again it is suggested, but not necessary, that this takes the form of a website. Remember:
” it is difficult to specify exactly what you should submit here but we definitely do NOT want personal diaries about how well or badly the project went, we want context and critique and no more than around 750 words. These can be illustrated with images, virtualisations, sounds, video.”

Bibliography/Further Resources/Recommended and related websites

  • Berghaus, G.(2005). Avant-garde Performance: Live Events and Electronic Technologies, US, Palgrave Macmillan
  • Behrman, D (1991) ‘Designing interactive computer-based music installation’, Contemporary Music Review, 6:1 (1991) 139-142
  • Livingstone, D. Miranda, E. R. (2005) “Orb3 – Adaptive Interface Design for Real time Sound
  • Synthesis & Diffusion within Socially Mediated Spaces” Proceedings of the 2005 International Conference on New Interfaces for Musical Expression (NIME05), Vancouver, BC, Canada
  • Miranda, E. R. (2001). Composing music with computers. Focal Press.
  • Miranda, E. R. (2002a). Computer sound design. Focal Press.
  • Roads, C. (1996). The Computer Music Tutorial: Cambridge, Mass: MIT Press.
  • Winkler, T (2001) Composing Interactive Music: Techniques and Ideas Using Max. Cambridge, Mass: MIT Press.
  • Wishart, T. and S. Emmerson (1996). On sonic art. Amsterdam, Harwood Academic Publishers.