Public Presentations

IRCAM @ BU CNM April 25-29, 2016

All public presentations are free and open to the public

IRCAM’s team will host and present a series of lectures, demonstrations, discussions and question sessions around musical technology and aesthetics. These presentations will cover historical and aesthetic topics as well as technological and scientific ones, with invited participants from the Massachusetts Institute of Technology and the Rensselaer Polytechnic Institute. We will conclude with a round-table discussion on the issues raised throughout the day, moderated by B. Lee Roberts, Boston University.
These events will all take place in the BU GSU Auditorium (775 Commonwealth Ave. Boston, MA, 02215)

April 25 10am-1pm

Musical computing and technology
Gérard Assayag & Andrew Gerzso
Some of IRCAM’s leading personalities including Andrew Gerzso and Gérard Assayag will present both a historic overview and the state of the art in musical computing and research as seen from the IRCAM perspective. This lecture demonstration will be fast moving and include lots of examples from pieces created at IRCAM.

April 26 10am-1pm

Spatialization and Computer Assisted Composition
Immersive 3-D Audio Rendering using IRCAM Spat

Markus Noisternig, IRCAM STMS Lab

IRCAM’s Spatialisateur (in French) provides tools for immersive 3-D audio and room reverberation rendering. The artistic play with spatial relationships adds a further dimension to expressivity in music performance; it redefines the understanding of sound in space.
This lecture will focus on applications of the Spat software to music composition, performance practice, and media arts.

Computer-Aided Composition using OpenMusic
Jean Bresson, IRCAM STMS Lab

OpenMusic is a visual programming environment dedicated to musical data processing and generation, used by a large community of users to carry out varied aspects of their compositional processes.
We will present the main features and characteristics of this environment, as well as a number applications in contemporary music production.

April 27 10am-1pm, 2:30pm-5pm

The State of the Art: Workshop on Musical Sound Spaces

The presentations will go into more of the technological and scientific detail and highlight recent research and developments, especially on the domains of virtual sources, ambisonics and spatialization or instrument radiation. These presentations should be of interest both to scientists and to a broader public including musicians. We will conclude with a round-table discussion on the issues raised throughout the day, moderated by B. Lee Roberts, Boston University.

Recent Advances in Immersive 3-D Audio Technologies
Markus Noisternig, IRCAM STMS Lab
10:15am-11am

Recent advances in 3-D audio technologies give rise to new ways of creating spatial experiences with sound; along with rhythm, melody, harmony, or the color of sound, space has become an essential element of expression in music composition and performance.

This talk gives an introduction to modern 3-D audio technologies using high-density loudspeaker arrays, such as Wave Field Synthesis (WFS) and Higher-Order Ambisonics (HOA), and discusses their respective advantages and limits. It offers some insights into 3-D audio recording technologies, such as spherical microphone arrays, and discusses how the use of new sound projection and recording technologies help creating novel musical effects.

From symbolic music processing to spatial audio – Research directions in computer-aided composition
Jean Bresson, IRCAM STMS Lab
11am-11:45am

Computer-aided composition processes traditionally deal with symbolic musical material, manipulated algorithmically and rendered using classical score representations and parameters (pitches, rhythms, etc.)
On the other hand, sound processing and spatialization generally run in real-time interactive environments.
Research and developments carried out during the past 15 years in computer-aided composition systems have aimed at bridging these different fields.
This idea will be illustrated through the presentation of musical research projects carried out in the OpenMusic environment,
with a particular focus on recent applications integrating the control of sound spatialization in compositional processes.

Playing with (Helmholtz) Blocks: Interactive Analysis and Design of Instrument Bores
Prof. Anthony T Patera, Massachusetts Institute of Technology
Work in collaboration with DBP Huynh and L Nguyen (Akselos, S.A.) and M Yano (U Toronto).
12:15pm-1pm

Accurate and detailed solution of the Helmholtz equation of acoustics in complicated three-dimensional configurations is typically a time-consuming and resource-intensive task. In this talk we describe a new computational environment which provides for real-time interactive analysis and design of acoustic systems and in particular wind instrument bores.

In the first stage, offline, we construct a library of pre-computed components – in the case of instrument bores, circular duct segments with side holes, bends, branches, and expansions; in the second stage we define any desired model – an instrument bore or more generally an acoustic duct – in terms of parameters which map to an assembly, or system, of components; in the third stage we query our model for different values of the parameter – related to bend radii, placement of holes (open or closed), shape of bell, and frequency – to evaluate outputs such as impedance and to visualize the three-dimensional pressure field. The second and in particular the third stages can be performed very rapidly, in minutes and seconds, respectively. We demonstrate the system through several user interfaces to a cloud server.

Building Performance Spaces and Infrastructure for Spatialized Music
Differentiation in Spatialization in Relation to Other Parameters in Composition and Listening
Johannes Goebel, EMPAC, Rensselaer Polytechnic Institute
2:30pm-3:15pm

There are not many performance spaces, which are defined and constructed with the spatialization of sound/music as a fundamental and important design requirement. At the Curtis R. Priem Experimental Media and performing Arts (EMPAC), Rensselaer Polytechnic Institute, we built a concert hall and two studios specifically to this requirement and we happened to have reached what we aspired to… which is never sure with acoustics. The considerations, the spaces and descriptions of a few works realized will be presented.

Equally important as architecture, room acoustics and technology, are considerations of how hearing, listening, composition and aesthetic-cultural context are correlated. How do compositional ideas and strategies relate to perception, memory, and musical context? Or rather the other way around: how do perception and musical context support or contradict compositional ideas? What is the relationship of the compositional shaping of sound in space related to the listening and experience during a performance?

Round-Table

B. Lee Roberts, Boston University, Moderator
3:30pm-5pm

ROUND-TABLE

April 28 10am-1pm

Round-table discussion on the issues raised throughout the day

In a special tribute to the recently deceased founder of IRCAM, Pierre Boulez, Andrew Gerszo, his lifelong collaborator for music technology will talk about their work together and answer questions about Boulez’s approach to technology, followed by a composer round-table featuring Beat Furrer, Chaya Czernowin, Joshua Fineberg, Andrew Gerzso and others, moderated by Allen Speight.
10am-11am Andrew Gerzso
11am-1pm Round-table