Está en la página 1de 17

Jean Bresson and Carlos Agon

Music Representations Team


Scores, Programs, and Time
Institut de Recherche et Coordination
Acoustique/Musique Representation: The Sheet
1 Place Igor Stravinsky,
75004 Paris, France Object in OpenMusic
{jean.bresson, carlos.agon}@ircam.fr

Current computer-music systems deal with complex in OpenMusic. The sheet editor is then presented. Its
processes related to heterogeneous musical contents main features, which are detailed next, concern the
and information. In particular, the objects involved common representation of musical objects related
in music composition correspond to various tem- to heterogeneous time systems and the capability to
poral paradigms and representations. For instance, develop programs from within the score. We then
contemporary music composers frequently deal examine the possible function of this new object in
with time durations, or even directly with the con- the general CAC framework, and we conclude with
tinuous aspects of sound structures, instead of (or in a concrete example with the reconstitution of an
relation to) a traditional pulsed-time representation. excerpt from a score by Karlheinz Stockhausen.
In these cases, however, continuous musical mate-
rial must generally be associated with discrete and
symbolic structures to allow for their meaningful
representation (Boulez 1987). The score is a general Computer-Music Scores
framework for music writing in which this integra-
tion of musical material and the related issues of Traditional musical categories are being decon-
time representations should be carefully considered. structed by contemporary musical practices and
Regardless of whether it concerns instrumental or computer-composition tools. The nature and the
electro-acoustic music, this document is of major role of the score has thus to be considered from a
importance as a formal support and a preferred place renewed point of view.
for musicians to write, read, and think about music. The concept of the score usually refers to a music
In this article, we focus on this notion of the score representation built on a notation system inherited
in computer-aided composition (CAC) through a from the Western music tradition, which is basically
presentation of related work in the visual program- a compound graphical structure used to organize
ming environment OpenMusic (Agon 1998; Assayag musical events (notes) in time. More generally, it
et al. 1999). We introduce a new object recently can be seen as a place where the composer writes
created in OpenMusic: the sheet. This new object music in terms of a standard and universal language
gathers results of CAC experiments and research describing some musical parameters or instructions
concerning the relations between programs and to perform in order to produce sound. In this sense,
score representations, the integration of sound and computer-composition tools commonly call any
signal processing in compositional processes, the document containing a list or sequence of such
mixing of heterogeneous temporal systems, and the parameters or commands a score. In principle,
general visual representation of time structures in this kind of score is not always intended to be read by
music notation. The sheet editor allows for accurate human performers and thus does not systematically
rhythmic notation while integrating specific fea- meet high-level symbolical requirements. It can
tures such as the support for new types of musical be represented with simple numerical data, then
objects and programming tools. converted to synthesis parameters or to symbolic
After a brief discussion of the question of the notation for performance.
score in computer music composition, we present Our position, however, is to reaffirm the need
previous attempts at combining programs and scores for symbolic musical representations in computer
system scores. As mentioned previously, a score
is not a simple sequence of instructions but also a
Computer Music Journal, 32:4, pp. 3147, Winter 2008 fundamental medium for composers to think about
c
!2008 Massachusetts Institute of Technology. their music, to describe and communicate musical

Bresson and Agon 31


forms and structures according to a symbolic (e.g., Smith 1972; Nienhuys and Nieuwenhuizen
representation system. It supports and informs the 2003), but usually this extra score editing is done
underlying compositional intentions. after the composition is completed. As a result, no in-
This notion of score can therefore be a problematic teraction can take place within the score framework
point, particularly in the context of electro-acoustic during the actual compositional process. The ENP
music (Ebbeke 1990). The questions of how an system (Kuuskankare and Laurson 2006) proposes
electro-acoustic score should be notated and inter- interesting solutions to work with personalized
preted, which are the objects of various research systems within programmable score editors. The
projects in the musicological and signal-processing ability to define new graphical objects, possibly re-
communities, are not the object of our current dis- lated to specific synthesis parameters, constitutes an
cussion. It is important to comment, however, that important and successful attempt in this direction.
these notions are getting closer, and even sometimes Another example of a common problem is the
tightly integrated into, the compositional process: cross-representation and manipulation of rhyth-
Composers now create their own instruments using mic and proportional-duration representations.
sound-synthesis languages and invent their own Computer-composition systems generally provide
notational systems. The role of the score is therefore tools for working with one or the other of these
reinforced as a central support federating these representations by using distinct objects and struc-
different domains for which adequate flexibility is tures, but they do not allow for their common and
required. Our objective, for now, is thus to think simultaneous use. The work of Hamel (1998) is a
about how to represent and implement a score in notable exception, which proposes a flexible score
a computer-composition system, given some nota- editor allowing for the use and graphical represen-
tional conventions and possible sound-processing tation of rhythmic notation within a linear time
systems at hand. representation system.
A first related issue is that of the new objects A second major issue concerning the notion
used to represent and organize the score. Indeed, as of score in computer-composition systems is the
much contemporary music precisely attempts to reconsideration of the traditional static design of
emancipate itself from traditional systems, note- the score in the context of the new compositional
based representations should be replaced by, or frameworks, consisting of networks of personal
completed with, other kinds of sound-description conventions, representations, and relations. This
structures and representations. Musical examples is one of the main points addressed by computer-
are widespread where traditional musical objects aided composition environments like those derived
(chords, voices, etc.) are used in addition to other from the PatchWork visual programming language
types of heterogeneous data (sound waveforms, (Laurson 1996), OpenMusic (Agon 1998), and PWGL
envelopes, or other extra objects and symbols) (Laurson and Kuuskankare 2002). We now discuss
meant to be interpreted by performers or by sound- the ways in which the aforementioned issues guided
synthesis programs. the main lines in the early developments of the
Current score editors are generally restricted to OpenMusic environment.
traditional music notation and do not allow for the
representation of these new types of objects and
structures. On the other hand, advanced sequencers Score Programming in OpenMusic
allow one to schedule musical events and to control
different kind of musical or sonic parameters, but Programs as Potential Scores
without real symbolic notation. Often, composers
use external graphical tools to integrate extra com- The creative and subjective aspects of music
ponents to their scores. Some score-engraving and composition led to the evolution of CAC systems
notation systems allow for the extension of common toward programming environments rather than
notation with user-defined symbols and graphics ready-made programs. Programming languages are

32 Computer Music Journal


indeed best-suited to allow composers to express space for a computer user. This project was never ac-
their musical ideas using all the possibilities tually completed, but it inspired the architecture of
provided by the computer. OpenMusic, which extended this basis by consider-
The concept of potential score introduced in ing the program as the federating document. In this
Assayag (1998) relates notation to the languages way the communication, relations, and interdepen-
that allow for the expression and the creation of dencies between the different components (i.e., ap-
music as a materialization of information flowing plications) could be better and more easily dealt with
in the system but also as a medium where formal than if they were integrated in a mere document.
inventiveness lives. Through this concept, the In OpenMusic, a program is represented in
merging of notation with language (and in particular, a patch editor and constitutes a framework for
with programming languages) is suggested as the the functional integration of (musical) data and
core purpose of the CAC systems: The computer processes. In the patch editor, all these components
program that leads to the creation of a musical are represented by graphical boxes linked to each
structure becomes a potential score representation other by graphical connections. Some special boxes
of a piece of music. This representation emphasizes called factories represent the musical data structures
the semantics of the musical processes and reflects (classes) created and processed by the program (see
the corresponding compositional intentions. Figure 1). These boxes make it possible to generate
The principal current CAC environments started and store musical objects and to manipulate them
from such considerations and enriched programming by programming. The factories also represent
environments with tools, features, and interfaces applications in the program: They are associated to
allowing for the development of specific musical special editors, allowing for a manual interaction
models and processes. In particular, with visual with the musical data (Agon and Assayag 2002). In
programming languages such as OpenMusic, the Figure 1, for instance, the breakpoint-function (BPF)
musical structures can be represented, created, and editor allows for the manual shaping of the input
manipulated graphically in relation to the processes values of the process represented, while the score
from which they are derived. editor allows for the editing and modification of the
result of this process: We see that manual changes
(additional notes, ties, etc.) have been performed on
Scores in Compositional Processes the voice object.
By integrating the editors/factories among the
In addition to its fundamental visual programming different musical objects and processes in a common
features, the OpenMusic environment was designed symbolic description, the patches therefore consti-
to allow for a constant interaction between the tute powerful musical representations maintaining
musical data and processes by means of a variety of scores as a central part of compositional processes.
interfaces and editors. In particular, the score editors However, except within the different musical object
are major elements in the compositional processes editors, the patch does not actually form a consistent
created in OpenMusic visual programs. temporal structure and thus cannot be considered as
At the origin of this design stands the idea of a real score representation. A relevant representation
programs that would contain applications related should at least include a temporal dimension so that
to each other by functional means. This idea was the music represented can be mentally or concretely
inspired by the OpenDoc project by Apple Computer unfolded in time.
(Orfali, Harkey, and Edwards 1996), the purpose of
which was to integrate heterogeneous components
inside a federating document. The role of the ap- Musical Data and Temporal Processes
plication (which usually produces documents) was
reversed by centralizing multiple applications in a The maquette is another type of visual program, ex-
document, considered as a more intuitive working tending the notion of patch with temporal properties.

Bresson and Agon 33


Figure 1. Programs and
editors in an OpenMusic
patch. A voice object is
created from a ratio list
converted into a rhythm
tree and from pitch values
sampled from a breakpoint
function (BPF).

A maquette editor has a horizontal ruler that repre- attached (e.g., their position and extents regarding
sents time and a vertical ruler whose function may the horizontal or vertical axes), so that the overall
be set freely (or just used as a topological landmark). context organization and the internal object calculus
Figure 2 shows an example of a maquette editor. may be affected by one another.
The maquette contains rectangular boxes, each The boxes can also be connected to each other
representing a program (i.e., a sub-patch) intended to by functional relations, which turns the maquette
produce a particular musical result. The horizontal itself into a program. Examples of such relations can
axis gives a temporal meaning to the graphical involve the construction of the musical objects. For
properties of these boxes: The position and size example, in Figure 2, the chords in boxes labeled
of a box on this axis determine the beginning C and D may be computed starting from some
and ending of the musical objects created by the data coming respectively from boxes A and B, or
corresponding programs in relation to the global the sound in box E synthesized from data coming
time line. From this point of view, the maquette from box B. They can also concern the temporal
editor can be considered as a sequencing tool organization of the containing boxes (e.g., the offset
gathering all these objects in a temporally organized of box F is computed to make it meet with the
structure. Moreover, the programs in the maquette beginning of box D). Musical objects are thus related
boxes can access and modify the graphical and in two possible ways: causal relations expressed
temporal properties of the box to which they are in the program and temporal relations resulting

34 Computer Music Journal


Figure 2. A maquette consists of arbitrary,
editor. The boxes are dimensionless units for
embedded programs spatial convenience.)
(patches) that compute Functional connections are
musical objects, integrated visible between the
in a common time inputs/outputs of some of
reference on the horizontal these programs.
axis. (The vertical axis

from the graphical layout. Hence, there exists a functional or logical organizations are available
computing time, independent from the time ruler through the programming aspects. In addition, het-
but following the functional composition of the erogeneous objects and their possibly diverse time
maquette, and an execution time depending on the paradigms can be brought together in a common time
resulting position and size of the components of reference. (Figure 2 shows a sound file together with
the maquette. These two types of organizations rhythmice.g., box Fand proportional-timee.g.,
can actually be related, because the programs in boxes A and Bmusical objects.)
these boxes both produce musical data and they Additional details about the maquette features
access and/or modify their own temporal and and implementation are given in Agon (1998) and
graphical properties. Finally, a maquette itself can Agon et al. (1998). Recent work has also been
be contained in a box, which leads to the creation of carried out on the maquette for the specific design
hierarchical temporal structures. of sound-synthesis processes, making it possible,
The maquette thus integrates formal musical through a special visual program attached to the
constructions in an intermediate form between maquette, to define the semantics of the overall
the visual program and the score. Various types of structure, namely, to determine how the musical
temporal organizations can be implemented in this data organized in it are to be converted into a
document: Linear and hierarchical time organiza- global musical result. Temporal structures can
tions naturally incorporate its structure, and other thus be created with sound-synthesis parameters,

Bresson and Agon 35


Figure 3. The maquette as Left: top-level structure organizing synthesis
an alternative score including two parameters; right: contents
representation for sound sub-maquettes, each of a box from the
synthesis: reconstitution producing a sound intermediate level
on an excerpt from component; middle: one of sub-maquette, a program
Traiettoria . . . deviata the sub-maquettes of the generating sound synthesis
(19821984) by M. Stroppa. intermediate level, parameters.

to be converted afterward into sounds via this cal systems. Hence, some kind of relation with the
user-defined process (Bresson and Agon 2006). Such traditional score notation is generally maintained,
musical forms, including the functional/temporal even by composers working with sound synthesis or
organization of sound-description data, constitutes electro-acoustic means. For instance, the events or
a global sound representation that one could also other types of data controlling the sound-synthesis
compare to a sound-synthesis score (see Figure 3). processes are frequently computed starting from
traditional score representation data, especially for
the specification of onsets and pitch-related infor-
Music Notation mation. (Figure 4 shows an example in OpenMusic.)
It might also be worth mentioning that, even if the
In addition to the function of a potential score that OpenMusic score editor is far from equaling existing
can be given to a compositional process represented dedicated score-editing software, it includes some
by a visual program or a maquette, we now expose notable features such as rhythmic structures (Agon,
additional comments concerning our particular Haddad, Assayag 2002) that provide unique possibil-
interest in traditional music notation and in scores ities of computation and representation of the rhyth-
under their traditional form. A first point is that mic hierarchical constructs, as well as an accurate
musicians and composers are generally still willing rendering of polyphonic scores, including polymetric
to adopt this conventional representation to notate information and individual tempo/meter variations
works using newly created or extended systems. for each voice. The microtonal notation avail-
It constitutes a common landmark and means of able in the editor also facilitates working in with
communication for the increasing number of musi- microtones (Bancquart, Andreatta, and Agon 2008).

36 Computer Music Journal


Figure 4. Maintaining
relations among
sound-synthesis processes
and traditional score
representation in
OpenMusic:
time-stretching parameters
computed from a rhythmic
structure.

However, the combination of this notation container. Its editor resembles a traditional score
aspect with advanced temporal structures remains but follows the CAC principles discussed herein
an unsolved problem. When using a maquette, for regarding integration and common representation
example, the boxes must start and end precisely in of musical materials and programs. The objective is
specific points: One of the main potentials of this to gather in one document the characteristics of a
interface is to integrate tightly and precisely such score as a readable support (which implies providing
temporal information in the musical processes. a coherent, symbolic temporal representation) and
As a consequence, it is clearly observable in the (visual) programming features enabled by the
Figure 2 that the notation inside the different computer-aided composition environment.
boxes does not accurately match the time axis. (For A sheet is composed of a variable number of
example, the beginning of a voice requires some tracks, each containing one or more sequenced
graphical space for the clef, so that the first note objects. These different tracks contents can be
graphically starts after the actual beginning of created in programs (i.e., in OpenMusic patches) and
the corresponding box.) Therefore, the maquette gathered and inserted into a sheet using drag-and-
cannot stand as a real support for music notation drop in the editor or directly by programming in the
and still looks more like a temporal programming same patches. Figure 5 shows a sheet constructed in
interface integrating out-of-time notated musical a patch starting from various musical objects.
objects. The basic functionalities of the sheet editor
are thus the creation and removal of tracks and
the addition and removal of musical objects to or
Sheet: Formal Structure and General Description from these tracks. The musical objects can then be
moved, copied, and so on, within tracks or from one
The sheet is a new object created in OpenMusic that track to another. The tracks display can be edited
can be seen both as a document and as a musical (using operations such as resize, reposition, respace,

Bresson and Agon 37


Figure 5. The sheet factory
in a patch and its
associated editor.

show/hide background, etc.) to reach the desired time (durations) and the graphical space occupied by
graphical setup for the score. the symbols in the score. A typical example of this
In principle, any type of object coming from the nonlinearity is the graphical representation of the
OpenMusic framework can be inserted in a sheet score elements that do not take any time at all (e.g.,
provided it has a duration: chords and sequences, clefs, measure bars, metric information, acciden-
voices, MIDI files, and even curves and envelopes, tals). As their size cannot be adapted to the timeline
sound files, etc. Each one of the inner objects is (their dimension would be zero), then the time axis
individually accessible through its own editor (e.g., is distorted to let them fit in the representation.
score editor, BPF editor, or sound file editor). In Figure 6, a temporal grid with a 500-msec
step is displayed on the score, which underlines
the constant variation of the space occupied by this
Time Systems and Graphical Representation duration along the voice representation. (The display
of a regular temporal grid available in the editor, as
One of the main features of the sheet editor is the well as the optional display of the absolute starting
consistent common graphical representations of the times of the internal objects, will help to assess
different objects it contains. this space/time distortion in the different example
figures given in this article.)
This nonlinearity can be problematic when vari-
Nonlinearity in the Time Representation ous simultaneous objects are represented together.
In such a joint representation, the simultaneous
Music notation is not linear with respect to repre- events must be at the exact same position in the
sentation of time: there is no linear relation between horizontal time axis. In addition to the display

38 Computer Music Journal


Figure 6. A voice in a sheet Figure 7. Two voices
editor with a regular aligned in a sheet editor:
temporal grid display. mutual influence of the
different objects
representations.

the different objects contained in the score. All


possible landmarks in these objects are considered
and used to compute ratios between time (dura-
tion between two successive landmarks) and space
(dimension of the corresponding graphical compo-
nents). A method is defined for each class of objects
to determine these temporal landmarks whenever
possible. With traditional score objects, for instance,
they correspond to the beginnings and endings of
notes, rests, and other symbols. Determining land-
Figure 6 marks in linear objects (such as a sound files,
or other continuous functions, curves, etc.) also
adds the corresponding intervals in the time/space
function. As will be shown subsequently, this may
therefore make their graphical representation better
correspond to the global score time flow.
The resulting ratios are gathered in a global
discrete function x = f[t] in which every landmark of
the score finds a position depending on the possible
display constraints of the musical object of which it
is a part (as in Figure 6), and on those of the other
objects in the score (as in Figure 7). By applying this
function when building the graphical representation
Figure 7 (i.e., each time a modification occurs in the score
contents), the relations between time and space are
constraints of individual objects, the representation therefore modified for every time interval and for
of each one must therefore take in account the each individual object in the score, including those
nonlinearity brought by the others. that could commonly be represented in a linear way.
In Figure 7, two identical voices are represented This approach deliberately prefers the rhythmic
in canon, with the second one shifted in time notation and rendering, which will generally be
(by one second in this case). Therefore, the first the main source of time/space distortions. This
sixteenth-note of the second voice starts with the somehow goes against the usual way to proceed in
fifth sixteenth-note of the first voice. As shown in such mixed cases, where rhythmic objects submit
the figure, the two voices are identical in content, to linear time-line proportions. The algorithm, in
but the graphical representation has been modified that case, could just have consisted of finding the
to respect the temporal alignment of the musical minimal space required for the minimum time
events. For example, the second note of the second interval contained in the score and applying the
voice uses almost twice the space of the second note corresponding ratio to the whole representation,
of the first voice: Some space was required there for hence possibly allocating large and useless spaces to
displaying the first bar line of the first voice. long notes or rests when used together with short
time divisions (e.g., sixteenth-notes or shorter).
With the sheet-alignment algorithm, in contrast, all
Temporal/Visual Alignment objects are locally stretched or compressed following
the distortions of the nonlinear notation symbols in
The consistency of the time representation in the the score and the intervals determined by their own
sheet editor is ensured by an algorithm that calcu- temporal landmarks. Inside each single interval,
lates a global function relating time to space, given if no specific constraint exists, the representation

Bresson and Agon 39


Figure 8. Mixing temporal Figure 9. Mixing rhythmic stretched in the
systems in the sheet and proportional score representation to make the
editor: two traditional notations: each note in the chord-seq notes of track 2
voices and a MIDI file. upper (rhythmic notation) (in proportional notation)
track lasts 1 sec. The fit between the second and
second beat of 1 sec is third beats of track 1.

remains linear, but each interval has a specific ratio


depending on the overall context. the MIDI events are therefore graphically adapted
accordingly with the score notation constraints, in
order to coincide with the notes and symbols of
this score. Two successive MIDI notes of the same
Time Systems
duration will thus no longer necessarily have the
same graphical size.
Different musical objects that might exist in the
In OpenMusic, the chord-seq object is another
sheet editor may correspond to various time sys-
proportional-time object, but represented with
tems. Among the objects existing in the OpenMusic
music notation: Although it resembles a score
environment, three main types are identified: pulsed
representation, each chord or note is specified
time (divisions of regular pulses, such as the case of
by an onset given in milliseconds, with no other
traditional music notation), proportional time (e.g.,
consideration for meter or pulse units. However,
sequences of musical events specified in millisec-
unlike the notes in MIDI files (Figure 8), the notes
onds), and (pseudo-) continuous time (e.g., signals
in chord-seq objects have non-linear properties,
and curves defined by mathematical functions).
because they are displayed on a score system in the
Pulsed time objects were discussed previously. They
traditional way (with note heads, stems, possible
mostly correspond to the voice objects represented
accidentals, etc.). The sheet editor, in this case, will
using a traditional music-notation system and are
thus also allow for the representation of chord-seq
the main source of space/time distortions.
objects together with other types of objects and in
particular with rhythmic notation (implemented
in the voice object), as shown Figure 9. Even
Proportional-Time Objects
though chord-seq is a proportional-time object, it
is responsible for the main time/space distortion in
Figure 8 shows another example of a sheet editor
this example.
where the tracks now correspond to different time
systems: The first two are voices in traditional music
notation, as in the previous examples, and the third is
a MIDI file (which is actually a MIDI transcription of Continuous-Time Objects
the voices) where events and durations are specified
in milliseconds (proportional time). The beginnings In addition to the previous time systems, other
and endings of the MIDI events in this MIDI file are objects exist that we call continuous, because
considered as landmarks: Following our temporal they are not composed of well-marked discrete
alignment algorithm, the onsets and durations of events (e.g., notes). This category mainly includes

40 Computer Music Journal


Figure 10. Mixing time align its segments with the
systems in the sheet editor: score representation. Note
a traditional voice and a the linear representation of
sound file. The markers in the same sound file in the
the sound allow the user to audio-file editor.

curves and sound signals. With these objects, only the audio file with the markers (in this case, of one
two important time points are known: the beginning second each) are stretched or compressed to respect
and the end. These points, however, allow one to the time spacing of the score (here with a tempo of
position the objects and to deduce their size in the 60 beats per minute).
graphical representation.
Additional points of interest can be defined
or deduced inside these objects to refine their Programming in the Sheet Editor
representation: sub-segments, particular events not
explicitly represented in the objects, etc. Making The second main feature of the sheet editor results
these points participate in the time/space function from its embedded programming possibilities, which
and in the graphical-display algorithm will correctly make it possible to build the objects it is made of
position them in the global time-line representation algorithmically from within the score and/or to set
by independently applying possible distortions to functional relations among them. When switched
the subsequent intervals. to the patch mode, the editor resembles the usual
In the curves and functions, the inflection points patches of OpenMusic (i.e., a visual program editor)
(or control points) can be used as landmarks. In the in which every object is displayed with an input
case of sound files, we use markers that can be set and an output and can be connected to functional
manually or algorithmically (i.e., in visual programs) call boxes and/or to other objects of the score. The
along the time axis. sheet editor can therefore be a framework for the
Figure 10 shows an audio file in which some development of programs in which objects of the
markers were regularly placed every second, along score are created and processed or linked by local
with a voice in traditional music notation. The functional relations. This provides the user with the
beginning of the audio track corresponds to the possibility of bringing to light another part of the
actual beginning of the voice (the time of the first musical semantics of the score, on both functional
note of this voice), and the segments highlighted in and generative levels. In Figure 11, for example, two

Bresson and Agon 41


Figure 11. Functional Figure 12. A sheet editor
relations among containing various tracks
components of the sheet and sequenced musical
by programming in the objects.
editor (patch mode).

Figure 11

voices are connected through a program that makes


one the retrograde of the other.
The ability to evaluate objects or the whole
score allows one to build or update the contents
of the score according to these types of possible
dependencies. Whatever modification would be
done on the first voice of Figure 11, evaluating the
editor will make the second one its retrograde.
Now considering again an example such as the
one in Figure 10, one might also wish that the elec-
troacoustic part of the score (the sound file) would be
related to the upper (assumed instrumental) voice.
Figures 12 and 13 show a sheet in modes normal
and patch, respectively. The sound file in this ex-
ample is computed by a program embedded within
the sheet editor, using data coming from the notes of Figure 12
the first track and from the envelope of the third one.
The representation can thus be enriched with
different objects taking part in the musical and/or
sound-synthesis processes, or simply allowing a Figure 14 is another example where the process
better reading and understanding of the score. By represented in Figure 4 is reconstituted in a sheet,
switching back to the score mode, the program- hence emphasizing again from within the score the
ming components are hidden for a clearer score functional and temporal relations between the voice
display (Figure 12). and the synthesized sound.

42 Computer Music Journal


Figure 13. Sound-synthesis from a musical sequence
process in the sheet editor and two dynamic
from Figure 12 (patch envelopes (one global,
mode). The Csound box is coming from the third
an abstraction of a patch track of the score, and the
(visible at the right side of other local, applied to each
the figure) that performs individual note).
sound synthesis starting

The Sheet in the Composition Environment called a proportional time object (with events
positioned in a time axis, specified in milliseconds).
The sheet has been presented as a new musical The boxes it contains (whose onsets and endings are
container in OpenMusic, associated with its specific considered as temporal landmarks in the alignment
editor. It can be created (or inspected, etc.) as a part algorithm) are therefore stretched and compressed
of a program via the corresponding factory box in in the representation to keep alignment with the
the patch editor (Figure 5). Conversely, we have music notation.
shown in the previous section how patches could be Like patches and maquettes, the sheet also exists
integrated in the sheet editor. in OpenMusic as a standalone persistent document.
The integration of maquettes is also an interesting Therefore, patches, maquettes, and sheets are now
issue: Considering their dual aspect of programs and complementary documents: The patch emphasizes
musical objects, they can be embedded in the sheet the programming aspects, the maquette emphasizes
editor and act either as a process producing a musical the time-processing aspects, and the sheet empha-
output, or as a more or less complex musical form, sizes the score and notation aspects. These different
made of temporal and functional structures, inserted supports should then naturally interact with one
in one of the sheet tracks (see Figure 15). In this another in the compositional processes and models
case, the maquette corresponds to what we have created in the CAC environment.

Bresson and Agon 43


Figure 14. Dynamic Figure 15. A maquette in
computing of a the sheet editor.
time-stretched sound in
the sheet editor. The
resulting stretched sound
segments are displayed
accordingly with the
initial rhythmic notation.

This score was written using personalized in-


struction graphics for the performers: Two players
are to excite the tam-tam, while two other players
handle microphones around the tam-tam following
precisely notated rhythmic and gesture instructions.
In the meantime, the two remaining players handle
filters with potentiometers to control the rendering
of the microphones outputs. Figure 16 shows a part
of this score reconstituted in the sheet.
The instructions for the tam-tam players
(rhythms, dynamics, positions, etc.) were created
here using OpenMusic voice objects with empty
staves and special note heads (tracks 1 and 5). The
movements and distances of the microphones from
the tam-tam and from the points of excitation are
represented by special curves with variable thick-
nesses, created in OpenMusic using modified BPF
objects (tracks 2 and 6). Finally, the different po-
tentiometers are controlled according to the visible
break-point or B-Spline curves (on tracks 4 and 8)
and to the gray figures representing the evolutions
of the filters high and low limits (tracks 3 and
Figure 14 7). These figures correspond to a special class of
musical object (BPF-bande) specifically designed for
this example. Figure 17 shows a patch implementing
part of the sheet-construction process.
In the future, we are considering pursuing this
experiment by implementing a sound-synthesis
processprobably using the OpenMusic interface
with Modalyss physical models (Ellis, Bensoam,
and Causse 2005)which would allow us to roughly
simulate and experiment with a performance of
the piece, starting from the data contained in our
score reproduction (following the principles of the
example in Figures 13 and 14). Aside from the
compositional aspect, this example also stresses
possible musicological applications; such an
Figure 15
implementation can indeed offer a computational
model of compositional and performing processes
that would enable musicologists to simulate and
A Concluding Example analyze particular aspects of a piece.

Before concluding, we here present a current work


in progress making use of the sheet editor, in which Conclusion and Future Works
we have tried to transcribe a part of the score of
Stockhausens Mikrophonie I (1965), for tam-tam This article introduced a new editor in the Open-
and six players. Music environment, addressing various questions

44 Computer Music Journal


Figure 16. A part of
Stockhausens
Mikrophonie I score,
reconstituted in the sheet
editor.

related to the general notion of the score in computer for various purposes in complement with them,
music composition. The sheet editor allows for the particularly when there is a special interest in mu-
creation of scores integrating instrumental parts sic notation. The many possibilities of embedding
(written in traditional music notation) together with these different documents inside each other is yet
sound signals and other data or instructions possibly another essential characteristic of the system and
related to the electronic parts of a piece. Regarding should help composers to freely combine the various
instrumental notation, it can also be used for mix- abstractions and structural levels of their musical
ing rhythmic notation (voices in OpenMusic) with works.
proportional time notation (like chord-seq objects) Future developments will probably concern the
in a single score, which is a longstanding problem setting of more advanced time relations between the
in music-editing tools. Moreover, the possible func- objects in the sheet editor. The temporal landmarks
tional relations set between the different objects present or defined by the user in the musical objects
and the general programming features make score- indeed allow one to envisage possibilities for their
representation issues and their implications in terms mutual synchronization (Stroppa and Duthen 1990).
of temporal representation interact with computing An object could be positioned in the time axis using
issues, which we believe to be of major importance one of its internal temporal landmarks, possibly
in computer-music composition systems. The po- synchronized to a landmark of another object from
tential of this new type of score can therefore be fully another track of the sheet. In the future, persistent
achieved in a CAC environment such as OpenMusic. temporal relations between the objects could hence
The sheet extends the previous types of docu- be implemented, using, for instance, the interval
ments, the patch and the maquette, and may be used logic operators from Allen (1983), or other temporal

Bresson and Agon 45


Figure 17. Building a the sheet from Figure 16.
model of Mikrophonie I in Left: a part of the curve of
OpenMusic. This patch track 4; right: some filter
shows the creation, by settings visible on track 3.
programming and using These objects are then
their graphical editors, of collected in a sheet object
some of the components of factory.

logic formalisms as described, for example, by Agon, C., and G. Assayag. 2002. Programmation visuelle
Marsden (2000). Another important related issue et editeurs
musicaux pour la composition assistee

to be addressed concerns the pagination of score `
par ordinateur. 14eme
Conference Francophone
editors to allow for the display and printing of larger sur lInteraction Homme-Machine IHM02. New
York: Association for Computing Machinery, pp. 205
scores and musical works.
206.
Agon, C., et al. 1998. Objects, Time and Constraints in
Acknowledgments OpenMusic. Proceedings of the 1998 International
Computer Music Conference. San Francisco, California:
The authors would like to thank Catherine Durand, International Computer Music Association, pp.
Michael Fingerhut, and Georges Bloch for their help 406415.
Agon, C., K. Haddad, and G. Assayag. 2002. Repre-
in proofreading this manuscript.
sentation and Rendering of Rhythmic Structures.
International Conference of Web Delivering of
References MusicWedelMusic. Piscataway, New Jersey: IEEE
Computer Society Press, pp. 109113.
Agon, C. 1998. OpenMusic: un langage de programma- Allen, J. F. 1983. Maintaining Knowledge About
tion visuelle pour la composition musicale. Ph.D. Temporal Intervals. Communications of the ACM
Thesis, Universite Pierre et Marie Curie, Paris 6, France. 26(11):832843.

46 Computer Music Journal


Assayag, G. 1998. Computer Assisted Composition Kuuskankare, M., and M. Laurson. 2006. Expressive
Today. First Symposium on Computer and Music. Notation Package. Computer Music Journal 30(4):67
Corfu, Greece: Ionian University, pp. 917. 79.
Assayag, G., et al. 1999. Computer Assisted Composition Laurson, M. 1996. PatchWork: A Visual Programming
at IRCAM: From PatchWork to OpenMusic. Computer Language and Some Musical Applications. Doctoral
Music Journal 23(3):5972. dissertation, Sibelius Academy, Finland.
Bancquart, A., M. Andreatta, and C. Agon. 2008. Laurson, M., and M. Kuuskankare. 2002. PWGL: A Novel
Microtonal Composition. In J. Bresson, C. Agon, and Visual Language based on Common Lisp, CLOS and
G. Assayag, eds. The OM Composers BookVolume OpenGL. Proceedings of International Computer Mu-
2. Paris: IRCAM/Editions Delatour, pp. 279302. sic Conference. San Francisco, California: International
Boulez, P. 1987. Timbre and CompositionTimbre and Computer Music Conference, pp. 142145.
Language. Contemporary Music Review 2(1):161171. Marsden, A. 2000. Representing Musical Time: A
Bresson, J., and C. Agon. 2006. Temporal Control over Temporal-Logic Approach. Lisse, Netherlands: Swets &
Sound Synthesis Processes. Proceedings of Sound Zeitlinger.
and Music Computing Conference, Marseille, France: Nienhuys, H.-W., and J. Nieuwenhuizen. 2003. LilyPond:
Centre National de Creation
Musicale, pp. 6776. A System for Automated Music Engraving. Proceed-
Ebbeke, K. 1990. La vue et loueProblematique
ings of the XIV Colloquium on Musical Informatics
des partitions dans la musique electro-acoustique.
(XIV CIM 2003). Florence, Italy: Tempo Reale/AIMI,
Contrechamps 11:7079. pp. 167172.
Ellis, N., J. Bensoam, and R. Causse. 2005. Modalys Orfali, R., D. Harkey, and J. Edwards. 1996. The Essential
Demonstration. Proceedings of the 2005 International Distributed Objects Survival Guide. New York: Wiley.
Computer Music Conference. San Francisco, California: Smith, L. 1972. Score: A Musicians Approach to
International Computer Music Association, pp. Computer Music. Journal of the Audio Engineering
101102. Society 20(1):714.
Hamel, K. 1998. NoteAbility, A Comprehensive Stroppa, M., and Duthen, J. 1990. Une representation
Music Notation System. Proceedings of the 1998 de structures temporelles par synchronisation de
International Computer Music Conference. San pivots. Colloque Musique et Assistance Informatique,
Francisco, California: International Computer Music Marseille, France: Laboratoire Musique et Informatique
Association, pp. 506509. de Marseille (MIM), pp. 305322.

Bresson and Agon 47

También podría gustarte