There are certain advantages to being old. I can collect the awe of the multitude for having seen Rachmaninoff play the piano, and for having shaken hands with Bela Bartok. Still there are drawbacks. I have made the terrible blunder, for example, of letting myself, at this advanced age, become interested in the whole complex of computer technology as it relates to music. I hear people talk about the wonderful gadgetry that will be in every home by the year 2025, and I shudder to think what difficulty I will have, at age 101, in mastering all these wonders. A few weeks ago the California Institute of the Arts held a demonstration, at Santa Monica’s Electronic Cafe, of some of the devices that are being worked on at the school’s newly formed Center for Art, Information and Technology. That center, funded by AT&T and the Peter Norton Family Foundation, is the umbrella under which the beardless techies and bearded electronic eminences at CalArts — and, eventually we — can command the creation, the shape and the sound of music in a manner as yet undreamed of. At this demonstration, Morton Subotnick — charter electronic guru at CalArts since its founding in 1969 — put something called the Gesture Piano through its paces. It isn’t a piano at all, of course, but a software program that enables a keyboardist to access whatever music the program has stored, and make it respond to the user’s whim. Take a Beethoven Sonata, as Subotnick did; lay it into the machine, and a performer at the keyboard can transfigure a performance of that work according to his own vision. A child, in learning that Beethoven Sonata, can command the way his machine performs the work; he can, in learning the work summon up repeats until a phrase becomes familiar. I wondered to myself, just for a moment: how is this matter of recreating a Beethoven Sonata through electronic means all that different from recreating a famous painting from a sheet of paper with numbered spaces to fill in with the right color? The answer lies in this magic word “interactive.” You don’t just recreate the Beethoven Sonata, you recreate it along the lines of your own personal vision of the way the music works. . David Rosenboom, newly anointed Dean of Music at CalArts, came on with a program called “Heirarchical Music Specification Language,”which also involved interactions whereby the whole process of artificial intelligence somehow conspires to create virtual new intelligent instruments within the computer. And the morning ended with a spectacular dance demonstration, called by its inventor, Mark Coniglio, a MIDI-Dancer. That, to an outsider, was both alluring and understandable. A dancer, her both arms wired to small receptors that sent information to a computer via a wireless transmitter strapped to her back, as she moved her arms, moved through a series of steps. Her movements, picked up by the arm terminals, controlled the music, and also the lighting of the improvised stage area in which she worked. This, of course, was stunning, if only for the elementary reason that the wired dancer creating the music and lighting was locked in to a perfect synchrony. That, of course, is also something of a drawback. The greatest hangup about the role of computers in the creation of art,. it seems to me, is the dehumanization process, the lack of randomness and surprise. The dynamic of a live performance is the risk factor, the real chance that human factors will inevitably intercede in a performance, that no two will be exactly alike. Subotnick talked about the shared concern in this problem, and about the development of such things as a metronome that can allow for human variations in the rhythm and meter of a piece. Sounds self-contradictory to me: a random metronome; but who am I, at this advanced age, to say? I have seen the interactive future. and it is user-friendly. This is Alan Rich with Notes on Music.,

This entry was posted in Daily News. Bookmark the permalink.