Leadership & Management

A Cybernetic Glove to Control a Real Time Granular Sound Synthesis Process

Description
We propose a cybernetic glove to control sound synthesis process. The glove was adopted to measure static and dynamic postures of the hand, by means of piezoelectric sensors, which are capable to change their electrical resistivity when deformed. The
Published
of 4
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
  A Cybernetic Glove to Control a Real Time Granular Sound Synthesis Process   Giovanni COSTANTINI 1,2 , Massimiliano TODISCO 1 , Giovanni SAGGIO 1   1 Department of Electronic Engineering, University of Rome Ò Tor Vergata Ó  Rome, Italy 2 Institute of Acoustics Ò O. M. Corbino Ó , Via del Fosso del Cavaliere, 100 Rome, Italy ABSTRACT We propose a cybernetic glove to control sound synthesis process. The glove was adopted to measure static and dynamic postures of the hand, by means of piezoelectric sensors, which are capable to change their electrical resistivity when deformed. The piezoresistive coefficient is defined by the ratio of the change of the relative resistivity caused by the change of the relative length of resistor. In addiction to the piezoresistive sensors we also utilize kinematic transducers, i.e. three mono-axial accelerometers, to reveal hand motion acceleration. The musical synthesis process was realized by means of a sound synthesizer based on granular additive synthesis algorithm. Keywords : Werable sensor; Virtual musical instrument; Control; Musical synthesis process. 1. INTRODUCTION Traditional musical sound is a direct result of the interaction between a performer and a musical instrument, based on complex phenomena, such as creativeness, feeling, skill, muscular and nervous system actions, movement of the limbs, all of them being the foundation of musical expressivity. Actually, musical instruments transduce movements of a performer into sound. Moreover, they require two or more control inputs to generate a single sound. For example, the loudness of the sound can be controlled by means of a bow, a mouthpiece, or by plucking a string. The pitch is controlled separately, for example by means of fingering which changes the length of an air column or of a string. The sound produced is characteristic of the musical instrument itself and depends on a multitude of time-varying physics quantities, such as frequencies, amplitudes, and phases of its sinusoidal partials [1]. The way music is composed and performed changes dramatically [2] when, to control the synthesis parameters of a sound generator, we use human-computer interfaces, such as mouse, keyboard, touch screen or input devices such as kinematic and electromagnetic sensors, or gestural control interfaces [3,4]. In this paper, we discuss a gestural sensor interface. It measures the signals relative to the hand movements, by means of some piezoresistive sensors and kinematic transducers. The paper is organized as follows: first we describe the glove, the gestural transducer and the system architecture, then we describe and illustrate the sound synthesizer, finally, we show our glove based musical instrument. 2. TRANSDUCERS   A lightweight lycra-based glove was chosen for our instrumented system as a support for ink carbon based bend sensors. The sensors were placed onto the glove in correspondence to the dorsal part of all the fingers joints, in order to trace fingers movements, as illustrated in figure 1. The sensors themselves assume an arched form when the fingers joints are in flat position, and follow the fingers profiles with closed fist, as figure 3a, 3b schematized. The sensor arched form is also utilized for measuring addu/abduction movements as in figure 3c, 3d. Here we adopted bend sensors as transducers. This is because of their lightness, cheapness and the fact that we experienced a new way of application which, differing from the usual literature way of usage [5, 6, 7], prevents these sensors from being stressed by elongation and torsion forces. In such a way it is potentially avoided the problem that some unwanted forces may prejudice the sensor from returning to its initial state because of its non perfect elasticity, so to lead to a drift in the sensor characteristics during its life of utilization. Moreover, in this way we potentially prevented the Ò force of the hand grip Ó  from leading to the so defined Ò source of noise in the sensor output Ó  [8]. Resistance values recorded from the sensors were converted into voltage signals and then fed into Arduino Mega board [9]. The Arduino Mega (figure 2) is a microcontroller based on the ATmega1280 processor. It has 54 digital input/output pins (of which 14 can be used as PWM outputs), 16 analog inputs, 4 UARTs (hardware serial ports), a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP header, and a reset button.  3. TEXTURE SOUND SYNTHESIZER   The synthesis process was realized by means of the sound synthesizer Ò Textures 2.1 Ó  [10]. The sound synthesized with Ò Texture 2.1 Ó  is based on a granular additive synthesis algorithm. There are seventeen sound synthesis parameters [10], showed in figure 4, regarding knobs, through which we can shape the sound waveform.  The extensive use of sound textures is one of the most characteristic aspects of contemporary music. Textures are particularly interesting when the polyphony is so dense that the voices become almost indistinguishable between themselves. The Ò Texture Ó , at this point, becomes part of the sound timbre. The spectral typology control of Texture is based on: ¥   Spectrum expansion-contraction by means of a simple model derived from the Steve Mc Adams formula. ¥   Stochastic deviation from partials frequencies by means of a probability distribution. ¥   Decay of amplitudes depending from partial order. Figure 5 show Ò Texture 2.1 Ó  standard VST [11] (Virtual Studio Technology from Steinberg ltd) audio plug-in. Figure 1: Frontview of the sensors mounted onto the glove Figure 2: Arduino Mega board. (a) (b) (c) (d) Figure 3. Sensors in arched configuration to measure (a)(b) flex/extension and (c)(d) addu/abduction movements.      Figure 4 Figure 5: Sound synthesizer ÒTextures 2.1Ó  4. THE GLOVE BASED MUSICAL INSTRUMENT   The glove based musical instrument that we propose has been developed by using the Max/MSP [12] environment. It is constituted by two components: the control unit and the synthesizer unit. The control unit allow the performer to control seventeen parameters. The signals supplied by the glove sensor unit realize the interface between the performer and the system. Particularly, we consider signals supplied by the glove: finger movements and three hand rotations, The chosen sensor by means of which the synthesis parameters are controlled, all influence the way the musician approaches the composition process. In Figure 4, the structure of the virtual musical instrument is shown. 5. CONCLUSIONS   We have developed a cybernetic glove for composing and performing expressive musical sound. We direct our attention to common musical aesthetics as a determinant factor in musical expressivity. The sensor interface we have presented is arranged by a sensor unit that supplies kinematics physical parameters. Particularly, these parameters are intrinsic and extrinsic hand movements. The experiences made by working with our wireless glove sensor interface have shown that the mapping strategy is a key element in providing musical sounds with expressivity. 6. REFERENCES [1] Neville H. Fletcher, Thomas D. Rossing, The Physics of Musical Instruments , Springer, 2nd edition (July 22, 2005). [2] Curtis Roads, The computer music tutorial , The MIT Press, (February 27, 1996). [3] Bongers, B. 2000, Physical Interfaces in the Electronic Arts. Interaction Theory and Interfacing Techniques for Real-time Performance , In M. Wanderley and M. Battier, eds. Trends in Gestural Control of Music. Ircam - Centre Pompidou. [4] Orio, N. 1999. A Model for Human-Computer Interaction Based on the Recognition of Musical Gestures. Proceedings of the 1999 IEEE International Conference on Systems, Man and Cybernetics , pp. 333-338. [5] L. Dipietro, A.M. Sabatini, P. Dario, A survey of glove-based systems and their applications, IEEE Transaction on Systems, Man, and Cybernetics  Ð  part C: Applications and Reviews , vol.38, n.4, 2008, pp. 461-482 [6] N.W. Williams, J.M.T. Penrose, C.M. Caddy, E. Barnes, D.R. Hose, P. Harley, A goniometric glove for clinical hand assessment, Journal of Hand Surgery  (British and European Volume, 2000) 25B: 2: 200-207 [7] L.K. Simone, E. Elovic, U. Kalambur, D. Kamper, A low cost method to measure finger flexion in individuals with reduced hand and finger range of motion, Proc. Of the 26th Annual International Conference of the IEEE EMBS , San Francisco, CA, USA, sept. 1-5, 2004, pp. 4791-4794 [8] S. Wise, W. Gardner, E. Sabelman, E. Valainis, Y. Wong, K. Glassand, J. Drace, J. Rosen, Evaluation of a fiber optic glove for semiautomated goniometric measurements , J. Rehabil. Res. Dev., vol. 27, no. 4, pp. 411-424, 1990 [9] Arduino , documentation avaible on the web at http://arduino.cc/ [10] Giorgio Nottoli, A sound texture synthesizer based on algorithmic generation of micro-polyphonies, Proc. of 3 nd  International Conference Ò Understanding and creating Music Ó , Caserta, December 2003, 11-15. [11] Steinberg VST Audio Plug-Ins SDK, 3rd party developer support site at http://www.steinberg.net/324_1.html [12] Cycling74 Max/MSP , documentation avaible on the web at http://www.cycling74.com/products/maxmsp/
Search
Similar documents
View more...
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks