240 likes | 394 Views
Distributed Mobile Musi c Applications & Social Music Making. IAN GIBSON BA(Hons ), MSc (York), DPhil (York) Founding Executive Committee: Sonic Arts Forum Member of the ICMA Member of the Interdisciplinary Centre for Scientific Research in Music, University of Leeds
E N D
Distributed Mobile Music Applications & Social Music Making IAN GIBSON BA(Hons), MSc (York), DPhil (York) Founding Executive Committee: Sonic Arts Forum Member of the ICMA Member of the Interdisciplinary Centre for Scientific Research in Music, University of Leeds Visiting Professor, University of British Columbia Former Member of Worcester College, University of Oxford @musictechguy
Introduction • Background in a number of institutions, teaching in a number of subject areas, research in both music and computing • Singing analysis and synthesis • Sonic Art • Social Music Making • System development • Enabling music • Performance spaces • Interfaces for commercial systems • Conclusions
Singing Analysis & Synthesis • Interest in novel interfaces for music synthesis and performance • Voice controlled sound synthesis
Sonic Art • Decade Installation, Round Foundry • Call Waiting project • “Testing The Line” Performance
Sonic Arts Forum – Next Conference: July 7th, University of Leeds.
cellmusic: A real-time performance system for mobile devices
Introducing cellmusic • cellMusic is a real-time, wireless distributed composition and performance system designed for domestic mobile devices. • It distinguishes itself from other wireless performance environments in that it is intended for ad hoc performances in a variety of locations, with services and performances dynamically adapting to the number of devices available. • It is intended that users will perform in the same manner that they use mobile phones for interacting socially.
The Performer & Performance • Anyone! • Sonic artists • Appeal and capitalization of the low-fi sound output, but with strength of numbers. Natural diffusion. • Exploration of a variety of physical environments, e.g. Parks, outdoor public performance spaces, impromptu performances at conferences etc. • Collect sonic material while performing!
Initial Systems: • CLDC-1.1(Connected Limited Device Configuration) • at least 192 kB of total memory budget available to Java • a 16-bit or 32-bit processor • low power consumption, often operating with battery power • connectivity to some kind of network, often with a wireless, intermittent connection and with limited bandwidth. • MIDP-2.1 (Mobile Information Device Protocol) • User Interface package • Application Life Cycle package (How the apps run in their environment) • Networking (e.g. https) • Public Key Package (Secure connections) • Persistence Package (Ability to read and write record stores, although no reading/writing to a file system)
Bluetooth • Bluetooth is designed to run at distances of up to 10 metres, however high powered devices will function up to 100 metres. A Bluetooth device can be discoverable. It may respond to an enquiry from another device with the following information: • Device name • Device class (e.g. Headphones, computer) • List of services available on the device • Technical information (relating to the device features, manufacturer, Bluetooth specification implemented etc) • Some devices are limited to the number of simultaneous connections they can achieve (typically up to 7 ). In some cases, one device may be required to pair with another in order to access its services. • Every device has a 48-bit address which is unique. Generally these are hidden, with user-defined names appearing in response to scans.
Client & Server Architecture • Threads are used to allow each node to be both a client and server. • Device names, addresses and services are used to enable devices to connect with each other. Sometimes permission has to be given to pair a device with another. • A unique UUID (Universally unique identifier)is used to allow identification of a cellMusic service. • A thread is used to monitor input from connections and store incoming control data in a list.
cellMusic Events • Four playback options • Simple tone • WAV / MP3 • Sequence of tones • MIDI • Extra data included to specify the playback device/devices (Current, ID, ‘next device’, ‘all devices’ etc) • Control instructions • Execute instruction from another device • Wait for instruction • Change tone in a note sequence • Re-assign WAV/MP3 ID • Iteration and selection control
Demonstration of Bluetooth Discovery And Delay Offset Between Devices
cellMusic Performance Data • Performance data is used to give structure • One node will invite others to join it • Partly to overcome the problem of constantly searching for other nodes • Future explorations of sub networks • Live interaction • Manual zoning of instruments • Attempt to allocate another part • Deallocate a part • Execute an event from a piece • Execute an event live from user input
Current Project • Porting to iOS 5.1 (Upgrading to latest OS)
Future Developments • User interface • Evolution of performance data (toward a meaningful language for composers) • Evolution of supported network topologies (and sub networks) • More on synchronization (Including Master/Slave clock synch) • More on live data collection (Images, sound) • More on services discovery • Explore linking to larger sound systems / lighting control • Port to iPhone et al
Enterprise & Informing Teaching With Research • Postgraduate projects (MSc, PhD) • Programming • Composition • Live diffusion of a piece • Commercial marketing of App • Towards a language for sound design • Industry link program: Development of software for control of Apollo system (iPad/ iPhone) • Successful Pilot • Configuring surround sound installations • Diffusion of surround sound components • Student placements • Redevelopment of the Undergraduate Programming Module
Adaptive (Assistive) Music Technology Research Group • A research group about to launch • Huddersfield, MMU, Leeds, Royal Northern College of Music etc. • Apollo, YAMSEN, Inclusive Music, SKUG (Norway), Invention Education, Sensory Software, Sound Sculpture • Adapting modular interactive audio for enabling music projects
Shared themes with UBC’s Digital Ventriloquized Actors (DIVAs) • Visiting Associate Professor (2011) • Visiting Professor (2012) • Exploring singing synthesis • Developing mobile applications for iPhone/iPad/iPod Touch • Investigating the composer/performer/conductor relationship • Establishing appropriate protocols • Investigating the ‘learning curve’ relating to performance techniques
Thank you! (WAKE UP NOW!) Dr Ian Gibson University of Huddersfield, United Kingdom