Английская Википедия:Immersive virtual musical instrument

Материал из Онлайн справочника
Перейти к навигацииПерейти к поиску

Шаблон:Refimprove An immersive virtual musical instrument, or immersive virtual environment for music and sound, represents sound processes and their parameters as 3D entities of a virtual reality so that they can be perceived not only through auditory feedback but also visually in 3D and possibly through tactile as well as haptic feedback, using 3D interface metaphors consisting of interaction techniques such as navigation, selection and manipulation (NSM).[1] It builds on the trend in electronic musical instruments to develop new ways to control sound and perform music such as explored in conferences like NIME.

Development

Florent Berthaut created a variety of 3D reactive widgets involving novel representations of musical events and sound, that required a special 3D input device to interact with them using adapted 3D interaction techniques.[2]

Jared Bott created an environment that used 3D spatial control techniques as used in known musical instruments, with symbolic 2D visual representation of musical events.[3]

Richard Polfreman made a 3D virtual environment for musical composition with visual representations of musical and sound data similar to 2D composition environments but placed in a 3D space.[4]

Leonel Valbom created a 3D immersive virtual environment with visual 3D representations of musical events and audio spatialization with which could be interacted using NSM interaction techniques.[5]

Teemu Mäki-Patola explored interaction metaphors based on existing musical instruments as seen in his Virtual Xylophone, Virtual Membrane, and Virtual Air Guitar implementations.[6]

Sutoolz from su-Studio Barcelona used real time 3D video games technology to allow a live performer to construct and play a fully audio visual immersive environment.[7]

Axel Mulder explored the sculpting interaction metaphor by creating a 3D virtual environment that allowed interaction with abstract deformable shapes, such as a sheet and a sphere, which parameters were mapped to sound effects in innovative ways. The work focused on proving the technical feasibility of 3D virtual musical instruments. Gestural control was based on 3D object manipulation such as a subset of prehension.[8]

Early work was done by Jaron Lanier with his Chromatophoria band and separately by Niko Bolas who developed the Soundsculpt Toolkit, a software interface that allows the world of music to communicate with the graphical elements of virtual reality.Шаблон:Citation needed

References

Шаблон:Reflist

External links