MR4MR: Mixed Reality for Melody Reincarnation

MR4MR: Mixed Reality for Melody Reincarnation


There is a long history of an effort made to explore musical elements with the entities and spaces around us, such as musique concrète and ambient music. In the context of computer music and digital art, interactive experiences that concentrate on the surrounding objects and physical spaces have also been designed. In recent years, with the development and popularization of devices, an increasing number of works have been designed in Extended Reality to create such musical experiences. In this paper, we describe MR4MR, a sound installation work that allows users to experience melodies produced from interactions with their surrounding space in the context of Mixed Reality (MR). Using HoloLens, an MR head-mounted display, users can bump virtual objects that emit sound against real objects in their surroundings. Then, by continuously creating a melody following the sound made by the object and re-generating randomly and gradually changing melody using music generation machine learning models, users can feel their ambient melody "reincarnating".

This work has exhibited in NTT Intercommunication Center

https://www.ntticc.or.jp/ja/archive/works/mr4mr-mixed-reality-for-melody-reincarnation/

Image

Paper

[2209.07023] MR4MR: Mixed Reality for Melody Reincarnation

https://arxiv.org/abs/2209.07023

Please cite as

Kobayashi, Atsuya, Ishino, Ryogo, Nobusue, Ryuku, Inoue, Takumi, Okazaki, Keisuke, Sawa, Shoma, & Tokui, Nao. (2022, September 17). MR4MR: Mixed Reality for Melody Reincarnation. Proceedings of the 3rd Conference on AI Music Creativity. The 3rd Conference on AI Music Creativity (AIMC 2022). https://doi.org/10.5281/zenodo.7088357

Bibtex

1@inproceedings{kobayashi_mr4mr2022,
2  title        = {MR4MR: Mixed Reality for Melody Reincarnation},
3  author       = {
4    Kobayashi, Atsuya and Ishino, Ryogo and Nobusue, Ryuku and Inoue, Takumi
5    and Okazaki, Keisuke and Sawa, Shoma and Tokui, Nao
6  },
7  year         = 2022,
8  month        = {Sep},
9  booktitle    = {Proceedings of the 3rd Conference on AI Music Creativity},
10  publisher    = {AIMC},
11  doi          = {10.5281/zenodo.7088357},
12  abstractnote = {
13    <p>There is a long history of an effort made to explore musical elements
14    with the entities and spaces around us, such as musique concr&egrave;te and
15    ambient music. In the context of computer music and digital art,
16    interactive experiences that concentrate on the surrounding objects and
17    physical spaces have also been designed. In recent years, with the
18    development and popularization of devices, an increasing number of works
19    have been designed in Extended Reality to create such musical experiences.
20    In this paper, we describe MR4MR, a sound installation work that allows
21    users to experience melodies produced from interactions with their
22    surrounding space in the context of Mixed Reality (MR). Using HoloLens, an
23    MR head-mounted display, users can bump virtual objects that emit sound
24    against real objects in their surroundings. Then, by continuously creating
25    a melody following the sound made by the object and re-generating randomly
26    and gradually changing melody using music generation machine learning
27    models, users can feel their ambient melody &quot;reincarnating&quot;.</p>
28  }
29}
30