Phase 1

For this first phase, four interdisciplinary teams of students in playwriting, composition, design, scenography, theater, dance, film and media arts each developed a five-minute media opera in augmented and virtual reality on the theme of the river (fleuve) for two singers and three instruments (accordion, clarinet and cello) as part of a one-year interdisciplinary and interfaculty seminar (fall 2021 - winter 2022). Supervised by a team of six professors - Ana Sokolović (Faculty of Music), Olivier Asselin (Department of Art History and Film Studies, Faculty of Arts and Science), Marie-Josèphe Vallée (School of Design, Faculty of Environmental Design), Diane Pavlović and Andrea Romaldi (National Theatre School), and choreographer Sarah Bild (École de danse contemporaine de Montréal)— , the students were all involved in the creative process from the outset, allowing the different disciplines to influence each other, in contrast to the classic operatic creation scheme where creation follows a well-defined order (libretto, then music, then staging and set design). Here, the visual concept developed by the directors and set designers, or the movement of a staging, could inspire the text or music, for example, and vice versa.

Quiet Night Thoughts

Une maison dans la main

Following comprehensive analysis, cause of death of Montreal's humpback remains undetermined...

Jusqu'au prochain printemps

This co-creation process was also nourished by an upstream reflection carried out by an auxiliary team comprising students in composition, musicology, film, playwriting and design. After analyzing works for the stage, exploring augmented and virtual reality, and conducting a literature review on screen opera and new technologies, this auxiliary team developed research questions that could guide the creation:

What defines opera as a genre?

What constitutes "presence", particularly in the context of virtual and augmented reality?

How can technology serve artistic creation?

How can opera be democratized?

From a technological point of view, we chose to use volumetric video with depth sensors, using five cameras. Sound was recorded live, during the capture. All the elements were then integrated into the Unity software: volumetric video, set design, 3D animations and visual effects, sound. We then integrated the results of these prototypes into an application that can be consulted on a tablet or smartphone. Our partners Normal studio and INEDI supported the post-production stages.

This first phase of exploration enabled us to develop four opera prototypes in virtual and augmented reality, to identify the technical challenges involved in volumetric capture and live sound recording, and to gain a better understanding of the issues involved in interdisciplinary co-creation. These discoveries will feed into the work of Phase 2, in which these prototypes will be developed into 15-minute operas to be broadcast by Opéra de Montréal in autumn 2024.

We have disseminated the results of this first phase in symposia and study days such as Watershed, Journée d'étude Centre Phi and plan to collect the results in a thematic issue of the Revue musicale OICRM.