Virtual Reality Showcase

Experience virtual reality as perceived and created by artists Gille de Bast, Allison Moore, Lisa Jackson, jadda tsui, and John Kyle Varley + Ed Renzi.

Viewing Hours:

Thursday Nov 21, 5pm – 9pm
Friday Nov 22, 12pm – 9pm
Saturday Nov 23, 12pm – 9pm
Sunday Nov 24, 12pm – 5pm

This exhibition is open to the public.

VR Works

Gille de Bast - Projet Lévitation

The levitation project is an experimental artistic device that aims to modify certain human cognitive and sensory capacities by immersing the audience into a hyper-realistic virtual space, where they can practice to levitate. Equipped with a VR helmet, a neural helmet, the viewer finds himself in a 3D world that hyper realistically recreates the real space in which he is located. He will then have access to the levitation program that will offer him concentration exercises to levitate his virtual body, his “avatar”, up to aerial limits from exposure. The assumption is that, by practicing, he will create / modify certain neural patterns of his brain, so that eventually he can levitate in the real world.


Allison Moore – Cloud Bodies

CLOUD BODIES is a 6dof VR experience using volumetric capture of two dancers merging with the point clouds of photo-scanned landscapes captured by the artist. The virtual bodies of the dancers morph with topographical scans from the natural world. Geometric metadata gathered from real environments are processed into point clouds, which are then reconstructed into virtual environments. A similar process of volumetric capture records the movement of bodies in x-y-z space. The use of photo-scanning to generate 3D virtual spaces signals to a hyper-archive of the future. What is the body’s relationship to these virtual and physical spaces?

Artist Concept & Direction by Allison Moore.
Dancers: Fia Grogono & Alex Jolicoeur
Created at the Post Image VR Lab
Milieux Institue for Arts, Culture, Technology
Concordia University

Lisa Jackson / NFB – Biidaaban

Toronto’s Nathan Phillips Square is flooded. Its infrastructure has merged with the local fauna; mature trees grow through cracks in the sidewalks and vines cover south-facing walls. People commute via canoe and grow vegetables on skyscraper roofs. Urban life is thriving.

Rooted in the realm of Indigenous futurism, Biidaaban: First Light is an interactive VR time-jump into a highly realistic—and radically different—Toronto of tomorrow. As users explore this altered city now reclaimed by nature, they must think about their place in history and ultimately their role in the future.

By Lisa Jackson, Mathew Borrett, Jam3 and the National Film Board of Canada.

jadda tsui – room

The curve of a wall, the smoothness of a tile, the starkness of a room. how can immersion evoke a certain feeling or emotion? room is styled as a choose-your-own-adventure meditation, to investigate the ways in which we perceive space, whether it be through memory, suggestion, sensory input, or habits of experience.

By jadda tsui (2019).

Banff Centre Audio – Victorian K-pop

Spatialized 3D Audio and Generative Music for VR - Music composition in VR environments presents interesting challenges and exciting opportunities. How should music behave in this new medium? What tools are available to storytellers and other content creators who wish to enrich their experiences with music? How can composers write, record, and produce musical environments for interactive immersive experiences? Frameworks Productions was created to answer the broader question – how should we create musical scores specifically for immersive audio and VR?  These musical experiments in spatial music were developed by Frameworks Productions with the assistance of the Banff Center. The three environments represent different approaches to spatial music in virtual worlds. The "Piano Cave" breaks a piano into 88 individual parts, surrounding the user with a colossal instrument. "Victorian K-pop" blends dynamic visuals with traditional musical instruments to fill a virtual space with sound. The spatial composition demo showcases Framework's proprietary generative music system, which creates musical scores that are unique to each user and appropriate to a changing narrative. The user's movement, gaze direction, and interaction with the environment will change the musical harmony in real-time.

By John Kyle Varley & Ed Renzi.