IAST 2018 Schedule & Program

Admin.iast.ca/ December 11, 2017/ Schedule/ 0 comments

2018 IAST – Schedule
(25 to 27 October, 2018)

2018 IAST consists of a three-day symposium at the University of Lethbridge from 25 to 27 October, 2018, and exploratory regional meetings. A general schedule is presented first, followed by detailed schedules and information for each day. These schedules only pertain to the three-day symposium. A brief explanation of each component in the general schedule can be found on the Activities webpage. Click on the live links in general schedule, below, to skip to the applicable information.

Time Thurs., 25 Oct. Fri., 26 Oct. Sat., 27 Oct.
8:00AM Registration & morning refreshments (University Recital Hall Lobby, W570 in the University Centre for the Arts [UCA])
8:30AM Registration (University Recital Hall Lobby, W570) Registration (University Recital Hall Lobby, W570)
9:00AM Introduction & Orientation (University Recital Hall [RH]) Virtual Reality Science Building Tour (University Theatre Lobby) Keynote 1 (University Recital Hall [RH])
10:30AM Session 1: Mediation (RH) Session 4: Animals Humans Machines Improvise (W480 in the University Centre for the Arts [UCA]) Student Exhibition (Level 4, University Centre for the Arts [UCA]) & Cluster Meeting 3 for IAST participants (RH)
12:30PM Break Break Break
1:30PM Session 2: Interactive or Responsive (RH) Session 5: Reception: Made for Us (W480) Keynote 2 (University Recital Hall)
3:00PM Student Exhibition (Level 4, UCA)
3:30PM Session 3: Emerging Artworks and Their Collaborators (RH) Cluster Meeting 2 for IAST participants (W857 & breakout rooms W646, W812, W870, UCA)
5:00PM Mixer & reception (University Hall Atrium)
5:30PM Break and IAST function (University Theatre Lobby) Break
6:00PM Event 2: Installation Presentations (University Hall Atrium)
7:30 Cluster Meeting 1 for IAST participants (W857 & breakout rooms W646, W812, W870 in the University Centre for the Arts [UCA]) Event 1: Concert (University Recital Hall)

Thursday, 25 October

Session no. 1 (Thursday, 25 Oct., 10:30AM to 12:30PM)


10:30 Patrick Pennefather CDM Using Emerging Technologies to Mediate the Performative
10:55 Scott Smallwood & Sydney Lancaster UAlberta Macromareal: Data Sonification of the Bay of Fundy
11:20 Dana Cooley ULethbridge To Hear a Shadow: Natural Energies and a Latent Magical World
11:45 Discussion


Patrick Pennefather Using Emerging Technologies to Mediate the Performative
Pennefather will report on several experiments that consisted of mediating performer and presenter using emerging technologies such as Virtual Reality, live streaming and live 3D immersive applications. Contexts included two contrasting dance cabarets and a conference paper that he presented in VR and in-person simultaneously. Drawing from Haraway and other cyborg/scholars, he will integrate a virtual representation of himself into the presentation.


Scott Smallwood & Sydney Lancaster Macromareal: Data Sonification of the Bay of Fundy
Macromareal is a collaboration between visual artist Sydney Lancaster and composer/sound artist Scott Smallwood, developed over two artist’s residencies (2016 and 2017). The installation approaches the tidal range in the Bay of Fundy, its documentation, and related environmental data through a series of interrelated works that explore the cyclic and durational aspect of natural processes, and the relationship between those processes, human activities, and conceptions of time and memory. In this talk, Smallwood will discuss the work in general, focusing on the four-channel audio component, its composition, and how data sonification is used to articulate the Bay’s profound tidal rhythms.


Dana Cooley To Hear a Shadow: Natural Energies and a Latent Magical World
Artists, from Alvin Lucier to Rafael Lozano Hemmer, have played a large role in exploring the natural energies and frequencies that circulate in our worlds. And emergent technologies, such as recently commercially available EEG devices, allow artists an opportunity to continue this tradition of work which routes the intangible into our perceptual register helping us tune into those faint reverberations which flutter and fluctuate around and through us, reminding us of the interconnectedness of the world and what is at stake if we ignore this interlaced existence.

Session no. 2 (Thursday, 25 Oct., 1:30PM to 3:30PM)

Interactive or Responsive

1:30 Örjan Sandred UManitoba Computer systems for constraint-based music formalization
1:55 Mark Hannesson UAlberta The Wandelweiser Collective and electronic music
2:20 Megumi Masaki BrandonU MUSIC4EYES+EARS
2:45 Discussion


Örjan Sandred Computer systems for constraint-based music formalization
The constraint based programming paradigm has proven very useful for music. Instead of designing a sequence of operations to execute, you describe the characteristics of the solution you want. Music style is often explained as a set of descriptions of what is characteristic for its structure. In my presentation, I will give a brief overview of the algorithms I have developed for constraint based music computing since 1997. I will start by outlining the pure rhythm focused “OMRC” system, then showing some characteristics of the “PWMC” that can constrain relations between pitch, rhythm and metric structure. Finally, I will discuss the “ClusterEngine”, which (inspired by parallel computing) implements a more efficient algorithm to facilitate more complex music structures to be generated. I will end by showing a close-to real time implementation of the ClusterEngine inside Max. It will also serve as an example on how to integrate Lisp code inside the Max environment.


Mark Hannesson The Wandelweiser Collective and electronic music
The Wandelweiser Collective is an international network of composers. Although mainly focused on instrumental composition, several members have created work with electronic media. Wandelweiser composers eschew a defining aesthetic, but most take John Cage and especially his piece 4’33” as a creative point of departure. Certain characteristics have become associated with the group. These include, open-forms, an acceptance of non-musical sounds, especially silence, and a focus on performer interaction. This paper will look at the work of several Wandelweiser composers who have utilized electronic media in their work. Michael Pisaro’s Transparent City, Antoine Beuger’s silent harmonies in discrete continuity, Pisaro and Beuger’s co-composed piece, this place is love, Jürg Frey’s 6 hour, l’âme est sans retenue I, Stefan Thut’s series of “box” pieces, my own piece, hoarfrost and an electronic realization of my piece, undeclared.


Megumi Masaki’s MUSIC4EYES+EARS: INTERACTIVE GESTURES presents new solo piano + electronics + video Canadian works written especially for Megumi Masaki. These interactive works are designed to explore innovative, diverse concepts, performance techniques and technologies in live piano + multimedia performance. Central to this project is how the interaction of image, movement, text and sound can create new expressive potentials as a whole.

Session no. 3 (Thursday, 25 Oct., 3:30PM to 5:30PM)

Emerging Artworks and Their Collaborators

3:30 Miles Thorogood UBC Audio Metaphor: A Framework for Artificial Soundscape Generation
3:55 Leanne Elias ULethbridge Visualizing Agriculture
4:20 Philippe Pasquier SFU Creative AI at the Metacreation Lab
4:45 Discussion


Miles Thorogood Audio Metaphor: A Framework for Artificial Soundscape Generation
The ability for artificial soundscapes to evoke a person’s memory and associations of a place and time is essential in the experience of computer games, virtual reality, film, and animation. The creative process in creating these soundscapes requires making decisions on combining and processing sounds based on the types of sounds as well as the affective qualities of the sounds. Building on metacreation and affective computing, this dissertation presents a formal composition model for automatically creating such soundscapes named Audio Metaphor.


Leanne Elias Visualizing Agriculture
This presentation will focus on a recent project undertaken by the Fine Arts Data Visualization/ Physicalization Lab at the University of Lethbridge. Visualizing Agriculture was an 18-month investigation into a collaboration between artists and agricultural scientists, and resulted in an exhibition and documentary film. Questions about data collection and analysis, sustainability, artistic practice vs scientific inquiry and the role of technology are explored.


Philippe Pasquier Creative AI at the Metacreation Lab
Computational creativity and Creative AI are new and expanding fields that bring together scientists and artists who design and deploy generative systems that are tackling creative tasks at human competitive levels. We will introduce and motivate these new developments of artificial Intelligence and machine learning towards computer-assisted creativity. We will illustrate our discourse with examples of systems and artworks designed and developed at the Metacreation Laboratory that compose music, produce 3D character animation or video mashup.



Friday, 26 October

Session no. 4 (Friday, 26 Oct., 10:30AM to 12:30PM)

Animals Humans Machines Improvise

10:30 Julie Andreyev ECU More-Than-Human Creativity
10:55 Gordon Fitzell UManitoba Performative Live Electronics
11:20 Arne Eigenfeldt SFU Collaborating with Creative Systems: Musebots
11:45 Discussion


Julie Andreyev More-Than-Human Creativity
With the ecological challenges of today brought about by anthropocentric forces, how may art contribute ecological-empathetic forms of relating? This presentation proposes interspecies creativity for expanded ways of knowing. Andreyev’s research-creation, called Animal Lover uses methods of interspecies collaboration, combined with ethics of care and new technologies, to draw attention to more-than-human creativity. Her recent performance projects—with birds, dogs, and forest communities—are informed by methods from acoustic ecology, indeterminacy, biology and naturalist approaches, combined with technologies such as generative systems, granular synthesis, computer vision, and field recording. The presentation discusses how the other lifeforms were invited to participate and how this negotiation established the outcomes of the projects. The technological-aesthetic methods afford a shift in thought, feeling and action—towards connection with other lifeforms and the ecologies we share.


Gordon Fitzell Performative Live Electronics
Composer Gordon Fitzell outlines performative approaches to live electronics in improvisation, sound installation and concert music, stressing the role that open and directed improvisation can play in bringing spontaneity and contemporary sensibilities to the art form. Through scores, photographs, videos and audio excerpts, he surveys the use of live electronics in his own compositions and in projects presented by the University of Manitoba’s eXperimental Improv Ensemble (XIE), which he directs. Fitzell asserts the importance of performer freedom and autonomy in live electronics. He touches upon technology ranging from DIY electronics to computer software and hardware, and explores the promise of aesthetic renewal through the perpetual compatibility of current-day technology. Projects investigated include solo and chamber works with live electronics, a sound installation featuring sonified weather balloons, and a smart table to which users can bring original sounds.


Arne Eigenfeldt Collaborating with Creative Systems: Musebots
Musebots are pieces of software that autonomously create music, collaboratively with other musebots. The aim of the Musebot project is to establish a playful and experimental platform for research, education and making, that will stimulate interest and advance innovation in musical metacreation (MuMe). Above all, the Musebot project is a collaborative, creative experiment: we invite others in the generative music community to join us in making autonomous software agents that work together to make original music. These software agents run on a either a single computer, or network of computers, creating music together in a “musebot ensemble” for a public audience.

Session no. 5 (Friday, 26 Oct., 1:30PM to 3:30PM)

Reception: Made for Us

1:30 Georg Boenn ULethbridge Musical Rhythm and Models of Creativity
1:55 Jim Bizzocchi SFU Building Art that Builds Art: The DadaProcessor Generative Video System
2:20 Maria Lantin ECU Liveness, Patience, and Attention: A delightful conversation
2:45 Discussion


Georg Boenn Musical Rhythm and Models of Creativity
In this presentation, a new computational model of musical rhythm and composition will be explored. I will show how the computational analysis of musical rhythms can help us to understand their different qualities, and how these rhythms relate to our human perception and experience. The analysis includes music from different cultures, as well as examples by Igor Stravinsky and Olivier Messiaen. A model of composition is presented that focuses on specific perceptual qualities of rhythm and meter. Research in music psychology and cognition has shown us important limits of musical timing and limits of human information processing. This research confirms certain practical and philosophical views on music. My talk will show how to incorporate parts of this knowledge into computer programs that can assist composers, musicians and music researchers alike. The system is based on mathematical models of rhythm and meter that make use of number theory, pattern matching and combinatorics on words.


Jim Bizzocchi Building Art that Builds Art: The DadaProcessor Generative Video System
My generative video system, the “DadaProcessor”, creates a real-time stream of edited video. The system works on a simple computational design. It includes a database of video clips that are tagged for content. The system’s rule set selects and orders the clips into an ongoing video flow. It is a form of ‘expert system’ – the rule set is based on sequencing heuristics used by human editors to construct a cinematic montage. I have produced a series of ‘ambient video’ artworks based on this system. My ambient video art is nature based, slow-paced, and non-narrative. The system’s sequencing incorporates a sense of gradual flow and progression, enhanced through the use of rich visual transitions. My next work, currently in progress, will produce a series of remixes drawn from the classic documentary “Berlin: Symphony of a Great City”. In all these generative works, my “DadaProcessor” video system works collaboratively with Arne Eigenfeldt’s “Musebot” generative music system.


Maria Lantin Liveness, Patience, and Attention: A delightful conversation
Delight can always be found in the act of giving our attention to detail — allowing ourselves the time to appreciate the liveness of this moment with an impulse toward explorater rather than manipulation. This feeling of “just being with” is more often rehearsed in naked reality, without headsets, screens, phones, etc. We have had time in unmediated reality (or what I like to call the un-dampened propagation field) to discover depth and interest. We expect reality to keep giving the closer we look at any scale. This expectation is somehow lost in mixed reality because most experiences are egocentric and present us a world “made for us”. Can we design experiences in mixed reality that promote a sense of being and awareness of a bigger web of relations? How do we integrate details, pauses, the unexpected into experiences where spectacle has been promised so often and where time feels compressed? How do we re-introduce birth, ageing, and death into the virtual and why should this be important to full experience? To seed this conversation, I will present some of the work my lab has been involved with, and other work that point to some delightful exercises in immersive and performative media. It is my hope to prompt a discussion about time, space, impermanence, data, patience, liveness, and attention as it relates to our desire to be with the digital.


Concert (Friday, 26 Oct., 7:30PM to 9:30PM)


Megumi Masaki performer and Keith Hamel composer Corona for piano, interactive electronics and interactive video
Andrew Schloss performer & composer sonicpair #2
Megumi Masaki performer and Keith Hamel composer Touch for piano and interactive electronics
Tommy Davis performer and Gordon Fitzell composer r/evolution
D. Andrew Stewart performer & composer Ritual for karlax
Kathryn Ricketts performer, Ian Campbell film-maker, and Loscil composer Land(ing)
Miles Thorogood and Aleksandra Dulic Journey of a pod


Megumi Masaki and Keith Hamel Corona for piano, interactive electronics and interactive video
Corona is a work for piano, interactive electronics, interactive video and gesture tracking. The piano creates a continuous texture that is enriched using a variety spectral processing techniques. As well, the movement of the pianists hands triggers electroacoustic sounds at certain times during the composition. Finally, all the sound generated by the piano and the electronics is analysed and converted to image to create an interactive video. In essence, all the sounds heard and images seen are directly or indirectly produced by the piano. From a compositional perspective, the work is minimalist in character, with the harmonies slowly unfolding and overlapping one another to create a rich and colourful wall of sound.


Andrew Schloss sonicpair #2
sonicpair #n is a series of duos involving robotic percussion. sonicpair #1 is for viola and radiodrum-controlled robotic percussion. This new piece, sonicpair #2 is unique in that the robotic player is a “mirror” of the human percussionist. Using a new 3D capacitive sensor called the SDRdrum, the full range of the percussionist’s gestures are exactly mimicked by the MechDrum, including gestures above the surface. This mirroring creates a unique visual and auditory experience that is different from all other percussive interfaces because the robotic percussion is not “waiting” for a noteon to be detected; rather it is continuously following all the gestures of the percussionist.


Megumi Masaki and Keith Hamel Touch for piano and interactive electronics
Touch is a composition for piano, interactive computer processing and gesture tracking of the pianist’s hands. The work explores bells of all shapes, sizes and colors to create an evocative soundscape of real and imaginary bells. As with many of my recent compositions, Touch is inspired by spectral analysis of bell sounds, and it is these re-constituted bell timbres that form much of the harmonic content of the composition. I wanted to create a colorful and imaginative sound world without an overtly dramatic sensibility. Rather, Touch is delicate and expressive, and explores a wide range of subtle colors and textures.


Tommy Davis and Gordon Fitzell r/evolution
r/evolution for adapted tenor saxophone and interactive spatial audio r/evolution is a moto perpetuo work featuring uncommonly long phrases, trajectories and transformations. The piece requires extended circular breathing and the virtuosic execution of multiple simultaneous performance techniques. These techniques include multi-phonics and other false fingerings, whisper tones, vocalising/growl, pitched key clicks, and a blended teeth-on-reed embouchure. Fluidity of co-presentation and transformation across these techniques is essential to the continually evolving character of the work. The work also involves live performative electronic audio in which contact and clip microphone signals are processed in real time to create an expanded and immersive listening environment.


D. Andrew Stewart Ritual for Karlax
RITUAL FOR KARLAX showcases parameter mappings that were influenced by “A Notation System for the Karlax Controller”, by Mays and Faber (2014) [In Proceedings of NIME 2014], and my experiences at a Karlax Workshop at CIRMMT, McGill University (Montreal, Canada), in May, 2015. RITUAL is my first fully-notated musical composition for the instrument. The notational style resembles traditional music, with the addition of custom-designed symbols for conveying both the sound result, which I refer to as “musical gestures”, and the manoeuvring of the instrument, which I refer to as “playing techniques” or “performance gestures”. Performing RITUAL FOR KARLAX entails controlling purely synthesised sounds in real time (via physical modelling). In this composition, I explore the concepts of ritual – or being ritualistic, creating moments of ecstasy (silence, solitude), and trance (noise, movement, crisis). Through these states of experience, I hope to find elucidation – or even transcendence.


Kathryn Ricketts, Ian Campbell, and Loscil Land(ing)
Land(ing) is centred on Remington: an anthropomorphised bird inhabiting an austere landscape of sound and video illuminating an even more austere prairie landscape. These disciplines interface in both live and captured modalities playing with time, duration, cause and effect both on formal and content driven levels. Land(ing) is attending to the implications of honouring and preserving the astonishingly beautiful land of central Canada. It asks questions such as how have we and how do we live attending to the urgencies of preservation and sustainability? The three artists harness their experience with both improvisation and composition to find ways to construct refined and well-crafted material shaping them into thematic modules and thereby allowing spontaneous reconfigurations with each performance. These prairie soaked evocations become a poetic call to the wonder of landscapes, both wild and tamed with a haunting invitation to dwell in places of wonder, longing and anticipation.


Miles Thorogood and Aleksandra Dulic Journey of a pod
Journey of a pod engages with topics surrounding the waters of Okanagan Creek systems in BC Canada. This sensitive A/V performance speaks to the body of water as a carrier of environmental and cultural significance. The ephemeral beauty of this space is disrupted through disparities in the visual space and soundscape produced by industrial agriculture and business practices. Through performing video and audio documentation of the creek using generative and expressive soundscapes and visuals, Aeon brings the enchantment of the water to the foreground. The performance presents an unfolding of an immersive inner-world drawing upon the interface of the forest and flow of water. As care for the creek the work moves to revealing disappearing landscapes made manifest in contrast between nature and development.


Saturday, 27 October

Keynote no. 1 (Saturday, 27 Oct., 9:00AM to 10:30AM)

9:00 Aleksandra Dulic Centre for Culture and Technology, UBC-Okanagan Crossings: Cultivating sustainability practises though immersive media art
This presentation explores the concept of crossings among art and science, research, and public, indigenous and non-indigenous perspectives in the design of poetic media experiences for imagining and practicing new ways of living sustainably. Sustainability, as an essentially contested concept and emergent property of local and global negotiations, places emphasis on ways of thinking and the imagination. In its role as catalyst, art and media can significantly contribute to social transformation by providing places for community reflection, visioning, and imagination. Artistic inquiry is not bound to the same limitations required of research based on the scientific method, because art tackles a different challenge, which is to communicate experientially. Within imagined, virtual or poetic media spaces community participants can expand their current perspectives while simultaneously making connections across extended time and space – into possible futures.

Student Exhibition (Saturday, 27 Oct., 10:30AM to 5:00PM)

Shaun Bellamy CueTrack – A Self-Contained Performance System for Live Electronic Music
Yujie Gao Sensory Overload
Tisha Gilbert Wearable Tech: Using Technology in Costuming
Tyler Heaton Space. Time. Story.
Rodrigo Henriquez A Stellar Arsenal
Kimira Using Virtual Reality as a visual communicating tool for set design
Chantelle Ko TRAVIS: Touch Responsive Augmented Violin Interface System
Autumn Read BBOP-CE Lenses
Phillip Rockerbie Hive
Robert Van Rooyen MechDrum
Abdullah Safa Soydan Sonic Matter: A Real-time Interactive Music Performance System
Mark Segger Sonic Windows
Lindzi W Spackman Technology and its Function in Theatrical Vision.

Shaun Bellamy CueTrack – A Self-Contained Performance System for Live Electronic Music
An interactive display of the CueTrack, a self-contained performance system for live electronic music funded by the Social Sciences and Humanitarian Research Council (SSHRC) CGS-M Grant. CueTrack is designed to streamline the rehearsal and performance of new electronic music by using a responsive control interface that is programmed specifically for each music composition. The device controls pre-recorded audio files, live manipulation, and the balance between electronics and the performer’s microphone. During the installation, audience members can approach, rehearse and perform an improvisatory composition for kalimba and live electronics, utilizing the CueTrack. Performers of all skill levels, and technological backgrounds, are encouraged to participate and experience an accessible performance system that is easy to learn.
Yujie Gao Sensory Overload
Sensory Overload is an inter-media art installation which used all elements from nature. In the piece, there are three elements in the whole system: the propeller as the fan, the water, and the light. The propeller suspended above the middle water cabinet, and the LED panel hidden at the bottom of each water cabinet. The author uses water as the “interface” to reception and conveys information from the fan and the light, resulting in different ripples and visual effects. Sensory Overload occurs when one or more of the body’s senses experience over-stimulation from the environment. This work calls on the viewer to reconsider our relationship with our state of living, how to face that there are more and more environmental elements that impact an individual.
Tisha Gilbert Wearable Tech: Using Technology in Costuming
Wearable Tech: Using Technology in costuming is an in depth look at the use of EL wire in the creation of a costume. This exhibit explores the processes used to create the costume and integrate the technology into the costume. Additionally, the exhibit demonstrates how this use of technology aides the actor in their development of character.
Tyler Heaton Space. Time. Story.
Space. Time. Story. is an interactive “still life” that explores the narrative potential of virtual reality by inviting particpants to experience a moment plucked from a story. Using consumer virtual reality hardware, the audience enters a computer generated world where they may discover the story through interaction. The installation asks how immersion, accomplished through the careful design and use of spatialized sound, visual detail, and interaction design, can enrich a story experience. It explores embodiment as a narrative device by making participation fundamental to the storytelling process. The work creates an open dialogue about the role of technology in the development of a medium and speculates on the future of immersive interactive embodied media.
Rodrigo Henriquez A Stellar Arsenal
This is a type of planetarium that uses the same principals of planetarium projector of using projection of stars and deconstructing it into it’s simplest parts. It projects light and stars around the room using angled circular and diamond shaped mirrors. The projector projects a time lapse of the stars on the planetarium and you see the time-lapse one the projector and all around the room in the lights, the fog machine adds atmosphere and allows the audience to see light beams. I composed the music to covey the feeling and wonder that I feel when I look at the stars. It is very personal to me, the stars hold high value to my spirituality and is a source of connection to the earth and to the stars.
Kimira Using Virtual Reality as a visual communicating tool for set design
Virtual reality is a fast growing industry. It is used in many fields of work to provide research, education, visual communication and entertainment. Kimira Bhikum, an emerging artist and masters student at the University of British Columbia explores the usage of Virtual Reality in the theatre industry and how it can be used as a visual tool to communicate between a set designer and director. Her research involves executing a successful workflow that is implemented into a case study that evaluates the visual communication between a director and designer. She compares traditional ways of presenting set design to the ways in which Virtual Reality can be used and provide the benefits of the usage of Virtual Reality in the theatre industry.
Chantelle Ko TRAVIS: Touch Responsive Augmented Violin Interface System
The Touch Responsive Augmented Violin Interface System (TRAVIS) utilizes, two Softpots on the fingerboard, as well as two Force Sensitive Resistors (FSRs) on the body of the instrument to trigger presets. The wired prototype sends data to Max MSP/Jitter patches via an Arduino Lilypad USB. In collaboration with the engineers who developed the wireless version of the Responsive User Bodysuits (RUBS), TRAVIS is now using an Arduino MKR1000. It sends data to Max MSP wirelessly via a dedicated IP address. Future research will include adding more FSRs, as well as building a fingerboard with the necessary sensor components built into it. Making a custom fingerboard will allow all four strings to be tracked, without having to alter or compensate the physical dimensions of the violin. Researchers: Chantelle Ko, Bob Pritchard. Wireless RUBS Environment Development (WiRED): Carol Fu, Jin Han, Lily Shao, and Esther Mutinda.
Autumn Read BBOP-CE Lenses
Using an Arduino and custom 3D printed bone-conduction glasses-lenses, the BBOP-CE (Bone-Conduction Broca’s Occipito Parieto Circuit Equilibrium) lenses are designed for individuals with dyslexia. The lenses aim to aid the wearer in speech and thought articulation, reading comprehension, and general focus in low/high stimulus environments, via the use of bone-conducted rhythms to relieve over-activity in the broca’s area (articulation and word analysis) and stimulate under-activity in both the parieto-temporal (word analysis) and occipito-temporal (word formation) sections of the brain. The intent is for the lenses to be unobtrusive to the user and their surrounding environment in daily life through the use of the bone-conducted sound being sent straight to the cochlea through the temples and bypassing the eardrum entirely so that the wearer can still hear the world around them.

Phillip Rockerbie Hive
Biomimesis involves imitating traits of an organism for our own purposes. My research creates discourse as to the advantages/disadvantages of designing using biomimesis, while my body of work expresses the ability to imitate various traits of an organism. My installation, Hive, explores the fascination of the non-human, emphasizing societal perspectives of organic traits observed in nature. Using the honeybee as an example, I have created an interactive work that mimics biological traits in a design context. Along with my physical work, my research addresses the urgency for assessing alternative types of design thinking in modern times. My exploration is not explicitly trying to solve issues related to human centric design, and instead discusses the ability/inability of using biomimesis as an alternative to human centric design. Since organisms have long adapted to survive on Earth it makes sense that we consider borrowing principles from them to aid in our own practices.
Robert Van Rooyen MechDrum
There are number of interesting use cases for the MechDrum™ that include playability for musicians with disabilities, live/playback performance, and compositions. By using parameter driven stochastic models for common rudiments with tempo scaling, a wide variety of MIDI controllers such as keyboards, buttons, sliders, air-pressure sensors, gesture sensors, and blink detection can be mapped to compelling virtuoso class renderings. This adaptability makes a real percussion instrument accessible to people that may otherwise not be able to participate in a solo or ensemble setting. With regard to live performance, 3D motion tracking enables play-by-wire over the network using OSC messages that offers an extension to the percussionist in terms of instrument location and augmentation. Finally, compositions can be written to deliver high-quality renderings on traditional instruments that push the boundaries of musicianship, stamina, and speed.
Abdullah Safa Soydan Sonic Matter: A Real-time Interactive Music Performance System
Sonic Matter is an interactive music performance system which streamlines the creation of music, especially for electroacoustic music composers. It gives the users control over sonic qualities; likewise, it can transform users’ sound material into simple to complex textures. Users can position the sound source to any point in 3D space, record trajectories, and convincingly move the sound in the listening environment. The system consists of the computer software, a KORG nanoKONTROL2 MIDI interface, a Leap Motion sensor, and optionally, a MIDI sustain pedal. The research aims to explore ways of creating original and engaging sounds as well as sonic textures with newly developed computer technologies. It also investigates how the design of an interactive performance system affects the performance and the artistic potential, both from the angles of easiness and musical expressivity.
Mark Segger Sonic Windows
Sonic windows is an Interactive installation exploring the experience of space and place, and allows observers to open doors into the live sound of locations in other cities.  The IAST version of this project includes five universities involved in the conference, a collaborator at each institution live-streaming a campus soundscape or soundwalk of their choice.  Collaborators include Brian Topp (University of British Columbia), Carmen Winther (UBC Okanagan), Chris Chraca (University of Victoria), Nico Arnaez (University of Alberta), and Chris Love (University of Manitoba).

Lindzi W Spackman Technology and its Function in Theatrical Vision.
This exhibit, through the use of technology, will demonstrate the theatre directors initial creative concept and how that concept is communicated to the actors and design team. By using photographic and sound examples found online the director can lead the actors and production team down an accurate creative path.


Keynote no. 2 (Saturday, 27 Oct., 1:30PM to 3:00PM)

1:30 Sid Fels Electrical and Computer Engineering in the Faculty of Applied Science, UBC-Vancouver Design for Human Experience and Expression
Research at the Human Communications Technology (HCT) laboratory (hct.ece.ubc.ca) has been targeting design for human experience and expression. In this presentation, I’ll start with a discussion of gesture-to-speech and voice explorations, including Glove-TalkII and the Digital Ventriloquized Actors (DIVAs). I’ll connect these to other explorations of the new interfaces for musical and visual expression that we have created. I will discuss our work on modelling human anatomy (www.parametrichuman.org) and function, such as speaking, chewing, swallowing and breathing (www.magic.ubc.ca/OPAL.htm) with biomechanical models using our toolkit Artisynth (www.artisynth.org). This work is motivated by our quest to make a new vocal instrument that can be controlled by gesture. Based on a theory of designing for intimacy, I hope to stimulate discussion within this framework on the role of digital, chemical and biological technologies play in the next wave of personalized artistic expression.

Installation presentations (Saturday, 27 Oct., 6:00PM to 7:00PM)

Jim Bizzocchi & Arne Eigenfledt SFU Ambient Landscapes
Henry Daniel SFU nómadas
Sydney Lancaster & Scott Smallwood UAlberta Macromareal: a rising tide lifts all boats
Megan L. Smith URegina Riding through Lethbridge


Jim Bizzocchi & Arne Eigenfledt Ambient Landscapes
Ambient Landscapes is generative video art in the genre of “ambient video”. It is produced by my generative video sequencing system, the DadaProcessor. The sound for this artwork is produced by Arne Eigenfeldt’s Musebot system. The two systems collaborate in real time to create a unified audio-visual experience. Ambient video is video art that is meant to play in the big screens that increasingly permeate our homes, offices, and public spaces. In the spirit of Brian Eno’s ambient music, ambient video can be ignored, or it can be noticed. If my ambient video display is noticed, it must always provide the viewer with visual pleasure. My ambient video art is nature-based. It is slow-paced, and non-narrative, but also incorporates a sense of gradual flow and progression. This flow is enhanced through the constant use of rich visual transitions to move from one shot to the next.


Henry Daniel nómadas
An audio/video installation, nómadas takes its inspiration from the current large-scale movements of bodies across international spaces as a type of chaotic transnational choreography that speaks to what cultural theorist Stuart Hall calls, a “contemporary travelling, voyaging and return as fate, as destiny […] as the prototype of the modern or postmodern New World nomad, continually moving between centre and periphery” (Hall in Rutherford, J. 234:1990). The work is part of the larger, long-term Contemporary Nomads research project, which explores “the deep fragmentation that exists between communities within as well as outside national borders, between nationalised and personalised bodies, and between social and political institutions and the ordinary people they were meant to serve” (Daniel, 2017).


Sydney Lancaster & Scott Smallwood Macromareal: a rising tide lifts all boats
Macromareal approaches the tidal range in the Bay of Fundy, its documentation, and related environmental data through a series of interrelated works that explore the cyclic and durational aspect of natural processes, and the relationship between those processes, human activities, and conceptions of time and memory. While the project itself has its roots firmly in the specificity of place, Macromareal combines video, sound, images, and sculpture to create an opportunity to consider our relationship to the earth, its resources, and the forces that shape it (and human history).


Megan L. Smith Riding through Lethbridge
Pierce the internet with your body. Take a ride through Google Street View from behind the handle bars of a stationary bike. This networked bike allows you to cycle across Lethbridge from the comforts of your home theatre. Designed by artist Megan Smith, the project stems from her SSHRC funded research-creation project Riding Through Walls where she rode over two years across Canada from within her studio in Regina. The piece was part of the Project Anywhere Global Exhibition in 2016, and URL:IRL, at the Dunlop Art Gallery in 2018.


Share this Post

Leave a Comment