Project: Drexel University Bio-Inspired Design – with teammates Raja Schaar and Ann Gerondelis. Together we’ll be working to expand our K12 and Higher-Ed Biologically Inspired Design and Citizen Science pedagogy by studying indigenous animals and plants. We’ll analyze their structural, behavioral, and functional features and adaptations to look for ways people might use them to solve problems in the conservation and sustainability space. I don’t work at Drexel (just excited to be on their team), but I am a service designer in Atlanta.
I am a design generalist for human and nonhuman great apes. I focus on design in complex, dynamic, and unfamiliar environments with emerging technology. With more than a decade working as a designer at Zoo (in exhibit, web and graphic design) and an education in design and digital media, I hope that my work supports improving the lives of humans & animals.
My favorite animals are orangutans and red pandas, but I am really excited to see sloths, coatis, and Panamanian golden frogs!
Project: Drexel University Bio-Inspired Design – with team mates Raja Schaar and Becky Scheel. Together we’ll be working to expand our K12 and Higher-Ed Biologically Inspired Design and Citizen Science pedagogy by studying indigenous animals and plants. We’ll analyze their structural, behavioral, and functional features and adaptations to look for ways people might use them to solve problems in the conservation and sustainability space.
Bio: I’m an architect, writer, designer, and university administrator. I recently moved from Atlanta to Philadelphia to lead the multi-faceted Design Department at Drexel University. That’s Fashion Design, Product Design, Graphic Design, Merchandising and Photography. Whew! I love the potential for designing human experiences at multiple scales in ways that activate our sensing bodies. I’m a design evangelist, often inviting STEM-strong students into my world through courses and workshops in bio-inspired design. I’m most happy exploring my environs by drawing them, and can’t wait to see what awaits in the forests of Gamboa!
As an ornithologist and conservation biologist, I track bird migrations around the globe with miniaturized GPS-GSM tracking devices that reveal novel migration routes and habitat use of endangered bird species. When mapped in 2 or 3 dimensions, the movement patterns of birds across continents are aesthetically stunning. I plan to collaborate with my sister (Leah Buechley), an expert artist, computer programmer, and tinkerer; my wife (Mara Burstein) an acrylic painter; and my nephew (Elan Solowej), a vivacious 5-year old, to co-design and create a project that fuses science, art, programming, and nature.
project: I’m thinking about doing a layered project with Leah Buechely (coder/designer) Evan Buechley (wildlife conservation biologist), and Elan Solowej (5yr old master mind). For years, Evan has collected data on where vultures migrate. This information can be presented beautifully with our diverse skills. We thought we might print it on wood (Evan?), add some electronics (Leah) and paint (me) and local specimens (Elan).
Generative Dance in the Wild and EEG Sonification.
Part One:
How do movements couple to sounds in the natural environment, and can paired dance communication by improvised both in the movements and musical composition realms? I used SonicPi to generatively sample sound recorded from nature to make musical beats and rhythms. These beats will couple to pair dance metaphors in paradigms in salsa and zouk, which are popular dances in Panama. Specifically the project consists of the following phases.
Record sounds in the natural environment of Panama and use them to construct simple phrases in SonicPi, choosing the right envelopes to synthesize beat sounds which, when live-looped together, produces Latin-like rhythms.
Begin recruiting conference attendees for a performance which involves dancing in sync to the collected beats. I will train those who are not familiar with simple steps of salsa and bachata latin dancing so that all can practice together even without formal training.
We will construct a wearable interface for switching between different SoniPi sketches for generating different sounds. We will prototype a teensy-based device that can use accelerometer data to switch between beats. The choice will depend on the leader in the dance pair.
We will user test a pair of dancers, one of whom (leader) can switch between rhythms and music that inspires different dance forms and speeds. The leader can choose both her steps and the musical rhythms being generated. For example, she can choose to dance bachata rather than salsa, or to have a dip in the salsa, and can choose the musical motifs appropriate to these specific actions.
If time permits, we will organize a Casino Rueda performance using pairs of dancers who can all control the music in different ways. If the technology does not permit it, we can prototype the process using calls much like in Casino Rueda, giving our DJ a cue to change the music.
The project investigates whether improvisation in dance can be coupled also to improvisation in music. Can we create a system for both changing the musicality and the movements in dance? We aim to investigate this in a natural context where Latin rhythms and natural sounds can be used as samples to create a performance of higher order improvisation.
Part Two:
Can EEG be used as a source of sound and can this sound be used to harmonize with the environment? This project generates a work of symphonic sound using human EEG attention data and EEG data in the wild. I use a MindWave Mobile headset to get attention data from humans and translates that scale to pitch for the melody. I use plant electrical data to recorded using plant electrodes (thanks to Seamus) to generate the tonic portion for the work. Combining the phasic EEG music with the tonic plant environmental music gives a voice to the way we operate in the universe. We humans make a lot of phasic noise, but the plant and environment of the world embody the tone and mood that form the substance of a work. We co-create with electrical recordings from the brain and the plant to make a symphony of Gamboa.
MindWave Mobile data is piped to BrainWaveOSC app, which sends the data to Unity. Unity uses an AudioSource to generate the pitch as mapped from attention data. On the plant side, Arduino is used to record and log plant electrical values. These two sources of EEG are part of the environment we exist in. Human EEG as you can see in the video demo, is used to generate pitch, making directed musical phrases using attention, so humans can control to come extent (but not all). Plant EEG will be used to generate the subtext of the symphony, forming the chords that the human EEG will play on top of. Both have a life of its own, so that the final form of the work is as much part of the environment of Gamboa as to any conscious control by any party.
Ray LC’s artistic practice incorporates cutting-edge neuroscience research for building bonds between humans and between humans and machines. He studied AI (Cal) and neuroscience (UCLA), building interactive art in Tokyo while publishing papers on PTSD. He’s Visiting Professor, Northeastern University College of Art, Media, Design. He was artist-in-residence at BankArt, 1_Wall_Tokyo, Brooklyn Fashion BFDA, Process Space LMCC, NYSCI, Saari Residence. He exhibited at Kiyoshi Saito Museum, Tokyo GoldenEgg, Columbia University Macy Gallery, Java Studios, CUHK, Elektra, NYSCI, Happieee Place ArtLab. He was awarded by Japan JSPS, National Science Foundation, National Institute of Health, Microsoft Imagine Cup, Adobe Design Achievement Award. http://www.raylc.org/
Hello world! My name is Josh Michaels, I’m a creative polymath with a bias toward technology from Portland, Oregon. My primary interest with regards to naturalism is the power of nature immersion as a form of therapy. With an increasing global focus on mental health and mindfulness, regular immersion in nature is often overlooked as one of the simplest and least expensive ways to improve ones mental health. However not everyone has the time or access to receive the benefits of nature immersion, which is difficult to reflect on given our origins as humans.
Along those lines, my interest is rooted in a desire to create experiences using images and video of nature that offer the power of nature immersion to people who are unable to experience it in person. A variety of research has shown that viewing images and video of nature can provide up to 90% of the therapeutic benefit of actually going into nature. This is remarkable given how low-fi synthetic nature is compared to the real thing.
The reality of modern life is that most people don’t have and can’t make the time to really immerse themselves in nature on a regular basis. So from a modern lifestyle point-of-view, nature imagery and video offers a practical way to squeeze some of the benefits of nature into a nature-starved life.
That said, the people who may benefit most from nature imagery and video are those who are locked away from actual nature for health, legal, or other reasons. Whether it’s a hospital room, submarine, or jail cell, physical limitations that cut people off from nature deprive them of benefits which should seemingly be available and accessible to every human all the time.
Nature imagery and video offers a great way to combat limited access to nature by bringing nature inside for those who can’t go outside. The possibilities for low cost and cost saving ways to use nature imagery to help those who lack access to nature are endless.
I’ll be using my time at Dinacon to continue my investigations into this subject using portable EEG/EKG monitoring to compare actual and recorded nature experiences.
Hello everyone! I’m Irene and I will be at Dinacon from the 4th till the 10th. I am an Explorer of Interfaces where my interests starts with the objects and, by shaping them, see how they can affect on our perceptions and senses.
With a background in Product Design and Mixed Media Art, I use technology in order to enhance our encounter and interaction with the objects I’m building. I have a strong knowledge and interest in digital fabrication technologies and I try to use them as a tool for exploration and development.
I’m a creative coder and a pseudo-new media artist based in Barcelona interested in experimentation on interactive communication focusing on the use of generative algorithms, creative code and interactive technology as a means of communication and an experience generator for new narratives creation.
Project: Chris Manzione and I will be making 3D scans of the jungle for an AR piece.
Bio: I am an interdisciplinary artist focused on the exchange between the body, the built environment, and the natural world. I create participatory platforms, images, and objects that invite movement and other forms of physical engagement that bring the world into the body to engender a sense of solidity and agency in an increasingly uncertain world. I have received residencies and fellowships from Eyebeam, the Marie Walsh Sharpe Studio Program and the Lower Manhattan Cultural Council. I have shown in Canada, the Bay Area, New York City, and South America. I am an assistant professor in Visual Art & Technology at The Stevens Institute of Technology in Hoboken, NJ. You can see some of my work at nancynowacek.com. I prefer laughter.
I’m a digital interactive media (“videogame”) artist with a special interest in our relationship with nature. I’ll be at Dinacon for 2 weeks and will probably make a videogame or something!