Welcome back to another episode of syllabus sharing here at Implicit Art!
This class focuses on interactive technologies and aesthetics in contemporary art. Students will learn basic software development and real-time computational methods. They simultaneously learn and make projects with MIDI sounds or drawings, digital audio, human interface devices (USB game controllers, Bluetooth phones and more), and recorded and live video files for mixing and computer vision (body- and motion-tracking, for example). Assignments include many small projects with varying technical goalposts, as well as a mid-term and final artwork that will be more focused on conceptual-material aesthetic themes.
Most of my students have little or no background in coding, so, like my Electronics and Sculpture class, this syllabus works as an introduction to interactive art. That said, I offer it at the 300-level, so that my digital art students will understand bits and bytes, audio and video, how computers “think,” and my other artists will be able to bring their skills with crafting images or objects (etc) into the mix. I also “stack” it with a 400-level class, so grad students, or advanced students that want to take it a second time, can add another dimension of creativity and criticality.
I teach this in Cycling74’s Max: a visually-based, object oriented programming environment. What does that mean? You build a flow chart for your data (whether that be sensors from a phone, a video feed, sounds, etc), and that input is transcoded and turned into something else. Come again? OK. For example (an example I give on the first day, and that I remade in my PJs while typing this – shown left), plug a microphone object into a meter object to see how loud real-time sound is. Take a video grabber and plug that into a screen (“world”) object to see your live webcam. Use a multiply (“*”) object with each stream on either side, and you get a live video that fades in and out based on how loud your subject speaks into the microphone. (Kitty, from kitchen: What are you yelling about in there? Me: Just blogging! Kitty: ???) It’s relatively easy, super cool, and completely visual. (Processing, which is more direct coding in Java, is actually taught in the music department at UWM, and I often recommend my students take that, too).
I’m gladly sharing last year’s syllabus and calendar online. It is under a CC-by license (Creative Commons Attribution), meaning, you can do whatever you want with it (use, distribute, remix, etc), so long as you credit me and acknowledge the license I used, link back to this page, and do not prohibit anyone else from doing said same.
The semester arc is project-based, and I teach ‘objects” (in the flow chart) and data dynamics (etc) as we go along with make, make, making. This is the order:
A generative “doodle” of software-based sound, which often sounds like R2D2, using MIDI and/or digital signal processes, and any combination of buttons, toggles, metronomes, randomizers, counters, and/or other learned objects.
A small, generative drawing project using jit.lcd or jit.gl.sketch, math, decision trees, gates, switches and/or the keyboard or mouse.
“Vizzie Visualizer and/or BEAP beater”
A generative or interactive project that uses randomness, feeds, and/or live input towards somewhat interesting or provocative ends. Students will be required to use both video (live and/or pre-recorded) and digital audio (live and/or pre-recorded) as part of this project – and pre-made patchers from the Vizzie and BEAP libraries are most welcome.
“Stupid pet trick” (mid-term)
An interactive art work with some form of external input (Human Interface, Computer Vision, Arduino, etc). uses pre-recorded video and/or live or pre-recorded sound along with some other form of input/output. Students will write a brief statement about their work (less than 300 words), and their technical abilities and use of inventive juxtapositioning will be judged against this text’s framing of concept, creativity and both interactive and visual aesthetics.
A large-scale interactive and/or generative and/or networked installation, performance, tool or art object. Again, students will be graded against their artist statements, on technical abilities, conceptual frames, creativity and both interactive and visual aesthetics. Undergraduates will show complete and working software, budget, and sketches for the full installation. Graduate students must set up the full installation somewhere in Kenilworth as part of their final critique.
Of course, as with all my classes, there are consistent discussions around the aesthetics and ethics of our work. The readings for undergrads are:
- “Action, Reaction and Phenomenon,” Rhizome.org (free online) (2008)
- Katherine Hayles: Flesh and Metal: Reconfiguring the Mindbody in Virtual Environments (available via Muse) (2002)
- Philip Galanther: What is generative art? Complexity theory as a context for art theory (available from CiteSeer) (2003)
- Nathaniel Stern: Interactive Art and Embodiment (introduction) (2013), made available by the instructor.
- “The Aesthetics of Play,” from The Aesthetics of Interaction in Digital Art, by Katja Kwastek.
Grad students do additional readings and context-based work, and are additionally required to read (and we discuss):
- Rethinking Curating, MIT Press
- Interactive Art & Embodiment: The Implicit Body as Performance, Gylphi Press (the whole book, not just the intro)
- Screens: Viewing Media Installation Art, U Minn
It is SUCH a fun class, with great work, and a high satisfaction factor as I watch my students learn to think differently: about technology and data, about art and aesthetics, about interaction, relationality, and ethics. AND, while I’m on parental leave, I’m very excited to see what new dimensions Jessica Fenlon can add to the class and program. I’m working on getting her in at UWM – and look out for a feature on her work on this blog in the coming weeks…
Here’s the Interactive and Multimedia Art syllabus, in Word format. Enjoy art, teaching, and learning!