User Interfaces - Spring 2007

Room 109, Warren Weaver Hall
7-9pm Mondays

Each of you will be maintaining a page with your work on-line. Your web pages are linked to from here.

The email address for the course TA (with suitable substitutions) is: elif AT cs DOT nyu DOT edu

The mailing list for this course is at:



We'll be using this website as our primary "text book" for the course. The course will be structured into (i) an introductory part, in which you learn the technical basics of constructing use interface elements, and (ii) the fun part, in which we use that knowledge to do exciting, collaborative, exploratory projects that will make the world a better place and probably change your life forever (or something like that).

The basics

In the first several weeks of this course you will learn the technical basics of how to make user interface elements: buttons, sliders, menus, pull-down menus, panels, and their various cousins and relatives.

To do this, I'll be giving you a simple set of Java classes to build on. The reason we are using Java is that it will let us run everything over the Web, and (unlike, say, Flash), will allow our programs to scale up to handle computationally challenging simulations as needed. We are going to use only the most basic elements of Java for this stuff: getting a mouse event, drawing a simple graphic or text element.

You will also learn how technology and design work together in constructing effective user interfaces. For example, take a look at this simulation of a piano keyboard, which I made recently using only the most primitive 2D Java components. Note the way that 3D structure and movement is just suggested by combining 2D visual components. Note also how the keys are slightly "sticky" - when your mouse moves off a key it stays down for just an extra instant, just like a real piano key.

The fun stuff

After several weeks, the class will branch out into two larger projects. You can actually start to think about these projects right away, and put notes up on your web site about them. But we won't be diving into them until sometime mid-February.

Kid-authorable Museum Projection Walls

In one project we will be teaming up with folks at the New York Hall of Science to work out user interfaces to revolutionize the future of how kids experience science museums. When you were a kid, did you ever notice that science museums were cool, but that there was really a limit to how much you could influence the experience?

We are cooking up a plan, together with our friends Eric Siegel and Steve Uzzo at the New York Hall of Science, to create interactive experiences (such as simulations and experiments in genetics, evolution, color mixing, electricity, crazy virtual musical instruments, animal flocking behavior, game theory, boolean logic, optics, dinosaur movement, acoustics, you name it...) that groups of kids can play with and interact with on big projection floors and walls at the science museum. We will also be maintaining a parallel environment here in the lab.

But then (and here is where it gets interesting) we will be designing authoring interfaces that kids can use to create their own versions of these experiences on their computer at home, which will run on the big projection stage at the museum. The kids with the best designs can then show off and talk about their work in public events at the museum. In effect, kids can use the museum as a laboratory to conduct their own science experiments. Periodically there will be live Web-streaming video of scheduled performances.

User interfaces for the Robotic Garden

My colleague Ali Mazalek down at the Graphics and Visualization Lab at Georgia Tech is teaching a class this semester in Experimental Media. The theme will be "The Garden as Performance Space". It's going to be a physical robotic garden, featuring reactive "inhabitants" equipped with sensors and actuators. in which interactive performances can be staged and broadcast. In her own description:

  1. Remote-controlled physical performance space [ grad students ] Project teams of graduate students will work together to develop the various parts of a real-time tangible performance space, equipped with sensors and actuators that can be controlled by remote GUIs designed by students in Ken Perlin's UI class at NYU. The goal is to create a playful garden-like space in which spontaneous performances consisting of behavioral/reactive elements can unfold, and be visually interesting to watch and interact with both locally and remotely. For example, the garden might contain plants that dance to music, converse with each other, and react to light and movement in the space.

  2. Remote-controlled real-time video capture of performances [ undergrad students ] Undergraduate students will work in sub-teams to equip the physical performance space with multiple steerable cameras that can be controlled by GUIs designed by students in Ken Perlin's UI class at NYU. The goal is to enable the capture and sequencing of an unfolding performance in real-time, and stream the captured performance to a theatre in the SecondLife 3D virtual world.

Our job will be to create the interfaces that will allow people from anywhere in the world to control and interact with this garden. If all goes well, we are going to collectively show off this work at the annual Machinima festival next fall. How cool would that be?