On the altar of a former cathedral in Duluth, Minn., an ensemble of musicians begins to play.
Their notes are piercing and sometimes dissonant. It’s not your typical cathedral music—but then again, these aren’t your typical musicians.
None of them look like robots, though. They look more like futuristic instruments.
Troy Rogers is their creator, and he introduces them one by one.
First there’s AMI, short for “automatic monochord instrument.” It’s a sheet of clear plastic stood on end with a guitar string stretched across it. Some electromagnetic levers press down on the string to change the pitch. It’s kind of a one-string electric guitar that plays itself.
AMI represents one kind of musical robot, where the robot and the instrument are the same thing.
The other kind is a mechanical device that plays a traditional instrument, like Rogers’ snare drum. About a dozen robotic arms reach across the top of the drum, poised to tap or pound the drumhead.
Rogers built most of these robots with two other composers. Troy, Steven Kemper and Scott Barton started making robots together while they were in grad school, back in 2007. Their collective is called Expressive Machines Musical Instruments, or EMMI.
“Now we live in different places and have joint custody of the robots,” Rogers says.
He introduces a clarinet-like robot next. This one’s a clear plastic tube that stands about 3 feet high. It has valves along its length, and Rogers’ computer operates those to change the pitch.
Rogers demonstrates the robots’ capabilities by having them play a piece based on Bulgarian folk music.
The composition lives on Troy’s laptop as computer code. A wire runs to each robot, and the robot performs what the composer has written.
Although the music is written on a computer and triggered by a computer, it’s not “computer music.”
The sound is created by real, physical instruments.
Those real instruments are initially what got Rogers started in music. He says he grew up playing guitar in bands.
“I was grounded for some youthful transgression and I read Frank Zappa’s autobiography,” he says. “He talked about being a composer, and I sort of glommed onto that.”
The term “composer” was new to Rogers, since he didn’t grow up with classical music. He liked Zappa’s take on it.
Rogers’ summary: “You can do whatever you want, and as long as you put a frame around it and call it music, then that’s what it is.”
Once you accept that broad definition of music, Rogers says it’s natural to embrace music made by machines. He says mechanical instruments got their start long before computers.
“From carillons in the low countries of Europe in the 13th century, to orchestrions in the 19th century and player pianos,” Rogers says, “electronic music is just a little side stream in that current of music technology.”
Rogers is finishing his Ph.D. in Music Composition and Computer Technology at the University of Virginia. He didn’t start out interested in building robots. He says he wasn’t one of those kids who likes to build stuff.
“No, I broke stuff, but I didn’t really build anything,” he says with a laugh. “I was more into destruction. I was into explosive devices.”
He lived through that phase, thankfully, and eventually found himself in graduate school.
A couple of years in, he wrote some music for a digital player piano. But he was frustrated by the instrument’s limitations. So he started taking electronic devices apart and figuring out how they worked.
Rogers’ first baby step toward building robots was to make an LED blink.
“Once you make an LED blink, it’s a short step to say, ‘Well, let’s make a motor spin or a solenoid trigger,’” he says. “Suddenly I found myself with a pencil and a paperclip and a rubber band and a steel bowl with a balloon stretched over it, making a little mechanical drummer. And from there, that’s how everything kind of started.”
Rogers kept learning from the Internet. Then he found other people who wanted to build musical robots, and they learned together. That’s when they formed EMMI.
Troy says he’s a musician who just happens to have taught himself a lot of computer programming and engineering, but he’s not an engineer.
“There’s engineering that goes into building these things, but I don’t think like an engineer,” he says. “We’re not trying to build something that solves a particular specified problem in the most efficient way.”
In fact, sometimes Rogers and his friends do the opposite.
“Sometimes the inefficient solution is the most musically interesting,” he says.
Rogers still writes occasional pieces for voice or solo piano, but he says he’s drawn to robots because they can do things that humans simply can’t.
“It might be complex rhythmic relationships and structures, polyrhythms that are outside the realm of what humans can perform,” he says. “Rather than just a 3 against 2 feel, it might be an 82 against 85 against 89 notes in a given time period.”
Robots can play faster than humans, too.
“Some of those trills or tremolos are happening at 40 times or 50 times a second,” Rogers says as one of his robots demonstrates.
That precision is great, but Rogers says it’s only part of why he likes to work with robots.
“On the other hand, sometimes you want to be surprised,” he says. “So you find the voice of the instrument.”
Rogers says experimenting with robots is no different from experimenting with acoustic instruments.
“Whether it’s multiphonics on a wind instrument or whether it’s a robotic instrument trilling at a rate that’s just below the absolute top threshold, you get into these sounds that you can’t produce another way,” he says. “And sometimes that’s what we’re looking for as musicians.”
Still, Rogers says robots have their own set of limitations. Or maybe it makes more sense to say that trained human musicians bring things to music that robots don’t. Expression, for instance, and vibrato.
“One dot on a piece of paper in a certain spot and you get all of that with a human performer,” he says. “It takes a lot more information and a lot more specification to get something as interesting out of a computer or a robot. A human performer just does it out of the box.”
Sometimes Rogers emulates a more human sound with a robot.
Enter the fourth member of Rogers’ quartet.
This one roughly simulates the human voice. Rogers built it with a friend in Belgium. He spent a year there on a Fulbright Scholarship working with what’s billed as the “world’s largest robot orchestra.”
Rogers says this robot generates sound from two “tunable Helmholtz resonators.”
“If you think of a beer bottle, if there’s less liquid, you have a lower pitch,” he explains. “And if there’s more liquid, you have a higher pitch. They’re a lot like that.”
Rogers can change the vowel sound the robot makes, from “ah” to “oh” to “ee” to “oo.”
Once the full robot quartet is plugged in and ready to go, Rogers queues up a piece based on Ugandan folk music.
His instruments begin to play—just as they do every time he asks them to.
“Where I used to be almost exclusively a composer, now I’ve become an instrument builder as well,” Rogers says. “Robots end up taking a lot of my time. They’re time-consuming devices.”
The robots might require a lot time and energy, but on the plus side, Troy Rogers has a house orchestra that’s always ready to play.
And they hardly eat anything.
MORE AUDIO from TRBQ:
Subscribe the to the TRBQ podcast on iTunes.
Listen to the TRBQ podcast on Stitcher.
Follow TRBQ on SoundCloud.