Whose drone is this?
On a cold gray afternoon in April, about 50 students flanked by police officers gathered on Carnegie Mellon University’s campus to protest. Their message: End Gun Violence, End State Violence.
Partway through the demonstration, from behind the chemical engineering building, a six-armed drone appeared in the sky. It caught the attention of Calvin Pollak, a Ph.D. student focused on rhetoric and one of the leading voices of the Students for a Democratic Society, a CMU student group inspired by the 1960s student activist group of the same name.
“Was that a police drone? Was it a CMU drone? Like, we don’t know,” Pollak said to those assembled before launching into a critique of the military-industrial complex on campus.
A single drone hovering over a university known worldwide for robotics is hardly sinister, but its sudden appearance at a protest against state violence made it a readymade symbol for the point organizers were making:
The future of warfare is being developed at CMU, and what do we really know about it?
Since forming shortly after the election of Donald Trump, the Students for a Democratic Society [SDS] has sponsored a teach-in on mass surveillance and a walkout on International Women’s Day. Now, they’re asking pointed questions about the nature of research happening at CMU.
A CMU spokesperson declined multiple requests to interview administration officials for this story.
At a SDS meeting in March, Pollak and others conveyed that the culture at CMU is such that the administration isn’t used to being questioned by the students.
“The culture is just not to engage with political issues on campus, especially issues that originate from the campus,” Pollak said. “The culture is to matriculate through your program, get a good job and that’s it.”
Millions in military research
Imagine a swarm of miniature drones, flying in unison with the power to kill.
We’re not at self-governing killer robots quite yet, but weaponized drone swarms are very real and they’re becoming more common and more autonomous — thanks, in part, to research being conducted at CMU.
The Department of Defense’s [DoD] spending on defense research and development has nearly doubled in the past two decades. According to data from the National Science Foundation, annual dollars spent by the DoD on research and development (that were disclosed by the federal government) have increased from $35.5 billion in 1996 to $68.3 billion in 2017.
According to a CMU report on expenditures of federal awards, the university declared that it spent $172 million in direct funding from the DoD for the fiscal year ending June 30, 2017, plus an additional $22 million in pass-through funding (money that other grant-receiving institutions received from the DoD and then passed to CMU). The university declared $182 million the previous fiscal year, plus $17 million in pass-through funding.

Subrata Ghoshroy, a research affiliate at the Massachusetts Institute of Technology’s program in science, technology and society, said a portion of the DoD’s research and development money finds its way to universities in all 50 states, mostly for fundamental research that isn’t specific to any particular weapons system. But, in reality, he said “there is no such thing as free lunch, and the Pentagon is not handing out money just to do good science.”
Drones can and do enable innovation for the common good. For example, drone swarms have the potential to benefit society by monitoring forest fires or scouting disaster areas.
But the DoD appears to be more focused on what the artificial intelligence [AI] and intelligent machines can do to transform warfare. Some experts believe that an “AI arms race” with Russia and China is near, if not already happening, and it has been reported that the DoD is collaborating with U.S. intelligence agencies to establish a Joint Artificial Intelligence Center.
With all that in mind, CMU is seen as a key component in this transformation: The U.S. News and World Report ranks CMU as a top graduate school in the nation for both AI and computer science. CMU’s importance to the federal government was further underscored in a January 2017 article by CyberScoop, an online publication covering cybersecurity, which stated “Carnegie Mellon University is to the NSA what the University of Alabama is to the NFL…”
As it pertains to drone swarms specifically:
- CMU is one of five partners in the defense industry and academia developing swarm tactics for a program that envisions small infantry units working in conjunction with a swarm of up to 250 unmanned aircraft systems and/or unmanned ground systems in support of a mission “to isolate an urban objective.”
- Last year, three CMU researchers published a paper detailing how adversarial “mole” drones can be used to infiltrate drone swarms to subvert it. “This problem has significant military applications,” according to the paper, funded partially by a grant from the Air Force Office of Scientific Research.
- CMU has also conducted research for the U.S. Marines and, in 2015, hosted a proof-of-concept demonstration for technology to increase the autonomy of unmanned ground and aerial systems working together.
A white paper from the Center for a New American Security said drone swarms could allow “disruptive change to military operations…bring[ing] greater mass, coordination, intelligence and speed to the battlefield.” The same report envisions a “smart cloud” of billions of cheaply produced 3-D printed drones that could “flood a building, locating and identifying enemy combatants and civilians.”
Dialogue and disappointment
In early April, two SDS representatives met with CMU’s Vice Provost for Research Gary Fedder to express their concerns about the military research happening on campus.
“We seek greater transparency and dialogue in the CMU community about how our university’s work is connected to that of the U.S. military and what the implications are of these connections,” wrote SDS member Rosemary Haynes in the letter requesting Fedder meet with group members.
Neither Fedder nor his office responded to repeated requests for comment from PublicSource.
SDS members Pollak and freshman Shori Sims said Fedder shared with them information on how the university manages research and funding. However, as the students understood it, they were disappointed to learn the provost’s office only checks for compliance of where classified research can be done, not the morality of the research itself.

Under CMU’s research policy, classified research is prohibited “except when confined to the semi-autonomous units” — such as the Software Engineering Institute, National Robotics Engineering Center and Mellon Institute. CMU professors are able to work on classified projects as consultants, and students are permitted so long as it “does not interfere substantially with progress toward a degree.”
“Based on this information, we need to work changing the culture of research here,” Pollak said.
According to Ghoshroy of MIT, U.S. research universities leave it up to individual professors to decide what projects and fields of study are worthy of their time. He noted that many universities in Germany have begun to implement policies that scrutinize defense research; however, he notes that state funding for military research in Germany is lower than it is in the United States (though German universities have performed research for the Pentagon), and Ghoshroy said the peace and student movements carry more power there.
War research
The bulk of DoD funding funneled to CMU goes to the Software Engineering Institute [SEI], the nation’s only Federally Funded Research and Development Center dedicated to software.
In 2015, the DoD renewed SEI’s contract for five years at $732 million, with the option to extend the contract for another five years at an additional $1 billion.
Operated by CMU and sponsored by the DoD, SEI is home to CERT, the first Computer Emergency Response Team and self-described “birthplace of cybersecurity.” There, experts also conduct cutting-edge research on system verification, data modeling, mission assurance, autonomy and human-machine interaction.
Much of what happens at SEI is classified but sometimes elements of their work become public, like when they made headlines for unmasking suspects of crimes who were using the anonymous web browser Tor in 2014.

SEI also performs research that uses machine learning to break down troves of surveillance video from unmanned aircraft systems, allowing teams that actively monitor drone feeds to access a searchable summary of meaningful highlights. Materials posted online by SEI note that the work is similar to Project Maven, a DoD project that, in April, was the subject of protests and petitions from thousands of Google employees opposed to it.
A DoD spokesperson, U.S. Army Maj. Audricia Harris, confirmed to PublicSource that the university has ties to Project Maven. “Project Maven regularly seeks advice from the nation’s top academic AI institutions on a wide variety of AI-related topics. This includes Carnegie Mellon University,” Harris wrote in an email.
SEI’s annual report also discusses research on real-time extraction of biometric data from video, work that “holds potential in a wide range of scenarios, including security, surveillance, counter-terrorism, and identification.” Marios Savvides, founder and director of the biometrics center at CMU’s CyLab, is a key collaborator in that work.
In May 2016, Savvides was awarded $8.9 million from the DoD for improvements to biometric surveillance and identification technology for the Naval Air Warfare Center Aircraft Division’s Special Surveillance Program. He has also conducted research into long-range iris scanning.
“A world that uses facial recognition does not look like Hollywood’s Minority Report,” reads a quote from Savvides on the CyLab homepage. “It looks like a smarter, more pleasant experience interacting with complex computer security systems to help make a safer world for our friends, our families and our children.”
Notable projects
The Robotics Institute [RI] at CMU is the crown jewel of Pittsburgh’s technological resurgence. Founded in 1979, RI was the first in the nation to offer a Ph.D. in robotics and is perpetually regarded as one of the marquee robotics institutions in the world.
There, pioneering researchers have developed miniature robots for minimally invasive therapy to the surface of a beating heart.
It has received NASA grants for interplanetary rover technology and their robots have been used to search for victims trapped in rubble after earthquakes.
Yet, at the RI’s National Robotics Engineering Center [NREC] in Lawrenceville, experts also work with government and industry clients on projects that aren’t as publicized.
Some of those projects include:
- NREC collaborated with Lockheed Martin subsidiary Sikorsky on a project to coordinate joint autonomous activities between an unmanned Blackhawk helicopter and a CMU Land Tamer, an unmanned ground vehicle.
- Defense contractor Leidos contracted NREC to develop a path planner for Sea Hunter, an anti-submarine warfare continuous trail autonomous vessel.
- British aerospace and defense firm BAE Systems partnered with NREC on sensing, teleoperation and autonomy packages for Black Knight, a 12-ton unmanned ground combat vehicle.
- NREC designed a sensor system to allow Boston Dynamics’ Legged Squad Support System to perceive its surroundings and autonomously track and follow a human leader.
‘Downstream importance’
A different philosophy toward research can be found at the RI’s Community Robotics, Education and Technology Empowerment Lab, known as the CREATE Lab*. There, researchers design tools to benefit society, and military funding is forbidden.
CREATE Lab’s founder and director, Illah Nourbakhsh, is a world-renowned robot ethicist and author of “Robot Futures,” an exploration of the coming societal implications of advanced robotics. He teaches an introductory, cross-disciplinary course on ethics and robotics and challenges engineers to become active thinkers, mindful of the consequences of their work.
“Technologists often underestimate their own downstream importance,” he said, referencing the social scientist Thomas Gieryn. “We are at the very beginning of the process of social change.”

It’s no surprise to hear Nourbakhsh echo well-established concerns about giving robots the power to make life-and-death decisions, but he is more concerned with the potential for increasingly more powerful AI that could be used to “tip the scales toward a concentration of power in the hands of corporations and governments, away from the populace.”
Faced with these looming threats to society, Nourbakhsh said he’s not optimistic for the future. Except in the following sense:
“The student body that I see today is much more socially aware and much more upset about the hegemonic power structures in society — and about the role data and surveillance and lack of privacy play in that — than any other student body I’ve seen for 22 years.”
*CREATE Lab and PublicSource receive funding from The Heinz Endowments.
This story was fact-checked by Jeffrey Benzing.
Brian Conway is a Pittsburgh-based freelance writer. He can be reached at brian.conway@gmail.com. You can follow him on Twitter @BrianConwayyyyy.