Computing for the Common Good
"Computing for the Common Good is about integrating values into your work, research, and education. That's the heart of it."
After a year of brainstorming with faculty and staff, the Computing for the Common Good initiative, or C4CG, was born. “C4CG is about integrating values into your work, research, and education. That’s the heart of it,” says Haas. “We spent a year talking about who we are and what we want to do and what our aspirations are. This whole notion of working ‘for the common good’ emerged from these discussions. It’s about education and research, the two pillars of any college that are going to make lives better not only for the citizens of Massachusetts, but also for the world.”
Haas says there are three prongs to C4CG: education; research to improve computer technology, making it better and safer for all; and research to apply that technology or technologies in general to do some good in the world.
What distinguishes the campus’s C4CG, says Haas, is the three-pronged approach to using technology for good. “You will find a lot of places that are working on the third prong, and that’s wonderful. There are now also quite a few places, though fewer, that are starting to worry about ethics and how we educate our students. And there are a very few that are thinking about preparing technology to be better, making systems better. There’s been a little more of that in the last year or so. I like to say we can take some credit as trendsetters there,” says Haas.
As she elucidates the prongs, Haas talks about education—giving students skills to be the upright citizens they need to be, as well as good computer scientists. She talks about ethics, not as a concept out there, but as a factor that will impact students in their careers, as they confront tough ethical situations and decision-making.
“Today, students have a couple of optional seminars they can take, one of which brings in industry people to talk about real-world problems,” says Haas. “Another is more reading based. What we want to do going forward, and what we have a group of faculty working on at present is an Ethics Simulation Game.” Think of it as online training. The idea is to use role-playing simulation technologies to grab students’ interests and then challenge them with real-world scenarios, the sorts of things that they might find when they take their first job or that they might come across in the application of technology, says Haas. “A lot of what we are trying to do is to help students see the ethical implications of their technical work, and to empower students to say ‘no’ to something that doesn’t seem right or to go and ask other people for help. Even if they have the best hearts in the world, they may be terrified to question something their boss is doing,” says Haas. Though still a work in progress, Haas expects to see a demo of the software in the fall and she’s already been approached by other schools who are interested in it.
C4CG’s second thrust—improving computing technology, where systems can improve safety, ensure privacy and security, and have some notion of fairness—has the capability to flag issues that might come up in technology development. “Tall order, but really cool. This includes basic systems research that tries to make things more robust,” says Haas. She rattles off a number of projects, including one in robotics focused on autonomous vehicle safety. “How do you ensure vehicles avoid situations that could harm someone? Those sorts of changes in computer technology will make systems better,” says Haas.
Haas is also a big fan of the privacy, security, and fairness work being conducted by several groups in the college as she believes “it will have an impact on every single one of us.” CICS faculty members have formed a research initiative around these concepts called EQUATE (Equity, Accountability, Trust, and Explainability). Faculty who are engaged in research and education related to equitable algorithms and systems form the core of EQUATE. Educational efforts include the aforementioned coursework in ethics, and algorithm design that respects the values of fairness and transparency. Research efforts explore EQUATE topics within software systems and programming languages, machine learning, and vision, theory, and data management systems.
As an example, Haas talks about growing concerns regarding using algorithms for decision-making. EQUATE faculty Alexandra Meliou and Yuriy Brun study how software systems can exhibit bias and how software engineers can develop fairer, more equitable systems. They recently received a four-year, $1.05 million grant from the National Science Foundation to support their research.
“Software makes decisions about what products we are led to buy, who gets a loan, self-driving car actions that may lead to property damage or human injury, medical diagnoses and treatment, and every stage of the criminal justice system including arraignment and sentencing that determine who goes to jail and who is set free,” says Meliou. “And yet, examples of discrimination have shown up in many software applications, including advertising, hotel bookings, facial recognition, and image search,” she adds
Brun says they plan to create a theoretical foundation of software fairness, including defining a suite of fairness metrics that can be used to describe desired properties of software. “We also plan to create algorithms for testing and verifying the software fairness, for identifying bias causes, and for debugging discrimination bugs,” says Brun.
A project that addresses all components of EQUATE, says Haas, is work conducted by privacy technologist Gerome Miklau for the 2020 Census. Funded by the National Science Foundation and the Defense Advanced Research Projects Agency (DARPA), Miklau has been working with the U.S. Census Bureau to develop technology that will better protect the privacy of citizens answering the census while still providing public access to aggregate census data important to federal, state and local decision-making.
“U.S. citizens are required to respond to the census and the Census Bureau is mandated by law to protect that data while at the same time making certain public releases,” says Miklau.
Recently, the Census Bureau decided that the current methods that they have been using to protect privacy may not be sufficient, says Miklau. “The Census Bureau actually attacked its own disclosure limitation methods and they found some vulnerabilities.”
Miklau and other scientists have been working with Census Bureau staff to develop new tools that will help boost privacy while ensuring access to aggregate census data. He and others are working on new methods of disclosure limitation using a model called “differential privacy.” Though the model was invented in 2006, Miklau says it has rarely found major practical applications, until very recently.
One advantage of differential privacy is the way the algorithms work, says Miklau. “The guarantee of privacy does not rely on keeping the algorithms themselves secret. It might seem like a subtle thing but that has a lot of implications for security and transparency” says Miklau. Another strength of differential privacy, he says, is the ability to accurately analyze the consequences of two subsequent data disclosures. “One of the real limitations of the past techniques is although you believe that each release is private by itself, what happens if somebody can put them together? You can’t always control what people will do with the data you are going to release so it’s really important to analyze what we call “composition” or subsequent, sequential releases. Many of the well-known privacy breaches that have made news result from these types of combinations of data,” says Miklau.
Ultimately, the Census Bureau has to decide how to trade off between two social goods: personal privacy and accurate statistics about the U.S. population. “There is an inherent tension between those two things and, as a society, we simply cannot have as much as we want of both of them,” says Miklau. “The use of our algorithms, and differentially private methods generally, will allow census directors to make better-informed decisions about this tradeoff.”
The third thrust completes the C4CG picture—deploying computing technology for good. Again, Haas quickly rattles off a list of projects, each intriguing. There’s the machine vision project that analyzes radar images to better understand bird migration routes so that zoning decisions can be made with wildlife safety in mind. Another group of faculty are creating new devices that can be used to help people break bad habits—really bad habits—like addiction to drugs, or smoking, or obesity caused by lack of self-control.
There’s a service aspect to it, too, notes Haas. The campus participates in CyberCorpsTM Scholarship for Service funded by the National Science Foundation. The program supports undergraduate and graduate students who are interested in cybersecurity with a full scholarship with internship opportunities under the condition that they work for a federal agency for an equivalent number of years after graduation. “It’s very popular with the students and the faculty who administer the program because it’s a real opportunity to do something good for your country. Some students stay with a government agency and others will take those skills and work in private industry. We can’t produce students fast enough in the cybersecurity space,” says Haas.
“People talk about the new industrial revolution. We are in another wave of great change. Yes, there will be losers and winners but we have better tools for thinking about what’s happening or going to happen than we did and maybe we can get ahead of things like policy decisions,” says Haas. “This is our revolutionary idea for the next decade. I think we have a unique perspective in CICS. This came from our core, and I think we are going to change the world through this.”
Karen J. Hayes '85