The University of Massachusetts Amherst

computer chip

A Revolution in Computer Hardware

UMass Amherst electrical and computer engineer Qiangfei Xia is developing memristor-based hardware technology with the potential to advance AI while dramatically reducing energy usage and costs.

Artificial intelligence (AI) is rapidly emerging as a force in nearly every sector of society. Yet the computer hardware in use today is based on hardware introduced over 75 years ago, relying on the transistor device first invented in 1947.

Thus far, computer engineers have largely kept up with the increased demands of advancing technology through various “brute force” methods, such as decreasing device size and increasing bandwidth, explains Qiangfei Xia, the Dev and Linda Gupta Professor of Electrical and Computer Engineering at UMass Amherst. The result, though, is that AI programs are enormously expensive to run, both in terms of monetary cost—limiting access to major corporations and the super wealthy—and environmental impacts, including carbon emissions and freshwater use. For example, training a large language model that powers the popular chatbot, ChatGPT, could cost over $10 million and consume more than 700,000 liters of freshwater.

Image
Qiangfei Xia
Qiangfei Xia, the Dev and Linda Gupta Professor of Electrical and Computer Engineering at UMass Amherst.

“AI is incredible. In 2016, we saw its potential when AlphaGo, an AI-based computer system created by a Google subsidiary, beat the reigning human champion at the ancient Chinese board game Go,” Xia says. “But what most people don’t know is that the computers running AlphaGo filled nearly an entire room, and playing one game cost thousands of dollars in electricity."

“Currently, computing uses around 10 to 15 percent of all electricity global, and the U.S. Department of Energy projects that by 2040, we’ll no longer be able to produce enough electricity to meet this demand,” he adds. “This demand for power is actually driving interest by big tech companies like Amazon, Google, Microsoft, and Meta in recommissioning defunct nuclear power plants, which holds its own risks.”

For all these reasons, he says, “We need a revolution in computing hardware.”

For the past decade, Xia and his collaborators—including several former colleagues from Hewlett Packard (HP) Laboratories in Palo Alto, California, where he worked before joining the UMass faculty—have been developing a “memristor” device to build new computers. They have demonstrated that this new analog computing device can complete complex computing tasks while bypassing the limitations of digital computing and using far less energy. And they believe that memristor technology holds the potential to advance AI and address many of today’s most pressing scientific questions, from nanoscale material modeling to large-scale climate science.

The Nanodevices and Integrated System Lab, headed by Xia, has been addressing pressing issues in AI hardware and has a significant impact on emerging hardware based on transition metal oxide memristive devices. Xia has been recognized for his contributions to this field: as an elected Fellow of the Institute of Electrical and Electronics Engineers (IEEE) and as one of the world’s most “Highly Cited Researchers,” as published by Clarivate Plc. He has been selected to deliver UMass Amherst's Distinguished Faculty Lecture, which acknowledges the work of the university's most esteemed and accomplished faculty members, and to receive the Chancellor's Medal—the highest recognition bestowed upon faculty by the campus. Xia will deliver his lecture, "Future Computers Will Be Like the Human Brain," on March 27 at 4 p.m. in Old Chapel. In 2023, Xia was also named a recipient of the UMass College of Engineering’s Outstanding Senior Faculty Award.

The Promise of Memristive Computing

According to Xia, the concept of memristor devices dates back to 1971, when it was first proposed by a professor at the University of California Berkeley. Work on the memristor remained mostly theoretical for decades, but in the late 2000s, HP—where Xia was working at the time—made a breakthrough by connecting the memristor concept to a physical device they built in the lab.

While transistors rely on the movement of electrons, the memristor takes inspiration from the human brain, in which ions move through hundreds of trillions of synapses to carry information. The memristor controls the flow of electrical current in a circuit, while also “remembering” the prior state, even when the power is turned off, unlike today’s transistor-based computer chips, which can only hold information while there is power. In the memristor, computing is performed at the site where data is stored, rather than moving data between the computer’s memory and processing modules.

Xia draws an analogy between this form of “in-memory computing” and the empty roads during the early days of the COVID-19 pandemic. “Everyone was working from home, so that reduced traffic on the roads substantially.” For a computer built with memristive technology, these empty “roads” mean a huge boost in energy efficiency and computing throughput. This opens many doors for creating low-power AI hardware, especially for edge computing, where data is processed in the devices that collect it, rather than being sent to a centralized cloud server. Potential applications include consumer electronics—such as lightweight AR/VR goggles or wireless earbuds—scientific research, and military technology.

We want to work together and transfer this technology to mainstream foundries so it can benefit users more broadly in the future.

Qiangfei Xia

Xia and his collaborators see great potential for commercializing this technology. In 2018,  they founded TetraMem, a Silicon Valley-based startup of which Xia is a co-founder and advisor.  

“We want to work together and transfer this technology to mainstream foundries so it can benefit users more broadly in the future,” says Xia.

To help support that goal, in 2024, a team led by UMass Amherst was awarded a large grant from the Northeast Microelectronics Coalition (NEMC) Hub through U.S. CHIPS and Science Act funding under the Microelectronics Commons program. The first-year budget is $7.9 million and the total four-year budget for the project is $23.8 million, with yearly renewal contingent on satisfactory delivery of milestones. The funding is aimed at efforts to accelerate domestic prototyping and expand the nation’s global leadership in microelectronics. According to Xia, the researchers are partnering with GlobalFoundries in upstate New York to manufacture analog memristors at volume.

“The memristor technology is now far enough along in development that this is an ideal time for industry to take over,” Xia says. “In university research, we’ll continue fine-tuning the technology and exploring novel applications, such as in 6G cellular network technology and language processing, to name a few.”
 

Computing Inspired by the Human Brain

Going forward, Xia aspires to build computer circuits that are even more inspired by neuroscience and work as efficiently as the human brain. “We want to design a memristor device that’s a step closer to how our biological neurons work. This is known as neuromorphic computing,” he says.

Returning to the story of AlphaGo’s victory, Xia says the computer was estimated to use around 150 kilowatts of power to play the game, while its human competitor used only about 20-25 watts in its highly efficient brain. The human brain is also incredibly powerful in its capabilities. For example, we can recognize another person instantaneously based on only partial information, while digital computers must still do pixel-by-pixel matching.

“We have a long way to go. Literally, we do not understand our brains well enough yet. We want to use what we have already learned from the brain to build the next-generation computer. This will require collaboration with not just electrical engineers and computer scientists but also neuroscientists and psychologists,” Xia says. “These are very, very exciting and interesting frontiers we want to push.”

“I feel so lucky to be working in this field during this time when AI is booming,” he adds. “I’m very hopeful for the next phase of truly brain-inspired computer hardware.”

 This story was originally published in February 2025.