Body
2024 Rice Engineering Magazine Cover

The spring 2025 issue of Rice Engineering Magazine is here!

At Rice Engineering, we are driven by a passion for innovation and a commitment to responsible engineering practices. It’s with great excitement that we unveil the new design of Rice Engineering magazine, which underscores our dedication to excellence in research, education, and service. The 2024-25 issue is full of news about how Rice Engineering is solving for greater good.

View

BUILDING COMPUTING SYSTEMS OF THE FUTURE


Computing, much like electricity in the days of yore, is now tightly woven into the fabric of modern society. Industries from healthcare to energy now rely on computers to process information and solve problems, large and small. Even our homes are not immune from the power of computing. This goes beyond personal computers and smartphones – computing is now just as deeply integrated into our appliances, vehicles, entertainment, and home management systems.

“Today, almost no aspect of human life is untouched by computing,” said Ashok Veeraraghavan, chair of the Department of Electrical and Computer Engineering at Rice. “But it is important to understand that we are also in the midst of a phase transition in computing.”

Much of this phase transition is fueled by advances in artificial intelligence (AI). Such algorithms promise to significantly improve – and quicken – complex computations needed to support natural language processing, computer vision, data analytics, and informed decision-making tasks. However, the more complicated the task, the more power-intensive computation it needs. To support the future of computing and, by extension, AI, the industry will need computing systems – complementary hardware and software components that work together to support several orders of magnitude of improvement in computing efficiency, portability, and accessibility.

Chris Jermaine, the Victor E. Cameron Professor of Computer Science and chair of the Department of Computer Science, said that the latest graphics processing units (GPUs) take about 700 watts of power to run. That’s approximately the same amount of energy required to power a microwave oven. A large-scale AI computation might have 1,000 or more GPUs working together for a month. At a certain point, as AI algorithms evolve, traditional computer systems won’t be able to keep up with their computation and energy demands.

“The power requirements are huge – and they are only getting bigger,” said Jermaine. “Unless we see some major advances in computing systems, we aren’t going to be able to support the future of AI. And that means AI won’t continue to evolve like it has been.”

That’s one of the reasons why the George R. Brown School of Engineering and Computing not only has a new name, but a new focus. To continue its mission of “solving for greater good,” the school will work on designing and building innovative computing system solutions to sustain progress – and help accelerate the computing phase transition that Veeraraghavan described.

Legacy as foundation

Rice University has a long and storied history in computing system design and development. Modern supercomputing systems can thank Ken Kennedy, a Rice alum and beloved computer science professor, for his pioneering work in parallel compiling. This type of software programming allows computing systems to harness hundreds, or even thousands, of hardware processors, so they can work in concert to perform advanced computing calculations.
“In the 1980’s, Rice was well known for its work in high-performance computing – and that’s the legacy of Ken Kennedy,” said Jermaine. “And while, at that time, no one would have thought to pair high-performance computing with AI, advances in these types of algorithms have been driven by the amount of available data as well as an increase in available computing power. As those continue to grow, it has big implications for what kind of computing systems we will need in the future.”

With a growing global reliance on computing to help solve the world’s most pressing challenges, Jermaine added that computer scientists and engineers must start to think beyond the classical systems architecture – and acknowledge the necessary and reciprocal relationships between traditional computer science and engineering disciplines to develop new, more resilient systems to support future computing needs. Veeraraghavan agreed – and said the George R. Brown School of Engineering and Computing is the perfect place to work on emerging hardware/software solutions to support the continuing evolution of AI algorithms.

“For decades, both the computer science and electrical and computer engineering departments have had a broad focus on computing systems,” he said. “It traces its origin story all the way back to the late 1950's and early 60's when the Rice Computer Project or the R1 was conceived, designed and built. Over time, this and other advances established Rice as a preeminent resource in both hardware and software of computing systems. This legacy was carried forward by Ken Kennedy and the many advances in software systems and parallel computing that he and his team pioneered. Around the same time, Sidney Burrus, Don Johnson and others established Rice University as a steeple of excellence in signal processing and computing, helping place Rice and Texas at the heart of the digital signal processing revolution that has had a lasting impact on our lives.

“It’s also the legacy of Willy Zwaenepoel and Peter Druschel, whose work in distributed computing shaped modern cloud and internet systems. Zwaenepoel’s innovations in networked computing and fault tolerance laid the groundwork for today’s reliable, large-scale data systems. Druschel’s breakthroughs in peer-to-peer networking transformed how data is shared, influencing everything from content distribution to cloud storage. When you consider the history of both these departments, their top-notch legacy and the cutting-edge research that they continue to do, it makes perfect sense for the two departments to collaborate and co-design hardware and software that will have the power to optimize computing systems that can function more efficiently to meet the challenges of AI.”

New ways to process data

To support this bold, new strategy, the George R. Brown School of Engineering and Computing hopes to bring on 10 new faculty members with joint appointments in computer science and electrical and computer engineering. The goal is to develop emerging applications in computer architecture and chip design, computer communication and networking, sensing and perception systems, and next-generation operating systems, compilers, and programming languages. Jermaine added that the school already has several exciting and innovative projects underway that have the potential to reduce power consumption and increase computing efficiency.

“On the hardware side, one of our associate professors in electrical and computer engineering, Kaiyuan Yang, is working on in-memory computing systems,” he explained. “This approach can give you a massive win both in terms of the capabilities of hardware in doing computations, as well as how much power is required.”

The traditional von Neumann computer architecture, which is the basis for the computers we use today, keeps the system’s memory and computational processing separate.

“You have to have a powerful general-purpose processor that is always bringing data and instructions from the computer’s memory, modifying it in the central processing unit (CPU), and then writing the data back out to the memory,” Jermaine said. “Moving the data back and forth between the memory and the CPU is a very power-hungry process.”

An in-memory computing architecture, in contrast, puts simple circuits onto the random-access memory (RAM) itself to run simple computations – saving significant time and power.

Another project from an interdisciplinary team of researchers will explore new technologies, including ferroelectric materials, with the potential to improve computing efficiency. The efforts are part of the Defense Advanced Research Projects Agency (DARPA) Next Generation Microelectronics program. The team, comprising materials scientists, electrical and computer engineers (including Veeraraghavan), and nanoengineers from within the George R. Brown School of Engineering and Computing, will collaborate with other Rice University research institutes.

“The basic effort is to create a new type of computer and memory architecture with the potential to be 100 times more energy efficient,” said Veeraraghavan. “Given just how much power computing takes today, this is an absolute necessity. If we continue on the path of exponential growth developing new power-hungry computing technologies, within a decade or two, it is expected that the power needs of computing systems will be greater than the power generation capacity of the world. That’s not a sustainable trajectory for computing, and this is a pressing problem – what we can do with computing in the future will be limited if we can’t solve it.”

Rice’s computer scientists are also working on innovative solutions at the intersection of quantum and high-performance computing. Tirthak Patel, an assistant professor of computer science, focuses on developing system software, compilers, and architectures that enhance the efficiency and reliability of quantum programs, making them more practical for real-world applications. Quantum computing holds the potential to impact drug discovery significantly, offering accelerated computational power to solve complex problems in molecular modeling and simulation.

“Tirthak’s work is critical in bridging the gap between the theoretical promise of quantum computing and its real-world feasibility,” said Jermaine. “By building robust software and systems, he’s helping unlock the potential of quantum computing to tackle problems that are currently beyond the reach of classical computing.”

High-risk, high reward

Part of Ken Kennedy’s enduring legacy – as a researcher and a pioneer in the computing industry – is the willingness to take risks on audacious and game-changing ideas.

“Kennedy was all about asking big and difficult questions. He’s known for his successes with parallel compiling, but he did fail sometimes. And, over time, as a field, we learned just as much from those failures,” said Jermaine. “We are facing big problems in computing, and we are going to need new solutions. We can no longer rely on a one-size-fits-all approach to software and hardware.”

Increasingly, Jermaine said, he believes we will need specialized hardware and software solutions designed to address specific needs – and expects to see “a tremendous amount of variability” in computing system design over the next few decades. And a great deal of those new systems, he expects, will come from the groundbreaking research being pursued at the George R. Brown School of Engineering and Computing at Rice.

“By developing computing systems in a holistic manner – and not continuing to let hardware and software evolve in their siloes – Rice researchers will have both the knowledge and engagement to drive innovation, ensuring that advances in AI, as well as any other new computing paradigms, can be supported,” Veeraraghavan said.

“We are facing systems problems, and they are going to require systems solutions,” he added. “Rice is uniquely poised to be a leader in this space because of our history and talent. We will help drive the next phase transition in computing, not just in developing new technologies, but also in thinking about how those technologies will shape our future.”