research

image of prof Shanhui Fan
April 2021

Professor Shanhui Fan presented his latest advances in radiative cooling at annual energy sector conference. Shanhui's radiative cooling harvests electricity from the coldness of the universe, which in turn, can be harvested on Earth for several renewable energy applications. For millennia, humans in regions where the ambient temperature never falls below freezing have used the concept to make ice by burying water at night.

Radiative cooling could have a significant impact on lowering electricity use and boosting output of renewables, but it will require advances in blackbody emitters, materials that absorb heat and radiate the heat at frequencies that send it into space.

"This requires a good blackbody emitter," said Shanhui, "but we can cool objects to a temperature 13 degrees Celsius (55 degrees Fahrenheit) below the ambient temperature with no electricity; it's purely passive cooling."

Radiative cooling systems could, for example, reduce the electricity required for air conditioning by 10 percent to 15 percent, he said. Such systems at night could also generate enough electricity for LED lighting in homes, which would be a significant development for the billion humans without electricity.

 

Other Stanford faculty research presented includes,

  • Professor Yi Cui, discussed new horizons for energy and climate research as part of a panel. To Cui, the big issue is energy storage to enable greater use of intermittent solar and wind power.
  • Professor Reihold Dauskardt's Spray-on Solar cells
  • Professor Arun Majumdar discussed gigaton-scale solutions for getting to zero greenhouse gas emissions globally from human activity.

 

Excerpted from Precourt Institute "Stanford at CERAWeek: energy storage, net-zero GHG, radiative cooling and perovskite solar cells"

 

Related News

prof Kunle Olukotun
April 2021

Professor Kunle Olukotun has built a career out of building computer chips for the world.

These days his attention is focused on new-age chips that will broaden the reach of artificial intelligence to new uses and new audiences — making AI more democratic.

The future will be dominated by AI, he says, and one key to that change rests in the hardware that makes it all possible — faster, smaller, more powerful computer chips. He imagines a world filled with highly efficient, specialized chips built for specific purposes, versus the relatively inefficient but broadly applicable chips of today.

Making that vision a reality will require hardware that focuses less on computation and more on streamlining the movement of data back and forth, a function that now claims 90% of computing power, as Kunle tells host Russ Altman on this episode of Stanford Engineering's The Future of Everything podcast. 

 

 

Source: The Future of Everything Series, "Kunle Olukotun: How to make AI more democratic"

image of prof Nick McKeown
March 2021

Later this year, in a lab in the Durand Building, a team of researchers will demonstrate how a tight formation of computer-controlled drones can be managed with precision even when the 5G network controlling it is under continual cyberattack. The demo's ultimate success or failure will depend on the ability of an experimental network control technology to detect the hacks and defeat them within a second to safeguard the navigation systems.

On hand to observe this demonstration will be officials from DARPA, the Defense Advanced Research Projects Agency, the government agency that's underwriting Project Pronto. The $30 million effort, led by Professor Nick McKeown, is largely funded and technically supported through the nonprofit Open Networking Foundation (ONF), with help from Princeton and Cornell universities. Their goal: to make sure that the wireless world – namely, 5G networks that will support the autonomous planes, trains and automobiles of the future – remains secure and reliable as the wired networks we rely on today.

This is no small task and the consequences could not be greater. The transition to 5G will affect every device connected to the internet and, by extension, the lives of every person who relies on such networks for safe transportation. But, as recent intrusions into wired networks have shown, serious vulnerabilities exist.

The pending Pronto demo is designed to solve that problem by way of a fix that McKeown and colleagues have devised that wraps a virtually instantaneous shield around wirelessly accessible computers using a technology known as software-defined networking (SDN).

 

Excerpted from "A new Stanford initiative aims to ensure 5G networks are reliable and secure", Stanford News, March 24, 2021.

 

RELATED

 

image of PRONTO members

 

 

image of prof. David Miller
February 2021

Professor David Miller joins Professor Russ Altman on The Future of Everything podcast. David's research interests include the use of optics in switching, interconnection, communications, computing, and sensing systems, physics and applications of quantum well optics and optoelectronics, and fundamental features and limits for optics and nanophotonics in communications and information processing. 

In this podcast he explains the remarkable potential of using light instead of electricity in computation.

 

"A silicon chip these days looks like six Manhattan grids stacked atop one another," Miller says of the challenge facing today's technology. Photonics holds the promise of more powerful computing by beaming tiny packets of photons through light-bearing conduits that carry 100,000 times more data than today's comparable wires, and it can do it using far less energy, too.

Before that day can arrive, however, Miller says photonic components need to become much smaller and less expensive to compete with the sheer scale advantages silicon enjoys, and that will require investment. But, for once, a way forward is there for the asking, as Miller tells bioengineer Russ Altman, host of Stanford Engineering's The Future of Everything podcast. Listen and subscribe here.

image of prof James Zou and PhD Amirata Ghorbani
February 2021

Each of us continuously generates a stream of data. When we buy a coffee, watch a romcom or action movie, or visit the gym or the doctor's office (tracked by our phones), we hand over our data to companies that hope to make money from that information – either by using it to train an AI system to predict our future behavior or by selling it to others.

But what is that data worth?

"There's a lot of interest in thinking about the value of data," says Professor James Zou, member of the Stanford Institute for Human-Centered Artificial Intelligence (HAI), and faculty lead of a new HAI executive education program on the subject. How should companies set prices for data they buy and sell? How much does any given dataset contribute to a company's bottom line? Should each of us receive a data dividend when companies use our data?

Motivated by these questions, James and graduate student Amirata Ghorbani have developed a new and principled approach to calculating the value of data that is used to train AI models. Their approach, detailed in a paper presented at the International Conference on Machine Learning and summarized for a slightly less technical audience in arXiv, is based on a Nobel Prize-winning economics method and improves upon existing methods for determining the worth of individual datapoints or datasets. In addition, it can help AI systems designers identify low value data that should be excluded from AI training sets as well as high value data worth acquiring. It can even be used to reduce bias in AI systems.

[...]

The data Shapley value can even be used to reduce the existing biases in datasets. For example, many facial recognition systems are trained on datasets that have more images of white males than minorities or women. When these systems are deployed in the real world, their performance suffers because they see more diverse populations. To address this problem, James and Amirata ran an experiment: After a facial recognition system had been deployed in a real setting, they calculated how much each image in the training set contributed to the model's performance in the wild. They found that the images of minorities and women had the highest Shapley values and the images of white males had the lowest Shapley values. They then used this information to fix the problem – weighting the training process in favor of the more valuable images. "By giving those images higher value and giving them more weight in the training process, the data Shapley value will actually make the algorithm work better in deployment – especially for minority populations," James says.

 

Excerpted from: HAI "Quantifying the Value of Data"

image of prof Shanhui Fan
February 2021

Professor Shanhui Fan and his team have developed a wireless charging system that could transmit electricity even as the distance to the receiver changes. They have incorporated an amplifier and feedback resistor that allows the system to automatically adjust its operating frequency as the distance between the charger and the moving object changes.

By replacing their original amplifier with a far more efficient switch mode amplifier, they boosted efficiency. The latest iteration can wirelessly transmit 10 watts of electricity over a distance of 2 or 3 feet.

Shanhui says there aren't any fundamental obstacles to scaling up a system to transmit the tens or hundreds of kilowatts that a car would need. In fact, he claims the system is more than fast enough to resupply a speeding automobile.

 

Excerpted from "Engineers Race to Develop Wireless Charging Technology"

image of 3 EE faculty: Subhasish Mitra, Mary Wootters, and H.S. Philip Wong
January 2021

Professors Subhasish Mitra, H.S. Philip Wong, Mary Wootters, and their students recently published "Illusion of large on-chip memory by networked computing chips for neural network inference", in Nature.

Smartwatches and other battery-powered electronics would be even smarter if they could run AI algorithms. But efforts to build AI-capable chips for mobile devices have so far hit a wall – the so-called "memory wall" that separates data processing and memory chips that must work together to meet the massive and continually growing computational demands imposed by AI.

"Transactions between processors and memory can consume 95 percent of the energy needed to do machine learning and AI, and that severely limits battery life," said Professor Subhasish Mitra.

The team has designed a system that can run AI tasks faster, and with less energy, by harnessing eight hybrid chips, each with its own data processor built right next to its own memory storage.

This paper builds on their prior development of a new memory technology, called RRAM, that stores data even when power is switched off – like flash memory – only faster and more energy efficiently. Their RRAM advance enabled the Stanford researchers to develop an earlier generation of hybrid chips that worked alone. Their latest design incorporates a critical new element: algorithms that meld the eight, separate hybrid chips into one energy-efficient AI-processing engine.

Additional authors are Robert M. Radway, Andrew Bartolo, Paul C. Jolly, Zainab F. Khan, Binh Q. Le, Pulkit Tandon, Tony F. Wu, Yunfeng Xin, Elisa Vianello, Pascal Vivet, Etienne Nowak, Mohamed M. Sabry Aly, and Edith Beigne.

[Excerpted from "Stanford researchers combine processors and memory on multiple hybrid chips to run AI on battery-powered smart devices"]

image of prof H. Tom Soh
January 2021

EE Professor Tom Soh, in collaboration with Professor Eric Appel, and colleagues have developed a technology that can provide real time diagnostic information. Their device, which they've dubbed the "Real-time ELISA," is able to perform many blood tests very quickly and then stitch the individual results together to enable continuous, real-time monitoring of a patient's blood chemistry. Instead of a snapshot, the researchers end up with something more like a movie.

"A blood test is great, but it can't tell you, for example, whether insulin or glucose levels are increasing or decreasing in a patient," said Professor Tom Soh. "Knowing the direction of change is important."

In their recent study, "A fluorescence sandwich immunoassay for the real-time continuous detection of glucose and insulin in live animals", published in the journal Nature Biomedical Engineering, the researchers used the device to simultaneously detect insulin and glucose levels in living diabetic laboratory rats. But the researchers say their tool is capable of so much more because it can be easily modified to monitor virtually any protein or disease biomarker of interest.

Authors are PhD candidates Mahla Poudineh, Caitlin L. Maikawa, Eric Yue Ma, Jing Pan, Dan Mamerow, Yan Hang, Sam W. Baker, Ahmad Beirami, Alex Yoshikawa, researcher Michael Eisenstein, Professor Seung Kim, and Professor Jelena Vučković.

Technologically, the system relies upon an existing technology called Enzyme-linked Immunosorbent Assay – ELISA ("ee-LYZ-ah") for short. ELISA has been the "gold standard" of biomolecular detection since the early 1970s and can identify virtually any peptide, protein, antibody or hormone in the blood. An ELISA assay is good at identifying allergies, for instance. It is also used to spot viruses like HIV, West Nile and the SARS-CoV-2 coronavirus that causes COVID-19.

The Real-time ELISA is essentially an entire lab within a chip with tiny pipes and valves no wider than a human hair. An intravenous needle directs blood from the patient into the device's tiny circuits where ELISA is performed over and over.

 Excerpted from "Stanford researchers develop lab-on-a-chip that turns blood test snapshots into continuous movies", December 21, 2020.

Related News

image of prof Stephen P. Boyd
January 2021

The Boyd group's CVXGEN software has been used in all SpaceX Falcon 9 first stage landings.  

From spacex.com: Falcon 9 is a reusable, two-stage rocket designed and manufactured by SpaceX for the reliable and safe transport of people and payloads into Earth orbit and beyond. Falcon 9 is the world's first orbital class reusable rocket. Reusability allows SpaceX to refly the most expensive parts of the rocket, which in turn drives down the cost of space access.

On December 9, Starship serial number 8 (SN8) lifted off from a Cameron County launch pad and successfully ascended, transitioned propellant, and performed its landing flip maneuver with precise flap control to reach its landing point. Low pressure in the fuel header tank during the landing burn led to high touchdown velocity resulting in a hard (and exciting!) landing. Re-watch SN8's flight here

 

Although Stephen doesn't plan to travel to Mars, he's thrilled that one day, some of his and his students' work will.

image of profs Wetzstein, Fan, Miller
December 2020

Professors Gordon Wetzstein, Shanhui Fan, and David A. B. Miller collaborated with faculty at several other institutions, to publish, "Inference in artificial intelligence with deep optics and photonics". 

Abstract: Artificial intelligence tasks across numerous applications require accelerators for fast and low-power execution. Optical computing systems may be able to meet these domain-specific needs but, despite half a century of research, general-purpose optical computing systems have yet to mature into a practical technology. Artificial intelligence inference, however, especially for visual computing applications, may offer opportunities for inference based on optical and photonic systems. In this Perspective, we review recent work on optical computing for artificial intelligence applications and discuss its promise and challenges.

Additional authors are Aydogan Ozcan, Sylvain Gigan, Dirk Englund, Marin Soljačić, Cornelia Denz, and Demetri Psaltis.

 

Related

Pages

Subscribe to RSS - research