Power electronics can be found in everything from cellphones and laptops to gasoline/electric vehicles, industrial motors and inverters that connect solar panels to the electric grid. With close to 80% of electrical energy consumption in the US expected to flow through a power converter by 2030, innovative solutions are required to tackle key issues related to conversion efficiency, power density and cost. This talk will look at the trends in power electronics across different application spaces, describe the ongoing research efforts and highlight the challenges ahead.
Immersive visual and experiential computing systems, i.e. virtual and augmented reality (VR/AR), are entering the consumer market and have the potential to profoundly impact our society. Applications of these systems range from communication, entertainment, education, collaborative work, simulation and training to telesurgery, phobia treatment, and basic vision research. In every immersive experience, the primary interface between the user and the digital world is the near-eye display. Thus, developing near-eye display systems that provide a high-quality user experience is of the utmost importance. Many characteristics of near-eye displays that define the quality of an experience, such as resolution, refresh rate, contrast, and field of view, have been significantly improved over the last years. However, a significant source of visual discomfort prevails: the vergence-accommodation conflict (VAC). Further, natural focus cues are not supported by any existing near-eye display. In this talk, we discuss frontiers of engineering next-generation opto-computational near-eye display systems to increase visual comfort and provide realistic and effective visual experiences.
Neuromorphic computing has recently emerged as one of the most promising option to reduce power consumption of big data analysis, paving the way for artificial intelligence systems with power efficiencies like the human brain. The key device for neuromorphic computing system is given by artificial two-terminal synapses controlling signal processing and transmission. Their conductivity must be changed in an analog/continuous way depending on neural signal strengths. In addition, synaptic devices must have: symmetric/linear conductivity potentiation and depression; a high number of levels (~32), which depend on applications and algorithm performances; high data retention (>10 years) and cycling (>109); ultra-low power consumption (<10fJ); low variability; high scalability (<10nm) and possibility of 3D integration.
A variety of different device technologies have been explored such as phase change memories, ferroelectric random-access memory and resistive random-access memory (RRAM). In each case matching the desired specs is a complex multivariable problem requiring a deep quantitative understanding of the link between material properties at the atomic scale and electrical device performance. We have used a multiscale modeling platform GINESTRATM to illustrate this for the case of RRAM and Ferroelectric tunnel junctions (FTJ).
In the case of RRAM, modeling of key mechanisms shows that a dielectric stack composed of two appropriately chosen dielectrics provides the best solution, in agreement with experimental data. In the case of FTJ, the hysteretic ferroelectric behavior of dielectric stacks fabricated from the orthorhombic phase of doped HfO2 is nicely captured by the simulations. These show that Fe-HfO2 stack can be easily used for analog switching by simply tuning set/reset voltage amplitudes. An added advantage of the simulations is that they point out ways to improve the performance, variability and endurance of the devices in order to meet industrial requirements.
Join SystemX laliance for their SPRING Workshop Week: Apr 30-May 3, 2018.
Details available on SystemX SPRING workshop page.
SystemX Alliance research broadly encompasses ubiquitous sensing, computing, and communications in various application areas. Currently affiliated SystemX faculty are found in departments across Stanford's School of Engineering and in some areas of natural Sciences and Medicine. Their research agenda is continually evolving in accordance with the interests of Stanford faculty and industry affiliates.
Person-millenia are spent each year seeking useful molecules for medicine, food, agriculture and other uses. For biomolecules, the near infinite universe of possibilities is staggering and humbling. As an example, antibodies, which make up the majority of the top-grossing medicines today, are comprised of 1,100 amino acids chosen from the twenty used by living things. The binding part (variable region) that allows the antibody to bind and recognize pathogens, is about 110 amino acids, giving rise to 10143 possible combinations. There are only about 1080 atoms in the universe, illustrating the intractability of exploring the entire space of possibility. This is just one example…
Presently, machine learning (ML), artificial intelligence (AI), quantum computing, and “big data” are often put forth as the solutions to all problems, particularly by pontificating TED presenters and in Sand Hill pitches dripping with hyperbole. Expecting these methods to provide intelligent prediction of molecular structure and function within our lifetimes is unrealistic. For example, a neural network trained on daily weather patterns in Palo Alto cannot develop an internal model for global weather. In a similar way, finite and reasonable molecular training sets will not magically cause a generalizable model of molecular quantum mechanics to arise within a neural network, no matter how many layers it is endowed with.
With that provocative preface, we turn to the notion of letting matter compute itself. Massive combinatorial libraries can now be intelligently and efficiently mined with appropriate molecular readouts (AKA “the question vector”) at ever-increasing throughputs presently surpassing 1012 unique molecules in a few hours. Once “matter-in-the-loop” exploration is embraced, AI, ML and other methods can be brought to bear usefully in closed-loop methods to follow veins of opportunity in molecular space. Several examples of mining massive molecular spaces will be presented, including drug discovery, digital pathology, and AI-guided continuous-flow chemical synthesis – all real, all working today.