EE380 Computer Systems Colloquium

EE380 Computer Systems Colloquium: Towards theories of single-trial high dimensional neural data analysis

Topic: 
Towards theories of single-trial high dimensional neural data analysis
Abstract / Description: 

Neuroscience has entered a golden age in which experimental technologies now allow us to record thousands of neurons, over many trials during complex behaviors, yielding large-scale, high dimensional datasets. However, while we can record thousands of neurons, mammalian circuits controlling complex behaviors can contain tens of millions of behaviorally relevant neurons. Thus, despite significant experimental advances, neuroscience remains in a vastly undersampled measurement regime. Nevertheless, a wide array of statistical procedures for dimensionality reduction of multineuronal recordings uncover remarkably insightful, low dimensional neural state space dynamics whose geometry reveals how behavior and cognition emerge from neural circuits. What theoretical principles explain this remarkable success; in essence, how is it that we can understand anything about the brain while recording an infinitesimal fraction of its degrees of freedom?

We present a theory that addresses this question, and test it using neural data recorded from reaching monkeys. Overall, this theory yields a picture of the neural measurement process as a random projection of neural dynamics, conceptual insights into how we can reliably recover neural state space dynamics in such under-sampled measurement regimes, and quantitative guidelines for the design of future experiments. Moreover, it reveals the existence of phase transition boundaries in our ability to successfully decode cognition and behavior on single trials as a function of the number of recorded neurons, the complexity of the task, and the smoothness of neural dynamics. We will also discuss non-negative tensor analysis methods to perform multi-timescale dimensionality reduction and demixing of neural dynamics that reveal how rapid neural dynamics within single trials mediate perception, cognition and action, and how slow changes in these dynamics mediate learning.

Date and Time: 
Wednesday, May 2, 2018 - 4:30pm
Venue: 
Gates B03

EE380 Computer Systems Colloquium: The End of Privacy

Topic: 
The End of Privacy
Abstract / Description: 

A growing proportion of human activities such as social interactions, entertainment, shopping, and gathering information are now mediated by digital devices and services. Such digitally mediated activities can be easily recorded, offering an unprecedented opportunity to study and measure intimate psycho-demographic traits using actual--rather than self-reported--behavior. Our research shows that digital records of behavior, such as samples of text, Tweets, Facebook Likes, web-browsing logs, or even facial images can be used to accurately measure a wide range of traits including personality, intelligence, and political views. Such Big Data assessment has a number of advantages: it does not require participants' active involvement; it can be easily and inexpensively applied to large populations; and it is relatively immune to cheating or misrepresentation. If used ethically, it could revolutionize psychological assessment, marketing, recruitment, insurance, and many other industries. In the wrong hands, however, such methods pose significant privacy risks. In this talk, we will discuss how to reap the benefits of Big Data assessment while avoiding the pitfalls.

Date and Time: 
Wednesday, April 11, 2018 - 4:30pm
Venue: 
Gates B03

EE380 Computer Systems Colloquium: The Future of Wireless Communications Hint: It's not a linear amplifier

Topic: 
The Future of Wireless Communications Hint: It's not a linear amplifier
Abstract / Description: 

Wireless communications are ubiquitous in the 21 st century--we use them to read the newspaper, talk to our colleagues or children, watch sporting events or other forms of entertainment, and to monitor and control the environment we live ins-- among just a few. This exponentiation of demand for wireless capacity has driven a new era of innovation in this space because spectrum and energy are expensive and constrained resources.

The future of wireless communications will demand leaps in spectrum efficiency, bandwidth efficiency, and power efficiency for successful technology deployments. Key applications that will fundamentally change how we interact with wireless systems and the demands we place on wireless technologies include Dynamic Spectrum Access Networks, massive MIMO, and the evasive unicorn of the "universal handset". While each of these breakthrough "system" capabilities make simultaneous demands of spectrum efficiency, bandwidth efficiency, and power efficiency, the current suite of legacy technologies forces system designers to make undesirable trade-offs because of the limitations of linear amplifier technology.

Eridan's solution is the antithesis of "linear". The Switch Mode Mixer Modulator (SMs3 ) technology emphasizes precision and flexibility, and simultaneously delivers spectrum efficiency, bandwidth efficiency, and power efficiency. The resulting capabilities dramatically increase total wireless capacity with minimum need for expanding operations into extended regions of the wireless spectrum.

This presentation will discuss the driving forces behind wireless system performance, the physics of linear amplifiers and SM3, measured performance of SM3 systems, and the implications for wireless system capabilities in the near future.

Date and Time: 
Wednesday, May 16, 2018 - 4:30pm
Venue: 
Gates B03

EE380 Computer Systems Colloquium: Cryptocurrencies

Topic: 
Cryptocurrencies
Abstract / Description: 

I will give introduction to blockchain technology, current state of the industry and its challenges and the Einsteinium Foundation that is embarking on a truly ambitious path likely to change how cryptocurrency is viewed and used in everyday life. Scientific research is a long-term investment in our future, and the future of our planet. Funding "big ideasi" has fallen dramatically around the world for in recent years. The defining characteristic of Einsteinium is its ongoing commitment to research and charitable missions. Einsteinium coin is a Bitcoin-like currency with a philanthropic objective of funding scientific, cutting edge IT and cryptocurrency projects.

Date and Time: 
Wednesday, May 9, 2018 - 4:30pm
Venue: 
Gates B03

EE380 Computer Systems Colloquium: Computer Accessibility

Topic: 
Exploring the implications of machine learning for people with cognitive disabilities
Abstract / Description: 

Advances in information technology have provided many benefits for people with disabilities, including wide availability of textual content via text to speech, flexible control of motor wheelchairs, captioned video, and much more. People with cognitive disabilities benefit from easier communication, and better tools for scheduling and reminders. Will advances in machine learning enhance this impact? Progress in natural language processing, autonomous vehicles, and emotion detection, all driven by machine learning, may deliver important benefits soon. Further out, can we look for systems that can help people with cognitive challenges understand our complex world more easily, work more effectively, stay safe, and interact more comfortably in social situations? What are the technical barriers to overcome in pursuing these goals, and what are the theoretical developments in machine learning that may overcome them?

Date and Time: 
Wednesday, April 18, 2018 - 4:30pm
Venue: 
Gates B03

EE380 Computer Systems Colloquium: Information Theory of Deep Learning

Topic: 
Information Theory of Deep Learning
Abstract / Description: 

I will present a novel comprehensive theory of large scale learning with Deep Neural Networks, based on the correspondence between Deep Learning and the Information Bottleneck framework. The new theory has the following components:

  1. rethinking Learning theory; I will prove a new generalization bound, the input-compression bound, which shows that compression of the representation of input variable is far more important for good generalization than the dimension of the network hypothesis class, an ill defined notion for deep learning.
  2. I will prove that for large scale Deep Neural Networks the mutual information on the input and the output variables, for the last hidden layer, provide a complete characterization of the sample complexity and accuracy of the network. This makes the information Bottleneck bound for the problem as the optimal trade-off between sample complexity and accuracy with ANY learning algorithm.
  3. I will show how Stochastic Gradient Descent, as used in Deep Learning, achieves this optimal bound. In that sense, Deep Learning is a method for solving the Information Bottleneck problem for large scale supervised learning problems. The theory provide a new computational understating of the benefit of the hidden layers, and gives concrete predictions for the structure of the layers of Deep Neural Networks and their design principles. These turn out to depend solely on the joint distribution of the input and output and on the sample size.

Based partly on works with Ravid Shwartz-Ziv and Noga Zaslavsky.

Date and Time: 
Wednesday, April 4, 2018 - 4:30pm
Venue: 
Gates B03

EE380 Computer Systems Colloquium: News Diffusion - fighting misinformation

Topic: 
News Diffusion: Scoring (automatically) news articles to fight misinformation
Abstract / Description: 

Deepnews.ai wants to make a decisive contribution to the sustainability of the journalistic information ecosystem by addressing two problems:
1. The lack of correlation between the cost of producing great editorial content and its economic value.
2. The vast untapped potential for news editorial products.

Deepnews.ai willl have a simple and accessible scoring system: the online platform receives a batch of news stories will score on a scale of 1 to 5 based on their journalistic quality. This is done automatically and in real time. This scoring system has multiple applications.

On the business side, the greatest potential is the possibility to adjust the price of an advertisement to the quality of the editorial context. There is room for improvement. Today, a story that required months of work and cost hundreds of thousands of dollars carries the same unitary value (a few dollars per thousand page views) as a short, gossipy article. But times are changing. In the digital ad business, indicators are blinking red: CPMs, click-through rates, and viewability are on a steady downward decline. We believe that inevitably, advertisers and marketers will seek high-quality content--as long as they can rely on a credible indicator of quality. Deepnews.ai will interface with ad servers to assess the value of a story and price and serve ads accordingly. The higher a story's quality score, the pricier the ad space adjacent to it can be. This adjustment will substantially raise the revenue per page to match the quality of news.

On the editorial side: The ability to assess the quality of news will open up opportunities for new products and services such as:
• Recommendation engines improvement: instead of relying on keywords or frequency, Deepnews.ai will surface stories based on substantial quality, which will increase the number of articles read per visit. (Currently, visitors to many news sites read less than two articles per visit).
• Personalization: We believe a reader's profile should not be limited to consumption analytics but should reflect his or her editorial preferences. Deepnews.ai is considering a dedicated "tag" which will be able to connect stories' metadata with a reader's affinity.
• Curation: Publishers will be able to use Deepnews.ai to offer curation services, a business currently left to players like Google and Apple. By providing technology that can automatically surface the best stories from trusted websites (even small ones), Deepnews.ai can help publishers expand their footprint.

The platform will be based on two of ML approaches: a feature-based model and a text content analytic model.

Using traditional ML methods, the first model assesses quality taking as input two sets of "signals" to assess the quality of journalistic work: Quantifiable Signals and Subjective Signals. Quantifiable Signals include the structure and patterns of the HTML page, advertising density, use of visual elements, bylines, word count, readability of the text, information density (number of quotes and named entities). This is processed data from news content. Subjective Signals are human scoring of quality based on criteria such as writing style, thoroughness, balance and fairness, timeliness, etc. These measures are produced by editors and experienced journalists.

The second approach is based on deep learning methods. Here, the goal is to build models that will be able to accurately classify an unseen incoming article purely based on the quality of the report, distinct from the metadata or the topic of discussion. The main challenge in many such deep learning approaches is the availability of labeled data. Nearly four million contemporary articles have been processed. They come from sources deemed as "good" or "commodity" (with no journalistic value-added). For the bulk of our data, the reputation and consistency of the news brand had a significant weight, but the objective is also to classify quality at a finer grained level, detached from the name of the source. To this end, various models are used to capture differences in writing that are agnostic to topical differences.

Date and Time: 
Wednesday, March 14, 2018 - 4:30pm
Venue: 
Gates B03

EE380 Computer Systems Colloquium: The Evolution of Public Key Cryptography

Topic: 
The Evolution of Public Key Cryptography
Abstract / Description: 

While public key cryptography is seen as revolutionary, after this talk you might wonder why it took Whit Diffie, Ralph Merkle and Hellman so long to discover it. This talk also highlights the contributions of some unsung (or "under-sung") heroes: Ralph Merkle, John Gill, Stephen Pohlig, Richard Schroeppel, Loren Kohnfelder, and researchers at GCHQ (Ellis, Cocks, and Williamson).

Date and Time: 
Wednesday, February 28, 2018 - 4:30pm
Venue: 
Gates B03

Pages

Subscribe to RSS - EE380 Computer Systems Colloquium