Statistics and Probability Seminars

Statistics Seminar: Statistical analysis in the study of fine art paintings and drawings

Topic: 
Statistical analysis in the study of fine art paintings and drawings
Abstract / Description: 

Held in person or online: Zoom details for online events will be sent to the subscribed list.


Computer vision and statistics, including deep networks, have led to numerous successes in the analysis of natural photographs, medical, remotely sensed, and other images that conform to the optics, physics, and statistics of natural scenes.  Fine art paintings and drawings differ in important ways from such images, however, they can be highly stylized, depict non-existent scenes, objects, or even no objects at all, violate traditional physical constraints on images such as perspective, express an artist's intention or meaning, and are far fewer in number than the photographs used to train traditional image analysis algorithms.  For these and additional reasons, the computational analysis of fine art requires modification of prior analysis techniques and even entirely new technical approaches.  As such, fine art presents a grand challenge to artificial intelligence research.

This talk will present some of the recent success of rigorous statistical and computer vision methods applied to problems in the history and interpretation of fine art paintings and drawings, including exposing fakes and forgeries.  It will discuss initial steps toward using statistical methods in the analysis of both images and associated text to compute simple interpretations of the meanings of some artworks.

This profusely illustrated talk will end with a list of several open problems in statistical image analysis of our cultural patrimony, including some of the most important images ever created.

Date and Time: 
Tuesday, September 21, 2021 - 4:30pm

Probability Seminar presents "Local limits for permutations and generating trees"

Topic: 
Local limits for permutations and generating trees
Abstract / Description: 

For large combinatorial structures, two main notions of convergence can be defined: scaling limits and local limits. In particular, for graphs both notions are well-studied and well-understood. For permutations only a notion of scaling limits, called permutons, has been investigated in the last decade.

In the first part of the talk, we introduce a new notion of local convergence for permutations and we prove some characterizations in terms of proportions of consecutive pattern occurrences.

In the second part of the talk, we investigate a new method to establish local limits for pattern-avoiding permutations using generating trees. The theory of generating trees has been widely used to enumerate families of combinatorial objects, in particular permutations. The goal of this talk is to introduce a new facet of generating trees encoding families of permutations, in order to establish probabilistic results instead of enumerative ones.

Date and Time: 
Monday, September 20, 2021 - 2:00pm
Venue: 
Sequoia Hall Room 200

Statistics Seminar: How much data is sufficient to learn high-performing algorithms?

Topic: 
How much data is sufficient to learn high-performing algorithms?
Abstract / Description: 

Algorithms often have tunable parameters that have a considerable impact on their runtime and solution quality. A growing body of research has demonstrated that data-driven algorithm design can lead to significant gains in runtime and solution quality. Data-driven algorithm design uses a training set of problem instances sampled from an unknown, application-specific distribution and returns a parameter setting with strong average performance on the training set. We provide a broadly applicable theory for deriving generalization guarantees for data-driven algorithm design, which bound the difference between the algorithm's expected performance and its average performance over the training set.

The challenge is that for many combinatorial algorithms, performance is a volatile function of the parameters: slightly perturbing the parameters can cause a cascade of changes in the algorithm’s behavior. Prior research has proved generalization bounds by employing case-by-case analyses of parameterized greedy algorithms, clustering algorithms, integer programming algorithms, and selling mechanisms. We uncover a unifying structure which we use to prove very general guarantees, yet we recover the bounds from prior research. Our guarantees apply whenever an algorithm's performance is a piecewise-constant, -linear, or — more generally — piecewise-structured function of its parameters. As we demonstrate, our theory also implies novel bounds for dynamic programming algorithms used in computational biology and voting mechanisms from economics.

This talk is based on joint work with Nina Balcan, Dan DeBlasio, Travis Dick, Carl Kingsford, and Tuomas Sandholm.

Date and Time: 
Tuesday, July 20, 2021 - 4:30pm

Statistics Department Seminar: High-dimensional and nonparametric estimation under 'local' information constraints

Topic: 
High-dimensional and nonparametric estimation under 'local' information constraints
Abstract / Description: 

Details unavailable at time of publishing. Please check Statistics pages for updated information.

Date and Time: 
Tuesday, June 29, 2021 - 4:30pm

Pages

Subscribe to RSS - Statistics and Probability Seminars