Home 
/
 Education 
/ /
 MSRI-UP 
/
 2018 
/

MSRI-UP 2018 Colloquia

MSRI-UP 2018: The Mathematics of Data Science

Home Research Topic People Colloquia Research Projects Pictures

Date: Friday June 22nd, 2018

Speaker: Dr. Antonio Montalban (University of California, Berkeley)

"Abstract Infinite Games"

It is known that for some abstract infinite games, one needs to use large uncountable objects to build winning strategies for them. The question we will look at is 'How far can one go without the need of uncountable objects?'


Date: Friday, June 29th, 2018

Speaker: Dr. Johnny Guzman (Brown University)

"Polynomial Exact Sequences in Numerical Analysis"

A basic result in Calculus is that if a smooth vector field is curl free then it is the gradient of a scalar function.What if the vector field has polynomial components? Could the corresponding scalar function be polynomial? What if the vector field is piecewise polynomial? Could the scalar function be chosen as a piecewise polynomial?

We will discuss these answers and generalizations. This is the basis of finite element exterior calculus: An approach to approximate the Hodge-Laplacian and other related partial differential equations.

Johnny Guzman is an Associate Professor of Applied Mathematics at Brown University. Before joining Brown he was an NSF postdoc at the University of Minnesota where his postdoc mentor was Bernardo Cockburn. He earned his Ph.D. at Cornell University under the guidance of Lars Wahlbin.He attended California State University-Long Beach near his hometown as an undergraduate.


Date: Tuesday, July 3rd, 2018

Speaker: Dr. Edray Goins (Pomona College)

"Yes, Even You Can Bend It Like Beckham"

In the 2002 film by Gurinder Chadha, character Jesminder ‘Jess’ Bhamra states “No one can cross a ball or bend it like Beckham” in a reference to the international soccer star's ability to cause the ball to swerve. In 2010, French researchers Guillaume Dupeux, Anne Le Goff, David Quéré and Christophe Clanet published a paper in the New Journal of Physics detailing both experimental and mathematical analyses of a spinning ball in a fluid to show that it must follow a spiral.

In this talk, we give an overview of their discussion by reviewing the Navier-Stokes equation in a Serret-Frenet coordinate system. This talk is dedicated to the memory of Angela Grant and her love of mathematics in sports.


Date: Friday, July 6th, 2018

Speaker: Dr. Terrence Blackman (Medgar Evers College, CUNY)

"Mathematics, mathematicians, mathematics education, excellence and equity"

Terrence Richard Blackman is Dean of the School of Science, Health and Technology at Medgar Evers College, CUNY. He is Associate Professor and the former Chair of The Department of Mathematics. A former Dr. Martin Luther King Jr. Assistant Professor in the Department of Mathematics at the Massachusetts Institute of Technology(MIT) and Terrence has also served as Assistant Professor in The Department of Education Research Policy and Practice in the Morgridge College of Education at The University of Denver.

In this talk he shared his journey in the world of professional mathematics and mathematics education with a view to motivating and inspiring the next generation of mathematicians and scientists from underrepresented communities.


Date: Friday July 13th, 2018

Speaker: Dr. Yannet Interian (University of San Francisco)

"Introduction to Deep Learning"

Deep learning is the state-of-the-art machine learning technique in areas such as object recognition, image segmentation, speech recognition and machine translation. In this lecture, Interian introduced problems that are solved using deep learning and discusses some of the core concepts underlying this technology.


Date: Friday, July 20th, 2018

Speaker: Dr. Rebecca Garcia (Sam Houston State University)

"Gröbner Basis of Neural Ideals"

A major area in neuroscience is the study of how the brain processes spatial information. Neurons in the brain represent external stimuli via neural codes. These codes often arise from regions of space called receptive fields: each neuron fires at a high rate precisely when the animal is in the corresponding receptive field. Much research in this area has focused on understanding what features of receptive fields can be extracted directly from a neural code. In particular, Curto, Itskov, Veliz-Cuba, and Youngs recently introduced the concept of neural ideal, a polynomial ideal. We will discuss the Gröbner basis of these neural ideals and the receptive field information we can glean from the Gröbner basis.This is joint work with Luis David Garcia Puente, Ryan Kruse, Jessica Liu, Dane Miyata, Ethan Petersen, Kaitlyn Phillipson, and Anne Shiu.