Logo

Mathematical Sciences Research Institute

Home » Nonlinear Estimation and Classification

Workshop

Nonlinear Estimation and Classification March 19, 2001 - March 29, 2001
Registration Deadline: March 29, 2001 over 18 years ago
To apply for Funding you must register by: December 19, 2000 over 18 years ago
Parent Program: --
Organizers David Denison, Mark Hansen, Chris Holmes, Robert Kohn, Bani Mallick, Martin Tanner and Bin Yu
Speaker(s)

Show List of Speakers

Description
Nonlinear Estimation and Classification Schedule For a tentative list of invited speakers and instructions on submitting a talk to the contributed portion of the program, please consult the main page of the conference: http://cm.bell-labs.com/who/cocteau/nec/index.html Contributed presentations (talks and posters) are invited for original work related to the theme of the workshop. Submissions are to be in the form of an extended abstract, consisting of not more than 3 pages. Graduate students, new researchers, minorities and women are strongly encouraged to apply. Extended abstracts will be accepted in PostScript or PDF formats and should be mailed to nec@research.bell-labs.com. When submitting papers, please also indicate whether travel funds are necessary for attendance. The deadline for extended abstracts is November 27, 2000. Notification of acceptance will be sent out on January 15, 2000. Overview Researchers in many disciplines face the formidable task of analyzing massive amounts of high-dimensional and highly-structured data. This is due in part to recent advances in data collection and computing technologies. As a result, fundamental statistical research is being undertaken in a variety of different fields. Driven by the complexity of these new problems, and fueled by the explosion of available computer power, highly adaptive, non-linear procedures are now essential components of modern "data analysis," a term that we liberally interpret to include speech and pattern recognition, classification, data compression and signal processing. The development of new, flexible methods combines advances from many sources, including approximation theory, numerical analysis, machine learning, signal processing and statistics. The proposed workshop intends to bring together eminent experts from these fields in order to exchange ideas and forge directions for the future. It also intends to introduce the research topics to graduate students by providing travel support and by requiring the last speaker of each session to give an overview of the field. A Brief Survey of the Area Three ingredients are common to the new class of non-linear methods: Approximation spaces (wavelets, splines, neural networks, trees) or other simple probabilistic structures as building blocks for representations of statistical objects; Algorithms for identifying the "best" representation or combining several "promising" candidates; and Statistical frameworks to judge between competing representations. A key component in recent non-linear modeling efforts is the design of efficient representations for (complex) objects such as curves, surfaces and images. Here, efficiency often relates to the number of descriptors (i.e. basis functions or general predictors). This definition has parallels with recent results in approximation theory on n-term expansions and best basis algorithms. Similarly, in machine learning, the focus is on repeatedly applying weak-learners or relatively simple representations like the sigmoid function used in (artificial) neural networks or very shallow trees, known as stumps. On the other hand, new tools for image and texture analysis draw their representations from the human visual system, where efficiency might be evaluated in biological terms. Finally, graphical or hidden Markov models represent a distribution involving complex dependencies by connecting simple (conjugate) distributions in a hierarchical manner. In statistical problems, the search for the single "best" representation among a (potentially vast) number of competitors is really a problem in model selection. Applying these ideas in the context of approximation spaces, for example, has led to new wavelet shrinkage schemes and new methods for free-knot splines. Various selection criteria have been suggested in these problems, each with differing strengths and weaknesses, but none dominating all the others over a wide range of problems. Even with an agreed selection criterion, difficulties can occur in identifying the "best" representation as it is often a single element in an extremely dense candidate set which cannot be exhaustively searched. Efficient optimization methods need to be devised to overcome this problem. Rather than focusing on a single model, recent empirical and theoretical results have shown that we can achieve improved predictive performance by combining several competing representations. The Bayesian paradigm naturally leads to a framework for averaging models, and has been applied extensively in the context of approximation spaces. From a theoretical standpoint, these techniques depend quite strongly on the underlying prior assumptions, and understanding their performance is an area of active research. Other techniques for combining models have arisen in the machine learning literature, most notably boosting and bagging, and have proved to be extremely effective in practice. When working with massive amounts of data, a test or hold-out data set has been commonly used for judging competing procedures. The question of which evaluation criterion or loss function to use still remains. Some loss functions are more sensible than others in a particular problem, although the squared loss has been the most conventional. With the current computing power, it is believed possible to use more sensible loss functions, for example, the absolute loss for image processing. This, however, needs a consented effort of the research community to deviate from the norm. The proposed workshop is an ideal place to reach such a consensus. Organizing Committee David Denison (Imperial College) Mark Hansen (Bell Labs) Chris Holmes (Imperial College) Robert Kohn (Univ. of New South Wales) Bani Mallick (Texas A&M) Martin Tanner (Northwestern) Bin Yu (UC Berkeley) FUNDING APPLICATION DEADLINE: February 5, 2001 Funded in part by the National Security Agency and the Department of the Army Research Office.
Keywords and Mathematics Subject Classification (MSC)
Primary Mathematics Subject Classification No Primary AMS MSC
Secondary Mathematics Subject Classification No Secondary AMS MSC
Funding & Logistics Show All Collapse

Show Funding

To apply for funding, you must register by the funding application deadline displayed above.

Students, recent Ph.D.'s, women, and members of underrepresented minorities are particularly encouraged to apply. Funding awards are typically made 6 weeks before the workshop begins. Requests received after the funding deadline are considered only if additional funds become available.

Show Lodging

MSRI does not hire an outside company to make hotel reservations for our workshop participants, or share the names and email addresses of our participants with an outside party. If you are contacted by a business that claims to represent MSRI and offers to book a hotel room for you, it is likely a scam. Please do not accept their services.

MSRI has preferred rates at the Hotel Shattuck Plaza, depending on room availability. Guests can call the hotel's main line at 510-845-7300 and ask for the MSRI- Mathematical Science Research Institute discount. To book online visit this page (the MSRI rate will automatically be applied).

MSRI has preferred rates at the Graduate Berkeley, depending on room availability. Reservations may be made by calling 510-845-8981. When making reservations, guests must request the MSRI preferred rate. Enter in the Promo Code MSRI123 (this code is not case sensitive).

MSRI has preferred rates at the Berkeley Lab Guest House, depending on room availability. Reservations may be made by calling 510-495-8000 or directly on their website. Select "Affiliated with the Space Sciences Lab, Lawrence Hall of Science or MSRI." When prompted for your UC Contact/Host, please list Chris Marshall (coord@msri.org).

MSRI has a preferred rates at Easton Hall and Gibbs Hall, depending on room availability. Guests can call the Reservations line at 510-204-0732 and ask for the MSRI- Mathematical Science Research Inst. rate. To book online visit this page, select "Request a Reservation" choose the dates you would like to stay and enter the code MSRI (this code is not case sensitive).

Additional lodging options may be found on our short term housing page.

Show Directions to Venue

Show Visa/Immigration

Schedule, Notes/Handouts & Videos
Show Schedule, Notes/Handouts & Videos
Show All Collapse
Mar 19, 2001
Monday
08:00 AM - 05:00 PM
  Bayesian prediction using adaptive ridge estimators
David Denison
09:30 AM - 10:15 AM
  Gauss mixture vector quantization: Clustering Gauss mixtures with the minimum discrimination information distortion for modeling, compression, and classification
Robert Gray
10:15 AM - 11:15 AM
  Some experiments with ensemble methods for classification and conditional density estimation
Thomas Dietterich
11:15 AM - 02:00 PM
  Dilution priors for model uncertainty
Edward George
02:00 PM - 02:30 PM
  An ANOVA model for dependent random measures with an application to population models
Maria DeIorio
02:30 PM - 03:15 PM
  Aspects of stochastic simulation based inference for some mixtures of latent structures
Ernest Fokoue
03:15 PM - 03:45 PM
  Combining regression estimators under a fixed design
Yuhong Yang
03:45 PM - 04:15 PM
  Clustering by partial mixture estimation
David Scott (University of Puget Sound)
04:15 PM - 05:15 PM
  Model complexity and model priors
Angelika van der Linde
Mar 20, 2001
Tuesday
09:30 AM - 10:15 AM
  Optimal properties and adaptive tuning of support vector machines (SVM's)
Grace Wahba
10:15 AM - 11:15 AM
  Detecting a menstrual cycle signal in flourescence spectroscopy of the cervix
Dennis Cox
11:15 AM - 11:45 AM
  Adaptive kernels for SV classification
Robert Burbidge
11:45 AM - 12:15 PM
  Unsupervised sparse regression
Mario Figueiredo
12:15 PM - 01:15 PM
  Instability in nonlinear estimation and classification
Steven Ellis
Mar 21, 2001
Wednesday
09:30 AM - 10:15 AM
  Extended linear modeling with splines
Charles Stone (University of California, Berkeley)
10:15 AM - 11:15 AM
  Bayesian approaches in nonparametric estimation problems
Linda Zhao
11:15 AM - 02:00 PM
  Information theoretic bounds for mixture modeling and model selection
Andrew Barron
02:00 PM - 02:30 PM
  Logspline density estimation with free-knot splines
Charles Kooperberg
02:30 PM - 03:15 PM
  Adaptive splines and genetic algorithms with an application to classification
Jennifer Pittman
03:15 PM - 03:45 PM
  Mixed-effects multivariate adaptive splines models: An automated procedure for fitting longitudinal data and growth curves
Heping Zhang
03:45 PM - 04:45 PM
  Bayesian mixture splines for spatially adaptive nonparametric regression
Sally Wood
Mar 22, 2001
Thursday
09:30 AM - 10:15 AM
  Variational inference for clustering and classification
Michael Jordan (University of California)
10:15 AM - 11:00 AM
  Modeling internet traffic
William Cleveland (Purdue University)
11:00 AM - 12:00 PM
  A simple model for a complex system: Predicting travel times on freeways
John Rice
02:00 PM - 02:30 PM
  Denoising deterministic time series
Andrew Nobel
02:30 PM - 03:00 PM
  A statistical approach to quantifying the predictability of noisy nonlinear systems
Barbara Bailey
03:15 PM - 03:45 PM
  Looking for nonlinearities in the large scale dynamics of the atmosphere
Claudia Tebaldi
03:45 PM - 04:15 PM
  Non-stationary, nonlinear classification and model selection
Nando DeFreitas
03:45 PM - 04:15 PM
  Non-stationary, nonlinear classification and model selection
Nando de Freitas
04:15 PM - 04:45 PM
  Semiparametric estimation of the long-memory parameter in FARIMA models
Ashis Gangopadhyay
Mar 23, 2001
Friday
09:30 AM - 10:15 AM
  Nonlinear estimation and classification: Challenges from genomics
Terence Speed (University of California, Berkeley)
10:15 AM - 11:15 AM
  The complex statistics of images
David Mumford (Brown University)
11:15 AM - 02:00 PM
  Interactions between data analysis of natural images, biological vision and mathematical analysis
David Donoho (Stanford University)
02:00 PM - 02:30 PM
  Bounding generalization error of aggregate classifiers through empirical margin distributions
Gilles Blanchard
02:30 PM - 03:15 PM
  Model selection for CART regression trees
Servane Gey
03:15 PM - 03:45 PM
  Nonlinear function learning and classification using optimal radial basis function networks
Adam Krzyzak
03:45 PM - 04:15 PM
  On adaptive estimation by neural nets type estimators
Ludger Ruschendorf
04:15 PM - 05:15 PM
  Fast rates and adaptation in nonparametric classification
Alexandre Tsybakov
Mar 26, 2001
Monday
09:30 AM - 10:15 AM
  Besov vs. Plato: Multiscale image modeling
Richard Baraniuk
10:15 AM - 11:15 AM
  Wavelets: Approximation, Compression and Sampling
Martin Vetterli
11:15 AM - 01:30 PM
  Harmonic Analysis of BV
Ronald DeVore (Texas A&M International University)
01:30 PM - 02:00 PM
  Texture characterization and retrieval using steerable hidden Markov models
Minh Do
02:00 PM - 02:45 PM
  Thresholding second generation wavelet coefficients for noisy, non-equispaced data
Maarten Jansen
02:45 PM - 03:45 PM
  A multiresolution analysis for statistical likelihoods: Theory and methods
Eric Kolaczyk
Mar 27, 2001
Tuesday
09:30 AM - 10:15 AM
  Coarse-to-fine classification and visual indexing
Donald Geman (Johns Hopkins University)
10:15 AM - 11:15 AM
  From bits to information: Theory and applications of learning machines
Tomaso Poggio
11:15 AM - 01:30 PM
  Gradient-based learning in heterogeneous structures
Yann LeCun
01:30 PM - 02:00 PM
  Data compression and statistical inference
Jorg Rahnenfuhrer
02:00 PM - 03:00 PM
  Local curved gaussian models
Juan Lin
03:00 PM - 03:30 PM
  Compressing and analyzing massive geophysical data sets by Monte Carlo extended ECVQ
Amy Braverman
03:30 PM - 04:15 PM
  Adaptive Bayesian Shrinkage Estimators
David Denison (University of Manchester)
04:15 PM - 04:45 PM
  Best adaptive tiling in a rate-distortion sense
Rahul Shukla
04:45 PM - 05:45 PM
  Multi-resolution properties of semi-parametric volatility models
Enrico Capobianco
Mar 28, 2001
Wednesday
09:00 AM - 09:45 AM
  Environmental monitoring using a time series of satellite images and other spatial data sets
Harri Kiiveri
09:45 AM - 10:45 AM
  Bayesian multidimensional scaling and choice of dimension
Adrian Raftery
10:45 AM - 11:15 AM
  Computationally intensive statistical methods for microarray based drug discovery
Katherine Pollard
11:15 AM - 11:45 AM
  Active learning of Bayes net structure with applications to gene knockout experiment design
Kevin Murphy
11:45 AM - 12:15 PM
  Using robust and resistant regression analysis (MM-estimator) to find differentially expressed genes in microarray data
Alexandre Loguinov
12:15 PM - 01:15 PM
  Logic Regression
Ingo Ruczinski
Mar 29, 2001
Thursday
09:30 AM - 10:15 AM
  Predictive data mining with multiple additive regression trees
Jerome Friedman
10:15 AM - 11:00 AM
  Logistic regression, AdaBoost and Bregman distances
Robert Schapire (Microsoft Research)
11:15 AM - 12:00 PM
  Where does mainstream optimization fit?
Margaret H. Wright (New York University, Courant Institute)
02:00 PM - 02:30 PM
  Random forests: An algorithm for prediction and understanding
Leo Breiman
02:30 PM - 03:00 PM
  High dimension - low sample size data analysis
James Marron
03:30 PM - 04:00 PM
  A global geometric framework for nonlinear dimensionality reduction
Vin de Silva