Bill Freeman - MIT

The moon as a camera, and motion minification
Aug 12, 2012 14:00 - 15:15

I'll describe the goal of photographing the earth from space using ground-based equipment, by photographing the moon. On the moon, the edges of cast shadows of earthshine reveal integrals over parts of the image of the earth as seen from the moon. By measuring the intensity profiles of cast shadow edges over a number of different positions along the moon's limb, one should have the data needed to compute a (fuzzy) picture of the earth as seen from the moon. I'll go through why I think this is feasible, and why one would want to do it.

A second brief topic, attached to this talk for discussions later, are some tools weve developed for motion analysis and redisplay, which may have application in astronomical image processing.

Rob Fergus - NYU

Exoplanet detection and spectral characterization
Aug 14, 2012 14:00 - 15:15

Exoplanet detection and characterization is currently a hot topic within Astronomy. Within the last 17 years the number of known exoplanets around nearby stars has gone from zero to nearly 800. These have been discovered with a variety of detection methods, of which the most scientifically informative is direct imaging since the planet's spectrum can be measured. However, to detect the planet against the glare of the star requires a formidable contrast ratio of ~10^9. To meet this challenge, astronomers have built coronagraphs which block out much of the star light but result in severe diffraction artifacts the overwhelm the weak planet signal. We have recently developed computer vision techniques that can detect planets whose brightness is 1-2% of the diffraction artifacts, roughly 3x better than the current state-of-the-art algorithms used by astronomers. These methods can also be used to precisely estimate the spectrum of the planet, thus revealing its elemental composition.

Joint work with David Hogg (NYU Physics), Ben Oppenheimer & Doug Brenner (American Museum of Natural History) and the rest of the P1640 team.

Dilip Krishnan - NYU

Removing Localized Corruption from Natural Images
Aug 13, 2012 14:00 - 14:30

Traditional approaches to removing image corruption such as blur or noise combine a natural image prior with a reconstruction term. The latter relies on a good generative model of the corruption - which may not exist for many distortions encountered in the real world. In this paper we explore approaches for learning a direct mapping from the corrupt input image to the clean image, obviating the need for any kind of generative model. We evaluate the approaches on several types of synthetic corruption, finding that neural-network based models perform the best. Our techniques can be used for many types of localized corruption. We demonstrate this using photographs of real-world scenes taken behind a pane of glass with water droplets, akin to a rainy window. Our model removes most of the raindrops without significant blur, the first such demonstration of this application.

Felix Hormuth - MPI for Astronomy

AstraLux Norte - the Calar Alto Lucky Imaging Camera
Aug 13, 2012 14:45 - 15:15

The AstraLux camera at the Calar Alto 2.2-m telescope is a "classical" Lucky Imaging instrument, i.e. produces images with high angular resolution by selecting and combining the best few percent of several thousand short exposure images. I will present the instrument design, the current online data reduction pipeline and give an overview of the observational programmes conducted so far. I will take a short detour into observing not with high spatial but high temporal resolution, a somewhat neglected capability of the AstraLux instrument.

As there are certainly other data reduction techniques other than Lucky Imaging that can be applied to AstraLux data in order to improve image resolution, I will provide some raw data to anyone interested to devise and test better solutions.

Krikamol Muandet - MPI for Intelligent Systems

Support Measure Machine for Quasar Target Selection
Aug 13, 2012 17:15 - 17:45

In this talk I will discuss the problem of quasar target selection. The objects attributes in astronomy such as fluxes are often subjected to substantial and heterogeneous measurement uncertainties, especially for the medium-redshift between 2.2 and 3.5 quasars which is relatively rare and must be targeted down to g ~ 22 mag. Most of the previous works for quasar target selection includes UV-excess, kernel density estimation, a likelihood approach, and artificial neural network cannot directly deal with the heterogeneous input uncertainties. Recently, extreme deconvolution (XD) has been used to tackle this problem in a well-posed manner. In this work, we present a discriminative approach for quasar target selection that can deal with input uncertainties directly. To do so, we represent each object as a Gaussian distribution whose mean is the object's attribute vector and covariance is the given flux measurement uncertainty. Given a training set of Gaussian distributions, the support measure machines (SMMs) algorithm are trained and used to build the quasar targeting catalog. Preliminary results will also be presented.

Joint work with Jo Bovy and Bernhard Schölkopf

Tamas Budavari - Johns Hopkins University

Aug 12, 2012 15:15 - 16:15

Christian Schuler - MPI for Intelligent Systems

Aug 12, 2012 17:00 - 17:45

Sam Hasinoff - Google

Aug 13, 2012 16:30 - 17:00

David Hogg - NYU

Open Problems in Astro Imaging
Aug 14, 2012 16:30 - 17:45

Hans-Walter Rix - MPI in Astronomy

Beyond CLEAN: Imaging challenges in radio-astronomy
Aug 15, 2012 14:00 - 15:30

Hans-Walter Rix - MPI in Astronomy

Optimal combination of multi-spectral images?
Aug 16, 2012 14:00 - 15:30