Weekly Seminar – 04/06/2018 – Solving Linear Inverse Problems using GAN priors

This week, Viraj will be presenting the paper ” Solving Linear Inverse Problems Using GAN Priors: An Algorithm with Provable Guarantees” available at: https://arxiv.org/abs/1802.08406

Abstract:

In recent works, both sparsity-based methods as well as learning-based methods have proven to be successful in solving several challenging linear inverse problems. However, sparsity priors for natural signals and images suffer from poor discriminative capability, while learning-based methods seldom provide concrete theoretical guarantees. In this work, we advocate the idea of replacing hand-crafted priors, such as sparsity, with a Generative Adversarial Network (GAN) to solve linear inverse problems such as compressive sensing. In particular, we propose a projected gradient descent (PGD) algorithm for effective use of GAN priors for linear inverse problems, and also provide theoretical guarantees on the rate of convergence of this algorithm. Moreover, we show empirically that our algorithm demonstrates superior performance over an existing method of leveraging GANs for compressive sensing.

Time: 12.10 – 1.00 PM, Friday, April 06.

Venue: 2222 Coover Hall
Slides are available here.
Code and Poster are available on Viraj’s website here: http://virajshah.me/

 

Weekly Seminar – 03/23/2018 and 03/30/2018 – Robust Subspace Clustering

For these two weeks, Praneeth will be presenting the paper “Robust Subspace Clustering” which is available at “https://arxiv.org/abs/1301.2603

This chalk on blackboard talk will mostly focus on the algorithm and an overview of the theoretical results presented in the paper.

Time: 12.10 – 1.00 PM, Friday, March 23.
Venue: 2222 Coover Hall
Notes for the first session is available here.
Notes for the second session is here.

Weekly Seminar – 03/02/2018 – Graph Convolutional Neural Networks

This week’s speaker for the DSRG seminar series is Rahul Singh. He will talk about Graph Convolutional Neural Networks. The abstract and references are as follows.
——————————————————
Abstract:

While classical CNNs have been very successful when dealing with signals such as speech, images, or video, in which there is an underlying Euclidean structure (regular grids), recently there has been a growing interest in trying to apply CNNs on non-Euclidean geometric data. Some examples of such data include

– In social networks, the characteristics of users can be modeled as signals on the vertices of the social graph
– In sensor networks, the sensor reading are modeled as time-dependent signals on the vertices
– In genetics, gene expression data are modeled as signals defined on the regulatory network

The non-Euclidean nature of such data implies that there are no such familiar properties as common system of coordinates, vector space structure, or shift-invariance. Consequently, basic operations like convolution and shifting that are taken for granted in the Euclidean case are even not well defined on non-Euclidean domains.

The talk will be about the recent efforts made towards the generalization of CNNs from low-dimensional regular grids to high-dimensional irregular domains such as graphs.”
——————————————————-

References:

Michaël Defferrard et al., “Convolutional neural networks on graphs with fast localized spectral filtering.” In Advances in Neural Information Processing Systems, pp. 3844-3852. 2016.

Thomas Kipf and Max Welling, ” Semi-supervised classification with graph convolutional networks.”  In Proceedings of International Conference on Learning Representations, 2017.

Michael Bronstein et al., “Geometric deep learning: going beyond euclidean data.” IEEE Signal Processing Magazine 34.4 (2017): 18-42.
——————————————————

Venue:2222 Coover Hall
Time: 12.00 – 1.00PM Friday, March 2nd.