Weekly Seminar – 3/31/2017 – Low Rank and Sparse Signal Processing #3

Davood Hajinezhad from Dr. Hong’s research group will be giving the talk tomorrow. The title is “Nonconvex Low Rank Matrix Factorization via Inexact First Order Oracle”. The details are as follows:
 
Date: March 31st, Friday
Time: 3:00 – 4:00 pm
Venue: 2222 Coover hall
 
Davood’s abstract: We study the low rank matrix factorization problem via nonconvex optimization. Compared with the convex relaxation approach, nonconvex optimization exhibits superior empirical performance for large scale low rank matrix estimation. However, the understanding of its theoretical guarantees is limited. To bridge this gap, we exploit the notion of inexact first order oracle, which naturally appears in low rank matrix factorization problems such as matrix sensing and completion. Particularly, our analysis shows that a broad class of nonconvex optimization algorithms, including alternating minimization and gradient-type methods, can be treated as solving two sequences of convex optimization algorithms using inexact first order oracle. Thus we can show that these algorithms converge geometrically to the global optima and recover the true low rank matrices under suitable conditions. Numerical results are provided to support our theory.

References:


Weekly Seminar – 3/24/2017 – Low Rank and Sparse Signal Processing #2

The weekly seminar series resumes after the spring break with a talk on Graph Signal Processing by Rahul Singh from Dr. Dogandzic’s research group. The details are as follows:
 
Date: March 24th, Friday
Time: 3:00 – 4:00 pm
Venue: 2222 Coover hall
Rahul’s abstract:  Graph Signal Processing (GSP) is concerned with modeling, representation, and processing of signals defined on irregular structures, known as graphs. In this setting, we deal with graph signals which are collection of data values lying on the vertices of arbitrary graphs. Graph signals can be defined as temperatures within a geographical area, traffic capacities at hubs in a transportation network, or human behaviors in a social network. In the talk, we will discuss the existing graph signal processing tools and concepts such as graph Fourier transform, spectral graph wavelets. Following references are good starting point for graph signal processing.

1. David I Shuman et al. “The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains”. In: Signal Processing Magazine, IEEE 30.3 (2013), pp. 83–98.

2. A. Sandryhaila and J.M.F. Moura. “Discrete Signal Processing on Graphs: Frequency Analysis”. In: Signal Processing, IEEE Transactions on 62.12 (2014), pp. 3042–3054.

3. David I Shuman, Benjamin Ricaud, and Pierre Vandergheynst. “Vertex-frequency analysis on graphs”. In: Applied and Computational Harmonic Analysis 40.2 (2016), pp. 260–291.

Slides: GSP

Weekly Seminar – 3/3/2017 – Low Rank and Sparse Signal Processing #1

Songtao Lu from Dr. Wang’s research group will start the series of talks on low rank and sparse signal processing in our weekly meetings. The details are as follows:
Date: March 3rd, Friday
Time: 3:00 – 4:00 PM
Venue: 2222 Coover hall
Songtao‘s Abstract: I would like to mainly talk about low rank matrix factorization with applications to machine learning on this Friday, including spectral clustering, nonnegative matrix factorization. If time is enough, I plan to introduce some new relevant works about deep neural networks for dimensionality reduction.
Selected references:
Slides: LMF

Weekly Seminar – 2/24/2017 – Deep Learning #6

Shiyang Li from Dr. Ajjarapu will be giving the final talk of the Deep Learning Series, on Generative Adversarial Networks. Please find the slides and reference material for the same.
Date: Friday, 24th Febraury
Time: 3:00pm to 4:00pm
Location: Coover, 2222
References:
We will also be starting the next series of lectures on low rank and sparse signal processing. The schedule for the same has been updated on the list of talks page.

Weekly Seminar – 2/17/2017 – Deep Learning #5

We’re inching towards the end of our Deep Learning (DL) series. We just have two talks left after which we move on to our next batch of topics on Sparse and Low Rank Signal Processing. Having gone through four talks on varied topics from introduction to optimization to models such as CNN and RNN, under the umbrella of Deep Learning, we thought it would be apt to look at how these models are actually being implemented.

Manaswi Podduturi from Dr. Hegde’s group will be speaking this Friday, on applications of deep learning. She will give brief talk about how to get started with implementing deep learning and introduce some popular resources for the same. She will also compare the performance of various DL algorithms on a fixed dataset.

If you’re itching to utilize the concepts discussed in the last few talks, but don’t know how to get started, do come for this session! For others, there’s coffee as usual!

Date: Friday, 17th February

Venue: 2222, Coover Hall

Time: 3:00pm to 4:00pm

Slides: Applications of DL

 

Weekly Seminar – 2/10/2017 – Deep Learning #4

Thanh Nguyen, from Dr. Hegde’s group will be giving the fourth talk in the Deep Learning series, this Friday. He will introduce recurrent neural networks for modelling the sequences and discuss some background ideas related to it. He will also talk about how to train the recurrent networks with back-propagation for a specific application of Language Modelling.

 

The reference for the talk is Chapter 10 of the deep learning book and lecture notes from the Oxford Deep NLP course.

There is a slight shift in the timings; we will convene at 3:15 pm, instead of the usual 3:00pm. The venue remains the same.

 

Date: Friday, 10th February

Venue: 2222, Coover Hall

Time: 3:15pm to 4:15pm

You can find the slides for the talk here: Recurrent Neural Networks

Weekly Seminar – 2/3/2017 – Deep Learning #3

Continuing with the current theme of talks on Deep Learning, Jason Saporta, will be talking on Convolutional Neural Networks. Jason is jointly advised by Dr. Heike Hofmann and Dr. Alicia Carriquiry from the Department of Statistics. He will be discussing Chapter 9 of the Deep Learning book.
You can find the slides here.
You can also find additional references to his talk: A free online book and reference paper.
Details of the talk are as follows:
Date: Friday, 3rd February
Time: 3pm to 4pm
Venue: 2222, Coover Hall
Hoping to see you all there!

Weekly Seminar – 1/27/2017 – Deep Learning #2

After a great introductory talk on deep learning last week by Qi, Xiangyi Chen from Dr. Hong’s group will be presenting this week on optimization for neural networks (Chapter 8 of the Deep Learning book). The main focus of his talk will be on challenges in NN optimization, gradient based algorithms, the difference between training neural networks and pure optimization techniques.
Details for the seminar are as follows:
Date: Friday, 27th January 2017
Time: 3pm to 4pm
Venue: 2222, Coover Hall

Weekly Seminar – 1/20/2017 – Deep Learning #1

We’re starting with our new format of talks, for the semester. The first mini-series of talks will be on Deep Learning (by popular vote), and will be initiated by Qi Xiao from Dr. Wang’s group.

She will be giving an introduction to deep learning, including typical models like feedforward neural networks and the back-propagation algorithm. She will also briefly talk about deep learning applications.

Slides: Introduction to Deep Learning

References:
1. Chapter 6, Deep Learning, by Ian Goodfellow and Yoshua Bengio and Aaron Courville
2. UFLDL Tutorial

The tentative plan for the next four weeks is to cover chapters 6,8,9,10,11 of the book. The schedule for speakers has been updated on the List of Talks page.

Date: 1/20/2017

Time: 3pm to 4pm

Venue: 3043 Coover Hall

Join us for a deep learning session on a topic that has been buzzing in our community for quite a while now! We also have medium roast coffee from Caribou Coffee to keep that buzz up. We’re pretty excited to see the way this new format of seminars shapes up, so your participation will be highly appreciated!