Summer Seminar

With the ILAS 2017 meet going on at Iowa State University, we had the privilege of inviting two young researchers to give a talk to our audience.

Details of the first talk are as follows:

Date: 28 Jul 2017
Time: 
2:00 PM – 3:30 PM

Location:
3043 ECpE Building Addition

Speaker: Ju Sun, Postdoctoral Research Fellow at Stanford University

Title: “When Are Nonconvex Optimization Problems Not Scary?”

For more details, check the department website.

Details for the second talk are as follows:

Date: 28 Jul 2017
Time: 
3:45 PM – 5:00 PM

Location:
3043 ECpE Building Addition

Speaker: Ludwig Schmidt, PhD student at MIT

Title: Faster Constrained Optimization via Approximate Projections (tentative)

Refreshments (food and coffee) will be provided! Join us!

Weekly Seminar – 4/14/2017 – Low Rank and Sparse Signal Processing #4

Charlie Hubbard from Dr. Hegde’s research group will be giving a talk on “Parallel Methods for Matrix Completion”. Please note the venue, it is NOT our usual place.
Date: April 4th, 2017
Time: 3:00 – 4:00 pm
Venue: 3043 Coover hall
Charlie’s abstract: As a graduate student, you don’t have time to search through the entire Netflix library for a movie you’ll like…you barely have time to watch a movie in the first place!  Thankfully, Netfilx excels at content recommendation, it is able to present you with twenty or so movies from its entire library that it knows you’ll enjoy watching (while you do homework). In recent years it has been shown that matrix completion can be a useful tool for content recommendation: given a sparse matrix of users-item ratings, matrix completion can be used to predict the unseen ratings.  The problem for large-scale content providers, like Amazon and Netflix, is that the size of their user-item matrices (easily 100,000 x 10,000) make most matrix completion approaches infeasible.   In this talk I will discuss: two scalable methods (Jellyfish and Hogwild!) for parallel matrix completion, a GPU-based implementation of Jellyfish and preliminary results from an unnamed algorithm for parallel inductive matrix completion.  
References:
Slides: MC

Weekly Seminar – 3/31/2017 – Low Rank and Sparse Signal Processing #3

Davood Hajinezhad from Dr. Hong’s research group will be giving the talk tomorrow. The title is “Nonconvex Low Rank Matrix Factorization via Inexact First Order Oracle”. The details are as follows:
 
Date: March 31st, Friday
Time: 3:00 – 4:00 pm
Venue: 2222 Coover hall
 
Davood’s abstract: We study the low rank matrix factorization problem via nonconvex optimization. Compared with the convex relaxation approach, nonconvex optimization exhibits superior empirical performance for large scale low rank matrix estimation. However, the understanding of its theoretical guarantees is limited. To bridge this gap, we exploit the notion of inexact first order oracle, which naturally appears in low rank matrix factorization problems such as matrix sensing and completion. Particularly, our analysis shows that a broad class of nonconvex optimization algorithms, including alternating minimization and gradient-type methods, can be treated as solving two sequences of convex optimization algorithms using inexact first order oracle. Thus we can show that these algorithms converge geometrically to the global optima and recover the true low rank matrices under suitable conditions. Numerical results are provided to support our theory.

References:


Weekly Seminar – 3/24/2017 – Low Rank and Sparse Signal Processing #2

The weekly seminar series resumes after the spring break with a talk on Graph Signal Processing by Rahul Singh from Dr. Dogandzic’s research group. The details are as follows:
 
Date: March 24th, Friday
Time: 3:00 – 4:00 pm
Venue: 2222 Coover hall
Rahul’s abstract:  Graph Signal Processing (GSP) is concerned with modeling, representation, and processing of signals defined on irregular structures, known as graphs. In this setting, we deal with graph signals which are collection of data values lying on the vertices of arbitrary graphs. Graph signals can be defined as temperatures within a geographical area, traffic capacities at hubs in a transportation network, or human behaviors in a social network. In the talk, we will discuss the existing graph signal processing tools and concepts such as graph Fourier transform, spectral graph wavelets. Following references are good starting point for graph signal processing.

1. David I Shuman et al. “The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains”. In: Signal Processing Magazine, IEEE 30.3 (2013), pp. 83–98.

2. A. Sandryhaila and J.M.F. Moura. “Discrete Signal Processing on Graphs: Frequency Analysis”. In: Signal Processing, IEEE Transactions on 62.12 (2014), pp. 3042–3054.

3. David I Shuman, Benjamin Ricaud, and Pierre Vandergheynst. “Vertex-frequency analysis on graphs”. In: Applied and Computational Harmonic Analysis 40.2 (2016), pp. 260–291.

Slides: GSP

Weekly Seminar – 3/3/2017 – Low Rank and Sparse Signal Processing #1

Songtao Lu from Dr. Wang’s research group will start the series of talks on low rank and sparse signal processing in our weekly meetings. The details are as follows:
Date: March 3rd, Friday
Time: 3:00 – 4:00 PM
Venue: 2222 Coover hall
Songtao‘s Abstract: I would like to mainly talk about low rank matrix factorization with applications to machine learning on this Friday, including spectral clustering, nonnegative matrix factorization. If time is enough, I plan to introduce some new relevant works about deep neural networks for dimensionality reduction.
Selected references:
Slides: LMF