Charla titulada «A Principled and Gradual Strategy to Machine Learning» por el Prof. Hamid Krim

Dear colleagues:

Please find below some information about the next conference to be held within the Programme iTEAM Communications and Multimedia Lectures:


A Principled and Gradual Strategy to Machine Learning
 


Prof. Hamid Krim
 

 

VISSTA Laboratory, North Carolina State University, Raleigh, NC, USA

 

  • Date: Wednesday, Dec. 22, 2021
  • Time: 15:00 h.
  • Location: iTEAM Meeting Room, Building 8G, access D, 4th Floor, Universitat Politecnica de Valencia, Camino de Vera s/n, 46022 Valencia.

Abstract:

A good representation of measured data is almost always key to their successful exploitation. We will argue in this presentation that there is a unifying representation theme, mediating between variations necessary to extract information adapted to the task at hand. Capturing the structure of the information at different scales will be sought by different techniques which, by also distilling the data in creative ways, yields adapted models which unveil a solution. Building on the classical statistical and non-parametric PCA as well as on canonical basis representation, non-linear properties are invoked to construct decomposition criteria of data to result in increasingly complex and informative structures.

Starting with the fact that information typically enjoys a limited number of degrees of freedom relative to the ambient space, we propose a lower rank structure for the information space relative to its embedding space. Exploiting the natural self-representativity of the data strongly suggest the flexible structure of union-of-subspaces (UoS) model, as a piece-wise linear (PWL) generalization of a linear subspace model. This proposed structure preserves the simplicity of linear subspace models, with an additional capacity of a PWL approximation of nonlinear data. We show a sufficient condition to use l 1 minimization to reveal the underlying UoS structure, and further propose a bi-sparsity model (RoSure) as an effective strategy, to recover the given data characterization by the UoS model from non-conforming errors/corruptions.
This structural characterization, albeit powerful for many applications, can be shown to be limited in large
scale data (images) commonly shared features. We make a case for further refinement by invoking a joint and principled scale-structure atomic characterization, which is demonstrated to improve performance. This resulting Deep Dictionary Learning approach is based on symbiotically formulating a classification problem regularized by a reconstruction problem. A theoretical rationale is also provided to contrast this work to Convolutional Neural Networks, with a demonstrably competitive performance, as well as to pursue new additional nonlinear perspectives using Volterra Kernels. Substantiating examples are provided, and the application and performance of these approaches are shown for a wide range of problems such as video segmentation and object classification.

Hamid Krim (ahk@ncsu.edu) received his BSc. MSc. and Ph.D. in Electrical Engineering. He was a Member of Technical Staff at AT&T Bell Labs, where he has conducted R&D in the areas of telephony and digital communication systems/subsystems. Following an NSF postdoctoral fellowship at Foreign Centers of Excellence, LSS/University of Orsay, Paris, France, he joined the Laboratory for Information and Decision Systems, MIT, Cambridge, MA as a Research Scientist, performing and supervising research. He is presently Professor of Electrical Engineering in the ECE Department, North Carolina State University, Raleigh, leading the Vision, Information and Statistical Signal Theories and Applications Laboratory. His research interests are in statistical signal and image analysis and mathematical modeling with a keen emphasis on applied problems in classification and recognition using geometric and topological tools. His research work has been funded by many Federal and Industrial agencies, including a NSF Career award. He has served on the IEEE editorial board of SP, and the TCs of SPTM and Big Data Initiative, as well as an AE of the new IEEE Transactions on SP on Information Processing on Networks, and of the IEEE SP Magazine. He is also one of the 2015-2016 Distinguished Lecturers of the IEEE SP Society.