Research Interests and Biography
I have now joined Microsoft Research Cambridge as a Research Machine Learning Engineer. Please, contact me on LinkedIn.
At Harvard University I was a postdoctoral fellow, where I worked on computational machine learning and representation learning. My research focused on finding interpretable signal representations via signal processing, optimization and machine learning methods. I was part of CRCS and SEAS department, and I collaborated in the LINKSLAB and CRISP research groups, as well as IACS courses. A few of the projects I have worked on include:

Ground metric learning using Earth Mover's Distance (EMD): EMD characterizes a metric space between discrete probability distributions (i.e., histograms) by evaluating the minimum transportation costs of probability mass. The name is an analogy of the transportation costs of moving a pile of dirt into another one, as enunciated by Gaspard Monge in 1781. We focus on learning a ground metric (i.e., the transportation cost values) from a set of reference distances and corresponding histograms. We consider both regression and classification models under a unifying framework. An application of interest is to characterize the space of flavors and fragrances to aid in the creation of products.

Multiscale Hierarchical Sparse Coding (SC): Convolutional Neural Networks (CNNs) use sparse signal activations across layers (feature maps) that can be associated to high filter responses. In this project we propose a hierarchical deep generative model that attains sparse representations across layers while learning filters (dictionaries), inspired by neural network formulations. Our model decomposes each signal into a smooth (a scale signal) and a detail (a sparse representation), that is recurrent across scales, similar to standard wavelet analysis. Our representation is interpretable (it is a selection of a few learned filters on each layer), and employs a reduced number of parameters.

Offline Recommender System of Music Songs: As part of our Capstone course at IACS and collaboration with Spotify, we develop an offline recommender system for The Music Streaming Sessions Dataset using reinforcement learning tools. Here is the published report.
 Differential Neural Architecture Search (DARTS): As part of our Capstone course at IACS and collaboration with Google, we study the performance of CNN architectures on scientific datasets (graphene structures, astronomical datasets, medical, etc.), rather than standard widely used benchmark datasets (CIFAR, SVHN, ImageNet, etc.). In this project we compare popular architectures with DARTS on scientific datasets. Here is the published report.
I received my Ph.D. from Universidad Politecnica de Madrid (UPM) working on nonconvex optimization, game theory and related applications. I received my Telecommunications Engineer degree from UPM and Technische Universitat Darmstadt (TUD), and M.Sc. from the National University of Ireland (NUIM). In 2015 I visited the University at Buffallo (UB) for a research collaboration with Prof. Gesualdo Scutari involving nonconvex optimization. My research interests included optimal transport, optimization methods and deep learning.
News
202011: I joined Microsoft Research at Cambridge to work in the Antigen Map Project in collaboration with Adaptive Biotechnologies.
202001: I will be giving a talk at CentraleSupélec on Wednesday 15th on nonconvex quadratic programming and metric learning.
202001: I will be giving a talk at Universidad Autónoma de Madrid on Wednesday 8th about metric learning in EMD spaces.
201912: I will be attending CAMSAP2019 at Gualupe (West Indies) and present our work about Dictionary Learning in Hierarchical Networks.
201911: I will be attending the Asian Conference on Machine Learning, in Nagoya, and participate in the ACMLTAB workshop.
201910: I will be giving a talk at the Center for Research on Computation and Society (CRCS), entitled "Ground Metric Learning for Discrete Optimal Transport"
201906: We will be teaching a new master course "Introduction to Machine Learning and Statistics" and Workshop on Data Science at Kigali, Rwanda. Thanks to the African Center of Excelence in Data Science (ACEDS)'' for the opportunity!