15-16 nov. 2021 - LORIA, Villers-lès-Nancy (France)

Conférenciers invités

Julie Digne

Chercheuse CNRS : Université Claude Bernard Lyon 1 - LIRIS

Exposé : New tools for surface analysis

        In this talk we explore new tools for shape analysis. We consider surfaces and how local analysis of the angular oscillations and polynomial radial behavior around surface points leads to accurate normal estimation and new integral invariants. A direct application of these integral invariants is geometric detail exaggeration. In a second part we tackle the problem of finding relevant principal directions related to high order differential properties, we link those with the eigendecomposition of symmetric tensors and show that they can be efficiently computed using the previous angular/radial polynomial decomposition. This leads to new applications for rigid registration.  

benjamin.png 

Benjamin Perret

Enseignant-Chercheur (HDR) : Université Gustave Eiffel - LIGM

Exposé : Hierarchical data analysis: definitions and optimization

        Hierarchical representations are used in many areas where the observable structures in the data depend on the chosen scale of observation. For example, in an image seen from a distance, one will usually only see the main objects, while at a closer distance one will see that these objects are themselves composed of sub-parts. In practice, most of the hierarchical representations used today are defined algorithmically without trying to optimize a well-defined cost criterion. In this presentation, we will see some recent definitions and methods for hierarchical representations that fit into combinatorial or continuous optimization frameworks, leading to a better understanding of these structures and opening the possibility to interact with modern machine learning methods.  

      The presentation will be divided into three parts. We will first see how the notion of watershed cuts naturally leads to hierarchies which are optimal at each scale of observation: such kind of hierarchies is fast to compute and shows to good performances in practice. Second, we will study a formalism to optimize hierarchical clustering cost functions using gradient descent methods, which brings a lot of flexibility and which can be used in combination with deep networks. Finally, we will see how certain hierarchies, called component trees, can be used to define versatile topological regularization terms in continuous optimization, and we will see how this approach relates to topological persistence.

Personnes connectées : 3 Vie privée
Chargement...