
Rodrigo Maulen (Laboratoire de Probabilités Statistique & Modélisation) « Attention-based clustering »
19 novembre 2025 à 10:30 --> 11:30
Transformers have emerged as a powerful neural network architecture capable of tackling a wide range of learning tasks. In this work, we provide a theoretical analysis of their ability to automatically extract structure from data in an unsupervised setting. In particular, we demonstrate their suitability for clustering when the input data is generated from a Gaussian mixture model. To this end, we study a simplified two-head attention layer and define a population risk whose minimization with unlabeled data drives the head parameters to align with the true mixture centroids. This phenomenon highlights the ability of attention-based layers to capture underlying distributional structure…
- wpea_event_timezone:
- UTC
- wpea_event_link:
- https://indico.math.cnrs.fr/event/15163/
- wpea_event_timezone_name:
- UTC
- wpea_event_id:
- indico-vnt-15163@indico.math.cnrs.fr
- wpea_event_origin:
- ical