Wednesday, February 4, 2026

Self-Supervised Studying with Gaussian Processes


Self supervised studying (SSL) is a machine studying paradigm the place fashions study to grasp the underlying construction of knowledge with out specific supervision from labeled samples. The acquired representations from SSL have demonstrated helpful for a lot of downstream duties together with clustering, and linear classification, and so forth. To make sure smoothness of the illustration area, most SSL strategies depend on the flexibility to generate pairs of observations which might be much like a given occasion. Nonetheless, producing these pairs could also be difficult for a lot of kinds of information. Furthermore, these strategies lack consideration of uncertainty quantification and may carry out poorly in out-of-sample prediction settings. To deal with these limitations, we suggest Gaussian course of self supervised studying (GPSSL), a novel method that makes use of Gaussian processes (GP) fashions on illustration studying. GP priors are imposed on the representations, and we acquire a generalized Bayesian posterior minimizing a loss perform that encourages informative representations. The covariance perform inherent in GPs naturally pulls representations of comparable models collectively, serving as an alternative choice to utilizing explicitly outlined constructive samples. We present that GPSSL is carefully associated to each kernel PCA and VICReg, a well-liked neural network-based SSL methodology, however in contrast to each permits for posterior uncertainties that may be propagated to downstream duties. Experiments on varied datasets, contemplating classification and regression duties, reveal that GPSSL outperforms conventional strategies when it comes to accuracy, uncertainty quantification, and error management.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles