SSL-EY

所属分类:CA认证
开发工具:Python
文件大小:0KB
下载次数:0
上传日期:2023-12-08 12:17:01
上 传 者sh-1993
说明:  SSL安永
(SSL EY)

文件列表:
LICENSE (1078, 2023-12-08)
augmentations.py (3692, 2023-12-08)
distributed.py (3145, 2023-12-08)
main_ssley.py (10374, 2023-12-08)
resnet.py (10625, 2023-12-08)
schematic.pdf (49813, 2023-12-08)
schematic.svg (89567, 2023-12-08)
ssley.py (1132, 2023-12-08)

# SSL-EY: Maximizing Correlation in Self-Supervised Learning **It's. pronounced. "Slay".** [![downloads](https://img.shields.io/badge/Arxiv-2310.01012-red?logo=arxiv&logoColor=red)](https://pypi.org/project/fusilli/)
SSL-EY (Self-Supervised Learning with an Eckhart-Young characterization) is a novel approach to self-supervised learning in AI based on Canonical Correlation Analysis. This repository hosts the official PyTorch implementation of SSL-EY (Self-Supervised Learning with an Eckhart-Young characterization), featuring a simplified design inspired by the [VICReg](https://github.com/facebookresearch/vicreg/blob/main/README.md) repository. The work is featured in: - [CCA with Shared Weights for Self-Supervised Learning](https://openreview.net/forum?id=7rYseRZ7Z3), presented at [NeurIPS 2023 Workshop: Self-Supervised Learning - Theory and Practice](https://neurips.cc/virtual/2023/80864) - [Efficient Algorithms for the CCA Family: Unconstrained Objectives with Unbiased Gradients](https://arxiv.org/abs/2310.01012)

Schematic

## Training Install [PyTorch](http://pytorch.org) and plug our loss in . Download [ImageNet](https://imagenet.stanford.edu/) and follow instructions on [VICReg](https://github.com/facebookresearch/vicreg) for distributed training scripts. ## Other Implementations Our loss function also slots into public SSL software pipelines.the results in our papers were produced from our public fork of solo-learn. ### solo-learn The results in our papers were produced from our [public fork of solo-learn](https://github.com/jameschapman19/solo-learn). ### lightly We also set up SSL-EY in our [lightly fork](https://github.com/jameschapman19/lightly) ## Pre-trained Models You can choose to download only the weights of the pretrained backbone used for downstream tasks, or the full checkpoint which contains backbone and projection head weights.
arch params accuracy download
ResNet-50 23M Work in Progress Work in Progress Work in Progress
ResNet-50 (x2) 93M 75.5% Work in Progress Work in Progress
ResNet-200 (x2) 250M 77.3% Work in Progress Work in Progress
## Pretrained models on PyTorch Hub Work in progress ## License SSL-EY is released under the MIT License, allowing commercial use. See LICENSE for details. ## Citation If you find this repository useful, please consider giving a star and citation: @misc{chapman2023efficient, title={Efficient Algorithms for the CCA Family: Unconstrained Objectives with Unbiased Gradients}, author={James Chapman and Lennie Wells and Ana Lawry Aguila}, year={2023}, eprint={2310.01012}, archivePrefix={arXiv}, primaryClass={cs.LG} } @inproceedings{ chapman2023cca, title={{CCA} with Shared Weights for Self-Supervised Learning}, author={James Chapman and Lennie Wells}, booktitle={NeurIPS 2023 Workshop: Self-Supervised Learning - Theory and Practice}, year={2023}, url={https://openreview.net/forum?id=7rYseRZ7Z3} }

近期下载者

相关文件


收藏者