A complete end-to-end pipeline for LLM interpretability with sparse autoencoders (SAEs) using Llama 3.2, written in pure PyTorch and fully reproducible.
-
Updated
Mar 23, 2025 - Python
A complete end-to-end pipeline for LLM interpretability with sparse autoencoders (SAEs) using Llama 3.2, written in pure PyTorch and fully reproducible.
Implementation of the stacked denoising autoencoder in Tensorflow
Pivotal Token Search
Official Code for Paper: Beyond Matryoshka: Revisiting Sparse Coding for Adaptive Representation
Pytorch implementations of various types of autoencoders
SANSA - sparse EASE for millions of items
Sparse Embedding Compression for Scalable Retrieval in Recommender Systems
Tensorflow Examples
Official Triton kernels for TopK and HierarchicalTopK Sparse Autoencoder decoders.
Multi-Layer Sparse Autoencoders (ICLR 2025)
Sparse Autoencoders (SAE) vs CLIP fine-tuning fun.
Providing the answer to "How to do patching on all available SAEs on GPT-2?". It is an official repository of the implementation of the paper "Evaluating Open-Source Sparse Autoencoders on Disentangling Factual Knowledge in GPT-2 Small"
[JAMIA] Official repository of Deep Propensity Network - Sparse Autoencoder(DPN-SA)
Interpret and control dense embedding via sparse autoencoder.
Repository for "From What to How: Attributing CLIP's Latent Components Reveals Unexpected Semantic Reliance"
Collection of autoencoder models in Tensorflow
[NeurIPS 2025] This is the official repository for VL-SAE: Interpreting and Enhancing Vision-Language Alignment with a Unified Concept Set
Implemented semi-supervised learning for digit recognition using Sparse Autoencoder
Implementations and Experiments: Transformers, RoPE, KV cache, SAEs, Tokenisers
Official code for NeurIPS 2025 paper "Revising and Falsifying Sparse Autoencoder Feature Explanations".
Add a description, image, and links to the sparse-autoencoder topic page so that developers can more easily learn about it.
To associate your repository with the sparse-autoencoder topic, visit your repo's landing page and select "manage topics."