AI Safety Research Tool
flinch
AI content restriction research tool. Test how models handle sensitive content probes across providers with pushback coaching and structured data export.
MIT AI Safety Python
LLM Interpretability Explorer
pry
Local-first desktop app for exploring what small transformer LLMs are doing inside. Attention visualization, logit lens, SAE features, and more.
MIT Interpretability Tauri
Local AI Image Generation
blink
Dead-simple local AI image generation. Type a prompt, get an image. No Python, no node graphs. Supports SD 1.5, SDXL, Flux, and more.
MIT Tauri Rust
Claude Code Plugins
beargle-plugins
A curated collection of Claude Code plugins. Install individually or browse the repo.
Loading plugins…
MIT Claude Code