Papers and resources related to the security and privacy of LLMs 🤖
-
Updated
Jun 8, 2025 - Python
Papers and resources related to the security and privacy of LLMs 🤖
[NeurIPS D&B '25] The one-stop repository for LLM unlearning
Python package for measuring memorization in LLMs.
The fastest Trust Layer for AI Agents
An Execution Isolation Architecture for LLM-Based Agentic Systems
It is a comprehensive resource hub compiling all LLM papers accepted at the International Conference on Learning Representations (ICLR) in 2024.
LLM security and privacy
LLM Platform Security: Applying a Systematic Evaluation Framework to OpenAI's ChatGPT Plugins
Make Zettelkasten-style note-taking the foundation of interactions with Large Language Models (LLMs).
User-friendly LLM interface, self-hosted, offline, and privacy-first.
Semantic Privacy Guard: A Java middleware that intercepts text, identifies PII using a three-layer hybrid pipeline (Regex + Naive Bayes ML + Apache OpenNLP NER), and redacts it before it reaches an LLM or leaves the corporate network — with stream-based processing for memory-efficient handling of large files and log streams.
Semantic PII Masking & Anonymization for LLMs (RAG). GDPR-compliant, reversible, and context-aware. Supports LangChain & OpenAI
🔒 Detect security leaks in AI-assisted codebases. Static analysis tool for Python & JS/TS with cross-file taint tracking.
A 3-tier framework for controlling your AI privacy — from open use to full isolation.
Example of running last_layer with FastAPI on vercel
Sensitive data detection and masking for text files. Scans markdown files for sensitive patterns across five detection domains, replaces matches with deterministic tokens, and stores originals in an AES-256-GCM encrypted vault.
🛡️ Privacy-aware OpenClaw plugin. Classifies messages into S1/S2/S3 sensitivity tiers — keeps private data local, redacts PII before cloud. Built by Centrase AI.
Build memory systems for AI agents with persistent recall, context management, and structured storage
Add a description, image, and links to the llm-privacy topic page so that developers can more easily learn about it.
To associate your repository with the llm-privacy topic, visit your repo's landing page and select "manage topics."