AIbase
Product LibraryTool Navigation

vigil-llm

Public

? Vigil ? Detect prompt injections, jailbreaks, and other potentially risky Large Language Model (LLM) inputs

Creat2023-09-05T01:02:21
Update2025-03-26T08:24:10
https://vigil.deadbits.ai/
370
Stars
1
Stars Increase

Related projects