How to Add Guardrails to LLM Apps with NeMo Guardrails
Protect your LLM application from jailbreaks, off-topic use, and harmful outputs in under 50 lines
Protect your LLM application from jailbreaks, off-topic use, and harmful outputs in under 50 lines
Add safety guardrails to your AI application with input validation and output filtering