Observability

  • Published on
    When applications interact with LLMs or MCP servers, every request and response is a potential attack surface. One way to add protection is to put a proxy at the edge, where you can inspect traffic and enforce security rules. Just as firewalls and WAFs shield web apps from SQL injection or XSS, a proxy can serve as an "AI firewall" to defend against risks like those in the OWASP Top 10 for LLMs. In this article, I will walk through how to build such a firewall using Nginx, OpenResty, and Lua.