At cloudsineAI, we believe that as GenAI continues to evolve, so must our approach to securing it. On 2nd April, our CEO and Founder, Matthias Chin, joined an expert panel at the inaugural AI Summit at Black Hat Asia 2025 to discuss a rising concern in the cybersecurity space: Are LLM firewalls the future of AI security?
The session brought together cybersecurity leaders and AI practitioners to explore the growing need for security around Large Language Models (LLMs) that are rapidly being adopted across sectors, from finance and healthcare to government and critical infrastructure.

Why LLM Firewalls Are Becoming Essential
LLMs are powerful, but they’re also vulnerable. From prompt injection attacks to sensitive data leakage and hallucinations, organisations are beginning to see real-world risks tied to these systems.
During the panel discussion, Matthias pointed out that LLM firewalls are emerging as a critical solution to mitigate these threats, much like how traditional firewalls were deployed to secure enterprise networks.
These firewalls will act as real-time protection layers, controlling how LLMs process and respond to user prompts. By intercepting malicious inputs and filtering risky outputs, they ensure AI systems remain safe, compliant, and aligned with business intent.
The cloudsineAI Approach
At cloudsineAI, we are actively developing an LLM firewall, WebOrion® Protector Plus, designed to secure modern GenAI applications. Our vision includes:
✅ Runtime prompt filtering to block unsafe or unauthorised queries
✅ Customisable guardrails based on role, industry, or policy
✅ Low-latency protection suitable for real-time use cases
We believe that AI safety starts with visibility and control, and LLM firewalls are a key part of this layered security strategy.
Leading the Conversation at Black Hat Asia 2025
We’re proud that our CEO represented cloudsineAI at the inaugural AI Summit at Black Hat Asia 2025, contributing to an important conversation shaping the future of AI security.
The strong turnout and engagement at the panel showed that LLM firewall technology is more than a buzzword — it’s a necessity.
Want to learn more about our LLM firewall? Check it out here.