loader

Month: January 2023

Home / 2023 Archives / Jan Archives
jaikrishnan Publications 0

Confidential Computing for AI: Hardening Model Secrets with SGX and Nitro Enclaves

A January 2024 white paper from Microsoft’s Office of the Chief Economist reported a 22% drop in task duration for experienced SOC analysts using Security Copilot. Jai, who advises a Fortune 500 security operations center, says that integrating retrieval-augmented LLMs into their triage workflow produced even sharper results. “We cut more than half the minutes out of every triage,” Jai shares. “The average alert dropped from eleven minutes to under five.” These results, he says, came not from generative chat, but from disciplined engineering decisions that gave the model access only to what it needed, nothing more. Jai’s Background in Large-Scale Cyber Analytics In this space Jai is recognized for turning research into production platforms that pass enterprise audit. Over the past decade he has built log pipelines that handle tens of petabytes each month, introduced zero-trust controls across multi-cloud SOCs, and authored reference blueprints on retrieval-augmented detection cited by industry working groups on AI for cyber defence. Colleagues respect his blend of data-engineering rigor and focus on measurable analyst productivity, qualities that underpin the results described here. Retrieval Comes Before Reasoning The real bottleneck in threat hunting, Jai explains, is narrowing down petabytes of logs into the few kilobytes that matter. “You don’t want the model guessing. You want it reading the right five lines.” His team implemented three core retrieval strategies: chunking logs into ~300-token blocks for better recall, embedding those with metadata like timestamps and MITRE tags, and enforcing a refresh cadence of under five seconds for high-velocity sources like auth logs. Two Calls, Not One Instead of direct prompting, the architecture separates retrieval from reasoning. A gRPC service first fetches the top-k relevant events, which are then passed into a tightly scoped prompt. “The model only sees curated context. It’s cheaper, faster, and audit-safe,” Jai notes. That setup ensures flat costs per query, evidence-cited output, and a cacheable retrieval layer keeping end-to-end latency under 300 milliseconds. A Prompt That Refuses to Wander Open chat is banned. The template exposes four short fields: Indicator, Context, Hypothesis, Recommended Action. Temperature sits at zero point one. A post-run checker discards any reply lacking a quoted evidence line. “If the model cannot ground its claim, we never see it,” Jai notes. Scoring That Integrates Seamlessly The model outputs a triage score between zero and one hundred. Alerts above eighty are promoted into a fast lane already trusted by human analysts. After eight weeks, the SOC reported 70% agreement between model scores and analyst decisions, while false escalations remained under 3%. Hardware Footprint Remains Modest In the pilot, a global manufacturer indexed thirty days of Sentinel, CrowdStrike, and Zeek telemetry, around 1.2 billion vectors in total. The system ran on four NVIDIA A10G nodes for vector search and a single L4 cluster for prompt inference. No other infrastructure was modified. Across the same window: Mean triage time dropped from 11.4 to 4.6 minutes Daily analyst throughput rose from 170 to 390 alerts False positive rate remained unchanged Governance Keeps Trust Intact Evidence retention. Every retrieved snippet and generated answer is stored with the incident ticket. Version freeze. The model stays fixed for ninety days; upgrades rerun calibration tests before release. Role boundary. Only tier-two analysts may convert model advice into automated remediation steps. “These gates satisfy audit without slowing the flow,” Jai says. The Leadership Perspective Retrieval-augmented language models remove roughly sixty percent of manual triage time when search, prompt, and governance are engineered together. Gains depend on three design choices: event-level chunking with rich metadata, a clear two-step search then reason pattern, and a prompt that enforces evidence citation. Hardware cost stays low because the system uses commodity GPU nodes for vectors and a small inference cluster. “We did not chase artificial chat magic,” Jai concludes. “We treated the model as a microservice, fed it hard context, and tied every suggestion to a line of log. The speed gain is measurable and the audit trail is airtight.” For CTOs seeking more coverage from the same headcount, Jai’s data shows that retrieval-augmented LLMs are ready for production testing today.

jaikrishnan Publications 0

LLM-Powered Threat Hunting: Building Retrieval-Augmented Workflows that Cut Triage Time 60 %

Read More In line with the Bahrain Economic Vision 2030, leading Bahraini bank enhances AI experiences for its clients Al Salam Bank has signed a strategic deal with Denodo, a global leader in data management, AWS, and NAIB IT, a Bahrain-based systems integrator known for delivering high-impact technology solutions across banking, government, public sector, and enterprise organizations. The agreement aims to adopt the Denodo platform to amplify the Bank’s data and AI infrastructure, in line with Bahrain’s Vision 2030 and the national direction toward digital transformation. The signing ceremony was attended by Shaikha Dr. Dheya Bint Ebrahim Al Khalifa, Managing Director at NAIB IT;  Mr. Anwar Murad, Deputy CEO – Banking at Al Salam Bank, Mr. Hemantha Wijesinghe, CTO at Al Salam Bank; and Mr. Gabriele Obino, Denodo Regional Vice President South Europe and Middle East and General Manager Denodo Arabian Limited. Through the Denodo platform, Al Salam Bank will be able to unify its enterprise data across various systems, enabling faster decision-making and driving innovation. This step also reflects the Bank’s commitment to leading innovation in digital banking, in line with the Kingdom of Bahrain’s long-term economic vision. Shaikha Dr. Dheya Bint Ebrahim Al Khalifa stated, “This strategic collaboration represents a significant milestone in Bahrain’s digital transformation journey. We are happy to facilitate partnerships that advance our nation’s technological capabilities and strengthen our position as a regional fintech hub. Through initiatives like this, we are building the foundation for a knowledge-based economy that aligns with Bahrain’s Vision 2030.” “At Al Salam Bank, we are committed to remaining at the forefront of digital transformation within the financial sector,” said Anwar Murad, Deputy CEO – Banking at Al Salam Bank. “This strategic partnership with Denodo and NAIB IT marks a significant step in advancing our digital maturity and optimizing the use of data and AI to better serve our clients. By harnessing real-time data integration and AI-powered analytics, we aim to enhance responsiveness, strengthen operational agility, and deliver a more personalized and seamless banking experience. This initiative goes beyond technology adoption—it represents our dedication to embedding intelligence into core operations, enabling informed decision-making and positioning Al Salam Bank as a forward-looking institution aligned with the aspirations of Bahrain’s Vision 2030.” “This partnership reflects our vision to build a smarter, more agile bank powered by advanced data and AI capabilities. We believe this initiative will not only enhance the clients experience but also set a benchmark for innovation in the region,” said Hemantha Wijesinghe, CTO at Al Salam Bank. Al Salam Bank has signed a strategic agreement with Denodo and NAIB IT to advance its data management and AI initiatives through AWS Marketplace, enabling faster procurement, cloud-native scalability, and real-time access to data products to accelerate innovation. The agreement forms a key pillar in Al Salam Bank’s broader digital transformation roadmap, reinforcing its position at the forefront of smart banking in the region. With the Denodo Platform’s logical data management capabilities including a universal semantic layer, Al Salam Bank can connect and manage data from its core systems, cloud-based services, and fintech partners, within minutes instead of weeks. The interoperability among the different systems will enable AI-powered analytics and reporting, enabling faster, data-driven decisions at the executive and operational levels. Commenting on the partnership, Gabriele Obino, regional vice president and general manager, Southern Europe and Middle East at Denodo, stated, “We are proud to support Al Salam Bank in its digital transformation journey. Our platform enables real-time data access, governance, and agility, critical components for AI success. This partnership showcases how modern data management can empower financial institutions to lead in a rapidly evolving digital economy.” “As a local integrator, our mission is to ensure that global innovation translates into local success, said Ebrahim Sonde, COO at NAIB IT. “Collaborating with Al Salam Bank and Denodo, we are committed to delivering a robust, secure, and scalable data architecture that drives meaningful transformation.” By adopting the Denodo Platform’s logical data management layer and leveraging NAIB IT’s deployment expertise, the Bank expects further enhancements in operational efficiency, regulatory compliance, and service agility. Real-time access to data will not only empower teams with faster insights but also elevate the end-user experience. In embracing this transformation, Al Salam Bank reinforces its position as a technology-forward institution, aligned with the aspirations of Bahrain’s Vision 2030 and prepared to lead in a future defined by intelligent financial services.