The Challenges of Legacy SIEM Models | Ali Ghodsi at RSAC 2026

Databricks · Advanced ·🧠 Large Language Models ·3w ago
Databricks CEO Ali Ghodsi outlines why traditional SIEM architectures are struggling to keep pace with modern cybersecurity needs. Key takeaways: - Cost & Ingestion: Volume-based pricing and proprietary formats make comprehensive data ingestion slow and expensive. - Retention Issues: High costs lead to short retention cycles, preventing longitudinal analysis of long-term threats. - Data Gaps: Legacy systems often exclude multimodal data, such as audio, video, and LLM transcripts. - Manual Operations: Detection and investigation remain highly manual, leaving SOC teams inundated. Watch the full keynote: https://www.databricks.com/resources/webinar/its-time-leave-legacy-siem-behind?utm_source=youtube&utm_medium=organic-social
Watch on YouTube ↗ (saves to browser)
Sign in to unlock AI tutor explanation · ⚡30

Related AI Lessons

PagedAttention: vLLM’s Solution to GPU Memory Waste
Learn how PagedAttention solves GPU memory waste for large language models (LLMs) and improve your LLM serving efficiency
Medium · ChatGPT
From 30 to 60 Tokens/Second: How I Got vLLM Running on 2x RTX 3090
Learn how to install and run vLLM on 2x RTX 3090 to achieve 60 tokens/second, a significant performance boost for LLM applications
Medium · LLM
Running an Offline LLM in React Native (2026): Building Privacy-First AI That Works Without the…
Learn to build a privacy-first offline LLM in React Native, enabling AI functionality without internet connectivity
Medium · LLM
Google Chrome is Now Automatically Downloading 4GB AI Models to User Computers: What You Need to…
Google Chrome now downloads 4GB AI models to user computers, understand the implications and how it affects your device
Medium · LLM
Up next
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)
Watch →