Holmes Processing: Distributed Large-Scale AnalysisΒΆ

Holmes log

Holmes Processing was born out of the need to rapidly process and analyse large volumes data in the computer security community. At its core, Holmes Processing is a catalyst for extracting useful information and generate meaningful intelligence. Furthermore, the robust distributed architecture allows the system to scale while also providing the flexibility needed to evolve.

Holmes Processing Architecture is based on 3 core pillars.

  • Resilient: Failures should be gracefully handled and not affect other parts of the system.
  • Scalable: The system should be easily able to scale vertically and horizontally.
  • Flexible: Components should be interchangeable and new features should be easy to add.