New draft standards
Kevin.Dienst at usbank.com
Kevin.Dienst at usbank.com
Mon Dec 14 17:01:18 UTC 2015
ELK
Splunk
We use a proprietary vendor product that migrates data into an HDFS store
via RabbitMQ based collectors and dumps them in raw form. From there I
have access to all the usual "big data" tools albeit I'm not using Flume
just yet, we're still trying to get a handle on operationalizing all the
various big data component so that data science developers can focus on
development instead of operations and support of the hardware/software
ecosystem.
Kevin D Dienst
From: Joe Wulf <joe_wulf at yahoo.com>
To: "linux-audit at redhat.com" <linux-audit at redhat.com>
Date: 12/14/2015 10:51 AM
Subject: Re: New draft standards
Sent by: linux-audit-bounces at redhat.com
Steve,
The last place I was at heavily used Splunk and then transitioned to
dual-routing a substantial portion of the logs from across the
infrastructure to ELK, as well.
-Joe
From: Steve Grubb <sgrubb at redhat.com>
To: F Rafi <farhanible at gmail.com>; "linux-audit at redhat.com"
<linux-audit at redhat.com>
Sent: Monday, December 14, 2015 10:34 AM
Subject: Re: New draft standards
But I guess this gives me an opportunity to ask the community what tools
they
are using for audit log collection and viewing? Its been a couple years
since
e had this discussion on the mail list and I think some things have
changed.
Do people use ELK?
Apache Flume?
Something else?
It might be possible to write a plugin to translate the audit logs into
the
native format of these tools.
-Steve
--
Linux-audit mailing list
Linux-audit at redhat.com
https://www.redhat.com/mailman/listinfo/linux-audit
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listman.redhat.com/archives/linux-audit/attachments/20151214/b575bc6d/attachment.htm>
More information about the Linux-audit
mailing list