- Print
- DarkLight
LimaCharlie provides multiple output options, referred to as streams, for you to send data from LimaCharlie to other destination(s). We provide native support for output with multiple different providers, or "destinations". The diagram below provides some basic examples of where data is sourced from and where data can be sent to.
Outputs should be thought of in two capacities: Streams and Destinations. A stream is what you are sending, whereas a destination is where you are sending it to. We will look at both in detail.
Check out the following YouTube video for a walkthrough of configuring an output.
Streams
Streams define what data points will be sent to an output destination.
Available streams include:
event
: The bulk of data events coming from sensors. Note: this will be very verbosedetect
: Alerts, as generated by thereport
action in detection and response rules.audit
: Events generated by the LimaCharlie platform, such as access control.deployment
: Events about your deployment, like sensor enrollments or cloned sensors.artifact
: Meta-events reporting on newly-ingested files through the Artifact Collection mechanism.tailored
: Only events specifically flagged for outputs sent to this stream.
Destinations
LimaCharlie integrates with several providers, such as S3, Google Cloud, or Slack, as destinations.
Looking to add LimaCharlie outputs to an allow list (aka "whitelist")? See more details here.
Use Cases
There are multiple use cases or integration strategies for shipping telemetry to and from the LimaCharlie platform. Some common approaches we have seen:
All data over batched files via SFTP, Splunk or ELK consumes the received files for ingestion.
Sensor ---> LCC (All Streams) ---> SFTP ---> ( Splunk | ELK )
All data streamed in real-time via Syslog, Splunk or ELK receive directly via an open Syslog socket.
Sensor ---> LCC (All Streams) ---> Syslog( TCP+SSL) ---> ( Splunk | ELK )
All data over batched files stored on Amazon S3, Splunk or ELK consumes the received files remotely for ingestion.
Sensor ---> LCC (All Streams) ---> Amazon S3 ---> ( Splunk | ELK )
Bulk events are uploaded to Amazon S3 for archiving, while alerts and auditing events are sent in real-time to Splunk via Syslog. Note: This has the added benefit of reducing Splunk license cost while keeping the raw events available for analysis at a lower cost.
Sensor ---> LCC (Event Stream) ---> Amazon S3
+--> LCC (Alert+Audit Streams) ---> Syslog (TCP+SSL) ---> Splunk
IP Sources
Outputs from the LimaCharlie cloud do not come from a single predictible IP address due to the highly distributed nature of the cloud.
An approximation can be made using the blocks of IP addresses published by Google Cloud Platform here.
The following LimaCharlie datacenters map to the following GCP regions:
- USA:
us-central1
- Canada:
northamerica-northeast1
- Europe:
europe-west4
- UK:
europe-west2
- India:
asia-south1