This guide covers configuring log sources, setting up FluentBit collection, and monitoring data ingestion in Sentinel Kit.
Sentinel Kit uses FluentBit as the primary log collection engine, feeding data into Elasticsearch for storage and analysis:
Log Sources β FluentBit β Elasticsearch β Sentinel-Kit frontend and Kibana
This document outlines the various methods available for ingesting logs into the Sentinel-Kit Elastic Stack.
Logs can be funneled into the Elastic Stack using one of the following methods:
This is the fastest way to index data. To use this method, simply place the data you wish to index into the following directory: ./data/log_ingest_data
By default, the stack is configured for rapid indexing of the following log types:
.evtx formatAll indexed data is placed into Elasticsearch indices following the format: sentinelkit-ingest-<TYPE>-<YY>-<MM>-<DD>.
You can access and visualize this data via Kibana:
https://kibana.sentinel-kit.localelastic user account and the password defined in your .env file (refer to the ELASTICSEARCH_PASSWORD variable).Sentinel-Kit exposes an SFTP service for data transfer.
β οΈ Note on Accessibility: To make the SFTP service accessible over the internet, you must configure your network equipment (e.g., firewall rules).
The SFTP credentials are found in your local .env file:
SFTP_USER=sentinel-kit_sftp_user
SFTP_PASSWORD=sentinel-kit_sftp_passwd
Once uploaded, the files will be available in:
./data/ftp_data
Important: Data placed here is NOT automatically indexed. You must manually move or copy the desired logs from this location into the ./data/log_ingest_data directory to trigger direct indexing (Method 1).
You can send logs, primarily in JSON format, using a dedicated forwarder. This requires setting up a data source via the backend console.
Execute the following command to enter the backend console application:
./launcher console
Once inside the container, run the following command to create a new ingestion endpoint:
sentinel-kit> app:datasource:create <name> <index> [<validFrom> [<validTo>]]
This command takes 4 arguments:
Example:
sentinel-kit> app:datasource:create MyIngestName temp_index 2020-01-01 2030-01-01
MyIngestName - temp_index
[OK] Datasource "MyIngestName" created successfully
Valid from 2020-01-01
Valid to 2030-01-01
Ingest key (header X-Ingest-Key): M2VmYjRiZTMtYThmNi00ZDhlLTliZTQtMGFjYWNhZDVjY2Mw
Forwarder URL: https://backend.sentinel-kit.local/ingest/json
Once the source is created, logs must be sent to the Forwarder URL displayed in the console output.
Use these commands to manage your data sources:
app:datasource:listapp:datasource:delete <name>With data ingestion configured: