Skip to content
Snippets Groups Projects

Evidence Collector

This project includes modules for collecting evidence regarding Wazuh and VAT.

Wazuh evidence collector

Wazuh evidence collector uses Wazuh's API to access information about manager's and agents' system informations and configurations. As an additional measure to ensure correct configuration of ClamAV (if installed on machine) we also make use of Elasticsearch's API to dirrectly access collected logs - Elastic stack is one of the Wazuh's required components (usually installed on the same machine as Wazuh server, but can be stand alone as well).

Installation & use

Using docker

Note: Docker image is not yet complete and might not work due to recent changes around scheduler etc.

  1. Set up your Wazuh development environment. Use Security Monitoring repository to create and deploy Vagrant box with all the required components.

  2. Clone this repository.

  3. Build Docker image:

docker build -t evidence-collector .
  1. Run the image:
docker run evidence-collector

Note: Current simple image runs code from test.py. If you wish to test anything else, change this file or edit Dockerfile.

Local environment

  1. Set up your Wazuh development environment. Use Security Monitoring repository to create and deploy Vagrant box with all required components.

  2. Clone this repository.

  3. Install dependencies:

pip install -r requirements.txt

sudo apt-get install jq
  1. a) Install Redis server locally:
sudo apt-get install redis-server

Note: To stop Redis server use /etc/init.d/redis-server stop.

  1. b) Run Redis server in Docker container:
docker run --name my-redis-server -p 6379:6379 -d redis

In this case also comment-out server start command in entrypoint.sh:

#redis-server &
  1. Run entrypoint.sh:
./entrypoint.sh

Note: This repository consists of multiple Python modules. When running Python code manually, use of -m flag might be necessary.

Component configuration

API User authentication

Current implementation has disabled SSL certificate verification & uses simple username/password verification (defined inside /constants/constants.py). Production version should change this with cert verification.

Manual Elasticsearch API testin with cURL

Example command for testing the API via CLI:

curl --user admin:changeme --insecure -X GET "https://192.168.33.10:9200/wazuh-alerts*/_search?pretty" -H 'Content-Type: application/json' -d'
{"query": {
  "bool": {
    "must": [{"match": {"predecoder.program_name": "clamd"}},
             {"match": {"rule.description": "Clamd restarted"}},
             {"match": {"agent.id": "001"}}]
    }
  }
}'

Running RQ and RQ-scheduler localy

  1. Install (if needed) and run redis-server:
sudo apt-get install redis-server

redis-server

Note: By default, server listens on port 6379. Take this into consideration when starting other components.

  1. Install RQ and RQ-scheduler:
pip install rq

pip install rq-scheduler
  1. Run both components in 2 terminals:
rqworker low

rqscheduler --host localhost --port 6379

Note: low in the first command references task queue worker will use.

  1. Run Python script containing RQ commands as usual:
python3 ...

Known issues

Python Elasticsearch library problems with ODFE

Latest versions (7.14.0 & 7.15.0) of Python Elasticsearch library have problems connecting to Open Distro for Elasticsearch and produce the following error when trying to do so:

elasticsearch.exceptions.UnsupportedProductError: The client noticed that the server is not a supported distribution of Elasticsearch

To resolve this, downgrade to older package version:

pip install 'elasticsearch<7.14.0'