Skip to content
Snippets Groups Projects
  • Zitnik, Anze's avatar
    8904133d
    Use AssessEvidence instead of StoreEvidence RPC on Clouditor. · 8904133d
    Zitnik, Anze authored
    Some refactoring.
    
    Squashed commit of the following:
    
    commit 11ae9a48f6b41c2dc5b3e00de1b808b75cc39013
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 11:40:01 2021 +0100
    
        Change CI script: build and test all branches
    
    commit fe84541d50ffc6b25d5fff94b1781345ec2b548d
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 11:33:07 2021 +0100
    
        Version up
    
    commit b99df078408ea2649ce59cd2d17c247c04c6a992
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 11:25:12 2021 +0100
    
        Update dockerignore: add (v)env folder.
    
    commit 20944e6743ce112d558fb0205a9347a46c17da8b
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 10:56:41 2021 +0100
    
        Refactoring: move all gRPC-generated files to their own package (grpc_gen).
    
    commit f4dce9c9076a1336dc7d0b5e15759b24c4f9bea7
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 10:34:05 2021 +0100
    
        Use AssessEvidence instead of StoreEvidence RPC on Clouditor.
    
        Added necessary proto files (removed unneeded) and regenerated python code from them. Also added Google APIs dependencies for grpc code generation. Accordingly updated README. Also updated requirements.txt to include some packages for grpc python code generation and for usage of Google APIs.
    
        Note that with the call to AssessEvidence, we need to provide an AssessEvidenceRequest object (instead of simply Evidence as before). Most changes of existing code are because of this.
    8904133d
    History
    Use AssessEvidence instead of StoreEvidence RPC on Clouditor.
    Zitnik, Anze authored
    Some refactoring.
    
    Squashed commit of the following:
    
    commit 11ae9a48f6b41c2dc5b3e00de1b808b75cc39013
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 11:40:01 2021 +0100
    
        Change CI script: build and test all branches
    
    commit fe84541d50ffc6b25d5fff94b1781345ec2b548d
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 11:33:07 2021 +0100
    
        Version up
    
    commit b99df078408ea2649ce59cd2d17c247c04c6a992
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 11:25:12 2021 +0100
    
        Update dockerignore: add (v)env folder.
    
    commit 20944e6743ce112d558fb0205a9347a46c17da8b
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 10:56:41 2021 +0100
    
        Refactoring: move all gRPC-generated files to their own package (grpc_gen).
    
    commit f4dce9c9076a1336dc7d0b5e15759b24c4f9bea7
    Author: Anže Žitnik <anze.zitnik@xlab.si>
    Date:   Tue Dec 14 10:34:05 2021 +0100
    
        Use AssessEvidence instead of StoreEvidence RPC on Clouditor.
    
        Added necessary proto files (removed unneeded) and regenerated python code from them. Also added Google APIs dependencies for grpc code generation. Accordingly updated README. Also updated requirements.txt to include some packages for grpc python code generation and for usage of Google APIs.
    
        Note that with the call to AssessEvidence, we need to provide an AssessEvidenceRequest object (instead of simply Evidence as before). Most changes of existing code are because of this.

Evidence Collector

This project includes modules for collecting evidence regarding Wazuh and VAT and sending it to Clouditor for further processing.

Wazuh evidence collector

Wazuh evidence collector uses Wazuh's API to access information about manager's and agents' system informations and configurations. As an additional measure to ensure correct configuration of ClamAV (if installed on machine) we also make use of Elasticsearch's API to dirrectly access collected logs - Elastic stack is one of the Wazuh's required components (usually installed on the same machine as Wazuh server, but can be stand alone as well).

Installation & use

Using docker

Note: Docker image is not yet complete and might not work due to recent changes around scheduler etc.

  1. Set up your Wazuh development environment. Use Security Monitoring repository to create and deploy Vagrant box with all the required components.

  2. Clone this repository.

  3. Build Docker image:

$ docker build -t evidence-collector .
  1. Run the image:
$ docker run evidence-collector

Note: Current simple image runs code from test.py. If you wish to test anything else, change this file or edit Dockerfile.

Local environment

  1. Set up your Wazuh development environment. Use Security Monitoring repository to create and deploy Vagrant box with all required components.

  2. Clone this repository.

  3. Install dependencies:

$ pip install -r requirements.txt

$ sudo apt-get install jq
  1. a) Install Redis server locally:
$ sudo apt-get install redis-server

Note: To stop Redis server use /etc/init.d/redis-server stop.

  1. b) Run Redis server in Docker container:
$ docker run --name my-redis-server -p 6379:6379 -d redis

In this case also comment-out server start command in entrypoint.sh:

#redis-server &
  1. Run entrypoint.sh:
$ ./entrypoint.sh

Note: This repository consists of multiple Python modules. When running Python code manually, use of -m flag might be necessary.

Component configuration

Generate gRPC code from .proto files

pip3 install grpcio-tools # (included in requirements.txt)
python3 -m grpc_tools.protoc --proto_path=proto evidence.proto --python_out=grpc_gen --grpc_python_out=grpc_gen
python3 -m grpc_tools.protoc --proto_path=proto assessment.proto --python_out=grpc_gen --grpc_python_out=grpc_gen
python3 -m grpc_tools.protoc --proto_path=proto metric.proto --python_out=grpc_gen --grpc_python_out=grpc_gen

As we are interacting with Clouditor, .proto files are taken from there.
Because of dependencies on Google APIs, .proto files in proto/google are taken from here.

Note: since we are running the code as a package, we have to modify imports in newly generated code: import evidence_pb2 as evidence__pb2 --> import grpc_gen.evidence_pb2 as evidence__pb2
(check all generated files)

API User authentication

Current implementation has disabled SSL certificate verification & uses simple username/password verification (defined inside /constants/constants.py). Production version should change this with cert verification.

Manual Elasticsearch API testin with cURL

Example command for testing the API via CLI:

$ curl --user admin:changeme --insecure -X GET "https://192.168.33.10:9200/wazuh-alerts*/_search?pretty" -H 'Content-Type: application/json' -d'
  {"query": {
    "bool": {
      "must": [{"match": {"predecoder.program_name": "clamd"}},
              {"match": {"rule.description": "Clamd restarted"}},
              {"match": {"agent.id": "001"}}]
      }
    }
  }'

Running RQ and RQ-scheduler localy

  1. Install (if needed) and run redis-server:
$ sudo apt-get install redis-server

$ redis-server

Note: By default, server listens on port 6379. Take this into consideration when starting other components.

  1. Install RQ and RQ-scheduler:
$ pip install rq

$ pip install rq-scheduler
  1. Run both components in 2 terminals:
$ rqworker low

$ rqscheduler --host localhost --port 6379

Note: low in the first command references task queue worker will use.

  1. Run Python script containing RQ commands as usual:
$ python3 -m wazuh_evidence_collector.wazuh_evidence_collector

Known issues

Python Elasticsearch library problems with ODFE

Latest versions (7.14.0 & 7.15.0) of Python Elasticsearch library have problems connecting to Open Distro for Elasticsearch and produce the following error when trying to do so:

elasticsearch.exceptions.UnsupportedProductError: The client noticed that the server is not a supported distribution of Elasticsearch

To resolve this, downgrade to older package version:

$ pip install 'elasticsearch<7.14.0'