Docker Dev Guide
#
Docker Setup#
Configure CollectorsBefore you start if you haven't already done so, please make a copy of docker-compose.override_example.yml this is used to setup your collectors. The default should work out of the box with the env.example provided. If you wish you add customizations, please see the Docker Advanced Install Guide
#
Selecting a VersionWe currently release a development version and a tagged version. The first version to support will be. v1.2.5 once release.
#
Stable Release VersionTo select a released version please do the following.
Example:
If you run the script without any version specified it'll list all the current tags and prompt you to select a version.
Once that's complete, all the instructions below are still applicable.
#
Development VersionIf you wish to use the development version you are free to do so. It is the default behavior on any git checkout. Simply follow the directions below and setup your pipeline as instructed.
#
Build Base ImagesThis is optional. The image are published on docker hub, but if you'd like to incorporate local changes please follow the process below.
#
Build Using Source CodeIf you would like to build the importer container using the version of the pipeline scripts found in this GitHub repo then run the following:
#
Configuring the Containers#
Environment FileIf you haven't done so already, copy env.example and update it to match your own settings:
#
RabbitThis portion is primarily to set the Rabbit MQ server. Most of the default settings work but whatever values you set here should be consistent with the config for the logstash and importer
Note the hostname will follow the docker-compose label. You can rename it if you like but by default it's set to rabbit
#
ImporterThe importer config is defined in compose/netsage_shared.xml. If you use different values then the defaults you may want to change them/ NOTE: Changes will require you to rebuild the container
#
LogstashDefine the input rabbit queue. This should match the importer output queue
Define the output rabbit queue. This can be the docker container or any valid RabbitMQ server.
#
Optional: ElasticSearch and KibanaYou can optionally store flow data locally in an ElasticSearch container and view the data with Kibana. Local storage can be enabled with the following steps:
- Uncomment the following lines in conf-logstash/99-outputs.conf:
Comment out the
rabbitmq {...}
block in conf-logstash/99-outputs.conf if you do not want to also send logstash output to RabbitMQ.Run the containers using the following line:
docker-compose -f docker-compose.yml -f docker-compose.develop.yml up -d