Deploy the Air-Gapped Everactive Appliance on Linux

The Everactive Appliance provides an on-prem version of our services for sensor data acquisition and processing.

Hardware Requirements

• Linux Kernel 5.15.X (Tested on Ubuntu Server 22.04 and Debian 12)
• 4GB of RAM, 2 CPUs
• 25 GB of Disk space

Installation and Configuration


Intall Docker following the instructions from official docker docs:


We include a copy of Taskfile in the release to use the

Download and Extract the Everactive Appliance release

$ wget '{PRESIGNED URL}' -O export-v1.0.0.tar.gz
$ tar -zxvf export-v1.0.0.tar.gz

Load the docker images

The tarball contains the complete container images we need to load in docker.

$ cd export  
export$ sudo bin/task load



We include a copy of the taskfile tool and scripts to simplify the configuration operations.

Example Output

export$ sudo bin/task load
task: [run:_load] docker load -i images/airgapped-database.tar
task: [run:_load] docker load -i images/airgapped-portainer.tar
task: [run:_load] docker load -i images/airgapped-api.tar
task: [run:_load] docker load -i images/airgapped-broker.tar
task: [run:_load] docker load -i images/airgapped-grafana.tar
task: [run:_load] docker load -i images/airgapped-proxy.tar
task: [run:_load] docker load -i images/airgapped-prometheus.tar
task: [run:_load] docker load -i images/airgapped-cdc.tar
task: [run:_load] docker load -i images/airgapped-sink.tar
task: [run:_load] docker load -i images/airgapped-tools.tar
42ebc17ba669: Loading layer [=>                                                 ]  32.77kB/1.47MB
42ebc17ba669: Loading layer [==================================================>]   1.47MB/1.47MB
42ebc17ba669: Loading layer [==================================================>]   1.47MB/1.47MB
14009dfb3416: Loading layer [===============>                                   ]  26.18MB/83.86MB
0620092853f5: Loading layer [==================================================>]  807.4kB/807.4kB
14009dfb3416: Loading layer [==================================================>]  83.86MB/83.86MB
b2139c584a93: Loading layer [==================================================>]  3.584kB/3.584kB
3cc6e2e29424: Loading layer [==================================================>]  4.096kB/4.096kB
05ecc919987a: Loading layer [==================================================>]   2.56kB/2.56kB

Customize the Environment

We need to configure the virtual machine's external hostname or IP address and the passwords for the different services.
Set the following environment variables using the corresponding values (you just need to edit the first two). In this example, the virtual machine has the IP address

export PASSWORD=airgap_password


Now run the customization task to prepare the configuration files. Make sure to use the -E flag to preserve the environment variables you just set.

export$ sudo -E bin/task customize

You can take a look at the file .env in the current folder and check how the different settings will be applied to the services. It is also a good place

Example Output

export$ sudo bin/task customize
task: [run:customize] sed -i "s/POSTGRES_PASSWORD=.*/POSTGRES_PASSWORD=${POSTGRES_PASSWORD}/g" .env

Bootstrap the self-signed certificates

To enable TLS in the appliance services, we need to create self-signed certificates and keys. With this command, we bootstrap a CA and the derivative certificates inside certificates/certs:

export$ sudo bin/task certs


Useful certificates

To connect to MQTT from outside the installed instance we recommend downloading the following files for connecting locally to MQTT connections over TLS:
• certificates/certs/airgap-broker.key
• certificates/certs/airgap-broker.crt
• certificates/certs/EveractiveCA.crt

Example Output

export$ sudo bin/task certs
task: [run:certs] bash certificates/
Create Certificates
Initializing CA
Created CA/certs/EveractiveCA.key
Created CA/certs/EveractiveCA.crt
Created CA/certs/EveractiveCA.crl
        Version: 3 (0x2)
        Serial Number: 1 (0x1)
        Signature Algorithm: sha256WithRSAEncryption
        Issuer: CN = EveractiveCA
            Not Before: Jul 21 14:49:21 2023 GMT
            Not After : Jul 21 14:59:20 2123 GMT
        Subject: CN = EveractiveCA
        Subject Public Key Info:
            Public Key Algorithm: rsaEncryption
Create Broker Certificates: airgap-broker,
Created CA/certs/airgap-broker.key
Created CA/certs/airgap-broker.csr
Created CA/certs/airgap-broker.crt from CA/certs/airgap-broker.csr signed by CA/certs/EveractiveCA.key
Create Proxy Certificates: airgap-proxy,
Created CA/certs/airgap-proxy.key
Created CA/certs/airgap-proxy.csr
Created CA/certs/airgap-proxy.crt from CA/certs/airgap-proxy.csr signed by CA/certs/EveractiveCA.key

Initial Gateway and Sensor Configuration

When the Data API container starts up, it will attempt to read a file in the root of the export directory called config.yml. This file contains the configuration for one or more sensors and gateways. Everactive support will provide you with this file in a separate download.

Example config.yml file

  - enableWakeup: true
      enabled: true
      syncTime: 0
      wakeupCode: "7C6EA12C"
      wakeupOffset: 60
      wakeupPeriod: 60
      beaconOffset: 400
      broadcastPeriod: 60
      enabled: true
      panId: 25
    name: "3AG32B5"
    serialNumber: "3AG32B5"
  - encryptionKey: a-secret-value
    macAddress: 01:02:03:43:AA:BC:DE:FF
    manufacturingFwVersion: 1.0.1
    manufacturingPartNumber: evr-01-23456
    manufacturingSerialNumber: evrac01020343AABCDEFF
    type: VibrationalElectrical

Start the Docker Stack

You are now ready to start the Docker Compose stack. Run the following command.

export$ sudo bin/task up

Example Output

bin/task up
task: [run:up] docker-compose -f docker-compose.yml up -d
[+] Running 12/12
 ✔ Network export_everactivenet            Created                                                              0.1s 
 ✔ Volume "export_airgap-prometheus-data"  Created                                                              0.0s 
 ✔ Volume "export_airgap-postgres-data"    Created                                                              0.0s 
 ✔ Volume "export_airgap-rabbitmq-data"    Created                                                              0.0s 
 ✔ Volume "export_airgap-cdc-data"         Created                                                              0.0s 
 ✔ Container airgap-database               Started                                                              1.1s 
 ✔ Container airgap-broker                 Started                                                              1.4s 
 ✔ Container airgap-proxy                  Started                                                              1.3s 
 ✔ Container airgap-prometheus             Started                                                              1.1s 
 ✔ Container airgap-cdc                    Started                                                              2.2s 
 ✔ Container airgap-api                    Started                                                              2.3s 
 ✔ Container airgap-sink                   Started                                                              2.4s

Validate the install

Once started, you can run the built-in validation report. This will run a suite of test to make sure the stack is functioning correctly.
✅ - Success: Test passed
🟠 - Warning: The test did not pass, but it’s not broken - See the troubleshooting guide
🛑 - Error: The test failed; something is wrong - See the troubleshooting guide

export$ sudo bin/task validate

Example Output

export$ sudo bin/task validate
task: [run:validate] docker run --rm -v $PWD/validate:/opt/validate -w /opt/validate /bin/bash -c "ansible-playbook playbook.yaml --extra-vars 'api_url=https://$PROXY_EXTERNAL_HOST'"
[WARNING]: No inventory was parsed, only implicit localhost is available
[WARNING]: provided hosts list is empty, only localhost is available. Note that
the implicit localhost does not match 'all'

PLAY [Diagnostics] *************************************************************

TASK [Get API Health] **********************************************************
ok: [localhost]

TASK [Assert API is health] ****************************************************
ok: [localhost] => {
    "changed": false,
    "msg": "✅ API is healthy"

TASK [Get API Sensor Data] *****************************************************
ok: [localhost]

TASK [Assert sensors API response success] *************************************
ok: [localhost] => {
    "changed": false,
    "msg": "✅ API has returned a successful response"

TASK [Assert sensors API has data] *********************************************
ok: [localhost] => {
    "assertion": " | length > 0",
    "changed": false,
    "evaluated_to": false,
    "failed_when_result": false,
    "msg": "🟠 API no sensors data returned"

TASK [Get API Gateway Data] ****************************************************
ok: [localhost]

TASK [Assert gateways API response success] ************************************
ok: [localhost] => {
    "changed": false,
    "msg": "✅ API has returned a successful response"

TASK [Assert gateways API returns data] ****************************************
ok: [localhost] => {
    "assertion": " | length > 0",
    "changed": false,
    "evaluated_to": false,
    "failed_when_result": false,
    "msg": "🟠 API no gateways data returned"

PLAY RECAP *********************************************************************
localhost                  : ok=8    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0 Services

Data API

The Everactive API can be accessed via REST calls and using basic authentication (ask Everactive support for credentials) with any HTTP Client. Test the API Service by calling the health endpoint:

curl --request GET \
     --url http://HOST/data/v1/health \
     --header 'accept: application/json'


API Documentation



The Everactive appliance includes a Prometheus server that collects metrics from the different services. You can access the Prometheus UI at: https://HOST/prometheus/graph

• username: airgap_user
• password: airgap_password

All metrics are exposed through the Prometheus Federation endpoint to integrate other systems like Grafana or DataDog.

curl -k -G --data-urlencode 'match\[]={job=~".+"}'  \
  -u airgap_user:airgap_password \


ISSUE: sudo bin/task validate report returned 🛑 API is not healthy
SOLUTION: This means that the API is not healthy. Typically this means that the API can not connect to the database. Use the command docker ps | grep airgap-database to check the status of the database service. Use the command docker logs airgap-database and look for errors. If you are not sure how to proceed, send a full dump of the logs docker logs > dump.txt to Everactive support.

ISSUE: sudo bin/task validate report returned 🟠 API no gateways data returned
SOLUTION: This is just a warning to let you know no gateways are reporting yet. Once a Gateway(s) is correctly configured, try rerunning the sudo bin/task validate task. You should now see ✅ API has returned sensor data. If you still do not get sensor data contact Everactive support.

ISSUE: sudo bin/task validate report returned 🟠 API no sensors data returned
SOLUTION: This is just a warning to let you know no sensors are reporting yet. Once Gateway(s) and sensors are configured, try rerunning the sudo bin/task validate task. You should now see ✅ API has returned gateway data. If you still do not get sensor data contact Everactive support.


Before beginning the upgrade process always double check the and release notes in the file.

When a new release tar.gz is available, download it to your machine. Unpack it in a location other than the current install location.

wget '{PRESIGNED URL}' -O export-new-version.tar.gz  
tar -zxvf export-new-version.tar.gz  
cd export-new-version

Review the for any specific instructions about upgrading to a specific version.

Run the upgrade task and set the INSTALL_DIR to the path where the current version is installed. This operation will install the new docker images and necessary files.

sudo bin/task upgrade INSTALL_DIR=/path/to/current/export

Finally, run the following commands in the /path/to/current/export to reload the services.

cd /path/to/current/export
sudo bin/task reload