To deploy the pipeline in a system, do the following:
- Clone the github repo
STFC-ICD-Research-and-Design/supermusr-data-pipelineand cd into the directorysupermusr-data-pipeline. - Run
nix develop --command cargo build --release. - Clone the github repo
Modularius/pipeline-testand cd intopipeline-test.
-
In file
Settings/PipelineSetup.sh:- Ensure
APPLICATION_PREFIXpoints to the directory containing the component executables. - Ensure
OTEL_ENDPOINTpoints to the correct OpenTelemetry connector.
- Ensure
-
Duplicate folders in
Settings/for each broker you wish to connect to. By defaultSettings/Localis included for if a broker is installed locally. For broker in "location", modify the fileSettings/"location"/PipelineConfig.sh- Set
BROKERto point to the kafka broker. - Set
TRACE_TOPIC,DAT_EVENT_TOPIC, etc to the names of the approprate topics on the broker. - Set
DIGITIZERSto the list ofdigitiser_ids that are handled by the broker (make use ofbuild_digitiser_argumentif possible). - Set
NEXUS_OUTPUT_PATHto point to the desired location of the Nexus Writer output. Each broker should have its own folder inOutput.
- Set
-
In the file
Settings/EventFormationConfig.shset the [TODO]
Having configured the currently existing shell files the following can be created.
| Docker | Contains .yaml and .env. files for docker-compose |
| Docs | Contains documentation specific to this deployment |
| Jupyter | Python/Jupyter scripts specific to this deployment |
| Scripts | Each file is a shell which runs a specific set of instructions. |
| Simulations | Contains .json files for use by the simulator |
| Tests | Shell files which perform specific tasks, which can be performed by multiple Scripts/ shells |
To run call ./run_pipeline.sh.
To kill the pipeline (though not the simulator) call ./kill.sh.
To mount the archive, run
mount -t cifs \
-o username=SuperMusr_mgr -o password=******** \
-o domain=ISIS -o vers=2.1 -o noserverino -o _netdev \
//ISISARVR55.isis.cclrc.ac.uk/SuperMusrTestDataBackup$ \
/mnt/archivealtering any parameters as required.
The deployment follows the pattern:
erDiagram
RUN_PIPELINE["run_pipeline"] {}
LIBS["libs/*"] {}
RUN_PIPELINE ||--|| LIBS: calls
SETUP["Settings/PipelineSetup.sh"] {
}
RUN_PIPELINE ||--|| SETUP: calls
CONFIG["Settings/*/PipelineConfig.sh"] {
}
RUN_PIPELINE ||--|| CONFIG: calls
EVCONFIG["Settings/EventFormationPipelineConfig.sh"] {
}
RUN_PIPELINE ||--|| EVCONFIG: calls
SCRIPTS["Scripts/*.sh"] {
}
RUN_PIPELINE ||--|{ SCRIPTS: calls
TESTS["Tests/*.sh"] {
}
SCRIPTS }|--|{ TESTS: calls