Skip to content

HTS Bioinf - Release and deployment of tsd-import

Scope

This procedure explains how to properly make new releases of the tsd-import package, and how to deploy the software into production environment.

Responsibility

Responsible person: A bioinformatician responsible for pipeline release.


Platforms

The software is located on different platforms, see the table below:

Host Directory Location Servers
TSD production /ess/p22/data/durable/production p22-cluster-sync or   p22-submit-dev
NSC transfer /boston/diag/transfer sleipnir and beta

1. Tag the repositories for the new release

Merge all changes into master and tag the new release accordingly (from HEAD of master). Version name is vX.X.X, and should follow semantic versioning. The tag MUST end with -rel:

git tag -a vX.X.X-rel -m "Tag release vX.X.X

Immediately push the tag to origin:

git push origin vX.X.X-rel

2. Export the repository

Open the pipelines view of Gitlab CI https://gitlab.com/DPIPE/labautomation/tsd-import and find a pipeline execution that was run on the relevant release tag.

Start the job release-artifact.

Once finished download the tar archive from the web page.

Transfer to TSD and NSC.

3. Deploy

1. Deploy on NSC

As lims-exporter-api is run on beta and nsc-exporter on sleipnir, to simplify the deployment process, we deploy on NSC only when those have been changed.

Note! The services running on beta and sleipnir must be stopped and started on their respective server. The removal and unpacking of the deploy package is done on beta.

Copy and archive the deployable package

Copy the exported tarball into {transfer}/sw/archive directory on beta. This is in order to ensure that there always is an archived version of the software before deployment.

Stop nsc-exporter and lims-exporter-api

Refer to Execution and monitoring of pipeline section "Finish production" for instructions on how to stop nsc-exporter and lims-exporter-api.

Extract the package

Log into the server: ssh beta

Remove the old version:

rm -rf {transfer}/sw/tsd-import
 ```

Extract the package and verify that the version is as expected:

```bash
cd {transfer}/sw/
tar -xvf archive/<package-file>
cat tsd-import/version

Start nsc-exporter and lims-exporter-api

Refer to Execution and monitoring of pipeline section "Start production" for instructions on how to stop nsc-exporter and lims-exporter-api by {serviceuser}.

2. Deploy on TSD

Since the filelock-exporter is only running on TSD, to simplify the deployment process, we go through this step only when filelock-exporter has been changed.

Copy and archive the deployable packages

Copy the exported tarball into {production}/sw/archive directories on TSD.

Stop filelock-exporter-api

Refer to Execution and monitoring of pipeline section "Finish production" for instructions on how to stop filelock-exporter-api.

Extract the package

Remove the current module:

rm -rf {production}/sw/tsd-import
 ```

Extract the package and confirm that the version is correct:

On `cluster`:

```bash
cd {production}/sw
./slurm-wrapper-untar-tsd-import.sh <package-file> 
# Wait until the job has finished on the cluster.
cat tsd-import/version

On durable:

cd {production}/sw
tar xvf <package-file>
cat tsd-import/version

Note! The <package-file> above must be located inside the sw/archive directory, but do not include the directory name when referring to the package file.

Start filelock-exporter-api

Refer to Execution and monitoring of pipeline section "Start production" for instructions on how to stop filelock-exporter-api by {serviceuser}.

Background

The tsd-import package consists of three different programs, two of which run on the NSC side, and one running on the TSD side:

Its sole purpose is to package samples from NSC and upload them safely into production area on TSD.

lims-exporter-api

The lims-exporter-api script monitors the lims exporter queue in Clarity for samples and analyses ready to be exported. It creates sample and analysis packages in the format expected in our automation system.

nsc-exporter

This script assists the user with uploading new samples/analyses and backup pipeline results to S3 API endpoint on TSD in an automatic fashion.

filelock-exporter-api

This script watches the S3 API endpoint on TSD for incoming samples and backup pipeline results that are not already copied to cluster or durable or durable3. When new samples or new backup pipeline results are detected, they are copied to production area or durable. There are checks for data integration. If a check has passed, the samples/analyses/ella-incoming are marked with a READY file for pickup by the automation system. If nsc-exporter was run on NSC side, filelock-exporter-api must be run on TSD side.