Skip to content

HTS Bioinf - Scheduled recurring jobs

Scope

This procedure describes operating the system we use for scheduled, recurring jobs, including, but not limited to:

  • ELLA database exports
  • ELLA maintenance
  • Sample repo update

Responsibility

Responsible person: Production bioinformatician.

Definitions

Dagu: Software used to schedule and execute the recurring jobs


How and where are recurring jobs running

The jobs are specified and run using Dagu. Documentation available here.

The folder /ess/p22/data/durable/production/dev-ops contains the source for all the scripts running. This folder is split in two:

  • ./tasks are where the Dagu jobs are specified, using YAML. See repository for specification on this format.
  • ./src are where the scripts are stored

Updating Dagu

Dagu is providing a stand-alone binary from https://github.com/dagu-dev/dagu/releases.

Put the updated binary under ./bin and update the ./bin/dagu symlink.

Monitoring

Dagu is a web service run by serviceuser on http://p22-app-01:8180.

Going to this website will show all specified jobs, their schedule, and their status (success/failure).

Adding/modifying/removing a job

Any modification to a yaml-file under ./tasks will automatically be picked up by Dagu. No need to stop/start Dagu. Any changes will only affect the task specified in the given yaml.

Use the user interface to test the job. An execution can be manually triggered from here.

Stopping/starting service

The two top-level scripts run-dagu.sh and stop-dagu.sh can be used to start/stop Dagu, respectively. The user need to log in as serviceuser to start/stop service.

Recurring jobs

This procedure will not aim at listing all scheduled jobs, as this is documented withing the folder itself.