Skip to content

Orchestration of processings 🤖

Creating a task on the orchestrator

In this tutorial you will learn how to create a task, submit it to the orchestrator and schedule its execution.

Presentation of a predefined chain

An instance of Spring Cloud Data Flow, a data processing service, is available on the platform. This tool allows you to define processing chains (🔗 streams). These chains are defined by a succession of unitary steps (🔗 applications).

Here is the description page of the predefined 🔗 DassFlow2D chain.

Demonstration of entity creation

Chain creation

It is possible to create a new chain 🔗 graphically. This action can also be performed by code.

Application creation

For the creation of an application of type process, enter the 🔗 integrated development environment and navigate to scdf-df2d-to-raster/processor/Dockerfile to see how a new application is created.

The publication of this application goes through a command line utility available on the platform. In a terminal of the development environment, enter the following commands:

docker login 643vlk6z.gra7.container-registry.ovh.net
# Enter the Harbor password
cd ~/projects/metis-demo-hydro/code/scdf-df2d-to-raster/processor

then publish a new version:

scdf_publish processor df2d-to-raster 0.x  # Incrémenter la version !

and check the addition of the application in 🔗 the list of available applications. The new version should appear.

Launching a processing

DassFlow-2D is a flood forecasting model. It uses as input data RGE-Alti IGN, OSO, topographic survey and flow measurements. It produces raster images of flood prediction. The execution of this model is integrated in the processings execution engine.

The launch of this execution is data driven. The input data for the processing should be copied to the S3 storage located in the directory $HOME/projects/metis-demo-hydro/code/df2d_to_raster/example/input.

From a terminal, run:

cd $HOME/projects/metis-demo-hydro/code/df2d_to_raster/example
aws s3 cp ./input s3://metis-demo-hydro/ingest/ --recursive

The system detects this new input and starts the processing. It is possible to access the progress logs on the 🔗 DassFlow2D chain definition page, RUNTIME frame, VIEW LOG button of the application df2d-to-raster. At the end of processing, the output is generated in the S3 storage. To check that it has been created correctly, display the contents of the output directory:

aws s3 ls s3://metis-demo-hydro/output --recursive

It is then possible to access the product directly or to download it for other processings.

Monitoring on Metis launcher

The resources consumed by this process can be monitored in the 🔗 monitoring system:

  • Go to Browse, then Kubernetes / Compute Resources / Pods2
  • Choose spring-cloud-dataflow in the second drop-down list at the top.

Go to the next step: analysis of the results...