Use a bash script

from pathlib import Path
import time
import subprocess
from remote_run.run import (
    ExecutionContext,
    SlurmSchedulingEngine,
    GuixRunner,
    GitProject,
    remote,
    is_finished,
    SshExecution,
)

Define an execution context with a scheduling engine. For slurm you can pass slurm parameters directly here.

execution_context = ExecutionContext(
    execution=SshExecution(
        machine="shpc0003.ost.ch",
        working_directory=Path("/cluster/raid/home/reza.housseini"),
    ),
    project=GitProject(),
    runner=GuixRunner(channels=Path("channels.scm").read_text()),
    scheduling_engine=SlurmSchedulingEngine(
        job_name="run_bash_script",
        mail_type="ALL",
        mail_user="reza.housseini@ost.ch",
    ),
)

decorate your functions you want to run in the specified execution context

@remote(execution_context)
def execute_bash_commands(message):
    return subprocess.run(
        ["echo", message, ";", "hostname"], capture_output=True, encoding="utf-8"
    ).stdout

this call will run on the remote machine specified in execution_context but due to the asynchronous nature of scheduling engines this will not return the result, instead you get the job id and a function to retrieve the result later.

job_id, result_func = execute_bash_commands("hello")

now we wait for the remote execution to finish before retrieving the result

time.sleep(20)

we should check if the job id has finished before retrieving the result

if is_finished(execution_context, job_id):
    assert result_func() == "hello\nshpc0003"

Gallery generated by Sphinx-Gallery