# Parameter Sweeps

See code in GitLab.
Author: Christoph Augustin christoph.augustin@medunigraz.at

## Parameter sweeps

Transmembrane voltage can be changed by applying an extracellular field in the extracellular space. As outlined in Sec. electrical-stimulation, an electric field can be set up either by injection/withdrawal of currents in the extracellular space or by changing the extracellular potential with voltage sources.

For testing the influence of the instant of stimulation, we generate a thin strand of tissue of 1 cm length. Electrodes are located at both caps of the strand and we vary the instant of the stimulations at these electrodes over time.

## Experiment Options

Several types of stimulation setups are predefined. To run these experiments, execute

cd \${TUTORIALS}/02_EP_tissue/20_parameter_sweep

and run

./run.py --help

to see all experimental parameters. In this example we will use the carputils options

--polling-param POLLING_PARAM [POLLING_PARAM ...]
Polling parameter
--polling-range min:max:num [min:max:num ...]
Define polling parameter range
--polling-file POLLING_FILE
File including polling data for parameter sweeps
--sampling-type {linear,lhs}
Sampling type for parameter sweeps. Choose between
"linear" and latin hypercube ("lhs") sampling

## Generation of a polling file

Generate a polling file where the start times of stimulus[0] and stimulus[1] are linearly interpolated in the intervals [20ms, 60ms] and [50ms, 70ms], respectively. A total of 25 sampling points is used. Note that the number of sampling points has to be the same for each polling parameter. The result is stored in the polling file stimulus.poll.

./run.py  --polling-file stimulus.poll \
--polling-param stimulus[0].start stimulus[1].start \
--polling-range 20:60:25 50:70:25

To use latin hypercube sampling <https://en.wikipedia.org/wiki/Latin_hypercube_sampling> as the sampling type instead of linear interpolation run the following code. Note that this requires the pyDOE python package which is not part of the standard libraries but can be installed using pip.

./run.py  --polling-file stimulus.poll \
--polling-param stimulus[0].start stimulus[1].start \
--polling-range 20:60:25 50:70:25 \
--sampling-type lhs

Remarks:

• An arbitrary number of parameters can be added to --polling-param. Each parameter requires a polling-range and the number of samples has to be the same.
• It is checked automatically if the polling-param is actually set by the user in the run script or in any .par file. If it is not set, an error is thrown.
• You can write your own python function to generate a polling file; e.g., for parameters that cannot easily be iterated or for parameters that are not set via the openCARP simulator (e.g., dynamically generated meshes). If the layout of the file is similar to the files generated by carputils, all of the following simulation functionality including job submission on clusters will still work!

## Run a simulation with the generated polling file

To run the simulations in serial with the parameters set in the file stimulus.poll execute

./run.py  --polling-file stimulus.poll

Usually, we want to run parameter sweeps in parallel. On the desktop, to run four simulations in parallel each using 2 processes use

./run.py  --polling-file stimulus.poll --np 8 --np-job 2

Remarks:

• --np has to be a multiple of --np-job. Else an error is thrown.
• The subdirectories for each simulation are generated automatically. To prevent this behavior you can include the parameter --simID <some-simulation-id> in the polling file. The system checks automatically if --simID is set there and if that is the case the simulation ID in the polling file is used.
• Alternatively, you can set job.carp(cmd, polling_subdirs=False) in the run script to prevent the generation of subdirectories in the simulation directory. Be aware that files may be overwritten by this mode.

## Run parameter sweeps using carputils on clusters

Most clusters allow to run parameter sweeps using their job submission system. See for example

To run a parameter sweep on VSC3 with a total of 512 cores and 2 cores per simulation run the following command. --dry will give you the opportunity to check the file before submission.

./run.py  --polling-file stimulus.poll --np 512 --np-job 2 \
--runtime 24:00:00 --platform vsc3 --dry

To immediately submit a parameter sweep job on Archer with a total of 1152 cores and 24 cores per simulation run without --dry

./run.py  --polling-file stimulus.poll --np 1152 --np-job 24 \
--runtime 24:00:00 --platform archer

Note

• You cannot use aprun - which is used e.g., on Archer -to run more than one application on a single node (= 24 cores) at the same time.
• If the polling functionality is not yet included on the cluster you want to use, I can help to set that up.

© Copyright 2020 openCARP project    Supported by DFG and EuroHPC    Contact    Imprint and data protection