Skip to content
Snippets Groups Projects
Commit 07804284 authored by Dmitry Kalinkin's avatar Dmitry Kalinkin
Browse files

README.md: update with latest info

parent 2ad03c1d
Branches
Tags
No related merge requests found
Pipeline #100670 canceled
EIC Detector Benchmarks ePIC Detector Benchmarks
======================= ========================
[![pipeline status](https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks/badges/master/pipeline.svg)](https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks/-/commits/master) [![pipeline status](https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks/badges/master/pipeline.svg)](https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks/-/commits/master)
## Overview ## Overview
Detector benchmarks are meant to test for regressions in individual detector subsystems. Detector benchmarks are meant to provide a maintained set of performance plots for individual detector subsystems.
The analysis is meant to avoid a reconstruction step.
So this precludes using [juggler](https://eicweb.phy.anl.gov/EIC/juggler) for processing the events.
## Documentation ## Documentation
See [common_bench](https://eicweb.phy.anl.gov/EIC/benchmarks/common_bench/). - See [tutorial](https://eic.github.io/tutorial-developing-benchmarks/)
- See [common_bench](https://eicweb.phy.anl.gov/EIC/benchmarks/common_bench/).
## Adding new benchmarks ## Adding new benchmarks
To get an idea of what to do look at an existing benchmark in the To get an idea of what to do look at an existing benchmark in the
[`benchmarks` directory](https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks/-/tree/master/benchmarks). [`benchmarks` directory](https://github.com/eic/detector_benchmarks/tree/master/benchmarks).
Currently a good reference for Snakemake instrumentation is available in the `tracking\_performances` benchmark.
## Running Locally It relies on single particle simulations that can be either produced on eicweb or downloaded from official campagins.
### Local development example
Here we setup to use our local build of the `juggler` library.
Note juggler is not needed for `detector_benchmarks` because it is not used but this is the same setup for
`reconstruction_benchmarks` and `physics_benchmarks`.
First set some environment variables.
```
export DETECTOR=epic # epic is the default
```
```
git clone https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks.git && cd detector_benchmarks
git clone https://eicweb.phy.anl.gov/EIC/benchmarks/common_bench.git setup
source setup/bin/env.sh && ./setup/bin/install_common.sh
source .local/bin/env.sh && build_detector.sh
mkdir_local_data_link sim_output
mkdir -p results
mkdir -p config
```
### File organization
For a minimal benchmark you'll need to add
`benchmarks/<benchmark_name_here>/config.yml` and
`benchmarks/<benchmark_name_here>/Snakemake`, plus the analysis script/macro.
The `Snakefile` has to be included in the root `./Snakefile` of the repository.
That common entry point is needed to ensure that common simulation samples can
be defined to be re-used by several benchmarks at a time.
The `config.yml` will require an include from the `./.gitlab-ci.yml`.
### Pass/Fail tests ### Pass/Fail tests
- Create a script that returns exit status 0 for success. - Create a script that returns exit status 0 for success.
- Any non-zero value will be considered failure. - Any non-zero value will be considered failure.
- Script
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment