Skip to content
Snippets Groups Projects
Commit 07804284 authored by Dmitry Kalinkin's avatar Dmitry Kalinkin
Browse files

README.md: update with latest info

parent 2ad03c1d
No related branches found
No related tags found
No related merge requests found
Pipeline #100670 canceled
EIC Detector Benchmarks
=======================
ePIC Detector Benchmarks
========================
[![pipeline status](https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks/badges/master/pipeline.svg)](https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks/-/commits/master)
## Overview
Detector benchmarks are meant to test for regressions in individual detector subsystems.
The analysis is meant to avoid a reconstruction step.
So this precludes using [juggler](https://eicweb.phy.anl.gov/EIC/juggler) for processing the events.
Detector benchmarks are meant to provide a maintained set of performance plots for individual detector subsystems.
## Documentation
See [common_bench](https://eicweb.phy.anl.gov/EIC/benchmarks/common_bench/).
- See [tutorial](https://eic.github.io/tutorial-developing-benchmarks/)
- See [common_bench](https://eicweb.phy.anl.gov/EIC/benchmarks/common_bench/).
## Adding new benchmarks
To get an idea of what to do look at an existing benchmark in the
[`benchmarks` directory](https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks/-/tree/master/benchmarks).
## Running Locally
### Local development example
Here we setup to use our local build of the `juggler` library.
Note juggler is not needed for `detector_benchmarks` because it is not used but this is the same setup for
`reconstruction_benchmarks` and `physics_benchmarks`.
First set some environment variables.
```
export DETECTOR=epic # epic is the default
```
```
git clone https://eicweb.phy.anl.gov/EIC/benchmarks/detector_benchmarks.git && cd detector_benchmarks
git clone https://eicweb.phy.anl.gov/EIC/benchmarks/common_bench.git setup
source setup/bin/env.sh && ./setup/bin/install_common.sh
source .local/bin/env.sh && build_detector.sh
mkdir_local_data_link sim_output
mkdir -p results
mkdir -p config
```
[`benchmarks` directory](https://github.com/eic/detector_benchmarks/tree/master/benchmarks).
Currently a good reference for Snakemake instrumentation is available in the `tracking\_performances` benchmark.
It relies on single particle simulations that can be either produced on eicweb or downloaded from official campagins.
### File organization
For a minimal benchmark you'll need to add
`benchmarks/<benchmark_name_here>/config.yml` and
`benchmarks/<benchmark_name_here>/Snakemake`, plus the analysis script/macro.
The `Snakefile` has to be included in the root `./Snakefile` of the repository.
That common entry point is needed to ensure that common simulation samples can
be defined to be re-used by several benchmarks at a time.
The `config.yml` will require an include from the `./.gitlab-ci.yml`.
### Pass/Fail tests
- Create a script that returns exit status 0 for success.
- Any non-zero value will be considered failure.
- Script
- Create a script that returns exit status 0 for success.
- Any non-zero value will be considered failure.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment