Guide to Tracking Development in Juggler for ATHENA Collaboration
Overview
This guide assumes the reader has some basic knowledge of tracking and computing. Notably,
- How to effectively use
git
and has aneicweb
account. - Some basic shell scripting and python
- How to use ROOT with modern c++ (
RDataframe
s are particularly important) - Some familiarity with
juggler
What this guide is not
- This is not a demonstration of ultimate track reconstruction (nowhere near it).
- This is not an analysis tutorial.
What this guide is
- A summary and overview of the basic framework components.
- A resource for algorithm developers.
- A guide to the development workflow driven by CI/CD.
Data Model
The data model is called eicd
Track finding and fitting is based around using ACTS with the detector geometry constructed via DD4hep.
Repositories and Workflow
Repositories
The collaboration uses the EIC group on eicweb which contains the subgroups detectors and benchmarks.
The main software components locally developed are:
-
juggler
(documentation) - Event processing framework (i.e. algorithms live) -
eicd
(documentation) - EIC data model -
npdet
- collection of dd4hep simulation plugins and tools.
The key collaboration/user code repositories are:
- detectors/ip6 - IP6 specifics (forward and backward beamline and detectors).
- detectors/athena - ATHENA detector
- Detector benchmarks - Set of analysis scripts run on the Geant4 output before any digitization or reconstruction. Also contains some detector calibrations.
- Reconstruction benchmarks - Analysis of the many aspects of reconstruction. This is where the tracking performance benchmarks and plots live. Also a good place for developing new algorithms.
- Physics benchmarks - Analysis of reconstructed for physics performance. The goal is to provide metrics for optimizing detector design and reconstruction.
Pipelines and Artifacts
The SWG leverages gitlab's CI/CD features heavily in our workflow. Here are some simplified explanations of these.
Pipeline
A pipeline is an automated set of jobs/scripts that are triggered by certain actions, such as pushing a merge request or merging into the master/main branch of a repository. Typically there is one pipeline per repository but there can multiple and a pipline can trigger downstream pipelines ("child" pipelines) or it can be triggered by an upstream pipeline. They can also be triggered manually.
The graph below show some of the downstream pipeline triggers (arrows) between different repositories.
graph TD;
ip[IP6<br>detectors/ip6] --> athena[ATHENA<br>detectors/athena]
athena-->db[Detector Benchmarks<br>benchmarks/detector_benchmarks];
db-->rb[Reconstruct Benchmarks<br>benchmarks/reconstruction_benchmarks];
db-->pb[Physics Benchmarks<br>benchmarks/physics_benchmarks];
juggler[juggler<br>algorithms]-->rb;
juggler-->pb;
Note that on any change to the detectors will cause all the benchmarks to be run.
"OK, pipelines run automatically. What is the big deal?"
Artifacts
All pipeline jobs have "artifacts" which are just selected files that are saved and can be downloaded individually or as a zip file.
Note artifacts are not the output data which is far too big. Artifacts are small files such as images, plots, text files, reports, etc.
Below is an image and link to a pdf of the latest ATHENA detector version generated by a job artifact from the master
branch pipeline.
Artifacts can be browsed via the web interface, for example, the latest in reconstruction benchmark results in the
final_report
job can be browsed.
Software Toolkit
ACTS
"Acts is an experiment-independent toolkit for (charged) particle track reconstruction in (high energy) physics experiments implemented in modern C++."
ACTS References:
The ACTS way of tracking
First, the geometry has to be constructed. Currently the ACTS geometry is constructed and tested as part of the athena pipeline.
The tracking surfaces can be viewed in a cad program using the output tracking_geometry.obj
file the reconstruction_benchmarks pipeline.

Assuming the ACTS geometry is already constructed, we move on to describe the data processing.
Source Links and Measurements
TrackerHit object + Geometry information --> Measurement
Hit CellID (channel) + Surface object --> Source Link
A SourceLinker
is an algorithm that looks up the surface and maps it to the hit's cellID
.
Naturally the same algorithm outputs measurements which are also mapped to the hit and contain the position
and sensor size information (via covariance matrix).
Proto tracks
Both track finding and fitting will use the information contained in the source links and measurements.
Track finding produces proto tracks
or groupings of hits. Each proto track is simply a std::vector<int>
storing the index of the
hits associated with a track seed.
Initial Track parameters and Seeding
A Kalman filter needs a starting point and those are the Initial Track Parameters
. These can be determined many different ways.
Conceptually the process of determining these parameters begins with track seeding.
ACTS CKF
Currently the default is to use the Combinatorial Kalman Filter (CKF) which does both track finding and fitting.
GenFit2
GenFit is a generic track fitting tool built on ROOT and the tgeo library. The geometry is constructed and geometry handles are present in the geometry service.
We skip this for now.
Adding a New Algorithm to Juggler
Algorithm implementations must belong to the juggler
library so they can be used in the event processing framework.
Because juggler
was built on top of the Gaudi Framework the machinery for building a new algorithm
and adding into an existing event processing sequence is quite easy.
The basic tasks for adding a new algorithm are:
- Identify the data inputs and outputs. Use existing structures in the data model or add new ones if needed.
- Implement the algorithm in c++. Create a new merge request (or preferably start with a new issue then click blue button "create merge request") for the juggler repository. Mark as
Draft:
if needed. - Merge the algorithm into the master. Bug the maintainers or SWG to help with this.
- Add to an existing or create a new benchmark in the
reconstruction_benchmark
repository.
ConformalXYPeakProtoTracks
Example: Builds proto-tracks based on the conformal mapping in the XY plane of circles to lines. See the code in this commit for details.

Add parameter
Here is an an issue requesting a parameter to the algorithm: EIC/juggler#43
demonstrate how to do this...
Use new parameter
New issue in reconstruction_benchmarks: EIC/benchmarks/reconstruction_benchmarks#63 (closed)
We essentially need to a single line here: https://eicweb.phy.anl.gov/EIC/benchmarks/reconstruction_benchmarks/-/blob/master/benchmarks/track_finding/options/track_reconstruction.py#L133
demonstrate how to add this
Looking closer
The options file above is trigger by the CI via,
The last script is worth looking closer as it does everything...
- First it generates some events with script
- Then it runs Geant4
- Then it runs juggler.
Note that if you were doing local development, you would run the script from the top level directory.
bash benchmarks/track_finding/multiple_tracks.sh
"Options" files
The curious reader will have already looked at the options file to find this orchestrates all the algorithms.
Note that in future, we anticipate moving away from "options" files in favor of a pure python solution. The files will essentially look the same aside from some initial boiler plate.
Comparison with 100 and 90 bins

