This page provide information about ESFRI communities use cases they would like to demonstrate in DAC21. The use cases contains a detailed plan on what operations they would like to demonstrate. This is more or less a single document. These plans contain information about which Rucio instance(s) they are going to use, which storage hardware they plan to use, and which data will be transferred.
The plan also include WP5 activity, particularly when there is some QoS implication with running analysis code through WP5. The document also include a list of resources needed to achieve this demonstration; for example, storage hardware, Rucio instances, and compute resources. Any requirements
on these resources is also documented.
This would explore how caching might be used (for example, at HPC centres) “heterogeneous resources”.
This table summarize different QoS Labels availble in ESCAPE testbed
QoS Label
|
Description
|
List of RSEs
|
SAFE
|
SAFE is only for Tape RSEs, which can be useful for Long-term archiving of data.
|
DESY-DCACHE-TAPE, PIC-DCACHE-TAPE, SARA-DCACHE-TAPE, CNAF-STORM-TAPE
|
FAST
|
Computation that requires reduced CPU usage, so more likely to be IO-bound. Most likely streaming access.
(At the moment it's just a label)
|
LAPP-WEBDAV, PIC-DCACHE
|
CHEAP-ANALYSIS
|
Data that has not been used for some time. Most likely random-IO
(At the moment it's just a label)
|
FAIR-ROOT, ALPAMED-DPM, INFN-NA-DPM, IN2P3-CC-DCACHE, SARA-DCACHE,
DESY-DCACHE, INFN-ROMA1
|
OPPORTUNISTIC
|
A file that is unlikely to be of interest; for example, a log file for a job with output that has been validated.
(At the moment it's just a label)
|
LAPP-DCACHE, PIC-INJECT, EULAKE-1, CNAF-STORM, GSI-ROOT
|
|
These docs are in progress and subject to change.
CTA
CTA google doc link can be found here
The following tables summarize different QoS requirements.
QoS ... Requirements:
WORKFLOW
|
Capacity
|
Avr file size
|
Number of files
|
Long haul ingestion and replication (Test 1)
|
300 GB
|
1.5 GB
|
190
|
Long haul ingestion and replication (Test 2)
|
300 GB
|
1.5 GB
|
190
|
Long haul ingestion and replication (Test 3)
|
10 TB
|
1.5 GB
|
1000
|
Long haul ingestion and replication (Test 4)
|
40 TB
|
1.5 GB
|
32000
|
Long haul ingestion and replication (Test 5)
|
10 TB
|
1.5 GB
|
8000
|
Long haul ingestion and replication (Test 6)
|
10 TB
|
1.5 GB
|
8000
|
|
QoS ... Requirements:
WORKFLOW
|
Capacity
|
Avr file size
|
Number of files
|
Data Reprocessing
|
100 TB
|
2 GB
|
|
|
FAIR
FAIR document can be found [here]
The following tables summarize different QoS requirements.
QoS SAFE Requirements:
WORKFLOW
|
Capacity
|
Avr file size
|
Number of files
|
Ingestion of R3B Monte-Carlo simulated data
|
1.5 TB
|
~ 5 GB
|
~ 300
|
Ingestion of raw data for PANDA
|
~ 1 TB
|
1 GB
|
1000
|
|
QoS OPPORTUNISTIC Requirements:
WORKFLOW
|
Capacity
|
Avr file size
|
Number of files
|
R3B data analysis
|
100 GB
|
|
|
Reconstruction of raw PANDA data
|
~ 1 TB
|
|
|
|
QoS FAST Requirements:
WORKFLOW
|
Capacity
|
Avr file size
|
Number of files
|
Ingestion of raw data for PANDA
|
~ 1 TB
|
1 GB
|
1000
|
|
LOFAR
LOFAR google doc link can be found here
The following tables summarize different QoS requirements.
QoS ... Requirements:
WORKFLOW
|
Capacity
|
Avr. file size
|
Number of files
|
Long haul ingestion and replication
|
~ 15 TB
|
|
|
|
QoS ... Requirements:
WORKFLOW
|
Capacity
|
Avr. file size
|
Number of files
|
Data Processing
|
~ 15 TB
|
|
|
|
ATLAS
ATLAS google doc link can be found here and in Redmine.
The following tables summarize different QoS requirements.
QoS SAFE and Requirements:
WORKFLOW
|
Capacity
|
Avr file size
|
Number of files
|
ATLAS open data replication, augmentation, bookkeeping and validation
|
300 GB
|
~ 2.5 GB
|
|
|
QoS CHEAP-ANALYSIS Requirements:
WORKFLOW
|
Capacity
|
Avr file size
|
Number of files
|
ATLAS open data replication, augmentation, bookkeeping and validation
|
6 TB
|
~ 2.5 GB
|
|
ATLAS user analysis pipeline tests on experimental particle physics using augmented open data
|
6 TB
|
~ 2.5 GB
|
|
|
QoS FAST Requirements:
WORKFLOW
|
Capacity
|
Avr file size
|
Number of files
|
ATLAS user analysis pipeline tests on experimental particle physics using augmented open data
|
~ 500 MB
|
|
|
|
SKAO
SKAO google doc link can be found here
CMS
CMS use case wiki
KM3NeT
The description of the KM3NeT Use Cases can be found here.
The following tables summarize different QoS requirements.
QoS SAFE Requirements:
WORKFLOW
|
Use Case ID
|
Capacity
|
Avr file size
|
Number of files
|
Ingestion from the shore station and replication
|
KM3NET001
|
60 GB
|
~ 3 GB
|
20
|
|
QoS CHEAP-ANALYSIS Requirements:
WORKFLOW
|
Use Case ID
|
Capacity
|
Avr file size
|
Number of files
|
ORCA data conversion
|
KM3NET003
|
90 GB
|
~ 1.5 GB
|
~ 60
|
|
QoS FAST Requirements:
WORKFLOW
|
Use Case ID
|
Capacity
|
Avr file size
|
Number of files
|
Ingestion from the shore station and replication
|
KM3NET001
|
120 GB
|
~ 3 GB
|
40
|
PMT calibration
|
KM3NET002
|
60 GB
|
~ 3 GB
|
20
|
ORCA data conversion
|
KM3NET003
|
3 GB
|
~ 3 GB
|
1
|
|