Code/KPI |
WP |
Description |
Methodology |
Estimated Targets |
Q5 |
Q6 |
Q7 |
Q8 |
P2 status |
KSA2.1 - Services Reliability |
SA2 |
% uptime dependent only on the SA2 services |
Participating sites monitoring tools |
99% |
KSA2.1.1 - ETICS |
|
|
|
99% |
98.7% |
96.9% |
97.3% |
98.1% |
97.75% |
KSA2.1.2 - CERN Testbed |
|
|
|
99% |
99.8% |
100.0% |
100.0% |
99.8% |
99.9% |
KSA2.1.3 - KOSICE Testbed |
|
|
|
99% |
97.0% |
100% |
100.0% |
100.0% |
99.2% |
KSA2.1.4 - INFN Testbed |
|
|
|
99% |
99.3% |
100.0% |
100.% |
100.0% |
99.8% |
KSA2.1.5 - JUELICH Testbed |
|
|
|
99% |
100.0% |
100.0% |
100.0% |
100.0% |
100.0% |
KSA2.1.6 - CESNET testbed |
|
|
|
99% |
100.0% |
96.5% |
99.9% |
99.8% |
99.0% |
KSA2.2 - Services Availability |
SA2 |
Total % uptime including the underlying suppliers |
Participating sites monitoring tools |
97% |
|
|
|
|
|
KSA2.2.1 - ETICS |
|
|
|
97% |
98.0 % |
95.8% |
96.8% |
97.9% |
97.1% |
KSA2.2.2 - CERN Testbed |
|
|
|
97% |
99.8% |
96.0% |
98.8% |
99.2% |
98.4% |
KSA2.2.3 - KOSICE Testbed |
|
|
|
97% |
97.0% |
100% |
99.9% |
100.0% |
99.2% |
KSA2.2.4 - INFN Testbed |
|
|
|
97% |
99.3% |
96.7% |
98.6% |
99.2% |
98.45 |
KSA2.2.5 - JUELICH Testbed |
|
|
|
97% |
100.0% |
99.5% |
100.0% |
100% |
100% |
KSA2.2.6 - CESNET Testbed |
|
|
|
97% |
100.0% |
100.% |
99.6% |
99.0% |
99.7% |
KSA2.3 - Distributed Testbed Size |
SA2 |
Number of CPUs available for distributed testing through collaborations with external providers |
Participating sites monitoring tools 1 CPU = 1 Virtual Machine |
Year 1: 50 CPUs Year 2: 200 CPUs Year 3: 500 CPUs |
97 |
112 |
154 |
204 |
204 |
KSA2.4 - Number of key process assessments |
SA2 |
A process assessment is a periodic exercise to evaluate the efficiency of a process and identify weaknesses and areas for improvements |
Periodic reports |
One per year for the major processes (Release, Change, Problem), results to be reported in the QA report to be submitted at the end of every year. |
N/A |
4 QA processes assessed: release, change, packaging, testing |
KSA2.5 - Number of weaknesses detected and addressed: related to the assessment |
SA2 |
A measure of how many of the weaknesses identified in the periodic assessments are addressed and their impact on the process efficiency |
Periodic reports |
One per year for the major processes (Release, Change, Problem), results to be reported in the QA report to be submitted at the end of every year for the preceding year assessment. |
N/A |
- Separation of the QA and QC activities - 2 major weaknesses identified and corrected: multi-platform support and EPEL/Lintian compliance |
KSA2.6 - Number of Support Requests |
SA2 |
Number of user request/tickets per quarter for the SA2 services |
GGUS report or query, internal support tracker |
Within QA Plan and agreed Operational level Agreements with the other WPs |
(in Working Hours) |
KSA2.6.1 - ETICS |
|
|
|
|
33 |
36 |
47 |
46 |
162 |
KSA2.6.2 - EMI Testbed |
|
|
|
|
14 |
8 |
11 |
5 |
38 |
KSA2.7 - Average Support Response Time |
SA2 |
Average time to respond to a request/ticket: time to the first reply to the user |
GGUS report or query, internal support tracker |
Within QA Plan and agreed Operational level Agreements with the other WPs |
|
KSA2.7.1 - ETICS |
|
|
|
|
3.22 |
2.45 |
0.94 |
0.73 |
1.68 |
KSA2.7.2 - EMI Testbed |
|
|
|
8.09 |
2.65 |
1.08 |
4.71 |
1.30 |
5.07 |
KSA2.8 - Average Support Request Life Time |
SA2 |
Average life time of a request/ticket: time from start to end of a ticket (to see time needed to close the tickets, categorized by tickets types) |
GGUS report or query, internal support tracker |
Within QA Plan and agreed Operational level Agreements with the other WPs |
(in Working Hours) |
KSA2.8.1 - ETICS |
|
|
|
|
96.29 |
133.02 |
84.20 |
57.45 |
89.97 |
KSA2.8.2 - EMI Testbed |
|
|
|
|
45.55 |
46.58 |
18.55 |
39.54 |
37.16 |