EMI Key Performance Indicators

Legend:
T - Too early to measure, e.g. EMI release not yet available
U - Unavailable. This may be due to a number of technical issues eg no agreement yet on the schema/model to use, tool unavailability

Metrics collection started in Q2.

Year 1

Code/KPI WP Description Methodology Estimated Targets Q1 Q2 Q3 Q4 P1 status
KNA1.1
Cost efficiency
NA1 A measure of the cost of providing software maintenance and support services in EMI Unit cost of effort for kSLOC of change or addition to the software base Should decrease compared to the initial baseline of running ARC, gLite and UNICORE as separate projects         The overall effort consumption to perform technical activities in EMI (SA1+SA2+JRA1) is 64 FTE compared to the estimated 93 consumed by previous projects together for equivalent activities. In terms of cost the software and services have been provided for a total costs of about 6 M€ compared to the estimated 12.8 M€ of the previous projects combined. Since no degradation of service has been reported and all technical objectives have been largely achieved, we consider that EMI has managed to increase the efficiency of the software engineering activities by 30%. The measurement is rather empirical and the effect has not been uniform across all partners and has been measured over a limited period of time, therefore it must be further validated in the coming years before we can state whether this is a sustainable trend.
KNA1.2
MoUs with commercial companies
NA1 The number of formal collaborations with commercial companies is support to the EMI sustainability and exploitation plans Periodic reports Year 1: 3
Year 2: 3
Year 3: 3
      1 (in preparation) 1

Code/KPI WP Description Methodology Estimated Targets Q1 Q2 Q3 Q4 P1 status
KNA2.1
Number and quality of events organised
NA2 Number of events organized or coorganized by EMI Follow-up metrics by means of real time online polls and other tools. 2 per year   0 0 1 1
KNA2.2
Number and quality of published material
NA2 Journal papers or articles and presentations at relevant conferences produced from EMI research activities Periodic reports 4 per year 3 11 1 2 17 (11 paper)
KNA2.3
Number and quality of training events
NA2 Number of training events organized by EMI and number of trained people Follow-up metrics by means of real time online polls and other tools. 4 per year   1 0 3 4
KNA2.4
Number of EMI products included in standard repositories, Linux distributions, etc
NA2 This is the number of EMI packages that become part of standard OS distributions like Fedora or Ubuntu Periodic reports 80% of the client components, selected services based on requirements   T T T -

Code/KPI WP Description Methodology Estimated Targets Q1 Q2 Q3 Q4 P1 status
KSA1.1
Number of incidents
SA1 Number and trends of incidents registered by the Service Desk (in total and per category) GGUS report or query The trend should follow a standard Rayleigh curve U 82
graph
115
graph
111
graph
graph
KSA1.2
Incident Resolution Time
SA1 Average time for resolving an incident by the 3rd-level support (possibly per category) GGUS report or query Within the SLA specifications Avg:190 days
Med:136
Avg:40 days
Med:37
graph
Avg:55 days
Med:25
graph
Avg:52 days
Med:36
graph
Avg:70 days
Med: 31
graph
KSA1.3
Number of Problems
SA1 Number and trends of problems (defects) submitted in the Defect Tracker(s) (in total and per category) as absolute value and as density over kSLOC Defect Tracker report or query The trend should follow a standard Rayleigh curve   U U U graph
KSA1.4
Number of Urgent Changes
SA1 Number of changes (defects or enhancements) with priority Immediate Defect Tracker report or query A precise target cannot be estimated, but too frequent Immediate changes are symptom of poor Quality Control. It is tentatively set at < 1 per month   U U U -
KSA1.5
Change Application Time
SA1 Average time, from incident submission to release, for applying a change (possibly per category and priority) Tracker report or query Within SLA specifications   U U U -
KSA1.6
Number of Releases
SA1 Number of releases grouped into Major, Minor, Revision and Emergency Periodic report by the Release Manager According to Release Plan   T T 1 1
KSA1.7
Number of Release Rollbacks
SA1 Number of releases which had to be reversed (rolledback) Periodic report by the Release Manager < 4 releases per year   T T T -

Code/KPI WP Description Methodology Estimated Targets Q1 Q2 Q3 Q4 P1 status
KSA2.1 - Services Reliability SA2 % uptime dependent only on the SA2 services Participating sites monitoring tools 99% - - - - -

KSA2.1.1 - ETICS

      99%   100% 98% 100% 99.2%

KSA2.1.2 - CERN Testbed

      99%   99.5%
(21 hosts)
100% 100% 100%
KSA2.1.3 - KOSICE Testbed       99%   100%
(9 hosts)
100% 99% 100%
KSA2.1.4 - INFN Testbed
      99%   100%
(8 hosts)
100% 99% 100%
KSA2.1.5 - JUELICH Testbed       99%   100%
(4 hosts)

100% 99% 100%
KSA2.1.6 - CESNET testbed       99%   100%
(1 host)
100% 100% 100%
KSA2.2 - Services Availability SA2 Total % uptime including the underlying suppliers Participating sites monitoring tools 97% - - - - -
KSA2.2.1 - ETICS       97%   99.7%
(21 hosts)
96% 99% 98%
KSA2.2.2 - CERN Testbed       97%   99.5%
(9 hosts)

97% 100% 99%
KSA2.2.3 - KOSICE Testbed       97%   90%
(9 hosts)
100% 99% 96%
KSA2.2.4 - INFN Testbed       97%   100%
(8 hosts)
100% 99% 100%
KSA2.2.5 - JUELICH Testbed       97%   100%
(4 hosts)
99% 99% 99%
KSA2.2.6 - CESNET Testbed       97%   100%
(1 hosts)
100% 100% 100%
KSA2.3 - Distributed Testbed Size SA2 Number of CPUs available for distributed testing through collaborations with external providers Participating sites monitoring tools
1 CPU = 1 Virtual Machine
Year 1: 50 CPUs
Year 2: 200 CPUs
Year 3: 500 CPUs
  70 60 60 73
KSA2.4 - Number of key process assessments SA2 A process assessment is a periodic exercise to evaluate the efficiency of a process and identify weaknesses and areas for improvements Periodic reports One per year for the major processes (Release, Change, Problem), results to be reported in the QA report to be submitted at the end of every year.   T T T -
KSA2.5 - Number of weaknesses detected and addressed: related to the assessment SA2 A measure of how many of the weaknesses identified in the periodic assessments are addressed and their impact on the process efficiency Periodic reports One per year for the major processes (Release, Change, Problem), results to be reported in the QA report to be submitted at the end of every year for the preceding year assessment.   T T T -
KSA2.6 - Number of Support Requests SA2 Number of user request/tickets per quarter for the SA2 services GGUS report or query, internal support tracker Within QA Plan and agreed Operational level Agreements with the other WPs - - - - -
KSA2.6.1 - ETICS           39 27 75 47
KSA2.6.2 - EMI Testbed           T 8 22 15
KSA2.7 - Average Support Response Time SA2 Average time to respond to a request/ticket: time to the first reply to the user GGUS report or query, internal support tracker Within QA Plan and agreed Operational level Agreements with the other WPs - - - - -
KSA2.7.1 - ETICS           7.7 working hours
71.7 in Aug
5.2 working hours 5.9 working hours 6.3
KSA2.7.2 - EMI Testbed           U 6 working hours 4.3 working hours 5.2
KSA2.8 - Average Support Request Life Time SA2 Average life time of a request/ticket: time from start to end of a ticket (to see time needed to close the tickets, categorized by tickets types) GGUS report or query, internal support tracker Within QA Plan and agreed Operational level Agreements with the other WPs - - - - -
KSA2.8.1 - ETICS           16.7 working hours
145.7 in Aug
10/.8 working hours 37.6 working hours 21.2
KSA2.8.2 - EMI Testbed           U 11.2 working hours 29.5 working hours 20.4

Code/KPI WP Description Methodology Estimated Targets Q1 Q2 Q3 Q4 P1 status
KJRA1.1
Number of Adopted Open Standard Interfaces
JRA1 This metric provides a measurable indicator whether the EMI product suite continuously adopts (emerging) open standards thus achieving an increasing standardcompliance throughout the delivered products. It will thus indicate the adoption rate of the EMI product suite In general this metric should have one overall numeric value that increases during the course of the project for each standard of an EMI product. Each standardbased interface per product will be summarized enabling the thoroughly evaluation of the number of adoption of open standards for the whole EMI project in general and for each product in particular. Estimated targets will be defined in a matrix notation along with the standardization roadmap and its updates   43
matrix
47
matrix
   
KJRA1.2
Number of Interoperable Interface Usage
JRA1 This metric provides a measurable indicator whether the EMI product suite itself can benefit from the adoption of open standards by using interoperable interfaces of products with dedicated other standard-based technologies. It will thus indicate the standard usage within the EMI product suite in general and measure the interoperable interface usage in particular. In general this metric should have an indicator as one general numeric value that increases during the course of the project. For each of the standard-based interfaces in the EMI product suite each use of this interface should increment the value per technology. This illustrates the number of interoperable interface usage. Over time, it is expected that the number grows with the number of adopted standards. The setup of this KPI will be a matrix that defines the amount of interface usages between the different EMI products. The initial target of the KPI in the matrix will be precisely defined for each relevant open standard interface per used product as part of the ‘Standardization Roadmap Document’ Estimated targets will be defined in a matrix notation along with the standardization roadmap and its updates   15
matrix
19
matrix
   
KJRA1.3
Number of reduced lines of code
JRA1 This metric provides a measurable indicator whether the EMI product suite can reduce its overall lines of codes in order to reduce its maintenance efforts. The aim of this measure is twofold. First, it proves that the actual lines of codes that have to be maintained are actually reduced during the course of the project. Second, it indicates code reuse and the harmonization of products that includes avoiding duplicate developments where possible when comparing one product to another one with the same functionality (i.e. slightly increasing SLOC vs. significantly reduced SLOCs). This metric can be actually retrieved via the ETICS build and test system and its AQCM plug-in, which in turn is able to provide the Source Lines of Code (SLOC) value for each product of the EMI product suite. It is expected that the sum of all SLOCs will be decreasing over the period of the project runtime even when new developments are foreseen that in turn again aim to reduce duplicate functionalities and thus the overall number of SLOCs. The current situation of this KPI in terms of SLOCs per product will be initially defined starting with the beginning of the project and finalized once all product teams have been defined and their products are available within ETICS. > 33% (1/3) reduction over the three-year activity. The reduction can be consequence of removing components or replacing them with commercial or community alternatives 3382920
EMI 0 SLOCS
    3327721
EMI 1 SLOCS
3563831
EMI 1 SLOCS
KJRA1.4
Number of reduced released products
JRA1 This metric provides a measurable indicator whether the EMI product suite is decreasing the overall maintenance in terms of the amount of supported products while keeping the same functionality or reuse functionality provided by other vendors or technology providers. This metric is a numeric value that indicates the number of different products within an EMI product release. It is expected that this value is decreasing during the course of the project. >= 2 products per year in average over three years   T T 58 58

Year 2

Code/KPI WP Description Methodology Estimated Targets Q5 Q6 Q7 Q8 P2 status
KNA1.1
Cost efficiency
NA1 A measure of the cost of providing software maintenance and support services in EMI Unit cost of effort for kSLOC of change or addition to the software base Should decrease compared to the initial baseline of running ARC, gLite and UNICORE as separate projects calculated yearly (P2) calculated yearly (P2) calculated yearly (P2) calculated yearly (P2) The cost efficiency of the EMI 2 stack is calculated comparing the actual cost and effort to the values predicted by the standard COCOMO model. The figures for EMI 2 are
SLOC = 1.92 (-20% compared to EMI 1)
Effort (p/y): 47 (EMI 2), 57 (COCOMO)
Cost:5.3 (EMI), 7.7 (COCOMO)
The EMI 2 figures are closer to the COCOMO figures than in EMI 1, which is a good indicator of the improvement in the application of software engineering processes
KNA1.2
MoUs with commercial companies
NA1 The number of formal collaborations with commercial companies is support to the EMI sustainability and exploitation plans Periodic reports Year 1: 3
Year 2: 3
Year 3: 3
1 (in preparation) 1 (in preparation) 1 (in preparation) 1 (in preparation) During Y2 no MoUs were signed. However, the close collaboration with DCore has allowed a number of EMI partners to enter into specific collaborations. An EMI-DCore MoU is also under preparation

Code/KPI WP Description Methodology Estimated Targets Q5 Q6 Q7 Q8 P2 status
KNA2.1
Number and quality of events organised
NA2 Number of events organized or coorganized by EMI Follow-up metrics by means of real time online polls and other tools. 2 per year 0 0 0 1 1
KNA2.2
Number and quality of published material
NA2 Journal papers or articles and presentations at relevant conferences produced from EMI research activities Periodic reports 4 per year 7 25 4 27 (1) 63 (1)
KNA2.3
Number and quality of training events
NA2 Number of training events organized by EMI and number of trained people Follow-up metrics by means of real time online polls and other tools. 4 per year 1 2 2 2 7
|

Code/KPI WP Description Methodology Estimated Targets Q5 Q6 Q7 Q8 P2 status
KNA3.1
Number of EMI products included in standard repositories, Linux distributions, etc.
NA3 This is the number of EMI packages that become part of standard OS distributions like Fedora or Ubuntu Periodic reports 80% of the client components, selected services based on requirements - - - - -
KNA3.2
New Contacts
NA3 This is the number of new contacts that have been evaluated to be a user community or open source foundation contributor Each contact we interact with will be listed, interviewed, documented, and evaluated At least 100 at the end of the project - 5 20 30 30
KNA3.3
Open Standard Specification Contributions
NA3 This is the number of open standard contributions to specific specifications Number of specifications with substantial EMI input are counted At least 3 at the end of the project - - (2) (3) (3)
KNA3.4
EMI Use Cases
NA3 This is the number of documented use cases that rely on EMI Products Number of use cases that relies on one or more EMI products will be documented and counted At least 20 at the end of the project - 1 2 4 4
|

Code/KPI WP Description Methodology Estimated Targets Q5 Q6 Q7 Q8 P2 status
KSA1.1
Number of incidents
SA1 Number and trends of incidents registered by the Service Desk (in total and per category) GGUS report or query The trend should follow a standard Rayleigh curve 172
graph
114
graph
172
graph
134
graph
626
graph
KSA1.2
Incident Resolution Time
SA1 Average time for resolving an incident by the 3rd-level support (possibly per category) GGUS report or query Within the SLA specifications Avg:57.6 days
Med:53.5
graph
Avg:65.4 days
Med:60.2
graph
Avg:110.1 days
Med: 104.5 days
graph
Avg: 94.8 days
Med: 87.8 days
graph
Avg: 84.08 days
Med: N/A days
graph
KSA1.3
Number of Problems
SA1 Number and trends of problems (defects) submitted in the Defect Tracker(s) (in total and per category) as absolute value and as density over kSLOC Defect Tracker report or query The trend should follow a standard Rayleigh curve 387
graph
385
graph
185
graph
229
graph
1383
graph
KSA1.4
Number of Urgent Changes
SA1 Number of changes (defects or enhancements) with priority Immediate Defect Tracker report or query A precise target cannot be estimated, but too frequent Immediate changes are symptom of poor Quality Control. It is tentatively set at < 1 per month 5
graph
5
graph
5
graph
1
graph
29
graph
KSA1.5
Change Application Time
SA1 Average time, from incident submission to release, for applying a change (possibly per category and priority) Tracker report or query Within SLA specifications U graph graph graph graph
KSA1.6
Number of Releases
SA1 Number of releases grouped into Major, Minor, Revision and Emergency Periodic report by the Release Manager According to Release Plan graph graph graph graph graph
KSA1.7
Number of Release Rollbacks
SA1 Number of releases which had to be reversed (rolledback) Periodic report by the Release Manager < 4 releases per year 0 0 0 0 0

Code/KPI WP Description Methodology Estimated Targets Q5 Q6 Q7 Q8 P2 status
KSA2.1 - Services Reliability SA2 % uptime dependent only on the SA2 services Participating sites monitoring tools 99%

KSA2.1.1 - ETICS

      99% 98.7% 96.9% 97.3% 98.1% 97.75%

KSA2.1.2 - CERN Testbed

      99% 99.8% 100.0% 100.0% 99.8% 99.9%
KSA2.1.3 - KOSICE Testbed       99% 97.0% 100% 100.0% 100.0% 99.2%
KSA2.1.4 - INFN Testbed
      99% 99.3% 100.0% 100.% 100.0% 99.8%
KSA2.1.5 - JUELICH Testbed       99% 100.0% 100.0% 100.0% 100.0% 100.0%
KSA2.1.6 - CESNET testbed       99% 100.0% 96.5% 99.9% 99.8% 99.0%
KSA2.2 - Services Availability SA2 Total % uptime including the underlying suppliers Participating sites monitoring tools 97%          
KSA2.2.1 - ETICS       97% 98.0 % 95.8% 96.8% 97.9% 97.1%
KSA2.2.2 - CERN Testbed       97% 99.8% 96.0% 98.8% 99.2% 98.4%
KSA2.2.3 - KOSICE Testbed       97% 97.0% 100% 99.9% 100.0% 99.2%
KSA2.2.4 - INFN Testbed       97% 99.3% 96.7% 98.6% 99.2% 98.45
KSA2.2.5 - JUELICH Testbed       97% 100.0% 99.5% 100.0% 100% 100%
KSA2.2.6 - CESNET Testbed       97% 100.0% 100.% 99.6% 99.0% 99.7%
KSA2.3 - Distributed Testbed Size SA2 Number of CPUs available for distributed testing through collaborations with external providers Participating sites monitoring tools
1 CPU = 1 Virtual Machine
Year 1: 50 CPUs
Year 2: 200 CPUs
Year 3: 500 CPUs
97 112 154 204 204
KSA2.4 - Number of key process assessments SA2 A process assessment is a periodic exercise to evaluate the efficiency of a process and identify weaknesses and areas for improvements Periodic reports One per year for the major processes (Release, Change, Problem), results to be reported in the QA report to be submitted at the end of every year. N/A 4 QA processes assessed: release, change, packaging, testing
KSA2.5 - Number of weaknesses detected and addressed: related to the assessment SA2 A measure of how many of the weaknesses identified in the periodic assessments are addressed and their impact on the process efficiency Periodic reports One per year for the major processes (Release, Change, Problem), results to be reported in the QA report to be submitted at the end of every year for the preceding year assessment. N/A - Separation of the QA and QC activities
- 2 major weaknesses identified and corrected: multi-platform support and EPEL/Lintian compliance
KSA2.6 - Number of Support Requests SA2 Number of user request/tickets per quarter for the SA2 services GGUS report or query, internal support tracker Within QA Plan and agreed Operational level Agreements with the other WPs (in Working Hours)
KSA2.6.1 - ETICS         33 36 47 46 162
KSA2.6.2 - EMI Testbed         14 8 11 5 38
KSA2.7 - Average Support Response Time SA2 Average time to respond to a request/ticket: time to the first reply to the user GGUS report or query, internal support tracker Within QA Plan and agreed Operational level Agreements with the other WPs  
KSA2.7.1 - ETICS         3.22 2.45 0.94 0.73 1.68
KSA2.7.2 - EMI Testbed       8.09 2.65 1.08 4.71 1.30 5.07
KSA2.8 - Average Support Request Life Time SA2 Average life time of a request/ticket: time from start to end of a ticket (to see time needed to close the tickets, categorized by tickets types) GGUS report or query, internal support tracker Within QA Plan and agreed Operational level Agreements with the other WPs (in Working Hours)
KSA2.8.1 - ETICS         96.29 133.02 84.20 57.45 89.97
KSA2.8.2 - EMI Testbed         45.55 46.58 18.55 39.54 37.16

Code/KPI WP Description Methodology Estimated Targets Q5 Q6 Q7 Q8 P2 status
KJRA1.1
Number of EMI service interfaces and libraries passing standard compliance tests
JRA1 The metric measures how many EMI service interfaces and libraries are successfully tested for standard compliance. Standard compliance is defined broadly and also includes compliance with EMI internal agreements. The number is taken by checking the available test reports generated during the quarter by the Product Teams. JRA1 aims to increase the number of successfull tests at least by two per reporting quarter.     1 1 1
KJRA1.2
Number of passed inter-product tests
JRA1 The metrics shows how EMI products can be used together by passing inter-product tests based on real-life use case scenarios. The number should be taken from the available test reports produced during the quarter. Each passed test should be assigned to a numeric value corresponding to the number of products involved in the test. The metric itself is the sum of the assigned test numbers. JRA1 aims to increase the metrics value at least by four per reporting quarter     0 13 13
KJRA1.3
Number of EMI products implementing EMI agreements
JRA1 This metrics shows how the EMI harmonization is progressing by measuring the commitment and actual work to implement EMI agreements. The number of ongoing or completed development tasks of the EMI development tracker targeting EMI agreement implementation will be counted. Based on the currently known agreements, approximately 20 by the end of the project.     16 16 16

Year 3

Code/KPI WP Description Methodology Estimated Targets Q9 Q10 Q11 Q12 P3 status
KNA1.1
Cost efficiency
NA1 A measure of the cost of providing software maintenance and support services in EMI Unit cost of effort for kSLOC of change or addition to the software base Should decrease compared to the initial baseline of running ARC, gLite and UNICORE as separate projects calculated yearly (P3) calculated yearly (P3) calculated yearly (P3) calculated yearly (P3)  
KNA1.2
MoUs with commercial companies
NA1 The number of formal collaborations with commercial companies is support to the EMI sustainability and exploitation plans Periodic reports Year 1: 3
Year 2: 3
Year 3: 3
         

Code/KPI WP Description Methodology Estimated Targets Q9 Q10 Q11 Q12 P3 status
KNA2.1
Number and quality of events organised
NA2 Number of events organized or coorganized by EMI Follow-up metrics by means of real time online polls and other tools. 2 per year 0 0 2 4  
KNA2.2
Number and quality of published material
NA2 Journal papers or articles and presentations at relevant conferences produced from EMI research activities Periodic reports 4 per year 16 27 8 26  
KNA2.3
Number and quality of training events
NA2 Number of training events organized by EMI and number of trained people Follow-up metrics by means of real time online polls and other tools. 4 per year 1 2 2 2  
|

Code/KPI WP Description Methodology Estimated Targets Q9 Q10 Q11 Q12 P3 status
KNA3.1
Number of EMI products included in standard repositories, Linux distributions, etc.
NA3 This is the number of EMI packages that become part of standard OS distributions like Fedora or Ubuntu Periodic reports 80% of the client components, selected services based on requirements - - - -  
KNA3.2
New Contacts
NA3 This is the number of new contacts that have been evaluated to be a user community or open source foundation contributor Each contact we interact with will be listed, interviewed, documented, and evaluated At least 100 at the end of the project 50 70 90 105  
KNA3.3
Open Standard Specification Contributions
NA3 This is the number of open standard contributions to specific specifications Number of specifications with substantial EMI input are counted At least 3 at the end of the project 2 3 3 3  
KNA3.4
EMI Use Cases
NA3 This is the number of documented use cases that rely on EMI Products Number of use cases that relies on one or more EMI products will be documented and counted At least 20 at the end of the project 10 15 18 22  
|

Code/KPI WP Description Methodology Estimated Targets Q9 Q10 Q11 Q12 P3 status
KSA1.1
Number of incidents
SA1 Number and trends of incidents registered by the Service Desk (in total and per category) GGUS report or query The trend should follow a standard Rayleigh curve 93 99 125 109 426
KSA1.2
Incident Resolution Time
SA1 Average time for resolving an incident by the 3rd-level support (possibly per category) GGUS report or query Within the SLA specifications 82.87 75.97 66.39 86.1 78.27
KSA1.3
Number of Problems
SA1 Number and trends of problems (defects) submitted in the Defect Tracker(s) (in total and per category) as absolute value and as density over kSLOC Defect Tracker report or query The trend should follow a standard Rayleigh curve 239 223 205 191 858
KSA1.4
Number of Urgent Changes
SA1 Number of changes (defects or enhancements) with priority Immediate Defect Tracker report or query A precise target cannot be estimated, but too frequent Immediate changes are symptom of poor Quality Control. It is tentatively set at < 1 per month 2 2 2 2 8
KSA1.5
Change Application Time
SA1 Average time, from incident submission to release, for applying a change (possibly per category and priority) Tracker report or query Within SLA specifications 189.2 191.7 213.3 134.7 180.68
KSA1.6
Number of Releases
SA1 Number of releases grouped into Major, Minor, Revision and Emergency Periodic report by the Release Manager According to Release Plan 10 32 29 111 182
KSA1.7
Number of Release Rollbacks
SA1 Number of releases which had to be reversed (rolledback) Periodic report by the Release Manager < 4 releases per year 0 0 0 0 0

Code/KPI WP Description Methodology Estimated Targets Q9 Q10 Q11 Q12 P3
KSA2.1 - Services Reliability SA2 % uptime dependent only on the SA2 services Participating sites monitoring tools 99%

KSA2.1.1 - ETICS

      99% 99.8% 99.3% 99.3% 99.5% 99.5%

KSA2.1.2 - CERN Testbed

      99% 100% 100% 100% 100% 100%
KSA2.1.3 - KOSICE Testbed       99% 99.8% 100% 100% 100% 100%
KSA2.1.4 - INFN Testbed
      99% 100% 100% 100% 100% 100%
KSA2.1.5 - JUELICH Testbed       99% 100% 100% 100% 100% 100%
KSA2.1.6 - CESNET testbed       99% 100% 100% 100% 100% 100%
KSA2.1.6 - DESY testbed       99% 100% 100% 100% 100% 100%
KSA2.2 - Services Availability SA2 Total % uptime including the underlying suppliers Participating sites monitoring tools 97%
KSA2.2.1 - ETICS       97% 98.4% 99.2% 99.2% 99.1% 99.2%
KSA2.2.2 - CERN Testbed       97% 100% 100% 99.8% 99.5% 99.8%
KSA2.2.3 - KOSICE Testbed       97% 99.8% 100% 100% 99.7% 99.9%
KSA2.2.4 - INFN Testbed       97% 100% 100% 100% 99.7% 99.9%
KSA2.2.5 - JUELICH Testbed       97% 97.8% 99.9% 99.9% 99.3% 99.2%
KSA2.2.6 - CESNET Testbed       97% 100% 99.9% 98.9% 100% 99.7%
KSA2.2.7 - DESY Testbed       97% 99.4% 100% 99.7% 100% 99.8%
KSA2.3 - Distributed Testbed Size SA2 Number of CPUs available for distributed testing through collaborations with external providers Participating sites monitoring tools
1 CPU = 1 Virtual Machine
Year 1: 50 CPUs
Year 2: 200 CPUs
Year 3: 500 CPUs
204 225 241 245  
KSA2.4 - Number of key process assessments SA2 A process assessment is a periodic exercise to evaluate the efficiency of a process and identify weaknesses and areas for improvements Periodic reports One per year for the major processes (Release, Change, Problem), results to be reported in the QA report to be submitted at the end of every year.
KSA2.5 - Number of weaknesses detected and addressed: related to the assessment SA2 A measure of how many of the weaknesses identified in the periodic assessments are addressed and their impact on the process efficiency Periodic reports One per year for the major processes (Release, Change, Problem), results to be reported in the QA report to be submitted at the end of every year for the preceding year assessment.
KSA2.6 - Number of Support Requests SA2 Number of user request/tickets per quarter for the SA2 services GGUS report or query, internal support tracker Within QA Plan and agreed Operational level Agreements with the other WPs (number of tickets)
KSA2.6.1 - ETICS         20 9 19 8  
KSA2.6.2 - EMI Testbed         5 (now fully integrated in release process)
KSA2.7 - Average Support Response Time SA2 Average time to respond to a request/ticket: time to the first reply to the user GGUS report or query, internal support tracker Within QA Plan and agreed Operational level Agreements with the other WPs (in Working Hours)
KSA2.7.1 - ETICS         0.70 0.87 1.31 1.02  
KSA2.7.2 - EMI Testbed         4.49 (now fully integrated in release process)
KSA2.8 - Average Support Request Life Time SA2 Average life time of a request/ticket: time from start to end of a ticket (to see time needed to close the tickets, categorized by tickets types) GGUS report or query, internal support tracker Within QA Plan and agreed Operational level Agreements with the other WPs (in Working Hours)
KSA2.8.1 - ETICS         41.65 29.89 119.88 7.92  
KSA2.8.2 - EMI Testbed         12.26 (now fully integrated in release process)

Code/KPI WP Description Methodology Estimated Targets Q9 Q10 Q11 Q12 P3 status
KJRA1.1
Number of EMI service interfaces and libraries passing standard compliance tests
JRA1 The metric measures how many EMI service interfaces and libraries are successfully tested for standard compliance. Standard compliance is defined broadly and also includes compliance with EMI internal agreements. The number is taken by checking the available test reports generated during the quarter by the Product Teams. JRA1 aims to increase the number of successfull tests at least by two per reporting quarter. 1 1 18 25 25
KJRA1.2
Number of passed inter-product tests
JRA1 The metrics shows how EMI products can be used together by passing inter-product tests based on real-life use case scenarios. The number should be taken from the available test reports produced during the quarter. Each passed test should be assigned to a numeric value corresponding to the number of products involved in the test. The metric itself is the sum of the assigned test numbers. JRA1 aims to increase the metrics value at least by four per reporting quarter 13 13 65 75 75
KJRA1.3
Number of EMI products implementing EMI agreements
JRA1 This metrics shows how the EMI harmonization is progressing by measuring the commitment and actual work to implement EMI agreements. The number of ongoing or completed development tasks of the EMI development tracker targeting EMI agreement implementation will be counted. Based on the currently known agreements, approximately 20 by the end of the project. 16 16 16 21 21

Edit | Attach | Watch | Print version | History: r64 < r63 < r62 < r61 < r60 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r64 - 2013-05-26 - MorrisRiedelExCern
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    EMI All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback