This twiki page describes the setup used as of 2016 to scan the LHCb software using the
Coverity
static code analyzer.
See also JIRA issue
LBCORE-470
.
Coverity is a commercial tool and can only be run on specific (licensed) machines.
The previous setup used by LHCb in 2011, described
here, was based on a private LHCb installation of a Coverity server instance.
This is no longer needed, as the 2016 setup is based on the existing infrastructure that is maintained outside LHCb, thanks to the EP-SFT group.
This consists of a Coverity Connect server on
coverity.cern.ch
and a build node on buildcoverity.cern.ch.
Note that Coverity itself hosts and offers a free
Coverity Scan
service for open source software.
Using a custom privately maintained service, however, offers more felxibility in the build environment and in the triage of defects.
This page contains four sections, describing:
how to prepare the Coverity Connect server, prior to Coverity scans (admin operations);
how to prepare Coverity scans while building the software, and commit the results to the Coverity Connect server (librarian operations);
how to interact with the Coverity Connect server to triage defects (developer operations);
the current status of Coverity defects for the LHCb software projects.
1. Prepare the Coverity Connect server, prior to Coverity scans (admin operations)
The results of the Coverity scans for the LHCb software can be analyzed by logging on the Coverity Connect instance on
coverity.cern.ch
.
Before using Coverity, your admin must have configured several server-side entities for you, including users and user groups, projects, streams, (optionally) triage stores and (optionally) component maps.
For help on how to configure the Coverity Connect server, refer to the
Coverity Usage and Administration Guide
(authentication required).
The following sub-sections describe the setup that was adopted for the LHCb software.
Users and user groups
Only users belonging to the
ph-sft-coverity-users
egroup are allowed to use the coverity.cern.ch server.
Users in this egroup are automatically imported (every night?) to the server via LDAP. They can authenticate with their CERN SSO credentials.
At the moment, around 20 LHCb accounts have been authorized to use the server by including them in this egroup.
The list includes all application managers and a few power developers.
A local group lhcb-coverity-users-local has been created on coverity.cern.ch, including all LHCb users so far existing on the server.
To simplify email communication with these users, a CERN egroup
lhcb-coverity-users
including the same users has also been created.
The local group on the Coverity server and the CERN egroup are kept in sync manually for the moment.
Eventually, the local egroup on the Coverity server may be replaced by the CERN egroup, imported via LDAP
(see
LBCORE-1063
- but this does not seem technically possible through the present interface?).
For the moment, only a few users have been individually granted admin/owner privileges over the relevant LHCb project, streams and triage stores.
Unlike projects, streams and triage stores, component maps are all owned by the admin user and their ownership may not be transferred;
managing component maps requires the global (not per-project) role "project admin" (see
"managing custom roles"
).
A few "user views" have also been created in the user space of one user and have been shared with the lhcb-coverity-users-local. This is described more in detail in
section 3 below.
Projects
A single Coverity "project" LHCb has been created to hold all defects from all relevant LHCb software projects.
From a user perspective, a Coverity project provides the top-level context that can be changed with a drop-down menu.
This means, for instance, that defects from the builds of GAUDI and BOOLE will normally both appear in the same defect list (unless one selects them by stream and/or component, as described below).
Streams
Twelve separate Coverity "streams" have been created for LHCb, one for each of the LHCb software projects that are so far included in Coverity scans: GAUDI, LHCB, LBCOM, REC, BOOLE, PHYS, HLT, ANALYSIS, STRIPPING, DAVINCI, GEANT4, GAUSS.
A Coverity stream belongs to a single project.
When defects are committed to the database of the Coverity Connect server from a software build (see
section 2), they are sent to a single stream; in turn, this automatically defines the project they belong to.
From a user perspective, it is possible to select and analyze only the defects committed to a given stream.
This is what has been done in the "user views" described above, created by one user and shared with all LHCb users. For instance, the LHCb-05-Boole view shows all outstanding defects in BOOLE.
Triage stores
Coverity "triage stores" describe the storage space on the Coverity Connect server where defects are stored.
One stream uses one and only one triage store, but the same triage store may be used by different streams.
Creating triage stores is optional: the Default triage store is used by all streams that have no specific triage store defined.
A single triage store has been created for all LHCb defects. It is used by all twelve LHCb streams.
Component maps
A Coverity "component map" is an optional entity allowing the categorization of defects by component within each project.
One stream uses one and only one component map, but the same component map may be used by different streams (e.g. the Default component map is used by all streams for which no dedicated component map was created).
A single component map has been created for all LHCb defects. It is used by all twelve LHCb streams.
The LHCb component map defines several components, listed below.
- One component for each of the twelve LHCb software projects: GAUDI, LHCB, LBCOM, REC, BOOLE, PHYS, HLT, ANALYSIS, STRIPPING, DAVINCI, GEANT4, GAUSS.
- Several components for defects coming from external software, "external (<pkg>)" where <pkg> is one of Boost, clhep, fastjet, gcc, HepMC, MCGenerators, Python, ROOT, system, tbb.
- A default component, Other, for defects coming from a not-yet-identified source. Currently no defects are assigned to the Other component.
The LHCb component map also defines several file rules, allowing the automatic categorization of a defect by component, depending on the name of the file where that defect was found.
The "system" component refers to headers in /usr, for instance.
Note that selecting defects by component or by stream is not the same thing.
A defect can belong to the BOOLE stream and to the GAUDI component, if it was found in a Gaudi header used by a Boole class.
2. Prepare Coverity scans and commit results to the Coverity Connect server (librarian operations)
Coverity scans for the LHCb software are prepared on a specific SLC6 node buildcoverity.cern.ch, maintained by EP-SFT, where the Coverity client software is installed.
The new infrastructure allows software builds with c++11, which could not be used on the previous EP-SFT infrastructure (before 2015).
To access buildcoverity.cern.ch and use the Coverity CLI, log in using your CERN SSO credentials:
ssh buildcoverity.cern.ch [-l username]
For the moment, only a few one-off manual scans of the full LHCb stacks have been performed using a private account
avalassi
.
Eventually, the plan is to configure automatic nightly (or weekly?) scans using LHCb service accounts.
(see
LBCORE-1058
).
The scans of the LHCb software are being performed in directory
/builda/LHCb
.
Extended attribute ACLs (via
getfacl
and
setfacl
) are used to make sure that all relevant users can read and write the same files there.
You can use
getfacl
to check the existing ACLs (note that
ls -l
lists a
+
at the end of the standard Unix permissions).
This is the command that was last used to prepare the directory:
setfacl -b -R -m u:avalassi:rwx -m u:lhcbsoft:rwx -m u:marcocle:rwx -m g:z5:rwx -dm u:avalassi:rwx -dm u:lhcbsoft:rwx -dm u:marcocle:rwx -dm g:z5:rwx /builda/LHCb
The full process of preparing Coverity scans includes three separate steps that involve three different commands from the Coverity client CLI:
- first, checkout the code and build it through
cov-build
;
- then, analyze the build results using
cov-anayze
;
- finally, commit the analysis results to the Coverity Connect server using
cov-commit-defects
.
All scripts used to perform Coverity scans are stored locally on buildcoverity for the moment (and documented in thiw twiki).
Eventually, it may be useful to store them in a gitlab repositiry ((see
LBCORE-1059
).
A full Coverity scan for the LHCb software is triggered using the following script (
download-build-analyze-commit.sh
), including the three steps described above. These steps are described in more details in the following subsections:
1#!/bin/bash -e
2cd /builda/LHCb
3date=`date`
4echo "Full Coverity scan started at "$date
5export CMTCONFIG=x86_64-slc6-gcc49-dbg # Geant4 now supports c++1y
6unset BINARY_TAG
7. /cvmfs/lhcb.cern.ch/lib/LbLogin.sh
8\rm -rf LbNightlyTools
9git clone https://gitlab.cern.ch/lhcb-core/LbNightlyTools.git
10cd LbNightlyTools
11mkdir logs
12touch logs/log-download-started
13. setup.sh
14lbn-get-configs
15lbn-checkout lhcb-gaudi-head
16touch logs/log-build-started
17for project in Gaudi LHCb Lbcom Rec Boole Phys Hlt Analysis Stripping DaVinci Geant4 Gauss ; do ../scripts/build.sh $project; done
18touch logs/log-analyze-started
19for project in Gaudi LHCb Lbcom Rec Boole Phys Hlt Analysis Stripping DaVinci Geant4 Gauss ; do ../scripts/analyze.sh $project; done
20touch logs/log-commit-started
21for project in Gaudi LHCb Lbcom Rec Boole Phys Hlt Analysis Stripping DaVinci Geant4 Gauss; do ../scripts/commit.sh $project; done
22touch logs/log-all-done
23echo "Full Coverity scan started at "$date
24echo "Full Coverity scan ended at "`date`
25arch=../archive/`date +%Y""%m""%d""_%H""h%""M`
26mkdir -p $arch/scripts
27\cp -dpr logs/* $arch/
28\cp -dpr ../scripts/*sh $arch/scripts/
29\mv $arch/scripts/setup_dummy.sh $arch/scripts/setup.sh
For reference, the first full scan performed using this script took approximately 10h50m
(approximately 20m for checkouts, 4h40m for builds, 2h00m for analysis, 3h50m for commits).
Full Coverity scan started at Sun Feb 28 19:15:57 CET 2016
Full Coverity scan ended at Mon Feb 29 06:03:09 CET 2016
-rw-r--r--. 1 avalassi zg 0 Feb 28 19:16 log-download-started
-rw-r--r--. 1 avalassi zg 0 Feb 28 19:31 log-build-started
drwxr-xr-x. 18 avalassi zg 4096 Feb 28 19:31 ../
-rw-r--r--. 1 avalassi zg 546 Feb 28 19:51 log-Gaudi-build.txt
-rw-r--r--. 1 avalassi zg 546 Feb 28 20:31 log-LHCb-build.txt
-rw-r--r--. 1 avalassi zg 545 Feb 28 20:40 log-Lbcom-build.txt
-rw-r--r--. 1 avalassi zg 599 Feb 28 21:22 log-Rec-build.txt
-rw-r--r--. 1 avalassi zg 602 Feb 28 21:27 log-Boole-build.txt
-rw-r--r--. 1 avalassi zg 545 Feb 28 22:08 log-Phys-build.txt
-rw-r--r--. 1 avalassi zg 541 Feb 28 22:25 log-Hlt-build.txt
-rw-r--r--. 1 avalassi zg 557 Feb 28 22:50 log-Analysis-build.txt
-rw-r--r--. 1 avalassi zg 556 Feb 28 22:58 log-Stripping-build.txt
-rw-r--r--. 1 avalassi zg 555 Feb 28 23:29 log-DaVinci-build.txt
-rw-r--r--. 1 avalassi zg 552 Feb 28 23:55 log-Geant4-build.txt
-rw-r--r--. 1 avalassi zg 606 Feb 29 00:07 log-Gauss-build.txt
-rw-r--r--. 1 avalassi zg 0 Feb 29 00:07 log-analyze-started
-rw-r--r--. 1 avalassi zg 4877 Feb 29 00:28 log-Gaudi-analyze.txt
-rw-r--r--. 1 avalassi zg 5273 Feb 29 00:59 log-LHCb-analyze.txt
-rw-r--r--. 1 avalassi zg 3288 Feb 29 01:02 log-Lbcom-analyze.txt
-rw-r--r--. 1 avalassi zg 4338 Feb 29 01:19 log-Rec-analyze.txt
-rw-r--r--. 1 avalassi zg 3128 Feb 29 01:21 log-Boole-analyze.txt
-rw-r--r--. 1 avalassi zg 4252 Feb 29 01:32 log-Phys-analyze.txt
-rw-r--r--. 1 avalassi zg 3853 Feb 29 01:39 log-Hlt-analyze.txt
-rw-r--r--. 1 avalassi zg 3645 Feb 29 01:45 log-Analysis-analyze.txt
-rw-r--r--. 1 avalassi zg 3167 Feb 29 01:47 log-Stripping-analyze.txt
-rw-r--r--. 1 avalassi zg 2931 Feb 29 01:53 log-DaVinci-analyze.txt
-rw-r--r--. 1 avalassi zg 4078 Feb 29 02:06 log-Geant4-analyze.txt
-rw-r--r--. 1 avalassi zg 4337 Feb 29 02:10 log-Gauss-analyze.txt
-rw-r--r--. 1 avalassi zg 0 Feb 29 02:10 log-commit-started
-rw-r--r--. 1 avalassi zg 2550 Feb 29 02:30 log-Gaudi-commit.txt
-rw-r--r--. 1 avalassi zg 2548 Feb 29 02:51 log-LHCb-commit.txt
-rw-r--r--. 1 avalassi zg 2262 Feb 29 03:02 log-Lbcom-commit.txt
-rw-r--r--. 1 avalassi zg 2258 Feb 29 03:13 log-Rec-commit.txt
-rw-r--r--. 1 avalassi zg 2259 Feb 29 03:16 log-Boole-commit.txt
-rw-r--r--. 1 avalassi zg 2257 Feb 29 03:22 log-Phys-commit.txt
-rw-r--r--. 1 avalassi zg 2254 Feb 29 03:28 log-Hlt-commit.txt
-rw-r--r--. 1 avalassi zg 2267 Feb 29 03:33 log-Analysis-commit.txt
-rw-r--r--. 1 avalassi zg 2102 Feb 29 05:40 log-Stripping-commit.txt
-rw-r--r--. 1 avalassi zg 2266 Feb 29 05:49 log-DaVinci-commit.txt
-rw-r--r--. 1 avalassi zg 2091 Feb 29 05:56 log-Geant4-commit.txt
-rw-r--r--. 1 avalassi zg 0 Feb 29 06:03 log-all-done
drwxr-xr-x. 2 avalassi zg 4096 Feb 29 06:03 ./
-rw-r--r--. 1 avalassi zg 2088 Feb 29 06:03 log-Gauss-commit.txt
Download and build the software (cov-build)
As described in the script above, LHCb software builds for Coverity scans are executed within the
LbNightlyTools
framework, the same that is used for the LHCb nightly builds.
The x86_64-slc6-gcc49-dbg platform is chosen as this is the most recent compiler supported by LHCb on SLC6 platforms.
To avoid internal inconsistencies, all LHCb software projects are downloaded in one go: first they are checked out from SVN into
tmp/checkout
, then tar.bz2 archives are created in
artifacts
.
The code is then built using the
build.sh
script, a wrapper over the
lbn-build
command, invoked through
cov-build
:
1#!/bin/bash -e
2dir=`basename $PWD`
3if [ "$dir" != "LbNightlyTools" ]; then
4 echo "ERROR! Please run this script from LbNightlyTools"; exit 1
5fi
6if [ "$1" == "" ] || [ "$2" != "" ]; then
7 echo "Usage: $0 <project>"
8 echo "Example: $0 Gaudi"
9 exit 1
10fi
11proj=$1
12mkdir -p cov-out
13\rm -rf cov-out/$proj
14mkdir -p logs
15out=logs/log-${proj}-build.txt
16\rm -rf $out
17if [ "$proj" == "Gaudi" ]; then
18 nounpack=
19else
20 nounpack="--no-unpack"
21fi
22echo "======================================================================"
23# See http://stackoverflow.com/questions/3173131
24echo "Output copied to logfile $out"
25exec > >(tee -a $out)
26exec 2> >(tee -a $out >&2)
27date=`date`
28echo "Build of $proj started at "$date
29time /coverity/cov-analysis-linux64-7.6.1/bin/cov-build --dir cov-out/$proj \
30 lbn-build $nounpack -j32 --projects $proj lhcb-gaudi-head
31echo "Build of $proj started at "$date
32echo "Build of $proj ended at "`date`
The
lbn-build
command unpacks all tar.bz2 archives into the
build
directory. To avoid unpacking more than once, all builds after the first build (that of Gaudi) are started with the
--no-unpack
option.
The output of the
cov-build
command is stored in per-project subdirectories of
cov-out
.
Note that ccache should
NOT be enabled in the build procedure for Coverity scans.
If ccache is used and a source code file is not rebuilt because it is already cached, Coverity will handle this as if the source code had been fully removed or, equivalently, as if any defects in that file had been fixed.
Note also that you will need an AFS token to execute the above script, otherwise you will several errors at various points.
This issue should be kept in mind when implementing automatic nightly Coverity scans (see
LBCORE-1058
);
one possibilty could be to make sure that the script above does not require AFS tokens.
Note finally that the
cov-build
step does not perform any static code analysis of defects:
this step only performs the full build of the software and keeps track of all build commands executed,
so that the relevant build environment and options can be used in the following
cov-analyze
step.
As the builds of the LHCb software are performed using cmake, it would be useful to investigate if the
cov-build
step may be bypassed
and replaced by an equivalent step based on
CMAKE_EXPORT_COMPILE_COMMANDS
(see
LBCORE-1057
).
Analyze build results (cov-analyze)
The results of the builds are then analyzed using the
analyze.sh
script, a wrapper over
cov-analyze
:
1#!/bin/bash -e
2dir=`basename $PWD`
3if [ "$dir" != "LbNightlyTools" ]; then
4 echo "ERROR! Please run this script from LbNightlyTools"; exit 1
5fi
6if [ "$1" == "" ] || [ "$2" != "" ]; then
7 echo "Usage: $0 <project>"
8 echo "Example: $0 Gaudi"
9 exit 1
10fi
11proj=$1
12if [ ! -d $PWD/build ]; then
13 echo "ERROR! Directory build not found"; exit 1
14fi
15if [ ! -d cov-out/$proj ]; then
16 echo "ERROR! Please build the code first!"; exit 1
17 exit 1
18fi
19\rm -rf cov-out/$proj/output;
20mkdir -p logs
21out=logs/log-${proj}-analyze.txt
22\rm -rf $out
23echo "======================================================================"
24# See http://stackoverflow.com/questions/3173131
25echo "Output copied to logfile $out"
26exec > >(tee -a $out)
27exec 2> >(tee -a $out >&2)
28date=`date`
29echo "Analyze of $proj started at "$date
30time /coverity/cov-analysis-linux64-7.6.1/bin/cov-analyze --dir cov-out/$proj \
31 --all --enable-constraint-fpp --enable-fnptr --enable-single-virtual --force
32echo "Analyze of $proj started at "$date
33echo "Analyze of $proj ended at "`date`
Note that, to avoid analyzing code repeatedly, Coverity can build models of already analyzed code (using the
cov-collect-models
), i.e. dump the information for already analyzed code fragments in
xmldb
files.
Eventually, it may be useful to implement analysis models in the LHCb Coverity scans, to speed them up further (see
LBCORE-1060
).
Commit analysis results to the Coverity Connect server (cov-commit-defects)
Finally, the results of the analysis are committed to the Coverity Connect server using the
commit.sh
script, a wrapper over
cov-commit-defects
:
1#!/bin/bash -e
2dir=`basename $PWD`
3if [ "$dir" != "LbNightlyTools" ]; then
4 echo "ERROR! Please run this script from LbNightlyTools"; exit 1
5fi
6if [ "$1" == "" ] || [ "$2" != "" ]; then
7 echo "Usage: $0 <project>"
8 echo "Example: $0 Gaudi"
9 exit 1
10fi
11proj=$1
12stream=LHCb-${proj}-Stream # eg LHCb-Gaudi-Stream
13stripall=
14for proj1 in Gaudi LHCb Lbcom Rec Boole Phys Hlt Analysis Stripping DaVinci Gean
15t4 Gauss; do
16 PROJ1=`echo $proj1 | awk '{print toupper($0)}'`
17 # NB stripped paths will start by "/" even if strip prefixes end in "/"
18 # e.g. add /builda/LHCb/LbNightlyTools/build/GAUDI
19 stripall="$stripall --strip-path $PWD/build/$PROJ1"
20done
21if [ ! -d cov-out/$proj ]; then
22 echo "ERROR! Please build the code first!"; exit 1
23 exit 1
24fi
25if [ ! -d cov-out/$proj/output ]; then
26 echo "ERROR! Please analyze the code first!"; exit 1
27 exit 1
28fi
29# Set environment variable COVERITY_PASSPHRASE to authenticate
30# See /coverity/cov-analysis-linux64-7.6.1/bin/cov-commit-defects --help
31if [ -f ../scripts/setup.sh ]; then . ../scripts/setup.sh; fi
32mkdir -p logs
33out=logs/log-${proj}-commit.txt
34\rm -rf $out
35echo "======================================================================"
36# See http://stackoverflow.com/questions/3173131
37echo "Output copied to logfile $out"
38exec > >(tee -a $out)
39exec 2> >(tee -a $out >&2)
40date=`date`
41echo "Commit of $proj started at "$date
42time /coverity/cov-analysis-linux64-7.6.1/bin/cov-commit-defects \
43 --dir cov-out/$proj --host lcgapp10.cern.ch --port 8080 --user admin \
44 --stream $stream $stripall --strip-path /afs/cern.ch/sw/lcg/releases \
45 --strip-path /cvmfs/sft.cern.ch/lcg/releases \
46 --strip-path /cvmfs/lhcb.cern.ch/lib/lcg/releases
47date
48echo "Commit of $proj started at "$date
49echo "Commit of $proj ended at "`date`
3. Interact with the Coverity Connect server to triage defects (developer operations)
From a user (application manager and/or power developer) perspective, interacting with the Coverity Connect server to analyze and triage defects is the most interesting aspect of the Coverity infrastructure.
Select which defects to view
To use the Coverity Connect server, simply connect to
https://coverity.cern.ch
.
In the red
project pulldown menu in the top left corner of the GUI, select LHCb as the Coverity project you want to work with.
Click on the three horizontal white lines in the top left corner of the GUI, just below the Coverity log and the project pulldown menu: the
view panel will now slide into the GUI from the left side.
For LHCb users belonging to the lhcb-coverity-users-local group, the view panel will look like the following (in the meantime, Geant4 and Gauss views have also been added):
The view panel allows you to choose the specific view you want to work with, within the selected LHCb project.
The view that is currently used (in the example above, the "LHCb-05-Boole" view within the "Issues: By snapshot" view type)
is displayed in white bold characters in the view panel and is also printed on the top left corner, below the project pulldown menu.
For LHCb users, the view panel will normallly display both standard views and LHCb-specific views:
the latter are displayed in italics, because they are private views of one user, shared with the LHCb group.
The view panel is largely configurable by each user.
If the LHCb specific shared views do not appear in your interface, click on "Layout Preferences" in the menu next to "Issues by Snapshot", and in the "Layout" tab select "Display all views shared with you".
Some of the views you may find most relevant for your work include:
- the standard "High impact outstanding" view, showing only defects classified as High-impact (and not yet fixed or dismissed), from all streams in the LHCb project
- the shared "All issues in last snapshot" view, showing all defects from all LHCb streams (irrespective of impact and including also defects triaged and dismissed as fake or irrelevant)
- the shared "LHCb-05-Boole" view and its siblings, showing all defects from one of the twelve LHCb streams (again, irrespective of impact and including also dismissed defects)
You can also add your own view. For instance, you may be interested in defects in the Boole stream, but only want to list outstanding ones, excluding those that have been dismissed as irrelevant.
To create a new view, click on "Add New View" in the menu next to "Issues by Snapshot";
alternatively, you may also click on "Edit Settings" for an existing view, and then select "Save as a Copy" to create a clone of that view, which you may later modify.
As mentioned above, note that streams and components are not the same thing.
A defect appearing in the "LHCb-05-Boole" view belongs to the BOOLE stream, but it can belong to the GAUDI component, if it was found in a Gaudi header used by a Boole class.
The shared "LHCb-OtherComponent" view is a control view, which should normally be empty
if every defect is assigned to an LHCb or external component.
If it is not empty, please report it as this means that the LHCb
component map must be modified.
Triage defects
Selecting the appropriate view will allow you to list and start working on individual defects in the LHCb software.
Clicking on an individual defect in the top left window will pop up the source code for that defect in the bottom left window and the metadata for that defect in the window to the right of the GUI.
The following is an example from the Gaudi stream:
In this example, defect 64252 in the build of Gaudi originates from the gcc hashtable.h header, and has therefore been assigned to the "external (gcc)" component during the Coverity scan.
As this defect cannot be fixed in the Gaudi source code, a Gaudi developer has dismissed this defect as "Ignore".
For another defect that points to a real bug in the Gaudi code, the workflow could for instance be the following:
triage the defect as "Fix required" initially; later on, commit a bug fix and triage the defect as "Fix submitted";
finally, cross check that the defect has disappeared from the Coverity server after the next scan.
4. Status of Coverity defects for the LHCb software projects
For reference, the total number of defects found during the first full scan on 29 Feb 2016 was the following:
grep Defect `ls -tr log-*-analyze.txt`
log-Gaudi-analyze.txt:Defect occurrences found : 1352 Total
log-LHCb-analyze.txt:Defect occurrences found : 2272 Total
log-Lbcom-analyze.txt:Defect occurrences found : 626 Total
log-Rec-analyze.txt:Defect occurrences found : 1604 Total
log-Boole-analyze.txt:Defect occurrences found : 520 Total
log-Phys-analyze.txt:Defect occurrences found : 895 Total
log-Hlt-analyze.txt:Defect occurrences found : 973 Total
log-Analysis-analyze.txt:Defect occurrences found : 920 Total
log-Stripping-analyze.txt:Defect occurrences found : 518 Total
log-DaVinci-analyze.txt:Defect occurrences found : 701 Total
log-Geant4-analyze.txt:Defect occurrences found : 1132 Total
log-Gauss-analyze.txt:Defect occurrences found : 1487 Total
--
AndreaValassi - 2016-02-18