XUUDB Verification and Validation Plan
Service/Component Description
See the reference card
XUUDB Service
Deployment scenarios
XUUDB is deployed as an administrative domain-central service which might serve one or more domain's servers. XUUDB is not considered a good choice as a grid-central service.
This paragraph It's quite similar for all components.
According to a definition of "Smoke Testing":
Smoke Testing: A quick-and-dirty test that the major functions of a piece of software works.
Originated in the hardware testing practice of turning on a new piece of hardware for the first time and considering it a success if it does not catch on fire.
Smoke Testing Steps:
Software components that have been translated into code are integrated into a "build".
A build includes all data files, libraries, reusable modules and engineering components required to implement one or more product functions.
A series of tests is designed to expose errors that will keep the build from properly performing its function.
The intent should be to uncover "show stopper" errors that have the highest likelihood of throwing the software project behind schedule.
The build is integrated with other builds and the entire product (in its current form) is smoke tested daily.
The integration approach may be top down or bottom up.
We have planned, following our experience on UNICORE, the steps needed to verify software base installation and base functionality aspects.
UNICORE server self-installing jar file version requires not too much work to let the installation works, in this way we can perform a very elementary test without too much effort, obviously only a few properties in UNICORE configuration files can be set.
first of all we have to test the packages that will be built and delivered within and for EMI, i.e. the .deb or .rpm packages.
So the current server bundle (which is outside EMI and released separately from EMI) is not relevant here.
In general I'd try to define a deployment testing procedure in multiple stages.
All should works with the default configuration, which will have to be designed in order that this kind of testing is possible, this should work out of the box... i.e. directly after running
the installation procedure (yum install...) with no changes to the config files necessary.
Some question/answer to define steps:
1) Which threads should be started at the end of the procedure?
xuudb
2) How many hosts I have to use?
one vm host is enough.
3) Where? on UNICORE Juelich testbed
4) Which platform?
For now UNICORE components are built on SLC5-EPEL and tested on
OpenSUSE
5) db entries? we can test it trying to add some entry into the xuudb using the admin.sh script
6) user to start the service is unicore
Functionality tests
The following tests use the term "entry specification" which can be any mix of the following properties: siteID, user credential, role, xlogin, gid.
Removing a user test (NOTimplemented: xxxxxx)
It is possible to remove a user providing any site id and credential as arguments.
Normal workflow - correct input
After removing a correct entry it is missing in output of the list operation.
Error workflow - erroneous input
Removing of the non-existing user do nothing.
TBD: more shall be added
Integration tests
TBD
Performance tests
TBD
Scalability tests
TBD
Standard Compliance/Conformance tests
N/A
Regression tests and unit tests
Unit tests coverage must be included in the test report.
All bugs reported should have an automated regression test attached if it is possible. Otherwise manual bug checking procedure should be added to this section. Note that this applies to bugs reported from the 1.11.2010.
Regression tests to be performed manually:
Deployment tests
TBD
--
ClaudioCacciari - 17-Dec-2010