Please note, this is the description for the specific gLite UI tests, when you certify the gLite UI you need to tests on other subcomponents too. Please check SA3Testing for details

User Interface (UI)

Location: org.glite.testsuites.ctb/UI

Developer: Dmitry Zaborov, Laurence Field (CERN), Kalle Happonen

These are intended as dedicated UI tests and not GRID infrastructure tests via UI. Hopefully this approach will help us to keep the test suite reasonably compact. The driving idea is to check whether all the commands listed in the Users Guide really work. An additional concern is that everything should work regardless what kind of shell (bash or tcsh) the user is using. The testing of both shells is enabled in three ways:

(1) Some of the most "system-oriented" tests, like shell environment tests, are implemented in 2 variants: for bash and tcsh.

(2) Some other tests, such as version tests are written such that both kinds of shell can execute them (we call this "multi-shell scripts").

(3) Other tests are designed to run in simple Bourne shell ('sh'). This solution relies on the fact that non-interactive non-login 'sh' shell does not read any start-up files or manipulate shell environment when starting (except for very special variables like SHLVL). For instance the value of PATH will be the same as in the parent shell or process (it is interesting to note that tcsh, as well as interactive bash, may remove duplicate entries from PATH, affecting the order in which the directories are listed). Thus running a sh script with some commands is a good equivalent of running the same commands directly (from interactive shell).

The run methods can be summarized as follows:

For the case (1) one runs both tcsh- and bash-versions of test. One can also run the bash version using sh. I.e. [ | sh | bash ] and [ | csh | tcsh] scriptname.csh Some of the scripts can also be "sourced" from current shell (please consult comments in the script of interest to find out).

For the case (2) one runs the same script using different shells, i.e. [ | sh | bash | csh | tcsh ]

For the case (3) one can launch the sh-script from current shell as well as start a new shell of different type and run the test from there. I.e. [ | sh | bash ]

Note that, since shell environment is exported from a shell to its daughter processes, running a bash test from a login bash is not the same as running it from a login tcsh, or from a non-login bash started from tcsh, etc. So it does make sense to try every test with different login shells.

Since no gLite services run continuously on UI there is no point in running pure UI tests permanently. Such tests can be used to validate a new UI installation, configuration or an update instead. However a major part of the UI tests actually implement functional testing features, e.g. a test of a job submission command will fail if the concerned service does not work. Some of the tests might also be useful on machines other than UI, e.g. on Worker Nodes.

Running the tests

It is highly recommended to have a valid voms proxy when running the tests. E.g. type 'voms-proxy-init -voms dteam' to create a dteam proxy.

All the tests (except can be run from the test driver script. This is the recommended way of running the tests. Some of the tests are platform specific, so using the test driver script, will execute the correct subset of tests. The current platforms supported at the moment are gLite 3.1 SLC4 32-bit (64-bit should work, but the tests might report errors with libraries), and gLite 3.2 SLC5 64-bit. The main differences between these is the migration from gLite 3.1 to 3.2.

The test driver needs some flags, depending on the tests run. See ./ --help for details. The test driver can run all tests, or a subset of tests. Some security tests require you to give certificate and myproxy passwords during the tests, otherwise the tests should be fully automatic.

Descritption of different tests

Shell environment tests

  • - bash version
  • UI-environment.csh - tcsh version
Search for incorrect entries in shell variables. Currently checked are PATH, LD_LIBRARY_PATH, PYTHONPATH, PERLLIB, PERL5LIB, MANPATH, JAVA_HOME, JAVA_INSTALL_PATH.

gLite-specific shell environment tests

  • - bash version
  • UI-glite-environment.csh - tcsh version
Test if gLite/LCG-specific shell variables, such as GLITE_LOCATION for example, are pointing to existing directories.

All-in-one command existance test

This test checks whether UI commands listed in the User Guide (to be precise in a text file supplied with the test suite) can be found in the system. The tests should fail if, for example, shell environment is not configured properly. Currently only (ba)sh-version of this test is provided. However it will effectively test the environment of the parent shell, which is exported to the daughter sh shell, regardless of the shell type.

Libraries exist test

  • - Checks if liblcg_util, libgfal and libglobus_gass_copy_gcc32 can be found in usual place.
Also looks for broken symbolic links in $LCG_LOCATION/lib $GLOBUS_LOCATION/lib $GLITE_LOCATION/lib. This test also works for slc5 64-bit

Man page exist test

  • - Test whether man pages are provided for the commands from the User Guide.
This test scans the list of gLite commands and reports what commands come without man pages.

NTP test

  • - Test whether system time is being synchronized with ntp (uses ntpstat or ps | grep to find out).

Command execution tests


The version tests do nothing but run commands under consideration with --version or --help option (or whatever its name is). The interesting feature of these tests is that they fail when a tested commands fail to load, e.g. if a needed dynamic library is missing. Thus one can use them as a quick but efficient installation test. These tests are provided in form of "multi-shell" scripts (see "Existence tests" above). Note that test results might depend on shell used (due to possibly different shell environment). So it makes sense to try the same tests with different shells. Unfortunately a some commands of interest could not be covered by these tests because they do not provide a version or help option. [Note: some commands can't be tested because they either have no help/version option (lfc-*) or return non-zero exit status after displaying the help (edg-gridftp-*, lcg- data management tools).]

Security subsystem

Basic functionality tests of various commands related to grid security.
  • - grid-cert-info with various options.
  • - grid-proxy-info with various options.
  • - grid-proxy-init/info/destroy chain.
  • - voms-proxy-info with various options.
  • - voms-proxy-init/info/destroy chain.
  • - ensure that voms-proxy-init really uses the file given with the -userconf.
  • - myproxy-init/info/destroy command chain.

Job management

Basic functionality tests of the most popular job submission commands.
  • - job-list-match test for the EDG job submission system (only gLite 3.1)
  • - submit/get status/get output test for the EDG job submission system (only gLite 3.1)
  • - job-list-match test for the gLite job submission system (only gLite 3.1)
  • - submit/get status/get output test for the gLite job submission system (only gLite 3.1)
  • - job-list-match test for the gLite WMS job submission system
  • - submit/get status/get output test for the gLite WMS job submission system
  • - delegate proxy, submit/get status/get output test for the gLite WMS job submission system

Information system

Currently only MDS-related commands are tested (no R-GMA).
  • - Get info about CEs using lcg-info
  • - Get info about SEs using lcg-info
  • - Get info about sites using lcg-infosites
  • - Get info from local GIIS using ldapsearch. (Use -H to choose GIIS hostname.)

Data management: Lcg File Catalog

Test LFC catalog commands. In order to avoid the need for file transfer commands all tests are operating on directories.
  • - Basic test of lfc-ls.
  • - Create a directory in LFC, list it and remove.
  • - Create a directory in LFC, make a symbolic link to it and clean up.
  • - Create a directory in LFC, set its comment, list, delete comment, delete the directory.
  • - Create a directory in LFC, list ACL, modify ACL, list ACL, delete directory.

Data management: Lcg data management tools

Basic tests of the command line tools known as lcg_utils.
  • - Create and register, and then remove, a GRID file.
  • - Create and register, copy back and remove a GRID file.
  • - Create and register, list replica (SURL), get GUID and TURL for the replica, and then delete a GRID file.
  • - Create and register, list aliases, create new alias, list again, remove alias, list once again and remove a GRID file.
  • - Needs two SEs and executes lcg-cr, lcg-lg, lcg-cp, lcg-lr, lcg-lg, lcg-ls, lcg-aa, lcg-la, lcg-ra, lcg-rep, lcg-uf, lcg-sd. Also contains tests for bugs 32808, 32923 and 32999.

Tag management tests

These tests try to manage tags for a given computing element. They require the computing element and vo as an option. By default the tests assume you don't have tag management rights, so it doesn't try to add or remove tags. To enaballe these tags, use the --extended flag

Known issues
  • fails due to bug 36573.
  • fails in the part devoted to the -confile option which is, though deprecated, should be identical to -vomses (but apparently it is not).
  • fails (unless launched with -voms ) due to bug 33459.


Developer: Sophie Lemaitre (CERN)

Location: LCG-DM/test/python/lfc and LCG-DM/test/perl/lfc

  • basic operations (delete, create,...) via the LFC Python API
  • basic operations (delete, create,...) via the LFC Perl API

-- KalleHapponen - 29 Apr 2009

Edit | Attach | Watch | Print version | History: r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r1 - 2009-04-29 - KalleHapponen
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    EGEE All webs login

This site is powered by the TWiki collaboration platform Powered by Perl This site is powered by the TWiki collaboration platformCopyright & by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Ask a support question or Send feedback