DILIGENT Pre-production

Infrastructure

Description

DILIGENT (http://www.diligentproject.org) pre-production infrastructure is composed by 6 sites:

  • CNR: Pisa, Italy
  • ENG: Rome, Italy
  • ESA: Rome, Italy
  • SNS: Pisa, Italy
  • UNIBAS: Basel, Switzerland
  • UoA: Athens, Greece
From this 6 sites, 3 are part of EGEE PPS and 3 are planning to join in the next weeks.

In terms of VOs, the DILIGENT pre-production infrastructure supports the following VOs:

  • diligent: DILIGENT VO used by the two DILIGENT user communities and the DILIGENT developers
  • dteam: EGEE PPS monitoring VO
  • ops: EGEE PPS monitoring VO

The "diligent" VO is also supported by 10 other EGEE PPS sites.

This infrastructure is mainly used to:

  • store of the DILIGENT user communities data
  • store the DILIGENT services archives used in the deployment of the DILIGENT service
  • execute the DILIGENT watermarking application
  • execute feature extraction applications
  • execute application defined as part of a compound service
  • manage the DILIGENT users and their groups/roles


Service distribution

glite30_preprod.png


Core Services

  • BDII: lxn1187.cern.ch
  • LFC: grid-lfc.esrin.esa.int
  • VOMS: grids13.eng.it
  • MyProxy: grids02.eng.it


VOs Supported by the DILIGENT sites

The following VOs are currently supported by the DILIGENT sites:

# VO Name Mandatory Details
01 diligent yes https://cic.gridops.org/index.php?section=vo&page=homepage&vo=diligent
02 ops yes https://cic.gridops.org/index.php?section=vo&page=homepage&vo=OPS
03 dteam yes https://cic.gridops.org/index.php?section=vo&page=homepage&vo=dteam
04 switch no no CIC VO card available
05 biomed no https://cic.gridops.org/index.php?section=vo&page=homepage&vo=biomed
06 geant4 no https://cic.gridops.org/index.php?section=vo&page=homepage&vo=geant4

Configuration Files:


To be supported in the future when the VOMS server information are provided (this can be done very easily in one of the regular site updates):

# VO Name Mandatory Details
07 eela no incomplete CIC VO card available
08 unosat no no CIC VO card available


Sites Supporting the DILIGENT VO

# Site Name Country CE SE
01 CERN-PPS Switzerland choice-yes.gif choice-yes.gif
02 CERN-PROD Switzerland choice-yes.gif choice-yes.gif
03 CESGA-PPS Spain choice-yes.gif choice-yes.gif
04 PPS-CNR Italy choice-yes.gif choice-yes.gif
05 PPS-CYFRONET Poland choice-yes.gif choice-yes.gif
06 PPS-ESRIN Italy choice-yes.gif choice-yes.gif
07 PPS-LIP Portugal choice-yes.gif choice-yes.gif
08 PPS-PADOVA Italy choice-yes.gif choice-yes.gif
09 PPS-RO-01-UPB Romania choice-yes.gif choice-yes.gif
10 PPS-SNS Italy choice-yes.gif choice-yes.gif
11 PRAGUE-CESNET-PPS Czech Republic choice-yes.gif choice-yes.gif
12 PreGR-01-UoM Greece choice-yes.gif choice-yes.gif
13 PreGR-02-UPATRAS Greece choice-yes.gif choice-yes.gif
13 UKI-LT2-IC-HEP-PPS UK choice-cancel.gif choice-yes.gif

To get more information about the number of CEs and SEs made available by these sites, you can query the BD-II using the CIC portal:

Data Challenges

Description

The goal of this data challenge is to execute feature extraction on images. The Image Feature Extraction tool is composed of a Java application, some Perl scripts and a C application. The Java code implements a client that contacts the Flickr database (http://www.flickr.com/), downloads a set of users (limited to 5 for interaction) and the images that these users are sharing over the Web. The Perl script and the C application are the core of the Feature Extraction process, they extracts feature from the images, create thumbnails and (using a Java client) store the results on a cluster located at CNR.

Since the application runs on Java 1.5 (not supported on gLite nodes) we have to download Java binaries before launching the application. However, this overhead is not so penalising and the Feature Extraction process ratio has been significantly improved. The input files are stored on our servers and the PPS support is limited to the computing power required to process such collection.

The characteristics of the data challenge are:

  • Around 500 jobs submitted per day (through 2 WMSs), although this can be increased/decreased as needed
  • Each job requires at most 50 Mb of disk space and at least 512 of RAM
  • Jobs will consume between 20 minutes and 1 hour of CPU time (depending on CPU)
  • Site do not need to install any particular libraries or other software

Schedule

  • 1st Part (2 weeks)
    • Start: Monday 16 July
    • End: Friday 27 July
  • 2nd Part (2 weeks)
    • Start: Monday 20 August
    • End: Friday 31 August



-- PedroAndrade - 10 Jul 2007

Edit | Attach | Watch | Print version | History: r12 < r11 < r10 < r9 < r8 | Backlinks | Raw View | Raw edit | More topic actions...
Topic revision: r11 - 2007-07-10 - PedroAndrade
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    DILIGENT All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2022 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback