Mein deutsches Wortschatz
ablehnen |
rinunciare |
aufpumpen |
gonfiare |
|
|
sich ablenken |
distrarsi |
bestehen auf |
insistere |
einladen |
invitare, offrire |
neigen |
tendere |
sperren |
chiudere, sbarrare |
vermeiden |
evitare |
vermuten |
sospettare |
Bedingung |
condizione |
Entscheidung |
decisione |
-s Geheimnis |
mistero |
Genehmigung |
autorizzazione |
-r Hintergrund |
sfondo, backgound |
|
|
-r Reifen |
pneumatico |
-r Schlauch |
camera d'aria |
Verzeihung |
scuse |
-r Zettel |
foglio |
knapp |
appena |
solange |
finché |
Ich entschuldige mich für die Störung
Ich rede mal mit ihm
Ich schaffe es (nicht)
Das wäre es von meiner Seite
TAN
https://www.reiner-sct.com/ccsdata/documentDownload.pdf?documentId=83708640
My notes
CERN P1 proxy
https://pc-atlas-www.cern.ch/twiki/bin/view/Main/BeamConditionsOperationManualExpertRun3
CERN radiation facilities
JLU
Eduroam
https://cat.eduroam.de
Zoom
https://cern.zoom.us/j/9920384618?pwd=ZG45L2xTUGlobGg5U3kzd1hQL3ZoZz09
Certificates
https://eduroamdocs.web.cern.ch/eduroamdocs/cern_users/linux/ubuntu.html
Firefox
Einstellungen -> Datenschutz & Sicherheit -> Cookies -> Datei Verwalten
login.cern.ch
(cern.ch)
Ausgewählte...
VPN
Packages:
sudo apt install network-manager-openconnect network-manager-openconnect-gnome
VPN Protocol: Cisco AnyConnect
Gateway: vpn.uni-giessen.de
CA Certificate: attached
Proxy: (kein)
(CSD Wrapper Script?)
(User Ceritifcate?)
(Prvate key?)...
VPN host: vpn.uni-giessen.de
ssh -X <username>@alfa3.physik.uni-giessen.de
Right-click on "Continue"
Click on "Copy link"
Command
nordvpn login --callback "link"
(double quotes required)
Linux
Firefox does not open websites
Close Firefox, then:
cd .mozilla/
mv firefox/ firefox.bak
Restart Firefox.
400 Bad request
->Cached Web Content->Clear data
Get IP of a network printer
avahi-browse --all -t -r
Search for devices connected to the network
(sudo) nmap -sn 192.168.x.x/24
Nohup
nohup <process> > list.out 2 > list.err &
Both
out
and
err
to the same file:
nohup <process> > log.txt 2>&1 &
Root:
nohup root -l -b -q "elal.C+(0,354944)" > ../Logs/r354944.txt 2>&1 &
The system is stuck
Hit Alt+F2, type
xkill
, and press Enter. Your mouse cursor will then turn into an X. Hover over the offending window and left-click to kill it. Right clicking will cancel and return your mouse to normal.
Reboot:
While holding Alt and the
SysReq (Print Screen) keys, type REISUB:
R: Switch to XLATE mode
E: Send Terminate signal to all processes except for init
I: Send Kill signal to all processes except for init
S: Sync all mounted file-systems
U: Remount file-systems as read-only
B: Reboot
Find the code of a process
You can add filters to
top
while it is running: press the
o
key and then type in a filter expression.
For example, to monitor all java processes use the filter expression
COMMAND=java
.
Multiple filters by pressing the key again, filter by user with the
u
key, clear all filters with the
=
key.
Find the ID of a process
pgrep -a <process>
Find a word in a file
In this example the file extension is
.h
and the word ti search for is
Hreco
:
find .. -iname '*.h' -exec grep -H Hreco {} \;
Search for files modified in the last n days
find ../.. -name "fit*.C" -mtime -n
Converting images
convert <image.jpg> <image.png>
convert <image*.png> <filename.pdf>
Merging images
Vertically:
convert image1.png imag2.png [-append] result.png
Horizontally:
convert image1.png imag2.png +append result.png
Merge pdf files
pdfunite file1.pdf file2.pdf merged_output.pdf
Extract pages from a pdf file
pdftk full-pdf.pdf cat 12-15 output outfile_p12-15.pdf
qpdf input.pdf --pages input.pdf 5-6 -- output.pdf
Printers
Web CUPS
http://localhost:631/printers/Dell_C3765dnf_Color_MFP
Audio
arecord -f cdr /tmp/test
arecord /tmp/test
aplay /tmp/test
speaker-test
speaker-test -twav
pgrep -a speaker
Hostname
nslookup <ip-address>
View last lines of a file updating "live"
tail -f <filename>
Search for text in files
grep -rli 'some_text' .
r = recursive, i = case insensitive, l = list file names
grep -r "text" <path>
Bash scripts
Read the content of a file
#!/bin/bash
filename='filename.txt'
while read -r item1 item2 item3; do
echo "$item1 $item2"
done < $filename
Assign the result of a floating point operation
c=$(echo "3 4" | awk '{a=$1; b=$2; printf "%0.2f \n", (a / b) }')
Avoid big vertical space before the last item of a list:
\item some text\par
Remove bullet from a list of items:
\item[]
Custom font size:
\usepackage{anyfontsize}
...
{\fontsize{50}{60}\selectfont Foo}{\fontsize{5}{6}\selectfont bar!}
Installing biblatex and biber packages:
sudo apt-get install texlive-bibtex-extra
sudo apt-get install texlive-bibtex-extra biber
Picture environment
https://www.overleaf.com/learn/latex/Picture_environment
\setlength{\unitlength}{0.20mm}
\begin{picture}(400,250)
\thicklines
\put(75,10){\line(1,0){130}}
\put(75,50){\line(1,0){130}}
\put(75,200){\line(1,0){130}}
\put(2,2.2){\circle{2}}
\put(6,2.2){\color{red}\oval(4,2)[r]}
\put(120,200){\vector(0,-1){150}}
\put(190,200){\vector(0,-1){190}}
\put(97,120){$\alpha$}
\put(170,120){$\beta$}
\put(220,195){upper state}
\put(220,45){lower state 1}
\put(220,5){lower state 2}
\end{picture}
Tables
\begin{center}
\renewcommand{\arraystretch}{2.5}
\begin{tabular}{| p{3.2cm} | p{6.3cm} | p{1.4cm} |}
\hline
\scriptsize{\textit{\textbf{item}}} & \scriptsize{\textit{\textbf{action}}} & \scriptsize{\textit{\textbf{responsibles}}} \\
& \multicolumn{2}{c|}{B7L1} \\
\cline{2-3}
\hline
\hline
\scriptsize{merge request} & to be performed in the next days $\Rightarrow$ \textbf{participation to the next DQ large scale tests} & N. \DJ iki\'c\newlin
e D. C. \\
\hline
\scriptsize{large scale tests on a number of 2017 and 2018 runs} & already available \textbf{337176} (2017), \textbf{361795} (2018), waiting for more runs
(at least two more per year) from AFP experts & N. \DJ iki\'c\newline K. Korcyl\newline L. Adamczyk\newline D. C. \\
\hline
\scriptsize{documentation} & AFP DQ TWiki being updated, more histograms (e.g. correlation with ID etc.) to be added \textbf{once agreed} & R. Staszewski\
newline P. Newman\newline D. C. \\
\hline
\scriptsize{webdisplay} & automatic checks under implementation/test & N. \DJ iki\'c\newline D. C. \\
\hline
\scriptsize{defects} & to be defined: \textit{name (e.g. \textbf{AFPnotSynchronized}), definition, tolerable/intolerable} & D. C.\newline ... \\
\hline
\end{tabular}
\end{center}
Columns
\vspace*{-0.28cm}
\begin{columns}[c c]
\column{.51\textwidth}
\begin{center}
\hspace*{-2.08cm}\includegraphics[width=0.65\linewidth]{pics/BRIC_demo03.jpg}
\end{center}
\column{.51\textwidth}
\begin{center}
\hspace*{-2.08cm}\includegraphics[width=1.25\linewidth]{pics/BRIC_demo03_zoom.jpg}
\end{center}
\end{columns}
Weblinks
Webpage: {\color{blue}\href{https://www.caen.it/products/easy-bric1/}{bric1}}
\url{https://www.caen.it/products/easy-bric1/}
Workspaces at CERN
AFS
AFS
EOS (CERNbox)
/eos/home-c/caforio
Setup:
setupATLAS
voms-proxy-init -voms atlas
lsetup rucio
Copy a file from GRID:
rucio download user.<someuser>.<somefile>
Qualification tasks
ATHENA
ATLAS data scheme
Releases
https://gitlab.cern.ch/atlas/athena/-/releases
Builds
https://atlassoftwaredocs.web.cern.ch/athena/athena-nightly-builds/
To see which nightly releases are installed it’s possible just to browse CVMFS:
ls /cvmfs/atlas-nightlies.cern.ch/repo/sw/master...
Tutorial
Release setup
Basic Account and Software Setup
Basic Account and Software Setup
Certificates
cert-convert.sh
#!/bin/bash
##################################################################################
#
# $HeadURL: svn+ssh://svn.cern.ch/reps/dirac/DIRAC/trunk/DIRAC/Core/scripts/dirac-cert-convert.sh $
# $Id: DavideCaforioSandbox.txt,v 1.207 2023/03/09 14:10:51 caforio Exp $
#
# dirac-cert-convert.sh script converts the user certificate in the p12 format
# into a standard .globus usercert.pem and userkey.pem files
#
# Author: Vanessa Hamar
# Last modified: 25.04.2010
#
##################################################################################
function usage {
echo Usage:
echo " "cert-convert.sh CERT_FILE_NAME.p12
exit 1
}
if [ $# = 0 ]; then
echo User Certificate P12 file is not given.
usage
fi
GLOBUS=$HOME/.globus
USERCERT_P12_ORIG=$1
USERCERT_P12=$GLOBUS/`basename $USERCERT_P12_ORIG`
USERCERT_PEM=$GLOBUS/usercert.pem
USERKEY_PEM=$GLOBUS/userkey.pem
OPENSSL=`which openssl`
DATE=`/bin/date +%F-%H:%M`
if [ ! -f "$USERCERT_P12_ORIG" ]; then
echo file $USERCERT_P12_ORIG does not exist
usage
fi
if [ ! -d $GLOBUS ]; then
echo "Creating globus directory"
mkdir $GLOBUS
fi
if [ -f $USERCERT_P12 ]; then
echo "Back up $USERCERT_P12 file"
cp $USERCERT_P12 $USERCERT_P12.$DATE
fi
cp $USERCERT_P12_ORIG $USERCERT_P12
echo "Converting p12 key to pem format"
if [ -f $USERKEY_PEM ]; then
echo "Back up $USERKEY_PEM file"
mv $USERKEY_PEM $USERKEY_PEM.$DATE
fi
while [ ! -s $USERKEY_PEM ]; do
$OPENSSL pkcs12 -nocerts -in $USERCERT_P12 -out $USERKEY_PEM
done
echo "Converting p12 certificate to pem format"
if [ -f $USERCERT_PEM ]; then
echo "Back up $USERCERT_PEM file"
mv $USERCERT_PEM $USERCERT_PEM.$DATE
fi
while [ ! -s $USERCERT_PEM ]; do
$OPENSSL pkcs12 -clcerts -nokeys -in $USERCERT_P12 -out $USERCERT_PEM
done
chmod 400 $USERKEY_PEM
chmod 644 $USERCERT_PEM
echo "Information about your certificate: "
$OPENSSL x509 -in $USERCERT_PEM -noout -subject
$OPENSSL x509 -in $USERCERT_PEM -noout -issuer
echo "Done"
GRID
Submitting a DQ job for AFP
cd /afs/cern.ch/work/c/caforio/testAFP/athena ; setupATLAS ; asetup Athena,master,r2021-10-05T2101 ; voms-proxy-init -voms atlas ; lsetup panda
pathena --trf "Run3DQTestingDriver.py --dqOffByDefault DQ.Steering.doAFPMon=True --inputFiles=%IN" --inDS=data18_13TeV.00350160.physics_Main.merge.AOD.f934_m1960 --extOutFile=ExampleMonitorOutput.root --outDS=user.caforio.test100.root
Athena releases
https://atlassoftwaredocs.web.cern.ch/athena/athena-nightly-builds/
ls -lrt /cvmfs/atlas-nightlies.cern.ch/repo/sw/master*
Checking the output
cd /eos/home-c/caforio
setupATLAS
asetup Athena,master,r2021-03-15T2101
voms-proxy-init -voms atlas
lsetup rucio
rucio get ...
VOMS
VOMS
VOMS2
VOMS3
voms-proxy-init -voms atlas
AMI
AMI
PANDA
Panda
PandaAthena
PandaRun
https://twiki.cern.ch/twiki/bin/view/PanDA/PandaAthena
https://panda-wms.readthedocs.io/en/latest/client/pathena.html#running-joboptions
Rucio
Rucio tutorial
Analysis stuff
AFPRun2DataReconstruction
RecoTf
setupATLAS
lsetup rucio
rucio list-files <filename>
rucio get <filename>
asetup Athena,21.0.97
lsetup panda
pathena --trf “Reco_tf.py ...” inputfile outputfile
Example:
pathena --trf "Reco_tf.py --inputEVNTFile \"%IN\" --outputDAODFile \"%OUT.DAOD.root\" --reductionConf TRUTH0" --inDS data18_13TeV.00360309.calibration_AFP.daq.RAW --outDS user.caforio.testRun360309
INFO : using CMTCONFIG=x86_64-centos7-gcc8-opt
INFO : extracting run configuration
INFO : archiving source files
INFO : The build directory is /cvmfs/atlas-nightlies.cern.ch/repo/sw/master_Athena_x86_64-centos7-gcc8-opt/2020-12-03T2101/Athena/22.0.22/InstallArea
INFO : archiving source files
INFO : archiving InstallArea
INFO : checking symbolic links
INFO : uploading source/jobO files
INFO : submit user.caforio.testRun360309/
INFO : succeeded. new jediTaskID=23451144
Monitoring the status of the job
BigPanda
https://bigpanda.cern.ch/task/
Bookeeping:
pbook
https://panda-wms.readthedocs.io/en/latest/client/pbook.html
Examples:
>>> show()
Showing only max 1000 tasks in last 14 days. One can set days=N to see tasks in last N days, and limit=M to see at most M latest tasks
JediTaskID ReqID Status Fin% TaskName
________________________________________________________________
23451144 1 running 0.0% user.caforio.testRun360309/
>>> show()<JediTaskID> or <ReqID>
Long format:
>>>showl()
Filters:
>>> show(username='Some Name', limit=7, days=30)
Kill job:
>>> kill(3)
Retry:
>>> retry(2)
Setup
setupATLAS
asetup Athena,master,latest
bash
mkdir workArea
setupATLAS
export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
alias setupATLAS='source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh'
mkdir source
mkdir build
mkdir run
cd source
Checking xAOD files
setupATLAS
asetup Athena,master,latest
checkxAOD.py <filename>
Checking-out AFP DQ code
mkdir testAFP
cd testAFP
setupATLAS
lsetup git
git atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git
cd athena
git atlas addpkg WorkDir
git atlas addpkg Run3AFPMonitoring
git atlas addpkg AthenaMonitoring
(
git remote add ndikic https://:@gitlab.cern.ch:8443/ndikic/athena.git
git fetch ndikic
)
git checkout master-AFP-DQM-Nikola
mkdir build
cd build
cmake ../athena/Projects/WorkDir; source x86_64-centos7-gcc8-opt/setup.sh; make
COOL
setupATLAS
asetup 21.0.39,Athena
(asetup 21.2.93,AthAnalysis)
COOLR MANAGER UI
https://coolr.web.cern.ch/ext/web/coolrui/index.html
ACE COOL Data Viewer
https://twiki.cern.ch/twiki/pub/Atlas/AfpDcs/ACE_CoolDataViewer_02Feb22.pdf
# setup TDAQ environment for runtime only
source /cvmfs/atlas.cern.ch/repo/sw/tdaq/tools/cmake_tdaq/bin/cm_setup.sh
# starting ACE
source /afs/cern.ch/atlas/project/tdaq/level1/calo/bin/ace.sh
COOL data from online
https://twiki.cern.ch/twiki/bin/view/AtlasComputing/CoolOnlineData
GIT
General
Git global setup
https://atlassoftwaredocs.web.cern.ch/gittutorial/env-setup/#configure-git
git config --global user.name "Name Surname"
git config --global user.email "mail@somewhere.de"
git config --list
To avoid entering credentials anytime
git config credential.helper store
Manage remote origins
git remote remove origin
git remote add origin https://gitlab.cern.ch/<project-name>
Create a new repository
git clone https://:@gitlab.cern.ch:8443/<user>/<project-name>.git
cd <project-name>
touch README.md
git add README.md
git commit -m "add README"
git push -u origin master
Push an existing folder
cd existing_folder
git init
git remote add origin https://gitlab.cern.ch/<user>/<project-name>.git
git add .
git commit -m "Initial commit"
git push -u origin master
Push an existing Git repository
cd existing_repo
git remote rename origin old-origin
git remote add origin https://gitlab.cern.ch/<user>/<project-name>.git
git push -u origin --all
git push -u origin --tags
View the branches in a Git repository
git branch [-a]
Delete a local branch
git branch -d <the_local_branch>
Checkout the latest version on the remote repository
(
git fetch origin
)
git checkout -b <branch_name> origin/master
GIT commit
git commit -[a]m "My message"
(use the -a option in your commit command to stage and add the changes to your repository).
Push a new local branch to a remote git repository link
- create a new branch:
git checkout -b feature_branch_name
- edit, add and commit your files
- push your branch to the remote repository:
git push -u origin feature_branch_name
Creating a fork
Click on "Fork" (upper right)
Click on "Submit"
Ignoring-files
https://linuxize.com/post/gitignore-ignoring-files-in-git/
GITlab CERN "13 TeV 90m" package
git status
git add <filename>
git commit -m "some text"
git push
Data quality
Useful links
Reference runs
- The place where they should be put is
/eos/atlas/atlascerngroupdisk/data-dqm/references/Other/
- Aaron Webb,
aaron.f.webb@gmail.com
Good Runs List
Online DQ
Starting the DQMD remotely:
where
<partition>
could be ATLAS, AFP etc.
Online DQ algorithms on GITLAB:
Luminosity etc.
https://acode-browser.usatlas.bnl.gov/lxr/source/athena/Control/AthenaMonitoring/AthenaMonitoring/ManagedMonitorToolBase.h
Talks
https://indico.cern.ch/event/831761/contributions/3484227/
->
https://indico.cern.ch/event/831761/contributions/3484227/attachments/1929826/3195970/inductionday_SWC.pdf
Dead/LEff/Noisy pixels
https://indico.cern.ch/event/937750/contributions/3939673/attachments/2072398/3479333/prez.pdf
DQ Run3 Tutorials
DQ algorithms
https://gitlab.cern.ch/atlas/athena/-/tree/master/DataQuality/dqm_algorithms
Release 22 - MT
A collection of the basic DQ algorithms that have been already implemented and where they reside:
https://twiki.cern.ch/twiki/bin/view/Atlas/Run3DQCodeInGit.
AFP repository on GIT:
https://gitlab.cern.ch/atlas/athena/tree/master/ForwardDetectors/AFP/AFP_Monitoring
Merge requests:
https://gitlab.cern.ch/atlas/athena/-/merge_requests?scope=all&utf8=%E2%9C%93&state=merged&author_username=ndikic
AFP outputs (standalone tests):
CERNbox
AFP DQ code:
mkdir testAFP
cd testAFP/
setupATLAS
lsetup git
git atlas init-config --apply
git atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git [:@gitlab.cern.ch:8443] -b 21.0
cd athena/
git checkout -b 21.0-AFP-DQM upstream/21.0 --no-track
git atlas addpkg AFP_Monitoring
ls ForwardDetectors/AFP/AFP_Monitoring/
cd ..
mkdir Build
cd Build/
asetup Athena,21.0,latest,slc6
cmake ../athena/Projects/WorkDir/
make
source ./x86_64-centos7-gcc62-opt/setup.sh
AFP DQ strategy (as of 13.08.2019):
- Get some first code running. Song-Ming pointed you to https://twiki.cern.ch/twiki/bin/view/Atlas/DataQuality where this is also roughly described (right hand side): Suggested workflow:
- read through the instructions in the example job option
- write an algorithm (subclass AthMonitorAlgorithm) that follows the form of the ExampleMonitorAlgorithm to retrieve a quantity (just port over a calculation from your subsystem’s Run II monitoring code) related to your subsystem from a data file (AOD/ESD/etc)
- write a configuration file that defines a histogram for this quantity and writes it to the output histogram file
- run athena with the JobOption you have created and check the out put file for your histogram
- test your ability to change the python configuration. (e.g. try changing the output
TDirectory
structure)
- test your ability to calculate other quantities and use filters in the algorithm
- record what percentage of your histograms is migrated
- for how to get this template code running, the best is to look at: https://twiki.cern.ch/twiki/bin/view/Atlas/DQRun3FrameworkTutorial (which I understand, you have already looked at - though I understand you tried to indeed understand all references, which is very admirable, but not possibly very difficult without having too much practical experience of the ATLAS computing framework).
Inside the section:
https://twiki.cern.ch/twiki/bin/view/Atlas/DQRun3FrameworkTutorial#New_Monitoring_Framework (which roughly explains some of the algorithms) you find a reference to
https://gitlab.cern.ch/atlas/athena/tree/master/Control/AthenaMonitoring
where you find in the folders:
src / AthenaMonitoring / python
the following files:
which you would want to rename to AFP DQMonitoring and include it in a relevant package in your project folder:
https://gitlab.cern.ch/atlas/athena/tree/master/ForwardDetectors/AFP
(Betty's old code there?).
Instructions to run:
https://twiki.cern.ch/twiki/bin/view/Atlas/DQRun3FrameworkTutorial#Getting_started_with_the_new_mon.
JIRA tickets:
https://its.cern.ch/jira/browse/ATLASDQ-662
.
Release 20.7
Run the ATLAS test web display
Get the DQM tools:
mkdir dq_devel
cd dq_devel
setupATLAS
lsetup git
git atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git
cd athena
git atlas addpkg DataQualityConfigurations
git fetch upstream
git checkout -b 21.0-my-dq-development upstream/21.0 –no-track
TWiki
AFP
M. Trzebinski (12.06.2019): list of defects and descriptions created, to be uploaded in the DQDefects DB, a specific GRL (AFP special runs) will be created
AFP test runs
2017: 333487, 334350, 335177, 336852,
337176, 337371, 338263, 339037
2018: 350013, 350184, 350682, 350923, 351364, 351894, 352340, 355389, 356124, 357409, 357539, 357821, 358541, 359010, 359191, 359472, 359823, 360026, 360309, 361975
Open Call for Tasks in the Data Quality area
https://twiki.cern.ch/twiki/bin/view/Atlas/DQOpenCalls#Finalization_of_the_AFP_DQ_histo
https://its.cern.ch/jira/browse/ATLASDQ-674
Talks
2021
2020
26/03/2020 AFP soft & sim + DQ Weekly Meeting
12/03/2020 ARP Technical Meeting
20/02/2020 ARP Technical Meeting
06/02/2020 ARP Technical Meeting
23/01/2020 ARP General Meeting
23/01/2020 ARP General Meeting
2017
25/10/2017 Data Quality weekly
24/10/2017 technical AFP meeting (AFP detector, test AFP in ATLAS web display, which histograms to use and how)
10/10/2017 technical AFP meeting (grouping and convention name in COOL, BPM, AFP in ATLAS test web display)
03/10/2017 technical AFP meeting (AFP in general DQ meeting, AFP into ATLAS test web display, COOL update)
09/26/2017 technical AFP meeting (Monitoring status, COOL parameters, AFP runs list, analysis of runs)
09/25/2017 AFP Condition Database meeting (A : variabls for COOL, B: data structure)
09/13/2017 ATLAS ALFA/AFP meeting (A: Online and Offline tools, defects, histo and ref histo, remaining steps. B: TPX-3)
08/29/2017 technical AFP meeting (Analysis of run 332303,
SiT hit versus luminosity block)
08/22/2017 technical AFP meeting (Online and offline monitoring histograms, analysis of run 332303)
08/15/2017 technical AFP meeting (Online and offline monitoring histograms and analysis status)
08/08/2017 technical AFP meeting (ATLAS monitoring system, Online Histogram Presenter)
08/01/2017 technical AFP meeting (Data Quality Monitoring importance, strategy, new email list)
Test setup
- create a root file with AFP reconstructed objects (G. Gach)
setupATLAS
mkdir ../build && cd ../build
asetup 21.0.39,Athena
mkdir ../source && mv CMakeLists.txt ../source/
cmake ../source
make -j
source x*/setup.sh
mkdir ../run && cd ../run
Reco_tf.py --inputBSFile 'raw_AFP_file' --outputAODFile 'output.root' --outputHISTFile 'output_HIST.root' --AFPOn True
--autoConfiguration 'everything' --preExec
'all:rec.doTrigger.set_Value_and_Lock(True);rec.doAlfa.set_Value_and_Lock(False);rec.doForwardDet.set_Value_and_Lock(Tr
ue);rec.doAFP.set_Value_and_Lock(True);DQMonFlags.doAFPMon=True;from InDetRecExample.InDetJobProperties import
InDetFlags;InDetFlags.checkDeadElementsOnTrack.set_Value_and_Lock(True);' 'r2a:from
InDetRecExample.InDetJobProperties import InDetFlags;
InDetFlags.useDynamicAlignFolders.set_Value_and_Lock(True);from InDetPrepRawDataToxAOD.SCTxAODJobProperties
import SCTxAODFlags;SCTxAODFlags.Prescale.set_Value_and_Lock(50);from TrigHLTMonitoring.HLTMonFlags import
HLTMonFlags;HLTMonFlags.doGeneral=False;TriggerFlags.AODEDMSet="AODFULL";' --maxEvents 5000 --conditionsTag
'CONDBR2-BLKPA-2017-11' --geometryVersion 'all:ATLAS-R2-2016-01-00-01' --steering 'doRAWtoALL'
- run the ATLAS test web display (R. Narayan, P. Onyisi)
mkdir dq_devel ; cd dq_devel
SetupATLAS ; lsetup git
git atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git
cd athena
git atlas addpkg DataQualityConfigurations
git fetch upstream
git checkout -b 21.0-my-dq-development upstream/21.0 –no-track
mkdir ../build ; cd ../build
asetup 21.0.38
-
- get the AFP configuration file
cd ../athena/DataQuality/DataQualityConfigurations/config/ ; mkdir AFP
cp /afs/cern.ch/user/c/calpas/www/collisions_run.config AFP
-
- get the AFP reconstructed root file
cp /PATH/TO/YOUR/NEW/output.HIST.root
cd ../../../../build
cmake ../athena/Projects/WorkDir ; make -j ; source x*/setup.sh
DQWebDisplay.py ../athena/DataQuality/DataQualityConfigurations/config/output.HIST.root TestDisplay 1
AFP DQ GIT stuff
Merge requests by Nikola
Config
AFP Monitoring
DCSCalculator2
DCSCalculator2
https://twiki.cern.ch/twiki/bin/view/Atlas/DcsCalculator2
COOL folders of AFP DCS
https://twiki.cern.ch/twiki/pub/Atlas/AfpDcs/AFP_COOL_Folders_twiki.pdf
DCSCalculator2 code
https://acode-browser1.usatlas.bnl.gov/lxr/source/athena/DataQuality/DCSCalculator2/python/subdetectors/
TDAQ
General
https://atlasop.cern.ch/twiki/bin/view/Main/Run2Preparation
Setup environment
source /sw/tdaq/setup/setup_tdaq-09-03-00.sh
Start IGUI
setup_daq -p <partition_name> /det/<some_subdet>/<some_path>
Check resources (useful to kill a partition)
start_rm_gui
DQMD display
dqm_display -p ATLAS
To-do list
- events in the xAOD files not chronologically ordere (LB number can go backwards from one event to the next)
- efficiency plots: currently
ex=px
(px
= hits in the plane x), should be ex=px/(p0+p1+p2+p3)
- correlation between AFP and central ATLAS: (horizontal) track multiplicity in AFP vs. total energy or total multiplicity
Luminosity with AFP
https://indico.cern.ch/event/954049/contributions/4008725/attachments/2099700/3529788/AFPlum.pdf
Beamspot
Beamspot main page (last 200 runs, otherwise 50 runs by default):
https://atlas-beamspot.cern.ch/webapp/t0Summary/?type=DB_BEAMSPOT&limit=200
"&limit=0" should in principle give all runs, but will likely time out since the resulting page will be too long.
Ntuples:
https://atlas-beamspot.cern.ch/webapp/files/?u=data18_13TeV.00354944.physics_Main/REPRO_BEAMSPOT.c1330_c1154
LHC
Schedule
https://docs.google.com/spreadsheets/d/1NSeaM0Vu9k-vwitbJWg7OuOav3PCriwwbcE92pxvHdY/edit#gid=866593016
LHC Logbook
https://be-op-logbook.web.cern.ch/elogbook-server/#/logbook?logbookId=322&dateFrom=2018-07-04T07%3A00%3A00&dateTo=2018-07-04T15%3A00%3A00
Logbook
Timber
https://timber.cern.ch/query?tab=Variables&name=&description=&extractionTimeZone=UTC&selectedOutput=0&timeRange%5Btype%5D=0&timeRange%5Bbefore%5D=24&timeRange%5BbeforeUnit%5D=hours&timeRange%5Breference%5D%5Btype%5D=now&timeRange%5Breference%5D%5Boffset%5D=Now&autoLoad=false
Jolly characters are allowed:
LHC.BPM.7%1_B%_DOROS:POS_V
BPM
Example of search string for ALFA station 7L1, beam 1, vertical position:
LHC.BPM.7L1_B1_DOROS:POS_V
ROOT
ROOT tutorial
https://root.cern.ch/root/htmldoc/guides/primer/ROOTPrimer.html#root-macros
Sources
The source code from the ROOT download webpages does not contain the
CMakeLists.txt
file.
The GIT repository works fine instead:
https://root.cern/releases/release-62200/#git
Installation
https://root.cern.ch/building-root
http://tylern4.github.io/InstallRoot/
Seems to work on Lubuntu 16.04 i686 and 18.04 amd64:
Download the source from GIT:
git clone https://github.com/root-project/root.git
cd root
git checkout v6-10-00-patches
or the most recent release:
git clone https://github.com/root-project/root.git
cd root
git checkout -b v6-22-00 v6-22-00
Seems to work on Lubuntu 18.04 amd64:
Required packages:
sudo apt-get install git dpkg-dev cmake g++ gcc binutils libx11-dev libxpm-dev \
libxft-dev libxext-dev
Optional packages:
sudo apt-get install gfortran libssl-dev libpcre3-dev \
xlibmesa-glu-dev libglew1.5-dev libftgl-dev \
libmysqlclient-dev libfftw3-dev libcfitsio-dev \
graphviz-dev libavahi-compat-libdnssd-dev \
libldap2-dev python-dev libxml2-dev libkrb5-dev \
libgsl0-dev libqt4-dev
mkdir build
cd build
cmake path/to/source
cmake --build . -- -jN
where N is the number of available cores.
Setup the environment to run:
$ source /path/to/install-or-build/dir/bin/thisroot.sh
Start ROOT interactive application:
$ root
Python3
(Lubuntu 18.04 64)
sudo apt-get install python3-dev
ROOT setup on lxplus
setupATLAS
localSetupROOT [options] [version]
localSetupROOT --help
Input/output
ROOT input/output
hadd
hadd result.root file1.root file2.root
root -l <file_name>
tree_name->MakeClass("<class_name>")
Open ROOT file, read Tree, make histograms, save in another ROOT file
void readTree(Int_t run = 354826){
Char_t sdata[100], sout[100];
Int_t lbStart, status;
Double_t sigmaX;
TBranch *statusBranch = 0, *lbStartBranch = 0, *sigmaXBranch = 0;
sprintf(sdata, "../Root/data18_13TeV.00%d.physics_Main-REPRO_BEAMSPOT.c1330_c1154.MergeNt-nt.root", run);
cout << "Input file = " << sdata << "\n";
TFile *f = new TFile(sdata);
sprintf(sout, "../Root/beamspot_%d.root", run);
TFile *fout = new TFile(sout, "RECREATE");
TH1 *hx = new TH1F("hx", "hx", 2000, 1, 2000);
TTree *t = (TTree *)f->Get("BeamSpotNt");
t->SetMakeClass(1);
t->SetBranchAddress("status", &status, &statusBranch);
t->SetBranchAddress("lbStart", &lbStart, &lbStartBranch);
t->SetBranchAddress("sigmaX", &sigmaX, &sigmaXBranch);
Long64_t nentries = t->GetEntries();
for ( Long64_t i = 0 ; i < nentries ; i++ ) {
statusBranch->GetEntry(i);
if ( status == 59 ) {
lbStartBranch->GetEntry(i);
sigmaXBranch->GetEntry(i);
hx->Fill(lbStart, sigmaX);
}
}
TCanvas *c = new TCanvas("c", "c", 1200, 800);
hx->SetMarkerStyle(8);
hx->SetLineWidth(0);
hx->GetXaxis()->SetTitle("LB");
hx->GetYaxis()->SetTitle("#sigmax");
hx->Draw();
fout->Write();
}
Peaks in histograms
TSpectrum: search by setting the sigma and the amplitude threshold of the peaks:
Int_t Search(const TH1* hist, Double_t sigma = 2, Option_t* option = "", Double_t threshold = 0.05)
Example
#include "TSpectrum.h"
TSpectrum *s = new TSpectrum();
Int_t peak;
...
peak = s->Search(hist, 2.2, "nobackground", 0.12);
cout << "peak: " << peak << "\n";
TChain
TChain *chain = new TChain("ntuple");
chain->Add("//data/scratch/atlas_root_v3/mpx04_20120305_Background_2012_LT_100s_1000.root");
chain->Add("/data/scratch/atlas_root_v3/mpx04_20120327_Background_2012_LT_50s_1000.root");
chain->Add("/data/scratch/atlas_root_v3/mpx04_20120412_VdM_scan_LT_8s_1000.root");
dir
TDirectory* topDir
m_outputFile = TFile::Open("out_file.root", "RECREATE");
topDir = m_outputFile->mkdir("test");
topDir->cd("test");
...Save_Hists...
m_outputFile->Write();
m_outputFile->Close();
Root dir
Draw
In an online session:
tree->Draw("variable_name");
where
tree
is a TTree and
variable_name
a leaf.
Conditions:
tree->Draw("variable1:variable2","condition==true");
Vectors
std::vector<double> *myvector = new std::vector<double>;
Loop over vectors
vector<int> vi;
...
for ( int i : vi ) cout << "i = " << i << endl;
External variables
In header
global.h
:
// Global variables
#ifndef GLOBAL_H
#define GLOBAL_H
extern Bool_t ext_var;
#endif
In
.C
macro or
.cpp
source file;
#include "global.h"
Bool_t ext_var;
void macro()
{
...
cout << ext_var << "\n";
...
}
Animated GIF
c1->SaveAs("MSet.gif+NN");
Each frame is delayed by NN*10ms.
Reading histograms from a ROOT file
// adjust path to input analysis rootfile
sprintf(sdata, "file.root");
cout << "Input file = " << sdata << "\n";
TFile *f = new TFile(sdata);
TH1 *h;
f->GetObject("dir1/dir2/histo_name;1",h);
h->Draw();
Clone Tree
copytree example
Fits
Ellipse (for TGraph)
https://root.cern.ch/doc/v606/fitEllipseTGraphRMM_8cxx_source.html
Bigaus
Numerical minimization
https://root.cern.ch/doc/master/NumericalMinimization_8C.html
SSH and SCP stuff
ssh
and
scp
with port specification:
Vydio stuff
sudo dpkg -i ./VidyoDesktopInstaller-ubuntu64-TAG_VD_3_6_3_017.deb
sudo apt install libqt4-designer libqt4-opengl libqt4-svg libqtgui4 libqtwebkit4
mkdir ../prg
gvim ../prg/videbcontrol
chmod 755 ../prg/videbcontrol
../prg/videbcontrol VidyoDesktopInstaller-ubuntu64-TAG_VD_3_6_3_017.deb
sudo dpkg -i ./VidyoDesktopInstaller-ubuntu64-TAG_VD_3_6_3_017.modfied.deb
../prg/videbcontrol VidyoDesktopInstaller-ubuntu64-TAG_VD_3_6_3_017.deb
rm VidyoDesktopInstaller-ubuntu64-TAG_VD_3_6_3_017.modfied.deb
../prg/videbcontrol VidyoDesktopInstaller-ubuntu64-TAG_VD_3_6_3_017.deb
sudo apt install libqt4-designer libqt4-opengl libqt4-svg libqtgui4 libqtwebkit4
sudo apt install libqtgui4 libqtwebkit4
rm VidyoDesktopInstaller-ubuntu64-TAG_VD_3_6_3_017.modfied.deb
../prg/videbcontrol VidyoDesktopInstaller-ubuntu64-TAG_VD_3_6_3_017.deb
sudo apt install libqt4-opengl
sudo apt install libqt4-svg
sudo apt install libqt4gui
sudo apt install libqtgui4
sudo apt install libqtwebkit4
sudo apt install libqt4-network
sudo dpkg -i ./VidyoDesktopInstaller-ubuntu64-TAG_VD_3_6_3_017.modfied.deb
LUMI stuff
Good runs
- 2015: 251103-286474, 286475-287983
- 2016: 289496-314199
- 2017: 324320-341649, 341692-342182
- 2018: 348197-367384
Vdm scans - Run-2
- 2015: 277025,277089,280231,280500,280520,286282,287224,287594
- 2016: 299390,300287,301915,301918,309311,309375,310781,312796,313067,313285,313878,313935
- 2017: 324832,324839,325020,329484,330875,335302,336506,339197,340453,340634,340644
- 2018: 354494,365218,365763,365768
LUMI DQ defects
- 2017
- 338377 LUMI_ONL_DET_ERROR_SEVERE 1−528; LUMI_ONL_OLC2HLT_SEVERE 1−575
AFP
AFP Figures
ALFA
Motherboard test
ALFA MB tests TWiki
General documentation
Spreadsheet
https://docs.google.com/spreadsheets/d/1PCt3J78sD6SlAaZ4itmRfV5u7bfJwEjsWS4hXHFASeQ/edit?usp=sharing
DCS
A common user has been created:
lumidaq
. Please ask the DCS expert for the password.
ssh -Y lumidaq@pcatlrpolab3
alfaDAQ2021
WCCOAui -proj ATLRPOLCS -m gedi &
Detector used:
A7R1U (RP5).
Configure
Switch on the HV
TDAQ
The TDAQ version is relevant. This example is for version 09-03-00, please check the current version installed in tbed with the TDAQ expert.
ssh -Y <user>@pc-tbed-pub
source /det/rpo/setup/setup_tdaq930_251.sh
setup_daq -p ALFA_251 /det/rpo/databases/tdaq-09-03-00/rpo/partitions/ALFA_251.data.xml &
LED:
- mode: PULSE
- Frequency: 3.0 kHz
- Amplitude: 4.2 V
Trigger (discriminator in NIM):
- mode: PULSE
- Period: 322.580 65 us
- High: 0 V
- Low: -0.5 V
- Width: 322.55 us
- Leading: 2.50 ns
- Trailing: 2.5 ns
Select most left discriminator which has output connected to
TestTrigger3 input of the LTP.
There is a number of outputs within same section. The right top output should be free. Connect it to the scope. As well as both outputs from the generator.
Discriminator:
NIM Model 821 Quad 100 MHz Discriminator
At START
- "ERROR RCD-ALFA-251 ers::Message LTP-ALFA@LTPModule::connect() No valid clock detected. Settings will be undetermined.": harmless, can be ignored
Gnam:
- ALFA_251 -> gnam ALFA -> RP_5
Documents
GIT repositories
DCS server at P1
pcatlfwd01
source /det/dcs/linuxScripts/set_env.sh
WinCC OA project: pcatlrpolcs.
Radmons
Scripts on ALFA LCS
pcatlfwd01
:
/det/dcs/Production/ATLAS_DCS_RPO/scripts/radMon.ctl
/det/dcs/Production/ATLAS_DCS_RPO/scripts/libs/RadMonLib.ctl
https://indico.cern.ch/event/472136/contributions/2165711/attachments/1276882/1894886/Radiation_status_2016.pdf
https://indico.cern.ch/event/484369/contributions/1993893/attachments/1226159/1795122/ALFA_radmons_20160211.pdf
https://indico.cern.ch/event/574904/contributions/2335039/attachments/1353890/2045310/Radiation_Oct_2016.pdf
https://indico.cern.ch/event/612336/contributions/2497687/attachments/1423742/2183127/Radiation_Feb_2017.pdf
https://indico.cern.ch/event/574904/contributions/2335039/attachments/1353890/2045310/Radiation_Oct_2016.pdf
https://indico.cern.ch/event/645856/contributions/2644078/attachments/1487452/2310647/Radiation_Radmons_2017.pdf
https://indico.cern.ch/event/645856/contributions/2644079/attachments/1488568/2313303/new_RadMons.pdf
https://indico.cern.ch/event/644125/contributions/2713542/attachments/1522330/2378815/Radiation_Sept_2017.pdf
https://indico.cern.ch/event/710304/contributions/3007162/attachments/1653516/2645869/Radiation_Jan_2018.pptx
https://atlasop.cern.ch/elisa/display/334932
https://atlasop.cern.ch/elisa/display/348971
Analysis
TWiki
ALFA 900 GeV
MC samples (Per-Oleg)
/data4tb/perdata/SpezData/354826
Presentations
General
My presentations
HGTD
Links
Sschedule
Risk Register
LV
Off-the-shelf AC/DC converter:
https://aisdb.cern.ch/pls/htmldb_aisdb_prod/f?p=189:14:5220126381618::::P14_PDF:41
PEBs
ATLAS Technical Coordination
Tatiana Klioutchnikova
G&S
Reviews
bPOL12V_V6
Radiation simulation
https://atlas-service-radsim.web.cern.ch/
- 420.00 < r < 470.00 cm, 570.00 < |z| < 600.00 cm
- integrated luminosity of 4000 fb−1
- Phase 2 ITk Step3.1Q6 geometry model
- Use 10 cm x 10 cm r x z binning
- Include results from FLUGG and GCALOR
Open questions
- water cooling (from LAr?) only for DC-DC converters?
- LV: any (hardware) protection against overvoltage?
HV
Temperature
Pressure
Humidity
Presentations about FOS-LPG:
Demonstrator
Interlock
Modules
Pad size: 1.3 x 1.3 mm2
Module size: 40 x 20 mm2
450 pads/ module
ELMB
Documents:
ELMB2
ELMB2 radiation tests
ELMB++
- Pros:
- optical fibres, no CANbus, no electrical ground
- (possibly) more radiation hard
- Cons:
- not available at the moment
- ELMB++ monthly meeting
EMCI
Racks
TDR
Comments (DCS-related)
- Markus Elsing: L 358 FLEX cables: number of connected modules limited by FLEX bandwidth?
- Maurice Garcia-Sciveres: calibrations
- Philippe Farthouat:
- sec. 6.4.2 time needed to upload the 1024 configuration registers of an ASIC
- MUX: who is developing; time scale
- ELMB++: availability
- Michel Raymond: L 4832 one PT100 per heater or per cover/vessel?
- Richard Teuscher:
- chap 8, humidity sensors: type, location, read-out software
- sec. 8.3 ELMB2 as backup if ELMB++ not available
- Kevin Einsweiler:
- sec. 6.6: voltage monitoring: automated action in case of high currents or voltage drops or relying only on Interlock triggered by large temperature variations?
- MUX: who is developing; time scale
My remarks:
- HV ramp up/down according to SB condition
- L 2882 total monitoring range is of 90°C
- L 2889 proposed->provided
- L 2915 mux->MUX's or multiplexers
DCS
Documentation
https://readthedocs.web.cern.ch/display/ICKB/PVSS+Service+Training+PVSS-JCOPFw+Course+Downloads/
Role administration
AMRM
Server replacement
DCS Server Replacement Procedure
Project administration
source /det/dcs/linuxScripts/set_env.sh
startPA
startConsole
WCCOAui -proj ATLZDC01 -m gedi &
WCCOAui -config /localdisk/winccoa/PROJECTNAME/config/config -p fwDeviceEditorNavigator/fwDeviceEditorNavigator.pnl -iconBar -menuBar
ASCII manager
From the gedi panel, click on
SysMgm
, then
Database
>
ASCII Manager
.
Virtual Machine
xfreerdp -g 800x600
Documents
OPC client
/det/dcs/tools/unifiedautomation/uaexpert.sh
Server -> Add... -> Custom Discovery
The items must be moved from Address Space to Data Access View.
The server address can be found in (Gedi)
SysMgm -> OPC Driver -> OPC UA Client, Connection URL/URI, e.g.
opc.tcp://pcatlzdc01.cern.ch:4901
.
Online PDF editors
- In the search field write the name of the area/access point
- click on the (i) icon
Solar powered Stirling engine
Motors for parabolic antenna
https://www.eurosat-online.it/i-principali-motori-per-parabola-in-commercio/
How to make Stirling engine
https://duckduckgo.com/?q=build+diy+solar+stirling+generators+solar+stirling+power+freedom&t=newext&atb=v1-1&iax=videos&ia=videos&pn=1&iai=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3D7gQuD3sjcrA
http://www.c-turbines.ch/frameset.html
https://diystirlingengine.com/stirling-engine-generator/
CERN Print Service
--
DavideCaforio - 2019-04-10