This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
asc:tutorials:2014october [2014/10/29 12:58] asc [Lesson 5: Running a job on multiple cores] |
asc:tutorials:2014october [2014/10/29 21:17] (current) asc [xAOD tutorial at ANL (October 28-29, 2014)] |
||
---|---|---|---|
Line 4: | Line 4: | ||
However, the lessons given below are simplified for a faster start. Also, the last 2 lessons are designed for the ANL cluster that uses condor for job submissions. | However, the lessons given below are simplified for a faster start. Also, the last 2 lessons are designed for the ANL cluster that uses condor for job submissions. | ||
In addition, we we will test US ATLAS connect as explained at the bottom of this page. | In addition, we we will test US ATLAS connect as explained at the bottom of this page. | ||
+ | |||
+ | Agenda of this tutorial is [[https:// | ||
Line 272: | Line 274: | ||
<code bash> | <code bash> | ||
- | source setup.sh | ||
mkdir lesson_3; cd lesson_3 | mkdir lesson_3; cd lesson_3 | ||
+ | source setup.sh | ||
rcSetup -u; rcSetup Base,2.0.12 | rcSetup -u; rcSetup Base,2.0.12 | ||
rc find_packages | rc find_packages | ||
Line 326: | Line 328: | ||
+ | <note tip>If the program fails saying that some shared library has a wrong format, clean ROOTCORE as "rc clean", | ||
+ | |||
====== Lesson 4: Filling histograms ====== | ====== Lesson 4: Filling histograms ====== | ||
- | Now we will make a number of changes to the above program. | + | Now we will make a number of changes to the above program. |
- | To do this we will need a number of changes: | + | |
- | 1) We need to link several ATLAS packages. | ||
- | |||
- | < | ||
- | | ||
- | </ | ||
- | |||
- | in " | ||
- | |||
- | 2) Then we need to modify 2 places to put histograms | ||
- | |||
- | <code python> | ||
- | MyAnalysis/ | ||
- | Root/ | ||
- | </ | ||
- | |||
- | Let us run this code to see how it works: | ||
<code bash> | <code bash> | ||
- | source setup.sh | ||
mkdir lesson_4; cd lesson_4 | mkdir lesson_4; cd lesson_4 | ||
</ | </ | ||
+ | |||
+ | Then setup atlas environment: | ||
<code bash> | <code bash> | ||
+ | source setup.sh | ||
rcSetup -u; rcSetup Base,2.0.12 | rcSetup -u; rcSetup Base,2.0.12 | ||
rc find_packages | rc find_packages | ||
Line 369: | Line 357: | ||
cd MyAnalysis/ | cd MyAnalysis/ | ||
</ | </ | ||
+ | |||
+ | |||
+ | |||
<note warning> | <note warning> | ||
- | Now prepare an input file with data from a directory with xAOD files: | + | Now prepare an input file with |
<code bash> | <code bash> | ||
python Make_input < | python Make_input < | ||
Line 384: | Line 375: | ||
How does this work? Your analysis code is testRun.cxx. We pass " | How does this work? Your analysis code is testRun.cxx. We pass " | ||
- | The actual analysis should be put to " | + | The actual analysis should be put to " |
+ | |||
+ | 1) We linked several ATLAS packages. | ||
+ | |||
+ | < | ||
+ | | ||
+ | </ | ||
+ | |||
+ | in " | ||
+ | |||
+ | 2) Then we modified 2 places to put histograms | ||
+ | |||
+ | <code python> | ||
+ | MyAnalysis/ | ||
+ | Root/ | ||
+ | </ | ||
+ | |||
+ | |||
+ | |||
The output of this example is in " | The output of this example is in " | ||
Line 400: | Line 410: | ||
- | Now we run the above job on multiple | + | Now we run the above job on multiple |
The execution of this program does not use anything from ATLAS. It uses basic Linux commands. | The execution of this program does not use anything from ATLAS. It uses basic Linux commands. | ||
Prepare a fresh directory: | Prepare a fresh directory: | ||
Line 435: | Line 445: | ||
- | We have made a few changes for this lesson compared to lesson 4. | + | We have made a few changes for in this lesson compared to lesson 4. |
For example, we changed ' | For example, we changed ' | ||
small script " | small script " | ||
Line 445: | Line 455: | ||
The output will go to the directory " | The output will go to the directory " | ||
- | Now run " | ||
- | You can monitor jobs with the command (launched in separate terminal): | + | Now let us launch 2 jobs using 2 cores. Each job will read a portion of the original " |
+ | |||
+ | <code bash> | ||
+ | ./A_RUN | ||
+ | </ | ||
+ | |||
+ | You can monitor jobs with this command (launched in a separate terminal): | ||
<code bash> | <code bash> | ||
Line 453: | Line 468: | ||
</ | </ | ||
- | Ones the jobs are done, you can merge the output files to " | + | When the jobs are done, you can merge the output files to " |
<code bash> | <code bash> | ||
hadd -f hist.root outputs/ | hadd -f hist.root outputs/ | ||
</ | </ | ||
+ | |||
+ | |||
+ | If it does not work: debug it as: | ||
+ | < | ||
+ | testRun 00 | ||
+ | </ | ||
+ | The command runs one job using the input list inputs/ | ||
+ | (Typically, this is due to wrong location of the goodrunlist) | ||
+ | |||
+ | |||
+ | If you run this program second time, clean the output directory: | ||
+ | |||
+ | < | ||
+ | rm -rf outputs/* | ||
+ | </ | ||
+ | (ROOTCORE does not like existing output directories) | ||
+ | |||
**Attention: | **Attention: | ||
Line 496: | Line 528: | ||
<code bash> | <code bash> | ||
cd submit | cd submit | ||
- | arc_ls -s / | + | arc_ls -s / |
- | arc_ls | + | arc_ls |
</ | </ | ||
Line 513: | Line 545: | ||
</ | </ | ||
- | When the jobs are done, the output files will be inside "Jobs" directory. Merge the ROOT outputs into one file as: | + | When the jobs are done, the output files will be inside |
<code bash> | <code bash> | ||
Line 532: | Line 564: | ||
</ | </ | ||
+ | (this may fail on certain Tier3s) | ||
====== | ====== | ||
Line 539: | Line 571: | ||
All lessons discussed on this wiki have been adopted by Ilija Vukotic for the use with ATLAS connect. | All lessons discussed on this wiki have been adopted by Ilija Vukotic for the use with ATLAS connect. | ||
- | See the instruction how | + | See the instruction |
- | {{: | + | {{: |
+ | |||
+ | |||
+ | |||
+ | ======Using Eclipse to develop ATLAS code====== | ||
+ | |||
+ | [[asc: | ||
// | // |