This is an old revision of the document!
Table of Contents
ArCond job submission to the ANL ATLAS PC farm
Please read the manual here asc_arcond
Checking available data
It is good idea to check the availability of your data in the ArCond static database. As example, you can list all files on all boxes as:
arc_ls DATASET
Example:
arc_ls mc08.106379.PythiaPhotonJet_AsymJetFilter.recon.AOD.e347_s462_r541
You can check all data available on a specific disk (“/data1”) as
arc_ls /data1
or data which are in subdirectory “/data1/MonteCarlo” :
arc_ls /data1/MonteCarlo/
If you want to list only directories, use this syntax:
arc_ls -d /data1/
This is especially useful if you want to check which runs are available.
To generate a summary of all files on all nodes, use this:
arc_ls -s /data1/
One can use also a pattern matching, similar to the linux “grep”. For example, to show only AOD files, use this example:
arc_ls -e "AOD"
Examples running ArCond
The directory:
/users/chakanau/public/Arcond
Contains some example showing how to run C++ without input data (example_nodata), now to run Pythia8 on many cores (example_pythia8) and a simple example running over D3PD (example). There are also examples of how to run athena jobs (which is the main goal of the Arcond package)
Arcond Example running over D3PD jobs
Look at the example
/users/chakanau/public/Arcond/example
It has 2 directories: “arc” (the submission directory and “package” (the actual C++ code which reads D3PD).
Simple C++ code
Go to “package” and look at the example “main.cxx”. This is a simple program which makes “Analysis.root” file and reads an input list of ROOT files locate in “data.in”. Run this example as : “make; ./main”. You will see the output “Analysis.root” with a debug histogram.
Submitting data to the farm
Now we will submit this program to the farm where “data.in” will be rebuilt on each node (it uses the script Analysis.py located in this directory). First, you will need to setup the arcond as “cd ../; source s_asc”
Then go to “arc” and look at “arcond.conf”. It should at least specify the input data on the farm and the location of the “package” directory. For the default configuration, it points to /users/chakanau/public/Arcond/example/package.
Now, submit the “package” to the farm. Type “arcond” and type “s”, “y”, “y”, “y”. The jobs will be sent. Check them as “condor_q”.
Getting the output files back
When jobs are done, collect the outputs as “arc_add”. This program will check all directories “Job/*” for the presence of “Analysis.root”. Then all these root files will be merged into “Analysis_all.root”. Check the sequence of the execution inside “user/ShellScript_BASIC.sh”.
— Sergei Chekanov 2011/03/09 17:44