Please read the manual here asc_arcond.
source source /share/grid/app/asc_app/asc_rel/1.0/setup-script/set_asc.sh cp -r /users/chakanau/public/Arcond/example cd example/arc
It is good idea to check the availability of your data in the ArCond static database. As example, you can list all files on all boxes as:
arc_ls [path to you dataset]
All data are arranged on several disks (/data1, /data2, /data3, /data4, /data5, /dataxrd) of the farm. You can check all data available on a specific disk (“/data1”) as
arc_ls -s /data1
(“s” means “summary). This prints a “summary” of all files on /data1 disk. You can list all files as
arc_ls /data1
or data which are in subdirectory ”/data1/mc12_8TeV“ :
arc_ls /data1/mc12_8TeV
If you are looking for a specific MC, pipe the output to “grep”:
arc_ls -d /data1/mc12_8TeV | grep 147915.Pythia8_AU2CT10
(here we are looking for the MC set “147915.Pythia8_AU2CT10”.
If you want to list only directories, use this syntax:
arc_ls -d /data1/mc12_8TeV
This is especially useful if you want to check which runs are available.
Similarly, you can search for data:
arc_ls -d /data1/data12_8TeV
One can use also a pattern matching, similar to the linux “grep”. For example, to show only AOD files, use this example:
arc_ls -e "AOD"
The directory:
/users/chakanau/public/Arcond
Contains some example showing how to run C++ without input data (example_nodata), now to run Pythia8 on many cores (example_pythia8) and a simple example running over D3PD (example). There are also examples of how to run athena jobs (which is the main goal of the Arcond package)
Look at the example
/users/chakanau/public/Arcond/example
It has 2 directories: “arc” (the submission directory and “package” (the actual C++ code which reads D3PD).
Go to “package” and look at the example “main.cxx”. This is a simple program which makes “Analysis.root” file and reads an input list of ROOT files locate in “data.in”. Run this example as : “make; ./main”. You will see the output “Analysis.root” with a debug histogram.
Now we will submit this program to the farm where “data.in” will be rebuilt on each node (it uses the script Analysis.py located in this directory). First, you will need to setup the arcond as “cd ../; source s_asc”
Then go to “arc” and look at “arcond.conf”. It should at least specify the input data on the farm and the location of the “package” directory. For the default configuration, it points to /users/chakanau/public/Arcond/example/package.
Now, submit the “package” to the farm. Type “arcond” and type “s”, “y”, “y”, “y”. The jobs will be sent. Check them as “condor_q”.
When jobs are done, collect the outputs as “arc_add”. This program will check all directories “Job/*” for the presence of “Analysis.root”. Then all these root files will be merged into “Analysis_all.root”. Check the sequence of the execution inside “user/ShellScript_BASIC.sh”.
— Sergei Chekanov 2011/03/09 17:44