This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Last revision Both sides next revision | ||
community:hepsim:usage_fast [2016/04/28 18:19] asc [How to create ProMC files] |
community:hepsim:usage_fast [2016/04/28 18:33] asc |
||
---|---|---|---|
Line 3: | Line 3: | ||
====== Fast detector simulation ====== | ====== Fast detector simulation ====== | ||
- | ===== On the fly reconstruction | + | ====== Downloading ROOT files ====== |
- | HepSim ca be used o create ROOT files after fast detector | + | Files after fast detector |
- | one can analyse events after the Delphes fast simulation program on the fly. | + | Assuming that you have installed either** hs-tools** or **Jas4pp** programs, you can download |
- | The latter approach allows to make changes to the detector geometry by the end-users and, at the the same time, perform an analysis. | + | |
- | The output ROOT file only includes histograms defined by the user (but you can also add custom ROOT tree). | + | |
- | To do this, use the [[http:// | + | |
- | that includes | + | |
- | You can find a description of the [[https:// | + | |
- | Follow these steps: | + | Here how you can find all samples that have " |
+ | for a give tag, say rfast004, use this command: | ||
<code bash> | <code bash> | ||
- | wget http:// | + | hs-find tev14_mg5%rfast004 |
- | tar -zvxf FastHepSim.tgz | + | |
- | cd FastHepSim/ | + | |
- | ./ | + | |
</ | </ | ||
- | + | which searches for all samples with the dataset name "tev14_mg5" | |
- | This installs | + | Then download a selected sample as: |
<code bash> | <code bash> | ||
- | cd .. # go to the root directory | + | hs-get tev14_mg5_nlo_httbar%rfast004 |
- | source setup.sh | + | |
</ | </ | ||
- | Next, go to the analysis example: | ||
- | |||
- | <code bash> | ||
- | cd analysis | ||
- | make | ||
- | </ | ||
- | |||
- | This compiles the analysis program (analysis.cc) that fills jetPT and muonPT histograms. | ||
- | Now we need to bring data from [[http:// | ||
- | we will copy data to the " | ||
- | |||
- | <code bash> | ||
- | hs-get http:// | ||
- | </ | ||
- | This copies 3 files in 2 threads and put them to the directory " | ||
- | |||
- | <code bash> | ||
- | ./ | ||
- | ./analysis delphes_card_FCC_basic_notau.tcl histo.root inputdata.txt | ||
- | </ | ||
- | |||
- | The first command creates a file " | ||
- | with output histograms. This example uses " | ||
- | Note that we have removed | ||
- | |||
- | If you want to access other objects (photons, electrons, b-jets), | ||
- | use [[https:// | ||
- | You can put external files into the src/ directory where it will be found by Makefile. | ||
- | |||
- | If you still want to look at the event structure in the form of ROOT tree, run the usual Delphes command: | ||
- | <code bash> | ||
- | ../ | ||
- | </ | ||
- | where output.root will contain all reconstructed objects. In this case, add " | ||
- | If the input file contains complete (non-slimmed) record, one can add " | ||
- | |||
- | Try also more sophisticated detector-geometry cards: | ||
- | |||
- | * examples/ | ||
- | * examples/ | ||
- | |||
- | Note that " | ||
- | |||
- | |||
- | |||
- | |||
- | |||
- | |||
- | |||
- | |||
- | |||
- | |||
- | |||
- | |||
- | |||
- | ====== A note for ANL cluster | ||
- | |||
- | For ANL cluster, you do not need to install Delphes. Simply run the reconstruction as: | ||
- | <code bash> | ||
- | source / | ||
- | $DELPHES/ | ||
- | </ | ||
- | The cards are located in $DELPHES/ | ||
- | |||
- | If you want to run over multiple ProMC files without manual download, use this command: | ||
- | |||
- | <code bash> | ||
- | java -cp hepsim.jar hepsim.Exec DelphesProMC delphes.tcl output.root [URL] [Nfiles] | ||
- | </ | ||
- | where [URL] is HepSim location of files and [Nfiles] is the number of files for processing. | ||
- | The output ROOT will be located inside the " | ||
- | Here is a small example: | ||
- | <code bash> | ||
- | java -cp hepsim.jar hepsim.Exec DelphesProMC delphes.tcl output.root | ||
- | </ | ||
- | which processes 5 files from [[http:// | ||
- | Skip " | ||
- | |||
- | |||
- | ====== Converting to LCIO ====== | ||
- | |||
- | ProMC files can be converted to LCIO files for full detector simulation. | ||
- | |||
- | <code bash> | ||
- | wget http:// | ||
- | tar -zvxf ProMC.tgz | ||
- | cd examples/ | ||
- | source setup.sh | ||
- | javac promc2lcio.java | ||
- | java promc2lcio file.promc file.slcio | ||
- | </ | ||
- | |||
- | Look at other directories in " | ||
- | ====== Record slimming ====== | ||
- | Particle records from the generators based on LO/ | ||
- | (PYTHIA, HERWIG, MADGRAPH) are often " | ||
- | records are slimmed, the following algorithm is used: | ||
- | |||
- | <code python> | ||
- | (status=1 && pT>0.3 GeV ) or # keep final states with pT>0.3 GeV | ||
- | (PID=5 || PID=6) | ||
- | (PID>22 && PID< | ||
- | (PID>10 && PID< | ||
- | </ | ||
- | where PID is absolute value of particle codes. Leptons ane neutrinos are also affected by the slimming pT cut. | ||
- | Note: for 100 TeV collisions, the pT cut is increased from 0.3 to 0.4 GeV. | ||
- | For NLO calculations with a few partons + PDF weights, the complete event records are stored. | ||
- | In the case when the slimming is applied, file sizes are reduced by x2 - x3. In some situation, slimming | + | ====== Analysing ROOT files ====== |
- | can affect detector simulation. For example, you should turn of tau reconstruction in Delphes when slimming is used. | + | |
+ | Read the [[https:// |