hepsim:usage_truth
Differences
This shows you the differences between two versions of the page.
— | hepsim:usage_truth [2024/07/01 21:25] (current) – created - external edit 127.0.0.1 | ||
---|---|---|---|
Line 1: | Line 1: | ||
+ | {{indexmenu_n> | ||
+ | |||
+ | [[: | ||
+ | |||
+ | ====== | ||
+ | |||
+ | [[https:// | ||
+ | for plotting any distribution or differential cross section using truth-level (EVGEN) files. | ||
+ | Many HepSim MC samples include *.py scripts to calculate differential cross sections. You can run | ||
+ | them using downloaded ProMC files (in which case you pass the directory with *promc files as an argument). | ||
+ | The second approach is to run *.py scripts on a local computer without downloading ProMC files beforehand. | ||
+ | In this case, data will be streamed to computer' | ||
+ | |||
+ | You can create plots using a number of programming languages, Java, Python, C++, Ruby, Groovy etc. Plots can be done on any platform, without modifying your system. C++ analysis programs require ROOT and Linux. | ||
+ | |||
+ | Below we will discuss how to analyse EVGEN data using Java, since this approach works on any | ||
+ | platform (Linux, Mac, Windows) and does not require installation of any platform-specific program. | ||
+ | As before, make sure that [[http:// | ||
+ | |||
+ | ====== | ||
+ | |||
+ | You can run validation scripts in a batch mode as: | ||
+ | |||
+ | <code bash> | ||
+ | wget https:// | ||
+ | hs-run ttbar_mg5.py | ||
+ | </ | ||
+ | |||
+ | Another approach is to use [[https:// | ||
+ | give more flexibility and more libraries for analysis. | ||
+ | In this example, we will run a Python script that downloads data from URL into the computer memory | ||
+ | |||
+ | Here is how to process the analysis using [[https:// | ||
+ | |||
+ | <code bash> | ||
+ | wget https:// | ||
+ | tar -zvxf jas4pp.tgz | ||
+ | cd jas4pp | ||
+ | source ./setup.sh # takes 5 sec for first-time optimization | ||
+ | wget https:// | ||
+ | fpad ttbar_mg5.py # process it in a batch mode. | ||
+ | </ | ||
+ | |||
+ | Look at the " | ||
+ | |||
+ | Similarly, you can use a more complex [[https:// | ||
+ | <code bash> | ||
+ | wget -O dmelt.zip http:// | ||
+ | wget https:// | ||
+ | unzip dmelt.zip | ||
+ | ./ | ||
+ | </ | ||
+ | (first time it will run slower since it needs to rescan jar files). In this example, data with the URL (given inside ttbar_mg5.py) will be downloaded to the computer memory. | ||
+ | You can also pass URL with data as an argument and limit the calculation to 10000 events: | ||
+ | |||
+ | <code bash> | ||
+ | ./ | ||
+ | </ | ||
+ | |||
+ | If you want to see a pop-up canvas with the output histogram on your screen, change the line " | ||
+ | |||
+ | |||
+ | ====== Using Java WebStart ====== | ||
+ | |||
+ | |||
+ | Many " | ||
+ | |||
+ | To use Java Web Start, you should configure Java permissions: | ||
+ | |||
+ | |||
+ | ====== | ||
+ | |||
+ | The above approach depends on network availability at the time when you do the analysis. | ||
+ | It is more convenient to download data files first and run over the data. | ||
+ | If all your ProMC files are in the directory " | ||
+ | example code from the DataMelt directory: | ||
+ | |||
+ | <code bash> | ||
+ | ./ | ||
+ | </ | ||
+ | |||
+ | |||
+ | Here is a complete example: we download data to the directory " | ||
+ | then we download the analysis script, and then we run this script over the local data using 10000 events: | ||
+ | |||
+ | <code bash> | ||
+ | wget https:// | ||
+ | hs-get https:// | ||
+ | ./ | ||
+ | </ | ||
+ | |||
+ | Similarly, | ||
+ | |||
+ | ====== | ||
+ | |||
+ | You can perform short validation analysis using an editor as: | ||
+ | |||
+ | <code bash> | ||
+ | hs-ide ttbar_mg5.py | ||
+ | </ | ||
+ | Run this code by pressing " | ||
+ | |||
+ | For Jas4pp, you can start an editor, correct the script, and run it: | ||
+ | |||
+ | <code bash> | ||
+ | ./jaspp ttbar_mg5.py # Open the script in the editor | ||
+ | </ | ||
+ | Then, use the right mouse button and select "Run Python" | ||
+ | |||
+ | |||
+ | For the DMelt IDE, you can also bring up a full-featured GUI editor as this: | ||
+ | |||
+ | <code bash> | ||
+ | ./ | ||
+ | </ | ||
+ | |||
+ | It will open the Python script for editing. Next, run this script by clicking the image of green running man on the status bar (or press [F8]). | ||
+ | |||
+ | ====== Using GUI URL dialogue ====== | ||
+ | |||
+ | If you use DMelt, you can run this code using a more conventional editor: | ||
+ | |||
+ | <code bash> | ||
+ | ../ | ||
+ | </ | ||
+ | |||
+ | On Windows, click " | ||
+ | Then copy the URL link of the file *.py using the right mouse button ("Copy URL Location" | ||
+ | When processing is done, you will see a pop-up window with the distribution. | ||
+ | |||
+ | If you have already analysis file, you can load it to the editor as: | ||
+ | |||
+ | <code bash> | ||
+ | ./ | ||
+ | </ | ||
+ | |||
+ | and run it using " | ||
+ | |||
+ | ====== | ||
+ | |||
+ | You can look at separate event and create a 2D lego plot for separate events using Python and Java. | ||
+ | First, download any ProMC file, i.e. | ||
+ | <code bash> | ||
+ | wget https:// | ||
+ | </ | ||
+ | |||
+ | Then, execute this script in DatMelt which fills 2D histogram with final-state particles: | ||
+ | |||
+ | <hidden Click here to view the Python code> | ||
+ | <code python etaphi.py> | ||
+ | from java.lang import * | ||
+ | from proto import FileMC | ||
+ | from jhplot import * # import DatMelt graphics | ||
+ | from hephysics.particle import LParticle | ||
+ | |||
+ | EventToLook=10 # event to look at | ||
+ | |||
+ | h1= H2D(" | ||
+ | flist=[" | ||
+ | |||
+ | file=FileMC(flist[0]) | ||
+ | header = file.getHeader() | ||
+ | un=float(header.getMomentumUnit()) # conversion units | ||
+ | lunit=float(header.getLengthUnit()) | ||
+ | eve = file.read(EventToLook) | ||
+ | pa = eve.getParticles() | ||
+ | pi2=2*3.14 | ||
+ | for j in range(pa.getPxCount()): | ||
+ | if (pa.getStatus(j)==1): | ||
+ | p=LParticle(pa.getPx(j)/ | ||
+ | pt=p.perp() | ||
+ | phi=p.phi() | ||
+ | eta=p.getEta() | ||
+ | e=p.e() | ||
+ | if (phi<0): phi=pi2+phi | ||
+ | if (pt> | ||
+ | |||
+ | c1 = HPlot3D(" | ||
+ | c1.setColorMode(1) | ||
+ | c1.visible(1) | ||
+ | c1.setBars() | ||
+ | c1.setBoxed(0) | ||
+ | c1.setNameX("& | ||
+ | c1.setNameY("& | ||
+ | c1.setNameZ(" | ||
+ | c1.draw(h1) | ||
+ | </ | ||
+ | </ | ||
+ | |||
+ | The execution of this script will bring-up a window with the lego plot. | ||
+ | |||
+ | < | ||
+ | {{: | ||
+ | </ | ||
+ | |||
+ | You can also look at a simple " | ||
+ | |||
+ | < | ||
+ | {{: | ||
+ | </ | ||
+ | |||
+ | The code written in Python is attached: | ||
+ | |||
+ | < | ||
+ | {{: | ||
+ | </ | ||
+ | |||
+ | ====== Reading NLO predictions ====== | ||
+ | |||
+ | |||
+ | NLO records are different from showered MC. There is much less information available on particles, | ||
+ | and events have weights. In many cases, PDF uncertainties are included. | ||
+ | Here is an example how to read outputs from the MCFM program generated on BlueGene/Q at ANL. | ||
+ | |||
+ | |||
+ | <code bash> | ||
+ | hs-view https:// | ||
+ | </ | ||
+ | |||
+ | or: | ||
+ | <code bash> | ||
+ | wget https:// | ||
+ | hs-view gamma100tev_0000000.promc | ||
+ | </ | ||
+ | |||
+ | Click the event number ('left pannel" | ||
+ | |||
+ | < | ||
+ | {{: | ||
+ | </ | ||
+ | |||
+ | The scripts that reconstruct cross sections are attached to the HepSim event repository. | ||
+ | Here is how to run a script to reconstruct a direct-photon cross section at NLO QCD using DatMelt: | ||
+ | |||
+ | |||
+ | You can also open a script as: | ||
+ | <code bash> | ||
+ | wget https:// | ||
+ | ./dmelt.sh gamma_jetphox.py | ||
+ | </ | ||
+ | |||
+ | Alternatively, | ||
+ | |||
+ | NLO event record includes 4-momenta of particles and event weights (double values). In addition, deviations form central weights are included as an array of integer values as: | ||
+ | |||
+ | {{: | ||
+ | |||
+ | You can calculate differential cross sections using online files using this example: | ||
+ | |||
+ | <code bash> | ||
+ | mkdir Higgs; cd Higgs; | ||
+ | wget https:// | ||
+ | wget -O dmelt.zip http:// | ||
+ | unzip dmelt.zip | ||
+ | ./ | ||
+ | </ | ||
+ | |||
+ | This example runs " | ||
+ | |||
+ | |||
+ | |||
+ | |||
+ | ====== Using C++/ROOT ====== | ||
+ | |||
+ | To read downloaded ProMC files using C++/ROOT, you need to install the [[https:// | ||
+ | |||
+ | <code bash> | ||
+ | echo $ROOTSYS | ||
+ | echo $PROMC | ||
+ | </ | ||
+ | They should point to the installation paths. If you use CERN's lxplus or AFS, simply one can setup PROMC as: | ||
+ | <code bash> | ||
+ | source / | ||
+ | </ | ||
+ | which is built for x86_64-slc6-gcc48-opt. | ||
+ | |||
+ | Then, look at examples: | ||
+ | |||
+ | <code bash> | ||
+ | $PROMC/ | ||
+ | $PROMC/ | ||
+ | $PROMC/ | ||
+ | </ | ||
+ | |||
+ | The same example directory shows how to write ProMC writes and convert to other formats. | ||
+ | |||
+ | You can generate an analysis code in C++, Java and CPython from a ProMC file with unknown data layout. | ||
+ | Here is an example for a NLO file: | ||
+ | |||
+ | <code bash> | ||
+ | wget https:// | ||
+ | promc_proto ggd_mu1_45_2000_run0_atlas50.promc | ||
+ | promc_code | ||
+ | make | ||
+ | </ | ||
+ | |||
+ | This creates directories with the C++/ | ||
+ | For a longer description, | ||
+ | |||
+ | For C++/ROOT, you can use {{: | ||
+ | Untar it and compile using " | ||
+ | all ProMC files in a given directory and fills ROOT histograms with cross sections. | ||
+ | |||
+ | Use this Doxygen description to work with C++: | ||
+ | |||
+ | * [[https:// | ||
+ | * [[https:// | ||
+ | | ||
+ | |||
+ | Please look at HepSim [[: | ||
+ | Please refer [[https:// | ||
+ | |||
+ | Also, there is a simple example showing how to read Monte Carlo files from HepSim in a loop, | ||
+ | build anti-KT jets using FastJet, and fill ROOT histograms. Download [[http:// | ||
+ | |||
+ | <code bash> | ||
+ | wget https:// | ||
+ | cd hepsim-cpp/; | ||
+ | make | ||
+ | </ | ||
+ | Read " | ||
+ | |||
+ | |||
+ | ====== Conversion to ROOT files ====== | ||
+ | |||
+ | The ProMC files can be converted to ROOT files using " | ||
+ | directory. ROOT files will be about 30-50% larger (and their processing takes more CPU time). Please refer | ||
+ | [[https:// | ||
+ | |||
+ | |||
+ | <code python> | ||
+ | cp -rf $PROMC/ | ||
+ | cd promc2root | ||
+ | make | ||
+ | ./ | ||
+ | </ | ||
+ | |||
+ | ====== Converting to LCIO ====== | ||
+ | |||
+ | ProMC files can be converted to LCIO or STDHEP | ||
+ | files for full detector simulations. | ||
+ | |||
+ | Note that the converters are included inside the ProMC package (see the directory " | ||
+ | |||
+ | < | ||
+ | <code bash> | ||
+ | wget https:// | ||
+ | tar -zvxf ProMC.tgz | ||
+ | cd examples/ | ||
+ | source setup.sh | ||
+ | javac promc2lcio.java | ||
+ | java promc2lcio file.promc file.slcio | ||
+ | </ | ||
+ | </ | ||
+ | |||
+ | The last command creates | ||
+ | |||
+ | |||
+ | ====== Converting to other formats ====== | ||
+ | Look at other directories in " | ||
+ | ====== Extracting events | ||
+ | |||
+ | A file can be reduced in size by extracting N events as this: | ||
+ | |||
+ | < | ||
+ | hs-extract signal.promc N | ||
+ | </ | ||
+ | where signal.promc is the original file, and N is the number of events to extract. | ||
+ | |||
+ | ====== Comparing MC and data ====== | ||
+ | |||
+ | HepSim maintains analysis scripts that can be used for comparing Monte Carlo simulations with data from [[http:// | ||
+ | For example, click the link with [[http:// | ||
+ | * Navigate to " | ||
+ | * Start DMelt if you did not yet, and select [File]-[Read script from URL]. Copy and paster the URL link from the HepData database | ||
+ | * Click " | ||
+ | |||
+ | HepData maintain Jython scripts that use the same syntax as HepSim. You can start from a HepSim validation script, and before the " | ||
+ | |||
+ | |||
+ | |||
+ | ====== XML output format ====== | ||
+ | |||
+ | Many scripts of HepSim create SVG images and a cross platform | ||
+ | [[https:// | ||
+ | |||
+ | < | ||
+ | <code python> | ||
+ | # | ||
+ | # Convert jdat to the standard Python | ||
+ | # This can be used for converting data to pyROOT | ||
+ | |||
+ | from xml.dom import minidom | ||
+ | xmldoc = minidom.parse(' | ||
+ | itemlist = xmldoc.getElementsByTagName(' | ||
+ | print len(itemlist) | ||
+ | |||
+ | alldata={} | ||
+ | for staff in itemlist: | ||
+ | ary=[] | ||
+ | sid = staff.getAttribute(" | ||
+ | sid = (staff.getElementsByTagName(" | ||
+ | title = (staff.getElementsByTagName(" | ||
+ | size= (staff.getElementsByTagName(" | ||
+ | dimension=(staff.getElementsByTagName(" | ||
+ | values = (staff.getElementsByTagName(" | ||
+ | print "Read: id=", | ||
+ | for line in values.splitlines(): | ||
+ | line = line.strip() | ||
+ | if not line: | ||
+ | | ||
+ | | ||
+ | alldata[sid]=[title, | ||
+ | # print all attributes | ||
+ | print alldata[" | ||
+ | </ | ||
+ | </ | ||
+ | |||
+ | ====== Programming with HepSim ====== | ||
+ | |||
+ | |||
+ | |||
+ | Please look at the [[: | ||
+ | |||
+ | Send comments to: --- // | ||
hepsim/usage_truth.txt · Last modified: 2024/07/01 21:25 by 127.0.0.1