Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
hepsim:quick [2018/06/08 16:41]
hepsim17 [Quick start]
hepsim:quick [2021/05/07 18:30]
hepsim17 [Quick start]
Line 9: Line 9:
 <code bash>       <code bash>      
 bash   #  set bash if you haven't done this before  bash   #  set bash if you haven't done this before 
-wget http://atlaswww.hep.anl.gov/hepsim/soft/hs-toolkit.tgz -O - | tar -xz;+wget https://atlaswww.hep.anl.gov/hepsim/soft/hs-toolkit.tgz -O - | tar -xz;
 source hs-toolkit/setup.sh source hs-toolkit/setup.sh
 </code> </code>
  
-This creates the directory "hs-toolkit" with HepSim commands. You can also download it as [[http://atlaswww.hep.anl.gov/hepsim/soft/hs-toolkit.tgz | hs-toolkit.tgz]]. Note that+This creates the directory "hs-toolkit" with HepSim commands. You can also download it as [[https://atlaswww.hep.anl.gov/hepsim/soft/hs-toolkit.tgz | hs-toolkit.tgz]]. Note that
 [[https://www.java.com/en/download/|Java 8]] and above should be installed.  [[https://www.java.com/en/download/|Java 8]] and above should be installed. 
 You can view the commands using the bash shell by typing : You can view the commands using the bash shell by typing :
Line 22: Line 22:
  
 The directory contains several bash scripts for Linux/Mac, and Windows batch (BAT) files to process events on Windows OS. The directory contains several bash scripts for Linux/Mac, and Windows batch (BAT) files to process events on Windows OS.
-The package is used for download, view and analyse HepSim files (truth-level files in the [[https://atlaswww.hep.anl.gov/asc/promc/ | PROMC format]]).  +The package is used for download, view and analyze truth-level events in the [[https://atlaswww.hep.anl.gov/asc/promc/ | PROMC]] or [[https://github.com/proio-org/ | PROIO]] format.  
  
 <note tip> <note tip>
-Use [[http://atlaswww.hep.anl.gov/asc/jas4pp | JAS4PP program]] for analysing LCIO +Use [[https://atlaswww.hep.anl.gov/asc/jas4pp | JAS4PP program]] for analysing LCIO 
 (*.lcio) files with Geant4 simulations. (*.lcio) files with Geant4 simulations.
-This program can also be used for truth-level [[https://atlaswww.hep.anl.gov/asc/promc/ | PROMC files]]. +This program can also be used for truth-level records in the [[https://atlaswww.hep.anl.gov/asc/promc/|PROMC]], [[https://github.com/proio-org/| PROIO]] and [[https://root.cern/|ROOT]] file formats. You can also use Delphes/ROOT framework for ROOT files
-To analyse ROOT files with fast simulations, use Delphes/ROOT framework. +
 </note> </note>
  
Line 35: Line 34:
  
 Let us show how to find the files associated with a given Monte Carlo event sample.  Let us show how to find the files associated with a given Monte Carlo event sample. 
-Go to [[http://atlaswww.hep.anl.gov/hepsim/|HepSim database]] and find the "Files" column.+Go to [[https://atlaswww.hep.anl.gov/hepsim/|HepSim database]] and find the "Files" column.
 It shows URL of truth-level files ("EVGEN"), i.e. files directly created by event generators.   It shows URL of truth-level files ("EVGEN"), i.e. files directly created by event generators.  
  
Line 55: Line 54:
 Similarly, one can use the download URL: Similarly, one can use the download URL:
 <code bash> <code bash>
-hs-ls http://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/higgs_ttbar_mg5/ +hs-ls https://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/higgs_ttbar_mg5/ 
 </code> </code>
 Note that in this approach, one can use URL mirrors close to your geographical location. Note that in this approach, one can use URL mirrors close to your geographical location.
Line 62: Line 61:
  
 <code bash> <code bash>
-hs-ls [setname] simple     > input.list     # make list of ProMC files (without URL path) +hs-ls [name] simple     > input.list     # make list of ProMC files (without URL path) 
-hs-ls [setname] simple-url > input_url.list # make list with URL from the main server+hs-ls [name] simple-url > input_url.list # make list with URL from the main server
 </code> </code>
-where [setname] is the name of dataset. You can also use a URL if you want to create a list of files from certain (mirror) servers.+where [name] is the name of dataset. You can also use a URL if you want to create a list of files from certain (mirror) servers.
 ====== Searching for datasets ====== ====== Searching for datasets ======
  
  
 The best method to find the needed sample is to use the web page with  The best method to find the needed sample is to use the web page with 
-[[http://atlaswww.hep.anl.gov/hepsim/search.php|database search]].+[[https://atlaswww.hep.anl.gov/hepsim/search.php|database search]].
  
 Enter "rfull" in the search field, and you will see all samples with full simulation taggs. Enter "rfast", and you will Enter "rfull" in the search field, and you will see all samples with full simulation taggs. Enter "rfast", and you will
Line 80: Line 79:
  
  
-  * [[http://atlaswww.hep.anl.gov/hepsim/list.php?find=rfull]] list MC after full simulations +  * [[https://atlaswww.hep.anl.gov/hepsim/list.php?find=rfull]] list MC after full simulations 
-  * [[http://atlaswww.hep.anl.gov/hepsim/list.php?find=rfast]] lists MC after fast simulations +  * [[https://atlaswww.hep.anl.gov/hepsim/list.php?find=rfast]] lists MC after fast simulations 
-  * [[http://atlaswww.hep.anl.gov/hepsim/list.php?find=mg5]] - lists all Madgraph5 samples +  * [[https://atlaswww.hep.anl.gov/hepsim/list.php?find=mg5]] - lists all Madgraph5 samples 
-  * [[http://atlaswww.hep.anl.gov/hepsim/list.php?find=higgs%rfast]] - lists Higgs samples after fast simulation+  * [[https://atlaswww.hep.anl.gov/hepsim/list.php?find=higgs%rfast]] - lists Higgs samples after fast simulation
  
 If you prefer to use the command-line approach, you can find URL that corresponds a dataset using this command: If you prefer to use the command-line approach, you can find URL that corresponds a dataset using this command:
Line 124: Line 123:
  
 <code> <code>
-hs-get http://atlaswww.hep.anl.gov/hepsim/info.php?item=2 data +hs-get https://atlaswww.hep.anl.gov/hepsim/info.php?item=2 data 
 </code> </code>
  
Line 130: Line 129:
  
 <code> <code>
-hs-get http://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/higgs_ttbar_mg5 data+hs-get https://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/higgs_ttbar_mg5 data
 </code> </code>
 All these examples will download all files from the "tev100pp_higgs_ttbar_mg5" event sample. All these examples will download all files from the "tev100pp_higgs_ttbar_mg5" event sample.
Line 148: Line 147:
 hs-get tev13pp_higgs_pythia8_ptbins  data 3 10 pt100_ hs-get tev13pp_higgs_pythia8_ptbins  data 3 10 pt100_
 </code> </code>
-where the name is [[http://atlaswww.hep.anl.gov/hepsim/info.php?item=92|tev13pp_higgs_pythia8_ptbins]]. +where the name is [[https://atlaswww.hep.anl.gov/hepsim/info.php?item=92|tev13pp_higgs_pythia8_ptbins]]. 
  
 The command download files to the "data" directory in 2 threads. The maximum number of download files is 5 and all file names have *pt100* string (i.e. pT>100 GeV). The command download files to the "data" directory in 2 threads. The maximum number of download files is 5 and all file names have *pt100* string (i.e. pT>100 GeV).
Line 168: Line 167:
 where "NNN" is a version number.  where "NNN" is a version number. 
 You can identify detector geometries that correspond to the tags using  [[http://atlaswww.hep.anl.gov/hepsim/detectors.php|detector description page]]. You can identify detector geometries that correspond to the tags using  [[http://atlaswww.hep.anl.gov/hepsim/detectors.php|detector description page]].
-For example, [[http://atlaswww.hep.anl.gov/hepsim/info.php?item=15|tev100pp_ttbar_mg5]] +For example, [[https://atlaswww.hep.anl.gov/hepsim/info.php?item=15|tev100pp_ttbar_mg5]] 
 sample includes the link "rfast001" (Delphes fast simulation, version 001). To download the reconstructed events for the reconstruction tag "rfast001", use this syntax:  sample includes the link "rfast001" (Delphes fast simulation, version 001). To download the reconstructed events for the reconstruction tag "rfast001", use this syntax: 
  
Line 185: Line 184:
 As before, one can also download the files using the URL: As before, one can also download the files using the URL:
 <code bash> <code bash>
-hs-ls http://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5/rfast001/ # list all files +hs-ls https://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5/rfast001/ # list all files 
-hs-get http://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5/rfast001/ data+hs-get https://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5/rfast001/ data
 </code> </code>
  
Line 194: Line 193:
  
 <code bash>  <code bash> 
-hs-info http://mc.hep.anl.gov/asc/hepsim/events/pp/14tev/pythia8_higgs2mumu/tev14_pythia8_h2mm_1.promc+hs-info https://mc.hep.anl.gov/asc/hepsim/events/pp/14tev/pythia8_higgs2mumu/tev14_pythia8_h2mm_1.promc
 </code> </code>
  
Line 224: Line 223:
  
 <code> <code>
-hs-info http://mc.hep.anl.gov/asc/hepsim/events/pp/14tev/pythia8_higgs2mumu/tev14_pythia8_h2mm_1.promc 100 +hs-info https://mc.hep.anl.gov/asc/hepsim/events/pp/14tev/pythia8_higgs2mumu/tev14_pythia8_h2mm_1.promc 100 
 </code> </code>
  
Line 231: Line 230:
 hs-view [promc file]               hs-view [promc file]              
 </code> </code>
-This command brings up a GUI window to look at separate events. You should forward X11 to see the GUI. For Windows: download the file [[http://atlaswww.hep.anl.gov/asc/hepsim/hepsim.jar|hepsim.jar]] and  click on it. Then open the file as [File]-[Open file].+This command brings up a GUI window to look at separate events. You should forward X11 to see the GUI. For Windows: download the file [[https://atlaswww.hep.anl.gov/asc/hepsim/hepsim.jar|hepsim.jar]] and  click on it. Then open the file as [File]-[Open file].
  
  
Line 240: Line 239:
  
 <code> <code>
-hs-view http://mc.hep.anl.gov/asc/hepsim/events/pp/14tev/pythia8_higgs2mumu/tev14_pythia8_h2mm_1.promc+hs-view https://mc.hep.anl.gov/asc/hepsim/events/pp/14tev/pythia8_higgs2mumu/tev14_pythia8_h2mm_1.promc
 </code> </code>
 Here we looked at one file of [[http://mc.hep.anl.gov/asc/hepsim/events/pp/14tev/qcd/pythia8/|Pythia8 (QCD) sample]].  Here we looked at one file of [[http://mc.hep.anl.gov/asc/hepsim/events/pp/14tev/qcd/pythia8/|Pythia8 (QCD) sample]]. 
Line 247: Line 246:
 ====== Monte Carlo logfile ====== ====== Monte Carlo logfile ======
  
-You can work with ProMC using the standard Linux commands, such as "unzip": +Each ProMC/ProIO file includes a logfile from the Monte Carlo generator. Show this file on the screen as: 
 + 
 +<code> 
 +hs-log [file] 
 +</code> 
 +where [file] is either a ProMC or ProIO file (you can use URL instead of the full path on the local computer). 
 + 
 +In the case of ProMC files, one can use the standard Linux commands, such as "unzip": 
  
 <code> <code>
Line 309: Line 315:
 In this example, we will run a Python (to be more exact, Jython) script and, at the same time, will stream data from the web. In this example, we will run a Python (to be more exact, Jython) script and, at the same time, will stream data from the web.
 Find a HepSim event sample by clicking the info "Info" column.  Find a HepSim event sample by clicking the info "Info" column. 
-For example, look at a ttbar sample from Madgraph: [[http://atlaswww.hep.anl.gov/hepsim/info.php?item=15|ttbar_mg5]].  +For example, look at a ttbar sample from Madgraph: [[https://atlaswww.hep.anl.gov/hepsim/info.php?item=15|ttbar_mg5]].  
 Find the URL of the analysis script ("ttbar_mg5.py") located at the bottom. Copy it to some foulder. Or use "wget": Find the URL of the analysis script ("ttbar_mg5.py") located at the bottom. Copy it to some foulder. Or use "wget":
  
 <code bash> <code bash>
-wget http://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5/macros/ttbar_mg5.py+wget https://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5/macros/ttbar_mg5.py
 </code> </code>
  
Line 346: Line 352:
 then we download the analysis script, and then we run this script over the local data using 10000 events: then we download the analysis script, and then we run this script over the local data using 10000 events:
 <code> <code>
-hs-get http://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5 ttbar_mg5+hs-get https://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5 ttbar_mg5
 hs-run ttbar_mg5 10000 hs-run ttbar_mg5 10000
 </code> </code>
Line 355: Line 361:
 The above example has some limitations since it uses rather simple editor.  The above example has some limitations since it uses rather simple editor. 
 Another approach is to use the full-featured [[http://atlaswww.hep.anl.gov/asc/jas4pp/|Jas4pp]] or    Another approach is to use the full-featured [[http://atlaswww.hep.anl.gov/asc/jas4pp/|Jas4pp]] or   
-[[http://jwork.org/dmelt/|DataMelt]] programs which give more flexibility.+[[https://datamelt.org|DataMelt]] programs which give more flexibility.
 <code bash> <code bash>
 wget -O dmelt.zip http://jwork.org/dmelt/download/current.php; wget -O dmelt.zip http://jwork.org/dmelt/download/current.php;
Line 363: Line 369:
 You can also pass URL with data as an argument and limit the calculation to 10000 events: You can also pass URL with data as an argument and limit the calculation to 10000 events:
 <code bash> <code bash>
-./dmelt/dmelt_batch.sh ttbar_mg5.py http://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5/ 10000 +./dmelt/dmelt_batch.sh ttbar_mg5.py https://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5/ 10000 
 </code> </code>
  
 As before, use the batch mode using downloaded ProMC files. As before, use the batch mode using downloaded ProMC files.
-Let assume that we put all ProMC files to the directory "data". Then run [[http://jwork.org/dmelt/|DataMelt]] over the data as:+Let assume that we put all ProMC files to the directory "data". Then run [[https://datamelt.org/|DataMelt]] over the data as:
  
 <code bash> <code bash>
Line 376: Line 382:
 then we download the analysis script, and then we run this script over the local data using 10000 events: then we download the analysis script, and then we run this script over the local data using 10000 events:
 <code> <code>
-hs-get http://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5 ttbar_mg5+hs-get https://mc.hep.anl.gov/asc/hepsim/events/pp/100tev/ttbar_mg5 ttbar_mg5
 ./dmelt/dmelt_batch.sh ttbar_mg5.py ttbar_mg5 10000  ./dmelt/dmelt_batch.sh ttbar_mg5.py ttbar_mg5 10000 
 </code> </code>
Line 382: Line 388:
  
 Then click "run" (or [F8]).  Then click "run" (or [F8]). 
-One can also start  [[http://jwork.org/dmelt/|DataMelt]] without input files: +One can also start  [[https://datamelt.org|DataMelt]] without input files: 
 <code> <code>
 ./dmelt/dmelt.sh ./dmelt/dmelt.sh
 </code> </code>
 on Linux/Mac. On Windows, run "dmelt.bat" instead. You will see the DatMelt IDE. on Linux/Mac. On Windows, run "dmelt.bat" instead. You will see the DatMelt IDE.
-Locate  an URL location of the analysis script, such as [[http://atlaswww.hep.anl.gov/hepsim/info.php?item=15|ttbar_mg5]] (can be found under "Info" link). Then copy this link using the right mouse button ("Copy URL Location"). +Locate  an URL location of the analysis script, such as [[https://atlaswww.hep.anl.gov/hepsim/info.php?item=15|ttbar_mg5]] (can be found under "Info" link). Then copy this link using the right mouse button ("Copy URL Location"). 
 Next, in the DMelt menu, go to "File"→"Read script from URL". Copy the URL link of the *.py file to the pop-up DataMelt  Next, in the DMelt menu, go to "File"→"Read script from URL". Copy the URL link of the *.py file to the pop-up DataMelt 
 URL dialog and click "run". The program will start reading data from the Web. At the end of the run, you will see  a pop-up URL dialog and click "run". The program will start reading data from the Web. At the end of the run, you will see  a pop-up
Line 407: Line 413:
 There is a simple example showing how to read multiple Monte Carlo files from HepSim, There is a simple example showing how to read multiple Monte Carlo files from HepSim,
 build anti-KT jets using FastJet, and fill ROOT histograms. Download  build anti-KT jets using FastJet, and fill ROOT histograms. Download 
-[[http://atlaswww.hep.anl.gov/asc/hepsim/soft/hepsim-cpp.tgz|hepsim-cpp package]]  and compile it:   +[[https://atlaswww.hep.anl.gov/asc/hepsim/soft/hepsim-cpp.tgz|hepsim-cpp package]]  and compile it:   
  
 <code> <code>
Line 425: Line 431:
 about how to read Delphes ROOT files.  about how to read Delphes ROOT files. 
  
-You can find all samples that contain fast simulations using [[http://atlaswww.hep.anl.gov/hepsim/list.php?find=rfast|this link]]. +You can find all samples that contain fast simulations using [[https://atlaswww.hep.anl.gov/hepsim/list.php?find=rfast|this link]]. 
  
 ===== Full simulation: LCIO files ===== ===== Full simulation: LCIO files =====
Line 438: Line 444:
      
  
-You can find all samples that contain full simulations using [[http://atlaswww.hep.anl.gov/hepsim/list.php?find=rfull|this link]]. +You can find all samples that contain full simulations using [[https://atlaswww.hep.anl.gov/hepsim/list.php?find=rfull|this link]].
- +
- +
- +
- +
  
  
Line 474: Line 475:
  
 Durham [[http://durpdg.dur.ac.uk/|HepData]] database maintains "DMelt" scripts compatible  Durham [[http://durpdg.dur.ac.uk/|HepData]] database maintains "DMelt" scripts compatible 
-with [[http://atlaswww.hep.anl.gov/hepsim/|HepSim]] analysis scripts, thus it is relatively easy to  overlay Monte Carlo predictions and data from published articles.+with [[https://atlaswww.hep.anl.gov/hepsim/|HepSim]] analysis scripts, thus it is relatively easy to  overlay Monte Carlo predictions and data from published articles.
 For example, look at the link  [[http://durpdg.dur.ac.uk/view/ins1253852|AAD 2013]] from [[http://durpdg.dur.ac.uk/|HepData]] and   For example, look at the link  [[http://durpdg.dur.ac.uk/view/ins1253852|AAD 2013]] from [[http://durpdg.dur.ac.uk/|HepData]] and  
 download a "DMelt" Jython script with published data.  download a "DMelt" Jython script with published data.