Difference between revisions of "Simulations"

From EIC
Jump to navigation Jump to search
Line 1: Line 1:
 
The EIC task force has a large number of simulation tools available for investigating different types of physics processes. Unless noted otherwise, these can be accessed from /afs/rhic.bnl.gov/eic/PACKAGES.
 
The EIC task force has a large number of simulation tools available for investigating different types of physics processes. Unless noted otherwise, these can be accessed from /afs/rhic.bnl.gov/eic/PACKAGES.
  
==Event Generators==
+
== Event Generators ==
  
 
The following event generators are available:
 
The following event generators are available:
  
* ep
+
*ep
**[[DJANGOH]]: (un)polarised DIS generator with QED and QCD radiative effects for NC and CC events.
+
**[[DJANGOH|DJANGOH]]: (un)polarised DIS generator with QED and QCD radiative effects for NC and CC events.
**[[gmc_trans]]: A generator for semi-inclusive DIS with transverse-spin- and transverse-momentum-dependent distributions.
+
**[[Gmc_trans|gmc_trans]]: A generator for semi-inclusive DIS with transverse-spin- and transverse-momentum-dependent distributions.
**[[LEPTO]]: A leptoproduction generator - used as a basis for PEPSI and DJANGOH
+
**[[LEPTO|LEPTO]]: A leptoproduction generator - used as a basis for PEPSI and DJANGOH
**[[LEPTO-PHI]]: A version of LEPTO with "Cahn effect" (azimuthal asymmetry) implemented  
+
**[[LEPTO-PHI|LEPTO-PHI]]: A version of LEPTO with "Cahn effect" (azimuthal asymmetry) implemented
**[[MILOU]]: A generator for deeply virtual Compton scattering (DVCS), the Bethe-Heitler process and their interference.
+
**[[MILOU|MILOU]]: A generator for deeply virtual Compton scattering (DVCS), the Bethe-Heitler process and their interference.
**[[PYTHIA]]: A general-purpose high energy physics event generator.
+
**[[PYTHIA|PYTHIA]]: A general-purpose high energy physics event generator.
**[[PEPSI]]: A generator for polarised leptoproduction.  
+
**[[PEPSI|PEPSI]]: A generator for polarised leptoproduction.
**[[RAPGAP]]: A generator for deeply inelastic scattering (DIS) and diffractive ''e + p'' events.
+
**[[RAPGAP|RAPGAP]]: A generator for deeply inelastic scattering (DIS) and diffractive ''e + p'' events.
  
* eA
+
*eA
**[[BeAGLE | BeAGLE]]: Benchmark eA Generator for LEptoproduction - UNDER CONSTRUCTION - a generator to simulate ep/eA DIS events including nuclear shadowing effects (based on DPMJetHybrid)
+
**[[BeAGLE|BeAGLE]]: Benchmark eA Generator for LEptoproduction - UNDER CONSTRUCTION - a generator to simulate ep/eA DIS events including nuclear shadowing effects (based on DPMJetHybrid)
**[[DPMJet]]: a generator for very low Q2/real photon physics in eA
+
**[[DPMJet|DPMJet]]: a generator for very low Q2/real photon physics in eA
**[[DpmjetHybrid | DPMJetHybrid]]: a generator to simulate ep/eA DIS events by employing PYTHIA in DPMJet
+
**[[DpmjetHybrid|DPMJetHybrid]]: a generator to simulate ep/eA DIS events by employing PYTHIA in DPMJet
 
**[http://code.google.com/p/sartre-mc/ Sartre] is an event generator for exclusive diffractive vector meson production and DVCS in ep and eA collisions based on the dipole model.
 
**[http://code.google.com/p/sartre-mc/ Sartre] is an event generator for exclusive diffractive vector meson production and DVCS in ep and eA collisions based on the dipole model.
  
There is code provided to convert the output from most of these generators into a ROOT format.
+
There is code provided to convert the output from most of these generators into a ROOT format. It is distributed as part of [[Monte_Carlo_and_Smearing|eic-smear]], the Monte Carlo smearing package.
It is distributed as part of [[Monte Carlo and Smearing | eic-smear]], the Monte Carlo smearing package.
 
  
==Detector simulations==
+
== Detector simulations ==
  
 
The following programmes are available for simulating detector geometry and response:
 
The following programmes are available for simulating detector geometry and response:
  
*[[Monte Carlo and Smearing | eic-smear]] A package for apply very fast detector smearing to Monte Carlo events.
+
*[[Monte_Carlo_and_Smearing|eic-smear]] A package for apply very fast detector smearing to Monte Carlo events.
 +
 
 
more details on detector simulations can be found [https://wiki.bnl.gov/eic/index.php/Detector here]
 
more details on detector simulations can be found [https://wiki.bnl.gov/eic/index.php/Detector here]
  
==Manuals==
+
== Manuals ==
  
See the pages of the programmes listed above for their documentation.
+
See the pages of the programmes listed above for their documentation. Other useful references are:
Other useful references are:
 
  
 
*[http://www.phenix.bnl.gov/WWW/publish/elke/EIC/Files-for-Wiki/Manuals/basesSpring.pdf BASES/SPRING v1] and [http://www.phenix.bnl.gov/WWW/publish/elke/EIC/Files-for-Wiki/Manuals/basesSpring5.1.pdf v5.1]: Cross section integration and Monte Carlo event generation. Used in Rapgap and MILOU.
 
*[http://www.phenix.bnl.gov/WWW/publish/elke/EIC/Files-for-Wiki/Manuals/basesSpring.pdf BASES/SPRING v1] and [http://www.phenix.bnl.gov/WWW/publish/elke/EIC/Files-for-Wiki/Manuals/basesSpring5.1.pdf v5.1]: Cross section integration and Monte Carlo event generation. Used in Rapgap and MILOU.
  
==Helpful/Important Links==
+
== Helpful/Important Links ==
 +
 
 
The following pages provide useful general information for Monte Carlo simulations:
 
The following pages provide useful general information for Monte Carlo simulations:
  
* MC programs:
+
*MC programs:
 
**A [http://www.desy.de/~heramc/mclist.html list] of Monte Carlo programmes
 
**A [http://www.desy.de/~heramc/mclist.html list] of Monte Carlo programmes
 
**[http://www.hepforge.org/ HepForge], high-energy physics development environment, which includes many Monte Carlo generators.
 
**[http://www.hepforge.org/ HepForge], high-energy physics development environment, which includes many Monte Carlo generators.
 
**Lecture slides from a course on [http://www.desy.de/~jung/qcd_and_mc_2009/ QCD and Monte Carlos]
 
**Lecture slides from a course on [http://www.desy.de/~jung/qcd_and_mc_2009/ QCD and Monte Carlos]
* Radiative Correction Codes:
+
*Radiative Correction Codes:
 
**A discussion of [http://www.jlab.org/RC/ Radiative corrections]
 
**A discussion of [http://www.jlab.org/RC/ Radiative corrections]
* Parton Distribution Function Interfaces:
+
*Parton Distribution Function Interfaces:
 
**[http://projects.hepforge.org/lhapdf/ LHAPDF], the Les Houches Accord PDF Interface. Currently installed version 5.9.1.
 
**[http://projects.hepforge.org/lhapdf/ LHAPDF], the Les Houches Accord PDF Interface. Currently installed version 5.9.1.
 
***The 64-bit libraries are at /afs/rhic.bnl.gov/eic//lib
 
***The 64-bit libraries are at /afs/rhic.bnl.gov/eic//lib
Line 54: Line 54:
 
**The users' manual of the [http://www.phenix.bnl.gov/WWW/publish/elke/EIC/Files-for-Wiki/Manuals/pdflib.pdf CERN PDFLIB]
 
**The users' manual of the [http://www.phenix.bnl.gov/WWW/publish/elke/EIC/Files-for-Wiki/Manuals/pdflib.pdf CERN PDFLIB]
  
==MC Analysis Techniques==
+
== MC Analysis Techniques ==
 +
 
 +
===== How to get a cross section =====
  
=====How to get a cross section =====
 
 
to normalize your counts to cross section you need two informations
 
to normalize your counts to cross section you need two informations
* the total number of trials, it is printed to the screen/logfile if all our MC finish
+
 
* the total integrated cross section, the unit is in general microbarn, it is printed to the screen/logfile if all our MC finish<br>
+
*the total number of trials, it is printed to the screen/logfile if all our MC finish
'''Counts = Luminosity x Cross Section'''<br>
+
*the total integrated cross section, the unit is in general microbarn, it is printed to the screen/logfile if all our MC finish
 +
 
 +
'''Counts = Luminosity x Cross Section'''
 +
 
 
  ==> count * total integrated cross section /total number of trials
 
  ==> count * total integrated cross section /total number of trials
 +
 
to calculate the corresponding MC luminosity
 
to calculate the corresponding MC luminosity
 +
 
  ==> total number of trials/ total integrated cross section
 
  ==> total number of trials/ total integrated cross section
<br>
+
 
There are some handy ROOT functions available to get the total number of trials, the total integrated MC cross section and the total number of events in the Tree<br>
+
<br/>There are some handy ROOT functions available to get the total number of trials, the total integrated MC cross section and the total number of events in the Tree<br/>These work on Pythia, Pepsi, Djangoh and Milou [https://wiki.bnl.gov/eic/index.php/BuildTree event-wise root trees]
These work on Pythia, Pepsi, Djangoh and Milou [https://wiki.bnl.gov/eic/index.php/BuildTree event-wise root trees]
+
 
* total number of trials:  
+
*total number of trials:
 +
 
 
  TObjString* nEventsString(NULL)
 
  TObjString* nEventsString(NULL)
 
  file.GetObject("nEvents", nEventsString);
 
  file.GetObject("nEvents", nEventsString);
* total integrated MC cross section
+
 
 +
*total integrated MC cross section
 +
 
 
  TObjString* crossSectionString(NULL);
 
  TObjString* crossSectionString(NULL);
 
  file.GetObject("crossSection", crossSectionString);
 
  file.GetObject("crossSection", crossSectionString);
* total number of events in the tree:  
+
 
 +
*total number of events in the tree:
 +
 
 
  TTree* tree(NULL);
 
  TTree* tree(NULL);
 
  file.GetObject("EICTree", tree);
 
  file.GetObject("EICTree", tree);
  
=====How to scale to the MC luminosity to the luminosity we want for the measurement=====
+
===== How to scale to the MC luminosity to the luminosity we want for the measurement =====
 +
 
 +
Very often it is impossible to generate so many events that the MC luminosity would correspond to one month of eRHIC running.<br/>For this case we generate so much MC events that all distributions are smooth and scale the uncertainties.<br/>The factor needed to scale is the ratio '''lumi-scale-factor = eRHIC-luminosity / generated MC luminosity'''. If we have this factor there are 2 ways to scale.
 +
 
 +
*scaling of counts in histogram by
  
Very often it is impossible to generate so many events that the MC luminosity would correspond to one month of eRHIC running.<br>
 
For this case we generate so much MC events that all distributions are smooth and scale the uncertainties. <br>
 
The factor needed to scale is the ratio '''lumi-scale-factor = eRHIC-luminosity / generated MC luminosity'''. If we have this factor there are 2 ways to scale.<br>
 
* scaling of counts in histogram by
 
 
  h11->Scale(lumi-scale-factor);  
 
  h11->Scale(lumi-scale-factor);  
this will scale the number of counts in each bin of the histogram to what you would get for the eRHIC-luminosity<br>
 
statistical uncertainties can then be calculated simply by sqrt(counts)
 
  
* scaling the statistical uncertainties only
+
this will scale the number of counts in each bin of the histogram to what you would get for the eRHIC-luminosity<br/>statistical uncertainties can then be calculated simply by sqrt(counts)
 +
 
 +
*scaling the statistical uncertainties only
 +
 
 
  sqrt(counts)/sqrt(lumi-scale-factor)
 
  sqrt(counts)/sqrt(lumi-scale-factor)
  
 
===== Example: reduced cross section =====
 
===== Example: reduced cross section =====
This example shows how to calculate the reduced cross section need to extract F_2 and F_L and how to scale the statistical uncertainties to a certain integrated luminosity<br>
+
 
 +
This example shows how to calculate the reduced cross section need to extract F_2 and F_L and how to scale the statistical uncertainties to a certain integrated luminosity
 +
 
 
  sigma_reduced =  prefactor * dsigma/dx/dQ2 with prefactor = Q^4 * x / (2*pi*alpha_em^2*(1+(1-y)^2)
 
  sigma_reduced =  prefactor * dsigma/dx/dQ2 with prefactor = Q^4 * x / (2*pi*alpha_em^2*(1+(1-y)^2)
 +
 
this cross section would have the unit barn * GeV^2, to make it dimensionless you need to use a conversion factor for barn to 1/GeV^2 (h^2c^2/GeV^2 = 0.3894 millibarn)
 
this cross section would have the unit barn * GeV^2, to make it dimensionless you need to use a conversion factor for barn to 1/GeV^2 (h^2c^2/GeV^2 = 0.3894 millibarn)
  
  sigma_reduced = counts(x,Q^2) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize <br>
+
  sigma_reduced = counts(x,Q^2) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize  
  if the root function Scale was used the statistical uncertainty is <br>
+
  delta sigma_reduced = sqrt(counts(x,Q^2)) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize<br>
+
  if the root function Scale was used the statistical uncertainty is  
  in the other case it is <br>
+
  delta sigma_reduced = sqrt(counts(x,Q^2)) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize/ sqrt(lumi-scale-factor)<br>
+
  delta sigma_reduced = sqrt(counts(x,Q^2)) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize
 +
 +
  in the other case it is  
 +
 +
  delta sigma_reduced = sqrt(counts(x,Q^2)) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize/ sqrt(lumi-scale-factor)
 +
  
 
'''Attention:''' all luminosities and cross section must be in the same unit (pb or fb or ...)
 
'''Attention:''' all luminosities and cross section must be in the same unit (pb or fb or ...)

Revision as of 14:50, 17 April 2019

The EIC task force has a large number of simulation tools available for investigating different types of physics processes. Unless noted otherwise, these can be accessed from /afs/rhic.bnl.gov/eic/PACKAGES.

Event Generators

The following event generators are available:

  • ep
    • DJANGOH: (un)polarised DIS generator with QED and QCD radiative effects for NC and CC events.
    • gmc_trans: A generator for semi-inclusive DIS with transverse-spin- and transverse-momentum-dependent distributions.
    • LEPTO: A leptoproduction generator - used as a basis for PEPSI and DJANGOH
    • LEPTO-PHI: A version of LEPTO with "Cahn effect" (azimuthal asymmetry) implemented
    • MILOU: A generator for deeply virtual Compton scattering (DVCS), the Bethe-Heitler process and their interference.
    • PYTHIA: A general-purpose high energy physics event generator.
    • PEPSI: A generator for polarised leptoproduction.
    • RAPGAP: A generator for deeply inelastic scattering (DIS) and diffractive e + p events.
  • eA
    • BeAGLE: Benchmark eA Generator for LEptoproduction - UNDER CONSTRUCTION - a generator to simulate ep/eA DIS events including nuclear shadowing effects (based on DPMJetHybrid)
    • DPMJet: a generator for very low Q2/real photon physics in eA
    • DPMJetHybrid: a generator to simulate ep/eA DIS events by employing PYTHIA in DPMJet
    • Sartre is an event generator for exclusive diffractive vector meson production and DVCS in ep and eA collisions based on the dipole model.

There is code provided to convert the output from most of these generators into a ROOT format. It is distributed as part of eic-smear, the Monte Carlo smearing package.

Detector simulations

The following programmes are available for simulating detector geometry and response:

  • eic-smear A package for apply very fast detector smearing to Monte Carlo events.

more details on detector simulations can be found here

Manuals

See the pages of the programmes listed above for their documentation. Other useful references are:

  • BASES/SPRING v1 and v5.1: Cross section integration and Monte Carlo event generation. Used in Rapgap and MILOU.

Helpful/Important Links

The following pages provide useful general information for Monte Carlo simulations:

  • MC programs:
    • A list of Monte Carlo programmes
    • HepForge, high-energy physics development environment, which includes many Monte Carlo generators.
    • Lecture slides from a course on QCD and Monte Carlos
  • Radiative Correction Codes:
  • Parton Distribution Function Interfaces:
    • LHAPDF, the Les Houches Accord PDF Interface. Currently installed version 5.9.1.
      • The 64-bit libraries are at /afs/rhic.bnl.gov/eic//lib
      • Alternatively, 32-bit libraries are at /afs/rhic.bnl.gov/eic/lib32
      • The PDF-Grids can be accessed via /afs/rhic.bnl.gov/eic/share/lhapdf/PDFsets. These should be found automatically if using the system-default version of LHAPDF
    • The users' manual of the CERN PDFLIB

MC Analysis Techniques

How to get a cross section

to normalize your counts to cross section you need two informations

  • the total number of trials, it is printed to the screen/logfile if all our MC finish
  • the total integrated cross section, the unit is in general microbarn, it is printed to the screen/logfile if all our MC finish

Counts = Luminosity x Cross Section

==> count * total integrated cross section /total number of trials

to calculate the corresponding MC luminosity

==> total number of trials/ total integrated cross section


There are some handy ROOT functions available to get the total number of trials, the total integrated MC cross section and the total number of events in the Tree
These work on Pythia, Pepsi, Djangoh and Milou event-wise root trees

  • total number of trials:
TObjString* nEventsString(NULL)
file.GetObject("nEvents", nEventsString);
  • total integrated MC cross section
TObjString* crossSectionString(NULL);
file.GetObject("crossSection", crossSectionString);
  • total number of events in the tree:
TTree* tree(NULL);
file.GetObject("EICTree", tree);
How to scale to the MC luminosity to the luminosity we want for the measurement

Very often it is impossible to generate so many events that the MC luminosity would correspond to one month of eRHIC running.
For this case we generate so much MC events that all distributions are smooth and scale the uncertainties.
The factor needed to scale is the ratio lumi-scale-factor = eRHIC-luminosity / generated MC luminosity. If we have this factor there are 2 ways to scale.

  • scaling of counts in histogram by
h11->Scale(lumi-scale-factor); 

this will scale the number of counts in each bin of the histogram to what you would get for the eRHIC-luminosity
statistical uncertainties can then be calculated simply by sqrt(counts)

  • scaling the statistical uncertainties only
sqrt(counts)/sqrt(lumi-scale-factor)
Example: reduced cross section

This example shows how to calculate the reduced cross section need to extract F_2 and F_L and how to scale the statistical uncertainties to a certain integrated luminosity

sigma_reduced =   prefactor * dsigma/dx/dQ2 with prefactor = Q^4 * x / (2*pi*alpha_em^2*(1+(1-y)^2)

this cross section would have the unit barn * GeV^2, to make it dimensionless you need to use a conversion factor for barn to 1/GeV^2 (h^2c^2/GeV^2 = 0.3894 millibarn)

sigma_reduced = counts(x,Q^2) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize 

if the root function Scale was used the statistical uncertainty is 

delta sigma_reduced = sqrt(counts(x,Q^2)) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize

in the other case it is 

delta sigma_reduced = sqrt(counts(x,Q^2)) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize/ sqrt(lumi-scale-factor)

Attention: all luminosities and cross section must be in the same unit (pb or fb or ...)