Simulating the Full Chain of Many Events with CMS Software from Scratch#

These are instructions for setting up your work environment and simulating many physics events by submitting jobs to the grid. Make sure you have followed the instructions on getting grid credentials first!#


Setting up a version of CMS Software#

This document uses the version 3_5_4, but subsitute that for whatever version you want. To see which versions are available to checkout into your directory:#

scramv1 list CMSSW 
  1. Put code into your .bash_profile (if it’s not already there):#

    source /afs/hep.wisc.edu/cms/setup/bashrc 
    
  2. Open a new terminal (to load your edited .bash_profile)#

  3. Create a work area for your version of CMS software:#

     scramv1 project CMSSW CMSSW_3_5_4 
    
  4. Make an Analysis directory#

     mkdir ~/CMSSW_3_5_4/src/Analysis 
    
  5. Go to that directory & set up your runtime environment#

     cd ~/CMSSW_3_5_4/src/Analysis/
     cmsenv 
    

Creating Events#

Out of date - unless you really want to create events, most kinds of physics has been simulated and you just need to find the right dataset in DAS. Skip this section to learn how to Analyze events.#

  1. Your configuration file of what events to create should be in a directory such as:#

     ~/CMSSW_3_5_4/src/Analysis/exampleConfig.cfg 
    

    You can use this exampleConfig.cfg as a template. Change the pythiaExample section, at least.#

  2. Be in the correct folder and setup the environment#

     cd ~/CMSSW_3_5_4/src/Analysis/
     cmsenv 
    
  3. Get a valid grid certificate (make it valid for some number of hours)#

     voms-proxy-init -valid 96:00
    
  4. Now, run a script to farm out the jobs to condor. This script needs 5 parameters, for example:#

     farmoutRandomSeedJobs AnyName 100000 500 ~/CMSSW_3_5_4/ ~/CMSSW_3_5_4/src/Analysis/exampleConfig.cfg
    

    This will create 100,000 events with 500 events in each root file. You can find out more about this farmout script here.#

  5. Type condor_q to see your jobs in the queue safe and sound. You can keep tabs on them that way.#

  6. When your jobs finish, your files are now all now located in a folder of what is known as HDFS,#

     /hdfs/store/user/YourUserName/AnyName/ 
    

    To delete, rename, or move files in HDFS you must first type:#

     ssh yourLogin@login.hep.wisc.edu
     voms-proxy-init
     gsido
     cd /hdfs/store/user/
    

    The gsido command gives you a shell running as the same unix account that owns your files in HDFS. You can then cd to your directory and do what you wish with your files.#


Creating Analyzer#

  1. Create a place for an analyzer program#

     cd ~/CMSSW_3_5_4/src/Analysis/
     cmsenv
     mkedanlzr SimpleAnalyzer 
    

    That creates an “Analyzer” directory with a few sub directories.#

  2. Files were created in the above step, you need to replace/edit these files:#

     ~/CMSSW_3_5_4/src/Analysis/Analyzer/src/SimpleAnalyzer.cc
     ~/CMSSW_3_5_4/src/Analysis/Analyzer/BuildFile 
    

    With what you wish. Note that SimpleAnalyzer.cc will have code that is typically (in CMSSW) split into a “.h” file. For an example of some analyzers, see QcdPhotonsDQM.cc, QcdPhotonsDQM.h. An example BuildFile.#

    So, just type pico and copy and paste the contents of the new files I just listed.#

  3. Go to your Analysis directory and set up the environment#

     cd ~/CMSSW_3_5_4/src/Analysis/
     cmsenv
    
  4. Go into the Analyzer subdirectory and compile your analyzer:#

     cd ~/CMSSW_3_5_4/src/Analysis/Analyzer/
     scramv1 b 
    
  5. Put simpleanalyzer.cfg into#

     ~/CMSSW_3_5_4/src/Analysis/ 
    

    See the next section on running your Analyzer.#