CRAB#

CRAB at UW-HEP CMS#

Here is the main CRAB documentation: CRAB.#

Quick Recipe for users#

We assume you have gone through getting a UW-HEP account, a grid certificate, etc….#

1) Typical setup:
ssh weinberg@login01.hep.wisc.edu
mkdir /scratch/weinberg
cd /scratch/weinberg     # must be in /scratch!
scramv1 project CMSSW CMSSW_3_5_8_patch3
cd CMSSW_3_5_8_patch3/src/
cmsenv
kinit weinberg@CERN.CH
cvs co your/analyzer  # (or create your analyzer)
scramv1 b
cd your/analyzer

2) Create a crab.cfg here.  See example in section below.

4) Load CRAB at wisconsin:
source /cms/sw/CRAB_2_10_2p3/crab.sh

5) Create the jobs
crab -create

6) Submit the jobs
crab -submit

7) wait, then check on your jobs,
crab -status

8) If they're working, the output root files should be in your hdfs directory:
 /hdfs/store/user/weinberg/

9) Optional, if you want the log and err files do
crab -getoutput

10) Optional, if you want the integrated luminosity of the data you ran on
crab -report
# Crab returns something like:
# Luminosity section summary file: /scratch/mbanderson/.../res/lumiSummary.json
# Give that full path to that file to lumiCalc:
lumiCalc.py -c frontier://LumiProd/CMS_LUMI_PROD -i res/lumiSummary.json recorded

For information on creating an analyzer see Creating Analyzer for more information.#

Figure out what dataset you want to run over using DAS#

# Note: to continue submitting CRAB jobs for a particular WF to Condor. crab -submit 1 -continue crab_0_$DATE_$TIME#

We recommend using the remoteGlidein scheduler. Other options are to use the glidein server and glite.#

Example CRAB cfg file#

This is a typical crab.cfg file.#

[CRAB]
jobtype                = cmssw
scheduler              = remoteGlidein
use_server             = 0

[CMSSW]
datasetpath            = /MinBias/Summer09-STARTUP3X_V8K_900GeV-v1/GEN-SIM-RECO
#datasetpath            = /MinimumBias/BeamCommissioning09-Dec19thReReco_336p3_v2/RECO
pset                   = test/SinglePhotonAnalyzer_cfg.py
total_number_of_events = -1
events_per_job         = 40000
output_file            = SinglePhotonAnalyzer.root

[USER]
copy_data              = 1
return_data            = 0
storage_element        = T2_US_Wisconsin
user_remote_dir        = ./
check_user_remote_dir  = 0

[GRID]
rb                     = CERN
proxy_server           = myproxy.cern.ch
virtual_organization   = cms
retry_count            = 0


#se_white_list          = T2_US_Caltech, T2_US_UCSD, T2_US_Wisconsin

Advanced Set-up#

Put the following code anywhere into your .bash_profile:#

# CMSSW & CRAB Version
export CMSSW_VER=CMSSW_3_3_1
export CRAB_VER=CRAB_2_10_2p3

#Load CMSSW environment settings
cd ${CMSSW_VER}/src
cmsenv
cd ~

#Load CRAB environment
source /cms/sw/${CRAB_VER}/crab.sh

Now, whenver you login, you can just#

cd /path/to/your/analysis/workdir

# Check Jobs status in local Condor queue.
condor_q

# Check Job status in CRAB
crab -status

# Retrieve the output
crab -getoutput

CRAB at CERN#

# Login to CERN
ssh user@lxplus.cern.ch

# Load the CMS Environment
source /afs/cern.ch/cms/LCG/LCG-2/UI/cms_ui_env.csh

# Start your grid proxy
voms-proxy-init -voms cms

# Load up CRAB
source /afs/cern.ch/cms/ccs/wm/scripts/Crab/CRAB_2_1_1/crab.csh

If your grid-proxy isn’t installed at CERN but it is at Wisconsin, copy it this way:#

# Login to CERN
ssh user@lxplus.cern.ch

# Copy your grid proxy
scp -r user@login.hep.wisc.edu:~/.globus .globus

# Set the permissions on these files
chmod 700 ~/.globus/private/
fs setacl -dir ~/.globus/private -acl system:anyuser none
chmod 0600 ~/.globus/private/userkey.pem
chmod 640 ~/.globus/usercert.pem

Data Location#

If you would like to see where your data is located, use Data Aggregation System.#

Troubleshooting#

# To see a list of all your jobs
condor_q $USER

# To watch your jobs
jobWatch

# If some jobs go on hold
# See the hold reason
jobWhyHold

# Find out more info about a job
#  jobID = the number on the left when you type condor_q
grid-job-remote-info <jobID>

# Grab the stdout file from a job still running
#  run the above command to find the location of stdout
globus-job-run <site-you-submitted-to> /bin/cat /location/of/stdout

globus-job-run osg-gw-4.t2.ucsd.edu /bin/cat /osglocal/users/cms/uscms1370/.globus/job/osg-gw-4.t2.ucsd.edu/22350.1211409217/stdout