ntuple based analysis package for long lived displaced jet analyses
# Fermilab uses tcsh by default even though it has bash!
# This framework is based in bash and
# technically maybe you don't need this,
# but tcshers be warned
bash --login
# Set up the area
export SCRAM_ARCH=slc6_amd64_gcc530;
scram pro -n LLDJ_slc6_530_CMSSW_8_0_26_patch1 CMSSW CMSSW_8_0_26_patch1;
cd LLDJ_slc6_530_CMSSW_8_0_26_patch1/src;
cmsenv;
## CMSSW imports and customizations
git cms-merge-topic ikrav:egm_id_80X_v3_photons
scramv1 build -j 10;
## LLDJstandalones Framework checkout
# first fork the repository to make your own workspace
git clone https://github.com/<mygithubusername>/LLDJstandalones.git;
pushd LLDJstandalones;
# If you want to check out a specific branch
# git fetch origin
# git branch -v -a # list branches available, find yours
# git checkout -b NAMEOFBRANCH origin/NAMEOFBRANCH
# add DisplacedHiggs as upstream
git remote add upstream https://github.com/DisplacedHiggs/LLDJstandalones.git
cd LLDJstandalones
# compile a clean area
scramv1 build -j 10;
## Every time you log in
# set up some environment variables (bash)
source LLDJstandalones/setup.sh
Make sure to run source setup.sh
from the LLDJstandalones
directory first to set up environment variables used in scripts.
In particular, this sets up $nversion
and $aversion
cd ${CMSSW_BASE}/LLDJstandalones/ntuples/config
To run local jobs, do cmsRun run_mc_80XAOD.py
and cmsRun run_data_80XAOD.py
Then submit CRAB jobs using bash submitcrab.sh
which uses crab_template.py
CRAB directores are in ..config/gitignore/$nversion
Finished jobs appear at FNAL in /store/group/lpchbb/LLDJntuples/$nversion
To run the analyzer first we need to run other things from ..LLDJntuples/commontools
- Lists
bash makemasterlist.sh
makes master lists of ntuples from which other lists are derivedbash makelists.sh
makes lists of files and puts them inlists
folder, split by samplebash countevents.sh
callscountevents.cxx
and makes .info files inlists
folderbash findTTavgweights.sh
runs over TTbar samples calculates the average TTbar weight
- Scale factors
- EGamma SFs in
commontools/elesf
- scale factors are provided from POG as a TH2F histogram from https://twiki.cern.ch/twiki/bin/view/CMS/EgammaIDRecipesRun2 - Pileup reweighting in
commontools/pileup
, seeREADME
bash collectjsons.sh
- get JSONs from finished CRAB jobs, compare to golden JSONbash makeinputhistos.sh
- make input SF histograms using database (on lxplus)bash runMakePUweights.sh
- make weight histograms- Make sure SF files are copied into
LLDJntuples/analyzer
directory
Run local jobs from LLDJntuples/analyzers
folder
make
compilesmain.C
into executablerunanalyzer.exe
./runanalyzer.exe --<flags>
call executable analyzer by hand (you must specify flags)- or edit and run
bash runAnalyzers.sh
which loops through different options for callingrunanalyzer.exe
From submitters
folder
- in
submitjobs.sh
setdoSubmit=false
to be safe while testing bash submitjobs.sh
creates submit area ingitignore
. The job that actually runs on the condor nodes isrunsubmitter.sh
voms-proxy-init --voms cms --valid 100:00
set up your proxy- set
doSubmit=true
and runbash submitjobs.sh
- then optionally add info to the autogenerated txt file about the submit
While jobs are running / finished
bash checkjobs.sh
check to see if see if condor jobs are donebash haddjobs.sh
merge analyzed output locally intoLLDJstandalones/roots/$aversion
bash cpeos.sh
copy hadded analyzer jobs to EOS- delete those files copied over, no script for this
First thing to do is merge the histograms from the analyzer, do this with bash runPlotterStackedRegion
then take it from there
- run
bash runPlotterStackedRegion
over the unshifted analyzer output. - run
bash runPlotterTagvarUnc.sh
to get plots starting withtvu
and values of shifted cuts based on integral - put new shift cut values in
analyzers/analyzer_config.C
as the variables liketag_shiftXXXX
and rerun analyzer