Difference: PhysicsAndMCproduction (31 vs. 32)

Revision 322013/05/02 - Main.JamesLetts

Line: 1 to 1
 

General Support

Line: 64 to 64
 As a reminder, you can submit vanilla jobs to use the glidein system which in in place. Condor-G jobs are of course still supported, but are not recommended.

CRAB

Changed:
<
<
TBW
>
>
To run the CRAB client, after setting up your CMSSW environment, you need only source the crab set up file:
source /code/osgcode/ucsdt2/Crab/etc/crab.[c]sh 
 

Old instructions

The following instructions are old, and should be disarded... leaving them here temporarily, just as a reminder of the past.

Line: 107 to 111
 To run analysis to the local DBS (other than the global CMS DBS), in the [USER] section of crab.cfg, following configuration needs to be added
      dbs_url = http://ming.ucsd.edu:8080/DBS/servlet/DBSServlet 
Deleted:
<
<
  • CRAB Server

A crab server is deployed at UCSD, which you can use for your crab submission. Following is the configuration in the [CRAB] section of crab.cfg file to specify the crab server and scheduler.

      scheduler = glite     
 use_server = 1 

Late binding based Crabserver can be used as defined here Essentially:

      scheduler =  glidein    
 use_server = 1 
 

Job Monitoring

Locally submitted jobs to the T2 Condor batch system can be monitored using:

Line: 159 to 155
 To just do ls via the srm would look like:
 lcg-ls -l -b -D srmv2 srm://bsrm-1.t2.ucsd.edu:8443/srm/v2/server?SFN=/hadoop/cms/store/user/tmartin/  or   srmls -2 -delegate=false  srm://bsrm-1.t2.ucsd.edu:8443/srm/v2/server?SFN=/hadoop/cms/store/user/tmartin 
Deleted:
<
<

Validation of Local CRAB Client Installation

Updated instructions on this page using the CMS Workbook.

Set up of environment:

export CMS_PATH=/code/osgcode/cmssoft/cms        
export SCRAM_ARCH=slc5_ia32_gcc434
source ${CMS_PATH}/cmsset_default.sh       
cmsrel CMSSW_3_8_4
cd CMSSW_3_8_4/src
cmsenv
scram b
source /code/osgcode/ucsdt2/gLite31/etc/profile.d/grid_env.sh  
export LCG_GFAL_INFOSYS=lcg-bdii.cern.ch:2170
export GLOBUS_TCP_PORT_RANGE=20000,25000
voms-proxy-init -valid 120:00 --voms cms:/cms/uscms/Role=cmsuser 
source /code/osgcode/ucsdt2/Crab/CRAB_2_7_5/crab.sh 

crab.cfg:

[CMSSW]
total_number_of_events=1
number_of_jobs=1
pset=tutorial.py
datasetpath=/Wmunu/Summer09-MC_31X_V3_7TeV-v1/GEN-SIM-RECO
output_file=out_test1.root

[USER]
return_data=0
email=jletts@ucsd.edu

copy_data = 1
storage_element = T2_US_UCSD

publish_data = 0
publish_data_name = jletts_Data
dbs_url_for_publication = https://cmsdbsprod.cern.ch:8443/cms_dbs_prod_local_09_writer/servlet/DBSServlet

[GRID]
SE_white_list=ucsd

[CRAB]
scheduler=glidein
jobtype=cmssw
server_name=ucsd

tutorial.py:

import FWCore.ParameterSet.Config as cms
process = cms.Process('Tutorial')
process.source = cms.Source("PoolSource", fileNames = cms.untracked.vstring())
process.maxEvents = cms.untracked.PSet( input       = cms.untracked.int32(10) )
process.options   = cms.untracked.PSet( wantSummary = cms.untracked.bool(True) )
process.output = cms.OutputModule("PoolOutputModule",
    outputCommands = cms.untracked.vstring("drop *", "keep recoTracks_*_*_*"),
    fileName = cms.untracked.string('out_test1.root'),
)
process.out_step = cms.EndPath(process.output)

Submit CRAB job:

crab -create
crab -validateCfg
crab -submit 1
 -- HaifengPi - 02 Sep 2008

-- SanjayPadhi - 2009/03/08

-- FkW - 2009/09/07

Changed:
<
<
-- JamesLetts - 2010/11/05
>
>
-- JamesLetts - 2013/05/02
 
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback