You are here: GSI Wiki>FIRST Web>OnlinePage (2011-11-15, ChristianFinck)Edit Attach

Page with online instructions for data decoding for the FIRST experiment shifters

This page contains the data decoding instructions and details for running and analyzing online the data acquired by the FIRST experiment.

Phone Book

Electronic Logbook

All the actions must be recorded in the electronic logbook that can be reached here.

An old logbook for the first runs taken can be found here. *old elog*

Summary Page Page with Run Summary

The status of data reprocessing is documented here

Data access

Data is stored from gStore both on Tape and on the /lustre FS. In order to access the data on /lustre you can
  • Find a machine from which /lustre is mounted on the lx-pool cluster. The list can be found here
  • Logon into the counting room machine: lxg0126
Data will show up under the /lustre/bio/first folder. Details on the acquired that can be found in the electronic logbook

Data Copy after File Closing in mbs

To check the files available you can use the command gstore query.

  • The following command will query all the lmd files (*lmd) that are kept under the DATA folder of the first_r archive.
gstore query '*lmd' first_r DATA/PRODUCTION

  • The following command will transfer ALL the lmd files that are under the DATA/PRODUCTION folder of the first_r archive under /SAT FS in the /SAT/s/s371/PRODUCTION folder
gstore ret '/SAT/s/s371/PRODUCTION/*' first_r DATA/PRODUCTION

  • The following command will transfer ALL the lmd files that are under the DATA folder of the first_r archive under /lustre FS in the /lustre/bio/first/DATA/PRODUCTION folder
gstore ret '/lustre/bio/first/DATA/PRODUCTION/*' first_r DATA/PRODUCTION

Data Decoding shifter tools (under s371 account)

In order to decode the acquired data a script has been setup and is available to the shifter.
  • Login into a machine that can access lustre (ex: ssh lennylust32 from an lx-pool machine )
  • Execute the setup script for G04:
. go4login 404-02
  • Then you go to the l0reco directory and you set up the running environment.
cd online/first/rec/trunk/l0reco/
source myLogin.sh 

Batch system decoding: meant for full production decoding

BATCH JOBS should be run from a 32 BIT MACHINE. Please BEFORE launching jobs, check that you are on a machine from the lx-pool (uname -m should return a 32 bit architecture like i686 without any 64 in) , and that you have executed the . go4login 404-02 and source myLogin.sh scripts.

BEFORE launching the job you need to decide on which queue you want to run. Production jobs (500 MB) without any vertex reconstruction nicely fit in the research queue, while if you run the vertex reconstruction you need to go into the default queue batch.

You can launch the batch decoding of few given production runs (E.g. runs 45 -- 50 ) using the command:
./SubBatchDecode.pl -path /lustre/bio/first/DATA/PRODUCTION -outdir /SAT/s/s371/PRODUCTION_root -str production -runi 45 -runf 50
from the online/FIRST/rec/trunk/l0reco/ folder. This will submite the jobs running the vertex reconstrucion on the batch batch queue. Otherwise you can use the command.
./SubBatchDecode.pl -path /lustre/bio/first/DATA/PRODUCTION -outdir /SAT/s/s371/PRODUCTION_root -str production -runi 45 -runf 50 -novtx -queue research
that will submit the jobs to the research batch queue WHITOUT running the vtx reconstrucion. In order to actually submit the batch processing you need to add -exe flag at the end of command line:
./SubBatchDecode.pl -path /lustre/bio/first/DATA/PRODUCTION -outdir /SAT/s/s371/PRODUCTION_root -str production -runi 45 -runf 50 -exe
ATTENTION: before issuing this command you need to check the script output without the -exe flag and CHECK CAREFULLY the PATH given and the STRING given.. otherwise you just waste CPU time / batch processing priority on non existing files By default the vtx reconstruction is done. You can disable it with the -novtx option. Help function is available and can be executed running:
./SubBatchDecode.pl -h

Usage: ./SubBatchDecode.pl <options> 
    Script that submits monitoring jobs.

Options:
  -path   :     Path of lmd infput file
  -outdir :     Path for redirecting root files. 
  -str    :     String that identifies the file. Format: string_run-number.lmd
  -runi   :     First Run
  -runf   :     Last Run
  -nev    :     Number of events to be processed. Default: 10000
  -flag   :     Flagging the output. Default: prod
  -novtx  :     Disables VTX reconstruction 
  -exe    :     Executes the commands (to be used AFTER checking)


Examples:
./SubBatchDecode.pl -path /lustre/bio/first/DATA/TESTS/global_tests -str global_t -runi 40 -runf 50


Comments to <asarti@lnf.infn.it>.

To check the status of the batch queues processing you can issue the command
bjobs
An to check what a specific jobs is doing you can issue the command
bjobs -l #jobid
where the jobid of a particular job can be seen with bjobs command.

Interactive decoding: meant for quasi-online analysis

You can launch the interactive decoding for a given production run (E.g. run 45) using the command:
./SubInteractiveDecode.pl -path /lustre/bio/first/DATA/TESTS/global_tests -str glolbal_t -run 45
from the online/FIRST/rec/trunk/l0reco/ folder. In order to actually submit the interactive processing you need to add -exe flag at the end of command line:
./SubInteractiveDecode.pl -path /lustre/bio/first/DATA/TESTS/global_tests -str glolbal_t -run 45 -exe
ATTENTION: before issuing this command you need to check the script output without the -exe flag and CHECK CAREFULLY the PATH given and the STRING given... By default the vtx reconstruction is NOT done. You can enable it with the -vtx option. Help function is available and can be executed running:
./SubInteractiveDecode.pl -h

Usage: ./SubInteractiveDecode.pl <options> 
    Script that submits monitoring jobs.

Options:
  -path   :       Path of lmd infput file
  -outdir :       Path for redirecting root files. 
  -str    :       String that identifies the file. Format: string_run-number.lmd
  -run    :       Run to be processed
  -nev    :       Number of events to be processed. Default: 10000
  -flag   :       Flagging the output. Default: prod
  -vtx    :       Enables VTX reconstruction 
  -exe    :       Executes the commands (to be used AFTER checking)

Examples:
./SubInteractiveDecode.pl -path /lustre/bio/first/DATA/TESTS/global_tests -str global_t -run 40 -vtx

Comments to <asarti@lnf.infn.it>.

Decoding details

Anyone can access the data under s371 account. Here are the details on the decoding.

  • First of all you need to login into the lx-pool cluster with the s371 login.
  • Then you need to configure the environment for the online
. go4login 404-02
  • Then you go to the l0reco directory and you set up the running environment.
cd online/first/rec/trunk/l0reco/
source myLogin.sh 
  • Then you can run the DecodeRaw program that produces the ntuple, histograms and evt displays for the events.
./DecodeRaw -in input.lmd -nev 100 -out out.root -pl_fr 100 -run 10 -cam 20
Since the VTX decoding is really time consuming, a special flag needs to be passed to the Executable in order to enable the VTX decoding. The flag is -vtx. The command line should appear:
./DecodeRaw -in input.lmd -nev 100 -out out.root -pl_fr 100 -run 10 -cam 20 -vtx
The options for the decoding execautable are
  • -out path/file : [def=dumb.root] Root output file
  • -in path/file : [def=../data/test.txt] Unformatted input file
  • -run value : [def=0] Run number to be processed
  • -cam value : [def=0] Campaign number : identifies run type
  • -deb value : [def=0] Enables debugging. Values 1,2 are allowed
  • -vtx : [def= no] Enables the vtx decoding/reco
  • -nev value : [def=10^7] Numbers of events to process
  • -pl_fr value : [def = 100] plots creation frequency

To see an help you can use the -help option
./DecodeRaw -help

Online monitoring

A set of macros is being set up under the first/rec/trunk/macro project. You will find out several directories with macros needed to produce online plots for DATA quality monitoring.

Start counter

Macros for start counter monitoring can be found in macro/startcounter folder under the first/rec/trunk project. Before running the macros you need to take care of setting up the environment trough:
. go4login 404-02
and
source myLogin.sh (or csh depending on your favourite shell)
Then you can call a macro trough the command line
  You can launch that macro with the command
  root -b -q loadlibs.C MonitorSC.C\(\"input.root\",\"output.root\"\)
  input.root must be a root file produced with DecodeRaw
as documented in the macro itself.

The macros kept in the folder are:
  • MonitorSC.C : to be used to display the raw times and charges from the four channels of the Start Counter.

Beam Monitor

Macros for beam monitor monitoring can be found in macro/beam_monitor folder under the FIRS/rec/trunk project. Before running the macros you need to take care of setting up the environment trough:
. go4login 404-02
and
source myLogin.sh (or csh depending on your favourite shell)
Then you can call a macro trough the command line
  You can launch that macro with the command
  root -b -q loadlibs.C DisplayTimes.C\(\"input.root\",\"output.root\"\)
  input.root must be a root file produced with DecodeRaw or DecodeMC
as documented in the macro itself.

The macros kept in the folder are:
  • DisplayTimes.C : to be used to display the raw times from all the TDCs connected to the cahmbers, the chmb occupancy and the number of wires firing per event.

Machinery for the t0 evaluation
  • DisplayNtuTimes.C : creates the histos to be used for the t0 evaluation. Takes as input a file produced with DecodeRaw.
root -b -q loadlibs.C DisplayNtuTimes.C\(\"../../l0reco/Run_0028_1k.root\",\"Run_0028_1k.root\"\)
  • T0dch_new.C : evaluates the t0s.
root -b -q T0dch_new.C\(\"Run_0028_1k.root\"\)
The t0s produced have to be copied in the l0reco/config directory to be applied at runtime.

Kentros

Macros for kentros monitoring can be found in macro/kentros folder under the FIRS/rec/trunk project. Before running the macros you need to take care of setting up the environment trough:
. go4login 404-02
and
source myLogin.sh (or csh depending on your favourite shell)
Then you can call a macro trough the command line
  You can launch that macro with the command
  root -b -q loadlibs.C MonitorKentros.C\(\"input.root\",\"output.root\"\)
  input.root must be a root file produced with DecodeRaw
as documented in the macro itself.

The macros kept in the folder are:

Vertex running informations

  • The alignment procedure is now working. The alignment parameters are in the geomaps folder in file TAVTdetector_align126.map, default wise the DecodeRaw executable will load this file. The alignment was performed on the global run 126.
  • Vertex-for-shifters.pdf: Vertex for shifters

Useful information

  • To print from the control room:
    • log into the lx-pool cluster
    • lpr -P P152 nomefile.ps (to print a ps file)

-- AlessioSarti - 19 Jul 2011
Topic revision: r27 - 2011-11-15, ChristianFinck
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding GSI Wiki? Send feedback | Legal notice | Privacy Policy (german)