Web page of the FIRST experiment reconstruction software

Before starting

The code is developed in C++. It has been tested on several different linux platforms. You need to have ROOT installed. You can have a look at the Documentation Page or the Download page in the ROOT site, to get the basics instructions on how to get compile and install ROOT. After the ROOT installation you might also want to setup the environment variables that allow the ROOT setup to be used in the following. In case you use bash shell you can use.
export ROOTSYS=/path/to/your/root/installation
export LD_LIBRARY_PATH=$ROOTSYS/lib
export PATH=$PATH:$ROOTSYS/bin

FIRST software web interface

The full software project can be browsed trough a web interface (trac based)

Debugging Tools

  • Valgrind is a nice tool to spot memory leaks. Instructions on how to run it can be found here
  • Alleyoop can help you understanding the output of valgrind.

How to get the code

The code is available through svn (see SVN gsi page for details)

The project is organized in the following way

  • sim (documented in the Simulation software web page)
  • rec

The reconstruction project lives under the rec directory.

Under the rec directory all the reconstruction software is kept. The code is organized in several directories, hosting different pieces of the project:
  • l0reco: the low level Data reconstruction
  • l0mcreco: the low level MC reconstruction
  • hlreco: the high level reconstruction
There's a directory containing all the libraries developed to Decode the 'raw' or MC information
  • libs
There's also a directory that keeps some reference data to train the reconstruction algorithms
  • data

To get the code you need to have a valid GSI account. The code can be downloaded using svn:
svn co https://subversion.gsi.de/first 

Common libs compilation

Firsto fo all , to setup the compile phase, you need @GSI to
. go4login 404-02
If you are running with local setups you need to set the library paths in order to enable ROOT and gsievt....

The code compiling is a several steps procedure:
  • First of all you need to compile the W. Mueller libraries. Go to the libs directory, setup the environment using myLogin.sh script (bash environment) and then go inside CTB and src directory issuing make clean and make all commands
  • To compile the W. M. libs (use myLogin.csh if you use (t)csh shells):
cd first/rec/trunk/libs
source myLogin.sh
cd CTB
make
cd ..
cd src
make realclean
make

Depending on what you plan to do next you'd like to attempt to compile the code in hlreco l0reco or l0mcreco directories

Data decode :: l0reco

In order to decode and display the lmd files.

To setup the compile phase you need @GSI to
. go4login [latest version that you can specify]
If you are running with local setups you need to set the library paths in order to enable ROOT.

Then you can start setting up the local setup to build the W.Muller classes.
cd first/rec/trunk/libs
source myLogin.sh

Then you can go in the src directory and issue a 'make' command.

Before compiling the l0reco exec, you need also to compile the l0mcreco libraries (now there's a dependency of MC on data decoding that at some point will be solved).
cd l0mcreco
source myLogin.sh
make clean
make DecodeMC

Then you can compile the l0reco executable (use myLogin.sh to setup the environment)
cd l0reco
source myLogin.sh
make clean
make DecodeRaw

Running the Interactive decoding

You can launch the interactive decoding for a given production run (E.g. run 45) using the command:
./SubInteractiveDecode.pl -path /lustre/bio/first/DATA/TESTS/global_tests -str glolbal_t -run 45
from the /path_to_your_code_installation/first/rec/trunk/l0reco/ folder. You need to issue this command from a machine that has lustre mounted. Otherwise you need to ask for a copy of the file you want to process in the common scratch area /s/s371/PRODUCTION.

In order to actually submit the interactive processing you need to add -exe flag at the end of command line:
./SubInteractiveDecode.pl -path /lustre/bio/first/DATA/TESTS/global_tests -str glolbal_t -run 45 -exe
ATTENTION: before issuing this command you need to check the script output without the -exe flag and CHECK CAREFULLY the PATH given and the STRING given... By default the vtx reconstruction is NOT done. You can enable it with the -vtx option. Help function is available and can be executed running:
./SubInteractiveDecode.pl -h

Usage: ./SubInteractiveDecode.pl <options> 
    Script that submits monitoring jobs.

Options:
  -path   :       Path of lmd infput file
  -outdir :       Path for redirecting root files. 
  -str    :       String that identifies the file. Format: string_run-number.lmd
  -run    :       Run to be processed
  -nev    :       Number of events to be processed. Default: 10000
  -flag   :       Flagging the output. Default: prod
  -vtx    :       Enables VTX reconstruction 
  -exe    :       Executes the commands (to be used AFTER checking)

Examples:
./SubInteractiveDecode.pl -path /lustre/bio/first/DATA/TESTS/global_tests -str global_t -run 40 -vtx

Comments to <asarti@lnf.infn.it>.

An example can be found here:
./SubInteractiveDecode.pl -path /s/s371/PRODUCTION -str production -run 476 -outdir /s/s371/PRODUCTION_root. -flag MyTest -vtx -exe

Decoding details

Anyone can access the data under his own account. Here are the details on the decoding.

  • First of all you need to login into the lx-pool cluster with your own login.
  • Then you need to configure the environment for the online
. go4login 404-02
  • Then you go to the l0reco directory and you set up the running environment.
cd /your_path_to_the_repository/first/rec/trunk/l0reco/
source myLogin.sh 
  • Then you can run the DecodeRaw program that produces the ntuple, histograms and evt displays for the events.
./DecodeRaw -in input.lmd -nev 100 -out out.root -pl_fr 100 -run 10 -cam 20
Since the VTX decoding is really time consuming, a special flag needs to be passed to the Executable in order to enable the VTX decoding. The flag is -vtx. The command line should appear:
./DecodeRaw -in input.lmd -nev 100 -out out.root -pl_fr 100 -run 10 -cam 20 -vtx
The options for the decoding execautable are
  • -out path/file : [def=dumb.root] Root output file
  • -in path/file : [def=../data/test.txt] Unformatted input file
  • -run value : [def=0] Run number to be processed
  • -cam value : [def=0] Campaign number : identifies run type
  • -deb value : [def=0] Enables debugging. Values 1,2 are allowed
  • -vtx : [def= no] Enables the vtx decoding/reco
  • -nev value : [def=10^7] Numbers of events to process
  • -pl_fr value : [def = 100] plots creation frequency

To see an help you can use the -help option
./DecodeRaw -help

MC decode :: l0mcreco

Takes care of Decoding / displaying MC information and writes down an ntuple to be fed to the HL reconstruction

You can compile it AFTER HAVING compiled the libraries (see the step above, use myLogin.csh if you use (t)csh shells)
cd l0mcreco
source myLogin.sh
make clean
make DecodeMC

In order to play with MC you shuld, take a look at the Simulation page first. Here are some 'good things to know' that you may found helpful.
  • The declaration of all the variabiles in the tuple are kept in sim/trunk/Evento.h

To run the program you need to provide the input MC file you want to process and the output rootfile that you want to produce:
./DecodeMC -in path_to_file/<MCfile>.root -out path_to_output/myoutput.root
The default configutation of the decoder produces some output (in the plots dir you'll have some event displays, the verbosity of the output can be controlled by the -deb flag from command line and with the gErrorIgnoreLevel = kError; hardcoded value in RecoTools.cc) that you can use to debug/check your program.

hlreco

Please look at the High Level Reconstruction Page for instructions and details.

-- AlessioSarti - 01 Mar 2011
Topic revision: r20 - 2013-10-16, AlessioSarti - This page was cached on 2024-11-20 - 02:41.

This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding GSI Wiki? Send feedback | Legal notice | Privacy Policy (german)