gen_vec_ps generator

14 views
Skip to first unread message

Mariana Khachatryan

unread,
Jan 13, 2023, 10:25:15 AM1/13/23
to GlueX Software Help Email List
Dear GlueX software help members,


I’m trying to generate some signal MC for omega eta data using gen_vec_ps generator, but the jobs fail with following error message

Missing name for redirect.
Warning from GlueXDetectorConstruction::ConstructSDandField - unsupported sensitive volume TAC1 found in geometry definition.
G4WT0 > Warning from GlueXDetectorConstruction::ConstructSDandField - unsupported sensitive volume TAC1 found in geometry definition.
src/JANA/JGeometryXML.cc:347 Node or attribute not found for xpath "//section/composition/posXYZ[@volume='ForwardMWPC']/@X_Y_Z".
src/JANA/JGeometryXML.cc:347 Node or attribute not found for xpath "//section[@name='ForwardMWPC']/box[@name='CPPF']/@X_Y_Z".
src/JANA/JGeometryXML.cc:347 Node or attribute not found for xpath "//section/composition/posXYZ[@volume='CppScint']/@X_Y_Z".
libraries/HDGEOMETRY/DGeometry.cc:1796 Unable to retrieve CPP scintillator position.
src/JANA/JGeometryXML.cc:348 Node or attribute not found for xpath "//section/composition/posXYZ[@volume='DIRC']/@X_Y_Z".
libraries/HDGEOMETRY/DGeometry.cc:1741 Unable to retrieve DIRC position.
src/JANA/JGeometryXML.cc:348 Node or attribute not found for xpath "//composition[@name='forwardTOF_bottom3']/mposY[@volume='FTOL']/@ncopy".
src/JANA/JGeometryXML.cc:348 Node or attribute not found for xpath "//composition[@name='forwardTOF_top3']/mposY[@volume='FTOL']/@ncopy".
src/JANA/JGeometryXML.cc:348 Node or attribute not found for xpath "//composition[@name='forwardTOF_bottom3']/mposY[@volume='FTOL']/@ncopy".
src/JANA/JGeometryXML.cc:348 Node or attribute not found for xpath "//composition[@name='forwardTOF_top3']/mposY[@volume='FTOL']/@ncopy".
rm: No match.
rm: No match.


You can find all the necessary files in
/w/halld-scshelf2101/Mariana/MC_gen/Gen_omegaeta


To run MC I use the following command:
python $MCWRAPPER_CENTRAL/gluex_MC.py MC.config 50685-51768 20000000 cleangenerate=0 batch=2



Thank you,
Mariana.

Alexander Austregesilo

unread,
Jan 13, 2023, 10:46:13 AM1/13/23
to gluex-s...@googlegroups.com
Hi Mariana,

I think, there is something wrong with how you set up your environment.
You can use a csh script as ENVIRONMENT_FILE, but you have to properly
set up the environment in the script. The line

source Mariana_recon-2018-08-ver02_25.xml

in mpi_setup.csh does not make sense. You would have to use gxenv or
gluex_env_jlab.csh to interpret the xml file.

Cheers,

Alex


On 1/13/2023 10:25 AM, Mariana Khachatryan wrote:
> python $MCWRAPPER_CENTRAL/gluex_MC.py MC.config 50685-51768 20000000 cleangenerate=0 batch=2

--
Alexander Austregesilo

Staff Scientist - Experimental Nuclear Physics
Thomas Jefferson National Accelerator Facility
Newport News, VA
aaus...@jlab.org
(757) 269-6982

Mariana Khachatryan

unread,
Jan 14, 2023, 10:11:09 PM1/14/23
to Alexander Austregesilo, gluex-s...@googlegroups.com
Hi Alex ,

thank you for suggestion. I modified corresponding line in mpi_setup.csh to be
source /group/halld/Software/build_scripts/gluex_env_jlab.csh /w/halld-scshelf2101/Mariana/MC_gen/Gen_omegaeta/Mariana_recon-2018_08-ver02_25.xml

but the jobs failed without log output so it’s try to see what is the problem.

Do you know, what else could be the problem?

Thank you,
Mariana.
> --
> You received this message because you are subscribed to the Google Groups "GlueX Software Help" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to gluex-softwar...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/gluex-software/8a039ba5-9402-de67-a757-fea84493742c%40jlab.org.

Peter Hurck

unread,
Jan 16, 2023, 3:25:20 AM1/16/23
to Mariana Khachatryan, Alexander Austregesilo, gluex-s...@googlegroups.com
Hi Mariana,

Have you tried running the job locally? Just pick a single run number, then change the number of events to 100 and leave off the ‘batch=2’ at the end of the MCWrapper command. Then you can watch the job being executed in your shell. This might give you a hint as to what is going on.

Cheers,
Peter
> To view this discussion on the web visit https://groups.google.com/d/msgid/gluex-software/C4353795-8AB3-42F9-96AA-75D7F932ED9A%40gmail.com.

Mariana Khachatryan

unread,
Jan 17, 2023, 2:28:49 PM1/17/23
to Peter Hurck, Alexander Austregesilo, gluex-s...@googlegroups.com
Hi Peter,

I run MC generation locally but it is not clear to me what is the problem. It ends with following lines:

Closed ROOT file
JANA >>Closing shared object handle 0 ...
JANA >>Closing shared object handle 1 ...
JANA >>Closing shared object handle 2 ...
JANA >>Closing shared object handle 3 ...
gen_vec_ps_050685_000_gen_vec_ps_050685_000.root
gen
gen_vec_ps_diagnostic_gen_vec_ps_050685_000.root
gen
hd_root_ana_gen_vec_ps_050685_000_gen_vec_ps_050685_000.root
hdroot
hd_root_gen_vec_ps_050685_000.root
hdroot
tree_pi0pippimeta__B4_M17_gen_vec_ps_050685_000.root
reaction
tree_thrown_gen_vec_ps_050685_000.root
thrown
rm: No match.
rm: No match.
MOVING AND/OR CLEANUP FAILED
beam.config BHgen_stats.astate BHgen_thread_1.astate
Tue Jan 17 13:48:28 EST 2023
Successfully completed
ending gluex_MC.py

And it creates hddm and runnumber_0 directories, but doesn’t store generated and reconstructed MC.

I have saved the output of shell in the attached file. Can you please take a look and tell
me iff you see what is the problem?



MC_output

Justin Stevens

unread,
Jan 17, 2023, 4:17:47 PM1/17/23
to Mariana Khachatryan, Peter Hurck, Alexander Austregesilo, gluex-s...@googlegroups.com
Hi Mariana,

I looked at the log file you attached to the last message and it seems you successfully generated 100 events with the output going to the directory you specified in your MCWrapper config file.  

ifarm1802.jlab.org> ls /w/halld-scshelf2101/Mariana/MC_gen/Gen_omegaeta/root/*/ -hl | grep "Jan 17"
-rw-r--r-- 1 marianak halld  16K Jan 17 15:10 gen_vec_ps_050685_000_gen_vec_ps_050685_000.root
-rw-r--r-- 1 marianak halld  21K Jan 17 15:10 gen_vec_ps_diagnostic_gen_vec_ps_050685_000.root
-rw-r--r-- 1 marianak halld 3.5M Jan 17 15:14 hd_root_ana_gen_vec_ps_050685_000_gen_vec_ps_050685_000.root
-rw-r--r-- 1 marianak halld 3.8M Jan 17 15:12 hd_root_gen_vec_ps_050685_000.root
-rw-r--r-- 1 marianak halld   54K Jan 17 15:14 tree_thrown_gen_vec_ps_050685_000.root
-rw-r--r-- 1 marianak halld  101K Jan 17 15:14 tree_pi0etapr__etapr_pippimeta__pi0_gg__eta_gg__B4_M35_gen_vec_ps_050685_000.root
-rw-r--r-- 1 marianak halld  112K Jan 17 13:48 tree_pi0pippimeta__B4_M17_gen_vec_ps_050685_000.root

Is there some output you were expecting that is missing?

-Justin

-- 
You received this message because you are subscribed to the Google Groups "GlueX Software Help" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gluex-softwar...@googlegroups.com.

>> 
>> -- 
>> You received this message because you are subscribed to the Google Groups "GlueX Software Help" group.
>> To unsubscribe from this group and stop receiving emails from it, send an email to gluex-softwar...@googlegroups.com.

> 

-- 
You received this message because you are subscribed to the Google Groups "GlueX Software Help" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gluex-softwar...@googlegroups.com.

Mariana Khachatryan

unread,
Jan 17, 2023, 4:24:25 PM1/17/23
to Justin Stevens, Peter Hurck, Alexander Austregesilo, gluex-s...@googlegroups.com
Hi Justin,

yes, the problem is that the generated and reconstructed root files are missing. They were supposed to be
saved in the directory I do generation which is the following:
/w/halld-scshelf2101/Mariana/MC_gen/Gen_omegaeta


Thank you,
Mariana.

Justin Stevens

unread,
Jan 17, 2023, 4:33:46 PM1/17/23
to Mariana Khachatryan, Peter Hurck, Alexander Austregesilo, gluex-s...@googlegroups.com
Hi Mariana,

In the MCWrapper config file you specify a “base” directory for the output file locations, which it appears you’ve set to

DATA_OUTPUT_BASE_DIR=/w/halld-scshelf2101/Mariana/MC_gen/Gen_omegaeta

Then when the jobs are finished (or your interactive command is completed) the files will be copied to directories within that path named hddm/ and root/.  So for the ROOT thrown and analysis trees you would look these directories

/w/halld-scshelf2101/Mariana/MC_gen/Gen_omegaeta/root/thrown/
/w/halld-scshelf2101/Mariana/MC_gen/Gen_omegaeta/root/trees/

It looks like many files have been successfully created there last week and some additional files (probably from your interactive run for 50685) were made today.  Is this what you’re looking for?

-Justin

Mariana Khachatryan

unread,
Jan 17, 2023, 5:08:26 PM1/17/23
to Justin Stevens, Peter Hurck, Alexander Austregesilo, gluex-s...@googlegroups.com
Hi Justin,

thanks for checking, that is right I was able to find the files. Because the hddm directory is always renewed after MC generation I was expected the same for root directory and was looking for newly created root directory. 

Thank you,
Mariana.

Mariana Khachatryan

unread,
Jan 19, 2023, 2:12:18 PM1/19/23
to Justin Stevens, Peter Hurck, Alexander Austregesilo, gluex-s...@googlegroups.com
Dear all,

I still get error when submitting my jobs to batch farm. I’m trying to generate some MC using gen_vec_ps generator version from Edmundo (I use halld_sim version from him).
When I run a job on ifarm (python $MCWRAPPER_CENTRAL/gluex_MC.py MC.config 50685 100 cleangenerate=0) it works.
When I submit job to run on batch farm (python $MCWRAPPER_CENTRAL/gluex_MC.py MC.config 51768 100 cleangenerate=0 batch=2), job fails with following error

gen_vec_ps: error while loading shared libraries: libImt.so: cannot open shared object file: No such file or directory

The directory where I have all the files is 
/w/halld-scshelf2101/Mariana/MC_gen/Gen_omegaeta

It looks like the problem is the environment setup since it complains about some library, but I don’t know how to fix this.
I setup environment here (/w/halld-scshelf2101/Mariana/MC_gen/Gen_omegaeta/mpi_setup.csh).
Please let me know if you have ideas on how to fix this.

With regards,
Mariana.

Alexander Austregesilo

unread,
Jan 19, 2023, 2:37:27 PM1/19/23
to Mariana Khachatryan, Justin Stevens, Peter Hurck, gluex-s...@googlegroups.com

Hi Mariana,

When you run MCWrapper interactively, it just uses your shell environment and ignores the ENVIRONMENT tags in the config file.

When I source the environment as specified in the Mariana_recon-2018_08-ver02_25.xml and try to execute gen_vec_ps, I get the same error. Are you sure the private version of halld_sim that you use in the xml file was compiled in this environment?

Cheers,

Alex

Mariana Khachatryan

unread,
Jan 19, 2023, 3:05:14 PM1/19/23
to Alexander Austregesilo, Justin Stevens, Peter Hurck, gluex-s...@googlegroups.com
Edumdo is the one to originally be able to run this MC , and he uses his custom.xml file.
In order to have the MC being processed with the same software version as the GlueX data I’m analyzing I
use a different .xml file, where I use halld_sim,hd_utilities and amptools versions from Edmundos .xml to use his gen_vec_ps version.
Is this wrong thing to do? I thought only theses would be relevant for generator.

Thank you,
Mariana.

Alexander Austregesilo

unread,
Jan 19, 2023, 3:16:21 PM1/19/23
to Mariana Khachatryan, Justin Stevens, Peter Hurck, gluex-s...@googlegroups.com

Edmundo may have compiled amptools and halld_sim with a different root version, which will definitely cause problems. You should try to recompile them in this custom environment.

Alexander Austregesilo

unread,
Jan 19, 2023, 3:19:38 PM1/19/23
to Mariana Khachatryan, GlueX Software Help Email List

On 1/19/2023 3:16 PM, Alexander Austregesilo wrote:
> Is this wrong thing to do? I thought only theses would be relevant for
> generator.

Yes that could cause problems if his halld_sim version is older. The
mcsmear step is also a part of halld_sim, and we recommend using the
latest version.

Mariana Khachatryan

unread,
Jan 19, 2023, 7:13:18 PM1/19/23
to Alexander Austregesilo, GlueX Software Help Email List
I modified the software versions and now the jobs work on batch farm.

Thank you!
Reply all
Reply to author
Forward
0 new messages