Sims_maf build error on Scons

Hi all, I installed MAF on one computer in Summer 2017 and using the documentation out there + this forum, eventually made a set of foolproof “instructions” on how to build MAF from any computer. So I’m on a new computer, months later, and attempted to follow exactly what worked for me a few months ago… and I should have known that version updates and such would make it not so easy. I’ve had some success updating anything necessary as it comes up, but most recently I ran into this problem at step [54/88] (below is the last few lines of the error that seemed to contain the most details):

++ unset SCONSUTILS_DIR_EXTRA

  • ./ups/eupspkg VERBOSE=0 FLAVOR=Linux64 config
  • ./ups/eupspkg VERBOSE=0 FLAVOR=Linux64 build
    ./ups/eupspkg: line 761: scons: command not found
  • exit -4
    eups distrib: Failed to build sconsUtils-14.0-15-g79b7e05.eupspkg: Command:
    source /supernova/localhome/mnewsome/lsst4/eups/bin/setups.sh; export EUPS_PATH=/supernova/localhome/mnewsome/lsst4; (/supernova/localhome/mnewsome/lsst4/EupsBuildDir/Linux64/sconsUtils-14.0-15-g79b7e05/build.sh) >> /supernova/localhome/mnewsome/lsst4/EupsBuildDir/Linux64/sconsUtils-14.0-15-g79b7e05/build.log 2>&1 4>/supernova/localhome/mnewsome/lsst4/EupsBuildDir/Linux64/sconsUtils-14.0-15-g79b7e05/build.msg
    exited with code 252

I’ve checked that scons was installed, I’ve updated it via eups anyway, and I’ve tried updating it in a virtual environment and installing sims_maf from there. I get this same error every time. I’ve searched this forum for similar problems but the only thing I could find was from 2016 and seemed out of date and inapplicable to this problem, but if it is something related to a GCC version as those older threads suggest then I’ll try it, I’m just not familiar with GCC.

Let me know if I can provide any other info. Thanks so much for any help.

Your stdout should include a path pointing you to a build.log somewhere in your LSST directory structure. Could you please send along that build log?

The log is below. It appears that the -std=c++14 does not work with g++ 4.8.5 which comes with Centos 7. Instead it looks like -std=c++1y does work though with gcc 4.8. We are currently trying to install version 13. Perhaps 14 has fixed this?

more ./sconsUtils-14.0-15-g79b7e05/config.log
file python/lsst/sconsUtils/state.py,line 343:
Configure(confdir = .sconf_temp)
scons: Configure: Checking who built the CC compiler…
cc --version > .sconf_temp/conftest_0
scons: Configure: yes

file python/lsst/sconsUtils/state.py,line 377:
Configure(confdir = .sconf_temp)
scons: Configure: Checking whether the C++ compiler works…
.sconf_temp/conftest_1.cpp <-
|
|int main()
|{
| return 0;
|}
|
c++ -o .sconf_temp/conftest_1.o -c -std=c++14 -g .sconf_temp/conftest_1.cpp
c++: error: unrecognized command line option '-std=c++14’
scons: Configure: no

The current development version of the DM stack, which includes sconsUtils 14.0-15-g79b7e05, isn’t supported on vanilla CentOS 7: it requires at least GCC 6.3.1. We suggest (and test against) using devtoolset-6 to provide this.

I’m pretty certain that the version of sims_maf that you’re trying to install is based on a recent weekly snapshot of the DM stack, which is why you’re picking up this dependency. The latest DM stack release, version 14, doesn’t have this GCC requirement; whether you can find a version of sims_maf compatible with that version of the DM codebase is really a question for @danielsf.

You can probably find a version of MAF that will build against DM v14.0, but it would involve installing version 2.3.6 of the sims stack using

eups distrib install sims_maf -t sims_2_3_6

We are currently on version 2.5.1, so that would be a significant reversion, and you would be missing out on a lot of new MAF development.

I think installing and enabling devtoolset-6 is the best option. Here are instructions I used

https://www.softwarecollections.org/en/scls/rhscl/devtoolset-6/

Is that not an option?

Thanks @swinbank and @danielsf. Our IT department is not excited about installing that on our production machine (where this needs to live for now). I think I will build a quick docker file and run it there.