Problems with tutorial: "calibrating single frames with processCcd.py"

Hi everyone :relaxed:
I’m currently starting to use the lsst pipeline, and I’ve been following the Getting started tutorial (Getting started with the LSST Science Pipelines — LSST Science Pipelines). In part 2 to calibrate the data, the command processCcd.py is used; I can do the ‘dry-run’ to see what data will be processed in the Butler repository:

(lsst) jahumada@leftraru2:/mnt/flock/jahumada/Paula$ processCcd.py DATA --rerun processCcdOutputs --id --show data

root INFO: Loading config overrride file '/home/jahumada/.conda/envs/lsst/opt/lsst/obs_subaru/config/processCcd.py'

Cannot import lsst.meas.extensions.convolved (No module named convolved): disabling convolved flux measurements

root INFO: Loading config overrride file '/home/jahumada/.conda/envs/lsst/opt/lsst/obs_subaru/config/hsc/processCcd.py'

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903334, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 23, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903334, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 22, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903334, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 16, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903334, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 100, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903336, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 24, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903336, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 17, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903338, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 25, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903338, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 18, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903342, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 100, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903342, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 10, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903342, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 4, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903344, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 11, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903344, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 5, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903344, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 0, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903346, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 12, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903346, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 6, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903346, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 1, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903986, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 23, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903986, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 22, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903986, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 16, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903986, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 100, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903988, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 24, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903988, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 23, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903988, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 17, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903988, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 16, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903990, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 25, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 903990, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 18, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 904010, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 100, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 904010, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 10, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 904010, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 4, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 904014, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 12, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 904014, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 6, 'expTime': 30.0}

id dataRef.dataId = {'taiObs': '2013-11-02', 'pointing': 671, 'visit': 904014, 'dateObs': '2013-11-02', 'filter': 'HSC-I', 'field': 'STRIPE82L', 'ccd': 1, 'expTime': 30.0}

But once I try to actually run it, I get the following error:

(lsst) jahumada@leftraru2:/mnt/flock/jahumada/Paula$ processCcd.py DATA --rerun processCcdOutputs --id
root INFO: Loading config overrride file '/home/jahumada/.conda/envs/lsst/opt/lsst/obs_subaru/config/processCcd.py'
Cannot import lsst.meas.extensions.convolved (No module named convolved): disabling convolved flux measurements
root INFO: Loading config overrride file '/home/jahumada/.conda/envs/lsst/opt/lsst/obs_subaru/config/hsc/processCcd.py'
root INFO: Running: /home/jahumada/.conda/envs/lsst/opt/lsst/pipe_tasks/bin/processCcd.py DATA --rerun processCcdOutputs --id
processCcd INFO: Processing {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903334, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 23, 'expTime': 30.0}
processCcd.isr INFO: Performing ISR on sensor {'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903334, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 23, 'expTime': 30.0}
processCcd FATAL: Failed on dataId={'taiObs': '2013-06-17', 'pointing': 533, 'visit': 903334, 'dateObs': '2013-06-17', 'filter': 'HSC-R', 'field': 'STRIPE82L', 'ccd': 23, 'expTime': 30.0}: disk I/O error
Traceback (most recent call last):
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/pipe_base/python/lsst/pipe/base/cmdLineTask.py", line 347, in __call__
    result = task.run(dataRef, **kwargs)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/pipe_base/python/lsst/pipe/base/timer.py", line 121, in wrapper
    res = func(self, *args, **keyArgs)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/pipe_tasks/python/lsst/pipe/tasks/processCcd.py", line 181, in run
    exposure = self.isr.runDataRef(sensorRef).exposure
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/obs_subaru/python/lsst/obs/subaru/isr.py", line 230, in runDataRef
    defects = sensorRef.get("defects", immediate=True)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/daf_persistence/python/lsst/daf/persistence/butlerSubset.py", line 198, in get
    return self.butlerSubset.butler.get(datasetType, self.dataId, **rest)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/daf_persistence/python/lsst/daf/persistence/butler.py", line 699, in get
    location = self._locate(datasetType, dataId, write=False)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/daf_persistence/python/lsst/daf/persistence/butler.py", line 649, in _locate
    location = repoData.repo.map(datasetType, dataId, write=write)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/daf_persistence/python/lsst/daf/persistence/repository.py", line 183, in map
    loc = self._mapper.map(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/obs_subaru/python/lsst/obs/hsc/hscMapper.py", line 156, in map
    location = super(HscMapper, self).map(datasetType, copyId, write=write)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/daf_persistence/python/lsst/daf/persistence/mapper.py", line 144, in map
    return func(self.validate(dataId), write)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/obs_base/python/lsst/obs/base/cameraMapper.py", line 688, in map_defects
    defectFitsPath = self._defectLookup(dataId=dataId)
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/obs_base/python/lsst/obs/base/cameraMapper.py", line 985, in _defectLookup
    (ccdVal, taiObs))
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/daf_persistence/python/lsst/daf/persistence/registries.py", line 365, in executeQuery
    c = self.conn.execute(cmd, values)
OperationalError: disk I/O error

Its an Input/Output error, but I have no idea why is wrong or how it could be solved, maybe there is something that I missed along the way!

Just in case, these are the commands I use to get to the point of using processccd.py

jahumada@leftraru2:/mnt/flock/jahumada/Paula$ source activate lsst

(lsst) jahumada@leftraru2:/mnt/flock/jahumada/Paula$ source eups-setups.sh

(lsst) jahumada@leftraru2:/mnt/flock/jahumada/Paula$ setup lsst_distrib

(lsst) jahumada@leftraru2:/mnt/flock/jahumada/Paula$ ml git-lfs

(lsst) jahumada@leftraru2:/mnt/flock/jahumada/Paula$ git lfs install

Git LFS initialized.

I apply the git-lfs related commands every time I log in since I’m using the lsst pipeline in a cluster called ‘leftraru’ from NLHPC.
Thanks a lot! :slight_smile:

What version of the Science Pipelines are you using? It looks like the problem is occurring when trying to access the defectRegistry.sqlite3 file, which went away a number of releases ago.

Hi :slight_smile:
The Science Pipelines were installed following the instructions of Installing and Using LSST Science Pipelines as a Conda Package — LSST Science Pipelines DM-10013 documentation, which ultimately uses the link http://conda.lsst.codes/stack/0.13.0
I imagine 0.13.0 to be the version?

Wow, that’s old (2017, I think). Thanks for pointing it out, but in general you should not be using development versions of the documentation (indicated by “Edition: DM-NNNNN” in the sidebar) as a reference.

If you would like to use a conda installation of the Science Pipelines rather than one of our other mechanisms documented at The LSST Science Pipelines — LSST Science Pipelines, I would suggest installing stackvana from conda-forge (courtesy of @beckermr ). I believe it will give you a recent weekly release, rather than a stable official release, however.

1 Like

Thanks a lot for the suggestion :slight_smile: . I will try installing stackvana and see how it goes.

Hi :slight_smile:
I installed stackvana from conda-forge and now obtain the following:

(lsst) jahumada@leftraru3:/mnt/flock/jahumada/Paula$ processCcd.py DATA --rerun processCcdOutputs --id
Could not import lsstcppimport; please ensure the base package has been built (not just setup).

Traceback (most recent call last):
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/ip_isr/python/lsst/ip/isr/isrLib.py", line 24, in swig_import_helper
    return importlib.import_module(mname)
  File "/home/lmod/software/Anaconda3/2020.02/lib/python3.7/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 670, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 583, in module_from_spec
  File "<frozen importlib._bootstrap_external>", line 1043, in create_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
ImportError: dynamic module does not define module export function (PyInit__isrLib)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/pipe_tasks/bin/processCcd.py", line 23, in <module>
    from lsst.pipe.tasks.processCcd import ProcessCcdTask
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/pipe_tasks/python/lsst/pipe/tasks/processCcd.py", line 22, in <module>
    from lsst.ip.isr import IsrTask
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/ip_isr/python/lsst/ip/isr/__init__.py", line 24, in <module>
    from .isrLib import *
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/ip_isr/python/lsst/ip/isr/isrLib.py", line 27, in <module>
    _isrLib = swig_import_helper()
  File "/home/jahumada/.conda/envs/lsst/opt/lsst/ip_isr/python/lsst/ip/isr/isrLib.py", line 26, in swig_import_helper
    return importlib.import_module('_isrLib')
  File "/home/lmod/software/Anaconda3/2020.02/lib/python3.7/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
ModuleNotFoundError: No module named '_isrLib'

I suspect this is most probably because it’s not a stable release… considering that the ‘base’ version of the lsst pipeline that I have is from 2017. Do you think there is any way to complete the tutorials without updating the lsst pipeline?.

Stackvana (which we do not officially support) comes with all the software necessary; there’s no need for any “base” version.

I did the following successfully, following the instructions in Getting started with the LSST Science Pipelines — LSST Science Pipelines exactly, after the first two commands and making sure to start in a clean shell with no LSST-related environment variables set:

$ conda create -n sv stackvana
$ conda activate sv
$ git clone https://github.com/lsst/testdata_ci_hsc
$ mkdir DATA
$ setup -j -r testdata_ci_hsc
$ echo "lsst.obs.hsc.HscMapper" > DATA/_mapper
$ ingestImages.py DATA $TESTDATA_CI_HSC_DIR/raw/*.fits --mode=link
$ installTransmissionCurves.py DATA
$ ln -s $TESTDATA_CI_HSC_DIR/CALIB/ DATA/CALIB
$ mkdir -p DATA/ref_cats
$ ln -s $TESTDATA_CI_HSC_DIR/ps1_pv3_3pi_20170110 DATA/ref_cats/ps1_pv3_3pi_20170110
$ processCcd.py DATA --rerun processCcdOutputs --id --show data
$ processCcd.py DATA --rerun processCcdOutputs --id

Hi :slight_smile: thanks a lot for your help. I’m still unable to install stackvana completely, but parallel to this I could ultimately get the version 21 of the lsst pipeline installed. Now re-doing the tutorial, the processCcd.py command line gives me the following:

(lsst-scipipe-cb4e2dc) jahumada@leftraru4:/mnt/flock/jahumada/Paula$ processCcd.py DATA --rerun processCcdOutputs --id

root INFO: Loading config overrride file '/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/obs_subaru/21.0.0+d401af1dcd/config/processCcd.py'

root INFO: Loading config overrride file '/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/obs_subaru/21.0.0+d401af1dcd/config/hsc/processCcd.py'

CameraMapper INFO: Loading exposure registry from /mnt/flock/jahumada/Paula/DATA/registry.sqlite3

CameraMapper INFO: Loading calib registry from /mnt/flock/jahumada/Paula/DATA/CALIB/calibRegistry.sqlite3

CameraMapper INFO: Loading calib registry from /mnt/flock/jahumada/Paula/DATA/CALIB/calibRegistry.sqlite3

root INFO: Running: /home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/pipe_tasks/21.0.0+44ca056b81/bin/processCcd.py DATA --rerun processCcdOutputs --id

Caught signal 11, backtrace follows:

/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/utils/21.0.0+e0676b0dc8/lib/libutils.so(+0x17364) [0x2ad1ac1ec364]

/lib64/libc.so.6(+0x36280) [0x2ad1a1ae4280]

/home/lmod/software/MPI/intel/2019.2.187-GCC-8.2.0-2.31.1/impi/2019.2.187/imkl/2019.2.187/lib/intel64/libiomp5.so(+0x96c09) [0x2ad1b419bc09]

/home/lmod/software/MPI/intel/2019.2.187-GCC-8.2.0-2.31.1/impi/2019.2.187/imkl/2019.2.187/lib/intel64/libiomp5.so(+0xcde9a) [0x2ad1b41d2e9a]

/home/lmod/software/MPI/intel/2019.2.187-GCC-8.2.0-2.31.1/impi/2019.2.187/imkl/2019.2.187/lib/intel64/libiomp5.so(+0xb6c8d) [0x2ad1b41bbc8d]

/home/lmod/software/MPI/intel/2019.2.187-GCC-8.2.0-2.31.1/impi/2019.2.187/imkl/2019.2.187/lib/intel64/libiomp5.so(omp_get_max_threads+0x35) [0x2ad1b419ce45]

/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/base/21.0.0+d529cf1a41/lib/libbase.so(+0x4fb3) [0x2ad1ac226fb3]

/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/base/21.0.0+d529cf1a41/lib/libbase.so(lsst::base::disableImplicitThreading()+0x170) [0x2ad1ac227400]

/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/base/21.0.0+d529cf1a41/python/lsst/base/threads.so(+0x62ea) [0x2ad24fc662ea]

/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/base/21.0.0+d529cf1a41/python/lsst/base/threads.so(+0xeffd) [0x2ad24fc6effd]

python((null)+0x264) [0x5583eac3e9a4]

python(_PyMethodDef_RawFastCallKeywords+0x264) [0x5583eac3e9a4]

python((null)+0x4509) [0x5583eacbc639]

python(_PyEval_EvalFrameDefault+0x4509) [0x5583eacbc639]

python((null)+0x187) [0x5583eac2dfa7]

python(_PyFunction_FastCallKeywords+0x187) [0x5583eac2dfa7]

python(+0x17f995) [0x5583eac74995]

python((null)+0x681) [0x5583eacb87b1]

python(_PyEval_EvalFrameDefault+0x681) [0x5583eacb87b1]

python((null)+0x255) [0x5583eac0e735]

python(_PyEval_EvalCodeWithName+0x255) [0x5583eac0e735]

python((null)+0x521) [0x5583eac2e341]

python(_PyFunction_FastCallKeywords+0x521) [0x5583eac2e341]

python(+0x17f995) [0x5583eac74995]

python((null)+0x48b2) [0x5583eacbc9e2]

python(_PyEval_EvalFrameDefault+0x48b2) [0x5583eacbc9e2]

python((null)+0x255) [0x5583eac0e735]

python(_PyEval_EvalCodeWithName+0x255) [0x5583eac0e735]

python(PyEval_EvalCode+0x23) [0x5583eac0fb33]

python(+0x227962) [0x5583ead1c962]

python(PyRun_FileExFlags+0x9e) [0x5583ead26a1e]

python(PyRun_SimpleFileExFlags+0x1bb) [0x5583ead26c0b]

python(+0x232cf1) [0x5583ead27cf1]

python((null)+0x3c) [0x5583ead27d7c]

python(_Py_UnixMain+0x3c) [0x5583ead27d7c]

/lib64/libc.so.6((null)+0xf5) [0x2ad1a1ad03d5]

/lib64/libc.so.6(__libc_start_main+0xf5) [0x2ad1a1ad03d5]

python(+0x1d7a25) [0x5583eaccca25]

Segmentation fault

I’m not sure if this is okay?, it seems that the objects couldn’t be processed for some reason, and I not sure why.

Thanks!

I think you will have to disable your use of the Intel MPI/MKL library that seems to be inserted in place of the conda-installed libraries.

It’s possible you could build from source with that library in place, but I’m not sure how easy it would be, and the configuration would be unsupported.

Thanks a lot!!! disabling the intel MPI/MKL library made it work :slight_smile: