#0 0x00007ffff75df249 in libMesh::TypeVector<double>::operator*<double> (this=0x7fffffff9aa0, factor=@0x7fffffff9ab8: inf) at /home/shane/projects/moose/scripts/../libmesh/installed/include/libmesh/type_vector.h:790#1 0x00007ffff75def4e in libMesh::operator*<double, double> (factor=@0x7fffffff9ab8: inf, v=...) at /home/shane/projects/moose/scripts/../libmesh/installed/include/libmesh/type_vector.h:804#2 0x00007ffff76dc6be in ADEFieldAdvection<(ComputeStage)0>::computeQpResidual (this=0x2dad410) at /home/shane/projects/zapdos/src/kernels/ADEFieldAdvection.C:44#3 0x00007ffff557f195 in ADKernelTempl<double, (ComputeStage)0>::computeResidual (this=0x2dad410) at /home/shane/projects/moose/framework/src/kernels/ADKernel.C:126#4 0x00007ffff533a775 in ComputeResidualThread::onElement (this=0x7fffffffa080, elem=0x1f67ef0) at /home/shane/projects/moose/framework/src/loops/ComputeResidualThread.C:120#5 0x00007ffff53c824c in ThreadedElementLoopBase<libMesh::StoredRange<libMesh::MeshBase::const_element_iterator, libMesh::Elem const*> >::operator() (this=0x7fffffffa080, range=..., bypass_threading=false) at /home/shane/projects/moose/framework/build/header_symlinks/ThreadedElementLoopBase.h:209#6 0x00007ffff4cfa120 in libMesh::Threads::parallel_reduce<libMesh::StoredRange<libMesh::MeshBase::const_element_iterator, libMesh::Elem const*>, ComputeResidualThread> (range=..., body=...) at /home/shane/projects/moose/scripts/../libmesh/installed/include/libmesh/threads_pthread.h:380#7 0x00007ffff4cbf640 in NonlinearSystemBase::computeResidualInternal (this=0xf47010, tags=...) at /home/shane/projects/moose/framework/src/systems/NonlinearSystemBase.C:1396#8 0x00007ffff4cbb0ce in NonlinearSystemBase::computeResidualTags (this=0xf47010, tags=...) at /home/shane/projects/moose/framework/src/systems/NonlinearSystemBase.C:692#9 0x00007ffff57f804f in FEProblemBase::computeResidualTags (this=0x1f96010, tags=...) at /home/shane/projects/moose/framework/src/problems/FEProblemBase.C:5251
--
You received this message because you are subscribed to the Google Groups "zapdos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/17af7528-bfc5-4f31-80e4-53a311d47cc9%40googlegroups.com.
Floating point exception signaled(invalid floating point operation)!
To track this down, compile in debug mode, then in gdb do:
break libmesh_handleFPE
run ...
bt
[1]/home/dev/projects/moose/scripts/../libmesh/src/base/libmesh.C, line 138, compiled Feb13 2020at 15:51:15
applicationcalled MPI_Abort(MPI_COMM_WORLD,1) - process 1
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos...@googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/b14020af-49fc-4d3b-bb0c-81ac57278e2a%40googlegroups.com.
===================================================================================
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=1:system msg for write_line failure : Bad file descriptor[Inferior 1 (process 19413) exited with code 01]
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/b14020af-49fc-4d3b-bb0c-81ac57278e2a%40googlegroups.com.
On Apr 7, 2020, at 7:57 PM, Shane Keniley <keni...@illinois.edu> wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/8898a375-4c49-46e4-8c6f-61e0b6edeb7c%40googlegroups.com.
<zapdos_test.sh>
ERROR: LU factorization failed with info=18
ERROR: LU factorization failed with info=1
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/8898a375-4c49-46e4-8c6f-61e0b6edeb7c%40googlegroups.com.
<zapdos_test.sh>
ERROR: LU factorization failed with info=1
On Apr 10, 2020, at 3:28 PM, Shane Keniley <keni...@illinois.edu> wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/6090a309-1470-49ce-9d41-4144e14700c7%40googlegroups.com.
On Apr 10, 2020, at 3:28 PM, Shane Keniley <keni...@illinois.edu> wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/6090a309-1470-49ce-9d41-4144e14700c7%40googlegroups.com.
I hope this weekend I can do more than send emails from my phone, but something quick you could try is -pc_factor_mat_solver_type superlu_dist if you haven’t tried it yet. I believe the default parallel lu type is mumps which is relatively memory intensive (with the trade off of being faster). Without first digging deeper I dunno whether those errors you’re seeing might be memory errors...
Time Step 101, time = 1.41303e-06, dt = 5.04911e-08 0 Nonlinear |R| = ^[[32m1.103421e+03^[[39m 0 Linear |R| = ^[[32m1.103421e+03^[[39m 1 Linear |R| = ^[[32m3.000056e-08^[[39m 1 Nonlinear |R| = ^[[31m2.764981e+03^[[39m 0 Linear |R| = ^[[32m2.764981e+03^[[39m 1 Linear |R| = ^[[32m6.464790e-09^[[39m 2 Nonlinear |R| = ^[[31m3.735332e+75^[[39m
==================================================================================== BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 14846 RUNNING AT r2i3n0.ib0.ice.inl.gov
= EXIT CODE: 1= CLEANING UP REMAINING PROCESSES= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES===================================================================================
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/6090a309-1470-49ce-9d41-4144e14700c7%40googlegroups.com.
On Apr 11, 2020, at 9:14 AM, Shane Keniley <keni...@illinois.edu> wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/b7837afa-845e-42de-915c-3dff5c9b0357%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/b7837afa-845e-42de-915c-3dff5c9b0357%40googlegroups.com.
petsc_options_iname = '-pc_type -pc_asm_overlap -sub_pc_type -sub_pc_factor_shift_type -sub_pc_factor_shift_amount'
petsc_options_value = 'asm 4 ilu NONZERO 1e-10'
On Apr 11, 2020, at 12:01 PM, Shane Keniley <keni...@illinois.edu> wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/0dad7c8c-ee45-47ee-874e-39d3c633322e%40googlegroups.com.
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 2
_d_diffem_d_actual_mean_en[_qp] =
_diff_interpolation.sampleDerivative(std::exp(_mean_en[_qp] - _em[_qp])) * _time_units;
_diffem[_qp].value() = _diff_interpolation2->sample(std::exp(_mean_en[_qp].value() - _em[_qp].value())) * _time_units;
On Apr 18, 2020, at 8:16 PM, Shane Keniley <keni...@illinois.edu> wrote:
--
You received this message because you are subscribed to the Google Groups "zapdos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/768c547f-828f-4b67-9d07-042d52a1cbe2%40googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos...@googlegroups.com.
On Apr 19, 2020, at 1:09 PM, Shane Keniley <keni...@illinois.edu> wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/cccd6ef9-fea4-491e-89e0-2778ad921c25%40googlegroups.com.
#0 0x00007fa873e19ca0 in PMPI_Abort () from /opt/moose/mpich-3.3/gcc-9.2.0/lib/libmpi.so.12#1 0x00007fa87a1a93f5 in libMesh::libmesh_terminate_handler () at /home/shane/projects/moose/scripts/../libmesh/src/base/libmesh.C:321#2 0x00007fa872bcccf6 in __cxxabiv1::__terminate(void (*)()) () at ../../../../gcc-9.2.0/libstdc++-v3/libsupc++/eh_terminate.cc:47#3 0x00007fa872bccd41 in std::terminate () at ../../../../gcc-9.2.0/libstdc++-v3/libsupc++/eh_terminate.cc:57#4 0x00007fa872bccf74 in __cxxabiv1::__cxa_throw (obj=<optimized out>, tinfo=0x7fa872ef1308 <typeinfo for std::out_of_range>, dest=0x7fa872be2150 <std::out_of_range::~out_of_range()>) at ../../../../gcc-9.2.0/libstdc++-v3/libsupc++/eh_throw.cc:95#5 0x00007fa881119bc0 in LinearInterpolationTempl<double>::sampleDerivative ( this=0x2f35298, x=@0x7ffff9eff298: -nan(0x8000000000000)) at /home/shane/projects/moose/framework/src/utils/LinearInterpolation.C:82#6 0x00007fa883e7ea14 in ADGasElectronMoments::computeQpProperties ( this=0x2f34010) at /home/shane/projects/zapdos/src/materials/ADGasElectronMoments.C:106#7 0x00007fa880d21951 in Material::computeProperties (this=0x2f34010) at /home/shane/projects/moose/framework/src/materials/Material.C:106#8 0x00007fa880d23dc1 in MaterialData::reinit (this=0x2b70b50, mats=std::__debug::vector of length 12, capacity 16 = {...}) at /home/shane/projects/moose/framework/src/materials/MaterialData.C:73---Type <return> to continue, or q <return> to quit---#9 0x00007fa880a8ff04 in FEProblemBase::reinitMaterials (this=0x2514010, blk_id=1, tid=0, swap_stateful=true) at /home/shane/projects/moose/framework/src/problems/FEProblemBase.C:2985#10 0x00007fa88060b7c0 in ComputeElemAuxVarsThread<AuxKernelTempl<double> >::onElement (this=0x7ffff9f01120, elem=0x2bf1760) at /home/shane/projects/moose/framework/src/loops/ComputeElemAuxVarsThread.C:109#11 0x00007fa88068e1b8 in ThreadedElementLoopBase<libMesh::StoredRange<libMesh::MeshBase::const_element_iterator, libMesh::Elem const*> >::operator() (this=0x7ffff9f01120, range=..., bypass_threading=false) at /home/shane/projects/moose/framework/build/header_symlinks/ThreadedElementLoopBase.h:209#12 0x00007fa87ffdddd0 in libMesh::Threads::parallel_reduce<libMesh::StoredRange<libMesh::MeshBase::const_element_iterator, libMesh::Elem const*>, ComputeElemAuxVarsThread<AuxKernelTempl<double> > > (range=..., body=...) at /home/shane/projects/moose/scripts/../libmesh/installed/include/libmesh/threads_pthread.h:380#13 0x00007fa87ff9c423 in AuxiliarySystem::computeElementalVarsHelper<AuxKernelTempl<double> > (this=0x2b86010, warehouse=..., vars=std::__debug::vector of length 1, capacity 1 = {...}, timer=269) at /home/shane/projects/moose/framework/src/systems/AuxiliarySystem.C:658#14 0x00007fa87ff73ea6 in AuxiliarySystem::computeElementalVars (this=0x2b86010, type=...) at /home/shane/projects/moose/framework/src/systems/AuxiliarySystem.C:590#15 0x00007fa87ff724d6 in AuxiliarySystem::compute (this=0x2b86010, type=...) at /home/shane/projects/moose/framework/src/systems/AuxiliarySystem.C:376#16 0x00007fa880a9e10b in FEProblemBase::computeResidualTags (this=0x2514010, tags=std::__debug::set with 3 elements = {...}) at /home/shane/projects/moose/framework/src/problems/FEProblemBase.C:5125#17 0x00007fa880a9d7f3 in FEProblemBase::computeResidualInternal (this=0x2514010, soln=..., residual=..., tags=std::__debug::set with 3 elements = {...}) at /home/shane/projects/moose/framework/src/problems/FEProblemBase.C:4988#18 0x00007fa880a9d504 in FEProblemBase::computeResidual (this=0x2514010, soln=..., residual=...) at /home/shane/projects/moose/framework/src/problems/FEProblemBase.C:4943#19 0x00007fa880a9d31d in FEProblemBase::computeResidualSys (this=0x2514010, soln=..., residual=...) at /home/shane/projects/moose/framework/src/problems/FEProblemBase.C:4918#20 0x00007fa87ff7434e in ComputeResidualFunctor::residual (this=0x1f73c10, soln=..., residual=..., sys=...) at /home/shane/projects/moose/framework/src/systems/ComputeResidualFunctor.C:23
NaN
is false
." So different compilers could probably implement different behaviors. So in summary I think the best thing for you to do is to wrap those material property calculations in a try-catch block and catch std::out_of_range exceptions whey they happen.--
You received this message because you are subscribed to the Google Groups "zapdos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/e250dfee-0ea2-49d7-b377-0d53827e55f1%40googlegroups.com.