Csoportok
Bejelentkezés
Csoportok
deal.II User Group
Beszélgetések
Címkék
basics
boundary_conditions
bug
complex_number
cpp
cuda
development
dg_methods
eclipse
eigen_problem
electro_magnetics
fe_spaces
feature_request
fluid_mechanics
fluid_structure_interation
h-refinement
hp_adaptivity
installation
laplace_poisson
linear_algebra
mac
manifold
matrix-free
mesh_generator
meshworker
mpi
multigrid
multithreading
news
p4est
parameter_handler
petsc
post-processing
pre-processing
slepc
solid_mechanics
suggestion
thermo_mechanics
time_integration
trilinos
tutorials
windows
Névjegy
Visszajelzés küldése
Súgó
deal.II User Group
Kapcsolatfelvétel a tulajdonosokkal és a kezelőkkel
5077/1–30.
Welcome to the deal.II mailing list. If you are new to the mailing list, please take the time to read these posts:
Getting started and posting guidelines for new users
and
deal.II discussion group: Feedback and guidelines
.
deal.II website:
http://dealii.org
Github:
https://github.com/dealii/
dealii
Összes megjelölése olvasottként
Csoport bejelentése
0 kiválasztva
vachan potluri
, …
Bruno Turcksin
13
2020. 02. 14.
Kérdések és válaszok
Installation on cray XC50 | linking to petsc, lapack and blas libraries with different names
Here is a summary of the installation process on Cray XC50. I have configured deal.II with MPI,
olvasatlan,
installation
petsc
Kérdések és válaszok
Installation on cray XC50 | linking to petsc, lapack and blas libraries with different names
Here is a summary of the installation process on Cray XC50. I have configured deal.II with MPI,
2020. 02. 14.
Ihar Suvorau
, …
David Wells
12
2020. 01. 30.
Kérdések és válaszok
"PETSc installation does not include a copy of the hypre package" while running the step-40 program
Yup - deal.II did not pick up HYPRE or MUMPS even though you configured PETSc with both (which can be
olvasatlan,
installation
mac
petsc
tutorials
Kérdések és válaszok
"PETSc installation does not include a copy of the hypre package" while running the step-40 program
Yup - deal.II did not pick up HYPRE or MUMPS even though you configured PETSc with both (which can be
2020. 01. 30.
Juan Carlos Araujo Cabarcas
,
Jean-Paul Pelteret
5
2020. 01. 20.
Kérdések és válaszok
Different shape representations with manifolds on the same triangulation
Dear Jean-Paul, thanks again for your support and kind suggestions. I have worked with
olvasatlan,
hp_adaptivity
manifold
petsc
Kérdések és válaszok
Different shape representations with manifolds on the same triangulation
Dear Jean-Paul, thanks again for your support and kind suggestions. I have worked with
2020. 01. 20.
Zhidong Brian Zhang
,
David Wells
5
2019. 12. 10.
Kérdések és válaszok
Vector conversion problem: between dealii::PETScWrappers::MPI::Vector and a PETSc Vec
It makes much sense! Right now, it works by using the deprecated function (generating MPI::Vector
olvasatlan,
basics
mpi
petsc
Kérdések és válaszok
Vector conversion problem: between dealii::PETScWrappers::MPI::Vector and a PETSc Vec
It makes much sense! Right now, it works by using the deprecated function (generating MPI::Vector
2019. 12. 10.
richard....@gmx.at
,
Daniel Arndt
3
2019. 07. 31.
Kérdések és válaszok
petsc & trilinos blocksparsematrix reinit with zero locally owned components
Dear Daniel, thank you very much for your quick and concise answer! Just for the record & other
olvasatlan,
basics
fluid_mechanics
mpi
petsc
trilinos
Kérdések és válaszok
petsc & trilinos blocksparsematrix reinit with zero locally owned components
Dear Daniel, thank you very much for your quick and concise answer! Just for the record & other
2019. 07. 31.
Ramprasad R
, …
Bruno Turcksin
9
2019. 07. 19.
Kérdések és válaszok
Compatibility of Petsc with step 18
Hi Daniel, The problem is now solved. The issue was that, the bash rc did not have the location of
olvasatlan,
cpp
mpi
petsc
Kérdések és válaszok
Compatibility of Petsc with step 18
Hi Daniel, The problem is now solved. The issue was that, the bash rc did not have the location of
2019. 07. 19.
Franco Milicchio
, …
Daniel Arndt
11
2019. 07. 30.
Kérdések és válaszok
Porting tutorials to PETSc from Trilinos
Thanks Daniel, now it runs. Of course it won't converge, lacking preconditiones, but this is for
olvasatlan,
petsc
trilinos
tutorials
Kérdések és válaszok
Porting tutorials to PETSc from Trilinos
Thanks Daniel, now it runs. Of course it won't converge, lacking preconditiones, but this is for
2019. 07. 30.
Vivek Kumar
,
Daniel Arndt
3
2019. 07. 11.
Kérdések és válaszok
Equivalent option for local_range() for Trilinos vectors
Thanks Daniel, it worked. On Wednesday, July 10, 2019 at 9:59:08 PM UTC-4, Daniel Arndt wrote: Vivek,
olvasatlan,
boundary_conditions
petsc
trilinos
Kérdések és válaszok
Equivalent option for local_range() for Trilinos vectors
Thanks Daniel, it worked. On Wednesday, July 10, 2019 at 9:59:08 PM UTC-4, Daniel Arndt wrote: Vivek,
2019. 07. 11.
Pai Liu
, …
Wolfgang Bangerth
7
2019. 04. 04.
Kérdések és válaszok
How to manually create sparsity pattern for PETSc sparsity matrix in parallel
Hi Wolfgang, Thank you so much for your kind help. I tried the dynamic sparsity pattern, and with the
olvasatlan,
mpi
petsc
Kérdések és válaszok
How to manually create sparsity pattern for PETSc sparsity matrix in parallel
Hi Wolfgang, Thank you so much for your kind help. I tried the dynamic sparsity pattern, and with the
2019. 04. 04.
gabriel...@koeln.de
, …
Gabriel Peters
17
2019. 04. 05.
Kérdések és válaszok
Applying boundary values in parll::distr:triang setting for two dof_handler Sparsematrux
Gabriel Peters Endenicher Str. 310 53121 Bonn 00491525/5478185 Gabriel...@koeln.de Am 05.04.19 um
olvasatlan,
boundary_conditions
p4est
petsc
Kérdések és válaszok
Applying boundary values in parll::distr:triang setting for two dof_handler Sparsematrux
Gabriel Peters Endenicher Str. 310 53121 Bonn 00491525/5478185 Gabriel...@koeln.de Am 05.04.19 um
2019. 04. 05.
Pai Liu
, …
David F
16
2019. 03. 26.
Kérdések és válaszok
Is parallel direct solver extremely slow?
Dear Pai, I'm very interested in solving a problem with characteristics very similar to yours.
olvasatlan,
mpi
petsc
solid_mechanics
Kérdések és válaszok
Is parallel direct solver extremely slow?
Dear Pai, I'm very interested in solving a problem with characteristics very similar to yours.
2019. 03. 26.
RAJAT ARORA
, …
Giorgos Kourakos
10
2019. 09. 12.
Kérdések és válaszok
Getting RHS values at nodes with DBC
As always there is a "deal.ii" way of doing the calculations. The FEValues::
olvasatlan,
mpi
petsc
Kérdések és válaszok
Getting RHS values at nodes with DBC
As always there is a "deal.ii" way of doing the calculations. The FEValues::
2019. 09. 12.
Eva Lilje
, …
张嘉宁
3
2019. 08. 05.
Kérdések és válaszok
Dealii Installtion fails because it uses the wrong MPI Version of the intel compiler. It uses debug_mt instead of release_mt.
Hi, recently, I have the same problem. When I complie the PETsc, it link the debug_mt/linmpi.so. So I
olvasatlan,
petsc
Kérdések és válaszok
Dealii Installtion fails because it uses the wrong MPI Version of the intel compiler. It uses debug_mt instead of release_mt.
Hi, recently, I have the same problem. When I complie the PETsc, it link the debug_mt/linmpi.so. So I
2019. 08. 05.
mrjonm...@gmail.com
,
Daniel Arndt
3
2018. 06. 28.
Kérdések és válaszok
KellyErrorEstimator failure when running multiple processes
Thank you. I don't know how I missed item 1. That's a bit embarrassing. Your first suggestion
olvasatlan,
basics
mpi
petsc
Kérdések és válaszok
KellyErrorEstimator failure when running multiple processes
Thank you. I don't know how I missed item 1. That's a bit embarrassing. Your first suggestion
2018. 06. 28.
Feimi Yu
, …
Wolfgang Bangerth
7
2018. 05. 15.
Kérdések és válaszok
Deprecated function PETScWrappers::VectorBase::ratio()
Oh, yes. Sorry I did not say it clearly. What I did is using an identity vector whose elements are
olvasatlan,
petsc
Kérdések és válaszok
Deprecated function PETScWrappers::VectorBase::ratio()
Oh, yes. Sorry I did not say it clearly. What I did is using an identity vector whose elements are
2018. 05. 15.
mrjonm...@gmail.com
,
Denis Davydov
2
2018. 05. 11.
Kérdések és válaszok
Mac OS X 10.13.4 Installation problem
Hi Jon, Try this .dmg https://github.com/luca-heltai/dealii/releases/tag/v9.0.0-rc1 If that won't
olvasatlan,
installation
mac
p4est
petsc
Kérdések és válaszok
Mac OS X 10.13.4 Installation problem
Hi Jon, Try this .dmg https://github.com/luca-heltai/dealii/releases/tag/v9.0.0-rc1 If that won't
2018. 05. 11.
Feimi Yu
, …
Weixiong Zheng
8
2018. 04. 08.
Kérdések és válaszok
Reason for SolverGMRES being slower in parallel?
Hi Weixiong, I did consider this problem so I wanted to avoid using a fake ILU like BlockJacobi. As I
olvasatlan,
linear_algebra
petsc
Kérdések és válaszok
Reason for SolverGMRES being slower in parallel?
Hi Weixiong, I did consider this problem so I wanted to avoid using a fake ILU like BlockJacobi. As I
2018. 04. 08.
Alexander Knieps
,
Wolfgang Bangerth
4
2018. 03. 20.
Contributing back a small feature enhancement (MGTransferPrebuilt parallel PETSc support)
On 03/20/2018 10:43 AM, Alexander Knieps wrote: > > I think that makes sense. Are the other MG
olvasatlan,
development
multigrid
petsc
Contributing back a small feature enhancement (MGTransferPrebuilt parallel PETSc support)
On 03/20/2018 10:43 AM, Alexander Knieps wrote: > > I think that makes sense. Are the other MG
2018. 03. 20.
Feimi Yu
,
Wolfgang Bangerth
13
2018. 03. 21.
Kérdések és válaszok
Iterating over all the entries in a PETScWrapper::MPI::SparseMatrix in parallel
Got it. Thank you so much! Thanks, Feimi On Wednesday, March 21, 2018 at 10:51:24 AM UTC-4, Wolfgang
olvasatlan,
linear_algebra
mpi
petsc
Kérdések és válaszok
Iterating over all the entries in a PETScWrapper::MPI::SparseMatrix in parallel
Got it. Thank you so much! Thanks, Feimi On Wednesday, March 21, 2018 at 10:51:24 AM UTC-4, Wolfgang
2018. 03. 21.
Roberto Porcù
, …
Timo Heister
7
2018. 04. 17.
Kérdések és válaszok
Problem with parallelization when using hyper_cube_slit
Dear Timo. thank you very much. I removed the check on the cell when setting the boundary indicators
olvasatlan,
boundary_conditions
petsc
Kérdések és válaszok
Problem with parallelization when using hyper_cube_slit
Dear Timo. thank you very much. I removed the check on the cell when setting the boundary indicators
2018. 04. 17.
Sukhminder Singh
,
Denis Davydov
2
2018. 01. 24.
Kérdések és válaszok
PetSc Hybrid MPI-OPENMP Parallelization with Spack Dealii
Hi, On Wednesday, January 24, 2018 at 9:13:06 PM UTC+1, Sukhminder Singh wrote: I installed Spack
olvasatlan,
installation
petsc
Kérdések és válaszok
PetSc Hybrid MPI-OPENMP Parallelization with Spack Dealii
Hi, On Wednesday, January 24, 2018 at 9:13:06 PM UTC+1, Sukhminder Singh wrote: I installed Spack
2018. 01. 24.
Marek Čapek
,
Denis Davydov
2
2017. 12. 16.
Kérdések és válaszok
surprising results from DoFHandler.locally_owned_dofs() calls in the fully distributed triangulation
Hi, On Friday, December 15, 2017 at 11:14:24 PM UTC+1, Marek Čapek wrote: Hello, I have downloaded
olvasatlan,
bug
petsc
trilinos
Kérdések és válaszok
surprising results from DoFHandler.locally_owned_dofs() calls in the fully distributed triangulation
Hi, On Friday, December 15, 2017 at 11:14:24 PM UTC+1, Marek Čapek wrote: Hello, I have downloaded
2017. 12. 16.
Jie Cheng
,
Wolfgang Bangerth
11
2017. 12. 18.
Kérdések és válaszok
Temporary distributed vectors
On 12/17/2017 10:10 PM, Jie Cheng wrote: > > The way to deal with the sparsity pattern is to
olvasatlan,
mpi
petsc
Kérdések és válaszok
Temporary distributed vectors
On 12/17/2017 10:10 PM, Jie Cheng wrote: > > The way to deal with the sparsity pattern is to
2017. 12. 18.
Lucas Campos
, …
Jie Cheng
10
2017. 12. 17.
Kérdések és válaszok
PETSc Sparse LU Preallocation
Hi Lucas and Wolfgang I have something to say on this issue because I think it might be helpful to
olvasatlan,
linear_algebra
mpi
petsc
Kérdések és válaszok
PETSc Sparse LU Preallocation
Hi Lucas and Wolfgang I have something to say on this issue because I think it might be helpful to
2017. 12. 17.
Jie Cheng
,
Wolfgang Bangerth
3
2017. 12. 06.
Kérdések és válaszok
General questions in distributed parallelization
Hi Wolfgang Thank you so much for the clear answer! Jie On Wednesday, December 6, 2017 at 3:14:46 PM
olvasatlan,
mpi
p4est
petsc
Kérdések és válaszok
General questions in distributed parallelization
Hi Wolfgang Thank you so much for the clear answer! Jie On Wednesday, December 6, 2017 at 3:14:46 PM
2017. 12. 06.
Lucas Campos
, …
Timo Heister
9
2017. 11. 30.
Kérdések és válaszok
Errors when using MUMPS/PETSc LU
>> Then you have to simplify your problem as much as possible until we >> can reproduce
olvasatlan,
linear_algebra
mpi
multithreading
petsc
Kérdések és válaszok
Errors when using MUMPS/PETSc LU
>> Then you have to simplify your problem as much as possible until we >> can reproduce
2017. 11. 30.
Lucas Campos
, …
Timo Heister
4
2017. 11. 13.
Kérdések és válaszok
LU Decomposition on multiple processors
> However I still do not understand what that line in the documentation means. > Maybe it is a
olvasatlan,
mpi
multithreading
petsc
Kérdések és válaszok
LU Decomposition on multiple processors
> However I still do not understand what that line in the documentation means. > Maybe it is a
2017. 11. 13.
Frederik S.
, …
Jean-Paul Pelteret
9
2017. 11. 10.
Kérdések és válaszok
CellDataStorage with mesh refinement
Hey Jean-Paul! Sorry it took so long to answer, I was on a conference this week and didn't come
olvasatlan,
h-refinement
mpi
multithreading
petsc
Kérdések és válaszok
CellDataStorage with mesh refinement
Hey Jean-Paul! Sorry it took so long to answer, I was on a conference this week and didn't come
2017. 11. 10.
Carlo Marcati
,
Bruno Turcksin
5
2017. 10. 18.
Kérdések és válaszok
SolutionTranfer with PETScWrappers::MPI::Vector
Dear Bruno, thank you. I ended up using prepare_for_pure_refinement() and refine_interpolate(), and
olvasatlan,
mpi
petsc
Kérdések és válaszok
SolutionTranfer with PETScWrappers::MPI::Vector
Dear Bruno, thank you. I ended up using prepare_for_pure_refinement() and refine_interpolate(), and
2017. 10. 18.
RAJAT ARORA
,
Wolfgang Bangerth
4
2017. 10. 03.
Kérdések és válaszok
adaptive mesh refinement doubts
Rajat, > 1. My problem involves the mesh movement in every time step. But with > adaptive mesh
olvasatlan,
boundary_conditions
cpp
petsc
tutorials
Kérdések és válaszok
adaptive mesh refinement doubts
Rajat, > 1. My problem involves the mesh movement in every time step. But with > adaptive mesh
2017. 10. 03.