| 查看: 2194 | 回复: 6 | ||
| 当前只显示满足指定条件的回帖,点击这里查看本话题的所有回帖 | ||
maoxinxina银虫 (小有名气)
|
[求助]
GPAW计算过程中报错已有1人参与
|
|
|
gpaw计算过程中出现如下错误: ImportError: numpy.core.multiarray failed to import -------------------------------------------------------------------------- mpirun has exited due to process rank 9 with PID 25960 on node a530 exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). 想问下应该如何解决呢? |
» 猜你喜欢
孩子确诊有中度注意力缺陷
已经有14人回复
三甲基碘化亚砜的氧化反应
已经有4人回复
请问下大家为什么这个铃木偶联几乎不反应呢
已经有5人回复
请问有评职称,把科研教学业绩算分排序的高校吗
已经有5人回复
2025冷门绝学什么时候出结果
已经有3人回复
天津工业大学郑柳春团队欢迎化学化工、高分子化学或有机合成方向的博士生和硕士生加入
已经有4人回复
康复大学泰山学者周祺惠团队招收博士研究生
已经有6人回复
AI论文写作工具:是科研加速器还是学术作弊器?
已经有3人回复
论文投稿,期刊推荐
已经有4人回复
请问2026国家基金面上项目会启动申2停1吗
已经有5人回复
maoxinxina
银虫 (小有名气)
- 应助: 0 (幼儿园)
- 金币: 226.9
- 散金: 11
- 帖子: 214
- 在线: 172.8小时
- 虫号: 2495364
- 注册: 2013-06-04
- 性别: GG
- 专业: 化学动力学
6楼2017-03-23 11:20:41
charleslian
木虫 (小有名气)
- 应助: 21 (小学生)
- 金币: 1725.5
- 散金: 100
- 红花: 7
- 帖子: 83
- 在线: 49.3小时
- 虫号: 1089067
- 注册: 2010-09-04
- 专业: 凝聚态物性 II :电子结构
2楼2017-03-16 11:06:45
maoxinxina
银虫 (小有名气)
- 应助: 0 (幼儿园)
- 金币: 226.9
- 散金: 11
- 帖子: 214
- 在线: 172.8小时
- 虫号: 2495364
- 注册: 2013-06-04
- 性别: GG
- 专业: 化学动力学
|
安装了你说的这个,但是还是报错,变成这样了。An MPI process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your MPI job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: a264 (PID 28715) MPI_COMM_WORLD rank: 4 If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- [a264:28710] 23 more processes have sent help message help-mpi-runtime.txt / mpi_init:warn-fork [a264:28710] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages rank=20 L00: Traceback (most recent call last): rank=20 L01: File "C6N7-C3N3.py", line 64, in <module> rank=20 L02: atom.get_potential_energy() rank=20 L03: File "/public/home/users/zkchu/program/python-tools/ase-3.9.1/lib/python2.7/site-packages/ase/atoms.py", line 640, in get_potential_energy rank=20 L04: energy = self._calc.get_potential_energy(self) rank=20 L05: File "/public/home/users/dicp004/gpaw/lib/python2.7/site-packages/gpaw/aseinterface.py", line 50, in get_potential_energy rank=20 L06: self.calculate(atoms, converge=True) rank=20 L07: File "/public/home/users/dicp004/gpaw/lib/python2.7/site-packages/gpaw/paw.py", line 251, in calculate rank=20 L08: self.set_positions(atoms) rank=20 L09: File "/public/home/users/dicp004/gpaw/lib/python2.7/site-packages/gpaw/paw.py", line 329, in set_positions rank=20 L10: self.wfs.initialize(self.density, self.hamiltonian, spos_ac) rank=20 L11: File "/public/home/users/dicp004/gpaw/lib/python2.7/site-packages/gpaw/wavefunctions/fdpw.py", line 71, in initialize rank=20 L12: basis_functions, density, hamiltonian, spos_ac) rank=20 L13: File "/public/home/users/dicp004/gpaw/lib/python2.7/site-packages/gpaw/wavefunctions/fdpw.py", line 108, in initialize_wave_functions_from_basis_functions rank=20 L14: lcaobd.mynbands) rank=20 L15: File "/public/home/users/dicp004/gpaw/lib/python2.7/site-packages/gpaw/wavefunctions/fd.py", line 250, in initialize_from_lcao_coefficients rank=20 L16: kpt.psit_nG = self.gd.zeros(self.bd.mynbands, self.dtype) rank=20 L17: File "/public/home/users/dicp004/gpaw/lib/python2.7/site-packages/gpaw/grid_descriptor.py", line 199, in zeros rank=20 L18: return self._new_array(n, dtype, True, global_array, pad) rank=20 L19: File "/public/home/users/dicp004/gpaw/lib/python2.7/site-packages/gpaw/grid_descriptor.py", line 224, in _new_array rank=20 L20: return np.zeros(shape, dtype) rank=20 L21: MemoryError GPAW CLEANUP (node 20): <type 'exceptions.MemoryError'> occurred. Calling MPI_Abort! -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 20 in communicator MPI_COMM_WORLD with errorcode 42. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- gpaw-python: c/extensions.h:36: gpaw_malloc: Assertion `p != ((void *)0)' failed. gpaw-python: c/extensions.h:36: gpaw_malloc: Assertion `p != ((void *)0)' failed. gpaw-python: c/extensions.h:36: gpaw_malloc: Assertion `p != ((void *)0)' failed. gpaw-python: c/extensions.h:36: gpaw_malloc: Assertion `p != ((void *)0)' failed. -------------------------------------------------------------------------- mpirun has exited due to process rank 20 with PID 28731 on node a264 exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). |
3楼2017-03-20 16:20:48
4楼2017-03-22 04:04:54













回复此楼