| 查看: 3727 | 回复: 7 | ||||
gjh123金虫 (正式写手)
|
[求助]
vasp 并行运算出错
|
|
使用openmpi 并行编译的vasp 每次运行时都会出现下面的错误而不能运行,哪位vasp 的高手知道,能帮忙解答一下啊。 错误:mpirun noticed that process rank 1 with PID 3716 on node localhost.localdomain exited on signal 11 (Segmentation fault). 本来以为是因为内存不够的原因,又买了个2G 的内存条加上后(总共4G内存),运行vasp显示只用了60% 左右的内存,但还是出现上面的错误而不能继续运行下去。郁闷啊!有什么方法可以解决一下吗? |
» 收录本帖的淘帖专辑推荐
VASP | vasp编译问题集锦 |
» 猜你喜欢
职称评审没过,求安慰
已经有39人回复
硝基苯如何除去
已经有3人回复
A期刊撤稿
已经有4人回复
垃圾破二本职称评审标准
已经有17人回复
回收溶剂求助
已经有6人回复
投稿Elsevier的Neoplasia杂志,到最后选publishing options时页面空白,不能完成投稿
已经有22人回复
申请26博士
已经有5人回复
EST投稿状态问题
已经有7人回复
毕业后当辅导员了,天天各种学生超烦
已经有4人回复
求助文献
已经有3人回复
» 本主题相关价值贴推荐,对您同样有帮助:
vasp并行运算出现问题:mpivars.sh: No such file or directory
已经有3人回复
简单的vasp几个节点并行问题,求大神帮助!!!!!!!!!!!!!!!!
已经有3人回复
vasp 计算 partial charge 时k点选取的问题
已经有19人回复
VASP5.2编译成功后运行出错
已经有14人回复
LAMMPS并行计算的问题(cpu——time关系)
已经有17人回复
vasp并行编译通过,运行却出现错误提示
已经有5人回复
vasp 集群并行出错
已经有7人回复
新手vasp编译不过,请指教~
已经有17人回复
VASP测试k点的时候报错
已经有8人回复
vasp计算突然中途停止 没有报错提示
已经有12人回复
VASP计算出错,请高手指点
已经有19人回复
VASP并行计算时,提示“超过CPU时限”而退出计算。请问如何解决?
已经有11人回复
VASP并行计算出错
已经有9人回复
Abinit 并行运算和串行运算的差别
已经有6人回复
vasp跨节点运行出错,mpiexec_node-1 (handle_stdin_input 1089)
已经有5人回复
vasp并行测试时出错。
已经有7人回复
cp2k并行编译出错
已经有3人回复
vasp并行出错,急于求助!!!
已经有13人回复
vasp 并行运算出错
已经有8人回复
【求助】安装vasp出错 make: *** [fftmpi_map.o] 错误 1【已解决】
已经有6人回复
【其他】VASP运行出现这样的错误提示,这是什么原因呢
已经有7人回复
【求助】vasp用neb计算过渡态报错,错误提示是什么意思?
已经有9人回复
【求助】vasp运行问题
已经有7人回复
【求助】VASP 编译出错
已经有24人回复
2楼2011-09-07 16:28:10
gjh123
金虫 (正式写手)
- 应助: 1 (幼儿园)
- 金币: 1012.6
- 散金: 36
- 红花: 1
- 帖子: 465
- 在线: 247.8小时
- 虫号: 675694
- 注册: 2008-12-16
- 专业: 凝聚态物性 II :电子结构
3楼2011-09-07 16:39:02
后天一
木虫 (小有名气)
我们的自由!
- 1ST强帖: 3
- 应助: 9 (幼儿园)
- 金币: 2544.6
- 红花: 4
- 帖子: 204
- 在线: 311小时
- 虫号: 519608
- 注册: 2008-03-06
- 专业: 凝聚态物性 II :电子结构
4楼2011-09-07 16:39:57
gjh123
金虫 (正式写手)
- 应助: 1 (幼儿园)
- 金币: 1012.6
- 散金: 36
- 红花: 1
- 帖子: 465
- 在线: 247.8小时
- 虫号: 675694
- 注册: 2008-12-16
- 专业: 凝聚态物性 II :电子结构
|
非常谢谢,加上FFLAGS= -heap-arrays 64这个赋值编译出的vasp, 已经运行成功了。刚刚也从这个贴子中发现了这个解决的方法。 http://muchong.com/bbs/viewthread.php?tid=2493388&fpage=1. 再次感谢各位的帮助。 |
5楼2011-09-07 17:36:40
|
本人在编译单点clm4.5之后,用yhbatch指令提交,cesm.log文件总是出现以下报错,请问有谁遇到过类似问题,改怎样解决呢?不甚感激! (seq_comm_setcomm) initialize ID ( 1 GLOBAL ) pelist = 0 0 1 ( npes = 1) ( nthreads = 1) (seq_comm_setcomm) initialize ID ( 2 CPL ) pelist = 0 0 1 ( npes = 1) ( nthreads = 1) (seq_comm_setcomm) initialize ID ( 17 ATM ) pelist = 0 0 1 ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 18 CPLATM ) join IDs = 2 17 ( npes = 1) ( nthreads = 1) (seq_comm_jcommarr) initialize ID ( 3 ALLATMID ) join multiple comp IDs ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 10 CPLALLATMID ) join IDs = 2 3 ( npes = 1) ( nthreads = 1) (seq_comm_setcomm) initialize ID ( 19 LND ) pelist = 0 0 1 ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 20 CPLLND ) join IDs = 2 19 ( npes = 1) ( nthreads = 1) (seq_comm_jcommarr) initialize ID ( 4 ALLLNDID ) join multiple comp IDs ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 11 CPLALLLNDID ) join IDs = 2 4 ( npes = 1) ( nthreads = 1) (seq_comm_setcomm) initialize ID ( 21 OCN ) pelist = 0 0 1 ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 22 CPLOCN ) join IDs = 2 21 ( npes = 1) ( nthreads = 1) (seq_comm_jcommarr) initialize ID ( 5 ALLOCNID ) join multiple comp IDs ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 12 CPLALLOCNID ) join IDs = 2 5 ( npes = 1) ( nthreads = 1) (seq_comm_setcomm) initialize ID ( 23 ICE ) pelist = 0 0 1 ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 24 CPLICE ) join IDs = 2 23 ( npes = 1) ( nthreads = 1) (seq_comm_jcommarr) initialize ID ( 6 ALLICEID ) join multiple comp IDs ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 13 CPLALLICEID ) join IDs = 2 6 ( npes = 1) ( nthreads = 1) (seq_comm_setcomm) initialize ID ( 25 GLC ) pelist = 0 0 1 ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 26 CPLGLC ) join IDs = 2 25 ( npes = 1) ( nthreads = 1) (seq_comm_jcommarr) initialize ID ( 7 ALLGLCID ) join multiple comp IDs ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 14 CPLALLGLCID ) join IDs = 2 7 ( npes = 1) ( nthreads = 1) (seq_comm_setcomm) initialize ID ( 27 ROF ) pelist = 0 0 1 ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 28 CPLROF ) join IDs = 2 27 ( npes = 1) ( nthreads = 1) (seq_comm_jcommarr) initialize ID ( 8 ALLROFID ) join multiple comp IDs ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 15 CPLALLROFID ) join IDs = 2 8 ( npes = 1) ( nthreads = 1) (seq_comm_setcomm) initialize ID ( 29 WAV ) pelist = 0 0 1 ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 30 CPLWAV ) join IDs = 2 29 ( npes = 1) ( nthreads = 1) (seq_comm_jcommarr) initialize ID ( 9 ALLWAVID ) join multiple comp IDs ( npes = 1) ( nthreads = 1) (seq_comm_joincomm) initialize ID ( 16 CPLALLWAVID ) join IDs = 2 9 ( npes = 1) ( nthreads = 1) (seq_comm_printcomms) 1 0 1 1 GLOBAL: (seq_comm_printcomms) 2 0 1 1 CPL: (seq_comm_printcomms) 3 0 1 1 ALLATMID: (seq_comm_printcomms) 4 0 1 1 ALLLNDID: (seq_comm_printcomms) 5 0 1 1 ALLOCNID: (seq_comm_printcomms) 6 0 1 1 ALLICEID: (seq_comm_printcomms) 7 0 1 1 ALLGLCID: (seq_comm_printcomms) 8 0 1 1 ALLROFID: (seq_comm_printcomms) 9 0 1 1 ALLWAVID: (seq_comm_printcomms) 10 0 1 1 CPLALLATMID: (seq_comm_printcomms) 11 0 1 1 CPLALLLNDID: (seq_comm_printcomms) 12 0 1 1 CPLALLOCNID: (seq_comm_printcomms) 13 0 1 1 CPLALLICEID: (seq_comm_printcomms) 14 0 1 1 CPLALLGLCID: (seq_comm_printcomms) 15 0 1 1 CPLALLROFID: (seq_comm_printcomms) 16 0 1 1 CPLALLWAVID: (seq_comm_printcomms) 17 0 1 1 ATM: (seq_comm_printcomms) 18 0 1 1 CPLATM: (seq_comm_printcomms) 19 0 1 1 LND: (seq_comm_printcomms) 20 0 1 1 CPLLND: (seq_comm_printcomms) 21 0 1 1 OCN: (seq_comm_printcomms) 22 0 1 1 CPLOCN: (seq_comm_printcomms) 23 0 1 1 ICE: (seq_comm_printcomms) 24 0 1 1 CPLICE: (seq_comm_printcomms) 25 0 1 1 GLC: (seq_comm_printcomms) 26 0 1 1 CPLGLC: (seq_comm_printcomms) 27 0 1 1 ROF: (seq_comm_printcomms) 28 0 1 1 CPLROF: (seq_comm_printcomms) 29 0 1 1 WAV: (seq_comm_printcomms) 30 0 1 1 CPLWAV: (t_initf) Read in prof_inparm namelist from: drv_in seq_flds_mod: read seq_cplflds_inparm namelist from: drv_in seq_flds_mod: read seq_cplflds_userspec namelist from: drv_in seq_flds_mod: seq_flds_a2x_states= Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:Sa_pslv seq_flds_mod: seq_flds_a2x_fluxes= Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4 seq_flds_mod: seq_flds_x2a_states= Sf_lfrac:Sf_ifrac:Sf_ofrac:Sx_avsdr:Sx_anidr:Sx_avsdf:Sx_anidf:Sx_tref:Sx_qref:So_t:Sx_t:Sl_fv:Sl_ram1:Sl_snowh:Si_snowh:So_ssq:So_re:Sx_u10:So_ustar seq_flds_mod: seq_flds_x2a_fluxes= Faxx_taux:Faxx_tauy:Faxx_lat:Faxx_sen:Faxx_lwup:Faxx_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Fall_voc001:Fall_voc002:Fall_voc003:Fall_voc004:Fall_voc005:Fall_voc006:Fall_voc007:Fall_voc008 seq_flds_mod: seq_flds_l2x_states= Sl_avsdr:Sl_anidr:Sl_avsdf:Sl_anidf:Sl_tref:Sl_qref:Sl_t:Sl_fv:Sl_ram1:Sl_snowh:Sl_u10 seq_flds_mod: seq_flds_l2x_fluxes= Fall_swnet:Fall_taux:Fall_tauy:Fall_lat:Fall_sen:Fall_lwup:Fall_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Flrl_rofliq:Flrl_rofice:Fall_voc001:Fall_voc002:Fall_voc003:Fall_voc004:Fall_voc005:Fall_voc006:Fall_voc007:Fall_voc008 seq_flds_mod: seq_flds_x2l_states= Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Slrr_volr seq_flds_mod: seq_flds_x2l_fluxes= Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Flrr_flood seq_flds_mod: seq_flds_i2x_states= Si_avsdr:Si_anidr:Si_avsdf:Si_anidf:Si_tref:Si_qref:Si_t:Si_snowh:Si_u10:Si_ifrac seq_flds_mod: seq_flds_i2x_fluxes= Faii_swnet:Fioi_swpen:Faii_taux:Fioi_taux:Faii_tauy:Fioi_tauy:Faii_lat:Faii_sen:Faii_lwup:Faii_evap:Fioi_melth:Fioi_meltw:Fioi_salt seq_flds_mod: seq_flds_x2i_states= Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:So_t:So_s:So_u:So_v:So_dhdx:So_dhdy seq_flds_mod: seq_flds_x2i_fluxes= Faxa_rain:Faxa_snow:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Fioo_q seq_flds_mod: seq_flds_o2x_states= So_t:So_s:So_u:So_v:So_dhdx:So_dhdy:So_bldepth seq_flds_mod: seq_flds_o2x_fluxes= Fioo_q seq_flds_mod: seq_flds_x2o_states= Sa_pslv:So_duu10n:Si_ifrac:Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokes seq_flds_mod: seq_flds_x2o_fluxes= Faxa_rain:Faxa_snow:Faxa_prec:Faxa_lwdn:Foxx_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Foxx_taux:Foxx_tauy:Foxx_lat:Foxx_sen:Foxx_lwup:Foxx_evap:Fioi_melth:Fioi_meltw:Fioi_salt:Forr_roff:Forr_ioff seq_flds_mod: seq_flds_s2x_states= seq_flds_mod: seq_flds_s2x_fluxes= seq_flds_mod: seq_flds_x2s_states= seq_flds_mod: seq_flds_x2s_fluxes= seq_flds_mod: seq_flds_g2x_states= seq_flds_mod: seq_flds_g2x_fluxes= seq_flds_mod: seq_flds_x2g_states= seq_flds_mod: seq_flds_x2g_fluxes= seq_flds_mod: seq_flds_xao_states= So_tref:So_qref:So_ssq:So_re:So_u10:So_duu10n:So_ustar seq_flds_mod: seq_flds_xao_albedo= So_avsdr:So_anidr:So_avsdf:So_anidf seq_flds_mod: seq_flds_r2x_states= Slrr_volr seq_flds_mod: seq_flds_r2x_fluxes= Forr_roff:Forr_ioff:Flrr_flood seq_flds_mod: seq_flds_x2r_states= seq_flds_mod: seq_flds_x2r_fluxes= Flrl_rofliq:Flrl_rofice seq_flds_mod: seq_flds_w2x_states= Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokes seq_flds_mod: seq_flds_w2x_fluxes= seq_flds_mod: seq_flds_x2w_states= Sa_u:Sa_v:Sa_tbot:Si_ifrac:So_t:So_u:So_v:So_bldepth seq_flds_mod: seq_flds_x2w_fluxes= 1 pes participating in computation for CLM ----------------------------------- NODE# NAME ( 0) cn4031 application called MPI_Abort(comm=0x84000006, 1) - process 0 yhrun: error: cn4031: task 0: Exited with exit code 1 |
6楼2014-12-02 17:01:05
7楼2018-06-21 17:55:38
8楼2021-07-24 23:04:52













回复此楼
