| 查看: 4634 | 回复: 18 | |||
| 当前只显示满足指定条件的回帖,点击这里查看本话题的所有回帖 | |||
[求助]
pwscf计算能带出错已有2人参与
|
|||
|
QE自带的Si示例是可以运行的 但是我自己建的mos2运行却一直出现如下错误: ######################################## # WARNING: there are pending errors # PENDING ERROR (ierr=1) # ERROR IN: iotk_open_write (iotk_files.f90:341) # CVS Revision: 1.20 # Error opening file unit=90000 file=/home/luyaosong/espresso/pseudo/.x binary=F new=F iostat=2 # FROM IOTK LIBRARY, VERSION 1.2.0 # UNRECOVERABLE ERROR (ierr=3) # ERROR IN: iotk_getline (iotk_scan.f90:947) # CVS Revision: 1.23 # iostat=5001 # ERROR IN: iotk_scan_tag (iotk_scan.f90:593) # CVS Revision: 1.23 # ERROR IN: iotk_scan (iotk_scan.f90:821) # CVS Revision: 1.23 # ERROR IN: iotk_scan_end (iotk_scan.f90:241) # CVS Revision: 1.23 # foundl ######################################################################################################################## 对于第一个错误 iotk_open 一直不知道问题出在哪里 望大家可以提示一下! 下面两个错误 看到其他虫友讨论说是只装了 IFORT未安装ICC,所以正在下载,不知道能否解决! 跪求大家帮忙!! |
» 收录本帖的淘帖专辑推荐
VASP |
» 猜你喜欢
读博
已经有5人回复
到新单位后,换了新的研究方向,没有团队,持续积累2区以上论文,能申请到面上吗
已经有13人回复
博士申请都是内定的吗?
已经有6人回复
之前让一硕士生水了7个发明专利,现在这7个获批发明专利的维护费可从哪儿支出哈?
已经有5人回复
博士读完未来一定会好吗
已经有29人回复
投稿精细化工
已经有4人回复
高职单位投计算机相关的北核或SCI四区期刊推荐,求支招!
已经有4人回复
导师想让我从独立一作变成了共一第一
已经有9人回复
心脉受损
已经有5人回复
Springer期刊投稿求助
已经有4人回复
» 本主题相关价值贴推荐,对您同样有帮助:
QE plotband.x 计算错误
已经有2人回复
运行bands.x出错
已经有5人回复
以石墨烯为例Pwscf计算态密度的详细步骤和要点
已经有11人回复
【原创】PWSCF中的nbnd和degauss
已经有11人回复
|
输入文件如下: &CONTROL calculation = 'scf' , restart_mode = 'from_scratch', outdir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' , pseudo_dir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' , prefix = 'mos2' , / &SYSTEM ibrav = 4, celldm(1) = 5.9716, celldm(3) = 12, nat = 3, ntyp = 2, ecutwfc = 50, ecutrho = 410, nbnd = 8, exxdiv_treatment = 'gygi-baldereschi' , / &ELECTRONS conv_thr = 1.0d-10 , / ATOMIC_SPECIES Mo 95.94 Mo.pz-spn-rrkjus_psl.0.2.UPF S 32.066 S.pz-n-rrkjus_psl.0.1.UPF ATOMIC_POSITIONS (alat) S 0.500000000 0.288675130 1.974192764 Mo 0.000000000 0.577350270 2.462038339 S 0.000000000 -0.577350270 2.950837559 K_POINTS automatic 16 16 1 0 0 0 求指点! |
3楼2016-12-09 16:03:46
5楼2016-12-09 18:12:30
|
我之前编译可能安装不全 都是用的QE自带的编译器。 重新安装了parallel_studio_xe_2015和Openmpi 后运行出现新错误: It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [Amer:9203] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! [Amer:9204] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! [Amer:9201] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! [Amer:9202] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! |
7楼2016-12-10 14:18:51
|
我又重装了下openmpi 刚才的错误没有 但是并行编译的时候仍有原来的错误: 重新安装openmpi后并行编译 仍出错 ############################################################################################################################# # WARNING: there are pending errors # PENDING ERROR (ierr=1) # ERROR IN: iotk_open_write (iotk_files.f90:341) # CVS Revision: 1.20 # Error opening file unit=90000 file=/home/luyaosong/espresso/pseudo/.x binary=F new=F iostat=2 # FROM IOTK LIBRARY, VERSION 1.2.0 # UNRECOVERABLE ERROR (ierr=3) # ERROR IN: iotk_getline (iotk_scan.f90:947) # CVS Revision: 1.23 # iostat=5001 # ERROR IN: iotk_scan_tag (iotk_scan.f90:593) # CVS Revision: 1.23 # ERROR IN: iotk_scan (iotk_scan.f90:821) # CVS Revision: 1.23 # ERROR IN: iotk_scan_end (iotk_scan.f90:241) # CVS Revision: 1.23 # foundl ######################################################################################################################## ######################################################################################################################## # WARNING: there are pending errors # PENDING ERROR (ierr=1) # ERROR IN: iotk_open_write (iotk_files.f90:341) # CVS Revision: 1.20 # Error opening file unit=90000 file=/home/luyaosong/espresso/pseudo/.x binary=F new=F iostat=2 # FROM IOTK LIBRARY, VERSION 1.2.0 # UNRECOVERABLE ERROR (ierr=3) # ERROR IN: i################################################################################################################### # WARNING: there are pending errors # PENDING ERROR (ierr=1) # ERROR IN: iotk_open_write (iotk_files.f90:341) # CVS Revision: 1.20 # Error opening file unit=90000 file=/home/luyaosong/espresso/pseudo/.x binary=F new=F iostat=2 # FROM IOTK LIBRARY, VERSION 1.2.0 # UNRECOVERABLE ERROR (ierr=3) # ERROR IN: iotk_getline (iotk_scan.f90:947) # CVS Revision: 1.23 # iostat=5001 # ERROR IN: iotk_scan_tag (iotk_scan.f90 tk_getline (iotk_scan.f90:947)# CVS Revision: 1.23 # iostat=5001 # ERROR IN: iotk_scan_tag (iotk_scan.f90:593) # CVS Revision: 1.23 # ERROR IN: iotk_scan (iotk_scan.f90:821) # CVS Revision: 1.23 # ERROR IN: iotk_scan_end (iotk_scan.f90:241) # CVS Revision: 1.23 # foundl 是不是QE没装完整呀! |
8楼2016-12-10 15:03:49
|
openmpi安装: 1 下载至download目录 然后在~/下新建并解压至openmpi-1.10.2文件夹tar zxvf openmpi-1.10.2.tar.gz; 2 进入安装文件目录 cd openmpi-1.10.2 3 使用configure配置,prefix指定安装目录,后面的CC、CXX、FC、F77指定使用intel编译器作为接口 ./configure --prefix=/opt/openmpi-165 CC=icc CXX=icpc FC=ifort F77=ifort 4开始编译,可以使用-j后面跟数字进行多核编译,能减少编译时间。 make –j4 5 编译完成后,使用make install安装到指定的目录 make install 6 设置环境变量 ~/.bashrc最后添加了: #setting for intel compiler and mpi source /opt/intel/bin/compilervars.sh intel64 #source /opt/intel/impi_5.0.1/bin64/mpivars.sh source /opt/intel/composer_xe_2015.0.090/mkl/bin/mklvars.sh intel64 ilp64 #setting for openmpi export MPI_HOME=/opt/openmpi-1.10.2 export PATH=$MPI_HOME/bin:$PATH export LD_LIBRARY_PATH=$MPI_HOME/lib:$LD_LIBRARY_PATH 在安装openmpi之前配置了parallel_studio_xe_2015,自带了impi,所以也设置了它的环境变量,使用的时候注释掉了impi。 mpich2 比较简单吗?要是容易搞定我就在试试。 |
10楼2016-12-10 15:37:42
12楼2016-12-10 18:15:39
13楼2016-12-10 18:17:14
17楼2016-12-10 21:22:28













回复此楼
tk_getline (iotk_scan.f90:947)