24小时热门版块排行榜    

CyRhmU.jpeg
南方科技大学公共卫生及应急管理学院2026级博士研究生招生报考通知(长期有效)
查看: 4626  |  回复: 18

Amersly

新虫 (小有名气)

[求助] pwscf计算能带出错已有2人参与

QE自带的Si示例是可以运行的 但是我自己建的mos2运行却一直出现如下错误:
########################################
# WARNING: there are pending errors
# PENDING ERROR (ierr=1)
# ERROR IN: iotk_open_write (iotk_files.f90:341)
# CVS Revision: 1.20
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2
# FROM IOTK LIBRARY, VERSION 1.2.0
# UNRECOVERABLE ERROR (ierr=3)
# ERROR IN: iotk_getline (iotk_scan.f90:947)
# CVS Revision: 1.23
#
iostat=5001
# ERROR IN: iotk_scan_tag (iotk_scan.f90:593)
# CVS Revision: 1.23
# ERROR IN: iotk_scan (iotk_scan.f90:821)
# CVS Revision: 1.23
# ERROR IN: iotk_scan_end (iotk_scan.f90:241)
# CVS Revision: 1.23
# foundl
########################################################################################################################
对于第一个错误 iotk_open 一直不知道问题出在哪里 望大家可以提示一下!
下面两个错误 看到其他虫友讨论说是只装了 IFORT未安装ICC,所以正在下载,不知道能否解决!
跪求大家帮忙!!
回复此楼
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖
回帖支持 ( 显示支持度最高的前 50 名 )

wode147

新虫 (初入文坛)

【答案】应助回帖

感谢参与,应助指数 +1
我也做过这个,&control
    calculation = 'relax',
    prefix='MoS2',
    pseudo_dir='/home/ g/Code/espresso-5.1.1/pseudo'
    outdir = './tmp',
    nstep = 200,
    tstress = .true.,
    tprnfor = .true.,
    etot_conv_thr = 1.0D-4,
    forc_conv_thr = 1.0D-3,
/
&system   
    ibrav=  4, celldm(1) = 5.97153455363,  celldm(3)= 10,
    nat=  3, ntyp= 2,
    ecutwfc = 80.0,
    ecutrho = 600,
    occupations = 'smearing',
    smearing = 'gaussian',
    degauss = 1.d-8,
/
&electrons
    conv_thr = 1.d-10
/
&ions
    ion_dynamics = 'bfgs'
/
&cell
    cell_dynamics = 'bfgs'
/
ATOMIC_SPECIES
Mo  95.94   Mo.pbe-spn-kjpaw_psl.0.3.0.UPF
S   32.065  S.pbe-n-kjpaw_psl.0.1.UPF
ATOMIC_POSITIONS
S  0.00 0.57735026919  3.4982879746835  
Mo 0.00 0.00000000000  4.00000000000
S  0.00 0.57735026919  4.5017120253165  
K_POINTS automatic
15  15  1  0  0  0
这是我的优化文件,你看一下吧,自洽在另一个电脑上不方便,优化和自洽也差不多,你对比一下吧
4楼2016-12-09 17:47:39
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
6楼: Originally posted by souledge at 2016-12-10 00:23:45
不要把outdir和pseudo_dir放在一起,计算时程序会自动将pseudo_dir中的赝势文件拷贝到outdir中,如果在一起,会将原有的赝势删除并且留个空文件。...

我之前编译可能安装不全 都是用的QE自带的编译器。
重新安装了parallel_studio_xe_2015和Openmpi 后运行出现新错误:
It looks like opal_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during opal_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_shmem_base_select failed
  --> Returned value -1 instead of OPAL_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[Amer:9203] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[Amer:9204] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[Amer:9201] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[Amer:9202] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
7楼2016-12-10 14:18:51
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖
普通回帖

souledge

专家顾问 (著名写手)

引用回帖:
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2

确定输入文件都没有错?
思想重于技巧,内涵重于表象
2楼2016-12-08 15:24:30
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
2楼: Originally posted by souledge at 2016-12-08 15:24:30
确定输入文件都没有错?

输入文件如下:
&CONTROL
                 calculation = 'scf' ,
                 restart_mode  = 'from_scratch',
                      outdir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' ,
                  pseudo_dir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' ,
                      prefix = 'mos2' ,
/
&SYSTEM
                      ibrav       = 4,
                     celldm(1)   = 5.9716,
                      celldm(3)   = 12,
                      nat         = 3,
                      ntyp        = 2,
                      ecutwfc     = 50,
                      ecutrho     = 410,
                        nbnd = 8,
            exxdiv_treatment = 'gygi-baldereschi' ,
/
&ELECTRONS
                    conv_thr = 1.0d-10 ,
/
ATOMIC_SPECIES
Mo  95.94    Mo.pz-spn-rrkjus_psl.0.2.UPF
S   32.066   S.pz-n-rrkjus_psl.0.1.UPF
ATOMIC_POSITIONS (alat)
S        0.500000000   0.288675130   1.974192764
Mo       0.000000000   0.577350270   2.462038339
S        0.000000000  -0.577350270   2.950837559
K_POINTS automatic
16 16 1 0 0 0

求指点!
3楼2016-12-09 16:03:46
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
4楼: Originally posted by wode147 at 2016-12-09 17:47:39
我也做过这个,&control
    calculation = 'relax',
    prefix='MoS2',
    pseudo_dir='/home/ g/Code/espresso-5.1.1/pseudo'
    outdir = './tmp',
    nstep = 200,
    tstress = .true.,
    ...

好的! 多谢大侠  我去试试
5楼2016-12-09 18:12:30
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

souledge

专家顾问 (著名写手)

【答案】应助回帖

★ ★
感谢参与,应助指数 +1
Amersly(漫天飘雪代发): 金币+2, 谢谢交流 2016-12-10 20:36:47
引用回帖:
3楼: Originally posted by Amersly at 2016-12-09 16:03:46
输入文件如下:
&CONTROL
                 calculation = 'scf' ,
                 restart_mode  = 'from_scratch',
                      outdir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' ,
      ...

不要把outdir和pseudo_dir放在一起,计算时程序会自动将pseudo_dir中的赝势文件拷贝到outdir中,如果在一起,会将原有的赝势删除并且留个空文件。
思想重于技巧,内涵重于表象
6楼2016-12-10 00:23:45
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
6楼: Originally posted by souledge at 2016-12-10 00:23:45
不要把outdir和pseudo_dir放在一起,计算时程序会自动将pseudo_dir中的赝势文件拷贝到outdir中,如果在一起,会将原有的赝势删除并且留个空文件。...

我又重装了下openmpi 刚才的错误没有 但是并行编译的时候仍有原来的错误:
重新安装openmpi后并行编译 仍出错
#############################################################################################################################
# WARNING: there are pending errors
# PENDING ERROR (ierr=1)
# ERROR IN: iotk_open_write (iotk_files.f90:341)
# CVS Revision: 1.20
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2
# FROM IOTK LIBRARY, VERSION 1.2.0
# UNRECOVERABLE ERROR (ierr=3)
# ERROR IN: iotk_getline (iotk_scan.f90:947)
# CVS Revision: 1.23
#
iostat=5001
# ERROR IN: iotk_scan_tag (iotk_scan.f90:593)
# CVS Revision: 1.23
# ERROR IN: iotk_scan (iotk_scan.f90:821)
# CVS Revision: 1.23
# ERROR IN: iotk_scan_end (iotk_scan.f90:241)
# CVS Revision: 1.23
# foundl
########################################################################################################################
########################################################################################################################
# WARNING: there are pending errors
# PENDING ERROR (ierr=1)
# ERROR IN: iotk_open_write (iotk_files.f90:341)
# CVS Revision: 1.20
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2
# FROM IOTK LIBRARY, VERSION 1.2.0
# UNRECOVERABLE ERROR (ierr=3)
# ERROR IN: i###################################################################################################################
# WARNING: there are pending errors
# PENDING ERROR (ierr=1)
# ERROR IN: iotk_open_write (iotk_files.f90:341)
# CVS Revision: 1.20
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2
# FROM IOTK LIBRARY, VERSION 1.2.0
# UNRECOVERABLE ERROR (ierr=3)
# ERROR IN: iotk_getline (iotk_scan.f90:947)
# CVS Revision: 1.23
#
iostat=5001
# ERROR IN: iotk_scan_tag (iotk_scan.f90tk_getline (iotk_scan.f90:947)
# CVS Revision: 1.23
#
iostat=5001
# ERROR IN: iotk_scan_tag (iotk_scan.f90:593)
# CVS Revision: 1.23
# ERROR IN: iotk_scan (iotk_scan.f90:821)
# CVS Revision: 1.23
# ERROR IN: iotk_scan_end (iotk_scan.f90:241)
# CVS Revision: 1.23
# foundl
是不是QE没装完整呀!
8楼2016-12-10 15:03:49
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

souledge

专家顾问 (著名写手)

引用回帖:
8楼: Originally posted by Amersly at 2016-12-10 15:03:49
我又重装了下openmpi 刚才的错误没有 但是并行编译的时候仍有原来的错误:
重新安装openmpi后并行编译 仍出错
############################################################################################# ...

请详述OpenMPI的安装过程和安装目录(安装过程包括版本号,configure的命令,make和make install的具体命令以及参数,还有安装后对OpenMPI的设置,比如有没有添加什么目录到环境变量或者~/.bashrc)。
或者,可以考虑换配置上更加简单点的MPICH2,然后重新编译QE。
思想重于技巧,内涵重于表象
9楼2016-12-10 15:16:55
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
9楼: Originally posted by souledge at 2016-12-10 15:16:55
请详述OpenMPI的安装过程和安装目录(安装过程包括版本号,configure的命令,make和make install的具体命令以及参数,还有安装后对OpenMPI的设置,比如有没有添加什么目录到环境变量或者~/.bashrc)。
或者,可以 ...

openmpi安装:
1 下载至download目录 然后在~/下新建并解压至openmpi-1.10.2文件夹tar zxvf openmpi-1.10.2.tar.gz;
2 进入安装文件目录
      cd openmpi-1.10.2
3 使用configure配置,prefix指定安装目录,后面的CC、CXX、FC、F77指定使用intel编译器作为接口
      ./configure --prefix=/opt/openmpi-165 CC=icc CXX=icpc FC=ifort F77=ifort
4开始编译,可以使用-j后面跟数字进行多核编译,能减少编译时间。
       make –j4
5 编译完成后,使用make install安装到指定的目录
       make install
6 设置环境变量
~/.bashrc最后添加了:
#setting for intel compiler and mpi
source /opt/intel/bin/compilervars.sh intel64
#source /opt/intel/impi_5.0.1/bin64/mpivars.sh
source /opt/intel/composer_xe_2015.0.090/mkl/bin/mklvars.sh intel64 ilp64

#setting for openmpi
export MPI_HOME=/opt/openmpi-1.10.2
export PATH=$MPI_HOME/bin:$PATH
export LD_LIBRARY_PATH=$MPI_HOME/lib:$LD_LIBRARY_PATH

在安装openmpi之前配置了parallel_studio_xe_2015,自带了impi,所以也设置了它的环境变量,使用的时候注释掉了impi。

mpich2 比较简单吗?要是容易搞定我就在试试。
10楼2016-12-10 15:37:42
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖
相关版块跳转 我要订阅楼主 Amersly 的主题更新
信息提示
请填处理意见