24小时热门版块排行榜    

查看: 4634  |  回复: 18
当前只显示满足指定条件的回帖,点击这里查看本话题的所有回帖

Amersly

新虫 (小有名气)

[求助] pwscf计算能带出错已有2人参与

QE自带的Si示例是可以运行的 但是我自己建的mos2运行却一直出现如下错误:
########################################
# WARNING: there are pending errors
# PENDING ERROR (ierr=1)
# ERROR IN: iotk_open_write (iotk_files.f90:341)
# CVS Revision: 1.20
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2
# FROM IOTK LIBRARY, VERSION 1.2.0
# UNRECOVERABLE ERROR (ierr=3)
# ERROR IN: iotk_getline (iotk_scan.f90:947)
# CVS Revision: 1.23
#
iostat=5001
# ERROR IN: iotk_scan_tag (iotk_scan.f90:593)
# CVS Revision: 1.23
# ERROR IN: iotk_scan (iotk_scan.f90:821)
# CVS Revision: 1.23
# ERROR IN: iotk_scan_end (iotk_scan.f90:241)
# CVS Revision: 1.23
# foundl
########################################################################################################################
对于第一个错误 iotk_open 一直不知道问题出在哪里 望大家可以提示一下!
下面两个错误 看到其他虫友讨论说是只装了 IFORT未安装ICC,所以正在下载,不知道能否解决!
跪求大家帮忙!!
回复此楼
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
2楼: Originally posted by souledge at 2016-12-08 15:24:30
确定输入文件都没有错?

输入文件如下:
&CONTROL
                 calculation = 'scf' ,
                 restart_mode  = 'from_scratch',
                      outdir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' ,
                  pseudo_dir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' ,
                      prefix = 'mos2' ,
/
&SYSTEM
                      ibrav       = 4,
                     celldm(1)   = 5.9716,
                      celldm(3)   = 12,
                      nat         = 3,
                      ntyp        = 2,
                      ecutwfc     = 50,
                      ecutrho     = 410,
                        nbnd = 8,
            exxdiv_treatment = 'gygi-baldereschi' ,
/
&ELECTRONS
                    conv_thr = 1.0d-10 ,
/
ATOMIC_SPECIES
Mo  95.94    Mo.pz-spn-rrkjus_psl.0.2.UPF
S   32.066   S.pz-n-rrkjus_psl.0.1.UPF
ATOMIC_POSITIONS (alat)
S        0.500000000   0.288675130   1.974192764
Mo       0.000000000   0.577350270   2.462038339
S        0.000000000  -0.577350270   2.950837559
K_POINTS automatic
16 16 1 0 0 0

求指点!
3楼2016-12-09 16:03:46
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
4楼: Originally posted by wode147 at 2016-12-09 17:47:39
我也做过这个,&control
    calculation = 'relax',
    prefix='MoS2',
    pseudo_dir='/home/ g/Code/espresso-5.1.1/pseudo'
    outdir = './tmp',
    nstep = 200,
    tstress = .true.,
    ...

好的! 多谢大侠  我去试试
5楼2016-12-09 18:12:30
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
6楼: Originally posted by souledge at 2016-12-10 00:23:45
不要把outdir和pseudo_dir放在一起,计算时程序会自动将pseudo_dir中的赝势文件拷贝到outdir中,如果在一起,会将原有的赝势删除并且留个空文件。...

我之前编译可能安装不全 都是用的QE自带的编译器。
重新安装了parallel_studio_xe_2015和Openmpi 后运行出现新错误:
It looks like opal_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during opal_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_shmem_base_select failed
  --> Returned value -1 instead of OPAL_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[Amer:9203] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[Amer:9204] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[Amer:9201] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[Amer:9202] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
7楼2016-12-10 14:18:51
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
6楼: Originally posted by souledge at 2016-12-10 00:23:45
不要把outdir和pseudo_dir放在一起,计算时程序会自动将pseudo_dir中的赝势文件拷贝到outdir中,如果在一起,会将原有的赝势删除并且留个空文件。...

我又重装了下openmpi 刚才的错误没有 但是并行编译的时候仍有原来的错误:
重新安装openmpi后并行编译 仍出错
#############################################################################################################################
# WARNING: there are pending errors
# PENDING ERROR (ierr=1)
# ERROR IN: iotk_open_write (iotk_files.f90:341)
# CVS Revision: 1.20
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2
# FROM IOTK LIBRARY, VERSION 1.2.0
# UNRECOVERABLE ERROR (ierr=3)
# ERROR IN: iotk_getline (iotk_scan.f90:947)
# CVS Revision: 1.23
#
iostat=5001
# ERROR IN: iotk_scan_tag (iotk_scan.f90:593)
# CVS Revision: 1.23
# ERROR IN: iotk_scan (iotk_scan.f90:821)
# CVS Revision: 1.23
# ERROR IN: iotk_scan_end (iotk_scan.f90:241)
# CVS Revision: 1.23
# foundl
########################################################################################################################
########################################################################################################################
# WARNING: there are pending errors
# PENDING ERROR (ierr=1)
# ERROR IN: iotk_open_write (iotk_files.f90:341)
# CVS Revision: 1.20
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2
# FROM IOTK LIBRARY, VERSION 1.2.0
# UNRECOVERABLE ERROR (ierr=3)
# ERROR IN: i###################################################################################################################
# WARNING: there are pending errors
# PENDING ERROR (ierr=1)
# ERROR IN: iotk_open_write (iotk_files.f90:341)
# CVS Revision: 1.20
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2
# FROM IOTK LIBRARY, VERSION 1.2.0
# UNRECOVERABLE ERROR (ierr=3)
# ERROR IN: iotk_getline (iotk_scan.f90:947)
# CVS Revision: 1.23
#
iostat=5001
# ERROR IN: iotk_scan_tag (iotk_scan.f90tk_getline (iotk_scan.f90:947)
# CVS Revision: 1.23
#
iostat=5001
# ERROR IN: iotk_scan_tag (iotk_scan.f90:593)
# CVS Revision: 1.23
# ERROR IN: iotk_scan (iotk_scan.f90:821)
# CVS Revision: 1.23
# ERROR IN: iotk_scan_end (iotk_scan.f90:241)
# CVS Revision: 1.23
# foundl
是不是QE没装完整呀!
8楼2016-12-10 15:03:49
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
9楼: Originally posted by souledge at 2016-12-10 15:16:55
请详述OpenMPI的安装过程和安装目录(安装过程包括版本号,configure的命令,make和make install的具体命令以及参数,还有安装后对OpenMPI的设置,比如有没有添加什么目录到环境变量或者~/.bashrc)。
或者,可以 ...

openmpi安装:
1 下载至download目录 然后在~/下新建并解压至openmpi-1.10.2文件夹tar zxvf openmpi-1.10.2.tar.gz;
2 进入安装文件目录
      cd openmpi-1.10.2
3 使用configure配置,prefix指定安装目录,后面的CC、CXX、FC、F77指定使用intel编译器作为接口
      ./configure --prefix=/opt/openmpi-165 CC=icc CXX=icpc FC=ifort F77=ifort
4开始编译,可以使用-j后面跟数字进行多核编译,能减少编译时间。
       make –j4
5 编译完成后,使用make install安装到指定的目录
       make install
6 设置环境变量
~/.bashrc最后添加了:
#setting for intel compiler and mpi
source /opt/intel/bin/compilervars.sh intel64
#source /opt/intel/impi_5.0.1/bin64/mpivars.sh
source /opt/intel/composer_xe_2015.0.090/mkl/bin/mklvars.sh intel64 ilp64

#setting for openmpi
export MPI_HOME=/opt/openmpi-1.10.2
export PATH=$MPI_HOME/bin:$PATH
export LD_LIBRARY_PATH=$MPI_HOME/lib:$LD_LIBRARY_PATH

在安装openmpi之前配置了parallel_studio_xe_2015,自带了impi,所以也设置了它的环境变量,使用的时候注释掉了impi。

mpich2 比较简单吗?要是容易搞定我就在试试。
10楼2016-12-10 15:37:42
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

我后面粘贴的那个是原网页的 我的版本是1.10.2

发自小木虫IOS客户端
12楼2016-12-10 18:15:39
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
11楼: Originally posted by souledge at 2016-12-10 15:47:44
./configure --prefix=/opt/openmpi-165 CC=icc CXX=icpc FC=ifort F77=ifort
export MPI_HOME=/opt/openmpi-1.10.2
确定只是因为抄袭的问题,本身并没有写错么?...

那句话opebmpi165应该是openmpi-1.10.2是我粘贴错了

发自小木虫IOS客户端
13楼2016-12-10 18:17:14
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

Amersly

新虫 (小有名气)

引用回帖:
16楼: Originally posted by souledge at 2016-12-10 20:19:58
第一处错误,出现在iotk_files.f90的这里:

    open(unit=unit,file=file,status=status,form=form,position="rewind",iostat=iostat,action="write"
    if(iostat/=0) then
      cal ...

现在是例子也不能运行了。我打算连系统加软件一起重装。想请教下要想正确编译qe,在之前需要有哪些软件准备?如果有写的比较全的网站 可以推荐下!多谢

发自小木虫IOS客户端
17楼2016-12-10 21:22:28
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖
相关版块跳转 我要订阅楼主 Amersly 的主题更新
信息提示
请填处理意见