Znn3bq.jpeg
²é¿´: 4800  |  »Ø¸´: 18
µ±Ç°Ö»ÏÔʾÂú×ãÖ¸¶¨Ìõ¼þµÄ»ØÌû£¬µã»÷ÕâÀï²é¿´±¾»°ÌâµÄËùÓлØÌû

Amersly

гæ (СÓÐÃûÆø)

[ÇóÖú] pwscf¼ÆËãÄÜ´ø³ö´í ÒÑÓÐ2È˲ÎÓë

QE×Ô´øµÄSiʾÀýÊÇ¿ÉÒÔÔËÐÐµÄ µ«ÊÇÎÒ×Ô¼º½¨µÄmos2ÔËÐÐÈ´Ò»Ö±³öÏÖÈçÏ´íÎó£º
########################################
# WARNING: there are pending errors
# PENDING ERROR (ierr=1)
# ERROR IN: iotk_open_write (iotk_files.f90:341)
# CVS Revision: 1.20
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2
# FROM IOTK LIBRARY, VERSION 1.2.0
# UNRECOVERABLE ERROR (ierr=3)
# ERROR IN: iotk_getline (iotk_scan.f90:947)
# CVS Revision: 1.23
#
iostat=5001
# ERROR IN: iotk_scan_tag (iotk_scan.f90:593)
# CVS Revision: 1.23
# ERROR IN: iotk_scan (iotk_scan.f90:821)
# CVS Revision: 1.23
# ERROR IN: iotk_scan_end (iotk_scan.f90:241)
# CVS Revision: 1.23
# foundl
########################################################################################################################
¶ÔÓÚµÚÒ»¸ö´íÎó iotk_open Ò»Ö±²»ÖªµÀÎÊÌâ³öÔÚÄÄÀï Íû´ó¼Ò¿ÉÒÔÌáʾһÏ£¡
ÏÂÃæÁ½¸ö´íÎó ¿´µ½ÆäËû³æÓÑÌÖÂÛ˵ÊÇÖ»×°ÁË IFORTδ°²×°ICC£¬ËùÒÔÕýÔÚÏÂÔØ£¬²»ÖªµÀÄÜ·ñ½â¾ö£¡
¹òÇó´ó¼Ò°ï棡£¡
»Ø¸´´ËÂ¥

» ÊÕ¼±¾ÌûµÄÌÔÌûר¼­ÍƼö

VASP

» ²ÂÄãϲ»¶

» ±¾Ö÷ÌâÏà¹Ø¼ÛÖµÌùÍÆ¼ö£¬¶ÔÄúͬÑùÓаïÖú:

ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

Amersly

гæ (СÓÐÃûÆø)

ÒýÓûØÌû:
6Â¥: Originally posted by souledge at 2016-12-10 00:23:45
²»Òª°ÑoutdirºÍpseudo_dir·ÅÔÚÒ»Æð£¬¼ÆËãʱ³ÌÐò»á×Ô¶¯½«pseudo_dirÖÐµÄØÍÊÆÎļþ¿½±´µ½outdirÖУ¬Èç¹ûÔÚÒ»Æð£¬»á½«Ô­ÓÐµÄØÍÊÆÉ¾³ý²¢ÇÒÁô¸ö¿ÕÎļþ¡£...

ÎÒ֮ǰ±àÒë¿ÉÄܰ²×°²»È« ¶¼ÊÇÓõÄQE×Ô´øµÄ±àÒëÆ÷¡£
ÖØÐ°²×°ÁËparallel_studio_xe_2015ºÍOpenmpi ºóÔËÐгöÏÖдíÎó£º
It looks like opal_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during opal_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_shmem_base_select failed
  --> Returned value -1 instead of OPAL_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  opal_init failed
  --> Returned value Error (-1) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "Error" (-1) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[Amer:9203] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[Amer:9204] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[Amer:9201] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[Amer:9202] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
7Â¥2016-12-10 14:18:51
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû
²é¿´È«²¿ 19 ¸ö»Ø´ð

souledge

ר¼Ò¹ËÎÊ (ÖøÃûдÊÖ)

ÒýÓûØÌû:
# Error opening file
unit=90000
file=/home/luyaosong/espresso/pseudo/.x
binary=F
new=F
iostat=2

È·¶¨ÊäÈëÎļþ¶¼Ã»ÓÐ´í£¿
˼ÏëÖØÓÚ¼¼ÇÉ£¬ÄÚº­ÖØÓÚ±íÏó
2Â¥2016-12-08 15:24:30
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

Amersly

гæ (СÓÐÃûÆø)

ÒýÓûØÌû:
2Â¥: Originally posted by souledge at 2016-12-08 15:24:30
È·¶¨ÊäÈëÎļþ¶¼Ã»ÓÐ´í£¿

ÊäÈëÎļþÈçÏ£º
&CONTROL
                 calculation = 'scf' ,
                 restart_mode  = 'from_scratch',
                      outdir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' ,
                  pseudo_dir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' ,
                      prefix = 'mos2' ,
/
&SYSTEM
                      ibrav       = 4,
                     celldm(1)   = 5.9716,
                      celldm(3)   = 12,
                      nat         = 3,
                      ntyp        = 2,
                      ecutwfc     = 50,
                      ecutrho     = 410,
                        nbnd = 8,
            exxdiv_treatment = 'gygi-baldereschi' ,
/
&ELECTRONS
                    conv_thr = 1.0d-10 ,
/
ATOMIC_SPECIES
Mo  95.94    Mo.pz-spn-rrkjus_psl.0.2.UPF
S   32.066   S.pz-n-rrkjus_psl.0.1.UPF
ATOMIC_POSITIONS (alat)
S        0.500000000   0.288675130   1.974192764
Mo       0.000000000   0.577350270   2.462038339
S        0.000000000  -0.577350270   2.950837559
K_POINTS automatic
16 16 1 0 0 0

ÇóÖ¸µã£¡
3Â¥2016-12-09 16:03:46
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

wode147

гæ (³õÈëÎÄ̳)

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

¸Ðл²ÎÓ룬ӦÖúÖ¸Êý +1
ÎÒÒ²×ö¹ýÕâ¸ö£¬&control
    calculation = 'relax',
    prefix='MoS2',
    pseudo_dir='/home/ g/Code/espresso-5.1.1/pseudo'
    outdir = './tmp',
    nstep = 200,
    tstress = .true.,
    tprnfor = .true.,
    etot_conv_thr = 1.0D-4,
    forc_conv_thr = 1.0D-3,
/
&system   
    ibrav=  4, celldm(1) = 5.97153455363,  celldm(3)= 10,
    nat=  3, ntyp= 2,
    ecutwfc = 80.0,
    ecutrho = 600,
    occupations = 'smearing',
    smearing = 'gaussian',
    degauss = 1.d-8,
/
&electrons
    conv_thr = 1.d-10
/
&ions
    ion_dynamics = 'bfgs'
/
&cell
    cell_dynamics = 'bfgs'
/
ATOMIC_SPECIES
Mo  95.94   Mo.pbe-spn-kjpaw_psl.0.3.0.UPF
S   32.065  S.pbe-n-kjpaw_psl.0.1.UPF
ATOMIC_POSITIONS
S  0.00 0.57735026919  3.4982879746835  
Mo 0.00 0.00000000000  4.00000000000
S  0.00 0.57735026919  4.5017120253165  
K_POINTS automatic
15  15  1  0  0  0
ÕâÊÇÎÒµÄÓÅ»¯Îļþ£¬Ä㿴һϰɣ¬×ÔÇ¢ÔÚÁíÒ»¸öµçÄÔÉϲ»·½±ã£¬ÓÅ»¯ºÍ×ÔÇ¢Ò²²î²»¶à£¬Äã¶Ô±ÈһϰÉ
4Â¥2016-12-09 17:47:39
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû
×î¾ßÈËÆøÈÈÌûÍÆ¼ö [²é¿´È«²¿] ×÷Õß »Ø/¿´ ×îºó·¢±í
[¿¼ÑÐ] 22408 312Çóµ÷¼Á +24 ÃÅ·ÃþÃþ 2026-04-14 26/1300 2026-04-18 13:04 by wunaiy88
[¿¼ÑÐ] ¼±Ðèµ÷¼Á +9 ¾ø²»·ÅÆú22 2026-04-15 10/500 2026-04-18 08:09 by chixmc
[¿¼ÑÐ] Ò»Ö¾Ô¸»ªÖÐũҵ071010£¬320Çóµ÷¼Á +17 À§À§À§À§À¤À¤ 2026-04-14 19/950 2026-04-17 20:08 by ¹ØÒ»ÕµµÆcd
[¿¼ÑÐ] 295·ÖÇóµ÷¼Á +5 ?ÒªÉϰ¶? 2026-04-17 5/250 2026-04-17 16:51 by fenglj492
[¿¼ÑÐ] 0854Çóµ÷¼Á +21 ÃÅ·ÃþÃþ 2026-04-15 25/1250 2026-04-17 15:45 by qzxyhcsy
[¿¼ÑÐ] 291Çóµ÷¼Á +9 ¹ØÒä±±. 2026-04-14 9/450 2026-04-16 22:49 by cfdbai
[¿¼ÑÐ] ҩѧÇóµ÷¼Á +14 à¶¹þ¼ÓÓÍ 2026-04-14 16/800 2026-04-16 10:15 by beilsong20
[¿¼ÑÐ] 279ѧ˶ʳƷרҵÇóµ÷¼ÁԺУ 20+7 ¹Â¶ÀµÄÀǰ®³ÔÑò 2026-04-12 29/1450 2026-04-16 09:00 by screening
[¿¼ÑÐ] Çóµ÷¼ÁÍÆ¼ö +8 СÄô°®Ñ§Ï° 2026-04-14 8/400 2026-04-16 07:22 by ѧԱJpLReM
[¿¼ÑÐ] 297£¬¹¤¿Æµ÷¼Á? +10 ºÓÄÏũҵ´óѧ-ÄÜ 2026-04-14 10/500 2026-04-15 21:50 by noqvsozv
[¿¼ÑÐ] 297¹¤¿Æµ÷¼Á? +14 ºÓÄÏũҵ´óѧ-ÄÜ 2026-04-13 15/750 2026-04-15 13:25 by ºÚ¿Æ¼¼¿óÒµ
[¿¼ÑÐ] 105500ҩѧÇóµ÷¼Á +4 x_skys 2026-04-12 4/200 2026-04-14 13:37 by rndfc
[¿¼ÑÐ] 085408¹âµçÐÅÏ¢¹¤³Ìר˶355Ò»Ö¾Ô¸³¤´º¹â»úËùµ÷¼Á +6 Íõymaa 2026-04-13 13/650 2026-04-14 11:33 by Íõymaa
[¿¼ÑÐ] 085600²ÄÁÏÓ뻯¹¤349·ÖÇóµ÷¼Á +16 Àîľ×Ó°¡¹þ¹þ 2026-04-12 17/850 2026-04-14 09:11 by fenglj492
[¿¼ÑÐ] Çóµ÷¼Á +3 ÎÒ°®¸ßÊý¸ßÊý°®Î 2026-04-12 3/150 2026-04-14 01:00 by Íõ¬Bè±
[¿¼ÑÐ] 2026˶ʿµ÷¼Á_Äܶ¯_ºÓÄÏũҵ´óѧ +4 ºÓÄÏũҵ´óѧ-ÄÜ 2026-04-12 4/200 2026-04-13 22:01 by bljnqdcc
[¿¼ÑÐ] BÇø0809 £¬ÊýÒ»Ó¢Ò»£¬290 Çóµ÷¼Á +3 ãöΫ1111 2026-04-12 4/200 2026-04-13 20:35 by ѧԱJpLReM
[¿¼ÑÐ] 302Çóµ÷¼Á +10 Ò×£¡? 2026-04-13 10/500 2026-04-13 19:04 by lbsjt
[¿¼ÑÐ] 346·Ö£¬¹¤¿Æ0854Çóµ÷¼Á£¬×¨Ë¶ +6 moser233 2026-04-12 7/350 2026-04-12 22:11 by fqwang
[¿¼ÑÐ] µ÷¼Á½áÊø +6 floriea 2026-04-12 8/400 2026-04-12 18:13 by zhouxiaoyu
ÐÅÏ¢Ìáʾ
ÇëÌî´¦ÀíÒâ¼û