| ²é¿´: 4800 | »Ø¸´: 18 | |||
| µ±Ç°Ö»ÏÔʾÂú×ãÖ¸¶¨Ìõ¼þµÄ»ØÌû£¬µã»÷ÕâÀï²é¿´±¾»°ÌâµÄËùÓлØÌû | |||
Amerslyгæ (СÓÐÃûÆø)
|
[ÇóÖú]
pwscf¼ÆËãÄÜ´ø³ö´í ÒÑÓÐ2È˲ÎÓë
|
||
|
QE×Ô´øµÄSiʾÀýÊÇ¿ÉÒÔÔËÐÐµÄ µ«ÊÇÎÒ×Ô¼º½¨µÄmos2ÔËÐÐÈ´Ò»Ö±³öÏÖÈçÏ´íÎó£º ######################################## # WARNING: there are pending errors # PENDING ERROR (ierr=1) # ERROR IN: iotk_open_write (iotk_files.f90:341) # CVS Revision: 1.20 # Error opening file unit=90000 file=/home/luyaosong/espresso/pseudo/.x binary=F new=F iostat=2 # FROM IOTK LIBRARY, VERSION 1.2.0 # UNRECOVERABLE ERROR (ierr=3) # ERROR IN: iotk_getline (iotk_scan.f90:947) # CVS Revision: 1.23 # iostat=5001 # ERROR IN: iotk_scan_tag (iotk_scan.f90:593) # CVS Revision: 1.23 # ERROR IN: iotk_scan (iotk_scan.f90:821) # CVS Revision: 1.23 # ERROR IN: iotk_scan_end (iotk_scan.f90:241) # CVS Revision: 1.23 # foundl ######################################################################################################################## ¶ÔÓÚµÚÒ»¸ö´íÎó iotk_open Ò»Ö±²»ÖªµÀÎÊÌâ³öÔÚÄÄÀï Íû´ó¼Ò¿ÉÒÔÌáʾһÏ£¡ ÏÂÃæÁ½¸ö´íÎó ¿´µ½ÆäËû³æÓÑÌÖÂÛ˵ÊÇÖ»×°ÁË IFORTδ°²×°ICC£¬ËùÒÔÕýÔÚÏÂÔØ£¬²»ÖªµÀÄÜ·ñ½â¾ö£¡ ¹òÇó´ó¼Ò°ï棡£¡ |
» ÊÕ¼±¾ÌûµÄÌÔÌûר¼ÍƼö
VASP |
» ²ÂÄãϲ»¶
ÉúÎïѧµ÷¼Á
ÒѾÓÐ10È˻ظ´
Çóµ÷¼Á
ÒѾÓÐ3È˻ظ´
320Çóµ÷¼Á
ÒѾÓÐ5È˻ظ´
½ÓÊÜÈκε÷¼Á
ÒѾÓÐ7È˻ظ´
»¯Ñ§070300 Çóµ÷¼Á
ÒѾÓÐ29È˻ظ´
297£¬¹¤¿Æµ÷¼Á?
ÒѾÓÐ5È˻ظ´
22408 312Çóµ÷¼Á
ÒѾÓÐ26È˻ظ´
ÊÕµ½¸´ÊÔµ÷¼Áµ«ÊÇÈ¥²»ÁË
ÒѾÓÐ8È˻ظ´
260Çóµ÷¼Á
ÒѾÓÐ5È˻ظ´
¼±Ðèµ÷¼Á
ÒѾÓÐ10È˻ظ´
» ±¾Ö÷ÌâÏà¹Ø¼ÛÖµÌùÍÆ¼ö£¬¶ÔÄúͬÑùÓаïÖú:
QE plotband.x ¼ÆËã´íÎó
ÒѾÓÐ2È˻ظ´
ÔËÐÐbands.x³ö´í
ÒѾÓÐ5È˻ظ´
ÒÔʯīϩΪÀýPwscf¼ÆËã̬ÃܶȵÄÏêϸ²½ÖèºÍÒªµã
ÒѾÓÐ11È˻ظ´
¡¾Ô´´¡¿PWSCFÖеÄnbndºÍdegauss
ÒѾÓÐ11È˻ظ´
Amersly
гæ (СÓÐÃûÆø)
- Ó¦Öú: 0 (Ó×¶ùÔ°)
- ½ð±Ò: 72.8
- ºì»¨: 3
- Ìû×Ó: 62
- ÔÚÏß: 25.2Сʱ
- ³æºÅ: 3804727
- ×¢²á: 2015-04-12
- רҵ: °ëµ¼ÌåÎïÀí
|
ÎÒ֮ǰ±àÒë¿ÉÄܰ²×°²»È« ¶¼ÊÇÓõÄQE×Ô´øµÄ±àÒëÆ÷¡£ ÖØÐ°²×°ÁËparallel_studio_xe_2015ºÍOpenmpi ºóÔËÐгöÏÖдíÎó£º It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [Amer:9203] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! [Amer:9204] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! [Amer:9201] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! [Amer:9202] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! |
7Â¥2016-12-10 14:18:51
souledge
ר¼Ò¹ËÎÊ (ÖøÃûдÊÖ)
-

ר¼Ò¾Ñé: +174 - 1STÇ¿Ìû: 19
- Ó¦Öú: 303 (´óѧÉú)
- ½ð±Ò: 11306.4
- É¢½ð: 1123
- ºì»¨: 108
- Ìû×Ó: 2814
- ÔÚÏß: 916.1Сʱ
- ³æºÅ: 941486
- ×¢²á: 2010-01-12
- רҵ: ½á¹¹ÌÕ´É
- ¹ÜϽ: µÚÒ»ÐÔÔÀí

2Â¥2016-12-08 15:24:30
Amersly
гæ (СÓÐÃûÆø)
- Ó¦Öú: 0 (Ó×¶ùÔ°)
- ½ð±Ò: 72.8
- ºì»¨: 3
- Ìû×Ó: 62
- ÔÚÏß: 25.2Сʱ
- ³æºÅ: 3804727
- ×¢²á: 2015-04-12
- רҵ: °ëµ¼ÌåÎïÀí
|
ÊäÈëÎļþÈçÏ£º &CONTROL calculation = 'scf' , restart_mode = 'from_scratch', outdir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' , pseudo_dir = '/home/luyaosong/QE/qe-6.0/bin/mos2/' , prefix = 'mos2' , / &SYSTEM ibrav = 4, celldm(1) = 5.9716, celldm(3) = 12, nat = 3, ntyp = 2, ecutwfc = 50, ecutrho = 410, nbnd = 8, exxdiv_treatment = 'gygi-baldereschi' , / &ELECTRONS conv_thr = 1.0d-10 , / ATOMIC_SPECIES Mo 95.94 Mo.pz-spn-rrkjus_psl.0.2.UPF S 32.066 S.pz-n-rrkjus_psl.0.1.UPF ATOMIC_POSITIONS (alat) S 0.500000000 0.288675130 1.974192764 Mo 0.000000000 0.577350270 2.462038339 S 0.000000000 -0.577350270 2.950837559 K_POINTS automatic 16 16 1 0 0 0 ÇóÖ¸µã£¡ |
3Â¥2016-12-09 16:03:46
wode147
гæ (³õÈëÎÄ̳)
- Ó¦Öú: 1 (Ó×¶ùÔ°)
- ½ð±Ò: 5.9
- Ìû×Ó: 22
- ÔÚÏß: 6.9Сʱ
- ³æºÅ: 5269582
- ×¢²á: 2016-11-22
- רҵ: Äý¾Û̬ÎïÐÔ II £ºµç×ӽṹ
¡¾´ð°¸¡¿Ó¦Öú»ØÌû
¸Ðл²ÎÓ룬ӦÖúÖ¸Êý +1
|
ÎÒÒ²×ö¹ýÕâ¸ö£¬&control calculation = 'relax', prefix='MoS2', pseudo_dir='/home/ g/Code/espresso-5.1.1/pseudo' outdir = './tmp', nstep = 200, tstress = .true., tprnfor = .true., etot_conv_thr = 1.0D-4, forc_conv_thr = 1.0D-3, / &system ibrav= 4, celldm(1) = 5.97153455363, celldm(3)= 10, nat= 3, ntyp= 2, ecutwfc = 80.0, ecutrho = 600, occupations = 'smearing', smearing = 'gaussian', degauss = 1.d-8, / &electrons conv_thr = 1.d-10 / &ions ion_dynamics = 'bfgs' / &cell cell_dynamics = 'bfgs' / ATOMIC_SPECIES Mo 95.94 Mo.pbe-spn-kjpaw_psl.0.3.0.UPF S 32.065 S.pbe-n-kjpaw_psl.0.1.UPF ATOMIC_POSITIONS S 0.00 0.57735026919 3.4982879746835 Mo 0.00 0.00000000000 4.00000000000 S 0.00 0.57735026919 4.5017120253165 K_POINTS automatic 15 15 1 0 0 0 ÕâÊÇÎÒµÄÓÅ»¯Îļþ£¬Ä㿴һϰɣ¬×ÔÇ¢ÔÚÁíÒ»¸öµçÄÔÉϲ»·½±ã£¬ÓÅ»¯ºÍ×ÔÇ¢Ò²²î²»¶à£¬Äã¶Ô±ÈÒ»Ï塃 |
4Â¥2016-12-09 17:47:39













»Ø¸´´ËÂ¥
20