±±¾©Ê¯ÓÍ»¯¹¤Ñ§Ôº2026ÄêÑо¿ÉúÕÐÉú½ÓÊÕµ÷¼Á¹«¸æ
²é¿´: 723  |  »Ø¸´: 0

ÂÞÒÕÇà

гæ (³õÈëÎÄ̳)

[½»Á÷] ´îÌåϵ³öÏÖÁËһЩÎÊÌâÏ£ÍûÓÐÈËÄܽâ´ð

ÓÃCharmm-GUI´îÁËÒ»¸öÌåϵ£¬²¢½øÐÐÁËÀ©³ä£¬ÔÚÄÜÁ¿×îС»¯Ê±³öÏÖÁËÒÔÏÂÎÊÌâThere is no domain decomposition for 20 ranks that is compatible with the given box and a minimum cell size of 6.2556 nm. Change the number of ranks or mdrun option -rdd.
ÎÒÔÚGRomacsµÄÊÖ²áÉÏ¿´µ½ÁËÀàËÆµÄ´íÎó--There is no domain decomposition for n nodes that is compatible with the given box and a minimum cell size of x nm

This means you tried to run a parallel calculation, and when mdrun tried to partition your simulation cell into chunks for each processor, it couldn't. The minimum cell size is controlled by the size of the largest charge group or bonded interaction and the largest of rvdw, rlist and rcoulomb, some other effects of bond constraints, and a safety margin. Thus it is not possible to run a small simulation with large numbers of processors. So, if grompp warned you about a large charge group, pay attention and reconsider its size. mdrun prints a breakdown of how it computed this minimum size in the .log file, so you can perhaps find a cause there.

If you didn't think you were running a parallel calculation, be aware that from 4.5, GROMACS uses thread-based parallelism by default. To prevent this, you can either give mdrun the "-nt 1" command line option, or build GROMACS so that it will not use threads. Otherwise, you might be using an MPI-enabled GROMACS and not be aware of the fact.
¸ù¾ÝËûµÄÌáʾ£¬ÎÒÔÚÄÜÁ¿×îС»¯µÄÃüÁîÐмÓÉÏÁË"-nt 1"£¬½á¹û·¢ÏÖÁËеĴíÎóSetting the total number of threads is only supported with thread-MPI and GROMACS was comipled without thread-MPI
ÔÚûÀ©³äǰÊÇ¿ÉÒÔ½øÐÐÄÜÁ¿×îС»¯µÄ£¬À©³äÖ®ºó¾Í²»¿ÉÒÔÁË£¬ÎÒ¶ÔCharmm-GUI²»Ì«Á˽⣬GROMACSÒ²²Å½Ó´¥£¬Ò²²»Ì«Çå³þÎҵĴíÎórankºÍÊÖ²áÉϵÄnodesÓÐʲô²î±ð£¬Ï£Íû¶®µÄÈ˸øÎÒһЩ½¨Òé
»Ø¸´´ËÂ¥

» ²ÂÄãϲ»¶

ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû
Ïà¹Ø°æ¿éÌø×ª ÎÒÒª¶©ÔÄÂ¥Ö÷ ÂÞÒÕÇà µÄÖ÷Ìâ¸üÐÂ
×î¾ßÈËÆøÈÈÌûÍÆ¼ö [²é¿´È«²¿] ×÷Õß »Ø/¿´ ×îºó·¢±í
[¿¼ÑÐ] ÉúÎïѧ308·ÖÇóµ÷¼Á£¨Ò»Ö¾Ô¸»ª¶«Ê¦´ó£© +5 ÏàÐűػá¹ââÍòÕ 2026-04-06 6/300 2026-04-06 11:07 by ÏàÐűػá¹ââÍòÕ
[¿¼ÑÐ] Çóµ÷¼Á +3 µçÆøÐ¡Éñͯ 2026-04-04 3/150 2026-04-05 10:17 by barlinike
[¿¼ÑÐ] 0854Çóµ÷¼Á +4 assdll 2026-04-04 4/200 2026-04-05 09:44 by zhq0425
[¿¼ÑÐ] ¿¼Ñе÷¼Á +11 СsunÒªºÃÔË 2026-04-04 11/550 2026-04-05 08:02 by qlm5820
[¿¼ÑÐ] ±¾¿Æ211 ·ÖÊý293ÇëÇóµ÷¼Á +4 Á«²Ë¾ÍÊÇź°É 2026-04-01 4/200 2026-04-04 22:32 by hemengdong
[¿¼ÑÐ] ¿É¿çרҵµ÷¼Á +3 ÖÜµÄµÃµØ 2026-04-04 6/300 2026-04-04 22:21 by barlinike
[¿¼ÑÐ] 325Çóµ÷¼Á +4 ´º·ç²»½èÒâ 2026-04-04 4/200 2026-04-04 22:08 by à£à£à£0119
[¿¼ÑÐ] 0703Çóµ÷¼Á +6 zizimo 2026-03-31 6/300 2026-04-04 14:16 by Î޼ʵIJÝÔ­
[¿¼ÑÐ] 085601Ò»Ö¾Ô¸±±Àí325·ÖÇóµ÷¼Á +6 ÕÒµ÷¼Á£¬£¬ 2026-04-02 6/300 2026-04-03 22:20 by –¹Æ?
[¿¼ÑÐ] ¿¼Ñе÷¼Á +5 СsunÒªºÃÔË 2026-04-03 5/250 2026-04-03 21:43 by à£à£à£0119
[¿¼ÑÐ] 289-Çóµ÷¼Á +4 ÕâÀïÊÇ_ 2026-04-03 4/200 2026-04-03 14:23 by 1753564080
[¿¼ÑÐ] ÊýÒ»Ó¢Ò»285Çóµ÷¼Á +7 AZMK 2026-04-03 9/450 2026-04-03 13:03 by ms629
[¿¼ÑÐ] Ò»Ö¾Ô¸Éî´ó085601²ÄÁϹ¤³Ìרҵ£¨×¨Ë¶£©300·Ö¿ÉÒÔµ÷¼ÁÈ¥ÄÄ +8 10160315 2026-04-02 8/400 2026-04-03 09:36 by hypershenger
[¿¼ÑÐ] ÉúÎïѧ308·ÖÇóµ÷¼Á£¨Ò»Ö¾Ô¸»ª¶«Ê¦´ó£© +6 ÏàÐűػá¹ââÍòÕ 2026-03-31 7/350 2026-04-02 23:16 by JourneyLucky
[¿¼ÑÐ] Ò»Ö¾Ô¸±±¾©¿Æ¼¼´óѧ²ÄÁÏѧ˶328·ÖÇóµ÷¼Á +6 1¶Îʱ¼ä 2026-03-31 7/350 2026-04-02 13:57 by 3041
[¿¼ÑÐ] 304Çóµ÷¼Á +12 ËØÄê¼ÀÓï 2026-03-31 15/750 2026-04-01 22:41 by peike
[¿¼ÑÐ] °²È«¹¤³Ì 285 Çóµ÷¼Á +3 Xinyu56 2026-04-01 4/200 2026-04-01 21:50 by ¾²¾²¾²¾²¾²¾²¾²¾
[¿¼ÑÐ] ºÏ·ÊÇøÓòÐÔÖØµãÒ»±¾ÕÐÊÕµ÷¼Á +4 6266jl 2026-03-30 8/400 2026-03-31 18:43 by 6266jl
[¿¼ÑÐ] Ò»Ö¾Ô¸Õã½­´óѧ¹¤¿Æ¶¯Á¦¹¤³Ì370,ÊýÒ»121,רҵ¿Î135£¬ÏÖÔÚÄÜÈ¥ÄÄÀï +3 080700µ÷¼Á 2026-03-30 4/200 2026-03-31 12:00 by KLMY666
[¿¼ÑÐ] Ò»Ö¾Ô¸Î÷µç085401ÊýÒ»Ó¢Ò»299Çóµ÷¼Á Áù¼¶521 +4 °®³Ô´óѼÀæ 2026-03-31 4/200 2026-03-31 11:51 by ²«»÷518
ÐÅÏ¢Ìáʾ
ÇëÌî´¦ÀíÒâ¼û