24СʱÈÈÃŰæ¿éÅÅÐаñ    

²é¿´: 889  |  »Ø¸´: 5
¡¾ÐüÉͽð±Ò¡¿»Ø´ð±¾ÌûÎÊÌ⣬×÷Õßagong½«ÔùËÍÄú 5 ¸ö½ð±Ò
µ±Ç°Ö»ÏÔʾÂú×ãÖ¸¶¨Ìõ¼þµÄ»ØÌû£¬µã»÷ÕâÀï²é¿´±¾»°ÌâµÄËùÓлØÌû

agong

гæ (³õÈëÎÄ̳)

[ÇóÖú] ÇóÖúÉó¸åÒâ¼ûµÄÀí½â

×ÖÃæÒâ˼¿´¶®ÁË£¬µ«ÊÇ»¹Çë¹ýÀ´ÈË¿´¿´£¬È»ºó·¢±íһЩÅúÆÀºÍ½¨Òé¡£ÒÔ¼°Ö®ºó¸ÃÔõôÑùÐ޸ĺÍÏÂÒ»²½µÄͶ¸å£¬Ð»Ð»´ó¼Ò¡£

·¢×ÔСľ³æIOS¿Í»§¶Ë
»Ø¸´´ËÂ¥

» ²ÂÄãϲ»¶

ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

agong

гæ (³õÈëÎÄ̳)

Review 1
Relevance and Timeliness        Technical Content and Scientific Rigour        Novelty and Originality        Quality of Presentation
Acceptable. (3)        Valid work but limited contribution. (3)        Some interesting ideas and results on a subject well investigated. (3)        Readable, but revision is needed in some parts. (3)
Strong Aspects (Comments to the author: What are the strong aspects of the paper?)
This paper presents a new algorithm to offload edge tasks to edge serves within the MEC environment. Authors present an improved  reinforcement learning framework according to dynamic environments, which selects samples from an improved experience pool. Simulation experiments reveal improved performance.
Weak Aspects (Comments to the author: What are the weak aspects of the paper?)
First, the distinction between the proposal and state-of-the-art reinforcement learning seems straightforward, as the selection of experience samples seems straightforward. Second, the simulation results are not discussed in details to explain the novelty of the proposal.
Recommended Changes (Recommended changes. Please indicate any changes that should be made to the paper if accepted.)
First, authors should explain the improvements. why experiences are vital to improve the performance of the reinforcement learning. And the selection of experience samples seems straightforward. Second, an example is favored to illustrate the workflow of the proposed algorithm. Third, the experiments do not present the detailed setup of the MEC environment, and the performance metrics.
6Â¥2021-12-15 18:23:27
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû
²é¿´È«²¿ 6 ¸ö»Ø´ð

ÙÈÔÂÀæÂä

гæ (СÓÐÃûÆø)

Luckyguys
2Â¥2021-12-15 17:44:11
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

agong

гæ (³õÈëÎÄ̳)

Á´½Ó: https://pan.baidu.com/s/1jOUKi7Hp2Y76vfRuQMJ1Qg ÌáÈ¡Âë: euty ¸´ÖÆÕâ¶ÎÄÚÈݺó´ò¿ª°Ù¶ÈÍøÅÌÊÖ»úApp£¬²Ù×÷¸ü·½±ãŶ
3Â¥2021-12-15 18:15:41
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

agong

гæ (³õÈëÎÄ̳)

Strong Aspects (Comments to the author: What are the strong aspects of the paper?)
In this paper, the authors proposed an experience-based computational offloading with reinforcement learning in MEC network.
Weak Aspects (Comments to the author: What are the weak aspects of the paper?)
1. In (11), it seems that the discount factor is 1, while the discount factor is defined as [0,1] in (12). It is not very clear.
2. Some symbols are undefined, i.e., the immediate reward r_t, the symbol \wedge in (15)
3. There are some flaw in the presentation, i.e., double ¡°the task¡± in section II-B, the action should be defined in lowercase.
4. In algorithm 1, the meaning of ¡°undated¡± is not clear.   
5. It is better to compare the proposed algorithm with DQN not DDPG.
Recommended Changes (Recommended changes. Please indicate any changes that should be made to the paper if accepted.)
In this paper, the authors proposed an experience-based computational offloading with
reinforcement learning in MEC network. The reviewer has the following comments.
1. In (11), it seems that the discount factor is 1, while the discount factor is defined as [0,1] in (12). It is not very clear.
2. Some symbols are undefined, i.e., the immediate reward r_t, the symbol \wedge in (15)
3. There are some flaw in the presentation, i.e., double ¡°the task¡± in section II-B, the action should be defined in lowercase.
4. In algorithm 1, the meaning of ¡°undated¡± is not clear.   
5. It is better to compare the proposed algorithm with DQN not DDPG.
4Â¥2021-12-15 18:22:55
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû
²»Ó¦Öú È·¶¨»ØÌûÓ¦Öú (×¢Ò⣺ӦÖú²Å¿ÉÄܱ»½±Àø£¬µ«²»ÔÊÐí¹àË®£¬±ØÐëÌîд15¸ö×Ö·ûÒÔÉÏ)
×î¾ßÈËÆøÈÈÌûÍÆ¼ö [²é¿´È«²¿] ×÷Õß »Ø/¿´ ×îºó·¢±í
[¿¼ÑÐ] ²ÄÁÏ080500µ÷¼ÁÇóÊÕÁô +3 Ò»¿Åmeteor 2026-03-13 3/150 2026-03-14 10:54 by peike
[¿¼²©] ¶«»ªÀí¹¤´óѧ»¯²Äרҵ26½ì˶ʿ²©Ê¿ÉêÇë +3 zlingli 2026-03-13 3/150 2026-03-14 08:44 by ѧԱX4bdKA
[½Ìʦ֮¼Ò] ½¹ÂÇ +4 Ë®±ùÔÂÔÂÒ°Íà 2026-03-13 6/300 2026-03-14 05:40 by redmoonzpc
[¿¼ÑÐ] 0703»¯Ñ§308·Ö£¬ËÄÁù¼¶Òѹý£¬Çóµ÷¼Á +3 ¹¾¹¾¹¾¹¾¹¾¹¾¹¾¹ 2026-03-07 3/150 2026-03-14 05:29 by JourneyLucky
[¿¼ÑÐ] Ò»Ö¾Ô¸ÖйúʯÓÍ´óѧ£¨»ª¶«£© ±¾¿Æ Æë³¹¤Òµ´óѧ +3 ʯÄÜΰ 2026-03-07 3/150 2026-03-14 05:16 by JourneyLucky
[¿¼ÑÐ] Çóµ÷¼Á£¬Ò»Ö¾Ô¸É½¶«´óѧ£¬Êý¶þÓ¢Ò»£¬302 +3 nini412 2026-03-08 3/150 2026-03-14 04:23 by JourneyLucky
[¿¼ÑÐ] »¯Ñ§¹¤³Ì321·ÖÇóµ÷¼Á£¨ÄϾ©¹¤Òµ£¬Õã½­¹¤Òµ£© +3 ´óÃ×·¹£¡ 2026-03-09 4/200 2026-03-14 02:34 by JourneyLucky
[¿¼ÑÐ] 328£¬0703¿¼ÉúÇóµ÷¼Á£¬Ò»Ö¾Ô¸Îª¶«±±Ê¦·¶´óѧ +4 ¹ÛËØÂÉ 2026-03-09 5/250 2026-03-14 01:24 by JourneyLucky
[¿¼ÑÐ] 321Çóµ÷¼Á +3 CUcat 2026-03-10 3/150 2026-03-14 00:25 by JourneyLucky
[¿¼ÑÐ] bÇø»·¾³¹¤³ÌÇóµ÷¼Á +4 Maps1 2026-03-10 6/300 2026-03-14 00:23 by JourneyLucky
[¿¼ÑÐ] 085600²ÄÁÏÓ뻯¹¤ Çóµ÷¼Á +7 enenenhui 2026-03-13 8/400 2026-03-13 22:19 by ÐÇ¿ÕÐÇÔÂ
[¿¼ÑÐ] Çóµ÷¼Á£¨²ÄÁÏÓ뻯¹¤327£© +4 °®³ÔÏã²ËÀ² 2026-03-11 4/200 2026-03-13 22:11 by JourneyLucky
[ÎÄѧ·¼²ÝÔ°] »ï°éÃÇ£¬×£ÎÒÉúÈÕ¿ìÀÖ°É +12 myrtle 2026-03-10 17/850 2026-03-13 22:00 by angelyueyi
[¿¼ÑÐ] 304Çóµ÷¼Á +7 7712b 2026-03-13 7/350 2026-03-13 21:42 by peike
[¿¼ÑÐ] 0703»¯Ñ§Çóµ÷¼Á +7 Â̶¹ÇÛ²ËÌÀ 2026-03-12 7/350 2026-03-13 17:25 by njzyff
[¿¼ÑÐ] 285Çóµ÷¼Á +4 ytter 2026-03-12 4/200 2026-03-13 14:48 by jxchenghu
[¿¼ÑÐ] ²ÄÁÏÓ뻯¹¤ 323 Ó¢Ò»+Êý¶þ+Îﻯ£¬Ò»Ö¾Ô¸£º¹þ¹¤´ó ±¾È˱¾¿ÆË«Ò»Á÷ +3 ×ÔÓɵÄ_·ÉÏè 2026-03-13 4/200 2026-03-13 10:47 by ¹Úc¸ç
[»ù½ðÉêÇë] NSFCÉ걨ÊéÀïÉêÇëÈ˼òÀúÖдú±íÐÔÂÛÖø»¹ÐèÒªÔÚÉ걨Êé×îºóµÄ¸½¼þÀïÃæÔÙÉÏ´«Ò»±éÂð 20+4 NSFC2026ÎÒÀ´ÁË 2026-03-10 11/550 2026-03-12 13:13 by NSFC2026ÎÒÀ´ÁË
[¿¼ÑÐ] 081200¼ÆËã»ú¿ÆÑ§Óë¼¼ÊõÊýÒ»Ó¢Ò»306Çóµ÷¼Á +3 intankt 2026-03-08 3/150 2026-03-11 10:50 by dandan413
[¿¼²©] 26Ä격ʿÉêÇë +4 ¿ÆÑй·111 2026-03-07 4/200 2026-03-08 21:56 by 0611517sll
ÐÅÏ¢Ìáʾ
ÇëÌî´¦ÀíÒâ¼û