| ²é¿´: 1282 | »Ø¸´: 9 | |||
zhseda½ð³æ (СÓÐÃûÆø)
|
[½»Á÷]
2010Äê09ÔÂ02ÈÕ ICICTA2010±»EI ¼ìË÷£¡£¡£¡ ÒÑÓÐ7È˲ÎÓë
|
|
2010Äê09ÔÂ02ÈÕ ICICTA2010±»EI ¼ìË÷£¡£¡£¡ 2010Äê09ÔÂ02ÈÕ ICICTA2010±»EI ¼ìË÷£¡£¡£¡ 2010Äê09ÔÂ02ÈÕ ICICTA2010±»EI ¼ìË÷£¡£¡£¡ 2010Äê09ÔÂ02ÈÕ ICICTA2010±»EI ¼ìË÷£¡£¡£¡ |
» ²ÂÄãϲ»¶
0854¿É¿çµ÷¼Á£¬Ò»×÷Ò»ÏîºËÐÄÂÛÎÄÎåÏîרÀû£¬Ê¡¡¢¹ú¼¶Ö¤Êé40+ÊýÒ»Ó¢Ò»287
ÒѾÓÐ5È˻ظ´
070300»¯Ñ§319Çóµ÷¼Á
ÒѾÓÐ6È˻ظ´
Ò»Ö¾Ô¸985£¬±¾¿Æ211£¬0817»¯Ñ§¹¤³ÌÓë¼¼Êõ319Çóµ÷¼Á
ÒѾÓÐ6È˻ظ´
Ò»Ö¾Ô¸Î÷ÄϽ»´ó£¬Çóµ÷¼Á
ÒѾÓÐ3È˻ظ´
286Çóµ÷¼Á
ÒѾÓÐ6È˻ظ´
½¹ÂÇ
ÒѾÓÐ11È˻ظ´
331Çóµ÷¼Á£¨0703Óлú»¯Ñ§
ÒѾÓÐ7È˻ظ´
298-Ò»Ö¾Ô¸Öйúũҵ´óѧ-Çóµ÷¼Á
ÒѾÓÐ5È˻ظ´
297Çóµ÷¼Á
ÒѾÓÐ5È˻ظ´
302Çóµ÷¼Á
ÒѾÓÐ10È˻ظ´
wb1318
Òø³æ (СÓÐÃûÆø)
- Ó¦Öú: 0 (Ó×¶ùÔ°)
- ½ð±Ò: 204.6
- Ìû×Ó: 115
- ÔÚÏß: 13Сʱ
- ³æºÅ: 892675
- ×¢²á: 2009-11-03
- ÐÔ±ð: GG
2Â¥2010-09-02 10:00:04
¡ï
Сľ³æ(½ð±Ò+0.5):¸ø¸öºì°ü£¬Ð»Ð»»ØÌû½»Á÷
Сľ³æ(½ð±Ò+0.5):¸ø¸öºì°ü£¬Ð»Ð»»ØÌû½»Á÷
¹§Ï²¹§Ï² |
3Â¥2010-09-02 11:08:09
ÎÒÐÄÓÀºã1567
ľ³æ (ÕýʽдÊÖ)
- Ó¦Öú: 0 (Ó×¶ùÔ°)
- ½ð±Ò: 3975.3
- É¢½ð: 5
- ºì»¨: 1
- Ìû×Ó: 359
- ÔÚÏß: 255.2Сʱ
- ³æºÅ: 601710
- ×¢²á: 2008-09-13
- ÐÔ±ð: GG
- רҵ: Ö²Îﻯѧ±£»¤
4Â¥2010-09-02 13:30:21
pplain
ľ³æ (ÖøÃûдÊÖ)
- Ó¦Öú: 1 (Ó×¶ùÔ°)
- ½ð±Ò: 2264.4
- É¢½ð: 61
- ºì»¨: 6
- Ìû×Ó: 1469
- ÔÚÏß: 187.4Сʱ
- ³æºÅ: 974162
- ×¢²á: 2010-03-17
- רҵ: ÆóÒµÐÅÏ¢¹ÜÀí
5Â¥2010-09-02 13:46:29
hdlyh
Ìú³æ (СÓÐÃûÆø)
- Ó¦Öú: 0 (Ó×¶ùÔ°)
- ½ð±Ò: 1.8
- Ìû×Ó: 131
- ÔÚÏß: 55.8Сʱ
- ³æºÅ: 137360
- ×¢²á: 2005-12-17
- רҵ: ×ÔÈ»ÓïÑÔÀí½âÓë»úÆ÷·Òë
¡ï
Сľ³æ(½ð±Ò+0.5):¸ø¸öºì°ü£¬Ð»Ð»»ØÌû½»Á÷
Сľ³æ(½ð±Ò+0.5):¸ø¸öºì°ü£¬Ð»Ð»»ØÌû½»Á÷
| ¸÷λÑÐÓÑ£¬ÄܰïÎÒ²éÕâ½ì»áÒéÉÏÕâÆªÎÄÕµÄEI¼ìË÷ÐÅÏ¢Âð£ºFeature Selection through Optimization of k-Nearest Neighbor Matching Gain£¬Íò·Ö¸Ðл£¡ |
6Â¥2010-09-02 16:12:39
ÎÒÐÄÓÀºã1567
ľ³æ (ÕýʽдÊÖ)
- Ó¦Öú: 0 (Ó×¶ùÔ°)
- ½ð±Ò: 3975.3
- É¢½ð: 5
- ºì»¨: 1
- Ìû×Ó: 359
- ÔÚÏß: 255.2Сʱ
- ³æºÅ: 601710
- ×¢²á: 2008-09-13
- ÐÔ±ð: GG
- רҵ: Ö²Îﻯѧ±£»¤
¡ï
Сľ³æ(½ð±Ò+0.5):¸ø¸öºì°ü£¬Ð»Ð»»ØÌû½»Á÷
Сľ³æ(½ð±Ò+0.5):¸ø¸öºì°ü£¬Ð»Ð»»ØÌû½»Á÷
|
Accession number: 20103413169767 Title: Feature selection through optimization of k-nearest neighbor matching gain Authors: Luo, Yihui1 ; Xiong, Shuchu1 Author affiliation: 1 Department of Information, Hunan University of Commerce, Changsha, China Corresponding author: Luo, Y. (yihuiluo@yahoo.com.cn) Source title: 2010 International Conference on Intelligent Computation Technology and Automation, ICICTA 2010 Abbreviated source title: Int. Conf. Intelligent Comput. Technol. Autom., ICICTA Volume: 2 Monograph title: 2010 International Conference on Intelligent Computation Technology and Automation, ICICTA 2010 Issue date: 2010 Publication year: 2010 Pages: 309-312 Article number: 5522419 Language: English ISBN-13: 9780769540771 Document type: Conference article (CA) Conference name: 2010 International Conference on Intelligent Computation Technology and Automation, ICICTA 2010 Conference date: May 11, 2010 - May 12, 2010 Conference location: Changsha, China Conference code: 81471 Sponsor: IEEE Intelligent Computation Society; Res. Assoc. Intelligent Comput. Technol. Autom.; Hunan University; Changsha University of Science and Technology; Hunan University of Science and Technology Publisher: IEEE Computer Society, 445 Hoes Lane - P.O.Box 1331, Piscataway, NJ 08855-1331, United States Abstract: Many problems in information processing involve some form of dimensionality reduction. In this paper, we propose a new model for feature evaluation and selection in unsupervised learning scenarios. The model makes no special assumptions on the nature of the data set. For each of the data set, the original features induce a ranking list of items in its k nearest neighbors. The evaluation criterion favors reduced features that result in the most consistent to these ranked lists. And an efficiently local descent search based on the model is adopted to select the reduced features. Our experiments with several data sets demonstrate that the proposed algorithm is able to detect completely irrelevant features and to remove some additional features without significantly hurting the performance of the clustering algorithm. © 2010 IEEE. Number of references: 11 Main heading: Feature extraction Controlled terms: Clustering algorithms - Data processing - Unsupervised learning Uncontrolled terms: Data sets - Dimensionality reduction - Evaluation criteria - Feature evaluation and selection - Feature selection - Information processing - K-nearest neighbors - New model - Search-based Classification code: 716 Telecommunication; Radar, Radio and Television - 721 Computer Circuits and Logic Elements - 723 Computer Software, Data Handling and Applications DOI: 10.1109/ICICTA.2010.608 Database: Compendex Compilation and indexing terms, © 2010 Elsevier Inc. |
7Â¥2010-09-02 16:42:14
hdlyh
Ìú³æ (СÓÐÃûÆø)
- Ó¦Öú: 0 (Ó×¶ùÔ°)
- ½ð±Ò: 1.8
- Ìû×Ó: 131
- ÔÚÏß: 55.8Сʱ
- ³æºÅ: 137360
- ×¢²á: 2005-12-17
- רҵ: ×ÔÈ»ÓïÑÔÀí½âÓë»úÆ÷·Òë
8Â¥2010-09-02 16:43:27
¡ï
Сľ³æ: ½ð±Ò+0.5, ¸ø¸öºì°ü£¬Ð»Ð»»ØÌû
Сľ³æ: ½ð±Ò+0.5, ¸ø¸öºì°ü£¬Ð»Ð»»ØÌû
|
±¾ÌûÄÚÈݱ»ÆÁ±Î |
9Â¥2017-05-31 19:39:53
wjbt232394
½û³æ (³õÈëÎÄ̳)
¡ï
Сľ³æ: ½ð±Ò+0.5, ¸ø¸öºì°ü£¬Ð»Ð»»ØÌû
Сľ³æ: ½ð±Ò+0.5, ¸ø¸öºì°ü£¬Ð»Ð»»ØÌû
|
±¾ÌûÄÚÈݱ»ÆÁ±Î |
10Â¥2017-06-08 14:59:22













»Ø¸´´ËÂ¥