<p id="nxp5x"><big id="nxp5x"><noframes id="nxp5x">

    <var id="nxp5x"><video id="nxp5x"></video></var>

          <em id="nxp5x"></em>

              首 頁 本刊概況 出 版 人 發行統計 在線訂閱 歡迎投稿 市場分析 1 組織交流 1 關于我們
             
            1
               通信短波
            1
               新品之窗
            1
               優秀論文
            1
               通信趨勢
            1
               特別企劃
            1
               運營商動態
            1
               技術前沿
            1
               市場聚焦
            1
               通信視點
            1
               信息化論壇
            1
            當前位置:首頁 > 優秀論文
            基于領域詞義關聯的研究
            作者:王忠振 王濤 杜曉莉
            來源:本站原創
            更新時間:2014/1/14 11:04:00
            正文:

                                           (1.國防科技大學計算機學院,湖南省長沙市 410000;
                                                  2.國防科技大學計算機學院,湖南省長沙市 410000;
                                                  3.國防科技大學計算機學院,湖南省長沙市 410000)

            [摘  要]:領域詞義關聯是在語義上對特定領域內的語言進行基本單位關聯性計算,而且作為關聯性計算的基礎,在其他級別的文本間關系度量中發揮著非常重要的作用。領域詞義關聯的研究具有獨特的語言特征,依據維基百科和百度百科不僅對每個詞條有詳細的解釋說明,還對說明中相關屬性詞條進行鏈接,可以準確推測特征詞的有效語言特征,利用深度學習算法充分挖掘與利用詞匯屬性之間的關聯性。
            [關鍵詞]:詞義關聯,深度學習,維基百科
            中圖分類號:TP391.1     文獻標識碼:A        文章編號:

                                          A Research Based on Domain Meaning Relevance

                                               WANG Zhongzhen1  WANG Tao2  DU Xiaoli3 
             。1. National University of Defense Technology,Changsha 410000,china. WANG Zhongzhen,54696661@qq.com
              2.National University of Defense Technology,Changsha 410000,china. WANG Tao,631570216@qq.com
              3.National University of Defense Technology,Changsha 410000,china. Du Xiaoli,821979047@qq.com)

            Abstract:Domain Meaning Relevance (DMR) refers to computation of the semantic relevance of basic language units in a specific domain. As the fundamental for relevance computation, DMR plays a significant role in measurement of relevance between texts of different levels. Research of DMR has particular Linguistic characteristics. Wikipedia and Baidupedia give not only detailed explanation for each entry, but also links to entries of similar attributes. By this advantage, we can use deep learning algorithm to exploit mining and utilizing the relevance between attributes, and infer the linguistic features of Feature Words.
            Key words:Semantic Relevance, Deep Learning,Wikipedia

             

             

            參  考  文  獻:
            [1]高勇.啤酒與尿布(神奇的購物籃分析)[M].清華大學出版社.2008,11.1-3
            [2]高飛.基于維基百科的漢語詞語及短文本相關度計算方法研究[D].杭州電子科技大學.2012,12,1
            [3] Resnik P.Using information content to evaluate semantic similarity in a taxonomy[C]//Proceedings of the 14th International Joint Conference on Artificial Intelligence volume 1,Montreal,Canada,August 1995:448-453
            [4]劉宏哲,須德. 基于本體的語義相似度和相關度計算研究綜述[J].計算機科學.2012,2.8-15
            [5]涂新輝,張紅春.中文維基百科的結構化信息抽取及詞語相關度計算方法[J].中文信息學報.2012,5.109-115
            [6] Alex Krizhevsky,Ilya Sutskever,Geoffrey E Hinton.ImageNet Classification with Deep Convolutional Neural Networks[J], NIPS 2012.
            [7] Clement Farabet, Camille Couprie, Laurent Najman and Yann LeCun.Learning Hierarchical Features for Scene Labeling. IEEE Transactions on Pattern Analysis and Machine Intelligence.2013.
            [8] Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean.Efficient Estimation of Word Representations in Vector Space [J].arXiv:1301.3781[cs.CL]. 2013.
            [9] Y. Bengio, R. Ducharme, P. Vincent. A neural probabilistic language model. Journal of Machine Learning Research . 2003 . 1137-1155
            [10] A. Mnih, G. Hinton. A Scalable Hierarchical Distributed Language Model. Advances in Neural Information Processing Systems 21, MIT Press, 2009.
            [11] T. Mikolov, M. Karafiat, L. Burget, J. Cernocky, S. Khudanpur. Recurrent neural network based language model, In: Proceedings of Interspeech, 2010.
            [12] Y. Bengio, Y. LeCun. Scaling learning algorithms towards AI. In: Large-Scale Kernel Machines,MIT Press, 2007.
            [13] T. Mikolov. Language Modeling for Speech Recognition in Czech, Masters thesis, Brno University of Technology, 2007.
            [14] T. Mikolov, J. Kopecky, L. Burget, O. Glembek and J. Cernocky. Neural network based language models for higly inflective languages, In: Proc. ICASSP 2009.
            [15] Frederic Morin, Yoshua Bengio. Hierarchical Probabilistic Neural Network Language Model. Society for Artificial Intelligence and Statistics[J].2005.246-252
            [16]Bengio Y, Ducharme R, and Vincent P. A neural probabilistic language model. In Leen, T, Dietterich, T., and Tresp, V., editors, Advances in Neural Information Processing Systems 13 (NIPS’00). MIT Press.2001. 933–938
            [17]Hinton, G. E. Training products of experts by minimizing contrastive divergence. Technical Report GCNU TR 2000-004, Gatsby Unit, University College London. 2000
            [18]Goodman, J. Classes for fast maximum entropy training. In International Conference on Acoustics,Speech and Signal Processing (ICASSP), Utah. 2001
            [19] Bengio Y, Ducharme R, Vincent P, and Jauvin C. A neural probabilistic language model. Journal of Machine Learning Research, 3. 2003.1137–1155.
            [20] Deerwester S.Dumais ST,Landaucr T K et a1.Indexing by latent semantic analysis.Journal of the Society for Information Science.1990.41(6).391-407
            [21] David M. Blei, AndrewY. Ng, Michael I. Jordan.LatentDirichlet Allocation.Journal of Machine Learning Research 3. 2003.993-1022

             

            作者簡介:
              王忠振,男,1980年6月出生,北京昌平人,國防科學技術大學計算機學院計算機科學與技術專業工程碩士。主要研究方向為數據挖掘、自然語言處理和信息安全。
              王濤,男,1979年9月出生,河南長葛人,國防科學技術大學計算機學院計算機科學與技術專業工程碩士。主要研究方向為數據挖掘、微博意見領袖和輿情控制。
              杜曉莉,女,1989年12月生,河北石家莊人,國防科學技術大學分布與并行處理國家重點實驗室。主要研究方向為社會網絡與移動計算、移動無線通信。
              

             
             
               
            《通信市場》 中國·北京·復興路49號通信市場(100036) 點擊查看具體位置
            電話:86-10-6820 7724, 6820 7726
            京ICP備05037146號-8
            建議使用 Microsoft IE4.0 以上版本 800*600瀏覽 如果您有什么建議和意見請與管理員聯系
            欧美成人观看免费全部欧美老妇0