注冊 | 登錄讀書好,好讀書,讀好書!
讀書網-DuShu.com
當前位置: 首頁出版圖書科學技術計算機/網絡人工智能基于人工神經網絡的機器翻譯

基于人工神經網絡的機器翻譯

基于人工神經網絡的機器翻譯

定 價:¥25.00

作 者: 許羅邁
出版社: 科學出版社
叢編項:
標 簽: 自動化基礎理論

購買這本書可以去


ISBN: 9787030189813 出版時間: 2007-06-01 包裝: 平裝
開本: 0開 頁數: 216 字數:  

內容簡介

  《基于人工神經網絡的機器翻譯》基于語料庫統(tǒng)計的機器翻譯模式把機器翻譯分為翻譯模式和語言模式兩種處理過程,作者嘗試把人工神經網絡技術應用于兩種模式的處理,使之應用于機器翻譯的全過程,是一項創(chuàng)造性工作,作者采用神經元自學習的方法,從少量實例開始,系統(tǒng)通過自學習建立機器詞庫和對應的譯文,本研究實驗證明對于確定的領域,該系統(tǒng)可以輸出相當通順的目的語,這種用分布式神經網絡體系解決翻譯模式的訓練,較好地解決了單一網絡學習能力有限的問題,對神經網絡語言處理技術開發(fā)了新思路,有相當意義。作者在應用神經網絡處理語言模式方面,也提出了新的解決方案,改變了以往神經網絡以復雜句法、語義特征為訓練對象的普遍做法,采用詞性標注為訓練對象,以自創(chuàng)的一套詞語移動符號基為訓練目標的神經網絡處理方法,是一種獨特的處理方法,雖然作者指出這種方法未能得到預期的結果,但是如果能夠如作者提出的把分布式神經網絡體系也用于語言模式的訓練,這種獨特的方法成敗與否還未可知。

作者簡介

暫缺《基于人工神經網絡的機器翻譯》作者簡介

圖書目錄

Preface
Acknowledgements
Chapter One Prologue
Chapter Two MT state of the art
 2.1 MT as symbolic systems
 2.2 Practical MT
 2.3 Alternative technique of MT
  2.3.1 Theoretical foundation
  2.3.2 Translation model
  2.3.3 Language model
 2.4 Discussion
Chapter Three Connectionist solutions
 3.1 NLP models
 3.2 Representation
 3.3 Phonological processing
 3.4 Learning verb past tense
 3.5 Part of speech tagging
 3.6 Chinese collocation learning
 3.7 Syntactic parsing
  3.7.1 Learning active/passive transformation
  3.7.2 Confluent preorder parsing
  3.7.3 Parsing with fiat structures
  3.7.4 Parsing embedded clauses
  3.7.5 Parsing with deeper structures
 3.8 Discourse analysis
  3.8.1 Story gestalt and text understanding
  3.8.2 Processing stories with scriptural knowledge
 3.9 Machine translation
 3.10 Conclusion
Chapter Four NeuroTrans design considerations
 4.1 Scalability and extensibility
 4.2 Transfer or inter lingual
 4.3 Hybrid or fully connectionist
 4.4 The use of linguistic knowledge
 4.5 Translation as a two stage process
 4.6 Selection of network models
 4.7 Connectionist implementation
 4.8 Connectionist representation issues
 4.9 Conclusion
Chapter Five A neural lexicon model
 5.1 Language data
 5.2 Knowledge representation
  5.2.1 Symbolic approach
  5.2.2 The statistical approach
  5.2.3 Connectionist approach
  5.2.4 NeuroTrans' input/output representation
  5.2.5 NeuroTrans' lexicon representation
 5.3 Implementing the neural lexicon
  5.3.1 Words in context
  5.3.2 Context with weights
  5.3.3 Details of algorithm
  5.3.4 The Neural Lexicon Builder
 5.4 Training
  5.4.1 Sample preparation
  5.4.2 Training results
  5.4.3 Generalization test
 5.5 Discussion
  5.5.1 Adequacy
  5.5.2 Scalability and Extensibility
  5.5.3 Efficiency
  5.5.4 Weaknesses
Chapter Six Implementing the language model
 6.1 Overview
 6.2 Design
  6.2.1 Redefining the generation problem
  6.2.2 Defining jumble activity
  6.2.3 Language model structure
 6.3 Implementation
  6.3.1 Network structure Sampling Training and results
  6.3.2 Generalization test
 6.4 Discussion
  6.4.1 Insufficient data
  6.4.2 Information richness
  6.4.3 Insufficient contextual information
  6.4.4 Distributed language model
Chapter Seven Conclusion
Chapter Eight References
Index

本目錄推薦

掃描二維碼
Copyright ? 讀書網 ranfinancial.com 2005-2020, All Rights Reserved.
鄂ICP備15019699號 鄂公網安備 42010302001612號