注冊(cè) | 登錄讀書好,好讀書,讀好書!
讀書網(wǎng)-DuShu.com
當(dāng)前位置: 首頁出版圖書科學(xué)技術(shù)計(jì)算機(jī)/網(wǎng)絡(luò)人工智能基于多模態(tài)數(shù)據(jù)的行為和手勢(shì)識(shí)別

基于多模態(tài)數(shù)據(jù)的行為和手勢(shì)識(shí)別

基于多模態(tài)數(shù)據(jù)的行為和手勢(shì)識(shí)別

定 價(jià):¥40.00

作 者: 張亮
出版社: 西安電子科技大學(xué)出版社
叢編項(xiàng):
標(biāo) 簽: 暫缺

ISBN: 9787560665399 出版時(shí)間: 2022-08-01 包裝:
開本: 頁數(shù): 字?jǐn)?shù):  

內(nèi)容簡(jiǎn)介

  This book provides a series of gesture and behavior recognition methods based on multimodal data representation. The data modalities include image data and skeleton data, and the modeling methods include traditional codebook, topological graph, and LSTM architectures. The tasks include single gesture recognition classification, single action recognition classification, continuous gesture classification, complex behavior classification of human interaction and other tasks of different complexity. This book focuses on the data processing methods of each modality, and the modeling methods for different tasks. We hope the reader can learn basic gesture and action recognition methods from this book, and develop a model system that suits their needs on this basis. This book can be used as a textbook for graduate, postgraduate and PhD students majoring in computer science, automation, etc. It can also be used as a reference for the reader who is interested in gesture recognition, human action interaction, sequence data processing, and deep neural network design, and who hopes to contribute to the fields.

作者簡(jiǎn)介

暫缺《基于多模態(tài)數(shù)據(jù)的行為和手勢(shì)識(shí)別》作者簡(jiǎn)介

圖書目錄

暫缺《基于多模態(tài)數(shù)據(jù)的行為和手勢(shì)識(shí)別》目錄

本目錄推薦

掃描二維碼
Copyright ? 讀書網(wǎng) ranfinancial.com 2005-2020, All Rights Reserved.
鄂ICP備15019699號(hào) 鄂公網(wǎng)安備 42010302001612號(hào)