My Account Log in

1 option

Syntactic Networks—Kernel Memory Approach / by Tetsuya Hoya.

Springer eBooks EBA - Intelligent Technologies and Robotics Collection 2024 Available online

View online
Format:
Book
Author/Creator:
Hoya, Tetsuya.
Series:
Studies in Computational Intelligence, 1860-9503 ; 1157
Language:
English
Subjects (All):
Engineering--Data processing.
Engineering.
Computational intelligence.
Data Engineering.
Computational Intelligence.
Local Subjects:
Data Engineering.
Computational Intelligence.
Physical Description:
1 online resource (138 pages)
Edition:
1st ed. 2024.
Place of Publication:
Cham : Springer Nature Switzerland : Imprint: Springer, 2024.
Summary:
This book proposes a novel connectionist approach to a challenging topic of language modeling within the context of kernel memory and artificial mind system, both proposed previously by the author in the very first volume of the series, Artificial Mind System—Kernel Memory Approach: Studies in Computational Intelligence, Vol. 1. The present volume focuses on how syntactic structures of language are modeled in terms of the respective composite connectionist architectures, each embracing both the nonsymbolic and symbolic parts. These two parts are developed via inter-module processes within the artificial mind system and eventually integrated under a unified framework of kernel memory. The data representation by the networks embodied within the kernel memory principle is essentially local, unlike conventional artificial neural network models such as the pervasive multilayer perceptron-based neural networks. With this locality principle, kernel memory inherently bears many attractive features, such as topologically unconstrained network formation, straightforward network growing, shrinking, and reconfiguration, no requirement of arduous iterative parameter tuning, construction of transparent and hierarchical data structures, and multimodal and temporal data processing via the network representation. Exploiting these multifacet properties of kernel memory with interweaving the notion of inter-module processing within the artificial mind system provides coherent accounts for concept formation and how various linguistic phenomena, viz. word compoundings, morphologies, and multiword constructions, are modeled. The description is then extended to more intricate network models of context-dependent lexical network and syntactic-oriented processing, the latter being the central theme of the present study, and further to those representing a hybrid of nonverbal and verbal thinking, and semantic and pragmatic aspects of sentential meaning. The book is intended for general readers engaging in various areas of study in cognitive science, computer science, engineering, linguistics, philosophy, psycholinguistics, and psychology.
Contents:
Review of the Two Existing Artificial Neural Network Models – Multilayer Perceptron and Probabilistic Neural Networks
Beyond the Original PNN Model – Kernel Memory for Modeling Various Neural Pattern Processing Mechanism
Modules within the Artificial Mind System and Their Interactions Relevant to Language Pattern Processing
Concept Formation.
ISBN:
9783031573125
3031573129
OCLC:
1435754011

The Penn Libraries is committed to describing library materials using current, accurate, and responsible language. If you discover outdated or inaccurate language, please fill out this feedback form to report it and suggest alternative language.

My Account

Shelf Request an item Bookmarks Fines and fees Settings

Guides

Using the Library Catalog Using Articles+ Library Account