In this paper, the nervous system is composed of a combination of common structure components, indicating that the hierarchical structure may lead to a great evolution to cognitive function extending eating behavior. The common component (basic unit) is a set that combines neural circuits to cylindrical forms, and the basic unit has an affinity with the structure of pillars, barrels, or blobs in neuroscience. Use multiple input and output to achieve a variable value function. By connecting a basic unit in a hierarchically, it can be done more complex time -series data process and can be add almost unlimited. Time series data is saved by repeating judgment and action. Save data includes not only food and other objects, but also surrounding situations. If the new situation resembles the elements of the time that is already recorded, the approximation element becomes active and the activation spreads throughout the past time series data. Time series data is constantly updated with processing. As example, reality turns into a new cause, and prediction turns into a new reality. If the two areas are often active at the same time, even if only one area becomes active, the corresponding areas will also change. This learning function corresponds to Hebb's law in neuroscience. This feature is the basis of imitation and conditional learning, enabling people to communicate with fellows and enabling collective actions. This ability is an indispensable ability to build a society and is the beginning of a language. Languages can express events not only in the past or future events, but also in places where sensory organs cannot reach. It has been inherited for generations as a culture. In this paper, a new layer is proposed at the top of the neural network that has evolved from eating behavior. The processing of the new layer is asynchronous with the lower layer, but it performs a complementary process and enables event communication with friends. This ability is the basics of language. The process of forming common knowledge exchanging questions and teachings between two neural networks is shown a simple example.
Published in | American Journal of Neural Networks and Applications (Volume 10, Issue 2) |
DOI | 10.11648/j.ajnna.20241002.12 |
Page(s) | 36-43 |
Creative Commons |
This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited. |
Copyright |
Copyright © The Author(s), 2024. Published by Science Publishing Group |
Context Corresponding Neuron Layers, Getting Knowledge by Dialogue, Basic Unit, Dynamically Recognize the Object, Dialog Between Fellows, Descriptive World in Brain, Real World in Brain, Modal Logic
[1] | Justin Leibor. An Invitation to Cognitive Science. Basil Blackwell Oxford and Cambridge MA, 1991. |
[2] | Seisuke Yanagawa: Communication Between Neural Networks, and Beginning of Language AJNNA Date: December 21, 2021 (ISSN Print: 2469-7400; ISSN Online: 2469-7419) Manuscript Number: 3391044 Elsevier. |
[3] | Seisuke Yanagawa Each Role of Short-Term and Long-Term Memory in Neural Networks American Journal of Neural Networks andApplications (AJNNA). March 10, 2020 (ISSN Print: 2469-7400; ISSN Online: 2469-7419) |
[4] | Seisuke Yanagawa: From Bacteria, a Consideration of the Evolution of Neural Network, Journal of Mechanics Engineering and Automation (2019) 17-23, |
[5] | Seisuke Yanagawa: Basic Unit: as a Common Module of Neural Networks 30 November 2020 Electrical Science & Engineering, vol. 03 Issue 01 April 2021 |
[6] | G. Buzaki: Rhythms of the Brain, Oxford university press, 2006, p44. |
[7] | Shantanu Shahane, Erman Guleryuz, Diab W. Abueidda: Surrogate neural network model for sensitivity analysis and uncertainty quantification of the mechanical behavior in the optical lens-barrel assembly Computers & Structures Volume 270, 1 October 2022, 106843, 2022 Elsevier Ltd |
[8] | Michael S. A. Graziano: Rethinking Consciousness W. W. Norton & Company Inc. 2021, p113. |
[9] | M. Iacoboni: Morroring People: Picador, 200. |
[10] | Giovanni Cina: Categories for the working modal logician Institute for Logic, Language and Computation Universities van Amsterdam Science Park 107 1098 XG Amsterdam homepage: |
[11] | Yoshio Shimizu: Logic via Category Theory Higher Order and Topos, University of Tokyo Press, 2007. |
[12] | D. L. Everett: Don’t Sleep, There are Snakes, Life and language in the Amazonian Jungle Canada by House of Canada Limited 2008. |
APA Style
Yanagawa, S. (2024). Knowledge Exchange Between Neural Network Toward Dawn of Language. American Journal of Neural Networks and Applications, 10(2), 36-43. https://doi.org/10.11648/j.ajnna.20241002.12
ACS Style
Yanagawa, S. Knowledge Exchange Between Neural Network Toward Dawn of Language. Am. J. Neural Netw. Appl. 2024, 10(2), 36-43. doi: 10.11648/j.ajnna.20241002.12
@article{10.11648/j.ajnna.20241002.12, author = {Seisuke Yanagawa}, title = {Knowledge Exchange Between Neural Network Toward Dawn of Language }, journal = {American Journal of Neural Networks and Applications}, volume = {10}, number = {2}, pages = {36-43}, doi = {10.11648/j.ajnna.20241002.12}, url = {https://doi.org/10.11648/j.ajnna.20241002.12}, eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajnna.20241002.12}, abstract = {In this paper, the nervous system is composed of a combination of common structure components, indicating that the hierarchical structure may lead to a great evolution to cognitive function extending eating behavior. The common component (basic unit) is a set that combines neural circuits to cylindrical forms, and the basic unit has an affinity with the structure of pillars, barrels, or blobs in neuroscience. Use multiple input and output to achieve a variable value function. By connecting a basic unit in a hierarchically, it can be done more complex time -series data process and can be add almost unlimited. Time series data is saved by repeating judgment and action. Save data includes not only food and other objects, but also surrounding situations. If the new situation resembles the elements of the time that is already recorded, the approximation element becomes active and the activation spreads throughout the past time series data. Time series data is constantly updated with processing. As example, reality turns into a new cause, and prediction turns into a new reality. If the two areas are often active at the same time, even if only one area becomes active, the corresponding areas will also change. This learning function corresponds to Hebb's law in neuroscience. This feature is the basis of imitation and conditional learning, enabling people to communicate with fellows and enabling collective actions. This ability is an indispensable ability to build a society and is the beginning of a language. Languages can express events not only in the past or future events, but also in places where sensory organs cannot reach. It has been inherited for generations as a culture. In this paper, a new layer is proposed at the top of the neural network that has evolved from eating behavior. The processing of the new layer is asynchronous with the lower layer, but it performs a complementary process and enables event communication with friends. This ability is the basics of language. The process of forming common knowledge exchanging questions and teachings between two neural networks is shown a simple example. }, year = {2024} }
TY - JOUR T1 - Knowledge Exchange Between Neural Network Toward Dawn of Language AU - Seisuke Yanagawa Y1 - 2024/11/26 PY - 2024 N1 - https://doi.org/10.11648/j.ajnna.20241002.12 DO - 10.11648/j.ajnna.20241002.12 T2 - American Journal of Neural Networks and Applications JF - American Journal of Neural Networks and Applications JO - American Journal of Neural Networks and Applications SP - 36 EP - 43 PB - Science Publishing Group SN - 2469-7419 UR - https://doi.org/10.11648/j.ajnna.20241002.12 AB - In this paper, the nervous system is composed of a combination of common structure components, indicating that the hierarchical structure may lead to a great evolution to cognitive function extending eating behavior. The common component (basic unit) is a set that combines neural circuits to cylindrical forms, and the basic unit has an affinity with the structure of pillars, barrels, or blobs in neuroscience. Use multiple input and output to achieve a variable value function. By connecting a basic unit in a hierarchically, it can be done more complex time -series data process and can be add almost unlimited. Time series data is saved by repeating judgment and action. Save data includes not only food and other objects, but also surrounding situations. If the new situation resembles the elements of the time that is already recorded, the approximation element becomes active and the activation spreads throughout the past time series data. Time series data is constantly updated with processing. As example, reality turns into a new cause, and prediction turns into a new reality. If the two areas are often active at the same time, even if only one area becomes active, the corresponding areas will also change. This learning function corresponds to Hebb's law in neuroscience. This feature is the basis of imitation and conditional learning, enabling people to communicate with fellows and enabling collective actions. This ability is an indispensable ability to build a society and is the beginning of a language. Languages can express events not only in the past or future events, but also in places where sensory organs cannot reach. It has been inherited for generations as a culture. In this paper, a new layer is proposed at the top of the neural network that has evolved from eating behavior. The processing of the new layer is asynchronous with the lower layer, but it performs a complementary process and enables event communication with friends. This ability is the basics of language. The process of forming common knowledge exchanging questions and teachings between two neural networks is shown a simple example. VL - 10 IS - 2 ER -