Computers are used to simulate hurricanes for the purpose of comprehending hurricanes and also for the purpose of predicting their behavior. Computers are often used to simulate various phenomena for similar purposes.
In the effort to utilize computers as controllers for robotics and to simulate human cognition we have developed a great deal of empirical evidence concerning human functioning. The software engineers must develop detailed comprehension of bodies in motion and intellects in action in order to write the algorithms required to build robots of all forms whether to simulate human intellect or human motion.
AI (Artificial Intelligence) research began shortly after WWII. Alan Turing was one of the important figures who decided that their efforts would not be focused on building machines but in programming computers.
Some of the achievements of AI have, in the last few decades, been turned toward neural modeling. This effort has been taken up by NTL (Neural Theory of Language) research group at Berkeley headed by Jerome Feldman and George Lakoff.
The brain’s neural network does things and the task the NTL has set for itself in the 1990s was to discover how the brain does what it does and where in the brain these tasks take place. As of 1999 NTL had decided to undertake three major modeling tasks:
1) The Spatial-Relations Learning Task
2) The Verbs of Hand Motion Learning task
3) The Motor Control and Abstract Aspectual Reasoning Task
“In each case, it has been shown that neural structures modeling aspects of the perceptual and motor systems can carry out the given task for concepts, and that, so far as anyone cal tell thus far, those perceptual and motor models are required to carry out the task.”
That is to say that the sensorimotor system in the human body can perform the functions required to conceptualize and, infer from those conceptions, in a manner required by human cognition. The logical assumption is that these self same sensorimotor neural networks are the networks the body uses to conceptualize during cognition.
Quotes from “Philosophy in The Flesh” by Lakoff and Johnson
In the effort to utilize computers as controllers for robotics and to simulate human cognition we have developed a great deal of empirical evidence concerning human functioning. The software engineers must develop detailed comprehension of bodies in motion and intellects in action in order to write the algorithms required to build robots of all forms whether to simulate human intellect or human motion.
AI (Artificial Intelligence) research began shortly after WWII. Alan Turing was one of the important figures who decided that their efforts would not be focused on building machines but in programming computers.
Some of the achievements of AI have, in the last few decades, been turned toward neural modeling. This effort has been taken up by NTL (Neural Theory of Language) research group at Berkeley headed by Jerome Feldman and George Lakoff.
The brain’s neural network does things and the task the NTL has set for itself in the 1990s was to discover how the brain does what it does and where in the brain these tasks take place. As of 1999 NTL had decided to undertake three major modeling tasks:
1) The Spatial-Relations Learning Task
2) The Verbs of Hand Motion Learning task
3) The Motor Control and Abstract Aspectual Reasoning Task
“In each case, it has been shown that neural structures modeling aspects of the perceptual and motor systems can carry out the given task for concepts, and that, so far as anyone cal tell thus far, those perceptual and motor models are required to carry out the task.”
That is to say that the sensorimotor system in the human body can perform the functions required to conceptualize and, infer from those conceptions, in a manner required by human cognition. The logical assumption is that these self same sensorimotor neural networks are the networks the body uses to conceptualize during cognition.
Quotes from “Philosophy in The Flesh” by Lakoff and Johnson