Hellbound
Merchant of Doom
A neural net isn’t programmed to produce a result. I believe that’s the misunderstanding here.
A neural net is trained to take a certain problem set (I.e. - aerodynamics, or space optimization, or map matching, or whatever) and adjust certain parameters, and weight results according to certain programmed parameters.
So in these cases, the AI isn’t doing something it wasn’t programmed to do. It’s more a matter of the programmers not realizing the potential outcomes of the manipulations they programmed the AI to perform.
The “Life” computer app is a good (simple) example. That’s the one where squares light up based on how many squares around them are lit. It is programmed to exercise Ute a few simple rules. Yet it can produce stable and dynamic structures. It’s not programmed to do that, but that is a result of the input parameters (initial squares lit and the results of each previous iteration) and the variables being manipulated (adjacent squares).
Sent from my iPhone using Tapatalk
A neural net is trained to take a certain problem set (I.e. - aerodynamics, or space optimization, or map matching, or whatever) and adjust certain parameters, and weight results according to certain programmed parameters.
So in these cases, the AI isn’t doing something it wasn’t programmed to do. It’s more a matter of the programmers not realizing the potential outcomes of the manipulations they programmed the AI to perform.
The “Life” computer app is a good (simple) example. That’s the one where squares light up based on how many squares around them are lit. It is programmed to exercise Ute a few simple rules. Yet it can produce stable and dynamic structures. It’s not programmed to do that, but that is a result of the input parameters (initial squares lit and the results of each previous iteration) and the variables being manipulated (adjacent squares).
Sent from my iPhone using Tapatalk
Last edited: