• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Super Artificial Intelligence, a naive approach

So you didn't find it.
You made it up because that would make your estimations correct. That's not evidence, that's circular reasoning.

You fail to understand.


Simply, the 10 impulses per second process was found first.

...and hence after, I searched for later estimations. (none found)

Then, I made the original post.
 
Last edited:
Whether or not you can potentially understand it is moot. You didn't understand it because you didn't pay attention while skim-reading it. How can anybody be expected to take you at all seriously if you formulate theories and come to conclusions based on on papers that you admit you haven't even read properly?

I didn't formulate some grand theory.

Simply, that value was quickly used to compute the 2020 year, for which human level brain like machines would arrive.


Other sequences of mine, such as the works below, are runnable, and testable, providing accurate results:

That's odd:

(1) basic neural net (technically a deep net of 3 layers as is, but can be extended):

https://github.com/JordanMicahBennett/SYNTHETIC-SENTIENCE



(2) deep neural network for heart irregularity detection, using residual neural networks:

https://github.com/JordanMicahBennett/EJECTION-FRACTION-IRREGULARITY-DETECTION-MODEL



(3) an experiment for enhancing deep neural net:

https://github.com/JordanMicahBenne...HOMEOMORPHIC-LATTICE-SPACE-BLOCH-OSCILLATIONS

etc
 
Last edited:
The fact that it's running 1542 times slower than realtime :confused:

I had long mentioned that the artificial synapses by IBM, were crude approximations. (So they are not as efficient)

They still provided unprecedented efficiency, and achieved state of the art in machine learning, cognitive tasks.
 
I had long mentioned that the artificial synapses by IBM, were crude approximations. (So they are not as efficient)

They still provided unprecedented efficiency, and achieved state of the art in machine learning, cognitive tasks.

Is it your estiation that they will run in real time by 2020?
Or do you count 'same number of operations as humans, but 1000 times slower' as human-level?
 
Is it your estiation that they will run in real time by 2020?
Or do you count 'same number of operations as humans, but 1000 times slower' as human-level?

I referred to ibm's current 10^14 as human level numbers (that is, minimum), but not efficiency.

Using public estimations (such as links from original post) we will have human level numbers + efficiency by at least 2020.
 
Last edited:
Simply, that value was quickly used to compute the 2020 year, for which human level brain like machines would arrive.

Which is the subject of this thread, and which was based on incorrect data - data which was incorrect because you were basing it off sources that you didn't actually take the time to read properly.

So, again, why should anybody give any credence to your methodology when you admit that you get your data from sources that you don't even bother to read properly? This is a sceptic's board.
 
Which is the subject of this thread, and which was based on incorrect data - data which was incorrect because you were basing it off sources that you didn't actually take the time to read properly.

So, again, why should anybody give any credence to your methodology when you admit that you get your data from sources that you don't even bother to read properly? This is a sceptic's board.

To make things even clearer, the subject of this thread is not the 2020 date.

The subject of this thread is 'thought curvature', which uses my mathematical understanding of quantum computing, and modern machine learning.


Once again:


(A)
Other sequences of mine, such as the works below, are runnable, and testable, providing accurate results:



(1) basic neural net (technically a deep net of 3 layers as is, but can be extended):

https://github.com/JordanMicahBennett/SYNTHETIC-SENTIENCE



(2) deep neural network for heart irregularity detection, using residual neural networks:

https://github.com/JordanMicahBennett/EJECTION-FRACTION-IRREGULARITY-DETECTION-MODEL



(3) an experiment for enhancing deep neural net:

https://github.com/JordanMicahBenne...HOMEOMORPHIC-LATTICE-SPACE-BLOCH-OSCILLATIONS

etc

In the above testable, runnable sequences, that provide accurate results as it relates to neural systems, it is clear that detailed readings were performed.



(B)
mkU7wOl.jpg


You need consider that I am human, and I can't validate every string of wikipedia/ibm data, especially when that data does not influence the computation of my works as you see above.
In situations that don't regard the coding of my works, I do simplistic probabilistic weightings.
 
Last edited:
Worked for me. To quote the most relevant sentence, "At about 1 billion chemical synapses per cubic millimeter of cortical grey matter, there are lots of synapses in the nervous system (on the order of 10^15 for a human brain)."


It may be the case that the source could still probably contain 2.4 x 1014 as indicated by ibm, as the above values appear vague
However, I have already spent too much time on that matter, as it does not regard the coding of my works.
 
Last edited:
It may be the case that the source could still probably contain 2.4 x 1014 as indicated by ibm, as the above values appear vague.

It could also simply be out of date or a rough calculation. The 10^14 area seems to be better substantiated at present, either way, by the looks of it.

However, I have already spent too much time on that matter, as it does not regard the coding of my works.

And that is true enough. I apologize that I don't have anything in the way of insights into how to improve your coding, though, given that I'm not a programmer. Thank you for belatedly, if seemingly unintentionally, answering the one important question that I posed regarding the OP, either way.
 
It could also simply be out of date or a rough calculation. The 10^14 area seems to be better substantiated at present, either way, by the looks of it.



And that is true enough. I apologize that I don't have anything in the way of insights into how to improve your coding, though, given that I'm not a programmer. Thank you for belatedly, if seemingly unintentionally, answering the one important question that I posed regarding the OP, either way.

No problem.
 
Last edited:
To make things even clearer, the subject of this thread is not the 2020 date.

It's certainly one of the main things you've been banging on about. And, again, is demonstrative of your admittedly poor methodology. You can't admit how poor your methodology is and then just ask everybody to pretend that you haven't.
 
It's certainly one of the main things you've been banging on about. And, again, is demonstrative of your admittedly poor methodology. You can't admit how poor your methodology is and then just ask everybody to pretend that you haven't.

I don't ask anyone to do anything.

I still observe my prior reply in post #307.

Its time to focus on the paper in the original post.

Edited by jsfisher: 
<snip> Edited for compliance with Rule 12 of the Membership Agreement.
 
Last edited by a moderator:
.........I have updated wikipedia to reflect this truth.

I'm horrified that someone who thinks that figures differing by an order of magnitude are "roughly the same" could be let loose anywhere near Wiki. Could you do us all a huge favour and stay away from it, please. Leave it to people who have some regard for accuracy and truth.
 
[qimg]http://i.imgur.com/eoWtEuR.jpg[/qimg]


I spent 2 more minutes to find evidence that ibm's 2.4 x 1014 reference was proper:
Koch's text makes reference to 240 trillion synapses in human cortex on page 87. (This falls UNDER the quadrillion boundary of 1015, and PARTICULARLY in the trillion range of +1012)

I have updated wikipedia to reflect this truth.
I don't believe you because...
See the original post source for 10^14.


They did achieve 10^14 synapses. (As mentioned in original post)

http://www.modha.org/blog/SC12/RJ10...742514894988565&cm_mc_sid_50200000=1489498856
And this...
That's odd.

I gathered that they had already achieved 10^14 in a relatively recent paper:

http://www.modha.org/blog/SC12/RJ10...122214894244438&cm_mc_sid_50200000=1489431888
And this...
(A)
That does not appear to be the case.

(1) I asked you (in reply #11) why you criticized the numbers.

(2) You responded (in reply #12) regarding my misuse of the word "pertinent".

How did your response in reply #12, sensibly answer my query in reply #11?



(B)
There was no claim of mine, about "technology being wonderful".

My statement was that models have already achieved 10^14 artificial synaptic operations per second:
SOURCE: http://www.modha.org/blog/SC12/RJ10...926914893946751&cm_mc_sid_50200000=1489394675

That 10^14 figure was not a claim.

You didn't look it up, you have been spamming the same paper throughout this thread.
 
I don't believe you because...

And this...

And this...


You didn't look it up, you have been spamming the same paper throughout this thread.

No.

Read carefully.
I expressed that ibm achieved 10^14, and they did just that.

IBM: "We have simulated an unprecedented 2.084 neurosynaptic cores containing 53 x 1010 neurons, and 1.37 x 1014 synapses."

http://www.modha.org/blog/SC12/RJ10...685714895492773&cm_mc_sid_50200000=1489540048

Your writings show blindness in more ways that one.
 
Last edited:
I'm horrified that someone who thinks that figures differing by an order of magnitude are "roughly the same" could be let loose anywhere near Wiki. Could you do us all a huge favour and stay away from it, please. Leave it to people who have some regard for accuracy and truth.

Why do the writings of a majority of the beings here display slow understanding?

Anyway,

(1)
The rough value I was referring to regarded, for eg, 10^15, and some multiplier.
So, where I write 10^15 somewhere, that is a rough sketch of 2 x 10^15.



(2)

You may have missed the post above:

ProgrammingGodJordan said:
I spent 2 more minutes to find evidence that ibm's 2.4 x 1014 reference was proper:
Koch's text makes reference to 240 trillion synapses in human cortex on page 87. (This falls UNDER the quadrillion boundary of 1015, and PARTICULARLY in the trillion range of +1012)

I have updated wikipedia to reflect this truth.
 
Last edited:
Why do the writings of a majority of the beings here display slow understanding?........

If you think that slurring your interlocutors in this conversation is a good way forward then you will soon fall foul of the rules. We aren't "beings", and I am not slow. It was you who wrote "10^15 is roughly 10^16". We've had long conversations about it.............have you forgotten already?
 

Back
Top Bottom