bokonon: your post does not answer my point.
I am not claiming that a sufficiently complex system other than the human brain
must (or is likely to) give rise to mind.
You are claiming that it
cannot (or almost certainly won't). Given that there are an incalculably large number of non-biological complex systems in the universe, the claim that there are probably
no non-biological minds is to say that the individual probability is essentially zero. I am saying that there is no justification for that opinion.
I don't assume that mere complexity gives rise to thought.
I don't think anyone would. There are numerous natural and human-created complex systems on our planet, for instance, that plainly don't have minds.
I think human brains evolved in response to survival pressures which don't apply in the case of "the complex arrangement of atoms in my shoe" or the universe as a whole.
...
If we ever reach the point where we are surrounded by intelligent machines, a lot of design trial and error will have gone into making them intelligent. I don't expect to see intelligence arise in machines as an emergent property, as in "Skynet became self-aware at 2:14am EDT August 29, 1997." More powerful adding machines are still only more powerful adding machines. They may become faster, but they won't become intelligent.
So far, you have said that the only minds we know of depend on vehicles that are the product of natural selection (indirectly in the case of human-designed thinking machines, if these turn out to be possible). Obviously I completely agree with this – but it misses my argument. My hypothesis depends on three postulates:
- Sufficiently complex (whatever that means) entities other than evolved, living, reproducing creatures can possess the property of mind.
- Sufficiently complex systems/entities of the right type (whatever that means) can arise in the absence of biological/evolutionary processes.
- For any sufficiently complex system of the right type, the probability of emergent mind is non-negligible.
Postulate 1 will be proved when we succeed in creating thinking machines. That was my reason for introducing them into the discussion – I'm not saying they address the other two requirements. I don't know how we could go about testing postulates 2 and 3, but I believe (I have
faith 
) that they are in principle testable, and we will discover how to do so.
The crux of my argument is that, whilst there has to be some mechanism whereby a sufficiently complex system can arise - and natural selection is certainly such a mechanism - the emergence itself is a completely separate phenomenon. It's physics (or possibly maths), not biology. Our
brains evolved by natural selection, but our
minds most certainly did not. Minds emerged automatically when our brains reached the necessary threshold level of complexity – and this complexity was selected for by pressures completely unrelated to the requirement (or ability) to prove Fermat's last theorem, construct syllogisms, compose symphonies and sonnets, invent the internet, design non-biological thinking machines, etc.
If you are saying that natural selection is in principle the only possible mechanism for producing mind-supporting complexity, then be aware that your argument is dangerously close to Paley's watchmaker: it's impossible for us to envisage any way that entities sufficiently complex to support minds could have come into existence other than by biological natural selection (us) or divine creation (Paley), therefore we can assume that there
is no other way. I can't refute the argument (at least, without demonstrating an alternative mechanism), but I maintain that it is logically unsound.
What
kind of complexity is required for emergent mind to be possible is a most interesting question. For instance, the substance – does it have to be biological? The size of the system, in terms of the number of elements and connections obviously matters. Is the precise pattern and mode of interaction crucial? Opinions, anyone?