The problem is that you're putting the cart way before the horse. We have not yet identified what physically constitutes our consciousness and computational descriptions are not a sufficient substitute.
Nice assertion there, what is your definition of 'consciousness' and 'conscious' and what parts do you know about? What parts of neurology have you studied, to make this claim, I mean.
I have not said that computation is sufficient. I might argue that we can model consciousness and the sub processes.
At this point, computation has just become a "god of the gaps" explanation. Computationalism isn't science, its a placative ideology serving to distract AI researchers from the fact that they really have no idea what consciousness is. They're Columbus’s who fervently want to believe that they've found the western route to the East Indies when, in actuality, they've really been completely sidetracked.
Not really relevant, you are talking about things I have not addressed.
It's entirely unjustified to assume that merely emulating the computational functions of our neurons is sufficient to produce consciousness -- especially when we have not yet discovered what consciousness is or how it is produced in the first place.
Maybe you should start with the definition?
It’s a flat fact that we do not know what consciousness is or understand how the chemistry/physics of the brain produces subjective experience.
Now see , this is just a bald assertion, is shows that you have committed what I would call a definitional category error. You have not presented a definition of consciousness, you do not have an understanding of the study of it, and you just make this open assertion without understanding.
So let’s discuss two things, the definition and what you know about the study of the parts of the definition.
BTW many posters in this respect are wrong, we do know how neurons work, are refining that knowledge, and the processes in some areas are getting well understood.
So this categorical statement is just, strange.
Just because you do not understand something does not mean it doesn't exist.
The question science can't answer isn't about how brains store information, it is about why consciousness should be needed at all as part of the process. Why couldn't brains store information without generating any "consciousness"?
Now you are just conflating concepts and making vague arguments, how much of memory are we talking about here? What kind of retrieval and what time frames? Memory is also not modeled in many ways as 'storage' like a hard disk, it is more fragmented and reconstructed in many models.
So have I demonstrated the ‘uniqueness of consciousness’? Is it still just a ‘magical association’?
Your magical assertion was that through vague allusion 'consciousness' and 'QED' were linked.
Answer whenever…..
As for me, I don’t have the background to explore the chemistry/physics of how subjective experience occurs (except in general terms).