Pretty sure you mean 'deifies' humans, or something like.
Though thanks for the chuckle

(of the sort I provide on a regular basis).
Well, I was partially serious.
I don't think it is valid to assume that just because some other human is similar to you they probably experience the same conscious states as you.
People who think that don't understand the nature of neural networks. It just doesn't
work that way.
I don't assume my consciousness is anything like yours, at all. Yeah we both see in color, and might have similar basic top level sensory perception networks, but everything else in our brain is 100% unique to us. The topography of our networks might be similar, but only like fingerprints are similar.
If I hooked up your hearing to my brain, the result is likely to be incomprehensible noise. If I hooked up your memories, it would be nonsense. If I tried to see with your visual cortex, it would be mayhem.
So I say people anthropomorphize humans because they make assumptions about how similar the experience of another human is to them.
As an example, think about language. French people think in French.
They think in French. That is crazy to me -- they don't just replace the nouns and verbs with French equivalents, their entire sentence structure is different and their thoughts are consequently structured in a different order. And French isn't even that different from English. Think about the Asian languages! It is likely that the conscious experience of a Japanese man is very different than mine. Similar in some ways, yes, but not entirely the same either.
It isn't that computationalists assume too much. We assume only what is valid. Everyone else takes it too far in the
other direction.