Are you really saying that we don't have heads? That seems very poetic.
I think that your and my sensation of having a head is a strong indication that we do, in fact, have one, but it’s not something we can be sure about.
To clarify - when I say that eyes focus inside the head, I mean that very literally, just like saying a camera lens focuses onto the film or sensor.
Which, as I say, is the ‘homunculus fallacy’. It isn’t true, and unless you’ve never, ever seen an optical illusion or had a sensory hallucination of any sort in your life you know perfectly well that it isn’t true.
I'm not sure if you are claiming here that there is no external reality, in which case I am not necessarily disagreeing, but that's quite another topic.
There is an external reality. And somewhere in that external reality there’s an organism, descended from generations of organisms who have lived long enough to reproduce, and therefore must have mechanisms that allow them to relate to other objects in the world in a way that facilitates this.
The random exigencies of evolution have equipped this organism with a specific mechanism for doing this, by means of which it integrates the data it’s getting from itself and the data it’s getting from the external environment into a best-guess simulation, tuned to the key tasks of survival and reproduction, of where it is in relation to everything else. To represent ‘where it is’, it has to differentiate between those bits of the world that belong to it and those that don’t
So it generates ‘you’ – or at least those bits of you about which it’s got reliable neural data (how does your spleen feel right now?) so that it can have something to which it can represent its rather less reliable data on anything that
isn’t itself but which it might need to do something about.* In its simulation it gives ‘you’ a ‘head’ with eyes and ears etc. because that’s what it needs to do to make the whole thing work. It seems logical that this corresponds with whatever’s ‘actually happening’ because if your physical boundaries in relation to the world weren’t being represented as at least roughly what they actually are you’d bump into things a lot more than you do. But equally (if less plausibly), you could ‘actually’ be some kind of, say, vast, bubbling algal mat flopping through the atmosphere of a gas giant, and the whole thing could be a convenient fiction that just happens to be the best way to make sure ‘you’ get soft fruit and gazelle (or whatever ‘soft fruit and gazelle’ actually are in the objective world).
*IIRC, there’s some sketchy neurological evidence that what it actually does, bizarrely, is make a whole series of up-front guesses about what might be ‘in here’ and ‘out there’ and
then does a ‘plausibility check’ against the incoming sensory data to select the one it’s going to run. Which sounds like exactly the sort of crazy, topsy-turvy result of evolution one would expect. And also provides a neat explanation for dreams as what happens when this guessing carries on
without the checking step.