I do not see the point in this exercise. To me it seems like an attempt to make it impossible to have consciousness outside the biological one that we know.
Our eyes register a light wave with wavelength 505 nm. This is stored internally as "green light". If it was stored internally in a computer brain as hex 1F9, or as ASCII '505' it would still be the same: the internal representation of the wavelength 505 nm. For convenience, we call it "green", and by that we mean "one of the wavelengths that are stored internally as green".
All of this is obvious, but why point it out all the time?
How would this make it impossible to have non-biological consciousness? I don't see any barrier there at all.
And phrases like "stored internally as green light" appear hopelessly vague to me.
The glaring problem with your post here, though, is that you equate something like "ASCII 505" with phenomenology. The two are completely different.
A label for the thing, and the thing itself, aren't the same.
You can make a machine that responds to light which makes our brains perform green in a number of ways. It could do all sorts of things, whatever.
But it won't perform green.
First, there's no green in the light, so it can't get green from there.
You can't just make it respond to the light and say "There, it sees green" because as we saw earlier, different brains do different things in response to such light (nothing, performing gray, performing green, performing other colors, performing smells etc.) so the "green" label is (a) totally arbitrary, and (b) not inherent in any way in the light.
Which all means, if you want a machine to perform green, you have to BUILD it so that it performs that action.
You cannot simply build a machine that responds to light which our brains react to by performing green, and expect that it will magically also perform green even though it's not designed and built to.
You can't say, "But it saw the green light" because there's no such thing as "green light".