• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Resolution of Transporter Problem

No. Of course not. But it's a reminder that actually we don't yet have a clear definition for what it is we're investigating. Because what happens for me with you and Pixy is you go charging off into wild claims that can't be substantiated. The "hard problem" is not adequately understood yet to be disregarded in the manner that you guys do. This fact is appreciated by professionals working in this field. But to you two you think it's all over and done with simply because you subscribe to one definition.

No. No to every statement in that paragraph.

I'm just stating the simple fact that there is currently no well accepted philosophical definition for the word "consciousness." This is true.

An apple is not an orange. This is true. It also, like your above statement, has nothing to do with this discussion.

I am pointing out that charging ahead to what you perceive as the finishing post in the manner you do is just dwelling in fantasy. As far as I can tell the majority of serious researchers still perceive big issues. They may be resolved over time or they may disappear as more and more of the "easy problems" are resolved, but there are still big issues here and now.

Are these "serious researchers" by any chance people such as Daniel Dennet or Susan Blackmore?

I don't think it's magical. But I do think there is a gulf between AI and human visual awareness.

What does that have to do with formal rules and formal definitions?

Do you think human consciousness can be described and modeled using the existing formal rules we have learned about the universe I.E. mathematics? Yes or No?
 
Are these "serious researchers" by any chance people such as Daniel Dennet or Susan Blackmore?

No. It's more neuroscientists like Ramachandran or Baars, people actually working in this field. Read about GWT, RD. Read Dehaene and Naccache. Read Gazzanigga and Damasio. Read the people who are actually working in this area, trying to work out how the brain does it. Read them and then quote me back where and when they say there's no HPC. We have basic theoretical models that allow us to overcome HPC, but they're not proven and there are issues, as I've repeatedly stated to you.

Why anyway do you pick on Dennet [sic] and Blackmore? Dennett would never agree there's a "hard problem" in a thousand years. He's more likely to eat his own beard than say the HPC is real, and I doubt Blackmore believes it's as hard as most other researchers.


What does that have to do with formal rules and formal definitions?

Do you think human consciousness can be described and modeled using the existing formal rules we have learned about the universe I.E. mathematics? Yes or No?

It's possible. I think there will always be issues around selfhood and boundaries. Objectivity is a brain state, not an inherent condition of the universe. But, accepting a few provisos, I don't see why consciousness shouldn't be broadly understood in the future.

Nick
 
Last edited:
No. It's more neuroscientists like Ramachandran or Baars, people actually working in this field. Read about GWT, RD. Read Dehaene and Naccache. Read Gazzanigga and Damasio. Read the people who are actually working in this area, trying to work out how the brain does it. Read them and then quote me back where and when they say there's no HPC.

Um, why don't you cite instances of them saying there is an HPC, hmm? Why should I have to comb through journal articles when you apparently have such a handle on them?

We have basic theoretical models that allow us to overcome HPC, but they're not proven and there are issues, as I've repeatedly stated to you.

Ahh, I was wondering when you would start backtracking.

Why anyway do you pick on Dennet

If I had to guess, I would say because you are obsessed with quoting him. Look at your sig line for God's sake.

It's possible. I think there will always be issues around selfhood and boundaries. Objectivity is a brain state, not an inherent condition of the universe. But, accepting a few provisos, I don't see why consciousness shouldn't be broadly understood in the future.

That is an excellent dualist non-answer.

I ask you whether you think consciousness can be mathematically described and you answer with "it's possible."

You are a dualist Nick.
 
Nick227 said:
Objectivity is a brain state, not an inherent condition of the universe.


That's an interesting statement, and it might be correct in a very broad sense. On the other hand, those two claims of certainty in and of themselves – that it is a brain state, and that it is not an inherent condition of the universe – are statements which seem to be "objective" if they are meant to be empirical ones (or maybe you mean them to be ontological?). At least they mimic those of what we ordinarily call objective statements about the world in general.

Ultimately there's no way of verifying them however, thus we're back at defining objectivity in the more traditional sense, which puts it back on the epistemological plane; mainly being a formal tool, a method, for acquiring data about the world. Hence we use objectivity as a principle; a verb rather than an noun; a description of a method rather than a state.
 
Um, why don't you cite instances of them saying there is an HPC, hmm? Why should I have to comb through journal articles when you apparently have such a handle on them?

I've already quoted both Baars and Ramachandran in this thread. When asked a direct HPC question Baars agreed it's an explanatory gap. Ramachandran I quoted as stating clearly that, when it comes to selfhood, neurology is only just scratching the surface.

So, as far as I'm concerned, I consider the case proven unless you can come up with some evidence to the contrary.


If I had to guess, I would say because you are obsessed with quoting him. Look at your sig line for God's sake.

I like Dennett. He takes a strong position. I like Dennett quotes too. The sig line one is good. Another fave of mine is where he says that if your theory of consciousness isn't counter-intuitive then it's just plain wrong. Great quotes.


That is an excellent dualist non-answer.

No. It's a realistic assessment of the situation.

Leading scientists in this field like Baars and Rama admit the HPC is still an issue. Would you call them dualists? They're realists, RD. They are making honest statements about the current state of the research.

I ask you whether you think consciousness can be mathematically described and you answer with "it's possible."

I said it's possible because to me there are still issues around the phenomenom of objectivity, which I then explained.

Nick
 
That's an interesting statement, and it might be correct in a very broad sense. On the other hand, those two claims of certainty in and of themselves – that it is a brain state, and that it is not an inherent condition of the universe – are statements which seem to be "objective" if they are meant to be empirical ones (or maybe you mean them to be ontological?). At least they mimic those of what we ordinarily call objective statements about the world in general.

Ultimately there's no way of verifying them however, thus we're back at defining objectivity in the more traditional sense, which puts it back on the epistemological plane; mainly being a formal tool, a method, for acquiring data about the world. Hence we use objectivity as a principle; a verb rather than an noun; a description of a method rather than a state.

Yes, I'm sure there are problems aplenty when one tries to create any kind of factor by which objectivity may itself be assessed. I just came to the position through being basically a non-dualist. For me visual phenomenology and thinking are distinct and it is the latter, primarily, that creates the sensation of there being subject-object relationships in the world around us. Such relationships are not inherent in consciousness in general.

eta: visual phenomenology is not inherently objective, though what we look at is to a degree determined by brain processes that relate to selfhood. Thinking, as inner dialogue, renders visual phenomenology as being objective.

Nick
 
Last edited:
Nick227 said:
Thinking, as inner dialogue, renders visual phenomenology as being objective.

That is one way to look at it, provided we go by your way of dealing with objectivity. However, I don't think inner dialog is the necessary deciding factor here, although inner dialog and reasoning certainly elevates the distinction (which is why I don't think I'm the keyboard I'm typing on, unless I'm in a kind of state where there is not such perceived distinction available). Btw. this is why I often link to Jill Bolte Taylor's TED-talk because she had a stroke in the left hemisphere, near the language area, which rendered her incapable of making such distinctions. Only when areas in the left hemisphere came "on-line" – as short bursts during the stroke – could she reason the she's having a stroke (she was/is a brain scientist after all). So that experience at least seems to come close to your point.

But we don't really know how much of a conceptual capacity there has to be in order to make the distinction you speak of. In fact, we might not even know if that is a fruitful line of questioning or reasoning about the issue. I would however refer to Gazzaniga and, for example, this interview: Plasticity, the Interpreter, and the Self (Journal of Consciousness Studies: No. 5—6, 1998).

It's pretty clear that inner dialog is one of the major factors, but there might also be other areas in the brain which could contribute to the issue of "separation". It might even be the case that inner dialog kicks in as a sort of post hoc commentary after the distinction is already made at a more "rudimentary" level, but not yet made available as a fully fledged narrative distinction. Thus, in this scenario, inner dialog would elevate and create meaning to the distinction (including meaning of distinction as, well... distinction).
 
Last edited:
When asked a direct HPC question Baars agreed it's an explanatory gap.

No. It was not a "direct HPC question." It was the kind of question that shows we don't need to work in the context of the HPC to research consciousness.

Look up HPC if you don't believe me. Your problem is that you interpret any "difficult" question to be part of the HPC. This is just wrong.

Ramachandran I quoted as stating clearly that, when it comes to selfhood, neurology is only just scratching the surface.

Lol, see?

So, as far as I'm concerned, I consider the case proven unless you can come up with some evidence to the contrary.

Likewise.

No. It's a realistic assessment of the situation.

Leading scientists in this field like Baars and Rama admit the HPC is still an issue. Would you call them dualists? They're realists, RD. They are making honest statements about the current state of the research.

Really? You can quote t them as stating that the "hard problem of consciousness is still an issue?"

Because it is quite different to think there are still difficult issues than it is to think that the "hard problem of consciousness" is a real problem.

I said it's possible because to me there are still issues around the phenomenom of objectivity, which I then explained.

So you are not sure whether mathematics can describe objectivity?

Guess what, Nick -- you are not a materialist. The single tenent that all types of materialism agree upon is that every aspect of the universe can be mathematically described.
 
No. It was not a "direct HPC question." It was the kind of question that shows we don't need to work in the context of the HPC to research consciousness.

Look up HPC if you don't believe me. Your problem is that you interpret any "difficult" question to be part of the HPC. This is just wrong.

OK, let's look at it again.

Blackmore vs Baars said:
Blackmore: But there still seems to be a mystery here to me, that what you're saying is that the difference between a perception that's unconscious and one that's conscious is a matter of which bit of the brain the processing is going on in. How can one bit of the brain with neurons firing in it be conscious, where another bit of the brain with very similar neurons firing in a very similar way is not? Don't we still have this explanatory gap?

Baars:There are a lot of explanatory gaps. We are in the study of consciousness where Benjamin Franklin was in the study of electricity around 1800: he knew of a number of basic phenomena, and he might have known about the flow of electricity, and the usefulness of the stream metaphor - that things go from one place to the other, a little like the flow of water; that you can put resistors into the circuit, which are a little bit like dams. You have a useful analogy at that point in understanding electricity, which actually turns out to be not bad; but you have to improve it. So we're at a very primitive stage, but there are a few things that we can say. (Blackmore 2005)

Blackmore's question is a direct hard problem question. She's asking what creates the qualitative difference between brain processing which is conscious and that which is unconscious. That's clearly a hard problem question. If you don't agree then I can only assume that you simply do not understand the HPC. It's about subjectivity, RD. Blackmore is asking...what is the material difference between conscious and unconscious processing. That is what the HPC actually is. It's to provide an materialist account for subjective phenomena.

Lol, see?

RD, if you'd ever actually read anything by Rama you would know that he repeatedly asserts that the HPC is fundamentally an issue of self.

Frankly, all this dialogue is now proving to me is that it's not possible for you to understand the HPC. If you want to dispute this then please point out to me just how you feel Blackmore is not asking a HPC question. Please just do one thing to demonstrate you do actually have even a vague grasp of the subject matter, otherwise I'm left feeling like I'm trying to communicate with a particularly antagonistic chatbot.

ETA: to be honest, it wouldn't surprise me at this juncture to learn that you don't actually exist as a person but are only some prototype discussion programme Pixy has created in his AI endeavours.

Really? You can quote t them as stating that the "hard problem of consciousness is still an issue?"

Well, I don't have to hand quotes from Baars or Rama stating exactly "I believe in the HPC." I use the fact that I understand the material to draw conclusions from what they write. If you read Baars' main book, he starts off by emphasising that we still don't have a functional neuroanatomical interpretation for differentiating conscious processing from unconscious processing. To me this is identical with saying "We haven't overcome the HPC," but to you apparently it's quite different.

For me, you just don't understand the material and you don't understand the perspectives of those who write professionally on the subject. Possibly you understand something about AI. I don't know enough about it myself to assess this.

Nick
 
Last edited:
That is one way to look at it, provided we go by your way of dealing with objectivity. However, I don't think inner dialog is the necessary deciding factor here, although inner dialog and reasoning certainly elevates the distinction (which is why I don't think I'm the keyboard I'm typing on, unless I'm in a kind of state where there is not such perceived distinction available). Btw. this is why I often link to Jill Bolte Taylor's TED-talk because she had a stroke in the left hemisphere, near the language area, which rendered her incapable of making such distinctions. Only when areas in the left hemisphere came "on-line" – as short bursts during the stroke – could she reason the she's having a stroke (she was/is a brain scientist after all). So that experience at least seems to come close to your point.

I haven't seen her TED presentation but I read about what happened to her.

I mean, coming more from a background of meditation, it's clear for me that a lot of selfhood is wrapped up in inner dialogue, but not all of it. For a start there is a sense of body map and this persists whether thinking is taking place or not. Secondly, there are reactions. The body will still react to threats, sexual stimuli or food stimuli whether thinking is taking place or not. Thirdly, and connected to #2, what we are consciously aware of still seems to be dictated by selfhood-related processing. That's to say, the cortico-thalamic loop or whatever, directs visual conscious awareness according to its programming which is acquired genetically and reflects the needs of self, of the organism. Thus, even if you have no conscious sense of self for a period, you will still respond to certain situations as though you have.

These aspects aside, it's all inner dialogue and mentation from inner dialogue if you ask me. Dennett, when facing these issues, takes care to distinguish biological selfhood from narrative or psychological selfhood.

But we don't really know how much of a conceptual capacity there has to be in order to make the distinction you speak of. In fact, we might not even know if that is a fruitful line of questioning or reasoning about the issue. I would however refer to Gazzaniga and, for example, this interview: Plasticity, the Interpreter, and the Self (Journal of Consciousness Studies: No. 5—6, 1998).

It's pretty clear that inner dialog is one of the major factors, but there might also be other areas in the brain which could contribute to the issue of "separation". It might even be the case that inner dialog kicks in as a sort of post hoc commentary after the distinction is already made at a more "rudimentary" level, but not yet made available as a fully fledged narrative distinction. Thus, in this scenario, inner dialog would elevate and create meaning to the distinction (including meaning of distinction as, well... distinction).

Yes, that sounds right to me. Thanks for the link, I'll check it out. Personally, I think Blackmore is right with her memetic perspective. We have a biological self, which has needs, but with the development of these vast brains so this slowly transformed into this immense, top-heavy "user illusion" with all its fantasies, foibles, avoidances and other folies. To create an authentic Self from all this dross I think is quite a challenge, and largely a return to the needs of the biological self.

Nick
 
Last edited:
Frankly, all this dialogue is now proving to me is that it's not possible for you to understand the HPC. If you want to dispute this then please point out to me just how you feel Blackmore is not asking a HPC question. Please just do one thing to demonstrate you do actually have even a vague grasp of the subject matter, otherwise I'm left feeling like I'm trying to communicate with a particularly antagonistic chatbot.

From wikipedia ( http://en.wikipedia.org/wiki/Hard_problem_of_consciousness ):

"Why should physical processing give rise to a rich inner life at all?"
"How is it that some organisms are subjects of experience?"
"Why does awareness of sensory information exist at all?"
"Why do qualia exist?"
"Why is there a subjective component to experience?"
"Why aren't we philosophical zombies?"
"Phenomenal Natures are categorically different from behavior"

Blackmore was not asking an "HPC question." HPC questions do not have material answers. That is why so many people do not recognize the HPC as a valid problem -- as soon as you try to provide a material answer, proponents take a step back and turn it into another question. It is about as productive as going on a snipe hunt.

You yourself have demonstrated this perfectly. Pixy and I gave you very understandable and very plausible explanations for why you "see" a "monitor." What did you do? You insisted that we weren't talking about what you were talking about and you refused to provide any operational definitions so we could modify or further explain. That is the HPC, sir. The HPC manifests itself when a person like you discounts an explanation on the basis of "intuition" or "gut feeling" rather than reason.

Now, do you think Susan Blackmore would act the same way? Or would she give rational counter-arguments for why she didn't think our explanation was satisfactory to explain observed behavior? I guess if she might act like you, and in that case I would have to admit she was asking an "HPC question," but I am giving her the benefit of the doubt.
 
Balckmore vs Baars said:
Blackmore: But there still seems to be a mystery here to me, that what you're saying is that the difference between a perception that's unconscious and one that's conscious is a matter of which bit of the brain the processing is going on in. How can one bit of the brain with neurons firing in it be conscious, where another bit of the brain with very similar neurons firing in a very similar way is not? Don't we still have this explanatory gap?

Baars:There are a lot of explanatory gaps. We are in the study of consciousness where Benjamin Franklin was in the study of electricity around 1800: he knew of a number of basic phenomena, and he might have known about the flow of electricity, and the usefulness of the stream metaphor - that things go from one place to the other, a little like the flow of water; that you can put resistors into the circuit, which are a little bit like dams. You have a useful analogy at that point in understanding electricity, which actually turns out to be not bad; but you have to improve it. So we're at a very primitive stage, but there are a few things that we can say. (Blackmore 2005)

This might be a subtle point, but I also don't think the aforementioned exchange is exactly representative of what Chalmers coined as the "hard" problem. What Blackmore and Baars seems to be talking about is more about explaining the difference between unconscious and conscious, not the extra step of subjective experience when already conscious of something. I think it's withing the extra step where Chalmers and his "hard" problem comes in (or so he at least thinks).


In Chalmers' own words from he Journal of Consciousness Studies: 1995 (my own emphasis is in bold text):
Chalmers said:
The really hard problem of consciousness is the problem of experience. When we think and perceive, there is a whir of information-processing, but there is also a subjective aspect. As Nagel (1974) has put it, there is something it is like to be a conscious organism. This subjective aspect is experience. When we see, for example, we experience visual sensations: the felt quality of redness, the experience of dark and light, the quality of depth in a visual field. Other experiences go along with perception in different modalities: the sound of a clarinet, the smell of mothballs. Then there are bodily sensations, from pains to orgasms; mental images that are conjured up internally; the felt quality of emotion, and the experience of a stream of conscious thought. What unites all of these states is that there is something it is like to be in them. All of them are states of experience.
 
From wikipedia ( http://en.wikipedia.org/wiki/Hard_problem_of_consciousness ):

"Why should physical processing give rise to a rich inner life at all?"
"How is it that some organisms are subjects of experience?"
"Why does awareness of sensory information exist at all?"
"Why do qualia exist?"
"Why is there a subjective component to experience?"
"Why aren't we philosophical zombies?"
"Phenomenal Natures are categorically different from behavior"

Blackmore was not asking an "HPC question." HPC questions do not have material answers. That is why so many people do not recognize the HPC as a valid problem -- as soon as you try to provide a material answer, proponents take a step back and turn it into another question. It is about as productive as going on a snipe hunt.

I am a materialist. I do not believe in the HPC, but I recognise the validity of the line of questioning which gives rise to it. It is natural. It is natural because someone who is actually trying to work out how the brain creates conscious experience is also human, and naturally wants to know the answers themselves. This viewpoint is shared by many that I've read who research consciousness, certainly including Baars, Ramachandran and Blackmore.

Now, if you're just doing AI then you're coming from a different place. You don't need to demonstrate how the brain achieves all the effects that it does. But brain researchers do.

The last few pages of this thread have been me objecting to statements made by Pixy in particular where he claims that virtually all researchers see no hard problem. This is patent nonsense as even a cursory examination of people's published writing will confirm.

You and Pixy seem to believe that it is all a question of semantics and language, that if we stop looking at the problem a certain way it will go away. Actually, I agree, and have said the same many times, but I agree to a degree. There are still issues. We simply do not know if what goes on inside a machine is experientially analogous to human consciousness. And the only real way to find out is to continue examining the brain as though there is a HPC. In actual research it is not so meaningful to get around the problem simply by rephrasing it.

Nick

ETA: why do you think HPC questions don't have material answers?
 
Last edited:
This might be a subtle point, but I also don't think the aforementioned exchange is exactly representative of what Chalmers coined as the "hard" problem. What Blackmore and Baars seems to be talking about is more about explaining the difference between unconscious and conscious, not the extra step of subjective experience when already conscious of something.

How do you see a step here? Are you saying you can be conscious without subjectivity?

Nick
 
So you are not sure whether mathematics can describe objectivity?

It's not completely clear for me, but I figure it could be possible.

Guess what, Nick -- you are not a materialist. The single tenent that all types of materialism agree upon is that every aspect of the universe can be mathematically described.

Well, it's more about mathematically describing consciousness, though this could amount to the same, I guess. But basically I agree that it can be done in theory, given certain provisos around incompleteness and possibly selfhood, like I mentioned.

Nick
 
How do you see a step here? Are you saying you can be conscious without subjectivity?

Well, if you read the quote from Chalmers and compare it to what Blackmore and Baars is talking about there seems to be a distinction. B&B seems to be discussing why certain actions in the brain do not register as conscious (A) whereas others do (B). What Chalmers seems to be talking about is the subjective experience of being conscious of something (perhaps the "redness of the red" when conscious of a red rose), i.e. in conjunction with B. Or perhaps more like B+1, B+2... B+n (where 1 experiencing red, 2 is experiencing blue etc.)?

Calling it an "extra step" might be badly phrased thou (it's not necessarily linear like that), but a distinction of what kind of explanatory gap they're talking about seems to be there.
 
You yourself have demonstrated this perfectly. Pixy and I gave you very understandable and very plausible explanations for why you "see" a "monitor." What did you do? You insisted that we weren't talking about what you were talking about and you refused to provide any operational definitions so we could modify or further explain. That is the HPC, sir. The HPC manifests itself when a person like you discounts an explanation on the basis of "intuition" or "gut feeling" rather than reason.

The answer you gave was not the question I was asking, as I tried to point out. Blackmore posed the same question to Baars and he agreed it was an explanatory gap.

I guess what bugs me is that you actually seem to believe I don't understand your answer, that somehow I have failed to grasp it. Maybe this is my trip. But, look, please please try and understand something here. No one has a problem understanding that attentional loops can be switched between differing input streams. It's not rocket science and there is already a basic neurological model created by Gerald Edelman. But this is not what I was asking, which you don't appreciate. When Baars is asked an identical question he answers that, yes, it is an explanatory gap. He doesn't repeatedly ask for definitions or try and wriggle out of the question. He's honest and he has the background to give a definitive answer. Of course, he also knows Blackmore won't let him off the hook if he comes out with a bunch of blather - a skill I evidently haven't developed yet.



Nick
 
Well, if you read the quote from Chalmers and compare it to what Blackmore and Baars is talking about there seems to be a distinction. B&B seems to be discussing why certain actions in the brain do not register as conscious (A) whereas others do (B).

Yes. They're discussing how near-identical neuronal processes can occur in parallel and yet only one is conscious.

What Chalmers seems to be talking about is the subjective experience of being conscious of something (perhaps the "redness of the red" when conscious of a red rose), i.e. in conjunction with B. Or perhaps more like B+1, B+2... B+n (where 1 experiencing red, 2 is experiencing blue etc.)?

Calling it an "extra step" might be badly phrased thou (it's not necessarily linear like that), but a distinction of what kind of explanatory gap they're talking about seems to be there.

I would agree that if one were to just examine the words themselves it could be felt that there was an extra step being alluded to. But to me your statement "subjective experience of being conscious of something" is not the HPC, but something else. It is to recognise the state of being aware.

For me, Chalmers' HPC in actuality really is subjective experience itself. It is that there are colours around us in the first place. And this is what Blackmore is asking Baars - What material process makes this data stream conscious whilst its neighbour is not?

The HPC deniers (GWT school) will respond that it is simply that an attentional loop, mediated by thalamic activity, is selecting one stream to be "broadcast" whilst an adjacent stream remains unconscious. But this will not satisfy the HPC upholders as they don't believe that such an apparently minor action can fully account for all the wonders of visual consciousness.

Nick
 
Last edited:
Nick227 said:
For me, Chalmers' HPC in actuality really is subjective experience itself. It is that there are colours around us in the first place. And this is what Blackmore is asking Baars - What material process makes this data stream conscious, when it's clear that it need not be.


But that's not really in accordance with how Chalmers coined it, is it? But sure, if you redefine the HPC as you have, you might be right.
 
But that's not really in accordance with how Chalmers coined it, is it? But sure, if you redefine the HPC as you have, you might be right.

I don't think I'm redefining here. Can we look closely...

Chalmers said:
The really hard problem of consciousness is the problem of experience. When we think and perceive, there is a whir of information-processing, but there is also a subjective aspect. As Nagel (1974) has put it, there is something it is like to be a conscious organism. This subjective aspect is experience. When we see, for example, we experience visual sensations: the felt quality of redness, the experience of dark and light, the quality of depth in a visual field. Other experiences go along with perception in different modalities: the sound of a clarinet, the smell of mothballs. Then there are bodily sensations, from pains to orgasms; mental images that are conjured up internally; the felt quality of emotion, and the experience of a stream of conscious thought. What unites all of these states is that there is something it is like to be in them. All of them are states of experience.

These states are all simple conscious experience. They surround us constantly. I don't see Chalmers saying that it's necessary to introspect upon them or to be aware that they are taking place in order for them to exist. It is simply that introspection is needed to be aware that these things that constantly surround us might seem not to be created by simple material processing.

Nick
 

Back
Top Bottom