• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Computer software that knows what you are thinking

On the other hand this is one more technology that would make it harder to revolt against a despotic government. As long as there's no way to consolidate it's power in a small group of people I'd support it. We already have nukes in the hands of a small group of people don't we? So start from square one if you don't like this sort of deal.

Our government *is* becoming increasingly despotic. And our President under the guise of "executive privelage" is consolidating more power among himself. And even if this technology was used outside the government, the government with it's far greater reserves of money would have a LOT more of these things than would be available elsewhere and would almost always use them for law enforcement purposes.

Even if a few existed outside of the government, the government would be using them to violate the privacy of individuals. And in my opinion freedom of thought, and basic privacy are sacrosanct.


INRM
 
INRM - did you read the study? (Not the article, the actual study linked to by 69dodge. We're not talking about some sort of "thought ray gun" that some gov't operative can station outside your house and covertly read every thought; we're talking about an fMRI. From the wikipedia article:
Subjects participating in a fMRI experiment are asked to lie still and are usually restrained with soft pads to prevent small motions from disturbing measurements. Some labs also employ bite bars to reduce motion, although these are unpopular as they can cause some discomfort to subjects. It is possible to correct for some amount of head movement with post-processing of the data, but large transient motion can render these attempts futile. Generally motion in excess of 3 millimeters will result in unusable data.

Warrantless wiretapping is a terrible analogy for this technology. If the government wanted to use this to spy on citizens, it would have to be in the context of a prolonged detention and interrogation. Furthermore, Mythbusters showed that interpreting fMRI data gathered from a non-cooperative patient (IE, trying to use an fMRI as a lie-detector) can be tricky at best.

Furthermore, I would have to disagree with your statement that there is "good science" and "bad science" or "good tech" and "bad tech." Science or technology that is developed with good intentions can certainly used for unethical purposes, and technology developed with bad intentions can certainly be used for worthy purposes. If an fMRI machine is used unethically to interrogate illegal detainies, does this mean the fMRI is "bad technology"? I think the thousands of doctors who have used fMRIs to diagnose epilepsy, Alzheimers, etc, would disagree. On the other side of the coin, Oppenheimer is still criticized for his work on the nuclear bomb, but the same technology is being used today to reveal the inner workings of quantum particles. Does that exhonerate Oppenheimer? Probably not, but it's undeniable that his work towards creating stable environments for nuclear reactions is priceless.
 
The reason I used the warrantless wiretapping analogy was largely to illustrate that we already live under a government that has little regard for human rights or privacy. And there is a risk it would be used as an interrogation tool.

When I say bad-science, I mean that even if it has some good uses, it has such potential for mis-use that it is more "bad" than it is "good". Hence, bad science.


INRM
 
Sthomson I am guessing you don't realize that tech usually doesn't stay where it starts at, if that was the case we woulnd't have alot of the things we do now. The results of this could very well be taken and turned into a "thought ray gun" as you put it. Simply because at this time the tech is large and undeveloped does not mean it will stay that way. Given the state the world is in I feel that most governments will attempt to utilize something like this and even attempt to explain its use as benificial, such as talking to coma patients while at the same time developing it militarily as well as possible police uses to "stop crime before it starts" kind of thing. Also how long before you are able to buy a computer with it already built in like most are with modems therefore taking most of the "choice" out of it? Kinda scary when you really think about it, of course if you believe the conspiracy people then it is already out there, you know because the government is always one step ahead of everyone else technologically.
 
This argument seems really pointless, as neither INRM nor Cold one has provided a) any evidence that this technology can ever be used to "read minds against our will" (meaning - I expect one of you to to some preliminary research into how fMRIs work) or b) any rational method to replace this technology with other technology equally beneficial yet with less potential for abuse.
 
Last edited:
There might a possibility that such a piece technology can evolve in leaps and bounds to the point that it evolves well beyond it's original intention and perhaps to the point that it could be done from a distance.

Cold one actually makes good points.

The technology has basically 3 good uses and 15 bad ones.


INRM
 
Last edited:
There might a possibility that such a piece technology can evolve in leaps and bounds to the point that it evolves well beyond it's original intention and perhaps to the point that it could be done from a distance.

Cold one actually makes good points.

The technology has basically 3 good uses and 15 bad ones.

Well, if someone with your qualifications and technical knowledge says it, then it must be true...

...um, wait, what ARE your qualifications to make such an announcement? Because it seems to me that there are inherent limitations in fMRI technology that would prevent usage at a distance.

1) Since fMRI studies blood flow in the brain, the reaction time is slow - it is limited by how fast blood can move through capillaries in the brain. If some eeevil organization wanted to use fMRI at a distance, they would have to devise some means of precisely targeting an individual over a long period of time.

2) Remember that 3mm movement resolution? That means the person's brain would have to be targeted with an error of less than 3 mm, all while that person is completely ignorant of the scan. Even while we're asleep, we move much more than this.

3) Furthermore, while the radio waves used for fMRI are non-ionizing, they are by necessity strong enough to heat up metal objects. I don't know how you would explain the resulting injury, equipment destruction, and fires that would be caused.

What are you advocating, INRM? A ban on fMRIs because you don't know how they work?
 
sthompson,

Don't use it is not an acceptable solution. I don't plan to. But what if the government does?

Contrary to what you think that *all* technology should be allowed to progress whatever the cost, I think there is some technology that is so dangerous that it should not be allowed to progress. Not that I'm anti-science, but there is good science and bad science. This I believe would be bad science. It has such potential for abuse, especially for an administration like ours that has no regard for personal freedoms (Warrantless wiretapping anyone?)

This is not like a privacy issue in the government will know what you're watching on your computer or TV, or what sites you've been on, or who've you talked to, and what you've talked about. But your very thoughts. The violation of privacy would be so much more extreme than any of the above issues that it would, by definition, be a violation of international law (Freedom of thought is one of the freedoms listed in international law) -- not to mention if you can't even keep your own thoughts secret, then what freedom do you have?

Creating public awareness of the situation is of course a wise idea, I would agree. But there is no way to address the problem short of altering the entire wiring of the human brain, which is impractical. The only solution is to prevent this technology from being used on the grounds that it is too dangerous.

INRM

There is an saying about developing technology like the atomic bomb. Once the genie is out of the bottle there's no putting it back.

You see this happening with Human cloning. Clinton pushed to pass a law making human clone research illegal. Problem with that is the rest of the world is going forward with the research. That will mean that those countries will reap any benefits from the research. As well as any problems.
There are many things we can learn from the research that would be beneifical medicine.

What laws we should have passed concerning cloneing is to ensure that a human clone is considered a human with all the rights encubant with being a human.
I mean is there any true difference between creating a human the usual way or by invitro fertilization or by cloneing?

That being said. i wonder if this technology could be used to accuratly extract thoughts or be some form of accurate lie detector. Human memory is so plastic.

I wonder how you would know if you were extracting a memory that was real and accurate and not undistorted, or made up from a dream or imagination or from a book you read or a movie you saw?

I wonder how accuratly sensory information is stored in the brain?

A lie detector made from this technology may wind up being not that much more accurate than a polygraph, which is pretty much not accurate at all.

I think at worst this technology may someday be able to read whatever internal dialog is going on in your head or what ever you are seeing or looking at in real time but I think anything extracted from memory may be questionable.

Of course there may be the possibility to erase memory or create false memory.
 
That's not what I'm talking about sthompson. There are ways to track bloodflow without an fMRI machine.

To uruk,

Actually even if new technology is created, regulations can be put into effect to make sure they're not used or restrict their use to limited applications.

There have been cases where international treaties have banned the use of certain technology (dum-dum bullets, weapons that maim and have no other use etc)

And there have been unwritten rules about not using certain technology -- cobalt bombs for example as the results of using them would be too dangerous.

In regards to privacy, it would be too dangerous.
 
Well from what I can tell sthompson you do not believe that the technology of cell phones came from the same tech that the land line telephone came from....
When in reality although they might not use the exact same methods they are in actuallity from the same original tech.
In other words Technology evolves and sometimes it evolves extremely fast to suit the "needs" at the time.
Therefore the idea that this "mri" cannot be quickly evolved to say an electrically based brain scan or some other medium is IMO a very uneducated one.
 
Well from what I can tell sthompson you do not believe that the technology of cell phones came from the same tech that the land line telephone came from....
When in reality although they might not use the exact same methods they are in actuallity from the same original tech.
In other words Technology evolves and sometimes it evolves extremely fast to suit the "needs" at the time.
Therefore the idea that this "mri" cannot be quickly evolved to say an electrically based brain scan or some other medium is IMO a very uneducated one.

Look, I see your point - I've already admitted that, but that still doesn't mean that we should throw the baby out with the bathwater. It is uneducated to argue that we should completely halt research on fMRI technology because of paranoia over far distant future theoretical applications. We could have done the same thing with telegraphs, radios, telephones, x-rays, computers, the internet, etc. etc.

I dunno, maybe we SHOULD have stopped developing the internet back in the 80s. Then there'd be no one here to argue.
 
I never said we should throw all of fMRI technology out...

I just said specific technology such as algorithms and deliberate use and research to figure out what a person's thinking... or their intent (A Julian Haynes had developed an algorithm that identified activity in the medial prefrontal cortex to determine whether a person was going to add subtract or what have you) or developing the fMRI into some kind of lie-detector/interrogation tool should be halted for the danger it could pose in terms of civil liberties.

As I said before, Cold one is absolutely correct in that technology sometimes evolves, and in some cases very rapidly, well beyond any initial expectation. And what he said earlier how it would be pitched as a way to help people who can't communicate (say autistics), while it would be developed to "prevent crime before it happens" and as an interrogation tool is not impossible.

INRM
 
I don't think we'll ever come to agreement on this issue. I have no problem with using an fMRI as a lie-detector tool, as we've got other lie-detector tools that are used in the exact same way. The fMRI does not read your thoughts, it reads your physiological reaction to questions, just like a normal lie detector does. You have not convinced me that this technology can ever be made portable or be used over long distances, even with unknown future technological innovations, because of the inherent limitations in our physiology and in the technology.

I have one question, INRM: do you agree with conservative politicians who attempt to limit and regulate research on new stem cell lines and/or on cloning?
 
I never said we should throw all of fMRI technology out...

I just said specific technology such as algorithms and deliberate use and research to figure out what a person's thinking... or their intent (A Julian Haynes had developed an algorithm that identified activity in the medial prefrontal cortex to determine whether a person was going to add subtract or what have you) or developing the fMRI into some kind of lie-detector/interrogation tool should be halted for the danger it could pose in terms of civil liberties.

As I said before, Cold one is absolutely correct in that technology sometimes evolves, and in some cases very rapidly, well beyond any initial expectation. And what he said earlier how it would be pitched as a way to help people who can't communicate (say autistics), while it would be developed to "prevent crime before it happens" and as an interrogation tool is not impossible.

INRM

I think I would have to disagree here. I think certain technology like this has more potential for benefit than deteriment. You can always regulate a particular of the use of the technology. There is a multitude of technology present now that can be used for mischief but it is being regulated. There will always be breaches.

It is also kind of pointless to say that you can prevent any technology from developing. Once the knowldege is known it just goes on from there. UN sanctions or not.
 
You can regulate ordinary civilians in the use of this technology... how do you regulate the government, especially when they operate under great secrecy, and have no regard for human rights?

Regarding any benefit it may have, I would think more on figuring out ways to cure or treat autism rather than develop a computer that can "read" a person's mind.

If most people were made aware of how dangerous such technology could be , that in itself would help dramatically (Nobody's detonated a cobalt bomb...), but regarding laws: If the laws are uniform, applied at the state, federal, and international level, and if punishments for violating the law are certain and unwavering it should work.

INRM
 
INRM, will you please answer my question: Do you support government limitations of potentially life-saving research, like the US gov't does for new stem cell lines?
 
I think the government should not have placed limits on stem-cell research. Of course there is no violation of civil liberties in the use of stem-cells.

INRM
 
I think the government should not have placed limits on stem-cell research. Of course there is no violation of civil liberties in the use of stem-cells.

But, you want to limit fMRI research because of potential violations of civil liberties. In contrast, conservatives want to limit stem cell research because of potential violations of an ethical or moral code. Pretty similar motivations, IMO.

On your logic, we should also stop research on high-resolution cameras and image processing, because it could be used to violate your right to privacy. We should stop research into genetics, because people could use your genetic information to discriminate against you. And so on.
 
Okay sthompson,

You are comparing apples and oranges. In the case of stem-cells you're talking about an egg, or a blastula, barely developed. It has no consciousness, it isn't alive. It doesn't even have a brain yet. You can't have consciousness without a brain. And you are using these eggs, who have been mostly aborted by mothers who don't want a baby-- they had a medical procedure to get rid of the egg from their body. At least this egg which would have normally been trashed as garbage can be used for fantastic medical benefits.

In the second case you are talking about human beings who are very much completely alive. We live in an evolved society that in the past 220 or so years entitles it's people certain liberties. America, historically for example entitles its citizens technically to be free from unreasonable search and seizures. If technology that can be developed to determine what a person's thinking, and since such technology would be almost inevitably sought by the government, particularly law enforcement and intelligence, and the technology ultimately works by tracking blood flow through sections of the brain, which can potentially be done from a distance, it can be potentially used to violate the most basic freedom-- the freedom to keep one's thoughts to themselves, and since our government in the past 6 to 7 years has shown a sudden, but profound and likely long-lasting disregard for human rights, this is a likely possibility. I think it should be stopped because it poses such a danger.

Regarding genetic research, this isn't exactly on the same calibur as being able to determine exactly what somebody is thinking. Although you present a good point, however I think genetic research should have some regulation incorporated to prevent abuse, but I do no think it should be halted. Keep in mind though, I don't even want to halt fMRI research all together, just certain research which can endanger civil liberties.


INRM
 
Well not quite yet. http://www.cmu.edu/news/archive/2008/January/jan3_justmitchell.shtml

I find this very interesting on several levels.
I can foresee a web browser that thows up search results based on what you are thinking.

Thought/neural controlled interfaces for computers or devices.

And eventualy extracting from your brain thoughts.

Maybe even direct neural telecommunication a la "Ghost in the shell."

It also implies some interesting things about the mind. I think I'll post this in the religeon and philosophy forum. Just to stir things up.
This kind of technology is old hat. Nothing new there as I read it.

*Yawn*
 

Back
Top Bottom