• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Where should technology research be focused?

Dorian Gray

Hypocrisy Detector
Joined
Nov 15, 2002
Messages
20,366
My belief? Computing power.

If computing power is doubled multiple times, then researchers in practically every field of science, medicine and technology can use massive simulations instead of real-time research. Will a particular medicine have negative effects on people? Take the characteristics of the drug, feed in all the data from all the people who contributed DNA to the human genome project, factor in a few lifestyles, and run a few million simulations. Rather than putting it out there after 10 years of testing and finding out later that it affects a certain segment of the population lethally, the simulations would catch it without risking lives.
 
I reject the premise (that someone or some agency or something should be directing technology research).

I'm not ignoring the NSF, et. al. who give grants that, in effect, focus research. But I am suggesting that all of such agencies and private industry should each go their own way. It may not maximize efficiency but I would argue (without evidence) that such an approach would maximize innovation.
 
The demand for computer power is enough that it will be funded privately. Sure a few government dollars get thrown into the mix when someone comes up with an idea and needs start up funds, but private funding comes in as soon as feasability is shown (maybe before).

Money doesn't need to be directed there. It already is.

And I think you are over-estimating the power of simulations. The accuracy of simulations comes from the "preprocessing" (researching what you want to simulate, developing a model, and programming the model). To "factor in a few lifestyles" would require huge amounts of research. True, whether a simulation is feasable may be determined by computing power, but whether or not it is accurate is determined by the knowledge base that goes into creating the model you simulate.
 
And I think you are over-estimating the power of simulations. The accuracy of simulations comes from the "preprocessing" (researching what you want to simulate, developing a model, and programming the model). To "factor in a few lifestyles" would require huge amounts of research. True, whether a simulation is feasable may be determined by computing power, but whether or not it is accurate is determined by the knowledge base that goes into creating the model you simulate.

So very true....
 
Technology research should be focussed exactly where it always has been and always will be - where people think they have something that will work.
 
I reject the premise (that someone or some agency or something should be directing technology research).
I reject that premise as well. However, I am pretty sure no one asserted that premise in this thread - I certainly didn't. I asserted that technology research should be focused in a certain direction, not that some entity should direct that research.
 
And I think you are over-estimating the power of simulations. The accuracy of simulations comes from the "preprocessing" (researching what you want to simulate, developing a model, and programming the model). To "factor in a few lifestyles" would require huge amounts of research. True, whether a simulation is feasable may be determined by computing power, but whether or not it is accurate is determined by the knowledge base that goes into creating the model you simulate.
Data is collected on millions of people every single second. It's passive. If a large-scale voluntary effort was accomplished similar to the Human Genome Project, the data would be more meaningful.

Besides, that's just it. Increased power would make data collection faster, and bad or questionable results would be detected and ruled out faster. In other words, creating the model would be easier.
 
Seed many fields and jump of the ones that produce results.

Nicholas Taleb does a better job of summing up my view:
http://edge.org/q2007/q07_5.html#taleb

" The only bad news is that we can't really tell where the good news are going to be about, except that we can locate it in specific locations, those with a high number of trials. More tinkering equals more Black Swans. Go look for the tinkerers."

Where can you find a high number of trials? A super-powerful computer that can do millions of trials in whatever timeframe we strive to achieve. All I'm saying is that increasing computing power would be 'the rising tide that lifts all boats.'
 
My belief? Computing power.

If computing power is doubled multiple times, then researchers in practically every field of science, medicine and technology can use massive simulations instead of real-time research. Will a particular medicine have negative effects on people?


Actually, as far as I can tell, that's exactly where too much research has been focused.

We've done that experiment. Computing power has doubled multiple times. The problem is that that increase in computing power has not resulted in increased power delivered to the user; when the power of the CPU doubles, Microsoft simply relaxes its standards for efficiency of code, with the result that the end user sees no significant increase on a typical desktop.

These trends are well-documented in the literature, if you want to dig up the references. And unfortunately Microsoft has such a dominant position in the computing industry that it's difficult even for non-Microsoft operating systems (such as Linux) to see any effective performance increase (basically, the chip manufacturers are moving the MS OS closer and closer to the silicon).

If you want to see improvements in biosimulation, I suggest that focusing research directly into what you want -- biosimulation improvements -- would be more productive. At least it doesn't have a fifteen-year history of corporate-related failure behind it.
 
" The only bad news is that we can't really tell where the good news are going to be about, except that we can locate it in specific locations, those with a high number of trials. More tinkering equals more Black Swans. Go look for the tinkerers."

Where can you find a high number of trials? A super-powerful computer that can do millions of trials in whatever timeframe we strive to achieve. All I'm saying is that increasing computing power would be 'the rising tide that lifts all boats.'

That's assuming you can create accurate complex models for things you want to test. Sometimes you can, many times you only think you can, see financial models.
 
My belief? Computing power.

If computing power is doubled multiple times, then researchers in practically every field of science, medicine and technology can use massive simulations instead of real-time research. Will a particular medicine have negative effects on people? Take the characteristics of the drug, feed in all the data from all the people who contributed DNA to the human genome project, factor in a few lifestyles, and run a few million simulations. Rather than putting it out there after 10 years of testing and finding out later that it affects a certain segment of the population lethally, the simulations would catch it without risking lives.

Although it may not be your main point as evidenced by the thread title, the simulation angle seems to be the most salient for people (and for me). I think inevitably we're going to rapidly reduce drug approval processes, and we're probably going to have to use simulations to do that. As immortality becomes more salient to baby boomers, they're not going to accept dying while drugs go through decade long development cycles.
 
The demand for computer power is enough that it will be funded privately. Sure a few government dollars get thrown into the mix when someone comes up with an idea and needs start up funds, but private funding comes in as soon as feasability is shown (maybe before).
The US government actually funds this research to a great degree.

For example.

I'm not disagreeing with your post, just pointing out a govermental funding source.
 
Actually, as far as I can tell, that's exactly where too much research has been focused.

We've done that experiment. Computing power has doubled multiple times. The problem is that that increase in computing power has not resulted in increased power delivered to the user; when the power of the CPU doubles, Microsoft simply relaxes its standards for efficiency of code, with the result that the end user sees no significant increase on a typical desktop.

These trends are well-documented in the literature, if you want to dig up the references. And unfortunately Microsoft has such a dominant position in the computing industry that it's difficult even for non-Microsoft operating systems (such as Linux) to see any effective performance increase (basically, the chip manufacturers are moving the MS OS closer and closer to the silicon).

If you want to see improvements in biosimulation, I suggest that focusing research directly into what you want -- biosimulation improvements -- would be more productive. At least it doesn't have a fifteen-year history of corporate-related failure behind it.

When I say 'computing power', I am talking about the whole thing, not just the processor. I mean, as you implied, what good is improving processing power when other components mitigate that improvement?
 
Although it may not be your main point as evidenced by the thread title, the simulation angle seems to be the most salient for people (and for me). I think inevitably we're going to rapidly reduce drug approval processes, and we're probably going to have to use simulations to do that. As immortality becomes more salient to baby boomers, they're not going to accept dying while drugs go through decade long development cycles.

Right! That is part of my point. But simulations are not limited to medicine (as I'm sure you know).
 
Piling on- this is exactly correct.

Create a program that accurately creates complex models. Test the program on several previous simulations that are known to be successful to see if the same results are achieved within an acceptable variance. Voila.
 
Create a program that accurately creates complex models. Test the program on several previous simulations that are known to be successful to see if the same results are achieved within an acceptable variance. Voila.
This is exactly the oversimplification we are writing about. A program can only tell you what you know- not what you don't know. If a simulation leads to an unexpected result, it must be physically verified.

Your approach has failed dramatically. About ten years ago some economists (including Nobelists) wrote a program that simulated all known international influences on equities, currencies and commodities; they formed a mutual fund company called Long Term Capital Management (LTCM) based on the program. They were remarkably successful, until Russia decided not to pay the interest on its national debt. That had never happened before, and LTCM tanked.
 

Back
Top Bottom