• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Computers in 2020

Hi

I'm picturing cellular computing.

~snip~

(This isn't all mine: I added some bells, whistles, and optimization, but the basic idea is from a friend and teacher's Master's thesis. Hiyas, Bruce!)

This sound exactly like cloud computing. There is a company (can't think of thier name) that is going to impliment a gaming website where you can play any game reguardless of the computing power of your home computer and without downloading the game software. The software runs on thier severs and they just stream the video to your home computer.

Some day all your software, including your OS, will be running on a server somewhere and your home computer will be just a dumb terminal recieving a data stream.

Now days technologies tend to converge. The mobile computing scenario I mentioned will all be a multi-function device. The cellphone device you carry around will be your laptop, home computer, media server, entertainment system, phone, even credt/debit card.

And no worries if your device gets stolen There is no data on it. It all stored in a server located with your service provider. The device detects your DNA signature and only accesses your account if you've authenticated by physical contact with the device.
 
Last edited:
Hi




There's already a guy... in Japan, I think... who's experimenting with running bundles of micowires through the arteries and individual microwires through the capillaries to individual brain cells, so... maybe a scale not measurable in hundreds-of-years without applying some odd little fraction.

Once nantechnology get to the point where we can integrate technology directly into our biology, Humanity will change significantly. There will be a blur between what is biological and what is technological. What is mental and what is computational. Telepathy may one day become possible via nanotechnology. Using radio waves rather than brainwaves though.
 
In 2020 governments already ran out of new ideas how to make cars and refrigerators etc. more energy-effective and ecological, so they turned to IT sector and made a new law banning computer operating systems that consume unreasonably much CPU power or memory, compared to the tasks needed by the computer user.

This new law made Windows XP, Vista and later Microsoft operating systems illegal. Now computers are running again with featherweight programming techniques, and old hardware from 1990's is again enough to run an operating system that meets all practical needs of a typical home user, only excluding the gamers.



And the gamers would vote out of office any government that would do that.
Something tells me we are dealing with one of these old time IT guys who still considers the triumph of Visual interface over line commands as the Work Of The Devil.
 
And the gamers would vote out of office any government that would do that.
Something tells me we are dealing with one of these old time IT guys who still considers the triumph of Visual interface over line commands as the Work Of The Devil.

Not to mention that there are huge efforts to lower the power consumption of CPUs and GPUs by chip manufactures. Lower power means less heat generation.

operating system with GUIs fit easily on thumb drives and the like. Check out Portable Linux installations like Ubuntu, etc.
 
All books and newspapers on paper would be obsolete. Instead you download them and read them on special machines that look like a piece of paper (that you can actually fold up) with a few attachments. This would probably also take photos and be a mobile phone. The keyboard would be virtual reality keyboard.

The same device would communicate with your PC via infra red. This is how you would communicate with your PC, which would not have its own keyboard or screen. If you have a family all could then use your PC at the same time.

You watch TV for news and current affairs. For movies and documentaries you hire the show and download it via the ultra high speed Internet. You are connected to the Internet via fiber optics.
 
That's because all the heavy computing was done by massive mainframes back on Earth, and only the solutions to all those hairy math problems were transmitted to Viking. The Viking computer was more like a simple device driver than anything else.

Um, like the celestial navigation module?
 
In homes, there's a Home Cycle Server (about what I have on my desk right now - four dual-core CPUs with 16 GB of RAM with a 160GB private local hard drive backed up by a 1TB NAS drive) to which the entire household's more difficult tasks are referred.
(NOTE: Apply ridiculous 11-year size inflation to hard drive specs for proper scale.)

...

Any tasks that need even more power are referred up to a few large regional cycle servers on a cost-per-cycle basis.

How many homes have a need for this kind of computing though? Except for gamers, the vast majority of users today aren't doing much more with their computers (computationally speaking) than they were doing 15 years ago, when Win 95 came out... They word process (or its net analogues), they play with pictures, they download and consume media. As previously mentioned, the average user's computer is far more powerful than they need already. Perhaps, as you mention, a costume designer might wind up using more intuitive CAD-type software than currently exists, but this software already runs well on existing computers.

It will take some new application that consumes huge computational resources in order to need to switch back to a mainframe situation. The gaming company mentioned earlier is an interesting example of where this definitely could work in a limited fashion. The only other computationally intense software that is commonly used by most people is their operating system, and if Microsoft doesn't clean up its act, they're going to start losing large numbers of customers to those OS's that aren't so bloated.

ETA:
It's kind of funny, I was reading The Puppet Masters by Robert Heinlein last night and he made 56 year predictions into the future (1951 to 2007, when the alien invasion occurred.) He's got cell phones, distributed computing, and orbital communications satellites (which are manned and don't have enough bandwidth to transmit audio and video at the same time.) He's also got cars that fly and run on a nationally controlled robotic system much of the time. Oddly enough, he still has the hero using a slide rule and the women making 'lewd' comments about having sex before marriage. I guess the reason I'm including this is that I think it's unlikely that any of us are going to be able to do a very good job of predicting the future much farther out than 10-20 years.
 
Last edited:
I think by 2020 we'll have stopped seeing anything truly interesting happening in home and mobile computing. With things like memsisters and terabytes of storage becoming available on even the least of devices, even gamers will have more computer power than they need.

My prediction is that 2020 is when consumer robotics will really start to take off.
 
My prediction is that 2020 is when consumer robotics will really start to take off.
Very likely, but what I think will have bigger impact is augmented reality (someone already mentioned it here). And real-time augmented reality DOES require more computing power than most home users have today -- or perhaps it needs about as much computing power, but in a much smaller package.
 
Normal people today use "Porsche" computers, because nothing less is enough to run any OS that Microsoft delivers to the market. And the next step will be that nothing less than a "Ferrari" PC is enough.

Well not everyone (or every company) can afford to buy new computers all the time. Even if new CPU technology consumes an imaginary amount less electricity, all computers in the world dumped every 3 or 5 years is not a very ecological idea. An excellent business plan though, nice played from Microsoft.
 
My guess is that computers will be pretty much the same as they are today when it comes to computing power and RAM, but they will be smaller and require less power. Hard-disks and other storage will see another ten-fold increase in size, maybe more for cellphones and other small, portable devices. The latter might also see a more significant increase to computing power and RAM. Typical network speeds will be five to ten times better than today, three to five times for wireless.

The most pronounced change will be e-ink, which will open up for new kinds of appliances and gadgets. Not just e-book readers and e-notebooks, but also stuff like "smart" shelf tags in stores, debit cards and electronic passes that display your last known balance, reloadable posters and picture frames and a lot of weird and occasionally clever stuff we wouldn't expect.

RFID will be widespread, but mostly for "work", little for home and personal use, although RFID + e-ink might give rise to new lines of products we haven't forseen.

We still won't have reliable voice recognition, face recognition, auto translation, self-driving cars or biometrics. Google won't be Google anymore, Apple might no longer be cool among the digirati, but IBM will still be IBM.
 
We will have two OS lines, one for limited computing (office, websurfing) and the other for high end users (gamers, CAD, servers, etc). The later will also have an option to run a limited tan-style AI.
 
Um, like the celestial navigation module?

Celestial navigation, if it was done at all (I'm sure there had to be one, but I only worked on the lander) was a function of the orbiter, not the lander. The lander got the landing profile (altitude, attitude for re-entry vs time, initial IRU settings) from the ground. Because of the 35 minute speed-of-light delay from earth to orbiter, the Viking had to manage power, the aeroshell engines to control attitude for aerobraking, the parachute commands, and finally the terminal descent rockets, and the first fifteen minutes of landed science with no human intervention.

It was a tad more then a device driver.
 
Barring an unlikely switch to a totally different technology base, computers will mature over the next decade and start to become much more static. The technology treadmill, and the effects that has had on the world will gradually slow down and virtually grind to a halt and the seemingly endless string of new computer related products will slow to a trickle.

Single processor throughput has largely peaked. While some small improvements will be made, the trend of smaller transistors = faster and lower power has already ended. Over the next decade, new processor technology will revolve around multiple cores performing paralizable tasks. After 2020, all bets are off since further reducing the size of transistors will be all but impossible.
 
Celestial navigation, if it was done at all (I'm sure there had to be one, but I only worked on the lander) was a function of the orbiter, not the lander. The lander got the landing profile (altitude, attitude for re-entry vs time, initial IRU settings) from the ground. Because of the 35 minute speed-of-light delay from earth to orbiter, the Viking had to manage power, the aeroshell engines to control attitude for aerobraking, the parachute commands, and finally the terminal descent rockets, and the first fifteen minutes of landed science with no human intervention.

You clearly know more about the mission than me. But I recall reading (I think) a RISKS digest about the dangers of implicit dependencies between modules.

The story as I remember it was an effort to free up some memory on the lander to load some more software. Someone noticed the navigation module was still installed and asked the perfectly sensible question of why something stuck on the surface would still need a navigation module. So it was deleted. And then the lander no longer knew where Earth was, and so no longer knew where to point the antenna. oops.

Ah, http://en.wikipedia.org/wiki/Viking_1 says 'The command was intended to uplink new battery charging software to improve the lander's deteriorating battery capacity, but it inadvertently overwrote data used by the antenna pointing software.', which may be the event I am thinking of.
 
I don't know about the wasted horsepower. I routinely run an application that will decode about 30 separate digital amateur radio or shortwave streams at the same time. My Vic20 wouldn't have been able to do that. Some of the computer uses we take for granted today wouldn't be possible with less than multigig hard drives and processors.

Of course, mine is the next to the bottom of the lineDell E310, but I use all the processing power on a regular basis. Plus, I have 320 Gb of online storage and it is ALL full.
 
My computer already has replaced the tv. I only watch live television when a) some really important news event takes place or b) a few sport events.

Everything else I watch from the PC (sometimes connected to a tv plasma screen)

There is 100 times more material to be seen than I will ever do, so I'll never will need the regular content again to entertain me.

With faster internet the quality of live broadcasts on the internet will increase. I think by 2020 regular tv will be in big trouble at least in Europe and Japan. Dunno about the USA, they are a little behind right now, but in these thing that often leads to the opposite, being in front a few years later...
 
There are a few TED talks that bring up some interesting ideas...

In biomimicry they discuss being able to GROW biodegradable computer parts...

In A discussion regarding the evolution of man-kind (Homo Evolutus being the next step) Humans and Technology will merge to create a new species, the first ever capable of redirecting their own evolutionary path, using advances in nano tech...

There's also another TED talk (I forget the name or person involved at the moment) where they introduce something their already working on...

It's a little necklace that you wear that projects on anything you put in front of it and tells you what ever you want to know about it.

It's like an IPhone that you can project on your own hand to dial phone numbers or calculations, project on cereal boxes to get the nutritional info on and even project someone's entire history of online blogging onto the person your talking to, in order to get a better idea of who someone is before they even open their mouths...

It's wierd, but it's out there!!!
 
The next big wave of changes in computers has already begun, with solid-state data storage. Compared to magnetic and optical drives, they're not only smaller and lighter, but also less energy-demanding. Open up the case of a standard desktop computer and look at what takes up most of the space: drives, a power supply unit, and cables to and from those parts.

The remainder is mostly the motherboard, CPU, CPU cooler, and RAM, but various handheld gizmos have been using smaller, simpler versions of those (minus the cooler, which isn't needed when things are that tiny) have already been in use for quite a while, as long as the programming is simple enough to be done on this more limited hardware. So magnetic and optical disk drives, and the power supply units to feed them, were the only things left that justified a desktop computer's bulk. Replace them with flash memory, and accept not getting the very fastest performance that can possibly be achieved this month, and you have no reason for the computer (minus keyboard & monitor, at least) to be significantly larger than a PDA or OQO. So the big separate "tower" part of computers will be no longer made and gradually disappearing as the old ones get replaced with tiny computers like the OQO, bought not just for portability but also because even if all you do is park it at home as the home computer, you can put them practically anywhere and hide them in unobstrusive little corners, instead of having to arrange rooms around them and buy special furniture for them. The most space-consuming parts will then be the keyboard and monitor (which might themselves be good things behind which to "hide" the computer).
 
Last edited:

Back
Top Bottom