Wll Computers Ever Be Fast Enough?

WoodenPupa

Member
Feb 22, 2005
35
0
0
I've been thinking about this for some time. I tend not to upgrade my computer until there are significant gains to be had from doing so. For example, I went from a Pentium 90, to a Pentium II 333, to a Pentium 4 2.5 GHz over the last ten years, upgrading significantly in each case. As I first used each new computer, I would think, "Damn, this thing is ridiculously fast!" And then gradually I would ovewhelm it with multitasking and CPU intensive tasks.

Even using programs like Cool Edit Pro, Photoshop, etc, I find myself able to consistently overwhelm my computer. I can't imagine what people using 3-D workstations and other developers go through in their speed requirements. I get that feeling that, no matter how long I live, I'll never own a computer that will be able to keep up with how fast I want to use it.

Will there ever come a time where computers will perform 99% of the requested operations instantly? For example, if I have a 1 GB .wav file, will there be a time when performing complex transforms/edits will be done instantly, with no waiting?
 

jagec

Lifer
Apr 30, 2004
24,442
6
81
Nope. As computing power increases, not only will programs and OSes get more bloated, they will start doing things to a much greater level of detail. They're already way past what would have been considered "reasonable" a decade ago.

But to put it into perspective, right now I'm working on an old 333MHz Celeron with 128MB of RAM...and it's actually doing really well. It surfs the web, does Office, and is even able to keep up with my 54Mbit wireless card (granted, the connection isn't 100%).
 

WoodenPupa

Member
Feb 22, 2005
35
0
0
Well, won't hardware advancements overtake the complexity of software at some arbitrary point in the future? Especially if quantum computing manifests, it would render a lot of dense calculations trivial, or so I've been led to believe.
 

harrkev

Senior member
May 10, 2004
659
0
71
Some would argue that we are already approaching a point of having more power than we need. The average Joe could certainly get by with a 2GHz box. The only people who need 3GHz are gamers and people involved in very un-home-user-like tasks (mathematical and engineering work). But there are always people who want more for certain classes of tasks. Take, for example, finite element analysis. If you had 10x the processing horsepower, is means that you could have a 10x better model in the same amount of time.

But, in any event, processor have been speeding up at a slower pace lately. Intel was already supposed to have hit 4GHz months ago, but no 4GHz in sight yet. So we may be hitting the wall as far as speed goes. I certainly home something revolutionary like quantum comptuers DOES happen. But even if it does, it will take time. In the labs, they have the equivalent of ONE quantum gate. I have read about storing the equivalent of four bits of information in one quantum state. So, right now, any sort of quantum computing is still in the "toy" stage. I would guess that if it happens, it will be around 20 years out or so.
 

ribbon13

Diamond Member
Feb 1, 2005
9,343
0
0
Moore's law is dying because of the fundamental issue of heat...

Most of the 'un-home-user-like tasks' are SMP friendly as it is, so dual-core will be very useful. I'm looking forward to a quad cpu system with my K8WE. I should be set for 6 years or more, especially if the 'Italy' Opterons reach 2.6Ghz.
 

WoodenPupa

Member
Feb 22, 2005
35
0
0
Originally posted by: harrkev
Some would argue that we are already approaching a point of having more power than we need. The average Joe could certainly get by with a 2GHz box. The only people who need 3GHz are gamers and people involved in very un-home-user-like tasks (mathematical and engineering work).

Definitely the average user has only very basic needs, like you see 2 GHz is sufficient. I would argue the last part though. I'm not an engineer, but I do a lot of audio work, and tasks like noise removal and 32 bit conversions take FOREVER when you're working with humongous files. Of course, my definition of "forever" keeps shrinking as computers evolve. But damn it, when the CPU takes ten minutes to do something, I get antsy as hell. That's gotta be doubly so for people who do professional work---how much more productive they could be if their CPU-intensive tasks could all be shrunk to a matter of seconds instead of minutes, hours, or days.

I guess it'll never end.
 

walla

Senior member
Jun 2, 2001
987
0
0
"Will there ever come a time where computers will perform 99% of the requested operations instantly? For example, if I have a 1 GB .wav file, will there be a time when performing complex transforms/edits will be done instantly, with no waiting? "

"Instantly" is a relative term. A millisecond may be instant for a human, but can be millions of instructions for a CPU.

And again, "all operations" is a very broad application. I am confident that humans will always find operations for computers that are extremelty complex and take quite a while to compute. Today's super computers could probably do your 1 gb wav file transforms "instantly", but they are more likely to be running complex scientific calculations such as weather, planet, or nuclear explosion modeling that can more efficiently utilize the full performance.

As far as the desktop market goes, software developers will always find ways to utilize more RAM, processing power, and graphics capability. That is what the customer demands, and what the customer identifies as "progress".

In short, computers will never be fast enough. Nor do I think they'll ever be "too slow" for the general consumer. While performance increases are slowing, so will be the software development encompassing that increase.
 

imported_jb

Member
Sep 10, 2004
171
0
0
what about some kind of price point? lets say that ~4GHz is as fast as we can go. all of these chips have cost hundreds of dollars when initially released but prices fall. either because of covering research costs or whatever. shouldn't a 4GHz chip end up costing $10, like a 400MHz does on ebay? how impossible is a motherboard w/ 25 processors? (i have no idea)
i hope we start to see motherboards that can adapt to numerous parts, allowing you to plugin ALL of your old computer equipment.
 

ribbon13

Diamond Member
Feb 1, 2005
9,343
0
0
Hahaha. I doubt there will be much demand for that. It would be better to recycle the metal, plastic, and silicon.

Plug in my 8086 and 8087?
 

walla

Senior member
Jun 2, 2001
987
0
0
Originally posted by: jb
what about some kind of price point? lets say that ~4GHz is as fast as we can go. all of these chips have cost hundreds of dollars when initially released but prices fall. either because of covering research costs or whatever. shouldn't a 4GHz chip end up costing $10, like a 400MHz does on ebay? how impossible is a motherboard w/ 25 processors? (i have no idea)
i hope we start to see motherboards that can adapt to numerous parts, allowing you to plugin ALL of your old computer equipment.

Prices of chips fall because consumer demand for those chips fall. If 4 GHz was "as fast as we can go", then 4GHz chips would keep selling at top prices. That is, until a new 4GHz chip came out with lower power consumption or better performance in terms of instructions per second, what have you. However, if there was a theoretically "optimal" chip that was sold, and no new innovation came along, eventually the CPU market would be saturated with them and prices naturally would bottom out.

A motherboard with 25 processors? For the consumer market, thats impractical since no one can afford 25 processors. But with knowhow, I'm sure you could network 12~25 motherboards in a cluster configuration.

The problem with making motherboards that can adapt to numerous parts is that numerous parts implies numerous different technologies, most of which are not necessarily compatible. We have PCI, PCI-X, Hypertransport, AGP, ISA etc., all these different technilogical protocols that are implemented in different ways. It is within the interest of the manufacturure to make motherboards that support only modern technologies in order to keep the board simple and useful for modern hardware. The only hope for backward compatibility is if new technology naturally supports it. That is often impractical because new technology is geared toward higher bandwidth, lower voltage, and faster speed that old hardware cannot operate at.

 

f95toli

Golden Member
Nov 21, 2002
1,547
0
0
The answer is no. An example is FEM-calculations in 3D which is used to simulate everything from electromagnetic waves to bridges.
The calculation is done by solving the equations at discrete point, if I for example want to solve a 1D problem over one meter I can use ten points, one very 10 cm.
The same problem in 2D with the same precision would require 10x10=100 points and in 3D 10x10x10=1000 points.

o what happens if I want to increase my resoltion by a factor of 10 (100 points in each dimension)? Well then I need 100x100x100=1 million points in 3D, which means that I need a computer that is 1000 times as fast in order to solve the problem in the same time as before.

There are lots of problems like this, there are several important problems which we simply can not handle; computers need to be orders of magnitude faster; and we can always increase the resolution even further.

 

jagec

Lifer
Apr 30, 2004
24,442
6
81
Originally posted by: WoodenPupa
Well, won't hardware advancements overtake the complexity of software at some arbitrary point in the future? Especially if quantum computing manifests, it would render a lot of dense calculations trivial, or so I've been led to believe.

No, because a modeling system can increase in complexity a hundredfold just by moving a decimal point. Hardware CANNOT advance as fast as software.

Obviously few people need that much precision in a model, but it's a question of "if you've got it, why not use it?".
 

eigen

Diamond Member
Nov 19, 2003
4,000
1
0
Assuming P != Np then there will always exist problem for which no polynomial time algorithms exist.They require either Exponential time or space.I
 

Howard

Lifer
Oct 14, 1999
47,982
10
81
I hope for a day when computers will be able to simulate (atom by atom) a virtual world exactly like this one. And have people plugged into them. And have safety devices to suppress people who cause disturbances...

 

ddviper

Golden Member
Dec 15, 2004
1,411
0
0
another question, when do u think games will stop having these huge advances in graphic technologies? there is only so real u can get using graphics, theres no more realistic than what a human can see (obviously)
 

Stas

Senior member
Dec 31, 2004
664
0
71
Howard

|I hope for a day when computers will be able to simulate (atom by atom) a virtual world exactly like this one. And have people plugged into them. And have safety devices to suppress people who cause disturbances...|

Or (!) You can do outside right now and enjoy the same thing in REALITY!

I've heard an interesting thought. Somebody said that as Ghz race goes down so will the perfomance requirements for software. WOW! Maybe the programers will finaly BEGIN TO OPTIMIZE their programs!
 

Schmeh

Member
Jun 25, 2004
29
0
0
Originally posted by: ribbon13
Moore's law is dying because of the fundamental issue of heat...

Actually Moore's Law is alive and well. You are making the common mistake, believing that Moore's Law is that cpu speed will double every 18 months. Infact Moore's Law is that the number of transistors on a cpu will double every 18 months. For Example Intel will go from roughly 450 million transitors (I believe, but am not positive, it may be less) on the current Itanium 2 to 1.8 billion transistors when they release Montecito.

As long as manufactures are able to continue to shrink the process size that cpus are made at, they should be able to keep Moore's Law alive for probably the next 5-10 years.
 

WiseOldDude

Senior member
Feb 13, 2005
702
0
0
Not if we continue to run Micro$oft maloperating system on them.

Or after we have han them for a couple of months
 

complacent

Banned
Dec 22, 2004
191
0
0
Originally posted by: eigen
Assuming P != Np then there will always exist problem for which no polynomial time algorithms exist.They require either Exponential time or space.I

I fail to see how what that has to do with the topic? NP problems do have a solution, and given enough processing power, they can be solved. Quantum computers will have the ability to factor large prime numbers (a common NP problem.) It doesn't mean that those problems become polynomial, but simply that we have enough power to solve NP problems in a reasonable amount of time.
 

complacent

Banned
Dec 22, 2004
191
0
0
Originally posted by: ddviper
another question, when do u think games will stop having these huge advances in graphic technologies? there is only so real u can get using graphics, theres no more realistic than what a human can see (obviously)

No. We aren't even close to this threshold. As long as graphics are represented by finite data, there will always improvements to be made.
 

complacent

Banned
Dec 22, 2004
191
0
0
We are coming to a glass ceiling where we are unable to feasibly increase speed in our processors (clock speed.) Even then, we can only make the process so small, and the speed of light is constant. There is definitely a barrier that we will never pass, even though that is far off in the future. Once we hit transistors that are one atom (theoretically possible, but never going to happen), that is it. We can't get smaller than that. <p> The next major boost in computing power will come from three sources: increased cache size/speed, multicore and multithreads. Increasing cache size will help all programs. However, having a multicore/multithread processor will only help applications that are written for concurrency. I'm sorry to say it, but hyper threading for pentium 4's is nearly useless for day to day operations. Most programs we use today are not written with threads. There must be a paradigm shift with ALL of our programs (OS, word processor, games, etc.) in order for us to get any advantage out of parallel processing. Another problem is that not every problem lends itself to concurrent programming...
 

complacent

Banned
Dec 22, 2004
191
0
0
Originally posted by: 0roo0roo
no, they are still retarded slow. things like voice commands? forget about it

I disagree. There is a product on the market called Powerscribe. It is made for physicians to use for digital dictation. It does real-time voice recognition as well as responds to commands. Granted, it is expensive, and the processing hardware comes in a handheld microphone for $500, but it is still available if you want it.
 

RelaxTheMind

Platinum Member
Oct 15, 2002
2,245
0
76
This stuff has been around for years to back up what comlacent stated. For disabled people given that it costs around $3k for training and customizing alone.

On a more realistic note:

We can easily reach great speeds as in the 10ghz range as I remember Intel stating in an article I read like 3 years ago. But the efficiency at which it would be cost efficient to cool that beast and keep everything stable as well as for all the other components to catch up is a way down the road. Unless you want to steal a multimillion dollar fiber optic processor from a weather station... hehe

For your audio question. Ever use a mac dual g5? If you have ever used a mac you will see why just about every music/film studio has one. We use a dual g4 in our studio and i cant even begin to compare the conversion speeds using QT. If you want instantaneous get a solid state hard drive. Or try running 98 on a huge ram drive. All of this to make up for our "retarded slow" hard drives.

I am not to savvy on Mac architecture but i do know that they are coming to a unity in the bandwidth deparment a lot of the cpu bottlenecks of PCs dont arent as existant in the Mac. Meaning everything running in unison geared toward multimedia (Not so in the gaming dept). But what do you expect when most of the components are made and designed by the same company. Most all the Mac components are handpicked PC components slapped with a cutsie stick and branded with an outrageous price tag.

But this is where multicore cpus come in....
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |