Okay, so it sounds probable that the noise I heard was from clipping.
Now I am curious; how likely is it that have I damaged the speakers? I have only had my speakers clip for a total of about 2 minutes since I got them, as I always turn them down very shortly after hearing the clipping.
Is...
A week ago, I received my Audioengine A5+'s, and while I'm very pleased with them 99.9% of them of the time, I have a small problem with them.
When playing certain types of songs at an especially high volume, the sound becomes distorted. The distortion is a sort of crackling noise, that...
I'm just wondering if in 2+ years, games will come out that make use of more than 4 cores, and thus HT would be giving large increases.
If that happens, I'll be kind of disappointed that I got the 3570k.
Thanks for all the replies.
So how likely is it games will make use of 8 threads within the next couple of years?
edit: yes, the 2600k/3770k have 2MB more L3 Cache. Not sure if this will have any effect on the majority of games.
So it looks like the 3770k will be £80 more than the 3570k.
As far as I know the only difference between the chips is HyperThreading. So basically I'm asking myself if I should pay £80 for HyperThreading.
Now, I know it's currently useless for gaming, but what about in the future? In 2...
Every report I've seen suggests you can get up to 4.5 on stock volts but from then on you will need to suffer relatively large jumps in voltage and heat to get higher clocks.
I'm interested to see if the above is wrong tho.
Did you ignore the second graph?
I will ignore the first question, you're obviously not looking for a serious answer.
Because the 2500k does not allow PCI 3.
680 SLI for 5760x1080
680 x4 SLI
Neither of the graphs are mine, but I know where I found them. If you want me to direct you where I found them I can.
The first graph was not well presented but it still shows PCI 3 being incredibly good for multi GPU multi monitor setups.
Although I agree a new person buying a new build is always getting more for their money than someone who is upgrading from previous generation components, there is a case to be made for people to upgrade.
But only a specific type of people.
People who have multi monitor multi GPU setups (say...
PCI 3 is looking to be real important for multi-monitor multi-GPU setups, so if a SB chip will block me from using the PCI 3 functionality on my Z77 board, I will deff. not be buying SB.
Thanks.
edit: found answer, no need for replies unless you think you could provide helpful additional...
Thanks for pointing that out to me, Akantus.
Yes, that's why I said what I did. A 3570k is more comparable to a 2500k, as they both don't have HT, whereas a 2600k does. But as Akantus pointed out to me, HT is turned off.
Does the 2600k have hyperthreading turned off? I thought with the original i7s at least, hyperthreading added considerable heat.
If hyperthreading isn't turned off aren't we comparing two different classes of CPUs. It is an i5 vs an idea afterall.
A sandybridge CPU in a Z77 board won't support PCI 3?
If that's true I won't be buying a sandybridge CPU afterall.
EDIT: Question solved. I found SB does not support PCI 3 and I would be stupid to deny myself PCI 3 if the graphs I've seen of PCI 3's impact on multi-monitor multi-GPU setups...
I'm in the middle of building a computer and have all my parts besides the CPU, which I'm now carefully considering.
Since this is a gaming PC and I do not use many productivity apps, I will not really benefit from HyperThreading, so an i7 is a waste for me, and I will be going with an i5...
I haven't read reviews, most of this is from hearing an Nvidia representative talk about it on a video. From what he was saying, it seemed like GPU Boost would activate mostly in less demanding scenes.
Please correct me if I am wrong.
Nvidia GPU Boost seems to give you a dynamic overclock when you LEAST need it.
For example: I'm playing a game where the AVG. FPS is 60, the card will struggle in some scenes (30 FPS) and thrive in others (90 FPS), it seems that the dynamic overclock will...
1. Nvidia
2. Performance, price is important too though.
3. No, it's a waste of money. Only upgrade if you will get good enough improvements over what you got.
4. Yes, all the time. Anandtech, Techpowerup, Hexus are probably my favorite sites for this.
If I told you I'd like to know in regards to Oblivion, would that help?
And are you saying x2 SSAA is better than x8/x16 MSAA in almost all cases, pending the scene content?
So it's not as perfect as vsync is for single GPU setups? What's the compromise?
Is the stutter for multi GPU setups with vsync on as bad as single GPU setups with vsync off?
I'll start reading that article now.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.