- Mar 16, 2005
- 493
- 0
- 0
Wait, these things turn off?
I've heard it's so! But any real user knows that night-time is Torrent-time/virus & spyware scan time/update time/defrag time/folding time/XviD conversion time...
Wait, these things turn off?
It's not like the process just magically goes away. Windows isn't that efficient. While it may be minimal, there's still going to be some memory being taken up. Many PCs, like the many built in 2001, that only have 128MB of RAM would greatly benefit from any increase is available memory.Originally posted by: Fresh Daemon
Granted, the available memory is increased. What I've seen in all these tests is that there isn't any increase in performance because of this. The reasons for this have been gone over in this thread, but basically, Windows will page out an unused service so although it appears that more RAM is freed up, you aren't actually gaining anything since Windows would have freed the RAM anyway.
I doubt these synthetic tests would be accurate, especially since they aren't geared toward this specific variable.This is why the benchmarks before and after tweaking, even when all available RAM is used up, don't show any real difference.
Sometimes that's the point; you don't want the app to run. Some of the functionality you're enabling would only be given to some remote attacker.All that these tweaks do is remove the flexibility from the system, for instance, if you ever needed one of those services, Windows could load it into memory and it'd work. OTOH, if you'd disabled it, then whatever app needed that service just wouldn't run.
Now, I'm not too sure about how this is affected, but what if it was a multi-user environment, where people are logging on and off? Whatever processes are loaded then, would certainly make a difference there, too.That may well be true, but I don't find that to be any useful metric for computer performance anyway. Macs still sell despite the fact that MacOS X takes forever to start compared to XP. So does Linux. Most people here (and most computer users in general) boot and shut down their computers once a day, if that.
Originally posted by: SumYungGai
It's not like the process just magically goes away. Windows isn't that efficient. While it may be minimal, there's still going to be some memory being taken up.
Many PCs, like the many built in 2001, that only have 128MB of RAM would greatly benefit from any increase is available memory.
It's not like the process just magically goes away. Windows isn't that efficient. While it may be minimal, there's still going to be some memory being taken up.
I doubt these synthetic tests would be accurate, especially since they aren't geared toward this specific variable.
Sometimes that's the point; you don't want the app to run. Some of the functionality you're enabling would only be given to some remote attacker.
Now, I'm not too sure about how this is affected, but what if it was a multi-user environment, where people are logging on and off?
I don't know, I'm not sure of the details. But I've never seen a string of processes only taking up merely 1K each.Originally posted by: BikeDude
For what? Given a process where all threads are blocked waiting for an OS function to return (be it WaitForSingleObject() waiting for a semaphore or a Read() from a socket, etc...) -- just exactly which parts of that process need to be kept in physical memory?
Okay, so you can't see the numerical difference between 28 and 60? I don't care about 160MB. Most don't have that amount (ie, it's non-standard), which is incredibly better for XP than 128.Originally posted by: Fresh Daemon
I tried this on a PC from 2000 with 160MB of memory and saw no improvement.
Uhh, what? I don't see how any of this related to the topic. It isn't an argument to talk about things that are totally unrelated, and your pseudo-hypothesis is exactly that. Unless you've seen all of those seemingly unused processes take up only a few kilobytes collectivly, you are simply wrong to think it will end up that way.Why? Do you know this or are you just guessing that that's how it would work? Computers are very counter-intuitive, probably because the science they rest upon (quantum mechanics) is very counter-intuitive.
But the thing is, those benchmarks are only testing a few variables; not all of them, or even most. It could be completely overlooking the variables we really would like to know, or could be not extensive enough to show anything definitive.I don't believe in running artificial benchmarks designed to test only one aspect of your system. If you use your computer to run artificial memory/disk benchmarks then I can't help you.
Since services usually don't take up much of anything when unused, and little paging happens when the program is already in use, it just doesn't make sense to bother with a single application, or one that merely does a limited action.BV's site talks a lot about gaming, so I tested it mostly with games, and saw no improvement.
Uhh, you mentioned security yourself in the opening post. And part of your argument involves functionality. But it's rare that you would suddenly need to enable a service, and you can always enable it by going to the Services MMC. We aren't using 95 anymore, we don't need to reboot for simple changes. Plus, there's no reason to ignore security, it's an inherent topic for things like services.But this isn't called Security-oriented Windows Tweaking, it's called Performance-oriented Windows Tweaking. If you want to discuss how services affect security, go run your own tests and start your own thread.
It doesn't matter, I was responding to your recent comments. That would mean either you yourself were off-topic, or are just avoiding the issue. And I don't really see how it's "beyond the scope", when it's directly related to performance (yes, booting is part of it, it's a fundamental factor).Multi-user servers are beyond the scope of this article, and neither BV nor myself claim that this is applicable to networked environments.
No one even mentioned the article. Just what you've been saying in this thread.Your criticisms seem to be aimed at the fact that this article does not cover things it never claimed to.
Originally posted by: Fresh Daemon
Wait, these things turn off?
I've heard it's so! But any real user knows that night-time is Torrent-time/virus & spyware scan time/update time/defrag time/folding time/XviD conversion time...
I don't know, I'm not sure of the details. But I've never seen a string of processes only taking up merely 1K each.
Okay, so you can't see the numerical difference between 28 and 60? I don't care about 160MB. Most don't have that amount (ie, it's non-standard), which is incredibly better for XP than 128.
But the thing is, those benchmarks are only testing a few variables; not all of them, or even most. It could be completely overlooking the variables we really would like to know, or could be not extensive enough to show anything definitive.
Also depends on the system setup and what program is being used. If I had more than the average amount of junk installed, and the OS was paging itself to simply open the start menu, I'm not going to be properly testing things like that with some other program that's already in memory.Originally posted by: Nothinman
The point is that the less memory the machine has, the more affect tweaking should have on performance.
Who said anything about noticing virtually nothing? And frankly, if don't see the difference in a 128MB system by making some of these changes, then you really aren't fit and are too slow to be testing this stuff.If you notice virtually nothing with 128M you're going to see even less if you have more memory.
But the thing is, I just said that.But the thing is, benchmarks by their very nature only test a few specific variables.But the thing is, those benchmarks are only testing a few variables; not all of them, or even most. It could be completely overlooking the variables we really would like to know, or could be not extensive enough to show anything definitive.
Well there certainly could be other tests that could get at least slightly more definitive results. Most of these benchmarks are trying to get the most out the system, and it doesn't look like it would deliberately make actions that would hinder performance, even if though those things may be encountered in a real life workload.There's no real way to gauge how responsive a system 'feels' without just sitting down and using it for a few days or weeks.
I don't care about what BV's site is for. You can't accurately test something in that way because it could be totally overlooking the variable. Hell, even something like load times would be much more realistic of a test for a game. Unless you're not meeting the minimum requirements for the game, something like page swapping probably won't be noticed when it's in use. And if that were the case, then maybe disabling a few service would actually make a difference.BV's tweak site specifically targets tweaks for gamers, so it makes sense to test with common games to see if the tweaks have any real affect on gaming.
I don't know, I'm not sure of the details. But I've never seen a string of processes only taking up merely 1K each.
Okay, so you can't see the numerical difference between 28 and 60? I don't care about 160MB. Most don't have that amount (ie, it's non-standard), which is incredibly better for XP than 128.
Unless you've seen all of those seemingly unused processes take up only a few kilobytes collectivly, you are simply wrong to think it will end up that way.
But the thing is, those benchmarks are only testing a few variables; not all of them, or even most. It could be completely overlooking the variables we really would like to know, or could be not extensive enough to show anything definitive.
Since services usually don't take up much of anything when unused, and little paging happens when the program is already in use, it just doesn't make sense to bother with a single application, or one that merely does a limited action.
Uhh, you mentioned security yourself in the opening post.
Plus, there's no reason to ignore security, it's an inherent topic for things like services.
And I don't really see how it's "beyond the scope", when it's directly related to performance (yes, booting is part of it, it's a fundamental factor).
No one even mentioned the article. Just what you've been saying in this thread.
And frankly, if don't see the difference in a 128MB system by making some of these changes, then you really aren't fit and are too slow to be testing this stuff.
Hell, even something like load times would be much more realistic of a test for a game.
WOW!!. All this time, I keep playing with services. Sometimes, I end up crashing my computer or things not working because of it. Now I know, not to even bother. Thanks!!!!
Very nice work there. I never had time to try these out, but I should someday.
what about using a ramdisk mapped as a drive letter as a pagefile? i did that for awhile and it seemed to work alright
Not a savings. In the DOS days it was, but now, it removes memory that would more efficiently be used by the OS.Originally posted by: karstenanderson
what about using a ramdisk mapped as a drive letter as a pagefile? i did that for awhile and it seemed to work alright
Man, you're just a stupid fucking idiot. I don't know what's taking up memory, so that means it's not taking up memory? Hmm, must be more of your "counter-intuitive" logic. Either that or you're just a dumbass.Originally posted by: Fresh DaemonRight, well, when you know something you'll be qualified to talk about it, won't you?I don't know, I'm not sure of the details. But I've never seen a string of processes only taking up merely 1K each.
In the meantime, you'll have to forgive me if I take my own hard evidence in preference to the conjecture and speculation of a self-confessed ignoramus.
Are you really that inept? Either you've responded without reading the whole post, or you're just that ignorant. Hmm, maybe because 128 is less than the recommended amount of memory you should have. Like I said before, at that point, the memory is the main bottleneck.160MB is "incredibly better" than 128MB? Why?
I don't care about your faulty tests.Let me ask you this: if I pulled a couple of 16MB SIMMs out of System I, do you think my results would be very different between tweak levels, bearing in mind that the tests maxed out well beyond the physical RAM capacity anyway?
And now you bring up enthusiasts. Well if you're only talking about "enthusiants", why even mention the benchmarks you put up? Seems like it's rather off topic then, or "out of the scope" of the subject at hand.And if you think yes, then let me ask you what the possible use is of a set of tweaks for enthusiasts that lose all value somewhere between 128MB and 160MB of physical RAM, in an age when any enthusiast has at least 512MB?
No you haven't. You just said that basically the whole process will be swapped out and never used again, when that clearly isn't true. You saying shit isn't evidence.I've given you my evidence and I've explained why it works that way.
First of all, I didn't tell YOU anything on that matter. And I never said I made any guesses, nor have I made any.You've told me that you're taking guesses without any evidence whatsoever.
Maybe just counter-intuitive. Ever think of that? Well if you did, you would be wrong, because it's counter-intuitive.You are being illogical,
LOL. Saying that something (an unrelated something at that) is sometimes counter-intuitive, doesn't argue anything. You have no backing for it, and are giving unrelated conjectures.and it is that, rather than your failure to accept that things can be counter-intuitive, which makes you appear incorrect.
Seriously, what the fuck? Tests of variables? You've degraded into retardation.I think you don't know what computers are used for! We don't run tests of variables on them,
God damn you are so slow. You totally miss my point. Sure, you run software, but it's not as simple as just running it.we run software which we use to get our work done or play games. It does not matter how greatly something produces an effect at a very low level if it doesn't affect actual programs.
Test it? Do you even know what I was saying? Why would I test it when I just said it's something that most test don't consider?Again, I'm not particularly interested in your guesswork. Test it yourself, you'll find that your results don't confirm your preconceptions.Since services usually don't take up much of anything when unused, and little paging happens when the program is already in use, it just doesn't make sense to bother with a single application, or one that merely does a limited action.
Serious breach of security? LOL. Yeah, ok.I mentioned security in passing because one of BV's tweaks represented a serious breach of security.
That wasn't "in passing". It was your way of making a rebuttle. Hardly "passing". And what "serious impact of functionality" was there? If you don't need the service, the only "functionality" you would be losing is, at most, not having to re-enable a service. And normally you don't even have to do anything when it's set to manual.I mentioned functionality in passing because one of his tweaks produced a serious impact on functionality.
Scientific ones? What in the hell are you talking about?There are loads of articles and guides on Windows security and services. There are no scientific ones on performance-oriented Windows tweaking. Bearing that in mind, I tested the latter, not the former.
Huh? LOL, why the hell would you say that? I'm the one saying that you're overlooking it and don't know what you're talking about. And you don't. I'm not the one in need of some learning on that subject.If you want information on security and Windows services, just Google it.
Wow. Way to make an absurd analogy that is completely dissimilar to what was being discussed. While it may not be about gaming, logging on and off, or restarting the computer, is part of the overall performance of the PC, and may matter to people. How is it so absolutely off point to mention that like you're trying to make it out to be?It's beyond the scope in the same way that strapping rocket boosters to a car is beyond the scope of a performance tuning magazine, although rockets are technically a performance-increasing modification like new heads or spray.And I don't really see how it's "beyond the scope", when it's directly related to performance (yes, booting is part of it, it's a fundamental factor).
This thread is you trying to counter his advice, and give your own.The thread is about the article. :roll:
Wow, talk about stupid things to say. Other than the retarded Superman comment, YOU DIDN'T TEST ON A 128MB SYSTEM. I doubt you even have the wits to even notice the small differences anyway. It's like unless there's some benchmark showing some number, you can't tell how a system is running.This is a really stupid thing to say. Are you telling me you can see a 0.3fps difference, and that people who don't are unfit and too slow? Who are you, Spiderman?And frankly, if don't see the difference in a 128MB system by making some of these changes, then you really aren't fit and are too slow to be testing this stuff.
So you resort to a test that doesn't even test the change you are making?I dislike things that I would have to time with a stopwatch.Hell, even something like load times would be much more realistic of a test for a game.
That's for sure.My own human eyes and hands are too fallible to produce reliable readings.
Everyone should be doing that anyway.I may do it later anyway, but take my results with a huge pinch of salt.
People who don't know what they're doing shouldn't be tampering with things like services anyway, unless it's for learning purposes.Well, you just made all of this work worth it! I managed to save one person from instability in the name of unrealizable performance gains. Kudos to you, mate.WOW!!. All this time, I keep playing with services. Sometimes, I end up crashing my computer or things not working because of it. Now I know, not to even bother. Thanks!!!!
I don't know what's taking up memory, so that means it's not taking up memory
Hmm, maybe because 128 is less than the recommended amount of memory you should have.
You just said that basically the whole process will be swapped out and never used again, when that clearly isn't true.
Tests of variables? You've degraded into retardation
Serious breach of security? LOL. Yeah, ok.
And by the way, what you mentioned is also bad for security, so I commented on it.
That wasn't "in passing". It was your way of making a rebuttle.
And what "serious impact of functionality" was there?
Scientific ones? What in the hell are you talking about?
Oh, and plenty of text-books give some of the advice BV and a few others have said.
While it may not be about gaming, logging on and off, or restarting the computer, is part of the overall performance of the PC
I doubt you even have the wits to even notice the small differences anyway.
So you resort to a test that doesn't even test the change you are making?
Everyone should be doing that anyway.
Originally posted by: Fresh Daemon
Actually, it is. Not running any kind of firewall or antivirus will get you the Sasser worm, MyDoom or any of the other horrible things that are out there.
First of all, antivirus products don't really do much. Since they (usually) only protect against known threats, they are little more than a safety net in case you actively try to run infected executables (protection by blacklists). The performance degradation when running certain AV products can easily outweigh the danger of infection -- atleast for experienced users.
Originally posted by: Nothinman
First of all, antivirus products don't really do much. Since they (usually) only protect against known threats, they are little more than a safety net in case you actively try to run infected executables (protection by blacklists). The performance degradation when running certain AV products can easily outweigh the danger of infection -- atleast for experienced users.
Considering that most of the worms in the wild exploit holes that have had patches out for months, it makes sense to run an antivirus unless you're absolutely sure all of the email you're receiving and webpages you're visiting are clean.
Originally posted by: gsellis
Bump - I thought the alarm clock was a sticky.
Originally posted by: BikeDude
Originally posted by: Fresh Daemon
Actually, it is. Not running any kind of firewall or antivirus will get you the Sasser worm, MyDoom or any of the other horrible things that are out there.
First of all, antivirus products don't really do much. Since they (usually) only protect against known threats, they are little more than a safety net in case you actively try to run infected executables (protection by blacklists). The performance degradation when running certain AV products can easily outweigh the danger of infection -- atleast for experienced users.
Lastly, but not least: Albeit a firewall is nice and all; if you're patched up with SP2 + hotfixes, the known security holes are eliminated, so both Sasser and MyDoom will fail to affect you.
I guess what you meant to say is that a firewall will protect against hitherto undiscovered security holes. (as well as weak passwords in case you enable file sharing -- although it has to be said that being able to access one's own rig using the Internet would be neat...)
What have you been reading? God damn you are so effing slow. Please, point out where that was said.Originally posted by: Fresh Daemon
No, if you don't know whether something is taking up memory that means you can't say that it is.
Ugh. Who mentioned Microsoft? So you would actually recommend 128 as the minimum? Man, you are just out of your mind. 64MB of memory would absolutely CRIPPLE a system.Actually, no, 128MB is the recommended amount. 64Mb is the minimum. Microsoft
Nice argument. You're making up crap and then ask me for proof. How can you prove that EVERY SINGLE LAST MEGABYTE is being swapped out? Hmm, task manager doesn't reflect that, and neither do any informational sources.Why isn't it true? What proof do you have?You just said that basically the whole process will be swapped out and never used again, when that clearly isn't true.
!!! NEWS FLASH !!! YOU AREN'T TESTING MORE THAN A FEW SELECT THINGS IN YOUR TESTS.No, that was you who wanted to test variables rather than the performance of the system as a whole.
LOL. Scientific menthod? LOLOLOLOL. Scientific method IS SUPPOSED TO TEST THE VARIABLE. That's how the scientific method works, genius. Unless you're specifically seeing if it alters these unrelated things, you aren't going to get good results.If we do X, and it produces no results in our tests, let's modify and alter the test conditions until we do see some results.
That is not scientific method, that is the fallacy of assuming your conclusion.
Uhh, actually no, it's not a "security breach". And after you're done looking up "breach" in the dictionary, please tell me who said anything about disabling firewalls or anti-virus software.Actually, it is. Not running any kind of firewall or antivirus will get you the Sasser worm, MyDoom or any of the other horrible things that are out there.
Uhh, again, no. That's mainly because of the vulnerability in SP1. It's not like it's guaranteed that you'll get some kind of virus "within minutes", especially if you have you're system set up right and know exactly what is running.Here's a test for you: disable all firewalls and all antivirus software, and time how long it takes before your computer begins to reset itself without your permission. That's a worm. This was why, before they added the firewall to WinXP, it was recommended to install with the network cable unplugged until you had installed your firewall and antivirus software. Usually your system would be compromised within minutes.
You're memory is absolutely terrible. And I just mentioned it again above, so I won't repeat that part.How?And by the way, what you mentioned is also bad for security, so I commented on it.
That's "petty".That's "rebuttal".
You did? Where? I don't see it anywhere in this thread. Try again.I noted that Speedfan, which is a very popular program amongst enthusiasts, would no longer install. This is one example.
LOL. So "scientific articles" means the scientific method? Hmm, I don't quite get that.Scientific method. Look it up.Scientific ones? What in the hell are you talking about?
You don't read textbooks? Many have mentioned these things, by many authors. I have mainly Course/Thompson Learning books, and several Microsoft and CompTIA certified ones mention these things. Off the top of my head, the A+ Software/Hardware ones, the various Windows ones, and the Security+ one.What are their names, who are their authors and who published them?
It's relative, jackass. If DOS didn't have anything I needed, performance wouldn't matter. But if I had a server that took 15 minutes to start up, that could be a problem of availability. So I would want good boot performance, or at least the best I could make it. There's no reason to just completely ignore it as a factor.No, it isn't. If you think it is, go run DOS. That'll start up, log on and off and restart faster than you could ever believe. Ergo, DOS must have great performance, right? And MacOS X and Linux must suck, because they are slow to boot and restart, right?
Holy shit. I covered this a few times already. I don't care about your faulty tests, and I don't care about gaming performance, because it doesn't really apply. The differences are only noticable under certain circumstances, which could be avoided by some tweaking.It needs superhuman wits to notice 0.3fps. For all your blustering, even you, Spiderman (or Superman, since you have trouble reading), couldn't spot that in a double-blind test.
I'm sorry, I'm paying more attention to the real discussion than your petty offhanded remarks that only make you look foolish.For all your blustering, even you, Spiderman (or Superman, since you have trouble reading),
I highly doubt you never use anything but games and benchmarking programs, and never have boot your PC.I resort to a test that measures what I actually want to use the computer for.
Wow. Terrible analogy. The only thing remotely close to this situation is how these tweaks won't make much, if any, difference in something like a game (one that's already open and loaded in memory). The rest of what you said is just garbage. These are simple tweaks for OVERALL performance. If it means a faster boot, that's a plus; if it means hard-paging is slightly faster, that's a plus; if it means means I won't get a slowdown because networking service is waiting for a response it won't get, that's a plus. Freeing a few megabytes of RAM and lowering the amount of items in the process list is also a plus.Here's another car analogy. I want to make a car handle really well, so you say, "Hey, go install a bigger engine, and a turbo, and soften the rear suspension." I try it out, and I say, "This handles worse than before!" To which you reply, "No, it accelerates better. Try accelerating."
Maybe it will - but it's not relevant to me!
Maybe if you weren't such a whiny little bitch, that wouldn't happen.Everyone is invited to run their own tests. So far no competing figures have been posted and the only people who disagree with me are foul-mouthed people like yourself who can't seem to make three posts without getting banned.