NVidia CUDA video application roundup

Dec 30, 2004
12,553
2
76
Yeah, conclusion says things like "pretty good video quality" and "video quality lacking"

We need to be able to set the exact same settings in the gpu-based encoding application.
In all likelihood there are lots of pixels being cut in the gpu encoded video that aren't being cut in the cpu-encoded one.
In all likelihood, performance improvement wouldn't be nearly as great if it were transcoding a 1920x1080p video with the highest settings....IE you wouldn't want to use this for anything but your ipod--> but when you're using it for something which that low of a resolution, any dual core is enough anyways.

At this point it's a cool gimmick-- here's to DX11 and what it can bring :beer:.
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Those video artifacts are why I never use any GPU transcoding. I tried it a while back with the AVIVO transcoder and quality was not great...and I've seen the same sort of things recently.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Loiloscope:
"Transcoding to a PlayStation 3 really throws CUDAs capabilities when coupled with a fast GPU into the limelight, with LoiLoScope capable of transcoding as much as 200% faster than a Core i7 940 in this scenario, making a huge different to transcoding times without affecting the final video output quality at all (with LoiLoScope giving decent end results throughout)."

Badaboom:
"If you're an advanced user who wants more control over the application's video output, then Badaboom gives you a fair number of options to control how the video is encoded and what level of output quality you are looking for (although this is controlled by a slider, which might not be as precise as some would like it to be). It has to be said that the video quality output by Badaboom at its default settings is pretty terrible most of the time, so chances are you'll want to at least move that slider up a fair way before you start transcoding anything."

Nero Move It:
"If none of those device profiles suit your needs, or you want more control over the application's output, then it's simple to create user defined profiles with plenty of options to choose from to match your requirements. Overall, we found video quality to be pretty decent; better than Badaboom at its default settings and more or less on a par with Cyberlink's MediaShow Espresso."

Expresso:
"If none of that fits your needs, you can also choose your own custom transcoding requirements, while CUDA support is featured via a simple checkbox when you're choosing your encoding parameters. Out of all of the applications we've tested for video transcoding here, MediaShow Espresso is perhaps the easiest to use while also providing the best out of the box video quality, making it a worthy consideration for purchase in our eyes.

Reveal:
No comment on quality

Power Director:
No comment on quality

ArcSoft:
"We can see such a comparison mode in action above while playing back Final Fantasy VII: Advent Children, while we've also switched on the application's statistics to show playback framerate (to help gauge whether the system is keeping up with upscaling the picture). Unfortunately, it's difficult to really appreciate how well the upscaling works without the ability to take screen grabs, so you'll just have to take my word for it that the upscaling does a great job on standard DVDs, leaving you with a cleaner, sharper and generally better looking image."

It would seem that it is assumed that GPU encoding is only faster because it produces lower quality video than a CPU? Not rendering all pixels? I'd like to see on what basis this idea has manifested itself. Are all CPU encoded videos perfect? Or do they themselves require tweaking of various options in the application?

What is good for one is also good for the other. And besides, the CPU cycles spared from using the GPU in some of those apps is astounding. A few utilize high CPU cycles, but I suppose those would be the apps we would stay away from.
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: Keysplayr
It would seem that it is assumed that GPU encoding is only faster because it produces lower quality video than a CPU? Not rendering all pixels? I'd like to see on what basis this idea has manifested itself. Are all CPU encoded videos perfect? Or do they themselves require tweaking of various options in the application?

What is good for one is also good for the other. And besides, the CPU cycles spared from using the GPU in some of those apps is astounding. A few utilize high CPU cycles, but I suppose those would be the apps we would stay away from.

There are usually lots of things you can tweak with CPU encoding...I usually leave everything at default for something like Ripbot264 if I'm encoding something for PS3 and I don't get any artifacts when starting with a decent quality source.
 

Forumpanda

Member
Apr 8, 2009
181
0
0
I don't consider CPU usage a metric to measure GPU encoding applications by, I want my application to use all system ressources to encode as fast as possible, so if it can use the CPU to improve encoding speed then I am all for it.

The real problem I think is that there isn't really a standard speed test for video encoding, there are so many different ways of tweaking the encoding that every application gives a different output.

If only there was a way to specify the output and then compare the encoding applications by speed then I would actually read one of these comparison articles.

What is the value in comparing applications by quality? .. that only makes sense if you set each application to its maximum possible quality, otherwise you are just comparing which developer decided to use the more resource intensive default settings, which doesn't really show anything.

Without fixing either encoding speed or output quality and comparing the other, then all that is really being compared is different applications doing different work.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Forumpanda
I don't consider CPU usage a metric to measure GPU encoding applications by, I want my application to use all system ressources to encode as fast as possible, so if it can use the CPU to improve encoding speed then I am all for it.

I would. I use my PC for multiple tasks at the same time. So, I would consider CPU usage as a metric. Some click "encode" and go out to dinner, or see a movie. Others do not, and wish to engage in other tasks. So, if you want those extra CPU cycles, they are available. If you don't, you're not losing out when encoding is so fast anyway.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: thilan29
Originally posted by: Keysplayr
It would seem that it is assumed that GPU encoding is only faster because it produces lower quality video than a CPU? Not rendering all pixels? I'd like to see on what basis this idea has manifested itself. Are all CPU encoded videos perfect? Or do they themselves require tweaking of various options in the application?

What is good for one is also good for the other. And besides, the CPU cycles spared from using the GPU in some of those apps is astounding. A few utilize high CPU cycles, but I suppose those would be the apps we would stay away from.

There are usually lots of things you can tweak with CPU "and GPU encoding"...I usually leave everything at default for something like Ripbot264 if I'm encoding something for PS3 and I don't get any artifacts when starting with a decent quality source.

Fixed ^ . When default settings produce less than desirable results, what do you do? Just leave it? Or try and make it better? I'm pretty sure that all these applications will be WELL messed with until a user finds a particular set of tweaks they find best for them.

 

akugami

Diamond Member
Feb 14, 2005
5,986
2,310
136
Originally posted by: Keysplayr
Originally posted by: Forumpanda
I don't consider CPU usage a metric to measure GPU encoding applications by, I want my application to use all system resources to encode as fast as possible, so if it can use the CPU to improve encoding speed then I am all for it.

I would. I use my PC for multiple tasks at the same time. So, I would consider CPU usage as a metric. Some click "encode" and go out to dinner, or see a movie. Others do not, and wish to engage in other tasks. So, if you want those extra CPU cycles, they are available. If you don't, you're not losing out when encoding is so fast anyway.

I'm with Forumpanda in that if I am encoding/transcoding a video and provided GPU assisted encoding does a good job (which I'm sure it will in the future), I'd like to have it use as much of my system resources if possible so the task gets completed ASAP. Especially if I'm running a batch of video clips off my AVCHD camcorder.

Most users who care purely about speed and not worry about a computer being bogged down usually has a secondary computer. I'm also of the opinion anyone who really cares about GPU assisted video encoding and transcoding likely cares more about speed than low CPU usage.

I think the key is in flexible settings. I know not everyone has multiple computers so give the user the option of having the app limit itself so that it doesn't hog all the system resources or the option to go all out in an attempt to finish the task in the shortest time possible. Best of both worlds.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Really? How long have you guys been together?

Seriously though, there will be tens of thousands just like you. Just as there will be tens of thousands who aren't.
 

Forumpanda

Member
Apr 8, 2009
181
0
0
But multi threading is a very well implemented part of any OS, it is not hard to let all other applications take priority over the encoding and not feel it running at all, regardless of how much 'cpu time' it actually utilizes.
Specially with basically everyone running at least a dual core CPU, if its purely a choice between fast and using the CPU or slower and not then I don't see any reason to make it slower.

I really can't see how you can have a different stance, it is trivial to let programs take CPU priority over encoding.

The much bigger system clog (for application response time) when it comes to encoding is disk access, which neither approach does much for, but flash drives eventually will.
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: Keysplayr
Really? How long have you guys been together?

Seriously though, there will be tens of thousands just like you. Just as there will be tens of thousands who aren't.

So I'm guessing GPGPU at least Nvidia's implementation isn't ready for the masses.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Forumpanda
But multi threading is a very well implemented part of any OS, it is not hard to let all other applications take priority over the encoding and not feel it running at all, regardless of how much 'cpu time' it actually utilizes.
Specially with basically everyone running at least a dual core CPU, if its purely a choice between fast and using the CPU or slower and not then I don't see any reason to make it slower.

I really can't see how you can have a different stance, it is trivial to let programs take CPU priority over encoding.

The much bigger system clog (for application response time) when it comes to encoding is disk access, which neither approach does much for, but flash drives eventually will.

You can't see how I have a different stance? Hookay, here goes.

Since the dawn of multicore processors, how many people were enjoying multitasking much smoother. Since the dawn of GPU assisted video playback (avivo/pure video) how many people enjoyed less CPU utilization when the workload was offloaded to the GPU freeing up CPU time for other tasks. How many people watch a movie and browse the web at the same time? Palunty, as indicated by numerous threads on this very subject. And now, finally, we have REAL TIME Video encoding apps that almost reduce the CPU utilization to NIL. What is not to like?

So you can see why I cannot understand your stance on this. For the past few years, people have been complaining that with Avivo and Pure Video, that CPU cycles were still too high. Everyone wanted to see less and less CPU usage for these everyday tasks. And now, they are getting it in droves.

And, if it didn't matter to anyone, why show the charts in this review of CPU utilization vs. offloading to the GPU percent of usage difference? It obviously matters to someone, cause they made the graphs. And saying that people with more than one computer at their disposal is totally unrealistic. The average user has just one computer. Not a farm to pick and choose from when one rig is busy. Only us folks in these types of forums usually have multiple rigs.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Hacp
Originally posted by: Keysplayr
Really? How long have you guys been together?

Seriously though, there will be tens of thousands just like you. Just as there will be tens of thousands who aren't.

So I'm guessing GPGPU at least Nvidia's implementation isn't ready for the masses.

Interesting. How did you come to that guess? Whether users are ready for it or not, it's available to them. And will be thrust upon them with Windows 7 I can assure you.
 

vj8usa

Senior member
Dec 19, 2005
975
0
0
Originally posted by: Keysplayr
Originally posted by: Forumpanda
But multi threading is a very well implemented part of any OS, it is not hard to let all other applications take priority over the encoding and not feel it running at all, regardless of how much 'cpu time' it actually utilizes.
Specially with basically everyone running at least a dual core CPU, if its purely a choice between fast and using the CPU or slower and not then I don't see any reason to make it slower.

I really can't see how you can have a different stance, it is trivial to let programs take CPU priority over encoding.

The much bigger system clog (for application response time) when it comes to encoding is disk access, which neither approach does much for, but flash drives eventually will.

You can't see how I have a different stance? Hookay, here goes.

Since the dawn of multicore processors, how many people were enjoying multitasking much smoother. Since the dawn of GPU assisted video playback (avivo/pure video) how many people enjoyed less CPU utilization when the workload was offloaded to the GPU freeing up CPU time for other tasks. How many people watch a movie and browse the web at the same time? Palunty, as indicated by numerous threads on this very subject. And now, finally, we have REAL TIME Video encoding apps that almost reduce the CPU utilization to NIL. What is not to like?

So you can see why I cannot understand your stance on this. For the past few years, people have been complaining that with Avivo and Pure Video, that CPU cycles were still too high. Everyone wanted to see less and less CPU usage for these everyday tasks. And now, they are getting it in droves.

And, if it didn't matter to anyone, why show the charts in this review of CPU utilization vs. offloading to the GPU percent of usage difference? It obviously matters to someone, cause they made the graphs. And saying that people with more than one computer at their disposal is totally unrealistic. The average user has just one computer. Not a farm to pick and choose from when one rig is busy. Only us folks in these types of forums usually have multiple rigs.

But why wouldn't you at least want the option to let the CPU speed things up? If you're encoding with a GPU and it's using no CPU but still going fast, wouldn't you want it to use the CPU and go even faster? Forumpanda was just saying that multitasking isn't really a concern, as you can set the encoding at lower CPU priority so that your multitasking won't get slowed down, but the encoding will still benefit from CPU cycles that'd otherwise go unused. It's a win-win. You mention "browsing the web" - how often does browsing the web use up 100% of a modern CPU? Realistically, never. I'd want whatever was "left over" in terms of CPU power to go towards speeding up the encoding, instead of going to waste.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Do you want to micro-manage your PC? Live in task manager to set priorities and affinities? And as for "browsing the web", pick one or two of another 10,000 things you can do with a PC. I'd thought that everyone would just pick up on it as a single example. Thought wrong. hehe. How about encoding video while watching a movie. Ripping MP3's from your CD's. Running Mat-Lab, running compilers, etc. etc. If none of these things used much CPU utilization, why then did so many users want the cycles offloaded to the GPU?
Having the CPU contribute it's power to encoding is really a good thing, but this just shows that nothing can ever make everyone happy. For the past few years, people have been on the lookout for the GPU's that offload the most % of CPU utilization. Now that's here, and wah. Why can't the CPU contribute! It's always a lose lose for devs, isn't it.
 

Forumpanda

Member
Apr 8, 2009
181
0
0
But it wouldn't really be all that difficult to program the encoding app to set itself to a lower priority, we are talking about what we *want* here.

You are just being silly, grumpy and stubborn, good day to you.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: wolf2009
flawed, what about the output quality of the video CPU vs GPU ?

CPU encoding (with x264) is superior, even with relatively fast settings, to any GPU encoding available right now.

Most GPU encoders that are out pretty much suck.
But we have to keep in mind that GPUs where not designed to encode video. Video encoding is integer math. GPU's were designed for floats. Which is why your CPU is going to be your main friend in video encoding for a while still.

Just have to wait and see how gpu encoding evolves.

I suppose one of the better approaches would be to offload parts of the encoder to the GPU (i.e. motion estimation), thereby freeing up time on the CPU to do more of the other things.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: TheRyuu
Originally posted by: wolf2009
flawed, what about the output quality of the video CPU vs GPU ?

CPU encoding (with x264) is superior, even with relatively fast settings, to any GPU encoding available right now.

Most GPU encoders that are out pretty much suck.
But we have to keep in mind that GPUs where not designed to encode video. Video encoding is integer math. GPU's were designed for floats. Which is why your CPU is going to be your main friend in video encoding for a while still.

Just have to wait and see how gpu encoding evolves.

I suppose one of the better approaches would be to offload parts of the encoder to the GPU (i.e. motion estimation), thereby freeing up time on the CPU to do more of the other things.

Because this is what LoRd MuldeR says? You almost got him word for word. Unless this is you?
Forum post and ling to Wiki.

I thought this was only true for DX9 an older spec. DX10 hardware allows for float2int and int2float ops. Incorrect? And I thought it was also true that 8800 and greater GPU's could do integer math. At least 8 bit, which would slow it down, but it doesn't mean it cannot run integer math. Am I wrong here?
 

WelshBloke

Lifer
Jan 12, 2005
31,443
9,343
136
Do you want to micro-manage your PC? Live in task manager to set priorities and affinities?

Why would you need to? Plenty of apps set their priority low if wanted.

How about encoding video while watching a movie. Ripping MP3's from your CD's. Running Mat-Lab, running compilers, etc. etc.

If your doing all that concurrently then you would, surely, want your computer to use all its resources.

If none of these things used much CPU utilization, why then did so many users want the cycles offloaded to the GPU?

Do users want that? Reviewers do because its an easy way to see if the GPU is doing any work, I'd think that users wouldn't care what was doing the work if it was done quickly and with quality.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: WelshBloke
Do you want to micro-manage your PC? Live in task manager to set priorities and affinities?

Why would you need to? Plenty of apps set their priority low if wanted.

How about encoding video while watching a movie. Ripping MP3's from your CD's. Running Mat-Lab, running compilers, etc. etc.

If your doing all that concurrently then you would, surely, want your computer to use all its resources.

If none of these things used much CPU utilization, why then did so many users want the cycles offloaded to the GPU?

Do users want that? Reviewers do because its an easy way to see if the GPU is doing any work, I'd think that users wouldn't care what was doing the work if it was done quickly and with quality.

If wanted. Key word.

Exactly.

They surely USED to. And there are several hundred threads to prove that going back years on these forums. You know this, and I know this. Why even ask if that's what they want? You've been on these forums long enough to know that there were many, many discussions on how well the GPU removes a workload from a CPU and how important it was to have this happen for people who multitask.

Have all of you forgotten? Selective memories? LOL. :thumbsup:
 

WelshBloke

Lifer
Jan 12, 2005
31,443
9,343
136
Originally posted by: Keysplayr
Originally posted by: WelshBloke
Do you want to micro-manage your PC? Live in task manager to set priorities and affinities?

Why would you need to? Plenty of apps set their priority low if wanted.

How about encoding video while watching a movie. Ripping MP3's from your CD's. Running Mat-Lab, running compilers, etc. etc.

If your doing all that concurrently then you would, surely, want your computer to use all its resources.

If none of these things used much CPU utilization, why then did so many users want the cycles offloaded to the GPU?

Do users want that? Reviewers do because its an easy way to see if the GPU is doing any work, I'd think that users wouldn't care what was doing the work if it was done quickly and with quality.

If wanted. Key word.

Exactly.

They surely USED to. And there are several hundred threads to prove that going back years on these forums. You know this, and I know this. Why even ask if that's what they want? You've been on these forums long enough to know that there were many, many discussions on how well the GPU removes a workload from a CPU and how important it was to have this happen for people who multitask.

Have all of you forgotten? Selective memories? LOL. :thumbsup:

But most of those threads were the same pissing contests about 'my GPU can beat up your GPU'. Its easy to pick a 'winner' when all you are looking for is the lowest CPU utilization.

What possible advantage would you have leaving out a large proportion of your computing power when doing intensive tasks?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: WelshBloke
Originally posted by: Keysplayr
Originally posted by: WelshBloke
Do you want to micro-manage your PC? Live in task manager to set priorities and affinities?

Why would you need to? Plenty of apps set their priority low if wanted.

How about encoding video while watching a movie. Ripping MP3's from your CD's. Running Mat-Lab, running compilers, etc. etc.

If your doing all that concurrently then you would, surely, want your computer to use all its resources.

If none of these things used much CPU utilization, why then did so many users want the cycles offloaded to the GPU?

Do users want that? Reviewers do because its an easy way to see if the GPU is doing any work, I'd think that users wouldn't care what was doing the work if it was done quickly and with quality.

If wanted. Key word.

Exactly.

They surely USED to. And there are several hundred threads to prove that going back years on these forums. You know this, and I know this. Why even ask if that's what they want? You've been on these forums long enough to know that there were many, many discussions on how well the GPU removes a workload from a CPU and how important it was to have this happen for people who multitask.

Have all of you forgotten? Selective memories? LOL. :thumbsup:

But most of those threads were the same pissing contests about 'my GPU can beat up your GPU'. Its easy to pick a 'winner' when all you are looking for is the lowest CPU utilization.

What possible advantage would you have leaving out a large proportion of your computing power when doing intensive tasks?

You mean, you can't think of a single thing? I've listed quite a few above, and that took 9 seconds. C'mon, you could do it to, if you wanted to. hehe.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |