- May 25, 2013
- 52
- 0
- 0
I hate to ask this with a zero track record here. I used to post here a looooooong time ago but it was so long ago that I forgot the old credentials not that I really care for them.
So here's the backstory, I'm heading a project at work to process hundreds of thousands of images using Photozoom Pro and I have a budget to put together a few machines to run this on. I'm just looking for the best way to spend the money to get the best performance/$ so going all out top shelf might not be prudent.
The tool itself is quite niche so there's not much sample data out there across different CPUs and GPUs. It operates in 2 modes BTW, CPU or GPU+CPU with GPU+CPU being the fastest. I'm thinking of running 2 instances per machine so I can take advantage of all resources.
The trial should work as it has everything turned on except that it watermarks the output image. It even comes with a sample image as well which could serve for the test.
http://www.benvista.com/downloads
Here's what I'm getting with my really old personal machine:
CPU: Q6600 @ 2.83Ghz - 1m:17s (77 seconds)
GPU: GTX460 - 0m:17s (17 seconds)
Here's how I'm running it:
1)Start the app
2)Choose 'Later' for registration/serial entry
3)The sample image should be loaded automatically
4)On the left panel, set 'Width' to 400%. Height should automatically update to match
5)On the left panel, set 'Resize Method' to 'S-Spline Max'
6)On the left panel, set 'Preset' to 'Photo - Extra Detailed'
7)Hit the 'Save' button in the upper left
8)Choose 'No' on registration again
9)Choose TIFF as the format and use whatever filename
10)TIFF options I'm using are 'LZW' for compression and 'IBM PC' for byte order not that these should affect the performance
11)Hit 'OK' and the elapsed time shown should be the GPU score since the app uses GPU by default
12)Go to Options > Preferences > Processing, and uncheck 'Use GPU acceleration', hit OK
13)Hit the 'Save' button in the upper left again, the previous settings will be the same
14)Choose 'No' on registration again
15)Choose TIFF as the format and use whatever filename, overwrite is fine
16)TIFF options I'm using are 'LZW' for compression and 'IBM PC' for byte order
17)Hit 'OK' and the elapsed time shown will now be the CPU score
I would be really grateful if anyone here can test with a stock and OC CPUs as well as GTX680 or faster GPUs. I will consolidate the data here in the first post for anyone who posts results. Thanks!
edit: Correction on how the app works. It looks like with GPU acceleration on it still pegs the CPU pretty hard(I've seen 90% over 8 cores) so 2 machines with the same GPU but different CPUs should give different results.
edit: Results chart with data from 2 forums
So here's the backstory, I'm heading a project at work to process hundreds of thousands of images using Photozoom Pro and I have a budget to put together a few machines to run this on. I'm just looking for the best way to spend the money to get the best performance/$ so going all out top shelf might not be prudent.
The tool itself is quite niche so there's not much sample data out there across different CPUs and GPUs. It operates in 2 modes BTW, CPU or GPU+CPU with GPU+CPU being the fastest. I'm thinking of running 2 instances per machine so I can take advantage of all resources.
The trial should work as it has everything turned on except that it watermarks the output image. It even comes with a sample image as well which could serve for the test.
http://www.benvista.com/downloads
Here's what I'm getting with my really old personal machine:
CPU: Q6600 @ 2.83Ghz - 1m:17s (77 seconds)
GPU: GTX460 - 0m:17s (17 seconds)
Here's how I'm running it:
1)Start the app
2)Choose 'Later' for registration/serial entry
3)The sample image should be loaded automatically
4)On the left panel, set 'Width' to 400%. Height should automatically update to match
5)On the left panel, set 'Resize Method' to 'S-Spline Max'
6)On the left panel, set 'Preset' to 'Photo - Extra Detailed'
7)Hit the 'Save' button in the upper left
8)Choose 'No' on registration again
9)Choose TIFF as the format and use whatever filename
10)TIFF options I'm using are 'LZW' for compression and 'IBM PC' for byte order not that these should affect the performance
11)Hit 'OK' and the elapsed time shown should be the GPU score since the app uses GPU by default
12)Go to Options > Preferences > Processing, and uncheck 'Use GPU acceleration', hit OK
13)Hit the 'Save' button in the upper left again, the previous settings will be the same
14)Choose 'No' on registration again
15)Choose TIFF as the format and use whatever filename, overwrite is fine
16)TIFF options I'm using are 'LZW' for compression and 'IBM PC' for byte order
17)Hit 'OK' and the elapsed time shown will now be the CPU score
I would be really grateful if anyone here can test with a stock and OC CPUs as well as GTX680 or faster GPUs. I will consolidate the data here in the first post for anyone who posts results. Thanks!
edit: Correction on how the app works. It looks like with GPU acceleration on it still pegs the CPU pretty hard(I've seen 90% over 8 cores) so 2 machines with the same GPU but different CPUs should give different results.
edit: Results chart with data from 2 forums
Last edited: