A GTX 1070 may get about 700,000 PPD according to this:
http://www.overclock.net/t/475163/gpu-projects-ppd-database
Unfortunately, this database has a strong bias towards too optimistic PPD estimations. Also, disregard any database entries which have fewer than ~20 samples. There is a high variability of PPD between different work units (WUs). Furthermore, The same GPU gives more credit under Linux than under Windows. How much more depends on the GPU, on the host CPU which feeds the GPU, and on other load on the host.
Regarding folding on CPUs vs. GPUs:
What
@Ken g6 said. Folding@Home rewards not just the amount of work done, but also how quickly the results were returned. Meaning, a few fast folding slots get more credit than several slow folding slots, even if the latter perform the same absolute amount of work during the same period of time. This bias towards faster slots is because the scientists want to reward those from whom they get input for their research sooner. (Also, new work issued to volunteers may be based on work previously completed by volunteers.)
@ao_ika_red posted a link to the credit formula in
#446.
And because GPUs are so much better at folding than CPUs, it is not worth anymore to use CPUs with fewer than about 32 threads per slot. My rather energy-efficient Broadwell-EPs with 44, 56, or 64 threads per slot give credit/Joule (PPD/Watt) at about 1/2.5...1/3 of my NVidia Pascal GPUs. With smaller CPUs, PPD/Watt are much worse because of the nonlinear credit system.
Regarding the noise:
Over time, I have seen mentions of various ways to cope with the cooling requirements of folding hosts, and the associated noise:
- leaving the side panel open, thus having only the noise of the card fans but not from case fans
- putting the hosts into a separate room, the basement, or the garage
And of course what
@crashtech said: Use Afterburner or similar tools to modify the GPU fan profile (or/ and to cap the power draw of the GPU).
I chose a luxury variant myself: Custom waterloops. That's because I have no spare rooms for my DC hosts. However, besides being an expensive approach to the cooling and noise problem, it is also only a partial solution: While large enough radiators allow to reduce fan noise a lot, there is still noise from GPU coil whine, and in case of larger loops which require adequate pump performance there is also some pump noise. Regarding coil whine, Folding@Home is the worst offender among the GPU DC projects which I tried so far. Coil whine is a bit higher pitched but more evenly under Linux than under Windows, because GPU utilization from F@H under Linux is higher and more steady than under Windows.
Some people build cabinets for their DC gear, with extra large fans for ventilation, and perhaps even mufflers.