ML-based up scaling certain has potential. Final Fantasy 7's pre-rendered backgrounds has recently been upscaled from their original resolution to HD quality. "Using state of the art AI neural networks, this upscaling tries to emulate the detail the original renders would have had," writes the...
I have the same cpu and mobo with w10. Asus AI broke for me as well after the update. However there is a beta Asus ai that i found on the asus rog forum and it works.
I have not experienced any slow down in general computing performace other than the broken asus ai. You mat want to reinstall...
I would just get the $499 Newegg bundle and sell the games. It takes a bit of work, but I believe you will be able to net more than $50-$75 for both games.
I have an old Dell monitor that I want to sell locally. I put it on Letgo and set price as negotiable. I got no offer. I typically do not sell locally but I simply don't have any box to ship the monitor.
Is Letgo the "go to" app for selling local goods or should I also double post on Craigslist
That would be me. I got a 1060 3GB for HTPC (madvr) and light gaming duties. I plan to enjoy Netflix 4K on 4K TV. I certainly did not pay $200 or anywhere close to that for the card.
Using C2Q as the main machine in 2017 is pretty rough. I have it paired with GTX 1060 for HTPC and light gaming duties (light as in Assassins Creed type third person action, emulators and other third person adventure games). The machine works great with madvr and I have no problem playing most...
Agreed that folks shouldn't spend money in 2017 to buy C2Q or spend money upgrading their existing C2Q - but if you are already rocking a overclocked C2Q, I would use that as a second PC rather than tossing it out. I use it for HTPC duties and games.
Not sure why C2Q is obsolete. I just ran CB r15 and I got 360 CB points, which is what an Haswell i3 would get.
If an Haswell i3 is good enough for 2017, so should an overclocked Q2Q
My Q6600 is clocked at 3.6Ghz with DDR2 800.
Looks like I got no reason to upgrade. Dam
How? The program and catalog are already running on an SSD. Memory is DDR3 2400. I hate the fact I can't just throw computing power at Lightroom...
I guess I am grasping at the straws trying to find a reason to buy Ryzen.
I have read that review but I am not sure they are correct. With heavy editing (many adjustments with filters, tools and sliders) driving two monitors, I often see all 8 threads of my 4790k being used and once in a while it would peak at 100%. On a fresh raw file, I do not see more than 2 cores...
I am a fairly heavy Lightroom user and it is currently running on a 4790k system overclocked to 4.7Ghz. I have noticed that with heavily processed photos having multiple adjustment brushes, heal/spot removals, sharpen, perspective correct and gradient filters, my 4790k would often get to 80-90%...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.