I just posted this in my
Phenom thread, but I thought it may be appropriate here too.
It seems even an 8 year-old Phenom does reasonably well with VP9 higher-than-HD video. For the YouTube videos I tried, this CPU could usually handle 2160p VP9 playback if left the machine undisturbed, but if I tried accessing various interface features it sometimes might drop frames. However, 1440p VP9 was fine with the CPU utilization peaking at just a little over 50% CPU. This matches the GPU well, since the video output won't handle 4K monitors anyway. AFAIK, this NVIDIA GeForce 9200 only supports up to 2560x1600, which is just over 1440p.
I'm sure AV1 will be a total disaster on this machine, but it seems
some people don't think AV1 will really be a viable competitor until around 2020 or later, so I'm not really concerned about AV1 at this point. In fact, 2020 will be the infancy of AV1 adoption, so some of us mere mortals may not really need to concern ourselves with AV1 until about 5 years from now.
Personally, I'm rooting for AV1 to push HEVC licensors to simplify their royalties structure, and reduce their royalties pricing. I tend to keep my machines a long time, and I bought a new desktop and laptop in 2017, built around compatibility for 10-bit 4K HEVC h.265 HDR, via Kaby Lake and Intel Quick Sync. I understand Google will force AV1 onto the world through YouTube, but I'm fine using "legacy" VP9 and even h.264 for YouTube up to 1440p, and using HEVC for other stuff including Netflix 4K. I know Netflix will be introducing AV1 support sooner rather than later, but given that there is almost no hardware support, Netflix's support for HEVC will be around for a long time.
BTW,
Netflix has stated that they won't implement AV1 until it is 20% more efficient than HEVC, but some recent testing has indicated that currently this early version of AV1 is actually
less efficient than the much more mature HEVC on the most complex critical scenes at this time, despite being at least 10X to 100X harder to encode. They did suggest though that they could implement AV1 in the next fiscal quarter though, and it does seem that some third party testing suggests that on average AV1 right now might be about
20% more efficient than HEVC, but again, if there is no hardware to decode it, it's effectively just really for testing purposes. Furthermore, to achieve that 20% higher efficiency, the encode speeds were 200X to 250X slower than HEVC. Ouch.