I know what more cores are used for in work applications/settings.
My intent was finding out what people use them for at home. I am betting that you don't have 300 Skylake cores at home.
The over-arching take-away then is that people are using compute and lots of it at home in the same manner people use them in work applications and enterprise : Video editing/Virtualization/HPC computing/AI/Simulations/Compilation.
That's essentially what corporations are doing just on different/similar/ or even lessor scales. Tbqh, the hardware isn't all that different niether are the architectures by and large. It all centers any a many core CPU with a bunch of PCIE lanes. Desktops now have NVME/SSD just like the enterprise. Even networking gear is the same for some people and even more capable in some cases. Computing hardware is now heavily commoditized and affordable. Entrepreneurship is soaring. I have never used cloud computing for my work and likely never will if I have a say in it. I have more capable hardware, I control and update the configuration and I don't have to deal w/ bozo security issues. There are a number of cases where building out your own computer stack is far cheaper especially at the higher end. There's tons of open source packages for management/Virtualization/etc. Enterprise is built on the same software stacks available to everyone via the open source community.
Computing goes through phases and focus. What people call cloud computing was the mainframe/thin client age from yesteryear. Hilariously, the bus architectures and many of the tidbits are still the same and borrow from what these pioneers did back in the 70s/80s. Cray is still around and does alot of HPC work. What happened after was that computing power was packed into an affordable Desktop thus came the PC age. The cloud computing era is long in the tooth and dead from where I sit. As always, the old guard and mainstream are always late to the game. One tries to milk an era for as long as they can (cloud computing meme) .. the (enterprise consumers) don't like change and stick to things until they're dead.
My thread-ripper rig can outperform a number of enterprise servers at various tasks. The game has changed. There's not much distinguishing enterprise from high end desktop beyond redundant power supplies, PCIE switches (they've been milking this forever) to allow for increased scaling, hot swap capability, and some other features which hardly are necessary unless you need 99.999% uptime. Shell out some cash and you can run high speed fiber networking and a good chunk of all of the other 'enterprise' gimmicks. Were in the next phase where the mainframe got packaged into the desktop. This is where you have a software and application explosion. Some got the memo. Some haven't.
I see the innovation in people taking the hardware and applying it in new ways. I don't see it in doing the same ol' crusty things they've been doing since the start of the cloud computing era... Cloud computing btw was a big scheme to achieve the holy grail of re-occurring revenue btw (aka leasing the same stuff to you over and over beyond the price it would have cost to purchase it). I hear people say : But I can run my software on 500 cores in the blink of an eye. Yeah, you also could write better software so you only need 10 cores. The innovation is going to occur with less not more IMO. We have some really crappy software out there due to how much compute power we have. We have some really crappy algorithms dominating computing due to insane compute power (meme learning).
Time for a change. The cloud era is over