Running with the --cache=10240 option has been a real eye-opener.
The downside is that, even with superfetch disabled, I've got maybe 800 MB of RAM left after taking out what is used by Waterfox, Ethereum Wallet, HWiNFO64, and suchlike. That's out of the 15.9 GB Windows says is usable.
Now for the real data:
Within the first 27 minutes, the MX200's host controller reported 18 GB of host writes. After a total of 213 minutes, the same host controller reports 19 GB of host writes. Apparently there was a surge of drive activity when initializing geth and downloading the few outstanding blocks that had built up on the chain between the time I ended my previous trial and began the new one. Three hours later it's only managed to pick up 1 GB of host writes. That's astonishing. I choose not to continue the experiment on this machine because there's not much I can do with it given how little RAM is remaining. Regardless, the data seems to indicate that setting up a very large cache by way of the --cache command eliminates most drive activity associated with block syncing/verification.
Also, as of 213 minutes into the run, the average write speed reported by the drive in HWiNFO64 is sitting around 1.5 MB/s. That's easily within the capabilities of a platter-based HDD, even if all the writes are random. Reported average read rate is ~4.6 MB/s which also should be doable by a 7200 rpm HDD, even if the reads are all random.
If you're going to set up a dedicated node box, you should be able to use a spinner. You just need to set a large cache value; the bigger, the better. Some folks have been trying 8192, but I would go larger if possible. I can't say for sure that large --cache values really work with spinners until I've tried it myself. I might have to install my old 640 GB WD Black and see if it's doable on that thing.
Simply relying on an SSD to take care of the problem does NOT seem a wise decision, since the default cache value of 128 leads to an insane number of host writes to the drive, even when handling a relatively small amount of data. Over 400 GB in writes per day makes no damn sense.
People hoping to casually use Ethereum Wallet for basic exchange of tokens, viewing wallet totals, etc. will probably want/need an SSD since the default settings preclude anything else. Using large --cache values requires at least a tiny bit of "expert" knowledge, as well as a fair amount of free RAM.
I am going to talk to a data recovery service to see what they can do with my SSD. They said if recovery isn't possible, there's no cost.
Good of them to take a look for free. Hope it isn't a total loss for you.