The issue is actually two fold because increased color space also decreases the resolution of colors that can be displayed.
If you look at the Rtings reviews you'll find that a lot 10-bit panels still show banding too. For me there's no middle ground on color banding, either you see it or you don't. So why the hell spend more money on some fancy 10 bit display if you still see banding?
The ultimate display will be 12-bit OLED with rec.2020 coverage, but it's going to be a while until we get there. Currently, I think the absolute best bang for the buck is a good quality VA panel with local dimming, bit depth doesn't really matter. Love my Vizio M series. Contrast isn't as good as OLED but damn does it still look good, especially given the price.
The banding present in 10bit displays is ultimately due to white-balance and gamma accuracy. The white-balance charts posted on rtings for example, are just 10 pt readings. While this seems like it makes sense, since most sets only have 10 pt WB controls, it doesn't mean that a televisions WB controls can perfectly tune a sets greyscale. If you do 20pt readings, in most cases, the dE will be higher, and the chart will not be as smooth,
this is what is showing up in 10bit banding tests. This of course can be perfected with a 3Dlut or just better WB controls.
Several of the Sony sets actually exhibit perfect gradients (x930D for example), since Sony is generally known for superior video processing.
10 bit processing is pretty new for televisions, and i imagine it will improve in the coming years.
Make no mistake though, the banding on 10bit is not nearly as bad as with 8bit, and that difference becomes much more apparent when you are talking about the much larger luminance range of HDR