Much more is a bit of a stretch. They might be a bit less efficient, but a purpose built 48V-> 1V convertor will be closer in efficiency to a 12V-1V convertor than a wide input range convertor operating at the top end of the range. I'd be surprised if it was even 1 percentage point less efficient, considering you can buy complete 48V-12V solutions that are 98% efficient. That's just the VRM on the GPU though, the flipside is lower losses in cabling and connectors.Everything I read, said the opposite. It more difficult to do the DC-DC conversion on 48V to ~1V range of ICs, than it is to go from 12V.
For DC Downconverter spec sheets I have seen, the larger the delta between Vin and Vout, the lower the efficiency.
So if you ran 48V DC to a GPU, you would end up with much more waste heat. That would make GPU power problem worse, not better.
The transition problems would be a pain though. Not just needing a new power supply, but there's a huge range of components used in the VRMs themselves tailored to 12V->~1V for both CPU and GPU VRMs with maximum efficiency. It'd probably be more likely if direct 48V conversion takes off in the datacenter and that whole supply chain builds up first.