There is no way to answer this question because there are a number of technologies being developed which are promising, but unproven. The production of hydrocarbon fuels directly by E. coli fo rexample.
Personally one of the worst things we can do is get caught in a one size fits all solution, and we need to consider the TCO of any system.
For example thorium reactors could be an ideal solution for energy intensive purposes, but we should be moving away from a grid based system even if the initial cost were higher. Why? A couple reasons and none of it related to the source or destination of electricity, but the means by which it is transmitted. As the grid becomes larger it is of necessity larger and more complex. We have "smart" grids now so we lessen the likelihood of a large power outage. That's great, but it then becomes increasingly expensive and more vulnerable to the elephant in the room, and that's solar activity. This is not something which is sci fi. A solar storm caused some damage not many years ago, and one in the mid 1800' was so powerful that people operating telegraphs got severe shocks. At this time there isn't a good solution to this. That means anything that uses modern technology and plugged in will be cooked. Computers mostly gone. The internet- history. Power? Forget it. That's best case. It gets a lot worse than that. Imagine communications gone, data fried, and more. Back to snail mail and typewriters. In the meantime the logistics of running business or government suddenly become impossible. What can be done to avert this is beyond the scope of my post, but this is not Y2K. It's a matter of physics.
Part of the solution would be to minimize exposure to the grid. It's easier to unplug your home or cluster of dwellings and it's easier to minimize the damage. Spending more money than exists to repair society isn't exactly economical.
There's a whole lot more to it than what's the shortest and cheapest route from concept to reality.