He said shrinkwrapped programs so I'm assuming desktop OSX apps also count. If I were to invest my time today in ObjectiveC or C#, I'd pick ObjC. Programming native OSX and iOS programs is too attractive, especially when HTML5+JS and C++ are equal with C# in every way now.
You can use your skills gained elsewhere for native Windows8 apps, but Apple if you want to be native you need to use ObjectiveC. Until Apple stops it's march, I'd go where the money is (Apple) and not duplicate his existing skillset (C++/HTML5/JS) for no net gain on Windows.
For the costs, I'm assuming you'd also want a Windows PC and a device for that as well, at some point. I'm not sure if that requirement goes away with any hardcore mobile development. I certainly would want to do testing on a real device whether be Android, iOS or Win8.
I have no opinion on which is a better bet, I would just follow the money which is Apple.
I would argue that the money is not in writing software for Apple's operating systems. Apple generates more revenue than Microsoft (mostly because they sell a lot more hardware and charge a premium for it), but that does not mean that there is more money to be made writing Apple software. In the business world, the overwhelming majority of corporations use Windows, and they generally want web-based or Windows software. I see very few job openings looking for iOS or Objective C developers, and I definitely see a lot more looking for either Java or .NET. This is in the South, Northeast, and Midwest USA, so basically everywhere but the West Coast (which I have not looked at). I know that ObjC, Java, and open-source are much bigger on the West Coast than elsewhere in the US, but for the majority of the United States, .NET is king in the business world. I don't know that there's a ton of money to be made in iOS development at this point, although it would appear that there is even less to be made in Android development.
This article would indicate that iOS development has already started to stale, while Android is catching up (but still has a ways to go).
In terms of the .NET stack, could you explain your basis for this statement:
HTML5+JS and C++ are equal with C# in every way now.
Have you actually opened and used the Visual Studio IDE? In terms of project types, Visual C++ is nowhere near C#. Visual C++ lacks every major modern framework for VS. C# has things like MVC, WPF, Silverlight, Webforms, etc. and it has designers for all of them. You could argue that none of this is technically necessary, because you could just use Notepad, but get real. In all reality, you would spend weeks writing HTML5/JS and C++ code that can be done in a matter of hours through VS designers, MVC scaffolding, etc. People use Visual Studio for Windows development because it is, hands down, the best IDE on the market today. You will have a tough time convincing most people that anything else even comes close. It would be foolish to have this wonderful tool, then tell somebody not to use it because, technically, it can be done with HTML5/JS or C++. "Technically" I could dig a ditch with a spoon, but I'd rather use an excavator.
Furthermore, what gives you the idea that the OP has any desire to do Windows 8 apps? From his post, it sounded like he is tired of web development and wants to do some very basic Windows development (no strange widgets, oddball interfaces, etc.) So what do we do? We immediately point him towards web development, Windows 8 (with its oddball interface), and worse yet, OS X/iOS apps.
Very few (if any) businesses are using Windows 8 at this point. Most of them are still on 7, Vista, or even XP. To me, a Windows 7 desktop application would seem like the logical place for OP to get started and learn the Visual Studio IDE. He never said he wanted to do anything bleeding-edge. I quoted Obsoleet for this reply, but I think that quite a few others in this thread have given the OP either bad or totally irrelevant advice on how to get started with Windows programming.
And Obsoleet, just so I'm not disagreeing with literally everything you said, I agree that having at least one real iOS/Android device to test against (instead of just an emulator) is basically a necessity. A variety of devices of different sizes would be even better.