I would have thought that Unix/Linux lend themselves to distributing processing over a number of processors; this can be of use in a number of ways;
1. Security checks on a system lend themselves to massive parallelisation; this is useful when handling data from secure financial websites.
2. Filtering of, for example, share prices lends itself to parallelisation. In general, financial data is massively parallel; for those of us watching the markets, the data flows 24 hours a day, 5 days a week.
3. Processing of sensor data involves multiple sources; for example, cars may sport a dozen cameras, and other sensors, in future; this technology may lend itself to home security, for example.
4. Developments in games may involve exo-skeletons with sensors and actuators, plus virtual reality; the sky is the limit, in terms of processing power required for this; imagine a virtual battle with an 'enemy', where the exo-skeleton enables you to feel the force of a blow? - and to deliver a forceful blow?
I would think that there will be many such possibilities in the future, and that they will use SoCs designed to be fit for purpose, with many cores, and heterogeneous computing.