It sounds like the question is really about the effective speed of electronic devices, rather than the speed of the radiowaves themselves. Everyone has commented that the speed of light is relatively constant (ha, Einstein joke) in a given medium... but the problem with electronics has been Moore's Law Vs The Laws of the Universe.
Back when I was in Devices Physics... the fastest CPU ran at 300Mhz, and everyone was worried that silicon just couldn't be pushed any faster at those voltages or with any current. The little transistors would never be able to switch that fast, or recover afterward before the next signal. Clearly we have gotten around that, but it's by more "smoke & mirrors" than actual MHz.
We use multiple pipelines, and staged instruction sets to act as an group of assembly lines, rather than a single line doing all of the work. Shrinking the CPU die size has also had a big effect by effectively shortening the length an electrical signal has to travel. There's all kinds of little tricks they have used to eek out more "MHz" every year... but there is a limit to silicon. That's why AMD uses the PR rating. Their chips running at 2600Mhz and a 9 stage line are just as fast as Intels running at 3400MHz with a 22 stage line. It more mechanical design than actual science.
Drawing this trainwreck to a close... No, scientists have not make waves move faster through Silicon. However, the extremely-deadly-to-produce Gallium arsenide used in LEDs is a whole different story, as signals do move (a little) faster in it (up to 300TeraHz). Could be the way of the future, but I doubt it. We'll invent something totally new by then.