I am quite familiar with the idea of Machine Epsilon from my working with Linear Algebra libraries etc and understand why such situations arise...In some of my more recent working I have started to measure the time of execution (of course relative to my architecture). I wonder is the smallest unit of time measurable is a hardware limitation ( I assume it is) but if not how can I f up my machine to make it more precise. i.e are there modules I can use to do this.....
If possible provide a good link for a non-EE ( I am math so a few equations are OK) on how computers measure time....
This all of course leads to questions about the discreteness or continuity of the physical universe given the limits of computation but thats another thread.
If possible provide a good link for a non-EE ( I am math so a few equations are OK) on how computers measure time....
This all of course leads to questions about the discreteness or continuity of the physical universe given the limits of computation but thats another thread.