Ok i know this is a stupid question. But I am sure i can figure this out if i look at it LONG enough. But as of yet i cant seem to figure it out.
I am trying to figure out how do the Internet speed Bandwidth testers get their figures? for example i used www.toast.com/performance and i got the results back as follows:
Loaded 754,928 bytes in 2.022 seconds from 4web-space server. and it calculated my throughtput as 2987 Kb. How did it get those figures?
when i am trying to calculate it I am getting bigger numbers. I think i might be missing some division or something.
And also if say i have a 3megaBITS/Sec connection what is the biggest possible X_KiloBYTES/Sec? as in when i am downloading something from the internet. the download rate is never more than 350Kb/s and i am always wondering how is that calculated.
TIA
I am trying to figure out how do the Internet speed Bandwidth testers get their figures? for example i used www.toast.com/performance and i got the results back as follows:
Loaded 754,928 bytes in 2.022 seconds from 4web-space server. and it calculated my throughtput as 2987 Kb. How did it get those figures?
when i am trying to calculate it I am getting bigger numbers. I think i might be missing some division or something.
And also if say i have a 3megaBITS/Sec connection what is the biggest possible X_KiloBYTES/Sec? as in when i am downloading something from the internet. the download rate is never more than 350Kb/s and i am always wondering how is that calculated.
TIA