- Feb 1, 2006
- 44
- 0
- 0
Hi all, I've been doing some reading about networking and physical media and I was reading about different categories of UTP cable (Cat 1-6) and something I'm having a hard time understanding that the book didn't explain too well is the difference between MHz and Mbps.
For example, the book I'm reading says that Cat 5 is rated for 100MHz and then it says that Cat5e is also rated for 100MHz, but can handle gigabit ethernet. This means that Cat 5 and 5e each are 100MHz, but 5e can handle 1000Mbps while Cat 5 only does 10/100Mbps correct?
How so if both are 100MHz?
Also, one other question. The book says to use 5e as the minimum when installing cable because some cables are now certified for 350MHz or beyond which allows the cable to exceed speeds of 1Gbps. So, I'm wondering is the author referring to Cat5e cable that can do 350MHz (b/c earlier as I mentioned he said it was rated at 100MHz)?
Any help is greatly appreciated!
For example, the book I'm reading says that Cat 5 is rated for 100MHz and then it says that Cat5e is also rated for 100MHz, but can handle gigabit ethernet. This means that Cat 5 and 5e each are 100MHz, but 5e can handle 1000Mbps while Cat 5 only does 10/100Mbps correct?
How so if both are 100MHz?
Also, one other question. The book says to use 5e as the minimum when installing cable because some cables are now certified for 350MHz or beyond which allows the cable to exceed speeds of 1Gbps. So, I'm wondering is the author referring to Cat5e cable that can do 350MHz (b/c earlier as I mentioned he said it was rated at 100MHz)?
Any help is greatly appreciated!