So on Micron, "bps" actually means bits per second per pin. Just so happens that the value in b/s/pin is the same as that in T/s.
For me: to get memory bandwidth, multiply bits per transfer (bus width) and transfer rate. For GeForce GTX 1080 Ti: (352 b/T)(11 GT/s) = 3872 Gb/s = 484 GB/s.
Sorry, just some off-topic rambling.... Websites are all over the place for the units of memory rate: bps, T/s, and Hz: sometimes the actual frequency in Hz (1250 MHz for 10 GT/s GDDR5X) and sometimes not ("5000" MHz).
For me: to get memory bandwidth, multiply bits per transfer (bus width) and transfer rate. For GeForce GTX 1080 Ti: (352 b/T)(11 GT/s) = 3872 Gb/s = 484 GB/s.
Sorry, just some off-topic rambling.... Websites are all over the place for the units of memory rate: bps, T/s, and Hz: sometimes the actual frequency in Hz (1250 MHz for 10 GT/s GDDR5X) and sometimes not ("5000" MHz).