Correct me if I'm wrong, but in order to get Gb/s, shouldn't bits per second be divided by 1073741824 (aka 1024 / 1024 / 1024), not 1000000000 (aka 1000 / 1000 / 1000)?
For instance: 3840x2160 120Hz 8bpc RGB (None) (None) gives a value of 23.89Gb/s, but shouldn't it be 22.25Gb/s? That's a whole 1.64Gb/s difference! Possibly enough to matter when choosing a standard!
Is there an exception for some reason when calculating uncompressed video data rates, like there there seems to be when calculating hard drive space?