RedFox
YTTalk's Fox
Bytes are used in data transfer. In gaming data within the games code is transfered through the game into the system to be read. Bits however is the multimedia of said game. All games use rendering for their images, audio, and video. Same as with videos for Youtube and such. In order to render every bit of the image in a games frame there are bits that contain part of the image. The bytes are the transfer of that and other data through the system. Think of it as a shipping barge carrying a load of shipping containers. This is all knowledge I have collected through my time of taking classes in video game development and other such research. So I am sorry that I do not have a source.
The bits per second of a compressed video and bytes per second of the Memory Bandwidth of a Graphics Card are 2 completely different things, you're talking complete b******t without some form of a source.
Yes we know games have to rendered on the GPU to be displayed on screen but this method, how the GPU does it and output are completely different from that of the bit rate of a rendered AVI, MP4, etc. video file.
Games and Graphics Card don't compress images displayed on screen, the amount of data a game is processing at once is dependent on the GPU and that does not affect the quality of the game on screen or when recorded to a file, it just affects the performance of the game if the GPU's memory bandwidth is too low and the interface is not wide enough (64 bit, 128 bit, etc.) and the GPU cannot process it fast enough.
Please don't reply unless you have actually something to back up your statements, giving bad advice is worse than giving good advice, you're suggesting the OP renders at 24mbps when YouTube compresses it below that.