Bit = a binary digit, the smallest changeable unit in most computer languages. It can have a value of either zero or one, nothing else.
Byte = 8 bits. The left-most bit is the MSB (Most Significant Bit) and the right-most bit is the LSB (Least Significant Bit). 8 bits, or one byte, represents one alpha-numeric character (a-z, A-Z, 0-9).
K, M, and G are used to designate the prefixes Kilo (thousand), Mega (million), and Giga (billion). Used with the term 'byte', they represent the following calculations;
Kilobyte (KB) = 2 to the power of 10 (2^10) = 1024 bytes
Megabyte (MB) = 2 to the power of 20 (2^20) = 1,048,576 bytes
Gigabyte (GB) = 2 to the power of 30 (2^30) = 1,073,741,824 bytes
6 MB = 6 x 2^20 = 6,291,456 bytes
Notice that KB, MB, and GB do *not* mean bytes in multiples of one thousand, one million, or one billion.
But it's a little different when you're talking about data communications over a network:
Kilobits = 1000bits
Megabits = 1,000,000 bits
So 56kbps is really 56,000 bits per second (instead of 57,344 bytes as it would be if we were discussing storage).