Bitrate is the term used for the amount of data recoded to the media (in most case a memory card) per second. This is often confused with what you read on the media itself which is usually in MB/s, this means mega bytes per second.
There are 8 bits in one byte the difference in the way they are written is subtle, 1Mb/s is 1 million bits per second, 1MB/s is actually 8 million bits per second. Hence the confusion!
Long story short the more data you record per second the greater the "quality" of your end results. There are many things that acheive lowering bitrates, reffered to as compression these tricks are used to give you a usable video but using less data than recording exactly what the camera sees.
The bit rate is quantified using the bits per second unit (symbol: "bit/s"), often in conjunction with an SI prefix such as "kilo" (1 kbit/s = 1000 bit/s), "mega" (1 Mbit/s = 1000 kbit/s), "giga" (1 Gbit/s = 1000 Mbit/s) or "tera" (1 Tbit/s = 1000 Gbit/s). The non-standard abbreviation "bps" is often used to replace the standard symbol "bit/s", so that, for example, "1 Mbps" is used to mean one million bits per second.
One byte per second (1 B/s) corresponds to 8 bit/s.