Who invented binary code? And how do you count in base 2? Or is it even base 2?
The first time I ever heard of computers was in 5th grade - "new math"(circa 1962, approx.) - they taught us to count in base 2:
1=1, 10=2, 11=3, 100=4, 101=5 (there are only 2 digits: a zero, and a one).
I googled my questions, and received the following responses:
"The binary number system was first invented back around 1679 by Gottfried Leibniz. The modern, or present day binary code was developed by Claude Shannon."
and http://en.wikipedia.org/wiki/Binary_numeral_system
So, that is how computers compute?
Most people count in base 10.
At least that is what the math teacher told me in 5th grade in 1962.
I'm glad I am not required to fully understand it in order to use it! (My computer, that is.)
Computers these days (can't say for earlier than 1995) are in base 8, because of the bits and bytes. Even then, that's changing to "simplify" (or dumb-down) so that the basic user can understand, and slowly merging into base 10. For example, 1Gig used to be 1024mb, but to "simplify", they're making it 1000mb straight.
ReplyDeleteIn the end, I don't think it matters so long as it works and people know how to use it ;) But it's typical that I learn to speak in base-8 and then they decide to change it :D
I am like you glad I do not have to understand to use mine as would not be here if I did math was never a strong suit of mine. In any form.....
ReplyDeleteI suck at maths too. Base eight, though, is basically (and VERY basically!) multiples of 8:
ReplyDelete8 bits is a byte
8 bytes is a megabyte
1024 megabytes is a gigabyte
1024 gigabytes is a terabyte.
Except now that's changing to "simplify", and it's base 10, which is basically multiples of ten (the easiest multiplication). I don't know if bits, bytes, and megabytes are changing, but 1000 megabytes is a gigabyte, and 1000 gigabytes is a terabyte.
I don't know who decides these things though ...
oh, I also have vague memories of learning this at school - counting in base 2, base 8, base 16... lots of fun.
ReplyDeletemy understanding was that computers computed in binary code (at least in the old days) because it was a question of bulbs being either on or off, so on is 1 and off is 0 and that's all there is.
That sounds quite plausible ... I'll have to research that, I'm intrigued now :) (It was all before my time, my "expertise" such as it is only dates back as far as about 1998)
ReplyDeleteah, well, I was exposed to this stuff second-hand back in my childhood and teens. my big brother trained as a computer programmer. he really tried to get me into it too, I remember when I was 14 he introduced me to a self-study book about it. we're talking 1970s here - big machines in their own air-conditioned rooms... /nostalgiafest
ReplyDelete