The world of IT is full of acronyms and abbreviations. So is the world of measurements. That's pretty much a guarantee that everyone working with IT measurements is going to get confused. For example, what is the difference between an MB and an Mbit? What does ERP or CMS stand for? The above are examples of standard acronyms we hear all the time. Inside every organization there is likely to be a whole range of industry-specific acronyms as well. For instance, in my company we often talk of LMS, CAT, SEO, SL20, and many, many more.
If someone mentions an unfamiliar acronym in your presence, you should ask the person to clarify and explain it. Most likely you will not be the only person in the room who feels lost. Asking questions is not seen as a sign of weakness in a modern IT department. It is a sign of curiosity and seeking to better oneself and one's knowledge about a subject. But let's not kid ourselves... in your IT career, you will have acronyms for everything. Soon it will be you confusing newbies with acronyms of your own.
It needs acronyms because information is constantly expanding out of control
Here are some examples of information explosion:
· The CPU at the heart of Apple's iPhone 4 is more powerful than a Cray supercomputer costing millions of dollars from the 1970's.
· The software that powers the space shuttle runs on a single MB of RAM. The minimum RAM requirement for Windows 7 is 1 GB ( or 1,000 times as much).
· If automobiles had improved their performance at the same rate as computers over the past 50 years, we might all be driving $500 cars that go 5,000 kilometers per hour, and get 500 kilometers per liter of fuel. They would probably fly too. Cool.
· In 1980, a 10MB hard drive cost over $4,000 and was considered to be a lot of storage. Today, you can't even buy a 10MB hard drive. It would be worthless and pointless. Even Windows 95, released way back in April 1995, required a 55MB minimum hard drive. Now you can get a TB drive for under a hundred dollars.
· Did you know that in 2010 Youtube transfers more data in one day then the entire Internet did in the year 2000?
Acronyms and abbreviations are essential to IT. Learning them is easier than you think.
Computers tend to get more powerful on an order of magnitude over previous generations. Some people have even tried to quantify this phenomenon. One example is Moore's Law. A simple way to explain Moore's Law is that a brand new computer is completely outdated in five years.
How do you remember the difference between a kilobyte, megabyte, gigabyte, and a terabyte without going crazy?
This is how. Break the terms down into their components. The word "megabyte" consists of two different words... "mega" and "byte".
"Mega" is a multiplier prefix found at the start of measurement words which means 1,000,000 (or a million times).
One "byte" is equal (normally) to 8 "bits".
A "bit" is equal to 1 switch "on" or "off" (either a 1 or 0).
It takes between 1 and 3 bytes to hold a letter... depending on what kind of character encoding you are using.
So a megabyte is equal to 8,000,000 bits.... or roughly equivalent to 500,000 letters of the alphabet. This is about the length of a short novel.
the prefix giga means 1,000,000,000 or one billion times.
So how many bits is in a gigabyte? That's right... 8,000,000,000. Enough to store 1,000 novels.
Once you learn the prefixes and basic units of measure, it's all very easy.
There are bigger units of measurement that an exabyte, such as a zettabyte and a yottabyte, but we will not discuss them in this course at this time. But your children and grandchildren will have to know them!
How big is an exabyte? For instance, every word ever spoken by every member of mankind would fill about 5 exabytes of storage space. The Internet transfers around 21 exabytes of data every month.
Now those are the "big number" multipliers. In IT however, things are not always big. They can also be very small. Unfortunately the "small multipliers" are often used erroneously. For example the words "microprocessor" and "nanotechnology" use mathematical prefixes, but not accurately. What they really mean in general is "really damn small".
Now that we know the prefixes... it's time to learn the units. We have already discussed byte and bit. This is a common measurement in computing. Although computers are much faster now than they were in 1940 (around the time they were invented), they are still digital, meaning that they can only understand zeros and ones (0 and 1).
You can put all the prefixes you learned (especially the big ones) onto any the following units of measurement related to IT:
What can you do with these measurements? Almost everything! Compute bandwidth, measure processor speed in TFLOPs with a benchmarking program, compare quality of digital cameras or monitors, and much, much more.
Have fun, and don't fear IT acronyms and measurements. They are only here to help us.