Cloud Basics Part 4: Virtualization – One of the Really Big Things

by Greg Dixon on May 11, 2012 · 0 comments

in Cloud Computing,ScanSource Catalyst,ScanSource Communications,ScanSource POS & Barcode,ScanSource Security,Technology

Post image for Cloud Basics Part 4: Virtualization – One of the Really Big Things

When the moon is in the seventh house  

and Jupiter aligns with Mars.

Then peace will guide the planets

            and love will steer the stars.

This is dawning of the Age of-…  uh;…  Cloud Computing.

People like to ask me questions about the Cloud, as if I were the one with answers.  One question I get a lot is,… “Where did the Cloud come from? It seems to have happened all of a sudden.”

Actually, the Hype Machine for the Cloud has been at full throttle for some time now.  How does that work?  Who decides when to turn that thing on? I suppose it has to be some secret consortium that watches and measures and predicts and plans for such things.  They publish a massive report that they charge big bucks for and the machine sputters to life.

I think the Cloud has developed because of an alignment of several technology evolutions.  For several years now, some really big things have been progressing, and now-… peace will guide the planets and love will steer the stars.

Really big thing #1:  Broadband Internet at home

I really do remember when the Internet got started. I remember Bulletin Board Systems (BBS), and I remember the awful screech of my 2400 Baud Hayes Smart Modem coming to life.  I remember having to decide whether I wanted to be on the Internet or talk on the phone.  And I remember it being slow.  Really, really slow.

E-mail, it turned out, was a pretty good idea.  And so were chat rooms and photo sharing and graphical user interfaces.  Browsers were a really big hit, and the more people used the Internet, the more they liked it.  Demand is king in business, so some smart and profit-minded people decided to speed the thing up.  The faster it got, the more you could do with it, and the more you did with it, the faster it had to be. Speed is addicting and so if it’s speed you want, then speed you’ll get. Today, we are streaming music and movies, making telephone calls, and having two-way interactive video conferences. All this made possible by something called Broadband.  Broadband is a term that is misused and misunderstood. It has come to mean a fast Internet connection for the home and small business user.  Of course, fast circuits have been available for years to businesses, at a cost. But for the sake of simplicity, let’s just say that the Internet now-a-days is screamin’ fast compared to the old days. My Hayes Modem ran at 2.4 KBps. That’s two thousand four hundred bits per second.  Today, for not much more money than I paid for my dial-up circuit, I can get 2.4 MBps or more.  That’s one thousand times faster.  So, what can you do with all that speed?  A lot.  One thing you can do, is use your browser, to run a pretty sophisticated application that is not even loaded on your hard drive. It lives in the Cloud.  But that brings us to the next really big thing.

Really big thing #2:  Thin Client Applications & Software as a Service – SaaS

Software makes your computer think happy thoughts.  Without software, your computer is just a warm brick.  You can buy software and load it on your computer and accomplish all sorts of things.  And that’s the way it’s been done for a long time. You buy a licensed copy of a software package and it is yours to keep. You use it for as long as you like and when you need a new one, you throw the old one away and start over. This kind of license is called a perpetual license, because you own it forever.

Most software has data associated with it. That data might reside in a central database, shared by many users.  Your email server is an example of this.  All the email “data” resides on the host server, and your PC software is merely a client of the host.  Since the client software is loaded on your computer and runs there, it is called a “thick client,”; thick referring to the fact that it takes up a lot of room in memory and on your local hard drive, and that it takes a thick wallet to afford it.

So, what is the alternative?   It’s called a “thin client” application. (Imagine that.)  It’s thin  because it is a relatively small piece of software, and it is generally free.  Your Internet browser is a thin client.  With the faster speeds of broadband, it is now possible to not only put your database server somewhere out in the cloud, but the software that accesses your database server is also out there. All you have on your local computer is your “thin client” browser with an account that gives you access to the application and its data.  Now, instead of owning your software, you are using someone else’s software; and database server. Your software is now provided as a service to you, instead of a license. Thus, Software as a Service or SaaS.  So, speaking of good ideas, this is a really good one, it seems.  Now I can access my very own server, with my very own data and I don’t have to own anything?  That’s right;… well,  sort of. But this brings us to our final really big thing. Read more on SaaS here: Cloud Basics Part 3

Really big thing #3: Server Virtualization

So, if a lot of people really liked the idea of thin clients and Software as a Service, then someone would have to build a really big building to hold all those servers.

True, but this is not just a problem for SaaS providers.  Every I.T. department of any size has run into the same problem.  As you grow, you add more and more servers. Each one has a specific and important purpose.  They all require power and cooling and maintenance.   And, if you looked at each one pretty closely, you might find that the server is really pretty bored. It runs its program and does its thing and never really breaks a sweat. In fact, most servers run at about 5 to 7 percent of their processing capacity.  It looks like we need another really good idea here.  So, someone said, “Hey! Let’s go back and do things the way IBM did it back in the 60’s!!  Yeah!!!!;… er…  WHAT?!?”

No kidding. The answer to having too many servers; lies in a very old way of doing computing.  In the 1960’s and 70’s, IBM made really big computers called mainframes. They were very expensive, so everybody had to share just one computer.  You accessed the computer from a dumb terminal. (This was before PCs.)  A dumb terminal had no software at all, which makes it the ultimate thin client. Each user acted as a client of the shared software and database that ran on the central processing unit (CPU). The CPU might be down the hall, or across the country. If it was remote, you used a very fast data circuit to access the remote software. (Is this all sounding strangely familiar?) Since there was only one computer, each user was given a tiny slot of time. In that slot, you could run a tiny portion of your software’s tasks. But if you were given enough time slots and you connected them together, it seemed that you were the only user of this multi-million dollar machine.  In actuality, you only had a “virtual machine”, not a real one. Hundreds of users could “time-share” on these big computers, each with their own virtual mainframe, each accomplishing the tasks they required.

FAST-forward to now.  Let’s say I have 40 servers, each running one application; each one drastically under-utilized.  The solution?  Virtualization.  It’s Déjà vu all over again.

Server virtualization requires that you install a new piece of software on the server called a “hypervisor”. This will allow a single server to be split up into several “virtual” servers. Each one can have its own operating system, its own software application, and its own time slot to work in.  It works remarkably like the IBM mainframe in that many users can simultaneously access their application and data, and instead of buying one server for each application, you have one server for many applications.  Each multi-user application has its very own “virtual server.”

Server virtualization has reduced the number of servers in an average data center by 50 to 70%.  This makes I.T. managers and CFO’s see GREEN in more ways than one (it saves on money and energy). Virtualization has had a huge impact on local data centers’ as well as large public data centers that make up the Cloud. Honestly, it just might be server virtualization that makes Cloud Computing possible.  Without it, the costs would simply be too high to make it feasible.

So, Cloud Computing is a convergence of several Really Big Things. But each of these things is really not all that new. In fact, they all have their genesis in the Age of Aquarius.

(Queue music…    Fade to black…)

Greg Dixon
Head Geek
ScanSource, Inc.

 

This post was written by

Greg Dixon joined ScanSource in 1992, where he serves as Chief Technology Officer. In this role, Greg oversees the technical support services provided by the company, as well as develops and manages strategic technological initiatives for ScanSource customers. Greg has more than 34 years of experience in the technology arena. He is seen as an industry expert and has been a featured speaker at many industry events over the years.

Learn more about this topic at scansourcecatalyst.com >Learn more about this topic at scansourcecommunications.com >Learn more about this topic at scansource.com >Learn more about this topic at scansourcesecurity.com >

Leave a Comment

Previous post:

Next post: