Going Back to the Mainframe

Original Post on Medium by Tim Edwards (@tim_shane)

Decades ago, a young Bill Gates fought for access to code on a time-shared mainframe. A personal computer didn’t yet exist. Not for a lack of vision. Computing was far too expensive, and had little utility to offer the average family. When Gates graduated from Lakeside School in 1973, ‘personal computers’ were boards you built yourself after buying an Intel 8008 chip, and ran simple programs that you wrote yourself. Only a small (but quickly growing) group of hobbyists were involved in personal computing. The academic and business world fed their calculation needs to consoles with time purchased or reserved on a powerful processing machine.

“There is no reason anyone would want a computer in their home.”

Ken Olsen, founder of DEC, 1977

Perhaps the mistake Ken Olson made was not being more specific. Home computing was still a hobbyist domain. When PC power rose, and modern operating systems began to take form, the rest was history, and the mainframes gathered dust. Today, remote and local servers serve our PCs with fresh data through services and hardware made to run continuously, but they more closely resemble PCs with specifically robust hardware than the mainframes that came before them. At least they did.

With the rise in virtualization, containerization, and powerful distributed server networks from Google, Amazon, and Microsoft, more and more of our computing life is moving to the cloud. The PC is still ubiquitous, but sales of tablets, laptops, and desktops peaked in 2014. Our computing is moving to the cloud.

Virtualization of the Personal Computer

The rise in core-count and GPU power in servers has lowered the bar for even small businesses to virtualize their workstations and move computing to a single server system. A single server can now power over a 100 virtual desktops. These businesses can use cheap hardware known as thin clients to automatically connect users to their virtual hardware. The flexibility allows you to use multiple devices to connect to your ‘personal computer’. Your phone, your personal laptop, a hotel computer: All can fire up your virtual PC.

When you combine the rise of server networks like AWS and Azure, with the rize of PC virtualization, I see an inevitable rise of ‘desktop as a service’. Microsoft is uniquely positioned in this market. Owning the most popular operating system, Windows, and a rapidly growing server network, Azure, there is little in the way to renting your computer. They can even create a beautiful Surface thin client. In the not distant future, $20 a month and a small stick similar to a Chromecast, will connect you to your PC offering limitless hard disk space and a basic level of computing power. Like to play games? It will seem silly to buy a $500 graphics card when you can just spend an extra $15 a month and have graphics power added to your PC with a mouse click. Working on a project that requires intense calculations? Upgrade your computer for only as long as you need it. Amazon lets you rent a 64 core, 256gb RAM, 8 GPU server by the hour. A system that easily costs over $100,000 can be had for a few dollars per hour.

Renting time from a powerful central server? Sounds familiar.

As networks grow and become more reliable, more and more computing moves off-site. We are moving back to the mainframe model, but now it’s a personal mainframe. It isn’t Bell calculating account statements, it’s Google finding people in your photos. It’s Apple playing you music. The Chromebook comes close. The downside is that many people aren’t ready to live entirely on the web. Many need applications installed locally and I don’t see this changing soon. iTunes, Photoshop, video games and more require having a local file system to use. Having a cloud desktop is the answer.

Some of the biggest innovations I see in the next 5 years will be the creation and retooling of protocols and infrastructure to accommodate this change. Hardware manufacturers and vendors will struggle and resist, but we’ve seen this before. You can’t argue with the market trends. Price, ease, and reliability will win. If you would have told me a few years ago that I would love renting Office versus buying perpetual licenses, I wouldn’t believe you. After seeing the ease of management, reliability, and growing features, I’m a believer. Now, I want to rent the rest of my computing as well.

How disaster validates technology

Why Trojans and other threats are a catalyst for digital transformation

With all the recent news about the WannaCry trojan I was reminded about an interesting thing about the minds of people: They only start to act if disaster is about to happen — or in the case of WannaCry, already happened.

A short background on the setting I am working in: My company is building a technology for WorkSpace virtualization as an alternative solution to Citrix or VMware Horizon. It is deep technical Cloud Infrastructure Technology wich is delivered through Service Providers. Most people don’t ever get to see anything of the technology, but a lot of people work on it in their daily jobs. As with all virtual desktops obviously there was a major threat from WannaCry (or any other trojan) and a lot of systems have been affected at customers. Over the weekend we had some calls from our Service Providers, that their customers have been affected and if there was a way to easily resolve the problem.

This is when the difference in technologies and the solutions implemented had a huge impact on data security and data recovery. The Service Providers could recover the customer desktops within 60 minutes and the customers were happy.

The interesting part is what followed:

One Service Provider started communicating what had happened to its customer base and even very conservative and cautious customers suddenly started to have inquiries about the Digital WorkSpace that is offered. Obviously these customers have been struck by disaster and now learned the a new technology is nothing to be feared, but transforming the workspace is actually increasing security and availability.

Disaster nearly instantly changed the values, believes and perception of the customers regarding new technology, because the benefits exceeded uncertainty.

The History of Cloud Computing

Definition Cloud Computing

Cloud Computing is the outsourcing of stationary hard and software to external service providers via networks like the internet. As a result of that the common service modelsInfrastructure as a Service (IaaS)Platform as a Service (PaaS) and Software as a Service (SaaS) arose. The official definition explains; “cloud computing is a model for enabling convenient, on-demand network access to a shared pool of computing resources (e.g., networks, servers, storage..) that can be rapidly provisioned and released with minimal management effort of service provider interaction.” In short:  with internet access, cloud computing is the ability to use any or all data and applications from any device.

Where did the term “Cloud Computing” come from?

“It starts with the premise that the data services and architecture should be on servers. We call it cloud computing – they should be in a “cloud” somewhere.” Eric Schmidt, Google. With the right browser and device, you can access your applications from everywhere. Schmidt, although some may disagree, earned the credit of coining the term “cloud.” However, what makes this controversial, is that within a month after he said this at the conference, Amazon was set to launch its Elastic Compute Cloud system.

How did cloud computing evolve?

The Cloud computing concept dates back to the early 1960’s. According to ComputerWeekly, Cloud Computing has evolved from the concept of an “intergalactic computer network.” Remember the mainframe days? With Cloud computing, one can store things somewhere else. During the 70’s and 80’s, the concept of having potential customers have everything (all their facilities) connected on the same network, was a revolutionary idea during those times. This was when most diagrams and workflows were drawn, and when the 90’s came around, the idea of the cloud became reality.

Revolutionizing the Cloud

The true revolution of the cloud was when computer scientists and engineers would show their diagrams and slideshows that referred to the “network”, where it meant the grouping together of computers and storage devices somewhere else. They initially coined the term cloud, however it was not a known term to the outside world just yet. Companies saw the monumental impact the cloud could have, and quickly jumped on board. The arrival of Salesforce in 1999, considered a pioneer in cloud computing, enabled Software as a Service product. Then came Amazon Web services in 2002, and in 2006, as mentioned before, the Elastic Compute Cloud was introduced. Cloud computing is everywhere now, and therefore, making user-ability even easier has come to the forefront of start-ups and companies like Tocario offering a new desktop as a service solution.