Why real estate companies love digital workspaces

The real estate business has always been a personal passion of mine and so I grew to have a lot of contacts in this industry. tocario as a company and I personally therefore have a lot of conversations with leaders in this industry on what major challenges and changes real estate companies are facing today.

This post is the summary of what is driving this and other industries with regard to the workplace of the future. I will try to answer one question: “How can real estate companies and other traditional industries turn the challenges at hand into opportunities to gain a competitive advantage by embracing change and industry trends?”

It’s all about the money

The first major challenge is closely related to the construction side of real estate projects. Competition has been fierce between construction and project development companies, and as a result margins have come under heavy pressure. This pressure has forced many project development construction companies to change the way business proposals are managed, following request for proposals (RFPs) to win prestigious projects.

Bids on RFPs have switched from a top down approach based on cost based estimations plus a risk covering safety margin, to very detailed analyses of the whole project and (re)planning the designs in 3D plans to have detailed and certain bottom up cost-based estimations of a project’s total cost. Adopting this last approach, in a BIM (Building Information Modeling) methodology, makes the whole construction process more efficient by identifying planning conflicts upfront and avoiding failure costs. This change in methodology is estimated to account for 10% to 15% of a project’s total cost in traditional methodologies.

The other big challenge is related to the sales processes and methods used by real estate developers and brokers, especially when selling projects that haven’t been built yet. Investors and project developers are trying to increase volumes by decreasing the payback period(PBP – the time needed to recoup the funds invested in a project) of each project, while maintaining a high sales price to optimize profitsand to minimize the financial investment risk. To achieve this minimum PBP at optimal margins, the challenge does not lie in making shiny, happy and high resolution pictures, or having the best glossy marketing, to convince one to invest a significant amount of money based on pictures and location maps only. The real challenge lies in smoothening the sales process to create value through experience, and reach out to the right buyer: someone who sees the value of a real estate project and is willing to pay the amount requested by the developer. Thisperceived value is personal, and strongly depends on personal preferences, needs and requirements. This is why real estate developers and brokers are looking for disruptive and innovative sales tools as an opportunity to gain a sustainable competitive advantage, increase profits and significantly shorten the PBP.

 “If you want to know how to sell more, than you better know why customers buy.

Before starting to shape up the ideal innovative sales tool solution for real estate developers, you have to understand the buying behavior of our potential buyers.

In general, the decision to purchase something is strongly influenced by the ability to reduce the (financial) risk and stress of a bad buy, and thus any sales process/methodology should be focused on gaining trust and confidence to take away hesitation. As an example let’s identify 3 major types of buying behavior, and extrapolate real estate buyers to these different behaviors. You’ll see that these behaviors strongly depend on the type of product bought, the size of the investment made to purchase the product and the frequency of buying it.

The first behavior is relying on a strong return policy, to gain trust and convince the buyer to purchase a product. This strategy is fairly simple: if you don’t like it, just send it back and get reimbursed, decreasing the financial risk of your purchase. The main example of this buying behavior is observed when we buy fashion and clothes online. The offer online is much larger and access to it is pretty easy, but the risk that a product won’t fit, or isn’t exactly what you’d have expected it to be, is higher. To eliminate this buying hurdles, these e-stores offer good return policies to convince to purchase and try. These purchases are pretty small and have a regular frequency.

Another behavior, pretty complementary to the first one, is buying based on (peer/user) reviews and informationlike specifications. Think of buying technology products on Amazon. You want to be well-informed before taking action and buying by looking at the 5 star rating and the reviews of other buyers. These purchases are non-daily, less frequent and consist of medium sized investments.

The 3rdbuying behavior is strongly influenced by, and based on, ‘experience’. This experience based buying behavior is most likely to be observed when making larger investments. These investments are no daily or recurring events, but are pretty rare. Think of buying a car, a boat or even an expensive top-notch sound system. That is why huge efforts are made and huge amounts of money are invested in showrooms, to give a potential buyer an amazing experience, making it easier for him to justify, and feel more comfortable, with spending his money. The buyer feels the value of the product he is about to buy, and gets him into the less rational decision making unit of the brain.

If we want to create an innovative sales tool for real estate developers, with a winning sales strategy to gain a buyer’s trust by taking away fear of a bad buy, it is very likely that our new efficient sales tool should bring any potential buyer increased value by the opportunity to experience the real estate project upfront.

You’ve got to start with the customer experience and work back toward the technology – not the other way around.” – Steve Jobs

The question still remains: “If experience is key to investing in real estate, how can potential buyers be convinced to invest their money in real estate that hasn’t been built yet?”

Nowadays, amazing pictures and location maps are used to give potential buyers a preview or experience at the whole virtual real estate project, to prove a project’s value. Sometimes you can even watch good looking walkthrough videos with an amazing level of detail. No wonder that these pics and videos come with never ending sunshine, happy people playing outside and having fun all day long to trigger your emotions. If you’re lucky, as a potential buyer, you’ll even have the opportunity to visit a well-designed cozy interior of a showroom to discover how the interior of the real estate project could look like. However, there’s still many significant and critical dimension unexplored. You just can’t physically visit the project as it physically doesn’t exist yet, and that is where virtual/immersive reality combined with real-time rendering comes into play.

I recently discovered multiple companies, creating content in virtual/immersive reality based on 3D and BIM designs. One of them, Nanopixel, added multiple parameters and filters to this virtual reality up to an unseen level of detail. On one of their projects I was able to select the budget, number of rooms, orientation etc. I was also able to select the time of the day, and the season, while the system would simultaneously render the model in real time to show how the light would come in. For the first time I felt like I was walking around in the apartment, having a look at those dimension that matter to me. It has been the closest experience to walking around physically in an existing real estate project. In future projects I expect that virtual visitors will be able to change more and more settings to comprise more and more dimension to converge towards an equal to real-life experience.

Instead of thinking outside the box, get rid of the box

Deepak Chopra

This whole virtual/immersive reality, combined with real-time rendering, and the increasing amount of parameters, features and dimensions to be added to these models, brings us to the next big challenge or question. The last hurdle we have to overcome to unleash the full potential of this new sales tool is to be summarized as: “How can we bring these models, with demanding graphical computing performance, to any potential buyer? We can’t expect them to buy a heavy workstation, install heavy software, and download these models (comprising multiple Gigabytes) to experience a real estate project, do we?

If we want to reach a large group of potential buyers, and not only those owning heavy graphical workstation to run these immersive models, we need to find a medium that combines the storage and the processing power, accessible at anytime, anywhere and with any device, to overcome the last hurdles to create an efficient innovative sales tool for the real estate business.

That’s the time where cloud computingfinally comes in, what did you expect? With our MyGDaaS platform, performance is streamed over the cloud, no downloads are needed as MyGDaaS runs heavy immersive reality designs in any HTML5 browser to provide any user with a fluid experience to run these models instantly, with any device to experience your next real estate investment.

A conclusion is the place where you get tired of thinking

Arthur Bloch

The main advantage for real estate developers and vendors is related to the ability to reach out to a huge amount of potential buyers, increasing the project-buyer fit. Finding the right buyer for a project, leads to shorter PBP, higher value and margins and lower financial risks for real estate developers. This cloud adoption to sell projects, based on immersive reality to increase experience, might lead to a significant competitive advantage.

On the other hand, investors get access to a larger supply of possible projects, as experiencing a real estate project goes instantly. They don’t need to travel or make appointments anymore. Just take your tablet, click a link, and walk through a potential investment project.

It is clear that MyGDaaS stands for the missing link to create tomorrow’s innovative sales tool, disruptive-by-efficiency, by taking away all hurdles to let any potential buyer experience his future real estate investment. Try it yourself and let’s engage in a conversation about your business.

Going Back to the Mainframe

Original Post on Medium by Tim Edwards (@tim_shane)

Decades ago, a young Bill Gates fought for access to code on a time-shared mainframe. A personal computer didn’t yet exist. Not for a lack of vision. Computing was far too expensive, and had little utility to offer the average family. When Gates graduated from Lakeside School in 1973, ‘personal computers’ were boards you built yourself after buying an Intel 8008 chip, and ran simple programs that you wrote yourself. Only a small (but quickly growing) group of hobbyists were involved in personal computing. The academic and business world fed their calculation needs to consoles with time purchased or reserved on a powerful processing machine.

“There is no reason anyone would want a computer in their home.”

Ken Olsen, founder of DEC, 1977

Perhaps the mistake Ken Olson made was not being more specific. Home computing was still a hobbyist domain. When PC power rose, and modern operating systems began to take form, the rest was history, and the mainframes gathered dust. Today, remote and local servers serve our PCs with fresh data through services and hardware made to run continuously, but they more closely resemble PCs with specifically robust hardware than the mainframes that came before them. At least they did.

With the rise in virtualization, containerization, and powerful distributed server networks from Google, Amazon, and Microsoft, more and more of our computing life is moving to the cloud. The PC is still ubiquitous, but sales of tablets, laptops, and desktops peaked in 2014. Our computing is moving to the cloud.

Virtualization of the Personal Computer

The rise in core-count and GPU power in servers has lowered the bar for even small businesses to virtualize their workstations and move computing to a single server system. A single server can now power over a 100 virtual desktops. These businesses can use cheap hardware known as thin clients to automatically connect users to their virtual hardware. The flexibility allows you to use multiple devices to connect to your ‘personal computer’. Your phone, your personal laptop, a hotel computer: All can fire up your virtual PC.

When you combine the rise of server networks like AWS and Azure, with the rize of PC virtualization, I see an inevitable rise of ‘desktop as a service’. Microsoft is uniquely positioned in this market. Owning the most popular operating system, Windows, and a rapidly growing server network, Azure, there is little in the way to renting your computer. They can even create a beautiful Surface thin client. In the not distant future, $20 a month and a small stick similar to a Chromecast, will connect you to your PC offering limitless hard disk space and a basic level of computing power. Like to play games? It will seem silly to buy a $500 graphics card when you can just spend an extra $15 a month and have graphics power added to your PC with a mouse click. Working on a project that requires intense calculations? Upgrade your computer for only as long as you need it. Amazon lets you rent a 64 core, 256gb RAM, 8 GPU server by the hour. A system that easily costs over $100,000 can be had for a few dollars per hour.

Renting time from a powerful central server? Sounds familiar.

As networks grow and become more reliable, more and more computing moves off-site. We are moving back to the mainframe model, but now it’s a personal mainframe. It isn’t Bell calculating account statements, it’s Google finding people in your photos. It’s Apple playing you music. The Chromebook comes close. The downside is that many people aren’t ready to live entirely on the web. Many need applications installed locally and I don’t see this changing soon. iTunes, Photoshop, video games and more require having a local file system to use. Having a cloud desktop is the answer.

Some of the biggest innovations I see in the next 5 years will be the creation and retooling of protocols and infrastructure to accommodate this change. Hardware manufacturers and vendors will struggle and resist, but we’ve seen this before. You can’t argue with the market trends. Price, ease, and reliability will win. If you would have told me a few years ago that I would love renting Office versus buying perpetual licenses, I wouldn’t believe you. After seeing the ease of management, reliability, and growing features, I’m a believer. Now, I want to rent the rest of my computing as well.

How disaster validates technology

Why Trojans and other threats are a catalyst for digital transformation

With all the recent news about the WannaCry trojan I was reminded about an interesting thing about the minds of people: They only start to act if disaster is about to happen — or in the case of WannaCry, already happened.

A short background on the setting I am working in: My company is building a technology for WorkSpace virtualization as an alternative solution to Citrix or VMware Horizon. It is deep technical Cloud Infrastructure Technology wich is delivered through Service Providers. Most people don’t ever get to see anything of the technology, but a lot of people work on it in their daily jobs. As with all virtual desktops obviously there was a major threat from WannaCry (or any other trojan) and a lot of systems have been affected at customers. Over the weekend we had some calls from our Service Providers, that their customers have been affected and if there was a way to easily resolve the problem.

This is when the difference in technologies and the solutions implemented had a huge impact on data security and data recovery. The Service Providers could recover the customer desktops within 60 minutes and the customers were happy.

The interesting part is what followed:

One Service Provider started communicating what had happened to its customer base and even very conservative and cautious customers suddenly started to have inquiries about the Digital WorkSpace that is offered. Obviously these customers have been struck by disaster and now learned the a new technology is nothing to be feared, but transforming the workspace is actually increasing security and availability.

Disaster nearly instantly changed the values, believes and perception of the customers regarding new technology, because the benefits exceeded uncertainty.

The History of Cloud Computing

Definition Cloud Computing

Cloud Computing is the outsourcing of stationary hard and software to external service providers via networks like the internet. As a result of that the common service modelsInfrastructure as a Service (IaaS)Platform as a Service (PaaS) and Software as a Service (SaaS) arose. The official definition explains; “cloud computing is a model for enabling convenient, on-demand network access to a shared pool of computing resources (e.g., networks, servers, storage..) that can be rapidly provisioned and released with minimal management effort of service provider interaction.” In short:  with internet access, cloud computing is the ability to use any or all data and applications from any device.

Where did the term “Cloud Computing” come from?

“It starts with the premise that the data services and architecture should be on servers. We call it cloud computing – they should be in a “cloud” somewhere.” Eric Schmidt, Google. With the right browser and device, you can access your applications from everywhere. Schmidt, although some may disagree, earned the credit of coining the term “cloud.” However, what makes this controversial, is that within a month after he said this at the conference, Amazon was set to launch its Elastic Compute Cloud system.

How did cloud computing evolve?

The Cloud computing concept dates back to the early 1960’s. According to ComputerWeekly, Cloud Computing has evolved from the concept of an “intergalactic computer network.” Remember the mainframe days? With Cloud computing, one can store things somewhere else. During the 70’s and 80’s, the concept of having potential customers have everything (all their facilities) connected on the same network, was a revolutionary idea during those times. This was when most diagrams and workflows were drawn, and when the 90’s came around, the idea of the cloud became reality.

Revolutionizing the Cloud

The true revolution of the cloud was when computer scientists and engineers would show their diagrams and slideshows that referred to the “network”, where it meant the grouping together of computers and storage devices somewhere else. They initially coined the term cloud, however it was not a known term to the outside world just yet. Companies saw the monumental impact the cloud could have, and quickly jumped on board. The arrival of Salesforce in 1999, considered a pioneer in cloud computing, enabled Software as a Service product. Then came Amazon Web services in 2002, and in 2006, as mentioned before, the Elastic Compute Cloud was introduced. Cloud computing is everywhere now, and therefore, making user-ability even easier has come to the forefront of start-ups and companies like Tocario offering a new desktop as a service solution.