I recently came across an article called “Building the business case for a server refresh,” written by Daniel Eason, an IT Executive who strategized how to justify a hardware refresh to stakeholders and decision makers. A typical refresh should take place every 3-4 years, however, when the economy dipped in 2008, many companies have put off upgrading hardware. In this blog article, I will discuss that although his reasoning for a server refresh is sound, moving to a Cloud environment better answers his points as well as providing a solution to where a server refresh never has to be considered again. First, let’s look at an excerpt from his article and his main points when presenting the argument for a server refresh. To see the whole article click the link below: http://searchvirtualdatacentre.techtarget.co.uk/tip/Building-the-busines…
“Server refresh strategies can be difficult to justify to business stakeholders. And when your existing servers were approved only three or four years ago, building a solid business case for new purchases gets even more difficult. Another challenge is gaining trust among stakeholders, especially when the perception is that a server refresh strategy has been proposed purely for technology’s sake and not to benefit the business.”
“When building a business case, consider the following factors:”
• Operational cost reductions
• Maintenance cost avoidance
• Service reliability
• Better server consolidation ratios
• Application licensing
• Breaking the vicious cycle
I do not disagree with this article in that the typical server refresh time frame should be every 3-4 years. Hardware, like anything else has a lifespan. Pressing it beyond that, in the long run, costs you more money because it needs more maintenance and is not as efficient as it could be. With all of the advances in technology including servers and computer hardware, each of the arguments are grounded in fact and in the long run will benefit the company. However, what this article does not touch on is that in another 3-4 years the company will have another large capital expenditure negating the savings you gain. The main objective of his article is to give pointers to IT professionals to break the vicious cycle of putting off server refreshes past the recommended 3-4 year time frame. I argue that you consider breaking the vicious cycle of refreshing hardware ever again by moving your business applications to the Cloud. In this article, let’s take each of his arguments and apply them to Cloud Computing and prove why moving to the Cloud is a better answer than just a server refresh.
Operational Cost reductions
Moving to the Cloud means you pay a monthly per user fee for business applications on hosted virtualized servers. This also means that you no longer have to house servers on premise and can reallocate your IT staff while diverting cost savings to another area of your business. This has a much greater impact on operational costs than simply refreshing servers for lower power consumption and connectivity costs.
Maintenance Cost Avoidance
By moving business applications to the Cloud, in most instances you no longer have to pay maintenance costs. Maintenance costs are usually necessary when applications are deployed on premise. In a cloud hosted environment, the service provider is responsible for all maintenance costs and upkeep to the system.
All Cloud environments have an uptime guarantee through a SLA (Service Level Agreement). For example, our NjevityToGo Cloud environment guarantees 99.9% uptime. That means that during a given year, there can be no more than 8.75 hours of unscheduled downtime. This is provided by redundant power sources, redundant connectivity and failover processes that provide users with maximum uninterrupted service. When you really think about it, you probably experience more downtime than this when hardware is housed on premise being serviced by a full time IT staff.
Better server consolidation ratios
One of the biggest advantages of Cloud Computing is gained through economies of scale. Economies of scale refers to reduction in costs as the size and usage levels of other inputs increase. In cloud computing, through pooling application users and maximizing server space through virtualization, costs are reduced and savings are passed along to the users.
Through advances in virtualization, the ability to pool application resources allows for monthly pricing opposed to having to actually purchase software licenses. Rather than owning the software, you pay for software as a service for a low monthly fee. Another advantage of deploying in the cloud is the ability to scale users as needed. In addition, upgrading business application software to the current version is simpler and more cost effective in the cloud than having to purchase and re-implement the latest version of software on premise.
Breaking the vicious cycle
Server refreshes are a vicious cycle that should be done every 3-4 years. The author’s point is once your business puts off a server refresh, they will continue on the vicious cycle of thinking it is ok to keep putting it off. However, you can break the vicious cycle of a server refreshes all together by moving to the Cloud where you will never have to worry about another server refresh again.
The takeaway is that although refreshing servers is important if your business has an on premise deployment strategy, Cloud Computing is a much simpler and more cost effective way to deploy business applications. Even with the significant advantages presented, IT’s biggest objection to moving to the Cloud is, “data is not secure.” Check out the link to a previous blog article I wrote that addresses these security concerns and why IT may not be as objective as you think when considering the Cloud. https://www.njevity.com/blog/how-secure-your-premise-application