Floods, heat wave highlight benefits of cloud computing solutions

Companies in Western Canada that utilize an on-premise server storage infrastructure are likely now wishing they had a more comprehensive cloud computing solution in place as floods and record temperatures illustrate the downsides of relying exclusively on legacy IT hardware configurations.
Temperature and humidity are typically the two biggest environmental factors that data centre operators must account for, and both have been found in excess in British Columbia and Alberta of late. First, record flooding of the Bow and Elbow Rivers earlier this summer shut down Calgary and other cities in southern Alberta for days. CBC News reported that parts of downtown Calgary were evacuated for nearly two weeks due to flooding, and the event will cost the city approximately $265.5 million in repair-related damages.
In neighboring British Columbia, early July brought with it near-record temperatures, The Canadian Press reported. On Canada Day this year, 15 municipalities in B.C. set new record highs, with temperatures topping 36 degrees Celsius in Merritt and 36.5 degrees C in Pemberton. On July 1, Lytton was the hottest spot in Canada, with highs in that area southern British Columbia topping 40 degrees.
Extreme weather pushes data centres to the brink
Although businesses have to deal with data centre downtime at any moment, these kinds of events increase the likelihood of servers and other critical IT hardware components going down. In the case of Calgary, the flooding destroyed major swaths of land, shutting off Internet access for millions and liquidating massive amounts of property in its wake. While the British Columbia heat wave has not been reported to have caused servers to shut down, extremely hot days cause data centre air conditioning units to work extra hard to maintain ideal internal temperatures, often increasing the odds that vital temperature monitoring equipment could malfunction or lose power.
While flooding and heat waves are rare occurrences, they represent two of the many external factors that on-premise data centre managers must account for in their disaster recovery and business continuity plans. As network connectivity becomes more crucial than ever, downtime becomes more costly. According to the Aberdeen Group, the average cost of an hour of downtime rose from $97,850 in June 2012 to $138,000 by February 2012. As related costs rise, so too does the amount of risk a company takes on by owning and operating its own IT infrastructure design and enterprise storage solutions.
Instead of dealing with these headaches, companies can utilize a cloud computing solution. By hosting mission-critical applications, software and data in a remote environment with built-in redundancy, enterprises can be sure that its core assets are effectively shielded from whatever may be affecting its headquarters or remote offices. Cloud computing essentially removes just about all of the location-centric IT issues an organization could possibly encounter.
Of course, not all cloud environments and vendors are alike, and businesses need to be absolutely certain that the cloud computing solution chosen is capable of supporting its efforts to limit downtime. FlexITy, one of Canada’s largest IT services provider, offers enterprises a best-in-class cloud computing option that will efficiently facilitate all mission-critical efforts. In addition, the data centre infrastructure that supports FlexITy’s cloud services features the best equipment available. This way, companies will have confidence in the effectiveness of their cloud-based solution no matter what natural events occur.
')}