Considering that data centres are the infrastructure upon which all Internet traffic cooperates, you’d think that Canada’s largest data centre provider, Q9 Networks, would be a household name.
“For a while, it actually was a bit of tech darling,” Strahan McCarten, Q9’s Senior Vice President of Product Management & Strategy, told me at the recent Capacity Conference in Toronto, “but then once it was purchased by private equity, it sort of fell off the publicly traded stock market map.”
Founded in Toronto in 1995, Q9 Networks went public on the TSX in 2004 before being taken private again for $1.1 billion a couple years ago by a consortium which included BCE, the Ontario Teachers’ Pension Plan, Providence Equity Partners and Madison Dearborn Partners.
Today, Q9 has 14 data centre locations spread across six locations in Canada, from Ontario west, all designed to provide continuous power and cooling distribution and engineered to operate at 100% capacity with at least N+1 redundancy.
The reliability of that capacity is important considering that there isn’t much Internet traffic that doesn’t rely on the backbone that Q9 provides.
“We believe our focus is on running infrastructure,” says McCarten. “So whether or not that’s Ping-Power-Pipe or traditional real estate infrastructure, or compute and storage, or more of a complete private cloud, we tend to build and manage those things directly ourselves. And in some instances, some people might resell either our service, or include our infrastructure as part of a product that they sell. A lot of our customers are SaaS providers. They are running financial services applications in downtown Toronto, or running services for the oil patch in Calgary, and their software runs in our data centre infrastructure.”
Q9 has a data centre in downtown Toronto and another in Brampton, a large facility at approximately 16 megawatts, which is large both in terms of current and expansion capacity.
Three data centres are in Calgary, one in the downtown financial centre, with two others in the northeast and southeast corners of the city.
And then there’s the Kamloops data centre. When I asked McCarten why Kamloops, he had two reasons.
“Because B.C. has, like many Canadian provinces, privacy laws with respect to data residency in the province. But also seismically, it’s more of a desirable area than, say, the Lower Mainland. In the data centre business, that’s an important feature.”
So, even after the Big One rips the west coast as we know it apart, data connectivity will continue to flow.
Q9’s data centres are also designed to keep running in the event of an extended outage of public utility power.
In this century, data centre providers like Q9 have undergone a shift, from a culture of secrecy, fixed services and lack of an international footprint to adopting a friendlier profile, offering managed services and pursuing global business.
“For a long time in the data centre business, certainly in the early days, secrecy was pretty important,” says McCarten. “The market has changed, in part because of customer demand, but in part because of the need to connect.”
The need for secrecy also hinged on data residency, and the fact that for some international customers, including American ones, the fact of Q9’s proximity to the United States without actually being physically located there was very attractive.
“When some of the American organizations started to look for opportunities for growth, they came to Canada,” says McCarten. “And secrecy and confidentiality is frankly kind of paramount to the industry. We take physical security very seriously. We take customer IP very seriously.”
Having been in the data centre provider business since the mid-1990s, Q9 has a good overview of how Internet usage has changed, mainly involving an increase of capacity as customers move towards cloud services, intensive over-the-top video applications and social networking, not to mention the high-frequency trading of the ad tech industry.
The growth of interconnection between companies through direct links, as opposed to the public Internet, has also transformed customer expectations, which have grown more sophisticated in the past decade.
“As that interconnection explodes, the location of the data centre, the people who run that data centre, become as important, if not more important, than the secrecy element that has historically been a trademark of the industry. So we are now being much more active and vocal about where we are, and how you peer with other providers, and how you connect.”
And of course the advent of “the cloud” has transformed how traffic behaves and has altered the role of data centre providers, as cloud-based web services like Amazon Web Services, Salesforce.com, Gmail, Microsoft Azure, or SoftLayer drive a consolidation in the marketplace.
“What would have been 50 customers hosting their individual email platforms in our data centre, we now have one big customer looking for an environment where they can put their mail system that serves all the 50 customers that used to host their infrastructure with us,” says McCarten. “So we start to see a lot of that consolidation, and that tends to come from international players. It used to be very strong domestic business, and now it’s somewhere between a third and 50% international business.”
“Secrecy and confidentiality is frankly kind of paramount to the industry. We take physical security very seriously. We take customer IP very seriously.” – Strahan McCarten
Culturally, too, the idea of “the cloud” met with resistance as possibly a passing fad or a marketing ploy. Not to mention that the term itself sounded flaky.
“First of all, I think part of the skepticism from the industry at large was just the term ‘the cloud’,” says McCarten. “It was an amorphous term suddenly used to describe everything, which in an of itself doesn’t say anything. You can’t buy ‘cloud’. You can buy Infrastructure-as-a-Service or Software-as-a-Service or Platform-as-a-Service and the sub-components of it. So that I don’t think necessarily helped its cause in the early days, but very clearly there’s a generation of applications that are being designed and sold to companies that are very clearly taking over how certain services are delivered.”
As McCarten points out, the need for increased capacity has been driven by the gradual acceptance of cloud-based or over-the-top services, as customers have gone from a process of kicking the tires to check dependability through to dependence on those services.
But for applications that are still best run inside a data centre, clients have also adopted the use of the “hybrid cloud”, which can be described as a little bit of colocation combined with cloud services, or perhaps the building of a private cloud for clients who are handling sensitive data while they use the public cloud for customer-facing applications.
“We as a data centre provider want to give customers access to all the tools that they need inside the data centre, and connect them to all the other tools that they want to use that we don’t necessarily provide,” says McCarten. “So that’s why we think about enabling key services and providing another one. So we directly sell colocation, we directly sell private cloud, we directly sell on-demand infrastructure. But we also connect, and try to make very easy to consume, a variety of other public cloud services that we’re not going to be in the business of selling.”
While quality-of-service standards and reliability have generally always been high in Canada, ease-of-use hasn’t always been one of the data centre provider industry’s hallmark qualities.
What’s remained consistent, though, is that ultimately what the customer wants is to solve a business problem. It’s only the approach to solving those problems that have changed over time.
“The objectives themselves, I don’t think have really changed all that much. Reliable, cost-effective, highly agile, rapid response, revenues and expenses meet in time so that they can run their own profitable business. But the sheer number of tools that are available to allow you to do that in IT, which didn’t exist 10 years ago, is something else. Where traditionally companies had to buy infrastructure, and do that on some sort of forecast of demand, the way it is now, if they have peaks and valleys in their demand cycle, they can build to a base load and then use our services to meet that peak.”
We Hate Paywalls Too!
At Cantech Letter we prize independent journalism like you do. And we don't care for paywalls and popups and all that noise That's why we need your support. If you value getting your daily information from the experts, won't you help us? No donation is too small.