Cloud Mea Culpa: Nick Carr Was Right and I Was Wrong

While it is tempting to forecast The End of Computing, it's unlikely that IT development will stop at Amazon-hosted (or Microsoft- or Google-hosted, for that matter) centralised computing

The point of these changes is they were not merely fashionable initiatives. Each represented a significant improvement in computing capability, driven by the advance of technology. And each resulted in a massive change in infrastructure, protocols, connectivity, and hardware. This is the point, I think, where Carr's analogy of the electricity grid breaks down. Unlike computing, the fundamentals of electricity settled down quite early and have remained pretty stable subsequently. Otherwise, how could Livermore's famous 100-year lightbulb remain burning to this very day? If electricity were like computing, the City of Livermore would have been forced to discard the lightbulb due to changes in power frequencies, socket obsolescence, incompatible hardware, or the like.

While it is tempting to, in a bit of a paraphrase, forecast The End of Computing, it's unlikely that IT development will stop at Amazon-hosted (or Microsoft- or Google-hosted, for that matter) centralised computing. To extrapolate one current trend, look at the explosion of computing-capable portable devices like smartphones and music players. What will happen to computing when, thanks to Moore's Law, these become as powerful as today's most powerful desktop computers, with just as much storage as well? Surely computing infrastructures will evolve to integrate that new distributed computing capacity.

Just so will data centers. I doubt that today's best practices in data centers, as evinced by cloud providers, will remain static.

Carr's more general point, however, is well-taken. What is the role of corporate data centers in this still-developing IT landscape? Notwithstanding the continuing rapid evolution of infrastructure, does it make sense for companies to run their own data centers going forward? Putting the question another way, is computing so changeable that there is still the potential for competitive advantage in running one's own infrastructure? As a corollary, is the variety of corporate applications so profound-and the differences in their architecture, hardware requirements, and so on so important-that it precludes moving to a standardised environment, a la Microsoft's San Antonio data center.

It's clear that the nature of how data centers are run is moving away from manual toward automation. As that piece discussing cloud data center cost structures noted, the automation of these centers is a key reason their cost structure is so much lower than standard data centers. For an internal data center to remain competitive from a cost benchmark perspective, it must attain those same automation capabilities. Much of the discussion about "internal clouds" posits that companies can match the same agility, automation, and economies of scale that external cloud providers achieve.

I sometimes hear advocates of internal clouds say that they want to provide a way for companies to leverage their existing infrastructure, but make it cloud-capable by layering some additional hardware and software on top. It's an attractive argument, but is it attainable or is does it merely attempt to justify remaining committed to the sunk cost represented by current data centers? More specifically, can real cloud capability be accomplished with the current infrastructure, representing as it does the variety of hardware and software purchased to implement specific application initiatives? My sense is that much of existing infrastructure cannot be moved into an automated environment, and a significant part of it will need to be scrapped or replaced with new automation-friendly technology.

To draw another historical analogy, a complement to the analogy I drew at the end of the cloud center cost posting alluded to above, consider how mass production evolved. When Henry Ford finally realised he needed to replace the Model T with a newer design, he found that his highly-automated factory lacked flexibility-it was designed for one thing: making Model Ts. It took an enormous redesign (lasting 18 months) and required replacing 40% of the machine tools in the factory, which were unable to manufacture anything but the T, before he could begin manufacture of the Model A. One might say he was "locked in" to manufacturing Model Ts. Ever since then, auto factories have been designed to be much more flexible in terms of car manufacture, enabling general automation no matter what specific type of car is being built.

Today's CIOs are likely to confront the same issue: if they want to move to a fully agile, fully flexible infrastructure, they'll need to do a general redesign of systems and processes-put bluntly, it will require significant investment to achieve "internal cloud" automation. Should they do so, that might put them on a level playing field, cost-wise, with commercial cloud providers. But it still won't answer the question as to whether "self-generation" of IT infrastructure is worthwhile when public options are available. Or, given the ever-increasing cost and complexity of infrastructure, does it make better sense to focus on applications-which, remember is where IT capability offers support for competitive advantage-and let someone else deal with the plumbing?

Bernard Golden is CEO of consulting firm HyperStratus, which specialises in virtualisation, cloud computing and related issues. He is also the author of "Virtualsation for Dummies," the best-selling book on virtualisation to date.

Join the PC World newsletter!

Error: Please check your email address.

Tags HyperStratusnick carrCloudcloud computingBernard Golden

Our Back to Business guide highlights the best products for you to boost your productivity at home, on the road, at the office, or in the classroom.

Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.
Show Comments

Most Popular Reviews

Latest News Articles


PCW Evaluation Team

Azadeh Williams

HP OfficeJet Pro 8730

A smarter way to print for busy small business owners, combining speedy printing with scanning and copying, making it easier to produce high quality documents and images at a touch of a button.

Andrew Grant

HP OfficeJet Pro 8730

I've had a multifunction printer in the office going on 10 years now. It was a neat bit of kit back in the day -- print, copy, scan, fax -- when printing over WiFi felt a bit like magic. It’s seen better days though and an upgrade’s well overdue. This HP OfficeJet Pro 8730 looks like it ticks all the same boxes: print, copy, scan, and fax. (Really? Does anyone fax anything any more? I guess it's good to know the facility’s there, just in case.) Printing over WiFi is more-or- less standard these days.

Ed Dawson

HP OfficeJet Pro 8730

As a freelance writer who is always on the go, I like my technology to be both efficient and effective so I can do my job well. The HP OfficeJet Pro 8730 Inkjet Printer ticks all the boxes in terms of form factor, performance and user interface.

Michael Hargreaves

Windows 10 for Business / Dell XPS 13

I’d happily recommend this touchscreen laptop and Windows 10 as a great way to get serious work done at a desk or on the road.

Aysha Strobbe

Windows 10 / HP Spectre x360

Ultimately, I think the Windows 10 environment is excellent for me as it caters for so many different uses. The inclusion of the Xbox app is also great for when you need some downtime too!

Mark Escubio

Windows 10 / Lenovo Yoga 910

For me, the Xbox Play Anywhere is a great new feature as it allows you to play your current Xbox games with higher resolutions and better graphics without forking out extra cash for another copy. Although available titles are still scarce, but I’m sure it will grow in time.

Featured Content

Latest Jobs

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?