IT-Analysis.com
IT-Analysis.com Logo
Business Issues Change
Business Issues Channels Enterprise Services SME Technology
Module Header
Craig WentworthMWD Advisors
Craig Wentworth
16th April - Egnyte the blue touchpaper...
Louella FernandesLouella Fernandes
Louella Fernandes
11th April - Managed Print Services: Are SMBs Ready?
Louella FernandesLouella Fernandes
Louella Fernandes
11th April - The Managed Print Services (MPS) Opportunity for SMBs
Simon HollowayThe Holloway Angle
Simon Holloway
11th April - Intellinote - capture anything!
David NorfolkThe Norfolk Punt
David Norfolk
11th April - On the road to Morocco

Opinion

Has the traditional view of the network reached the end of the line?
Rob Bamforth By: Rob Bamforth, Principal Analyst, Quocirca
Published: 23rd October 2013
Copyright Quocirca © 2013
Logo for Quocirca

When John Gage of Sun Microsystems coined the term ‘the network is the computer’ in 1984, it seemed to effectively sum up the distributed computing paradigm that was overhauling the old world of mainframes and minicomputers. For long enough it looked pretty accurate, so much so that Cisco tweaked it a little further, although less formally, into ‘the network is the platform’.

But is it still true? Perhaps not.

The rot set in some time ago. When, in the early 1990s, a bunch of waggish Sun employees created a tee-shirt (one of hundreds over the years), which proclaimed “the network is the network, the computer is the computer, Sun apologises for any confusion”, it seemed like a reasoned return to sanity, but now this statement is profoundly inaccurate in both clauses.

First, the computer is no longer what it once was. Some might try to argue that we are not yet in a ‘post-PC era’, but tablet sales beg to differ. Google’s Eric Schmidt, who recently admitted to being surprised by the tablet boom said “It looks to us like the majority of enterprise computing is being done on mobile devices, in particular on tablets. That broke the old model.” If the enterprise is going that way, then it is only keeping up with where the consumer has already gone.

Beyond the portable single pane of glass, there is also a surge in small smart connected devices. Sensors and intelligence in objects that do not have users attached to them—the internet of things—is creating significant interest or success in many areas, from smart metering to self-drive cars. This will continue to grow and, while not always strictly ‘mobile’, it is clear that a good deal of intelligence is being pushed to the network ‘edge’.

At the opposite end is the ‘core’. As IT has evolved into a utility or service provider-like model, the core has had to become dramatically more agile. Mainframes, server rooms and even blade hardware no longer scales sufficiently quickly or flexibly to deal with the vagaries of demand and constraints of budget.

The core has become more and more virtualised. Stage one was within the racks of hardware and operating systems; stage two was across the network and into the ‘cloud’. However, the evangelical hype of universal ‘application service provision’ so popular during the dotcom boom has not quite turned out to be so simple. The concept is great, but the reality is occasionally flawed.

It turns out that some public cloud providers have periods of downtime, some might be compelled to pass on data to their governments and, as happened recently, some go to the wall—although it should be noted that on premise IT can suffer in exactly the same way. The principal of the cloud-like core is sound, but like any IT project its use should be properly architected and based on a well-thought out strategy before reaching for the credit card and diving into the implementation. Service provision requires a good understanding of both the qualities of ‘service’ and the lifecycle of ‘provisioning’.

Connecting core and edge should therefore be the place for the network, right?

No. There is no network; there are many networks and layers of interconnectivity, capabilities and demands. Just like the core and the edge, there is multiplicity and variety; wired and wireless, public and private, fast and slow, and lots of different protocols.

Is it the landscape of a single solution or single vendor? No, but there are expectations of simplicity, uniformity and standardisation.

Connectivity from the edge to the core must be seamless. It shouldn’t matter how many or what types of networks are used or touched, this must not be visible in the experience at the edge. Roaming, handover, re-authentication are services the networks must keep to themselves.

Network performance should always be adequate for the task in hand. Given the huge numbers of users, their sudden changes in usage patterns and variety of application demands, achieving this is no mean feat. Services need to be dynamically provisioned on demand, capacity monitored and traffic shaped to match resources.

Intelligence is vital. While much of the demand cannot be predicted, much of its impact can be. Data can be collected and assessed in real-time, trends noted as soon as they start and action taken. Problems can be identified, isolated and worked around, as they happen.

The fact that this does not smack of wires, light pulses and radio spectrum is not accidental. The network is no longer the network or the computer—it has become software. That brings both opportunities but also challenges in the areas of standardisation, transparency and openness. However, there is no going back, the future of the network will be software defined.

Advertisement



Published by: IT Analysis Communications Ltd.
T: +44 (0)190 888 0760 | F: +44 (0)190 888 0761
Email: