|Home||Switchboard||Unix Administration||Red Hat||TCP/IP Networks||Neoliberalism||Toxic Managers|
|May the source be with you, but remember the KISS principle ;-)|
"This could be Heaven or this could be Hell"
Nicholas Carr represent a interesting threat to IT as a profession, the threat which I would call obscurantist threat. The essence is weak understanding of technological aspects of IT masked by definite talent as a writer. Due to his talent as a writer many people who know little or nothing about IT (and that, unfortunately, includes large part of the media) take him seriously because of the stylistic quality of his prose. When media lends credibility to his ignorance in IT technology, he is able to spread naive or completely wrong ideas like his hypothesis that "cloud computing is the next best thing since sliced bread". If a person has superficial understanding of the technology used in IT systems, or, occasionally, the problems that datacenters IT staff that operates them faces, how you can expect from him informed recommendations and forecasts?
We tried to show that both his initial article and subsequent books suffer from the superficial knowledge and misrepresentation of IT history (cloud computing is reincarnation of old idea of central datacenters that was dominant during mainframe era on a new techonolocal level). His analogy with electrical transmission networks is also superficial and factually wrong. IT is more like nerve system of the organization then muscles of the organization which electricity power on. The diverse group of technologies that falls under the umbrella term "cloud " has both strong and weak points but as we tried to show can never fully replace existing ("smart") components of IT infrastructure. Another problem with Nicholas Carr is that he sometimes falsifies the evidence: he cherry picks historical facts to fit his needs instead of trying to provide an objective picture (his usage of electrical transmission networks analogy is a good example of his approach to selection of facts and analogies). To be more correct Carr stretch analogies to fit his pre-cooked Utopian vision ("server is sacred, desktop is dead"). The central idea of the article "IT does not matter" is simply a fallacy and it is not supported by the facts on the ground. This promised nirvana is even farther form us in 2011 as it was in 2003...
|In no way events with cloud computing will unfold in the direction of moving everything on the remote, run by external providers, cloud-based servers. What will succeed is a hybrid approach with large percentage of private clouds (aka remote data centers) and with powerful desktops (usually in the form of laptops) and Smartphones playing an increasingly important role...|
Proposed remedy of external to the organization "in the cloud" providers is a plain vanilla Utopia. There are severe technological limitation of centralized provision of services even for simple protocols like email (cloud approach does not work in case of Smartphones, which for the period since the publication of the Carr's paper became probably more important devices for reading mail then Web browsers). For services like media streaming important problems with the bandwidth problem, which are pretty evident for frequent users of You Tube: sometimes you just cannot watch the selected video as server cannot push the data fast enough due to some network bottleneck. At the same time with buffering not real time content can be delivered over the Internet as Netflix success (and Amazon attempt to replicate it) has shown to everybody very vividly. As they aptly said in Hotel California: "This could be Heaven or this could be Hell".
In other words, the success of "from the cloud" content delivery depends on protocols you need to provide, reliability you need to achieve and the chances of explosive growth of demand (without that cloud appeal instantly shrinks in half :-). What Carr failed to understand is that "in the cloud" providers can give definite competitive advantage to the organizations that experience rare but very high load spikes. That includes such important category as retailers, larger of which now have their own "private" cloud infrastructure. For each protocol detailed analysis of costs and benefits and field trials are required. This is not the case of one size fit all.
As with mainframes, there is a important vendor lock-in problem here that was never discussed by Carr. This is important problem for which the solution still needs to be found. Right now organization that has large deployment of particular service (say, switched to Google for of cloud based application and email) locks itself and can't leave the chosen provider without incurring substantial additional costs. Actually such a conversion to a new provider is difficult without highly qualified internal IT staff so preserving par to internal datacenter infrastructure is just insurance against worst case scenario, when chosen provider either can't deliver or gradually became too costly. In other words "leaving the door open" is a prerequisite to any large scale adoption of "in the cloud" providers.
IT in general and the structure of datacenters are driven by technological changes. Right now those changes are concentrated on half-dozen of new technologies:
How those changes will affect the datacenter and IT in general we can only guess. But it is highly unlikely that the only movement will be in the direction toward "cloud based extrenal providers" that Carr predicts.
It is clear that virtualization and ability to migrate workload from one server to another in real time will play more significant role in the future. But that does not mean that those technologies will be utilized only by external providers like Amazon. Amazon is not cheap and reliability is not guaranteed as recent outages convincingly demonstrated. A hybrid approaches look more promising with significant client based component because of tremendous computing power of modern laptops, netbook and other clients (it is the computing power of desktop which actually dooms dummy clients). No one should forget that along with more powerful server we get more and more powerful laptops and to provide computation power of dual core 3GHz CPUs with, say 6G of memory for each user is impossible even for really rich and ambitious companies like Google. This presuppose a long search for the optimum, economically justified equilibrium between server and desktop components for each and every important application. At the end of the day the codebase will be refactored into two layers (centralized vs. local):
Economic factors make the optimum location and structure of "the cloud" open to review. For example private, corporate-based cloud (remote datacenter with high doze of virtualization) are pretty economically attractive and are superiorr for many organizational needs. I doubt that corporations need any preacher, exposing the value of remote datacenters :-). Also open to review is the spectrum of applications which can demonstrate better cost of ownership under pure "in the cloud" model with services supplied by external providers. Compatibility with other applications, security and reliability of those services are still need to be proven. Yes, it is undeniable that there are several classes of applications that emerged as suitable for 100% "on the server", SaaS approach. Among them we can mention email, CRM, supply chain management, corporate benefits, travel, expense reporting, many other HR-related applications, corporate portals.
On the opposite side there are several important corporate applications that proved to be much less suitable, for example, Office applications, especially spreadsheets, ERM (SAP/R3, etc), graphic processing programs, multimedia streaming, backups and most of CPU intensive applications. I would like to stress again that it is very expensive and rather stupid to provide the remote power equivalent to a dual core 3.4GHz CPU and half dozen gigabytes of 1.33GHz or faster memory.
At best Carr managed to ask several interesting questions, but provided inferior, simplistic and by-and-large completely misleading answers. Unless you can utilize them as a catalyst to your own analysis, Carrs' unability to grasp the fundamental IT concepts due to lack of educations devalues, sometimes completely, his writings. The initial HBR paper was intentionally controversial, extremely weak on facts but rich with fuzzy (and faulty like in case of electrical power network) analogies. The latter makes Carr a prominent representative of IT obscurantism. Subsequent books added almost nothing to the story, but continued and actually enhanced Carr's obscurantist tendencies. Some statements really have a flavor of yellow press publications.
|At best Carr managed to ask interesting question, but provided inferior, misleading answer. The initial HBR paper was intentionally controversial, extremely weak on facts but rich with fuzzy analogies (obscurantism) and colored by lack of special education. Lack of special education proved to be a big problem for this talented writer which devalued, sometimes completely, his writings.|
This is a typical non-professional approach to a complex problem and attractiveness of Carr's book and articles to readers is generally reversely proportional to the level of understanding of this complex technological area by a particular category of readers and is directly proportional to the attractiveness of outsourcing to the corporate brass (which is often too greedy for their own good ;-). And instead of writing new books Carr probably should try to explain why in almost ten years (his first article was published in 2003). A decade is a very long time for such fast developing technology as IT. Yet "in the cloud" computing has not yet become a mainstream proposition althouth it did experienced growth. But not as dramatic as exposition of smartphones (iPhone, Blackberry, etc), netbooks and ebooks readers like Kindle, tablets like iPad and other "smart" devices devices that negate the trend Carr advocated. Datacenters also continued to grow, including their most traditional form: local datacenters. I would like to see Carr answering a simple question: "Why powerful, wireless devices like iPad, Amazon Kindle, smart phones and ultra portable laptops (netbooks) became so popular?"
The most important tendency in modern datacenters recently was virtualization. While connected with the concept of cloud computing it is a distinct technology that currently is predominantly used in local datacenters. Or mixture of local and remote datacenters (the latter can be called private cloud for the sake of clarity ;-). Dynamic migration of applications from one server to another is another growing technology that became feasible with 10GBit local networks and is also increasing in popularity.
Utilization of those technologies by external SaaS providers are linked to several currently unresolved problems. Due to those problems, the deals proposed by "in the cloud" providers proved to be not as exciting as they were hyped up to be. For example Amazon elastic cloud became more popular but Amazon now sells almost as many electronic books as "dead trees" books. In comparison with the situation five years ago now there is an additional factor: availability of more or less mature virtualization solutions and much more powerful hardware, but this factor can be played both for and against "utility computing": businesses are more than happy to use virtualization technology within the limit of their private datacenters and get the same (or better as they do not need to feed a middleman) flexibility and efficiency gains. Carr fails to do the job of identifying criteria and the most promising areas were IT can be profitably commoditized and moved to utility service provider. Such areas definitely exists along with areas where it can be suicidal for the company to outsource to the "in the cloud" provider.
|Due to his naive enthusiasm for SaaS Carr fails to do the most important job of identifying the areas and parts of the common IT infrastructure components which IT can be profitably commoditized and moved to the "in the cloud" utility service provider and part which are not suitable and should be run locally. What is more important, the future unfolds not as a dominance of pure "in the cloud" service providers, but as a combination of local software and remote services which are carefully balanced and optimized for each task. Attempts to convert current laptop, smart phones, etc into dumb terminals as Carr envisions are naive and doomed to be a failure...|
The irony of Carr's position as a technology forecaster is that for the last five year since the publication of his HBR article local datacenters actually flourished and had shown no signs of impeding demise. Also his "in the cloud" euphoria does not take into account mainframe-style social problem that are typical for outsourced IT services as well as several technical problem (with the cost efficiency and the bandwidth cost as probably the most important two).
The author uses an old tried and true "snake oil salesmen" formula: shocking title, bashing existing situation with circumstantial evidence, red herrings, and logical leaps and then proposing some promising but immature technology as a absolute, the only right solution. Every other solution be damned. The key idea is to bank on the fact that his audience is uninformed and gullible. The author writings do scare me not because of the content (especially given the absurdity of this main thesis) but due to all the praise and attention he got. It is the latter that makes me worry. Authors of miracle diets and "make rich fast" books should be beware of competition as well as a potential new field into which they can profitable extend their business ;-).
Both the initial Carr's HBR article and two subsequent books provides a simplistic and flawed hypothesis of where IT is heading. All three major ideas that Carr put forward in his HBR article (and he never added anything substantial to them later, in two subsequent books) are demostratably wrong:
There are two larger issues here:
In a sense Carr's recommendation is a recommendation similar to throwing out passengers from a sinking boat. IT ins this respect it is similar to problem with human nerve system. I strongly doubt the Bear Sterns or Enron went down because of overspending on IT systems.
The entire discussion about IT and its impact on competitive advantage boils
down to availability of smart, highly trained professionals who are capable
of broad strategic thinking and able to connect seemingly disparate ideas, systems
and protocols. Without them there can be no competitive advantage. But you cannot
employ just smart people. So some kind of IT organization in which
they are a part is necessary although
it might be quite different then the current structure.
As for the benefits of dismantling IT organization, such a move unless done in a very limited and controlled fashion entails serious dangers as exemplified by outsourcing experience of many organizations. The cure might be more dangerous then disease: an important side effect of total dismantling of IT organization is that instantly makes a company a donor in the hands of ruthless external suppliers and contractors. Consultants (especially large consultant firms) can help but they also can become part of the problem due to the problem of loyalty. We all know what happened with medicine when doctors were allowed to be bribed by pharmaceutical companies. This situation which is aptly called "Viva Viagra" and in which useless or outright dangerous drags like Vioxx were allowed to became blockbusters was fully replicated in IT: myth about independence of IT consultants is just a myth (and moreover, some commercial IDS/IPS and EMS systems in their destructive potential are not that different from Vioxx ;-).
The key problem is
that outsourcing everything to
service provider just move the underling problem of dilbertalization
and excessive bureaucratization of IT to service provider level. Moving services
to remote "in the cloud" service providers (or even local outsourcers) does
not solve this problem as the latter also are not immune from the disease.
It might be even more pronounced in such an environment.
Cloud computing increases the total complexity of the system. It is well known fact that simple systems have single points of failure that are relatively easy to diagnose and fix. Complex systems have multiple points of failure that interact in unpredictable and often undetectable ways, and they are by definition very difficult to diagnose and fix. Switching to cloud computing means switching to more complex systems and as such is a mixed blessing. The "maze" of cloud-based software is more complex as well as more vulnerable to unpredictable downtime and new security risks and generally makes vital corporate services including financial, communications, and transportation software less, not more reliable.
While WAN connectivity dramatically improved due to ubiquity of fiber lines and Internet became more diverse and powerful in recent years, in no way that suggest that the dramatic switch to "utility computing" is under way and that the best and/or most economical way to implement now standard (mature) levels of IT technology (services as Carr calls them) is to implement them "in the cloud" (bandwidth communism). Carr predictions also looks far from being plausible from several other standpoints. Utility computing is a viable trend but it is not absolute and has severe limitations; as such is far from being dominant and defining trend during eight years that passed after 2003. Nothing suggests dramatic acceleration of this trend in the future.
While "in the cloud" computing has its merits the artificial link of its success to dismissal of local datacenters is just another Carr's fallacy: they are more like complementary not antagonistic technologies and each has its strong and weak points. The hypothesis that businesses will "en mass" move mission-critical applications to "the cloud” is far from plausible. One of the problems at the moment is economics. “On Demand” computing via public Internet is only viable for the applications that require low bandwidth. That means for example that remote backup solution are mostly limited to private data. The idea of transmitting, say, a terabyte via public Internet (and that's typical size of medium company daily backup). Even private links are problematic here. Although telecom prices for bandwidth have fallen and bandwidth has increased, the size of local hardrives and processing power of a typical server and laptop has increased much more rapidly. Amount of graphical information stored also increased dramatically. That means that that there is no easy way to process this graphical information in the cloud, unless it is stored in the provider which performs the whole processing. The latter creates the problem of feeding it back to user laptops. Due to those constrains it looks like "In the cloud" service providers face serious headwinds ahead. Competing technologies include but are not limited to application streaming, virtual appliances and standardization of datacenter infrastructure (right now represented by "cloud in the box").
There are some clear analogy between old "mainframe-based' datacenter and new service provider approach. And legendary user hostility toward "glass datacenters" in general and IBM in particular during the era of mainframes was a symptom of underling problems. Something similar might happen with the remote service providers attractiveness. Any WAN-based service provider faced the unpredictability of public bandwidth problem and the cost of solving this problem depends on the type of service. That means that some services like email and Web hosting are inherently more suitable to outsourcing to providers in the cloud then others (application streaming, backup, SAP-style ERP applications, etc). For high bandwidth services like backup there is no free lunch -- you need to pay for private links (and that means to pay arm and leg ;-) in order to get respectable reliability. Ask any international company what part of their total IT cost is related to the cost of transatlantic and other long-distance links between remote datacenters of various continents. This is a pretty sobering exercise.
Carr's recommendations fare even worse then his key ideas which is typical for any Utopia: they are dangerously naive. Companies which follow them can definitely be hurt. It might be that Carr's article negatively affected economic growth by giving CEOs justification for withholding the necessary IT investment. I would like to remind the reader Carr's proposals which we discussed before in more detail:
- Spend less. That's a good, generic, risk-free advice applicable to any large corporation department and any situation. Essentially, this is another way to say that penny saved is penny earned. The problem with it is that "stupidity is punishable" and "scrooges usually pay twice". You need to understand where you need cut spending and where you need to spend more; elimination of local IT that Carr advocates prevents you from seeing this and as such is a stupid solution to the problem akin to throwing baby with a dirty bathwater. Weakening IT as akin to weakening of nerve system and can translate into more general trend that leads to "me-too" company with average products, average management, and a below average future.
- Follow, don't lead. This is very questionable advice, as it is IT IQ that provides strategic value to the business; a large company might be save a few millions by using copycat technologies but as each business is individual enough and that strategy can backfire and at the end lead to losses not gains as the product adopted can interfere with the established businesses process and decision making.
- Focus on vulnerabilities, not opportunities. This is really a very dangerous advice as Carr completely, fails to understand that value of sound architecture in security and disaster recovery. Right now too many companies are focused on issues of vulnerabilities and compliance, but it is strategy and operational mistakes destroy more shareholder value. As one industry study states [New Report Reveals Causes for Shareholder Value Destruction]):
"Only 13% of the decrease in shareholder value in these companies resulted from compliance failures. Sixty percent of the value destruction was attributable to strategic mistakes, such as misjudging customer demand or competitive pressure, or management ineffectiveness.
An additional 27% was due to operational blunders, such as cost overruns or poorly managed integration during mergers and acquisitions."
Contrary to Carr's diagnosis the key problem with current IT is not excessive costs or underutilization of equipment (another Carr's fallacy): it is disconnect of IT from the interest of the organization due to strangulating and corrupting bureaucracy -- dilbertalization of IT. It might be a sign of maturity in best "utilities" style but it is a pervert sign reminding "mature socialism" Brezhnev-style.
What is implicit in all this "IT does not matter" noise is compete absence of understanding of the value of architecture and IT talent in modern organizations on both on software level and infrastructure level. While businesses must articulate strategy and align IT with that strategy the results depend on the talen of system integrators and like cooks they are not created equal. That means that organizations need to preserver talented IT personnel including but not limited to programmers. architects, literate business leaders and business savvy, talented and loyal IT managers.
The truth is, IT departments are more like nerve system. They are not the brains that drive the business forward and they should not be. Much like the nerve system IT connects different parts of business organization together in such a way that makes possible for them like a players in a good orchestra to function in unison. That fact also limit the value of external consultants. Consultants don't know your business -- they know the toolset, and they can excel at holding hands but not much more. Business processes can be well supported only by a flexible, evolving architecture; The latter (often discussed under the label of adaptability) is probably the greatest challenge in IT. Processes which are cast in stone (like in some popular CRM and ERP applications; also typical for "in the cloud" providers) can be more of a liability then an asset.
Despite gross over simplifications, Carr's paper stimulates thinking and several interesting points can be raised after reading of the paper (books are just an extended version of the paper argumentation; they do not bring anything new):
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.
ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least
Copyright © 1996-2016 by Dr. Nikolai Bezroukov. www.softpanorama.org was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.
Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info|
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.
Last modified: September 12, 2017