Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Anti-historicism

Prev | Contents | Next

I think it is important to place Carr writings in historical context.  He is not the first and not the last IT utopist who twisted IT history to support some absurd predictions. From historical perspective his key statement that "IT’s evolution has closely mirrored that of earlier infrastructural technologies." is profoundly wrong.  While there are certain similarities in no way development of It mirror development of such technologies as railways or electrical networks: differences are more profound then similarities.  And the key reason for this is the It is the most complex artifact ever created by mankind. 

Contrary to Carr statements, IT already experienced several dramatic transformations in the past. Transformations that each time dramatically changed the nature of IT both at datacenter level and (for couple of last) the nature of client computing. We will discuss several aspects of Carr's frivolous treatment of IT history:

Questionable, Superficial Analogy with Electricity Generation

An important part of Carr's obscurantism is connected with his usage of  questionable from the historical standpoint analogies. Analogies, especially analogy with the evolution of electrical generation plants and transmission networks. Coverage which he provides is exactly the same as in case of IT: extremely superficial, lacking understanding important historical factors and very limited understanding of underlying technology.

Carr understands the world of electrical utilities and power transmission lines, which is his the most favorite analogy, even less then he understands IT.  

Carr's treatment of electricity generation is too simplistic. In reality along with the trend of building larger and larger stand-alone generation plants (for the reasons explained below) there was an opposite trend of putting generation plants close to point of consumption. Such local generation of electrical power has many forms the most important of which are  cogeneration (usage of heat wasted during electrical generation to heat the buildings with Manhattan and Eastern European cities as prominent examples), hydrogenation and wind power generation. There are also experiments with solar energy generation.

The most commercially important local electricity production is connected with the idea of cogeneration (CHP). In recent years, the use of CHP in commercial buildings and multi-residential complexes has increased steadily.  Examples of commercial and institutional CHP users include hotels, offices, and hospitals, which tend to have significant energy costs as a percentage of total operating costs, as well as balanced and constant electric and thermal loads. Buildings account for 12% of the direct fossil fuel consumption in the USA, 36% of the electricity generated and 25 % of the nation's fuel bill. The main advantage of the cogeneration is the ability to productively utilize the tremendous amount of heat (over 50% of total produced by burning fuel) produced during the electricity generation for heating and air conditioning of the buildings. Here is basic description from Wikipedia:

Cogeneration (also combined heat and power, CHP) is the use of a heat engine or a power station to simultaneously generate both electricity and useful heat.

Conventional power plants emit the heat created as a byproduct of electricity generation into the environment through cooling towers, flue gas, or by other means. CHP or a bottoming cycle captures the byproduct heat for domestic or industrial heating purposes, either very close to the plant, or —especially in Scandinavia and eastern Europe—for distribution through pipes to heat local housing.

In the United States, Con Edison produces 30 billion pounds of steam each year through its seven cogeneration plants (which boil water to 1,000°F/538°C before pumping it to 100,000 buildings in Manhattan—the biggest commercial steam system in the world.[1][2]

Byproduct heat at moderate temperatures (212-356°F/100-180°C) can also be used in absorption chillers for cooling. A plant producing electricity, heat and cold is sometimes called trigeneration or more generally: polygeneration plant.

Cogeneration is a thermodynamically efficient use of fuel. In separate production of electricity some energy must be rejected as waste heat, but in cogeneration this thermal energy is put to good use.

For example, Shanghai Pudong International Airport operates a CHP plant which generates combined electricity, heating and cooling for the airport’s terminals at peak demand times. It is fuelled by natural gas (The International Energy Agency) . System Details:

Power generator Gas Turbine Generators of 4,600 kW
Steam Generator Heat Recovery Steam Generator, producing 11 tonnes / hour at 8 bar, 185 ºC
Fuels Used Natural gas
Type of Chiller Absorption Chiller

Environmental Performance

CHP electrical efficiency 29%
CHP total efficiency 74%
NOx emissions 5 – 25 ppm

Economic Performance

Total project costs USD1,760,000
CHP plant operating costs USD705,000 / yr
Costs of separate generation of steam and grid electricity USD1,220,000 / yr
Annual savings USD515,000 / yr
Payback period 3.4 yr

The overall efficiency of the Pudong Airport CHP system is significantly higher than that of network electricity and on-site heat generation. It therefore contributes to both cost and CO2 emissions reductions. The NOx pollution from the system is also estimated to be less than coal-fired electricity generation. The system operates 16 hours per day to offset peak energy demand of the airport. This improves local reliability and reduces overall energy costs. The energy use of the airport is substantial, with electricity demand around 28 MW and heat demand between 20 and 65 tonnes / hour. The CHP system meets 20% to 30% of the airport’s electricity demand and 15% to 50% of its heat demand, depending on the season.

Another example is Network Appliance Inc. data center (Computerworld)

On hot summer days in Sunnyvale, Calif., when power demand soars and peak rates hit their highest levels, Network Appliance Inc.'s 1-megawatt data center drops off the grid. The company's natural-gas-powered cogeneration system delivers all of the power it needs -- and saves it about $300,000 a year in energy costs -- while also providing a source of "free cooling" for the data center.

NetApp's cogeneration system is "reducing energy expense by generating power and cooling when electricity prices are high and gas rates are low," says David Robbins, NetApp's vice president of global infrastructure.

The technology, also known as combined heat and power (CHP), combines a generator with a specialized chiller that turns the exhausted waste heat into a source of chilled water. Technically any power source can be used for CHP, but "natural gas is the most commonly used fuel source for cogeneration in small commercial applications of CHP," such as 5MW, 10MW or 20MW plants serving a single building or campus, says William Kosik, managing principal at EYP Mission Critical Facilities Inc., a New York-based engineering firm that has consulted with NetApp.
 

It might well be that the rising costs of fossil fuels will require for computers became less powerful and more  energy efficient and that means that Carr's attempt to bury modern laptop is the prediction that runs contrary to the problems with peak oil that the humanity might face now. From this point of view solar powered, autonomous low energy consumption devices might be vital for the future: huge "in the cloud" providers might simply become bankrupt due to the run-up in electricity costs. 

Many consider Micro Combined Heat and Power (MicroCHP) products are the future of residential HVAC industry (894545)

In 1997  10.7 Quads of primary energy was used to provide the 3.5 Quadsof site electricity used by U.S. residences. The remaining primary energy 7.2 Quads, was dissipated as waste heat. This waste heat corresponds almost exactly to the residences’ heat requirements (7.1 Quads). The huge losses of the electricity system (relative to the losses from space and water heating) means that while electricity accounts for only 35% of “site” energy, it accounts for 61 percent of the primary energy use and the same or more pollution. If the electricity used in residences buildings were generated by microCHPs and all the waste heat could be used, residential energy use could be reduced significantly.

MicroCHP products slightly improve the efficiency of the residential heating appliance but what is more important result in a substantial reduction in the environmental impact associated with meeting building electric power needs. United States Combined Heat and Power Association (USCHPA) estimated that the current generation of MicroCHP can provide both heat and electricity in a cost effective and environmentally friendly manner for all North-Eastern states. for building with MicroCHP during cold months of the year when building require heating electricity generation is just a byproduct of heating: essentially free. In a sense its like printing money during winter months.

MicroCHP appliances use part of heat generating by burning natural gas to generate electricity while other part is used for heating the home. It is important to understand that this  "opportunistic generation" (only when you need to heat home ) can provide almost zero cost electricity as the amount of heat consumed by electrical generator is very small.  Here is one article apt quote about this technology from Christian Science Monitor:

Down in Bernard Malin's basement is a softly thrumming metal box that turns natural gas into hot water and generates $600 to $800 worth of electricity a year - a bonus byproduct of heating his home.

"It's like printing money," says Mr. Malin, the first person in Massachusetts - perhaps in the nation - to own a residential "micro combined-heat-and-power" system, also known as micro-CHP.

But he's not likely to be the last.

Since Malin changed his home heating system to micro-CHP in February, 18 other families in the Boston area also have adopted the technology, which squeezes about 90 percent of the useful energy from the fuel. That's triple the efficiency of power delivered over the grid.

Factories and other industrial facilities have used large CHP systems for years. But until the US debut of micro-systems in greater Boston, the units had not been small enough, cheap enough, and quiet enough for American homes. Add to that the public's rising concern about electric-power reliability - seen in a sales boom of backup generators in the past couple of years - and some experts see in micro-CHP a power-to-the-people energy revolution.

"Right now these residential micro-CHP systems are just a blip," says Nicholas Lenssen of Energy Insights, a technology advisory firm in Framingham, Mass. "But it's a ... technology that ... could have a big impact as it's adopted more widely over the next five to 10 years."

The micro-CHP markets in Europe and Japan are expanding  fast with thousands of units are being sold annually. Even three years ago in November 2006 before major energy price rises micro-CHP they have significant installed base:

In Japan, more than 30,000 homeowners have installed micro-CHP systems driven by quiet, efficient internal-combustion engines, each housed in a sleek metal box made by Honda. Japan is ahead because gas utilities have been subsidizing and promoting the systems. In Britain, where the systems look like dishwashers and sit under kitchen counters, 80,000 systems made by a New Zealand company are on order.

At least five companies are building micro-CHP systems worldwide. Two are trying to enter the US market: Marathon Engine Systems of East Troy, Wis., plans to bring a 4-kilowatt hot-water system it sells in Europe to the US early in 2007. Climate Energy of Medfield, Mass., has developed a forced-hot-air system that marries a high efficiency furnace to a super quiet Honda generator.

In the United States, meanwhile, the residential CHP market remains in its infancy. Still market analysts expect to see micro-CHP systems gaining ground in the United States over the next five years, especially in the northeastern part of the country, which manufacturers are targeting due to that area's cold climate and more favorable regulatory policies. One major United States natural gas distribution company, KeySpan is offering a $2600 rebate for customers who purchase a microCHP from "Climate Energy" of Massachusetts and thereby help to conserve energy consumed by heating homes. Climate Energy system cuts carbon-dioxide emissions for electricity used in the home by 40 percent, company officials say. Climate Energy utilizes ECR furnaces and 1.2 kWe Honda generators in its micro-CHP systems. The units are 85-93% efficient at converting to useable heat and electricity the fuel inputs than would otherwise have fed a conventional heating furnace alone. Micro-CHP systems that provide a backup power generator capability, may hold the most promise

Section 923 of the Energy Policy Act of 2005 ("EPAct") on NJ advances Micro-CHP development and deployment at the residential level.  Section 921 also authorizes $40 million to advance this activity in 2007 and 2008 ( 051023rb)

Now let's return to his superficial and incompetent treatment of  electrical grid history and problems. First of all electrical generation plants are in reality huge chemical plants (burning fuel is a chemical reaction, with toxic fumes in case of coal). And the history of electricity generation was shaped by the desire to optimize this reaction and make it more efficient  for the production of electricity. So monstrous electrical generation plans is mainly a side effect of the attempt to solve the intricate problems of  the problem of efficiently burning fuel for the electricity generation. The ability to use alternating current for the transmission of electrical energy and low losses high voltage power lines is another domineering factor. There was also some political element in building national electrical grids in some countries too as they implicitly increased the power of central government. It is funny but Bolsheviks in Russia in 1920th used to have the slogan: "Communism is the power of  Soviets plus the all-national electrical grid". 

Cost of electricity produced depends on location (his favorite example, Google,  can really benefit from a datacenter in Rokies: when in 2007 natural gas was a nickel in the Rockies, New York City prices were $6.50 a million Btu.). Efficiency of power generating plants depends on the size of reactor that was the key factor which drives the concentration of electricity production in a few large generating plants. In addition electrical generation plant running on coal and to lesser extent on natural gas have significant environmental impact and until recently were simply dangerous to be placed in big cities (acid rains, radioactive contamination, etc).  Emissions from coal-fired power plants already account for about 27%  of American greenhouse emissions.  But the man problem is the amount of really toxic substances emitted and that's why large power generating plants are usually located outside the cities: controlling emissions from a few hundred power plants is easier than controlling them from few thousand. Here are factors that directly or indirectly stimulated the concentration of electricity generating industry. None of them is connected with fairy tales which Carr is discussing in his books:

Ironically right now tables are turning in favor of local electricity production using wind energy or solar energy (desert-based generating stations, roof panels, etc). In a way oil or gas burning furnaces in private homes is a luxury. Also with modern technology heating can be done more economically by adding electrical generator which uses part of the energy released from the burning of the fuel for generating electricity and that might happen with higher cost of oil and natural gas (vibration is a problem that needs to be solved).  the best is to supply heat for several houses not to one house that this runs into the problem with houses private owners that can be resolved only with higher prices of fuel. If prices of natural gas and heating oil triple from the current elevated levels such solution might become more feasible.  

What Carr does not understand is that fact that among the key drivers for building large electrical plants are such factors as:

But it comes as cost: maintaining network of high-voltage transmission lines increases the cost of electricity.  There is also some negative ecological impact  (for example, in the amount of birds killed), the amount of land they occupy and influence on health of people living close to high voltage lines. There is much less intrinsic need to redistribute computer server power between different time zones. That's another important difference between world of electricity generation and the world of IT.  

Also unlike electricity stream data stream is much more complex and that means that other factors enter the play in any attempt to standardize them on the levels above TCP/IP protocols. Interoperability of data streams formats is a huge problem. 

And of course it never comes to Carr that with rising costs of fossil fuels and ecological effects of coal mining any huge electricity consumer including his beloved Google can really benefit from proximity to a large power generation plant operating is the zone where fuel is cheap and abundant.  That actually includes feasibility of  integrating Google datacenter and the generating plant :-). Right now cost of electricity is relatively low, but if retail electricity costs will double due to higher costs of fossil fuel the game might change.  That actually might mean that the return to a local electricity generation, but on a new technological level with wind and solar power as important players, can be a viable opportunity to cut electricity costs. Especially if cars will gradually switch to electrical engines.  

Another important economic justification of universal transmission grid in a large country with multiple local time zones is that it provides some level of utilization of the capability of generation plants which are almost idle at night and not to burn coal or gas uselessly (it's extremely expensive to shut-down generators during off-peak period). In this case they can  supply electricity to adjacent time zones. Because of the economics of load balancing, transmission grids now span across countries and even large portions of continents. But truth be told almost 10% of energy is lost in transmission...  Transmission and distribution losses are estimated at 7.2% in 1995 in the USA, and 7.4% in 1998 in UK.

While electrical plans are reasonably complex the key process are the plant is relatively simple and is known to the mankind for more then a hundred years: it is conversion of heat first into mechanical energy and then conversion of mechanical energy to electricity. Both of those process benefit greatly from economy of scale.

At the same time economy of scale does not transfer directly to it because the complexity of IT systems has no precedents in human history. That's why mainframes were decimated by their smaller rivals (minicomputers and microcomputers (PC)) as if they were dinosaurs. That means that analogies with railways and electrical grid are deeply and irrevocably flawed. They do not capture the key characteristics of the IT technology: its unsurpassed complexity and Lego type flexibility. IT became a real nerve system of the modern organizations. Not the muscle system or legs :-) I would like to stress that  the IT technology have the complexity exceeding anything else created by mankind and unmatched flexibility which alone ensures that race to new heights (and new dimensions of competitive advantage for enterprise) is endless and basically unpredictable.

Here is very apt description of fallacy of electrical analogy from the review of "The Big Switch" written by Charles Fitzgerald:

The Fallacy of the Perfect Analogy

My second critique is that the book turns on the idea that computing is basically similar enough to electricity that it will inexorably follow the same path.  While there are similarities, it is a mistake to assume they are alike in every aspect.  There are enough differences that blind adherence to an analogy is dangerous:

So while the book gets the broad trend to more computing in the cloud right, Carr's extended analogy obscures a lot of the differences and subtleties that will make or break cloud computing endeavors.  Between the caveats and the broad definitions, there is a lot of leeway in his technical vision (admittedly the mark of a savvy forecaster).  Victory will go to those who best exploit both the cloud and the edge of the network.  Carr's own examples -- Napster, Second Life and the CERN Grid -- make this case, even if he either misses their distributed nature or chooses to ignore it.

One, Potentially More Suitable, Analogy

Recently I come across a discussion that might represent a more suitable analogy n Charles Hugh Smith Weblog entry  "Are People Smarter than Media Pundits? Yes; Something Is Deeply Amiss . This discussion also suggests that "Carrism" is more widespread phenomenon of incompetent, biased journalism, transcending Carr treatment of IT:

The analysis of the typical "nice guy looking at stuff with a journalist's eye" is often misleadingly superficial. I vividly recall a long piece either Fallows or Easterbrook wrote in the early 80s lambasting the U.S. Air Force for not buying the cheap trainer-fighter, the F-5, instead of the larger, far more costly F-15.

This analysis, published in a mainstream respected magazine (The Atlantic), made it appear to be completely nonsensical to buy the F-15 instead of the cheerfully cut-rate F-5, as if fighter-bombers were more or less like commute cars and a Chevy Nova was just as good as a Cadillac, so why spend more?

How about the ability to fly in poor weather? Oops. Part of the weight and expense of the F-15 lay in its advanced avionics which enabled it to fly at night in lousy weather--an ability you might want as a pilot or President if the enemy was unsporting enough to start a war in bad weather.

The rest of this MSM "analysis" was driven with the same sort of completely uninformed drivel and misconstrued "evidence." How about survivability in air-to-air combat? How about being able to carry enough air-to-air munitions and targeting capability to overcome numerically superior foes?

That didn't figure into the "analysis" either, revealing that the journalist hadn't bothered talking to pilots who were entrusted to win whatever conflict our civilian leaders sent them to fight. If it were up to the superficial, massively biased MSM writer, it would have been with vastly inferior weapons.

It didn't occur to the writer of this gleeful slam on "needlessly expensive aircraft" to wonder how the pilot might feel, knowing he was risking his life in a cut-rate aircraft with minimal avionics, weaponry, speed, etc. How aggressive could we expect him/her to be in a circumstance where he knows he's flying an inferior plane with cut-rate avionics and capabilities? Which one would you rather fly, the bargain-basement trainer or the one which might actually enable you to survive air combat?

I mention this 1980s-era "analysis" to show that the MSM has long been filled with appalling superficiality even at the highest levels of American journalism. This is neither new nor surprising, but Mr. Easterbrook may well have reached a new pinnacle of perverse blindness to reality...

 

Complete Inability to Grasp the Continuing Growth of Complexity  of Modern IT Infrastructure

The trend on gradual increases of complexity of IT infrastructure continues unabated for more then 50 years and there is no sigh that this process is slowing down. For example, the complexity of IT infrastructure dramatically increased during the last decade and that's on top of no less dramatic increases of each of previous four decades.  This process is very important for projecting any future trends and growth of complexity dramatically diminishes the effectiveness of centralization. 

Growth of complexity dramatically diminishes the effectiveness of centralization

Carr's complete and total misunderstanding of the level of complexity of IT is nothing new.  We can easily find Carr's views  replicas in 1970th.  For example, Andrei Ershov mentioned such views about IT in his famous article Aesthetics and the human factor in programming (based on presentation to AFIPS Spring Joint Computer Conference in 1972). He stressed that  inability to grasp the real level of  complexity of IT is rather typical for management and thus claims of software/IT professionals for a special status will always be disputed. As he noted, even in 1972 programmers and system administrators (IT workers) felt that they are gradually slipping into the paws of managers, who try to make them into kind of replaceable assembly line workers with all the work planned, measurable, uniform and faceless.  The ability of management to grasp real level of complexity of IT systems changed little since 1972 ;-). That's why Carr's view have found fertile ground, especially in "dilbertalised" part of IT management (please note that such talented IT managers as Ballmer, McNealy to name a few called Carr views absurd from the very beginning).  But despite all those mediocre IT management attempts to "tame" IT project costs overruns, they continue. also despite attempts to convert programmers as well as network and system administrators into another brand of blue collar workers, they survived almost intact for 35 years since Andrei Ershov speech. And they survived first of all because complexity of those systems make real talent in IT field  a very scare commodity and simplistic attempts to cut costs and streamline operations backfired (and those bozos who initiated such attempts were often fired or promoted ;-) 

Carr makes valid and fail-safe argument that, after all, who screws things up like IT? He laments that despite existence of such giants of thoughts as he is, in this day and age we still have runaway IT projects and projects that lack business value.  But this is a primitive view of the problem. The real challenge for large IT projects, the challenge that usually course cost and time overruns is the enormous complexity of undertaking. If you understand this simple fact,  then overrun in costs and time became much more understandable and explainable. This situation is actually typical for any complex product, especially those in which the final product is actually defined in the project, not build according to more or less firm specifications. "Flexibility of specifications" problem is actually the most acute in IT because software is extremely malleable. Car manufactures spend years on prototyping the cars. But in IT often the prototyping and building of the final product are not cleanly separated.  

I think that talented IT professional should have a unique ability to move between various levels of abstractions (from the hardware registers level to the application architecture level) and this ability is as unique as the ability to play chess on master, if not grandmaster level or to play violin on the level of good orchestra player, if not on David  Oistrakh level.  That means that Carr's suggestion that IT functions can be totally moved to the user level within a particular organizations was/is both naive and destructive.  It is naive because regular user does not have time and desire to study the complex software package he/she uses for his work. All he needs are results and that means that often those results are obtained in the most inefficient way possible. It is destructive because at the time of publishing on Carr's article (end of dot-com boom and almost the lowest point of stock market) it was as close to backstabbing the industry, which Carr does not understand and for which he never worked,  as one can get.  Of course money does not smell, but still...

The second example of his anti-historical approach is that Carr profess a very narrow and static definition of IT.  During the last 50 years IT evolved in many different directions. Now it is an umbrella term that includes a dozen or so sub-fields which are different to the point at which people from one area often do not understand technology of the other.  For example, data base engineering is one major part of IT, networking services is another. Web services is example of a composite field which involves and interacts with both networking infrastructure, server infrastructures and data bases. They use different technologies (few networking engineers know Perl, Java and SQL). Moreover the level of specialization is such  that even within a single Unix administration field there is quite a bit of specialization between personnel who service Solaris, AIX,  HP-UX, Red Hat and Suse.  This complexity and diversity limits the possibility of the deployment of non-professionals. People who do not have specific talents cannot be taught programming for the same reason not everybody can ne taught to play a violin, guitar or clarinet. You need a specific abilities and they are pretty rate. For example, programming talent (even on VBA scripting level) is pretty special and might actually better correlate to  writing abilities and musical talents then with mathematical talent. Of course, some top programmers are simultaneously gifted mathematicians (for example, Donald Knuth).

This complexity also influence the economic benefits of hardware and software deployment in a very interesting way: on each turn of technological spiral competitive advantage moves to higher and higher levels and gradually disappears from lower levels. For example in 1986 network connectivity was a huge comparative advantage while in 1994 just an access to Internet and SMTP mail was similar a huge competitive advantage. Now those are became standard parts of infrastructure and attention moves for higher level, architectural issues.  Differences in costs of project and quality of results between a team run by a talented architect and teams of the run-of-a-mill people without any architectural vision are quite drastic.

As for Carr claim of the lack of competitive advantage, I would suggest that it might disappear, but huge competitive disadvantage of weak internal IT is evident ;-). Lack of local IT talent instantly makes a company a donor in the hands of ruthless external suppliers and contractors. Consultants (especially large consultant firms) can help, but they often are the part of the problem instead of being a part of the solution due to the problem of loyalty. We all know what happened with medicine when doctors were allowed to be bribed by pharmaceutical companies. This situation which is aptly called "Viva Viagra" and which useless or outright dangerous drags like Vioxx were allowed to became blockbusters were replicated in IT: myth about independence of IT consultants is just a myth (and moreover, some commercial IDS and EMS systems in their usefulness and side effects are not that different from Vioxx ;-).  Cases when a company which lost IT talent (for example due to outsourcing) was overcharged two or even ten times for a particular project are not that uncommon. And believe me clueless companies can be persuaded to buy equivalent of Vioxx in software or hardware (appliances) by corrupt advisers quite easily, much like clueless patients were prescribed useless or harmful for them drags. Such cases are pretty common; they are just too embarrassing for company brass so few people (or nobody) are fired and the whole mess usually is swiped under the carpet.

Lack of local IT talent instantly makes a company a donor in the hands of ruthless external suppliers and contractors.

Anybody with long experience in IT organizations can tell stories when a particularly stupid or arrogant manager  (arrogant managers never trust their own staff)  in charge of a functional IT unit costs company millions of dollars in additional expenses due to overpaid consultants,  unnecessary hardware and enterprise software cost he/she bravely bought. Add to that exorbitant maintenance fees and that small victory of a particular manager can turn into non-strategic but pretty costly disadvantage for the company.  Despite aura of respectability of the industry software salesmen are much closer to used car salesmen then people unrelated to IT presuppose.  If those people feel a weakness, that's it: the company is cooked. In some way the atmosphere also remind the atmosphere in large investment banks during recent financial crisis, the atmosphere similar to high demand cult with suppressed independence of all actors from top to boom and where clients are openly despised and viewed as legitimate prey.  

Also just the level of complexity of software systems (including even such common as MS Office) means that Carr's idea the users can replace InfoTech professionals is simply naive. Yes, in some cases regular user can easily  accomplish simple tasks (and most of tasks a typical user faces are simple; that one of the reason of huge success of MS Office) without help from the expert. But in such cases Ms Word is downgraded to a variant of typewriter. Also Mongol horde approach can work in some cases: instead of using capabilities of software you can add low-paid low-trained people and with some additional management effort get similar net result.  But in other cases such an approach can be an unmitigated disaster. This Mongol horde analogy is of cause stretched but the difference in capabilities if a user and seasoned professional are so substantial that they cannot be ignored. 

The limitations of a regular user can be easily understood by evaluation the average level of mastery of MS Word or Excel. Quite frankly it is deplorable. A regular MS Word user cannot productively use probably 80% of MS Word functionality. Moreover out of 20% of features that he/she uses probably 10% is used incorrectly due to lack of training and education (styles is one such feature).   For Excel the situation is even worse as this is more complex and very capable product. This lack of knowledge translates into lower productivity and as such costs money. First of all because additional software is bought and maintained for the tasks that with some level of sophistication can be accomplished by plain vanilla Office. Knowledge is power and power is money.  

For more complex enterprise tools like Lotus Notes, LAMP stack (Linux-Apache-MySQL-PHP/Perl/Python), etc that situation is even less user-friendly and differences of productivity and used capabilities and achieved results of a regular user and seasoned professional more drastic. 

Complexity of software products also means that without specialists users tend to overbuy software. If we are talking about a large firm that can in minor way affect the bottom line as enterprise license for some packages costs half-million dollars or more.  For example, it is not uncommon for a large company to have overlapping enterprise packages for monitoring hardware systems and network equipment each costing, say $200K a year.  Such duplication can be avoided but if and only if there is adequate IT talent of the floor. And with high level of talent the enterprise can use open source products that might provide enough functionality for the particular enterprise. In a way the lesser is the level of IT talent, the more money the enterprise needs to spend on enterprise software and hardware  

The lesser is the level of IT talent, the more money the enterprise needs to spend on enterprise software and hardware

The last but not least: it is not enough to understands a particular software technology or protocol. Real professionalism starts with the understanding of the limits of applicability. This understanding is clearly lacking in Carr view of service providers, which definitely have their place and importance but are generally overrated by Carr as a universal solution to IT problems. In fact, while solving some existing problem, they simultaneously are actually a source of new problems. As we will see below this model have server serious limitations. Among them are "bandwidth communism" and re-introduction of  mainframe-style mentality with all its warts. 

Frivolous Treatment of IT history

In his writings Carr presuppose that this is the first time IT faces with a radical transformation (switch to "in the cloud" providers in his vision).  For example in his first book he wrote

We are arriving at the turning point in the history of IT in business with the convergence of three important trends that will shape the future

But IT the history is much more complex and is not driven by "convergence of ... important trends". Historically it was driven by technological breakthroughs, that make previous generation of computers and software obsolete. In a way IT is the most dynamic of existing industries.  Unlike railways or electrical grids, IT already experienced several radical transformations with the last major one being the demise of  "mainframe era"  dump terminals and the emergence of PC-based corporate IT.  IT history includes half-dozen "turning points" (with end of which meaning a small revolution), each of which was initiated by radically new technology:

  1. Predominantly military applications ( 1945-1950). Harvard Mark I, ENIAC, EDVAC,  etc. 
  2. Batch computing dominance (1950-1960).  IBM 650 was popular computer during this period. Later IBM 7070 (a  "transistorized IBM 650") became common. IBM 1401 was also common (20K system sold).
  3. The era of timesharing (late 1960s - early 70th) mainframe-based time-sharing services became kings of the hill in computing centers of their day.  This version of transformation can be called "batch processing does not matter". IBM System 360  which was the most popular was announced in 1964. 
  4. Minicomputer revolutions (late 1970th-early 1980th), when mainframe were first undermined by DEC computers and new operating systems like VAX/VMS and Unix. Later other  companies joined DEC and this class of computers emerged as less bureaucratic alternative to many mainframe functions. 
  5. PC revolution (early 1980th-late 1990th) which hundred of PC companies were first created and later most of PC companies perished in the wave of consolidation. First LAN and WAN emerged. The latter were used for information sharing (Fidonet, Usenet, BBS).  This stage of development of IT might well have been called "mainframes does not matter" at it put the final nail in "mainframe dominance" coffin. Paradoxically, mainframes as a class of computers survived and even later experience some kind of limited Renaissance (VM/Linux).
  6. Internet revolution (early 1990th-now) emergence of WEB-based infrastructure, portals (Yahoo) and search engines in early 90th (AltaVista, etc). Early stage was dominated by email  but since approximately 1994 http became the most common protocol (Web revolution).  Gradual switch to fiber optic in WAN, dot-com boom and emergence of linux and open source.   Later stage often called Web 2.0 included emergence of laptops with wireless networking as an alternative to cubicle-bound PCs and re-emergence of collaborative computing (stating from 2000) with new collaborative tools like blogs and Wikis as well as Internet search engines like Google. 

As you can see there is very little similarity between IT history and railway history or history of electrical networks. There were never pocket locomotives :-). But what is important they are artifacts of different complexity. And in no way computers ever were a "proletarian tool" like Carr states: 

"Twenty years ago, most executives looked down on computers as proletarian tools – glorified typewriters and calculators – best relegated to low-level employees like secretaries, analysts, and technicians."

It was always the instrument (and very expensive instrument) for management and government bureaucrats. From the very beginning it provided the ability to see more complete picture of the enterprise. Actually before the term IT became common this set of technologies used to be called MIS ("management information systems").  As such IT always was considered "a nerve system of enterprise".  In 1963 influential "Executive Leadership Course" from Prentiss Hall included special volume on MIS:

So even forty five years ago Carr's statement was demonstrably wrong.  The problem with Carr views that he is talking about PC whole the main IT workhorse at this time was mainframes.

IT history does not support  Carr simplistic hypothesis of an imminent  waterfall-style transformation from local datacenter into services with Internet connected service providers.  There will definitely be new transformations but so far it is early to tell what form they will take as historically they are bound to major technological breakthrough. 

Poor forecasting track record

If we try to time his predictions then his performance as a forecaster was so far not impressive. He wrote his paper in 2003 almost in the bottom of IT slump. But IT recovered from slump and expanded in 2003-2007.  Worldwide end users spent $1.16 trillion on information technology in 2006 and will increase IT spending at a compound annual growth rate (CAGR) of 6.3% to reach $1.48 trillion in 2010 [IDC]

In a way he behaved like a  typical "doom and gloom" stock forecaster who missed the upswing of 2003-2007 and judging from his books he does not care to check his former forecasts --  he is too busy writing new books.

But truth be told in 2004-2007 IT employment in corporations (and corporate IT datacenters) have grown. For example in 2007 alone the staffing has grown 7-8%;  2008 might be a down year due to general economic slowdown and huge layoffs in financial sector (for example, UK banks will eliminate 10K IT jobs). But this is nothing to do with Carr's "in the cloud" vision.

In the same timeframe the rise of "cloud based" services was impressive (especially for WEB hosting providers) but so far failed to doom the local datacenters.  During those years Google did not add anything strategically important to its search. Web based applications he developed were not received very well and linger in relative obscurity. Meantime Microsoft increase its hold with Office 2003 and Office 2007.  That strongly suggest that people were voting with their money for the locally installed software during this period and there are some limits of what web applications can provide (limits which are gradually expensing, but at the page similar to the page of improvement of regular "installed on the hard drive" applications). 

Also the current switch to Web based mail by regular users cannot be treated like a success of "cloud" computing. In my opinion this is just a reflection of limited capabilities of SMTP protocol which make simple Web client more or less adequate for average user. SMTP does not allow sending large files as attachments (or more correctly the same is limited on mail hubs), interaction with  the calendar and ToDo lists is pretty simple so the problem of limited bandwidth does not come into play the way it degrade experience with Flicr and YouTube..  Still capabilities of best Web-based email clients are poor in comparison with best corporate email clients like Outlook and Lotus Notes. It is also telling that people who need advanced capabilities usually use both local and remote clients. Here we can talk not about replacement but "peaceful coexistence".

As for typical Office applications "displacement of desktop applications" currently looks like a failure.  There are simply not very many takers for them yet.  And laptops power have grown to such extent that to replicate their functions with dumb devices (network terminals)  on remote servers is a very questionable (and expensive) idea indeed. I for example doubt that Google can ever provide me the computing power I use in my Dell D620 laptop in Excel and FrontPage. And that factor alone limits the applicability of  "cloud computing" for me and many other users, although I am not against experimenting with Microsoft remote information backup/synchronization services when they will be available.

All in all, at lease for five years since 2003 his prediction can be classified as false (or premature), although the current financial crisis might negatively effect IT employment, but for reasons that have nothing to do with the attractiveness of the cloud-based services and all this blah-blah-blah about electrical utilities analogy.  Trends that we can observe are more complex then simplistic move to "cloud-based" service providers.

In this limited (but not short by IT standards) timeframe Carr proved to be a  false prophet.

Sending the concept of a datacenter with a dedicated staffing to the dustbin of history because it cannot deliver the impossible is foolish at best. At the same time software applications capabilities are limited mainly by human imagination and existing generation of computer hardware. Since human creativity is so vast in potential and computer hardware is still evolving by leaps and bounds, it would be foolish to think of software technology as being mature. It still quickly evolves and each decade introduced new hardware and elements in the corporate IT infrastructure. As long as software continue to evolve  IT will always be strategic. One example are track companies who missed the benefits of GPS-based positioning and wireless communication.  Carr strategic advice to keep a tight grip on the IT budget and “innovate when risks are low” is by definition a defensive strategy that is suitable for "old or dying" companies, not from the new hungry for market share players. The key for success is the vision what areas of It can serve as a catalyst for new business processes or enhance existing processes.  Currently the potential for this role of "catalyst" for custom WEB applications, new types of advertizing, specialized monitoring services (like truck tracking) is quite substantial. 

Andre Gide once said “Believe those who are seeking the truth; doubt those who find it.”

Historically Unjustified Extrapolation of Google Success

His extrapolation of Google success on the whole industry is extremely naive taken into account level of complexity inherent in the industry and the problems Google faced in the past and faces right  now. He also ignores the fact that Google record is far from being stellar and the company experienced serious difficulties in many attempts to extent its success beyond search (including an attempt to unseat Microsoft Office dominance). Even the most successful  of  the "in the cloud" services -- Web hosting -- is far from  a paradise and people often are opting out of it as soon as their site had grown beyond certain size and/or complexity. First they switch to accounts with shell access, then to virtual machine and then to hosted server. In the later case the value of whole exercise for large company is doubtful.  Yes I know about a couple of companies (actually eventually eaten by competitors) who paid "arm and leg" for outsourced Web infrastructure. But in cases that I know that was more sign of incompetence of IT brass in particular companies, then the sign of coming new times (in one case a company paid approximately $20K per month for pretty straightforward Windows-based corporate web site).  I doubt that Amazon or eBay will outsource their Web hosting to somebody.  It's pretty funny that, as I noted before,  Carr's vision of "in the cloud" service providers is very similar to previous mainframes and we all know the horrors of mainframe infrastructure which was decimated by PC revolution. That, by the way, means that in a very deep sense our hero is a reactionary figure who fits obscurantist label perfectly well. 

Prev | Contents | Next



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: March 12, 2019