Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Softpanorama Bulletin
Vol 22, No.01 (January, 2010)

Bulletin 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007
2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018

Fifty Glorious Years (1950-2000): the Triumph of the US Computer Engineering

Invention of computers was the highest moment of the development of the USA high-tech industry, the area which defined the progress of high-tech as a whole. This is an area were the USA really has been the greatest nation of the world. Real "shining city on the hill". The USA gave the world great programmers, hardware designers, network architects and managers.

When ENIAC was declassified in 1946 ( it made the front page of the New York Times) a lot of things were put into fast motion. As early as 1952 during the presidential elections night, Univac computer correctly predicted the winner. While chances were 50% ;-), this was an impressive introduction of computers into mainstream society.

Giants of the field either were US citizens or people who worked in the USA for a long time. Among them:

  1. Gene Amdahl -- architect of System/360 hardware. Also formulated Amdahl's law.
  2. Frances E. Allen an American computer scientist and pioneer in the field of optimizing compilers. Her achievements include seminal work in compilers, code optimization, and parallelization. She also had a role in intelligence work on programming languages for the National Security Agency. Allen was the first female IBM Fellow and in 2006 became the first woman to win the Turing Award.[4]
  3. John Backus -- designed FORTRAN and the first Fortran complier, one of designers of Algol 60, co-inventor of the Backus-Naur form
  4. Gordon Bell -- designed several of PDP machines (PDP-4, PDP-6, PDP-11 Unibus and VAX)
  5. Fred Brooks -- managed the development of IBM's System/360 and the OS/360 wrote The Mythical Man-Month
  6. Vint Cerf -- DAPRA manager, one of "the fathers of the Internet", sharing this title with Bob Kahn.
  7. John Cocke -- Optimizing compiler design (IBM Fortran H compiler), "the father of RISC architecture."
  8. Fernando J. Corbató -- MIT CTSS Time-Sharing System, Multics OS, Corbató's Law
  9. Seymour Cray -- founder of Cray Research, "the father of supercomputing"
  10. Charles Stark Draper an American scientist and engineer, known as the "father of inertial navigation". He was the founder and director of the Massachusetts Institute of Technology's Instrumentation Laboratory, later renamed the Charles Stark Draper Laboratory, which made the Apollo moon landings possible through the Apollo Guidance Computer it designed for NASA.
  11. Whitfield Diffie is an American cryptographer and one of the pioneers of public-key cryptography.
  12. Jack Dongarra contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. With Eric Grosse, he pioneered the open source distribution of numeric source code via email with netlib
  13. Brendan Eich an American computer programmer, who created of the JavaScript scripting language. Later he became the chief technology officer at the Mozilla Corporation.
  14. Douglas Engelbart -- co-inventor of the computer mouse, instrumental in the development of hypertext
  15. Philip Don Estridge -- led the team, which developed the original IBM Personal Computer (PC), and thus is known as "father of the IBM PC". His decisions dramatically changed the computer industry, resulting in a vast increase in the number of personal computers sold and bought (computer for each family), thus creating an entire PC industry.
  16. David C. Evans the founder of the computer science department at the University of Utah and co-founder (with Ivan Sutherland) of Evans & Sutherland, a computer firm which is known as a pioneer in the domain of computer-generated imagery.
  17. Edward Feigenbaum a computer scientist working in the field of artificial intelligence. He is often called the "father of expert systems." A former chief scientist of the Air Force, he received the U.S. Air Force Exceptional Civilian Service Award in 1997. In 1984 he was selected as one the initial fellows of the ACMI and in 2007 was inducted as a Fellow of the ACM. In 2011, Feigenbaum was inducted into IEEE Intelligent Systems' AI's Hall of Fame for the "significant contributions to the field of AI and intelligent systems".
  18. Robert W. Floyd -- talented computer scientist who collaborated with Donald Knuth. Invented Floyd–Warshall algorithm (independently of Stephen Warshall), which efficiently finds all shortest paths in a graph, Floyd's cycle-finding algorithm for detecting cycles in a sequence, and Floyd-Evans language for parsing. He also introduced the important concept of error diffusion for rendering images, also called Floyd–Steinberg dithering (though he distinguished dithering from diffusion). His lecture notes served as a blueprint for volume three of the Art of Computer Programming (Sorting and Searching). He obtained full professor position in Stanford without Ph.D.
  19. Bill Gates Created FAT filesystem, and was instrumental in creation of PCs DOS, Windows OSes, Microsoft Office and the whole PC ecosystem which dominates computing today.
  20. Seymour Ginsburg a pioneer of automata theory, formal language theory, and database theory, in particular; and computer science, in general. Ginsburg was the first to observe the connection between context-free languages and "ALGOL-like" languages.
  21. Robert M. Graham one of the key developers of Multics, one of the first virtual memory time-sharing computer operating systems, which broke ground for all modern operating systems. He had responsibility for protection, dynamic linking, and other key system kernel areas.
  22. Ralph Griswold created groundbreaking string processing languages SNOBOL, SL5, and, later, Icon.
  23. Richard Hamming His contributions include the Hamming code, the Hamming window (Digital Filters), Hamming numbers, sphere-packing (or hamming bound) and the Hamming distance.
  24. Martin Hellman an American cryptologist, and is best known for his invention of public key cryptography in cooperation with Whitfield Diffie and Ralph Merkle. Hellman is a long-time contributor to the computer privacy debate and is more recently known for promoting risk analysis studies on nuclear threats, including the NuclearRisk.org .
  25. David A. Huffman known for his Huffman coding
  26. Steve Jobs Co-creator on Apple, the marketing force behind Next computer, iPad and iPhone brands.
  27. Bill Joy The major contributor to FreeBSD and Solaris OS, Creator of vi editor and C shell.
  28. Phil Katz a computer programmer best known as the co-creator of the ZIP file format for data compression, and the author of PKZIP, a program for creating zip files which ran under DOS.
  29. Alan Kay One of the members of Xerox PARC, Atari's chief scientist for three years. Best known for his contribution to Smalltalk language.
  30. Gary Kildall Creator of the concept of the BIOS, CPM and DrDOS operating systems.
  31. Donald Knuth -- made tremendous contribution by systematizing of knowledge of computer algorithms, publishing three volumes of the Art of Computer programming (starting in 1968); see also Donald Knuth: Leonard Euler of Computer Science
  32. Butler Lampson Lampson was one of the founding members of Xerox PARC in 1970. In 1973, the Xerox Alto, with its three-button mouse and full-page-sized monitor was born. It is now considered to be the first actual personal computer (at least in terms of what has become the 'canonical' GUI mode of operation). At PARC, Lampson helped work on many other revolutionary technologies, such as laser printer design; two-phase commit protocols; Bravo, the first WYSIWYG text formatting program; Ethernet, the first high-speed local area network (LAN); and designed several influential programming languages such as Euclid.
  33. Chris Malachowsky is an American electrical engineer, one of the founders of computer graphics company Nvidia.
  34. John Mauchly an American physicist who, along with J. Presper Eckert, designed ENIAC, the first general purpose electronic digital computer, as well as EDVAC, BINAC and UNIVAC I, the first commercial computer made in the United States.
  35. John McCarthy coined the term "artificial intelligence" (AI), developed the Lisp programming language family, significantly influenced the design of the ALGOL programming language, popularized timesharing, and was very influential in the early development of AI.
  36. Bob Miner the co-founder of Oracle Corporation and architect of Oracle's relational database.
  37. Cleve Moler a mathematician and computer programmer specializing in numerical analysis. In the mid to late 1970s, he was one of the authors of LINPACK and EISPACK, Fortran libraries for numerical computing. He invented MATLAB, a numerical computing package, to give his students at the University of New Mexico easy access to these libraries without writing Fortran. In 1984, he co-founded MathWorks with Jack Little to commercialize this program.
  38. Gordon E. Moore an American businessman and co-founder and Chairman Emeritus of Intel Corporation and the author of Moore's Law (published in an article April 19, 1965 in Electronics Magazine).
  39. Robert Morris a researcher at Bell Labs who worked on Multics and later Unix. Morris's contributions to early versions of Unix include the math library, the bc programming language, the program crypt, and the password encryption scheme used for user authentication. The encryption scheme was based on using a trapdoor function (now called a key derivation function) to compute hashes of user passwords which were stored in the file /etc/passwd; analogous techniques, relying on different functions, are still in use today.
  40. Allen Newell contributed to the Information Processing Language (1956) and two of the earliest AI programs, the Logic Theory Machine (1956) and the General Problem Solver (1957) (with Herbert A. Simon). He was awarded the ACM's A.M. Turing Award along with Herbert A. Simon in 1975 for their basic contributions to artificial intelligence and the psychology of human cognition.
  41. Robert Noyce co-founded Fairchild Semiconductor in 1957 and Intel Corporation in 1968. He is also credited (along with Jack Kilby) with the invention of the integrated circuit or microchip which fueled the personal computer revolution and gave Silicon Valley its name.
  42. Ken Olsen an American engineer who co-founded Digital Equipment Corporation (DEC) in 1957 with colleague Harlan Anderson.
  43. John K. Ousterhout the creator of the Tcl scripting language and the Tk toolkit.
  44. Alan Perlis an American computer scientist known for his pioneering work in programming languages and the first recipient of the Turing Award.
  45. Dennis Ritchie an American computer scientist who created the C programming language with long-time colleague Ken Thompson, and was instrumental in creation of the Unix operating system. Ritchie and Thompson received the Turing Award from the ACM in 1983, the Hamming Medal from the IEEE in 1990 and the National Medal of Technology from President Clinton in 1999.
  46. Claude Shannon an American mathematician, electronic engineer, and cryptographer known as "the father of information theory".
  47. Ivan Sutherland an American computer scientist and Internet pioneer. He received the Turing Award from the Association for Computing Machinery in 1988 for the invention of Sketchpad, an early predecessor to the sort of graphical user interface that has become ubiquitous in personal computers. He is a member of the National Academy of Engineering, as well as the National Academy of Sciences among many other major awards. In 2012 he was awarded the Kyoto Prize in Advanced Technology for "pioneering achievements in the development of computer graphics and interactive interfaces"
  48. Richard Stallman - Creator of the GNU project; see also Nikolai Bezroukov. Portraits of Open Source Pioneers. Ch.3 Prince Kropotkin of Software (Richard Stallman and War of Software Clones)
  49. Robert Tarjan an American computer scientist. He is the discoverer of several graph algorithms, including Tarjan's off-line least common ancestors algorithm, and co-inventor of both splay trees and Fibonacci heaps. Tarjan is currently the James S. McDonnell Distinguished University Professor of Computer Science at Princeton University, and is also a Senior Fellow at Hewlett-Packard.
  50. Ken Thompson designed and implemented the original Unix operating system. He also invented the B programming language, the direct predecessor to the C programming language, and was one of the creators and early developers of the Plan 9 operating systems.
  51. Larry Wall -- Creator of Perl language; see also Slightly Skeptical View on Larry Wall and Perl

Those people mentioned above are all associated with the USA. And I named just a few about work of which I personally know... The USA computer science research was in close collaboration with British computer scientists which also made some significant contributions (some of the most impressive IBM compilers were actually designed and implemented in Britain) but the leadership role of the USA was indisputable.

Large part of this unique technology culture was destroyed via outsourcing frenzy which started around 1998, but the period from approximately 1950 till approximately 2000 was really a continues triumph of the US engineering. Simultaneously this was a triumph of New Deal policies. When they were dismantled (starting from Reagan or even Carter), computer science quickly was overtaken by commercial interests and became very similar to economics in the level of corruption of academics and academic institutions.

But that did not happened overnight and inertia lasted till late 90th. Firms also did not escape this transformation into money making machines with IBM as a primary example of the disastrous results of such transformations which started under "American Express"-style leadership of Lou Gerstner. See IBM marry Linux to Outsourcing.

Here is the timeline modified from History of Computer Science

Fifty glorious years

1950's 1960's 1970's 1980's 1990's Notes

1950's

In 1949 The U.S. Army and the University of Illinois jointly fund the construction of two computers, ORDVAC and ILLIAC (ILLInois Automated Computer). The Digital Computer Laboratory is organized. Ralph Meagher, a physicist and chief engineer for ORDVAC, is head. 1951 ORDVAC (Ordnance Variable Automated Computer), one of the fastest computers in existence, is completed. 1952 ORDVAC moves to the Army Ballistic Research Laboratory in Aberdeen, Maryland. It is used remotely from the University of Illinois via a teletype circuit up to eight hours each night until the ILLIAC computer is completed

Grace Murray Hopper (1906-1992) invented the notion of a compiler, at Remington Rand, in 1951. Earlier, in 1947, Hopper found the first computer "bug" -- a real one -- a moth that had gotten into the Harvard Mark II. (Actually, the use of ``bug'' to mean defect goes back to at least 1889.). The first compiler was written by Grace Hopper, in 1952, for the A-0 System language. The term compiler was coined by Hopper. History of compiler construction - Wikipedia, the free encyclopedia

In a famous paper that appeared in the journal Mind in 1950, Alan Turing introduced the Turing Test, one of the first efforts in the field of artificial intelligence. He proposed a definition of "thinking" or "consciousness" using a game: a tester would have to decide, on the basis of written conversation, whether the entity in the next room responding to the tester's queries was a human or a computer. If this distinction could not be made, then it could be fairly said that the computer was "thinking".

In 1952, Alan Turing was arrested for "gross indecency" after a burglary led to the discovery of his affair with Arnold Murray. Overt homosexuality was taboo in 1950's England, and Turing was forced to take estrogen "treatments" which rendered him impotent and caused him to grow breasts. On June 7, 1954, despondent over his situation, Turing committed suicide by eating an apple laced with cyanide.

In the same 1952 ILLIAC, the first computer built and owned entirely by an educational institution, becomes operational. It was ten feet long, two feet wide, and eight and one-half feet high, contained 2,800 vacuum tubes, and weighed five tons.

In the same 1952 IBM developed first magnetic disk. In September 1952, IBM opened a facility in San Jose, Calif.-a critical moment in the story of Silicon Valley. The company set to work developing a new kind of magnetic memory for its planned Model 305 Ramac (Random Access Method of Accounting and Control), the world's first "supercomputer."

In 1952 Univac correctly predicted the results of presidential elections in the USA. Remington Rand seized the opportunity to introduce themselves to America as the maker of UNIVAC – the computer system whose name would become synonymous with computer in the 1950s. Remington Rand was already widely known as the company that made the Remington typewriters. The company bought out the struggling Eckert-Mauchly Computer Corporation in 1950. Pres Eckert and John Mauchly had led the ENIAC project and made one of the first commercially available computer, UNIVAC. See Computer History Museum @CHM Have you got a prediction for us, UNIVAC

The IBM 650 Magnetic Drum Data Processing Machine was announced 2 July 1953 (as the "Magnetic Drum Calculator", or MDC), but not delivered until December 1954 (same time as the NORC). Principal designer: Frank Hamilton, who had also designed ASCC and SSEC. Two IBM 650s were installed at IBM Watson Scientific Computing Laboratory at Columbia University, 612 West 116th Street, beginning in August 1955.

Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also invented an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer. (Dijkstra is famous for his caustic, opinionated memos. For example, see his opinions of some programming languages).

In 1956 IBM 305 RAMAC was announced. It was the first commercial computer that used a moving head hard disk drive (magnetic disk storage) for secondary storage. The 305 was one of the last vacuum tube computers that IBM built. The IBM 350 disk system stored 5 million 8-bit (7 data bits plus 1 parity bit) characters. It had fifty 24-inch-diameter (610 mm) disks.

The same year Case University Computing Center got IBM 650 and the same year Donald Knuth entered this college and he managed to start working at the Case University Computing Center. That later led to creation of his three volume series the Art of Computer Programming -- the bible of programming as it was called.

On October 4, 1957, the first artificial Earth satellite Sputnik was launched by USSR it into an elliptical low Earth orbit. In a way it as a happenstance due to iron will and talent of Sergey Korolev, a charismatic head of the USSR rocket program (who actually served some years in GULAG). But it opened a new era. The ILLIAC I (Illinois Automatic Computer), a pioneering computer built in 1952 by the University of Illinois, was the first computer built and owned entirely by a US educational institution, was the first to calculated Sputnik orbit. The launch of Sputnik led to creation NASA and indirectly of the US Advanced Research Projects Agency (DARPA) in February 1958 to regain a technological lead. It also led to dramatic increase in U.S. government spending on scientific research and education via President Eisenhower's bill called the National Defense Education Act. This bill encouraged students to go to college and study math and science. The students' tuition fees would be paid for. This led to a new emphasis on science and technology in American schools. In other words Sputnik created building blocks which probably led to the general establishment of the way computer science developed in the USA for the next decade of two. DARPA latter funded the creation of the TCP/IP protocol and Internet as we know it. It also contributed to development of large integral circuits. The rivalry in space, even though it had military reasons served as tremendous push forward for computers and computer science.

John Backus and others developed the first complete complier -- FORTRAN compiler in April 1957. FORTRAN stands for FORmula TRANslating system. Heading the team is John Backus, who goes on to contribute to the development of ALGOL and the well-known syntax-specification system known as BNF. The first FORTRAN compiler took 18 person-years to create.

LISP, a list-processing language for artificial intelligence programming, was invented by John McCarthy about 1958. The same year Alan Perlis, John Backus, Peter Naur and others developed Algol.

In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959.

In 1959 LISP 1.5 appears. The same year COBOL is created by the Conference on Data Systems and Languages (CODASYL).

See also Knuth Biographic Notes


1960's

In the 1960's, computer science came into its own as a discipline. In fact this decade became a gold age of computer science. In fact, the term was coined by George Forsythe, a numerical analyst. The first computer science department was formed at Purdue University in 1962. The first person to receive a Ph. D. from a computer science department was Richard Wexelblat, at the University of Pennsylvania, in December 1965.

Operating systems saw major advances. Fred Brooks at IBM designed System/360, a line of different computers with the same architecture and instruction set, from small machine to top-of-the-line. DEC designed PDP series. The first PDP-1 was delivered to Bolt, Beranek and Newman in November 1960,[14] and formally accepted the next April.[15] The PDP-1 sold in basic form for $120,000, or about $900,000 in 2011 US dollars.[16] By the time production ended in 1969, 53 PDP-1s had been delivered.[11][17]At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed.

In 1960 ALGOL 60, the first block-structured language, appears. This is the root of the family tree that will ultimately produce the Pl/l, algol 68, Pascal, Modula, C, Java, C# and other languages. ALGOL become popular language in Europe in the mid- to late-1960s. Attempts to simplify Algol lead to creation of BASIC (developed c. 1964 by John Kemeny (1926-1992) and Thomas Kurtz (b. 1928)). It became very popular with PC revolution.

The 1960's also saw the rise of automata theory and the theory of formal languages. Big names here include Noam Chomsky and Michael Rabin. Chomsky introduced the notion of context free languages and later became well-known for his theory that language is "hard-wired" in human brains, and for his criticism of American foreign policy.

Sometime in the early 1960s , Kenneth Iverson begins work on the language that will become APL--A Programming Language. It uses a specialized character set that, for proper use, requires APL-compatible I/O devices. APL is documented in Iverson's book, A Programming Language published in 1962

In 1962 ILLIAC II, a transistorized computer 100 times faster than the original ILLIAC, becomes operational. ACM Computing Reviews says of the machine, "ILLIAC II, at its conception in the mid-1950s, represents the spearhead and breakthrough into a new generation of machines." in 1963 Professor Donald B. Gillies discovered three Mersenne prime numbers while testing ILLIAC II, including the largest then known prime number, 2**11213 -1, which is over 3,000 digits.

The famous IBM System/360 (S/360) was first announced by IBM on April 7, 1964. S/360 became the most popular computer systems for more then a decade. It introduced 8-bit byte address space, byte addressing and many other things. The same year (1964) PL/1 was released. It became the most widely used programming language in Eastern Europe and the USSR. It later served as a prototype of several other languages including PL/M and C.

In 1964 the IBM 2311 Direct Access Storage Facility was introduced (History of IBM magnetic disk drives - Wikipedia,) for the System/360 series. It was also available on the IBM 1130 and (using the 2841 Control Unit) the IBM 1800. The 2311 mechanism was largely identical to the 1311, but recording improvements allowed higher data density. The 2311 stored 7.25 megabytes on a single removable IBM 1316 disk pack (the same type used on the IBM 1311) consisting of six platters that rotated as a single unit. Each recording surface had 200 tracks plus three optional tracks which could be used as alternatives in case faulty tracks were discovered. Average seek time was 85 ms. Data transfer rate was 156 kB/s.

Along with the development unified System 360 series of computers, IBM wanted a single programming language for all users. It hoped that Fortran could be extended to include the features needed by commercial programmers. In October 1963 a committee was formed[4] composed originally of 3 IBMers from New York and 3 members of SHARE, the IBM scientific users group, to propose these extensions to Fortran. Given the constraints of Fortran, they were unable to do this and embarked on the design of a "new programming language" based loosely on Algol labeled "NPL". This acronym conflicted with that of the UK's National Physical Laboratory and was replaced briefly by MPPL (MultiPurpose Programming Language) and, in 1965, with PL/I (with a Roman numeral "I" ). The first definition appeared in April 1964. IBM took NPL as a starting point and completed the design to a level that the first compiler could be written: the NPL definition was incomplete in scope and in detail.[7] Control of the PL/I language[8] was vested initially in the New York Programming Center and later at the IBM UK Laboratory at Hursley. The SHARE and GUIDE user groups were involved in extending the language and had a role in IBM's process for controlling the language through their PL/I Projects. The language was first specified in detail in the manual "PL/I Language Specifications. C28-6571" written in New York from 1965 and superseded by "PL/I Language Specifications. GY33-6003" written in Hursley from 1967. IBM continued to develop PL/I in the late sixties and early seventies, publishing it in the GY33-6003 manual. These manuals were used by the Multics group and other early implementers. The first production PL/I compiler was the PL/I F compiler for the OS/360 Operating System, built by John Nash's team at Hursley in the UK: the runtime library team was managed by I.M.(Nobby) Clarke. Release 1 shipped in 1966. That was a significant step forward in comparison with earlier compilers. The PL/I D compiler, using 16 kilobytes of memory, was developed by IBM Germany for the DOS/360 low end operating system. It implemented a subset of the PL/I language requiring all strings and arrays to have fixed extents, thus simplifying the run-time environment. Reflecting the underlying operating system it lacked dynamic storage allocation and the controlled storage class. It was shipped within a year of PL/I F.

Hoare also invented Quicksort while on business trip to Moscow.

Douglas C. Englebart invents the computer mouse c. 1968, at SRI.

The first volume of The Art of Computer Programming was published in 1968 and instantly became classic Donald Knuth (b. 1938) later published two additional volumes of his world famous three-volume treatise

In 1968 ALGOL 68 , a monster language compared to ALGOL 60, appears. Some members of the specifications committee -- including C.A.R. Hoare and Niklaus Wirth -- protest its approval. ALGOL 68 proves difficult to implement. The same year Niklaus Wirth begins his work on a simple teaching language which later becomes Pascal.

Ted Hoff (b. 1937) and Federico Faggin at Intel designed the first microprocessor (computer on a chip) in 1969-1971.

In late 60th the PDP-11 one of the first 16-bit minicomputers was designed in a crash program by Harold McFarland, Gordon Bell, Roger Cady, and others as a response to NOVA 16-bit minicomputers. The project was able to leap forward in design with the arrival of Harold McFarland, who had been researching 16-bit designs at Carnegie Mellon University. One of his simpler designs became the PDP-11. It was launched in 1970 and became huge success. The first officially named version of Unix ran on the PDP-11/20 in 1970. It is commonly stated that the C programming language took advantage of several low-level PDP-11–dependent programming features, albeit not originally by design. A major advance in the PDP-11 design was Digital's Unibus, which supported all peripherals through memory mapping. This allowed a new device to be added easily, generally only requiring plugging a hardware interface board into the backplane, and then installing software that read and wrote to the mapped memory to control it. The relative ease of interfacing spawned a huge market of third party add-ons for the PDP-11, which made the machine even more useful. The combination of architectural innovations proved superior to competitors and the "11" architecture was soon the industry leader, propelling DEC back to a strong market position.

A second generation of programming languages, such as Basic, Algol 68 and Pascal (Designed by Niklaus Wirth in 1968-1969) appeared at the end of decade


1970's

Flat uniform record (relational) databases got a fashionable pseudo-theoretical justification with the work of Edgar F. Codd. While mostly nonsense it help to spread relational database which became dominant type of databases. That was probably one of the first of bout of fashion in computer science. Many more followed. Codd won the Turing award in 1981.

Unix, a very influential operating system, was developed at Bell Laboratories by Ken Thompson (b. 1943) and Dennis Ritchie (b. 1941) after ATT withdraw from Multics project. Brian Kernighan and Ritchie together developed C, which became the most influential system programming language and also was used as general purpose language on personal computers. The first release was made in 1972. The definitive reference manual for it will not appear until 1974.

In early 1970th the PL/I Optimizer and Checkout compilers produced in Hursley supported a common level of PL/I language[23] and aimed to replace the PL/I F compiler. The compilers had to produce identical results - the Checkout Compiler was used to debug programs that would then be submitted to the Optimizer. Given that the compilers had entirely different designs and were handling the full PL/I language this goal was challenging: it was achieved. The PL/I optimizing compiler took over from the PL/I F compiler and was IBM's workhorse compiler from the 1970s to the 1990s. Like PL/I F, it was a multiple pass compiler with a 44kByte design point, but it was an entirely new design. Unlike the F compiler it had to perform compile time evaluation of constant expressions using the run-time library - reducing the maximum memory for a compiler phase to 28 kilobytes. A second-time around design, it succeeded in eliminating the annoyances of PL/I F such as cascading diagnostics. It was written in S/360 Macro Assembler by a team, led by Tony Burbridge, most of whom had worked on PL/I F. Macros were defined to automate common compiler services and to shield the compiler writers from the task of managing real-mode storage - allowing the compiler to be moved easily to other memory models. Program optimization techniques developed for the contemporary IBM Fortran H compiler were deployed: the Optimizer equaled Fortran execution speeds in the hands of good programmers. Announced with the IBM S/370 in 1970, it shipped first for the DOS/360 operating system in Aug 1971, and shortly afterward for OS/360, and the first virtual memory IBM operating systems OS/VS1, MVS and VM/CMS (the developers were unaware that while they were shoehorning the code into 28kB sections, IBM Poughkeepsie was finally ready to ship virtual memory support in OS/360). It supported the batch programming environments and, under TSO and CMS, it could be run interactively.

Simultaneously PL/C a dialect of PL/1 for education was developed at Cornell University in the early 1970s. It was designed with the specific goal of being used for teaching programming. The main authors were Richard W. Conway and Thomas R. Wilcox. They submitted the famous article "Design and implementation of a diagnostic compiler for PL/I" published in the Communications of ACM in March 1973. PL/C eliminated some of the more complex features of PL/I, and added extensive debugging and error recovery facilities. The PL/C compiler had the unusual capability of never failing to compile any program, through the use of extensive automatic correction of many syntax errors and by converting any remaining syntax errors to output statements.

In 1972 Gary Kildall implemented a subset of PL/1, called "PL/M" for microprocessors. PL/M was used to write the CP/M operating system - and much application software running on CP/M and MP/M. Digital Research also sold a PL/I compiler for the PC written in PL/M. PL/M was used to write much other software at Intel for the 8080, 8085, and Z-80 processors during the 1970s.

In 1973-74 Gary Kildall developed CP/M during , an operating system for an Intel Intellec-8 development system, equipped with a Shugart Associates 8-inch floppy disk drive interfaced via a custom floppy disk controller. It was written in PL/M. Various aspects of CP/M were influenced by the TOPS-10 operating system of the DECsystem-10 mainframe computer, which Kildall had used as a development environment.

The LSI-11 (PDP-11/03), introduced in February, 1975 was the first PDP-11 model produced using large-scale integration a precursor to personal PC.

The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time.

In March 1976 one of the first supercomputer CRAY-1 was shipped, designed by Seymour Cray (b. 1925) It could perform 160 million operations in a second. The Cray XMP came out in 1982. Later Cray Research was taken over by Silicon Graphics.

There were also major advances in algorithms and computational complexity. In 1971, Steve Cook published his seminal paper on NP-completeness, and shortly thereafter, Richard Karp showed that many natural combinatorial problems were NP-complete. Whit Diffie and Martin Hellman published a paper that introduced the theory of public-key cryptography, and a public-key cryptosystem known as RSA was invented by Ronald Rivest, Adi Shamir, and Leonard Adleman.

Microsoft was formed on April 4, 1975 to develop and sell BASIC interpreters for the Altair 8800. Bill Gates and Paul Allen write a version of BASIC that they sell to MITS (Micro Instrumentation and Telemetry Systems) on a per-copy royalty basis. MITS is producing the Altair, one of the earlier 8080-based microcomputers that came with a interpreter for a programming language.

The Apple I went on sale in July 1976 and was market-priced at $666.66 ($2,572 in 2011 dollars, adjusted for inflation.)

The Apple II was introduced on April 16, 1977 at the first West Coast Computer Faire. It differed from its major rivals, the TRS-80 and Commodore PET, because it came with color graphics and an open architecture. While early models used ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5 1/4 inch floppy disk drive and interface, the Disk II.

In 1976, DEC decided to extend the PDP-11 architecture to 32-bits while adding a complete virtual memory system to the simple paging and memory protection of the PDP-11. The result was the VAX architecture. The first computer to use a VAX CPU was the VAX-11/780, which DEC referred to as a superminicomputer. Although it was not the first 32-bit minicomputer, the VAX-11/780's combination of features, price, and marketing almost immediately propelled it to a leadership position in the market after it was released in 1978. VAX systems were so successful that it propelled Unix to the status of major OS. in 1983, DEC canceled its Jupiter project, which had been intended to build a successor to the PDP-10 mainframe, and instead focused on promoting the VAX as the single computer architecture for the company.

In 1978 AWK -- a text-processing language named after the designers, Aho, Weinberger, and Kernighan -- appears. The same year the ANSI standard for FORTRAN 77 appears.

In 1977 Bill Joy, then a graduate student at Berkeley, started compiling the first Berkeley Software Distribution (1BSD), which was released on March 9, 1978

In 1979, three graduate students in North Carolina developed a distributed news server which eventually became Usenet.

The Second Berkeley Software Distribution (2BSD), was released in May 1979. It included updated versions of the 1BSD software as well as two new programs by Joy that persist on Unix systems to this day: the vi text editor (a visual version of ex) and the C shell.

The same 1979 VisiCalc the first spreadsheet program available for personal computers was conceived by Dan Bricklin, refined by Bob Frankston, developed by their company Software Arts,[1] and distributed by Personal Software in 1979 (later named VisiCorp) for the Apple II computer

At the end of 1979 the kernel of BSD Unix was largely rewritten by Berkeley students to include a virtual memory implementation, and a complete operating system including the new kernel, ports of the 2BSD utilities to the VAX, was released as 3BSD at the end of 1979.

Microsoft purchased a license for Version 7 Unix from AT&T in 1979, and announced on August 25, 1980 that it would make it available for the 16-bit microcomputer market.


1980's

The success of 3BSD was a major factor in the Defense Advanced Research Projects Agency's (DARPA) decision to fund Berkeley's Computer Systems Research Group (CSRG), which would develop a standard Unix platform for future DARPA research in the VLSI Project and included TCP stack. CSRG released 4BSD, containing numerous improvements to the 3BSD system, in October 1980. 4BSD released in November 1980 offered a number of enhancements over 3BSD, notably job control in the previously released csh, delivermail (the antecedent of sendmail), "reliable" signals, and the Curses programming library.

This decade also saw the rise of the personal computer, thanks to Steve Wozniak and Steve Jobs, founders of Apple Computer.

In 1981 IBM PC was launched which made personal computer mainstream. The first computer viruses are developed also in 1981. The term was coined by Leonard Adleman, now at the University of Southern California. The same year, 1981, the first truly successful portable computer (predecessor of modern laptops) was marketed, the Osborne I.

In 1982 one of the first scripting languages REXX was released by IBM as a product. It was four years after AWK was released. Over the years IBM included REXX in almost all of its operating systems (VM/CMS, VM/GCS, MVS TSO/E, AS/400, VSE/ESA, AIX, CICS/ESA, PC DOS, and OS/2), and has made versions available for Novell NetWare, Windows, Java, and Linux.

In 1982 PostScript appears, which revolutionized printing on dot matrix and laser printers.

1983 was the year of major events in language area:

4.2BSD would take over two years to implement and contained several major overhauls. It incorporated a modified version of BBN's preliminary TCP/IP implementation; new Berkeley Fast File System, implemented by Marshall Kirk McKusick; The official 4.2BSD release came in August 1983. The same 1983 Stallman resigns from MIT to start the GNU project with the explicit goal of reimplementing Unix as a "free" operating system. The name stands for "GNU is Not Unix."

In 1984 Stallman published a rewritten version of Gosling's Emacs (GNU Emacs, where G stand for Goslings) as "free" software (Goslings sold the rights for his code to a commercial company), and launches the Free Software Foundation (FSF) to support the GNU project. One of the first program he decided to write is a C compiler that became widely knows as gcc. The same year Steven Levy "Hackers" book is published with a chapter devoted to RMS that presented him in an extremely favorable light.

In October 1983 Apple introduced the Macintosh computer which was the first GUI-based mass produced personal computer. It was three years after IBM PC was launched and six years after Apple II launch. It went of sale on Jan 24, 1984 two days after US$1.5 million Ridley Scott television commercial, "1984" was aired during Super Bowl XVIII on January 22, 1984. It is now considered a a "masterpiece.". In it an unnamed heroine to represent the coming of the Macintosh (indicated by a Picasso-style picture of Apple's Macintosh computer on her white tank top) as a means of saving humanity from the "conformity" of IBM's attempts to dominate the computer industry.

In 1985 Intel 80386 introduced 32-bit logical addressing. It became instrumental in Unix Renaissance which started the same year the launch of of Xenix 2.0 by Microsoft. It was based on UNIX System V. An update numbered 2.1.1 added support for the Intel 80286 processor. The Sperry PC/IT, an IBM PC AT clone, was advertised as capable of supporting eight simultaneous dumb terminal users under this version. Subsequent releases improved System V compatibility. The era of PC Unix started and Microsoft became dominant vendor of Unix: in the late 1980s, Xenix was, according to The Design and Implementation of the 4.3BSD UNIX Operating System, "probably the most widespread version of the UNIX operating system, according to the number of machines on which it runs". In 1987, SCO ported Xenix to the 386 processor. Microsoft used Xenix on Sun workstations and VAX minicomputers extensively within their company as late as 1992

Microsoft Excel was first released for Macintosh, not IBM PC, in 1985. The same year the combination of the Mac, Apple's LaserWriter printer, and Mac-specific software like Boston Software's MacPublisher and Aldus PageMaker enabled users to design, preview, and print page layouts complete with text and graphics-an activity to become known as desktop publishing.

The first version of GCC was able to compile itself in late 1985. The same year GNU Manifesto published

In 1986-1989 a series of computer viruses for PC DOS made headlines. One of the first mass viruses was boot virus called Brain created in 1986 by the Farooq Alvi Brothers in Lahore, Pakistan, reportedly to deter piracy of the software they had written.

In 1987, the US National Science Foundation started NSFnet, precursor to part of today's Internet.

The same year, 1987, Perl was released by Larry Wall. In 1988 Perl 2 was released.

Steve Jobs was ousted from Apple and formed his new company NeXT Computer with a dozen of former Apple employees. NeXT was the first affordable workstation with over megaflop computer power. It was in 1988, and the smaller NeXTstation in 1990. It was NeXTstation that was used to develop World Wide Web in CERN. It was also instrumental in creating of complex modern GUI interfaces and launching object oriented programming into mainstream...

1989 FSF introduces a General Public License (GPL). GPL is also known as 'copyleft'. Stallman redefines the word "free" in software to mean "GPL compatible". In 1990 As the president of the League for Programming Freedom (organization that fight software patterns), Stallman is given a $240,000 fellowship by the John D. and Catherine T. MacArthur Foundation.


1990's

Microsoft Windows 3.0, which began to approach the Macintosh operating system in both performance and feature set, was released in May 1990 and was a less expensive alternative to the Macintosh platform.

4.3BSD-Reno came in early 1990. It was an interim release during the early development of 4.4BSD, and its use was considered a "gamble", hence the naming after the gambling center of Reno, Nevada. This release was explicitly moving towards POSIX compliance. Among the new features was an NFS implementation from the University of Guelph. In August 2006, Information Week magazine rated 4.3BSD as the "Greatest Software Ever Written".They commented: "BSD 4.3 represents the single biggest theoretical undergirder of the Internet."

On December 25 1990 the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the Internet was accomplished in CERN. It was running on NeXT:

" Mike Sendall buys a NeXT cube for evaluation, and gives it to Tim [Berners-Lee]. Tim's prototype implementation on NeXTStep is made in the space of a few months, thanks to the qualities of the NeXTStep software development system. This prototype offers WYSIWYG browsing/authoring! Current Web browsers used in "surfing the Internet" are mere passive windows, depriving the user of the possibility to contribute. During some sessions in the CERN cafeteria, Tim and I try to find a catching name for the system. I was determined that the name should not yet again be taken from Greek mythology. Tim proposes "World-Wide Web". I like this very much, except that it is difficult to pronounce in French..." by Robert Cailliau, 2 November 1995.[22]

In 1991 Linux was launched. The USSR was dissolved that led to influx of Russian programmers (as well as programmers from Eastern European countries) in the USA.

The first website was online on 6 August 1991:

"Info.cern.ch was the address of the world's first-ever web site and web server, running on a NeXT computer at CERN. The first web page address was http://info.cern.ch/hypertext/WWW/TheProject.html, which centred on information regarding the WWW project. Visitors could learn more about hypertext, technical details for creating their own webpage, and even an explanation on how to search the Web for information. There are no screenshots of this original page and, in any case, changes were made daily to the information available on the page as the WWW project developed. You may find a later copy (1992) on the World Wide Web Consortium website." -CERN

BSDi, the company formed to commercialized Unix BSD system found itself in legal trouble with AT&T's Unix System Laboratories (USL) subsidiary, then the owners of the System V copyright and the Unix trademark. The USL v. BSDi lawsuit was filed in 1992 and led to an injunction on the distribution of Net/2 until the validity of USL's copyright claims on the source could be determined. That launched Linux into mainstream.

FreeBSD development began in 1993 with a quickly growing, unofficial patchkit maintained by users of the 386BSD operating system. This patchkit forked from 386BSD and grew into an operating system taken from U.C. Berkeley's 4.3BSD-Lite (Net/2) tape with many 386BSD components and code from the Free Software Foundation.

On April 1993 CERN released the web technology into the public domain.

1994 First official Linux version 1.0 kernel released. Linux already has about 500,000 users. Unix renaissance started.

The same 1994 Microsoft incorporates Visual Basic for Applications into Excel, creating a way to knock out the competition of the Microsoft Office.

In February 1995, ISO accepts the 1995 revision of the Ada language. Called Ada 95, it includes OOP features and support for real-time systems.

In 1995 TCP connectivity in the USA became mainstream. Internet boom (aka dot-com boom) hit the USA. . Red Hat was formed by merger with ACC with Robert Yong of ACC (former founder of Linux Journal) a CEO.

In 1996 first computer monitoring system such as Tivoli and OpenView became established players.

In 1996 first ANSI C++ standard was released.

In 1997 Java was released. Also weak and primitive programming language if we consider its design (originally intended for imbedded systems), it proved to be durable and successful successor for Cobol. Sum Microsystems proved to be a capable marketing machine but that lead to deterioration of Solaris position and partial neglect of other projects such as Solaris on X86 and TCL. Microsoft launched a successful derivate of Java, called C# in December 2002.

In 1998 outsourcing that in 10 years destroy the USA programming industry became a fashion, fueleed by finacial industry attempts to exploit Internet boom for quick profits.

In 1999 a crazy connected with so called Millennium bug hit the USA. Proved lasting intellectual deterioration of some key US political figures including chairmen Greenspan --a cult like figure at the time.

In March 1999. Al Gore revealed that "During my service in the United States Congress, I took the initiative in creating the internet.". Which was partically true

This decade ended with 2000 dot-com boom bust. See Nikolai Bezroukov. Portraits of Open Source Pioneers. Ch 4: Grand Replicator aka Benevolent Dictator (A Slightly Skeptical View on Linus Torvalds)

Notes


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Nov 04, 2013] RIP Bill Lowe Father of the IBM PC no longer reading drive C By Iain Thomson

October 29, 2013 | The Register

Obit William (Bill) C. Lowe, the IBM manager who broke through Big Blue's corporate structure to build its first personal computer (and inadvertently made Microsoft the industry's powerhouse) has died at the age of 72 after a heart attack.

Lowe joined IBM in 1962 and swiftly rose through the ranks to become lab director at the company's Boca Raton base in Florida. But in 1979 he was given what was, at the time, a seemingly impossible task – building a working personal computer in a year.

Big Blue was the computing company in the 1950s and 60s, but it dealt in big-iron systems. In the 1970s companies such as Altair, Apple and others showed there was a booming market for small computers and IBM felt it had to get in the game, and quickly.

But that was a problem. IBM's corporate culture didn't do things fast – decisions were carefully scrutinized and design teams worked for years to develop their own hardware that was as good as they could make it internally. To build a PC in a year would be impossible, unless the company was willing to buy in hardware and software from third-parties.

Moving pieces in Project Chess

Lowe convinced the IBM board that this particular strategy was the only way to go, and thus set up Project Chess: a team of a dozen engineers who would design and build the first IBM PC. Getting off-the-shelf components to power the system wasn't too big an issue, but getting the software to run it was, and Lowe and his crew went to two companies to get it: Digital Research and Microsoft.

At the time Gary Kildall's firm Digital Research was the biggest operating-system vendor in the nascent PC market and its CP/M software was popular and flexible. Microsoft was then one of the biggest suppliers of BASIC interpreters and other software, so IBM sent out a purchasing squad to get the code it needed.

The team met Gates first – it helped that his mother was on the board of the non-profit United Way of America, as was John Opel, chairman of IBM, so she put in a good word about her son. But before discussing the project, IBM asked Gates and his team to sign one of its legendary non-disclosure agreements (NDA), which gave Big Blue full access to Microsoft's business and locked down the small software house from discussing anything about the meetings.

NDA stands for Not Doing Anything

Bill had no problems signing the NDA and discussed the situation with Lowe and others before confirming Microsoft could supply their programming language needs, although he recommended they speak to Kindall to get CP/M as an operating system.

When the IBM suits arrived at Kindall's house, he was away at the time, indulging in his passion for flying or so Silicon Valley history has it. His wife, also a director at Digital, answered the door instead, took one look at the NDA and called the firm's lawyer. That day's meeting between Big Blue and Digital, which Kindall turned up to after his flight, failed to produce a deal.

William Lowe. Credit: Marcin Wichary

Bill Lowe in 2007 ... Credit: Marcin Wichary

This left IBM in something of a quandary, so they went back to Gates and asked if he had any ideas. Seeing an opportunity, Gates said Microsoft could supply IBM with the operating systems it needed. The only problem was Microsoft didn't have a working operating system, so it went to Seattle Computer Products, which had just written one called QDOS (Quick and Dirty Operating System), bought it for $50,000 and renamed it MS-DOS (but IBM branded it PC DOS).

Lowe moved over to the position of general manager of IBM's Rochester plant in 1980, so it was his successor Don Estridge who launched the IBM Personal Computer 5150 in 1981. It soon became clear that Big Blue had a hit on its hands, with the computer selling six times the forecast figure in its first year. But Lowe and his team had made two crucial mistakes.

Where Big Blue blew it

Firstly, because it used third-party components, IBM didn't have control over the design. The only part of the computer IBM had copyright control of was the BIOS ROM chip, and before long Compaq had figured a way to reverse-engineer it so that it could sell IBM-compatible systems for less than Big Blue was charging.

Secondly, IBM made the mistake of letting Microsoft sell its operating system to other manufacturers, rather than reserving it exclusively for IBM alone. Gates cheerfully sold his OS to the PC cloners and it became the de facto standard for the industry.

Lowe came back to the PC business in 1985 after Estridge's death in an air crash. In an effort to regain control of the market, in 1987 IBM introduced the PS/2 computer, which rewrote the architecture for its PCs, and a new operating system called OS/2. Unfortunately for IBM, buyers weren't sold on changing out their existing PCs to the relatively pricey PS/2 architecture and OS/2 was being written in partnership with Microsoft – which understandably wasn't putting too much effort into coding a rival product.

Neither the new computer nor the operating system took off and IBM was increasingly relegated to also-ran status among PC purchasers. Lowe was partially blamed for the situation, despite his enormous achievement in getting IBM moving, and left the company in 1988 in search of pastures new at Xerox.

He is survived by his wife Cristina, four children, and 10 grandchildren. ®

Computer Visionary Who Invented the Mouse By JOHN MARKOFF

The mouse that roared. See also Doug Engelbart Biography - Doug Engelbart Institute

July 3, 2013 | NYTimes.com

Douglas C. Engelbart, 1925-2013

Douglas C. Engelbart was 25, just engaged to be married and thinking about his future when he had an epiphany in 1950 that would change the world.

Clips of Douglas C. Englebart's 1968 demonstration of a networked computing system, which included a mouse, text editing, video conferencing, hypertext and windowing. Read more… .

He had a good job working at a government aerospace laboratory in California, but he wanted to do something more with his life, something of value that might last, even outlive him. Then it came to him. In a single stroke he had what might be safely called a complete vision of the information age.

The epiphany spoke to him of technology's potential to expand human intelligence, and from it he spun out a career that indeed had lasting impact. It led to a host of inventions that became the basis for the Internet and the modern personal computer.

In later years, one of those inventions was given a warmhearted name, evoking a small, furry creature given to scurrying across flat surfaces: the computer mouse.

Dr. Engelbart died on Tuesday at 88 at his home in Atherton, Calif. His wife, Karen O'Leary Engelbart, said the cause was kidney failure.

Computing was in its infancy when Dr. Engelbart entered the field. Computers were ungainly room-size calculating machines that could be used by only one person at a time. Someone would feed them information in stacks of punched cards and then wait hours for a printout of answers. Interactive computing was a thing of the future, or in science fiction. But it was germinating in Dr. Engelbart's restless mind.

In his epiphany, he saw himself sitting in front of a large computer screen full of different symbols - an image most likely derived from his work on radar consoles while in the Navy after World War II. The screen, he thought, would serve as a display for a workstation that would organize all the information and communications for a given project.

It was his great insight that progress in science and engineering could be greatly accelerated if researchers, working in small groups, shared computing power. He called the approach "bootstrapping" and believed it would raise what he called their "collective I.Q."

A decade later, during the Vietnam War, he established an experimental research group at Stanford Research Institute (later renamed SRI and then SRI International). The unit, the Augmentation Research Center, known as ARC, had the financial backing of the Air Force, NASA and the Advanced Research Projects Agency, an arm of the Defense Department. Even so, in the main, computing industry professionals regarded Dr. Engelbart as a quixotic outsider.

In December 1968, however, he set the computing world on fire with a remarkable demonstration before more than a thousand of the world's leading computer scientists at the Fall Joint Computer Conference in San Francisco, one of a series of national conferences in the computer field that had been held since the early 1950s. Dr. Engelbart was developing a raft of revolutionary interactive computer technologies and chose the conference as the proper moment to unveil them.

For the event, he sat on stage in front of a mouse, a keyboard and other controls and projected the computer display onto a 22-foot-high video screen behind him. In little more than an hour, he showed how a networked, interactive computing system would allow information to be shared rapidly among collaborating scientists. He demonstrated how a mouse, which he invented just four years earlier, could be used to control a computer. He demonstrated text editing, video conferencing, hypertext and windowing.

In contrast to the mainframes then in use, a computerized system Dr. Engelbart created, called the oNLine System, or NLS, allowed researchers to share information seamlessly and to create and retrieve documents in the form of a structured electronic library.

The conference attendees were awe-struck. In one presentation, Dr. Engelbart demonstrated the power and the potential of the computer in the information age. The technology would eventually be refined at Xerox's Palo Alto Research Center and at the Stanford Artificial Intelligence Laboratory. Apple and Microsoft would transform it for commercial use in the 1980s and change the course of modern life.

Judy New Zealand

We stand on the shoulders of giants, as Isaac Newton once said. R.I.P. Dr Engelbart. I never knew you but have admired you for years since discovering your story and your famous demonstration while doing research for an IT article. Ideas and achievements will always rate higher with me than the wealth other people turn them into. I never knew "The Mother of All Demos" https://www.youtube.com/watch?v=yJDv-zdhzMY was available on Utube until reading a comment on this article and am delighted to find that as well as being brilliant and creative you were also good looking. What a wonderful young man you must have been. My sympathy to your children and hopes that creativity reigns amongst your grandchildren in whatever form it may take. You definitely were one of Isaac Newton's giants with millions standing on your shoulders.

Vic Kley Berkeley, CA

I knew Doug and liked him for his wisdom and integrity. He went well beyond pointing devices like the mouse or specific cursor shapes like his "bug".

This article makes a good effort to change the emphasis, from the mouse to Englebart himself and his realized visions.

He is one of the few visionaries who has seen the vision he shared with Vanevar Bush, Turing and others actually come to pass in his time.

Pointing devices and cursors as Doug would certainly tell you if he could, were in use before WWII. Trackballs and joysticks, even digital tablets all came before the mouse. The mouse was adopted quite simply because it was cheap and profitable. This fact is high praise for Doug for such properties are the things of a successful technology.

Doug, you will be missed.

Tom Foth Trumbull

Back in the 1980's I was writing for Softalk, a PC magazine, and wrote a cover article on these "new" things called "mice." Through this and later some mutual friends, I got a chance to meet and spend time with Dr. Engelbart.

He was always frustrated because the other part of his human interface, the chorded keyboard (http://en.wikipedia.org/wiki/Chorded_keyboard) never received the notoriety and use as the mouse did, because it took practice to use.

I was in a lecture he was giving and he showed himself on snow skis at the top of a mountain peak. He said "Anything worth doing, anything that brings value, be it driving a car or skiing down a mountain, is worth doing well." His point was we need to master a tool or a technology to fully exploit it... and not be left to a tool or technology mastering us and our becoming slaves to it.

So much of today's human interface to computers and the ways computers organize information is based on his passion and vision.

I remember having supper with him once around 1985... and I was so humbled to be with him. His intellect and passion were off the chart. Even though I was hardly his intellectual equal, he engaged me in conversation and treated me as his equal. He was as humble as he was brilliant.

We lost an incredible thinker, innovator, and visionary in his passing. My condolences to his family and friends.

hadarmen NYC

I learned about his great work which he demonstrated in 1968 and accepted as mother of all demonstration's.

Engelbart has seen through what we are living with our computers today, everything he explained in his presentation was solid reality today. how a one man mind can imagine things this much accurate preciseness. I watched his demonstration's on internet, I can tell one thing If I were in the audience I most likely didn't understand what he is creating, But today in 2013 when I listened Engelbart I am shock and awestruck what he is talking about. He already created todays computer world in his mind and crystallized.

Douglas Engelbart is the mind behind todays computer world. What he envisioned become truth and more entire world is running on it.

His conceptual idea was worth a Nobel prize.

R.I.P Mr Engelbart, one man can do only this much to change the world.

it is not only mouse watch the 1968 presentation.

N West Coast

Thank you, Dr. Engelbart.

I fell in love with the Mac over the PC 20 years ago while in college because of the mouse and it's the meteor-like trailing cursor.

This exemplifies what his and Mr. Jobs' geniune vision - that computers are designed to serve us. They shouldn't make us all secretaries, doing mundane data entry with 10 fingers.

Instead, with the mouse, we all become our own conductors of our own symphony full of GUI's. It's wonderful that Mr. Jobs had thought that "a single button was appropriate," like holding onto a conductor baton.

To this day I have yet let go of the mouse. I refuse to go onto the touch pads for the same reasons why I dislike the keyboard. Eventually, I will also refuse to use voice-activated commands. We cannot think and move our tongue at the same time, in order to devote more time to "look, listen, and feel."

In medicine we had someone similar to Dr. Englebart, Dr. Netter. They represent a human dimension that is irreplaceable.

We should hope to respect their legacy by providing a nurturing culture in society for the next 100 years even if only another Englebart or Netter shall come about.

http://en.wikipedia.org/wiki/Frank_H._Netter

NYSlacker Upstate/Downstate

DO NOT FOLD SPINDLE OR MUTILATE...

I was privileged to watch the entire history of modern computing when I enrolled at SUNY Binghamton's School of Advanced Technology in 1980.

My wife and I started programming with punch cards. UGH!!! Card decks of hundreds of cards, each card had to be typed exactly, then put thru a card reader and executed. Mistakes had to be corrected by retyping the card, then put thru the reader again (waiting on line for the reader to be free)...

Finally SAT got a time share system. We could work remotely thru a SLOW modem. One program I wrote (in APL) took 8 hours to compile and execute. Better than cards, but not by much.

On my last day at SAT I went to say goodbye to one professor and on his desk was one of the first (arguably with a serial number under 10) IBM PCs. 80x25 char screen, screen colors were green or amber, no graphics, no mouse, clunky but compelling. Then, in 1982, we left for Silicon Valley.

The history lesson continued as the PC boom started and gained momentum. Companies that relied on mainframes began to use PCs, at least for individual workers. The last part of those lessons was the Lisa, Apple's first foray into what we now recognize as modern computing. This was all of Doug Englebarts ideas in commercialized form. A beautiful machine, with color, graphics, all the bells and whistles we take for granted now.

He was a true Silicon Valley genius who did in fact make everyone's life better than it had been... RIP

calhouri cost rica

"The group disbanded in the 1970s, and SRI sold the NLS system in 1977 to a company called Tymshare. Dr. Engelbart worked there in relative obscurity for more than a decade until his contributions became more widely recognized by the computer industry."

As impressive as were the achievements recounted in this obit, I find the above quoted section the most touching part of the memorial. A man who grew up on a farm in Oregon and despite a life of accomplishment most of us could only dream of somehow managed to retain a modesty and self-confidence that precluded any necessity of blowing his own horn. Would that the modern tech-stars mentioned in the piece (and others who need no naming) shared these endearing traits.

Guerrino USA

NYT Pick..

I had the privilege to know Doug well, during my first 7 years as Logitech's CEO. He and I met regularly when Doug had his office at our Fremont facility. Our conversations were fascinating and incredibly instructive for me. As executives, we tend to focus on the short/medium term. Long term for us may mean 3-5 years. Doug's vision of bootstrapping Collective IQ, his passion in life, was a 50 year+ vision. His belief that technology is key to mankind's ability to solve difficult problems collectively has the transformative power that few others share. In every conversation with him, the wide ranging strength of his vision energized and amazed me. It made me a better person, a better thinker and a better executive.

In spite of all the awards, the Turing Prize, the Medal of Honor, Doug was one of the most under recognized geniuses of our times.And he stayed humble, curious and accessible to ideas and people all his life.

He left many lessons for us. As a company, we will be well served by understanding how we can dramatically enhance our effectiveness by working together, tapping into each other IQ and skills to achieve much more than the sum of our individual parts. It's Doug's legacy, and we will strive to honor it.

Long live Doug, not just in our memories, but in our embodying his vision as a group of individuals

Guerrino De Luca - chairman Logitech

kat New England

Here's the commercially available in 1981 Xerox Star (first link), via a 1982 paper (second link), complete with photo of the graphical interface/display, mouse, what you see is what you get editor:

http://www.guidebookgallery.org/articles/designingthestaruserinterface/p...

http://www.guidebookgallery.org/articles/designingthestaruserinterface

DK Adams Castine, ME

Mr. Engelbart deserves each and all of these accolades. The author of this article, however, might use a bit more education in early computing. Noting the comment "Computers were ungainly room-size calculating machines that could be used by only one person at a time." By 1960- eight years after delivery of the first UNIVAC I, machines were already capable of full multiprogramming/ mutiuser/ timesharing/ networking, able to support hundreds of users. UNIVAC introduced its 400 series of commercial real-time computers, developed out of fire-control computers for the US Navy. Beginning in 1962, I worked with these machines and the UNIVAC 1100 series, developing some of the first industrial real-time uses; they were also used in real-time process control (Westinghouse Prodac); Western Union switched their telegrams through them; and Eastern Airlines used its 490 systems for the first efficient reservation system: all of these were running in the early '60s- and all developed in Minnesota and Pennsylvania, not Silicon Valley. Not everything under the sun happens under the California sun.

Drew Levitt, Berlin

I was in the audience when Engelbart received an honorary PhD from Yale a couple years ago. Martin Scorsese was another of the honorees, and was initially getting all the attention. But when the speaker read Engelbart's citation and the magnitude of his contributions sank in, the audience gave him a very long standing ovation.

Concerned American USA

Doug Englebart's contributions were outstanding. This was a great life to celebrate, indeed.

I wonder if any of it would have happened so quickly without government research funding? Invented in 1964 and product-ized in 1980. This would not show the modern corporate requisite quarterly or even annual return and would have run the bulk of a patent's duration.

Game changing, high-risk research is not consonant with the modern corporate model. It may never be with any real corporate model. This is OK, but govt high-risk research programs have been cut for years - is it no wonder much modern innovation is modest?

Even Xerox PARC, having the parent Xerox - with a great hold on the photo-copy business in those days, had vestiges of govt sponsored research.

Miguel United States

Not everything has stagnated. Thanks to government funding, great advances have been made in surveillance technology.

SJFine Wilton, CT

The story I always heard was that Jobs and Woz saw the potential of the mouse when the Xerox PARC team in charge of the it showed them all the hard work they were doing with it. As I understood it, the PARC team did not expect to have Apple essentially (if legally) steal it out from under them when they showed them the mouse. Well, that's the story that is told by many people and was shown in the 1999 film, "Pirates of Silicon Valley" (a really fun film with great performances by Noah Wyle as Steve Jobs and Anthony Michael Hall as Bill Gates).

However, I just discovered that version of the story may be at least partially apocryphal as well. This New Yorker article : http://www.newyorker.com/reporting/2011/05/16/110516fa_fact_gladwell entitled "Creation Myth" has the tag line, "The mouse was conceived by the computer scientist Douglas Engelbart, developed by Xerox PARC, and made marketable by Apple." It goes on to describe a less commonly known version of the mouse creation story.

I have to admit that until this morning, I had no idea who Douglas C. Engelbart was. (I would bet that a majority of "techie" people didn't know of him either, but that I'm among a great minority of people who will admit it.) I look forward to learning more about this brilliant man. It's interesting to observe the crafting of history that is taking place with the death of Jobs and now Engelbart. How many other "historical facts" have few or none have ever known?

bonongo Ukiah, CA

NYT Pick..

I first saw a clip of Doug Englebart's remarkable 1968 demonstration of the computer mouse in a 1992 documentary, "The Machine that Changed the World" broadcast on PBS. The multi-part series included a fascinating history of the computer, including pioneers like Charles Babbage, Alan Turing -- and Doug Englebart. The documentary noted that, for all the revolutionary innovations he was responsible for, though, even by the early 1990s, Englebart had largely been forgotten in the forward rush of the computer revolution. At one point it showed him walking in anonymity, on the Stanford campus.

And yet as I write this, I use the computer mouse to move the cursor around on my screen -- much as Dr. Engelbart did nearly a half-century ago. I've never forgotten the recording of that 1968 demonstration, and how advanced his equipment looks, even today, compared to the giant mainframes that most people associated with computers in those days. So thanks, Dr. Engelbart -- your creation has now outlived its creator.

Fred Wilf Philadelphia

>> "Mr. Bates said the name was a logical extension of the term then used for the cursor on a screen: CAT. Mr. Bates did not remember what CAT stood for, but it seemed to all that the cursor was chasing their tailed desktop device.)"

Back in the day, all of the screens were CAThode ray tubes. By the time I started using them in the 1970s, we called them "CRTs" or "tubes" or "screens" or "terminals" depending on the particular technology (some were pretty dumb terminals limited to 80 characters across by 25 lines down, while others were could do relatively sophisticated graphics), but I could see them being called "CATs" as well.

RIP, Dr. Engelbart. Your work was groundbreaking. You should be as well-known as others who followed the path that you set.

LongView San Francisco Bay Area

Most unfortunate that his vision did not acknowledge - (a) the consolidation of money and power that computer technology has obtained, (b) massive loss of 'hand and eye' based employment usurped by digital mechanization driving down wages and employment prospects coincident with the decimation of the "middle class", (c) addiction of the "masses" to point and click rather than read and reflect, and (d) pretty much a whole-scale usurpation of the centuries validated method of learning and acquisition of knowledge - reading, reflection and contemplation, and writing. The future in plain sight.

Ichiro Furusato New Zealand

Actually, you are quite wrong about Doug. I was fortunate enough to be invited into his Bootstrap Institute in the late 1990s where I learned much about the history of computing and Doug's ideas.

Doug talked about how Robert Oppenheimer came to later regret being the "father" of the atomic bomb and that people who develop and work with computers bear a similar responsibility to society, i.e., that technologies are not neutral and that technologists must consider how their inventions may be used both to augment as well as potentially damage human society. Doug was simply in favour of focusing on augmentation, but fully acknowledged the hazards. It's billionaires like Jobs and Zuckerberg who are happy to use others' inventions to make themselves extremely wealthy, but take no responsibility to society. Witness Apple's legacy on worker conditions and disposable products, and Facebook's on privacy.

It seems especially poignant that Doug's death follows closely after the world has learned that the surveillance state has come fully into being, more advanced and pervasive than any dystopian novelist could ever have imagined, and that computer-assisted drones are being used by governments to indiscriminantly kill innocent people in faraway lands.

If you're interested in learning more about Doug rather than just spouting off in angry ignorance I might recommend the book "Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing" by Thierry Bardini.

LongView San Francisco Bay Area

Robert Oppenheimer, being a world-class physicist, fully understood, despite his hand-wringing witnessed by President Truman, the magnitude of human death and infrastructure destruction, that would be obtained long before the atomic bombs were detonated over Hiroshima and Nagasaki. To a mathematical mind of Oppenheimer's caliber, the calculations were literally simple and "back of the envelope". The disconnect of Engelbart, and many of his generation and kind, is that he and they did not have the intellectual capacity of fore-sight to understand that a socio-economy based on electronic digital computation that has pauperized the human endeavor. Fortunately, several hundred years out, but eight or ten human generations, their lack of vision will be but a dim footnote by the world as it will be - significant affects from human caused climate change, effective depletion of the petroleum resource, and a world made by hand due to the biogeophysical constraints that control all biological species. Indeed the future in plain sight. Earth abides.

Mei Lin Fung Palo Alto

Doug Engelbart liked to describe himself as a simple farm boy who dared to dream big.

He actually won a Readers Digest $5? for an aphorism paraphrasing "My talent is my willingness to tolerate a high level of embarrassment to get to my dreams"....

it took a lot of tolerance for embarrassment to imagine hypertext, windows, remote video conferencing and more in 1968 when people were using punched cards to communicate with their computers.

He was part of an Authors at Google video in 2007 - you can watch it here : http://www.youtube.com/watch?v=xQx-tuW9A4Q

Engelbart dedicated his life from the age of 25 to helping people understand that technology was like fire, it could burn us up, or be harnessed for the highest good - for augmenting our humanity. Engelbart inspired generations to work towards this goal of augmenting our humanity.

He was one of our greatest global thinkers who partnered with Vint Cerf, and the other Arpanet pioneers who were funded by ARPA Program Manager Bob Kahn. These deeply thoughtful people have been instrumental in making the Internet something that everyone could benefit from.

His thesis is that human systems and tool systems have to co-evolve - IBM produced the Co-Evolution Symposium where he was the keynote speaker.

http://www.almaden.ibm.com/coevolution/pdf/engelbart_paper.pdf

Friends of Engelbart are gathering in Palo Alto today to remember him.

kat New England

For those who claim the pc as we know it would have "died" without Jobs, here's the commercially available in 1981 Xerox Star (first link), via a 1982 paper (second link), complete with photo of the graphical interface/display, mouse, what you see is what you get editor:

GUIdebook Articles "Designing the Star User Interface" Picture

GUIdebook Articles "Designing the Star User Interface"

martinn palo alto

Shreekar, Parts of Xerox were already commercializing the ideas from the Computer Science Lab. They would certainly not have "died" without Apple. Jobs was a ripoff artist who never gave credit to the people whose ideas he took - Engelbart's, Taylor's, Lampson's, Thacker's.

DeVaughn Silicon Valley

Met Doug Englebart at SRI, during my first real job right out of college. In SRI's Information Sciences and Engineering Department, he was crafting the mouse. A young PR guy, I could never really explain what it was or did, but I assumed it was important because Doug was working on it and he was, well, important. A good guy who was never celebrated in the Valley with the flourish he deserved.

Dennis Johns Island, SC

Wasn't much celebrated at SRI for a long time either. The DARPA contract ended and -- typical of SRI at the time -- since there was no funding, his lab was closed and Xerox PARC made one of its better decisions and hired this gentle, quite terrific man.

nemecl Big Bear, CA

This was a great breakthrough.

I worked at NYU during the early seventies and I took computer science to get my master's in 1982. As students we were supposed to punch cards - I never managed to do it right. Fortunately, the computer powers-to-be (the computer was built from discrete transistors) allowed a very limited teletype access for the chemistry lab where I worked.

A mouse? It was 1971, stupid.

Amazing how much can change during a lifetime of one person (I am nine years younger than Dr. Engelbart).

RIP, Dr. Engelbart.

Swannie Honolulu, HI

Ah, yes...punch cards...one wrong key-stroke and you start all over again from the beginning. Bah-Humbug!

Thomas Zaslavsky Binghamton, N.Y.

PunchED cards make great note papers and good bookmarks. A use for all those typos.

kat New England

Re: "Eventually the technology Dr. Engelbart demonstrated would be refined at Xerox's Palo Alto Research Center and at the Stanford Artificial Intelligence Laboratory and then be transformed for commercial use by Apple and Microsoft in the 1980s."

I think you're forgetting the Xerox Star, released with a mouse in 1981, well ahead of Apple's and Microsoft's systems.

Robert Vancouver, Canada

One of my technical manuscripts was composed on a Xerox-Star prototype in March 1980 in Webster NY, sent by 'ether-net' to Stamford Conn for approval, and published in SCIENCE 5 months later.

Two years later most of my colleagues, lost their secretarial services and typed their own manuscripts.

A. M. Garrett Lafayette, La.

I almost wish the headline read, "Douglas C. Engelbart, a graduate of public colleges and a government employee, dies at 88"

Sad to see how we are dismantling the great public umiversity system that fostered so many American dreams and gave the world such genius. I wonder how many kids who come from a farm today can afford college. I wonder how many government workers we are furloughing to cut an already shrinking deficit are working on just this kind of ground-breaking research.

Mei Lin Fung Palo Alto

Doug was always proud of his alma mater, the Oregon State University - he spoke about the importance of the Land Grant System that allowed a boy who grew up in the depression, who did not have more than one pair of shoes to walk to school, He earned his PhD at the University of California at Berkeley. The ground breaking research was made possible by the great investment in Education made by earlier generations. What are we doing for the generations after us?

Thomas Zaslavsky Binghamton, N.Y.

It doesn't matter. Our masters are making out very well and they're making sure things won't change in that respect. The research can go to the Moon for all they care.

Anonymoose & Squirrel Pasadena

That's a very interesting and perceptive comment from Lafayette, where most things are organic and growing.

I spent a couple of years in New Orleans recently and Summers in Lake Charles when I was a early teen, in the 1970s. It's so much hotter now it seems, that could be the changes in land use what with fewer trees and swamps. Two Summers ago the drought was really bad. I stopped several times in Lafayette and it was impossible not to notice the way people took things in a stride that was so much more tolerable than the people in Los Angeles.

The cuts to your university system and health care were the things that made me leave (besides the heat). A distant relative of mine had been a state supreme court justice under Huey Long, but I never did reconnect with that end of the family. That and the money from the spill has never really been allocated to people or to remediation while there were plenty of corporations getting in on the act. I think Gov. Long had a hand in this, plus the basic insular nature of the state.

I think all Americans should visit Louisiana, not just New Orleans (and not at once), the folks there are fun, generous, and intelligent. I never felt threatened in Louisiana whereas Los Angeles can be a dangerous place, not once.

KenJr New Mexico

At Fairchild, Gene Hoerni's planar process mentioned in the next to last paragraph gave birth in that same timeframe to the integrated circuit first demonstrated at Fairchild in 1960 by Robert Noyce and Jay Last. To say the planar process "improved the electrical output of transistors and made them cheaper to manufacture and available to a mass market, gives short shrift in the extreme to both Hoerni's planar process and Engelbart's "Scaling in Microelectronics" ideas.

Mei Lin Fung Palo Alto

Doug would be the first to say that he was one of many - his aspiration was for tapping the collective wisdom - he often spoke about the importance of generating collective wisdom. The Center for Collective Intelligence at MIT was named after Prof. Tom Malone founder of the center, met Dr. Engelbart in California. Tom Malone and Prof. Hiroshi Ishii came to celebrate Engelbart's Legacy at the Program for the Future held by the Tech Museum in 2008 to celebrate the 40th anniversary of the Mother of all Demos. An interview conducted with Engelbart was downloaded 50,000 times in a week - people in 2003/4 were amazed that they had never heard of him, and asked it this was an April Fools Day joke. Engelbart was one of a kind and he believed each of us had untapped potential and he devoted his life to making it possible for others to have the opportunities he had to build a better future

[Jun 24, 2013] NSA Releases Secret Pre-History of Computers - Slashdot

Posted by samzenpus
from the please-forget-about-that-other-stuff dept.

An anonymous reader writes "The National Security Agency has declassified an eye-opening pre-history of computers used for code-breaking between the 1930s and 1960s. The 344 page report, entitled It Wasn't All Magic: The Early Struggle to Automate Cryptanalysis (pdf), it is available on the Government Attic web site. Government Attic has also just posted a somewhat less declassified NSA compendium from 1993: A Collection of Writings on Traffic Analysis. (pdf)"

ibwolf

Re:Pay no attention (Score:5, Insightful)

Pay no attention to the man in the Russian airport.

No, they want you to pay attention to him, to this, to ANYTHING except for what they (the US government and the NSA in particular) are actually doing with regards to you personal liberties. That is what they are trying to distract you from thinking about.

Anonymous Coward

First pwned! (Score:5, Funny)

Am I crazy for opening a PDF from the NSA?

egcagrac0

Re:First pwned! (Score:5, Informative)

Not if you did it in a VM running a LiveCD...

Anonymous Coward

More Secret History (Score:2, Informative)

How about Bush's blackmail scheme where he used the NSA to try to obtain material to blackmail UN ambassadors into voting for invading Iraq. Most of the media treated that like it was secret...

stewsters

PDFS (Score:5, Funny)

Hey you guys who are talking about Snowden, download this PDF with some cool additional code! Don't worry about it. I promise we didn't buy exploits from Adobe or Microsoft!

gl4ss

Re:PDFS (Score:4, Interesting)

Hey you guys who are talking about Snowden, download this PDF with some cool additional code! Don't worry about it. I promise we didn't buy exploits from Adobe or Microsoft!

Why buy what you can get for free?

If you don't use up the budget you don't get more next year. Especially if your working at an agency that can't be measured for efficiency in any way.

Anonymous Coward

The Puzzle Palace (Score:1)

There's a relatively old book about the NSA and SIGINT written by a journalist who studied publicly available materials using Tom Clancy's MO, that you can buy at Barnes and Noble or Amazon.com. I remember reading it and thinking it was more like "what it's like to work at the NSA" than an expose, though. Still, IIRC the author and publisher had to square off with the NSA to get it in print.

flyingfsck

Re:Broken Link (Score:4, Funny)

Got it for you. It is called stuxnet-prehistory.pdf.exe

sbrown7792
Re:The site got suspended...

Google webcache has this [googleusercontent.com]

[Sep 16, 2012] Remembrance of Computer Disks Past by Michael Malone

WSJ.com

This month marks the 60th anniversary of computer-disk memory. Don't feel bad if you missed the big celebration-there wasn't one. Computer memory is the forgotten story of the electronics revolution. Yet it may be the most remarkable of all.

In September 1952, IBM opened a facility in San Jose, Calif.-a critical moment in the story of Silicon Valley. The company set to work developing a new kind of magnetic memory for its planned Model 305 Ramac (Random Access Method of Accounting and Control), the world's first "supercomputer."

Like the generations of mainframe computers before it, Ramac was slated to feature ...

[Jun 14, 2012] Alan Turing and his machines - by the men who knew him best

14 June 2012

It is fitting that the greatest code-breaker of World War Two remains a riddle a hundred years after his birth. Alan Turing, the brilliant, maverick mathematician, widely considered to be the father of computer science and artificial intelligence, invented an electromagnetic machine called the 'bombe' which formed the basis for deciphering Germany's Enigma codes.

The man himself has rather eluded definition: painted (too easily) as a nutty professor with a squeaky voice; as a quirky, haphazard character with a sloppy appearance by his mother and schoolmasters; by colleagues as a gruff, socially awkward man; and by his friends as an open-hearted, generous and gentle soul.

The crucial contribution Turing made at Bletchley Park, one that has been credited with shortening the war by two years and saving countless lives, did not become public knowledge until twenty years after his death. His mother, brother and friends did not know until long after they'd mourned him, the extent of his heroism.

Despite his premature death aged 41, Turing was so prolific and ground-breaking that the Science Museum is dedicating an entire exhibition to what sprang from his mind. It will showcase his machines, his mathematics and his work on codes and morphogenesis, but will also tell the extraordinary story of his life.

"We're calling the exhibition Code-breaker because of Bletchley, but also because Turing broke the codes of science in his work and the codes of society through his homosexuality," says David Rooney, head curator at the Science Museum.

The State which Turing had fought to protect cruelly turned on him in 1952. He was found guilty of gross indecency for homosexual acts avoiding prison by agreeing to a now unthinkable condition of probation: chemical castration. He took Stilboestrol, a pill containing female hormones, but was removed from his government work and felt himself to have been placed under observation. As the holder of State secrets, who was in 1950s attitudes a sexual deviant, he was a dangerous outcast.

He was found dead on 7 June 1954, a few weeks before his 42nd birthday, after biting into an apple laced with cyanide. This 'Snow White' suicide is particularly resonant given his enjoyment of the 1937 Disney film of the fairy-tale. In Andrew Hodges' biography Alan Turing: The Enigma, he records Turing's fondness for singing the words from the scene where the witch drops an apple into a sulphurous cauldron: "Dip the apple in the brew/ Let the Sleeping Death seep through".

58 years after his suicide Turing is beginning to get the recognition he deserves. Nearly 35,000 people have signed a petition calling for his criminal record to be posthumously overturned. Another petition (so far 15,000+ names strong) calls for his face to be printed on the £10 note.

Insights into the man behind the machines:

Turing's nephew, Dermot Turing, 51, the son his brother John, never met him. He was born after his uncle's death so his impressions are drawn from his father and step sisters' stories.

Because my father was quite closely involved in tidying the pieces after Alan had committed suicide, we didn't talk about him at home much.

Frankly, they weren't particularly close as adults. With the business of the prosecution [for homosexuality], which was only a couple of years prior to the suicide, it is hardly surprising that my father found the whole thing pretty distressing. He felt that he'd been left to 'deal' with their mother.

I began to hear about him in the mid Seventies when information about Bletchley became publicly known. During that period there was a lot of talk about it, obviously. I remember being glued to a BBC report revealing the weirder things that took place during the war, of which Enigma was just one. Also, because his mother died in 1976, my father was suddenly able to talk about him.

Alan had written some "scarifying" things about his mother in notebooks for Doctor Greenbaum [a Jungian psychotherapist]. My father felt it was appropriate to conceal the toxic material about granny, so he destroyed all the famous notebooks.

I suspect people might be upset to think Alan hated his mother. It's more complicated than that, of course. I'm not in a position to judge, but my half sisters, who are, would hotly deny that Alan hated her.

That begs the question: why did he write those terrible things? I'm speculating, and anybody else's judgement on this is as good as mine, but I think if you put together the fact that in 1950s England, when social attitudes are very, very different from what they are now, having to explain to his mother (who was essentially an Edwardian lady) what that conviction for homosexuality meant, must have been the toughest thing he'd ever had to do.

I don't think it's possible to comprehend the enormous pressure he was under. It has taken me quite by surprise that a vociferous group of people still think that it's not imaginable that he could have committed suicide.

These people don't feel it was in his nature to do it and believe in evidence that points away from it. The fact that he'd bought himself two pairs of socks the day before, or something. Frankly, I suspect Alan was a victim of mood swings and we probably won't know what it was that tipped him over the edge at that last moment.

That my father, whose initial reaction was that Alan couldn't possibly have committed suicide, found himself persuaded that he must be wrong on that score, is I think the most powerful evidence that he did.

To lots of people this remains an open question. The fact that he can excite such interest about the manner of his death nearly 60 years on is extraordinary. But this year should be about celebrating his achievements rather than reopening questions about his death.

Regarding his 1952 conviction [for homosexuality], I am still, putting it at its mildest, puzzled as to how the court concluded that it had power to push him to do that [take chemical castration]. There is an open question on that.

He was sentenced under the Criminal Justice Act 1948 which introduced the option of probation as an alternative to prison. This was a very new piece of legislation in 1952. How the judge concluded that you could attach conditions to probation so early in the life of this new sentencing power I don't know.

There's a double-sided view of Alan Turing. Talk to the people who worked with him and were his junior assistants and you get this very positive picture of somebody who took time and was approachable. You also get this same sense from people who knew Alan as children: my half sisters, the Greenbaum children, Professor Newman's sons.

If you talk to people who dealt with Alan either as superiors or in a non-technical social setting, and read the really quite acid writings by Alan about what was going on at Cambridge, you realise there was another facet of him: uncompromising, socially a bit awkward. He didn't go out of his way to charm people if it wasn't interesting enough for him to do so.

I can see traits of my father in that, too. Captain Jerry Roberts [a Bletchley Park veteran] said if you passed Alan in the corridor he would turn his eyes to the wall rather than say hello. He obviously wasn't that easy to deal with.

I'm probably not allowed to say things like that. I'm not trying to de-sanctify him but I think there's a tendency to paint him as completely ridiculous. You've got all these stories about weird things that he got up to. My granny's book [Alan. M Turing by Sara Turing] is full of them. Other people assume he's a mad mathematics professor character.

The people knew him personally will tell you Alan was a bit chaotic. Quite the opposite from a person who is good is processing. I suspect he'd often get bored and not finish projects. Having written this design spec for a universal computer, he wasn't particularly interested in its day-to-day application.

Mike Woodger, 89, was Alan Turing's first assistant at the National Physical Laboratory. They worked together on the Ace Pilot Computer.

I was 23 in 1946 when I first met Turing at the NPL. At that point Turing had nobody else to work for him. He was rather motherly towards me.

My initial impression of Turing was of a rather shy and retiring man. We first spoke because I got into difficulty over a puzzle I was trying to solve. Turing looked over my shoulder and said: "Why don't you try singularity?" I'd done a degree in mathematics and should have known what he meant, but didn't. He patiently explained it to me.

You know about his personal life, of course. But I didn't know that he was homosexual until after his death. I went to his home a few times and we got on very well.

He was respected at NPL, but I would not say he was revered as he is now. Not many people knew what he had done during the war. He had a reputation for being rather gruff. He didn't suffer fools gladly.

I went down with glandular fever almost as soon as I arrived at NPL and went off sick for six weeks. I returned in September and there was a charming note from Turing:

Dear Woodger, [He would never have called me Mike]

Unfortunately Wilkinson and I have both arranged to go on leave just at the moment you are coming back. I trust you can keep yourself occupied while we are gone. You could do:

1. Output

2. Try and help out in any measure doing Ace jobs

3. Read the folder

4. Read some good books

5. Relax

I hope you really are alright. It's a shame to have you come back and find the place deserted. It might be wise to have a relapse for a week.

Turing

He was a bit of a fingers and thumbs man. The ideas were brilliant but the execution suffered somewhat from his physical disabilities.

Turing didn't need to be meticulous. He was creative. He was always looking ahead.

He left NPL in 1947 but he returned for the launch of the first Pilot Ace in 1950. He told us how much better we had made it than if he had stayed.

John Turing, Alan's brother, wrote an account of him before he died. It is included as an Afterword in the recently republished Alan M. Turing: Centenary Edition by Sara Turing. Here's an extract:

One Easter holiday in Dinard, Alan spent all his time collecting seaweed and brewing it up in the cellar until at length he extracted a few drops of iodine which he carried back to the science master at Sherborne [the public school both brothers attended] in high triumph.

When later we were living in Guildford, he had a series of crazes. He tried to learn the violin, which was excruciating. Then he turned his attention to breeding those little red banana flies in test tubes, so that he could prove Mendel's theory at first hand. Unfortunately, they escaped and the house was full of banana flies for several days.

Oddest of all, in the heat of summer, he spent much of his time dressed as a private soldier allegedly drilling at Knightsbridge barracks, to what purpose nobody knew, but looking back on it now, I strongly suspect that drilling was not the object of the exercise at all. He was, as I have said, good at beating the system and, of course, the odder the things he did, the less one was likely to enquire into them.

My mother gives a true picture of Alan's generosity. Our family friend Hazel achieved her life's ambition of becoming a missionary with Alan's help. Alan gave his time and brains unstintingly to his friends, paid for the schooling of a boy whom he more or less adopted, spent hours choosing suitable presents for his relations and friends, without regard to expense, and was incredibly patient with and endearing to small children, with whom he would have interesting conversations about the nature of God and other daunting subjects.

Alan could not stand social chat or what he was pleased to call "vapid conversation". What he really liked was a thoroughly disputatious exchange of views. It was pretty tiring, really. You could take a safe bet that if you ventured on some self-evident proposition, as, for example, that the earth was round, Alan would produce a great deal of incontrovertible evidence to prove that it was almost certainly flat, ovular or much the same shape as a Siamese cat which had been boiled for fifteen minutes at a temperature of one thousand degrees Centigrade.

Code-breaker: Alan Turing's Life and Legacy at the Science Museum from 21 June 2012- June 2013, www.sciencemuseum.org.uk

[Jun 06, 2012] Science fiction pioneer Ray Bradbury, 91, has died

"Ray Bradbury, author of Fahrenheit 451, the dystopian novel about the logical development of techotrends in modern society, has died at the age of 91 in Los Angeles, California, Tuesday night, June 5th, 2012

Los Angeles Times

Author of more than 27 novels and story collections-most famously "The Martian Chronicles," "Fahrenheit 451," "Dandelion Wine" and "Something Wicked This Way Comes"-and more than 600 short stories, Bradbury has frequently been credited with elevating the often-maligned reputation of science fiction. Some say he singlehandedly helped to move the genre into the realm of literature.

PHOTOS: Ray Bradbury | 1920 - 2012

"The only figure comparable to mention would be [Robert A.] Heinlein and then later [Arthur C.] Clarke," said Gregory Benford, a UC Irvine physics professor who is also a Nebula award-winning science fiction writer. "But Bradbury, in the '40s and '50s, became the name brand."

Much of Bradbury's accessibility and ultimate popularity had to do with his gift as a stylist-his ability to write lyrically and evocatively of lands an imagination away, worlds he anchored in the here and now with a sense of visual clarity and small-town familiarity.

The late Sam Moskowitz, the preeminent historian of science fiction, once offered this assessment: "In style, few match him. And the uniqueness of a story of Mars or Venus told in the contrasting literary rhythms of Hemingway and Thomas Wolfe is enough to fascinate any critic."

His stories were multi-layered and ambitious. Bradbury was far less concerned with mechanics-how many tanks of fuel it took to get to Mars and with what rocket-than what happened once the crew landed there, or what they would impose on their environment. "He had this flair for getting to really major issues," said Paul Alkon, emeritus professor of English and American literature at USC.

"He wasn't interested in current doctrines of political correctness or particular forms of society. Not what was wrong in '58 or 2001 but the kinds of issues that are with us every year."

Whether describing a fledgling Earthling colony bullying its way on Mars (" -- And the Moon Be Still as Bright" in 1948) or a virtual-reality baby-sitting tool turned macabre monster ("The Veldt" in 1950), Bradbury wanted his readers to consider the consequences of their actions: "I'm not a futurist. People ask me to predict the future, when all I want to do is prevent it."

Ray Douglas Bradbury was born Aug. 22, 1920, in Waukegan, Ill., to Leonard Spaulding Bradbury and the former Esther Marie Moberg. As a child he soaked up the ambience of small-town life - wraparound porches, fireflies and the soft, golden light of late afternoon - that would later become a hallmark of much of his fiction.

In 1945, "The Big Black and White Game," published in the American Mercury, opened the doors to other mainstream publications including Saturday Evening Post, Vogue and Colliers. "A young assistant [at Mademoiselle] found one of my stories in the 'slush pile.' It was about a family of vampires [and] called 'The Homecoming.' " Bradbury told the Christian Science Monitor in 1991. "He gave it to the story editor and said, 'You must publish this!' " That young assistant was Truman Capote, whose own"Homecoming" brought him renown.

Bradbury married Marguerite McClure in 1947, the same year he published his first collection of short stories - "Dark Carnival" (Arkham House) - a series of vignettes that revisited his childhood hauntings.

His first big break came in 1950, when Doubleday collected some new and previously published Martian stories in a volume titled "The Martian Chronicles." A progression of pieces that were at once adventures and allegories taking on such freighted issues as censorship, racism and technology, the book established him as an author of particular insight and note. And a rave review from novelist Christopher Isherwood in Tomorrow magazine helped Bradbury step over the threshold from genre writer to mainstream visionary.

"The Martian Chronicles" incorporated themes that Bradbury would continue to revisit for the rest of his life. "Lost love. Love interrupted by the vicissitudes of time and space. Human condition in the large perspective and definition of what is human," said Benford. "He saw ... the problems that the new technologies presented - from robots to the super-intelligent house to the time machine -- that called into question our comfy definitions of human."

Bradbury's follow-up bestseller, 1953's "Fahrenheit 451," was based on two earlier short stories and written in the basement of the UCLA library, where he fed the typewriter 10 cents every half-hour. "You'd type like hell," he often recalled. "I spent $9.80 and in nine days I had 'Fahrenheit 451.' "

Books like "Fahrenheit 451," in which interactive TV spans three walls, and "The Illustrated Man" - the 1951 collection in which "The Veldt" appeared - not only became bestsellers and ultimately films but cautionary tales that became part of the American vernacular.

"The whole problem in 'Fahrenheit' centers around the debate whether technology will destroy us," said George Slusser, curator emeritus of the J. Lloyd Eaton Collection of Science Fiction, Fantasy, Horror and Utopia at UC Riverside. "But there will always be a spirit that keeps things alive. In the case of 'Fahrenheit,' even though this totalitarian government is destroying the books, the people have memorized them. There are people who love the written word. That is true in most of his stories. He has deep faith in human culture."

But as he garnered respect in the mainstream, he lost some standing among science fiction purists. In these circles, Bradbury was often criticized for being "anti-science." Instead of celebrating scientific breakthroughs, he was reserved, even cautious.

Bradbury had very strong opinions about what the future had become. In the drive to make their lives smart and efficient, humans, he feared, had lost touch with their souls. "We've got to dumb America up again," he said.


Bradbury is survived by his daughters Susan Nixon, Ramona Ostergren, Bettina Karapetian and Alexandra Bradbury; and eight grandchildren. His wife, Marguerite, died in 2003.


[May 06, 2012] The Making of Prince of Persia

jordanmechner.com

Before Prince of Persia was a best-selling video game franchise and a Jerry Bruckheimer movie, it was an Apple II computer game created and programmed by one person, Jordan Mechner.

Now available as a paperback and ebook, Mechner's candid journals from the time capture his journey from his parents' basement to the forefront of the fast-growing 1980s video game industry… and the creative, technical and personal struggles that brought the prince into being and ultimately into the homes of millions of people worldwide.

[Feb 28, 2012] NASA retires on its last IBM mainframe - Triangle Business Journal

Back in the 1960s, System 360 mainframe computer was a technological wonder. And frankly it stayed that way for years, but new systems have been developed. Now comes word that NASA has shut down its last functioning IBM mainframe.

"This month marks the end of an era in NASA computing. Marshall Space Flight Center powered down NASA's last mainframe, the IBM Z9 Mainframe,"wrote NASA CIO Linda Cureton in a blog post.

[Jan 10, 2012] Tech luminaries we lost in 2011 - Computerworld

[Nov 16, 2011] The 40th birthday of the first microprocessor, the Intel 4004

Forty years ago today, electronics and semiconductor trade newspaper Electronic News ran an advertisement for a new kind of chip. The Intel 4004, a $60 chip in a 16-pin dual in-line package, was an entire CPU packed onto a single integrated circuit (IC).

At a bare minimum, a CPU is an instruction decoder and an arithmetic logic unit (ALU); the decoder reads instructions from memory and directs the ALU to perform appropriate arithmetic. Prior CPUs were made up of multiple small ICs of a few dozen or hundred transistors (and before that, individual transistors or valves) wired up together to form a complete "CPU." The 4004 integrated the different CPU components into one 2,300-transistor chip.

4004 wasn't just a new direction for the computer industry; it was also a new direction for Intel. Since its founding in 1968, Intel was a memory company, making various kinds of RAM, boasting some of the fastest and highest density memory in the industry. It wasn't in the business of making CPUs or logic chips. Nonetheless, Japanese electronic calculator company Busicom approached Intel in 1969, asking the memory company to build a new set of logic chips for its calculators.

Busicom proposed a fixed-purpose design requiring around a dozen chips. Busicom had designed the logic itself, and even verified that it was correct; it wanted Intel to build the things. Ted Hoff, manager of Intel's Application Department, realized that the design could be simplified and improved by using a general-purpose CPU instead of the specialized calculator logic that Busicom proposed. Hoff managed to convince both Intel and Busicom management that his approach was the right one.

Work started six months later when Intel hired Federico Faggin in April 1970 to work on the project. Faggin had to design and validate the logic of the CPU. This was a challenge for Intel. As a memory company, it didn't have methodologies for designing or validating logic circuits. Intel's processes were geared towards the production of simple, regular repeating structures, rather than the highly varied logic that a CPU requires.

Faggin's job was also made more complex by the use of silicon gate transistors. At the time, aluminum gates were standard, and while silicon eventually won out, its early development was difficult; silicon gates needed different design approaches than aluminum, and those approaches hadn't been invented yet.

Nonetheless, Faggin was successful, and by March 1971 had completed the development work of a family of four different chips. There was a 2048-bit ROM, the 4001; a 40-byte RAM, the 4002; an I/O chip, the 4003; and finally, the CPU itself, 4004. Intel paid Busicom for the rights to the design, allowing the firm to sell and market the chip family. Branded as MCS-4, the chips started production in June 1971, before being advertised to the commercial markets 40 years ago today.

Clumsy and cutting-edge

The 4004 itself was a peculiar mix of cutting-edge technology and conservative cost-cutting. As an integrated CPU it was a landmark, but the design itself was clumsy even for 1970. Intel management insisted that the chip use a 16-pin DIP, even though larger, 40-pin packages were becoming mainstream at the time. This means that the chip's external bus was only four bits wide, and this single 4-bit bus had to transport 12-bit memory addresses, 8- and 16-bit instructions, and the 4-bit integers that the CPU operated on. Reading a single 16-bit instruction thus took four separate read operations. The chip itself had 740 kHz clock, using 8 clock cycles per instruction. It was capable of 92,600 instructions per second-but with the narrow multipurpose bus, achieving this in practice was difficult.

In 1972, Intel produced the 8-bit 8008. As with the 4004, this was built for a third party-this time terminal manufacturer Datapoint-with Datapoint contributing much of the design of the instruction set, but Intel using its 4004 experience to actually design the CPU. In 1974, the company released the 8080, a reworked 8008 that used a 40-pin DIP instead of 8008's 18-pin package. Federico Faggin did much of the design work for the 8008 and 8080.

In spite of these pioneering products, Intel's management still regarded Intel as a memory company, albeit a memory company with a sideline in processors. Faggin left intel in 1974, founding his own processor company, Zilog. Zilog's most famous product was the Z80, a faster, more powerful, software-compatible derivative of the 8080, that powered early home computers including the Radio Shack TRS-80 and the Sinclair ZX80, ZX81, and ZX Spectrum-systems that were many people's first introduction into the world of computing.

Faggin's decision to leave Intel and go into business for himself caused some bad feeling, with Intel for many years glossing over his contribution. Nonetheless, he left an indelible mark on Intel and the industry as a whole, not least due to his decision to sign his initials, FF, on the 4004 die.

The 8080 instruction set was then extended to 16 bits, with Intel's first 16-bit processor, the 20,000 transistor 8086, released in 1978. This was the processor that first heralded Intel's transition from a memory company that also produced processors into the world's leading processor company. In 1981, IBM picked the Intel 8088-an 8086 with the external bus cut to 8-bit instead of 16-bit-to power its IBM PC, the computer by which all others would come to be measured. But it wasn't until 1983, with memory revenue being destroyed by cheap Asian competitors, that Intel made microprocessors its core product.

The processors of today continue to owe much of their design (or at least, the design of their instructions) to the 8086. They're unimaginably more complex, with the latest Sandy Bridge E CPUs using 2.2 billion transistors, a million-fold increase on 4004 and 100,000-fold on the 8086, the basic design elements are more than 30 years old.

While the 4004 is widely regarded as the first microprocessor, and is certainly the best known, it arguably isn't actually the first. There are two other contenders.

Texas Instruments' TMS 1000 first hit the market in calculators in 1974, but TI claimed it was invented in 1971, before the 4004. Moreover, TI was awarded a patent in 1973 for the microprocessor. Intel subsequently licensed this patent.

Earlier than both of these was a processor called AL1. AL1 was built by a company named Four-Phase Systems. Four-Phase demonstrated systems built using AL1 in 1970, with several machines sold by early 1971. This puts them ahead of both TI and Intel. However, at the time AL1 was not used as a true standalone CPU; instead, three AL1s were used, together with three further logic chips and some ROM chips.

Intel and Cyrix came to blows in a patent dispute in 1990, with TI's patent being one of the contentious ones. To prove that TI's patent should not have been granted, Four-Phase Systems founder Lee Boysel took a single AL1 and assembled it together with RAM, ROM, and I/O chips-but no other AL1s or logic chips-to prove that it was, in fact, a microprocessor, and hence that it was prior art that invalidated TI's claim. As such, although it wasn't used this way, and wasn't sold standalone, the AL1 can retrospectively claim to have been the first microprocessor.

The 4004 is, however, still the first commercial microprocessor, and it's the first microprocessor recognized and used at the time as a microprocessor. Simple and awkward though its design may have been, it started a revolution. Ted Hoff, for convincing Busicom and Intel alike to produce a CPU, Federico Faggin, for designing the CPU, and Intel's management, particularly founders Gordon Moore and Robert Noyce, for buying the rights and backing the project, together changed the world.

Photograph by Rostislav Lisovy

[Nov 16, 2011] Ten years of Windows XP how longevity became a curse by Peter Bright

Not sure about the curse, but XP really has a tremendous ride...
arstechnica.com

Windows XP's retail release was October 25, 2001, ten years ago today. Though no longer readily available to buy, it continues to cast a long shadow over the PC industry: even now, a slim majority of desktop users are still using the operating system.

...For home users using Windows 95-family operating systems, Windows XP had much more to offer, thanks to its substantially greater stability and security, especially once Service Pack 2 was released.

...Over the course of its life, Microsoft made Windows XP a much better operating system. Service Pack 2, released in 2004, was a major overhaul of the operating system. It made the software better able to handle modern systems, with improved WiFi support and a native Bluetooth stack, and made it far more secure. The firewall was enabled by default, the bundled Internet Explorer 6 gained the "gold bar" popup blocker and ActiveX security feature, and for hardware that supported it, Data Execution Protection made it more difficult to exploit software flaws.

...Ten years is a good run for any operating system, but it really is time to move on. Windows 7 is more than just a solid replacement: it is a better piece of software, and it's a much better match for the software and hardware of today.

[Oct 30, 2011] Dennis Ritchie Day

Slashdot

"Today we celebrate Dennis Ritchie Day, an idea proposed by Tim O'Reilly. Ritchie, who died earlier this month, made contributions to computing that are so deeply woven into the fabric that they impact us all. We now have to remark on the elephant in the room. If Dennis Ritchie hadn't died just after Steve Jobs, there would probably have been no suggestion of a day to mark his achievements.

We have to admit that it is largely a response to the perhaps over-reaction to Steve Jobs which highlighted the inequality in the public recognition of the people who really make their world work."

[Oct 30, 2011] What Everyone Is Too Polite to Say About Steve Jobs

gawker.com

Before he was deposed from Apple the first time around, Jobs already had a reputation internally for acting like a tyrant. Jobs regularly belittled people, swore at them, and pressured them until they reached their breaking point. In the pursuit of greatness he cast aside politeness and empathy. His verbal abuse never stopped. Just last month Fortune reported about a half-hour "public humiliation" Jobs doled out to one Apple team:

"Can anyone tell me what MobileMe is supposed to do?" Having received a satisfactory answer, he continued, "So why the fuck doesn't it do that?"

"You've tarnished Apple's reputation," he told them. "You should hate each other for having let each other down."

Jobs ended by replacing the head of the group, on the spot.

In his book about Jobs' time at NeXT and return to Apple, The Second Coming of Steve Jobs, Alan Deutschman described Jobs' rough treatment of underlings:

He would praise and inspire them, often in very creative ways, but he would also resort to intimidating, goading, berating, belittling, and even humiliating them... When he was Bad Steve, he didn't seem to care about the severe damage he caused to egos or emotions... suddenly and unexpectedly, he would look at something they were working on say that it "sucked," it was "shit."

Jobs had his share of personal shortcomings, too. He has no public record of giving to charity over the years, despite the fact he became wealthy after Apple's 1980 IPO and had accumulated an estimated $7 billion net worth by the time of his death. After closing Apple's philanthropic programs on his return to Apple in 1997, he never reinstated them, despite the company's gusher of profits.

It's possible Jobs has given to charity anonymously, or that he will posthumously, but he has hardly embraced or encouraged philanthropy in the manner of, say, Bill Gates, who pledged $60 billion to charity and who joined with Warren Buffet to push fellow billionaires to give even more.

"He clearly didn't have the time," is what the director of Jobs' short-lived charitable foundation told the New York Times. That sounds about right. Jobs did not lead a balanced life. He was professionally relentless. He worked long hours, and remained CEO of Apple through his illness until six weeks before he died. The result was amazing products the world appreciates. But that doesn't mean Jobs' workaholic regimen is one to emulate.

There was a time when Jobs actively fought the idea of becoming a family man. He had his daughter Lisa out of wedlock at age 23 and, according to Fortune, spent two years denying paternity, even declaring in court papers "that he couldn't be Lisa's father because he was 'sterile and infertile, and as a result thereof, did not have the physical capacity to procreate a child.'" Jobs eventually acknowledged paternity, met and married his wife, now widow, Laurene Powell, and had three more children. Lisa went to Harvard and is now a writer.

[Oct 30, 2011] Ten years of Windows XP

Windows XP's retail release was October 25, 2001, ten years ago today. Though no longer readily available to buy, it continues to cast a long shadow over the PC industry: even now, a slim majority of desktop users are still using the operating system.

Windows XP didn't boast exciting new features or radical changes, but it was nonetheless a pivotal moment in Microsoft's history. It was Microsoft's first mass-market operating system in the Windows NT family. It was also Microsoft's first consumer operating system that offered true protected memory, preemptive multitasking, multiprocessor support, and multiuser security.

The transition to pure 32-bit, modern operating systems was a slow and painful one. Though Windows NT 3.1 hit the market in 1993, its hardware demands and software incompatibility made it a niche operating system. Windows 3.1 and 3.11 both introduced small amounts of 32-bit code, and the Windows 95 family was a complex hybrid of 16-bit and 32-bit code. It wasn't until Windows XP that Windows NT was both compatible enough-most applications having been updated to use Microsoft's Win32 API-and sufficiently light on resources.

In the history of PC operating systems, Windows XP stands alone. Even Windows 95, though a landmark at its release, was a distant memory by 2005. No previous PC operating system has demonstrated such longevity, and it's unlikely that any future operating system will. Nor is its market share dominance ever likely to be replicated; at its peak, Windows XP was used by more than 80 percent of desktop users.

The success was remarkable for an operating system whose reception was initially quite muted. In the wake of the September 11th attacks, the media blitz that Microsoft planned for the operating system was toned down; instead of arriving with great fanfare, it slouched onto the market. Retail sales, though never a major way of delivering operating systems to end users, were sluggish, with the operating system selling at a far slower rate than Windows 98 had done three years previously.

It faced tough competition from Microsoft's other operating systems. Windows 2000, released less than two years prior, had won plaudits with its marriage of Windows NT's traditional stability and security to creature comforts like USB support, reliable plug-and-play, and widespread driver support, and was widely adopted in businesses. For Windows 2000 users, Windows XP was only a minor update: it had a spruced up user interface with the brightly colored Luna theme, an updated Start menu, and lots of little bits and pieces like a firewall, UPnP, System Restore, and ClearType. ...

Long in the tooth it may be, but Windows XP still basically works. Regardless of the circumstances that led to its dominance and longevity, the fact that it remains usable so long after release is remarkable. Windows XP was robust enough, modern enough, well-rounded enough, and usable enough to support this extended life. Not only was Windows XP the first (and only) PC operating system that lasted ten years: it was the first PC operating system that was good enough to last ten years. Windows 98 didn't have the security or stability; Windows 2000 didn't have the security or comfort; Mac OS X 10.1 didn't have the performance, the richness of APIs, or the hardware support.

... ... ...

Given current trends, Windows 7 will overtake XP within the next year, with many businesses now moving away from the decade-old OS in earnest. Not all-there are still companies and governments rolling out Windows XP on new hardware-but the tide has turned. Windows XP, with its weaker security and inferior support for modern hardware, is now becoming a liability; Windows 7 is good enough for business and an eminently worthy successor, in a way that Windows Vista was never felt to be.

Ten years is a good run for any operating system, but it really is time to move on. Windows 7 is more than just a solid replacement: it is a better piece of software, and it's a much better match for the software and hardware of today. Being usable for ten years is quite an achievement, but the stagnation it caused hurts, and is causing increased costs for administrators and developers alike. As incredible as Windows XP's longevity has been, it's a one-off. Several factors-the 32-bit transition, the Longhorn fiasco, even the lack of competition resulting from Apple's own Mac OS X transition-conspired to make Windows XP's position in the market unique. We should not want this situation to recur: Windows XP needs to be not only the first ten-year operating system; it also needs to be the last.

Selected comments

Muon:

"We should not want this situation to recur: Windows XP needs to be not only the first ten-year operating system; it also needs to be the last."

It feels like you completely missed the point.

Stability. Matters.

In today's fast-paced, constantly iterating (not innovating, as they claim) world, "good enough" is an alien concept, a foreign language. Yet we reached "good enough" ten years ago and it shows no signs of ever going away.

superslav223

OttoResponder wrote: All those words and yet one was missed: monopoly. You can't really talk about XP - or any other Microsoft OS - without talking about the companies anti-competitive practices. For example, the way they strong-armed VARs into selling only Windows... Sure the Bush Administration let them off the hook, but the court's judgement still stands.

Pirated XP is still installed far more than Linux despite being an OS from 2001.

Linux on the desktop has shortcomings and pretending they don't exist won't make them go away.

Microsoft used strong arm tactics but the competition also sucked. I have known many geeks that chose XP over Linux because they found the latter to be too much of a hassle, not because of OEMs or software compatibility.

xpclient:

"Windows XP didn't boast exciting new features". I stopped reading there because that's a load of crap/myth. XP came with a large number of NEW and EXCITING features. Read more about them here: http://en.wikipedia.org/wiki/Features_new_to_Windows_XP . XP was a very well engineered system that improved by orders of magnitude upon Windows 2000. Its popularity and continued use demonstrate just how well designed the system was. Great compatibility, excellent stability and performance. Security was an Achilles heel but SP2 nailed it and XP became a very good OS.

Windows 7 has some nice features but plenty of regressions too. Windows 7 can't even do basic operations like freely arrange pictures in a folder or not force a sorting order on files. The search is totally ruined for real-time searching, WMP12 is a UI disaster. Service packs and updates take hours to install instead of minutes and can't be slipstreamed into setup files. There's no surround sound audio in games. There is no choice of a Classic Start Menu. Windows Explorer, the main app where I live (instead of living on Facebook) is thoroughly dumbed down.

chabig:

"Windows XP didn't boast exciting new features or radical changes, but it was nonetheless a pivotal moment in Microsoft's history. It was Microsoft's first mass-market operating system in the Windows NT family. It was also Microsoft's first consumer operating system that offered true protected memory, preemptive multitasking, multiprocessor support, and multiuser security."

Talk about contradictions. First, claim there were no new or exciting features, then list a bunch of them. XP was the first fully 32 bit Windows OS and broke dependence on DOS. I'd say it did offer radical changes for the better. carlisimo | 5 days ago | permalink I'm still on XP, and I'm hesitant about upgrading... I don't like the changes to Windows Explorer at all. dnjake | 5 days ago | permalink How often do you need a new spoken language or a new hammer? When people spend effort learning how to use an operating system and that operating system meets their needs, change is a losing proposition. The quality of Microsoft's work is going down. But Microsoft's quality is still far better than almost any of the low grade work that is standard for the Web. Windows 7 does offer some improvement over XP and it is a more mature operating system. It will be used longer than XP and it remains to be seen how long XP's life will turn out to be. The quality of the Web is still low. Even the most basic forms and security sign in applications are primitive and often broken. It may easily take another decade. But the approach Microsoft is talking about with Windows 8 probably will eventually will provide a mature system based on the HTML DOM as the standard UI. Between that and XAML, it is hard to see why anything more will be needed. The days of ever changing operating systems are drawing to a close.

microlith

superslav223 wrote:

Windows 2008R2 is probably safer than RHEL for web hosting.

Really? I'd like to see some results on this.

Quote: IE6 and IE9 might as well be entirely different browsers.

Yes, because they had competition surpassing them. Otherwise you get 5+ years of... nothing.

theJonTech:

I am in charge of the PC Deployment team of a Fortune 400 company. I can tell you first hand the nightmare of upgrading to Windows 7. The facts are, legacy apps haven't been upgraded to run on Windows 7, much less Windows 7 64bit. We had the usual suspects, Symantec, Cisco, Citrix all ready for launch, but everyone else drags their feet and we have had to tell our customers, either do away with the legacy app and we can find similar functionality in another application or keep your legacy app and you will be sent an older PC with XP on it (Effectively redeploying what they already have with more memory). Add to that, we are using Office 2010 and it's a complete shell shock to most end users used to Office 2003, though going from 2007 isn't as bad.

On the other hand I do small business consulting and moved over a 10 person office and they were thrilled with Windows 7, as it really took advantage of the newer hardware.

It just depends on the size and complexity of the upgrade. My company just cannot throw away a working OS, when these production applications won't work... Maybe in a perfect IT world

Hagen:

fyzikapan wrote:

The various Linux distros still haven't managed to come up with a solid desktop OS that just works, to say nothing of the dearth of decent applications, and what is there frequently looks and works like some perpetual beta designed by nerds in their spare time.

It's funny b/c it's so very true

jiffylube1024:

Windows XP's longevity is truly remarkable. The article makes a good point in that the strong push towards internet-connected PC's and internet security made running all pre-XP Microsoft desktop OS'es untenable after a few years, especially after Windows XP SP2 released with beefier security.

I personally jumped ship to Vista as soon as I could, because after the stability issues were ironed out within the first 6 months, it was a much smoother, better PC experience than XP (long boot times notwithstanding). Windows 7, which was essentially just a large service pack of Vista sold as a new OS (think OS X releases), was a smoother, more refined Vista.

I believe that Windows 7 is "the new XP", and it will probably still command well over 10% of the desktop market in 5+ years. I believe that for non-touch screen PC's, Windows 7 will be the gold standard for years to come, and that is the vast majority of buisness PC's and home PC's. New builds of Windows 7 boot faster than XP, and run smoother with fewer hiccups. The GPU-accelerated desktop really does run smoother than the CPU driven ones of the past.

Nothing will approach XP's 10-year run, most of that as the dominant desktop OS. The lines between desktop and laptop have been blurred lately as well; Windows 7 and Mac OS X are considered "desktop" OS'es even when they run on laptops. There is a newfound emphasis on mobile OS'es like never before today. More and more people will use Tablet devices as media consumption devices - to surf the net, watch videos, etc. More and more people use computers *while watching TV; it's a trend that is only increasing, and smartphones and tablets make this even easier. ----------

Because Windows 8's "Metro" UI is so touch-focused, I could see it taking off in school usage, laptops, and, of course, tablets in the 201x decade. It will be interesting to see how Windows 8 tablets run when the OS first launches in late 2012; tablet hardware is at least an order of magnitude slower than desktop hardware. Within a few years of Windows 8's launch, however, there may be no perceptible performance difference between Tablet and desktop/laptop usage.

superslav223

Hagen wrote:

OS X 10.0-10.7 = $704 Windows XP - 7 via upgrades (XP Professional Full - Ultimate - Ultimate) = $780

Windows XP - 7 via full versions (Professional - Ultimate - Ultimate) = $1020

OS X has been cheaper if you were looking for the full experience of Windows each time. I dont' have time to research which of the versions were valid upgrades of each other, b/c that was a super PITA, so I'll just leave that there for others to do.

You're too lazy to do the research for your own comparison?

Ultimate mainly exists for the people that max out laptops because they have nothing better to do with their money. The best feature of Ultimate (bitlocker) has a free alternative (truecrypt).

You certainly don't need Ultimate for the full Windows experience.

EmeraldArcana:

DrPizza wrote: I don't even understand the question, particularly not with regard to how it relates to Apple. Apple doesn't give you iterative improvements. It releases big new operating systems that you have to buy.

I think the case can be made that the transition between versions of Windows, traditionally, are a bit larger of a jump than transitions between versions of Mac OS X.

The leap from Windows XP to Windows Vista was quite large; compare that to the changes between Mac OS 10.3 and Mac OS 10.4. Similarly, Vista to 7 looks "more" than 10.4 to 10.5.

While Apple's charging for each point iteration of its operating system and adding new features, most of the underlying elements are essentially the same. They round out corners, remove some brushed metal here and there, change a candy stripe or two, but the visual differences between versions aren't as dramatic.

chronomitch

cactusbush wrote: Yeah, we are obliged to upgrade OS's eventually, when we purchase new hardware. Problem is that Windoze is getting worse - not better. Windoze Vista and 7 -STINK- and the future with Win8 looks even bleaker. My issues revolve around having command and control over my personal machine rather than it behaving like a social machine or by having the OS protect me from the machine's inner workings.

Several previous commentators have already remarked along similar lines, their frustration with Win7&8's isolation and de-emphasis of the file managing 'Explorer' app. - "aliasundercover" listed several Win7 shortcomings; including excessive 'nag screens', less control over where things get put, "piles of irrelevant things run incessantly in the background", the need for the machine to 'Phone Home' constantly and the "copy protection and activation getting meaner".

Years ago the purchase of XP and its new limited installation to only one computer; determined that XP would be my last MS OS purchase. Linux however has not yet blossomed to a desirable point.

MS is developing dumbed down style of operating systems that I don't want and don't like.

1) There is absolutely nothing stopping you from "having command and control" over your personal Windows 7 machine. In fact, Windows 7 provides more and better facilities for automatically starting and managing various tasks.

2) Nag screens can be easily disabled. Moreover, there are multiple settings for these screens. For example, I only see these screens when I install or uninstall software.

3) I don't see how explorer has been "isolated" or "de-emphasized." There are a few UI changes to the program, but most can be reverted to what XP looked like, save for the lack of an "up" button (which can be restored with certain software). Learn to use the new search in the start menu. It will save you a lot of time in the long run.

4) I'm not sure what the "less control over where things get put" complaint is about. Pretty much every program installer allows you to change where programs are installed.

5) Windows 7 runs faster and more efficiently than Windows XP, regardless of background processes.

6) Windows 7 activation has been painless. I don't see why anyone cares about Windows "phoning home" for activation after installation or the copy protection scheme, unless you're a pirate. Buy a copy of Windows 7 for each PC and stop being such a cheap-ass.

Honestly, it sounds like you have had very little actual exposure to Windows 7 and have just picked up complaints from other people. Neither XP nor 7 are perfect OSes, but 7 is leagues above XP in terms of security, performance, and standards. Windows 7 is a modern OS in every sense of the word. XP is an OS that has been patched and updated many times during its lifespan to include features and security it should have had in the first place.

me987654

lwatcdr wrote:

Nightwish wrote: That and most people walk into a best buy and get whatever they're told to get having no idea what an OS is or that there's a choice. Enterprise is terrified of change.

Enterprise needs to get work done. Vista had all the problems of a major update with the benefits of a minor update.

I'm left wondering the age of the people spouting this "enterprise is terrified of change" meme.

Seriously. This isn't meant to insult of younger people. It isn't bad to be young. However youth often don't fully grasp the factors that go into the decision making process.

IT departments aren't afraid of change. Change is exactly what keeps them employed and makes their job interesting. You'll find that they usually run the latest and greatest at home, likely have a brand new gadgets, and spend their free time on sites like ars.

So why don't they upgrade? Because upgrading costs time, money, and the devoted attention of people in key rolls. It also results in lost productivity in the short term. The benefits of upgrading must be weighed against the costs of upgrading. But not only that, the upgrade must be weighed against other projects that might help the organization more. Only so much change can be managed and endured simultaneously.

Meanwhile, casual and overly emotional observers pretend that IT departments are sticking with XP because they're lazy or haven't given the topic much thought. Rest assured, migration from XP has been given a ton of attention and the decision of when to leap isn't made lightly.

Great post... I think young people don't fully grasp how important it is to keep those main line of business applications operating.

MekkelRichards

I love XP and always will. I have been using it for almost 10 years. Got it just after it came out on my first real computer. The Dell Dimension 4100 with 733MHZ Pentium3 and 128MB SDRAM.

Just a few months ago I sold a brand new replacement laptop that I was sent from Dell so that I could buy an older, cheap laptop. A 2006 Inspiron E1405. It has a 1.83GHZ Core Duo, released before even the Core 2 Duos came out, only a 32-bit CPU. I am running XP SP3 on it with 2gigs of RAM and it flies. I run every program that any normal person would. Currently have 13 tabs open in Chrome, Spotift open, some WinExplorer windows, and Word 2010 open. Not a hint of slow down.

XP is just so lean and can be even furtherly leaned out through tons of tweaks.

FOR ANYONE WHO STILL RUNS XP, download an UxTheme.dll patcher so that you can use custom themes!

NuSkoolTone

All this crying about "Holding us back". I say to the contrary, it kept us from moving "Forward". Forward as in needing new hardware every couple of years that in the end gave us NO REAL new functionality, speed, or efficiency. It wasn't until the CORE processors from Intel that there was any NEED for a new OS to take advantage.

Being able to keep working on old hardware that STILL PERFORMED, or being able to upgrade when you FELT like it (instead of being FORCED to because the new crappy whizbang OS brought it to its knees) with results that FLEW was NICE.

Windows 7 is a worthy successor to XP, but that doesn't mean XP wasn't a GREAT OS during its run!

Mr Bil

A few of us are using XP because the 70-250 thousand dollar instrument requires a particular OS to run the software. Upgrading to a new OS (if offered) is a 3 to 14 thousand dollar cost for new controller boards in the PC and the new software, not to mention the additional cost of a new PC. We have two Win98, one WinNT, and three WinXP machines in our lab running instruments.

metalhead0043

I just got on the Windows 7 bandwagon a little over a month ago. There are some things I like and some things I don't like. The boot times and shut down times are considerably faster that XP. Also I feel like the entire OS is just much more stable. I never get programs that hang on shut down. It just plain works. I don't care much for the new Windows 7 themes. I immediately went to the Windows Classic theme as soon as I found it. However, I still like the old XP start menu more. It was just more compact and cleaner. I do like the search feature for Windows 7. There are some other things I don't like, like the Explorer automatically refreshing when I rename a file. It's a pointless feature that adds nothing to the experience.

At my job, however, I see no upgrade from XP in sight. I work for a major office supply retailer, and we are hurting financially. We still use the same old Pentium 4 boxes from when I started back in 2003.

trollhunter

What moves people (and companies) to upgrade (or not) their OS ? Basically, 2 things: applications and hardware. For a long time, XP covered this 2 items very well (even in the case of legacy Win-16 or DOS applications in most of the cases, either natively or thru 3rd. party support like DOSBox), so the market felt no big need to change. Microsoft knew this, that's why things like DirectX10 got no support on XP. OTOH, the hardware market evolved to 64 bit platforms, and things like the 3GB address space limit in XP and immature multi core processors support became real problems.

I think that Windows 7 / Windows 8 will spread faster when people and enterprises feel the need to migrate to 64 bit hardware (that was my case, BTW).

[Oct 29, 2011] Steve Jobs death Apple boss' tangled family and who could inherit his $8.3bn fortune By Mark Duell

October 7, 2011 | Mail Online

A father he never knew, a love-child he once denied and a sister he only met as an adult: The tangled family of Steve Jobs... and who could inherit his $8.3 BILLION fortune Apple co-founder survived by two sisters, wife and their three children But he also had love child Lisa Brennan-Jobs with Chrisann Brennan His Syrian biological father never had conversation with Jobs as an adult

Steve Jobs's tangled family of a forgotten father, long-lost sister and love child means lawyers may face a delicate task breaking up his $8.3billion fortune. The 56-year-old co-founder and former CEO of Apple is widely seen as one of the world's greatest entrepreneurs - and he died just outside the top 100 world's richest billionaires. But behind the iconic Californian's wealth and fame lies an extraordinary story of a fragmented family. Husband and wife: Steve Jobs leans his forehead against his wife after delivering the keynote address at an Apple conference in San Francisco, California, in June Mr Jobs, of Palo Alto, California, is survived by his sisters Patti Jobs and Mona Simpson, his wife Laurene Powell Jobs and their three children Eve, Erin and Reed. STEVE JOBS AND HIS FAMILY Biological parents: Joanne Schieble and Abdulfattah Jandali Biological sister: Mona Simpson Adoptive parents: Clara and Paul Jobs Adoptive sister: Patti Jobs Wife: Laurene Powell Jobs Children: Eve, Erin and Reed Love child: Lisa Brennan-Jobs from relationship with Chrisann Brennan

But his family is far from straightforward. He was adopted as a baby and, despite his biological father's attempts to contact him later on, remained estranged from his natural parents. In his early twenties Mr Jobs became embroiled in a family scandal before his days of close media scrutiny, after he fathered a love child with his high school sweetheart Chrisann Brennan. Ms Brennan, who was his first serious girlfriend, became pregnant in 1977 - and he at first denied he was the father. She gave birth to Lisa Brennan-Jobs in 1978 - and in the same year Mr Jobs created the 'Lisa' computer, but insisted it only stood for 'Local Integrated Software Architecture'. The mother initially raised their daughter on benefits. But he accepted his responsibilities two years later after a court-ordered blood test proved he was the father, despite his claims of being 'infertile'. Relatives: Mr Jobs did not meet his biological sister Mona Simpson, left, until he was aged 27. Lisa Brennan-Jobs, right, was his love child with longtime girlfriend Chrisann Brennan in 1978 Ms Brennan-Jobs has made a living for herself, after graduating from Harvard University, as a journalist and writer. 'My father was rich and renowned, and later, as I got to know him, went on vacations with him, and then lived with him for a few years, I saw another, more glamorous world' Lisa Brennan-Jobs She was eventually invited into her father's life as a teenager and told Vogue that she 'lived with him for a few years'. 'In California, my mother had raised me mostly alone,' Lisa wrote in an article for Vogue in 2008. 'We didn't have many things, but she is warm and we were happy. We moved a lot. We rented.

'My father was rich and renowned, and later, as I got to know him, went on vacations with him, and then lived with him for a few years, I saw another, more glamorous world.' Biological dad: Abdulfattah Jandali, 80, a casino boss, has said he wanted to meet his son but was worried about calling him in case Mr Jobs thought he was after money Mr Jobs was born to Joanne Schieble and Syrian student Abdulfattah Jandali before being given up for adoption.

Mr Jandali was a Syrian student and not married to Ms Simpson at the time of Mr Jobs's birth in San Francisco, California, in February 1955.

More...The man who changed the world: Apple founder Steve Jobs, 56, dies weeks after quitting as boss of firm he started in his garage 'The world is a better place because of Steve': The life and times of Apple visionary Steve Jobs

She did not want to bring up a child out of wedlock and went to San Francisco from their home in Wisconsin to have the baby.

Mr Jobs is thought never to have made contact with his biological father.

Mr Jandali, 80, a casino boss, has said he wanted to meet his son but was worried if Mr Jobs thought he was after money. Tributes: Flowers adorn the sidewalk outside the home of Steve Jobs in Palo Alto, California, today He had always hoped that his son would call him to make contact - and had emailed him a few times in an attempt to speak. Mr Jandali once said he 'cannot believe' his son created so many gadgets. 'This might sound strange, though, but I am not prepared, even if either of us was on our deathbeds, to pick up the phone to call him,' he said. Ms Schieble and Mr Jandali then had a second child called Mona Simpson, who became a novelist. Ms Simpson is an author who once wrote a book loosely based on her biological brother. She lives in Santa Monica, California, with her two children and was once married to producer Richard Appel. Couple: He met his wife Laurene Powell in 1989 while speaking at Stanford's graduate business school and he had three children with her - Eve, Erin and Reed But Mr Jobs did not actually meet Ms Simpson until he was aged 27. He never wanted to explain how he tracked down his sister, but she described their relationship as 'close'. Mr Jobs was adopted by working-class couple Clara and Paul Jobs, who have both since died, but they also later adopted a second child - Patti Jobs. He later had the Ms Brennan-Jobs love child with his longtime girlfriend Ms Brennan in 1978. He met his wife Laurene Powell in 1989 while speaking at Stanford's graduate business school and he had three children with her - Eve, Erin and Reed. Residence: Apple co-founder Mr Jobs lived in this home estimated at $2.6million in Palo Alto, California They married in 1991 and Reed was born soon after. He is their oldest child, aged 20. Mr Jobs registered an incredible 338 U.S. patents or patent applications for technology and electronic accessories, reported the International Business Times. He was believed to have driven a 2007 Mercedes Benz SL55 AMG, which was worth around $130,000 new at the time. His 5,700 sq ft home was a 1930s Tudor-style property with seven bedrooms and four bathrooms - and it is estimated by CNBC to be worth $2.6million.

Mr Jobs also owned a huge historic Spanish colonial home in Woodside, which had 14 bedrooms and 13 bathrooms, located across six acres of forested land. Steve Jobs: The 56-year-old co-founder and former CEO of Apple is widely seen as one of the world's greatest entrepreneurs - and he also died just outside the top 100 world's richest billionaires But he later had it knocked down to make way for a smaller property after a long legal battle.

His charitable giving has always been a secret topic, just like most other elements of his lifestyle. Mr Jobs reportedly declined to get involved with the Giving Pledge - founded by Warren Buffett and Bill Gates to get the wealthiest people to give away at least half of their wealth. But he is rumoured to have given $150million to the Helen Diller Family Comprehensive Cancer Center at the University of California in San Francisco, reported the New York Times. It is cancer organisations that are most likely to be supported if any charities are in his will, as he died on Wednesday at the age of 56 from the pancreatic form of the illness.

Read more: http://www.dailymail.co.uk/news/article-2046031/Steve-Jobs-death-Apple-boss-tangled-family-inherit-8-3bn-fortune.html#ixzz1cG2DkrVu

I don't care how you spell it out..... money, love children, tangled web of a life..... The world just lost the Henry Ford and the Thomas Edison of our day. Can anyone set the dollar value of his estate aside and look at Steve Jobs holistically? The guy was damn brilliant.... RIP Steve and sympathies to your wife and children. Right now, they don't care what your net worth was..... you were there dad and a father....
- Kurt R, Northville, MI, 08/10/2011 00:07
Click to rate Rating 84 Report abuse
@Nancy Briones: he was a hero because he epitomized the American dream - brought up in a very modest household, dropped out of college to save his parents' money, started a business in his garage, made it big, failed and was fired, got back up and made it big all over again. His visions and his attention to detail have changed everyone's lives, whether you use Apple products or not. All computers changed because of Apple; the music industry was dragged kicking & screaming into the 21st century, to the benefit of consumers. Pixar revolutionized animated movies. Just simple computer typography changed massively because of Jobs. He took the computer version of ergonomics (that is, their ease of use) to levels no-one else could be remotely bothered to take them. He made computers useful for the liberal arts field, not just number crunching. His mission in life was to improve the world. His salary was $1 per year. He got rich just because he was successful at changing the world.
- DBS, San Francisco, USA, 08/10/2011 00:00
Click to rate Rating 66 Report abuse
My name is Ozymandias, king of kings: Look on my works, ye Mighty, and despair
- Clive, Fife, 07/10/2011 15:24
Click to rate Rating 53 Report abuse
Why was he such a hero? He benefited greatly from his creations. It was his job and he was paid for it. Funny how his cancer diagnosis somehow made us all so sympathetic to someone whose mission in life was to amass wealth, not save the world. My heart goes out to his family in their time of loss, however.


Read more: http://www.dailymail.co.uk/news/article-2046031/Steve-Jobs-death-Apple-boss-tangled-family-inherit-8-3bn-fortune.html#ixzz1cG1yoRpu

[Oct 14, 2011] Dennis Ritchie, 70, Dies, Programming Trailblazer - by Steve Rohr

October 13, 2011 | NYTimes.com
Dennis M. Ritchie, who helped shape the modern digital era by creating software tools that power things as diverse as search engines like Google and smartphones, was found dead on Wednesday at his home in Berkeley Heights, N.J. He was 70.

Mr. Ritchie, who lived alone, was in frail health in recent years after treatment for prostate cancer and heart disease, said his brother Bill.

In the late 1960s and early '70s, working at Bell Labs, Mr. Ritchie made a pair of lasting contributions to computer science. He was the principal designer of the C programming language and co-developer of the Unix operating system, working closely with Ken Thompson, his longtime Bell Labs collaborator.

The C programming language, a shorthand of words, numbers and punctuation, is still widely used today, and successors like C++ and Java build on the ideas, rules and grammar that Mr. Ritchie designed. The Unix operating system has similarly had a rich and enduring impact. Its free, open-source variant, Linux, powers many of the world's data centers, like those at Google and Amazon, and its technology serves as the foundation of operating systems, like Apple's iOS, in consumer computing devices.

"The tools that Dennis built - and their direct descendants - run pretty much everything today," said Brian Kernighan, a computer scientist at Princeton University who worked with Mr. Ritchie at Bell Labs.

Those tools were more than inventive bundles of computer code. The C language and Unix reflected a point of view, a different philosophy of computing than what had come before. In the late '60s and early '70s, minicomputers were moving into companies and universities - smaller and at a fraction of the price of hulking mainframes.

Minicomputers represented a step in the democratization of computing, and Unix and C were designed to open up computing to more people and collaborative working styles. Mr. Ritchie, Mr. Thompson and their Bell Labs colleagues were making not merely software but, as Mr. Ritchie once put it, "a system around which fellowship can form."

C was designed for systems programmers who wanted to get the fastest performance from operating systems, compilers and other programs. "C is not a big language - it's clean, simple, elegant," Mr. Kernighan said. "It lets you get close to the machine, without getting tied up in the machine."

Such higher-level languages had earlier been intended mainly to let people without a lot of programming skill write programs that could run on mainframes. Fortran was for scientists and engineers, while Cobol was for business managers.

C, like Unix, was designed mainly to let the growing ranks of professional programmers work more productively. And it steadily gained popularity. With Mr. Kernighan, Mr. Ritchie wrote a classic text, "The C Programming Language," also known as "K. & R." after the authors' initials, whose two editions, in 1978 and 1988, have sold millions of copies and been translated into 25 languages.

Dennis MacAlistair Ritchie was born on Sept. 9, 1941, in Bronxville, N.Y. His father, Alistair, was an engineer at Bell Labs, and his mother, Jean McGee Ritchie, was a homemaker. When he was a child, the family moved to Summit, N.J., where Mr. Ritchie grew up and attended high school. He then went to Harvard, where he majored in applied mathematics.

While a graduate student at Harvard, Mr. Ritchie worked at the computer center at the Massachusetts Institute of Technology, and became more interested in computing than math. He was recruited by the Sandia National Laboratories, which conducted weapons research and testing. "But it was nearly 1968," Mr. Ritchie recalled in an interview in 2001, "and somehow making A-bombs for the government didn't seem in tune with the times."

Mr. Ritchie joined Bell Labs in 1967, and soon began his fruitful collaboration with Mr. Thompson on both Unix and the C programming language. The pair represented the two different strands of the nascent discipline of computer science. Mr. Ritchie came to computing from math, while Mr. Thompson came from electrical engineering.

"We were very complementary," said Mr. Thompson, who is now an engineer at Google. "Sometimes personalities clash, and sometimes they meld. It was just good with Dennis."

Besides his brother Bill, of Alexandria, Va., Mr. Ritchie is survived by another brother, John, of Newton, Mass., and a sister, Lynn Ritchie of Hexham, England.

Mr. Ritchie traveled widely and read voraciously, but friends and family members say his main passion was his work. He remained at Bell Labs, working on various research projects, until he retired in 2007.

Colleagues who worked with Mr. Ritchie were struck by his code - meticulous, clean and concise. His writing, according to Mr. Kernighan, was similar. "There was a remarkable precision to his writing," Mr. Kernighan said, "no extra words, elegant and spare, much like his code."

[Oct 7, 2011] Excerpts From Steve Jobs' Wikipedia Entry

October 06, 2011 | Moon of Alabama

To consider today's Steve Jobs hype citing some excerpts from the Wikipedia entry about him seems appropriate.

Jobs returned to his previous job at Atari and was given the task of creating a circuit board for the game Breakout. According to Atari founder Nolan Bushnell, Atari had offered $100 for each chip that was eliminated in the machine. Jobs had little interest in or knowledge of circuit board design and made a deal with Wozniak to split the bonus evenly between them if Wozniak could minimize the number of chips. Much to the amazement of Atari, Wozniak reduced the number of chips by 50, a design so tight that it was impossible to reproduce on an assembly line. According to Wozniak, Jobs told Wozniak that Atari had given them only $700 (instead of the actual $5,000) and that Wozniak's share was thus $350.
...
While Jobs was a persuasive and charismatic director for Apple, some of his employees from that time had described him as an erratic and temperamental manager.
...
In the coming months, many employees developed a fear of encountering Jobs while riding in the elevator, "afraid that they might not have a job when the doors opened. The reality was that Jobs' summary executions were rare, but a handful of victims was enough to terrorize a whole company." Jobs also changed the licensing program for Macintosh clones, making it too costly for the manufacturers to continue making machines.
...
After resuming control of Apple in 1997, Jobs eliminated all corporate philanthropy programs.
...
In 2005, Jobs responded to criticism of Apple's poor recycling programs for e-waste in the U.S. by lashing out at environmental and other advocates at Apple's Annual Meeting in Cupertino in April.
...
In 2005, Steve Jobs banned all books published by John Wiley & Sons from Apple Stores in response to their publishing an unauthorized biography, iCon: Steve Jobs.

The article doesn't go into the outsourcing of the production of Apple products to a Chinese company which is essentially using slave labor with 16 hour work days and a series of employee suicides. This while Apple products are beyond real price competitions and the company is making extraordinary profits.

Jobs was reported to be the 42nd of the richest men list in the United States.

He marketed some good products. The NeXT cube was nice. Jobs though wasn't a nice man.

b

@jdmckay NeXT OS & Development tools were 5-10 years beyond... *anything* else out there. NeXT STEP *defined* OOP... when CS professors were still saying it was a fad.

NeXT came out 1988/89.

I learned object oriented programming (OOP) 1985/86 on a Symbolics LISP Machine which had a very nice graphic interface. The machine was of course running at a computer science department at a university and there were several capable CS professors around who were working on such machines and saw them as the future and not as a fad.

Jobs didn't invent with NeXT. He created a really nice package of existing technologies using a UNIX derivative and aspects of the LISP Machine and Smalltalk. Objective-C was developed in early 1980s. Jobs just licensed it. People at XEROX and elsewhere had been working at such stuff for years before Jobs adopted them.

NeXTStep did not define OOP. It made it wider available. There were already some 7000+ LISP machines sold before NeXT came onto the market.

[Oct 06, 2011] Apple's Steve Jobs, visionary leader, dead at 56

Yahoo! Finance

Six years ago, Jobs had talked about how a sense of his mortality was a major driver behind that vision.

"Remembering that I'll be dead soon is the most important tool I've ever encountered to help me make the big choices in life," Jobs said during a Stanford commencement ceremony in 2005.

"Because almost everything -- all external expectations, all pride, all fear of embarrassment or failure -- these things just fall away in the face of death, leaving only what is truly important."

"Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart."

[Oct 05, 2011] Jobs, Apple Cofounder and Visionary, Is Dead

NYTimes.com

Apple said in a press release that it was "deeply saddened" to announce that Mr. Jobs had passed away on Wednesday.

"Steve's brilliance, passion and energy were the source of countless innovations that enrich and improve all of our lives," the company said. "The world is immeasurably better because of Steve.

Mr. Jobs stepped down from the chief executive role in late August, saying he could no longer fulfill his duties, and became chairman. He underwent surgery for pancreatic cancer in 2004, and received a liver transplant in 2009.

Rarely has a major company and industry been so dominated by a single individual, and so successful. His influence went far beyond the iconic personal computers that were Apple's principal product for its first 20 years. In the last decade, Apple has redefined the music business through the iPod, the cellphone business through the iPhone and the entertainment and media world through the iPad. Again and again, Mr. Jobs gambled that he knew what the customer would want, and again and again he was right.

The early years of Apple long ago passed into legend: the two young hippie-ish founders, Mr. Jobs and Steve Wozniak; the introduction of the first Macintosh computer in 1984, which stretched the boundaries of what these devices could do; Mr. Jobs's abrupt exit the next year in a power struggle. But it was his return to Apple in 1996 that started a winning streak that raised the company from the near-dead to its current position of strength.

Bill Gates, the former chief executive of Microsoft, said in a statement that he was "truly saddened to learn of Steve Jobs's death." He added: "The world rarely sees someone who has had the profound impact Steve has had, the effects of which will be felt for many generations to come. For those of us lucky enough to get to work with him, it's been an insanely great honor. I will miss Steve immensely."

Mr. Jobs's family released a statement that said: "Steve died peacefully today surrounded by his family. In his public life, Steve was known as a visionary; in his private life, he cherished his family. We are thankful to the many people who have shared their wishes and prayers during the last year of Steve's illness; a Web site will be provided for those who wish to offer tributes and memories."

On the home page of Apple's site, product images were replaced with a black-and-white photo of Mr. Jobs.

Mr. Jobs's decision to step down in August inspired loving tributes to him on the Web and even prompted some fans to head to Apple stores to share their sentiments with others. Some compared him to legendary innovators like Thomas Edison.

[Sep 10, 2011] Michael Hart (1947 - 2011) Prophet of Abundance by Glyn Moody

September 08, 2011 | Open Enterprise

I've never written an obituary before in these pages. Happily, that's because the people who are driving the new wave of openness are relatively young, and still very much alive. Sadly, one of the earliest pioneers, Michael Hart, was somewhat older, and died on Tuesday at the age of just 64.

What makes his death particularly tragic is that his name is probably only vaguely known, even to people familiar with the areas he devoted his life to: free etexts and the public domain. In part, that was because he modest, content with only the barest recognition of his huge achievements. It was also because he was so far ahead of his times that there was an unfortunate disconnect between him and the later generation that built on his trailblazing early work.

To give an idea of how visionary Hart was, it's worth bearing in mind that he began what turned into the free etext library Project Gutenberg in 1971 - fully 12 years before Richard Stallman began to formulate his equivalent ideas for free software. Here's how I described the rather extraordinary beginnings of Hart's work in a feature I wrote in 2006:

In 1971, the year Richard Stallman joined the MIT AI Lab, Michael Hart was given an operator's account on a Xerox Sigma V mainframe at the University of Illinois. Since he estimated this computer time had a nominal worth of $100 million, he felt he had an obligation to repay this generosity by using it to create something of comparable and lasting value.

His solution was to type in the US Declaration of Independence, roughly 5K of ASCII, and to attempt to send it to everyone on ARPANET (fortunately, this trailblazing attempt at spam failed). His insight was that once turned from analogue to digital form, a book could be reproduced endlessly for almost zero additional cost - what Hart termed "Replicator Technology". By converting printed texts into etexts, he was able to create something whose potential aggregate value far exceeded even the heady figure he put on the computing time he used to generate it.

Hart chose the name "Project Gutenberg" for this body of etexts, making a bold claim that they represented the start of something as epoch-making as the original Gutenberg revolution.

Naturally, in preparing to write that feature for LWN.net, I wanted to interview Hart to find out more about him and his project, but he was very reluctant to answer my questions directly - I think because he was uncomfortable with being placed in the spotlight in this way. Instead, he put me on his mailing list, which turned out to be an incredible cornucopia of major essays, quick thoughts, jokes and links that he found interesting.

In one of those messages, he gave a good explanation of what he believed his Project Gutenberg would ultimately make possible:

Today we have terabyte drives for under $100 that are just about the same size as the average book.

10 years ago, in 1999, most people were using gigabytes in their systems rather than terabytes.

10 years before that, in 1989, most people used megabytes.

10 years before that, in 1979, most people used kilobytes.

My predictions run up to about 2021, which would be around the 50th anniversary of that first eBook from 1971.

I predict there will be affordable petabytes in 2021.

If there are a billion eBooks by 2021, they should fit the new petabytes just fine, as follows:

Premise #1:

The average eBook in the plainest format takes a megabyte.

Premise #2

There will be a billion eBooks in 2021 or shortly after.

Therefore:

A billion eBooks at a megabyte each takes one petabyte.

You will be able to carry all billion eBooks in one hand.

As this makes clear, Hart was the original prophet of digital abundance, a theme that I and others are now starting to explore. But his interest in that abundance was not merely theoretical - he was absolutely clear about its technological, economic and social implications:

I am hoping that with a library this size that the average middle class person can afford, that the result will be an even greater overthrow of the previous literacy, education and other power structures than happened as direct results of The Gutenberg Press around 500 years ago.

Here are just a few of the highlights that may repeat:

1. Book prices plummet.

2. Literacy rates soar.

3. Education rates soar.

4. Old power structures crumbles, as did The Church.

5. Scientific Revolution.

6. Industrial Revolution.

7. Humanitarian Revolution.

Part of those revolutions was what Hart called the "Post-Industrial Revolution", where the digital abundance he had created with Project Gutenberg would be translated into the analogue world thanks to more "replicators" - 3D printers such as the open source RepRap:

If we ... presume the world at large sees its first replicator around 2010, which is probably too early, given how long it took most other inventions to become visible to the world at large [usually 30 years according to thesis by Madelle Becker], we can presume that there will be replicators capable of using all the common materials some 34.5 years into the future from whatever time that may actually be.

Hence the date of 2050 for the possibility of some replicators to actually follow somebody home: if that hasn't already been made illegal by the fears of the more conservative.

Somewhere along the line there will also be demarcations of an assortment of boundaries between replicators who can only make certain products and those who can make new replicators, and a replicator that could actually walk around and follow someone, perhaps all the way home to ask if it could help.

The fact that it was ~30 years from the introduction of eBooks to those early Internet pioneers to the time Google made their big splashy billion dollar media blitz to announce their eBook project without any mention of the fact that eBooks existed in any previous incarnation, simply is additional evidence for an educated thesis mentioned above, that had previously predicted about a 30 year gap between the first public introductions and awareness by the public in general.

So, when you first start to see replicators out there set your alarm clocks for ~30 years, to remind you when you should see, if they haven't been made illegal already, replicators out for a walk in at least some neighborhoods.

Notice the comment "if that hasn't already been made illegal". This was another major theme in Hart's thinking and writings - that copyright laws have always been passed to stop successive waves of new technologies creating abundance:

We keep hearing about how we are in "The Information Age," but rarely is any reference made to any of four previously created Information Ages created by technology change that was as powerful in the day as the Internet is today.

The First Information Age, 1450-1710, The Gutenberg Press, reduced the price of the average books four hundred times. Stifled by the first copyright laws that reduced the books in print in Great Britain from 6,000 to 600, overnight.

The Second Information Age, 1830-1831, Shortest By Far The High Speed Steam Powered Printing Press Patented in 1830, Stifled By Copyright Extension in 1831.

The Third Information Age, ~1900, Electric Printing Press Exemplified by The Sears Catalog, the first book owned by millions of Americans. Reprint houses using such presses were stifled by the U.S. Copyright Act of 1909.

The Fourth Information Age, ~1970, The Xerox Machine made it possible for anyone to reprint anything. Responded to by the U.S. Copyright Act of 1976.

The Fifth Information Age, Today, The Internet and Web. Hundreds of thousands, perhaps even a million, books from A to Z are available either free of charge or at pricing, "Too Cheap To Meter" for download or via CD and DVD. Responded to by the "Mickey Mouse Copyright Act of 1998," The Digital Millennium Copyright Act, The Patriot Act and any number of other attempted restrictions/restructures.

Hart didn't just write about the baleful effect of copyright extensions, he also fought against them. The famous "Eldred v Ashcroft" case in the US that sought to have such unlimited copyright extensions declared unconstitutional originally involved Hart. As he later wrote:

Eldred v Ashcroft was previously labeled as in "Hart v Reno" before I saw that Larry Lessig, Esquire, had no intention of doing what I thought necessary to win. At that point I fired him and he picked up Eric Eldred as his current scapegoat du jour.

As this indicates, Hart was as uncompromising in his defense of the public domain as Stallman is of free software.

Most of his best writings are to be found in the emails that were sent out to his mailing list from time to time, although there is a Web page with links to a couple of dozen essays that are all well-worth reading to get a feeling for the man and his mind. There are also more of his writings on the Project Gutenberg site, as well as a useful history of the project.

However, it's hugely regrettable that Hart never published his many and wide-ranging insights as a coherent set of essays, since this has led to a general under-appreciation of the depth of his thinking and the crucial importance of his achievements. Arguably he did more for literature (and literacy) than any Nobel Prize laureate for that subject every will.

Fortunately, Project Gutenberg, which continues to grow and broaden its collection of freely-available texts in many languages, stands as a fitting and imperishable monument to a remarkable human being who not only gave the world great literature in abundance, but opened our eyes to the transformative power of abundance itself.

Follow me @glynmoody on Twitter or identi.ca, and on Google+

[May 07, 2011] Gates, Woz, and the last 2,000 years of computing • The Register

If you program it, they will come

By Gavin Clarke in Mountain View Get more from this author

Posted in Music and Media, 21st January 2011 19:58 GMT

It's weird to see something from your childhood displayed as an ancient cultural artifact. Here at the newly refurbished Computer History Museum in Mountain View, California, I'm standing over a glass case that houses the Commodore 64, the same machine I begged my parents to buy me for Christmas in 1983.

Compared to today's slick and smart personal computers, the Commodore 64 is the village idiot. With its 8-bit, 1MHz MOS 6510 processor and 64KB of memory, the only thing chunky about this machine was its famous built-in keyboard. But the Commodore 64 was in the vanguard of a revolution, one that took computers into people's homes by making them mass-produced, affordable, and usable by people without maths degrees or special training.

The Commodore even bested the iconic Apple II – or Apple ][, as fanbois of the day wrote its name – which was designed by that company's larger-than-life cofounder Steve Wozniak. When Apple's pioneering desktop debuted in 1977, the entry-level version with a mere 4KB of RAM cost $1,298 – significantly more expensive than the Commodore 64's 1982 intro price of $595, which later dropped to $199. The Commodore 64 was so popular that sales estimates range from between 17 and 22 million units during its 11-year run.

The Commodore 64 is now among 1,100 objects that comprise a new exhibition called Revolution: The First 2,000 Years of Computing at the Computer History Museum. Assembled by an army of 300 people over eight years, Revolution fills 25,000 square feet and is the crown jewel of a $19m renovation of a museum that's been an easy-to-miss landmark of Silicon Valley – the spiritual home of the tech industry – since 1999.

$19m is a hefty dose of private philanthropy by any standard, and one that's all the more impressive given that it came from an industrial sector famed for entrepreneurs and engineers obsessed by the future, not the past. Among the donors to Revolution is Bill Gates, who also provided the establishing gift: the BASIC interpreter tape he wrote for the MITS Altair 8800 while at Harvard in 1975, and that led to Microsoft and Windows.

Museum president and chief executive John Hollar told The Reg on a tour ahead of Revolution's opening that the exhibit centers on thematic moments in computing. "If you knit all those together, you get an interesting picture of where we are today," he said.

Revolution features pieces of the 700-square-foot, 30-ton ENIAC – Electrical Numerical Integrator And Computer – built by the US government between 1943 and 1945 to calculate missile trajectories. Not only was ENIAC the first general-purpose computer to run at "electronic speed" because it lacked mechanical parts, it was also programmed entirely by a staff of six female mathematicians who lacked any manuals and worked purely by deciphering logical and block diagrams.

There's also an IBM/360 from 1964, the first general-purpose computer for businesses that killed custom systems such as ENIAC that were built by and for governments. IBM staked its future on the IBM/360 – in today's dollars the project would cost $80bn.

Revolution is home to one of Google's first rack towers, dating from 1999. Spewing Ethernet cabling down its front, the tower helped establish Google as a search colossus whose thumb is now on the throat of the web and society, choking out $23bn a year from online ads.

Is small and portable more your thing? There's the PalmPilot prototype and the original card and wood mock-up donated by Palm co-founder and Apple graduate Donna Dubinsky. With its stylus input, the PalmPilot became the first widely popular handheld device. The original models, the Pilot 1000 and Pilot 5000, predated Apple's finger-poking iPad by 14 years.

Revolution houses analogue devices that are more like workbenches, and is home to the first Atari Pong arcade game in its plywood case (which ignited the video-game revolution), a gold-colored Bandai Pippin from Apple (which disappeared without a trace), the Apple II, and the Altair that inspired Gates to eventually build the world's largest software company.

While you can't view the actual BASIC interpreter tape that Gates wrote, the code has been immortalized in a huge glass plaque in the newly minted, airy reception area. Nerds take note: the reception area's tiled floor holds a punch-card design – work out what it says and you win a prize.

"Knitting it all together," as Hollar puts it, means you shouldn't be surprised to see that 1999 Google rack server in an exhibit that goes back 2,000 years to the abacus.

"Will Google be remembered 100 years from now? That's hard to say," Hollar told us. "But what's more likely is what Google represents is with us forever - which is finding what you want, when you want it, where you are, and having an expectation that that information is going to be instantaneously available to you. That's unleashed a new kind of human freedom and those powerful forces that affect people at a personal and human level – they don't get put back in the box."

Revolution doesn't just show objects that are important to computing, such as the Google rack, it also tells the stories of their creation and their creators. Not just familiar names such as Alan Turing, but also the ENIAC women whose job title was "computer" and who were classified as "sub professional" by the army and disregarded by their snotty male managers.

Also featured are people such as integrated-circuit inventor Jack Kilby, whose bosses at Texas Instruments told him in 1958 not to bother with his project. Kilby pressed on regardless during his summer holidays, and presented the top brass with the finished article when they returned from their undeserved time away. Such jaw-dropping tales of achievement against all odds are explained with the assistance of 5,000 images, 17 commissioned films, and 45 interactive screens you can poke at and scroll through.

Thanks to its location in the heart of Silicon Valley, down the road from Apple, Intel, HP, and Xerox PARC – companies whose ideas or products now dominate our daily lives – it would be easy for the museum to present only a local feel to the story of computing, and to give a US-centric bias. With displays from Britain, France, Japan, Korea, and Russia, however, Revolution looks beyond the boundaries of Silicon Valley and the US.

A good example of the global focus is the story of LEO, the Lyons Electronic Office that's one of the least-known entries from Britain in the history of personal computing.

Back when Britain ran an Empire, J Lyons & Company was a huge food and catering company famed for running a network of nationwide teashops that refreshed stiff upper lips from Piccadilly to the provinces with cups of tea for the price of just over two old pennies.

Then, as now, catering had extremely narrow margins, and J Lyons took an early interest in computers to help automate its back office and improve its bottom line. Specifically, J Lyons was watching the work of Sir Maurice Wilkes at Cambridge University, where he was building EDSAC, the Electronic Delay Storage Automatic Calculator, that performed its first calculation in 1949. EDSAC was a pre-semiconductor dinosaur, using 3,000 vacuum valves and 12 racks, with mercury for memory, and that lumbered along at a then mind-blowing 650 operations per second.

Lyons copied EDSAC and came up with LEO in 1951, regarded as the first computer for business. LEO ran the first routine office jobs - payroll and inventory management - while Lyons also used LEO for product development, calculating different tea blends.

Lyons quickly realized the potential of office automation, built the LEO II and formed LEO Computers, which went on to build and install LEO IIs for the British arm of US motor giant Ford Motor Company, the British Oxygen Company, HM Customs & Excise, and the Inland Revenue. LEO computers were exported to Australia, South Africa, and the Czech Republic - at the height of the Cold War. By 1968 LEO was incorporated into ICL, one of Britain's few computer and services companies, now Fujitsu Services.

The lion goes to pieces

Two surviving fragments of the once mighty LEO were acquired for Revolution at auction in London: a cracked vacuum tube and a battered, grey-blue metal control box with buttons and flip switches that's more car part than computer component.

Alex Bochannek, the museum curator who showed The Reg around Revolution, told us he feels "particularly strongly" about LEO. "Our collecting scope is global in nature and the story of LEO is such a fascinating one, yet almost completely unknown in this country. This is why we decided to add these objects to the collection specifically for Revolution."

The $19m renovation and Revolution make the museum an attractive destination for visitors of all kinds and ages - engineers, non-techies, tourists, and those interested in the history of women in the workplace, to name a few. The museum is also trying to raise its game in academic circles, and Hollar wants to turn it into the premier center on the history of computing.

Just 2 per cent of the museum's entire collection is on display in Revolution, with the plan to make the rest available to a worldwide audience through a new web site in March.

"[The museum] will be seen as a destination of important papers and other important artifacts located around the world for that definitive collection point of oral histories of people whose stories need to be recorded," Hollar said.

After being in the Valley for 12 years, why is now the right time to plough $19m into Revolution? According to Hollar, the time is ripe because of the ubiquity of computing online and in our pockets: we need to understand the journey from moving mechanical parts to digitization, from room-sized single-purpose dinosaurs to the multifunction iPhone, from switches and flashing lights to the keyboard and screen.

The entrepreneurs and engineers of Silicon Valley could also learn a thing or two by examining their past. "Some people in Silicon Valley believe they don't look backward, they only look forwards, but some people here who are very successful do understand they are part of something larger," Hollar said.

I hear echoes of British wartime leader Winston Churchill, who was fond of George Santayana's sentiment that those who fail to learn from history are doomed to repeat it. In this case, however, they might also miss new opportunities.

"The Google search engine is based on a very simple analogy that academic articles are known to become authoritative the more they are cited," Hollar said. "By making that analogy, Larry Page working with Brin took search from looking for words on a page to looking for something that's really important to you."

As more and more of the pioneers of modern computing age and pass away - Sir Wilkes died last year, and just one of the ENIAC women remains with us - there must surely be among modern computing's pioneers a growing desire for something tangible that preserves and records their achievements. It would be ironic if those obsessed with digitizing and recording data fail to record their stories, and if those stories slipped into an oral tradition or - worse - a Wikipedia-style group consensus of history where facts are relative and received secondhand. How many more LEOs are alive in the auction houses of the world, waiting to be clawed back?

Was such a concern for legacy behind Gates' donation to the museum?

"I think he's proud of what that little piece of tape represents," Hollar said. "That's the essence of the very first days of Microsoft, and if you talk to Bill and [cofounder] Paul Allen about it they are very aware now they are at a point in their lives where they are very aware that what they did was important and it needs to be preserved.

"I'm very glad they see the museum as the place where they want that to happen."

I just wonder if Gates feels as weird about all this as I did.

[Apr 04, 2010] Microsoft founders lead tributes to 'father of the PC'

BBC News

The "father of the personal computer" who kick-started the careers of Microsoft founders Bill Gates and Paul Allen has died at the age of 68.

Dr Henry Edward Roberts was the inventor of the Altair 8800, a machine that sparked the home computer era.

Gates and Allen contacted Dr Roberts after seeing the machine on the front cover of a magazine and offered to write software for it.

The program was known as Altair-Basic, the foundation of Microsoft's business.

"Ed was willing to take a chance on us - two young guys interested in computers long before they were commonplace - and we have always been grateful to him," the Microsoft founders said in a statement.

"The day our first untested software worked on his Altair was the start of a lot of great things."

Apple co-founder Steve Wozniak told technology website CNET that Dr Roberts had taken " a critically important step that led to everything we have today".

'Fond memories'

Dr Roberts was the founder of Micro Instrumentation and Telemetry Systems (MITS), originally set up to sell electronics kits to model rocket hobbyists.

The company went on to sell electronic calculator kits, but was soon overshadowed by bigger firms.

In the mid-1970's, with the firm struggling with debt, Dr Roberts began to develop a computer kit for hobbyists.

The result was the Altair 8800, a machine operated by switches and with no display.

It took its name from the then-cutting edge Intel 8080 microprocessor.

The $395 kit (around £1,000 today) was featured on the cover of Popular Electronics in 1975, prompting a flurry of orders. It was also sold assembled for an additional $100 charge.

Amongst those interested in the machine were Paul Allen and Bill Gates.

The pair contacted Dr Roberts, offering to write software code that would help people program the machine.

The pair eventually moved to Albuquerque - the home of MITS - where they founded Micro-Soft, as it was then known, to develop their software: a variant of the Beginners All-purpose Symbolic Instruction Code (Basic).

"We will always have many fond memories of working with Ed in Albuquerque, in the MITS office right on Route 66 - where so many exciting things happened that none of us could have imagined back then," the pair said.

Dr Roberts sold his company in 1977.

He died in hospital on 1 April after a long bout of pneumonia.

[Aug 1, 2008] Author Wallace Says Gates Surrounds Himself With Smart People

July 31 (Bloomberg) -- Author James Wallace, a reporter at the Seattle Post-Intelligencer, talks with Bloomberg's Tom Keene about Microsoft Corp.'s strategy and competition with Google Inc., Boeing Co.'s performance, and the shortage of engineers in the U.S. James Wallace and Jim Erickson co-wrote the best seller ``Hard Drive: Bill Gates & the Making of the Microsoft Empire,'' published in 1992.

Listen/Download

[Jul 23, 2008] Randy Pausch, whose 'last lecture' became sensation dies -- chicagotribune.com

Randy Pausch, a terminally ill professor whose earnest farewell lecture at Carnegie Mellon University became an Internet phenomenon and bestselling book that turned him into a symbol for living and dying well, died Friday. He was 47.

Pausch, a computer science professor and virtual-reality pioneer, died at his home in Chesapeake, Va., of complications from pancreatic cancer, the Pittsburgh university announced.

When Pausch agreed to give the talk, he was participating in a long-standing academic tradition that calls on professors to share their wisdom in a theoretical "last lecture." A month before the speech, the 46-year-old Pausch was told he had only months to live, a prognosis that heightened the poignancy of his address.

Originally delivered last September to about 400 students and colleagues, his message about how to make the most of life has been viewed by millions on the Internet. Pausch gave an abbreviated version of it on "Oprah" and expanded it into a best-selling book, "The Last Lecture," released in April.

Related links

Yet Pausch insisted that both the spoken and written words were designed for an audience of three: his children, then 5, 2 and 1.

"I was trying to put myself in a bottle that would one day wash up on the beach for my children," Pausch wrote in his book.

Unwilling to take time from his family to pen the book, Pausch hired a coauthor, Jeffrey Zaslow, a Wall Street Journal writer who had covered the lecture. During more than 50 bicycle rides crucial to his health, Pausch spoke to Zaslow on a cellphone headset.

"The speech made him famous all over the world," Zaslow told The Times. "It was almost a shared secret, a peek into him telling his colleagues and students to go on and do great things. It touched so many people because it was authentic."

Thousands of strangers e-mailed Pausch to say they found his upbeat lecture, laced with humor, to be inspiring and life-changing. They drank up the sentiments of a seemingly vibrant terminally ill man, a showman with Jerry Seinfeld-esque jokes and an earnest Jimmy Stewart delivery.

If I don't seem as depressed or morose as I should be, sorry to disappoint you.

He used that line after projecting CT scans, complete with helpful arrows pointing to the tumors on his liver as he addressed "the elephant in the room" that made every word carry more weight.

Some people believe that those who are dying may be especially insightful because they must make every moment count. Some are drawn to valedictories like the one Pausch gave because they offer a spiritual way to grapple with mortality that isn't based in religion.

Sandra Yarlott, director of spiritual care at UCLA Medical Center, said researchers, including Elisabeth Kubler-Ross, have observed that work done by dying patients "resonates with people in that timeless place deep within."

As Pausch essentially said goodbye at Carnegie Mellon, he touched on just about everything but religion as he raucously relived how he achieved most of his childhood dreams. His ambitions included experiencing the weightlessness of zero gravity; writing an article in the World Book Encyclopedia ("You can tell the nerds early on," he joked); wanting to be both a Disney Imagineer and Captain Kirk from "Star Trek"; and playing professional football.

Onstage, Pausch was a frenetic verbal billboard, delivering as many one-liners as he did phrases to live by.

Experience is what you get when you didn't get what you wanted.

When his virtual-reality students at Carnegie Mellon won a flight in a NASA training plane that briefly simulates weightlessness, Pausch was told faculty members were not allowed to fly. Finding a loophole, he applied to cover it as his team's hometown Web journalist -- and got his 25 seconds of floating.

Since 1997, Pausch had been a professor of computer science, human-computer interaction and design at Carnegie Mellon. With a drama professor, he founded the university's Entertainment Technology Center, which teams students from the arts with those in technology to develop projects.

The popular professor had an "enormous and lasting impact" on Carnegie Mellon, said Jared L. Cohon, the university's president, in a statement. He pointed out that Pausch's "love of teaching, his sense of fun and his brilliance" came together in his innovative software program, Alice, which uses animated characters and storytelling to make it easier to learn to write computer code.

During the lecture, Pausch joked that he had become just enough of an expert to fulfill one childhood ambition. World Book sought him out to write its virtual-reality entry.

[Apr 24, 2008] Eliza's world by Nicholas Carr

April 11, 2008

Reposted from the new edition of Edge:

What is the compelling urgency of the machine that it can so intrude itself into the very stuff out of which man builds his world? - Joseph Weizenbaum

Somehow I managed to miss, until just a few days ago, the news that Joseph Weizenbaum had died. He died of cancer on March 5, in his native Germany, at the age of 85. Coincidentally, I was in Germany that same day, giving a talk at the CeBIT technology show, and - strange but true - one of the books I had taken along on the trip was Weizenbaum's Computer Power and Human Reason.

Born in 1923, Weizenbaum left Germany with his family in 1936, to escape the Nazis, and came to America. After earning a degree in mathematics and working on programming some of the earliest mainframes, he spent most of his career as a professor of computer science at MIT. He became - to his chagrin - something of a celebrity in the 1960s when he wrote the Eliza software program, an early attempt at using a computer to simulate a person. Eliza was designed to mimic the conversational style of a psychotherapist, and many people who used the program found the conversations so realistic that they were convinced that Eliza had a capacity for empathy.

The reaction to Eliza startled Weizenbaum, and after much soul-searching he became, as John Markoff wrote in his New York Times obituary, a "heretic" in the computer-science world, raising uncomfortable questions about man's growing dependence on computers. Computer Power and Human Reason, published in 1976, remains one of the best books ever written about computing and its human implications. It's dated in some its details, but its messages seem as relevant, and as troubling, as ever. Weizenbaum argued, essentially, that computers impose a mechanistic point of view on their users - on us - and that that perspective can all too easily crowd out other, possibly more human, perspectives.

The influence of computers is hard to resist and even harder to escape, wrote Weizenbaum:

The computer becomes an indispensable component of any structure once it is so thoroughly integrated with the structure, so enmeshed in various vital substructures, that it can no longer be factored out without fatally impairing the whole structure. That is virtually a tautology. The utility of this tautology is that it can reawaken us to the possibility that some human actions, e.g., the introduction of computers into some complex human activities, may constitute an irreversible commitment. . . . The computer was not a prerequisite to the survival of modern society in the post-war period and beyond; its enthusiastic, uncritical embrace by the most "progressive" elements of American government, business, and industry quickly made it a resource essential to society's survival in the form that the computer itself had been instrumental in shaping.

The machine's influence shapes not only society's structures but the more intimate structures of the self. Under the sway of the ubiquitous, "indispensable" computer, we begin to take on its characteristics, to see the world, and ourselves, in the computer's (and its programmers') terms. We become ever further removed from the "direct experience" of nature, from the signals sent by our senses, and ever more encased in the self-contained world delineated and mediated by technology. It is, cautioned Weizenbaum, a perilous transformation:

Science and technology are sustained by their translations into power and control. To the extent that computers and computation may be counted as part of science and technology, they feed at the same table. The extreme phenomenon of the compulsive programmer teaches us that computers have the power to sustain megalomaniac fantasies. But the power of the computer is merely an extreme version of a power that is inherent in all self-validating systems of thought. Perhaps we are beginning to understand that the abstract systems - the games computer people can generate in their infinite freedom from the constraints that delimit the dreams of workers in the real world - may fail catastrophically when their rules are applied in earnest. We must also learn that the same danger is inherent in other magical systems that are equally detached from authentic human experience, and particularly in those sciences that insist they can capture the whole man in their abstract skeletal frameworks.

His own invention, Eliza, revealed to Weizenbaum the ease with which we will embrace a fabricated world. He spent the rest of his life trying to warn us away from the seductions of Eliza and her many friends. The quest may have been quixotic, but there was something heroic about it too.

See other appreciations of Weizenbaum by Andrew Brown, Jaron Lanier, and Thomas Otter.

[Apr 10, 2008] Andrew Brown The creation of artificial stupidity reflects badly on the human race Technology by Andrew Brown

April 10 2008 | The Guardian

Joseph Weizenbaum, who died last month, was one of the computer scientists who changed the way we think. Unfortunately for all of us, he didn't change it in the way he wanted to. His family was driven from Germany by the Nazis in 1936, and by the early 1960s he was a professor at MIT, part of the first wave of brilliant programmers to whom it sometimes seemed that there was nothing that computers could not do. Contemporaries like John McCarthy and Marvin Minsky confidently predicted the emergence of "strong" human-like artificial intelligence (AI). Then, in 1965, Weizenbaum demonstrated artificial stupidity, and the world has never been the same since.

He wrote a program called Eliza, which would respond to sentences typed in at a terminal with sentences of its own that bore some relation to what had been typed in; it mimicked a completely non-directional psychotherapist, who simply encouraged the patient to ramble till they stumbled on the truth, or the end of the session. What happened, of course, was that some students started to confide in the program as if it were a real person.

Even professional psychiatrists were completely deceived. One of them wrote: "If the Eliza method proves beneficial then it would provide a therapeutic tool which can be made widely available to mental hospitals and psychiatric centres suffering a shortage of therapists ... several hundred patients an hour could be handled by a computer system." Clearly, this is not a misunderstanding of the particular powers of one program, but a much larger misunderstanding of what computers are and what we are.

For Weizenbaum this raised unsettling questions about what human understanding might be. Instead of building computers which were genuinely capable of understanding the world, his colleagues had simply redefined understanding and knowledge until they were things of which computers were, in principle, capable.

We live in a world full of Eliza's grandchildren now, a race of counterfeit humans. I am not thinking of the automated systems that appear to parse the things that we say on customer service hotlines, but the humans chained to scripts whom we eventually reach, trained to react like machines to anything that is said to them.

What made Weizenbaum such an acute critic was not just that he understood computers very well and was himself a considerable programmer. He shared the enthusiasms of his enemies, but unlike them he saw the limits of enthusiasm. Perhaps because of the circumstances of his family's expulsion from Germany, he saw very clearly that the values associated with science - curiosity, determination, hard work and cleverness - were not on their own going to make us happy or good. Scientists had been complicit, sometimes enthusiastically complicit, in the Nazi war machine, and now computer programmers were making possible the weapons that threaten all life on Earth. He was an early campaigner against anti-ballistic missile systems, because they would make war more likely.

He wrote a wonderful denunciation of the early hacking culture in his book, Computer Power and Human Reason:

"Bright young men of disheveled appearance, often with sunken glowing eyes, can be seen sitting at computer consoles, their arms tensed and waiting to fire their fingers at the buttons and keys on which their attention seems to be as riveted ... The hacker ... has only technique, not knowledge. He has nothing he can analyze or synthesize. His skill is therefore aimless, even disembodied. It is simply not connected with anything other than the instrument on which it may be exercised. His skill is like that of a monastic copyist who, though illiterate, is a first-rate calligrapher. His grandiose projects must therefore necessarily have the quality of illusions, indeed, of illusions of grandeur. He will construct the one grand system in which all other experts will soon write their systems."

But Weizenbaum did much more than that himself even if he wrote only one long book. His book has dated very little, and nothing else I've read shows so well how a humanist may love computers without idolising them.

thewormbook.com/helmintholog

Edge ELIZA'S WORLD by JARON LANIER, Computer Scientist and Musician; Columnist, Discover Magazine

We have lost a lion of Computer Science. Joseph Weizenbaum's life is proof that someone can be an absolute alpha-geek and a compassionate, soulful person at the same time. He displayed innovative courage in recognizing the seductive dangers of computation.

History will remember Weizenbaum as the clearest thinker about the philosophy of computation. A metaphysical confrontation dominated his interactions with the non-human centered mainstream. There were endless arguments about whether people were special in ways that cybernetic artifacts could never be. The mainstream preferred to sprinkle the magic dust of specialness on the "instruments," as Weizenbaum put it, instead of people.

But there was a less metaphysical side of Weizenbaum's thinking that is urgently applicable to the most pressing problems we all face right now. He warned that if you believe in computers too much, you lose touch with reality. That's the real danger of the magic dust so liberally sprinkled by the mainstream. We pass this fallacy from the lab out into the world. This is what apparently happened to Wall Street traders in fomenting a series of massive financial failures. Computers can be used rather too easily to improve the efficiency with which we lie to ourselves. This is the side of Weizenbaum that I wish was better known.

We wouldn't let a student become a professional medical researcher without learning about double blind experiments, control groups, placebos, the replication of results, and so on. Why is computer science given a unique pass that allows us to be soft on ourselves? Every computer science student should be trained in Weizenbaumian skepticism, and should try to pass that precious discipline along to the users of our inventions.

Weizenbaum's legacy includes an unofficial minority school in computer science that has remained human-centered. A few of the other members, in my opinion, are David Gelernter, Ted Nelson, Terry Winograd, Alan Kay, and Ben Schneiderman.

Everything about computers has become associated with youth. Turing's abstractions have been woven into a theater in which we can enjoy fantasies of eternal youth. We are fascinated by wiz kids and the latest young billionaires in Silicon Valley. We fantasize that we will be uploaded when the singularity arrives in order to become immortal, and so on. But when we look away from the stage for a moment, we realize that we computer scientists are ultimately people. We die.

The Machine That Made Us KEVIN KELLY Editor-At-Large, Wired; Author, New Rules for the New Economy

Computer scientist Joseph Weizenbaum recently passed away at the age of 85. Weizenbaum invented the famous Eliza chat bot forty years ago. Amazingly this pseudo-AI still has the power to both amusing and confuse us. But later in life Weizenbaum became a critic of artificial intelligence. He was primarily concerned about the pervasive conquest of our culture by the computational metaphor - the idea that everything interesting is computation - and worried that in trying to make thinking machines, we would become machines ourselves. Weizenbaum's death has prompted a review of his ideas set out in his book "Computer Power and Human Reason".

On the Edge Nick Carr says this book "remains one of the best books ever written about computing and its human implications. It's dated in some its details, but its messages seem as relevant, and as troubling, as ever. Weizenbaum argued, essentially, that computers impose a mechanistic point of view on their users - on us - and that that perspective can all too easily crowd out other, possibly more human, perspectives." He highlights one passage worth inspecting.

The computer becomes an indispensable component of any structure once it is so thoroughly integrated with the structure, so enmeshed in various vital substructures, that it can no longer be factored out without fatally impairing the whole structure. That is virtually a tautology. The utility of this tautology is that it can reawaken us to the possibility that some human actions, e.g., the introduction of computers into some complex human activities, may constitute an irreversible commitment. . . . The computer was not a prerequisite to the survival of modern society in the post-war period and beyond; its enthusiastic, uncritical embrace by the most "progressive" elements of American government, business, and industry quickly made it a resource essential to society's survival in the form that the computer itself had been instrumental in shaping.

That's an elegant summary of a common worry: we are letting the Machine take over, and taking us over in the process.

Reading this worry, I was reminded of a new BBC program called "The Machine That Made Us." This video series celebrates not the computer but the other machine that made us - the printing press. It's a four part investigation into the role that printing has played in our culture. And it suggested to me that everything that Weizenbaum said about AI might be said about printing.

So I did a search-and-replace in Weizenbaum's text. I replaced "computer" with this other, older technology, "printing."

Printing becomes an indispensable component of any structure once it is so thoroughly integrated with the structure, so enmeshed in various vital substructures, that it can no longer be factored out without fatally impairing the whole structure. That is virtually a tautology. The utility of this tautology is that it can reawaken us to the possibility that some human actions, e.g., the introduction of printing into some complex human activities, may constitute an irreversible commitment. . . . Printing was not a prerequisite to the survival of modern society; its enthusiastic, uncritical embrace by the most "progressive" elements of government, business, and industry quickly made it a resource essential to society's survival in the form that the printing itself had been instrumental in shaping.

Stated this way its clear that printing is pretty vital and foundational, and it is. I could have done the same replacement with the technologies of "writing" or "the alphabet" - both equally transformative and essential to our society.

Printing, writing, and the alphabet did in fact bend the culture to favor themselves. They also made themselves so indispensable that we cannot imagine culture and society without them. Who would deny that our culture is unrecognizable without writing? And, as Weizenbaum indicated, the new embedded technology tends to displace the former mindset. Orality is gone, and our bookish culture is often at odds with oral cultures.

Weizenbaum's chief worry seems to be that we would become dependent on this new technology, and because it has its own agenda and self-reinforcement, it will therefore change us away from ourselves (whatever that may be).

All these are true. But as this exercise makes clear, we've gone through these kind of self-augmentating transitions several times before, and I believe come out better for it. Literacy and printing has improved us, even though we left something behind.

Weizenbaum (and probably Carr) would have been one of those smart, well-meaning elder figures in ancient times preaching against the coming horrors of printing and books. They would highlight the loss or orality, and the way these new-fangled auxiliary technologies demean humanity. We have our own memories, people: use them! They would have been in good company, since even Plato lamented the same.

There may indeed be reasons to worry about AI, but the fact that AI and computers tend to be pervasive, indispensable, foundational, self-reinforcing, and irreversible are not reasons alone to worry. Rather, if the past history of printing and writing is any indication, they are reasons to celebrate. With the advent of ubiquitous computation we are about to undergo another overhaul of our identity.

[Feb 6, 2008] Industry milestone DNS turns 25

02/06/08 | Network World

The Domain Name System turned 25 last week.

Paul Mockapetris is credited with creating DNS 25 years ago and successfully tested the technology in June 1983, according to several sources.

The anniversary of the technology that underpins the Internet -- and prevents Web surfers from having to type a string of numbers when looking for their favorite sites -- reminds us how network managers can't afford to overlook even the smallest of details. Now in all honesty, DNS has been on my mind lately because of a recent film that used DNS and network technology in its plot, but savvy network managers have DNS on the mind daily.

DNS is often referred to as the phone book for the Internet, it matches the IP address with a name and makes sure people and devices requesting an address actually arrive at the right place. And if the servers hosting DNS are configured wrong, networks can be susceptible to downtime and attacks, such as DNS poisoning.

And in terms of managing networks, DNS has become a critical part of many IT organization's IP address management strategies. And with voice-over-IP and wireless technologies ramping up the number of IP addresses that need to be managed, IT staff are learning they need to also ramp up their IP address management efforts. Companies such as Whirlpool are on top of IP address management projects, but industry watchers say not all IT shops have that luxury. (Learn more about IP ADDRESS MANAGEMENT products from our IP ADDRESS MANAGEMENT Buyer's Guide)

"IP address management sometimes gets pushed to the back burner because a lot of times the business doesn't see the immediate benefit -- until something goes wrong," says Larry Burton, senior analyst with Enterprise Management Associates.

And the way people are doing IP address management today won't hold up under the proliferation of new devices, an update to the Internet Protocol (from IPv4 to IPv6) and the compliance requirements that demand detailed data on IP addresses.

"IP address management for a lot of IT shops today is manual and archaic. It is now how most would say to manage a critical network service," says Robert Whiteley, a senior analyst at Forrester Research. "Network teams need to fix how they approach IP address management to be considered up to date."

And those looking to overhaul their approach to IP address management might want to consider migrating how they do DNS and DHCP services as well. While the technology functions can be conducted with separate platforms -- albeit integration among them is a must -- some experts say while updating how they manage IP addresses, network managers should also take a look at their DNS and DHCP infrastructure.

"Some people think of IP address management as the straight up managing of IP addresses and others incorporate the DNS/DHCP infrastructure, says Lawrence Orans, research director at Gartner. "If you are updating how you manage IPs it's a good time to also see if how you are doing DNS and DHCP needs an update."

Low-tech Magazine Email in the 18th century

More than 200 years ago it was already possible to send messages throughout Europe and America at the speed of an aeroplane – wireless and without need for electricity.

Email leaves all other communication systems far behind in terms of speed. But the principle of the technology – forwarding coded messages over long distances – is nothing new. It has its origins in the use of plumes of smoke, fire signals and drums, thousands of years before the start of our era. Coded long distance communication also formed the basis of a remarkable but largely forgotten communications network that prepared the arrival of the internet: the optical telegraph.

(Maps and picture : Ecole Centrale de Lyon)

--------------------------------------------------------------------------------------------------------
Every tower had a telegrapher, looking through the telescope
at the previous tower in the chain.
--------------------------------------------------------------------------------------------------------

Throughout history, long distance communication was a matter of patience – lots of patience. Postmen have existed longer than humans can write, but the physical transport of spoken or written messages was always limited by the speed of the messenger. Humans or horses can maintain a speed of 5 or 6 kilometres an hour for long distances. If they walk 10 hours a day, the transmission of a message from Paris to Antwerp would take about a week.

Already in antiquity, post systems were designed that made use of the changing of postmen. In these stations, the message was transferred to another runner or rider, or the horseman could change his horse. These organised systems greatly increased the speed of the postal services. The average speed of a galloping horse is 21 kilometres an hour, which means that the distance in time between Paris and Antwerp could be shortened to a few days. A carrier pigeon was twice as fast, but less reliable. Intercontinental communication was limited to the speed of shipping.

A chain of towers

Centuries of slow long-distance communications came to an end with the arrival of the telegraph. Most history books start this chapter with the appearance of the electrical telegraph, midway the nineteenth century. However, they skip an important intermediate step. Fifty years earlier (in 1791) the Frenchman Claude Chappe developed the optical telegraph. Thanks to this technology, messages could be transferred very quickly over long distances, without the need for postmen, horses, wires or electricity.

The optical telegraph network consisted of a chain of towers, each placed 5 to 20 kilometres apart from each other. On each of these towers a wooden semaphore and two telescopes were mounted (the telescope was invented in 1600). The semaphore had two signalling arms which each could be placed in seven positions. The wooden post itself could also be turned in 4 positions, so that 196 different positions were possible. Every one of these arrangements corresponded with a code for a letter, a number, a word or (a part of) a sentence.

1,380 kilometres an hour

Every tower had a telegrapher, looking through the telescope at the previous tower in the chain. If the semaphore on that tower was put into a certain position, the telegrapher copied that symbol on his own tower. Next he used the telescope to look at the succeeding tower in the chain, to control if the next telegrapher had copied the symbol correctly. In this way, messages were signed through symbol by symbol from tower to tower. The semaphore was operated by two levers. A telegrapher could reach a speed of 1 to 3 symbols per minute.

The technology today may sound a bit absurd, but in those times the optical telegraph was a genuine revolution. In a few decades, continental networks were built both in Europe and the United States. The first line was built between Paris and Lille during the French revolution, close to the frontline. It was 230 kilometres long and consisted of 15 semaphores. The very first message – a military victory over the Austrians – was transmitted in less than half an hour. The transmission of 1 symbol from Paris to Lille could happen in ten minutes, which comes down to a speed of 1,380 kilometres an hour. Faster than a modern passenger plane – this was invented only one and a half centuries later.

From Amsterdam to Venice

The technology expanded very fast. In less than 50 years time the French built a national infrastructure with more than 530 towers and a total length of almost 5,000 kilometres. Paris was connected to Strasbourg, Amsterdam, Toulon, Perpignan, Lyon, Turin, Milan and Venice. At the beginning of the 19th century, it was possible to wirelessly transmit a short message from Amsterdam to Venice in one hour's time. A few years before, a messenger on a horse would have needed at least a month's time to do the same.

The system was copied on a large scale in other countries. Sweden developed a country-wide network, followed by parts of England and North America. A bit later also Spain, Germany and Russia constructed a large optical telegraph infrastructure. Most of these countries devised their own variations on the optical telegraph, using shutters instead of arms for example. Sweden developed a system that was twice as fast, Spain built a telegraph that was windproof. Later the optical telegraph was also put into action in shipping and rail traffic.

A real European network never really existed. The connection between Amsterdam and Venice existed for only a short period. When Napoleon was chased out of the Netherlands, his telegraph network was dismantled. The Spanish, on the other hand, started too late. Their nationwide network was only finished when the technology started to fall into disuse in other countries. The optical telegraph network was solely used for military and national communications, individuals did not have access to it – although it was used for transmitting winning lottery numbers and stock market data. (Map : Ecole Centrale de Lyon)

Intercontinental communication

The optical telegraph disappeared as fast as it came. This happened with the arrival of the electrical telegraph, fifty years later. The last optical line in France was stopped in 1853, in Sweden the technology was used up to 1880. The electrical telegraph was not hindered by mist, wind, heavy rainfall or low hanging clouds, and it could also be used at night. Moreover, the electrical telegraph was cheaper than the mechanical variant. Another advantage was that it was much harder to intercept a message – whoever knew the code of the optical telegraph, could decipher the message. The electrical telegraph also made intercontinental communication possible, which was impossible with the optical telegraph (unless you made a large detour via Asia.

The electrical telegraph was the main means of communication for transmitting text messages over long distances for more than 100 years. At first, electrical wires were used; later on radio waves were used to communicate. The first line was built in 1844, the first transatlantic connection was put into use in 1865. The telegraph made use of Morse code, where dots and dashes symbolize letters and numbers.

Not the telephone, nor the railroads, nor radio or television made the telegraph obsolete. The technology only died with the arrival of the fax and the computer networks in the second half of the 20th century. Also in rail-traffic and shipping optical telegraphy was replaced by electronic variants, but in shipping the technology is still used in emergency situations (by means of flags or lamps).

Keyboard

The electrical telegraph is the immediate predecessor of e-mail and internet. Since the thirties, it was even possible to transmit images. A variant equipped with a keyboard was also developed, so that the technology could be used by people without any knowledge of Morse code. The optical as well as the electrical telegraph are both in essence the same technology as the internet and e-mail. All these means of communication make use of code language and intermediate stations to transmit information across large distances; the optical telegraph uses visual signs, the electrical telegraph dots and dashes, the internet ones and zeroes. Plumes of smoke and fire signals are also telegraphic systems – in combination with a telescope they would be as efficient as an optical telegraph.

Low-tech internet

Of course, e-mail is much more efficient than the optical telegraph. But that does not alter the fact that the low-tech predecessor of electronic mail more or less obtained the same result without wires or energy, while the internet consists of a cluster of cables and is devouring our energy resources at an ever faster pace.

© Kris De Decker (edited by Vincent Grosjean)

[Nov 10, 2007] MIT releases the sources of MULTICS, the father of UNIX! -

November 10, 2007 | Jos Kirps's Popular Science and Technology Blog

This is extraordinary news for all nerds, computer scientists and the Open Source community: the source code of the MULTICS operating system (Multiplexed Information and Computing Service), the father of UNIX and all modern OSes, has finally been opened.

Multics was an extremely influential early time-sharing operating system started in 1964 and introduced a large number of new concepts, including dynamic linking and a hierarchical file system. It was extremely powerful, and UNIX can in fact be considered to be a "simplified" successor to MULTICS (the name "Unix" is itself a hack on "Multics"). The last running Multics installation was shut down on October 31, 2000.

From now on, MULTICS can be downloaded from the following page (it's the complete MR12.5 source dumped at CGI in Calgary in 2000, including the PL/1 compiler):

http://web.mit.edu/multics-history

Unfortunately you can't install this on any PC, as MULTICS requires dedicated hardware, and there's no operational computer system today that could run this OS. Nevertheless the software should be considered to be an outstanding source for computer research and scientists. It is not yet know if it will be possible to emulate the required hardware to run the OS.

Special thanks to Tom Van Vleck for his continuous work on www.multicians.org, to the Group BULL including BULL HN Information Systems Inc. for opening the sources and making all this possible, to the folks at MIT for releasing it and to all of those who helped to convince BULL to open this great piece of computer history.

UNIX letters Anti-Foreword by Dennis Ritchie.

Dear Mr. Ritchie,

I heard a story from a guy in a UNIX sysadmin class, and was wondering if it was true.

The guy in this class told of a co-worker of his who was in a UNIX training class that got involved in UNIX bashing. You know, like why is the -i option for grep mean ignore case, and the -f option for sort mean ignore case, and so on. Well, the instructor of the course decided to chime in and said something like this:

"Here's another good example of this problem with UNIX. Take the find command for example. WHAT idiot would program a command so that you have to say -print to print the output to the screen. What IDIOT would make a command like this and not have the output go to the screen by default."

And the instructor went on and on, and vented his spleen...

The next morning, one of the ladies in the class raised her hand, the instructor called on her, and she proceeded to say something like this:

"The reason my father programmed the find command that way, was because he was told to do so in his specifications."

I've always wondered if this story was true, and who it was who wrote the find command. In the Oct. 94 issue of Byte they had an article on "UNIX at 25" which said that Dick Haight wrote the find command along with cpio, expr, and a lot of the include files for Version 7 of UNIX. I don't know where to send this message directly to Dick Haight, and I would appreciate it if you would forward it to him, if you are able. If you can't, well then I hope you liked the story. I got your mail address from "The UNIX Haters Handbook", and would like to add this to your Anti-Forward:
Until that frozen day in HELL occurs, and the authors of that book write a better operating system, I'm sticking with UNIX.

Sincerely,

Dan Bacus
[email protected].

From daemon Thu Feb  9 02:22 GMT 1995
Return-Path: [email protected]
Received: from plan9.research.att.com ([192.20.225.252]) by nscsgi.nscedu.com (8.6
From: [email protected]
Message-Id:  <[email protected]>
To: danb
Date: Wed, 8 Feb 1995 21:20:30 EST
Subject: Re: story
Content-Type: text
Content-Length: 1031
Status: RO

Thanks for the story and the note.  Dick Haight was in what was
then probably called USG, for Unix Support Group (the name changed
as they grew).  Their major role was to support the system within
AT&T, and later to turn it into a real commercial product.  He was indeed
one of the major people behind find and cpio.  This group was distinct from
the research area where the system originated, and we were somewhat put
off by the syntax of their things.  However, they were clearly quite useful,
and they were accepted.

Dick left AT&T some years ago and I think he's somewhere in South
Carolina, but I don't have an e-mail address for him.  I'm not sure what
he thinks of find and cpio today.  That group always was more concerned
with specifications and the like than we were, but I don't know enough
about their internal interactions to judge how these commands evolved.
All of your story is consistent with what I know up to the punchline,
about which I can't render an opinion!

Thanks again for your note.

       Dennis

[Sep 24, 2007] Happy Birthday, Sputnik! (Thanks for the Internet) by Gary Anthes

September 24, 2007 (Computerworld)

Quick, what's the most influential piece of hardware from the early days of computing? The IBM 360 mainframe? The DEC PDP-1 minicomputer? Maybe earlier computers such as Binac, ENIAC or Univac? Or, going way back to the 1800s, is it the Babbage Difference Engine?

More likely, it was a 183-pound aluminum sphere called Sputnik, Russian for "traveling companion." Fifty years ago, on Oct. 4, 1957, radio-transmitted beeps from the first man-made object to orbit the Earth stunned and frightened the U.S., and the country's reaction to the "October surprise" changed computing forever.

Although Sputnik fell from orbit just three months after launch, it marked the beginning of the Space Age, and in the U.S., it produced angst bordering on hysteria. Soon, there was talk of a U.S.-Soviet "missile gap." Then on Dec. 6, 1957, a Vanguard rocket that was to have carried aloft the first U.S. satellite exploded on the launch pad. The press dubbed the Vanguard "Kaputnik," and the public demanded that something be done.



The most immediate "something" was the creation of the Advanced Research Projects Agency (ARPA), a freewheeling Pentagon office created by President Eisenhower on Feb. 7, 1958. Its mission was to "prevent technological surprises," and in those first days, it was heavily weighted toward space programs.

Speaking of surprises, it might surprise some to learn that on the list of people who have most influenced the course of IT -- people with names like von Neumann, Watson, Hopper, Amdahl, Cerf, Gates and Berners-Lee -- appears the name J.C.R. Licklider, the first director of IT research at ARPA.

Armed with a big budget, carte blanche from his bosses and an unerring ability to attract bright people, Licklider catalyzed the invention of an astonishing array of IT, from time sharing to computer graphics to microprocessors to the Internet.

J.C. R. Licklider

J.C.R. Licklider

Indeed, although he left ARPA in 1964 and returned only briefly in 1974, it would be hard to name a major branch of IT today that Licklider did not significantly shape through ARPA funding -- all ultimately in reaction to the little Soviet satellite.

But now, the special culture that enabled Licklider and his successors to work their magic has largely disappeared from government, many say, setting up the U.S. once again for a technological drubbing. Could there be another Sputnik? "Oh, yes," says Leonard Kleinrock, the Internet pioneer who developed the principles behind packet-switching, the basis for the Internet, while Licklider was at ARPA. "But it's not going to be a surprise this time. We all see it coming."

The ARPA Way
Licklider had studied psychology as an undergraduate, and in 1962, he brought to ARPA a passionate belief that computers could be far more user-friendly than the unconnected, batch-processing behemoths of the day. Two years earlier, he had published an influential paper, "Man-Computer Symbiosis," in which he laid out his vision for computers that could interact with users in real time. It was a radical idea, one utterly rejected by most academic and industrial researchers at the time. (See sidebar, Advanced Computing Visions from 1960.)

Driven by the idea that computers might not only converse with their users, but also with one another, Licklider set out on behalf of ARPA to find the best available research talent. He found it at companies like the RAND Corp., but mostly he found it at universities, starting first at MIT and then adding to his list Carnegie Mellon University; Stanford University; University of California, Berkeley; the University of Utah; and others.


Advanced Computing Visions from 1960
Nearly a half-century ago, a former MIT professor of psychology and electrical engineering wrote a paper -- largely forgotten today -- that anticipated by decades the emergence of computer time sharing, networks and some features that even today are at the leading edge of IT.

Licklider wrote "Man-Computer Symbiosis" in 1960, at a time when computing was done by a handful of big, stand-alone batch-processing machines. In addition to predicting "networks of thinking centers," he said man-computer symbiosis would require the following advances:

  • Indexed databases. "Implicit in the idea of man-computer symbiosis are the requirements that information be retrievable both by name and by pattern and that it be accessible through procedures much faster than serial search."
  • Machine learning in the form of "self-organizing" programs. "Computers will in due course be able to devise and simplify their own procedures for achieving stated goals."
  • Dynamic linking of programs and applications, or "real-time concatenation of preprogrammed segments and closed subroutines which the human operator can designate and call into action simply by name."
  • More and better methods for input and output. "In generally available computers, there is almost no provision for any more effective, immediate man-machine communication than can be achieved with an electric typewriter."
  • Tablet input and handwriting recognition. "It will be necessary for the man and the computer to draw graphs and pictures and to write notes and equations to each other on the same display surface."
  • Speech recognition. "The interest stems from realization that one can hardly take a ... corporation president away from his work to teach him to type."


Licklider sought out researchers like himself: bright, farsighted and impatient with bureaucratic impediments. He established a culture and modus operandi -- and passed it on to his successors Ivan Sutherland, Robert Taylor, Larry Roberts and Bob Kahn -- that would make the agency, over the next 30 years, the most powerful engine for IT innovation in the world.

Recalls Kleinrock, "Licklider set the tone for ARPA's funding model: long-term, high-risk, high-payoff and visionary, and with program managers, that let principal investigators run with research as they saw fit." (Although Kleinrock never worked at ARPA, he played a key role in the development of the ARPAnet, and in 1969, he directed the installation of the first ARPAnet node at UCLA.)

Leonard Kleinrock

Leonard Kleinrock

From the early 1960s, ARPA built close relationships with universities and a few companies, each doing what it did best while drawing on the accomplishments of the others. What began as a simple attempt to link the computers used by a handful of U.S. Department of Defense researchers ultimately led to the global Internet of today.

Along the way, ARPA spawned an incredible array of supporting technologies, including time sharing, workstations, computer graphics, graphical user interfaces, very large-scale integration (VLSI) design, RISC processors and parallel computing (see DARPA's Role in IT Innovations). There were four ingredients in this recipe for success: generous funding, brilliant people, freedom from red tape and the occasional ascent to the bully pulpit by ARPA managers.

These individual technologies had a way of cross-fertilizing and combining over time in ways probably not foreseen even by ARPA managers. What would become the Sun Microsystems Inc. workstation, for example, owes its origins rather directly to a half-dozen major technologies developed at multiple universities and companies, all funded by ARPA. (See Timeline: Three Decades of DARPA Hegemony.)

Ed Lazowska, a computer science professor at the University of Washington in Seattle, offers this story from the 1970s and early 1980s, when Kahn was a DARPA program manager, then director of its Information Processing Techniques Office:

What Kahn did was absolutely remarkable. He supported the DARPA VLSI program, which funded the [Carver] Mead-[Lynn] Conway integrated circuit design methodology. Then he funded the SUN workstation at Stanford because Forest Baskett needed a high-
resolution, bitmapped workstation for doing VLSI design, and his grad student, Andy Bechtolsheim, had an idea for a new frame buffer.

Meanwhile, [Kahn] funded Berkeley to do Berkeley Unix. He wanted to turn Unix into a common platform for all his researchers so they could share results more easily, and he also saw it as a Trojan horse to drive the adoption of TCP/IP. That was at a time when every company had its own networking protocol -- IBM with SNA, DEC with DECnet, the Europeans with X.25 -- all brain-dead protocols.

Bob Kahn

Bob Kahn

One thing Kahn required in Berkeley Unix was that it have a great implementation of TCP/IP. So he went to Baskett and Bechtolsheim and said, "By the way, boys, you need to run Berkeley Unix on this thing." Meanwhile, Jim Clark was a faculty member at Stanford, and he looked at what Baskett was doing with the VLSI program and realized he could take the entire rack of chips that were Baskett's graphics processor and reduce them to a single board. That's where Silicon Graphics came from.

All this stuff happened because one brilliant guy, Bob Kahn, cherry-picked a bunch of phenomenal researchers -- Clark, Baskett, Mead, Conway, [Bill] Joy -- and headed them off in complimentary directions and cross-fertilized their work. It's just utterly remarkable.


Surprise?
The launch of the Soviet satellite Sputnik shocked the world and became known as the "October surprise." But was it really?

Paul Green

Paul Green

Paul Green was working at MIT's Lincoln Laboratory in 1957 as a communications researcher. He had learned Russian and was invited to give talks to the Popov Society, a group of Soviet technology professionals. "So I knew Russian scientists," Green recalls. "In particular, I knew this big-shot academician named [Vladimir] Kotelnikov."

In the summer of 1957, Green told Computerworld, a coterie of Soviet scientists, including Kotelnikov, attended a meeting of the International Scientific Radio Union in Boulder, Colo. Says Green, "At the meeting, Kotelnikov -- who, it turned out later, was involved with Sputnik -- just mentioned casually, 'Yeah, we are about to launch a satellite.'"

"It didn't register much because the Russians were given to braggadocio. And we didn't realize what that might mean -- that if you could launch a satellite in those days, you must have a giant missile and all kinds of capabilities that were scary. It sort of went in one ear and out the other."

And did he tell anyone in Washington? "None of us even mentioned it in our trip reports," he says.



DARPA Today
But around 2000, Kleinrock and other top-shelf technology researchers say, the agency, now called the Defense Advanced Research Projects Agency (DARPA), began to focus more on pragmatic, military objectives. A new administration was in power in Washington, and then 9/11 changed priorities everywhere. Observers say DARPA shifted much of its funding from long-range to shorter-term research, from universities to military contractors, and from unclassified work to secret programs.

Of government funding for IT, Kleinrock says, "our researchers are now being channeled into small science, small and incremental goals, short-term focus and small funding levels." The result, critics say, is that DARPA is much less likely today to spawn the kinds of revolutionary advances in IT that came from Licklider and his successors.

DARPA officials declined to be interviewed for this story. But Jan Walker, a spokesperson for DARPA Director Anthony Tether, said, "Dr. Tether ... does not agree. DARPA has not pulled back from long-term, high-risk, high-payoff research in IT or turned more to short-term projects." (See sidebar, DARPA's Response.)

A Shot in the Rear

David Farber, now a professor of computer science and public policy at Carnegie Mellon, was a young researcher at AT&T Bell Laboratories when Sputnik went up.

"We people in technology had a firm belief that we were leaders in science, and suddenly we got trumped," he recalls. "That was deeply disturbing. The Russians were considerably better than we thought they were, so what other fields were they good in?"

David Farber

David Farber

Farber says U.S. university science programs back then were weak and out of date, but higher education soon got a "shot in the rear end" via Eisenhower's ARPA. "It provided a jolt of funding," he says. "There's nothing to move academics like funding."

Farber says U.S. universities are no longer weak in science, but they are again suffering from lack of funds for long-range research.

"In the early years, ARPA was willing to fund things like artificial intelligence -- take five years and see what happens," he says. "Nobody cared whether you delivered something in six months. It was, 'Go and put forth your best effort and see if you can budge the field.' Now that's changed. It's more driven by, 'What did you do for us this year?'"

DARPA's budget calls for it to spend $414 million this year on information, communications and computing technologies, plus $483 million more on electronics, including things such as semiconductors. From 2001 to 2004, the percentage going to universities has shrunk from 39% to 21%, according the Senate Armed Services Committee. The beneficiaries have been defense contractors.

Victor Zue

Victor Zue

Meanwhile, funding from the National Science Foundation (NSF) for computer science and engineering -- most of it for universities -- has increased from $478 million in 2001 to $709 million this year, up 48%. But the NSF tends to fund smaller, more-focused efforts. And because contract awards are based on peer review, bidders on NSF jobs are inhibited from taking the kinds of chances that Licklider would have favored.

"At NSF, people look at your proposal and assign a grade, and if you are an outlier, chances are you won't get funded," says Victor Zue, who directs MIT's 900-person Computer Science and Artificial Intelligence Laboratory, the direct descendent of MIT's Project MAC, which was started with a $2 million ARPA grant in 1963.

"At DARPA, at least in the old days, they tended to fund people, and the program managers had tremendous latitude to say, 'I'm just going to bet on this.' At NSF, you don't bet on something."


DARPA's Response
"We are confident that anyone who attended DARPATech [in Aug. 2007] and heard the speeches given by DARPA's [managers] clearly understands that DARPA continues to be interested in high-risk, high-payoff research," says DARPA spokesperson Jan Walker.

Walker offers the following projects as examples of DARPA's current research efforts:

  • Computing systems able to assimilate knowledge by being immersed in a situation
  • Universal [language] translation
  • Realistic agent-based societal simulation environments
  • Networks that design themselves and collaborate with application services to jointly optimize performance
  • Self-forming information infrastructures that automatically organize services and applications
  • Routing protocols that allow computers to choose the best path for traffic, and new methods for route discovery for wide area networks
  • Devices to interconnect an optically switched backbone with metropolitan-level IP networks
  • Photonic communications in a microprocessor having a theoretical maximum performance of 10 TFLOPS (trillion floating-point operations per second)

Farber sits on a computer science advisory board at the NSF, and he says he has been urging the agency to "take a much more aggressive role in high-risk research." He explains, "Right now, the mechanisms guarantee that low-risk research gets funded. It's always, 'How do you know you can do that when you haven't done it?' A program manager is going to tell you, 'Look, a year from now, I have to write a report that says what this contributed to the country. I can't take a chance that it's not going to contribute to the country.' "

A report by the President's Council of Advisors on Science and Technology, released Sept. 10, indicates that at least some in the White House agree. In "Leadership Under Challenge: Information Technology R&D in a Competitive World," John H. Marburger, science advisor to the president, said, "The report highlights in particular the need to ... rebalance the federal networking and IT research and development portfolio to emphasize more large-scale, long-term, multidisciplinary activities and visionary, high-payoff goals."

Still, turning the clock back would not be easy, says Charles Herzfeld, who was ARPA director in the mid-1960s. The freewheeling behavior of the agency in those days might not even be legal today, he adds. (See The IT Godfather Speaks: Q&A With Charles M. Herzfeld.)

No Help From Industry
The U.S. has become the world's leader in IT because of the country's unique combination of government funding, university research, and industrial research and development, says the University of Washington's Lazowska. But just as the government has turned away from long-range research, so has industry, he says.

According to the Committee on Science, Engineering and Public Policy at the National Academy of Sciences, U.S. industry spent more on tort litigation than on research and development in 2001, the last year for which figures are available. And more than 95% of that R&D is engineering or development, not long-range research, Lazowska says.

Ed Lazowska

Ed Lazowska

"It's not looking out more than one product cycle; it's building the next release of the product," he says. "The question is, where do the ideas come from that allow you to do that five years from now? A lot of it has come from federally funded university research."

A great deal of fundamental research in IT used to take place at IBM, AT&T Inc. and Xerox Corp., but that has been cut way back, Lazowska says. "And of the new companies -- those created over the past 30 years -- only Microsoft is making significant investments that look out more than one product cycle."

Lazowska isn't expecting another event like Sputnik. "But I do think we are likely to wake up one day and find that China and India are producing far more highly qualified engineers than we are. Their educational systems are improving unbelievably quickly."

Farber also worries about those countries. His "Sputnik" vision is to "wake up and find that all our critical resources are now supplied by people who may not always be friendly." He recalls the book, The Japan That Can Say No (Simon & Schuster), which sent a Sputnik-like chill through the U.S. when it was published in 1991 by suggesting that Japan would one day outstrip the U.S. in technological prowess and thus exert economic hegemony over it.

"Japan could never pull that off because their internal markets aren't big enough, but a China that could say no or an India that could say no could be real," Farber says.

The U.S. has already fallen behind in communications, Farber says. "In computer science, we are right at the tender edge, although I do think we still have leadership there."


Science and Technology Funding by the U.S. Department of Defense (in millions)

Account FY 2006 Level FY 2007 Estimate FY 2008 Request $ Change FY 07 vs. FY 08 % Change FY 07 vs. FY 08
Total Basic Research $1,457 $1,563 $1,428 -$135 -8.6%
Total Applied Research $4,948 $5,329 $4,357 -$972 -18%
Total Advanced Technology Development $6,866 $6,432 $4,987 -$1,445 -22.4%
Total Science and Technology $13,272 $13,325 $10,772 -$2,553 -19%

Source: The Computing Research Association

Some of the cutbacks in DARPA funding at universities are welcome, says MIT's Zue. "Our reliance on government funding is nowhere near what it was in 1963. In a way, that's healthy, because when a discipline matures, the people who benefit from it ought to begin paying the freight."

"But," Zue adds, "it's sad to see DARPA changing its priorities so that we can no longer rely on it to do the big things."

Related News:

Gary Anthes is a Computerworld national correspondent.

[Apr 5, 2007] AlanTuring.net The Turing Archive for the History of Computing

Largest web collection of digital facsimiles of original documents by
Turing and other pioneers of computing. Plus articles about
Turing and his work, including Artificial Intelligence.

NEW Recently declassified previously top-secret documents about codebreaking.

[Apr 5, 2007] Proceedings of Symposium Computer in Europe.1998

Pretty unique and little known material

[Apr 5, 2007] Andrei P. Ershov, Aesthetics and the human factor in programming, Communications of the ACM, v.15 n.7, p.501-505, July 1972

In 1988 the charitable Ershov's Fund was founded. The main aim of the Fund was development of informatics in forms of invention, creation, art and education activity.)

[Mar 20, 2007] Fortran creator John Backus dies by Brian Bergstein

First FORTRAN compilers have pretty sophisticated optimization algorithms and generally much of compiler optimization research was done for Fortran compliers. More information can be found at The History of the Development of Programming Languages

March 20, 2007 (MSNBC.com)

John Backus, whose development of the Fortran programming language in the 1950s changed how people interacted with computers and paved the way for modern software, has died. He was 82.

Backus died Saturday in Ashland, Ore., according to IBM Corp., where he spent his career.

Prior to Fortran, computers had to be meticulously "hand-coded" - programmed in the raw strings of digits that triggered actions inside the machine. Fortran was a "high-level" programming language because it abstracted that work - it let programmers enter commands in a more intuitive system, which the computer would translate into machine code on its own.

The breakthrough earned Backus the 1977 Turing Award from the Association for Computing Machinery, one of the industry's highest accolades. The citation praised Backus' "profound, influential, and lasting contributions."

Backus also won a National Medal of Science in 1975 and got the 1993 Charles Stark Draper Prize, the top honor from the National Academy of Engineering.

"Much of my work has come from being lazy," Backus told Think, the IBM employee magazine, in 1979. "I didn't like writing programs, and so, when I was working on the IBM 701 (an early computer), writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."

John Warner Backus was born in Wilmington, Del., in 1924. His father was a chemist who became a stockbroker. Backus had what he would later describe as a "checkered educational career" in prep school and the University of Virginia, which he left after six months. After being drafted into the Army, Backus studied medicine but dropped it when he found radio engineering more compelling.

Backus finally found his calling in math, and he pursued a master's degree at Columbia University in New York. Shortly before graduating, Backus toured the IBM offices in midtown Manhattan and came across the company's Selective Sequence Electronic Calculator, an early computer stuffed with 13,000 vacuum tubes. Backus met one of the machine's inventors, Rex Seeber - who "gave me a little homemade test and hired me on the spot," Backus recalled in 1979.

Backus' early work at IBM included computing lunar positions on the balky, bulky computers that were state of the art in the 1950s. But he tired of hand-coding the hardware, and in 1954 he got his bosses to let him assemble a team that could design an easier system.

The result, Fortran, short for Formula Translation, reduced the number of programming statements necessary to operate a machine by a factor of 20.

It showed skeptics that machines could run just as efficiently without hand-coding. A wide range of programming languages and software approaches proliferated, although Fortran also evolved over the years and remains in use.

Backus remained with IBM until his retirement in 1991. Among his other important contributions was a method for describing the particular grammar of computer languages. The system is known as Backus-Naur Form.

© 2007 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Copyright 2007 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

[Jan 12, 2007] Jeff Raikes interview -- the whole thing from ..By Jack Schofield

December 14, 2006 (Guardian Unlimited)

This week's interview, in the printed Guardian Technology section, is with Jeff Raikes, president of the Microsoft Business Division, and "a member of the company's Senior Leadership Team" with Bill Gates and Steve Ballmer. Obviously we don't have room to print more than 3,000 words, even if you have time to read it. However, if you do want more, what follows is an almost complete transcript. You don't often get Microsoft's most senior guys one-on-one, and they are rarely as forthcoming as Raikes was this time....
For searches: the topics covered include Office Genuine Advantage (piracy), the Office 2007 user interface (the ribbon), SharePoint Server, hosted Office and online applications, the new XML file formats, and the Office bundles....

To set the scene, it's around 8.20am at the QEII Conference Centre in London, where Microsoft is holding a conference for software partners. I'm setting up my tape, and one of the PRs is getting us cups of coffee. I'm telling Raikes that I used to run VisiCalc on an Apple II, so I remember he joined Microsoft from Apple in 1981. "Unlike most of the people I talk to nowadays, you've been in this business longer than I have!"

Jeff Raikes: [Laughs] I started on VisiCalc in June of 1980, I actually worked for Atari briefly, when I was finishing up college. I ended up spending more in the company store than I made in income, so it's probably a good thing I moved on. Atari at that time was owned by Warner, so you could buy all the music albums for like a dollar, and games machines for all my friends.

Jack Schofield: Before we get going, did you write the Gates memo?
JR: Which memo are you referring to?

JS: The 1985 memo that Bill Gates sent to Apple, saying "you ought to license Mac OS to make it an industry standard." (http://www.scripting.com/specials/gatesLetter/text.html)
JR: I did. It's funny, there's a great irony in that memo, in that I was absolutely sincere in wanting the Macintosh to succeed, because that was the heart of our applications business at the time. And Apple somehow decided it was a devious plot and that I was the devil....

The irony is that I think if they'd taken the advice in the memo, we'd probably have ended up seeing the Mac be more successful and Windows perhaps not quite as successful, so I guess it all worked out OK in the end!

JS: It was good advice: I always thought you were right!
JR: Thankyou. I always thought it was a good memo, too, but if nobody did anything about it then perhaps it wasn't so good...

JS: And you're still in applications, which is amazing after all these years.
JR: It's amazing to see how much the opportunity has grown. If in 1981 we'd said that there would be 500 million people using Microsoft Office tools, people would have thought we were nuts. Yet today, I look at the landscape, at the broad opportunities of impacting how people find, use and share information, how they work together in a world where there's a lot of pressure; at the explosion of content, and how people manage content. And on the horizon, there's voice over IP and Unified Communications, and business intelligence, and software as a service to enhance the information work experience. So I look at it today, and I'm amazed at how much opportunity I had, and how much there is.
I've done different roles -- I spent eight or nine years with Steve Ballmer as we were building the worldwide sales and marketing organisation -- and when I returned to Office in 2000, some people thought there's not that much more to do. Quite the contrary, there was an incredible amount to do!

JS: Is that 500 million paid up users?
JR: Mmm, mmm, no, that's opportunity, Jack! [Laughs]

JS: Now you're getting the equivalent of Windows Genuine Advantage [piracy protection], which is going to be fun.
JR: We do have Office Genuine Advantage now, but it's not implemented exactly the same. Encouraging licensing is an important activity, but it's one of those things where you have to strike the right balance. We want to encourage usage of our software, and we want to make sure that those people who have licensed the software appropriately have a good experience.
I've lived on this copy protection thing since the 1980s, and it could be very obtrusive, so you have to strike the right balance. Office Genuine Advantage is a good approach in that it incents people to want to be licensed.

JS: What about your approach to licensing the ribbon from the Office 2007 user interface? What would happen if OpenOffice.org did a knock-off of Office 2007?
JR: Well, we'd just have to see. We have a certain responsibility to protect our intellectual property, and we try to do that in ways that are good for our customers and of course for our shareholders. So we've come up with a licensing programme [for the ribbon] and we'll see what others want to do. We have made no decisions yet as to exactly what we might do in a set of various scenarios.

JS: You seem to be offering to license it to people who write applications and utilities that support Office but not ones that are competing with Office....
JR: That's right.

JS: There's possibly a fuzzy line there....
JR: That's true, it can be. That's why I say there's a lot to come, to understand what people's interests are and what they may wish to do.

JS: How do you think the take-up of the ribbon is going to go?
JR: If we were to go by the research -- and of course that doesn't always bear out in the market -- it would be extremely positive. If you poll Office users, there's a couple of things that really stand out. One is that they really see that Office is very important to what they do in their jobs, so they care a lot about it. The second thing is that they'd like to be able to do even more. They recognise there's a lot of capability in the product that they're not getting to today. So the research that we put into designing the user experience was to address that issue: to help folks get to more capability and get things done faster and easier. Our research shows they can use 65% fewer keystrokes and less mouse-travel.
People want a results-oriented interface: they want to get things done. So that's the most notable step with Office 2007.
Now there's Office SharePoint Server, which takes the server side to a new level. Bill and I would draw the analogy to when we put together the Office productivity suite in the late 80s: we think Office SharePoint Server will in a few years be recognised as a similarly important strategic initiative. We're bringing together the collaboration, the document libraries, integrated workflow, electronic forms, business intelligence, content management, the portal capability, and having the opportunity to build on it. Bringing that platform together is important.

JS: But how do you promulgate it? SharePoint is more or less confined to large corporations, except for Office Live, and you don't even say that that's based on SharePoint.
JR: Office Live will be a way for people to have access to it quite broadly, both small businesses and individual Office users. In fact, SharePoint is perhaps the fastest growth business in the history of our company: we went from zero to $500 million in three years.

JS: Why isn't it talked about as part of the big web conversation, along with wikis and blogs and so on?
JR: Well, of course, SharePoint 2007 does have support for blogs and wikis, is that what you mean? I'm sorry, I may not be following your question....

JS: Well, when you created Office, it wasn't a corporate sale, it was part of the mass market, part of the conversation between ordinary users. Now SharePoint is a corporate sale, but it isn't part of it the wider market conversation about blogs and wikis, Apache, MySQL and so on.

JR: Today, not as much as we would like ... and I think that's an opportunity. As you say, SharePoint is one of the foundations of Office Live, and we have chosen to build Office Live in a progression. We've started with small businesses, but I think that as you recognise -- and the broad market doesn't, yet -- there's certainly the opportunity to open that up to anybody who does information work and anybody who uses Office tools and wants to extend that. So I think that's a great opportunity.

JS: Are you doing anything with hosted Office, apart from watching it disappear?
JR: Today, I don't get a lot of interest in running Word over the internet. Bandwidth is precious, and most people have Office. Nobody's crystal ball is perfect, but I think in a few years those who say software is dead will go the way of those people who said PCs were dead and network computing was the thing.
The reason is, people get very focused in on trying to undermine Microsoft and they don't get very focused in on the customer. You have all this horsepower at your fingertips, whether it's your PC or your laptop or your mobile device, and you have all that horsepower in the cloud. Why not use the combination of the horsepower in order to optimise the experience. Do I really want to run the Word bits over my network connection, or do I want to use it to store contents, to have access to them anywhere, to share and collaborate and so on. It's the combination....

JS: It's noticeable with Office 2007 that you don't always know which things are on the server and which are on your PC, so ultimately the two things blend together....
JR: I think it's important to think about what are the scenarios that will really enhance and extend information work.

JS: You did do Office 2003 as a hosted online service, as part of the Microsoft.Net launch....
JR: People can do that, but most people already have the software on their computers, so there isn't that big a demand for that today. I think Exchange is a platform that will more rapidly move to a service form than Office client applications, where most of the time you want to optimise the power at your fingertips. Or at least that would be my prediction. I think the key strategy is to be able to use the combination.

JS: Hosted Exchange hasn't got a lot of traction, has it?
JR: I think it's a market that's still in its early stage. I would also say that hosted Exchange has done as well as any hosted business email system. So the question is, to what extent will businesses want to access these things online? Some of my colleagues think that, in 10 years, no companies will have their own Exchange servers. I'm not quite that aggressive!
I do believe, though, that many companies will look to hosted Exchange, hosted SharePoint.... I think we'll see more and more of those infrastructure elements. And frankly, Jack, I'll make sure that the people who are developing our servers are thinking of hosted services, which means they have to think through the technical issues. We are going to make sure we have service thinking integrated throughout our software.
At the end of the day, my point of view is: give the customer the choice. Sell them on the value of Exchange as a messaging system and let them choose whether they want it on the premises or have someone run it for them as a service.

JS: What about web-based alternatives such as ThinkFree, which offers a sort of Office online? Is that part of your bailliewick?
JR: There are a number of those web productivity ideas out there. As I said, the thing that will probably trip people up is they'll get focussed on the idea that that's a replacement for the Office suite, when what's most interesting are the new and unique scenarios that you can get by having that capability. But then, it's our responsibility to make sure that our customers have access to those services as part of their use of Office tools. It's about software and services, as opposed to services versus software.

JS: I wondered if that online element was part of your empire or something that someone else was looking after....
JR: It's certainly something that's very top of mind of mind for me....

JS: And I wondered that because Office is a blockbuster, but it does take a while to do things compared to the speed at which things happen on the web. Look at YouTube!
JR: That's a fair point. You know, for better or for worse -- and it's probably both -- the core of what we do with Office probably doesn't have that characteristic, even in a web context. There are billions of documents out there, and people want tools that are compatible with billions of documents, and that have the functionality to allow people to do what they want to do. Things such as Google Docs, there are certainly some nice elements, but if you're a student and you need to do a paper that requires footnotes, well, good luck! [Laughs]
That's not to say they won't get better, but I try and temper my reaction to these things. In the same way I think our competitors get confused by focusing on trying to undermine us, instead of delivering customer value, I think we could get confused if we overreact to what might be the trend. The thing to do is to step back and say: "What is it that customers really want to do?" They may not be doing it today, and they might not know what they want to do, and they don't know the technology well enough to know what's possible, which is what makes this business kind of fun. But if you can make those predictions then you can end up with a winning business.
As an example, what happened with Mac Excel in 1985 was that we had a programmer called Steve Hazelrig who was doing the printing code. Laser printers were expensive then, and ours was way down the hall, so Steve wrote a little routine that put an image of the page up on the screen, with a magnifying glass so he could examine every pixel to make sure he had an accurate rendering of the page. The program manager Jake Blumenthal came down the hall and said: "Wow, that would be a great feature." That's how Print Preview made it into all of our products: no customer ever asked for it.
So the trick is to understand the things people want to do, and they may not know to ask for them, but the opportunity is there. So I think it's more important to understand what customers really want to do, and to make sure we deliver on that.

JS: Who's driving XML file formats? Is that customers or is it Microsoft?
JR: It's a combination: there are actually multiple motivations. First of all, there's the obvious reason: that you can increase the interoperability with file formats by using XML. We have customers who are very excited by the ability to extract information from Open XML formats and use that as part of their applications. But frankly, we would say that we feel document binaries are outliving their usefulness. They're now a source of security threats, in the sense that it's the new frontier for trying to attack computing infrastructures. And we can have more resilient file formats with XML.
People forget that we had rtf, and we had sylk, and those were "open formats" that people didn't really use that much because they were less performant. OK, so we're now in a different era where we can use XML as the foundation, get the benefits of interoperability, and have it be performant. It can actually be smaller footprint, and it's a much better structure than what we had before.
And, frankly, we've had to step up to the recognition that putting these formats with an open standards group is a good thing: it's a good thing for Microsoft, and it's a good thing for society. When I meet with governments, I recognise there's a legitimate interest in making sure that the way we store information is done on a long term basis. I fully support that. Some people say, "Hey, there's a lot of our intellectual property in there and you're opening that up for cloning." Well, we did. We decided. I decided. We needed to go forward and make these part of a standards body and address that interest.

JS: Do you foresee a mass migration of the installed base to the new formats?
JR: I don't. I wish I did -- I think it would be a good thing -- but I just think that's very hard. I think we have to do an excellent job of supporting compatibility, and with Office 2003, we download the compatibility packs so that you can read and write the XML formats. And we're going to work with third parties on ODF [Open Document Format] converters. But again, given your deeper knowledge, you probably recognise that when you don't have [a feature] in your format, well, how does that work? The idea that somehow everybody is going to be able to use ODF for these things, well, let's just say they've got a lot of work to do!
[PR says one last question, we have to go....]

JS: Have you got in a bit of a rut with Office bundles, now you have SharePoint and OneNote and so on. The operating system side have had a pretty good hit with the Small Business Server, SBS. Now you've got 19 products in Office....
JR: Maybe 22! [Laughs] We've got Small Business Office and it's one of our big products, but its primary distribution vehicle is with the hardware manufacturers and the reseller channel. That's been hugely successful. One of the key changes that we made was to give it small business specific value, with things like Contact Manager. For many businesses, Publisher is the number two used application because they use it as their sales and marketing vehicle.
But we do have Office Professional Plus and Enterprise, which gives people OneNote and Groove, not just Standard and Professional. In the retail market there's an all-in-one type package, like Vista Ultimate, but the high volume at retail is our new Home and Student Edition, at a very attractive price. We used to think of that as our Student and Teacher Edition.

JS: Could you throw SharePoint Server into that? [Laughs]
JR: I'd really like to include some kind of subscription to Office Live: we've looked at doing that and we probably will do that some time in the future. That's one of the beauties of software as a service: it does give us a way to complement the core client applications.
[Getting up to leave]
Thanks very much. It's fun to reflect on history a little bit...

The Raikes empire

The Information Worker Group includes: Access, Business Intelligence Applications, Data Analyzer, Excel, FrontPage, Groove, InfoPath, Live Meeting, Natural Language Processing, Office Live, Microsoft Office System, Office Online, OneNote, Outlook, PowerPoint, Project, Publisher, SharePoint, Visio, and Word.
Business Solutions includes: Microsoft Dynamics, Supply Chain Management, Customer Relationship Management, Financial Management, Microsoft Dynamics AX, Great Plains, Microsoft Dynamics NAV, Enterprise Reporting, Retail Management, Small Business Products, Microsoft Small Business Financials, Microsoft Dynamics SL, Business Contact Manager, Exchange Server and Speech Server.

[Dec 15, 2006] Ralph Griswold died Lambda the Ultimate

Ralph Griswold, the creator of Snobol and Icon programming languages, died in October 2006 of cancer. Until recently Computer Science was a discipline where the founders were still around. That's changing. Griswold was an important pioneer of programming language design with Snobol sting manipulation facilities different and somewhat faster then regular expressions.

Ralph Griswold died two weeks ago. He created several programming languages, most notably Snobol (in the 60s) and Icon (in the 70s) - both outstandingly innovative, integral, and efficacious in their areas. Despite the abundance of scripting and other languages today, Snobol and Icon are still unsurpassed in many respects, both as elegance of design and as practicality.

Ralph Griswold

See also Ralph Griswold 1934-2006 and Griswold Memorial Endowment
Ralph E. Griswold died in Tucson on October 4, 2006, of complications from pancreatic cancer. He was Regents Professor Emeritus in the Department of Computer Science at the University of Arizona.

Griswold was born in Modesto, California, in 1934. He was an award winner in the 1952 Westinghouse National Science Talent Search and went on to attend Stanford University, culminating in a PhD in Electrical Engineering in 1962.

Griswold joined the staff of Bell Telephone Laboratories in Holmdel, New Jersey, and rose to become head of Programming Research and Development. In 1971, he came to the University of Arizona to found the Department of Computer Science, and he served as department head through 1981. His insistence on high standards brought the department recognition and respect. In recognition of his work the university granted him the breastle of Regents Professor in 1990.

While at Bell Labs, Griswold led the design and implementation of the groundbreaking SNOBOL4 programming language with its emphasis on string manipulation and high-level data structures. At Arizona, he developed the Icon programming language, a high-level language whose influence can be seen in Python and other recent languages.

Griswold authored numerous books and articles about computer science. After retiring in 1997, his interests turned to weaving. While researching mathematical aspects of weaving design he collected and digitized a large library of weaving documents and maintained a public website. He published technical monographs and weaving designs that inspired the work of others, and he remained active until his final week.

-----Gregg Townsend Staff Scientist The University of Arizona

[Mar 3, 2006] ACM Press Release, March 01, 2006

BTW John Backus authored of extremely speculative 1977 ACM Turing award Lecture "Can Programming be liberated from the von Neumann Style? A Functional Style and its Algebra of Programs". It can be found here. As E.W.Dijkstra noted "The article is a progress report on a valid research effort but suffers badly from aggressive overselling of its significance. This is the more regrettable as it has been published by way of Turing Award Lecture."
From Slashdot: "It's interesting that Peter Naur is being recognized 40 years later, when another Algol team member, Alan Perlis, received the first Turing Award in 1966. Here's a photo of Perlis, Naur and the other Algol 1960 conference participants. [tugurium.com] ".
Some contributions of Algol60 (Score:2, Informative)
by Marc Rochkind (775756) on Saturday March 04, @04:39PM (#14851091)
(http://mudbag.com/)

1. The Report on the language used a formal syntax specification, one of the first, if not the first, to do so. Semantics were specified with prose, however.
2. There was a distinction between the publication language and the implementation language (those probably aren't the right terms). Among other things, it got around differences such as whether to use decimal points or commas in numeric constants.
3. Designed by a committee, rather than a private company or government agency.
4. Archetype of the so-called "Algol-like languages," examples of which are (were?) Pascal, PL./I, Algol68, Ada, C, and Java. (The term Algol-like languages is hardly used any more, since we have few examples of contemporary non-Algol-like languages.)

However, as someone who actually programmed in it (on a Univac 1108 in 1972 or 1973), I can say that Algol60 was extremely difficult to use for anything real, since it lacked string processing, data structures, adequate control flow constructs, and separate compilation. (Or so I recall... it's been a while since I've read the Report.)

Backus Normal Form vs. Backus Naur Form

The following exchange comes from a transcript given at the 1978 conference which the book documents:

CHEATHAM: The next question is from Bernie Galler of the University of Michigan, and he asks: "BNF is sometimes pronounced Backus-Naur-Form and sometimes Backus-Normal- Form. What was the original intention?

NAUR: I don't know where BNF came from in the first place. I don't know -- surely BNF originally meant Backus Normal Form. I don't know who suggested it. Perhaps Ingerman. [This is denied by Peter Z. Ingerman.] I don't know.

CHEATHAM: It was a suggestion that Peter Ingerman proposed then?

NAUR: ... Then the suggestion to change that I think was made by Don Knuth in a letter to the Communications of the ACM, and the justification -- well, he has the justification there. I think I made reference to it, so there you'll find whatever justification was originally made. That's all I would like to say.

About BNF notation

BNF is an acronym for "Backus Naur Form". John Backus and Peter Naur introduced for the first time a formal notation to describe the syntax of a given language (This was for the description of the ALGOL 60 programming language, see [Naur 60]). To be precise, most of BNF was introduced by Backus in a report presented at an earlier UNESCO conference on ALGOL 58.

Few read the report, but when Peter Naur read it he was surprised at some of the differences he found between his and Backus's interpretation of ALGOL 58. He decided that for the successor to ALGOL, all participants of the first design had come to recognize some weaknesses, should be given in a similar form so that all participants should be aware of what they were agreeing to. He made a few modificiations that are almost universally used and drew up on his own the BNF for ALGOL 60 at the meeting where it was designed. Depending on how you attribute presenting it to the world, it was either by Backus in 59 or Naur in 60.

(For more details on this period of programming languages history, see the introduction to Backus's Turing award article in Communications of the ACM, Vol. 21, No. 8, august 1978. This note was suggested by William B. Clodius from Los Alamos Natl. Lab).

[Jan 6, 2006] Frank Cary; Drove Personal Computer Creation for IBM By Patricia Sullivan, Washington Post Friday, ; Page B07

Frank T. Cary, 85, the chairman and chief executive of IBM who pushed for the creation of the company's once-dominant personal computer, defended the giant business against a 13-year-long federal antitrust lawsuit and helped launch a decade-long effort by U.S. corporations to end apartheid in South Africa, died Jan. 1 at his home in Darien, Conn.

His wife of 63 years, Anne Curtis Cary, described her husband as a quiet, down-to-earth person who would prefer a bare-bones obituary, if any at all. She declined to provide a cause of death.

Mr. Cary led IBM as its chief executive from 1973 to 1981 and as chairman from 1973 to 1983. He was chairman of biopharmaceutical company Celgene Corp. from 1986 to 1990.

Under his watch in the 1970s, IBM more than doubled its revenues and earnings while operating in one of the most competitive industries in the world.

Mr. Cary, described in the news media at the time as a tight-lipped, powerfully built sales executive with a crushing handshake and an Irish twinkle, oversaw the introduction of the widely popular Selectric typewriter, among other innovations.

Annoyed that smaller companies such as Commodore and Apple had successfully introduced desktop personal computers while IBM's first two efforts had failed, Mr. Cary delegated the task of coming up with a personal computer in a single calendar year to an independent business unit headed by executive Bill Lowe, ordering him to "teach the Big Blue elephant to dance."

Working in Boca Raton, Fla., Lowe and his colleagues used off-the-shelf components, a key decision that let them make their deadline and set the course of the PC industry. They bought the operating system, the software that runs the computer, from the startup company Microsoft; that sale launched the juggernaut that made Microsoft founder Bill Gates the richest man in the world.

When Mr. Cary left IBM, under its then-mandatory policy of retiring executives at 60, the company seemed invincible. Its PC sales were a highly profitable $4 billion, and customers seeking a personal computer were often offered the choice of Commodores, Apples or "IBM clones." But within a decade, the once-dominant business lost control of the PC standard, and of the market as well. No one nowadays refers to the ubiquitous personal computer as an "IBM clone."

Born in Gooding, Idaho, Frank Taylor Cary moved to California when young and graduated from the University of California at Los Angeles. He served in the Army during World War II, received a master's degree in business administration from Stanford University in 1948 and went to work for IBM.

He succeeded Thomas J. Watson Jr., the son of the company's founder, as chief executive. Under Watson, IBM had what the New York Times called "the premier glamour stock," when it sold for 66 times earnings in the mid-1960s. Mr. Cary told the Times that his singular disappointment was that IBM's stock price hovered between 15 and 20 times earnings during his tenure.

He suffered through 45 days of questioning during what he called "the Methuselah of antitrust cases" and what The Washington Post called "a Homeric pretrial paper chase . . . involving 66 million documents and 2,500 depositions."

The trial lasted six years, off and on, with 974 witnesses, 66 million pages of evidence and 104,000 pages of testimony, as the government tried to prove that IBM had broken the law by forming a monopoly and taking out competitors one by one. The case, filed on the last day of President Lyndon B. Johnson's administration, lasted until the Justice Department dropped it in January 1982, a year into President Ronald Reagan's administration.

In 1975, Mr. Cary authorized the creation of IBM's first lobbying office in Washington, which later became a powerful force among corporate lobbyists.

Mr. Cary was not solely concerned with financial goals. According to IBM, he joined General Motors chief executive Tom Murphy and the Rev. Leon Sullivan, a General Motors board member, in 1975 to recruit 21 top American corporate leaders for a decade-long effort to end apartheid in South Africa. The meeting led to the creation of the original Sullivan Principles, which committed businesses to equal and fair pay practices, training of nonwhites for management positions, and improving the quality of life for nonwhites in housing, transportation, school, health and recreation facilities.

Mr. Cary's hobbies included skiing, swimming, tennis and golf. He served on company boards in recent years, including printer manufacturer Lexmark International Inc., medical services provider Lincare Holdings Inc., media company Capital Cities/ABC Inc., and the engineering- and construction-oriented Bechtel Group Inc.

Besides his wife, of Darien, Conn., survivors include four children and 12 grandchildren.

[Jun 4, 2005] Q&A Internet Pioneer Looks Ahead - Computerworld Q&A: An Internet Pioneer Looks Ahead Leonard Kleinrock predicts 'really smart' handhelds, but warns of out-of-control complexity.

Leonard Kleinrock with Interface Message Processor 1, the Arpanet's first switching node. The minicomputer, configured by Bolt, Beranek and Newman, arrived at UCLA on Labor Day weekend in 1969. Two days later, a team led by Kleinrock had messages moving between IMP1 and another computer at UCLA. Thus the Arpanet, the forerunner of today's Internet, was born.

JULY 04, 2005 (COMPUTERWORLD) - Leonard Kleinrock is emeritus professor of computer science at the University of California, Los Angeles. He created the basic principles of packet switching, the foundation of the Internet, while a graduate student at MIT, where he earned a Ph.D. in 1963. The Los Angeles Times in 1999 called him one of the "50 people who most influenced business this century."

Computerworld's Gary H. Anthes interviewed Kleinrock in 1994 as part of the Internet's 25th anniversary celebration. Recently, Anthes asked Kleinrock for an update.

You told Computerworld 11 years ago that the Internet needed, among other things, "a proper security framework." What about today? In the past 11 years, things have gotten far worse, so much so that there are parts of the population that are beginning to question whether the pain they are encountering with spam, viruses and so on is worth the benefit. I don't think there's a silver bullet. We need systemwide solutions. Strong authentication will help. IPv6 will help. Identifying the source of information-a networking issue-to make sure it's not being spoofed will help.

You called for better multimedia capabilities in 1994 as well. One of the major changes related to multimedia in these 11 years has been the explosion of what we call the "mobile Internet." There's this ability now to travel from one location to another and gain access to a rich set of services as easily as you can from your office. The digitization of nearly all content and the convergence of function and content on really smart handheld devices are beginning to enable anytime, anywhere, by anyone Internet -- the mobile Internet. But there is a lot more to be done.

Such as? We have to make it easier for people to move from place to place and get access. What's missing is the billing and authentication interface that allows one to identify oneself easily in a global, mobile, roaming fashion. We [will] see this change to an alternate pricing model where people can subscribe to a Wi-Fi roaming service offered by their company or from their home ISP. As these roaming agreements are forged between the subscription provider and the owners/operators of today's disparate public-access networks, the effective number of locations where a subscriber will be able to connect at no or low fee will grow. A key component in this environment is internetwork interoperability, not only for data traffic but for authentication and billing. The benefits will be ease of use and predictable cost.

You mentioned smart handheld devices. Where are they going? We are seeing your phone, PDA, GPS, camera, e-mail, pager, walkie-talkie, TV, radio, all converging on this handheld device, which you carry around in addition to your laptop. It will [alter the properties of] a lot of content - video, images, music-to match what's come down to the particular device you have. For example, you may be using your handheld cell phone to serve as a passthrough device to receive an image or video that you wish to display on some other output device-say, your PC or your TV. The handheld may need to "dumb down" the image for itself but pass the high-quality stream to the TV, which will render the stream to match its-the TV's-display capability.

Is that capability of interest to corporate IT? Absolutely. We see e-mail already on the handheld, as well as the ability to download business documents such as spreadsheets and PowerPoint presentations. We'll see the ability to handle the occasional videoconference on a handheld, as well as other media-rich communications. We are right on the threshold of seeing these multifunction devices. Of course, the human-computer interface is always a problem.

How might that improve? Voice recognition is going to be really important. And there will be flexible devices where you actually pull out keyboards and screens and expand what you are carrying with you. Haptic technologies-based on touch and force feedback-are not yet here, but there's a lot of research going on. For example, with a handheld, you could display a virtual keyboard on a piece of paper and just touch that.

You have warned that we are "hitting a wall of complexity." What do you mean? We once arrogantly thought that any man-made system could be completely understood, because we created it. But we have reached the point where we can't predict how the systems we design will perform, and it's inhibiting our ability to do some really interesting system designs. We are allowing distributed control and intelligent agents to govern the way these systems behave. But that has its own dangers; there are cascading failures and dependencies we don't understand in these automatic protective mechanisms.

Will we see catastrophic failures of complex systems, like the Internet or power grid? Yes. The better you design a system, the more likely it is to fail catastrophically. It's designed to perform very well up to some limit, and if you can't tell how close it is to this limit, the collapse will occur suddenly and surprisingly. On the other hand, if a system slowly erodes, you can tell when it's weakening; typically, a well-designed system doesn't expose that.

So, how can complex systems be made more safe and reliable? Put the protective control functions in one portion of the design, one portion of the code, so you can see it. People, in an ad hoc fashion, add a little control here, a little protocol there, and they can't see the big picture of how these things interact. When you are willy-nilly patching new controls on top of old ones, that's one way you get unpredictable behavior.

[Apr 16, 2005] The Daemon, the Gnu and the Penguin By Peter H. Salus

So, by the beginning of 1974 there were a number of user groups exchanging information and a new operating system that was beginning to get folks excited. No one had thought seriously about licensing. And there were 40 nodes on the ARPAnet.

Early in 1974, Mel Ferentz (then at Brooklyn College)4 and Lou Katz (then at Columbia's College of Physicians and Surgeons)5called a meeting of UNIX users in New York in May. Ken Thompson supplied them with a list of those who had requested a copy of UNIX after the SOSP meeting. Nearly three dozen in under six months. The meeting took place on May 15, 1974. The agenda was a simple one: descriptions of several installations and uses; lunch; "Ken Thompson speaks!"; interchange of UNIX hints; interchange of DEC hints; free-for-all discussion. Lou told me that he thought there were about 20 people in attendance; Mel thought it might have been a few more than that. That's the organization that's now the USENIX Association.

The Ritchie-Thompson paper appeared in the July 1974 issue of Communications of the ACM. The editor described it as "elegant." Soon, Ken was awash in requests for UNIX.

Mike O'Dell's reaction to the article is typical. In 1974, Mike was an undergraduate at the University of Oklahoma. He told me:

When the famous 1974 CACM issue appeared, I was working at the OU Computer Center. We had this thing called ITF, the Intermittent Terminal Facility, which had the world's worst implementation of BASIC, and one of the guys had written some routines which let you do I/O on terminals -- and this was a non-trivial feat. So a group of us sat down and tried to figure out whether we could do something interesting. ...

The UNIX issue came. I remember going down the hall and getting it out of my mailbox and saying to myself, Oh, ACM's got something on operating systems, maybe it's worth reading. And I started reading through it. I remember reading this paper on the UNIX time-sharing system. It was sort of like being hit in the head with a rock. And I reread it. And I got up and went out of my office, around the corner to George Maybry who was one of the other guys involved with this. And I threw the issue down on his desk and said: "How could this many people have been so wrong for so long?"

And he said: "What are you talking about?"

And I said: "Read this and then try to tell me that what we've been doing is not just nuts. We've been crazy. This is what we want."

The CACM article most definitely had a dramatic impact.

Slashdot Unix's Founding Fathers

by js7a (579872) <james AT bovik DOT org> on Monday July 26, @05:00AM (#9799332)
(http://www.bovik.org./ | Last Journal: Monday July 19, @05:17PM)

... It was proprietary software, patents wouldn't have done a thing to it.

Actually, a crucial part of Unix was patented, before software patents were technically allowed. But the fact that it had been was the main reason that Unix spread so rapidly in the 70s and 80s.

Back in the 70s, Bell Labs was required by an antitrust consent decree of January 1956 to reveal what patents it had applied for, supply information about them to competitors, and license them in anticipation of issuance to anyone for nominal fees. Any source code covered by such a Bell Labs patent also had to be licensed for a nominal fee. So about every computer science department on the planet was able to obtain the Unix source.

The patent in question was for the setuid bit, U.S. No. 4,135,240 [uspto.gov]. If you look at it, you will see that it is apparently a hardware patent! This is the kicker paragraph:

... So far this Detailed Description has described the file access control information associated with each stored file, and the function of each piece of information in regulating access to the associated file. It remains now to complete this Detailed Description by illustrating an implementation giving concrete form to this functional description. To those skilled in the computer art it is obvious that such an implementation can be expressed either in terms of a computer program (software) implementation or a computer circuitry (hardware) implementation, the two being functional equivalents of one another. It will be understood that a functionally equivalent software embodiment is within the scope of the inventive contribution herein described. For some purposes a software embodiment may likely be preferrable in practice.

Technically, even though that said it "will be understood," and was understood by everyone as a software patent, it wasn't until the 1981 Supreme case of Diamond v. Diehr that it became enforcable as such. Perhaps that is why the patent took six years to issue back in the 70s.

So, through the 1970s, Unix spread because it was covered by an unenforcable software patent! Doug McIlroy said, "AT&T distributed Unix with the understanding that a license fee would be collected if and when the setuid patent issued. When the event finally occurred, the logistical problems of retroactively collecting small fees from hundreds of licensees did not seem worth the effort, so the patent was placed in the public domain."

Windows NT and VMS: The Rest of the Story Mark Russinovich InstantDoc #4494 December 1998

Most of NT's core designers had worked on and with VMS at Digital; some had worked directly with Cutler. How could these developers prevent their VMS design decisions from affecting their design and implementation of NT? Many users believe that NT's developers carried concepts from VMS to NT, but most don't know just how similar NT and VMS are at the kernel level (despite the Usenet joke that if you increment each letter in VMS you end up with WNT­Windows NT).

As in UNIX and most commercial OSs, NT has two modes of execution, as Figure 2 shows. In user mode, applications execute, and OS/2, DOS, and POSIX execute and export APIs for applications to use. These components are unprivileged because NT controls them and the hardware they run on. Without NT's permission, these components cannot directly access hardware. In addition, the components and hardware cannot access each other's memory space, nor can they access the memory associated with NT's kernel. The components in user mode must call on the kernel if they want to access hardware or allocate physical or logical resources.

The kernel executes in a privileged mode: It can directly access memory and hardware. The kernel consists of several Executive subsystems, which are responsible for managing resources, including the Process Manager, the I/O Manager, the Virtual Memory Manager, the Security Reference Monitor, and a microkernel that handles scheduling and interrupts. The system dynamically loads device drivers, which are kernel components that interface NT to different peripheral devices. The hardware abstraction layer (HAL) hides the specific intricacies of an underlying CPU and motherboard from NT. NT's native API is the API that user-mode applications use to speak to the kernel. This native API is mostly undocumented, because applications are supposed to speak Win32, DOS, OS/2, POSIX, or Win16, and these respective OS environments interact with the kernel on the application's behalf.

VMS doesn't have different OS personalities, as NT does, but its kernel and Executive subsystems are clear predecessors to NT's. Digital developers wrote the VMS kernel almost entirely in VAX assembly language. To be portable across different CPU architectures, Microsoft developers wrote NT's kernel almost entirely in C. In developing NT, these designers rewrote VMS in C, cleaning up, tuning, tweaking, and adding some new functionality and capabilities as they went. This statement is in danger of trivializing their efforts; after all, the designers built a new API (i.e., Win32), a new file system (i.e., NTFS), and a new graphical interface subsystem and administrative environment while maintaining backward compatibility with DOS, OS/2, POSIX, and Win16. Nevertheless, the migration of VMS internals to NT was so thorough that within a few weeks of NT's release, Digital engineers noticed the striking similarities.

Those similarities could fill a book. In fact, you can read sections of VAX/VMS Internals and Data Structures (Digital Press) as an accurate description of NT internals simply by translating VMS terms to NT terms. Table 1 lists a few VMS terms and their NT translations. Although I won't go into detail, I will discuss some of the major similarities and differences between Windows NT 3.1 and VMS 5.0, the last version of VMS Dave Cutler and his team might have influenced. This discussion assumes you have some familiarity with OS concepts (for background information about NT's architecture, see "Windows NT Architecture, Part 1" March 1998 and "Windows NT Architecture, Part 2" April 1998).

Pirates of Silicon Valley (1999)

3 out of 5 stars Cheezefest, but also insightful, April 19, 2003
Reviewer: A viewer (Arlington, MA USA)

The video is brutally honest about how Jobs neglects his daughter and abuses Apple employees. He seems to have had a hard time dealing with his own illegitimacy (he was adopted) It is no coincidence that he shaped the Mac project to be the "bastard" project that tears Apple apart from within. Too bad the movie didn't spend a little more time on this theme. I also loved the climactic scene where Gates and Jobs confront each other. Although certainly fictional, it sums up the Mac/PC war brilliantly. Jobs shouts about how the Macintosh is a superior product, and Gates, almost whispering, answers "That doesn't matter. It just doesn't matter"

4 out of 5 stars A fair overview of Microsoft and Apple beginnings, August 5, 2001
Reviewer: aaron wittenberg (portland, or)

From the people I've talked to that had seen it, this sounded like a great movie. I finally got my chance to see it just a few days ago.

I wasn't using computers back when this movie starts at Berkeley in the early 70's, but from the time the Apple I was invented until the IBM PC came around, I recall that history pretty well.

This movie does an alright job explaining the starting of Microsoft and Apple. The downside is many interesting facts have been left out. The writers never mentioned why IBM made a personal computer. They did because almost every other computer related company was building computers, and they wanted to cash in on the market. The scene where Gates goes to IBM and offers them DOS was not entirely correct.

What they didn't tell you is that IBM first offered the late Gary Kildahl (the owner of CP/M) to write DOS, and for whatever reason he wasn't interested. Next was Microsoft, and Bill Gates was interested, but he sold IBM something he didn't have. Instead, Gates bought 86-DOS for $50,000 from its author Tim Paterson who worked at Seattle Computer Products way back in 1980. Tim later went to work for Microsoft.

Just think... had Gary been interested, his business would likely be the Microsoft of today.

There are other small differences which the movie either didn't tell the full story, or it wasn't entirely accurate. They failed to mention that Microsoft supplied MS-DOS (they renamed 86-DOS for licensing and ownership) to IBM, but the IBM PC used virtually all Intel components. They failed to mention that a huge chunk of history came from Intel. It was also never mentioned that Apple Motorola for processors, or else their beloved Macintosh would not exist.

They were right on track about Apple stealing the graphic interface from Xerox. It is true that Xerox invented it sometime during the early-mid 70's and the management wasn't interested in this invention. BIG mistake.

Apple is often credited with inventing the GUI. Not true.

I was a little surprised that the movie made no mention about how Microsoft teamed up with IBM back in the 80's and they worked on OS/2. Microsoft spent more time working on Windows and IBM finally finished OS/2 on their own. I truly feel if they had worked together on a single operating system, we would have one today that doesn't crash like Windows and actually worked like an operating system should.

If you are even a little interested in the history of computers and how some of these huge companies started out, you might find this very interesting.

I still remember using Windows 1.0 back in 1986. A lot has changed with it!

So to close, this would make a good time killer and something you give you a little more knowledge about computer history. But please keep in mind, not all of the events are totally accurate, and a lot of critical information was left out. This is by no means the end all authority.

4 out of 5 stars Great Movie!, June 12, 2001
Reviewer: Matt (Reno, NV)

Even though I do not know either man personally, this movie gives soo much insight to both sides. It shows that Apple followers are much like Jobs himself, arrogant, condescending, and self-righteous. It also shows why Apple has not, and never will, get more than 10% of the market share. There is even speculation that Jobs is going to drop Motorola and use AMD chips instead. On the other side, it shows how Gates tells his employees what he wants, and lets them complete their tasks without intervention, while Jobs continually abuses and verbally trashes his employees to get the job done. Jobs is an emotional powder keg, while Gates plays it cool. Great movie!

4 out of 5 stars The great american success story, timing is everything!, February 27, 2003
Reviewer: A viewer (Nampa, Id. United States)

This takes you from beginning to present day.
Shows Paul Allen (who now OWNS the Seahawks and Trailblazers pro teams) Bill Gates, Steve Jobs etc. etc. Dropping out of college to pursue a slow burning fire that would become the personal computer/windows software that we know today.

What is interesting is that it shows who talks and who works. Gates lies a lot, pretty much living by the saying "telling people what they want to hear" while Paul Allen grinds away at making code.

On the other end it's the same somewhat, rogue cannon Steve Jobs handling the business part while we get a sense that Steve Wozniak is a true tech who goes above and beyond Jobs' rantings to produce the final product.

What is so funny is the irony of this movie:

Loan Officer: "Sorry Mr. Jobs, but we don't think the ordinary person will have any use for a computer".

HP: "You think people are interested in something called a mouse?".

Xerox: "We build it and then they can come right in here and steal it from us? It's just not fair, this operating system is a result of our hard work!".

Jobs to Gates: "You're STEALING FROM US!!!"

Assistant to Gates: "Do you realize Apple has a pirate flag over their front door, and they just gave us 3 prototypes of their operating system?"

Jobs: "I don't want people to look at it like a monitor and mouse, I think of this as art, a vision, people need to think outside the box".

Jobs: "You stole it from ussss!"

Gates: "No it's not stealing, you see, it's like we both have this neighbor, and he leaves his door open all the time. You go over there to get his TV, only I've gotten their first..and now you're calling me the thief?!".

Just some of the excerpts that make this movie a classic and show you everything that went down when a bunch of college dropouts set out and changed the world in which we live today.

5 out of 5 stars Apple vs Microsoft...but not a war, January 17, 2002
Reviewer: Sebastian Brytting (Stockholm, Sweden)

The best thing about this movie, I think, is that it manages to deal with the Apple vs Microsoft discussion without picking a side. It shows Steve Jobs yelling at his employees when his private life is messy. But it also shows him inspire and develop products that changed the world, and how he eventually sorted out his private problems.

It shows Bill Gates stealing from every large company he comes across, but he is not portrayed as the 'bad guy.' The viewer can pick sides himself.

Computer related movies most often end up really lousy, but not this one. When Steve Jobs is having fun, you get happy. When he finds out that Bill Gates has betrayed his trust and stolen his life's work, you get sad. When Bill Gates tries to be 'cool', you laugh. (Hilarious scene)

The other great thing about this movie is that since it's so neutral, it makes even the toughest Microsoft fan admit that it was all pirated from Apple. (Though they always add "at least from the beginning" to preserve their pride) =)

Bottom line: This movie rocks! See it! Newbie or hacker, you've got to see this movie!

Microsoft OS/2 Announcement

NEWS RELEASE

M-3592

FOR RELEASE APRIL 2, 1987

Microsoft Operating System/2™ With Windows Presentation Manager Provides Foundation for Next Generation of Personal Computer Industry

REDMOND, WA • April 2, 1987 • Microsoft Corporation today announced Microsoft Operating System/2 (MS OS/2™), a new personal computer system operating system. MS OS/2 is planned for phased release to OEM manufacturers beginning in the fourth quarter of 1987. Designed and developed specifically to harness the capabilities of personal computers based upon the Intel® 80286 and 80386 microprocessors, MS OS/2 provides significant new benefits to personal computer application software developers and end-users.

MS OS/2, a multi-tasking operating system which allows applications software to use up to 16 Mb of memory on 80286 and 80386-based personal computers, can be adapted for use on most personal computers based on the 80286 and 80386 processors, including the IBM® PC AT and other popular systems in use today. The MS OS/2 Windows presentation manager is an integral part of the MS OS/2 product, providing a sophisticated graphical user interface to the MS OS/2 system. The MS OS/2 Windows presentation manager is derived from the existing Microsoft® Windows product developed and marketed by Microsoft for the current generation of IBM personal computers and compatible machines.

The MS OS/2 product is the first to be announced as the result of the Joint Development Agreement announced by IBM and Microsoft in August 1985. Microsoft will be offering MS OS/2, including the MS OS/2 Windows presentation manager, to all its existing OEM customers.

"Microsoft Operating System/2 provides the foundation for the next phase of exciting growth in the personal computer industry," said Bill Gates, chairman of Microsoft. "Microsoft is committed to providing outstanding systems software products to the personal computer industry. MS OS/2 will be the platform upon which the next 1000 exciting personal computer applications software products are built. In particular, our commitment to the power of the graphical user interface has been realized with the announcement of the MS OS/2 Windows presentation manager and the new IBM Personal System/2™ series. We believe that these machines represent a new standard in personal computer graphics capabilities which will drive the software industry toward the creation of incredible new graphics-based applications software products."

Microsoft products to support MS OS/2 local area network systems and applications software developers

In a series of related announcements, Microsoft announced the Microsoft Operating System/2 LAN Manager, a high-performance local area networking software product. The MS OS/2 LAN Manager enables personal computers running either MS OS/2 or MS-DOS® to be connected together on a local area network. Any machine in the network can function as either a server or a workstation.

Microsoft also announced that it plans to begin distribution of the MS OS/2 Software Development Kit (SDK) on August 1. The MS OS/2 SDK includes pre-release software and full product specifications for MS OS/2. This will enable applications developers and other hardware and software developers to begin design and development of products for the MS OS/2 environment in advance of general end-user availability of the MS OS/2 product. The MS OS/2 SDK will include a pre-release version of MS OS/2 and a comprehensive set of software development tools. Microsoft will be providing a high level of support for the MS OS/2 SDK product using the Microsoft Direct Information Access Line (DIAL) electronic mail support service. In addition, users of the MS OS/2 SDK will receive regular software updates and credit for attendance at technical training seminars given by Microsoft personnel.

New version of Microsoft Windows to provide bridge to MS OS/2

Microsoft also announced today a new version of Microsoft Windows for MS-DOS. To be made available in the third quarter of 1987, Microsoft Windows version 2.0 has a number of new features, including significantly improved performance and full utilization of expanded memory features. This MS-DOS version of Windows will run existing Windows applications and will present users with the same visual interface used by the Microsoft OS/2 Windows presentation manager. This interface is based upon the use of overlapping windows rather than the tiling technique used in the current release of Microsoft Windows. The incorporation of the MS OS/2 Windows presentation manager user interface into Microsoft Windows version 2.0 will provide a consistent interface between the current MS-DOS generation of personal computers and future systems based on MS OS/2.

New version of Microsoft MS-DOS to enhance installed base of personal computers

Separately, Microsoft also announced a new version of the MS-DOS operating system™ version 3.3, which provides improved performance and increased hard-disk storage capability for MS-DOS personal computers. "Microsoft is committed both to providing significant enhancements to the current generation of operating systems technology, and to introducing revolutionary new products, such as MS OS/2. MS-DOS version 3.3 is part of Microsoft's ongoing effort to improve our current products," said Gates.

Microsoft products support new IBM Personal System/2 series

Microsoft also announced that it will be releasing updated versions of its existing MS-DOS-based applications software, Microsoft Windows and Microsoft XENIX® products to take advantage of the new IBM Personal System/2 series. Microsoft will also be supporting the new IBM Personal System/2 with versions of the Microsoft Mouse, the most popular pointing device in use on personal computers today.

Microsoft Corporation (NASDAQ "MSFT") develops, markets and supports a wide range of software for business and professional use, including operating systems, languages and application programs as well as books and hardware for the microcomputer marketplace.

# # # # #

Microsoft, MS-DOS, XENIX and the Microsoft logo are registered trademarks of Microsoft Corporation.

Microsoft Operating System/2 and MS OS/2 are trademarks of Microsoft Corporation.
IBM is a registered trademark of International Business Machines Corporation.
Personal System/2 is a trademark of International Business Machines Corporation.
Intel is a registered trademark of Intel Corporation.

Product Information

Microsoft OS/2 Windows Presentation Manager

Introduction

The Windows presentation manager is the graphical user-interface component of Microsoft Operating System/2™.

Product Features

The Windows presentation manager provides a windowed graphical user interface to MS OS/2™ users. It replaces the MS OS/2 command line interface with a full function user shell. This shell features:

Tandy 16b

The Tandy 16b, and the similar Tandy 6000, were Tandy's stab at a multi-user business system. These machines came in 1984 equiped with both a Motorola 68000 and a Zilog Z-80. 64Kb of RAM was standard, but could be expanded up to a whole megabyte.

Utilizing the 68000 chip, they could also run Xenix, Microsoft's early UNIX experiment, that later morphed into SCO UNIX. Under Xenix, the 16b/6000 could handle two stand-alone terminals in addition to the base unit itself. The 16b came standard with two 8" 1.2 meg floppy drives. The 6000 could also be equipped with one floppy and an internal 15 meg hard drive. A green monochrome screen was also standard.

Pranks at Microsoft

Tsunami 386

Everyone that knows me knows that I tend to get really worked up about some issues, and I've earned a reputation as quite a flamer over the years.

In late '86 and early '87, Microsoft had decided to get out of the Xenix (unix) operating system business by farming the work off to other vendors, and merging the product with AT&T's offering.

I still believe that Xenix is the best operating system Microsoft has ever sold (even though I was also heavily involved with the development of OS/2, OS/2 2.0, Windows 1 through 3.1, and NT/OS) and it frustrated me greatly that we were dumping Xenix in this ignoble way. On April 1, 1987, several of the xenix developers, most particularly Dave Perlin and Paul Butzi, decided to see if they could get me to go really wild.

They cooked up a story about a deal that had been worked up with some company over a machine called the Tsunami 386, which involved giving them extraordinary rights to the xenix sources, and most particularly, the sources and support of the xenix 386 compiler, which I had written, and was in no way coupled to any of the usual AT&T licenses.

My bosses and their bosses were all informed of the prank, and everybody played along beautifully. It must have been quite a sight, as I went storming around building 2, explaining why this contract was so terribly outrageous. I don't think I've ever been so angry about anything. Finally, when I'm getting ready to escalate to Bill Gates, Dave suggests that I check the date.

I swear, I'll get them back some day. I just haven't thought of how, yet.

Linux-Kernel Archive Re Microsoft and Xenix

On Friday 22 June 2001 18:41, Alan Chandler wrote:
> I am not subscribed to the list, but I scan the archives and saw the
> following. Please cc e-mail me in followups.

I've had several requests to start a mailing list on this, actually... Might do so in a bit...

> I was working (and still am) for a UK computer systems integrator called
> Logica. One of our departments sold and supported Xenix (as distributor
> for Microsoft? - all the manuals had Logica on the covers although there
> was at least some mention of Microsoft inside) in the UK. At the time it

I don't suppose you have any of those manuals still lying around?

> It was more like (can't remember exactly when) 1985/1986 that Xenix got
> ported to the IBM PC.

Sure. Before that the PC didn't have enough Ram. Dos 2.0 was preparing the
dos user base for the day when the PC -would- have enough ram.

Stuff Paul Allen set in motion while he was in charge of the technical side
of MS still had some momentum when he left. Initially, Microsoft's
partnership with SCO was more along the lines of outsourcing development and
partnering with people who knew Unix. But without Allen rooting for it,
Xenix gradually stopped being strategic.

Gates allowed his company to be led around by the nose by IBM, and sucked
into the whole SAA/SNA thing (which DOS was the bottom tier of along with
a bunch of IBM big iron, and which OS/2 emerged from as an upgrade
path bringing IBM mainframe technology to higher-end PCs.)

IBM had a unix, AIX, which had more or less emerged from the early RISC
research (the 701 project? Lemme grab my notebook...)

Ok, SAA/SNA was "Systems Application Architecture" and "Systems Network
Architecture", which was launched coinciding with the big PS/2 announcement
on April 2, 1987. (models 50, 60, and 80.) The SAA/SNA push also extended
through the System/370 and AS400 stuff too. (I think 370's the mainframe and
AS400 is the minicomputer, but I'd have to look it up. One of them (AS400?)
had a database built into the OS. Interestingly, this is where SQL
originated (my notes say SQL came from the System/370 but I have to
double-check that, I thought the AS400 was the one with the built in
database?). In either case, it was first ported to the PC as part of SAA.
We also got the acronym "API" from IBM about this time.) Dos 4.0 was new, it
added 723 meg disks, EMS bundled into the OS rather than an add-on (the
Lotus-Intel-Microsoft Expanded Memory Specification), and "DOSShell" which
conformed to the SAA graphical user interface guidelines. (Think an
extremely primitive version of midnight commander.)

The PS/2 model 70/80 (desktop/tower versions of same thing) were IBM's first
386 based PC boxes, which came with either DOS 3.3, DOS 4.0, OS/2 (1.0), or
AIX.

AIX was NOT fully SAA/SNA compliant, since Unix had its own standards that
conflicted with IBM's. Either they'd have a non-standard unix, or a non-IBM
os. (They kind of wound up with both, actually.) The IBM customers who
insisted on Unix wanted it to comply with Unix standards, and the result is
that AIX was an outsider in the big IBM cross-platform push of the 80's, and
was basically sidelined within IBM as a result. It was its own little world.

skip skip skip skip (notes about boca's early days... The PC was launched in
August 1981, list of specs, xt, at, specs for PS/2 models 25/30, 50, 70/80,
and the "pc convertable" which is a REALLY ugly laptop.)

Here's what I'm looking for:

AIX was first introduced for the IBM RT/PC in 1986, which came out of the
early RISC research.
It was ported to PS/2 and S/370 by SAA, and was based
on unix SVR2. (The book didn't specify whether the original version or the
version ported to SAA was based on SVR2, I'm guessing both were.)

AIX was "not fully compliant" with SAA due to established and conflicting
unix standards it had to be complant with, and was treated as a second class
citizen by IBM because of this. It was still fairly hosed according to the
rest of the unix world, but IBM mostly bent standards rather than breaking
them.

Hmmm... Notes on the history of shareware (pc-write/bob wallace/quiicksoft,
pc-file/pc-calc/jim button/buttonware, pc-talk/andrew flugelman, apparently
the chronological order is andrew-jim-bob, and bob came up with the name
"shareware" because "freeware" was a trademark of Headlands Press, Inc...)
Notes on the IBM Risc System 6000 launch out of a book by Jim Hoskins (which
is where micro-channel came from, and also had one of the first cd-rom
drives, scsi based, 380 ms access time, 150k/second, with a caddy.) Notes on
the specifications of the 8080 and 8085 processors, plus the Z80

Sorry, that risc thing was the 801 project led by John Cocke, named after the
building it was in and started in 1975.

Ah, here's the rest of it:

The IBM Person Computer RT (Risc Technology) was launched in January 1986
running AIX. The engineers (in Austin) went on for the second generation
Risc System 6000 (the RS/6000) with AIX version 3, launched February 15 1990.
The acronym "POWER" stands for Performance Optimized WIth Enhanced Risc.

Then my notes diverge into the history of ethernet and token ring (IEEE 802.3
and 802.5, respectively. The nutshell is that ethernet was a commodity and
token ring was IBM only, and commodity out evolves proprietary every time.
The second generation ethernet increased in speed 10x while the second
generation token ring only increase 4x, and ethernet could mix speeds while
token ring had to be homogeneous. Plus ethernet moved to the "baseT" stuff
which was just just so much more reliable and convenient, and still cheaper
even if you had to purchase hubs because it was commodity.)

> instead) and I was comparing Xenix, GEM (remember that - for a time it
> looked like it might be ahead of windows) and Microsoft Windows v 1 . We

Ummm... GEM was the Geos stuff? (Yeah I remember it, I haven't researched
it yet though...)

> chose Windows in the end for its graphics capability although by the time
> we started development it was up to v2 and we were using 286's (this was
> 1987/88).

I used windows 2.0 briefly. It was black and white and you could watch the
individual pixels appear on the screen as it drew the fonts. (It looked
about like somebody writing with a pen. Really fast for writing with a pen,
but insanely slow by most other standards. Scrolling the screen was an
excuse to take a sip of beverage du jour.)

The suckiness of windows through the 80's has several reasons. The first
apple windowing system Gates saw was the LISA, -before- the macintosh, and
they actually had a pre-release mac prototype (since they were doing
application software for it) to clone. Yet it took them 11 years to get it
right.

In part this was because PC graphics hardware really sucked. CGA, hercules,
EGA... Painful. Black and white frame buffers pumped through an 8 mhz ISA
bus. (Even the move to 16 bit bus with the AT didn't really help matters too
much.)

In part, when Paul Allen left, Microsoft's in-house technical staff just
disintegrated. (Would YOU work for a company where marketing had absolute
power?) The scraps of talent they had left mostly followed the agenda set by
IBM (DOS 4/5, OS/2 1.0/1.1). A lot of other stuff (like the AIX work) got
outsourced.

Windows was Gates' pet project (I suspect an ego thing with steve jobs may
have been involved a bit, but they BOTH knew that the stuff from Xerox parc
was the future). He didn't want to outsource it, but the in-house resources
available to work on it were just pathetic.

There are a couple good histories of windows (with dates, detailed feature
lists, and screen shots of the various versions) available online. And if
you're discussing windows, you not only have to compare it with the Macintosh
but at least take a swipe at the Amiga and Atari ST as well. And OS/2's
presentation manager development, and of course the early X days (The first
version of X came out of MIT in 1984, the year the macintosh launched.
Unfortunatley in 1988 X got caught in a standards committee and development
STOPPED for the next ten years. Development finally got back in gear with
the XFree86 guys told X Open where it could stick its new license a year or
two back and finally decided to forge ahead on their own, and they've been
making up for lost time ever since but they've had a LOT of ground to cover.
Using 3d accelerator cards to play MPEG video streams is only now becoming
feasable to do under X. And it SHOULD be possible to do that through a
100baseT network, let alone gigabit, but the layering's all wrong...)

> Logica sold out its Xenix operation to Santa-Cruz around 1987 (definately
> before October 1987) because we couldn't afford the costs of developing the
> product (which makes me think that we had bought it out from Microsoft - at
> least in the UK). By then we had switched our PDP 11s to System V (I also
> remember BUYING an editor called "emacs" for use on it:-) ).

That would be the X version of emacs. And there's the explanation for the
split between GNU and X emacs: it got forked and the closed-source version
had a vew years of divergent development before opening back up, by which
point it was very different to reconcile the two code bases.

Such is the fate of BSD licensed code, it seems. At least when there's money
in it, anyway...

And THAT happy experience is why Richard Stallman stopped writing code for a
while and instead started writing licenses. The GPL 1.0 decended directly
from that (and 2.0 from real world use/experience/users' comments in the
field)

(Yes, I HAVE been doing a lot of research. I think I'll head down to the UT
library again this afternoon, actually...)

Rob
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to [email protected]
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/

Xenix: back when MS was just starting out.

Cable
[email protected]
Xenix: back when MS was just starting out.

>Yup, Xenix was 16-bit only.
>But there was a time - and this is hilarious when you
>consider the current state of affairs of NT v. Unix and
>Microsoft v. all the sorry Unix hardware vendors - that
>there were more computers running Microsoft Xenix than ALL
>OTHER VERSIONS OF UNIX COMBINED!

That much may be true. At that time MS had a stable OS, with power and 99.44% crash-proof with true multitasking.

In fact, IBM had planned to offer Xenix as an option for its AT computers in 1984 with Dumb Terminals hooked up to it for multi-user access.

>Yes indeed, Microsoft was the BIGGEST UNIX LICENSEE of ALL!

Maybe.

>No kidding, that's what a mass market platform like a PC
>will get you; even the primitive 286 based PCs that
>existed back then. Lots and lots of stores ran vertical
>applications on Xenix software. Lots and lots of fast food
>joints (big brand names) ran their store and all their
>cash registers on Xenix software.

Yeah one of my business partners used to work at a "Jack In The Box" around the time Xenix was popular, and they used it in the kitchen to place orders. He knew the modem number, and was able to dial in and run programs on their system. He went to college at the time, and learned all he could about Unix.

>Even the mighty Microsoft used 100s of Xenix PCs to run
>their internal email system.

Now they use 100s of multi-processor PC systems to run their e-mail globally.

>(Hard to imagine 1000s of Microsoft developers using vi,
>but who knows? Maybe it happened.)

There are other editors besides the archaic VI, PICO is one such editor.

But what happened to Xenix? Why did it almost vanish in 1987, replaced by a product known as OS/2 and later Windows 3.0 replaced the OS/2 product when IBM and MS had a falling out? Why no just use that X-Window interface in Xenix and make Xenix easier to install? :)

Microsoft PressPass - Microsoft Applauds European Commission Decision to Close Santa Cruz Operation Matter

Microsoft Applauds European Commission Decision to Close Santa Cruz Operation Matter

Decision upholds Microsoft's right to receive royalties if SCO utilizes Microsoft's technology

REDMOND, Wash.-November 24, 1997 - Microsoft Corporation today applauded the decision of the European Commission to close the file and take no further action on a dispute between Microsoft and Santa Cruz Operation (SCO) involving a 1987 contract. The Commission's decision follows progress by Microsoft and SCO to resolve a number of commercial issues related to the contract, and upholds Microsoft's right to receive royalty payments from SCO if software code developed by Microsoft is used in SCO's UNIX products.

"We are gratified that the European Commission rejected SCO's request for further action and approved our request to close the file on this case," said Brad Smith, Microsoft's associate general counsel, international.

"We were prepared to address SCO's concerns as long as our intellectual property royalty rights could be protected at the same time. The unique nature of the original 1987 contract made it difficult, but we were able to find a workable solution that resolves SCO's major concerns and still protects Microsoft's intellectual property rights," Smith said.

SCO's complaint concerned a contract originally negotiated in 1987 between Microsoft and AT&T for the development of the UNIX operating system. A principal goal of that contract was to help AT&T reduce fragmentation in the UNIX marketplace by creating a single merged UNIX product. To accomplish this goal, under the contract Microsoft developed for AT&T a new Intel-compatible version of UNIX that improved the program's performance and added compatibility with Microsoft's popular XENIX® operating system, which was at the time the most popular version of UNIX on any hardware platform. When completed in 1988, the merged product created by Microsoft was named "Product of the Year" by UnixWorld Magazine.

To prevent further UNIX fragmentation and at AT&T's behest, the contract obligated the parties to ensure that any future versions of UNIX they developed for the Intel platform would be compatible with this new version of UNIX.

As compensation for Microsoft's technology and for its agreement to give up its leadership position with XENIX, AT&T agreed to pay Microsoft a set royalty for the future copies of UNIX it shipped. AT&T subsequently transferred its rights and obligations under the contract to Novell, which transferred the contract to SCO in 1995.

The code developed by Microsoft under the 1987 contract continues to play an important role in SCO's OpenServer UNIX product. This includes improvements Microsoft made in memory management and system performance, development of a multi-step bootstrap sequence, numerous bug fixes, and the addition of new functions originally developed for XENIX and still documented today by SCO for use by current application developers.

SCO complained to the EC that the provisions in the 1987 contract restricted the manner in which it could develop a future version of UNIX (code-named "Gemini") for the 64-bit generation of Intel processors. After reviewing the matter, Microsoft modified the contract to waive SCO's backward compatibility and development obligations, but insisted on continued payment of royalties for any UNIX versions that include Microsoft's technology. Microsoft then requested that the Commission close the file on the case and take no further action, and the Commission agreed to do so. SCO therefore withdrew its complaint.

Microsoft's Smith said there were basically three issues in the contract that needed to be resolved: (1) the backward compatibility requirement, (2) a development requirement designed to reduce UNIX fragmentation under which each new version of UNIX would be built on the previous versions, and (3) royalty payment obligations for Microsoft's intellectual property rights.

"Microsoft was willing to waive the backward compatibility and development requirements, which were included in the 1987 agreement at AT&T's behest, but we needed to preserve our intellectual property royalty rights, which are fundamental to the software industry as a whole," he noted. "Unfortunately, the old contract was written in a way that made it difficult to separate the development requirement from the royalty rights, but we were able to find a solution that gave SCO what it wanted but protected our intellectual property rights."

Microsoft first learned of SCO's complaint to the European Commission in late March. In a May 22 submission to European Commission officials, Microsoft affirmed that it was willing to waive the backward compatibility requirement in the contract, as long as Microsoft's right to receive royalty payment for use of its copyrighted technology was preserved. On May 26, before receiving Microsoft's submission, the Commission provided Microsoft with a Statement of Objections. This is a preliminary step in the EC process that identifies issues for further deliberation and provides a company an opportunity to present its position in person at an internal hearing. Microsoft reiterated its willingness to waive the backward compatibility requirements in an August 1 filing with the European Commission. Microsoft also requested that the Commission hold a hearing, so that Microsoft could document the various ways in which Microsoft's intellectual property is contained in SCO's present UNIX products.

On November 4, after discussions with SCO were unsuccessful in resolving the matter, Microsoft informed SCO that it was unilaterally waiving the compatibility and development requirements of the contract, but retaining the requirement that SCO pay a royalty to Microsoft when it ships product that utilizes Microsoft's intellectual property rights. Upon receiving Microsoft's waiver, the Commission canceled the hearing, which was scheduled for November 13. Despite Microsoft's action to address SCO's concerns, SCO continued to ask for further action by the European Commission. However, the Commission rejected SCO's request and decided to close the case. SCO therefore withdrew its complaint.

"We're pleased that we were able to resolve these issues to the satisfaction of everyone involved, and we're particularly pleased that the EC upheld our right to collect royalties for the use of our technology. This principle is fundamental to the entire software industry," said Smith.

Founded in 1975, Microsoft (NASDAQ "MSFT") is the worldwide leader in software for personal computers. The company offers a wide range of products and services for business and personal use, each designed with the mission of making it easier and more enjoyable for people to take advantage of the full power of personal computing every day.

Microsoft and XENIX are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries. Other products and company names mentioned herein may be the trademarks of their respective owners.

Note to editors: If you are interested in viewing additional information on Microsoft, please visit the Microsoft Web page at http://www.microsoft.com/presspass/ on Microsoft's corporate information pages.

Linux How Does SCO Compare

Full Article:

Disclaimer: Contents are not reviewed for correctness and are not endorsed or recommended by ITtoolbox or any vendor. FAQ contents include summarized information from Linux-Select discussion unless otherwise noted.

Adapted from responses by Brian on Tuesday, July 08, 2003

SCO was the first to popularize the use of UNIX software on Intel-based hardware, building on the software they acquired from the XENIX work done by Microsoft in the late 1970s and early 1980s. They've probably sold more UNIX licenses than anyone else, including Sun, but their profit margins have never come close to the margins that Sun has long enjoyed on the larger server systems.

SCO has always excelled in the retail market, but IBM and Sun are moving into that space along with many Linux software vendors.

According to http://www.computerwire.com/recentresearch/CA8DBB43AE69514C80256D57003848CB

"In 1969/70, Kenneth Thompson, Dennis Ritchie, and others at AT&T's Bell
Labs began the development of the Unix operating system. Unable to sell the software due to a DoJ injunction, AT&T instead licensed Unix Version 7 (and later Unix System III and V) to other organizations including the University of California at Berkeley (which developed its version as BSD under a more liberal license) and Microsoft (Xenix), while AT&T continued the development of the original Unix code (System III, System V).

In 1980 Microsoft began the development of Xenix, a version of Unix for Intel processors, with the Santa Cruz Operation (SCO), before handing it over to SCO to concentrate on MS-DOS. SCO renamed it SCO Xenix, then SCO Unix after a code merge with the original System V, before settling on OpenServer."

SCO has always sold low end licenses on Intel hardware. Typical license prices ranged from $100-1000, rarely more than that. In contrast, Sun's prices start around $1000 and go up dramatically for high end hardware. Sun paid at least $1,000,000 to AT&T, but was granted complete rights to all UNIX source code, so that investment was regained long ago.

In contrast, both Caldera and SCO have been jockeying to establish a profitable business, something that neither of them have done consistently. SCO has probably sold more UNIX licenses over the years than anyone else, but just when people really started to run server software in larger numbers on Intel hardware, Sun (with their Solaris on Intel and now Linux) and IBM, with their embrace of Linux software, have started to compete in areas which once belonged almost completely to SCO.

From my experience, I've used a few different versions of Caldera products, the former Linux software division of SCO. They were always very easy to install, mixed with a combination of freely available and commercial software. They tended to be quite stable and useful, but rarely near the leading or bleeding edge. I used Caldera Open Linux Base, an early version of commercial and free software. It was decent. The one I really enjoyed was Caldera Open Linux eDesktop 2.4. Though now quite dated, at the time, it was one of the first really friendly, truly easy to install and configure systems.

More recently, I've used Caldera Open Linux Workstation 3.1. It's now dated, too, and probably pulled from the market. But it ran quite well and exhibited many of the same friendly characteristics as eDesktop 2.4. As far as SCO UNIX products, I've used XENIX, the Microsoft predecessor to SCO UNIX, and I've used SCO UNIX, too, but I haven't used any of their products in the past two years. In the meantime, things changed, and the company changed ownership from Caldera to SCO, underwent management changes, and my interest diminished.

But all in all, SCO seems to offer good, stable, but aging software. It compares very well to Linux software in stability, but lags in current features and tends to be higher in price because of its higher proprietary content. Given the present situation, I feel that SCO will probably try to make a little money off the current litigation, then sell off or get out of the business entirely. For that reason, I hesitate to recommend their software - the future stability of the company itself is in question.

More information:
More information on Caldera/SCO is available in the Linux-Select discussion group archives. Enter a topic and click the search button for detailed results.

The Microsoft-SCO Connection By Steven J. Vaughan-Nichols

Cyber Cynic: The Microsoft-SCO Connection -- 21 May 2003

What is Microsoft really up to by licensing Unix from SCO for between 10 to 30 million dollars? I think the answer's quite simple: they want to hurt Linux. Anything that damages Linux's reputation, which lending support to SCO's Unix intellectual property claims does, is to Microsoft's advantage.

Mary Jo Foley, top reporter of Microsoft Watch agrees with me. She tells me, "This is just Microsoft making sure the Linux waters get muddier They are doing this to hurt Linux and keep customers off balance. Eric Raymond, president of the Open Source Initative agrees and adds "Any money they (Microsoft) give SCO helps SCO hurt Linux. I think it's that simple."

Dan Kusnetzky, IDC vice president for system software research, also believes that Microsoft winning can be the only sure result from SCO's legal maneuvering. But, he also thinks that whether SCO wins, loses, or draws, Microsoft will get blamed for SCO's actions.

He's right. People are already accusing Microsoft of bankrolling SCO's attacks on IBM and Linux.

But is there more to it? Is Microsoft actually in cahoots with SCO? I don't think so. Before this deal, both SCO and Caldera have had long, rancorous histories with Microsoft

While Microsoft certainly benefits from any doubt thrown Linux's way, despite rumors to the contrary Microsoft no longer owns any share of SCO and hasn't for years. In fact, Microsoft's last official dealing with Caldera/SCO was in early January 2000, when Microsoft paid approximately $60 million to Caldera to settle Caldera's claims that Microsoft had tried to destroy DR-DOS. While Microsoft never admitted to wrong-doing, the pay-off speaks louder than words.

The deal didn't make SCO/Caldera feel any kinder towards Microsoft. A typical example of SCO's view of Microsoft until recently can be found in the title of such marketing white papers as "Caldera vs. Microsoft: Attacking the Soft Underbelly" from February 2002.

Historically, Microsoft licensed the Unix code from AT&T in 1980 to make its own version of Unix: Xenix. At the time, the plan was that Xenix would be Microsoft's 16-bit operating system. Microsoft quickly found they couldn't do it on their own, and so started work with what was then a small Unix porting company, SCO. By 1983, SCO XENIX System V had arrived for 8086 and 8088 chips and both companies were marketing it.

It didn't take long though for Microsoft to decide that Xenix wasn't for them. In 1984, the combination of AT&T licensing fees and the rise of MS-DOS, made Microsoft decide to start moving out of the Unix business.

Microsoft and SCO were far from done with each other yet though. By 1988, Microsoft and IBM were at loggerheads over the next generation of operating systems: OS/2 and Unix. Microsoft saw IBM's support of the Open Software Foundation (OSF), an attempt to come up with a common AIX-based Unix to battle the alliance of AT&T and Sun, which was to lead to Solaris.

Microsoft saw this as working against their plans for IBM and Microsoft's joint operating system project, OS/2 and their own plans for Windows. Microsoft thought briefly about joining the OSF, but decided not to. Instead Bill Gates and company hedged their operating systems bets by buying about 16% of SCO, an OSF member, in March 1989

In January 2000, Microsoft finally divested the last of their SCO stock. Even before Caldera bought out SCO though in August 2000, Microsoft and SCO continued to fight with each other. The last such battle was in 1997, when they finally settled a squabble over European Xenix technology royalties that SCO had been paying Microsoft since the 80s.

Despite their long, bad history, no one calling the shots in today's SCO has anything to do with either the old SCO or Caldera. I also though think that there hasn't been enough time for SCO and Microsoft to cuddle up close enough for joint efforts against IBM and Linux.

I also think that it's doubtful that Microsoft would buy SCO with the hopes of launching licensing and legal battles against IBM, Sun and the Linux companies. They're still too close to their own monopoly trials. Remember, even though they ended up only being slapped on the wrist, they did lose the trial. Buying the ability to attack their rivals' operating systems could only give Microsoft a world of hurt.

Besides, as Eric Raymond in the Open Source Initiative's position paper on SCO vs. IBM and Bruce Perens' "The FUD War against Linux," point out, it's not like SCO has a great case.

Indeed, as Perens told me the other day, in addition to all the points that has already been made about SCO's weak case, SCO made most 16-bit Unix and 32V Unix source code freely available. To be precise, on January 23, 2002, Caldera wrote, "Caldera International, Inc. hereby grants a fee free license that includes the rights use, modify and distribute this named source code, including creating derived binary products created from the source code." Although not mentioned by name, the letter seems to me to put these operating systems under the BSD license.While System III and System V code are specifically not included, it certainly makes SCO's case even murkier.

SCO has since taken down its own 'Ancient Unix' source code site, but the code and the letter remain available at many mirror sites.

Given all this, I think Microsoft has done all they're going to do with SCO. They've helped spread more FUD for a minimal investment. To try more could only entangle them in further legal problems. No, SCO alone is responsible for our current Unix/Linux situation and alone SCO will have to face its day in court.

unix xenix history comments --Re SCO boot disk images

From: [email protected] (Bill Vermillion)
Subject: Re: SCO boot disk images
References: <[email protected]> <[email protected]> <[email protected]> <[email protected]>
Date: Mon, 11 Feb 2002 00:00:22 GMT

In article <[email protected]>,
Bela Lubkin <[email protected]> wrote:
>Bill Vermillion wrote:

>> In article <[email protected]>,
>> Tony Lawrence <[email protected]> wrote:
>> >Bela Lubkin wrote:

>> >> Amazing that after all these years and however many SCO Xenix,
>> >> Unix and OpenServer systems in the field (well over a million
>> >> licenses sold), they still can't bring themselves to name it.
>> >> Also amazing the GNU HURD chose the same ID as SysV Unix --
>> >> something like 10 years after SysV Unix had already claimed
>> >> it...

>> >Yeah, doesn't it just frost you? Far and away SCO had more Unix out
>> >there than anybody else, but it always got ignored- didn't exist as
>> >far as anyone else was concerned.

>> I was running SysV systems on iNTEL devices before SCO ever brought
>> forth their first Unix implementation. The SysV2 was realy raw as it
>> was one where you had to add all the lines in the /etc/password file
>> by hand, and make sure you got all the :'s correct, etc. That was
>> from MicroPort - which I think was the first iNTEL based SysV2. When
>> I used Esix - which was V.3 - it was about a year before SCO came
>> out with their V.3 based implementation. So you can't blame someone
>> for not mentioning something that hadn't yet existed. Of course
>> Linux came along after that so your point is valid there :-)

>The first releases of SCO Xenix were based on AT&T 7th Edition and then
>System III, back in 1984. But even then they used the 0x63 partition
>ID. MicroPort's '286 SysV was released in early '86, I believe. SCO's
>SysV port (Xenix System V 2.1.0 or so) was around the same timeframe,
>with the '386 version a year or two later.

Tony's comment was on SCO SysV Unix. I'm well aware of the early
Xenix uses as I maintained several machines with it. The MicroPort
time frame is correct as it was being promoted at the 1986 summer
Usenix conference where I first saw. They were promoting it along
the line of 'buy this hard drive and get Unix free'. This was
also about the same time I'd see your posts on the Dr. Dobbs forum.
[some of us still remember!]

>> Until SCO brought out the Unix implementation the Xenix would only
>> support 16 users - so that one reason it was looked down upon from
>> people in that group. It was thought of more along the lines of a
>> small car than a large truck.

>Was there really a 16-user limitation? I can only think of license
>reasons for that, not software...

That sticks in my memory - but I could be wrong on that one.

>> One of the constant grumblings was that which was forced upon SCO
>> by licensing issues and that was that only the base system was
>> there, and you had to purchase the development system and text
>> processing system separately. I heard that a lot from others
>> who were using SysV iNTEL based systems, who were independant of
>> the above group.

>SCO was forced to unbundle the devsys and text processing
>portions due to unfavorable license agreements with MS and AT&T,
>respectively. The royalties due on each of those portions would
>have made the overall price of the OS unacceptable. You could
>argue (and I would agree) that SCO should have made better royalty
>agreements with MS & AT&T, initially _or_ by renegotiating. But it
>didn't happen.

You notice I did say 'forced upon SCO'. It was the others Unix
users who complained - thinking it was something that SCO did on
purpose. And who of us who was on this list in the early 1990s
will ever forget all of Larry's rants against SCO. SCO's problem
as I see it was that they were about the only pure SW vendor while
others had HW ties.

Intel even had their own brand of Unix for awhile. And maybe you
recall but was that the one that went to Kodak, which then became
Interactive, which then went to Sun. Others came and went.
SCO has always 'been there'. That's more than you can say for
other vendors who championed Unix for awhile and then quit.

Dell comes immdiately to mind - the one Larry championed so
loudly. They pushed it for awhle and then they dropped. Dell
later pushed Linux and then dropped it. Now if they would only
push MS products there might be hope ;-).

The next time I stumble across the SCO price list from 1984/5
that has the pricing for Xenix on the Apple Lisa and the Lyrix word
processing for other platforms [I think the VAX was included] I'll
scan it in. Far too many think of SCO as only working Xenix and
Unix on iNTEL but they were far more than that. Their list of
cross-assemblers for different CPUs/platforms was amazing too.
I had forgotten how broad that field was - had to well over a dozen
at that time.

>By the OSR5 timeframe, when we finally got rid of the MS-based
>development system, the idea of selling the DS separately was well
>entrenched, and persists to date (though you can get really deep
>discounts by joining a developer's program, which is either free or
>cheap -- I've lost track). And the text processing package had become
>almost completely irrelevant.

The EU suit against Micrsoft making them stop forcing the inclusion
of the Xenix code was a good thing. ISTR that it was only about
six months after that when SCO was able to drop that part. People
seem to forget the MS's licensing hurt more than just MS user.
Given the environments where SCO was used I don't know whether
a cheaper or bundled DS would have been beneficial to the business
side or not.

About the only thing the text-processing was being used for by that
time in many was to write/format man pages it seemed. Of course all
of SCO man pages were already formatted, probably because of this.
And writing in troff style was certainly nothing I ever felt I
would like to learn.

The best parts of AT&T text processing never seemed to make it past
AT&T. I was really impressed by the Writers Work Bench. But by
that time serious document production was being done by companies
who speicilized in it - and it really didn't belong in the OS.
FrameMaker comes immeditely to mind. That did some truly amazing
things but it's target customers were HUGE companies. Main users
were places such as drug manufacturers who would generate a
semi-truck full of paper for submission for drug approval, and
automobile manufactures. Unix really shined in those environments.

Bill

--
Bill Vermillion - bv @ wjv . com

Computer Source Publications - Under The Hood Part 8

SourceMagazine.com

Under The Hood: Part 8
November 4, 2002 - Computer Source Magazine

E Pluribus UNIX

Last time, I explained how UNIX evolved and became the operating system (OS) with which most computer science students were most familiar and wanted to emulate. Now, I'll explain how Microsoft became a champion of UNIX for the microcomputer.

Due to its portability and flexibility, UNIX Version 6 (V6) became the minicomputer OS of choice for universities in 1975. At about the same time, microprocessors from Intel, Motorola, MOS Technology and Zilog ushered in the age of the microcomputer and created the home- or personal-computer market. That's also when Bill Gates and Paul Allen founded Micro-Soft and created the MBASIC Beginner's All-purpose Symbolic Instruction Code (BASIC) interpreter for the MITS Altair 8800 microcomputer.

Ironically, MBASIC was actually written on a Digital Equipment Corporation (DEC) Programmed Data Processor 11 (PDP-11) minicomputer running UNIX V6 on the University of California at Berkeley (UCB) campus, which Gates and Allen timeshared.

The Altair 8800 used the Intel 8080 microprocessor, which couldn't run UNIX. Instead, it used MBASIC as a self-contained programming environment. The same was true of the MBASIC interpreters for the Motorola 6800 for the Ohio Scientific OS-9, the Zilog Z80 for the Tandy/Radio Shack TRS-80 and the MOS Technology 6502 for the Apple II and Commodore Personal Electronic Transactor (PET). Each MBASIC interpreter was custom-written specifically for each processor.

UNIX could also be ported to different processors, but at that time only ran on high-end minicomputer and mainframe systems from DEC and IBM. In 1974, the closest thing to UNIX was Digital Research CP/M for the Intel 8080 and Zilog Z80 microprocessors. By 1977, 8080 or Z80 systems with an S-100 bus running CP/M were considered as close to a "real computer" running UNIX that you get with a microcomputer. It was at this time that Micro-Soft became Microsoft and expanded its inventory of language offerings.
In 1978, Bell Labs distributed UNIX with full source-code and, within a year, academic researchers began developing their own custom versions, most notably the UCB Berkeley Standard Distribution (BSD). In 1979, Microsoft licensed UNIX directly from AT&T, but couldn't license the UNIX name, so it called its UNIX variant Microsoft XENIX.

XENIX was originally developed on a DEC Virtual Address Extension (VAX) running the Virtual Memory System (VMS) and a PDP-11 running UNIX V7, albeit now using Microsoft's own in-house minicomputers, and then converted into assembly language specific to the new 16-bit Motorola 68000 and Intel 8086 microprocessors. This put XENIX at the high end of the microcomputer market, which was still dominated by 8-bit machines, but well below the lowest end of the minicomputer market.

In 1979, brothers Doug and Larry Michels founded the Santa Cruz Operation (SCO) as a UNIX porting and consulting company using venture capital from Microsoft, which handed over all further development of Microsoft XENIX to SCO. Doug Michels recalled that the company's name was a bit of "social engineering" to obscure the fact that it was essentially a two-man peration. "I'd call up and say, 'This is Doug from the Santa Cruz Operation' and be pretty sure they wouldn't catch that the 'O' was capitalized and think I was from another branch of their company."
By 1980, the UNIX family tree had split into three distinct major branches:

  1. AT&T UNIX System III from Bell Labs' UNIX Support Group (USG).
  2. Berkeley Standard Distribution 4.1 from UCB.
  3. XENIX 3.0 from Microsoft and SCO.


Microsoft XENIX was initially an Intel 8086 port of AT&T UNIX Version 7 with some BSD-like enhancements. This became Microsoft/SCO XENIX 3.0 a year or so later. SCO XENIX 5.0 was updated to conform to AT&T UNIX System V Release 0 (SVR0) in 1983, for which SCO brought its own rights to the UNIX source code. By then, XENIX had the largest installed base of any UNIX system during the early 1980s.

Microsoft acquired a 25 percent share of SCO, which at the time gave it a controlling interest. While SCO handled the actual development and added some enhancements of its own, Microsoft handled the marketing of the product, which it touted as the "Microcomputer Operating System of the Future!"

A 1980 issue of Microsoft Quarterly stated, "The XENIX system's inherent flexibility … will make the XENIX OS the standard operating system for the computers of the '80s." The 1983 XENIX Users' Manual declared, "Microsoft announces the XENIX Operating System, a 16 bit adaptation of Bell Laboratories UNIX™ Operating System. We have enhanced the UNIX software for our commercial customer base, and ported it to popular 16-bit microprocessors. We've put the XENIX OS on the DEC® PDP-11™, Intel® 8086, Zilog® Z8000 and Motorola® 68000." It went on to warn against "so-called UNIX-like" products. Similar sentiments were echoed in ads for Microsoft XENIX in the UNIX Review and UNIX World magazines as late as 1984. That's when Microsoft and SCO had a parting of the ways.
What changed?
On August 12, 1981, the IBM Model 5150 Personal Computer changed everything. Then, on January 24, 1984, the Apple Macintosh changed everything … again!

-Dafydd Neal Dyar

notes01

Slide ``Unix history'':

trademark UNIX, genetic UNIX, UNIX-like
	very simple: trademark UNIX: certified by The Open Group to bear the
		trademark UNIX(R), genetic: by inheritance, possibly without
		trademark (examples: BSDs), UNIX-like: independent (linux)

	1969: Thompson and Ritchie worked on Multics: Multiplexed Information
		and Computing Service; joint effort by Bell Telephone
		Laboratories (BTL), GE and MIT.

		BTL withdrew, and "Unics" (UNIplexed Information...) was
		re-written in BCPL (Basic Combined Programming Language),
		Ritchie wrote a ``cut-down version'': "B"

	1970: First elementary UNIX system installed on PDP-11 for text
		preparation, featuring such exciting tools as ``ed'' and
		``roff''.

	1971: First manual published.  (Manual in print!)  Commands included
		b, cat(1), chmod(1), chown(1), cp(1), ls(1), mv(1), wc(1)

	1972: Ritchie rewrote "B" to become "C"; Thompson implements the
		concept of the pipe (which is attributed to Douglas McIlroy,
		but simultaneously popped up at Dartmouth)

caption{Ritchie and Thompson, porting UNIX to the PDP-11
via two Teletype 33 terminals.}

	1974: Thompson teaches at Berkeley.  Together with Bill Joy (Sun!),
		Chuck Haley and others, he developed the Berkeley Software
		Distribution (BSD)

		BSD added (over the years) the vi editor (Joy), sendmail,
		virtual memory, TCP/IP networking

	throughout 70's:
		Second Berkeley Software Distribution, aka 2BSD.  final
		version of this distribution, 2.11BSD, is a complete system
		used on hundreds of PDP-11's still running in various corners
		of the world.
		
		spread of UNIX also thanks to VAX, to which Joy ported 2BSD,
		which included the virtual memory kernel

		other important factors influencing OS development:  research
		from Xerox Parc, including GUI studies and the mouse, then
		adopted by Steve Jobs for the Apple OS.

	1978: UNIX Version 7 was released and licensed (free to universities);
		from here on (actually, since Version 5), we see two distinct
		directions: BSD and what was to become ``System V''

	1979: 3BSD released for VAX

		At this time, commercial vendors become interested in UNIX and
		start to license it from BTL / ATT.  ATT could not sell the
		work from BTL, so it spread to universities and academia.
		Eventually, work from ATT was taken over by WE, to become UNIX
		System Laboratories, later on owned by Novell.


		Some dates:

Version 6 	1975 	Universities
Version 7 	1978 	Universities and commercial. The basis for System V.
System III 	1981 	Commercial
System V, Release 1 	1983 	Commercial
System V, Release 2 	1984 	Commercial, enhancements and performance improvements
Version 8 	1985 	Universities
Version 9 	1986 	Universities
Version 10 	1989 	Universities


	The 80s:
		Note that there never was a ``System I'':  it was thought that
		a I would imply a buggy system.  (Note references to todays
		version insanity:  RH 100.12, SuSE 7, etc., Solaris 2.6
		becomes Solaris 7)  Software version numbering is (largely)
		arbitrary!

		Note there was no 5BSD:  4.1BSD should have been 5BSD, but
		ATT thought users would get confused with ``System V'' and
		objected.

		4.2BSD shipped more versions than SV, since early versions did
		not yet include TCP/IP (developed with funding from DARPA at
		Berkeley) or the Berkeley Fast Filesystem (covered in future
		lecture).

Legal stuff:
	Up until 4.3BSD-Tahoe (in which machine-dependent and
	machine-independent parts were first separated), everybody who wanted
	to get BSD had to get a license from ATT (BSD was never released as
	binary only, but always contained the source, too).  Other vendors
	wanted to use the Berkeley networking code, but didn't want to pay for
	the entire BSD binaries in licenses.

	TCP/IP, entirely developed at Berkeley, was broken out of BSD and
	released in 1989 as the Networking Release 1, with the first BSD
	license.

	Then, people wanted to get a more complete version of a freely
	redistributable OS.  So folks at Berkeley started to rewrite every
	utility from scratch, solely based on the documentation.  In the end
	only about 6 files were left that were ATT contaminated and could not
	trivially be rewritten.  The rewrites etc. were released as Networking
	Release 2;  soon after the remaining parts were rewritten and 386/BSD
	was released which then turned into NetBSD (hence the name).

	Similarly, BSDI rewrote the files and released their system, even
	advertising it as UNIX (call 1-800-ITS-UNIX).  USL (Unix System
	Laboratories), the part of ATT that sold UNIX, did not like that one
	bit.  Even after BSDI changed their ads and didn't claim it was UNIX,
	they still sued, claiming that BSDI contains USL code.

	BSDI argues that it's willing to discuss the six files they wrote, but
	they should not be held responsible for the Berkeley code.  USL,
	knowing they'd have no case based on just six files *refiled* lawsuit
	against BSDI and UC Berkeley.

	UC Berkeley then counter-sued USL, saying they didn't comply with
	their license (no credit for code incorporated into SV).  Soon after,
	USL was bought by Novell, and settlement talks started.  Settlement was
	reached in 1994: three out of 18,000 files of Networking Release 2
	were removed, some minor changes.

	This was released as 4.4BSD-Lite.  Since the settlement also included
	that USL would not sue anybody who did use 4.4BSD-lite as the basis,
	BSDI, NetBSD and FreeBSD all had to merge their changes with
	4.4BSD-lite and get rid of the encumbered files.

	We'll cover more legal stuff in a future lecture.


BSD: First to support VM (inspired by SunOS implementation and MACH OS); SunOS
versions 3 and 4, which brought the UNIX world some of its most well known
features, such as NFS, were derived from 4.2BSD and 4.3BSD, respectively. The
NetBSD and BSD/OS operating systems are descendants of 4.4BSD, with a wealth
of new features; FreeBSD and OpenBSD are two other operating systems which are
descended from 4.4BSD through NetBSD, and, last but certainly not least,
Apple's "Rhapsody", "Darwin", and "OS X" operating systems are mostly NetBSD
and FreeBSD code running atop a microkernel.

System V R4: Commercial UNIX's attempt at standardization. Most commercial
versions of UNIX are compliant with SVR4; some are compliant with the more
recent, but only slightly different, SVR4.2 or SVR5 specifications. Sun's
Solaris is SVR4, with many Sun enhancements; SGI's IRIX is, similarly, SVR4
with many subsequent changes by SGI; UnixWare is SVR4.2 or SVR5, depending on
version. Most differences between different vendor operating systems derived
from SVR4 are not obvious to the application programmer or are comparatively
minor.

Another Unix-like operating system is Linux. While Linux isn't directly
descended from any version of Unix, it is generally similar to SVR4 from the
programmer's point of view; modern Linux systems also implement much of the
functionality of the 4.4BSD-derived systems such as NetBSD or OS X.
Unfortunately, sometimes Linux quite simply "goes its own way" to an extent
otherwise relatively uncommon in the modern era of Unix; functions won't work
quite as they do in other versions of Unix, common utilities will implement
strange new options and deprecate old ones, and so forth. If you learn to
write code that is portable between many versions of Unix, it will run on
Linux -- but be prepared to scratch your head at times!

Xenix was Microsoft's version of UNIX for microprocessors.  When Microsoft
entered into an agreement with IBM to develop OS/2, it lost interest in
promoting Xenix.  Microsoft transferred ownership of Xenix to SCO in an
agreement that left Microsoft owning 25% of SCO.  However, Microsoft continued
to use Xenix internally, submitting a patch to support functionality in UNIX
to AT&T in 1987, which trickled down to the code base of both Xenix and SCO
UNIX. Microsoft is said to have used Xenix on VAX minicomputers extensively
within their company as late as 1992.

SCO released a version of Xenix for the Intel 286 processor in 1985, and
following their port of Xenix to the 386 processor, a 32-bit chip, renamed it
SCO UNIX.


Todays big UNIX versions: SCO UNIX, largely SVR4, mostly irrelevant
			SCO UnixWare (from Novell (again)).
			SCO Linux (from Caldera)

			SunOS: largely irrelevant, as it's superseded by
			Solaris.  SunOS is BSD derived.

			Solaris: SVR4, including features from SunOS (such as
			NFS).  Naming nonsense:  some called Solaris SunOS 5,
			Solaris 2.3 SunOS 6, Solaris 2.4 SunOS 7 etc.  silly
			Strength: NFS
			One of the most popular UNIX versions.  Here lie big
			bucks!

			HP-UX: Version 10 mostly SVR4, dunno much about it.
			
			Digital UNIX: DEC's version, mostly BSD.

			IRIX:  guinness, SV based, includes BSD extensions /
			compatibilities from SGI for Mips architecture.
			Strenght: XFS, graphics (SGI -> OpenGL)
			Might soon go away in favor of Linux

			AIX: IBMs SV based, including a large number of BSD
			changes.  dunno much about, but obviously IBM's big
			in the linux business these days
			Acronym: Advanced Interactive eXecutive or AIn't uniX

			*BSD:  NetBSD first BSD.  Designed for correctness
			(portability is just a side effect!).  First release
			was 0.8 in March 1993.
			FreeBSD, born as 1.0 in Dec. 1993, concentrates on
			i386 performance
			OpenBSD forked off NetBSD in October 1995,
			self-proclaimed focus on security.
			BSD/OS or BSDi: commercial BSD.  Mostly irrelevant.

			Linux: completely different:  neither genetic nor
			trademark.  Minix­like.  Just a kernel, not a complete
			OS.  Only GNU makes it a complete OS (so Stallman has
			a point, after all).  First kernel first announced in
			1991.  Monolithic.

Slide ``Some UNIX versions'':
Monolithic vs microkernel:
The microkernel approach consists in defining a very simple virtual machine
over the hardware, with a set of primitives or system calls to implement
minimal OS services such as thread management, address spaces and interprocess
communication.
The main objective is the separation of basic service implementations from the
operation policy of the system.

Examples of a microkernel: 
	GNU Hurd, AIX, Windows NT, Minix, QNX

Monolithic kernels:
	traditional UNIX kernels, BSD
hybrid monolithic kernel:
	can load modules at runtime (Linux, BSD, ...)


Some of the more interesting other UNIX and UNIX-like OS:

		GNU started in 1983 by Stallman.  Intention: use HURD
		(a Mach microkernel), then took Linux.  GNU's Not Unix.

		HURD:  a unix-like microkernel, currently based on GNU Mach

		Mach: unix-like microkernel developed at Carnegie Mellon.
		Mach-based OS include NeXTSTEP and OS/2

		NeXTESTEP: of interest due to connection to Mac OS X.
		Influence in WindowManagers:  see AfterStep, WindowMaker etc.
		Founded by Steve Jobs (throwing in the towel at Apple after
		revolutionizing the GUI using ideas from Xerox Parc with
		the Apple Lisa and the Apple Macintosh) around 1985.

NeXTSTEP is the original object-oriented, multitasking operating system that
NeXT Computer, Inc. developed to run on its proprietary NeXT computers
(informally known as "black boxes"). NeXTSTEP 1.0 was released in 1989 after
several previews starting in 1986, and the last release 3.3 in early 1995. By
that point NeXT had teamed up with Sun Microsystems to develop OpenStep, a
cross-platform standard and implementation (for SPARC, Intel, HP and NeXT m68k
architectures), based on NeXTSTEP.

		includes: mach-kernel + BSD code, postscript windowing engine
		(see OS X), Objective-C, advanced, interesting UI (dock,
		shelf)

		Apple bought NeXTSTEP in 1997 (again under Steve Jobs).


	Darwin:
		open source, XNU kernel, integrates Mach 3.0, and Net- and
		FreeBSD.  Stand-alone OS, but really interesting as the core
		of Mac OS X.

	Mac OS X:
		Apple's Unix (not UNIX - not trademarked).  Combines NeXTSTEP
		and Darwin.  Quartz (postscript based windowing engine),
		netinfo, unix core + pretty user interface

Some other interesting UNIX versions:
	QNX:	POSIX compliant real-time UNIX-like OS

	Plan 9 / Inferno:
		Since 2003 even Open Source.
		Incorporates interesting Operating System research.  again
		from Bell Labs.  Absolutely _everything_ is a file, no
		distinction between a local and a remote object.  low-level
		networking protocol known as 9P


Slide: UNIX Basics:
	- kernel: schedules tasks, manages storage, controls hardware, makes
	  system calls

	- shell: user interface to the OS and other applications, usually
	  command interpreters.  provide the functionality of pipes.

	- tools and applications: everything else.  Most common tools adhere
	  to written (POSIX, SUSV3) or unwritten standards.

	- multitasking: running multiple jobs at the same time without hanging
	  in between

	- multiuser: more than one person can use the same resources at one
	  time.  Windows did not have this until XXX, Mac OS only since OS X!
	  Prioritization (nice(1)), user privileges and permissions etc.

	- portability: easily port applications from one UNIX to another --
	  even if sometimes hurdles have to be overcome.  But try porting from
	  Mac OS (pre-X) or Windows to UNIX!

	- networking capabilities: inclusion of TCP/IP early on brought email
	  to every UNIX.  UNIX is the foundation of the internet.


Slide: Unix Basics necessitate:

	- multi user concepts:
		each user has unique ID, access granted based on numeric ID,
		not name
		concepts of groups
		some UNIX versions have ACLs

		file ownership:
			ls -l
			chmod bits
			chown
			directory permissions, sticky bit, /tmp dir

		process priorities:
			kill, pkill, killall
			nice, runaway processes
			fork-exec, PID 1, example login process:

			init (PID1) 	(via fork-exec)
			^ |
			| +----getty----getty----getty----getty
			|        |
			|      *exec*
			|        |
			|      login
			|        |
			|      *exec*
			|        |
			+----- shell ---  *fork-exec* --- ls
			         ^                         |
			         +-------------------------+


		login:
			get username
			modify terminal not to echo next input
			get password
			open /etc/passwd

jschauma:abcdEFG12345:2379:600:Jan Schaumann:/home/jschauma:/bin/ksh

			get information
			if credentials are right, set UID, GID and HOME and
				finally exec shell



		communication with users:
			wall, write, talk
			mailing lists
			make sure you can always reach most of the users
			need to notify users in advance

	- super user account
		as before: only identified by ID 0
		toor vs root
		logins, remote logins, password in the clear, su and sudo

	- security considerations in a networked world
		the internet is no longer what it used to be:
			open relays for SMTP
			telnet vs ssh
		use encryption where possible
		strike balance between security and convenience
		guard against threats from inside or outside the network
			80% of all attempts come from inside
			95% of all successful attempts come from the inside!

UNIX and Bull by Jean Bellec

Bull, like many "old" computer companies, faced from the early 1980s the dilemma of "open systems". Bull had a "proprietary systems" culture and had a business model oriented towards being the sole supplier of its customers needs. Engineers in all laboratories were used to design all the parts of a computer system, excluding the elementary electronic components. Even when Bull adopted a system from another laboratory, the whole system or software was revisited to be "adapted" to specific manufacturing requirements or to specific customer needs? The advent of open systems, where the specifications and the implementations were to be adopted as such, was a cultural shock that had since traumatized the company.

The new management of Groupe Bull in the 1980s was convinced of the eventual domination of open systems. Jacques Stern, the new CEO, even prophesized in 1982 the decline and the fall of IBM under the pressure of the governments' backed open standards.

The Bull strategy was then to phase out the various proprietary product lines very progressively and to take positions in the promising open systems market.
Many UNIX projects have been considered in the various components of what was now part of Groupe Bull: Thomson had concluded agreements with Fortune, Transac Alcatel was considering its own line (based on NS3032), CNET (the engineering arm of France Télécom) had developed its own architecture on the base of Motorola 68000 (SM-90)...

The take-over of R2E in the late 1970s had given to Bull-Micral a sizeable position in the PC market. But, at that time, many in the company, did not envision the overwhelming success of the personal computers. So, Bull decided to invest in the more promising minicomputer market based on the UNIX operating system.

Bull developed an UNIX strategy independently from Honeywell's. Honeywell did start a port of UNIX on a customized 386 PC and reoriented Honeywell Italia towards a 68000 based UNIX computer. However, plans were exchanged between the companies and did not differ significantly, while products were separately developed until the eventual take-over of Honeywell by Bull..

Open Software

UNIX was, at that time, a property of AT&T, then also a potential competitor for existing computer companies. So, Bull undertook a lobbying effort both in the standards organizations (ECMA, ISO) and at the European Commission to establish UNIX as a standard not controlled by AT&T. This lobbying effort succeeded in establishing X-Open standards, initially for Europe and eventually backed by U.S. manufacturers.

X-Open standardized UNIX API (Application Programming Interfaces), an obvious desire for software houses. But, that objective was not sufficient for a hardware or a basic software manufacturer. So, when approached by Digital Equipment and IBM in 1988, Bull supported with enthusiasm the OSF Open Systems Foundation that had the purpose of developing an alternative source to AT&T supplied UNIX source code. An OSF laboratory was installed in Boston with a subsidiary lab in Grenoble. Bull enlisted the support of OSF from a majority of X-Open backers.
That was the climax of Unix wars: while AT&T got the support of Sun Microsystems, of the majority of Japanese suppliers -including NEC-, the OSF clan gathered H-P, DEC, IBM and even Microsoft that planned the support of X-Open source code in the, still secret, Windows/NT.
IBM had initially granted the AIX know-how to OSF, but a chasm progressively appeared between the Austin AIX developers and the Cambridge OSF. Eventually, OSF abandoned the idea to use AIX as the base of their operating system and went their own way.
When eventually delivered, the first version of OSF was adopted by DEC alone. IBM, H-P were sticking to their own version of UNIX.

In the mean time, Bull and Honeywell engineers had ported license free old versions of UNIX on some mainframes architectures: Level 6, DPS-4, Intel PC and DPS-7. Those implementations were not fully X-Open standardized and their distribution was quite limited.

UNIX Hardware

UNIX was the only successful example of an architecture independent operating system. In the early 1980s, that independence and the related peripheral subsystems openness was considered as satisfying customers. Architects of all companies expected to remain free to invent new instruction sets and the early 1980s saw a blooming of new RISC architectures increasing processor performances and occupying many engineers to port "standard" software to those architectures.

The initial entry of Bull in the UNIX market was to adopt the French PTT CNET's platform known as SM-90. That platform was based on the Motorola MC-68000 microprocessor for which Thomson (future SGI) got a manufacturing license

In parallel, Bull in its Echirolles center and Honeywell in Boston and Pregnana . developed several versions of 68000 based UNIX systems. After the purchase of Honeywell computers assets by Bull, those systems were consolidated into DPX/2 product line.

Jacques Stern, convinced on the superiority of RISC architectures and having failed to convince his engineers to build the right one, decided in 1984 to invest into Ridge Computers, a Santa Clara start-up founded in 1980 by ex-Hewlett Packard employees. Ridge systems were licensed to Bull and sold by Bull as SPS-9. However Ridge entered a financial crisis in1986-1988 and, after new capital injections from Bull and others, eventually vanished.

Going back to Silicon Valley to shop for another RISC architecture in 1988, Bull decided to license the upper range MIPS system and to move its own MC-68000 products to the soon to be announced MOS technology MIPS microprocessors. MIPS looks very promising in 1990: its architecture was adopted by Digital, by Siemens, by Silicon Graphics, by Nintendo and others. However, the multiprocessor version of MIPS chip was delayed and the company entered a financial crisis, ended by its absorption by Silicon Graphics.

Bull decided to abandon MIPS and went shopping for yet another partner. Both Hewlett-Packard and IBM courted Bull in 1991 for adopting their architecture. The French prime minister supported publicly Hewlett-Packard, while Bull's Francis Lorentz and the French ministry of industry were leaning towards IBM.
Eventually, in January 1992, Bull choose IBM. It adopted the PowerPC RISC architecture, introduced the RS/6000 and entered a cooperative work with IBM's Austin's laboratory to develop a multi-processor computer running IBM's AIX operating software. That project, code-named Pegasus, that involved the Bull laboratories of Pregnana and Grenoble, gave birth to the Escala product line.
The PowerPC architecture was completed by the adoption of Motorola's workstations and small servers by Bull as Estrella systems.
In the upper range, Bull attempted unsuccessfully a cooperation with IBM in SP-x Parallel Systems. It did not succeed also in its attempt in 1994 to repackage Escala as a main frame priced system under the name Sagister.

Escala and AIX succeeded satisfactorily. But customers switching to UNIX from mainframes wanted the price of their systems low enough to offset their conversion costs and were very reluctant to buy the kind of hardware profitable to the manufacturer.

In addition to maintaining its AIX-PowerPC based Escala systems, Bull had to introduce also in 1996 a line of open systems, designed by NEC, based on Intel Pentium microprocessors, and running Microsoft Windows/NT.

Conclusion (not a definitive one)

The conversion of the industry to the Open Systems has been much slower than predicted in the early 1980s.

The large systems customers were reluctant to move, perhaps afraid of Y2K problems. The lower part of the computer world adopted the Intel-Microsoft standard and their success allowed many companies to take over the small servers market with Windows/NT.

Bull, when defining its UNIX strategy, was not expecting that the future of UNIX might reside in the open version (Linux) designed by a then obscure Finnish university programmer running on PC hardware with an early 1980s architecture.

68000 products (Honeywell DPX, SM-90 - Bull SPS-7, DPX-2)
MIPS alliance
PowerPC based servers

Intel based servers

Bull UNIX contributors

Index

Revision : 28 July 2019.

2000-05-27 Falling on their Face Six Incidents from Corporate History

3) In 1988/89, UNIX was growing by leaps and bounds. AT&T's Unix Systems Labs and the Open Systems Foundation each had dozens of allies and were slinging "roadmaps" at each other, claiming to have the best view of the future of the computer industry.

It was obvious that networking, security, multiprocessing, multi-platform, and multi-vendor was the future. Obviously (to all of us UNIX-lovers), that DOS/Windows kludge was nothing but a sick joke of a toy. A prototype that should never have left the lab… worse, it was a copy of Apple's copy of a prototype at Xerox Palo Alto Research Center. All the good computer scientists knew that the future was already under development at CMU, Berkeley, Bell Labs and other great centers of UNIX-based innovation: X-Windows, NeWS, TCP/IP, Distributed Computing Environment (DCE), Kerberos, C++, Object Oriented Programming, etc.

The "roadmaps" of the various standards committees simply spelled out how each of these "excellent features" would be integrated into product over the next few years. Microsoft (probably Gates) looked at these and again saw the danger… the challengers were about to blow by Microsoft into the new "networked" world, leaving the boys with their toys in the dust, just the way they had left IBM a decade earlier.

Gates went out and bought the best Operating System architect willing to challenge UNIX dominance and the race to Windows NT was begun. By time Windows NT 3.1 was finally released (some say the "Not a Toy" version of Windows 3.1) the UNIX community had largely self-destructed into a series of battles between vendors who were sure their fellow UNIX vendors were out to get them. The roadmaps that were waved so bravely in 1989 were never followed; USL was sold to Novell and then dismembered. OSF stumbled along and finally joined X-Open and the attempt to "standardize" UNIX faded into history. Though IBM, NCR and DEC did field DCE implementations that spanned UNIX and mainframes, Sun and other UNIX vendors scoffed and followed ARPA funding, forsaking much of the early research and standardization efforts. Meanwhile, Microsoft fielded a cheaper, stripped down DCE in the form of the Windows NT Domain model which is today, with Windows 2000 beginning to meet the goals of the original DCE effort. Lesson: if you're going to publish a roadmap of where you're going, be sure you get there before your competition!

This bit of history makes me sick. In the case of the Apple/Microsoft encounter, Apple was the proprietary 'fool' and suffered for it. In this case, Microsoft first tried to join the UNIX community early on (remember XENIX?) but was roundly ostracized from the community. Apparently Gates saw a flaw in the UNIX brotherhood that we all missed. We missed the self-destructive Not-Invented-Here attitude that ultimately doomed the Open Systems revolution. We forced Microsoft into a one-against-many position, not unlike the position Apple chose. We should have won. Instead, we fractured into a mass of incompatible (enough) UNIX variants and proceeded to blame each other for the failure to meet the #1 promise of UNIX: source-level platform independence.

In this case, we committed fratricide while Microsoft took our plans for a Distributed Computing Environment and made it the centerpiece of their entire Enterprise business. I don't know about the rest of the UNIX community, but living though this history from my position in Bell Labs, I felt we did fall on our faces!

The Creation of the UNIX Operating System

About the FreeBSD Project

1.3.1 A Brief History of FreeBSD Contributed by Jordan Hubbard.

The first CDROM (and general net-wide) distribution was FreeBSD 1.0, released in December of 1993. This was based on the 4.3BSD-Lite (``Net/2'') tape from U.C. Berkeley, with many components also provided by 386BSD and the Free Software Foundation. It was a fairly reasonable success for a first offering, and we followed it with the highly successful FreeBSD 1.1 release in May of 1994.

Around this time, some rather unexpected storm clouds formed on the horizon as Novell and U.C. Berkeley settled their long-running lawsuit over the legal status of the Berkeley Net/2 tape. A condition of that settlement was U.C. Berkeley's concession that large parts of Net/2 were ``encumbered'' code and the property of Novell, who had in turn acquired it from AT&T some time previously. What Berkeley got in return was Novell's ``blessing'' that the 4.4BSD-Lite release, when it was finally released, would be declared unencumbered and Net/2 based product. Under the terms of that agreement, the project was allowed one last release before the deadline, that release being FreeBSD 1.1.5.1.

FreeBSD then set about the arduous task of literally re-inventing itself from a completely new and rather incomplete set of 4.4BSD-Lite bits. The ``Lite'' releases were light in part because Berkeley's CSRG had removed large chunks of code required for actually constructing a bootable running system (due to various legal requirements) and the fact that the Intel port of 4.4 was highly incomplete. It took the project until November of 1994 to make this transition, at which point it released FreeBSD 2.0 to the net and on CDROM (in late December). Despite being still more than a little rough around the edges, the release was a significant success and was followed by the more robust and easier to install FreeBSD 2.0.5 release in June of 1995.

We released FreeBSD 2.1.5 in August of 1996, and it appeared to be popular enough among the ISP and commercial communities that another release along the 2.1-STABLE branch was merited. This was FreeBSD 2.1.7.1, released in February 1997 and capping the end of mainstream development on 2.1-STABLE. Now in maintenance mode, only security enhancements and other critical bug fixes will be done on this branch (RELENG_2_1_0).

25th Anniversary of Unix

The Evolution of the Unix Time-sharing System by Dennis Ritchie, ca. 1979

retro The UNIX Time-sharing System--A Retrospective*

Dennis M. Ritchie
Bell Laboratories, Murray Hill, NJ, 07974

Prophetic Petroglyphs

Attached by magnet to the wall of my office is a yellowed sheet of paper, evidently the tenth page of an internal Bell Labs memo by Doug McIlroy. Unfortunately, I don't have the rest of the note.

To put my strongest concerns into a nutshell:

1. We should have some ways of connecting programs like garden hose--screw in another segment when it becomes when it becomes necessary to massage data in another way. This is the way of IO also.

2. Our loader should be able to do link-loading and controlled establishment.

3. Our library filing scheme should allow for rather general indexing, responsibility, generations, data path switching.

4. It should be possible to get private system components (all routines are system components) for buggering around with.

Selected Computing Sciences Technical Reports

A Brief History of Unix By Charles Severance

CSRG Archive CD-ROMs

CSRG Archive CD-ROMs Click on either of the above pictures to view larger versions. Thanks to the efforts of the volunteers of the ``Unix Heritage Society' and the willingness of Caldera to release 32/V under an open source license, it is now possible to make the full source archives of the University of California at Berkeley's Computer Systems Research Group (CSRG) available. The archive contains four CD-ROM's with the following content: CD-ROM #1 - Berkeley Systems 1978 - 1986 1bsd 2. 9pucc 4. 1

Twenty Years of Berkeley Unix- From AT&T-Owned to Freely Redistributable

A Brief History of UNIX

Netizens Netbook Table of Contents by Ronda Hauben and Michael Hauben Last Modified: 6/12/96

Note: Please do not link to individual file names as file names are subject to change. Instead link to the Netizens netbook page. Thanks.

Foreword: By Tom Truscott
Preface: What is a Netizen?
Introduction: Participatory Networks

Part I - The Present: What Has Been Created and How?

Chapter 1 - The Net and the Netizens: The Effect the Net has on People's Lives
Chapter 2 - The Evolution of Usenet: The Poor Man's Arpanet
Chapter 3 - The Social Forces Behind The Development of Usenet
Chapter 4 - The World of Usenet

Part II - The Past: Where Has It All Come From?

Chapter 5 - The Vision of Interactive Computing and the Future
Chapter 6 - Cybernetics, Time-sharing, Human-Computer Symbiosis and On-line Communities: Creating a Supercommunity of On-line Communities
Chapter 7 - Behind the Net: Computer Science and the Untold Story of the ARPANET
Chapter 8 - The Birth and Development of the ARPANET
Chapter 9 - On the Early History and Impact of UNIX: Tools to Build the Tools for a New Millennium
Chapter 10 - On the Early Days of Usenet: The Roots of the Cooperative Online Culture

Part III - And the Future?

Chapter 11 - The NTIA Conference on the Future of the Net Creating a Prototype for a Democratic Decision Making Process
Chapter 12 - "Imminent Death of the Net Predicted!"
Chapter 13 - The Effect of the Net on the Professional News Media: The Usenet News Collective and Man-Computer News Symbiosis
Chapter 14 - The Net and the Future of Politics: The Ascendancy of the Commons
Chapter 15 - Exploring New York City's On-Line Community: A Snapshot of NYC.General

Part IV - Contributions Towards Developing a Theoretical Framework

Chapter 16 - The Expanding Commonwealth of Learning: Printing and the Net
Chapter 17 - `Arte': An Economic Perspective
Chapter 18 - The Computer as Democratizer

Bibliography
Glossary of Acronyms

Appendix

Proposed draft Declaration of the Rights of Netizens

USENIX ;login - Use the Source, Luke! Again

Editor's note: This article originally appeared in a slightly different form in the AUUG Newsletter.

use the source, Luke! again

By Warren Toomey
<[email protected]>

Warren Toomey is a lecturer in computer science at the Australian Defence Force Academy, where he just finished his Ph.D. in network congestion. He teaches operating systems, data networks, and system administration courses. He has been playing around on UNIX since 4.2BSD.

So you call yourself a UNIX hacker: you know what bread() is, and the various splxx() routines don't faze you. But are you really a UNIX hacker? Let's have a look at a brief history of UNIX and the community of UNIX users and hackers that grew up around it and some recent developments for real UNIX hackers.

UNIX took the academic world by storm in 1974 with the publication of Ken Thompson's paper about its design, which was published in Communications of the ACM. Although it didn't contain many radically new ideas, UNIX had an elegance, simplicity, and flexibility that other contemporary operating systems did not have. Soon lots of people were asking Bell Laboratories if they could get copies of this wondrous new system.

This was the cause of some concern within AT&T, because of the restrictions of an antitrust decree brought against them in the 1950s. This decree effectively stopped AT&T from selling or supporting software: they could only engage in telco business. Their solution to meet the UNIX demand was to charge a nominal "license" fee to obtain UNIX and to distribute tapes or disks "as is." You'd receive your disk in the mail with just a short note: "Here's your rk05. Love, Dennis."

AT&T's stance on UNIX was often seen as an OHP slide at early conferences:

"This slide was always greeted with wild applause and laughter," says Andy Tanenbaum. This lack of support was tolerated for several reasons: Ken and Dennis did unofficially fix things if you sent them bug reports, and you also had the full source code to UNIX.

At the time, having full source code access for a useful operating system was unheard of. Source code allowed UNIX users to study how the code worked (John Lions's commentary on the sixth edition), fix bugs, write code for new devices, and add extra functionality (the Berkeley Software Releases, AUSAM from UNSW). The access to full source code, combined with AT&T's "no support" policy, engendered the strong UNIX community spirit that thrived in the late 1970s and early 1980s, and brought many UNIX users groups into existence. When in doubt as to how a program (or the kernel) worked, you could always "use the source, Luke!"

During this period, UNIX became wildly popular at universities and in many other places. In 1982, a review of the antitrust decree caused the breakup of AT&T into the various "Baby Bell" companies. This gave AT&T the freedom to start selling software. Source code licenses for UNIX became very expensive, as AT&T realized that UNIX was indeed a money spinner for them. Thus the era of UNIX source code hackers ended, except for some notable activities like the 4BSD work carried out at the University of California, Berkeley.

Those organizations lucky enough to have bought a "cheap" UNIX source license before 1982 were able to obtain the 4BSD releases from UCB and continue to hack UNIX. Everybody else had to be satisfied with a binary-only license and wait for vendors to fix bugs and add extra functionality. John Lions's commentary on how the UNIX kernel worked was no longer available for study; it was restricted to one copy per source code license, and was not to be used for educational purposes.

What were UNIX hackers going to do with no UNIX source code to hack anymore? The solution was to create UNIX clones that didn't require source code licenses. One of the first was Minix, created by Andy Tanenbaum and aimed squarely at teaching operating systems. Early versions of Minix were compatible with the seventh edition UNIX; the most recent version is POSIX compliant and can run on an AT with 2 MB of memory and 30 MB of disk space.

Many Minix users tried to convince Andy to add features such as virtual memory and networking, but Andy wanted to keep the system small for teaching purposes. Eventually, a user named Linus Torvalds got annoyed enough that he used Minix to create another UNIX clone with these extra features. And so Linux was born.

While Linux was taking off like a plague of rabbits, the BSD hackers were working on removing the last vestiges of UNIX source code from their system. They thought they had done so, and BSDI released BSD/386, a version of 4.3BSD that ran on Intel platforms. AT&T, however, wasn't so sure about the complete removal of UNIX source code and took them to court about it.

AT&T is not a good company to be sued by: it has a small army of lawyers. Eventually, the conflict was settled out of court with a few compromises, and we now have several freely available BSDs: FreeBSD, NetBSD, and OpenBSD. Of course, they all come with source code.

UNIX hackers of the late 1990s surely have an abundance of source code to hack on: Linux, Minix, OpenBSD, etc. But are they really UNIX hackers, or just UNIX clone hackers? Wouldn't it be nice if we could hack on real UNIX, for old time's sake?

UNIX turned 25 in 1993, which makes its early versions nearly antiques. Many of the old UNIX hackers (hackers of old UNIX, that is) thought the time had come to get the old, completely antiquated UNIX systems back out for sentimental reasons. After all, ITS, CTSS, and TOPS-20 had been rescued and made publicly available, why not UNIX?

At the time, UNIX was undergoing a crisis of ownership. Did AT&T own UNIX this week, or was it Novell, Hewlett-Packard, or SCO? UNIX is a trademark of someone, but I'm not sure who. After the dust had settled, SCO had the rights to the source code, and X/Open had dibs on the name "UNIX," which is probably still an adjective.

During the ownership crisis, Peter Salus, Dennis Ritchie, and John Lions had begun to lobby Novell: they wanted John's commentary on UNIX to be made publicly available in printed form. It wasn't until the UNIX source code rights had been sold to SCO that this finally was approved. It helped to have some old UNIX hackers, Mike Tilson and Doug Michels, inside SCO to fight the battle. You can now buy John Lions's commentary on 6th Edition UNIX (with source code) from Peer to Peer Communications, ISBN 1-57398-013-7. As Ken Thompson says: "After 20 years, this is still the best exposition of a 'real' operating system."

One of the restrictions on the commentary's publication is that the UNIX source contained within cannot be entered into a computer. OK, so you can read the book, but what use is source code unless you can hack at it?!

At the time that SCO bought UNIX, I began to lobby SCO to make the old source available again, unaware of the efforts to release the Lions's commentary. SCO's initial

response was "this will dilute the trade secrets we have in UNIX, and it wouldn't be economically viable." My efforts drew a blank.

To help bring greater lobbying power to bear on SCO, the PDP UNIX Preservation Society (PUPS) was formed. Its aims are to fight for the release of the old UNIX source, to preserve information and source from these old systems, and to help those people who still own PDP-11s to get UNIX up and running on them. After realizing that SCO was never going to make the old UNIX source code freely available, we explored the avenue of cheap, personal-use source licenses. The society set up a Web petition on the topic and gathered nearly 400 electronic signatures.

Inside SCO, we were very fortunate to contact Dion Johnson, who took up our cause and fought tooth and nail with the naysayers and the legal eagles at SCO. The combined efforts of the PUPS petition and Dion's hard work inside SCO has finally borne fruit.

On March 10, 1998, SCO made cheap, personal-use UNIX source code licenses available for the following versions of UNIX: first through seventh edition UNIX, 32V, and derived systems that also run on PDP-11s, such as 2.11BSD. The cost of the license is US$100, and the main restriction is that you cannot distribute the source code to people without licenses. Finally, we can be real UNIX hackers and "use the source, Luke!" again.

Acknowledgments and References

I'd like to thank Dion Johnson, Steven Schultz, the members of the PDP UNIX Preservation Society, and the people who signed the PUPS petition for their help in making cheap UNIX source licenses available again. Dion, in particular, deserves a medal for his efforts on our behalf.

You can find more about the PDP UNIX Preservation Society at <http://minnie.cs.adfa.oz.au/PUPS/> and details on how to obtain your own personal UNIX source license at <http://minnie.cs.adfa.oz.au/PUPS/getlicense.html>.

SCO won't be distributing UNIX source code as part of the license. PUPS members have volunteered to write CDs and tapes to distribute old versions of UNIX to license holders. We currently have fifth, sixth, and seventh editions, 32V, 1BSD, all 2BSDs, Mini UNIX, and Xinu. We are looking for complete versions of PWB UNIX and AUSAM. We desperately want anything before fifth edition and hope these early systems haven't gone to the bit bucket. Please contact us if you have anything from this era worth preserving.

If you are licensed and want a copy of the PUPS Archive, see the PUPS Web page above for more information. We expect to be deluged with requests for copies, so if you can volunteer to write CDs or tapes for us, please let us know.

You don't need own a PDP-11 to run these old systems. The PUPS Archive has a number of excellent PDP-11 emulators. If you have bought a copy of the Lions's commentary (and you should), now you can run real sixth edition UNIX on an emulator. And if you want, you can hack the code!

SCO - Ancient UNIX -- SCO opened early unix code.

News It's a blizzard--time to innovate

-five years and one month ago saw a doozy of a storm. Ward Christensen, mainframe programmer and home computer hobbyist, was stuck at home behind drifts too thick to dig. He'd been in the habit of swapping programs with Randy Suess, a fellow hacker--in the old sense of someone who did smart things with dumb electronics--by recording them onto cassettes and posting them.

They'd invented the hardware and software to do that, but in that same chilly month of 1978 someone called Bill Hayes came up with a neat circuit called the Hayes MicroModem 100. Ward called Randy, complained about the weather, and said wouldn't it be a smart idea to have a computer on the phone line where people could leave messages. "I'll do the hardware. When will the software be ready?" said Randy.

The answer was two weeks later, when the Computerized Bulletin Board System first spun its disk, picked up the line and took a message. February 16th, 1978 was the official birthday: another two weeks after it really came to life, says Christensen, because nobody would believe they did it in a fortnight. He's got a point: these were the days when you couldn't just pop down to PC World and pick up a box, download some freeware and spend most of your time wondering what color to make the opening screen.

Everything about the CBBS was 1970s state of the hobbyist's art: a single 173-kilobyte 8-inch floppy disk to store everything, 300-baud modem, 8-bit processor running at a megahertz or so, and--blimey--64kb of memory.

Christensen wrote the BIOS and all the drivers (as well as the small matter of the bulletin board code itself), while Suess took care of five million solder joints and the odd unforeseen problem. Little things were important: the motor in the floppy disk drive ran from mains electricity instead of the cute little five volts of today--things burned out quickly if left on. So the floppy had to be modified to turn itself on when the phone rang, keep going for a few seconds after the caller had finished to let the rest of the computer saved its data, and then quietly go back to sleep. Tell the kids of today that...

The kids of yesterday didn't need telling. Bulletin boards running CBBS spread across the US and further afield; by 1980, Christensen was reporting 11,000 users on his board alone, some of whom called in from Europe and Australia--in the days of monopoly telcos with monstrous international call charges. But that was because there was nothing else like it. People dialed in and got news instantly--well, after five hours of engaged tone--that would otherwise have to wait for the monthly specialist magazines to get into print. And of course, they could swap files and software, starting the process which today has grown into the savior of the human race or the destroyer of all that is noble and good (pick one).

The experience of a BBS (the C got dropped as alternative programs proliferated) was very different on some levels to our broadband, Webbed online lifestyle. Three-hundred baud is around five words a second: you can read faster than that. Commands were single characters, messages were terse but elegant, while a wrong command can land you with a minute's worth of stuff you just didn't need to know. Some software even threw users off who pressed too many keys without being productive enough: it was a harsh, monochrome and entirely textual world.

It was also utterly addictive. For the first time, people could converse with others independently of social, temporal or spatial connections. People made the comparison at the time with the great epistolary conversations of the Victorians, where men and women of letters sent long hand-written notes to each other two or three times a day, but BBS life was much more anarchic than that. You didn't know with whom you were swapping messages, but you could quickly find out if they were worth it. At first envisioned as local meeting places where people who knew each other in real life could get together from home, BBSs rapidly became entreats for complete strangers--the virtual community had arrived, with its joys, flamewars and intense emotions.

Ten years later, bulletin boards had evolved into a cooperative mesh of considerable complexity, A system called Fidonet linked them together, so mail from one could be forwarded to another anywhere in the world via a tortuous skein of late night automated phone calls. File-transfer protocols, graphics, interactive games and far too much politics had all transformed that first box of bits and chips beyond recognition.

Then came that world's own extinction-level event, as the asteroid of the Internet came smashing through the stratosphere and changed the ecosystem for good. That we were ready for the Net, and that every country had its own set of local experts who'd been there, done that and knew what to do next, is in large part due to the great Chicago snowstorm of 1978 and two people who rolled up their sleeves to make a good idea happen. It's an anniversary well worth remembering.

** alt.rest.in.peace

Jim Ellis, one of the founders of the Usenet system which predated and helped shape the Web, died Thursday of non-Hodgkins lymphoma at his home in Pennsylvania. He was 45. The newbies amongst us might not be familiar with Usenet, a massive information-sharing service where files were swapped, friends and enemies were made, and just about every topic imaginable was discussed. It served as the model for modern message boards and for a long time was the coolest thing happening on the Internet. In 1979, as a graduate student at Duke University, Ellis helped design the system, linking computers at Duke to some at the University of North Carolina. Within just a few years, it spread worldwide. By 1993, there were 1,200 newsgroups and the system reflected an increasingly diverse and chaotic online community. Users would post messages and encrypted files in a series of newsgroups built into a hierarchy of interests, such as rec.collecting.stamps and comp.os.linux. The infamous alt. groups were home to the wilder topics, from alt.religion.kibology to alt.pave.the.earth.

In time, as with many communities, it got crowded and went into decline. By 1999, an estimated 37,000 newsgroups were in operation, and legitimate postings had largely been drowned out by ads, spam, and flame wars. But the impact of Ellis' creation on our modern Internet can't be dismissed. For his contributions, Jim Ellis received the Electronic Frontier Foundation's Pioneer Award in 1993 and the Usenix Lifetime Achievement Award in 1995.

An archive of Usenet postings dating back to 1995 is hosted by Google.

The People's Embassy Computer history section

10 Years of Impact Technology, Products, and People The Role of Courage in Applied Research, Ivan Sutherland

While the formulation of a research strategy is a business decision for Sun Labs, the choice of a worthy problem is a highly personal decision for researchers.

"Selecting a project worthy of your team's time and the company's money requires foresight, passion, suspension of disbelief, luck, and courage," said Dr. Ivan Sutherland, Vice President and Fellow of Sun Microsystems. A pioneer in the field of computer graphics and integrated circuit design, Ivan produced some of the world's first virtual reality and 3-D display systems as well as some of the most advanced computer image generators now in use. His groundbreaking research on asynchronous circuits has resulted in technology that could lead to circuits with much higher processing speeds than currently possible using conventional technology.

"A critical first step in picking problems is to understand where the technology will be in 10 years," he said. "And that requires two things: First, you need to project ahead based on your knowledge of the technology; but there's also a critical element of self-deception. The danger in projecting that far ahead is that you'll become overwhelmed with the complexity or difficulty of your mission and never actually get to work. So in part it's a matter of convincing yourself that things are really simpler than they are and getting started before you realize how hard the problem actually is.

"It is also important to weigh the opportunity," Ivan continued. "Some problems aren't worth consideration because a solution is simply too far off--a `beam me up Scotty' transporter, for example. I try to select a problem I think I can solve in a limited time frame using the existing base of technology.

"And on a personal level, it is important not to overlook the role of courage," said Ivan. "With research comes risk. Researchers daily face the uncertainty of whether their chosen approach will succeed or fail. We sometimes face periods of weeks, months, or even years with no visible progress. To succeed, you must have the personal fortitude to overcome discouragement and to keep your focus on the task at hand."

Successful development of technology can also require courage on the part of the enterprise, according to Jon Kannegaard, Sun Vice President and Deputy Director of Sun Labs, "It can be a leap of faith from a business perspective as well as a personal perspective," he said. "There isn't always an objective way to determine whether or not there's a pot of gold at the end of the rainbow, yet the company is called upon to decide whether or not to invest resources to develop the technology. In some cases, ongoing investment may be required to transform the technology into a product, again with no certainty in the outcome. There's courage required at every step."

USENIX ;login -A Tribute to Rich Stevens by Rik Farrow

Hal Stern reflected that he respected his teaching and his ability to illuminate and explain. "Charles Mingus, late jazz bassist, sums it up well: 'Taking something complex and making it simple is true creativity.' We have lost one of our truly creative."

[Aug 25, 2001] HALL OF FAME

[Jul 1, 2001] Biographies of pioneers in the computing history

[Jun 30, 2001] Usenet Co-founder Jim Ellis Dies

Slashdot"Jim Ellis, one of the cofounders of Usenet, has passed away. Usenet is considered the first large information sharing service, predating the WWW by years." He was 45 years old, and died after battling non-Hodgkins lymphoma for 2 years. Usenet of course began in 1979, and is the 2nd of the 3 most important applications on the net (the first being email, and the third being the web). Truly a man who changed the world.

Thanks a lot. (Score:1, Interesting)
by Anonymous Coward on Friday June 29, @12:01PM EST (#34)
I never knew you, but thanks anyway, dude.

If Usenet is one of the first really democratic institutions, shouldn't we all recognize this as significant as when one of the country's Founding Fathers died? Just an idea...

Son of Usenet (Score:2, Interesting)
by mike_the_kid (http://www.nedyah.org/mailer.asp) on Friday June 29, @12:02PM EST (#35)
(User #58164 Info) http://www.nedyah.org/music_for_the_masses/
Anyone who remembers FidoNet or BBS can realize just how far ahead of its time usenet was. Fidonet was a direct descendant of usenet, and it was quite a resource in its heyday.
The model of usenet, where people can post new articles or reply to older ones is seen right here on slashdot discussions, and all the other web based discussion boards. Bulletin boards are one of the great things about the Internet. The format for discussion, seen today in mailing lists and forums like this, started with usenet.
Fido was my first exposure to this type of information, way before I had an IP address.
If the core of this model was not usenet, what was it? If it was, I must give credit to the people who developed usenet for their forward thinking on information exchange and hierarchy.
It is not a perfect system, but in its flaws (namely the signal to noise ratio) is hope for better methods of communication.

www.nedyah.org -- Careful!

He deserves respect (Score:5, Insightful)
by nougatmachine ([email protected]) on Friday June 29, @12:02PM EST (#39)
(User #445974 Info)
Besides the obvious need to have respect for the dead, I feel that Jim Ellis deserves respect because he made the first internet resource that strived to create a community atmosphere. This is the model that the web boards found on many websites were based on, and certainly was an influence on the Slashdot model. Whoever made the sarcastic comment about the graves saying "make money now", I understand you were trying to be funny, but I have a hard time laughing about people who have recently died. It's hardly Jim's fault Usenet has become such a wasteland.
So many important dudez in heaven (Score:4, Insightful)
by chrysalis on Friday June 29, @12:13PM EST (#69)
(User #50680 Info) http://www.jedi.claranet.fr
Richard Stevens. Douglas Adams (not really internet-related but definitely someone I loved). The ZIP algorithm inventor (sorry I can't remember his name) . And now Usenet's daddy. All rest in heaven now.
But do you think Richard Stevens and the Usenet creator were enjoying today's internet ? They built something that worked perfectly to exchange tons of messages with low bandwidths. Now, everyone has 100x the bandwidth they had when they designed their product. Computers are 100x faster. So what ? Do we find info 100x faster than before ?
Actually not. To read a simple text, you have to download hundreds of kilobytes. 99% is bloat (ads, bloated HTML, useless Java, etc) . Reading messages on a web discussion board is slow. You have to issue dozens of clicks before reading a thread, and wait for every ad to load. Usenet provided a consistent, sorted, easy to parse, and *fast* way to share info with other people.
7 years ago, I was providing access to 12000 newsgroups on Minitel. Minitel is a french terminal, with a 1200 bauds modem (and 75 bauds in emission) . And it worked. People could easily browse all Usenet news. Faster and easier than on web sites.
Another thing is that Usenet let you choose any client. You can choose your preferred fancy interface. Web discussion boards don't let you a lot of choice.
Migrating from Usenet to web sites is stupid. It wastes a lot of bandwidth for nothing. People do this because :
    • Everyone can open its own web site

        People can force users to see web ads to read messages Great deal. Web discussion boards provides inconsistency and redundancy. How many web sites discusses the same thing ? How many questions are asked on a web site though they were already answered on another web site ?
        Usenet solved this a long time ago.
        What killed Usenet is the load of uuencoded warez and spam. Everyone has to filter messages to find real ones. Lousy. But we can't fight stupidity. Give people mail access, they will send spam. Give people Napster, they will share copyrighted songs. Give people a CD writer, they will burn commercial software. Give people the web, they will DOS it or try root exploits. Give people usenet, they will kill it. And there's no way back.


    -- Pure FTP server - Upgrade your FTP server to something simple and secure.

  • That is truly sad (Score:4, Interesting)
    by jfunk ([email protected]) on Friday June 29, @12:23PM EST (#88)
    (User #33224 Info) http://www.funktronics.ca/
    The Internet to me, at first, was news, ftp, and telnet. I spent an inordinate amount of time in 'nn' every day reading sci.electronics, alt.hackers (that was a very fun newsgroup about *real* hacking), and host of others.

    When I first saw the 'web' I thought, "this is crap, random words are linked to various things and it doesn't seem to make sense. Back to the newsgroups with me." I realise now that it was just my initial sampling that was total crap, but I kept up with the newsgroups anyway.

    I'm totally sad about the state of USENET over the past few years, and this just makes it all worse.

    However, for that long time I spent thriving on the USENET, I'll have to thank Jim Ellis. He indirectly helped me find out about Linux, electronics, hardware hacking, etc. Things I do professionally these days.

    I think it's a somewhat appropriate time for an:

    ObHack (I'm sorry if it's not a very good one. Good hacks, that are not your employer's intellectual property, seem to decrease to almost nothingness when you're no longer a poor student): We had this hub where a heatsink had broken off inside. I grabbed some solid wire and threaded it through the fins and through holes in the circuit board. Through a fair bit of messing around I made sure that it will *never* come out of place again. Ok, that was bad, so I'll add another simple one: Never underestimate the power of a hot glue gun. It allows you to easily provide strain relief for wires that you've soldered onto a PCB and I've also used it to make prototypes of various sensors. If you want to take it apart, and x-acto knife does the trick very easily.

    Sigh.

    Honor Jim Ellis and help others with lymphoma (Score:5, Informative)
    by redowa (trillatpurpleturtledotcom) on Friday June 29, @02:56PM EST (#197)
    (User #102115 Info)
    One way to truly honor Jim Ellis's memory and his contributions to the internet as we know it would be to help find a cure for the cancer that killed him.
    The Leukemia & Lymphoma Society (nat'l. non-profit org.) has this amazing program called Team in Training - basically, you train for an endurance event (marathon, century cycle, triathlon, etc.), and in exchange for 3-5 months of professional coaching, staff support, transportation, accomodation, and entrance fee for your event, you agree to fundraise for the Leukemia & Lymphoma Society.
    It's such an inspiring experience. It's totally doable - you can go from complete slothdom to finishing a marathon in just a few months. And you get to meet patients with various blood-related cancers, and hear about their experiences - after you find out what chemo & marrow transplants are like, suddenly your upcoming 14-mile run doesn't seem so hard - and you directly affect their chances of survival with every dollar you raise. It is such a good feeling, both physically and mentally, to be a part of this program
    Usenet when you could talk to the heroes (Score:2)
    by King Babar on Friday June 29, @03:46PM EST (#209)
    (User #19862 Info) http://www.missouri.edu/~kingjw
    Here, at the time of the passing of its co-creator, I see a great out-pouring of nostalgia for Usenet of old. I also see the posts of many people who were not lucky enough to have seen it at its zenith. I think the one most amazing aspect of Usenet was not merely that you could get fast answers to pressing technical questions, but that you had direct access to some real giants of that day, and see a little bit about how they think. It wasn't just that there was more signal, in some cases the signal came from the creator of whatever it was you were asking about. Even if they worked for a Big Important Company. So if you asked an interesting question in comp.sys.mac.hypercard, chances were good that somebody from Apple would respond. Alexander Stepanov used to respond to traffic about the C++ STL. World experts at your fingertips everywhere! It should have been paradise!

    And I have to say that by and large we really blew it. It wasn't just the spam, or even the massive flamefests. It was really the corrosive effects of ignorance and greed. Take Tom Christiansen (most recently [email protected]). Not always a bunch of rainbows and smiles, he, but an incredibly well-informed individual whose contributions to Usenet are the stuff of legend. Apparently chased away from Usenet for good by one too many "gimme gimme" question and one too many displays of horrible netiquette. A real tragedy.

    This was around the time I discovered Slashdot, and saw what looked like a more clueful albeit imperfect mirror of the Spirit of Usenet. I was quite cheered when I found out that tchrist himself was becoming a key contributor. It might be a new geek paradise! But, of course, that didn't happen. Tom got chased away again by a bunch of cretins.

    And, getting back to the idea of an elegy for Ellis, I believe the final straw there was some jerk maligning Jon Postel when his obituary came up in this forum. Much worse than spam.

    Babar

    Usenet was NOT the Internet (Score:5, Interesting)
    by JoeBuck (jbuck at welsh-buck dot org) on Friday June 29, @12:34PM EST (#114)
    (User #7947 Info) http://www.welsh-buck.org/jbuck/
    Back in the 80s, Usenet was the net for those of us who couldn't get on the Internet, because we didn't have the connections into DARPA (by virtue of being a defense contractor or big research university) to get on it. The only connectivity we had was 1200 baud modems (in some cases, 300 baud). The way you got on was that you had a Unix system and a modem, and a contact with someone that was willing to give you a news feed (possibly in exchange for lightening the load by feeding a couple of other folks).

    Actually, you didn't even need Unix. I was at a small company that did a lot of digital signal processing, and it was a VMS shop, so we ran Usenet on top of Eunice (a Unix-on-top-of-VMS emulation that sort of worked, but had only symbolic links, no hard links). I was the guy who did the Eunice port for 2.11B news: my first involvement in what would now be called a major open source project.

    Back in those days, to send mail you had to have a picture of the UUCP network topology in your head: a series of paths that would get you from here to there. There were a couple of short cuts: sites that would move messages across the country (ihnp4) or internationally (seismo, which later became uunet, the first commercial provider of news feeds).

    Because of the way Usenet worked, in the days where it went over UUCP (before NNTP), it was based on personal connections and a web of trust. Things were pretty loose, but if someone ignored community norms and behaved in a way that would clearly damage the fabric of the net, they just lost their news feed and that was that. It was cheap Internet connections and NNTP that made Canter and Siegel (the first big Usenet spammers) possible. But this reliance on personal connections had its downside: some admins enjoyed being petty dictators too much. The UUCP connection between AMD and National Semi (yes, competitors fed each other news on a completely informal basis, it was a different era) was temporarily dropped because of a personal squabble between the sysadmins.

    There were many other nets then that weren't the Internet: Bitnet, berknet (at Berkeley) and the like. Figuring out how to get mail around required wizardry: mixes of bang paths (...!oliveb!epimass!jbuck), percent signs, and at-signs (user%[email protected]).

    The user interfaces on sites like Slashdot are still vastly inferior to newsreader interfaces, like trn and friends. I could quickly blast through hundreds of messages, killing threads I wasn't interested in, skimming to get to the meat. If only sites like Slashdot would pay more attention to what worked so well about Usenet.

    MAKE GREEN CARDS FAST WITH SERDAR ARGIC AND KIBO!! (Score:5, Insightful)
    by connorbd on Friday June 29, @12:01PM EST (#33)
    (User #151811 Info) http://www.geocities.com/ResearchTriangle/Station/2266
    I remember the end of the Usenet glory days (mid-90s, unfortunately just after the September That Never Ended), before it was swallowed by spam. Usenet IMHO is the place where net.culture grew up, even if it wasn't part of the Internet in the beginning. No offense to the /. community, but to those of you who never experienced it, Usenet back in the day was a place the likes of which we probably won't see again.

    Places like /. and k5 still have an echo of the old Usenet, and you likewise still get some of it on mailing lists now, but take a look through Google Groups now -- too much garbage, and the community that's there is somewhat isolated because Usenet isn't as integral to the net experience as it once was.

    Two taps and a v-sign for the man -- not everyone can claim to have created a true community single-handedly.

    /brian

    [Nov 25, 2000] Michael John Muuss -- homepage of the late author of ping...

    Mr. Muuss was born in 1958, received a BES in Electrical Engineering from the Johns Hopkins University in 1979, and has subsequently received numerous awards and citations for his work, and is a two-time winner of the U.S. Army Research and Development Achievement Award.

    "The author of the enormously popular freeware network tool PING, Mike Muuss, died in a Maryland car crash last night. The accident happened at 9.30pm (New York time) on route 95 as a result of a previous accident.
    Mike hit a car stuck in the middle of the road and was pushed into the path of an oncoming tractor."

    http://www.theregister.co.uk/content/4/14936.html

    [Sep 12, 2000] Raph's Page Online World Timeline -- interesting game technology timeline

    Also mirrored in other sites, for example Four Below Zero - The Gamer - Timelines

    [Jul 21, 2000] Peter Salus wrote a short overview of the vi history in Open Source Library - Papers:

    The original UNIX editor was ed. It was a line editor of reluctant and recalcitrant style. When UNIX (version 4) got to Queen Mary College, London, in 1973, George Coulouris -- a Professor of Computing -- wasn't happy with it. So he wrote a screen editor, which he called "em," or "ed for mortals."

    Coulouris went on sabbatical to Berkeley, where he installed em on "his" machine. A graduate student noticed it one day, and asked about it. Coulouris explained. He then went off to New Jersey to Bell Labs, and when he returned to Berkeley, he found that em had been transmuted into ex, a display editor that is a superset of ed and a number of extensions -- primarily the one that enables display editing.

    At the beginning of 1978, the first Berkeley Software Distribution was available. It consisted of a tape of the Berkeley Pascal System and the ex text editor. The graduate student was Bill Joy, and the distribution cost $50. The next year Berkeley got some ADM-3a terminals, and Joy rewrote em to vi -- a truly visual editor.

    In sum, ed came out of Bell Labs in New Jersey, went to Queen Mary College in London, from there to the University of California at Berkeley, and from there back to New Jersey, where it was incorporated into the next edition of UNIX.

    Recommended Links

    Google matched content

    Softpanorama Recommended

    Top articles

    Sites

    Internal

    External

    News Recommended Links OS History Unix History CPU History Language History Solaris history DOS History
    Donald Knuth Richard Stallman Larry Wall Scripting Giants CTSS Multics OS Quotes Etc



    Etc

    Society

    Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

    Quotes

    War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

    Bulletin:

    Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

    History:

    Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

    Classic books:

    The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

    Most popular humor pages:

    Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

    The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


    Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

    FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

    This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

    You can use PayPal to to buy a cup of coffee for authors of this site

    Disclaimer:

    The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

    Last modified: July 28, 2019