Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
May the source be with you, but remember the KISS principle ;-)
Skepticism and critical thinking is not panacea, but can help to understand the world better

Fifty Glorious Years (1950-2000): the Triumph of the US Computer Engineering

News Recommended Links OS History Unix History CPU History Language History Solaris history CTSS DOS History
Donald Knuth Richard Stallman Larry Wall A Slightly Skeptical View on Linus Torvalds Scripting Giants Slightly Skeptical View on John K. Ousterhout and Tcl Multics OS Quotes Etc


Invention of computers was the highest moment of the development of the USA high-tech industry, the area which  defined the progress of high-tech as a whole. This is an area were the USA really has been the greatest nation of the world. Real "shining city on the hill". The USA gave the world great programmers, hardware designers, network architects and managers.  Those unique conditions, when the whole country was a large Silicon Valley of the world, were destroyed by neoliberal transformation of society and, especially, neoliberal transformation of higher education (when university were transformed into for profit corporation for the university elite)  started in 80th and getting to full speed after 2001.

When ENIAC was declassified in 1946 ( it made the front page of the New York Times) the computer revolution  was put into fast motion. As early as 1952 during the presidential elections night, Univac computer correctly predicted the winner. While chances were 50% ;-), this was an impressive introduction of computers into mainstream society. IBM, DEC, CDC and later Intel, HP, Apple and Dell emerged as the leading producers of hardware. With the advent of microprocessors all major CPU, Include Intel x86 and Motorola  68000, PowerPC, etc were US designed. The USA programmers created all major world operating systems such as System/360, Multics,  VM/CMS, VAX/VMS,  Unix, CP/M, DOS, Windows, System 7, OS 7, major linux distributions, such as Red Hat and Debian, Android. in 1967 they wrote the first hypervisor ( CP-67; later renamed to CP/CMS was available to IBM customers from 1968 to 1972, in source code form without support). In 1072 the shipped first commercial hypervisor  VM/370. Later they create a series of impressive offering in this area too such as  VirtualBox  and VMware.

Most of the leading programming languages such as Fortran, Cobol, PL/1, PL/M, Snobol, Lisp, Scheme, Basic,  C, C++, C#, Objective-C, Korn shell, Perl, PHP, Java, Javascript, TCL,  and compilers/interpreters to them were "made in the USA" too. From early 1950th till approximately 2000 academic science was also completely dominated by the USA scientists.  From early 50th till ACM was the most influential computer professionals society and till approximately 1975 it flagship periodical, Communications of the ACM was the top professional publication of the field, although British Computer Society The Computer Journal was also of some statute and influence.   

History is written by the winners and computer history of XX century was definitely written in the USA. If we assume that the professional success is a mixture of natural abilities, hard labor and luck (including being born at the right place at right time), as in Malcolm Gladwell suggested in his unscientific, but now popular The 10,000 Hour Rule it is clear that the US scientists have has all those three components.  But they were not alone -- conditions is GB, Germany and France were not bad iether.  While we should take Gladwell findings and his 10000 rule with a grain of slat, it point to one interesting observation. Most those that I mention below were born between 1920 and 1955 -- a window of opportunity in computer science which since then is virtually closed. It is similar to 1830-1840 window for titans of Gilded Age such as Rockefeller (1939), Carnegue(1835), Gould(1836), J.P Morgam(1837) that Gladwell mentioned. "No one—not rock stars, not professional athletes, not software billionaires, and not even geniuses—ever makes it alone", writes Gladwell.

At the same time it is important to see this history not as "people, places and events" but also via artifacts, be it machines, programs or interviews of pioneers. This part of history is badly persevered in the USA. Moreover there is a trend of Dumbing down history of computer science. As Donald Knuth remarked ( Kailath Lecture and Colloquia):

For many years the history of computer science was presented in a way that was useful to computer scientists. But nowadays almost all technical content is excised; historians are concentrating rather on issues like how computer scientists have been able to get funding for their projects, and/or how much their work has influenced Wall Street. We no longer are told what ideas were actually discovered, nor how they were discovered, nor why they are great ideas. We only get a scorecard.

Similar trends are occurring with respect to other sciences. Historians generally no prefer "external history" to "internal history", so that they can write stories that appeal to readers with almost no expertise.

Historians of mathematics have thankfully been resisting such temptations. In this talk the speaker will explain why he is so grateful for the continued excellence of papers on mathematical history, and he will make a plea for historians of computer science to get back on track.

History is always written by the winners, and that means right now it is written by neoliberals.  Dumping down history of computer science is just application of neoliberalism to particular narrow field. The to way an essence of neoliberal history is "to dumb down everything". Dumbing down is a deliberate lowering of the intellectual level of education, literature, cinema, news, and culture. Deliberate dumbing down is the goal.

They use power of vanity to rob us of vision which history can provide. Knuth lecture "Let's Not Dumb Down the History of Computer Science" can be viewed at Kailath Lecture and Colloquia. He did important point that historical errors are as important as achievement, and probably more educational. In this "drama of ideas" (and he mentioned high educational value of errors/blunders  of Linux Torvalds in design of Linux kernel)  errors and achievement s all have their place and historical value.  History gives people stories that are much more educational then anything else. that's that way people learn best.

50 Giants of the field

Giants of the field either were US citizens or people who worked in the USA for a long time. Among them:  

  1. Gene Amdahl (born November 16, 1922)  -- architect of  System/360 hardware.  Also formulated Amdahl's law.
  2. Frances E. Allen (born August 4, 1932) n American computer scientist and pioneer in the field of optimizing compilers. Her achievements include seminal work in compilers, code optimization, and parallelization. She also had a role in intelligence work on programming languages for the National Security Agency. Allen was the first female IBM Fellow and in 2006 became the first woman to win the Turing Award.
  3. John Backus (December 3, 1924 – March 17, 2007) -- designed FORTRAN and the first Fortran complier, one of designers of Algol 60, co-inventor of the Backus-Naur form. The IEEE awarded Backus the W.W. McDowell Award in 1967 for the development of FORTRAN.[1] He received the National Medal of Science in 1975,[2] and the 1977 ACM Turing Award “for profound, influential, and lasting contributions to the design of practical high-level programming systems, notably through his work on FORTRAN, and for publication of formal procedures for the specification of programming languages.”[3]
  4. Gordon Bell (born August 19, 1934) -- designed several of PDP machines (PDP-4, PDP-6, PDP-11 Unibus and VAX). The DEC founders Ken Olsen and Harlan Anderson recruited him for their new company in 1960, where he designed the I/O subsystem of the PDP-1, including the first UART. Bell was the architect of the PDP-4, and PDP-6. Other architectural contributions were to the PDP-5 and PDP-11 Unibus and General Registers architecture.
  5. Fred Brooks (born April 19, 1931) -- managed the development of IBM's System/360 with its innovative ground-breaking hardware and the OS/360 which was the dominant OS on IBM mainframes. He wrote classic book about his experience as the manager of OS/360 development.  The Mythical Man-Month (1975). Which remains one most read computer books from 1970th. Brooks has received  the National Medal of Technology in 1985 and the Turing Award in 1999.
  6. Vint Cerf (born June 23, 1943)--  DAPRA manager, one of "the fathers of the Internet", sharing this title with Bob Kahn.
  7. John Cocke (May 30, 1925 – July 16, 2002)  -- Led the design and implementation of famous optimizing compilers produced by IBM (IBM Fortran H compiler), one of the "fathers" of RISC architecture. Contributed to the analysis of program graphs theory.  He was instrumental in the design of  the IBM 801 minicomputer, where his realization that matching the design of the architecture's instruction set to the relatively simple instructions actually allow compilers to produce high performance binaries at relatively  low cost. He is also one of the inventors of the CYK algorithm (C for Cocke). He was also involved in the pioneering speech recognition and machine translation work at IBM in the 1970s and 1980s, and is credited by Frederick Jelinek with originating the idea of using a trigram language model for speech recognition.
  8. Fernando J. Corbató (born July 1, 1926) --  a pioneer in the development of time-sharing operating systems including famous MIT CTSS Time-Sharing System, and Multics OS. Inventor of Corbató's Law. Essentially a Godfather of Unix, which would never happen if AT&T was not involved in Multics project and learned MIT technology during this period.
  9. Seymour Cray (September 28, 1925 – October 5, 1996)   -- founder of Cray Research, "the father of supercomputing". Cray participated in the design of the ERA 1103, the first commercially successful scientific computer. By 1960 he had completed the design of the CDC 1604, an improved low-cost ERA 1103 that had impressive performance for its price range. After that he designed the CDC 6600 was the first commercial supercomputer, outperforming everything then available by a wide margin. He then further increased the challenge in the later release the 5-fold faster CDC 7600.  In 1963, in a Business Week article announcing the CDC 6600, Seymour Cray clearly expressed an idea that is often misattributed to Herb Grosch as so-called Grosch's law: Computers should obey a square law -- when the price doubles, you should get at least four times as much speed. After founding Cray research he released famous Cray-1 supercomputer in 1976.   As with earlier Cray designs, the Cray-1 made sure that the entire computer was fast, as opposed to just the processor.
  10. Charles Stark Draper  (October 2, 1901 – July 25, 1987) an American scientist and engineer, known as the "father of inertial navigation". He was the founder and director of the Massachusetts Institute of Technology's Instrumentation Laboratory, later renamed the Charles Stark Draper Laboratory, which made the Apollo moon landings possible through the Apollo Guidance Computer it designed for NASA.
  11. Whitfield Diffie (born June 5, 1944) is an American cryptographer and one of the pioneers of public-key cryptography. His interest in cryptography began at "age 10 when his father, a professor, brought home the entire crypto shelf of the City College Library in New York." Diffie and Martin Hellman's paper New Directions in Cryptography was published in 1976. It introduced a new method of distributing cryptographic keys. It has become known as Diffie–Hellman key exchange. The article also seems to have stimulated the almost immediate public development of a new class of encryption algorithms, the asymmetric key algorithms. Diffie and Susan Landau's influential book Privacy on the Line was published in 1998 on the politics of wiretapping and encryption. An updated and expanded edition appeared in 2007.
  12. Brendan Eich (born 1960 or 1961) an American computer programmer, who created of the JavaScript scripting language. Later he became the chief technology officer at the Mozilla Corporation. See his site Brendan Eich
  13. Douglas Engelbart (January 30, 1925 – July 2, 2013)  -- co-inventor of the computer mouse, instrumental in the development of hypertext. These were demonstrated at The Mother of All Demos in 1968. Engelbart's Law, the observation that the intrinsic rate of human performance is exponential, is named after him.
  14. Philip Don Estridge (June 23, 1937 - August 2, 1985)  -- led the team, which developed the original IBM Personal Computer (PC), and thus is known as "father of the IBM PC". His decisions dramatically changed the computer industry, resulting in a vast increase in the number of personal computers sold and bought (computer for each family), thus creating an entire PC industry.
  15. David C. Evans (February 24, 1924 – October 3, 1998) the founder of the computer science department at the University of Utah and co-founder (with Ivan Sutherland) of Evans & Sutherland, a computer firm which is known as a pioneer in the domain of computer-generated imagery.
  16. Edward Feigenbaum  (June 23, 1937 - August 2, 1985),  a computer scientist who is often called the "father of expert systems." A former chief scientist of the Air Force, he received the U.S. Air Force Exceptional Civilian Service Award in 1997. In 1984 he was selected as one the initial fellows of the ACMI and in 2007 was inducted as a Fellow of the ACM. In 2011, Feigenbaum was inducted into IEEE Intelligent Systems' AI's Hall of Fame for the "significant contributions to the field of AI and intelligent systems".
  17. Robert W. Floyd (June 8, 1936 – September 25, 2001)  -- a young "genius, who finished school at age 14. Mostly known as a computer scientist who invented Floyd–Warshall algorithm (independently of Stephen Warshall), which efficiently finds all shortest paths in a graph, Floyd's cycle-finding algorithm for detecting cycles in a sequence, and Floyd-Evans stack-based language for parsing. He was a pioneer of operator-precedence grammars. He also introduced the important concept of error diffusion for rendering images, also called Floyd–Steinberg dithering (though he distinguished dithering from diffusion). His lecture notes on sorting and searching served as a blueprint for volume three of the Art of Computer Programming (Sorting and Searching). He obtained full professor position in Stanford without Ph.D. He received the Turing Award in 1978. Floyd worked closely with Donald Knuth, in particular as the major reviewer for Knuth's seminal book The Art of Computer Programming, and is the person most cited in that work.
  18. Bill Gates  (born 28 October 1955) Created FAT filesystem, and was instrumental in creation and success of PCs DOS, and Windows 95, 98, NT, 2000 and 2003 OSes, Microsoft Office and, more importantly, the whole PC ecosystem which dominates computing today. Ensured possibility of linux success by marketing Xenix. See XENIX -- Microsoft Short-lived Love Affair with Unix In a way Microsoft can be called godfather of Linux, which would be impossible without mass produced Windows PC hardware.
  19. Seymour Ginsburg (1927–2004) a pioneer of automata theory, formal language theory, and database theory, in particular; and computer science, in general. Ginsburg was the first to observe the connection between context-free languages and "ALGOL-like" languages.
  20. Robert M. Graham  (born in 1929) one of the key developers of Multics, one of the first virtual memory time-sharing computer operating systems, which broke ground for all modern operating systems.  He had responsibility for protection, dynamic linking, and other key system kernel areas. In 1996 he was inducted as a Fellow of the Association for Computing Machinery. See Robert M. Graham Home Pa
  21. David Gries  born 26 April 1939 ) is the author of influentical 1971 book Compiler Construction for Digital Computers, John Wiley and Sons, New York, 1971, 491 pages. (Translated into Spanish, Japanese, Chinese, Italian and Russian.). That was the first systematical exposure of compiler technology.  He also participated in the development of one of the best educational compilers (for programming language PL/C) which was probably the only real competitor to IBM PL/1 debugging compiler as for quality of diagnostics and correction of syntax errors.
  22. Ralph Griswold  (May 19, 1934,  – October 4, 2006), created groundbreaking string processing languages SNOBOL, SL5, and, later,  Icon.
  23. Richard Hamming (February 11, 1915 – January 7, 1998) His contributions include the Hamming code, the Hamming window (Digital Filters), Hamming numbers, sphere-packing (or hamming bound) and the Hamming distance.
  24. Martin Hellman (born October 2, 1945) an American cryptologist, and is best known for his invention of public key cryptography in cooperation with Whitfield Diffie and Ralph Merkle. Hellman is a long-time contributor to the computer privacy debate and is more recently known for promoting risk analysis studies on nuclear threats, including the .
  25. David A. Huffman(August 9, 1925 – October 7, 1999)  known for his Huffman code, an optimal prefix code found using the algorithm developed  while he was a Ph.D. student at MIT, and published in the 1952 paper "A Method for the Construction of Minimum-Redundancy Codes".[1] Huffman's algorithm derives this table based on the estimated probability or frequency of occurrence (weight) for each possible value of the source symbol.  He is recipient of
  26. Steve Jobs (February 24, 1955 – October 5, 2011). Co-creator on Apple, the marketing force behind the Next computer, iPad and iPhone brands.
  27. Bill Joy  (born November 8, 1954) The major contributor to FreeBSD and Solaris OS. As a UC Berkeley graduate student, Joy created  the Berkeley Software Distribution (BSD) of Unix.  Made important work on improving the Unix kernel, and also handled BSD distributions. Joy's speed of programming is legendary, with an oft-told anecdote that he wrote the vi editor in a weekend. Joy denies this assertion. He is a creator of standard unix editor vi editor and, now less used, but very influential C shell. Joy co-founded Sun Microsystems in 1982 along with Vinod Khosla, Scott McNealy and Andreas von Bechtolsheim, and served as chief scientist at the company until 2003.
  28. Phil Katz  (November 3, 1962 – April 14, 2000)  a computer programmer best known as the co-creator of the ZIP file format for data compression, which became de-facto standard compression method in DOS and Windows. He is the author of PKZIP, a program  which pioneered ZIP format and for coupe of decades of DOS era was ahead of competition in quality of compression.
  29. Alan Kay  (born May 17, 1940) One of the members of Xerox PARC, Atari's chief scientist for three years. Best known for his contribution to Smalltalk language.
  30. Gary Kildall (May 19, 1942 – July 11, 1994) Creator of the concept of the BIOS, CPM and DrDOS operating systems. Gary Arlen Kildall  was an American computer scientist and microcomputer entrepreneur who created the CP/M operating system and founded Digital Research, Inc. (DRI). Kildall was one of the first people to see microprocessors as fully capable computers rather than equipment controllers and to organize a company around this concept.[1] He also co-hosted the PBS TV show The Computer Chronicles. Although his career in computing spanned more than two decades, he is mainly remembered in connection with IBM's unsuccessful attempt in 1980 to license CP/M for the IBM PC.
  31. Donald Knuth (born January 10, 1938)-- made tremendous contribution by systematizing of knowledge of computer algorithms, publishing three volumes of the Art of Computer programming (starting  in 1968); see also Donald Knuth: Leonard Euler of Computer Science. Also created TeX typesetting system.  Invented specific style fo programming called Literate Programming and pioneered experimental study of programs. While working on the the Art of Computer programming  in 1971 he published his groundbreaking paper  "An empirical study of FORTRAN programs." ( Software --Practice and Experience, vol 1, pages 105-133, 1971).   In this paper he laid a foundation of empirical analysis of computer languages by providing convincing empirical evidence about the critical influence of the level of optimization of "inner loops" on performance, the fact that programs appear to exhibit a very important property termed locality of reference and provided powerful argument against orthogonal languages and for introducing "Shannon code style constructs" in the language by observing the fact that only a small rather primitive subset of the languages is used in 90% of all statements(most of arithmetic expressions on the right side of assignment statements are simple increments/decrements or a=a+c where c is a small constant).  Formulated impossibility for programmer to correctly predict bottlenecks in the programs without measurements and related Knuth law ("Premature optimization is the root of all evil. "). Was courageous fighter against early fundamentalism trends in programming promoted by Structured programming cult.
  32. Butler Lampson (born December 23, 1943) Lampson was one of the founding members of Xerox PARC in 1970. In 1973, the Xerox Alto, with its three-button mouse and full-page-sized monitor was born. It is now considered to be the first actual personal computer (at least in terms of what has become the 'canonical' GUI mode of operation). At PARC, Lampson helped work on many other revolutionary technologies, such as laser printer design; two-phase commit protocols; Bravo, the first WYSIWYG text formatting program; Ethernet, the first high-speed local area network (LAN); and designed several influential programming languages such as Euclid.
  33. John Mauchly (August 30, 1907 – January 8, 1980) an American physicist who, along with J. Presper Eckert, designed ENIAC, the first general purpose electronic digital computer, as well as EDVAC, BINAC and UNIVAC I, the first commercial computer made in the United States.
  34. John McCarthy  (September 4, 1927 – October 24, 2011)[ coined the term "artificial intelligence" (AI), developed the Lisp programming language family, significantly influenced the design of the ALGOL programming language, popularized timesharing, and was very influential in the early development of AI.
  35. Bob Miner (December 23, 1941 – November 11, 1994) the co-founder of Oracle Corporation and architect of Oracle's relational database. From 1977 until 1992, Bob Miner led product design and development for the Oracle relational database management system. In Dec., 1992, he left that role and spun off a small, advanced technology group within Oracle. He was an Oracle board member until Oct., 1993.[2]
  36. Cleve Moler a mathematician and computer programmer specializing in numerical analysis. In the mid to late 1970s, he was one of the authors of LINPACK and EISPACK, Fortran libraries for numerical computing. He invented MATLAB, a numerical computing package, to give his students at the University of New Mexico easy access to these libraries without writing Fortran. In 1984, he co-founded MathWorks with Jack Little to commercialize this program.
  37. Gordon E. Moore (born January 3, 1929)  an American businessman and co-founder and Chairman Emeritus of Intel Corporation and the author of Moore's Law (published in an article April 19, 1965 in Electronics Magazine).
  38. Robert Morris (July 25, 1932 – June 26, 2011)    a researcher at Bell Labs who worked on Multics and later Unix. Morris's contributions to early versions of Unix include the math library, the bc programming language, the program crypt, and the password encryption scheme used for user authentication. The encryption scheme was based on using a trapdoor function (now called a key derivation function) to compute hashes of user passwords which were stored in the file /etc/passwd; analogous techniques, relying on different functions, are still in use today.
  39. Allen Newell  (March 19, 1927 – July 19, 1992) contributed to the Information Processing Language (1956) and two of the earliest AI programs, the Logic Theory Machine (1956) and the General Problem Solver (1957) (with Herbert A. Simon). He was awarded the ACM's A.M. Turing Award along with Herbert A. Simon in 1975 for their basic contributions to artificial intelligence and the psychology of human cognition.
  40. Robert Noyce co-founded Fairchild Semiconductor in 1957 and Intel Corporation in 1968. He is also credited (along with Jack Kilby) with the invention of the integrated circuit or microchip which fueled the personal computer revolution and gave Silicon Valley its name.
  41. Ken Olsen an American engineer who co-founded Digital Equipment Corporation (DEC) in 1957 with colleague Harlan Anderson.
  42. John K. Ousterhout (born October 15, 1954) the creator of the Tcl scripting language and the Tk toolkit.
  43. Alan Perlis (April 1, 1922 – February 7, 1990)  an American computer scientist known for his pioneering work in programming languages and the first recipient of the Turing Award. In 1982, he wrote an article, Epigrams on Programming, for ACM's SIGPLAN journal, describing in one-sentence distillations many of the things he had learned about programming over his career. The epigrams have been widely quoted.
  44. Dennis Ritchie (September 9, 1941 – October 12, 2011)[ an American computer scientist who  created the C programming language with long-time colleague Ken Thompson, and was instrumental in creation of the Unix operating system. Ritchie and Thompson received the Turing Award from the ACM in 1983, the Hamming Medal from the IEEE in 1990 and the National Medal of Technology from President Clinton in 1999.
  45. Claude Shannon  (April 30, 1916 – February 24, 2001 an American mathematician, electronic engineer, and cryptographer known as "the father of information theory". He is also credited with founding both digital computer and digital circuit design theory in 1937, when, as a 21-year-old master's degree student at the Massachusetts Institute of Technology (MIT), he wrote his thesis demonstrating that electrical applications of Boolean algebra could construct and resolve any logical, numerical relationship. Shannon contributed to the field of cryptanalysis for national defense during World War II, including his basic work on codebreaking and secure telecommunications.
  46. Ivan Sutherland (born May 16, 1938) an American computer scientist and Internet pioneer. He received the Turing Award from the Association for Computing Machinery in 1988 for the invention of Sketchpad, an early predecessor to the sort of graphical user interface that has become ubiquitous in personal computers. He is a member of the National Academy of Engineering, as well as the National Academy of Sciences among many other major awards. In 2012 he was awarded the Kyoto Prize in Advanced Technology for "pioneering achievements in the development of computer graphics and interactive interfaces"
  47. Richard Stallman (born March 16, 1953)- Creator of the GNU project; see also Nikolai Bezroukov. Portraits of Open Source Pioneers. Ch.3 Prince Kropotkin of Software (Richard Stallman and War of Software Clones).  He is the founder of the GNU project: the second after FreeBSD project explicitly oriented on creation of existing commercial software clones and first of all Unix OS. Since mid-90th GNU project was by-and-large superseded/integrated into Linux project and movement, but still has its own historical place and importance due to the value of GPL and GNU toolchain for free/open source software movement.  Also was initial developer of two important software packages GCC and GNU Emacs.
  48. Robert Tarjan (born April 30, 1948) an American computer scientist. He discovered of several important graph algorithms, including Tarjan's off-line least common ancestors algorithm, and is a co-inventor of both splay trees and Fibonacci heaps.  The Hopcroft-Tarjan planarity testing algorithm was the first linear-time algorithm for planarity-testing.
  49. Ken Thompson (born February 4, 1943) designed and implemented the original Unix operating system. He also invented the B programming language, the direct predecessor to the C programming language, and was one of the creators and early developers of the Plan 9 operating systems. Thompson had developed the CTSS version of the editor QED, which included regular expressions for searching text. QED and Thompson's later editor ed (the standard text editor on Unix) contributed greatly to the eventual popularity of regular expressions,
  50. Larry Wall (born September 27, 1954) -- Creator of Perl language and patch program. Creator of the idea of dual licensing and influential Artistic license. See also Slightly Skeptical View on Larry Wall and Perl

Those people mentioned above are all associated with the USA. And I named just a few about work of which I personally know...  The USA  computer science research was often conducted in close collaboration with British computer scientists which also made some significant contributions (some of the most impressive IBM compilers were actually designed and implemented in Britain) but the leadership role of the USA was indisputable. CACM was always more important publications then Computer Journal. 

Large part of this unique technology culture was destroyed via outsourcing frenzy which started around 1998, but the period from approximately 1950 till approximately 2000 was really the triumph of the US computer engineering. Simultaneously this was a triumph of New Deal policies. When they were dismantled (starting from Reagan or even Carter), and neoliberalism became the ruling ideology, computer science quickly was overtaken by commercial interests and became very similar to economics in the level of corruption of academics and academic institutions.

But that did not happened overnight and the inertia lasted till late 90th. 

Firms also did not escape this transformation into money making machines with IBM as a primary example of the disastrous results of such transformations which started under "American Express"-style leadership of Lou Gerstner. The first of financial shenanigans, who became CEO of a major technical company. And who will later destroy several other major US computer companies. In the interests of shareholders and personal bonuses ;-). See  IBM marry Linux to Outsourcing.

Here is the timeline modified from History of Computer Science

Timeline of Fifty Glorious Years

1950's 1960's 1970's 1980's 1990's Notes


In 1949 The U.S. Army and the University of Illinois jointly fund the construction of two computers, ORDVAC and ILLIAC (ILLInois Automated Computer). The Digital Computer Laboratory is organized. Ralph Meagher, a physicist and chief engineer for ORDVAC, is head. 1951 ORDVAC (Ordnance Variable Automated Computer), one of the fastest computers in existence, is completed.  1952 ORDVAC moves to the Army Ballistic Research Laboratory in Aberdeen, Maryland. It is used remotely from the University of Illinois via a teletype circuit up to eight hours each night until the ILLIAC computer is completed

Grace Murray Hopper (1906-1992) invented the notion of a compiler, at Remington Rand, in 1951. Earlier, in 1947, Hopper found the first computer "bug" -- a real one -- a moth that had gotten into the Harvard Mark II. (Actually, the use of ``bug'' to mean defect goes back to at least 1889.). The first compiler was written by Grace Hopper, in 1952, for the A-0 System language. The term compiler was coined by Hopper. History of compiler construction - Wikipedia, the free encyclopedia

In a famous paper that appeared in the journal Mind in 1950, Alan Turing introduced the Turing Test, one of the first efforts in the field of artificial intelligence. He proposed a definition of "thinking" or "consciousness" using a game: a tester would have to decide, on the basis of written conversation, whether the entity in the next room responding to the tester's queries was a human or a computer. If this distinction could not be made, then it could be fairly said that the computer was "thinking".

In 1952, Alan Turing was arrested for "gross indecency" after a burglary led to the discovery of his affair with Arnold Murray. Overt homosexuality was taboo in 1950's England, and Turing was forced to take estrogen "treatments" which rendered him impotent and caused him to grow breasts. On June 7, 1954, despondent over his situation, Turing committed suicide by eating an apple laced with cyanide.

In the same 1952 ILLIAC, the first computer built and owned entirely by an educational institution, becomes operational. It was ten feet long, two feet wide, and eight and one-half feet high, contained 2,800 vacuum tubes, and weighed five tons.

In the same 1952 IBM developed first magnetic disk. In September 1952, IBM opened a facility in San Jose, Calif.—a critical moment in the story of Silicon Valley. The company set to work developing a new kind of magnetic memory for its planned Model 305 Ramac (Random Access Method of Accounting and Control), the world's first "supercomputer."

In 1952 Univac correctly predicted the results of presidential elections in the USA.  Remington Rand seized the opportunity to introduce themselves to America as the maker of UNIVAC – the computer system whose name would become synonymous with computer in the 1950s. Remington Rand was already widely known as the company that made the Remington typewriters. The company bought out the struggling Eckert-Mauchly Computer Corporation in 1950. Pres Eckert and John Mauchly had led the ENIAC project and made one of the first commercially available computer, UNIVAC. See Computer History Museum @CHM Have you got a prediction for us, UNIVAC

The IBM 650 Magnetic Drum Data Processing Machine was announced 2 July 1953 (as the "Magnetic Drum Calculator", or MDC), but not delivered until December 1954 (same time as the NORC). Principal designer: Frank Hamilton, who had also designed ASCC and SSEC. Two IBM 650s were installed at IBM Watson Scientific Computing Laboratory at Columbia University, 612 West 116th Street, beginning in August 1955.

Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also invented an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer. (Dijkstra is famous for his caustic, opinionated memos. For example, see his opinions of some programming languages).

In 1956 IBM 305 RAMAC was announced. It was the first commercial computer that used a moving head hard disk drive (magnetic disk storage) for secondary storage. The 305 was one of the last vacuum tube computers that IBM built. The IBM 350 disk system stored 5 million 8-bit (7 data bits plus 1 parity bit) characters. It had fifty 24-inch-diameter (610 mm) disks.

The same year Case University Computing Center got IBM 650 and the same year Donald Knuth entered this college and  he managed to start working at the Case University Computing Center. That later led to creation of his three volume series the Art of Computer Programming -- the bible of programming as it was called.

On October 4, 1957, the first artificial Earth satellite Sputnik was launched by USSR it into an elliptical low Earth orbit. In a way it as a happenstance due to iron will and talent of Sergey Korolev, a charismatic head of the USSR rocket program (who actually served some years in GULAG). But it opened a new era.  The ILLIAC I (Illinois Automatic Computer), a pioneering computer built in 1952 by the University of Illinois, was the first computer built and owned entirely by a US educational institution, was the first to calculated Sputnik orbit. The launch of Sputnik led to creation   NASA  and indirectly of the US Advanced Research Projects Agency (DARPA) in February 1958 to regain a technological lead. It also led to dramatic increase in U.S. government spending on scientific research and education via President Eisenhower's bill called the National Defense Education Act. This bill encouraged students to go to college and study math and science. The students' tuition fees would be paid for. This led to a new emphasis on science and technology in American schools. In other words Sputnik created building blocks which probably led to the general establishment of the way computer science developed in the USA for the next decade of two.  DARPA latter funded the creation of the TCP/IP protocol and Internet as we know it. It also contributed to development of large integral circuits. The rivalry in space, even though it had military reasons served as tremendous push forward for computers and computer science. 

John Backus and others developed the first complete complier -- FORTRAN compiler in April 1957.  FORTRAN stands for FORmula TRANslating system. Heading the team is John Backus, who goes on to contribute to the development of ALGOL and the well-known syntax-specification system known as BNF. The first FORTRAN compiler took 18 person-years to create.

LISP, a list-processing language for artificial intelligence programming, was invented by John McCarthy about 1958. The same year Alan Perlis, John Backus, Peter Naur and others developed Algol.

In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959.

In 1959 LISP 1.5 appears. The same year COBOL is created by the Conference on Data Systems and Languages (CODASYL).

See also Knuth Biographic Notes


In the 1960's, computer science came into its own as a discipline. In fact this decade became a gold age of computer science. In fact, the term was coined by George Forsythe, a numerical analyst. The first computer science department was formed at Purdue University in 1962. The first person to receive a Ph. D. from a computer science department was Richard Wexelblat, at the University of Pennsylvania, in December 1965.

Operating systems saw major advances. Fred Brooks at IBM designed System/360, a line of different computers with the same architecture and instruction set, from small machine to top-of-the-line. DEC designed PDP series. The first PDP-1 was delivered to Bolt, Beranek and Newman in November 1960,[14] and formally accepted the next April.[15] The PDP-1 sold in basic form for $120,000, or about $900,000 in 2011 US dollars.[16] By the time production ended in 1969, 53 PDP-1s had been delivered.[11][17]At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed.

In 1960 ALGOL 60, the first block-structured language, appears. This is the root of the family tree that will ultimately produce the Pl/l, algol 68, Pascal, Modula, C, Java, C# and other languages.  ALGOL become popular language in Europe in the mid- to late-1960s. Attempts to simplify Algol lead to creation of  BASIC (developed c. 1964 by John Kemeny (1926-1992) and Thomas Kurtz (b. 1928)).  It became very popular with PC revolution.

The 1960's also saw the rise of automata theory and the theory of formal languages. Big names here include Noam Chomsky and Michael Rabin. Chomsky introduced the notion of context free languages and later became well-known for his theory that language is "hard-wired" in human brains, and for his criticism of American foreign policy.

Sometime in the early 1960s , Kenneth Iverson begins work on the language that will become APL--A Programming Language. It uses a specialized character set that, for proper use, requires APL-compatible I/O devices. APL is documented in Iverson's book, A Programming Language  published in 1962

In 1962 ILLIAC II, a transistorized computer 100 times faster than the original ILLIAC, becomes operational. ACM Computing Reviews says of the machine, "ILLIAC II, at its conception in the mid-1950s, represents the spearhead and breakthrough into a new generation of machines." in 1963 Professor Donald B. Gillies discovered three Mersenne prime numbers while testing ILLIAC II, including the largest then known prime number, 2**11213 -1, which is over 3,000 digits.

The famous IBM System/360 (S/360) was  first announced by IBM on April 7, 1964.  S/360 became the most popular computer systems for more then a decade. It introduced 8-bit byte address space, byte addressing and many other things.  The same year (1964) PL/1 was released. It became the most widely used programming language in Eastern Europe and the USSR. It later served as a prototype of several other languages including PL/M and C.

In 1964 the IBM 2311 Direct Access Storage Facility was introduced (History of IBM magnetic disk drives - Wikipedia,) for the System/360 series. It was also available on the IBM 1130 and (using the 2841 Control Unit) the IBM 1800. The 2311 mechanism was largely identical to the 1311, but recording improvements allowed higher data density. The 2311 stored 7.25 megabytes on a single removable IBM 1316 disk pack (the same type used on the IBM 1311) consisting of six platters that rotated as a single unit. Each recording surface had 200 tracks plus three optional tracks which could be used as alternatives in case faulty tracks were discovered. Average seek time was 85 ms. Data transfer rate was 156 kB/s.

Along with the development unified System 360 series of computers, IBM wanted a single programming language for all users. It hoped that Fortran could be extended to include the features needed by commercial programmers. In October 1963 a committee was formed[4] composed originally of 3 IBMers from New York and 3 members of SHARE, the IBM scientific users group, to propose these extensions to Fortran. Given the constraints of Fortran, they were unable to do this and embarked on the design of a “new programming language” based loosely on Algol labeled “NPL". This acronym conflicted with that of the UK’s National Physical Laboratory and was replaced briefly by MPPL (MultiPurpose Programming Language) and, in 1965, with PL/I (with a Roman numeral “I” ). The first definition appeared in April 1964. IBM took NPL as a starting point and completed the design to a level that the first compiler could be written: the NPL definition was incomplete in scope and in detail.[7] Control of the PL/I language[8] was vested initially in the New York Programming Center and later at the IBM UK Laboratory at Hursley. The SHARE and GUIDE user groups were involved in extending the language and had a role in IBM’s process for controlling the language through their PL/I Projects. The language was first specified in detail in the manual “PL/I Language Specifications. C28-6571” written in New York from 1965 and superseded by “PL/I Language Specifications. GY33-6003” written in Hursley from 1967. IBM continued to develop PL/I in the late sixties and early seventies, publishing it in the GY33-6003 manual. These manuals were used by the Multics group and other early implementers. The first production PL/I compiler was the PL/I F compiler for the OS/360 Operating System, built by John Nash's team at Hursley in the UK: the runtime library team was managed by I.M.(Nobby) Clarke. Release 1 shipped in 1966. That was a significant step forward in comparison with earlier compilers. The PL/I D compiler, using 16 kilobytes of memory, was developed by IBM Germany for the DOS/360 low end operating system. It implemented a subset of the PL/I language requiring all strings and arrays to have fixed extents, thus simplifying the run-time environment. Reflecting the underlying operating system it lacked dynamic storage allocation and the controlled storage class. It was shipped within a year of PL/I F.

 Hoare also invented Quicksort while on business trip to Moscow.

Douglas C. Englebart invents the computer mouse c. 1968, at SRI.

The first volume of The Art of Computer Programming was published in 1968 and instantly became classic Donald Knuth (b. 1938) later published  two additional volumes of his world famous three-volume treatise

In 1968 ALGOL 68 , a monster language compared to ALGOL 60, appears. Some members of the specifications committee -- including C.A.R. Hoare and Niklaus Wirth -- protest its approval. ALGOL 68 proves difficult to implement. The same year Niklaus Wirth begins his work on a simple teaching language which later becomes Pascal.

Ted Hoff (b. 1937) and Federico Faggin at Intel designed the first microprocessor (computer on a chip) in 1969-1971.

In late 60th the PDP-11 one of the first 16-bit minicomputers was designed in a crash program by Harold McFarland, Gordon Bell, Roger Cady, and others as a response to NOVA 16-bit minicomputers. The project was able to leap forward in design with the arrival of Harold McFarland, who had been researching 16-bit designs at Carnegie Mellon University. One of his simpler designs became the PDP-11. It was launched in 1970 and became huge success. The first officially named version of Unix ran on the PDP-11/20 in 1970. It is commonly stated that the C programming language took advantage of several low-level PDP-11–dependent programming features, albeit not originally by design. A major advance in the PDP-11 design was Digital's Unibus, which supported all peripherals through memory mapping. This allowed a new device to be added easily, generally only requiring plugging a hardware interface board into the backplane, and then installing software that read and wrote to the mapped memory to control it. The relative ease of interfacing spawned a huge market of third party add-ons for the PDP-11, which made the machine even more useful. The combination of architectural innovations proved superior to competitors and the "11" architecture was soon the industry leader, propelling DEC back to a strong market position.

A second generation of programming languages, such as Basic, Algol 68 and Pascal (Designed by Niklaus Wirth in 1968-1969) appeared at the end of decade


Flat uniform record (relational) databases got a fashionable pseudo-theoretical justification with the work of Edgar F. Codd.  While mostly nonsense it help to spread relational database which became dominant type of databases. That was probably one of the first of bout of fashion in computer science. Many more followed. Codd won the Turing award in 1981.

Unix, a very influential operating system, was developed at Bell Laboratories by Ken Thompson (b. 1943) and Dennis Ritchie (b. 1941) after ATT withdraw from Multics project. Brian Kernighan and Ritchie together developed C, which became the most influential system programming language and also was used as general purpose language on personal computers.  The first release was made in 1972. The definitive reference manual for it will not appear until 1974.

In early 1970th the PL/I Optimizer and Checkout compilers produced in Hursley supported a common level of PL/I language[23] and aimed to replace the PL/I F compiler. The compilers had to produce identical results - the Checkout Compiler was used to debug programs that would then be submitted to the Optimizer. Given that the compilers had entirely different designs and were handling the full PL/I language this goal was challenging: it was achieved. The PL/I optimizing compiler took over from the PL/I F compiler and was IBM’s workhorse compiler from the 1970s to the 1990s. Like PL/I F, it was a multiple pass compiler with a 44kByte design point, but it was an entirely new design. Unlike the F compiler it had to perform compile time evaluation of constant expressions using the run-time library - reducing the maximum memory for a compiler phase to 28 kilobytes. A second-time around design, it succeeded in eliminating the annoyances of PL/I F such as cascading diagnostics. It was written in S/360 Macro Assembler by a team, led by Tony Burbridge, most of whom had worked on PL/I F. Macros were defined to automate common compiler services and to shield the compiler writers from the task of managing real-mode storage - allowing the compiler to be moved easily to other memory models. Program optimization techniques developed for the contemporary IBM Fortran H compiler were deployed: the Optimizer equaled Fortran execution speeds in the hands of good programmers. Announced with the IBM S/370 in 1970, it shipped first for the DOS/360 operating system in Aug 1971, and shortly afterward for OS/360, and the first virtual memory IBM operating systems OS/VS1, MVS and VM/CMS (the developers were unaware that while they were shoehorning the code into 28kB sections, IBM Poughkeepsie was finally ready to ship virtual memory support in OS/360). It supported the batch programming environments and, under TSO and CMS, it could be run interactively.

Simultaneously PL/C  a dialect of PL/1 for education was developed at Cornell University in the early 1970s. It was designed with the specific goal of being used for teaching programming. The main authors were  Richard W. Conway and Thomas R. Wilcox. They submitted the famous article "Design and implementation of a diagnostic compiler for PL/I" published in the Communications of ACM in March 1973. PL/C eliminated some of the more complex features of PL/I, and added extensive debugging and error recovery facilities. The PL/C compiler had the unusual capability of never failing to compile any program, through the use of extensive automatic correction of many syntax errors and by converting any remaining syntax errors to output statements.

In 1972 Gary Kildall implemented a subset of PL/1, called "PL/M" for microprocessors. PL/M was used to write the CP/M operating system  - and much application software running on CP/M and MP/M. Digital Research also sold a PL/I compiler for the PC written in PL/M. PL/M was used to write much other software at Intel for the 8080, 8085, and Z-80 processors during the 1970s.

In 1973-74 Gary Kildall developed CP/M during , an operating system for an Intel Intellec-8 development system, equipped with a Shugart Associates 8-inch floppy disk drive interfaced via a custom floppy disk controller. It was written in PL/M. Various aspects of CP/M were influenced by the TOPS-10 operating system of the DECsystem-10 mainframe computer, which Kildall had used as a development environment.

The LSI-11 (PDP-11/03), introduced in February, 1975 was the first PDP-11 model produced using large-scale integration a precursor to personal PC. 

The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time.

In March 1976 one of the first supercomputer CRAY-1 was shipped, designed by Seymour Cray (b. 1925) It could perform 160 million operations in a second. The Cray XMP came out in 1982. Later Cray Research was taken over by Silicon Graphics.

There were also major advances in algorithms and computational complexity. In 1971, Steve Cook published his seminal paper on NP-completeness, and shortly thereafter, Richard Karp showed that many natural combinatorial problems were NP-complete. Whit Diffie and Martin Hellman published a paper that introduced the theory of public-key cryptography, and a public-key cryptosystem known as RSA was invented by Ronald Rivest, Adi Shamir, and Leonard Adleman.

Microsoft was formed on April 4, 1975 to develop and sell BASIC interpreters for the Altair 8800. Bill Gates and Paul Allen write a version of BASIC that they sell to MITS (Micro Instrumentation and Telemetry Systems) on a per-copy royalty basis. MITS is producing the Altair, one of the earlier  8080-based microcomputers that came with a interpreter for a programming language.

The Apple I went on sale in July 1976 and was market-priced at $666.66 ($2,572 in 2011 dollars, adjusted for inflation.)

The Apple II was introduced on April 16, 1977 at the first West Coast Computer Faire. It differed from its major rivals, the TRS-80 and Commodore PET, because it came with color graphics and an open architecture. While early models used ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5 1/4 inch floppy disk drive and interface, the Disk II.

In 1976, DEC decided to extend the PDP-11 architecture to 32-bits while adding a complete virtual memory system to the simple paging and memory protection of the PDP-11. The result was the VAX architecture. The first computer to use a VAX CPU was the VAX-11/780, which DEC referred to as a superminicomputer. Although it was not the first 32-bit minicomputer, the VAX-11/780's combination of features, price, and marketing almost immediately propelled it to a leadership position in the market after it was released in 1978. VAX systems were so successful that it propelled Unix to the status of major OS.  in 1983, DEC canceled its Jupiter project, which had been intended to build a successor to the PDP-10 mainframe, and instead focused on promoting the VAX as the single computer architecture for the company.

In 1978 AWK -- a text-processing language named after the designers, Aho, Weinberger, and Kernighan -- appears. The same year the ANSI standard for FORTRAN 77 appears.

In 1977 Bill Joy, then a graduate student at Berkeley, started compiling the first Berkeley Software Distribution (1BSD), which was released on March 9, 1978

In 1979, three graduate students in North Carolina developed a distributed news server which eventually became Usenet.

The Second Berkeley Software Distribution (2BSD), was released in May 1979. It included updated versions of the 1BSD software as well as two new programs by Joy that persist on Unix systems to this day: the vi text editor (a visual version of ex) and the C shell.

The same 1979 VisiCalc  the first spreadsheet program available for personal computers was conceived by Dan Bricklin, refined by Bob Frankston, developed by their company Software Arts,[1] and distributed by Personal Software in 1979 (later named VisiCorp) for the Apple II computer

At the end of 1979 the kernel of BSD Unix was largely rewritten by Berkeley students to include a virtual memory implementation, and a complete operating system including the new kernel, ports of the 2BSD utilities to the VAX, was released as 3BSD at the end of 1979.

Microsoft purchased a license for Version 7 Unix from AT&T in 1979, and announced on August 25, 1980 that it would make it available for the 16-bit microcomputer market.


The success of 3BSD was a major factor in the Defense Advanced Research Projects Agency's (DARPA) decision to fund Berkeley's Computer Systems Research Group (CSRG), which would develop a standard Unix platform for future DARPA research in the VLSI Project and included TCP stack. CSRG released 4BSD, containing numerous improvements to the 3BSD system, in October 1980. 4BSD released in November 1980 offered a number of enhancements over 3BSD, notably job control in the previously released csh, delivermail (the antecedent of sendmail), "reliable" signals, and the Curses programming library.

This decade also saw the rise of the personal computer, thanks to Steve Wozniak and Steve Jobs, founders of Apple Computer.

In 1981 IBM PC was launched which made personal computer mainstream. The first computer viruses are developed also in 1981. The term was coined by Leonard Adleman, now at the University of Southern California. The same year, 1981, the first truly successful portable computer (predecessor of modern laptops) was marketed, the Osborne I.

In 1982 one of the first scripting languages REXX was released by IBM as a product. It was four years after AWK was released. Over the years IBM included REXX in almost all of its operating systems (VM/CMS, VM/GCS, MVS TSO/E, AS/400, VSE/ESA, AIX, CICS/ESA, PC DOS, and OS/2), and has made versions available for Novell NetWare, Windows, Java, and Linux.

In 1982 PostScript appears, which revolutionized printing on dot matrix and laser printers.

1983 was the year of major events in language area:

4.2BSD would take over two years to implement and contained several major overhauls. It incorporated a modified version of BBN's preliminary TCP/IP implementation;  new Berkeley Fast File System, implemented by Marshall Kirk McKusick; The official 4.2BSD release came in August 1983. The same 1983 Stallman resigns from MIT to start the GNU project with the explicit goal of reimplementing  Unix as a "free" operating system. The name stands for "GNU is Not Unix."

In 1984 Stallman published a rewritten version of Gosling's Emacs (GNU Emacs, where G stand for Goslings) as "free" software (Goslings sold the rights for his code to a commercial company), and launches the Free Software Foundation (FSF) to support the GNU project. One of the first program he decided to write is a C compiler that became widely knows as gcc. The same year Steven Levy "Hackers" book is published with a chapter devoted to RMS that presented him in an extremely favorable light.

In October 1983 Apple introduced the Macintosh computer which was the first GUI-based mass produced  personal computer. It was three years after IBM PC was launched and six years after Apple II launch. It went of sale on Jan 24, 1984 two days after US$1.5 million Ridley Scott television commercial, "1984" was aired during Super Bowl XVIII on January 22, 1984. It is now considered a  a "masterpiece.". In it an unnamed heroine to represent the coming of the Macintosh (indicated by a Picasso-style picture of Apple's Macintosh computer on her white tank top) as a means of saving humanity from the "conformity" of IBM's attempts to dominate the computer industry. 

In 1985 Intel 80386 introduced 32-bit logical addressing. It became instrumental in Unix Renaissance which started the same year the launch of of Xenix 2.0 by Microsoft. It was based on UNIX System V. An update numbered 2.1.1 added support for the Intel 80286 processor. The Sperry PC/IT, an IBM PC AT clone, was advertised as capable of supporting eight simultaneous dumb terminal users under this version. Subsequent releases improved System V compatibility. The era of PC Unix started and Microsoft became dominant vendor of Unix: in the late 1980s, Xenix was, according to The Design and Implementation of the 4.3BSD UNIX Operating System, "probably the most widespread version of the UNIX operating system, according to the number of machines on which it runs". In 1987, SCO ported Xenix to the 386 processor. Microsoft used Xenix on Sun workstations and VAX minicomputers extensively within their company as late as 1992

Microsoft Excel was first released for Macintosh, not IBM PC, in 1985. The same year the combination of the Mac, Apple's LaserWriter printer, and Mac-specific software like Boston Software's MacPublisher and Aldus PageMaker enabled users to design, preview, and print page layouts complete with text and graphics—an activity to become known as desktop publishing.

The first version of GCC was able to compile itself in late 1985. The same year GNU Manifesto published

In 1986-1989 a series of computer viruses for PC DOS made headlines. One of the first mass viruses was boot virus called Brain created in 1986 by the Farooq Alvi Brothers in Lahore, Pakistan, reportedly to deter piracy of the software they had written.

In 1987, the US National Science Foundation started NSFnet, precursor to part of today's Internet.

The same year, 1987, Perl was released by Larry Wall. In 1988 Perl 2 was released.

Steve Jobs was ousted from Apple and formed his new company NeXT Computer  with a dozen of former Apple employees. NeXT was the first affordable workstation with over megaflop computer power. It was in 1988, and the smaller NeXTstation in 1990. It was NeXTstation  that was used to develop World Wide Web in CERN. It was also instrumental in creating of complex modern GUI interfaces and launching object oriented programming into mainstream...

In 1998 Human genome sequncing project started. A Brief History of the Human Genome Project

In 1988, Congress funded both the NIH and the DOE to embark on further exploration of this concept, and the two government agencies formalized an agreement by signing a Memorandum of Understanding to "coordinate research and technical activities related to the human genome."

James Watson was appointed to lead the NIH component, which was dubbed the Office of Human Genome Research. The following year, the Office of Human Genome Research evolved into the National Center for Human Genome Research (NCHGR).

In 1990, the initial planning stage was completed with the publication of a joint research plan, "Understanding Our Genetic Inheritance: The Human Genome Project, The First Five Years, FY 1991-1995." This initial research plan set out specific goals for the first five years of what was then projected to be a 15-year research effort.

In 1992, Watson resigned, and Michael Gottesman was appointed acting director of the center. The following year, Francis S. Collins was named director.

The advent and employment of improved research techniques, including the use of restriction fragment-length polymorphisms, the polymerase chain reaction, bacterial and yeast artificial chromosomes and pulsed-field gel electrophoresis, enabled rapid early progress. Therefore, the 1990 plan was updated with a new five-year plan announced in 1993 in the journal Science (262: 43-46; 1993).

1989 FSF introduces a General Public License (GPL). GPL is also known as 'copyleft'. Stallman redefines the word "free" in software to mean "GPL compatible". In 1990 As the president of the League for Programming Freedom (organization that fight software patterns), Stallman is given a $240,000 fellowship by the John D. and Catherine T. MacArthur Foundation.


Microsoft Windows 3.0, which began to approach the Macintosh operating system in both performance and feature set, was released in May 1990 and was a less expensive alternative to the Macintosh platform.

4.3BSD-Reno came in early 1990. It was an interim release during the early development of 4.4BSD, and its use was considered a "gamble", hence the naming after the gambling center of Reno, Nevada. This release was explicitly moving towards POSIX compliance. Among the new features was an NFS implementation from the University of Guelph. In August 2006, Information Week magazine rated 4.3BSD as the "Greatest Software Ever Written".They commented: "BSD 4.3 represents the single biggest theoretical undergirder of the Internet."

On December 25 1990 the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the Internet was accomplished in CERN. It was running on NeXT:

" Mike Sendall buys a NeXT cube for evaluation, and gives it to Tim [Berners-Lee]. Tim's prototype implementation on NeXTStep is made in the space of a few months, thanks to the qualities of the NeXTStep software development system. This prototype offers WYSIWYG browsing/authoring! Current Web browsers used in "surfing the Internet" are mere passive windows, depriving the user of the possibility to contribute. During some sessions in the CERN cafeteria, Tim and I try to find a catching name for the system. I was determined that the name should not yet again be taken from Greek mythology. Tim proposes "World-Wide Web". I like this very much, except that it is difficult to pronounce in French..." by Robert Cailliau, 2 November 1995.[22]

In 1991 Linux was launched. The USSR was dissolved that led to influx of Russian programmers (as well as programmers from Eastern European countries) in the USA.

The first website was online on 6 August 1991:

" was the address of the world's first-ever web site and web server, running on a NeXT computer at CERN. The first web page address was, which centred on information regarding the WWW project. Visitors could learn more about hypertext, technical details for creating their own webpage, and even an explanation on how to search the Web for information. There are no screenshots of this original page and, in any case, changes were made daily to the information available on the page as the WWW project developed. You may find a later copy (1992) on the World Wide Web Consortium website." -CERN

BSDi, the company formed to commercialized Unix BSD system found itself in legal trouble with AT&T's Unix System Laboratories (USL) subsidiary, then the owners of the System V copyright and the Unix trademark. The USL v. BSDi lawsuit was filed in 1992 and led to an injunction on the distribution of Net/2 until the validity of USL's copyright claims on the source could be determined.  That launched Linux into mainstream.

FreeBSD development began in 1993 with a quickly growing, unofficial patchkit maintained by users of the 386BSD operating system. This patchkit forked from 386BSD and grew into an operating system taken from U.C. Berkeley's 4.3BSD-Lite (Net/2) tape with many 386BSD components and code from the Free Software Foundation.

On April 1993 CERN released the web technology into the public domain.

1994 First official Linux version 1.0 kernel released. Linux already has about 500,000 users. Unix renaissance started.

The same 1994 Microsoft incorporates Visual Basic for Applications into Excel, creating a way to knock out the competition of the Microsoft Office.

In February 1995, ISO accepts the 1995 revision of the Ada language. Called Ada 95, it includes OOP features and support for real-time systems. 

In 1995 TCP connectivity in the USA became mainstream. Internet boom (aka dot-com boom) hit the USA. . Red Hat was formed by  merger with ACC with  Robert Yong of ACC (former founder of Linux Journal)  a CEO.

In 1996 first computer monitoring system such as Tivoli and OpenView became established players. 

In 1996  first ANSI C++ standard was released.

In 1997 Java was released. Also weak and primitive programming language if we consider its design (originally intended for imbedded systems), it proved to be durable and successful successor for Cobol.  Sum Microsystems proved to be a capable marketing machine but that lead to deterioration of Solaris position and partial neglect of other projects such as Solaris on X86 and TCL. Microsoft launched a successful derivate of Java, called C# in December 2002.

In 1998 outsourcing that in 10 years destroy the USA programming industry became a fashion, fueleed by finacial industry attempts to exploit Internet boom for quick profits.

In 1999 a crazy connected with so called Millennium bug hit the USA.  Proved lasting intellectual deterioration of some key US political figures including chairmen Greenspan --a cult like figure at the time.

In March 1999. Al Gore revealed that "During my service in the United States Congress, I took the initiative in creating the internet.". Which was partically true

This decade ended with 2000 dot-com boom bust. See Nikolai Bezroukov. Portraits of Open Source Pioneers. Ch 4: Grand Replicator aka Benevolent Dictator (A Slightly Skeptical View on Linus Torvalds)


Top Visited
Past week
Past month


Old News ;-)

2010-2019 2000-2009 1990-1999 1980-1989 1970-1979 1960-1969 1950-1959

[May 18, 2020] ALGOL 60 at 60- The greatest computer language you ve never used and grandaddy of the programming family tree The Register

May 18, 2020 |

ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree Back to the time when tape was king By Richard Speed 15 May 2020 at 09:47 149 SHARE ▼ An Elliott 803 at Loughborough Grammar School in 1976 (pic: Loughborough Schools Foundation / Peter Onion)

2020 marks 60 years since ALGOL 60 laid the groundwork for a multitude of computer languages.

The Register spoke to The National Museum of Computing's Peter Onion and Andrew Herbert to learn a bit more about the good old days of punch tapes.

ALGOL 60 was the successor to ALGOL 58, which debuted in 1958. ALGOL 58 had introduced the concept of code blocks (replete with begin and end delimiting pairs), but ALGOL 60 took these starting points of structured programming and ran with them, giving rise to familiar faces such as Pascal and C, as well as the likes of B and Simula.

"In the 1950s most code was originally written in machine code or assembly code," said Herbert, former director of Microsoft Research in Cambridge, with every computer having its own particular twist on things. A first generation of languages, called "Autocode", existed for coding problems like equations which could then be translated into machine code, but lacked the bells and whistles of today. Worse, some had features that others lacked, making hopping between systems tricky.

"There was an Autocode for the [Elliott] 803," said Onion, "but it only supported expressions like A + B = C, so if you've got a complex equation, you have to break it down into individual single binary operations. So there was still a lot of hard work to be done by the programmer."

"Fortran," said Herbert, "emerged as the first real programming language for scientific and numeric work. That convinced people that having higher-level languages (as they called them then – they were pretty primitive by modern standards) made programmers more productive."

The overhead of compiling, and inefficiencies in the compilers themselves, meant that machine code remained king of the performance hill, but for those doing science work, the ability to churn out some code to solve a problem and then simply move on to the next was appealing.

"Fortran," Herbert continued, "was more like an autocode," before laughing, "It still is in some ways!

"And a bunch of people thought you could do better."

Enter the International Federation for Information Processing (IFIP), which Herbert recalled "had a whole bunch of committees who looked at standards and problems in computing".

One group started on the design of what was then called an "Algorithmic Language": a language for writing algorithms. The output, in 1958, described the language "ALGOL 58". However, as engineers began to create compilers for the new system, they found "all kinds of things hadn't really been thought about or worked through properly," recalled Herbert.

And so there were revisions and changes. A periodical called " The ALGOL Bulletin " detailed the travails of those involved as the problems and the weaknesses in the language were dealt with (or at least attempted).

The process was not unlike an open-source mailing list today, but in paper form.

Eventually, Herbert told us, "they published the ALGOL 60 report, which is the baseline that everyone then worked to."

The committees were under pressure and also suffered a little from differing international approaches. The American side had a lot of experience in Fortran and were seeking something that could quickly be made to work on their computers, while the Europeans were a little more cerebral and had, Herbert laughed, "terrible notions like beauty and elegance in mind for the language".

"People were sorting out some of the things that we now take for granted like ideas in structured programming, data structures, data types," he added.

Seeking solutions to the problem of portability of programmers between systems and code between hardware generations as well as avoiding the pain of having to rewrite programs every time a new iteration of computer arrived, vendors embraced the language with variants cropping up over many manufacturers.

ALGOL 60 on tape (pic: Peter Onion)

Alas, those seeking a handy-dandy "HELLO WORLD" example will be disappointed. The Achilles' heel of the language that would go on to inspire so many others was that it lacked standard input/output capabilities.

"The defining committee couldn't agree on how to do input/output," said Herbert. "They decided that would be left to a library, and that library would be user dependent."

"In this case," added Onion, "the user being the compiler writer."

Oh dear. The omission pretty much did for vendor independence as manufacturers naturally went their own way, leaving large chunks of code incompatible between systems. There were also elements of ALGOL 60 that were open to interpretation, leaving it a little compromised from the start.

While ALGOL ploughed its furrow, Fortran continued to be developed in parallel. "People in the Fortran world," explained Herbert, "saw ideas in ALGOL they quite liked and brought them across." As the decades passed, Fortran remained the centre of gravity for scientific computing while ALGOL became more of an academic language, used for teaching computer science ideas.

"It was quite heavily used in the scientific community," Herbert said. "Most mainframe manufacturers supported it."

Some of the team behind ALGOL 60 stayed with the project and went on to come up with ALGOL 68, which, as far as Herbert is concerned, "nailed all the things that ALGOL 60 had left a bit vague".

Indeed, it was hard to avoid in the 1970s for those taking computer science courses. This hack has fond memories of the successor language, while the grandfather of Reg sub-editor Richard Currie had a hand in the development of ALGOL 68-R and RS.

"It had the world's most exotic input output system," Herbert laughed.

It was also, sadly for its enthusiasts, a bit of a dead end. Despite ALGOL 68-R becoming widely used in (particularly British) military applications for a time, it would take until the 1970s for a full implementation of ALGOL 68 to become available.

The last edition of The ALGOL Bulletin was published in 1988, with its editor noting: "ALGOL 68 as a language is very stable. It is used and loved by those who understand its benefits, and ignored (or misquoted) by the rest."

The story of ALGOL 60 is not so much of the language's eventual fate, but also of those that it inspired. ALGOL W, based on a proposal for ALGOL X, by Niklaus Wirth and QuickSort creator Tony Hoare would go on to inspire Wirth's Pascal and Modula-2. Pascal's influence continues to be felt today.

ALGOL 60 also heavily influenced the Combined Programming Language (CPL), developed in the 1960s but not implemented until the following decade. CPL in turn led to Basic CPL (BCPL), from which B descended. The B language was further developed to become C.

Tony Hoare was responsible for the implementation of ALGOL 60 on the Elliott 803 computer , an example of which remains operational at The National Museum of Computing, although compiling and running a program on that hardware is a little different to the development environments to which coders are now accustomed.

First, the compiler must be loaded from paper tape. The ALGOL program itself is then fed into the tape reader and "it sort of chunters away," remarked Onion, "for anything between 30 seconds to perhaps 15 or 20 minutes during the compilation."

Once compiled, a program would be free to use the space originally occupied by the compiler. Doing so would, however, not win the programmer any popularity awards since the next user would have to reload the compiler again. Leaving it in memory meant that multiple programs could be run.

"That made it very popular for teaching," said Herbert, "because you can have a line of students, each with their paper tape with their programme in their hand and you basically march up to the machine, the machine's got the ALGOL system loaded, you run your programme, it produces gibberish, you go away and think about it and the next student runs their programme."

With paper tape being king, Onion observed that the experience of programming taught a bit of focus: "When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code "

The National Museum of Computing has two Elliott machines in its collection , a 1962 803B (which was donated after spending 15 years lurking in a barn following its decommissioning) and a 903. Both are fully operational and can be seen running once the museum is able to open its doors once again.

The 803B, which is maintained by Onion, also features a Calcomp drum plotter as well as some additional input/output features.

The Lorenz attractor plotted by an ALGOL program (pic: Peter Onion)

As for taking the ALGOL 60 itself out for a spin today, there are a few options for those not fortunate enough to have an Elliott 803 or 903 to hand. MARST will translate ALGOL 60 to C or one can get a feel for the whole 803 experience via a simulator .

Although as ALGOL 60 turns 60, you could just fire up a modern programming language. Lurking within will likely be the ideas of ALGOL's designers. ® John Thorn

Simon Harris , 4 days

Re: .. never used .. ?

When I was studying Electronic Engineering in the early 1980s, ALGOL was the first language we were formally taught - I remember the ALGOL-68R language guide was a Ministry of Defence book.

Simon Harris , 4 days

Re: .. never used .. ?

Algol 60 on an ICL 1902 around 1980 here; How many freaking errors did you get because of missing semicolons?;

As for PL/1, IBM also had its own extended versions (confidential for some reason), used for internal mainframe code development, called PL/AS and PL/DS.

p.s. ADA anyone?

Reply , 0 Archtech , 3 days
Re: Algol 68 is not ALGOL 60

"The more I ponder the principles of language design, and the techniques that put them into practice, the more is my amazement at and admiration of ALGOL 60. Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors".

- C.A.R. Hoare, "Hints on Programming Language Design", 1973

Archtech , 3 days Archtech , 3 days Doctor Syntax , 3 days

"When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code "

10 minutes? Luxury. Punched card jobs run in batches. 2 hours turn-round, max 3 runs a day with the compiler losing track after the first error and rejecting every subsequent line. Then you really paid attention to your source code.

Doctor Syntax , 3 days Anonymous Coward , 3 days

The mainframe operators soon learned that when a System 4 (IBM 360) Assembler run produced thousands of errors after the first few statements it just needed a statement adding and rerunning. IIRC something like a USE or BALR 3,0 establishing the addressing base register.

The punched card data-prep gir women also became quite competent at spotting common mistakes. Hence one compiler test source compiled cleanly - when it was supposed to test those error messages.

On an official training course to learn the System 4 Assembler there was a desk of us who had already had some hands-on practice. The lecturer in Hut K was kept on his toes by our questions. When the set program task was returned from the computer run - he gleefully gave us our failed run listings. We looked at them - then pointed out he had forgotten to include the macro expansion pass. Oops! He then remembered that he always left it out on the first submission to save the expense on his machine time budget. He didn't expect clean compilations from the students.

Primus Secundus Tertius , 3 days

The "bad for science" machine was the IBM 360, where the floating point exponent represented 16**n rather than 2**n. As a result the floating point mantissa sometimes lost three bits. The result was that the single precision floating point was good to only about six decimal digits. Hence the proliferation of double precision floating point on IBM. It was not needed on ICL 190x nor Elliott 803.

IBM were mainly interested in commercial arithmetic from COBOL compilers. This used binary coded decimal (BCD) arithmetic, which could handle billions of dollars to the nearest cent. COBOL type computational defaulted to BCD, I believe. I was once trying to explain floating point data to a database salesman. I finally got through to him with the phrase computational-type-3.

Primus Secundus Tertius , 3 days

"Who remembers Backus-Naur ?"

I do! Back in the early-1980s, working at the in-house consultancy arm of a multinational, I was on-site doing some tedious task when I was joined by a colleague from our Dutch office. He had even less patience than me, so less than 30 minutes in, he resorted to giving me a formal lecture on Backus-Naur notation for the rest of the morning (and returned to Rotterdam in the afternoon). (When the main Board closed us down the following year, he returned to the University of Leiden. Thank you, Joss - I'll never forget BNF.)

Fustbariclation , 17 hrs

Isn't BNF still useful for representing languages?

Fustbariclation , 17 hrs RobThBay , 3 days
The edit/compile/run cycle

"When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code..."

I remember those days, except I was using punch cards instead of paper tape. Those long turn-arounds forced you to desk check your code and spend time debugging it properly.

[Jan 25, 2020] The evolution of a Linux sysadmin Enable Sysadmin by Nathan Lager

Jan 23, 2020 |
A tale of how a sysadmin went from hobbyist and Level 1 tech to the big time at Red Hat.

We've all got a story, right? I don't know if anyone would read mine, but, to the right audience, it might sound familiar, or at least relatable. My life is sort of fractured between two things I'm passionate about. One is the off-road industry, and the other is open source software. Some time ago, I was involved in a YouTube "challenge" where a bunch of us off-road enthusiasts were asked to share our story, and I told the tale of how I got involved in "Jeeping" and why we do what we do in that space. Here, I am about to tell the other side of my story, where I'm a computer guy with a bunch of nerd cred. So hang on while I tell you the story of a broke high-school kid who stumbled into a career. Career Advice

I was a kid in the '80s; before computers were everywhere you looked. My dad, though, he was his generation's version of a geek -- a telephone guy, son of a handyman who worked his way through the Depression, making whatever he could out of whatever he had. As a kid, my dad involved me in all sorts of projects involving building things or wiring, and we even built an electrified wooden toy helicopter together. I learned from him that you could build your own happiness. I always joke with people that I learned my alphabet on a Texas Instruments computer connected to our living room TV. My first game system was an Atari computer with a 5.25" floppy drive.

In the early '90s, my dad brought home a state-of-the-art 486 desktop computer. Our first modern computer! He gave me his old Atari to put in my bedroom and tinker on, and that's what I did. The 486, though, that thing had a modern operating system on it, and I was nothing short of enthralled. A friend introduced me to some local dial-up bulletin board systems.

That, I'd have to say, is what started it all. I learned so much from this community of like-minded folks; all dialed into this little BBS. Eventually, I became very curious about how the BBS itself worked and started tinkering with BBS software. I found the space to be expensive, though; you needed hardware I couldn't afford.

Then came the Internet. As I mentioned, my dad was a telephone guy. He was, at the time, an engineer at a local telephone company. One of the partner companies under the same umbrella as his telco was starting up an internet service provider! He was able to get in early, and I was one of the first kids in my grade to get access to the Internet.

The Internet was a very different place at the time. You'd dial in, so it was much slower, and, because of the speed and the technologies involved, it was very much a text world. It was certainly not the entertainment source it is today, but to me, it was still just amazing!

I had a friend who was just as intrigued by technology as I was. He and I were both interested in the world outside of the end-user experience of these neat little computer things. We read everything we could get our hands-on. We read about websites and servers and how all of these things worked together. We read about some of the darker sides of technology, the Hackers Handbook, and how phone phreaking worked. I even learned a bit about how to pick locks! Lucky for me, and my parents, I've always been guided by a pretty fierce sense of morality, or my life could have turned out much differently.

Our reading and learning eventually led us to Linux. The word "free" associated with Linux caught our attention. I didn't have a job, I was in high school, so free was good. Little did I know that picking up that first Linux distro (Red Hat 5.0, I still have the CDs) would steer me into the career I've worked at for my entire adult life. My friend, by the way, he runs the Engineering team at that ISP I mentioned now. I guess our curiosity worked out pretty well for him too!

During the summer between my sophomore and junior years in High School, I picked up and started tinkering with those Red Hat 5.0 install discs. I installed, and reinstalled, and reinstalled, and reinstalled that OS on my little 486 until I finally got it right. I even got dual-booting working, so I could keep my Windows environment and play with Linux. After I graduated, my parents bought me a new PC to use for my school work in college, so I was able to turn my little 486 into a dedicated Linux machine. By now, we'd moved from dial-up internet service to dedicated cable. 500mbps baby! I ran a website off of my little 486. I lost track of the number of times I had to wipe and reinstall that system because some malicious actor broke into my poor little machine and flattened it on me, but I persisted, learning something else each time.

While I was in college, I worked in level 1 tech support for the ISP I mentioned above. I didn't love it. I had no control over the services I was supporting, and let's face it, level 1 tech support is a frustrating IT job. I spent five years doing that, trying and failing to get into the system administration group at the ISP. Eventually, I moved up into a network support role, which was much better than level 1, but not where I wanted to be. I was OK at networking, and I certainly could have made a career out of it, but it wasn't what I wanted to do. I wanted to run servers. I wanted to run Linux.

So, after seven years at the ISP, I left and started a job as a network administrator at a small web host. We were a team of about ten people, though that varied during the time I was there. "Network administrator" was a very loose title there. I was responsible for everything that had a CPU in it. I even had to replace the filter in the building's AC unit. I was responsible for network gear, some WAN links, Cisco routers, switches, and of course, Windows, Linux, and BSD servers. This was much more in line with what I wanted to do. However, I didn't love how they were doing it, not just from a technology aspect, but from a business perspective. They did some things that I thought were questionable. Still, though, I was gaining experience, in so many ways. I implemented more and more Linux systems to replace windows and BSD systems there, architected improvements, and generally did the best job I knew how to do.

After about three and a half years there, I left that web host for what I thought would be my last move. I started as a system administrator at a small liberal arts college near home. By this point, I'm married, and my wife and I are planning on a family. Higher education has some benefits that many people might not know about. It's a great atmosphere, and they put a lot of emphasis on bettering yourself, not just putting in your hours. The only real downside is that the pay is lower than in the private sector. Still, this was an increase over what I was making, and I didn't know it at the time, but I was walking into a team that helped set me up for the future in ways I couldn't have imagined.

See, I believe that IT is made up of two types of people: folks who see IT as a lucrative career, and folks who are passionate about IT and get paid to do it. This place was about 50% passionate people. I had never worked so closely with people so excited to do what they do. I felt like I was at home, I was learning new things every day, and talking with some of the most brilliant IT people I'd ever met. What's more, they all wanted to share what they knew.

Well, over time, that slowly changed. Those brilliant people took other jobs, some changes in management forced some others out, and eventually, I found that I was one of the few left who was still passionate about what had been so important to the mission at the college. More cloud adoption meant less need for a do-it-yourselfer like me. My "I'm going to retire here" plans started to crumble. I eventually moved into a new role they created for high-performance computing, which had promise. We started deploying the college's first HPC cluster. Then I got a message one Sunday afternoon from a contact I'd made within Red Hat.

I'd met Marc (Unclemarc to those who know him) through the Red Hat Accelerators, a customer advocacy group that Red Hat runs, and of which I'd become a member in late 2018. We hung out at Summit in Boston in early 2019, and apparently, he liked what he saw in me. He let me know that the team he's on would likely have an opening soon, and he thought I'd make a great addition. Now, for me, the prospect of a job at Red Hat sounded almost too good to be true. I'd been a fan of Red Hat since well, remember when I said I bought that first Linux distro install disc in 1997 or so? It was Red Hat Linux. I'd based a career on a Linux distro I'd bought out of an interest in a better way of doing something, when I was a kid in high school, looking for a cheaper alternative. Now here I am, a few months into Technical Account Management at Red Hat. I guess you could say I'm pleased with where this path has taken me.

Wondering where your sysadmin career could take you? Take a skills assessment to find the next step on the path for you. Topics: Sysadmin culture Career Nathan Lager Nate is a Technical Account Manager with Red Hat and an experienced sysadmin with 20 years in the industry. He first encountered Linux (Red Hat 5.0) as a teenager, after deciding that software licensing was too expensive for a kid with no income, in the late 90's. Since then he's run More about me

Related Content

Posted: January 14, 2020

Author: Susan Lauber

[Dec 03, 2019] The America of the moon-landing is not the America of today. Graduates of business schools have taken over once great engineering companies

Dec 03, 2019 |

absalom_hicks , 30 minutes ago link

The America of the moon-landing is not the America of today. Graduates of business schools have taken over once great engineering companies. The business students are of a lower intellect and baser motivation -- the worst possible combination.

The desire for science and engineering has weakened in America but greed for money and wealth is greatly increased. The business types produce mostly intellectual garbage and compensate for it with volume. No competent intellect can read it (or wants to do so) and so it remains unchecked, inflates even more and clogs everything.

You can live for a long time on the great inheritance your fathers have bequeathed you but you cannot live on it forever. Yet this is what we are trying to in more ways than one.

schroedingersrat , 6 minutes ago link

The business students are of a lower intellect and baser motivation

Most of them are losers that simply inherited their wealth & power. America is now run by rich dimwits that would flip burgers in a meritocracy.

[Dec 02, 2019] Perhaps it's time to remember Yuri Gagarin's role in creating the USA hardware and software engineering boom from 1960 to 2000

Dec 02, 2019 |

The shock in the US was that the Russians were not only competitive, but had embarrassed US science and engineering by being first. In 1958, President Eisenhower signed into law the National Defense Education Act, and this enabled talented students to flow into science and engineering. The shock waves were felt throughout the entire educational system, from top to bottom. Mathematics was more important than football.

[Dec 01, 2019] Academic Conformism is the road to 1984. - Sic Semper Tyrannis

Highly recommended!
Dec 01, 2019 |

Academic Conformism is the road to "1984."


The world is filled with conformism and groupthink. Most people do not wish to think for themselves. Thinking for oneself is dangerous, requires effort and often leads to rejection by the herd of one's peers.

The profession of arms, the intelligence business, the civil service bureaucracy, the wondrous world of groups like the League of Women Voters, Rotary Club as well as the empire of the thinktanks are all rotten with this sickness, an illness which leads inevitably to stereotyped and unrealistic thinking, thinking that does not reflect reality.

The worst locus of this mentally crippling phenomenon is the world of the academics. I have served on a number of boards that awarded Ph.D and post doctoral grants. I was on the Fulbright Fellowship federal board. I was on the HF Guggenheim program and executive boards for a long time. Those are two examples of my exposure to the individual and collective academic minds.

As a class of people I find them unimpressive. The credentialing exercise in acquiring a doctorate is basically a nepotistic process of sucking up to elders and a crutch for ego support as well as an entrance ticket for various hierarchies, among them the world of the academy. The process of degree acquisition itself requires sponsorship by esteemed academics who recommend candidates who do not stray very far from the corpus of known work in whichever narrow field is involved. The endorsements from RESPECTED academics are often decisive in the award of grants.

This process is continued throughout a career in academic research. PEER REVIEW is the sine qua non for acceptance of a "paper," invitation to career making conferences, or to the Holy of Holies, TENURE.

This life experience forms and creates CONFORMISTS, people who instinctively boot-lick their fellows in a search for the "Good Doggy" moments that make up their lives. These people are for sale. Their price may not be money, but they are still for sale. They want to be accepted as members of their group. Dissent leads to expulsion or effective rejection from the group.

This mentality renders doubtful any assertion that a large group of academics supports any stated conclusion. As a species academics will say or do anything to be included in their caste.

This makes them inherently dangerous. They will support any party or parties, of any political inclination if that group has the money, and the potential or actual power to maintain the academics as a tribe. pl

doug , 01 December 2019 at 01:01 PM


That is the nature of tribes and humans are very tribal. At least most of them. Fortunately, there are outliers. I was recently reading "Political Tribes" which was written by a couple who are both law professors that examines this.

Take global warming (aka the rebranded climate change). Good luck getting grants to do any skeptical research. This highly complex subject which posits human impact is a perfect example of tribal bias.

My success in the private sector comes from consistent questioning what I wanted to be true to prevent suboptimal design decisions.

I also instinctively dislike groups that have some idealized view of "What is to be done?"

As Groucho said: "I refuse to join any club that would have me as a member"

J , 01 December 2019 at 01:22 PM
Reminds one of the Borg, doesn't it?

The 'isms' had it, be it Nazism, Fascism, Communism, Totalitarianism, Elitism all demand conformity and adherence to group think. If one does not co-tow to whichever 'ism' is at play, those outside their group think are persecuted, ostracized, jailed, and executed all because they defy their conformity demands, and defy allegiance to them.

One world, one religion, one government, one Borg. all lead down the same road to -- Orwell's 1984.

Factotum , 01 December 2019 at 03:18 PM
David Halberstam: The Best and the Brightest. (Reminder how the heck we got into Vietnam, when the best and the brightest were serving as presidential advisors.)

Also good Halberstam re-read: The Powers that Be - when the conservative media controlled the levers of power; not the uber-liberal one we experience today.

[Nov 08, 2019] The monumental impact of C

Nov 08, 2019 |

The monumental impact of C The season finale of Command Line Heroes offers a lesson in how a small community of open source enthusiasts can change the world. 01 Oct 2019 Matthew Broberg (Red Hat) Feed 29 up 3 comments x Subscribe now

Get the highlights in your inbox every week. Command Line Heroes podcast explores C's origin story in a way that showcases the longevity and power of its design. It's a perfect synthesis of all the languages discussed throughout the podcast's third season and this series of articles . original_c_programming_book.jpg

C is such a fundamental language that many of us forget how much it has changed. Technically a "high-level language," in the sense that it requires a compiler to be runnable, it's as close to assembly language as people like to get these days (outside of specialized, low-memory environments). It's also considered to be the language that made nearly all languages that came after it possible.

The path to C began with failure

While the myth persists that all great inventions come from highly competitive garage dwellers, C's story is more fit for the Renaissance period.

In the 1960s, Bell Labs in suburban New Jersey was one of the most innovative places of its time. Jon Gertner, author of The idea factory , describes the culture of the time marked by optimism and the excitement to solve tough problems. Instead of monetization pressures with tight timelines, Bell Labs offered seemingly endless funding for wild ideas. It had a research and development ethos that aligns well with today's open leadership principles . The results were significant and prove that brilliance can come without the promise of VC funding or an IPO.

Programming and development

The challenge back then was terminal sharing: finding a way for lots of people to access the (very limited number of) available computers. Before there was a scalable answer for that, and long before we had a shell like Bash , there was the Multics project. It was a hypothetical operating system where hundreds or even thousands of developers could share time on the same system. This was a dream of John McCarty, creator of Lisp and the term artificial intelligence (AI), as I recently explored .

Joy Lisi Ranken, author of A people's history of computing in the United States , describes what happened next. There was a lot of public interest in driving forward with Multics' vision of more universally available timesharing. Academics, scientists, educators, and some in the broader public were looking forward to this computer-powered future. Many advocated for computing as a public utility, akin to electricity, and the push toward timesharing was a global movement.

Up to that point, high-end mainframes topped out at 40-50 terminals per system. The change of scale was ambitious and eventually failed, as Warren Toomey writes in IEEE Spectrum :

"Over five years, AT&T invested millions in the Multics project, purchasing a GE-645 mainframe computer and dedicating to the effort many of the top researchers at the company's renowned Bell Telephone Laboratories -- including Thompson and Ritchie, Joseph F. Ossanna, Stuart Feldman, M. Douglas McIlroy, and the late Robert Morris. But the new system was too ambitious, and it fell troublingly behind schedule. In the end, AT&T's corporate leaders decided to pull the plug."

Bell Labs pulled out of the Multics program in 1969. Multics wasn't going to happen.

The fellowship of the C

Funding wrapped up, and the powerful GE645 mainframe was assigned to other tasks inside Bell Labs. But that didn't discourage everyone.

Among the last holdouts from the Multics project were four men who felt passionately tied to the project: Ken Thompson, Dennis Ritchie, Doug McIlroy, and J.F. Ossanna. These four diehards continued to muse and scribble ideas on paper. Thompson and Ritchie developed a game called Space Travel for the PDP-7 minicomputer. While they were working on that, Thompson started implementing all those crazy hand-written ideas about filesystems they'd developed among the wreckage of Multics.

pdp7-minicomputer-oslo-2005.jpeg A PDP-7 minicomputer was not top of line technology at the time, but the team implemented foundational technologies that change the future of programming languages and operating systems alike.

That's worth emphasizing: Some of the original filesystem specifications were written by hand and then programmed on what was effectively a toy compared to the systems they were using to build Multics. Wikipedia's Ken Thompson page dives deeper into what came next:

"While writing Multics, Thompson created the Bon programming language. He also created a video game called Space Travel . Later, Bell Labs withdrew from the MULTICS project. In order to go on playing the game, Thompson found an old PDP-7 machine and rewrote Space Travel on it. Eventually, the tools developed by Thompson became the Unix operating system : Working on a PDP-7, a team of Bell Labs researchers led by Thompson and Ritchie, and including Rudd Canaday, developed a hierarchical file system , the concepts of computer processes and device files , a command-line interpreter , pipes for easy inter-process communication, and some small utility programs. In 1970, Brian Kernighan suggested the name 'Unix,' in a pun on the name 'Multics.' After initial work on Unix, Thompson decided that Unix needed a system programming language and created B , a precursor to Ritchie's C ."

As Walter Toomey documented in the IEEE Spectrum article mentioned above, Unix showed promise in a way the Multics project never materialized. After winning over the team and doing a lot more programming, the pathway to Unix was paved.

Getting from B to C in Unix

Thompson quickly created a Unix language he called B. B inherited much from its predecessor BCPL, but it wasn't enough of a breakaway from older languages. B didn't know data types, for starters. It's considered a typeless language, which meant its "Hello World" program looked like this:

main( ) {
extrn a, b, c;
putchar(a); putchar(b); putchar(c); putchar('!*n');

a 'hell';
b 'o, w';
c 'orld';

Even if you're not a programmer, it's clear that carving up strings four characters at a time would be limiting. It's also worth noting that this text is considered the original "Hello World" from Brian Kernighan's 1972 book, A tutorial introduction to the language B (although that claim is not definitive).


Typelessness aside, B's assembly-language counterparts were still yielding programs faster than was possible using the B compiler's threaded-code technique. So, from 1971 to 1973, Ritchie modified B. He added a "character type" and built a new compiler so that it didn't have to use threaded code anymore. After two years of work, B had become C.

The right abstraction at the right time

C's use of types and ease of compiling down to efficient assembly code made it the perfect language for the rise of minicomputers, which speak in bytecode. B was eventually overtaken by C. Once C became the language of Unix, it became the de facto standard across the budding computer industry. Unix was the sharing platform of the pre-internet days. The more people wrote C, the better it got, and the more it was adopted. It eventually became an open standard itself. According to the Brief history of C programming language :

"For many years, the de facto standard for C was the version supplied with the Unix operating system. In the summer of 1983 a committee was established to create an ANSI (American National Standards Institute) standard that would define the C language. The standardization process took six years (much longer than anyone reasonably expected)."

How influential is C today? A quick review reveals:

Decades after they started as scrappy outsiders, Thompson and Ritchie are praised as titans of the programming world. They shared 1983's Turing Award, and in 1998, received the National Medal of Science for their work on the C language and Unix.


But Doug McIlroy and J.F. Ossanna deserve their share of praise, too. All four of them are true Command Line Heroes.

Wrapping up the season

Command Line Heroes has completed an entire season of insights into the programming languages that affect how we code today. It's been a joy to learn about these languages and share them with you. I hope you've enjoyed it as well!

[Oct 23, 2019] Internet Archive Releases 2,500 MS-DOS Games

Oct 23, 2019 |

( 58 BeauHD on Monday October 14, 2019 @10:10PM from the nostalgia-blast dept. The latest update from Internet Archive brings thousands of MS-DOS games from the '90s like 3D Bomber, Zool and Alien Rampage. CNET reports: On Sunday, Internet Archive released 2,500 MS-DOS games that includes action, strategy and adventure titles. Some of the games are Vor Terra, Spooky Kooky Monster Maker, Princess Maker 2 and I Have No Mouth And I Must Scream. "This will be our biggest update yet, ranging from tiny recent independent productions to long-forgotten big-name releases from decades ago," Internet Archive software curator Jason Scott wrote on the site's blog .

One game that might trigger a few memories is the 1992 action-adventure horror game Alone in the Dark , published by Infogrames. In the game, you can play private investigator Edward Carnby or family member Emily Hartwood, who's investigating the suspicious death of Jeremy Hartwood in his Louisiana mansion called Derceto, which is now supposedly haunted. Fighting against rats, zombies and giant worms, you have to solve a number of puzzles to escape. Another retro game included by Internet Archive is a 1994 title played on PCs and Amiga computers called Mr. Blobby (a remake of the SNES game Super Troll Islands). Players can choose from three different characters -- Mr. Blobby, Mrs. Blobby and Baby Blobby. The goal of the game is to color in the computer screen by walking over it. Levels include climbing ladders, avoiding spikes and bouncing on springs.

[Oct 22, 2019] Wired Remembers the Glory Days of Flash

Oct 22, 2019 |

( 95

They write that its early popularity in the mid-1990s came in part because "Microsoft needed software capable of showing video on their website,, then the default homepage of every Internet Explorer user." But Flash allowed anyone to become an animator. (One Disney artist tells them that Flash could do in three days what would take a professional animator 7 months -- and cost $10,000.)

Their article opens in 2008, a golden age when Flash was installed on 98% of desktops -- then looks back on its impact: The online world Flash entered was largely static. Blinking GIFs delivered the majority of online movement. Constructed in early HTML and CSS, websites lifted clumsily from the metaphors of magazine design: boxy and grid-like, they sported borders and sidebars and little clickable numbers to flick through their pages (the horror).

Flash changed all that. It transformed the look of the web ...

Some of these websites were, to put it succinctly, absolute trash. Flash was applied enthusiastically and inappropriately. The gratuitous animation of restaurant websites was particularly grievous -- kitsch abominations, these could feature thumping bass music and teleporting ingredients . Ishkur's 'guide to electronic music' is a notable example from the era you can still view -- a chaos of pop arty lines and bubbles and audio samples, it looks like the mind map of a naughty child...

In contrast to the web's modern, business-like aesthetic, there is something bizarre, almost sentimental, about billion-dollar multinationals producing websites in line with Flash's worst excess: long loading times, gaudy cartoonish graphics, intrusive sound and incomprehensible purpose... "Back in 2007, you could be making Flash games and actually be making a living," remembers Newgrounds founder Tom Fulp, when asked about Flash's golden age. "That was a really fun time, because that's kind of what everyone's dream is: to make the games you want and be able to make a living off it."
Wired summarizes Steve Jobs' "brutally candid" diatribe against Flash in 2010. "Flash drained batteries. It ran slow. It was a security nightmare. He asserted that an era had come to an end... '[T]he mobile era is about low power devices, touch interfaces and open web standards -- all areas where Flash falls short.'" Wired also argues that "It was economically viable for him to rubbish Flash -- he wanted to encourage people to create native games for iOS."

But they also write that today, "The post-Flash internet looks different. The software's downfall precipitated the rise of a new moulded by the specifications of the smartphone and the growth of social media," favoring hits of information rather than striving for more immersive, movie-emulating thrills.

And they add that though Newgrounds long-ago moved away from Flash, the site's founder is now working on a Flash emulator to keep all that early classic content playable in a browser.

[Oct 13, 2019]

Notable quotes:
"... He mostly writes in C today. ..."
Oct 13, 2019 |

Eugene Miya , A friend/colleague. Sometimes driver. Other shared experiences. Updated Mar 22 2017 · Author has 11.2k answers and 7.9m answer views

He mostly writes in C today.

I can assure you he at least knows about Python. Guido's office at Dropbox is 1 -- 2 blocks by a backdoor gate from Don's house.

I would tend to doubt that he would use R (I've used S before as one of my stat packages). Don would probably write something for himself.

Don is not big on functional languages, so I would doubt either Haskell (sorry Paul) or LISP (but McCarthy lived just around the corner from Don; I used to drive him to meetings; actually, I've driven all 3 of us to meetings, and he got his wife an electric version of my car based on riding in my car (score one for friend's choices)). He does use emacs and he does write MLISP macros, but he believes in being closer to the hardware which is why he sticks with MMIX (and MIX) in his books.

Don't discount him learning the machine language of a given architecture.

I'm having dinner with Don and Jill and a dozen other mutual friends in 3 weeks or so (our quarterly dinner). I can ask him then, if I remember (either a calendar entry or at job). I try not to bother him with things like this. Don is well connected to the hacker community

Don's name was brought up at an undergrad architecture seminar today, but Don was not in the audience (an amazing audience; I took a photo for the collection of architects and other computer scientists in the audience (Hennessey and Patterson were talking)). I came close to biking by his house on my way back home.

We do have a mutual friend (actually, I introduced Don to my biology friend at Don's request) who arrives next week, and Don is my wine drinking proxy. So there is a chance I may see him sooner.

Steven de Rooij , Theoretical computer scientist Answered Mar 9, 2017 · Author has 4.6k answers and 7.7m answer views

Nice question :-)

Don Knuth would want to use something that’s low level, because details matter . So no Haskell; LISP is borderline. Perhaps if the Lisp machine ever had become a thing.

He’d want something with well-defined and simple semantics, so definitely no R. Python also contains quite a few strange ad hoc rules, especially in its OO and lambda features. Yes Python is easy to learn and it looks pretty, but Don doesn’t care about superficialities like that. He’d want a language whose version number is converging to a mathematical constant, which is also not in favor of R or Python.

What remains is C. Out of the five languages listed, my guess is Don would pick that one. But actually, his own old choice of Pascal suits him even better. I don’t think any languages have been invented since was written that score higher on the Knuthometer than Knuth’s own original pick.

And yes, I feel that this is actually a conclusion that bears some thinking about. 24.1k views ·

Dan Allen , I've been programming for 34 years now. Still not finished. Answered Mar 9, 2017 · Author has 4.5k answers and 1.8m answer views

In The Art of Computer Programming I think he'd do exactly what he did. He'd invent his own architecture and implement programs in an assembly language targeting that theoretical machine.

He did that for a reason because he wanted to reveal the detail of algorithms at the lowest level of detail which is machine level.

He didn't use any available languages at the time and I don't see why that would suit his purpose now. All the languages above are too high-level for his purposes.

[Oct 01, 2019] Soviet Computing in the 1980s A Survey of the Software and Its Applications - ScienceDirect

Oct 01, 2019 |

Advances in Computers Volume 30 , 1990, Pages 223-306

The chapter surveys some aspects of the Soviet computer software world and examines how computers applied in several fields enjoy a high level of official support. The chapter examines seven major areas of computer applications in the USSR. Various automated systems of management and control (ASU) are discussed. The state of computing in Soviet research and development organizations, which found themselves low on the priority list when it came to allocating computer and communications technology until the mid 1980s is also described. Computer networking is also developing very slowly in the USSR. The Ministry of Telecommunications is hostile to data communications and places various impediments in the way of organizations desiring to use the switched network for this purpose. The chapter reviews Soviet educational computing. Computer courses with a curriculum stressing the development of programming skills and "algorithmic thinking" were introduced into Soviet schools. Computer Aided Design (CAD) is the latest applications area of highest priority. The chapter emphasizes that without radical change, the Soviet software industry will be unable to satisfy domestic demand for high-quality software. The consequence is that Western software will be in great and growing demand, which raises a policy question for the United States and its software industry.

[Oct 01, 2019] What Were Soviet Computers Like?

Feb 01, 2002 |

80 kwertii asks: "Does anyone have any information on computing in the former Soviet Union? A Google search turned up this virtual museum , which has some good historical background on the development of early Soviet computer technology (a lot only in Russian, unfortunately) but not much on later systems. What sorts of architectures did Soviet computers use? Were there any radically different computing concepts in use, like a standard 9-bit byte or something? What kind of operating systems were common? How has the end of the Cold War and the large scale introduction of Western computer technology affected the course of Russian computer development?"

Bob_Robertson ( 454888 ) , Monday February 18, 2002 @09:21PM ( #3029642 ) Homepage

Like IBM's. ( Score: 4 , Interesting)

The reality is that the KGB was stealing American computer designs from the beginning. As Glastnost was coming into being, and the "west" was getting a look into how things worked inside the Soviet system, they discovered that they were running clones of the IBM 360's.

I've seen an interview recently with an ex-KGB big-wig who said he realized how bankrupt the Soviet system was as he learned how little they developed "in house" rather than copied from the west. The Soviets were always one or two generations of technology behind simply because they weren't inventing it.


Xunker ( 6905 ) , Monday February 18, 2002 @09:26PM ( #3029665 ) Homepage Journal
Another.. ( Score: 5 , Interesting)

An an slightly different node, I found this link a while ago that discusses, in great depth, Sinclair Clones [] from the late 1970's to the early 1990's.

Another thing I remember reading a long while ago was an article in "A+/Incider" magazine (and Apple II magazine) where the cover story was the giant headline "Red Apples"; in it they talked about a close of the Apple IIe that looked like a negative of the Apple IIe we know (black case, white keys), but otherwise was more or less the same -- compatible logic, just made somewhere else. I may even throw that coppy in my flatbed if there is enoguh interest.

If I had to guess, all but either very high-end or very early machine will be of the same designs as western counterparts, probably for engineering reasons because an engineer doesn't want to reinvent the wheel (or bitwise logic in this case) just to make machine to do word processing.

morcheeba ( 260908 ) writes:
Re:Another.. ( Score: 3 , Informative)

Here's some info on the Agat [] - a clone of an Apple II.

If you want to buy an old Russian computer, try here (has many pictures!) []. I don't know if this guy's stock is representative of 1980's Russian computing, but it contains a lot (31) of Sinclair clones [], and information on other computers, including IBM PC-compatibles []. If nothing, the names listed should help searches.

deicide ( 195 ) writes:
Re:Another.. ( Score: 2 , Informative)

Sinclair clones are VERY representative of personal computer market of that time. There were literally dozens of variants, with various extensions and addons, custom operating systems, modified OS, etc. They were self-made (I've had one of those, total cost: $20), with mass-produced pc boards and cases, and even factory-made (even with OS translated to Russian.

Most of them connected to a TV and used tape recorders for storage. Eventually, I had a dot-matrix printer and could've gotten a 5" floppy drive if I really wanted. I've seen mice, modems and light pens. I've seen cable and broadcast tv system's audio channel used to broadcast binary data when station wasn't broadcasting regular programming (would that be predecessor to cable modems?) We would record audio to tapes and then load them back into computer.

There were clones of 286 PC's as well (Poisk), although that was just about when I moved to this side of the ocean..

There were also completely original computers with BASIC or FORTRAN interpreter as "operating system".

Tipsy McStagger ( 312800 ) writes:
Re:Another.. ( Score: 1 )

heh he.. sinclair modems.

I remember trying to set up a system with a friend across town where the spectrums were wired up to mangled phones and we'd send messages by saving a program across the phone that the other end would load and then repeat... each message also included the basic app required to send the next one - or something - I forget now

Anonymous Coward writes:
Re:Another.. ( Score: 2 , Interesting)

I live in the USSR. Most of what I saw where:

- Z80 machines running CP/M or custom operating systems like the DIOS

- Sinclair clones

When the opening to the west happened, there was a huge leap in technology because 286 and 386SX PCs were brought.

I was fortunate enough to have one, and it seemed to me, at that time, that they had gigantic CPU power and a huge memory.

I was running benchmarks all the time to compare my 386sx with my Sinclair.

My 386sx was about 10-15 times faster, and had 15 times more memory!

How was that for a leap?

Now in Eastern Europe we have very good programmers. Why?

Because, when the outside world is not that interesting and funny, more and more people have fun (I mean, programming is lots of fun) with their computers!

Thank you for your time reading this, and sorry for posting as AC. I don't have a ./ account and I find logging it each time in order to read ./ is pretty hard.

cwebster ( 100824 ) writes:
Re:Another.. ( Score: 1 )

>Thank you for your time reading this, and sorry for posting as AC. I don't have a ./ account and I find logging it each time in order to read ./ is pretty hard.

you know, you can cookie your logon and only have to actually log on once a year, when your cookie expires.

Ratface ( 21117 ) writes:
Re:Another.. ( Score: 2 )

They have (had?) one of the Russian Sinclair clones in a display case by the stairs at the staff entrance of the National Science Museum in London when I was there (~5 years ago).

First time I had ever seen one. I've often thought how much fun it must have been trying to deal with the strange 5 functions / key system that the Spectrum had PLUS having everything in Cyrrilic(sp?)!

I'd love to pick up one of those babies!

eufaula ( 163352 ) writes:
Re:Another.. -- Pravetz ( Score: 3 , Interesting)

I have a good friend who is from Bulgaria, and there they mass-produced an apple IIe knockoff called the Pravetz. they reverse engineered the apple and started making their own version. He said that they ended up being more powerful than any of the apple II line.

People like the Dark Avenger (ever had a real computer virus? he probably wrote it) grew up hacking these things. anyway, they are mentioned in a really good wired article [] about the Dark Avenger and the Soviet Bloc's more recent computing history, and Woz even has a picture of one [] on his website.

Anonymous Coward , Monday February 18, 2002 @10:15PM ( #3029732 )
link ( Score: 5 , Informative) -- also has bibliography to printed materials

RGRistroph ( 86936 ) writes:
Re:link ( Score: 2 )

Mod parent up -- it's one of the only two informative posts so far (and no, that guy ranting about how you have to go to the library to do research is not insightful)

Link for the lazy: []

clem.dickey ( 102292 ) , Tuesday February 19, 2002 @12:14AM ( #3030100 )
Ryad line ( Score: 4 , Informative)

In the late 70s or early 80s ACM's "Computing Surveys" ran an article on Soviet computing. Here's what I remember:

The Soviets said that military computers were generally original designs.

Most of the commercial computers were either IBM 360/370 models diverted through 3rd countries (direct exports were prohibited) or the Soviet "Ryad" line. Ryads were 360/370 copies. Not having to worry about copyright andd patent issues, the East copied IBM mainframes directly. IBM engineers recognized an I/O problem with one Soviet model, since the IBM original had the same problem. Just as the 360 model development was split among groups in Poughkeepsie and Endicott, different Soviet Bloc countries were assigned development/manufacturiing responsibility for the copies.

Software was, of course, pirated OS/360. (Back in those days, software came with source.)

RGRistroph ( 86936 ) writes: < > on Tuesday February 19, 2002 @12:54AM ( #3030261 ) Homepage
Re:Ryad line ( Score: 5 , Informative)

I found the's site search to be unusable on linux/mozilla, which is ironic -- however, a Google search on "soviet" [] turned up some interesting papers available as pdf (special tribute to Russian Dmitry Sklyarov ?):

There are more, but the google search page is probably the place to go, rather than me cutting-and-pasting it here.

By the way, that guy S.E. Goodman seems to have also written an article about Red China's internet infrastructure.

AndyElf ( 23331 ) writes:
Re:link ( Score: 1 )

Also this [] page has interesting info. History, timeline, pics.

fm6 ( 162816 ) , Monday February 18, 2002 @11:01PM ( #3029782 ) Homepage Journal
The Old-Fashioned Way ( Score: 2 , Insightful)

Let's see, so far we've got one offtopic post, one bigoted and ignorant post from the Tom Clancy crowd, and the usual noise. I don't think you'll get much help here. Sometimes all you can find online is opinion and rumor.

Now, don't get me wrong. I love the Web in general and Slashdot in particular. Both are invaluable resources for obscure little questions like the one you're asking. I know I used to write technical documentation without having the net as a reference source -- but I'm damned if I remember how.

Still, information you can get through this kind of informal research is limited in scope. There's a lot of stuff online -- but a lot more that's not. A lot of texts exist only in proprietary databases, not on the web. Not to mention the the much larger document base that simply doesn't exist in eletronic form.

You need to find a good library, probably one at a university or in a major city. They all have web sites (librarians love the web) and usually have their catalogs online. But searching a library catalog is not as simple as typing a few content words into Google. You probably need to interface with one of those old-fashioned access nodes that are only available onsite -- the ones with comprehensive heuristic and associative search features. I refer, of course, to reference librarians.

duffbeer703 ( 177751 ) writes:
Re:The Old-Fashioned Way ( Score: 3 , Insightful)

That's very politically correct of you. You show a tendency common to most PC types -- Don't let the facts get in the way of feel-good politics.

The Soviet Union didn't do very much independent computer design after the early 1960's. Various Soviet agencies and front organizations obtained IBM, Burroughs and Sperry-Univac mainframes and setup factories to manufacture spares and even a few backward-engineered copies.

The Soviet Union did not embrace information technology. It was a society that was essentially living in the 1930's. Heavy industry was the priority of the USSR, not semiconductors.

If you looked on the desks of Soviet desk jockeys in the late 80's, you'd find most offices to be non-computerized (like many western offices). The ones with computers had green screens, IBM or Apple clones. Engineers had Intergraph or Apolla stuff.

The truth isn't bigoted or ignorant.

fm6 ( 162816 ) writes:
I'm so PC! ( Score: 2 )

I love the term "Politically Correct". It allows you to dismiss any difference of opinion as a kneejerk reaction. Which is itself, of course, a kneejerk reaction.

(I once heard Night of the Living Dead condemned as "Politically Correct" because the main character was black. Too typical.)

Look, I never said the Soviets never ripped off American technology. The US leads the planet in this area. People imitate us. Well, duh. Go to the Sony web site sometime and read that company's history. Their early attempts to reverse-engineer and manufacture magnetic recording devices are quite amusing.

I'm no expert on the history of Soviet technology. But I do know enough to know that saying "They never did anything with computers except rip off American designs" is simplistic and stupid.

In point of fact, Soviet engineers in all areas were not able to imitate Western technology as much as they would have liked. There were many reasons for this, some obvious, some not. If you're really interested in the subject, go do some actual reading. In any case, spare us the Clancy cliches.

duffbeer703 ( 177751 ) writes:
Re:I'm so PC! ( Score: 2 )

The term "Politically Correct" in this context means that you are more concerned with your notion of "fairness" towards the former Soviet Union than the facts.

You have further reinforced my assessment of your original post with your reply. You suggest that i visit the Sony web site to learn about their early reverse-engineering efforts, then admit that you know virtually nothing about Soviet technology. You then assert (while posting in "Ask Slashdot") that we would all be better served by reading printed books (that Tom Clancy didn't write) on the subject rather than asking people on the web.

Maybe you should have taken a second to read my post. In that post I stated clearly that Soviets did have their own computer innovations until sometime in the 1960's. At that point it was cheaper and easier for them to appropriate and/or copy Western equipment. Technology as it applied to semiconductors just was not a priority.

Spare this forum your offtopic pseudo-intellectual rants and go away.

fm6 ( 162816 ) writes:
Grunt. Mumble. ( Score: 2 )

It's so paradoxical being PC. On the one hand, people assume you're so thoroughly brainwashed that you can't think for yourself. On the other hand, they continue to lecture you as if you were actually capable of rational thought!

Well, I can't sneer. Here I am arguing with a guy who enters the discussion with the premise that nothing I say can make sense. Pretty futile, no?

But I love the way you put "fair" in quotes. In this context "fair" simply means admitting that you don't know what you don't know. It means being skeptical about your own prejudices and assumptions.

It might help if you separate out the issue of whether the Soviet system was morally bankrupt and profoundly inefficient. Actually, that's not even an issue any more -- almost everybody agrees that it was. But it doesn't follow from this fact that Soviet technology consisted entirely of pathetic American rip offs. However screwed up the state was, it had some brilliant citizens, and only a bigot would dismiss their accomplishments out of hand.

fm6 ( 162816 ) writes:
Disagreement is not Bigotry ( Score: 2 )

You think I want the whole world to agree with me? What am I doing on Slashdot then?

It's not the opinion that makes you a bigot. Bigotry can happen to anybody, of any stripe. God knows I've caught myself in that mode often enough.

The difference between disagreement and bigotry is the same as the difference between having an honest difference of opinion and being prejudiced. If you disagree with me because you find my arguments uncompelling, then you're just someone with a different POV. That's fair enough. It's even useful -- even if neither of us can admit he's wrong, at least we can keep each other honest.

But if you start out assuming that whole groups of people are incapable of saying or doing anything worth your notice, and sneer at anybody who suggests otherwise, then you're a bigot.

Peter H.S. ( 38077 ) , Tuesday February 19, 2002 @07:57PM ( #3035283 ) Homepage
Re:The Old-Fashioned Way ( Score: 4 , Insightful)

The Soviet Union did not embrace information technology. It was a society that was essentially living in the 1930's. Heavy industry was the priority of the USSR, not semiconductors.

If you looked on the desks of Soviet desk jockeys in the late 80's, you'd find most offices to be non-computerized (like many western offices). The ones with computers had green screens, IBM or Apple clones. Engineers had Intergraph or Apolla stuff.

The USSR was indeed behind behind the west regarding advanced semiconductor technologi, but your anectdotical evidence can be misleading, since the USSR soviet economy was sharply devided into a civilian part (who got almost nothing) and a military who had first priority.
So even though the standard USSR office was pen-and-paper, the military complex would have access much more advanced technology.
IMHO, soviet military equipment since WWII to until the eighties, was often on par, if not better, than US equipment (especially missilies, tanks, infantery weapons, airplanes, though perhaps not avionics).
OTOH, civilian USSR equipment was always decades behind, what could be found in the west.

The truth isn't bigoted or ignorant.
I believe that a famous USSR newspaper was called "Pravda", meaning "The Truth" ;-).

WetCat ( 558132 ) writes:
Re:Gotta say it....... ( Score: 1 )

In 1991, I actually have been using one BESM-6 computer, which was completely original (no IBM copying at all). It was 48-bit machine. I was faster than IBM PS2 12Mhz...

andaru ( 535590 ) writes: < > on Monday February 18, 2002 @11:38PM ( #3029931 ) Homepage
Bug free code ( Score: 5 , Interesting)

I remember a book called Writing Bug Free Code (yes, you all scoff, but this is for real) written by a Russian computer scientist.

The basic premise was that he was using punch cards, and the actual computer on which he was compiling and testing his programs was in a relatively distant city.

He would punch up a set of cards and mail them to where the computer was, which would take a week or two. When they got around to it, they would compile his program and print out a test run using input he gave them. This would take another week. The week or two return trip made the average round trip take a month.

Now if you had to wait one month to find out that you had missed a semicolon, wouldn't you be more careful?

Usquebaugh ( 230216 ) writes:
Re:Bug free code ( Score: 2 , Funny)

Depends wether it was fixed price or time and materials :-)

boopus ( 100890 ) writes:
Re:Bug free code ( Score: 2 )

I think you've hit upon the reason many of us capitalists don't beleive in comunism...

fm6 ( 162816 ) writes:
Card Carrying Programmers ( Score: 2 )

Now if you had to wait one month to find out that you had missed a semicolon, wouldn't you be more careful?
Actually, that POV is not restricted to the former Proletarian Dictatorship. Most of my early programming was done by punching FORTRAN and PL/1 code onto punched cards. I used to stay up all night so I could submit my jobs when the turnaround was down to 15 minutes.

I had a FORTRAN textbook that said this was Very Bad, and not just because of lost sleep. It urged students to think through their code before trying it. Do hand simulation. Read it through with a friend. Later on I read books by people who insisted all software should be "provably correct."

Now I work with Delphi and Kylix, which thoroughly encourages the cut-and-try approach. Oh well.

marcus ( 1916 ) writes:
Brings back memories ( Score: 1 )

Including functional front panels, paper tape and thoughts like "Wow, that 1200bps cassette tape is fast!"

Used to do punch cards in PL/1 at school at least until I discovered the lab with vt-100s in it, and made friends with an operator who showed me how to make the machine punch the cards based on the source file that I had entered at the terminal. ;-) Hello David, are you still out there?

fm6 ( 162816 ) writes:
Re:Brings back memories ( Score: 2 )

Yeah, IBM really resisted interactive computing for a long time. Actually a good thing, since it helped give companies like DEC and DG their shot. One way to do without keypunches in IBM shops was to write a card-reader emulator!

Are we in nostalgia mode? Elsewhere on /., somebody is asking for help porting his RPG code to Linux. I seem to recall that RPG was little more than a software emulator for an IBM accounting machine, which used plugboard programming to process data on punched cards. Perhaps I misremember. Silly to invent a language for something like that!

Satai ( 111172 ) writes:
In... ( Score: 1 , Troll)

...the words of Yakkof Smirnof (or some spelling variation thereof,) in Communist Russia, Computer crash you!

Evil Attraction ( 150413 ) , Tuesday February 19, 2002 @12:57AM ( #3030274 )
Found lots of information ( Score: 3 , Informative)

I found some related (and maybe some not so related) information on this by using Google [] and searching for "soviet union computers technology". Here's a handful of links for ya; "Computing in the Former Soviet Union and Eastern Europe" [] "Where did Soviet scientists go?" [] "Creator of the first stored program computer in continental Europe" [] Not much, but you might find more for yourself by refining your search a little.

Evil Attraction

Detritus ( 11846 ) , Tuesday February 19, 2002 @01:02AM ( #3030291 ) Homepage
Ukraine ( Score: 4 , Informative)

See this [] for a Ukrainian perspective on Soviet computer history.

You also may want to do a google search on the comp.arch newsgroup. I think the topic has been discussed there.

The Soviets reverse engineered a number of American designs (IBM 360, PDP-11). They also did some original designs for special applications.

Some of the work was farmed out to other Warsaw Pact countries, such as the GDR.

Jeremiah Cornelius ( 137 ) writes:
Principally Copies of Successful US Designs. ( Score: 2 , Redundant)

The PDP-11 series were extensively copied in the USSR, as were the IBM 360 mainframes []

fooguy ( 237418 ) , Tuesday February 19, 2002 @01:57AM ( #3030457 ) Homepage
VAX - When you Care Enough to Steal the Very Best ( Score: 5 , Funny)

This quote is from page 15 of the OpenVMS at 20 publication that Digital Published in 1997. The PDF [] is available from Compaq.

During the cold war, VAX systems could not be sold behind the Iron Curtain. Recognizing superior technology, technical people cloned VAX systems in Russia, Hungary, and China. After learning that VAX systems were being cloned, DIGITAL had the following words etched on the CVAX chip, "VAX...when you care enough to steal the very best."

morcheeba ( 260908 ) , Tuesday February 19, 2002 @02:23PM ( #3032956 ) Journal
Re:VAX - When you Care Enough to Steal the Very Be ( Score: 4 , Informative)

Those words were in Cyrillic (of course)... see them on the chip here! []

dylan_- ( 1661 ) writes:
Re:VAX - When you Care Enough to Steal the Very Be ( Score: 2 )

...and then explore the rest of this incredibly cool site.

Thanks for the link, morcheeba.

morcheeba ( 260908 ) writes:
Re:VAX - When you Care Enough to Steal the Very Be ( Score: 1 )

Oh yeah, it's a great site... maybe I should have mentioned it :) I've been lucky to work at two places with good optical equipment ... mainly for PCB inspection/rework, so not quite the magnification at that site. When I mysteriously blew up a FET in a hybrid package, I got to remove the top (a welded-on metal top; none of the dice were potted inside) and see if it was over voltage or over current that killed the part. At another facility, we an X-ray machine and a scanning electron microscope, neither of which I got to use :(

$pacemold ( 248347 ) writes:
Re:VAX - When you Care Enough to Steal the Very Be ( Score: 1 )

Those words were in Cyrillic (of course)...

...But not in Russian. At least not in understandable Russian.
ader ( 1402 ) , Tuesday February 19, 2002 @06:39AM ( #3030967 ) Homepage
Please to announce computorial system is new ( Score: 2 , Funny)

[Adopt cod Russian accent:]

Glorious new Soviet People's Dual Potato 3000! With advanced UVR (Ultra Root Vegetable(tm)) technology and many obedient clock cycles working for common good. Running Mikkelzoft Window KGB. Own the means of production and experience many kilohertz of glorious revolution in the People's progress today, comrade!


NB. Before you complain, I must point out that as a Linux user myself, I am of course a fervent communist.

Jonmann ( 559611 ) writes:
Ternary computers ( Score: 2 , Funny)

I believe that the coolest invention the Russians ever made (concerning computers) was the ternary computer. More appropriately, the balanced ternary computer.
It was a bit like our binary computers, but it had real potential with the trigits having the values of up, down and neutral. The computer was called SETUN, although it was experimental and never truly realized since the 60's.
If anyone has a link concerning SETUN, I'd be interested, so far my only source has been the meager note on 'An introdunction to cryptography', Mollin.

NaturePhotog ( 317732 ) writes:
Re:Ternary computers ( Score: 3 , Interesting)

A search on Google [] gives a number of interesting links, including: photo [] at the European Museum on CS and Technology article [] (including bibliography) at the Virtual Computer Museum discussion of ternary computing [] at American Scientist One of those indicated it was circa 1958.

Anonymous Coward , Tuesday February 19, 2002 @07:50AM ( #3031081 )
Elbrus Supercomputers ( Score: 5 , Interesting)

There is an article [] on X-bit labs about Soviet supercomputers Elbrus-1 , Elbrus-2 and Elbrus-3 , and their successor, Elbrus-2000 :

The history of the world computer science is connected with the name of Elbrus. This company was founded in Lebedev Institute of Precision Mechanics and Computing Equipment, which team had been developing supercomputers for the Soviet Union's defense establishments for over 40 years. E2K processor embodies the developing ideas of the Russian supercomputer Elbrus-3 built in 1991. Today Elbrus-3 architecture is referred to EPIC (Explicitly Parallel Instruction Computing).

According to Boris A. Babaian, chief architect of Elbrus supercomputers, superscalar architecture was invented in Russia. To quote him as saying: "In 1978 we developed the world's first superscalar computer, Elbrus-1. At present all Western superscalar processors have just the same architecture. First Western superscalar processor appeared in 1992 while ours - in 1978. Moreover, our variant of superscalar is analogous to Pentium Pro introduced by Intel in 1995".

The historical priority of Elbrus is confirmed in the States as well. According to the same article in Microprocessor Report by Keith Diefendorff, the developer of Motorola 88110 - one of the first western superscalar processors: "In 1978 almost 15 years ahead of Western superscalar processors, Elbrus implemented a two-issue out-of-order processor with register renaming and speculative execution".

ameoba ( 173803 ) writes:
Maybe somebody can fill this in... ( Score: 2 )

I seem to remember that the only computer system ever built on trinary (base-3) logic was produced in the Soviet Union. The name escapes me, but I think something like that is enought to dispell the idea of them not doing any original research (good research, OTOH...).

sql*kitten ( 1359 ) writes:
Re:Maybe somebody can fill this in... ( Score: 2 )

I seem to remember that the only computer system ever built on trinary (base-3) logic was produced in the Soviet Union.

See this earlier thread [].

fm6 ( 162816 ) writes:
A bit on bytes ( Score: 2 )

I just noticed that kwertii lists 9-bit bytes as a "radically different concept", an example of what Soviet computer architects might have considered. Worth mentioning that the 8-bit byte was not always something you could take for granted. I can't think of any production machines, but I seem to recall that Knuth's specification of his famous MIX [] machine (an imaginary computer he invented for teaching purposes) doesn't require that bytes be implemented as 8-bit values. In fact, a programmer is not even supposed to assume that a byte is a string of bits!

Before IBM introduced the byte concept back in the 60s, all computers used "word-level" addressing. That meant that data path width and the addressable unit of data had to be the same thing. Made it hard to write portable software. By divorcing the addressing scheme from the data path width, IBM was able to design computers where differences in word size were a matter of efficiency, not compatibility.

There was nothing to force manufacturers to use 8-bit bytes. (Unless, of course, they were trying to rip off IBMs instruction set. A few did, but competing head-to-head with Big Blue that way usually didn't work out.) On the one hand, the standard data terminal of the time used a 7-bit character set. On the other hand, you can make a case for a 12-bit byte []. But IBM used an 8-bit byte, and in those days, what IBM did tended to become a standard.

Knight of the Sad Co ( 22131 ) writes:
Re:A bit on bytes ( Score: 1 )

Bull-Honeywell's GCOS machines still use 9-bit bytes. C was designed to run on these machines (Kernighan's Programming in C [] begins ``C is a computer language available on the GCOS and UNIX operating systems...''). The size of various types is intentionally left flexible to allow for these machines.

A 36-bit word on a machine with limited address space allows pointers to individual bits.

Those who do not know their own history are doomed to assume that it was lived only by `backward' peoples?

fm6 ( 162816 ) writes:
Re:A bit on bytes ( Score: 2 )

And /etc/passwd still has a GECOS Field []. Thanks for the example.

fm6 ( 162816 ) writes:
Re:A bit on bytes ( Score: 2 )

I don't know a lot about these boxes, but information on the web seems to indicate that "character oriented" means a very small word size. That would make sense -- the 1401 was a popular business machine with only 16K of RAM. I presume it had a correspondingly small word size -- like 8 bits?

But an 8-bit word and an 8-bit byte are not the same thing. With an 8-bit word you can easily manipulate individual characters, but your ability to do numerical work is extremely limited. If you need to do scientific computing, you have to go find a system with a bigger word size -- and lose the ability to deal with character-size data easily.

Byte architecture eliminates this problem by divorcing data path width ("word size") from addressible data unit("byte size").

zap42hod ( 303419 ) writes:
western tech. ( Score: 1 )

I've heard we used to read the architecture of western silicon chips slice by slice.
Also there were many IBM and other boxes bought in. Many of which were copied since there wasn't enough money to by them for all the needs.

zap42hod ( 303419 ) writes:
Re:western tech. ( Score: 1 )

And of course I'm not saying we didn't do any original research. The engineers were really good, probably because eduaction had really high standards. That's changed unfortunately, at least here in Estonia with the adoption of international degrees.

o2b-rich ( 455759 ) writes:
Other former Warsaw Pact nations ( Score: 1 )

Not sure if anyone can expand on this, but I thought that Bulgaria was the east-European silicon valley? As mentioned already, the GDR also made some kit. I've read some material describing Russians buying fairly advanced homegrown systems from Bulgaria; it's no secret that they have a few virus authors there... so they certainly have some latent expertise. It's long-suspected that Russian coding techniques were superior to those in the West, motivated by the presence of less-powerful CPUs. Or was this a myth too?

spaceyhackerlady ( 462530 ) writes:
From a reliable source ( Score: 1 )

A colleague of mine is of Slovak descent, and tells me one of the wildest dodges in the Bad Old Days was CPUs with weird numbers of bits, like 28 bit words. It seems that it was illegal to export 32 bit CPUs to the Eastern Bloc. But anything smaller was OK.

In Wireless World in the late 1980s there was a very good series of articles on Eastern Bloc computing, including all the PDP-11 and S/360 clones that have been mentioned. Sorry, I don't have the exact citation. Check your library.


Webmoth ( 75878 ) writes:
How it is today ( Score: 2 )

Well, I don't know anything about the history of Russian/Soviet computing. However, I was over there last summer, and found a computer store which had state-of-the-art peripherals for sale, right alongside a bootleg copy of Windows 2000. In a bookstore, I found (and bought) a Russian translation of Olaf Kirch's Linux Network Administrator's Guide (aka, The NAG []). The text was Russian but the examples were all in the default language of Linux, English.

The products in the computer store were selling for about the same as in America given the exchange rate at the time (except for the Win2K which was ~USD13). When you consider that the average Russian salary is USD2000-3000/yr, you aren't going to find many Russians online, at least not at home. Businesses seem to be fairly up-to-date as far as technology goes, aside from the mom-and-pop shops. Broadband internet access seems to be more myth than reality there.

Some of posts here said that they were a couple generations behind because they were just copying American technology. Appears they're catching up.

Animats ( 122034 ) writes:
Robotron ( Score: 2 )

Check out the Robotron [] site, created in memory of the East German line of computers. Pictures, manuals, and screenshots. (A PacMan clone!) Z80 clones, 8086 clones, CP/M clones, etc.

raju1kabir ( 251972 ) writes:
Norsk Data in Latvia ( Score: 1 )

Not that helpful, but...

Just after the Baltics broke away, I was visiting the University of Latvia. I asked to see the computer facilities and was led to a room full of Norsk Data text-based terminals with cyrillic keyboards. The displays were able to show both cyrillic and roman characters. I do not, sadly, remember any specifics of the computer they were connected to other than that it had a lot of wires hanging everywhere.

d2ksla ( 89385 ) writes:
Re:Norsk Data in Latvia - 28 bit computers! ( Score: 1 )

Norsk Data ("Norwegian Computers") designed fairly advanced 32-bit systems in the middle of the 80's. I remember using them at my local university in Sweden. (Obviously the VAX 11/785 we had too was more exciting since it could run Hack under VMS Eunice).

Back then there was an export embargo on advanced computers to the Soviet union, which basically meant that 32-bit computers couldn't be sold there. So they cut off 4 bits and voila! had an exportable 28-bit computer (ND-505).

Maybe technically not a soviet machine, but still...

morn ( 136835 ) writes:
ICL ( Score: 1 )

I seem to remember hearing something about the ICL almost managing to become the computer supplier to the Soviet government, but this being blocked in the final stages by the British government. I can't find anything to support this anywhere, however - does anyone out there remember more of this than me?

Alex Belits ( 437 ) , Wednesday February 20, 2002 @06:28PM ( #3040511 ) Homepage
The lines were: ( Score: 3 , Informative)

    "BESM"/"Elbrus" line -- originally developed. "ES" Line -- clone of IBM 360 line "Elektronika"/"SM" line -- clone of PDP-11 line, often with some creative changes (high-density floppies, graphics controlers on a second PDP-11 CPU), then some VAXen "DWK"/"UKNC" line -- same as "SM", but made as a desktop. "DWK" models 3 and 4 were built as a single unit with terminal (keyboard was separate), "UKNC" was a very nice flat box with builtin keyboard and extension connectors at the top, connected to a separate monitor. "BK-0010" -- can be described as a PDP-11 squeezed into Sinclair's case, everything was in the keyboard, with TV output, tape recorder connector, and on some models a serial port. "Elektronika-85" -- Dec Pro/350 clone. Was hated just as much as its prototype. "ES-1840","Iskra-1030" lines -- IBM PC clones, usually with some changes. Appeared in early 90's and soon were replaced by conventional PC clones. "Radio-86RK","Specialist" -- hobbyist 8080-based boxes, never were mass-produced but popular among various computer enthusiasts. "Sinclair" clones

There were some others, however I have mentioned the most popular ones.

leob ( 154345 ) writes:
BESM-6 description in English ( Score: 1 )

Is at BESM-6 Nostalgia page [].

hummer357 ( 545850 ) writes:
the latest... ( Score: 1 )

What about this:

[Sep 21, 2019] Dr. Dobb's Journal February 1998 A Conversation with Larry Wall

Perl is unique complex non-orthogonal language and due to this it has unique level of expressiveness.
Also the complexity of Perl to a large extent reflect the complexity of Perl environment (which is Unix environment at the beginning, but now also Windows environment with its quirks)
Notable quotes:
"... On a syntactic level, in the particular case of Perl, I placed variable names in a separate namespace from reserved words. That's one of the reasons there are funny characters on the front of variable names -- dollar signs and so forth. That allowed me to add new reserved words without breaking old programs. ..."
"... A script is something that is easy to tweak, and a program is something that is locked in. There are all sorts of metaphorical tie-ins that tend to make programs static and scripts dynamic, but of course, it's a continuum. You can write Perl programs, and you can write C scripts. People do talk more about Perl programs than C scripts. Maybe that just means Perl is more versatile. ..."
"... A good language actually gives you a range, a wide dynamic range, of your level of discipline. We're starting to move in that direction with Perl. The initial Perl was lackadaisical about requiring things to be defined or declared or what have you. Perl 5 has some declarations that you can use if you want to increase your level of discipline. But it's optional. So you can say "use strict," or you can turn on warnings, or you can do various sorts of declarations. ..."
"... But Perl was an experiment in trying to come up with not a large language -- not as large as English -- but a medium-sized language, and to try to see if, by adding certain kinds of complexity from natural language, the expressiveness of the language grew faster than the pain of using it. And, by and large, I think that experiment has been successful. ..."
"... If you used the regular expression in a list context, it will pass back a list of the various subexpressions that it matched. A different computer language may add regular expressions, even have a module that's called Perl 5 regular expressions, but it won't be integrated into the language. You'll have to jump through an extra hoop, take that right angle turn, in order to say, "Okay, well here, now apply the regular expression, now let's pull the things out of the regular expression," rather than being able to use the thing in a particular context and have it do something meaningful. ..."
"... A language is not a set of syntax rules. It is not just a set of semantics. It's the entire culture surrounding the language itself. So part of the cultural context in which you analyze a language includes all the personalities and people involved -- how everybody sees the language, how they propagate the language to other people, how it gets taught, the attitudes of people who are helping each other learn the language -- all of this goes into the pot of context. ..."
"... In the beginning, I just tried to help everybody. Particularly being on USENET. You know, there are even some sneaky things in there -- like looking for people's Perl questions in many different newsgroups. For a long time, I resisted creating a newsgroup for Perl, specifically because I did not want it to be ghettoized. You know, if someone can say, "Oh, this is a discussion about Perl, take it over to the Perl newsgroup," then they shut off the discussion in the shell newsgroup. If there are only the shell newsgroups, and someone says, "Oh, by the way, in Perl, you can solve it like this," that's free advertising. So, it's fuzzy. We had proposed Perl as a newsgroup probably a year or two before we actually created it. It eventually came to the point where the time was right for it, and we did that. ..."
"... For most web applications, Perl is severely underutilized. Your typical CGI script says print, print, print, print, print, print, print. But in a sense, it's the dynamic range of Perl that allows for that. You don't have to say a whole lot to write a simple Perl script, whereas your minimal Java program is, you know, eight or ten lines long anyway. Many of the features that made it competitive in the UNIX space will make it competitive in other spaces. ..."
"... Over the years, much of the work of making Perl work for people has been in designing ways for people to come to Perl. I actually delayed the first version of Perl for a couple of months until I had a sed-to-Perl and an awk-to-Perl translator. One of the benefits of borrowing features from various other languages is that those subsets of Perl that use those features are familiar to people coming from that other culture. What would be best, in my book, is if someone had a way of saying, "Well, I've got this thing in Visual Basic. Now, can I just rewrite some of these things in Perl?" ..."
Feb 28, 1998 |

... ... ...

The creator of Perl talks about language design and Perl. By Eugene Eric Kim

DDJ : Is Perl 5.005 what you envisioned Perl to be when you set out to do it?

LW: That assumes that I'm smart enough to envision something as complicated as Perl. I knew that Perl would be good at some things, and would be good at more things as time went on. So, in a sense, I'm sort of blessed with natural stupidity -- as opposed to artificial intelligence -- in the sense that I know what my intellectual limits are.

I'm not one of these people who can sit down and design an entire system from scratch and figure out how everything relates to everything else, so I knew from the start that I had to take the bear-of-very-little-brain approach, and design the thing to evolve. But that fit in with my background in linguistics, because natural languages evolve over time.

You can apply biological metaphors to languages. They move into niches, and as new needs arise, languages change over time. It's actually a practical way to design a computer language. Not all computer programs can be designed that way, but I think more can be designed that way than have been. A lot of the majestic failures that have occurred in computer science have been because people thought they could design the whole thing in advance.

DDJ : How do you design a language to evolve?

LW: There are several aspects to that, depending on whether you are talking about syntax or semantics. On a syntactic level, in the particular case of Perl, I placed variable names in a separate namespace from reserved words. That's one of the reasons there are funny characters on the front of variable names -- dollar signs and so forth. That allowed me to add new reserved words without breaking old programs.

DDJ : What is a scripting language? Does Perl fall into the category of a scripting language?

LW: Well, being a linguist, I tend to go back to the etymological meanings of "script" and "program," though, of course, that's fallacious in terms of what they mean nowadays. A script is what you hand to the actors, and a program is what you hand to the audience. Now hopefully, the program is already locked in by the time you hand that out, whereas the script is something you can tinker with. I think of phrases like "following the script," or "breaking from the script." The notion that you can evolve your script ties into the notion of rapid prototyping.

A script is something that is easy to tweak, and a program is something that is locked in. There are all sorts of metaphorical tie-ins that tend to make programs static and scripts dynamic, but of course, it's a continuum. You can write Perl programs, and you can write C scripts. People do talk more about Perl programs than C scripts. Maybe that just means Perl is more versatile.

... ... ...

DDJ : Would that be a better distinction than interpreted versus compiled -- run-time versus compile-time binding?

LW: It's a more useful distinction in many ways because, with late-binding languages like Perl or Java, you cannot make up your mind about what the real meaning of it is until the last moment. But there are different definitions of what the last moment is. Computer scientists would say there are really different "latenesses" of binding.

A good language actually gives you a range, a wide dynamic range, of your level of discipline. We're starting to move in that direction with Perl. The initial Perl was lackadaisical about requiring things to be defined or declared or what have you. Perl 5 has some declarations that you can use if you want to increase your level of discipline. But it's optional. So you can say "use strict," or you can turn on warnings, or you can do various sorts of declarations.

DDJ : Would it be accurate to say that Perl doesn't enforce good design?

LW: No, it does not. It tries to give you some tools to help if you want to do that, but I'm a firm believer that a language -- whether it's a natural language or a computer language -- ought to be an amoral artistic medium.

You can write pretty poems or you can write ugly poems, but that doesn't say whether English is pretty or ugly. So, while I kind of like to see beautiful computer programs, I don't think the chief virtue of a language is beauty. That's like asking an artist whether they use beautiful paints and a beautiful canvas and a beautiful palette. A language should be a medium of expression, which does not restrict your feeling unless you ask it to.

DDJ : Where does the beauty of a program lie? In the underlying algorithms, in the syntax of the description?

LW: Well, there are many different definitions of artistic beauty. It can be argued that it's symmetry, which in a computer language might be considered orthogonality. It's also been argued that broken symmetry is what is considered most beautiful and most artistic and diverse. Symmetry breaking is the root of our whole universe according to physicists, so if God is an artist, then maybe that's his definition of what beauty is.

This actually ties back in with the built-to-evolve concept on the semantic level. A lot of computer languages were defined to be naturally orthogonal, or at least the computer scientists who designed them were giving lip service to orthogonality. And that's all very well if you're trying to define a position in a space. But that's not how people think. It's not how natural languages work. Natural languages are not orthogonal, they're diagonal. They give you hypotenuses.

Suppose you're flying from California to Quebec. You don't fly due east, and take a left turn over Nashville, and then go due north. You fly straight, more or less, from here to there. And it's a network. And it's actually sort of a fractal network, where your big link is straight, and you have little "fractally" things at the end for your taxi and bicycle and whatever the mode of transport you use. Languages work the same way. And they're designed to get you most of the way here, and then have ways of refining the additional shades of meaning.

When they first built the University of California at Irvine campus, they just put the buildings in. They did not put any sidewalks, they just planted grass. The next year, they came back and built the sidewalks where the trails were in the grass. Perl is that kind of a language. It is not designed from first principles. Perl is those sidewalks in the grass. Those trails that were there before were the previous computer languages that Perl has borrowed ideas from. And Perl has unashamedly borrowed ideas from many, many different languages. Those paths can go diagonally. We want shortcuts. Sometimes we want to be able to do the orthogonal thing, so Perl generally allows the orthogonal approach also. But it also allows a certain number of shortcuts, and being able to insert those shortcuts is part of that evolutionary thing.

I don't want to claim that this is the only way to design a computer language, or that everyone is going to actually enjoy a computer language that is designed in this way. Obviously, some people speak other languages. But Perl was an experiment in trying to come up with not a large language -- not as large as English -- but a medium-sized language, and to try to see if, by adding certain kinds of complexity from natural language, the expressiveness of the language grew faster than the pain of using it. And, by and large, I think that experiment has been successful.

DDJ : Give an example of one of the things you think is expressive about Perl that you wouldn't find in other languages.

LW: The fact that regular-expression parsing and the use of regular expressions is built right into the language. If you used the regular expression in a list context, it will pass back a list of the various subexpressions that it matched. A different computer language may add regular expressions, even have a module that's called Perl 5 regular expressions, but it won't be integrated into the language. You'll have to jump through an extra hoop, take that right angle turn, in order to say, "Okay, well here, now apply the regular expression, now let's pull the things out of the regular expression," rather than being able to use the thing in a particular context and have it do something meaningful.

The school of linguistics I happened to come up through is called tagmemics, and it makes a big deal about context. In a real language -- this is a tagmemic idea -- you can distinguish between what the conventional meaning of the "thing" is and how it's being used. You think of "dog" primarily as a noun, but you can use it as a verb. That's the prototypical example, but the "thing" applies at many different levels. You think of a sentence as a sentence. Transformational grammar was built on the notion of analyzing a sentence. And they had all their cute rules, and they eventually ended up throwing most of them back out again.

But in the tagmemic view, you can take a sentence as a unit and use it differently. You can say a sentence like, "I don't like your I-can-use-anything-like-a-sentence attitude." There, I've used the sentence as an adjective. The sentence isn't an adjective if you analyze it, any way you want to analyze it. But this is the way people think. If there's a way to make sense of something in a particular context, they'll do so. And Perl is just trying to make those things make sense. There's the basic distinction in Perl between singular and plural context -- call it list context and scalar context, if you will. But you can use a particular construct in a singular context that has one meaning that sort of makes sense using the list context, and it may have a different meaning that makes sense in the plural context.

That is where the expressiveness comes from. In English, you read essays by people who say, "Well, how does this metaphor thing work?" Owen Barfield talks about this. You say one thing and mean another. That's how metaphors arise. Or you take two things and jam them together. I think it was Owen Barfield, or maybe it was C.S. Lewis, who talked about "a piercing sweetness." And we know what "piercing" is, and we know what "sweetness" is, but you put those two together, and you've created a new meaning. And that's how languages ought to work.

DDJ : Is a more expressive language more difficult to learn?

LW: Yes. It was a conscious tradeoff at the beginning of Perl that it would be more difficult to master the whole language. However, taking another clue from a natural language, we do not require 5-year olds to speak with the same diction as 50-year olds. It is okay for you to use the subset of a language that you are comfortable with, and to learn as you go. This is not true of so many computer-science languages. If you program C++ in a subset that corresponds to C, you get laughed out of the office.

There's a whole subject that we haven't touched here. A language is not a set of syntax rules. It is not just a set of semantics. It's the entire culture surrounding the language itself. So part of the cultural context in which you analyze a language includes all the personalities and people involved -- how everybody sees the language, how they propagate the language to other people, how it gets taught, the attitudes of people who are helping each other learn the language -- all of this goes into the pot of context.

Because I had already put out other freeware projects (rn and patch), I realized before I ever wrote Perl that a great deal of the value of those things was from collaboration. Many of the really good ideas in rn and Perl came from other people.

I think that Perl is in its adolescence right now. There are places where it is grown up, and places where it's still throwing tantrums. I have a couple of teenagers, and the thing you notice about teenagers is that they're always plus or minus ten years from their real age. So if you've got a 15-year old, they're either acting 25 or they're acting 5. Sometimes simultaneously! And Perl is a little that way, but that's okay.

DDJ : What part of Perl isn't quite grown up?

LW: Well, I think that the part of Perl, which has not been realistic up until now has been on the order of how you enable people in certain business situations to actually use it properly. There are a lot of people who cannot use freeware because it is, you know, schlocky. Their bosses won't let them, their government won't let them, or they think their government won't let them. There are a lot of people who, unknown to their bosses or their government, are using Perl.

DDJ : So these aren't technical issues.

LW: I suppose it depends on how you define technology. Some of it is perceptions, some of it is business models, and things like that. I'm trying to generate a new symbiosis between the commercial and the freeware interests. I think there's an artificial dividing line between those groups and that they could be more collaborative.

As a linguist, the generation of a linguistic culture is a technical issue. So, these adjustments we might make in people's attitudes toward commercial operations or in how Perl is being supported, distributed, advertised, and marketed -- not in terms of trying to make bucks, but just how we propagate the culture -- these are technical ideas in the psychological and the linguistic sense. They are, of course, not technical in the computer-science sense. But I think that's where Perl has really excelled -- its growth has not been driven solely by technical merits.

DDJ : What are the things that you do when you set out to create a culture around the software that you write?

LW: In the beginning, I just tried to help everybody. Particularly being on USENET. You know, there are even some sneaky things in there -- like looking for people's Perl questions in many different newsgroups. For a long time, I resisted creating a newsgroup for Perl, specifically because I did not want it to be ghettoized. You know, if someone can say, "Oh, this is a discussion about Perl, take it over to the Perl newsgroup," then they shut off the discussion in the shell newsgroup. If there are only the shell newsgroups, and someone says, "Oh, by the way, in Perl, you can solve it like this," that's free advertising. So, it's fuzzy. We had proposed Perl as a newsgroup probably a year or two before we actually created it. It eventually came to the point where the time was right for it, and we did that.

DDJ : Perl has really been pigeonholed as a language of the Web. One result is that people mistakenly try to compare Perl to Java. Why do you think people make the comparison in the first place? Is there anything to compare?

LW: Well, people always compare everything.

DDJ : Do you agree that Perl has been pigeonholed?

LW: Yes, but I'm not sure that it bothers me. Before it was pigeonholed as a web language, it was pigeonholed as a system-administration language, and I think that -- this goes counter to what I was saying earlier about marketing Perl -- if the abilities are there to do a particular job, there will be somebody there to apply it, generally speaking. So I'm not too worried about Perl moving into new ecological niches, as long as it has the capability of surviving in there.

Perl is actually a scrappy language for surviving in a particular ecological niche. (Can you tell I like biological metaphors?) You've got to understand that it first went up against C and against shell, both of which were much loved in the UNIX community, and it succeeded against them. So that early competition actually makes it quite a fit competitor in many other realms, too.

For most web applications, Perl is severely underutilized. Your typical CGI script says print, print, print, print, print, print, print. But in a sense, it's the dynamic range of Perl that allows for that. You don't have to say a whole lot to write a simple Perl script, whereas your minimal Java program is, you know, eight or ten lines long anyway. Many of the features that made it competitive in the UNIX space will make it competitive in other spaces.

Now, there are things that Perl can't do. One of the things that you can't do with Perl right now is compile it down to Java bytecode. And if that, in the long run, becomes a large ecological niche (and this is not yet a sure thing), then that is a capability I want to be certain that Perl has.

DDJ : There's been a movement to merge the two development paths between the ActiveWare Perl for Windows and the main distribution of Perl. You were talking about ecological niches earlier, and how Perl started off as a text-processing language. The scripting languages that are dominant on the Microsoft platforms -- like VB -- tend to be more visual than textual. Given Perl's UNIX origins -- awk, sed, and C, for that matter -- do you think that Perl, as it currently stands, has the tools to fit into a Windows niche?

LW: Yes and no. It depends on your problem domain and who's trying to solve the problem. There are problems that only need a textual solution or don't need a visual solution. Automation things of certain sorts don't need to interact with the desktop, so for those sorts of things -- and for the programmers who aren't really all that interested in visual programming -- it's already good for that. And people are already using it for that. Certainly, there is a group of people who would be enabled to use Perl if it had more of a visual interface, and one of the things we're talking about doing for the O'Reilly NT Perl Resource Kit is some sort of a visual interface.

A lot of what Windows is designed to do is to get mere mortals from 0 to 60, and there are some people who want to get from 60 to 100. We are not really interested in being in Microsoft's crosshairs. We're not actually interested in competing head-to-head with Visual Basic, and to the extent that we do compete with them, it's going to be kind of subtle. There has to be some way to get people from the slow lane to the fast lane. It's one thing to give them a way to get from 60 to 100, but if they have to spin out to get from the slow lane to the fast lane, then that's not going to work either.

Over the years, much of the work of making Perl work for people has been in designing ways for people to come to Perl. I actually delayed the first version of Perl for a couple of months until I had a sed-to-Perl and an awk-to-Perl translator. One of the benefits of borrowing features from various other languages is that those subsets of Perl that use those features are familiar to people coming from that other culture. What would be best, in my book, is if someone had a way of saying, "Well, I've got this thing in Visual Basic. Now, can I just rewrite some of these things in Perl?"

We're already doing this with Java. On our UNIX Perl Resource Kit, I've got a hybrid language called "jpl" -- that's partly a pun on my old alma mater, Jet Propulsion Laboratory, and partly for Java, Perl...Lingo, there we go! That's good. "Java Perl Lingo." You've heard it first here! jpl lets you take a Java program and magically turn one of the methods into a chunk of Perl right there inline. It turns Perl code into a native method, and automates the linkage so that when you pull in the Java code, it also pulls in the Perl code, and the interpreter, and everything else. It's actually calling out from Java's Virtual Machine into Perl's virtual machine. And we can call in the other direction, too. You can embed Java in Perl, except that there's a bug in JDK having to do with threads that prevents us from doing any I/O. But that's Java's problem.

It's a way of letting somebody evolve from a purely Java solution into, at least partly, a Perl solution. It's important not only to make Perl evolve, but to make it so that people can evolve their own programs. It's how I program, and I think a lot of people program that way. Most of us are too stupid to know what we want at the beginning.

DDJ : Is there hope down the line to present Perl to a standardization body?

LW: Well, I have said in jest that people will be free to standardize Perl when I'm dead. There may come a time when that is the right thing to do, but it doesn't seem appropriate yet.

DDJ : When would that time be?

LW: Oh, maybe when the federal government declares that we can't export Perl unless it's standardized or something.

DDJ : Only when you're forced to, basically.

LW: Yeah. To me, once things get to a standards body, it's not very interesting anymore. The most efficient form of government is a benevolent dictatorship. I remember walking into some BOF that USENIX held six or seven years ago, and John Quarterman was running it, and he saw me sneak in, sit in the back corner, and he said, "Oh, here comes Larry Wall! He's a standards committee all of his own!"

A great deal of the success of Perl so far has been based on some of my own idiosyncrasies. And I recognize that they are idiosyncrasies, and I try to let people argue me out of them whenever appropriate. But there are still ways of looking at things that I seem to do differently than anybody else. It may well be that perl5-porters will one day degenerate into a standards committee. So far, I have not abused my authority to the point that people have written me off, and so I am still allowed to exercise a certain amount of absolute power over the Perl core.

I just think headless standards committees tend to reduce everything to mush. There is a conservatism that committees have that individuals don't, and there are times when you want to have that conservatism and times you don't. I try to exercise my authority where we don't want that conservatism. And I try not to exercise it at other times.

DDJ : How did you get involved in computer science? You're a linguist by background?

LW: Because I talk to computer scientists more than I talk to linguists, I wear the linguistics mantle more than I wear the computer-science mantle, but they actually came along in parallel, and I'm probably a 50/50 hybrid. You know, basically, I'm no good at either linguistics or computer science.

DDJ : So you took computer-science courses in college?

LW: In college, yeah. In college, I had various majors, but what I eventually graduated in -- I'm one of those people that packed four years into eight -- what I eventually graduated in was a self-constructed major, and it was Natural and Artificial Languages, which seems positively prescient considering where I ended up.

DDJ : When did you join O'Reilly as a salaried employee? And how did that come about?

LW: A year-and-a-half ago. It was partly because my previous job was kind of winding down.

DDJ : What was your previous job?

LW: I was working for Seagate Software. They were shutting down that branch of operations there. So, I was just starting to look around a little bit, and Tim noticed me looking around and said, "Well, you know, I've wanted to hire you for a long time," so we talked. And Gina Blaber (O'Reilly's software director) and I met. So, they more or less offered to pay me to mess around with Perl.

So it's sort of my dream job. I get to work from home, and if I feel like taking a nap in the afternoon, I can take a nap in the afternoon and work all night.

DDJ : Do you have any final comments, or tips for aspiring programmers? Or aspiring Perl programmers?

LW: Assume that your first idea is wrong, and try to think through the various options. I think that the biggest mistake people make is latching onto the first idea that comes to them and trying to do that. It really comes to a thing that my folks taught me about money. Don't buy something unless you've wanted it three times. Similarly, don't throw in a feature when you first think of it. Think if there's a way to generalize it, think if it should be generalized. Sometimes you can generalize things too much. I think like the things in Scheme were generalized too much. There is a level of abstraction beyond which people don't want to go. Take a good look at what you want to do, and try to come up with the long-term lazy way, not the short-term lazy way.

[Sep 21, 2019] The list of programming languages by dates

Sep 21, 2019 |


1949 1951 1952 1955 1956 1957 1958 1959 1960 1962 1963 1964

[Sep 07, 2019] Knuth: Early on in the TeX project I also had to do programming of a completely different type on Zilog CPU which was at the heart of lazer printer that I used

Sep 07, 2019 |

Knuth: Yeah. That's absolutely true. I've got to get another thought out of my mind though. That is, early on in the TeX project I also had to do programming of a completely different type. I told you last week that this was my first real exercise in structured programming, which was one of Dijkstra's huge... That's one of the few breakthroughs in the history of computer science, in a way. He was actually responsible for maybe two of the ten that I know.

So I'm doing structured programming as I'm writing TeX. I'm trying to do it right, the way I should've been writing programs in the 60s. Then I also got this typesetting machine, which had, inside of it, a tiny 8080 chip or something. I'm not sure exactly. It was a Zilog, or some very early Intel chip. Way before the 386s. A little computer with 8-bit registers and a small number of things it could do. I had to write my own assembly language for this, because the existing software for writing programs for this little micro thing were so bad. I had to write actually thousands of lines of code for this, in order to control the typesetting. Inside the machine I had to control a stepper motor, and I had to accelerate it.

Every so often I had to give another [command] saying, "Okay, now take a step," and then continue downloading a font from the mainframe.

I had six levels of interrupts in this program. I remember talking to you at this time, saying, "Ed, I'm programming in assembly language for an 8-bit computer," and you said "Yeah, you've been doing the same thing and it's fun again."

You know, you'll remember. We'll undoubtedly talk more about that when I have my turn interviewing you in a week or so. This is another aspect of programming: that you also feel that you're in control and that there's not a black box separating you. It's not only the power, but it's the knowledge of what's going on; that nobody's hiding something. It's also this aspect of jumping levels of abstraction. In my opinion, the thing that computer scientists are best at is seeing things at many levels of detail: high level, intermediate levels, and lowest levels. I know if I'm adding 1 to a certain number, that this is getting me towards some big goal at the top. People enjoy most the things that they're good at. Here's a case where if you're working on a machine that has only this 8-bit capability, but in order to do this you have to go through levels, of not only that machine, but also to the next level up of the assembler, and then you have a simulator in which you can help debug your programs, and you have higher level languages that go through, and then you have the typesetting at the top. There are these six or seven levels all present at the same time. A computer scientist is in heaven in a situation like this.

Feigenbaum: Don, to get back, I want to ask you about that as part of the next question. You went back into programming in a really serious way. It took you, as I said before, ten years, not one year, and you didn't quit. As soon as you mastered one part of it, you went into Metafont, which is another big deal. To what extent were you doing that because you needed to, what I might call expose yourself to, or upgrade your skills in, the art that had emerged over the decade-and-a-half since you had done RUNCIBLE? And to what extent did you do it just because you were driven to be a programmer? You loved programming.

Knuth: Yeah. I think your hypothesis is good. It didn't occur to me at the time that I just had to program in order to be a happy man. Certainly I didn't find my other roles distasteful, except for fundraising. I enjoyed every aspect of being a professor except dealing with proposals, which I did my share of, but that was a necessary evil sort of in my own thinking, I guess. But the fact that now I'm still compelled to I wake up in the morning with an idea, and it makes my day to think of adding a couple of lines to my program. Gives me a real high. It must be the way poets feel, or musicians and so on, and other people, painters, whatever. Programming does that for me. It's certainly true. But the fact that I had to put so much time in it was not totally that, I'm sure, because it became a responsibility. It wasn't just for Phyllis and me, as it turned out. I started working on it at the AI lab, and people were looking at the output coming out of the machine and they would say, "Hey, Don, how did you do that?" Guy Steele was visiting from MIT that summer and he said, "Don, I want to port this to take it to MIT." I didn't have two users.

First I had 10, and then I had 100, and then I had 1000. Every time it went to another order of magnitude I had to change the system, because it would almost match their needs but then they would have very good suggestions as to something it wasn't covering. Then when it went to 10,000 and when it went to 100,000, the last stage was 10 years later when I made it friendly for the other alphabets of the world, where people have accented letters and Russian letters. <p>I had started out with only 7-bit codes. I had so many international users by that time, I saw that was a fundamental error. I started out with the idea that nobody would ever want to use a keyboard that could generate more than about 90 characters. It was going to be too complicated. But I was wrong. So it [TeX] was a burden as well, in the sense that I wanted to do a responsible job.

I had actually consciously planned an end-game that would take me four years to finish, and [then] not continue maintaining it and adding on, so that I could have something where I could say, "And now it's done and it's never going to change." I believe this is one aspect of software that, not for every system, but for TeX, it was vital that it became something that wouldn't be a moving target after while.

Feigenbaum: The books on TeX were a period. That is, you put a period down and you said, "This is it."

[Sep 07, 2019] As soon as you stop writing code on a regular basis you stop being a programmer. You lose you qualification very quickly. That's a typical tragedy of talented programmers who became mediocre managers or, worse, theoretical computer scientists

Programming skills are somewhat similar to the skills of people who play violin or piano. As soon a you stop playing violin or piano still start to evaporate. First slowly, then quicker. In two yours you probably will lose 80%.
Notable quotes:
"... I happened to look the other day. I wrote 35 programs in January, and 28 or 29 programs in February. These are small programs, but I have a compulsion. I love to write programs and put things into it. ..."
Sep 07, 2019 |

Dijkstra said he was proud to be a programmer. Unfortunately he changed his attitude completely, and I think he wrote his last computer program in the 1980s. At this conference I went to in 1967 about simulation language, Chris Strachey was going around asking everybody at the conference what was the last computer program you wrote. This was 1967. Some of the people said, "I've never written a computer program." Others would say, "Oh yeah, here's what I did last week." I asked Edsger this question when I visited him in Texas in the 90s and he said, "Don, I write programs now with pencil and paper, and I execute them in my head." He finds that a good enough discipline.

I think he was mistaken on that. He taught me a lot of things, but I really think that if he had continued... One of Dijkstra's greatest strengths was that he felt a strong sense of aesthetics, and he didn't want to compromise his notions of beauty. They were so intense that when he visited me in the 1960s, I had just come to Stanford. I remember the conversation we had. It was in the first apartment, our little rented house, before we had electricity in the house.

We were sitting there in the dark, and he was telling me how he had just learned about the specifications of the IBM System/360, and it made him so ill that his heart was actually starting to flutter.

He intensely disliked things that he didn't consider clean to work with. So I can see that he would have distaste for the languages that he had to work with on real computers. My reaction to that was to design my own language, and then make Pascal so that it would work well for me in those days. But his response was to do everything only intellectually.

So, programming.

I happened to look the other day. I wrote 35 programs in January, and 28 or 29 programs in February. These are small programs, but I have a compulsion. I love to write programs and put things into it. I think of a question that I want to answer, or I have part of my book where I want to present something. But I can't just present it by reading about it in a book. As I code it, it all becomes clear in my head. It's just the discipline. The fact that I have to translate my knowledge of this method into something that the machine is going to understand just forces me to make that crystal-clear in my head. Then I can explain it to somebody else infinitely better. The exposition is always better if I've implemented it, even though it's going to take me more time.

[Sep 07, 2019] Knuth about computer science and money: At that point I made the decision in my life that I wasn't going to optimize my income;

Sep 07, 2019 |

So I had a programming hat when I was outside of Cal Tech, and at Cal Tech I am a mathematician taking my grad studies. A startup company, called Green Tree Corporation because green is the color of money, came to me and said, "Don, name your price. Write compilers for us and we will take care of finding computers for you to debug them on, and assistance for you to do your work. Name your price." I said, "Oh, okay. $100,000.", assuming that this was In that era this was not quite at Bill Gate's level today, but it was sort of out there.

The guy didn't blink. He said, "Okay." I didn't really blink either. I said, "Well, I'm not going to do it. I just thought this was an impossible number."

At that point I made the decision in my life that I wasn't going to optimize my income; I was really going to do what I thought I could do for well, I don't know. If you ask me what makes me most happy, number one would be somebody saying "I learned something from you". Number two would be somebody saying "I used your software". But number infinity would be Well, no. Number infinity minus one would be "I bought your book". It's not as good as "I read your book", you know. Then there is "I bought your software"; that was not in my own personal value. So that decision came up. I kept up with the literature about compilers. The Communications of the ACM was where the action was. I also worked with people on trying to debug the ALGOL language, which had problems with it. I published a few papers, like "The Remaining Trouble Spots in ALGOL 60" was one of the papers that I worked on. I chaired a committee called "Smallgol" which was to find a subset of ALGOL that would work on small computers. I was active in programming languages.

[Sep 07, 2019] Knuth: maybe 1 in 50 people have the "computer scientist's" type of intellect

Sep 07, 2019 |

Frana: You have made the comment several times that maybe 1 in 50 people have the "computer scientist's mind." Knuth: Yes. Frana: I am wondering if a large number of those people are trained professional librarians? [laughter] There is some strangeness there. But can you pinpoint what it is about the mind of the computer scientist that is....

Knuth: That is different?

Frana: What are the characteristics?

Knuth: Two things: one is the ability to deal with non-uniform structure, where you have case one, case two, case three, case four. Or that you have a model of something where the first component is integer, the next component is a Boolean, and the next component is a real number, or something like that, you know, non-uniform structure. To deal fluently with those kinds of entities, which is not typical in other branches of mathematics, is critical. And the other characteristic ability is to shift levels quickly, from looking at something in the large to looking at something in the small, and many levels in between, jumping from one level of abstraction to another. You know that, when you are adding one to some number, that you are actually getting closer to some overarching goal. These skills, being able to deal with nonuniform objects and to see through things from the top level to the bottom level, these are very essential to computer programming, it seems to me. But maybe I am fooling myself because I am too close to it.

Frana: It is the hardest thing to really understand that which you are existing within.

Knuth: Yes.

[Sep 07, 2019]

Sep 07, 2019 |

Knuth: Well, certainly it seems the way things are going. You take any particular subject that you are interested in and you try to see if somebody with an American high school education has learned it, and you will be appalled. You know, Jesse Jackson thinks that students know nothing about political science, and I am sure the chemists think that students don't know chemistry, and so on. But somehow they get it when they have to later. But I would say certainly the students now have been getting more of a superficial idea of mathematics than they used to. We have to do remedial stuff at Stanford that we didn't have to do thirty years ago.

Frana: Gio [Wiederhold] said much the same thing to me.

Knuth: The most scandalous thing was that Stanford's course in linear algebra could not get to eigenvalues because the students didn't know about complex numbers. Now every course at Stanford that takes linear algebra as a prerequisite does so because they want the students to know about eigenvalues. But here at Stanford, with one of the highest admission standards of any university, our students don't know complex numbers. So we have to teach them that when they get to college. Yes, this is definitely a breakdown.

Frana: Was your mathematics training in high school particularly good, or was it that you spent a lot of time actually doing problems?

Knuth: No, my mathematics training in high school was not good. My teachers could not answer my questions and so I decided I'd go into physics. I mean, I had played with mathematics in high school. I did a lot of work drawing graphs and plotting points and I used pi as the radix of a number system, and explored what the world would be like if you wanted to do logarithms and you had a number system based on pi. And I had played with stuff like that. But my teachers couldn't answer questions that I had.

... ... ... Frana: Do you have an answer? Are American students different today? In one of your interviews you discuss the problem of creativity versus gross absorption of knowledge.

Knuth: Well, that is part of it. Today we have mostly a sound byte culture, this lack of attention span and trying to learn how to pass exams. Frana: Yes,

[Sep 07, 2019] Knuth: I can be a writer, who tries to organize other people's ideas into some kind of a more coherent structure so that it is easier to put things together

Sep 07, 2019 |

Knuth: I can be a writer, who tries to organize other people's ideas into some kind of a more coherent structure so that it is easier to put things together. I can see that I could be viewed as a scholar that does his best to check out sources of material, so that people get credit where it is due. And to check facts over, not just to look at the abstract of something, but to see what the methods were that did it and to fill in holes if necessary. I look at my role as being able to understand the motivations and terminology of one group of specialists and boil it down to a certain extent so that people in other parts of the field can use it. I try to listen to the theoreticians and select what they have done that is important to the programmer on the street; to remove technical jargon when possible.

But I have never been good at any kind of a role that would be making policy, or advising people on strategies, or what to do. I have always been best at refining things that are there and bringing order out of chaos. I sometimes raise new ideas that might stimulate people, but not really in a way that would be in any way controlling the flow. The only time I have ever advocated something strongly was with literate programming; but I do this always with the caveat that it works for me, not knowing if it would work for anybody else.

When I work with a system that I have created myself, I can always change it if I don't like it. But everybody who works with my system has to work with what I give them. So I am not able to judge my own stuff impartially. So anyway, I have always felt bad about if anyone says, 'Don, please forecast the future,'...

[Sep 06, 2019] Knuth: Programming and architecture are interrelated and it is impossible to create good architecure wthout actually programming at least of a prototype

Notable quotes:
"... When you're writing a document for a human being to understand, the human being will look at it and nod his head and say, "Yeah, this makes sense." But then there's all kinds of ambiguities and vagueness that you don't realize until you try to put it into a computer. Then all of a sudden, almost every five minutes as you're writing the code, a question comes up that wasn't addressed in the specification. "What if this combination occurs?" ..."
"... When you're faced with implementation, a person who has been delegated this job of working from a design would have to say, "Well hmm, I don't know what the designer meant by this." ..."
Sep 06, 2019 |

...I showed the second version of this design to two of my graduate students, and I said, "Okay, implement this, please, this summer. That's your summer job." I thought I had specified a language. I had to go away. I spent several weeks in China during the summer of 1977, and I had various other obligations. I assumed that when I got back from my summer trips, I would be able to play around with TeX and refine it a little bit. To my amazement, the students, who were outstanding students, had not competed [it]. They had a system that was able to do about three lines of TeX. I thought, "My goodness, what's going on? I thought these were good students." Well afterwards I changed my attitude to saying, "Boy, they accomplished a miracle."

Because going from my specification, which I thought was complete, they really had an impossible task, and they had succeeded wonderfully with it. These students, by the way, [were] Michael Plass, who has gone on to be the brains behind almost all of Xerox's Docutech software and all kind of things that are inside of typesetting devices now, and Frank Liang, one of the key people for Microsoft Word.

He did important mathematical things as well as his hyphenation methods which are quite used in all languages now. These guys were actually doing great work, but I was amazed that they couldn't do what I thought was just sort of a routine task. Then I became a programmer in earnest, where I had to do it. The reason is when you're doing programming, you have to explain something to a computer, which is dumb.

When you're writing a document for a human being to understand, the human being will look at it and nod his head and say, "Yeah, this makes sense." But then there's all kinds of ambiguities and vagueness that you don't realize until you try to put it into a computer. Then all of a sudden, almost every five minutes as you're writing the code, a question comes up that wasn't addressed in the specification. "What if this combination occurs?"

It just didn't occur to the person writing the design specification. When you're faced with implementation, a person who has been delegated this job of working from a design would have to say, "Well hmm, I don't know what the designer meant by this."

If I hadn't been in China they would've scheduled an appointment with me and stopped their programming for a day. Then they would come in at the designated hour and we would talk. They would take 15 minutes to present to me what the problem was, and then I would think about it for a while, and then I'd say, "Oh yeah, do this. " Then they would go home and they would write code for another five minutes and they'd have to schedule another appointment.

I'm probably exaggerating, but this is why I think Bob Floyd's Chiron compiler never got going. Bob worked many years on a beautiful idea for a programming language, where he designed a language called Chiron, but he never touched the programming himself. I think this was actually the reason that he had trouble with that project, because it's so hard to do the design unless you're faced with the low-level aspects of it, explaining it to a machine instead of to another person.

Forsythe, I think it was, who said, "People have said traditionally that you don't understand something until you've taught it in a class. The truth is you don't really understand something until you've taught it to a computer, until you've been able to program it." At this level, programming was absolutely important

[Sep 06, 2019] Oral histories

Sep 06, 2019 |

Having just celebrated my 10000th birthday (in base three), I'm operating a little bit in history mode. Every once in awhile, people have asked me to record some of my memories of past events --- I guess because I've been fortunate enough to live at some pretty exciting times, computersciencewise. These after-the-fact recollections aren't really as reliable as contemporary records; but they do at least show what I think I remember. And the stories are interesting, because they involve lots of other people.

So, before these instances of oral history themselves begin to fade from my memory, I've decided to record some links to several that I still know about:

Interview by Philip L Frana at the Charles Babbage Institute, November 2001
transcript of OH 332
audio file (2:00:33)
Interviews commissioned by Peoples Archive, taped in March 2006
playlist for 97 videos (about 2--8 minutes each)
Interview by Ed Feigenbaum at the Computer History Museum, March 2007
Part 1 (3:07:25) Part 2 (4:02:46)
Interview by Susan Schofield for the Stanford Historical Society, May 2018
(audio files, 2:20:30 and 2:14:25; transcript)
Interview by David Brock and Hansen Hsu about the computer programs that I wrote during the 1950s, July 2018
video (1:30:0)
(texts of the actual programs)

Some extended interviews, not available online, have also been published in books, notably in Chapters 7--17 of Companion to the Papers of Donald Knuth (conversations with Dikran Karagueuzian in the summer of 1996), and in two books by Edgar G. Daylight, The Essential Knuth (2013), Algorithmic Barriers Falling (2014).

[Sep 06, 2019] Knuth: No, I stopped going to conferences. It was too discouraging. Computer programming keeps getting harder because more stuff is discovered

Sep 06, 2019 |

Knuth: No, I stopped going to conferences. It was too discouraging. Computer programming keeps getting harder because more stuff is discovered. I can cope with learning about one new technique per day, but I can't take ten in a day all at once. So conferences are depressing; it means I have so much more work to do. If I hide myself from the truth I am much happier.

[Sep 06, 2019] How TAOCP was hatched

Notable quotes:
"... Also, Addison-Wesley was the people who were asking me to do this book; my favorite textbooks had been published by Addison Wesley. They had done the books that I loved the most as a student. For them to come to me and say, "Would you write a book for us?", and here I am just a secondyear gradate student -- this was a thrill. ..."
"... But in those days, The Art of Computer Programming was very important because I'm thinking of the aesthetical: the whole question of writing programs as something that has artistic aspects in all senses of the word. The one idea is "art" which means artificial, and the other "art" means fine art. All these are long stories, but I've got to cover it fairly quickly. ..."
Sep 06, 2019 |

Knuth: This is, of course, really the story of my life, because I hope to live long enough to finish it. But I may not, because it's turned out to be such a huge project. I got married in the summer of 1961, after my first year of graduate school. My wife finished college, and I could use the money I had made -- the $5000 on the compiler -- to finance a trip to Europe for our honeymoon.

We had four months of wedded bliss in Southern California, and then a man from Addison-Wesley came to visit me and said "Don, we would like you to write a book about how to write compilers."

The more I thought about it, I decided "Oh yes, I've got this book inside of me."

I sketched out that day -- I still have the sheet of tablet paper on which I wrote -- I sketched out 12 chapters that I thought ought to be in such a book. I told Jill, my wife, "I think I'm going to write a book."

As I say, we had four months of bliss, because the rest of our marriage has all been devoted to this book. Well, we still have had happiness. But really, I wake up every morning and I still haven't finished the book. So I try to -- I have to -- organize the rest of my life around this, as one main unifying theme. The book was supposed to be about how to write a compiler. They had heard about me from one of their editorial advisors, that I knew something about how to do this. The idea appealed to me for two main reasons. One is that I did enjoy writing. In high school I had been editor of the weekly paper. In college I was editor of the science magazine, and I worked on the campus paper as copy editor. And, as I told you, I wrote the manual for that compiler that we wrote. I enjoyed writing, number one.

Also, Addison-Wesley was the people who were asking me to do this book; my favorite textbooks had been published by Addison Wesley. They had done the books that I loved the most as a student. For them to come to me and say, "Would you write a book for us?", and here I am just a secondyear gradate student -- this was a thrill.

Another very important reason at the time was that I knew that there was a great need for a book about compilers, because there were a lot of people who even in 1962 -- this was January of 1962 -- were starting to rediscover the wheel. The knowledge was out there, but it hadn't been explained. The people who had discovered it, though, were scattered all over the world and they didn't know of each other's work either, very much. I had been following it. Everybody I could think of who could write a book about compilers, as far as I could see, they would only give a piece of the fabric. They would slant it to their own view of it. There might be four people who could write about it, but they would write four different books. I could present all four of their viewpoints in what I would think was a balanced way, without any axe to grind, without slanting it towards something that I thought would be misleading to the compiler writer for the future. I considered myself as a journalist, essentially. I could be the expositor, the tech writer, that could do the job that was needed in order to take the work of these brilliant people and make it accessible to the world. That was my motivation. Now, I didn't have much time to spend on it then, I just had this page of paper with 12 chapter headings on it. That's all I could do while I'm a consultant at Burroughs and doing my graduate work. I signed a contract, but they said "We know it'll take you a while." I didn't really begin to have much time to work on it until 1963, my third year of graduate school, as I'm already finishing up on my thesis. In the summer of '62, I guess I should mention, I wrote another compiler. This was for Univac; it was a FORTRAN compiler. I spent the summer, I sold my soul to the devil, I guess you say, for three months in the summer of 1962 to write a FORTRAN compiler. I believe that the salary for that was $15,000, which was much more than an assistant professor. I think assistant professors were getting eight or nine thousand in those days.

Feigenbaum: Well, when I started in 1960 at [University of California] Berkeley, I was getting $7,600 for the nine-month year.

Knuth: Knuth: Yeah, so you see it. I got $15,000 for a summer job in 1962 writing a FORTRAN compiler. One day during that summer I was writing the part of the compiler that looks up identifiers in a hash table. The method that we used is called linear probing. Basically you take the variable name that you want to look up, you scramble it, like you square it or something like this, and that gives you a number between one and, well in those days it would have been between 1 and 1000, and then you look there. If you find it, good; if you don't find it, go to the next place and keep on going until you either get to an empty place, or you find the number you're looking for. It's called linear probing. There was a rumor that one of Professor Feller's students at Princeton had tried to figure out how fast linear probing works and was unable to succeed. This was a new thing for me. It was a case where I was doing programming, but I also had a mathematical problem that would go into my other [job]. My winter job was being a math student, my summer job was writing compilers. There was no mix. These worlds did not intersect at all in my life at that point. So I spent one day during the summer while writing the compiler looking at the mathematics of how fast does linear probing work. I got lucky, and I solved the problem. I figured out some math, and I kept two or three sheets of paper with me and I typed it up. ["Notes on 'Open' Addressing', 7/22/63] I guess that's on the internet now, because this became really the genesis of my main research work, which developed not to be working on compilers, but to be working on what they call analysis of algorithms, which is, have a computer method and find out how good is it quantitatively. I can say, if I got so many things to look up in the table, how long is linear probing going to take. It dawned on me that this was just one of many algorithms that would be important, and each one would lead to a fascinating mathematical problem. This was easily a good lifetime source of rich problems to work on. Here I am then, in the middle of 1962, writing this FORTRAN compiler, and I had one day to do the research and mathematics that changed my life for my future research trends. But now I've gotten off the topic of what your original question was.

Feigenbaum: We were talking about sort of the.. You talked about the embryo of The Art of Computing. The compiler book morphed into The Art of Computer Programming, which became a seven-volume plan.

Knuth: Exactly. Anyway, I'm working on a compiler and I'm thinking about this. But now I'm starting, after I finish this summer job, then I began to do things that were going to be relating to the book. One of the things I knew I had to have in the book was an artificial machine, because I'm writing a compiler book but machines are changing faster than I can write books. I have to have a machine that I'm totally in control of. I invented this machine called MIX, which was typical of the computers of 1962.

In 1963 I wrote a simulator for MIX so that I could write sample programs for it, and I taught a class at Caltech on how to write programs in assembly language for this hypothetical computer. Then I started writing the parts that dealt with sorting problems and searching problems, like the linear probing idea. I began to write those parts, which are part of a compiler, of the book. I had several hundred pages of notes gathering for those chapters for The Art of Computer Programming. Before I graduated, I've already done quite a bit of writing on The Art of Computer Programming.

I met George Forsythe about this time. George was the man who inspired both of us [Knuth and Feigenbaum] to come to Stanford during the '60s. George came down to Southern California for a talk, and he said, "Come up to Stanford. How about joining our faculty?" I said "Oh no, I can't do that. I just got married, and I've got to finish this book first." I said, "I think I'll finish the book next year, and then I can come up [and] start thinking about the rest of my life, but I want to get my book done before my son is born." Well, John is now 40-some years old and I'm not done with the book. Part of my lack of expertise is any good estimation procedure as to how long projects are going to take. I way underestimated how much needed to be written about in this book. Anyway, I started writing the manuscript, and I went merrily along writing pages of things that I thought really needed to be said. Of course, it didn't take long before I had started to discover a few things of my own that weren't in any of the existing literature. I did have an axe to grind. The message that I was presenting was in fact not going to be unbiased at all. It was going to be based on my own particular slant on stuff, and that original reason for why I should write the book became impossible to sustain. But the fact that I had worked on linear probing and solved the problem gave me a new unifying theme for the book. I was going to base it around this idea of analyzing algorithms, and have some quantitative ideas about how good methods were. Not just that they worked, but that they worked well: this method worked 3 times better than this method, or 3.1 times better than this method. Also, at this time I was learning mathematical techniques that I had never been taught in school. I found they were out there, but they just hadn't been emphasized openly, about how to solve problems of this kind.

So my book would also present a different kind of mathematics than was common in the curriculum at the time, that was very relevant to analysis of algorithm. I went to the publishers, I went to Addison Wesley, and said "How about changing the title of the book from 'The Art of Computer Programming' to 'The Analysis of Algorithms'." They said that will never sell; their focus group couldn't buy that one. I'm glad they stuck to the original title, although I'm also glad to see that several books have now come out called "The Analysis of Algorithms", 20 years down the line.

But in those days, The Art of Computer Programming was very important because I'm thinking of the aesthetical: the whole question of writing programs as something that has artistic aspects in all senses of the word. The one idea is "art" which means artificial, and the other "art" means fine art. All these are long stories, but I've got to cover it fairly quickly.

I've got The Art of Computer Programming started out, and I'm working on my 12 chapters. I finish a rough draft of all 12 chapters by, I think it was like 1965. I've got 3,000 pages of notes, including a very good example of what you mentioned about seeing holes in the fabric. One of the most important chapters in the book is parsing: going from somebody's algebraic formula and figuring out the structure of the formula. Just the way I had done in seventh grade finding the structure of English sentences, I had to do this with mathematical sentences.

Chapter ten is all about parsing of context-free language, [which] is what we called it at the time. I covered what people had published about context-free languages and parsing. I got to the end of the chapter and I said, well, you can combine these ideas and these ideas, and all of a sudden you get a unifying thing which goes all the way to the limit. These other ideas had sort of gone partway there. They would say "Oh, if a grammar satisfies this condition, I can do it efficiently." "If a grammar satisfies this condition, I can do it efficiently." But now, all of a sudden, I saw there was a way to say I can find the most general condition that can be done efficiently without looking ahead to the end of the sentence. That you could make a decision on the fly, reading from left to right, about the structure of the thing. That was just a natural outgrowth of seeing the different pieces of the fabric that other people had put together, and writing it into a chapter for the first time. But I felt that this general concept, well, I didn't feel that I had surrounded the concept. I knew that I had it, and I could prove it, and I could check it, but I couldn't really intuit it all in my head. I knew it was right, but it was too hard for me, really, to explain it well.

So I didn't put in The Art of Computer Programming. I thought it was beyond the scope of my book. Textbooks don't have to cover everything when you get to the harder things; then you have to go to the literature. My idea at that time [is] I'm writing this book and I'm thinking it's going to be published very soon, so any little things I discover and put in the book I didn't bother to write a paper and publish in the journal because I figure it'll be in my book pretty soon anyway. Computer science is changing so fast, my book is bound to be obsolete.

It takes a year for it to go through editing, and people drawing the illustrations, and then they have to print it and bind it and so on. I have to be a little bit ahead of the state-of-the-art if my book isn't going to be obsolete when it comes out. So I kept most of the stuff to myself that I had, these little ideas I had been coming up with. But when I got to this idea of left-to-right parsing, I said "Well here's something I don't really understand very well. I'll publish this, let other people figure out what it is, and then they can tell me what I should have said." I published that paper I believe in 1965, at the end of finishing my draft of the chapter, which didn't get as far as that story, LR(k). Well now, textbooks of computer science start with LR(k) and take off from there. But I want to give you an idea of

Ken Thompson and Dennis Ritchie Explain UNIX (Bell Labs) - YouTube

What is intersting that in this presentation they stressed that Unix was designed out of desire to create a simple programming environment.

[Jul 30, 2019] FreeDOS turns 25 years old by Jim Hall

Jul 28, 2019 |

FreeDOS turns 25 years old: An origin story The operating system's history is a great example of the open source software model: developers working together to create something.

Get the highlights in your inbox every week.

FreeDOS .

That's a major milestone for any open source software project, and I'm proud of the work that we've done on it over the past quarter century. I'm also proud of how we built FreeDOS because it is a great example of how the open source software model works.

For its time, MS-DOS was a powerful operating system. I'd used DOS for years, ever since my parents replaced our aging Apple II computer with a newer IBM machine. MS-DOS provided a flexible command line, which I quite liked and that came in handy to manipulate my files. Over the years, I learned how to write my own utilities in C to expand its command-line capabilities even further.

Around 1994, Microsoft announced that its next planned version of Windows would do away with MS-DOS. But I liked DOS. Even though I had started migrating to Linux, I still booted into MS-DOS to run applications that Linux didn't have yet.

I figured that if we wanted to keep DOS, we would need to write our own. And that's how FreeDOS was born.

On June 29, 1994, I made a small announcement about my idea to the comp.os.msdos.apps newsgroup on Usenet.

A few months ago, I posted articles relating to starting a public domain version of DOS. The general support for this at the time was strong, and many people agreed with the statement, "start writing!" So, I have

Announcing the first effort to produce a PD-DOS. I have written up a "manifest" describing the goals of such a project and an outline of the work, as well as a "task list" that shows exactly what needs to be written. I'll post those here, and let discussion follow.

While I announced the project as PD-DOS (for "public domain," although the abbreviation was meant to mimic IBM's "PC-DOS"), we soon changed the name to Free-DOS and later FreeDOS.

I started working on it right away. First, I shared the utilities I had written to expand the DOS command line. Many of them reproduced MS-DOS features, including CLS, DATE, DEL, FIND, HELP, and MORE. Some added new features to DOS that I borrowed from Unix, such as TEE and TRCH (a simple implementation of Unix's tr). I contributed over a dozen FreeDOS utilities

By sharing my utilities, I gave other developers a starting point. And by sharing my source code under the GNU General Public License (GNU GPL), I implicitly allowed others to add new features and fix bugs.

Other developers who saw FreeDOS taking shape contacted me and wanted to help. Tim Norman was one of the first; Tim volunteered to write a command shell (COMMAND.COM, later named FreeCOM). Others contributed utilities that replicated or expanded the DOS command line.

We released our first alpha version as soon as possible. Less than three months after announcing FreeDOS, we had an Alpha 1 distribution that collected our utilities. By the time we released Alpha 5, FreeDOS boasted over 60 utilities. And FreeDOS included features never imagined in MS-DOS, including internet connectivity via a PPP dial-up driver and dual-monitor support using a primary VGA monitor and a secondary Hercules Mono monitor.

New developers joined the project, and we welcomed them. By October 1998, FreeDOS had a working kernel, thanks to Pat Villani. FreeDOS also sported a host of new features that brought not just parity with MS-DOS but surpassed MS-DOS, including ANSI support and a print spooler that resembled Unix lpr.

You may be familiar with other milestones. We crept our way towards the 1.0 label, finally releasing FreeDOS 1.0 in September 2006, FreeDOS 1.1 in January 2012, and FreeDOS 1.2 in December 2016. MS-DOS stopped being a moving target long ago, so we didn't need to update as frequently after the 1.0 release.

Today, FreeDOS is a very modern DOS. We've moved beyond "classic DOS," and now FreeDOS features lots of development tools such as compilers, assemblers, and debuggers. We have lots of editors beyond the plain DOS Edit editor, including Fed, Pico, TDE, and versions of Emacs and Vi. FreeDOS supports networking and even provides a simple graphical web browser (Dillo). And we have tons of new utilities, including many that will make Linux users feel at home.

FreeDOS got where it is because developers worked together to create something. In the spirit of open source software, we contributed to each other's work by fixing bugs and adding new features. We treated our users as co-developers; we always found ways to include people, whether they were writing code or writing documentation. And we made decisions through consensus based on merit. If that sounds familiar, it's because those are the core values of open source software: transparency, collaboration, release early and often, meritocracy, and community. That's the open source way !

I encourage you to download FreeDOS 1.2 and give it a try.

[Apr 16, 2019] European contributions to computing and the internet

Apr 16, 2019 |

Amerimutt Golem , says: April 16, 2019 at 10:28 am GMT

@Thomm .... ... ...

Actually smart Northern European men enabled the very Internet you are using to spread kosher propaganda.

1. Gottfried Leibniz/German – binary number system.
2. George Boole/English – Boolean logic.
3. Konrad Kuze/German – electronic computer.
4. Donald Davies/Welsh – packet switching.
5. Clifford Cocks/English – public key encryption years before Rivest , Shamir, and Adleman.
6. Edsger Dijkstra/Dutch – Dijkstra's algorithm and programming.
7. Tim Berners-Lee/English – HTML and http.
8. Håkon Wium Lie/Norwegian – Cascading Style Sheets (CSS).
9. Linus Torvalds/Finn – Linux on which many web servers run. Klaus Knopper/German – Knoppix Linux variant.
10. Frank Codd/English – relational database model.
11. Michael Widenius/Swede – MySQL on which many web applications run.
12. Kristen Nygaard & Ole-Johan Dahl/Norwegians – object-oriented programming and Simula programming language.
13. Guido van Rossum/Dutch – Python programming language.
14. Lennart Augustsson/Swede – Haskell programming language.
15. Bjarne Stroustrup/Dane – C++ programming language.
17 Geoffrey Hinton/English – artificial intelligence.
18. Jürgen Dethloff and Helmut Göttrup/Germans – chip card used in mobile phones plus credit and debit cards.
19. Karlheinz Brandenburg/German – MP3 format.

[Oct 30, 2018] The Watson family held integrity, equality, and knowledge share as a formidable synthesis of company ethics. With them gone old IBM was gone...

It not Watson family gone it is New Deal Capitalism was replaced with the neoliberalism
Notable quotes:
"... Except when your employer is the one preaching associate loyalty and "we are family" your entire career. Then they decide you've been too loyal and no longer want to pay your salary and start fabricating reasons to get rid of you. ADP is guilty of these same practices and eliminating their tenured associates. Meanwhile, the millennials hired play ping pong and text all day, rather than actually working. ..."
Oct 30, 2018 |

Zytor-LordoftheSkies , Thursday, March 22, 2018 11:55 AM

A quick search of the article doesn't find the word "buy backs" but this is a big part of the story. IBM spent over $110 BILLION on stock buy backs between 2000 and 2016. That's the number I found, but it hasn't stopped since. If anything it has escalated.

This is very common among large corporations. Rather than spend on their people, they funnel billions into stock buy backs which raises or at least maintains the stock value so execs can keep cashing in. It's really pretty disgraceful. This was only legalized in 1982, which not-so-coincidentally is not long after real wages stalled, and have stalled ever since.

Suzan Zytor-LordoftheSkies ,
Thanks for this bit of insanely true reporting. When laid off from Westinghouse after 14 years of stellar performance evaluations I was flummoxed by the execs getting million-dollar bonuses as we were told the company wasn't profitable enough to maintain its senior engineering staff. It sold off every division eventually as the execs (many of them newly hired) reaped even more bonuses.
Georgann Putintsev Suzan ,
Thank you ... very insightful of you. As an IBMer and lover of Spreadsheets / Statistics / Data Specalist ... I like reading Annual Reports. Researching these Top Execs, BOD and compare them to other Companies across-the-board and industry sectors. You'll find a Large Umbrella there.
There is a direct tie and inter-changeable pieces of these elites over the past 55 yrs. Whenever some Corp/ Political/ Government shill (wannbe) needs a payoff, they get placed into high ranking top positions for a orchestrating a predescribed dark nwo agenda. Some may come up the ranks like Ginny, but ALL belong to Council for Foreign Relations and other such high level private clubs or organizations. When IBM sells off their Mainframe Manufacturing (Poughkeepsie) to an elite Saudi, under an American Co. sounding name of course, ... and the U.S. Government ... doesn't balk ... that has me worried for our 1984 future.
Carol Van Linda Suzan ,
Sears is doing this also
Suzan Carol Van Linda ,
Details? Thanks!
vibert Zytor-LordoftheSkies ,
True in every large corporation. They use almost free money from the US Government to do it. (Taxpayer's money)
DDRLSGC vibert ,
Yeah, it is amazing how they stated that they don't need help from the government when in reality they do need government to pass laws that favor them, pack the court system where judges rule in their favor and use their private police and the public sector police to keep the workers down.
Johnny Player DDRLSGC ,
Why do you put disqus in your name? . Is that so you can see if they sell your info and you know where it originated from?
Theo Geauxvan Zytor-LordoftheSkies ,
I wonder how many billions (trillions?) have been funneled from corporate workers pockets this way? It seems all corporations are doing it these days. Large-scale transfer of wealth from the middle class to the wealthy.
Stevie Ponders Theo Geauxvan ,
It's called asset stripping. Basically corporate raiding (as in pillage) from the inside.
R. J. Smith , Thursday, March 22, 2018 9:06 AM
"Member of the IBM family" -- BS. Your employer is not your family.
Randall Smith R. J. Smith
Not anymore. With most large companies, you've never been able to say they are "family." Loyalty used to be a thing though. I worked at a company where I saw loyalty vanish over a 10 year period.
marsto R. J. Smith
Except when your employer is the one preaching associate loyalty and "we are family" your entire career. Then they decide you've been too loyal and no longer want to pay your salary and start fabricating reasons to get rid of you. ADP is guilty of these same practices and eliminating their tenured associates. Meanwhile, the millennials hired play ping pong and text all day, rather than actually working.
DDRLSGC marsto
Yeah, and how many CEOs actually work to make their companies great instead of running them into the ground, thinking about their next job move, and playing golf
Mary Malley R. J. Smith ,
I have to disagree with you. I started with IBM on their rise up in those earlier days, and we WERE valued and shown that we were valued over and over through those glorious years. It did feel like we were in a family, our families mattered to them, our well-being. They gave me a month to find a perfect babysitter when they hired me before I had to go to work!

They helped me find a house in a good school district for my children. They bought my house when I was moving to a new job/location when it didn't sell within 30 days.

They paid the difference in the interest rate of my loan for my new house from the old one. I can't even begin to list all the myriad of things that made us love IBM and the people we worked with and for, and made us feel a part of that big IBM family.

Did they change, yes, but the dedication we gave was freely given and we mutually respected each other. I was lucky to work for them for decades before that shift when they changed to be just like every other large corporation.

Georgann Putintsev Mary Malley ,
The Watson family held integrity, equality, and knowledge share as a formidable synthesis of company ethics moving a Quality based business forward in the 20th to 21st century. They also promoted an (volunteer) IBM Club to help promote employee and family activities inside/outside of work which they by-en-large paid for. This allowed employees to meet and see other employees/families as 'Real' & "Common-Interest" human beings. I participated, created, and organized events and documented how-to-do-events for other volunteers. These brought IBMers together inside or outside of their 'working' environment to have fun, to associate, to realize those innate qualities that are in all of us. I believe it allowed for better communication and cooperation in the work place.

To me it was family. Some old IBMers might remember when Music, Song, Skits were part of IBM Branch Office meetings. As President of the IBM Clubs Palo Alto branch (7 yrs.), I used our Volunteer Club Votes to spend ALL that IBM donated money, because they <administratively> gave it back to IBM if we didn't.

Without a strong IBM Club presence, it gets whittled down to 2-3 events a year. For a time WE WERE a FAMILY.

bookmama3 Georgann Putintsev , in reply to" aria-label="in reply to">
Absolutely! Back when white shirts/black suits were a requirement. There was a country club in Poughkeepsie, softball teams, Sunday brunch, Halloween parties in the fall, Christmas parties in December where thousands of age appropriate Fisher Price toys were given out to employee's kids. Today "IBMer" is used by execs as a term of derision. Employees are overworked and under appreciated and shortsighted, overpaid executives rule the roost. The real irony is that talented, vital employees are being retired for "costing too much" while dysfunctional top level folk are rewarded with bonuses and stock when they are let go. And it's all legal. It's disgraceful.
OrangeGina R. J. Smith , in reply to" aria-label="in reply to">
very true, however for many of us, our co-workers of a very long time ARE family. Corporations are NOT people, but they are comprised of them.
HiJinks R. J. Smith , in reply to" aria-label="in reply to">
It was true at one time, but no more.
Herb Tarlick R. J. Smith , in reply to" aria-label="in reply to">
This one was until the mid eighties.

[Oct 15, 2018] Microsoft co-founder Paul Allen dead at 65 by Jacob Kastrenakes and Rachel Becker

Oct 15, 2018 |

Microsoft co-founder Paul Allen died today from complications with non-Hodgkin's lymphoma. He was 65. Allen said earlier this month that he was being treated for the disease.

Allen was a childhood friend of Bill Gates, and together, the two started Microsoft in 1975. He left the company in 1983 while being treated for Hodgkin's lymphoma and remained a board member with the company through 2000. He was first treated for non-Hodgkin's lymphoma in 2009, before seeing it go into remission.

In a statement given to ABC News , Gates said he was "heartbroken by the passing of one of my oldest and dearest friends." He went on to commend his fellow co-founder for his life after Microsoft:

From our early days together at Lakeside School, through our partnership in the creation of Microsoft, to some of our joint philanthropic projects over the years, Paul was a true partner and dear friend. Personal computing would not have existed without him.

But Paul wasn't content with starting one company. He channelled his intellect and compassion into a second act focused on improving people's lives and strengthening communities in Seattle and around the world. He was fond of saying, "If it has the potential to do good, then we should do it." That's the king of person he was.

Paul loved life and those around him, and we all cherished him in return. He deserved much more time, but his contributions to the world of technology and philanthropy will live on for generations to come. I will miss him tremendously.

Microsoft CEO Satya Nadella said Allen's contributions to both Microsoft and the industry were "indispensable." His full statement is quoted below:

Paul Allen's contributions to our company, our industry, and to our community are indispensable. As co-founder of Microsoft, in his own quiet and persistent way, he created magical products, experiences and institutions, and in doing so, he changed the world. I have learned so much from him -- his inquisitiveness, curiosity, and push for high standards is something that will continue to inspire me and all of us as Microsoft. Our hearts are with Paul's family and loved ones. Rest in peace.

In a memoir published in 2011, Allen says that he was responsible for naming Microsoft and creating the two-button mouse. The book also portrayed Allen as going under-credited for his work at Microsoft, and Gates as having taken more ownership of the company than he deserved. It created some drama when it arrived, but the two men ultimately appeared to remain friends, posing for a photo together two years later.

After leaving Microsoft, Allen became an investor through his company Vulcan, buying into a diverse set of companies and markets. Vulcan's current portfolio ranges from the Museum of Pop Culture in Seattle, to a group focused on using machine learning for climate preservation, to Stratolaunch, which is creating a spaceplane . Allen's investments and donations made him a major name in Seattle, where much of his work was focused. He recently funded a $46 million building in South Seattle that will house homeless and low-income families.

Both Apple CEO Tim Cook and Google CEO Sundar Pichai called Allen a tech "pioneer" while highlighting his philanthropic work in statements on Twitter. Amazon CEO Jeff Bezos said Allen's work "inspired so many."

me title=

me title=

me title=

Allen has long been the owner of the Portland Trail Blazers and Seattle Seahawks as well. NFL Commissioner Roger Goodell said Allen "worked tirelessly" to "identify new ways to make the game safer and protect our players from unnecessary risk." NBA Commissioner Adam Silver said Allen "helped lay the foundation for the league's growth internationally and our embrace of new technologies."

He also launched a number of philanthropic efforts, which were later combined under the name Paul G. Allen Philanthropies. His "philanthropic contributions exceed $2 billion," according to Allen's own website, and he had committed to giving away the majority of his fortune.

Allen's sister, Jody Allen, wrote a statement on his family's behalf:

My brother was a remarkable individual on every level. While most knew Paul Allen as a technologist and philanthropist, for us he was a much loved brother and uncle, and an exceptional friend.

Paul's family and friends were blessed to experience his wit, warmth, his generosity and deep concern. For all the demands on his schedule, there was always time for family and friends. At this time of loss and grief for us – and so many others – we are profoundly grateful for the care and concern he demonstrated every day.

Some of Allen's philanthropy has taken a scientific bent: Allen founded the Allen Institute for Brain Science in 2003, pouring $500 million into the non-profit that aims to give scientists the tools and data they need to probe how brain works. One recent project, the Allen Brain Observatory , provides an open-access "catalogue of activity in the mouse's brain," Saskia de Vries, senior scientist on the project, said in a video . That kind of data is key to piecing together how the brain processes information.

In an interview with Matthew Herper at Forbes , Allen called the brain "hideously complex" -- much more so than a computer. "As an ex-programmer I'm still just curious about how the brain functions, how that flow of information really happens," he said . After founding the brain science institute, Allen also founded the Allen Institute for Artificial Intelligence and the Allen Institute for Cell Science in 2014, as well as the Paul G. Allen Frontiers Group in 2016 , which funds cutting-edge research.

Even back in 2012, when Allen spoke with Herper at Forbes , he talked about plans for his financial legacy after his death -- and he said that a large part of it would be "allocated to this kind of work for the future."

In a statement emailed to The Verge, The Allen Institute's President and CEO Allan Jones said:

Paul's vision and insight have been an inspiration to me and to many others both here at the Institute that bears his name, and in the myriad of other areas that made up the fantastic universe of his interests. He will be sorely missed. We honor his legacy today, and every day into the long future of the Allen Institute, by carrying out our mission of tackling the hard problems in bioscience and making a significant difference in our respective fields.

According to Quincy Jones, Allen was also an excellent guitar player .

[Oct 15, 2018] Microsoft Co-Founder Paul Allen Dies of Cancer At Age 65 - Slashdot

Oct 15, 2018 |

bennet42 ( 1313459 ) , Monday October 15, 2018 @08:24PM ( #57483472 )

Re:RIP Paul! ( Score: 5 , Informative)

Man what a shock! I was lucky enough to be working at a Seattle startup that Paul bought back in the 90s ( doing VoIP SOHO phone systems ). He liked to swing by office on a regular basis as we were just a few blocks from Dicks hamburgers on Mercer St (his favorite). He was really an engineer's engineer. We'd give him a status report on how things were going and within a few minutes he was up at the white board spitballing technical solutions to ASIC or network problems. I especially remember him coming by the day he bought the Seahawks. Paul was a big physical presence ( 6'2" 250lbs in those days ), but he kept going on about how after meeting the Seahawks players, he never felt so physically small in his life. Ignore the internet trolls. Paul was a good guy. He was a humble, modest, down-to-earth guy. There was always a pick-up basketball game on his court on Thursday nights. Jam session over at his place were legendary ( I never got to play with him, but every musician that I know that played with him was impressed with his guitar playing ). He left a huge legacy in the pacific northwest. We'll miss you Paul!

Futurepower(R) ( 558542 ) writes: < > on Monday October 15, 2018 @06:56PM ( #57482948 ) Homepage
Bill Gates was so angry, Allen left the company. ( Score: 5 , Interesting)

The book Paul Allen wrote avoids a full report, but gives the impression that Bill Gates was so angry, Paul Allen left the company because interacting with Bill Gates was bad for his health.

Quotes from the book, Idea Man [] by Paul Allen.

Page 49:

THREE DECADES AFTER teaching Bill and me at Lakeside, Fred Wright was asked what he'd thought about our success with Microsoft. His reply: "It was neat that they got along well enough that the company didn't explode in the first year or two."

Page 96:

When Bill pushed on licensing terms or bad-mouthed the flaky Signetics cards, Ed thought he was insubordinate. You could hear them yelling throughout the plant, and it was quite a spectacle-the burly ex-military officer standing toe to toe with the owlish prodigy about half his weight, neither giving an inch.

Page 177:

Bill was sarcastic, combative, defensive, and contemptuous.

Page 180:

"For Bill, the ground had already begun shifting. At product review meetings, his scathing critiques became a perverse badge of honor. One game was to count how many times Bill confronted a given manager; whoever got tagged for the most "stupidest things " won the contest. "I give my feedback," he grumbled to me, "and it doesn't go anywhere."

RubberDogBone ( 851604 ) , Monday October 15, 2018 @10:16PM ( #57483928 )
RIP Dr. Netvorkian ( Score: 2 )

Rest well, Mr. Allen.

He used to have the nickname "Doctor NetVorkian" because many of the things he invested in promptly tanked in one way or another after his investment. He had a lot of bad luck with his investments.

For those who don't understand the joke, a certain Dr. Kervorkian became notorious for helping ill patients commit suicide.

toadlife ( 301863 ) , Monday October 15, 2018 @06:29PM ( #57482740 ) Journal
Heyyyyy! ( Score: 5 , Funny)

Allen had nothing to do with systemd!

hey! ( 33014 ) writes:
Re: ( Score: 2 )

What a ray of sunshine you are.

CohibaVancouver ( 864662 ) , Monday October 15, 2018 @06:44PM ( #57482862 )
Re:Now burning in hell ( Score: 5 , Informative)

He is now burning in hell for Microsoft and Windows

Windows, Anonymous Coward? Allen left Microsoft in 1982. Windows 1.0 launched in 1985.

("The" Windows - Windows 3.1 - Didn't launch until 1992, a decade after Allen had left.)

BitterOak ( 537666 ) , Monday October 15, 2018 @06:56PM ( #57482940 )
Re:Now burning in hell ( Score: 5 , Insightful)
Microsoft created Windows and Allen co-founded Microsoft - he cannot wipe that blood off his hands!

But you can wipe Windows off your hard drive, so I don't get your point. Paul Allen was a great guy in many, many ways.

El Cubano ( 631386 ) writes:
Re: ( Score: 2 )
But you can wipe Windows off your hard drive, so I don't get your point. Paul Allen was a great guy in many, many ways.

Agreed. Even if you could "blame" him for all or part of Windows, he did start the Museum of Pop Culture []. If you are ever in Seattle, it is a must see. I mean, they have what is probably the best Star Trek museum display anywhere (which is saying a lot since the Smithsonian has a very nice one as well), including most of the original series set pieces and I believe one of the only actual Enterprise models used for filming. In my mind, that gives him a great deal of geek cred. Plus, as I under

110010001000 ( 697113 ) writes:
Re: ( Score: 2 )

Well if he donated guitars and liked Star Trek then he must have been a good guy.

110010001000 ( 697113 ) , Monday October 15, 2018 @07:20PM ( #57483078 ) Homepage Journal
Re:And Then? ( Score: 1 )

You forgot he was a big Patent Troll. He won't be missed or remembered.

110010001000 ( 697113 ) , Monday October 15, 2018 @10:28PM ( #57483964 ) Homepage Journal
Re:And Then? ( Score: 2 )

I knew someone would say that. You are right. I won't. But he won't either. He was a patent troll. Oh but: RIP and thoughts and prayers, right? He was a great guy and will be missed.

[Sep 07, 2018] This is the Story of the 1970s Great Calculator Race

Sep 07, 2018 |

[Editor's note: all links in the story will lead you to Twitter] : In the 1970s the cost -- and size -- of calculators tumbled. Business tools became toys; as a result prestige tech companies had to rapidly diversify into other products -- or die! This is the story of the 1970s great calculator race... Compact electronic calculators had been around since the mid-1960s, although 'compact' was a relative term. They were serious, expensive tools for business . So it was quite a breakthrough in 1967 when Texas Instruments presented the Cal-Tech: a prototype battery powered 'pocket' calculator using four integrated circuits . It sparked a wave of interest. Canon was one of the first to launch a pocket calculator in 1970. The Pocketronic used Texas Instruments integrated circuits, with calculations printed on a roll of thermal paper. Sharp was also an early producer of pocket calculators. Unlike Canon they used integrated circuits from Rockwell and showed the calculation on a vacuum fluorescent display . The carrying handle was a nice touch!

The next year brought another big leap: the Hewlet-Packard HP35 . Not only did it use a microprocessor it was also the first scientific pocket calculator. Suddenly the slide rule was no longer king; the 35 buttons of the HP35 had taken its crown. The most stylish pocket calculator was undoubtedly the Olivetti Divisumma 18 , designed by Mario Bellini. Its smooth look and soft shape has become something of a tech icon and an inspiration for many designers. It even featured in Space:1999! By 1974 Hewlett Packard had created another first: the HP-65 programmable pocket calculator . Programmes were stored on magnetic cards slotted into the unit. It was even used during the Apollo-Soyuz space mission to make manual course corrections. The biggest problem for pocket calculators was the power drain: LED displays ate up batteries. As LCD displays gained popularity in the late 1970s the size of battery needed began to reduce . The 1972 Sinclair Executive had been the first pocket calculator to use small circular watch batteries , allowing the case to be very thin. Once LCD displays took off watch batteries increasingly became the norm for calculators. Solar power was the next innovation for the calculator: Teal introduced the Photon in 1977, no batteries required or supplied!

But the biggest shake-up of the emerging calculator market came in 1975, when Texas Instruments -- who made the chips for most calculator companies -- decided to produce and sell their own models. As a vertically integrated company Texas Instruments could make and sell calculators at a much lower price than its competitors . Commodore almost went out of business trying to compete: it was paying more for its TI chips than TI was selling an entire calculator for. With prices falling the pocket calculator quickly moved from business tool to gizmo : every pupil, every student, every office worker wanted one, especially when they discovered the digital fun they could have! Calculator games suddenly became a 'thing' , often combining a calculator with a deck of cards to create new games to play. Another popular pastime was finding numbers that spelt rude words if the calculator was turned upside down; the Samsung Secal even gave you a clue to one!

The calculator was quickly evolving into a lifestyle accessory . Hewlett Packard launched the first calculator watch in 1977... Casio launched the first credit card sized calculator in 1978 , and by 1980 the pocket calculator and pocket computer were starting to merge. Peak calculator probably came in 1981, with Kraftwerk's Pocket Calculator released as a cassingle in a calculator-shaped box . Although the heyday of the pocket calculator may be over they are still quite collectable. Older models in good condition with the original packaging can command high prices online. So let's hear it for the pocket calculator: the future in the palm of your hand!

Anonymous Coward , Monday September 03, 2018 @10:39PM ( #57248568 )

HP were real engineers ( Score: 3 , Informative)

I have a HP-15C purchased in 1985 and it is still running on the original batteries - 32 years!
That is phenomenal low power design for the technology and knowledge at the time.

mmogilvi ( 685746 ) writes:
Re: ( Score: 3 , Interesting)

I replaced the batteries in my 15c for the first time a couple of years ago. And just to be clear, it has three small non-rechargable button batteries, like you would find in a watch.

cyn1c77 ( 928549 ) , Monday September 03, 2018 @11:58PM ( #57248754 )
Re:HP were real engineers ( Score: 2 )
I have a HP-15C purchased in 1985 and it is still running on the original batteries - 32 years!
That is phenomenal low power design for the technology and knowledge at the time.

That's phenomenal even by today's design standards!

JBMcB ( 73720 ) , Monday September 03, 2018 @11:21PM ( #57248666 )
Olivetti Divisumma 18 ( Score: 3 )

My dad's friend was a gadget hound, and had one of these in the 80's. Not a great machine. The keys were weird and mushy. It had no electronic display. It only had a thermal printer that printed shiny dark gray numbers on shiny light gray paper. In other words, visibility was poor. It looked amazing, though, and you could spill a coke on it and the keys would still work.

Much more impressive but more utilitarian - he had a completely electro-mechanical rotary auto-dial telephone. It took small, hard plastic punch cards you'd put the number on. You'd push the card into a slot on the telephone, and it would feed the card in and out, generating pulses until it got to the number you punched out. Then it would pull the card back in and do it again for the next number until the whole number was dialed. No digital anything, just relays and motors.

fermion ( 181285 ) , Tuesday September 04, 2018 @01:53AM ( #57248998 ) Homepage Journal
chicken or the egg ( Score: 5 , Insightful)

In some ways, the electronic calculator market was created by TI and it's need to sell the new IC. There were not many applications, and one marketable application was the electronic calculator. In some ways it was like live Apple leveraging the microwave for the iPod.

Like the iPod, the TI calculators were not great, but they were very easy to use. The HP calculators were and are beatiful. But ease of use won out.

Another thing that won out was until about a decade ago all TI calculators were very limited. This made them ideal machines for tests. HP calculators could do unit analsys, and since 1990 they had algebra systems, and could even do calculus. This made them the ideal machine for technical students and professionals, but no high school would waste time teaching it because all they care about is filling out bubbles on an answer sheet.

The interesting contemporary issue that I see is that schools are still teaching calculators when really smart phones can do everything and more, especially with apps like Wolfram Alpha. Unless you are a legacy HP user, asking kids to buy a calculator just to boosts TI profits seems very wasteful to me. This is going to change as more tests move to online format, and online resources such as Desmos take over the physical clacultor, but in the meantime the taxpayer is on the hook for millions of dollars a year per large school district just for legacy technology.

Dhericean ( 158757 ) , Tuesday September 04, 2018 @05:01AM ( #57249428 )
Japan and Calculators ( Score: 3 )

An interesting NHK World documentary about Japanese calculator culture and the history of calculators in Japan. I generally watch these at speed = 1.5.

Begin Japanology (13 June 2013) - Calculators []

weilawei ( 897823 ) , Tuesday September 04, 2018 @05:17AM ( #57249468 ) Homepage
No TI-89 Fans Yet? ( Score: 2 )

I love my TI-89. I still use it daily. There's a lot to be said for multiple decades of practice on a calculator. Even the emulator of it on my phone, for when I don't have it handy, isn't the same.

It doesn't need to be particularly fast or do huge calculations--that's what programming something else is for. But nothing beats a good calculator for immediate results. ›

mknewman ( 557587 ) , Tuesday September 04, 2018 @09:31AM ( #57250228 )
TI started in 1972 not 1975 ( Score: 2 )

I had a Datamath in 1973, and a SR-57 programmable (100 steps, 10 memories) in 1975. Those were the days.

Ken Hall ( 40554 ) , Tuesday September 04, 2018 @09:50AM ( #57250316 )
TI Programmer ( Score: 2 )

First calculator that did octal and hex math (also binary). Got one when they came out, cost $50 in 1977. Still have it, still works, although the nicad battery died long ago. In a remarkable show of foresight, TI made the battery pack with a standard 9V battery connector, and provided a special battery door that let you replace the rechargeable battery with a normal 9V. I replaced it with a solar powered Casio that did a bunch more stuff, but the TI still works.

[Sep 01, 2018] >Trump's NAFTA Deal Simply Can't Solve America's Manufacturing Problems

Notable quotes:
"... By Marshall Auerback, a market analyst and Research Associate at the Levy Institute. Cross-posted from Alternet . ..."
"... This article was produced by the Independent Media Institute . ..."
"... "Silicon Valley grew like a weed with essentially no Department of Defense investment." ..."
"... The key, I think, is the conscious policy to deliberately seed new technologies developed under military auspices into the civilian economy. The best recent example was the Moore School Lectures in July – August 1946, which can be identified as the precise point in time when the USA government, following Hamiltonian principles of nation building, acted to create an entire new industry, and an entirely new phase shift in the technological bases of the economy. ..."
"... I agree, please see this presentation titled The Secret History of Silicon Valley, which details the relationship in fair greater detail. It is not from a professional historian, but the depth of research and citations show a story that runs counter to the common myths. ..."
"... The CIA money is everywhere in the silly valley. ..."
Sep 01, 2018 |

Posted on September 1, 2018 by Lambert Strether

Lambert: This is an important post, especially as it pertains to our implicit and unacknowledged industrial policy, militarization.

By Marshall Auerback, a market analyst and Research Associate at the Levy Institute. Cross-posted from Alternet .

President Trump and his Mexican counterpart, Enrique Peña Nieto, recently announced resolution of major sticking points that have held up the overall renegotiation of the NAFTA Treaty (or whatever new name Trump confers on the expected trilateral agreement). At first glance, there are some marginal improvements on the existing treaty, especially in terms of higher local content sourcing, and the theoretic redirection of more "high wage" jobs back to the U.S.

These benefits are more apparent than real. The new and improved NAFTA deal won't mean much, even if Canada ultimately signs on. The deal represents reshuffling a few deck chairs on the Titanic , which constitutes American manufacturing in the 21st century: a sector that has been decimated by policies of globalization and offshoring.

Additionally, what has remained onshore is now affected adversely to an increasing degree by the Pentagon. The experience of companies that have become largely reliant on military-based demand is that they gradually lose the ability to compete in global markets.

As early as the 1980s, this insight was presciently confirmed by the late scholar Seymour Melman . Melman was one of the first to state the perhaps not-so-obvious fact that the huge amount of Department of Defense (DoD) Research and Development (R&D) pumped into the economy has actually stifled American civilian industry innovation and competitiveness, most notably in the very manufacturing sector that Trump is seeking to revitalize with these "reformed" trade deals.

The three biggest reasons are:

1. The huge diversion of national R&D investment into grossly overpriced and mostly unjustifiable DoD R&D programs has tremendously misallocated a large proportion America's finest engineering talent toward unproductive pursuits (e.g., the tactical fighter fiascos , such as the F-35 Joint Strike Fighter that, among myriad other deficiencies, cannot fly within 25 miles of a thunderstorm; producing legacy systems that reflect outdated Cold War defense programs to deal with a massive national power, as opposed to combatting 21st-century terrorist counterinsurgencies). Indicative of this waste, former congressional aide Mike Lofgren quotes a University of Massachusetts study , illustrating that comparable civilian expenditures "would produce anywhere from 35 percent to 138 percent more jobs than spending the same amount on DoD [projects]." The NAFTA reforms won't change any of that.

2. By extension, the wasteful, cost-is-irrelevant habits of mind inculcated into otherwise competent engineers by lavish DoD cost-plus contracting have ruined these engineers for innovation in competitive, cost-is-crucial civilian industries.

3. The ludicrously bureaucratized management systems (systems analysis, systems engineering, five-year planning and on and on through a forest of acronyms) that DoD has so heavily propagandized and forced on contractors has, in symbiosis with the Harvard Business School/Wall Street mega-corporate managerial mindset, thoroughly wrecked efficient management of most sectors of American industry.

Let's drill down to the details of the pact, notably automobiles, which have comprised a big part of NAFTA. Under the new deal, 25 percent of auto content can be produced elsewhere than North America, a reduction from 37.5 percent that could be produced outside before, because of the multinational nature of every major automobile manufacturer . Twenty-five percent is still a very large percentage of the high-end auto content, much of which is already manufactured in Europe -- especially expensive parts like engines and transmissions, especially for non-U.S. manufacturers, that won't be much affected by this deal.

Additionally, much of the non–North American auto content that can be or is being manufactured in Europe is the high end of the value-added chain . Certainly the workers producing the engines and transmissions have higher-than-$16-per-hour wage rates, which are trumpeted in the new agreement as a proof that more "good jobs for working people" are being re-established by virtue of this deal. Since when is $16 per hour a Trumpian boon for U.S. auto workers? Objectively, $16 is only 27 percent above the 2018 Federal Poverty Threshold for a family with two kids ; even worse, $16 is only 54 percent of today's actual average hourly pay ($29.60) in U.S. automobile manufacturing, according to 2018 BLS numbers .

But beyond cars, here's the real problem: Although the ostensible goal of all of Trump's trade negotiations is to revitalize American manufacturing, the truth is that U.S. manufacturing basically suffered a catastrophic setback when China entered the World Trade Organization (WTO) back in 2001. Along with liberalized capital flows, the extensive resort to "offshoring" of manufacturing to China has sapped the American manufacturing capabilities, as well as engendering a skills shortage. This includes (to quote a recent Harvard Business Review study co-authored by Professors Gary Pisano and Willy Shih ):

"[the] tool and die makers, maintenance technicians, operators capable of working with highly sophisticated computer-controlled equipment, skilled welders, and even production engineers [all of whom] are in short supply.

"The reasons for such shortages are easy to understand. As manufacturing plants closed or scaled back, many people in those occupations moved on to other things or retired. Seeing fewer job prospects down the road, young people opted for other careers. And many community and vocational schools, starved of students, scaled back their technical programs."

The one ready source of demand for U.S.-manufactured goods is the military. High-tech enthusiasts like to claim that the U.S. Defense Department had a crucial role in creating Silicon Valley . The truth is far more nuanced. Silicon Valley grew like a weed with essentially no Department of Defense investment; in fact, until quite recently, most successful Silicon Valley enterprises avoided DoD contracts like the plague.

A bit of history: the transistor was invented by entirely private-funded research at Bell Labs in 1947. Next, the first integrated circuit patent was filed by Werner Jacobi, a Siemens commercial engineer in 1949 for application to hearing aids . The next advance, the idea of a silicon substrate, was patented in 1952 as a cheaper way of using transistors by Geoffrey Dummer , a reliability engineer working at a British government radar lab; for all of the talk of the defense establishment's role in developing high tech, ironically the British military showed no interest and Dummer couldn't secure the funding support to produce a working prototype. In 1958, Jack Kilby, a newbie at Texas Instruments (which had developed the first transistor radio as a commercial product in 1954) came up with the idea of multiple transistors on a germanium substrate. Almost simultaneously, in 1960, Robert Noyce at Fairchild Electronics patented a cheaper solution, a new approach to the silicon substrate, and implemented working prototypes. Both men envisioned mainly civilian applications (Kilby soon designed the first integrated circuit pocket calculator, a commercial success).

It is true that the first customers for both the Noyce and Kilby chips were the U.S. Air Force's B-70 and Minuteman I projects, which gave rise to the idea that the Pentagon played the key role in developing U.S. high-tech manufacturing, although it is worth noting that the electronics on both projects proved to be failures. Both companies soon developed major commercial applications for their integrated circuit innovations, Texas Instruments with considerably more success than Fairchild.

The Defense Advanced Research Projects Agency (aka "DARPA") is generally credited with inventing the internet. That's overblown. In fact, civilian computer science labs in the U.S., UK and France developed the idea of wide area networks in the 1950s. In the early '60s, DARPA started funding ARPANET design concepts to connect DoD laboratories and research facilities, initially without the idea of packet switching. Donald Davies at the UK's National Physics Lab first demonstrated practical packet switching in 1967 and had a full inter-lab network going by 1969. The first two nodes of the ARPANET were demonstrated in 1969 using a primitive centralized architecture and the Davies packet switching approach. In 1973, DARPA's Cerf and Kahn borrowed the idea of decentralized nodes from the French CYCLADES networking system, but this wasn't fully implemented as the TCP/IP protocol on the ARPANET until 1983. In 1985, the civilian National Science Foundation started funding the NSFNET, based on the ARPANET's TCP/IP protocol, for a much larger network of universities, supercomputer labs and research facilities. NSFNET started operations with a much larger backbone than ARPANET in 1986 (ARPANET itself was decommissioned in 1990) and started accepting limited commercial service providers in 1988, and, with further expansion and much-needed protocol upgrades, NSFNET morphed into the internet in 1995, at which time the NSFNET backbone was decommissioned.

Today, of course, the DoD is showering the largest Silicon Valley companies with multi-billions and begging them to help U.S. military out of the hopeless mess it has made of its metastasizing computer, communications and software systems. Needless to say, if this DoD money becomes a significant portion of the income stream of Google, Microsoft, Apple, etc., it is safe to predict their decay and destruction at the hands of new innovators unencumbered by DoD funding, much as occurred with aviation companies such as Lockheed. NAFTA's reforms won't change that reality.

As a result of the militarization of what's left of U.S. manufacturing, along with the enlargement of the trans-Pacific supply chains with China (brought about through decades of offshoring), a mere tweak of the "new NAFTA" is unlikely to achieve Trump's objective of revitalizing America's industrial commons. With China's entry into the WTO, it is possible that the U.S. manufacturing has hit a "point of no return," which mitigates the utility of regional trade deals, as a means of reorienting the multinational production networks in a way that produces high-quality, high-paying jobs for American workers.

Offshoring and the increasing militarization of the American economy, then, have rendered reforms of the kind introduced by NAFTA almost moot. When it becomes more profitable to move factories overseas, or when it becomes more profitable, more quickly, to focus on finance instead of manufacturing, then your average red-blooded capitalist is going to happily engage in the deconstruction of the manufacturing sector (most, if not all, outsourced factories are profitable, just not as profitable as they might be in China or, for that matter, Mexico). The spoils in a market go to the fastest and strongest, and profits from finance, measured in nanoseconds, are gained more quickly than profits from factories, whose time frame is measured in years. Add to that the desire to weaken unions and the championing of an overvalued dollar by the financial industry, and you have a perfect storm of national decline and growing economic inequality.

Seen in this broader context, the new NAFTA-that's-not-called-NAFTA is a nothing-burger. The much-trumpeted reforms give crumbs to U.S. labor, in contrast to the clear-cut bonanza granted to corporate America under the GOP's new tax "reform" legislation. Hardly a great source of celebration as we approach the Labor Day weekend. In fact, by the time the big bucks corporate lawyer-lobbyists for other mega corporations have slipped in their little wording refinements exempting hundreds of billions of special interest dollars, the new NAFTA-that's-not-called-NAFTA will most likely include an equal screwing for textile, steel, energy sector, chemical, and other U.S. industry workers, and anything else that is left of the civilian economy.

This article was produced by the Independent Media Institute .

Mark Pontin , September 1, 2018 at 3:08 am

From Auerback's article: "Silicon Valley grew like a weed with essentially no Department of Defense investment."

Absolutely false.

In 1960, 100 percent of all microprocessor chips produced by Silicon Valley -- therefore, IIRC, at that time 100 percent of all microprocessor chips produced in the world -- were bought by the U.S. Department of Defense. This was primarily driven by their use in the guidance systems of the gen-1 ICBMs then being developed under USAF General Bernard Schriever (which also supplied the rockets used for NASA's Redstone and Mercury programs), and by their use in NORAD and descendant early-warning radar networks.

As late as 1967, 75 percent of all microprocessor chips from Silicon Valley were still bought by the Pentagon.

By then, Japanese companies like Matsushita, Sharp, etc. had licensed the technology so Silicon Valley was no longer producing all the world's chip output. But still .

That aside, I think Auerback makes some very good points.

Anthony K Wikrent , September 1, 2018 at 8:09 am

While I think Auerback's Melman's point that the best engineering talent was diverted by military spending, it does not mean that military spending had an entirely negative effect on the ability to compete in other markets. What no one addresses is: Why did USA computer and electronics industries become so successful, while those of Britain did not?

The key, I think, is the conscious policy to deliberately seed new technologies developed under military auspices into the civilian economy. The best recent example was the Moore School Lectures in July – August 1946, which can be identified as the precise point in time when the USA government, following Hamiltonian principles of nation building, acted to create an entire new industry, and an entirely new phase shift in the technological bases of the economy.

This followed historical examples such as the spread of modern metal working machine tools out of the national armories after the War of 1812; the role of the Army in surveying, planning, and constructing railroads in the first half of the 19th century; the role of the Navy in establishing scientific principles of steam engine design during and after the Civil War; and the role of the Navy in creating the profession of mechanical engineering after the Civil War.

The left, unfortunately, in its inexplicable hatred of Alexander Hamilton (such as the left's false argument that Hamilton wanted to create a political economy in which the rich stayed rich and in control -- easily disproven by comparing the descendants of USA wealthy elites of the early 1800s with the descendants of British wealthy elites of the early 1800s) cannot understand Hamiltonian principles of nation building because the left does not want to understand the difference between wealth and the creation of wealth. The left sees Hamilton's focus on creating new wealth (and, be it noted, Hamilton was explicit in arguing that wealth is created by improving the productive powers of labor) and misunderstands it as coddling of wealth.

Scott , September 1, 2018 at 8:40 am

I agree, please see this presentation titled The Secret History of Silicon Valley, which details the relationship in fair greater detail. It is not from a professional historian, but the depth of research and citations show a story that runs counter to the common myths.

Marshall Auerback , September 1, 2018 at 9:04 am

But there had already been substantial private investment EARLIER than 1960. The DoD investment came later and initially was unsuccessful. I do later acknowledge that the Pentagon played A role, but not THE role, and that by and large the military's role has been unhelpful to American manufacturing (as opposed to the common myths to the contrary). Multiple events, multiple players, and multiple points of origin need to be mentioned in any sensible understanding of the emergence of Silicon Valley, starting with Bell Laboratories in 1947.

Carolinian , September 1, 2018 at 9:19 am

Thank you for this bit of pushback against anti-SV paranoia. And the above comment is only referring to one aspect of Silicon Valley whereas the industry as we now think about it is as much about personal computing and software as about hardware.

That doesn't make them good guys necessarily but the notion that the USG is running the whole show is surely wrong.

4corners , September 1, 2018 at 3:23 am

With China's entry into the WTO, it is possible that the U.S. manufacturing has hit a "point of no return," which mitigates the utility of regional trade deals.

So, what then does Mr Auerback propose? Give up on trade deals all together? Leave in place the unbalanced tariff structures that Trump claims to be addressing?

Re: autos, 25% imported content (by weight or what?!) seems like a significant move from 37.5%. But the author just dismisses that with a vague suggestion that the higher value-added products might constitute the 25%, and that it probably doesn't matter anyway.

The other half of the double feature – about inefficiency from DOD investments -- is interesting, but what is the main criticism about DoD? Is it that contracting with the private sector is inherently inefficient or should it be more about how those programs have been managed? For example, the F-35 seems ill-conceived and poorly managed but does that suggest DoD-private sector projects are inherently counterproductive?

anon48 , September 1, 2018 at 12:24 pm

"So, what then does Mr Auerback propose? Give up on trade deals all together?"

That was my takeaway too Mr Auerback should we wait until you drop the other shoe?

Mark Pontin , September 1, 2018 at 3:30 am

I guess I should cite a source. An interesting one is Donald Mackenzie's Inventing Accuracy: A Historical Sociology of Nuclear Missile Guidance MIT Press, 1992-93.

MacKenzie -- a professor of sociology at the University of Edinburgh, specializing in science and technology studies -- is also a good deep dive on financial engineering, the algorithms and the actual hardware of HFT, etc. A book of his from 2008, An Engine, Not a Camera: How Financial Models Shape Markets is absolutely worth reading.

The Rev Kev , September 1, 2018 at 3:42 am

Even working in the private sector can let engineers get into bad habits. Electronic engineers were used to just selecting whatever material they needed to finish an engineering design in at least one major corporation. That is, until the EU but a strict ban on the import of electronic equipment that contained certain metals and chemicals due to the fact that they were such bad pollutants. But the EU is far too big a market to ignore so it looked like adjustments had to be made.

A talk was given by a consultant of how the new rules would work in practice. Straight away the engineers asked if a deferral was possible. No. Was it possible to get a waiver for the use of those materials. No. The consultant stated that these were the rules and then went on to point out that he was now on his way to China to give the same identical talk to Chinese engineers. He also said that these rules would probably become de facto world standards. Wish that I had saved that article.

DF , September 1, 2018 at 10:35 am

Was this RoHS?

The Rev Kev , September 1, 2018 at 10:39 am

Don't know, but I only saw the original article talking about this about a year or two ago.

John Wright , September 1, 2018 at 11:54 am

This appears to be the original ROHS (Removal Of Hazardous Substances) EU directive.

This involved a lot of different chemicals, but of interest to me was the replacement of tin-lead solder in electronic equipment.

In the early 2000's my job involved migrating a number of shipping tin-lead (sn-pb) printed circuit assemblies to pb free designs to meet ROHS standards.

A lot of changes were necessary as the new pb-free (tin-silver-copper) solder melted at around 223c while the previous tin-lead solder melted at 183c.

Components had to all take the higher temperature and this caused a slew of problems. Usually the bare board had to be changed to a different material that could take the higher temp.

The EU ended up being somewhat flexible in the ROHS implementation, as I know that they allowed a number of exemptions for some high pb content (>= 85% pb) solder where there was no good substitute. (example, high temperature solders for devices that run hot such as transducers and loudspeakers and internal power IC connections)


As I remember, medical equipment and US military equipment were not required to adhere to ROHS.

It was not a simple task to meet the new ROHS standards, so I can understand that some engineers wanted a deferral.

But it provided employment for me.

David , September 1, 2018 at 1:10 pm

When a system is designed to last 20+ years with high reliability and availability requirements, techniques and methods that are used to design and build this years I-phone aren't always applicable.

The lead in the solder helps prevent the growth of "tin whiskers", which can cause electronic systems to fail over time. One usually gets a RoHS waiver for the part(s) for longer life systems.

But what happens in five or ten years when the part(s) need to be replaced? Will one be able to find that leaded-part? Try and buy a new 5 year old I-phone.

Will that part be a genuinely new part or one that was salvaged, and possibly damaged, from another piece of equipment? How would one know if the part is damaged? It looks okay. It functions in the lab, but will it function, as expected, in the real world? For how long?

What's the risk of putting a similar but different part in the system? Could people be harmed if the part fails or doesn't function as expected?

This is systems engineering. Some wave it off as a "ludicrously bureaucratized management systems", but it is important. They may prefer the commercial approach of letting the public find faults in designs, but that's how people get run over by autonomous vehicles.

Unfortunately, systems engineering doesn't directly contribute to the quarterly bottom line, so it is usually underfunded or neglected. It only becomes important when, after all the designers have moved on, the system can't be used because it can't be repaired or something dangerous happens because the system was maintained with parts that were "good enough", but weren't.

What this has to do with NAFTA, I don't know.

AbateMagicThinking But Not Money , September 1, 2018 at 5:22 am

Perhaps covert regime change may be the only way (instead of tariffs):

The medicine required will probably involve hard-faced interviews with all the top CEO's (involving rubber gloves at the private airports as they come from their jaunts abroad). Who better than special ops for carrying our such persuasion. It will have to be done on the QT under stricter than usual national security secrecy so that the 'persuaded' won't lose face.

Look out for an appropriate and revolutionary "refocusing initiative" announced by all the top people in the private sector as an indication that the threat of the the rubber glove has worked.


Disturbed Voter , September 1, 2018 at 5:41 am

Utopians will make any society into a no-place. You can't run a society solely on consumerism, it has to also run on MIC expenditures, because of excess productive capacity, as was shown in WW II and the Cold War. The other trend is we don't want to pay anyone with a college degree, more than the minimum wage. Per management, engineers make too much salary.

jo6pac , September 1, 2018 at 8:30 am

The CIA money is everywhere in the silly valley.

The article is a good read.

Felix_47 , September 1, 2018 at 8:48 am

I work in this area with the government. This is exactly what I see. It is a deep seated cancer on our nation. What can be done to change this. I always thought cut the military budget but then the only jobs American can get (if vets with security clearances) would vanish? Does anyone have any solutions?

ambrit , September 1, 2018 at 10:24 am

Declare the Infrastructure Renovation Program to be "In the Interests of National Security," cut the Military Industrial Complex on for a piece of the action, and start hiring. After all, General Dynamics got some serious contracts to run the call centres for Obamacare. The template is there.

Jeff , September 1, 2018 at 10:48 am

That's disgusting. Giving call center contracts to defense firms And likely for a much bigger mark-up than might have been given to call center management companies.

Newton Finn , September 1, 2018 at 10:43 am

As Edward Bellamy envisioned in the late 19th Century, one possible "solution" would be to transition the military model into serving benevolent civilian uses. Imagine if our now-monstrous MIC was tasked not with protecting elite wealth and destroying other nations, but rather with repairing environmental damage and restoring environmental health, along with addressing other essential needs of people and planet. Change the purpose pursued, and a great evil can become a great good.

Wukchumni , September 1, 2018 at 10:55 am

We largely did that in the early 1970's, as angst against the Vietnam War was at a fever pitch and the ecological movement was ascendant.

It took awhile for the effects to be felt, but smog in L.A. went from horrendous to manageable within a decade of auto emissions being regulated. Lakes & waterways across the land were cleaned up as well, to the benefit of everybody.

Bellamy was quite the visionary, so much of what he described in Looking Backward, actually happened.

blennylips , September 1, 2018 at 1:38 pm

> now- monstrous MIC

Second Hanford radioactive tunnel collapse expected. And it could be more severe

By Annette Cary
August 28, 2018 07:14 PM

John Zelnicker , September 1, 2018 at 9:37 am

Lambert – Did you intend to post only about a third of Auerbach's article? That's all I see.

MichaeLeroy , September 1, 2018 at 9:48 am

One of my gigs was working as a scientist/engineer for a fledgling Minnesota based medical device company. The company's R&D team originally consisted of experienced medical device development professionals and we made progress on the product. Management secured a round of funding from Silicon Valley investors. One of the strings attached to that investment was that Silicon Valley expertise be brought in to the R&D efforts. There was a huge culture clash between the Minnesota and the California teams, as the Silicon Valley engineers tried to apply their military systems perspective to development of a device meant for use in humans. From our perspective, the solutions proposed by the weapons guys were dangerous and impractical. However management considered the Silicon Valley expertise golden and consistently ruled in their favor. This was the early 2000s, so none of the Minnesota R&D team had difficulty moving on to other projects, which is what we did. The result of the Silicon Valley turn was a several-fold increase in cash burn, loss of medical device engineering knowledge and ultimately no product.

DF , September 1, 2018 at 10:33 am

Another view is that defense procurement is one of the few things that's keeping a lot of US manufacturing and technical expertise alive at all. I went to one of the only electronic parts stores (i.e., a place that sells stuff like IC's capacitors, resistors, etc.) in Albuquerque, NM a few months ago. The person at the cash register told me that nearly all of their business comes from Sandia National Laboratories, which is the big nuclear weapons laboratory in ABQ.

John Wright , September 1, 2018 at 12:09 pm

I suspect there is a lot more USA technical expertise out there than indicated by the small quantity of retail electronics parts stores surviving. On-line companies such as Digi-key and Mouser have very large and diverse component selections and quick delivery to retail customers.

Go to

The blank search shows 8,380,317 items.

John , September 1, 2018 at 10:40 am

The Aug 21, 2015 NC has a piece "How Complex Systems Fail" that seems appropriate to this. Death is an unavoidable part of the great cycle. Get used to it.

Mel , September 1, 2018 at 11:31 am

How Complex Systems Fail PDF here .

[Jun 01, 2018] Xerox Alto Designer Charles Thacker, Co-Inventor Of Ethernet, Dies at 74

Notable quotes:
"... Dealers of Lightning ..."
Jun 18, 2017 |
( 71 Charles Thacker, one of the lead hardware designers on the Xerox Alto, the first modern personal computer, died of a brief illness on Monday. He was 74.

The Alto, which was released in 1973 but was never a commercial success, was an incredibly influential machine. Ahead of its time, it boasted resizeable windows as part of its graphical user interface, along with a mouse, Ethernet, and numerous other technologies that didn't become standard until years later. (Last year, Y Combinator acquired one and began restoring it.)

"Chuck" Thacker was born in Pasadena, California, in 1943. He first attended the California Institute of Technology in his hometown but later transferred to the University of California, Berkeley in 1967. While in northern California, Thacker began to abandon his academic pursuit of physics and dove deeper into computer hardware design, where he joined Project Genie , an influential computer research group. By the end of the decade, several members, including Thacker, became the core of Xerox's Palo Alto Research Center (PARC) computer research group, where they developed the Alto.

In a 2010 interview, Thacker recalled :

We knew [the Alto] was revolutionary. We built it with the very first semiconductor dynamic RAM, the Intel 1103, which was the first memory you could buy that was less than a 10th of a cent a bit. As a result, we realized we could build a display that was qualitatively better than what we had at the time. We had character generator terminals, and some of them were quite nice. But they were limited in various ways, whereas the Alto had the property that anything you could represent on paper, you could put on the screen. We knew that was going to be a big deal.

At the age of 24, Apple co-founder Steve Jobs famously visited Xerox's Palo Alto Research Center (PARC) and saw an Alto firsthand in 1979. That episode is often credited with being a huge inspiration for the eventual release of the Macintosh five years later. (Like Jobs, Thacker also founded a company in his 20s-in 1969, the short-lived Berkeley Computer Company was born.)

Michael Hiltzik, a journalist who wrote an entire book on the history of Xerox PARC called Dealers of Lightning , said that Thacker was a "key designer."

Hiltzik told Ars:

He was the quintessential hardware guy at a time when designing the hardware was a key aspect of the task. He was a master at designing logic boards when that was the guts of the computer. This was before the silicon revolution. He did all that, and he had built a couple of computers even before the Alto.

Later in his career, Thacker joined Microsoft in 1997 to help establish the company's lab in Cambridge, England. Two years after that, he designed the hardware for Microsoft's Tablet PC, which was first conceived of by his PARC colleague Alan Kay during the early 1970s.

In 2009, Thacker received the Association for Computing Machinery's prestigious A.M. Turing award. According to Thomas Haigh , a computer historian and professor at the University of Wisconsin, Milwaukee, it is "very rare" for a "system builder in industry," as opposed to a theorist or an academic, to be given this honor.

Haigh wrote the following in an e-mail to Ars:

Alto is the direct ancestor of today's personal computers. It provided the model: GUI, windows, high-resolution screen, Ethernet, mouse, etc. that the computer industry spent the next 15 years catching up to. Of course others like Alan Kay and Butler Lampson spent years evolving the software side of the platform, but without Thacker's creation of what was, by the standards of the early 1970s, an amazingly powerful personal hardware platform, none of that other work would have been possible.

In the same 2010 interview, Thacker had some simple advice for young computer scientists: "Try to be broad. Learn more math, learn more physics."

Posted by EditorDavid on Saturday June 17, 2017 @01:46PM

An anonymous reader quotes Ars Technica : Charles Thacker, one of the lead hardware designers on the Xerox Alto, the first modern personal computer, died of a brief illness on Monday . He was 74. The Alto, which was released in 1973 but was never a commercial success, was an incredibly influential machine...

Thomas Haigh, a computer historian and professor at the University of Wisconsin, Milwaukee, wrote in an email to Ars , "Alto is the direct ancestor of today's personal computers. It provided the model: GUI, windows, high-resolution screen, Ethernet, mouse, etc. that the computer industry spent the next 15 years catching up to. Of course others like Alan Kay and Butler Lampson spent years evolving the software side of the platform, but without Thacker's creation of what was, by the standards of the early 1970s, an amazingly powerful personal hardware platform, none of that other work would have been possible."

In 1999 Thacker also designed the hardware for Microsoft's Tablet PC , "which was first conceived of by his PARC colleague Alan Kay during the early 1970s," according to the article. "I've found over my career that it's been very difficult to predict the future," Thacker said in a guest lecture in 2013 .

"People who tried to do it generally wind up being wrong."

woboyle ( 1044168 ) , Saturday June 17, 2017 @03:13PM ( #54639641 )

Ethernet and Robert Metcalf ( Score: 5 , Interesting)

The co-inventor of ethernet at PARC, Robert Metcalf, has been a friend of mine for 35 years. My sympathies to Thacker's family for their loss. I never knew him although I may have met him in the early 1980's in the Silicon Valley. As a commercial computer sales rep in the Valley back then I sold Robert the first 100 IBM PC's for his startup, 3-com. When I was an engineer in Boston in the late 1980's and early 1990's we would meet for dinner before IEEE meetings.

[Mar 27, 2018] The Quest To Find the Longest-Serving Programmer

Notable quotes:
"... the National Museum of Computing ..."
Mar 27, 2018 |

( the National Museum of Computing published a blog post in which it tried to find the person who has been programming the longest . At the time, it declared Bill Williams, a 70-year old to be one of the world's most durable programmers, who claimed to have started coding for a living in 1969 and was still doing so at the time of publication. The post has been updated several times over the years, and over the weekend, the TNMC updated it once again. The newest contender is Terry Froggatt of Hampshire, who writes: I can beat claim of your 71-year-old by a couple of years, (although I can't compete with the likes of David Hartley). I wrote my first program for the Elliott 903 in September 1966. Now at the age of 73 I am still writing programs for the Elliott 903! I've just written a 903 program to calculate the Fibonacci numbers. And I've written quite a lot of programs in the years in between, some for the 903 but also a good many in Ada.

[Oct 14, 2017] In December 18, 2017 Perl turns 30 by Ruth Holloway

Notable quotes:
"... there is more than one way to do it ..."
"... Perl version 5.10 of Perl was released on the 20th anniversary of Perl 1.0: December 18, 2007. Version 5.10 marks the start of the "Modern Perl" movement. ..."
Oct 14, 2017 |

Larry Wall released Perl 1.0 to the comp.sources.misc Usenet newsgroup on December 18, 1987. In the nearly 30 years since then, both the language and the community of enthusiasts that sprung up around it have grown and thrived -- and they continue to do so, despite suggestions to the contrary!

Wall's fundamental assertion -- there is more than one way to do it -- continues to resonate with developers. Perl allows programmers to embody the three chief virtues of a programmer: laziness, impatience, and hubris. Perl was originally designed for utility, not beauty. Perl is a programming language for fixing things, for quick hacks, and for making complicated things possible partly through the power of community. This was a conscious decision on Larry Wall's part: In an interview in 1999, he posed the question, "When's the last time you used duct tape on a duct?"

A history lesson

... ... ...

The Perl community

... ... ...

... ... ...

As Perl turns 30, the community that emerged around Larry Wall's solution to sticky system administration problems continues to grow and thrive. New developers enter the community all the time, and substantial new work is being done to modernize the language and keep it useful for solving a new generation of problems. Interested? Find your local Perl Mongers group, or join us online, or attend a Perl Conference near you!

Ruth Holloway - Ruth Holloway has been a system administrator and software developer for a long, long time, getting her professional start on a VAX 11/780, way back when. She spent a lot of her career (so far) serving the technology needs of libraries, and has been a contributor since 2008 to the Koha open source library automation suite.Ruth is currently a Perl Developer at cPanel in Houston, and also serves as chief of staff for an obnoxious cat. In her copious free time, she occasionally reviews old romance... "

[Sep 21, 2017] Bill Gates wishes ctrl-alt-del was one button

Sep 21, 2017 |

Microsoft co-founder Bill Gates has pledged to give more than half of his enormous wealth to philanthropy and helped make the personal computer a reality ! but he cannot live down the control-alt-delete keyboard function.

On Wednesday, the billionaire admitted the three-strokes PC users must use to log on to their computer or interrupt a program could be just one button.

David Rubenstein, co-founder and co-CEO of private equity firm The Carlyle Group, raised the issue Wednesday during a panel discussion at the Bloomberg Global Business Forum in New York City.

"You are the person who came up with the idea of doing it that way," Rubenstein said in this Bloomberg video of the discussion . "Why did you do that?"

The question drew laughs from the audience and from Gates' fellow panelists. Gates took a long pause before answering.

"Clearly, the people involved, they should have put another key on in order to make that work," he said. "I'm not sure you can go back and change small things in your life without putting the other things at risk. Sure, if I can make one small edit, I'd make that a single key operation." It wasn't the first time Gates addressed control-alt-delete. Rubenstein asked him about it four years ago during an interview at Harvard University.

"We could have had a single button," Gates said then . "The guy who did the IBM keyboard design didn't want to give us our single button... It was a mistake."

[Sep 17, 2017] The last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization

Notable quotes:
"... To emulate those capabilities on computers will probably require another 100 years or more. Selective functions can be imitated even now (manipulator that deals with blocks in a pyramid was created in 70th or early 80th I think, but capabilities of human "eye controlled arm" is still far, far beyond even wildest dreams of AI. ..."
"... Similarly human intellect is completely different from AI. At the current level the difference is probably 1000 times larger then the difference between a child with Down syndrome and a normal person. ..."
"... Human brain is actually a machine that creates languages for specific domain (or acquire them via learning) and then is able to operate in terms of those languages. Human child forced to grow up with animals, including wild animals, learns and is able to use "animal language." At least to a certain extent. Some of such children managed to survive in this environment. ..."
"... If you are bilingual, try Google translate on this post. You might be impressed by their recent progress in this field. It did improved considerably and now does not cause instant laugh. ..."
"... One interesting observation that I have is that automation is not always improve functioning of the organization. It can be quite opposite :-). Only the costs are cut, and even that is not always true. ..."
"... Of course the last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization. And it is unclear whether we reached the limit of current capabilities or not in certain areas (in CPU speeds and die shrinking we probably did; I do not expect anything significant below 7 nanometers: ). ..."
May 28, 2017 |

libezkova , May 27, 2017 at 10:53 PM

"When combined with our brains, human fingers are amazingly fine manipulation devices."

Not only fingers. The whole human arm is an amazing device. Pure magic, if you ask me.

To emulate those capabilities on computers will probably require another 100 years or more. Selective functions can be imitated even now (manipulator that deals with blocks in a pyramid was created in 70th or early 80th I think, but capabilities of human "eye controlled arm" is still far, far beyond even wildest dreams of AI.

Similarly human intellect is completely different from AI. At the current level the difference is probably 1000 times larger then the difference between a child with Down syndrome and a normal person.

Human brain is actually a machine that creates languages for specific domain (or acquire them via learning) and then is able to operate in terms of those languages. Human child forced to grow up with animals, including wild animals, learns and is able to use "animal language." At least to a certain extent. Some of such children managed to survive in this environment.

Such cruel natural experiments have shown that the level of flexibility of human brain is something really incredible. And IMHO can not be achieved by computers (although never say never).

Here we are talking about tasks that are 1 million times more complex task that playing GO or chess, or driving a car on the street.

My impression is that most of recent AI successes (especially IBM win in Jeopardy ( ), which probably was partially staged, is by-and-large due to the growth of storage and the number of cores of computers, not so much sophistication of algorithms used.

The limits of AI are clearly visible when we see the quality of translation from one language to another. For more or less complex technical text it remains medium to low. As in "requires human editing".

If you are bilingual, try Google translate on this post. You might be impressed by their recent progress in this field. It did improved considerably and now does not cause instant laugh.

Same thing with the speech recognition. The progress is tremendous, especially the last three-five years. But it is still far from perfect. Now, with a some training, programs like Dragon are quite usable as dictation device on, say PC with 4 core 3GHz CPU with 16 GB of memory (especially if you are native English speaker), but if you deal with special text or have strong accent, they still leaves much to be desired (although your level of knowledge of the program, experience and persistence can improve the results considerably.

One interesting observation that I have is that automation is not always improve functioning of the organization. It can be quite opposite :-). Only the costs are cut, and even that is not always true.

Of course the last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization. And it is unclear whether we reached the limit of current capabilities or not in certain areas (in CPU speeds and die shrinking we probably did; I do not expect anything significant below 7 nanometers: ).

[Sep 16, 2017] Judge Dismisses Inventor of Email Lawsuit Against Techdirt

Sep 16, 2017 |

( 127

Posted by msmash on Thursday September 07, 2017

A federal judge in Massachusetts has dismissed a libel lawsuit filed earlier this year against tech news website Techdirt. From a report: The claim was brought by Shiva Ayyadurai, who has controversially claimed that he invented e-mail in the late 1970s. Techdirt (and its founder and CEO, Mike Masnick) has been a longtime critic of Ayyadurai and institutions that have bought into his claims.

"How The Guy Who Didn't Invent Email Got Memorialized In The Press & The Smithsonian As The Inventor Of Email," reads one Techdirt headline from 2012. One of Techdirt's commenters dubbed Ayyadurai a "liar" and a "charlatan," which partially fueled Ayyadurai's January 2017 libel lawsuit.

In the Wednesday ruling, US District Judge F. Dennis Saylor found that because it is impossible to define precisely and specifically what e-mail is, Ayyadurai's " claim is incapable of being proved true or false. "

[Jul 25, 2017] Adobe to pull plug on Flash, ending an era

Notable quotes:
"... Video by Wochit Tech ..."
"... (Reporting by Salvador Rodriguez; additional reporting by Stephen Nellis; editing by Jonathan Weber and Lisa Shumaker) ..."
Jul 25, 2017 |
Video by Wochit Tech

Adobe Systems Inc's Flash, a once-ubiquitous technology used to power most of the media content found online, will be retired at the end of 2020, the software company announced on Tuesday.

Adobe, along with partners Apple Inc., Microsoft Corp., Alphabet Inc.'s Google, Facebook Inc. and Mozilla Corp., said support for Flash will ramp down across the internet in phases over the next three years.

After 2020, Adobe will stop releasing updates for Flash and web browsers will no longer support it. The companies are encouraging developers to migrate their software onto modern programming standards.

"Few technologies have had such a profound and positive impact in the internet era," said Govind Balakrishnan, vice president of product development for Adobe Creative Cloud.

Created more than 20 years ago, Flash was once the preferred software used by developers to create games, video players and applications capable of running on multiple web browsers. When Adobe acquired Flash in its 2005 purchase of Macromedia, the technology was on more than 98 percent of personal computers connected to the web, Macromedia said at the time.

But Flash's popularity began to wane after Apple's decision not to support it on the iPhone.

In a public letter in 2010, late Apple CEO Steve Jobs criticized Flash's reliability, security and performance. Since then, other technologies, like HTML5, have emerged as alternatives to Flash.

In the past year, several web browsers have begun to require users to enable Flash before running it.

On Google's Chrome, the most popular web browser, Flash's usage has already fallen drastically. In 2014, Flash was used each day by 80 percent of desktop users. That number is now at 17 percent "and continues to decline," Google said in a blog on Tuesday.

"This trend reveals that sites are migrating to open web technologies, which are faster and more power-efficient than Flash," Google said. "They're also more secure."

Flash, however, remains in use among some online gamers. Adobe said it will work with Facebook as well as Unity Technologies and Epic Games to help developers migrate their games.

Adobe said it does not expect Flash's sunset to have an impact on its bottom line. "In fact, we think the opportunity for Adobe is greater in a post-Flash world," Balakrishnan said.

(Reporting by Salvador Rodriguez; additional reporting by Stephen Nellis; editing by Jonathan Weber and Lisa Shumaker)

[Jul 11, 2017] 48-Year-Old Multics Operating System Resurrected

Jul 09, 2017 |

"The seminal operating system Multics has been reborn," writes Slashdot reader doon386 :

The last native Multics system was shut down in 2000 . After more than a dozen years in hibernation a simulator for the Honeywell DPS-8/M CPU was finally realized and, consequently, Multics found new life... Along with the simulator an accompanying new release of Multics -- MR12.6 -- has been created and made available. MR12.6 contains many bug and Y2K fixes and allows Multics to run in a post-Y2K, internet-enabled world. Besides supporting dates in the 21st century, it offers mail and send_message functionality, and can even simulate tape and disk I/O. (And yes, someone has already installed Multics on a Raspberry Pi.)

Version 1.0 of the simulator was released Saturday, and is offering a complete QuickStart installation package with software, compilers, install scripts, and several initial projects (including SysDaemon, SysAdmin, and Daemon).

Plus there's also useful Wiki documents about how to get started, noting that Multics emulation runs on Linux, macOS, Windows, and Raspian systems. The original submission points out that "This revival of Multics allows hobbyists, researchers and students the chance to experience first hand the system that inspired UNIX." ( 142825 ) , Sunday July 09, 2017 @01:47AM ( #54772267 ) Homepage

I used it at MIT in the early 80s. ( Score: 4 , Informative)

I was a project administrator on Multics for my students at MIT. It was a little too powerful for students, but I was able to lock it down. Once I had access to the source code for the basic subsystem (in PL/1) I was able to make it much easier to use. But it was still command line based.

A command line, emails, and troff. Who needed anything else?

Gravis Zero ( 934156 ) , Sunday July 09, 2017 @02:10AM ( #54772329 )
It's not the end! ( Score: 4 , Interesting)

Considering that processor was likely made with the three micrometer lithographic process, it's quite possible to make the processor in a homemade lab using maskless lithography. Hell, you could even make it NMOS if you wanted. So yeah, emulation isn't the end, it's just another waypoint in bringing old technology back to life.

Tom ( 822 ) , Sunday July 09, 2017 @04:16AM ( #54772487 ) Homepage Journal
Multics ( Score: 5 , Interesting)
The original submission points out that "This revival of Multics allows hobbyists, researchers and students the chance to experience first hand the system that inspired UNIX."

More importantly: To take some of the things that Multics did better and port them to Unix-like systems. Much of the secure system design, for example, was dumped from early Unix systems and was then later glued back on in pieces.

nuckfuts ( 690967 ) , Sunday July 09, 2017 @02:00PM ( #54774035 )
Influence on Unix ( Score: 4 , Informative)

From here []...

The design and features of Multics greatly influenced the Unix operating system, which was originally written by two Multics programmers, Ken Thompson and Dennis Ritchie. Superficial influence of Multics on Unix is evident in many areas, including the naming of some commands. But the internal design philosophy was quite different, focusing on keeping the system small and simple, and so correcting some deficiencies of Multics because of its high resource demands on the limited computer hardware of the time.

The name Unix (originally Unics) is itself a pun on Multics. The U in Unix is rumored to stand for uniplexed as opposed to the multiplexed of Multics, further underscoring the designers' rejections of Multics' complexity in favor of a more straightforward and workable approach for smaller computers. (Garfinkel and Abelson[18] cite an alternative origin: Peter Neumann at Bell Labs, watching a demonstration of the prototype, suggested the name/pun UNICS (pronounced "Eunuchs"), as a "castrated Multics", although Dennis Ritchie is claimed to have denied this.)

Ken Thompson, in a transcribed 2007 interview with Peter Seibel[20] refers to Multics as "...overdesigned and overbuilt and over everything. It was close to unusable. They (i.e., Massachusetts Institute of Technology) still claim it's a monstrous success, but it just clearly wasn't." He admits, however, that "the things that I liked enough (about Multics) to actually take were the hierarchical file system and the shell! a separate process that you can replace with some other process."

Shirley Marquez ( 1753714 ) , Monday July 10, 2017 @12:44PM ( #54779281 ) Homepage
A hugely influential failure ( Score: 2 )

The biggest problem with Multics was GE/Honeywell/Bull, the succession of companies that made the computers that it ran on. None of them were much good at either building or marketing mainframe computers.

So yes, Multics was a commercial failure; the number of Multics systems that were sold was small. But in terms of moving the computing and OS state of the art forward, it was a huge success. Many important concepts were invented or popularized by Multics, including memory mapped file I/O, multi-level file system hierarchies, and hardware protection rings. Security was a major focus in the design of Multics, which led to it being adopted by the military and other security-conscious customers.

[May 29, 2017] Economist's View The Future of Work Automation and Labor Inclusive AI Technology and Policy for a Diverse Human Future

May 29, 2017 |
libezkova - , May 28, 2017 at 06:13 PM
"computers merely use syntactic rules to manipulate symbol strings, but have no understanding of meaning or semantics. "

It is actually more complex than that. Here I again would like to remind about a very simple program called Eliza, created around 1966 that was one of the first to fake Turing test ( ). It really deceived humans that it is a Rogerian psychotherapist.

If was first to show that "understanding" in a very deep sense is based the ability to manipulate symbols in some abstract language and inability of humans to tell that this is not another human being (the essence of Turing test) is intrinsically connected with our of notion "understanding". In this sense, the program that is able to pass the Turing test in a particular domain "understands" this domain.

For example, when you ask your smartphone voice assistant "What is the weather in San Francisco today?" the answer demonstrates clear understanding of what you are asking for. Your smartphone clearly "understands" you.

On the other hand when you submit a text for translation to Google and get the result, the result clearly shows that in this domain the computer is clearly unable to pass the Turing test. Although if we limit the domain to a very narrow area I think the situation improves.

Similarly the ability of a computer to compete with humans in chess in a very deep sense means that such a specialized computer "understands chess" (or any other game) despite the fact that mechanisms using which humans come to the next move and computer comes to the next move are different.

So an instrumental aspect of understanding that matter most.

What is currently lacking is what we mean by the word "creativity" -- the ability to create a new "artificial language" for a domain (humans invented chess).

At the same time the ability of complex manipulation of symbols on the level that allow a program to pass Turing test remains a very good proxy for understanding.

So the question is only about the complexity and power of those manipulations with the symbols of the language. In other words at some point quantity turns into quality.

[Apr 26, 2017] Today in Tech – 1960

Notable quotes:
"... IBM announced their plans for their Stretch supercomputer, the IBM 7030. ..."
"... Though the IBM 7030 was not considered a success, it spawned technologies which were incorporated in many, more successful machines. ..."
Apr 26, 2017 |

On this day in 1960 IBM announced their plans for their Stretch supercomputer, the IBM 7030. In an upbeat release the company stated that the Stretch would be the world's fastest and most powerful computer of the time, outperforming the IBM 704 by 60 to 100 times. The actual performance of the IBM 7030 ended up disappointing, at only about 30 times that of the IBM 704. This caused considerable embarrassment for IBM and a significant price cut of the Stretch from $13.5 million to $7.78 million.

Though the IBM 7030 was not considered a success, it spawned technologies which were incorporated in many, more successful machines.

[Apr 19, 2017] Today in Tech – 1957

Notable quotes:
"... by April 19, 1957 the first FORTRAN program was run (apart from internal IBM testing) at Westinghouse, producing a missing comma diagnostic, followed by a successful attempt. ..."
Apr 19, 2017 |

On this day in 1957, the first FORTRAN program was run.

Back in 1953, IBM computer scientist John W. Backus proposed to his superiors that a more practical language be developed for programming their IBM 704 mainframe computer. He along with a team of programmers later invented FORTRAN, a "high-level" programming language that greatly simplified program writing. And by April 19, 1957 the first FORTRAN program was run (apart from internal IBM testing) at Westinghouse, producing a missing comma diagnostic, followed by a successful attempt.

[Apr 15, 2017] The first researcher that has shown that Turing test can be faked was Joseph Weizenbaum from MIT.

Apr 15, 2017 |
anne -> anne... , April 14, 2017 at 11:47 AM

April 9, 2014

The Chinese Room Argument

The argument and thought-experiment now generally known as the Chinese Room Argument was first published in a paper * in 1980 by American philosopher John Searle (1932- ). It has become one of the best-known arguments in recent philosophy. Searle imagines himself alone in a room following a computer program for responding to Chinese characters slipped under the door. Searle understands nothing of Chinese, and yet, by following the program for manipulating symbols and numerals just as a computer does, he produces appropriate strings of Chinese characters that fool those outside into thinking there is a Chinese speaker in the room. The narrow conclusion of the argument is that programming a digital computer may make it appear to understand language but does not produce real understanding. Hence the "Turing Test" ** is inadequate. Searle argues that the thought experiment underscores the fact that computers merely use syntactic rules to manipulate symbol strings, but have no understanding of meaning or semantics. The broader conclusion of the argument is that the theory that human minds are computer-like computational or information processing systems is refuted. Instead minds must result from biological processes; computers can at best simulate these biological processes. Thus the argument has large implications for semantics, philosophy of language and mind, theories of consciousness, computer science and cognitive science generally. As a result, there have been many critical replies to the argument.



The Turing Test is a test, developed by Alan Turing in 1950, of a machine's ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human.

libezkova -> anne... , April 14, 2017 at 07:22 PM

This guy is 50 years late

The first researcher who has shown that Turing test can be faked was Joseph Weizenbaum from MIT.

In 1964-1966 he wrote his famous ELIZA program
( ) which simulated a Rogerian psychotherapist and used rules, dictated in the script, to respond with non-directional questions to user inputs.

As such, ELIZA was one of the first chatterbots, but was also regarded as one of the first programs capable of passing the Turing Test.

In general humans are unique in the ability to construct new languages and rules of manipulating objects in those languages.

Computers so far can only manipulate with objects in a given language using preprogrammed rules. although with neural networks recognition of visual images this is more complex then that.

Amazing achievements of computers in chess and natural language recognition, including the spoken one, are mostly due to huge storage and multiprocessing. A regular desktop for $8K-$10K can now have 64 cores (2 x 32) and 128 GB or even 1TB of memory. This was a supercomputer just 15 years ago.

In chess computers are also able to work in parallel to explore deterministically positions from a given one for the larger amount of "half-moves" them human do. That's why computers became competitive with humans in chess and Blue Gene managed to beat Kasparov in 1996 ( ) That's has nothing to do with the ability to think.

Winning in Jeopardy is more involved and impressive fit, but I believe IBM cheated.

IBM after Louis Gerstner Jr, destroyed it became a completely financially driven company capable to perform all kind of dirty tricks.

But to outsider such programs are definitely looking "intelligent".

Clark third law states "Any sufficiently advanced technology is indistinguishable from magic."

That's what we are currently observing with computers.

Note: BTW the quality of translation from one language to another remains so dismal that any bilingual human can do better then the best computer program. Although recent experiments with recognizing spoken language translating it to another and verbalizing it in the second language are nothing short of amazing.

[Apr 06, 2017] It's 30 years ago: IBM's final battle with reality by Andrew Orlowski

Notable quotes:
"... OS/2 ran fine on PC clones and ISA boards. The link between PS/2 and OS/2 was entirely spurious. ..."
"... After all, OS/2 did TCP/IP long before Windows had a stable IP stack supported by the manufacturer (yes, yes, there was Trumpet WinSock and Chameleon TCP/IP but they were not from Microsoft and MS refused to support them). ..."
"... If they'd put decent mainframe developer tools onto a personal workstation they'd have had a core market to build on. ..."
"... No product manager was going to risk the ongoing investment in mainframe software by allowing personal computers to connect other than as the dumbest of terminals. It's ironic that IBM used the word "System" so much: it never really understood systems, just product families.. ..."
"... Way before then Microsoft had UNIX System 7 running and released as XENIX for the 8086/80286 architecture. ..."
Apr 04, 2017 |
Thirty years ago this month, IBM declared war on reality – and lost. For the 30 years prior to that April day in 1987, nobody had dared to challenge IBM's dominance of the computer industry. It would take almost a decade for the defeat to be complete, but when it came, it was emphatic.

In April of '87, IBM announced both a new PC architecture, PS/2, and a new operating system to run on the boxes, OS/2. "Real" business computers would also run IBM networking software and IBM SQL database software, all bundled into an "Extended Edition" of the operating system. There was only one real way of computing, IBM declared, and it had a /2 on the end. This was signalled by the job titles, the PC business was called "Entry Systems": bikes with training wheels.

While IBM itself has subsequently survived and prospered, it's a very different company today. It subsequently divested its PCs, printers and microprocessor divisions (along with much else) and one wonders how much different it would be today if it hadn't devoted a decade and thousands of staff to trying to bring the PC industry back under its control. Ironically, Lenovo is the biggest PC company again – but by following the rules set by Microsoft and Intel.

Analysts, eh?

OS/2 is an oft-told story, not least here at the Reg , where we gave an account of the OS wars on the 25th anniversary of OS/2. Dominic Connor also shared lots of evocative detail about the IBM culture at the time here (Part One) and here (Part Two) . So there's no need to do so again.

But every time history is written, it's from a different vantage point – a different context. It no longer seems a joke to suggest, as IBM CEO Thomas J Watson probably never did , that the world needs only five computers. It's a quote that adorned many .sig files in the early days of email. Hah Hah. How stupid could a tech CEO be?

Well today, just a handful of cloud platforms are threatening to dominate both consumer and business computing, as big data analytics and AI (we're told) will only really work at scale. And only a few companies (Amazon, Alphabet, Microsoft) will have that data. So how many computers will the world really have? Not counting the Javascript interpreter in your pocket or your living room, that clicks on invisible advertisements in the always-connected world, that has been relegated to a runtime.

So if IBM had fought a different war, how might the world look?

The tl;dr

If you're not familiar with the significance of OS/2 and PS/2 and don't want to read thousands of words, here's a capsule summary. The setter of standards for the past three decades, IBM had responded to the microprocessor revolution by allowing its PC to be easily "cloneable" and run a toy third-party operating system. It didn't matter too much to senior IBM management at first: PCs weren't really computers, so few people would buy them. Business computing would be done on real multiuser systems. That was the norm at the time. But the runaway sales of the PC clones worried IBM and in 1984, just three years after the launch of the IBM PC, Big Blue plotted how to regain control. The result was a nostalgic backward-looking vision that under-estimated that computing standards would increasingly be set by the open market, not by IBM.

The PS/2 series of PCs had world-beating futuristic industrial design, based around clip-together plastic parts that made the beige tins of the time look instantly dated. And PS/2 would have been fine if it hadn't been for the notorious proprietary bus: Micro Channel Architecture .

This plug and play bus was much better than the ISA standards of its day, and good enough for IBM to be using in workstations and even mainframes. However, it was not compatible with the (fairly crude) expansion boards of the day required to do networking or graphics; MCA cards were twice the price of comparable cards; and hybrid PCs were difficult to build. But the killer was that IBM demanded a licence fee from OEMs and sent its lawyers after MCA clones. The result was years of uncertainty. Seven years later, Apple was still making fun of how hard it was to get expansion cards to work in PCs. The real bedevilment of IBM's PCs was the tech titan's cost structure, which made it more expensive than the competition, at least without a heavy corporate discount.

But people don't get emotional about PS/2s in the way they got emotional about OS/2, which, even as a former user, is pretty strange, given how much grief it gave you. The tl;dr of the OS/2 story is that IBM announced a highly advanced operating system for PCs, but it was five years (1992) before it shipped a version that was demonstrably better for the average user. (In fact, it was almost a year before anything shipped at all.)

Since operating systems aren't an end in themselves, but merely a means to an end, a means of running something that alleviates your grunt work (like dBase or Lotus 1-2-3 at the time), the advantages of OS/2 were pretty elusive.

And, even in 1992, "better" meant managing old apps and files better – and the action in that "space" was taking place on Windows.

Because the industry was so unused to believing it could actually set standards, for a long time it didn't. This nervousness in the wake of PS/2 and OS/2 had caused a kind of winter. People just didn't bother upgrading – well, why would you? You had to wait and see what the standards would be.

So Microsoft put next to no effort into updating DOS (or even Windows) for the next two years. Application vendors continued to update their applications, but these remained character mode and there the lack of a standard for addressing extended memory also added to the inertia. Remember that OS/2 had been written for the 80286 chip introduced in 1984, and while the 80386 chip added new modes for protecting memory and virtualizing sessions, without the software to take advantage of it, weak demand ensured 386 PCs remained very expensive.

I described IBM's vision of computing as nostalgic and backward-looking, with PCs limited to office paperwork while real computing took place on (IBM) servers. But what if IBM had dared skip a generation and tried something really daring?

Alternative histories

At the time, there just that weren't many options.

A week before IBM's PS/2 and OS/2 roadmap was unveiled, Digital Research Inc shipped a multitasking DOS for 286 computers – Concurrent DOS 286 – to OEMs. This emulated the original IBM PC chip in software, and offered a huge leap in multitasking over Microsoft DOS. DRI also had a graphical shell, GEM. Why not dump Microsoft for DRI, which ran the industry standard OS when IBM set about making its first PC? It was really about control. IBM had excellent engineers and many PhDs; it could make it itself. IBM had little appetite to give Gary Kildall, who understood the microcomputer business much better than IBM, another sniff.

As it turned out, DRI struggled, along with everyone else, to make Concurrent DOS work reliably on the 80286 chip, particularly when it came to networking. The PC was a wild west, and reliable compatibility really needed the hardware virtualisation of the 80386 chip.

The obvious alternative was rapidly maturing: Unix. The trade press had busily hyping "open systems" for years. In fact, it's little remembered now the PS/2 came with a version of IBM's Unix: the Advanced Interactive Executive or AIX for the PS/2. "The multiuser, multitasking virtual memory operating system will be a subset of the Unix-like AIX operating system for the RT PC", as InfoWorld reported.

(Note that a Unix would be "Unix-like" forever after.)

But IBM didn't like Unix, and didn't get serious about selling it until Sun, with the rest of the industry in hot pursuit, was eating into its business. IBM didn't like distributed architectures that were too distributed, and for IBM, Sun's emphasis on "the network is the computer" put the emphasis in completely the wrong place. The "Open Systems" hype was that the CIO user would mix and match their IT – and that was the last thing IBM wanted.

And there were more immediate, practical difficulties with leaping a generation of technology and yoking the IBM PC to Unix long term. It wasn't ease of use. Later, usability was the stick used to beat Unix vendors with, but at the time DOS and Unix were equally hard to use. Backward-compatibility was the main issue: they wouldn't run the PC applications of the day. It seemed far more plausible that IBM could persuade the big PC application vendors of the day – like Lotus and Ashton Tate – to migrate to OS/2 than bring their Unix ports to IBM's version of Unix. Far less risky for IBM too.

With the benefit of hindsight, the third option would have been far more attractive: why didn't IBM just buy Apple? It wasn't IBM-compatible, but most of IBM's kit wasn't IBM-compatible, either. (Hence one of those grand IBM unifying strategies of the time: "System Application Architecture".)

As it turned out, IBM and Apple would work closely together, but only out of desperation, once it was clear that Windows was a runaway success, and both had lost the developers to Microsoft. Much of the first half of the 1990s saw several thousand IBM and Apple developers working on ambitious joint projects that never bore fruit: Taligent, a WorkPlace OS, and much else.

Apple had a working Intel port of MacOS... in 1994. IBM and Apple had even agreed in principle to merge in 1995, only for Apple CEO Michael Spindler to get cold feet at the signing. He wanted more money from the buyer (the Apple Way).

Still, it's fascinating to speculate what a popular, consumer-friendly OS might have done if IBM had been prepared to license Apple's Macintosh OS aggressively, so it supplanted DOS as the industry standard. Many vendors had a foot in both camps at the time, so it really would have been a case of investing in existing initiatives, rather than starting from scratch. Without big IBM's support, Microsoft would have been relegated to a tools company with a nifty spreadsheet, possibly eventually divesting both. I wonder who would have bought Excel?

Alas, nothing was further from the thoughts of IBM's management at the time, obsessed with taming the industry and bringing it back under central control. What they'd always done had always worked. Why change?

Thomas J Watson may never have said the world will have five computers - it is almost certainly a myth. But it's striking as the world moves to three (or four) computing platforms, IBM isn't one of them. ®

1 day Anonymous South African Coward

Re: Interesting times

On the other hand, my experience with OS/2 on a 33MHz 386SX with 4Mb RAM was excellent. First 2.1 then Warp, then Merlin... haven't had any crashes or funny things. Just wished the dang PC would go faster.

DooM played better than under DOS and some games just loved the extra memory I was able to give to them with the DOS settings under OS/2...

Good days, good memories.

1 day Anonymous Coward

Re: Interesting times

Agreed, I had it on a 386sx 25 (2mb ram) and then a 486dx2 66 (8mb ram), ran videos, games and windows apps faster than in windows and dos. Was a good OS, shame it didn't do better than it did. Have been using it up until a couple of years ago, some older machines where I work used it. But that wasn't a pleasant task, keeping it integrated with the current environment and making work around to keep it 'compliant'.

1 day AndrueC

Re: Interesting times

I remember Sunday afternoons playing Geoff Crammond's Formula One Grandprix in a VDM while downloading messages from CompuServe using the multi-threaded Golden Compass. My first experience of real multi-tasking on a PC.

I also used it to develop DOS applications as the crash protection meant that I only had to reopen the VDM, not reboot the machine.

1 day ChrisC

Re: Interesting times

I have fond memories of Warp too - back then I was doing some research work on robotic equations of motion, which had eventually evolved into a hideously complex Matlab script to do all the hard work for me. I'd just define the system geometry parameters at the start, click Go, twiddle my thumbs for an hour or so, and then get a complete set of optimised motion equations out the other end.

Unfortunately this was all being done in the Win3.1 version of Matlab, and as bad as the co-operative multitasking was in 3.1 generally, it was a shining beacon of excellence compared to how it behaved once Matlab started up - I'm pretty sure the Matlab devteam must have misread the Windows documentation and thought it featured "un-cooperative multitasking", because once you let Matlab loose on a script it was game over as far as being able to do anything else on that PC was concerned.

As a hardcore Amiga user at the time, I knew that multitasking didn't have to be this godawful, and I was convinced that the PC I had in front of me, which at the time had roughly twice the raw processing power of the fastest Amiga in my collection, really ought to be able to multitask at least as well as the slowest Amiga in my collection...

I can't recall how I stumbled upon OS/2 as the solution, all I do remember is that having learned of its existence and its claimed abilities to do stuff that Windows could only dream of doing, I dashed into town and bought my own copy of Warp, and once I got over the hurdle of getting it installed as a multi-boot setup with my existing fine-tuned DOS/Win3.1 setup (having expended god knows how many hours tweaking it to run all my games nicely - yes, even those that expected to have almost all of the base memory available, but still also needed to have CDROM *and* mouse drivers shoe-horned in there somewhere too - I didn't want to mess that up) I fired it up, installed Matlab, and tentatively clicked Go... Umm, is it running? This can't be right, the OS is still perfectly responsive, I can launch other Win3.1 applications without any signs of hesitation, and yet my Matlab script really does claim to be churning its way through its calculations about as quickly as it did hogging Win3.1 all to itself.

From that day on, Warp became my go-to OS for anything work-related until the day I finally ditched Win3.1 and made the switch to 95.

So yes, count me in as another one of those people who, despite the problems OS/2 had (I'll readily admit that it could be a bit flakey or just a bit obtuse when trying to get it to do what you wanted it to do) will still quite happily wax lyrical about just how bloody amazing it was in comparison to a DOS/Win16 based setup for anyone wanting to unlock the true potential of the hardware in front of them. Even today I still don't think the Windows dev team *really* understand how multitasking ought to behave, and I do wonder just how much productivity is lost globally due to those annoying random slowdowns and temporary hangs which remain part and parcel of everyday life as a Windows user, despite the underlying hardware being orders of magnitude more powerful than anything we could dream of having sat on our desks back in the 90's.

1 day LDS

Re: Interesting times

OS/2 had a single applications message queue, or something alike, and a non responding application could block the whole queue. 3.0 run DOS applications very well, most Windows application worked well, but some had issues - i.e. Delphi 1.0. But despite IBM asserting it could also support Win32 applications, it never did, and from 1995 onwards the world was quickly migrating to 32 bit applications - and native OS/2 applications were too few and sparse.

But OS/2 was just a part of the PS/2 equation - PS/2 and MCA boards were really too expensive compared to clones - and once Compaq and others started delivering good enough machines, it was too late to close the stable. Nor DRI/GEM nor Apple could have turned the tide - it was a matter of customers money.

Back then PCs and their software were still expensive (much more expensive than today), and for many it was a quite demanding investment - asking even more like IBM did, was just cutting out many, many customers who could only afford clones - all of them running MS-DOS and then Windows.

16 hrs Planty

Re: Interesting times

Interesting times again. Microsoft are now in IBM shoes, facing irevellance as nobody really cares about their products anymore. Microsoft have been running around throwing out all sorts of random things, hoping something will stick. Nothing stuck, everything sucked.

15 hrs Richard Plinston

Re: Interesting times

> OS/2 had a single applications message queue,

It was Windows 1 through 3.11 that "had a single applications message queue".

"""Preemptive multitasking has always been supported by Windows NT (all versions), OS/2 (native applications) , Unix and Unix-like systems (such as Linux, BSD and macOS), VMS, OS/360, and many other operating systems designed for use in the academic and medium-to-large business markets."""

> But despite IBM asserting it could also support Win32 applications, it never did,

OS/2 Windows 3.x did support Win32s applications by loading in the win32s module, just like Windows 3.1 could do. However, Microsoft added a completely spurious access to virtual memory beyond the 2Gbyte limit of OS/2 (Windows supported 4Gbyte accesses) just to stop OS/2 using that beyond a particular version. Microsoft then required the new version in their software.

Exactly what you would expect from Microsoft, and still should do.

> But OS/2 was just a part of the PS/2 equation - PS/2 and MCA boards were really too expensive compared to clones

OS/2 ran fine on PC clones and ISA boards. The link between PS/2 and OS/2 was entirely spurious.

6 hrs AndrueC

Re: Interesting times

I think you might be confusing two things there. Having pre-emptive multi-tasking doesn't preclude having a single message queue. The two things are only tangentially related.

Multi-tasking is the ability to run multiple processes (or multiple threads) by switching between them. Pre-emptive multitasking means that the OS can force a task switch, cooperative multitasking means that each process has to yield control back to the OS. OS/2 was an indeed one of the earliest (possibly the earliest) PC OS that was preemptive. Windows was only cooperative until 9x and NT.

But nothing about being multi-tasking requires that the OS even support message queues. Early versions of Unix offered pre-emptive multitasking but in the absence of X-Windows probably didn't have any message queues. In fact arguably an OS probably never would. Message queues are usually a higher-level construct typically implemented in the GUI framework.

And, sadly, it is indeed true that the early versions of Work Place Shell (the OS/2 default GUI) had a single message queue. IBM's recommendation was that developers implement their own queue sinks. The idea being that every application would have a dedicated thread that did nothing but accept messages and store them in a queue. The main application thread(s) would then empty this 'personal' queue at their leisure. I'm not sure why they wanted this design - maybe because then it was the application's responsibility to manage message storage? Sadly (and not surprisingly) most developers couldn't be arsed. As a result the WPS could often lock up. Now the OS itself wasn't locked - other processes would keep running just fine. If you were lucky enough to have a full screen VDM open you wouldn't even notice until you tried to go back to the WPS. When it happened to us my colleague and I used to ask the other one to Telnet in to our boxes and kill the main WPS thread to get things going again.

One of the big features that OS/2 Warp finally brought was multiple message queues. Sadly by then it was too late. It's a shame because I did like the WPS. It's object oriented nature was great. Windows has never offered that. In the WPS an icon is an object that knows where it should appear. In Windows it's just an icon that Explorer choose to render in a particular location. Right click it and Explorer tries to work out what to put on the menu. Do the same in WPS and the icon will create its own menu.

OOP GUIs are powerful things. Re: Interesting times

Lets try not to forget that Microsoft only got the PC-DOS gig after pitching Microsoft port of AT&T Unix (Xenix) for IBM's Personal Computer. It is also worth mentioning that Windows was initially developed on Xenix and ported to DOS after initial testing.

Had it not been for IBM, the world would be.... pretty much as it is now

5 hrs cmaurand

Re: Interesting times

Actually, it did support 32 bit applications. Microsoft kept changed something in windows in the way it handled 32 bit applications, IBM adjusted, then Microsoft came out with Win32s, IBM adjusted, Microsoft changed it again (something about the way windows does things in memory) and IBM gave up on trying to keep up with Microsoft's changes.

4 hrs AndrueC

Re: Interesting times

Actually, it did support 32 bit applications.

That Windows integration piece in OS/2 was cool, I thought. Pretty much seamless and you could choose how seamless it was - full screen or pretend to be an application on the desktop. The bit where it could link into an existing installation was just brilliant. Licensing payments to Microsoft for Windows? Not today, thanks :D

But then the entire VDM subsystem was cool. A true 'Virtual DOS machine' rather than the Windows poor cousin. You could even boot up different versions of DOS. I seem to recall one of their bug fixes was to the audio subsystem to support a Golf simulator that triggered tens of thousands of interrupts per second. From what I vaguely recall they said they removed the need on the card side and faked the results inside the VDM. The VDM team must've gone to extraordinary lengths in order create what we'd probably now call a Virtual Machine.

2 hrs Mine's a Large One

Re: Interesting times

"Now the OS itself wasn't locked - other processes would keep running just fine. If you were lucky enough to have a full screen VDM open you wouldn't even notice until you tried to go back to the WPS. When it happened to us my colleague and I used to ask the other one to Telnet in to our boxes and kill the main WPS thread to get things going again."

We had a couple of apps which used to do this occasionally on our servers and the first we'd know about it would be when we tried to do anything at the server, something we didn't often do - everything had been merrily working away in the background but with a single app having frozen the WPS.

2 hrs Faux Science Slayer

Rockefeller front IBM replaced by Rockefeller front MicroSoft

Edison did not invent electric light, Alex Bell stole his telephone patent, Marconi STOLE Tesla radio patents and Glenn Curtiss stole a dozen Wright Brothers patents. All were set up fronts for Rockefeller monopolies.

"JFK to 9/11, Everything is a Rich Man's Trick" on YouTube....end feudalism in 2017

18 mins CrazyOldCatMan

Re: Interesting times

my experience with OS/2 on a 33MHz 386SX with 4Mb RAM was excellent

It was (even more so than DOS/Windows) sensitive to the hardware - if you had anything slightly out then OS/2 would do lots of mysterious things..

I used it, right up to the end when IBM dropped it. Along with linux (my first linux machine was a 386sx25 with a 330MB ESDI drive - was originally OS/2 with an IDE 80MB drive but OS/2 simply wouldn't load when the ESDI drive interface card was inserted. So I used it for linux instead).

After all, OS/2 did TCP/IP long before Windows had a stable IP stack supported by the manufacturer (yes, yes, there was Trumpet WinSock and Chameleon TCP/IP but they were not from Microsoft and MS refused to support them).

1 day schifreen

Too much credit

You seem to conclude that the problem lay with IBM's poor strategy. You give the company way too much credit. They didn't really have a strategy at all and, if they did, no one knew what it was. Or where they would be working once the latest departmental reorg was complete.

The problem with OS/2 is that it was a solution to a problem that hardly anyone actually had. IBM knew this. Ask senior IBM people at the time (as I did) exactly what problem OS/2 was designed to solve, and why the world needed to ditch the free OS that came with their PC in order to install something that, frankly, never installed properly anyway, and they singularly failed to come up with a sensible answer. Or any answer at all.

1 day Little Mouse

Re: Too much credit

I'd almost forgotten how pervasive the image of IBM was back in those days. I joined the PC party in the days of 286, and IIRC, PC's only really fell into two camps - either IBM or IBM clones.

Amazing that they could get from that position of dominance to this.

1 day I am the liquor

Re: Too much credit

I don't think it's true that OS/2 was a solution to a problem that never existed. Perhaps it was too early. But by the mid 90s when customers did understand the problem, and Microsoft was answering it with Windows NT, surely that should have meant that OS/2 was mature and ready to take advantage.

OS/2 Warp was a better design in many ways than NT 3.51, but IBM's head start didn't help them win the race, and now we all run Windows NT on our PCs.

1 day Warm Braw

Re: A solution to a problem that hardly anyone actually had

It could have been a solution to a problem that a lot of people had. If you've ever had the misfortune to write software for IBM mainframes and been stuck with TSO or SPF (or even VM/CMS which is only better my comparison with the alternatives) you'd have given your eye teeth for a less developer-hostile environment. If they'd put decent mainframe developer tools onto a personal workstation they'd have had a core market to build on.

But this wasn't the IBM way: accounts were structured along product lines and no mainframe systems salesman was going to let a PC salesman or a System/34 salesman onto his territory if the thought it might cannablise his commission. No product manager was going to risk the ongoing investment in mainframe software by allowing personal computers to connect other than as the dumbest of terminals. It's ironic that IBM used the word "System" so much: it never really understood systems, just product families..

1 day Vic

Re: Too much credit

You give the company way too much credit. They didn't really have a strategy at all


I was working at an IBM Systems Centre when the PS/2 launched. We had to go to customers to tell them how wonderful this MCA thing was - without having any knowledge of whether or not it was any better than ISA.

And that was a shame, really - MCA *was* better. But no-one found out until it was way too late. And the Model/30 didn't have MCA anyway...

1 day LDS

Re: Too much credit

Many people started to have problems with the single applications model DOS had - it's no surprise one of the first big hits of Borland was Sidekick - which made TSRs (Terminate and Stay Resident) applications common.

Still, everything had to work in the 1MB of memory available in real mode. Extended/Expanded memory couldn't be used to run code from. DOS Extenders later could run bigger applications, but still a single one.

Windows was well accepted not only because it was a GUI, but because it allowed to run more than one application, switch among them easily, and move data across applications. The limited multithreading was not an issue on desktops with only a single core CPU.

If you were using the PC just to play games, it really didn't matter, but for business users it was a great productivity boost.

18 hrs a_yank_lurker

Re: Too much credit

@schifreen - Itsy Bitsy Morons always had a schizophrenic attitude towards PCs and to a lesser extent minis at the time. They worshipped big iron and could not understand why people would want a "toy" or "crippled iron". What they failed to grasp is many computing activities are not very resource intensive on any computer (I wrote a thesis on an Apple IIe) even early PCs. These are activities that could be easily automated and put on a smaller computer. Others grasped the vacuum left and moved in with both feet.

The other issue for them was selling PCs is very different than selling a mainframe. One buys a PC much like one buys any other appliance from a retailer. There is no formal bid process with tenders to opened and reviewed. At retail, the sales staff is less interested in the brand you bought but is very interested in selling you something.

15 hrs Richard Plinston

Re: Too much credit

> and why the world needed to ditch the free OS that came with their PC

MS-DOS and Windows were never free*. When you bought a computer with Windows installed, and sometimes when Windows was _not_ installed, money went from the OEM to Microsoft. That cost was part of the price.

* actually there was a 'free' version: 'Windows with Bing' that no one wanted.

14 hrs addinall

Re: Too much credit

MS-DOS was never free. Unless you stole it.

13 hrs Richard Plinston

Re: Too much credit

> MS-DOS was never free. Unless you stole it.

When SCP sold 86-DOS to Microsoft for development into PC-DOS and MS-DOS (MS had previously licensed it from SCP) the agreement was that SCP would have as many copies of MS-DOS as they wanted for free as long as they were shipped with a computer (SCP built the Zebra range of S-100 based computers).

After the fire in the SCP factory, which stopped them building computers, they started selling V20 chips (faster clone of the 8088* with 8085 emulation built in) and V30 chips (ditto 8086) with a free copy of MS-DOS. MS bought out the agreement for a reputed $1million.

* swap this for the 8088 to get a 20% faster machine that could also run CP/M software (with suitable loader).

10 hrs Richard Plinston

Re: Too much credit

> MS-DOS was never free. Unless you stole it.

After per-box pricing* was declared illegal, MS came up with another scheme where MS-DOS and Windows were bundled together at the price of Windows alone. Effectively this was MS-DOS for free to stop DR-DOS being installed. At the time it was MS-DOS 4.01 versus DR-DOS 5 which was infinitely superior and it took MS 20 moths to nearly catch up with MS-DOS 5, at which point DR released DR-DOS 6 with task switching. MS took another year to almost catch up with MS-DOS 6.

* OEMs were contracted to pay Microsoft for MS-DOS on every box sold regardless of whether it had MS or DR-DOS (or other) installed. This was to strangle DR-DOS sales. The alternative to accepting this contract was to never sell any MS products ever again.

10 hrs Richard Plinston

Re: Too much credit

> Itsy Bitsy Morons always had a schizophrenic attitude towards PCs and to a lesser extent minis at the time. They worshipped big iron and could not understand why people would want a "toy" or "crippled iron".

You write as if IBM were one thing. It was divided up into several divisions, each with their own sales and marketing and each competing against the others. The mainframe division (360/370) wanted a small computer to counter the Apple IIs with Visicalc, Z80 softcard and CP/M software invading their sites. The IBM-PC was designed to be 20% better than the Apple II (160Kb floppies instead of 120Kb etc) and also act as a terminal (which is why the IBM PC has DTE serial ports while other micros had DCE) while running the same software. There were also 3740 (terminal) PCs and 360 emulating PCs (with Motorola 68x00 co-processor boards) for developers to use to write mainframe software.

The mainframe division did look down on the Series One, System 3, System 36 and System 38 (AS400) and other divisions, but did not see the IBM PC as any threat at all. They did want to exclude other brands though.

10 hrs Pompous Git

Re: Too much credit

MS-DOS was never free. Unless you stole it.
Only two of the many computers I've owned came with MS-DOS, most were without an OS. Since MS refused to sell DOS at retail, most people did just that; they stole a copy of DOS. A much smaller number of us purchased DR-DOS and reaped the benefits of an arguably better DOS than DOS. Especially if you also ran the 4DOS command processor.
7 hrs Richard Plinston

Re: Too much credit

> Since MS refused to sell DOS at retail

MS had a contractual moratorium on selling MS-DOS at retail for 10 years with IBM. This expired and MS released MS-DOS 5 for retail sales.

> A much smaller number of us purchased DR-DOS and reaped the benefits of an arguably better DOS than DOS.

It was significantly better that the contemporary MS-DOS 4.01 and had a 20 month lead on MS-DOS 5.

Allegedly it reached a 20% market share until MS brought in illegal per-box pricing and bundled MS-DOS+Windows at Windows price.

6 hrs Pompous Git

Re: Too much credit

"MS had a contractual moratorium on selling MS-DOS at retail for 10 years with IBM. This expired and MS released MS-DOS 5 for retail sales."
B-b-b-b-ut that can't be true. Shirley MS are responsible for everything bad in computing... ;-)
5 hrs Anonymous Coward

Re: Too much credit

"At the time it was MS-DOS 4.01 versus DR-DOS 5 which was infinitely superior and it took MS 20 moths to nearly catch up with MS-DOS 5, at which point DR released DR-DOS 6 with task switching. MS took another year to almost catch up with MS-DOS 6"

But MS had a trick up its sleeve to sideline DR-DOS ... with Win 3.0 they had a long (several months) "beta" period where people cold download the "beta" for free to try out so there was a lot of prelaunch publicity. Part of that publicity was that Win 3.0 didn't work on DR-DOS as there was an error during start-up. In legal terms DR weren't allowed to access the beta so they couldn't counter all the "if you want win 3.0 you'll need MS-DOS and not DR-DOS" stories in the press. In reality the error was, I believe, due to MS deliberately using a result from a DOS call which wasn't precisely specified where Win3 worked with the value MSDOS returned but not with the one DRDOS returned ... trivial for DR to fix and seem to recall they did this as soon as Win3 launched but the damage was done.

At the time I was using DR-DOS so I assumed that to get Win3 I'd need to factor in the cost of MSDOS as well and at which point the price of OS/2 (with a special launch offer) was similar so I went for OS/2

4 hrs FuzzyWuzzys

Re: Too much credit

DR-DOS was superb, it had so much stuff bundled in and it worked like a dream and as proven MS took months to catch up to what DR-DOS did. DR also has one of the more tragic stories in PC history, well worth checking out.

13 mins CrazyOldCatMan

Re: Too much credit

OS/2 Warp was a better design in many ways than NT 3.51, but IBM's head start didn't help them win the race, and now we all run Windows NT on our PCs.

Windows winning has often been called "the triumph of marketing over excellence".

1 day Martin 47
Since operating systems aren't an end in themselves, but merely a means to an end, a means of running something that alleviates your grunt work (like dBase or Lotus 1-2-3 at the time), the advantages of OS/2 were pretty elusive.

Think someone needs to mention that to Microsoft

1 day stephanh

The whole situation seems very similar to Microsoft today desperately getting a bolthole in the mobile market, against Android. Microsoft's "Windows Experience" Android strategy me a lot of OS/2 (replace a working, familiar OS by something nobody needs).

"Hegel remarks somewhere that all great world-historic facts and personages appear, so to speak, twice. He forgot to add: the first time as tragedy, the second time as farce."

-- Karl Marx, "The Eighteenth Brumaire of Louis Napoleon"

10 hrs Pompous Git
"Since operating systems aren't an end in themselves, but merely a means to an end... the advantages of OS/2 were pretty elusive."
Not really. It was far from unknown in publishing to have to leave the machine rendering a print job because it couldn't do anything else at the same time. Being able to continue working while printing would have been a blessing, but had to await WinNT.
1 day Admiral Grace Hopper
The Man In The High Data Centre

I enjoy a good counterfactual. I still wonder occasionally how the world would have looked if IBM and Apple had got Pink to the point where it was marketable.

1 day Tinslave_the_Barelegged

Re: The Man In The High Data Centre

Bloke at our small village local being cagey about what he did at IBM. Eventually I twigged and said "Oh, you're working on Pink?" He seemed amazed that anyone would know about it, and eventually chilled, but it struck me that little episode was a good analogy for the problems with the ill fated IBM/Apple dalliance.

22 hrs jake

Re: The Man In The High Data Centre

Somewhere, I have a T-shirt with the IBM logo of the time superimposed over the "full color" Apple logo of the time on the front. On the back, it reads "Your brain, on drugs.". The first time we wore them at work, we were told we'd be fired if we wore them again ...

1 day Colin Bull 1

succesful standard

The PS2 brought with it one of the longest used PC standards - VGA. Only in the last year or two has the VGA standard connector been superceded by HDMI.

1 day MrT

Re: succesful standard

And the PS/2 port for mice/keyboards.

The Model M keyboard set a high standard that many modern items still struggle to beat...

1 day Peter Gathercole

Re: successful standard

Though to be truthful, the key action in the Model M keyboards had appeared in the earlier Model F keyboards, and the very first Model M Enhance PC keyboard appeared with an IBM PC 5 pin DIN connector on the 5170 PC/AT.

We had some 6MHz original model PC/ATs where I worked in 1984, and even then I liked the feel of the keyboard. Unfortunately, the Computer Unit decided to let the departmental secretaries compare keyboards before the volume orders went in, and they said they liked the short-travel 'soft-touch' Cherry keyboards over all the others (including Model Ms).

As this was an educational establishment, the keyboards got absolutely hammered, and these soft-touch keyboards ended up with a lifetime measured in months, whereas the small number of Model Ms never went wrong unless someone spilled something sticky into them.

I wish I had known at the time that they were robust enough to be able to withstand total immersion in clean water, as long as they were dried properly.

1 day Wade Burchette
Re: succesful standard

The PS/2 standard is for keyboards/mice and it is still an important connector. It just works.

And VGA has not been supplanted by HDMI, but by DVI which was supplanted by DisplayPort. HDMI is designed for TV's and has a licensing cost.

1 day Danny 14

Re: succesful standard

I bought a second hand denford milling machine. The license comes on a 3.5in floppy disk - I needed to buy a USB drive as I didn't have any working ones left (not even sure I had a motherboard with a connection any more either). Re: succesful standard

Ah, those proprietary floppy disk drives which couldn't tell the difference between a 720Kb and 1.4Mb disk. I hate to think about the hours spent recovering data from misformatted floppies.

And whilst the industrial design of the PS/2 was attractive and internals were easily accessible, quality of manufacture (the ones made in Scotland, at least) was lousy. I still carry the scars from fitting lids back on Model 55s.

22 hrs Dave 32

Re: succesful standard

Ah, you forgot about the 2.88MB floppy disks.

I happen to have an IBM PS/2-model 9595 sitting here under the desk with one of those drives in it. ;-)

Ah, yes, the model 9595; those had that little 8 character LED "Information Panel" display on the front of the case. It only took a tiny bit of programming to write an OS/2 device driver to turn that into a time-of-day clock. For years, that was the only thing that I ran on that 9595 (I called it my "600 Watt clock".).

Hmm, why was there suddenly a price spike for old IBM PS/2-model 9595 machines? ;-)


P.S. I'll get my coat; it's the one with the copy of Warp in it.

14 hrs bombastic bob

Re: OS/2 and PS/2 Memories

Back in the 90's, a few months before the release of Windows 3.0, I took an OS/2 presentation manager programming class at the local (night school) city college. Got an 'A'. Only 6 students survived to complete the class, and the room was packed on day 1 when I thankfully had my add slip signed by the prof... [and we had "the PS/2 machines" in the lab to ourselves, since they were the only ones that could run OS/2].

And I really _LIKED_ OS/2 PM. I was able to format a diskette while doing OTHER THINGS, kinda cool because DOS could _NEVER_ do that! Version 1.2 was nice looking, too, 3D SKEUOMORPHIC just like Windows 3.0 would soon become!

But when i tried to BUY it, I ran into NOTHING but brick walls. It was like "get a PS/2, or wait forever for OEMs to get it 'ported' to THEIR machines". Bleah.

THAT is what killed it. NOT making it available for CLONES. When 'Warp' finally released, it was too little, too late.

But the best part of OS/2 was it's API naming, which follows the MORE SENSIBLE object-verb naming, rather than verb-object. So in Windows, it's "CreateWindow". In OS/2, it's "WindowCreate". And wouldn't you know it, when you read the DOCUMENTATION all of the things that work with WINDOWS are in the SAME PART OF THE MANUAL!

Damn, that was nice! OK I have hard-copy manuals for OS/2 1.2 still laying about somewhere... and corresponding hard-copy Windows 3.0 manuals that I had to thumb back-forth with all the time. Old school, yeah. HARDCOPY manuals. And actual message loops (not toolkits nor ".Not" garbage).

10 hrs Pompous Git

Re: OS/2 and PS/2 Memories

"THAT is what killed it. NOT making it available for CLONES. When 'Warp' finally released, it was too little, too late."
A friend who was a developer at the time says the main thing that killed OS/2 was the cost of the SDK: over a $AU1,000. BillG was giving away the SDK for Windows at computer developer events.
1 day Lord Elpuss

"But it's striking as the world moves to three (or four) computing platforms, IBM isn't one of them ."

IBM is very much one of the top players. In Cloud, I would say AWS, Azure and IBM are the top 3. Business Intelligence? Oracle, IBM, Microsoft. AI/Cognitive? IBM Watson, Google DeepMind are the big two, Microsoft Cognitive coming a far third (along with highly innovative smaller vendors with a great solution but lacking scale - these will be swallowed by the big 2 (or 3) in short order).

Don't discount Big Blue too early.

1 day wolfetone

The IBM PS/2 Model 70 was my first PC, bought for me on my 10th birthday in 1997. Knew nothing about the computers, so it got thrown out a year later for an upgraded Acer 486. It had a huge impact on me, probably the one thing that got my love of computers and computing in general going.

In 2012, after many a weekend looking at eBay, I found a Model 70 for sale in London for £50. I got in the car, drove down and picked it up. I've other more interesting pieces of technology in my collection, but the Model 70 is the jewel as far as I'm concerned.

1 day Peter Gathercole

When the IBM AIX Systems Support Centre in the UK was set up in in 1989/1990, the standard system that was on the desks of the support specialists was a PS/2 Model 80 running AIX 1.2. (I don't recall if they were ever upgraded to 1.2.1, and 1.3 was never marketed in the UK).

386DX at 25MHz with 4MB of memory as standard, upgraded to 8MB of memory and an MCA 8514 1024x768 XGA graphics adapter and Token Ring card. IIRC, the cost of each unit excluding the monitor ran to over £4500.

Mine was called Foghorn (the specialists were asked to name them, using cartoon character names).

These systems were pretty robust, and most were still working when they were replaced with IBM Xstation 130s (named after Native American tribes), and later RS/6000 43Ps (named after job professions - I named mine Magician, but I was in charge of them by then so could bend the rules).

I nursed a small fleet of these PS/2 re-installed with OS/2 Warp (and memory canalized from the others to give them 16MB) for the Call-AIX handlers while they were in Havant. I guess they were scrapped after that. One user who had a particular need for processing power had an IBM Blue Lightning 486 processor (made by AMD) bought and fitted.

1 day Danny 14

I remember the day the boss signed a huge contract to strip out the IBMs and put Gateways in instead. 200 machines, and in 1990 it was brand new 486 33s (not the x2 ones), it was a fortune. Out with IBM and in with DOS, Windows for workgroups and good old novell netware logon scripting. Good days and it all pretty much worked. we even had Pegasus mail back in the day and a JANET connection.

1 day kmac499
Back in them days the old "No-one got fired for buying IBM" was still rampant, and the site I was on, or more accurately the tech support crew, backed PS/2 OS/2 for that very reason.

The wheels started to come off quite quickly with the cost and availability of MCA cards.

The other thing that really killed it isn't mentioned in the article. That was Token Ring networking, anyone else remember Madge cards?. I'm no network guy but IIRC it was all very proprietary, and an expensive pig to modify once installed (MAU's ??). Unlike the XEROX Ethernet system with simpler cabling and connectors,

Of course some PS/2 architecture does survive to this day. The keyboard and mouse connectors; all that work and all that's left is a color coded plug .....

22 hrs Bandikoto

I wrote fixed the Madge Tolkien Ring network card driver when I worked at Telebit, long, long ago, in a Valley far, far away. I'm pretty sure that was part of the work to get IPX working properly in Fred, as well. Token Ring was rated at 60% faster than thin Ethernet, and actually much faster at Load, but a real pain in the behind to work with.

1 day Peter2
I remember the pictured computer in the article, and this was typed on the (pictured) model M keyboard that came with that computer. They don't build equipment like that any more. Which is probably just as well, given that the keyboard weighs practically as much as a modern laptop notebook.
15 hrs Anonymous Coward


I still have a model M on my home PC. Only real issue with it is that it doesn't have a Windows key. I also wonder how long motherboards will come with a PS2 keyboard port.

Nothing I have ever used since comes close to it.

1 day BarryProsser
Perspective of the PC developers?

As a retired IBM UK employee, I have followed The Register's articles for years. I like this article more than most of the ones written about IBM. I would love to hear the perspective of the IBMers who were directly involved in the PC developments 30 years ago. I doubt many of them read or participate here. Where best to engage them? Yahoo, Linkedin, Facebook, others?

22 hrs Bandikoto

Re: Perspective of the PC developers?

Do you participate in your local IBM Club? I know that at least one still exists in San Jose, California. There's an IBM Club in Boca, which seems like a good place to start, given that was the home of the IBM Personal Computer personal computer.

1 day John Smith 19

Although technically OS/2 is still around

As EComStation

In hindsight (always 20/20) IBM's mistake was to seek to start making money off of MCA ASAP. Had they had been much more reasonable they would have had a little bite of every board (through licensing) made and MCA would have dominated.

BTW let's not forget what a PoS Windows 2.0 was or how the 286 was so retarded that the MS tech who worked out how to switch it from real to virtual mode and back was hailed a f**king genius.

1 day LDS

Re: Although technically OS/2 is still around

The 286 designers never thought that someone would have wanted to return to the much more limited real mode once into the much more advanced protected (not virtual) mode. Entering protected mode was easy - just set the proper bit. Getting out was "impossible" - but through a reset.

After all, back then, backward compatibility wasn't still a perceived issue. Probably Intel believed everyone would have rewritten the software to work in the new, better, protected world. It turned out it was wrong (just to make the same mistake years later with Itanium, though....).

The problem was not the 286 itself (and IMHO we'll see the segmented memory model back again one day because it implements a far better security model...), it was most DOS advanced application were written to bypass DOS itself and access the BIOS and HW (RAM and I/O ports) directly, something that was hard to emulate on a 286, and made porting to other OS more difficult.

Probably CP/M applications were better behaved, and a "protected" CP/M would have been easier to develop.

17 hrs Paul Crawford

Re: 286

The article has one significant mistake - the 286 did support protected operation, even the option for "no execute" on memory segments. But as it was designed to be either 'real mode' 8086 compatible OR protected mode you had the fsck-up of having to use a keyboard controller interrupt to bring it out of halt state back to 'real' mode.

The major advances for the 386 were:

1) 32-bit registers

2) The "flat" memory model and virtual memory support (not the 16-bit segments of 286, OK still segments but way big enough for a long time)

3) The option to easily change protection modes.

14 hrs Richard Plinston

Re: Although technically OS/2 is still around

> BTW let's not forget what a PoS Windows 2.0 was or how the 286 was so retarded that the MS tech who worked out how to switch it from real to virtual mode and back was hailed a f**king genius.

It was an IBM tech that worked out how to switch the 80286 back to 8086 mode using the keyboard IF chip, there was a standard instruction to switch it to protected mode. The mechanism was incorporated into OS/2 1.0, MS stole that for Windows/286.

6 hrs LDS

Re: 286

The segmented mode has more than the "no execute" bit. It has an "execute only" mode - the CPU can execute the code in the segment, but the application can't nor read nor modify if (goodbye, ROP!). Still, managing segments adds complexity, security checks slow down execution, and applications are less free (which, from a security point of view, is a good thing).

The "flat memory model" was not an inherent new feature - it means just to use large segments, large enough the application never need to load another one (which usually meant the whole application address space). Also, usually, code and data segments overlap to make everything "easier" (and far less secure).

286 16-bit segments were too small to allow for that. 386 32-bit segments allowed that, while the new pagination feature allowed for "virtual memory" with a page (4k) granularity. 286 virtual memory worked at the segment level - with 64k segment it could work, but with flat 4GB segments it couldn't work, obviously.

But what made the 386 able to run DOS applications was the Virtual 86 mode. In that mode, the hardware itself trapped direct accesses to memory and I/O ports, and allowed the OS to handle them, without requiring complex, fragile hacks.

This mode is no longer available in 64 bit mode, and that's why Windows 64 bit can't run DOS applications any longer (Windows supports console applications which are native Windows applications, not DOS ones).

1 day David Lawrence
Ah yes I remember it well

I was working for IBM at the time, at their UK Headquarters, so I got my free copy of OS/2 Warp.

There was a big demo with loads of PCs all running Windows-based games and software. I hit problems installing it on my PC because I wasn't prepared to re-partition my hard drive and lose all my existing data. Rubbish.

I watched OS/2 go down the drain, followed by several other doomed products. Then the bottom dropped out of the mainframe market and things took another slide. The 'partnership' with Lotus was a bit of a disaster too.

IBM? They just do software and services right? And a lot (but not all) of the software was obtained through acquisitions. I remember when they actually made stuff - like a PC keyboard that cost £120.

Shame really.

1 day Danny 14

Re: Ah yes I remember it well

lotus notes still makes me blink asymmetrically when I see pictures of it. <shudder>

1 day adam payne
I loved the model M keyboard such a fine piece of design and engineering.
1 day ShelLuser

IBM was its own worst enemy

It's been a while but back in the days I was a serious OS/2 advocate. Look, if you even get other people to end up trying out OS/2 because they became sick and tired of Windows 3.11 often bodging up and not being able to network properly then yeah...

But IBM more than often didn't even seem to care all that much. Looking back I think it was a bit the same as the stories we get to hear about Microsoft now: how divisions in the company do different things, don't always work together and in some rare cases even compete. Even at the expense of customers if they have to!

But IBM... I enrolled in the OS/2 support program (I seriously don't remember how I pulled this off anymore, I think I asked (and got) permission from my work to look into all this and also use their name) which ended up with IBM sending me several beta versions of OS/2 products. Including several OS/2 server environments. It was awesome. OS/2 server (a green covered double CD, that much I remember) was basically OS/2 with additional user management and network configuration settings.

Yet the funniest thing: IBM couldn't care less about your test results. At one time I got an invitation to go to IBM in the Netherlands for an OS/2 server demonstration which would also showcase some of their latest product (I recall being showed a very lightweight laptop). At arrival you had to search for the entrance and where it all was, because any announcements or directions were no where to be found on site.

I bought OS/2 3.0 Warp and the 4.0 Merlin and it always worked like a charm. I seriously liked OS/2 much better than anything else. So when I had the opportunity to buy a PC through my work it was obvious what I would need to get, right? An IBM Aptiva. That would be an ultimate, the thing to get for OS/2. Because obviously an IBM OS will definitely run on IBM hardware, right?

Context: this was at the prime of my OS/2 endeavors. I could optimize and write a config.sys file from mind if I had to, I knew what drivers to use, which to skip, what each command did. Memory optimization? Easy. Bootstrapping a *single* floppy disk to get an OS/2 commandline? Hard, yet not impossible (try it, you'd normally get multiple disks to boot with).

It took me one whole weekend, dozens of phonecalls to the IBM support line, and the conclusion was simple: IBM did not care about OS/2 for their own hardware. And with that I mean not at all. It did not work, no matter what I tried. Even they told me that this wasn't going to work. Compaq out of all brands did care. Compaq, the brand which tried extremely hard to appeal to the general customer by making their hardware "easy" to use and also "easy" to customize (comparable to Dell a bit) didn't only target Microsoft and Windows. Noooo.... When I eventually ditched my IBM I got myself a Compaq and I also purchased an extra set of drivers and installation media (3 boxes of 3.5 floppy disks, approx. 37 in total) and guess what? Next to a full Windows 3.11 installation plus a different program manager and dozens of drivers it also included several disks with OS/2 drivers. I removed Windows and installed OS/2 that very same evening.

Compaq... which often advertised that they made Windows easier. And also delivered OS/2 drivers for their harware...

IBM, which made OS/2 also made hardware, never even bothered to provide OS/2 drivers for their own PC's. Not even if you asked them.

Does that look like a company which cared?

IBM was its own enemy sometimes.

1 hr Uncle Ron

Re: IBM was its own worst enemy

IBM was and still is it's own enemy. So many comments above reflect this so well. Eg: "IBM's mistake was that it tried to make money right away from MCA." So true. IMHO, it is the MBA's permeating the entire company that are the enemy. They know nothing about the business IBM is actually in, only about cost recovery, expense containment, and fecking business models. For the last 30 years, the real heroes in IBM have been the ones who cut the most, or spend the least, or pound suppliers the worst.

This virus is especially dangerous when a non-MBA contracts it. When they see who gets the most recognition, they can't wait to de-fund sales commissions or training programs or development staffs. They think they are "doing good." It is not only true in IBM. Companies all over the West are infected with the idea that reducing costs (and innovation) towards zero, and increasing revenue towards infinity, is all we should be working on. So, fewer Cheerios in the box, fewer ounces of pepper in the same size can, cut the sales-force, reduce the cost (and quality) of support, and on and on and on.

If there is one word that summarizes this disease, and a word I cannot stand to hear in -any- context, it is the word, "Monetize." It encapsulates all the evils of what I feel is the "Too Smart by Half" mentality. I cannot say how many times I have heard the phrase, "how much money are we leaving on the table?" or, "how many more will we sell if we..." and the room goes silent and a good idea is dropped.

I am sorry I am rambling, I am sad. Never be another System/360 or Boeing 747. Incremental from here on out. Elon Musk doesn't seem to be infected...

1 day Anonymous South African Coward
Single Input Queue

Gotta love the silly old SIQ... one app chokes the SIQ and you can do absolutely nothing, except hard reboot :)

Fun times indeed. I had a couple of SIQ incidents as well. All but forgotten, but recalled to memory now :) Sometimes a CTRL-ALT-DEL would work, sometimes not.

And who remember the CTRL-ALT-NUMLOCK-NUMLOCK sequence?

1 day Kingstonian

Memories - Nostalgia isn't what it used to be.

It all worked well and was of its time. PS/2 and OS/2 made sense in an IBM mainframe using corporate environment (which I worked in) with some specialized workgroup LANs too. OS/2 EE had mainframe connectivity built-in and multitasking that worked. Token Ring was much better for deterministic performance too where near real time applications were concerned and more resilient than ethernet at the time - ethernet would die at high usage (CSMA/CD on a bus system) whereas token ring would still work if loaded to 100% and just degrade performance gracefully. Ethernet only gained the upper hand in many large corporate environments when 10 base T took off. Token ring would connect to the mainfame too so no more IRMA boards for PCs

There was OS/2 software available to have a central build server where each workstation could be defined on the server and then set up via the network by booting from floppy disk - useful in the corporate world. DB/2 was available for OS/2 so a complete family of useful tools was available. And IBM published its standards

IBM was used to the big corporate world and moving down to individuals via its PCs whereas Microsoft at that time was more individual standalone PCs and moving up to corporate connectivity. The heritage still shows to some extent. Novell was still the LAN server of choice for us for some time though.

The PS/2 was easy to take apart - our supplier showed us a PS/2 50 when it first came out. He had to leave the room briefly and we had the lid of the machine and had taken it apart (no tools needed) before he returned. He was very worried but it was very easy just to slide the parts back together and they just clipped into place - not something you could do with other PCs then. I came across an old price list recently - the IBM model M keyboard for PS/2 was around £200 (without a cable which came with the base unit- short for the model 50 and 70 desktops and long for the 60 and 80 towers! Memory was very expensive too and OS/2 needed more than DOS. In fact EVERYTHING was expensive.

OS/2 service packs (patches) came on floppy disks in the post. You had to copy them and then return them!

Starting in computing just after the original IBM PC was announced this all brings back fond memories and a huge reminder of the industry changes.

15 hrs addinall

Fake News. Re: Memories - Nostalgia isn't what it used to be.

Your memory is shot. OS/2 was developed my Microsoft AND IBM, first released jointly in December 1987. Bill Gates was at Wembley flogging the OS during 1988.

Way before then Microsoft had UNIX System 7 running and released as XENIX for the 8086/80286 architecture.

The development of OS/2 began when IBM and Microsoft signed the "Joint Development Agreement" in August 1985. It was code-named "CP/DOS" and it took two years for the first product to be delivered.

OS/2 1.0 was announced in April 1987 and released in December. The original release is textmode-only, and a GUI was introduced with OS/2 1.1 about a year later. OS/2 features an API for controlling the video display (VIO) and handling keyboard and mouse events so that programmers writing for protected-mode need not call the BIOS or access hardware directly. In addition, development tools include a subset of the video and keyboard APIs as linkable libraries so that family mode programs are able to run under MS-DOS. A task-switcher named Program Selector is available through the Ctrl-Esc hotkey combination, allowing the user to select among multitasked text-mode sessions (or screen groups; each can run multiple programs).

Communications and database-oriented extensions were delivered in 1988, as part of OS/2 1.0 Extended Edition: SNA, X.25/APPC/LU 6.2, LAN Manager, Query Manager, SQL.

The promised graphical user interface (GUI), Presentation Manager, was introduced with OS/2 1.1 in October, 1988.[9] It had a similar user interface to Windows 2.1, which was released in May of that year. (The interface was replaced in versions 1.2 and 1.3 by a tweaked GUI closer in appearance to Windows 3.1).

OS/2 occupied the "Intellegent Workstation" part of SAA (Systems Application Architecture) and made use of the APPC PU2.1 LU6.2 SNA network stack.

5 hrs Anonymous Coward

"The development of OS/2 began..." etc. etc.

That mostly looks copy & paste from Wikipedia. A link would have been enough

7 mins Anonymous Coward
Re: Fake News. Memories - Nostalgia isn't what it used to be.

OS/2 was developed my Microsoft AND IBM

And that's why an OS/2 subsystem lingered in Windows for some time.

(They finally killed it, right? And the POSIX subsystem?)

1 day Primus Secundus Tertius
No Stop button in window

The big defect in OS/2 that I met was the lack of a stop button in any window. Yes, you could close the window but that did not stop the task, just left it sitting in the background.

It was Windows 95 that brought us a proper stop button.

We had an OS/2 system returned to us as not working. The user had been closing the window, and then later starting up another instance. The system was clogged with dormant tasks. Once I removed them, everything worked again; what we had to do then was to update our User Guide.

22 hrs Frish
IBM what could have been

At one point, OS/2 could run multiple PC-DOS Windows "Better than Microsoft can" since memory was partitioned, and a crash in one window wouldn't affect the whole machine. Microsoft wrote a stub to detect whether OS/2 was installed, and killed the attempt to load PC-DOS...

Where IBM Boca missed the boat was thinking that this is a "personal" computer, and therefore, not a "commercial" one. IBM should have OWNED that market, entirely, and the competing product within the corporation that lost to Boca's product recognized that but got shoved aside by the sexiness of the PC, and all the departures from Big Blue tradition, etc.

Also, the way IBM execs got paid meant they were shortsighted about 'solutions' that included IBM Products that they didn't get paid for.

As Product Marketing Manager for IBM CICS 0S/2, (announced at Comdex '89, a Sr. VP from MIcrosoft shared with me on the show floor "That's the most important announcement in this entire show" as I handed him a press release THAT WAS NOT included in the show's press kit, since the PC division was in charge, and they kept other Division's products from being included...) I tried to get the then President of the PC division to just whisper CICS OS/2 to the management of a very large Insurance firm. He would have left with a 40,000 PC order, but, instead, chose to say nothing...IDIOTIC but true.

18 hrs Primus Secundus Tertius

Re: IBM what could have been

"...Microsoft wrote a stub to detect whether OS/2 was installed..."

Windows 95 was distributed to end users as an "update CD". It would not run unless it detected an installed Windows 3.x or was presnted with the first W3 floppy disk. It would also accept the first OS/2 Warp 3 floppy disk.

9 hrs Pompous Git

Re: IBM what could have been

"Windows 95 was distributed to end users as an "update CD". It would not run unless it detected an installed Windows 3.x or was presented with the first W3 floppy disk. It would also accept the first OS/2 Warp 3 floppy disk."
That's because Warp included a full Win 3.x license.

22 hrs Jim 59


I remember seeing a demonstration of GEM at the PCW show in 1986 or 1987, at - Olympia I think it was. Very impressive it was too. Didn't it also come bundled on some Amstrad PCs ? Re: GEM

> Didn't it also come bundled on some Amstrad PCs ?

Yes. They came with both DRI's DOS+ and GEM and MS-DOS.

It was in all Atari 512 (and derivatives) running on TOS, which was written by DRI. It also came with BBC Master 512 which ran DRI's DOS+ and GEM on an 80186 (or was it 80188?) processor.

22 hrs Justin Clift
OS/2 resurrection

Just to point out, OS/2 is being resurrected as "ArcaOS":

They're been working on it for a while, and (though I've not used it), apparently it's targeted for release in under two weeks:

Modern Qt 5.x has been ported to it, and many Qt based apps are apparently working properly.

Saying this as DB Browser for SQLite is one of them. One of our Community members has been keeping us informed. ;)

22 hrs Jim-234

OS/2 Desktop virtualization before it was cool

I used to run OS/2 very well on a 386 based PC

I could have several windows open each with different virtual OSes, and you could move around the virtual OS image files etc.

So for example if you had some odd networking hardware (like LanTastic) that only wanted a specific DOS version, well spin up a new little dedicated Virtual OS image for it.

While people think all this virtualization stuff is so new.. it was around 25 years ago and worked rather well considering the hardware you had available at the time.

It's a shame Microsoft was able to blackmail IBM into discontinuing OS/2, with a different vision OS/2 might have become a serious contender for workplace use.

18 hrs Handlebar

Re: OS/2 Desktop virtualization before it was cool

Er, IBM were doing virtual machines in the 1960s ;-)

9 hrs Pompous Git

Re: OS/2 Desktop virtualization before it was cool

"It's a shame Microsoft was able to blackmail IBM into discontinuing OS/2"
Say what? Are you crazy?
22 hrs VanguardG

The quote from Mr Watson was supposedly in 1943 - when computers were larger than many houses and weighed as much as several buses. To say nothing of being extremely pricey (since they were essentially built on-site by hand) and expensive to operate, since they needed to be staffed by a crew trained in just that one machine, plus the power needs were enormous, and the heat produced prodigious. And, at that time, you really had fairly limited tasks that needed doing that required that kind of number-crunching. Did he say it? Maybe not, but given the time frame, it really doesn't seem as boneheaded as it would have sounded 15 years later.

Microchannel *would* have been sweet, were it not for the expense. A basic sound card of the time was $70 for ISA - the same card in MCA was $150.

As for plugging in a keyboard after boot...I still find it amusing that someone actually wrote an error code of "Keyboard not detected. Preset F1 to continue." If there's no keyboard, there's no F1 to press.

21 hrs Primus Secundus Tertius

There are other variations of that quote about five computers: e.g. the UK would only need that kind of number.

For the publicly known computations of that time - gunnery trajectories etc., that number is perhaps right. But I believe over a dozen instances of Colossus were built for code cracking. So even then the estimate of five was way out.

However the real expansion of computing came with data processing, where record keeping outweighed the relatively small amount of computation. IBM should have known that, given their existing business in accounting machines fed with punched cards.

18 hrs Updraft102

Well, yeah... there's no F1 to press, so if you want the PC to boot and you see that message, you will have to plug a keyboard in and press F1. That and make sure the keyboard lock is in the off/unlocked position, which is probably more what the keyboard check/stop on fail configuration was about than anything else.

The keyboard lock was an actual physical lock on the front of the PC that used a round key (like on vending machines) which, if set to the locked position, would prevent the PC from detecting the keyboard or responding to keypresses.

Enabling the "Press F1 to resume on keyboard error" BIOS setting made the keylock into a rudimentary system protection device. It wasn't all that effective against anyone who could get into the computer case, as bypassing the lock was as easy as pulling the cord for the keylock off the motherboard, but PC cases also typically had provisions for a small padlock to keep the cover on the case back then too. It wasn't great protection, but it probably provided some relief from pranksters, who probably would not be determined enough to cause physical damage to someone's PC for a joke. 14 hrs Richard Plinston > So even then the estimate of five was way out.

Actually he was speculating on the sales of the particular model that they were currently building. He was counting the number of government agencies that would be able to afford them and find them useful.

It was only much later that anyone tried to use computers for commercial purposes that would find them a place in businesses: LEO - Lyons Electronic Office was developed for payroll, stock distribution and manufacturing (of cakes).

In the 1950s the British government were deciding where _the_ computer would go. They chose a town that was a major railway junction because then the train loads of punch cards could easily be shipped to it. 20 mins VanguardG I remember - and people were perpetually misplacing the key to their padlocks. The round keylocks require a lot of expertise to crack open, but the padlocks...well...they were nearly always bought cheap. Basic (legal) hand tools could pop them open in seconds, without any damage to the case and, at most, a scratch or two on the padlock. Some people who were kinda serious had monitor stands that had a built-in, locking keyboard drawer. But those were usually employed by people who had a "favorite" keyboard they were afraid would get stolen by a jealous co-worker rather than because of any actual security concerns. 21 hrs Tim Brown 1 The UK had the best tech for personal computers at the time

For PCs during that period, in pure tech terms , Acorn's ARM machines running RISC-OS were way ahead of offerings from anyone else and prior to that the BBC micro (built by Acorn).

It's just such a shame that Acorn lacked any international marketing savvy then. 14 hrs Richard Plinston Re: The UK had the best tech for personal computers at the time

> It's just such a shame that Acorn lacked any international marketing savvy then.

And yet Acorn became a worldwide powerhouse chip design expert that currently sells licences for billions of chips every year. Even before phones started using them ARM were selling tens of millions of licences for chips to power embedded equipment (modems, routers, PABX, ...).

ARM = Acorn RISC Machines 9 hrs jake Re: The UK had the best tech for personal computers at the time

Not really. I had a 16 bit Heath H11A personal computer in 1978. Acorn didn't ship a 16 bit computer until 1985 ... The old girl still runs. Loudly. 21 hrs Anonymous Coward "Thomas J Watson may never have said the world will have five computers"

It is funny that people take this statement to be a sign of IBM not understanding the potential of the computing market, if it was ever even said. It actually makes a lot of sense though. TJW, correctly, didn't think it would make sense for every business out there to go build their own little data center and a "best we can do" computing infrastructure. Better to let a giant time share (now called cloud provider) handle all of that complexity and just rent the resources as needed. It is kind of like saying there is only room for a handful of electrical utilities in the world. Even if everyone, at MSFT's urging, went out and bought a gas powered generator for their house... it still makes sense that there is only room for a handful of utilities. 21 hrs JohnCr A tiny bit more

A few comments.

1) IBM was STRONGLY urged to skip the 286 and create a true 32 bit, 386 version of OS/2. Even Microsoft was strongly pushing IBM in this direction. IBM's resistance to do so was a major contributor to the split between IBM and Microsoft.

2) The MicroChannel was better than the ISA bus. The problem was it was not good enough for the future. The PC evolution was moving towards faster graphics and network (100 mbps) connectivity. The MicroChannel, even 3 generations into the future did not have the bandwidth to meet these needs. The industry evolved to the PCI interface. As an interesting historical coincidence, PCI uses the same type of connectors as did the MicroChannel. And the PCI interface found its way into other IBM systems.

3) The IBM PC was inferior to Apple's products. The PC became more successful because a new industry and IBM's customers worked together to make it successful. (Steve Job - The Lost Interview) In 1987 IBM turned a deaf ear to the industry and its customers. When IBM stopped listening its fortunes turned. This culture took over the whole company and was a major factor in the company almost going out of business in the 1990's.

Expand Comment 14 hrs Richard Plinston Re: A tiny bit more

> Even Microsoft was strongly pushing IBM in this direction.

Microsoft had developed its own 286 versions of MS-DOS: 4.0 and 4.1 (not to be confused with the much later 4.01). These was also known as European DOS because the were used by Siemans, ICL (where I worked) and Wang. These versions supported a limited multitasking of background tasks and one foreground program. I have a manual here on how to write 'family' applications that would run on 8086 MS-DOS or in protected mode on 80286 MS-DOS 4.x.

It was dumped when they switched to writing OS/2 with IBM. 9 hrs niksgarage Re: A tiny bit more

MCA was an architecture that worked fine in workstations, just not PCs. I met Fred Strietelmeyer - the architect of MCA (fairly sure that's his name) in Austin, TX in the 80s who told me that MCA was a fine architecture, but not all implementations were. RS/6000 had multi bus-mastering working with MCA, mostly because the address on the channel was a logical address, which went through the memory manager, so AIX could easily control and protect the physical memory space. PS/2 used physical addresses, which meant that either bus mastering was turned off, or the bus mastering cards needed to have a copy of the memory manager on board as well. If you were running AIX, MCA was not a problem or even a question deserving of five minutes of thought.

The PC industry hated MCA, the connector, the architecture and its licencing. They came out with EISA - a backward-compatible connector to extend the AT bus. I always found it a huge irony that PCI used the same physical connector as MCA years later. 9 hrs niksgarage Re: A tiny bit more

AND you are exactly right about the 286/386 wars. I was in Boca when the AIX guys from Austin came to see the CPDOS developers (as OS/2 was known in Dev), and showed them true multi-tasking on 386. They were baffled why IBM was writing another OS when we already had a virtualising, 32-bit ready pre-emptive multi-tasker that ran on multiple hardware platforms. It's only issue was it couldn't run on 286. And for that reason alone, IBM spent enough money on OS/2 development that they could have paid for the Hubble telescope AND had it repaired.

I also saw the numerous prototype machines in Boca (one called 'Nova' in dev as I recall) which had a 16MHz 386, lots of memory on board and an AT expansion bus. (Also had the 1.44MB diskette drive) Nice machine, could have sold really well. Only the Model 30 was allowed to be shipped, in case the AT bus continued to outshine MCA. Marshalltown What grief?

It always puzzled me what grief OS/2 was supposed to create. I used it as a substitute for Windows, running windows s/w into the early years of the century. I gathered that IBM might have still been attempting to extract its pound of flesh from developers, but as far as I was concerned, it worked fine. I built my own machines and it ran on them without problems. I also liked Rexx as a scripting language. It was immensely more useful than does and much less of a pain (to me) than MS BASIC and all its little dialects and subspecialties.

The only real grief I encountered was developers who "simply couldn't" do a version for OS/2 - and, of course, MS doing their best to see that their software was less compatible than need be.

History seems to forget how thoroughly MS would break things with each "improvement." Many of the useful "improvements" in Windows were first present in OS/2 and some really nice features vanished when IBM decided the effort wasn't worth the candle. The browser with its tree-structured browsing history was remarkable. No browser since has and anything to match. Even now, relicts of the OS/2 interface are still present in KDE and GNU. Microsoft has finally moved "on" with the horrible looking and acting interface of Windows 10. Ah... Concurrent DOS...

IBM did actually use DR Concurrent DOS 286 - but in their 4680 Point-of-sale (described often as a real P.O.S. by those of us who used it) OS. Re: Ah... Concurrent DOS...

> IBM did actually use DR Concurrent DOS 286 - but in their 4680 Point-of-sale (described often as a real P.O.S. by those of us who used it) OS.

Yes, that was a DRI product but it was not Concurrent-DOS it was FlexOS. This shared code with MP/M-86, as did Concurrent-DOS (neither of which had an 80286 product) but was 80286 based. The main difference was that FlexOS would not run MS/PC-DOS programs and Concurrent-CP/M-86 / Concurrent-DOS would run several of them at once (as well as CP/M-86 programs).

DRI had pre-emptive multi-user, multi-tasking systems since 1978 with MP/M which ran on 8085 and Z80 micros with bank switched memory (I have a couple of RAIR Blackbox/ICL PC1s here and an ICL PC2 8085AH2 with 512Kbyte). MPM2 and MP/M-86 (for the 8086) were released around 1980. Concurrent-CP/M-86 with multiple virtual screens ran on an IBM-PC (and other machines - I have a stack of 8086 ICL PC2) and could used EEMS memory cards such as AST RamPage to get several Mbytes of memory and do context switching with just a handfull of register moves.

Concurrent-CP/M-86 was demonstrated the same month as MS-DOS 2 was released. It had pre-emptive multi-tasking (and multiuser with serial terminals). The virtual screens were just a keystroke away so one could run SuperCalc, Wordstar, and other programs at the same time and just flick between them - even on the serial terminals.

Later, this was developed for 386 into DR-Multiuser-DOS from which DR-DOS 5 and 6 were derived.

There was a FlexOS-386 which had an enhanced GEM-X but it was dropped to concentrate on the Concurrent range.

Expand Comment PS/2 and OS/2

The naming convention always struck me as odd. In my mind, the /2 meant divide by 2, ie half an OS and half a PS :-) Re: PS/2 and OS/2

Ah - "OS divided by 2" - I'd never thought of it that way before.

Interestingly, for the short time that MS were promoting it, they spelt it OS-2, which could I suppose be read as "OS minus 2".

I'm not sure what if anything we can deduce from that! Re: PS/2 and OS/2

Nah. IBM (and many others) used the "/" for many, many products. Like System/360. It was a naming convention that seemed to lend weight, not division, to a product family. Big Blue: IBM's Use and Abuse of Power

To understand the lead up to PCs people should read Richard DeLamarter's Big Blue: IBM's Use and Abuse of Power.

It goes back to the 1890s and shows how IBM became dominant through not such ethical practices. The antitrust suit against IBM was dropped by Ronald Reagan, which prompted DeLamarter to write the book. Make sure your driver strategy is in place if you launch a new O/S

Good article Andrew. I was there in Miami when IBM launched the PS/2. They closed off the streets six blocks around the exhibition centre and as far as I remember the Beach Boys played. They should probably have saved their money. One thing you missed is that not only was OS/2 late but the driver support in the operating system was very poor. This meant that as well as blocking all the plug in cards through the new bus architecture, they also bricked all of the add on peripherals for the PS/2 that had worked with the IBM PC. Add to that the fact that the OS/2 driver team started competing with driver developers for driver business and also blocked them from developing for the architecture (until the OS/2 DDK made a brief appearance and then disappeared) and the factors that contributed to IBM's demise was complete. I recall that when the driver team saw the source code of one of our drivers some years later they threatened to sue us. That was until I pointed out that the code was based on the OS/2 DDK and they went quiet but couldn't quite believe that we had managed to obtain a copy in the few weeks that it had popped its head above the surface. Microsoft worked out early on that driver support is a key element to the success of an Operating System. Something that they seem to have lost sight of a bit with Windows Vista onwards although I suppose the switch to 64bit has made backwards compatibility more difficult. Keep the nostalgia coming Andrew, it's not like it used to be! Tony Harris

6 hrs BinkyTheMagicPaperclip

Such a missed opportunity

I didn't use OS/2 1.x at the time (only later), but beyond 1.0 it was fine for server based apps and a good, solid platform. Not so much for desktop apps - insufficient driver support, high memory requirements, and limited app support put paid to that.

OS/2 2.x and beyond was a much improved proposition, but suffered in competition with the large number of Windows apps. The userbase were not, in general, prepared to pay more for a smaller amount of higher quality features - the reality of running a minority platform.

OS/2 might have got further if IBM concentrated on Intel, but instead they wasted vast amounts of effort on OS/2 PPC. Much though I loved OS/2, the succession plan was flawed. Windows NT is simply better architected, they spent the time maintaining compatibility with 16 bit apps, and had much improved security, and multi user support. OS/2 was effectively dead before it really caused a problem, but it would have caused issues later on.

System <n>/Mac OS were also flawed, and the early versions of OS X sucked, but Apple are much better at retaining compatibility whilst updating the OS (at least for a few years, until they drop old kit like a brick).

I've still got an OS/2 system, and a lot of apps, and will be assembling an OS/2 1.3 system (because I'm a masochist and like trying OS). Haven't bothered with eComstation, but might give Arca 5.0 a go if it's any good, and not ludicrously priced. There aren't too many OS/2 apps I really want to run these days, though.

One final note : it's *synchronous* input queue, not single. If messages are not taken off the input queue it hangs the interface, but does not stop apps running. There was a workaround implemented in Warp 3 fixpack 16, but until then a badly behaved app was a real pain. However, Win32 successfully moved away from the synchronous input queues in Win16, to asynchronous in Win32, without breaking too many apps. IBM should have put in the engineering effort to do the same.

There are also some substantial differences between OS/2's architecture, and Windows (or indeed anything else). For instance the co-ordinate origin in Windows is at the top left of the screen, but in OS/2 it's the bottom left (OS/2 uses the mathematically correct option here)

The "fun" takeaway from this is that while "industry analysts" have been consistently wrong for at least the last 30 years, we're still listening to what they say...

Concurrent DOS, great stuff!

One of the first multiuser systems I really learned as a spotty teenager back in the late '80s. Working with my Dad on getting shared DataEase databases working at his workplace. We had C/DOS on the "master" PC system and a couple Wyse terminals hanging off the serial ports, 3 systems independently using a shared, albeit cutdown, PC based RDBMS system. I loved it so much as a kid that I ended up with a career working in database systems.

Better than OS/2 / Win9x / NT

Around the time OS/2 was making its way onto the scene and most people use DESQview to multitask on their 286/386 PC, Quantum Software Systems (now QNX Software Systems) had a real-time multiuser, multitasking , networked and distributed OS available for 8086/8088 & 80286 processors.

On PC/XT hardware in ran without any memory protection, but the same binaries would run on the real mode and protected mode kernel.

Something about being able to do

$ [2] unzip [3]/tmp/

Would run the unzip program on node 2, read as a source archive the /tmp/ file on node 3 and extract the files into the current working directory.

We accessed a local BBS that ran a 4-node QNX network (6 incoming phone lines + X.25 (Datapac) )

Even supported diskless client booting and the sharing of any device over the network. Though at around $1k for a license, it wasn't "mainstream".

It's too bad the few times Quantum tried to take it mainstream, the plans failed. Both the Unisys ICON and a new Amiga had chosen QNX as the base for their OS.

OS/2 from the Big App developers' POV

WordPerfect's then-CEO wrote this memoire, eventually put it on web for free. It's actually a good read, especially if you've never experienced the insane self-meme-absorption that is american corporate insiders. Which is why it went from biggest in world to *pop* so suddenly.

OS/2 first mentioned near end of Ch.8 and then passim. It shows quite a different view of OS/2.

Boffin Productive OS/2?

We had a productive OS/2 machine at one of our sites up until very recently. I think the only reason it was got rid of was the site being axed.

It was running a protein separation machine and had to dual into Win98 if you wanted to copy your data to the NAS. It was impressive that it lasted so long baring in mind it lived in a cold room at <5c.

1 hr ImpureScience

Still Sort Of Miss It

I really liked OS/2, and for a while I thought it would take the place that Windows now owns. But IBM had no idea how to sell to single end users, and get developers on board. Despite having a superior product their financial policies guaranteed only customers ended up being banks and insurance companies.

I'm a musician, and I remember going on for over a year with the guy they had put in charge of MIDI on OS/2. It never happened, because which bank, or what insurance company, would be interested? !--file:///f:/Public_html/History/index.shtml-->

[Apr 05, 2017] Today in Tech – 1911

Apr 05, 2017 |

April 5, 1911 – Born on this day is Cuthbert Hurd, a mathematician hired by IBM in 1949. At the time, Hurd was only the second IBM employee hired with a Ph.D. While he may not be widely-known, his contribution to IBM and the development of computers is invaluable. During the early 1950s IBM profited greatly from traditional punch card accounting. It was Hurd who quietly encouraged the upper management of IBM to enter the field of computing, as a cross-country sales trip revealed pent-up demand for scientific computers. It was a difficult move for the company, but was a rewarding one. Hurd was able to sell 10 out of the 18 computers marketed as the IBM 701, the first commercial scientific machines that could be rented at $18,000 a month.

Hurd soon became director of the IBM Electronic Data Processing Machines Division, and later became president of the Computer Usage Company, the first independent computer software company.

[Mar 29, 2017] Today in Tech – 1989

Mar 29, 2017 |

March 29, 1989 – Pixar wins an Academy Award for Tin Toy , the first fully computer-animated work to ever win best animated short film. The film was directed by John Lasseter and financed by then Pixar owner Steve Jobs. It was an official test of the PhotoRealistic RenderMan software. The short film later caught the attention of Disney, which partnered with Pixar to create the highly successful Toy Story , an animated movie turned film franchise with elements inspired by the original short film.

[Mar 22, 2017] Today in Tech – 1971

Mar 22, 2017 |

This is the first of a new blog series, "Today in Tech" and will feature the significant technological events of specific dates throughout history. Today in Tech:

1971 – Intel announces that the world's first commercial microprocessor – the Intel 4004 – is officially ready for shipping. The 4-bit central processing unit was designed by engineers Ted Hoff and Stan Mazor under the leadership of Federico Faggin, and with the assistance of Masatoshi Shima, an engineer from the Japanese firm Busicom. The microprocessor was designed in April of 1970 and completed in January of 1971, the same year when it was first made commercially available.

[Feb 21, 2017] Designing and managing large technologies

Feb 21, 2017 |
RC AKA Darryl, Ron : February 20, 2017 at 04:48 AM , 2017 at 04:48 AM
RE: Designing and managing large technologies

[This is one of those days where the sociology is better than the economics or even the political history.]

What is involved in designing, implementing, coordinating, and managing the deployment of a large new technology system in a real social, political, and organizational environment? Here I am thinking of projects like the development of the SAGE early warning system, the Affordable Care Act, or the introduction of nuclear power into the civilian power industry.

Tom Hughes described several such projects in Rescuing Prometheus: Four Monumental Projects That Changed the Modern World. Here is how he describes his focus in that book:

Telling the story of this ongoing creation since 1945 carries us into a human-built world far more complex than that populated earlier by heroic inventors such as Thomas Edison and by firms such as the Ford Motor Company. Post-World War II cultural history of technology and science introduces us to system builders and the military-industrial-university complex. Our focus will be on massive research and development projects rather than on the invention and development of individual machines, devices, and processes. In short, we shall be dealing with collective creative endeavors that have produced the communications, information, transportation, and defense systems that structure our world and shape the way we live our lives. (3)

The emphasis here is on size, complexity, and multi-dimensionality. The projects that Hughes describes include the SAGE air defense system, the Atlas ICBM, Boston's Central Artery/Tunnel project, and the development of ARPANET...

[Of course read the full text at the link, but here is the conclusion:]

...This topic is of interest for practical reasons -- as a society we need to be confident in the effectiveness and responsiveness of the planning and development that goes into large projects like these. But it is also of interest for a deeper reason: the challenge of attributing rational planning and action to a very large and distributed organization at all. When an individual scientist or engineer leads a laboratory focused on a particular set of research problems, it is possible for that individual (with assistance from the program and lab managers hired for the effort) to keep the important scientific and logistical details in mind. It is an individual effort. But the projects described here are sufficiently complex that there is no individual leader who has the whole plan in mind. Instead, the "organizational intentionality" is embodied in the working committees, communications processes, and assessment mechanisms that have been established.

It is interesting to consider how students, both undergraduate and graduate, can come to have a better appreciation of the organizational challenges raised by large projects like these. Almost by definition, study of these problem areas in a traditional university curriculum proceeds from the point of view of a specialized discipline -- accounting, electrical engineering, environmental policy. But the view provided from a discipline is insufficient to give the student a rich understanding of the complexity of the real-world problems associated with projects like these. It is tempting to think that advanced courses for engineering and management students could be devised making extensive use of detailed case studies as well as simulation tools that would allow students to gain a more adequate understanding of what is needed to organize and implement a large new system. And interestingly enough, this is a place where the skills of humanists and social scientists are perhaps even more essential than the expertise of technology and management specialists. Historians and sociologists have a great deal to add to a student's understanding of these complex, messy processes.

[A big YEP to that.]

cm -> RC AKA Darryl, Ron... , February 20, 2017 at 12:32 PM
Another rediscovery that work is a social process. But certainly well expressed.

It (or the part you quoted) also doesn't say, but hints at the obvious "problem" - social complexity and especially the difficulty of managing large scale collaboration. Easier to do when there is a strong national or comparable large-group identity narrative, almost impossible with neoliberal YOYO. You can always compel token effort but not the "intangible" true participation.

People are taught to ask "what's in it for me", but the answer better be "the same as what's in it for everybody else" - and literally *everybody*. Any doubts there and you can forget it. The question will usually not be asked explicitly or in this clarity, but most people will still figure it out - if not today then tomorrow.

[Jan 11, 2017] Fake History Alert Sorry BBC, but Apple really did invent the iPhone

Notable quotes:
"... In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard). ..."
"... The N900, N810 and N800 are to this day far more "little computers" than any other smartphone so far. Indeed, as they ran a Debian Linux derivative with a themed Enlightenment based desktop, which is pretty much off the shelf Linux software. While they didn't have multitouch, you could use your finger on the apps no problem. It had a stylus for when you wanted extra precision though. ..."
"... I was reading a BBC news web article and it was wrong too. It missed out emphasising that the real reason for success in 2007 was the deals with operators, cheap high cap data packages, often bundled with iPhone from the Mobile Operator. ..."
"... Actually if you had a corporate account, you had a phone already with email, Apps, ability to read MS Office docs, web browser and even real Fax send/receive maybe 5 or 6 years before the iPhone. Apart from an easier touch interface, the pre-existing phones had more features like copy/paste, voice control and recording calls. ..."
"... I remember having a motorola A920 way back in 2003/2004 maybe, and on that I made video calls, went online, had a touch interface, ran 'apps', watched videos.... in fact I could do everything the iPhone could do and more... BUT it was clunky and the screen was not large... the iPhone was a nice step forward in many ways but also a step back in functionality ..."
"... Apple invented everything... They may have invented the iPhone but they DID NOT invent the "smartphone category" as that article suggests. ..."
"... Microsoft had Smartphone 2002 and Pocket PC 2000 which were eventually merged into Windows Mobile and, interface aside, were vastly superior to the iPhone's iOS. ..."
"... Devices were manufactured in a similar fashion to how android devices are now - MS provided the OS and firms like HTC, HP, Acer, Asus, Eten, Motorola made the hardware. ..."
"... The government was looking for a display technology for aircraft that was rugged, light, low powered and more reliable than CRTs. They also wanted to avoid the punitive royalties taken by RCA on CRTs. It was the work done in the 1960s by the Royal Radar Establishment at Malvern and George William Gray and his team at the University of Hull that led to modern LCDs. QinetiQ, which inherited RSRE's intellectual property rights, is still taking royalties on each display sold. ..."
"... The key here is that Steve Jobs had the guts to force the thought of a useful smartphone, gadget for the user first and phone second into the minds of the Telcos, and he was the one to get unlimited/big data bundles. ..."
"... He identified correctly, as many had before but before the power to do anything about it, that the customers are the final users, not the telcos. ..."
Jan 11, 2017 |


Re: The point stands

the point is flat on it's back just like the sophistic reply.

Lets take apples first machines they copied the mouse from Olivetti , they took the OS look from a rank XEROX engineers work, the private sector take risks and plagiarize when they can, but the missing person here is the amateur, take the BBS private individuals designed, built and ran it was the pre cursor to the net and a lot of .com company's like AOL and CompuServe where born there.

And the poor clarity in the BBC article is mind numbing, the modern tech industry has the Fairchild camera company as it's grand daddy which is about as far from federal or state intervention and innovation as you can get .

Deconstructionism only works when you understand the brief and use the correct and varied sources not just one crackpot seeking attention.


Re: Engineering change at the BBC?

"The BBC doesn't "do" engineering "

CEEFAX, PAL Colour TV, 625 line transmissions, The BBC 'B', Satellite Broadcasting, Digital Services, the iPlayer, micro:bit, Smart TV services.

There's also the work that the BBC did in improving loudspeakers including the BBC LS range. That work is one reason that British loudspeakers are still considered among the world's best designs.

By all means kick the BBC, but keep it factual.


Re: I thought I invented it.

That was the first market demographics - iPod users happy to buy one who could also make calls. But that's also were Nokia failed spectacularly - it was by nature phone-centric. Its models where phones that could also make something else. True smartphones are instead little computers that can also make phone calls.

In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard).

the iPod probably taught Apple you need a proper "finger based" UI for this kind of devices - especially for the consumer market - and multitouch solved a lot of problems.


Re: I thought I invented it.

Shortly there-after I duct-taped 4 of them together and invented the tablet.

My version of it all is that the glory goes to iTunes for consumer friendly interface (ignore that concept Linux guys) and easy music purchases, the rest was natural progression and Chinese slave labor.

Smart phones and handheld computers were definitely driven by military dollars world wide but so was the internet. All that fact shows is that a smart balance of Capitalism & Socialism can go a long way.


Re: I thought I invented it.

>That was the first market demographics - iPod users happy to buy one who could also make calls. But that's also were Nokia failed spectacularly - it was by nature phone-centric. Its models where phones that could also make something else. True smartphones are instead little computers that can also make phone calls. In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard). the iPod probably taught Apple you need a proper "finger based" UI for this kind of devices - especially for the consumer market - and multitouch solved a lot of problems.

I don't know exactly why Nokia failed, but it wasn't because their smart phones were "phone centric". The N900, N810 and N800 are to this day far more "little computers" than any other smartphone so far. Indeed, as they ran a Debian Linux derivative with a themed Enlightenment based desktop, which is pretty much off the shelf Linux software. While they didn't have multitouch, you could use your finger on the apps no problem. It had a stylus for when you wanted extra precision though.

I could apt-get (with some sources tweaking) what I wanted outside of their apps. You could also compile and run proper Linux desktop apps on it, including openoffice (back in the day). It ran like a dog and didn't fit the "mobile-UI" they created, but it worked.

It also had a proper X server, so I could forward any phone app to my big PC if I didn't feel like messing about on a small touchscreen. To this day I miss this ability. To just connect via SSH to my phone over wifi, run an smartphone app, and have it appear on my desktop like any other app would.

It had xterm, it had Perl built in, it had Python (a lot of it was written in Python), you even could install a C toolchain on it and develop C code on it. People ported standard desktop UIs on it, and with a VNC/RDP server you could use it as a portable computer just fine (just connect to it using a thin client, or a borrowed PC).

I had written little scripts to batch send New years SMS to contacts, and even piped the output of "fortune" to a select few numbers just for kicks (the days with free SMS, and no chat apps). To this day I have no such power on my modern phones.

Damn, now that I think back, it really was a powerful piece of kit. I actually still miss the features *sniff*

And now that I think about it, In fact I suspect they failed because their phones were too much "little computers" at a time when people wanted a phone. Few people (outside of geeks) wanted to fiddle with X-forwarding, install SSH, script/program/modify, or otherwise customise their stuff.

Arguably the one weakest app on the N900 was the phone application itself, which was not open source, so could not be improved by the community, so much so people used to say it wasn't really a phone, rather it was a computer with a phone attached, which is exactly what I wanted.


Invention of iPhone

It wasn't even really an invention.

The BBC frequently "invents" tech history. They probably think MS and IBM created personal computing, when in fact they held it back for 10 years and destroyed innovating companies then.

The only significant part was the touch interface by Fingerworks.

I was reading a BBC news web article and it was wrong too. It missed out emphasising that the real reason for success in 2007 was the deals with operators, cheap high cap data packages, often bundled with iPhone from the Mobile Operator.

This is nonsense:

"Those were the days, by the way, when phones were for making calls but all that was about to change."

Actually if you had a corporate account, you had a phone already with email, Apps, ability to read MS Office docs, web browser and even real Fax send/receive maybe 5 or 6 years before the iPhone. Apart from an easier touch interface, the pre-existing phones had more features like copy/paste, voice control and recording calls.

The revolution was ordinary consumers being able to have a smart phone AND afford the data. The actual HW was commodity stuff. I had the dev system for the SC6400 Samsung ARM cpu used it.

Why did other phones use resistive + stylus instead of capacitive finger touch?

The capacitive touch existed in the late 1980s, but "holy grail" was handwriting recognition, not gesture control, though Xerox and IIS both had worked on it and guestures were defined before the 1990s. So the UK guy didn't invent anything.

Also irrelevant.

Mines the one with a N9110 and later N9210 in the pocket. The first commercial smart phone was 1998 and crippled by high per MByte or per second (or both!) charging. Also in 2002, max speed was often 28K, but then in 2005 my landline was still 19.2K till I got Broadband, though I had 128K in 1990s in the city (ISDN) before I moved.


Re: Invention of iPhone

The ground breaking elements of the iPhone were all to do with usability:

The fixed price data tariff was - to me - the biggest innovation. It may have been the hardest to do, as it involved entrenched network operators in a near monopoly. The hardware engineers only had to deal with the laws of physics.

The apple store made it easy to purchase and install apps and media. Suddenly you didn't have to be a geek or an innovator to make your phone do something useful or fun that the manufacturer didn't want to give to everyone.

The improved touch interface, the styling, and apple's cache all helped, and, I assume, fed into the efforts to persuade the network operators to give the average end user access to data without fear.


Re: Invention of iPhone

"Those were the days, by the way, when phones were for making calls but all that was about to change."

I remember having a motorola A920 way back in 2003/2004 maybe, and on that I made video calls, went online, had a touch interface, ran 'apps', watched videos.... in fact I could do everything the iPhone could do and more... BUT it was clunky and the screen was not large... the iPhone was a nice step forward in many ways but also a step back in functionality


Re: Invention of iPhone

"The fixed price data tariff was - to me - the biggest innovation".

In my experience, the iphone killed the "all you can eat" fixed price data tariffs

I purchased a HTC Athena (T-Mobile Ameo) on a T-Mobile-Web and Walk contract in Feb 2007. I had unlimited 3.5G access (including tethering) and fixed call minutes/texts.

When it was time to upgrade, I was told that iphone 3G users were using too much data and that T-Mobile were no longer offering unlimited internet access.

Robert Carnegie

"First smartphone"

For fun, I put "first smartphone" into Google. It wasn't Apple's. I think a BBC editor may have temporarily said that it was.

As for Apple inventing the first multitouch smartphone, though - claims, with some credibility, that Apple's engineers wanted to put a keyboard on their phone. The Blackberry phone had a keyboard. But Steve Jobs wanted a phone that you could work with your finger (without a keyboard).

One finger.

If you're only using one finger, you're not actually using multi touch?


Apple invented everything... They may have invented the iPhone but they DID NOT invent the "smartphone category" as that article suggests.

Microsoft had Smartphone 2002 and Pocket PC 2000 which were eventually merged into Windows Mobile and, interface aside, were vastly superior to the iPhone's iOS.

Devices were manufactured in a similar fashion to how android devices are now - MS provided the OS and firms like HTC, HP, Acer, Asus, Eten, Motorola made the hardware.

People rarely know how long HTC has been going as they used to OEM stuff for the networks - like the original Orange SPV (HTC Canary), a candybar style device running Microsoft Smartphone 2002. Or the original O2 XDA (HTC Wallaby), one the first Pocket PC "phone edition" devices and, IIRC, the first touchscreen smartphone to be made by HTC.


Re: Apple invented everything...

Yup, I had Windows based smartphones made by Qtek and HTC, and my first smartphone was an Orange SPV M2000 (a Qtek 9090 ) three years before the first iPhone, and I had a O2 XDA after that, which in 2006, had GPS, MMS, and an SD card slot, which held music for my train commute.

Now I'm a fan of the Note series, I had one capacitive screen smartphone without a stylus (HTC HD2), and missed it too much.


Re: Apple invented everything...

Lotaresco, I used to review a lot of the devices back in the day, as well as using them daily and modifying them (my phone history for ref: ). Not once did they ever fail to make a phone call. Maybe the journalist was biased and made it up (Symbian was massively under threat at the time and all sorts of bullshit stories were flying about), maybe he had dodgy hardware, who knows.

Either way, it doesn't mean that the OS as a whole wasn't superior to what Nokia and Apple produced - because in every other way, it was.


Re: Apple invented everything...


"The weak spot for Microsoft was that it decided to run telephony in the application layer. This meant that any problem with the OS would result in telephony being lost....

Symbian provided a telephone which could function as a computer. The telephony was a low-level service and even if the OS crashed completely you could still make and receive calls. Apple adopted the same architecture, interface and telephony are low level services which are difficult to kill."

Sorry, but if iOS (or symbian) crashes you cannot make calls. In what capacity were you evaluating phones in 2002? I cannot recall ever seeing a Windows Mobile blue screen. It would hang from time to time, but it never blue screened.


Seeing how much free advertising the BBC has given Apple over the years I doubt they will care.

And lets be honest here, the guy is kinda correct. We didn't just go from a dumb phone to a smart phone, there was a gradual move towards it as processing power was able to be increased and electronic packages made smaller. Had we gone from the old brick phones straight to an iPhone then I would agree that they owned something like TNT.

Did Apple design the iPhone - Yes, of course.

Did Apple invent the Smart Phone - Nope.

IBM had a touch screen "smart" phone in 1992 that had a square screen with rounded corners.

What Apple did was put it into a great package with a great store behind it and they made sure it worked - and worked well. I personally am not fond of Apple due to the huge price premium they demand and overly locked down ecosystems, but I will admit it was a wonderful product Design.


Re: "opinion pieces don't need to be balanced"

"I am no fan of Apple, but to state that something was invented by the State because everyone involved went to state-funded school is a kindergarten-level of thinking that has no place in reasoned argument."

It's actually "Intellectual Yet Idiot" level thinking. Google it. Your right that arguments of this sort of calibre have no place in reasoned argument, but the presence of this sort of quality thinking being shoved down peoples throats by media is why a hell of a lot of people are "fed up with experts".


Hmmm....iPhone 1.0

I actually got one of these for my wife. It was awful. It almost felt like a beta product (and these are just a few of things I still remember):

I think it's reasonably fair to say that it was the app store that really allowed the iPhone to become so successful, combined with the then Apple aura and mystique that Jobs was bringing to their products.

As to who invented this bit or that bit - I suggest you could pull most products released in the last 10-20 years and have the same kind of arguments.

But poor show on the beeb for their lack of fact checking on this one.


Re: Hmmm....iPhone 1.0

"...The original iPhone definitely has a proximity sensor. It is possible that your wife's phone was faulty or there was a software issue...."

Have an upvote - hers definitely never worked (and at the time I didn't even know it was supposed to be there), so yeah, probably faulty. I'd just assumed it didn't have one.


There is of course...

.. the fact that the iPhone wouldn't exist without its screen and all LCD displays owe their existence to (UK) government sponsored research. So whereas I agree that Mazzucato is guilty of rabidly promoting an incorrect hypothesis to the status of fact, there is this tiny kernel of truth.

The government was looking for a display technology for aircraft that was rugged, light, low powered and more reliable than CRTs. They also wanted to avoid the punitive royalties taken by RCA on CRTs. It was the work done in the 1960s by the Royal Radar Establishment at Malvern and George William Gray and his team at the University of Hull that led to modern LCDs. QinetiQ, which inherited RSRE's intellectual property rights, is still taking royalties on each display sold.

anonymous boring coward

Re: There is of course...

I had a calculator in the late 1970s with an LCD display. It had no resemblance to my phone's display.

Not even my first LCD screened laptop had much resemblance with a phone's display. That laptop had a colour display, in theory. If looked at at the right angle, in the correct light.

Innovation is ongoing, and not defined by some initial stumbling attempts.


Apple invented the iPhone...

... in the same way that Ford invented the Model T, Sony invented the Walkman or Nintendo invented the Wii. They took existing technologies, iterated and integrated them, and presented them in the right way in the right place at the right time.

And that's been true of pretty much every invention since someone discovered how to knap flint.

As to how much of a part the state had to play: a lot of things - especially in the IT and medical field - have been spun out of military research, though by the same token, much of this is done by private companies funded by government sources.

Equally, a lot of technology has been acquired through trade, acquisition or outright theft. In WW2, the United Kingdom gave the USA a lot of technology via the Tizard mission (and later, jet-engine technology was also licenced), and both Russia and the USA "acquired" a lot of rocket technology by picking over the bones of Germany's industrial infrastructure. Then, Russia spent the next 40 years stealing whatever nuclear/military technology it could from the USA - though I'm sure some things would have trickled the other way as well!

Anyway, if you trace any modern technology back far enough, there will have been state intervention. That shouldn't subtract in any way from the work done by companies and individuals who have produced something where the sum is greater than the parts...


Re: Apple invented the iPhone...

... in the same way that Ford invented the Model T, Sony invented the Walkman or Nintendo invented the Wii. They took existing technologies, iterated and integrated them, and presented them in the right way in the right place at the right time.

And that's been true of pretty much every invention since someone discovered how to knap flint.

Not so sure, Singer did a little more with respect to the sewing machine - his was the forst that actually worked. Likewise Marconi was the first with a working wireless. Yes both made extensive use of existing technology, but both clearly made that final inventive step; something that isn't so clear in the case of the examples you cite.

Equally, a lot of technology has been acquired through trade, acquisition or outright theft.

Don't disagree, although your analysis omitted Japanese and Chinese acquisition of 'western' technology and know-how...

Anyway, if you trace any modern technology back far enough, there will have been state intervention.

Interesting point, particularly when you consider the case of John Harrison, the inventor of the marine chronometer. Whilst the government did offer a financial reward it was very reluctant to actually pay anything out...

Aitor 1

Apple invented the iPhone, but not the smartphone.

The smartphone had been showed before inseveral incarnations, including the "all touch screen" several years before Apple decided to dabble in smartphones. So no invention here.

As for the experience, again, nothing new. Al thought of before, in good part even implemented.

The key here is that Steve Jobs had the guts to force the thought of a useful smartphone, gadget for the user first and phone second into the minds of the Telcos, and he was the one to get unlimited/big data bundles.

He identified correctly, as many had before but before the power to do anything about it, that the customers are the final users, not the telcos.

The rest of the smartphones were culled before birth by the Telecomm industry, as they demanded certain "features" that nobody wanted but lined their pockets nicely with minumum investment.

So I thank Steve Jobs for that and for being able to buy digital music.

[Dec 26, 2016] FreeDOS 1.2 Is Finally Released

Notable quotes:
"... Jill of the Jungle ..."
Dec 26, 2016 |
( 59

Posted by EditorDavid on Sunday December 25, 2016 @02:56PM from the long-term-projects dept.

Very long-time Slashdot reader Jim Hall -- part of GNOME's board of directors -- has a Christmas gift. Since 1994 he's been overseeing an open source project that maintains a replacement for the MS-DOS operating system, and has just announced the release of the "updated, more modern" FreeDOS 1.2 !

[Y]ou'll find a few nice surprises. FreeDOS 1.2 now makes it easier to connect to a network. And you can find more tools and games, and a few graphical desktop options including OpenGEM. But the first thing you'll probably notice is the all-new new installer that makes it much easier to install FreeDOS. And after you install FreeDOS, try the FDIMPLES program to install new programs or to remove any you don't want. Official announcement also available at the FreeDOS Project blog .

FreeDOS also lets you play classic DOS games like Doom , Wolfenstein 3D , Duke Nukem , and Jill of the Jungle -- and today marks a very special occasion, since it's been almost five years since the release of FreeDos 1.1. "If you've followed FreeDOS, you know that we don't have a very fast release cycle," Jim writes on his blog . "We just don't need to; DOS isn't exactly a moving target anymore..."

[Nov 24, 2016] American Computer Scientists Grace Hopper, Margaret Hamilton Receive Presidential Medals of Freedom

Nov 23, 2016 |
( 116

Posted by BeauHD on Wednesday November 23, 2016 @02:00AM from the blast-from-the-past dept.

An anonymous reader quotes a report from FedScoop:

President Barack Obama awarded Presidential Medals of Freedom to two storied women in tech -- one posthumously to Grace Hopper, known as the "first lady of software," and one to programmer Margaret Hamilton. Hopper worked on the Harvard Mark I computer, and invented the first compiler.

"At age 37 and a full 15 pounds below military guidelines, the gutsy and colorful Grace joined the Navy and was sent to work on one of the first computers, Harvard's Mark 1," Obama said at the ceremony Tuesday. "She saw beyond the boundaries of the possible and invented the first compiler, which allowed programs to be written in regular language and then translated for computers to understand." Hopper followed her mother into mathematics, and earned a doctoral degree from Yale, Obama said.

She retired from the Navy as a rear admiral. "From cell phones to Cyber Command, we can thank Grace Hopper for opening programming up to millions more people, helping to usher in the Information Age and profoundly shaping our digital world," Obama said. Hamilton led the team that created the onboard flight software for NASA's Apollo command modules and lunar modules, according to a White House release . "

At this time software engineering wasn't even a field yet," Obama noted at the ceremony. "There were no textbooks to follow, so as Margaret says, 'there was no choice but to be pioneers.'" He added: "Luckily for us, Margaret never stopped pioneering. And she symbolizes that generation of unsung women who helped send humankind into space."

[Sep 06, 2016] The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory

Notable quotes:
"... The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory.[6] The TCP/IP communications protocols were developed for ARPANET by computer scientists Robert Kahn and Vint Cerf, and incorporated concepts by Louis Pouzin for the French CYCLADES project. ..."
"... In 1980 DoD was a huge percent of the IC business, a lot of the R&D was done at Bell Labs, some of that for telecom not DoD. By 1995 or so DoD was shuttering its IC development as it was all being done for Wii. Which is a minor cause for why software is so hard for DoD; the chips are not under control and change too fast. ..."
"... About 20 years ago I conversed with a fellow who was in ARPANET at the beginning. We were getting into firewalls at the time with concerns for security (Hillary was recently elected to the senate) and he was shaking his head saying: "It was all developed for collaboration.... security gets in the way". ..."
Sep 05, 2016 |

pgl : Monday, September 05, 2016 at 11:07 AM

Al Gore could not have invented the Internet since Steve Jobs is taking the bow for that. Actually Jobs started NeXT which Apple bought in 1997 for a mere $427 million. NeXT had sold a couple of computer models that did not do so well but the platform software allowed Apple to sell Web based computers. BTW - the internet really began in the 1980's as something called Bitnet. Really clunky stuff back then but new versions and applications followed. But yes - the Federal government in the 1990's was very supportive of the ICT revolution.
ilsm -> pgl... , Monday, September 05, 2016 at 11:59 AM
DARPA did most of it to keep researchers talking.
RC AKA Darryl, Ron -> pgl... , Monday, September 05, 2016 at 12:35 PM

The Advanced Research Projects Agency Network (ARPANET) was an early packet switching network and the first network to implement the protocol suite TCP/IP. Both technologies became the technical foundation of the Internet. ARPANET was initially funded by the Advanced Research Projects Agency (ARPA) of the United States Department of Defense.[1][2][3][4][5]

The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory.[6] The TCP/IP communications protocols were developed for ARPANET by computer scientists Robert Kahn and Vint Cerf, and incorporated concepts by Louis Pouzin for the French CYCLADES project.

As the project progressed, protocols for internetworking were developed by which multiple separate networks could be joined into a network of networks. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet protocol suite (TCP/IP) was introduced as the standard networking protocol on the ARPANET. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations. ARPANET was decommissioned in 1990...


By mid-1968, Taylor had prepared a complete plan for a computer network, and, after ARPA's approval, a Request for Quotation (RFQ) was issued for 140 potential bidders. Most computer science companies regarded the ARPA–Taylor proposal as outlandish, and only twelve submitted bids to build a network; of the twelve, ARPA regarded only four as top-rank contractors. At year's end, ARPA considered only two contractors, and awarded the contract to build the network to BBN Technologies on 7 April 1969. The initial, seven-person BBN team were much aided by the technical specificity of their response to the ARPA RFQ, and thus quickly produced the first working system. This team was led by Frank Heart. The BBN-proposed network closely followed Taylor's ARPA plan: a network composed of small computers called Interface Message Processors (or IMPs), similar to the later concept of routers, that functioned as gateways interconnecting local resources. At each site, the IMPs performed store-and-forward packet switching functions, and were interconnected with leased lines via telecommunication data sets (modems), with initial data rates of 56kbit/s. The host computers were connected to the IMPs via custom serial communication interfaces. The system, including the hardware and the packet switching software, was designed and installed in nine months...

sanjait -> RC AKA Darryl, Ron... , Monday, September 05, 2016 at 01:09 PM
Though the thing we currently regard as "the Internet", including such innovations as the world wide web and the web browser, were developed as part of "the Gore bill" from 1991.

In case anyone is trying to argue Gore didn't massively contribute to the development of the Internet, as he claimed.

pgl -> sanjait... , Monday, September 05, 2016 at 02:37 PM
So the American government help paved the way for this ICT revolution. Steve Jobs figures out how Apple could make incredible amounts of income off of this. He also shelters most of that income in a tax haven so Apple does not pay its share of taxes. And Tim Cook lectures the Senate in May of 2013 why they should accept this. No wonder Senator Levin was so upset with Cook.
ilsm -> pgl... , Monday, September 05, 2016 at 04:29 PM
In 1980 DoD was a huge percent of the IC business, a lot of the R&D was done at Bell Labs, some of that for telecom not DoD. By 1995 or so DoD was shuttering its IC development as it was all being done for Wii. Which is a minor cause for why software is so hard for DoD; the chips are not under control and change too fast.
ilsm -> RC AKA Darryl, Ron... , Monday, September 05, 2016 at 04:25 PM
About 20 years ago I conversed with a fellow who was in ARPANET at the beginning. We were getting into firewalls at the time with concerns for security (Hillary was recently elected to the senate) and he was shaking his head saying: "It was all developed for collaboration.... security gets in the way".

[Sep 16, 2015] This Is Why Hewlett-Packard Just Fired Another 30,000

"...An era of leadership in computer technology has died, and there is no grave marker, not even a funeral ceremony or eulogy ... Hewlett-Packard, COMPAQ, Digital Equipment Corp, UNIVAC, Sperry-Rand, Data General, Tektronix, ZILOG, Advanced Micro Devices, Sun Microsystems, etc, etc, etc. So much change in so short a time, leaves your mind dizzy."
Zero Hedge


yeah thanks Carly ...

HP made bullet-proof products that would last forever..... I still buy HP workstation notebooks, especially now when I can get them for $100 on ebay ....

I sold HP products in the 1990s .... we had HP laserjet IIs that companies would run day & night .... virtually no maintenance ... when PCL5 came around then we had LJ IIIs .... and still companies would call for LJ I's, .... 100 pounds of invincible Printing ! .... this kind of product has no place in the World of Planned-Obsolesence .... I'm currently running an 8510w, 8530w, 2530p, Dell 6420 quad i7, hp printers hp scanners, hp pavilion desktops, .... all for less than what a Laserjet II would have cost in 1994, Total.

Not My Real Name

I still have my HP 15C scientific calculator I bought in 1983 to get me through college for my engineering degree. There is nothing better than a hand held calculator that uses Reverse Polish Notation!


HP used to make fantastic products. I remember getting their RPN calculators back in th 80's; built like tanks.

Then they decided to "add value" by removing more and more material from their consumer/"prosumer" products until they became unspeakably flimsy. They stopped holding things together with proper fastenings and starting hot melting/gluing it together, so if it died you had to cut it open to have any chance of fixing it.

I still have one of their Laserjet 4100 printers. I expect it to outlast anything they currently produce, and it must be going on 16+ years old now.

Fuck you, HP. You started selling shit and now you're eating through your seed corn. I just wish the "leaders" who did this to you had to pay some kind of penalty greater than getting $25M in a severance package.

Fiscal Reality

HP12C. 31 years old and still humming.WTF happened?

Automatic Choke

+100. The path of HP is everything that is wrong about modern business models. I still have a 5MP laserjet (one of the first), still works great. Also have a number of 42S day-to-day workhorse and several spares. I don't think the present HP could even dream of making these products today.


How well will I profit, as a salesman, if I sell you something that works?

How valuable are you, as a customer in my database, if you never come back?

Confucious say "Buy another one, and if you can't afford it, f'n finance it!"

It's the growing trend. Look at appliances. Nothing works anymore.

Normalcy Bias

Son of Loki

GE to cut Houston jobs as work moves overseas

" Yes we can! "

Automatic Choke

hey big brother.... if you are curious, there is a damn good android emulator of the HP42S available (Free42). really it is so good that it made me relax about accumulating more spares. still not quite the same as a real calculator. (the 42S, by the way, is the modernization/simplification of the classic HP41, the real hardcord very-programmable, reconfigurable, hackable unit with all the plug-in-modules that came out in the early 80s.)

Miss Expectations

Imagine working at HP and having to listen to Carly Fiorina bulldoze you...she is like a are 4 minutes of Carly and Ralph Nader (if you can take it):

Miffed Microbiologist

My husband has been a software architect for 30 years at the same company. Never before has he seen the sheer unadulterated panic in the executives. All indices are down and they are planning for the worst. Quality is being sacrificed for " just get some relatively functional piece of shit out the door we can sell". He is fighting because he has always produced a stellar product and refuses to have shit tied to his name ( 90% of competitor benchmarks fail against his projects). They can't afford to lay him off, but the first time in my life I see my husband want to quit...


I've been an engineer for 31 years - our managements's unspoken motto at the place I'm at (large company) is: "release it now, we'll put in the quality later". I try to put in as much as possible before the product is shoved out the door without killing myself doing it.


Do they even make test equipment anymore?

HP test and measurement was spun off many years ago as Agilent. The electronics part of Agilent was spun off as keysight late last year.

HP basically makes computer equipment (PCs, servers, Printers) and software. Part of the problem is that computer hardware has been commodized. Since PCs are cheap and frequent replacements are need, People just by the cheapest models, expecting to toss it in a couple of years and by a newer model (aka the Flat screen TV model). So there is no justification to use quality components. Same is become true with the Server market. Businesses have switched to virtualization and/or cloud systems. So instead of taking a boat load of time to rebuild a crashed server, the VM is just moved to another host.

HP has also adopted the Computer Associates business model (aka Borg). HP buys up new tech companies and sits on the tech and never improves it. It decays and gets replaced with a system from a competitor. It also has a habit of buying outdated tech companies that never generate the revenues HP thinks it will.


When Carly was CEO of HP, she instituted a draconian "pay for performance" plan. She ended up leaving with over $146 Million because she was smart enough not to specify "what type" of performance.


An era of leadership in computer technology has died, and there is no grave marker, not even a funeral ceremony or eulogy ... Hewlett-Packard, COMPAQ, Digital Equipment Corp, UNIVAC, Sperry-Rand, Data General, Tektronix, ZILOG, Advanced Micro Devices, Sun Microsystems, etc, etc, etc. So much change in so short a time, leaves your mind dizzy.

[Dec 26, 2014] Donald Knuth Worried About the Dumbing Down of Computer Science History

Dec 26, 2014 | Slashdot

Anonymous Coward on Friday December 26, 2014 @01:58PM (#48676489)

The physics does NOT define Computer Science (Score:5, Insightful)

The physics does NOT define Computer Science. Computer Science has nothing that depends on transistors, or tubes, or levers and gears.

Computers can be designed and built, and computing performed, at many different levels of physical abstraction.

You can do computer science all on paper for fucks sake.

Ever heard of this guy called Alan Turing?

Knuth is right, the ignorance, even among technical people, is astounding

Dracos (107777) on Friday December 26, 2014 @12:59PM (#48676173)

Re:False Summary - Haigh Agrees with Knuth's Thesi (Score:5, Insightful)

there are indeed no good technical histories of computer science, and little prospect of any.

I see the posthumous reactions to Steve Jobs and Dennis Ritchie as indicators that Knuth is absolutely right.

I bet anyone here would agree that co-authoring UNIX is a far more important event than being the iPod/iPhone taskmaster.

ripvlan (2609033) on Friday December 26, 2014 @11:53AM (#48675785)

But wait,there's more (Score:3, Insightful)

I returned to college several years ago after a 20 year hiatus (the first 6 years were my creative period). My first time around I studied what might be called pure Computer Science. A lot has happened in the industry after 20 years and I very much enjoyed conversations in class - esp with the perspective of the younger generation. I found it fascinating how many kids of today hoped to enter the gaming industry (my generation - Zork was popular when I was a kid and Myst was a breakout success on a new level). Kids today see blockbuster gaming as an almost make it rich experience - plus a "real world" job that sounds like fun.

But more interesting was the concepts of Computer Engineering vs Computer Science. What is Science vs Engineering? Are software "engineers" really scientists? Do they need to learn all this sciencey stuff in order to enter the business school? I attended a large semi-well-known University. Back in the '80s the CS department was "owned" by the school of business. Programming computers was thought to be the money maker - only business really used them with a strong overlap into engineering because computers were really big calculators. However it was a real CS curriculum with only 1 class for business majors. Fast forward a dozen years and CS is now part of the Engineering school (with Business on its own). The "kids" wondered why they needed to study Knuth et al when they were just going to be programming games. What about art? Story telling? They planned on using visual creative studio tools to create their works. Why all this science stuff? (this in a haptics class). Should a poet learn algorithms in order to operate MS-Word?

Since computers are ubiquitous they are used everywhere. I tell students to get a degree in what interests them - and learn how to use/program computers because...well..who doesn't use a computer? I used to program my TI calculator in highschool to pump out answers to physics & algebra questions (basic formulas).

Are those who program Excel Macros computer scientists? No. Computer Engineers? no. Business people solving real problems? Yes/maybe. The land is now wider. Many people don't care about the details of landing a man on the moon - but they like it when the velcro strap on their shoes holds properly. They receive entertainment via the Discovery Channel and get the dumbed down edition of all things "science."

When creating entertainment - it needs to be relatable to your target audience. The down and dirty details and technicalities interest only a few of us. My wife's eyes glaze over when I talk about some cool thing I'm working on. Retell it as saving the world and improving quality - she gets it (only to politely say I should go play with the kids -- but at least she was listening to that version of events).

I think that the dumbing down of history is ... well.. history. There was this thing called World War 2. The details I learned in grade school - lots of details. Each battle, names of important individuals. Today - lots of history has happened in the meantime. WW2 is now a bit dumbed down - still an important subject - but students still only have 8 grades in school with more material to cover.

My brain melts when I watch the Discovery Channel. I'm probably not the target audience. The details of historical science probably interest me. The history of Computing needs to be told like "The Social Network."

Virtucon (127420) on Friday December 26, 2014 @12:19PM (#48675913)

it's everywhere (Score:3)

we've raised at least two generations of self obsessed, no attention-span kids who want instant gratification. Retards like Justin Bieber who today tweets that he bought a new plane. As the later generations grow into the workforce and into fields like journalism, history and computer science it's no small wonder they want to reduce everything down to one liners or soundbites. Pick your field because these kids started with censored cartoons and wound up with Sponge Bob. Shit, even the news is now brokered into short paragraphs that just say "this shit happened now onto the next.."

Screw that! Yeah I'm getting older so get the fuck off my lawn!

xororand (860319) writes: on Friday December 26, 2014 @02:07PM ( #48676543)

The Machine That Changed The World

There's a gem of a documentary about the history of computing before the web.

The Machine That Changed the World is the longest, most comprehensive documentary about the history of computing ever produced.

It's a whirlwind tour of computing before the Web, with brilliant archival footage and interviews with key players - several of whom passed away since the filming.

Episode 1 featured Interviews with, including but not limited to:

Paul Ceruzzi (computer historian), Doron Swade (London Science Museum), Konrad Zuse (inventor of the first functional computer and high-level programming language, died in 1995), Kay Mauchly Antonelli (human computer in WWII and ENIAC programmer, died in 2006), Herman Goldstine (ENIAC developer, died in 2004), J. Presper Eckert (co-inventor of ENIAC, died in 1995), Maurice Wilkes (inventor of EDSAC), Donald Michie (Codebreaker at Bletchley Park)

luis_a_espinal (1810296) on Friday December 26, 2014 @03:49PM (#48676989) Homepage

There is a CS dumbing down going on (Score:2)

Donald Knuth Worried About the "Dumbing Down" of Computer Science History

Whether CS education is appropriate to all people who do computed-assisted technical work is very irrelevant to me since practical forces in real life simply solve that issue.

The problem I care about is a problem I seen in CS for real. I've met quite a few CS grads who don't know who Knuth, Lamport, Liskov, Hoare Tarjan, o Dijkstra are.

If you (the generic CS grad) do not know who they are, how the hell do you know about basic CS things like routing algorithms, pre and post conditions, data structures, you know, the very basic shit that is supposed to be the bread and butter of CS????

It is ok not to know these things and these people if you are a Computer Engineer, MIS or Network/Telecomm engineer (to a degree dependent on what your job expects from you.)

But if you are Computer Scientist, my God, this is like hiring an Electrical Engineer who doesn't know who Maxwell was. It does not inspire a lot of confidence, does it?

Aikiplayer (804569) on Friday December 26, 2014 @05:59PM (#48677657)

Re:Don't do what they did to math (Score:1)

Knuth did a nice job of articulating why he wants to look at the history of things at the beginning of the video. Those reasons might not resonate with you but he does have definite reasons for wanting technical histories (not social histories which pander to "the stupid") to be written.

[Dec 26, 2014] The Tears of Donald Knuth

History is always written by the winners, and that means right now it is written by neoliberals. Dumping down history of computer science is just application of neoliberalism to particular narrow field. The to way an essence of neoliberal history is "to dumb down everything". Dumbing down is a deliberate lowering of the intellectual level of education, literature, cinema, news, and culture. Deliberate dumbing down is the goal.
They use power of vanity to rob us of vision which history can provide. Knuth lecture "Let's Not Dumb Down the History of Computer Science" can be viewed at Kailath Lecture and Colloquia. He did important point that historical errors are as important as achievement, and probably more educational. In this "drama of ideas" (and he mentioned high educational value of errors/blunders of Linux Torvalds in design of Linux kernel) errors and achievement s all have their place and historical value. History gives people stories that are much more educational then anything else. that's that way people learn best.
Dec 26, 2014 | Communications of the ACM, January 2015

In his lecture Knuth worried that a "dismal trend" in historical work meant that "all we get nowadays is dumbed down" through the elimination of technical detail. According to Knuth "historians of math have always faced the fact that they won't be able to please everybody." He feels that other historians of science have succumbed to "the delusion that ... an ordinary person can understand physics ..."

I am going to tell you why Knuth's tears were misguided, or at least misdirected, but first let me stress that historians of computing deeply appreciate his conviction that our mission is of profound importance. Indeed, one distinguished historian of computing recently asked me what he could do to get flamed by Knuth. Knuth has been engaged for decades with history. This is not one of his passionate interests outside computer science, such as his project reading verses 3:16 of different books of the Bible. Knuth's core work on computer programming reflects a historical sensibility, as he tracks down the origin and development of algorithms and reconstructs the development of thought in specific areas. For years advertisements for IEEE Annals of the History of Computing, where Campbell-Kelly's paper was published, relied on a quote from Knuth that it was the only publication he read from cover to cover. With the freedom to choose a vital topic for a distinguished lecture Knuth chose to focus on history rather than one of his better-known scientific enthusiasms such as literate programming or his progress with The Art of Computer Programming.

... Distinguished computer scientists are prone to blur their own discipline, and in particular few dozen elite programs, with the much broader field of computing. The tools and ideas produced by computer scientists underpin all areas of IT and make possible the work carried out by network technicians, business analysts, help desk workers, and Excel programmers. That does not make those workers computer scientists. The U.S. alone is estimated to have more than 10 million "information technology workers," which is about a hundred times more than the ACM's membership. Vint Cerf has warned in Communications that even the population of "professional programmers" dwarfs the association's membership.7 ACM's share of the IT workforce has been in decline for a half-century, despite efforts begun back in the 1960s and 1970s by leaders such as Walter Carlson and Herb Grosch to broaden its appeal.

... ... ...

So why is the history of computer science not being written in the volume it deserves, or the manner favored by Knuth? I am, at heart, a social historian of science and technology and so my analysis of the situation is grounded in disciplinary and institutional factors. Books of this kind would demand years of expert research and sell a few hundred copies. They would thus be authored by those not expected to support themselves with royalties, primarily academics.

... ... ...

The history of science is a kind of history, which is in turn part of the humanities. Some historians of science are specialists within broad history departments, and others work in specialized programs devoted to science studies or to the history of science, technology, or medicine. In both settings, historians judge the work of prospective colleagues by the standards of history, not those of computer science. There are no faculty jobs earmarked for scholars with doctoral training in the history of computing, still less in the history of computer science. The persistently brutal state of the humanities job market means that search committees can shortlist candidates precisely fitting whatever obscure combination of geographical area, time period, and methodological approaches are desired. So a bright young scholar aspiring to a career teaching and researching the history of computer science would need to appear to a humanities search committee as an exceptionally well qualified historian of the variety being sought (perhaps a specialist in gender studies or the history of capitalism) who happens to work on topics related to computing.

... ... ...

Thus the kind of historical work Knuth would like to read would have to be written by computer scientists themselves. Some disciplines support careers spent teaching history to their students and writing history for their practitioners. Knuth himself holds up the history of mathematics as an example of what the history of computing should be. It is possible to earn a Ph.D. within some mathematics departments by writing a historical thesis (euphemistically referred to as an "expository" approach). Such departments have also been known to hire, tenure, and promote scholars whose research is primarily historical. Likewise medical schools, law schools, and a few business schools have hired and trained historians. A friend involved in a history of medicine program recently told me that its Ph.D. students are helped to shape their work and market themselves differently depending on whether they are seeking jobs in medical schools or in history programs. In other words, some medical schools and mathematics departments have created a demand for scholars working on the history of their disciplines and in response a supply of such scholars has arisen.

As Knuth himself noted toward the end of his talk, computer science does not offer such possibilities. As far as I am aware no computer science department in the U.S. has ever hired as a faculty member someone who wrote a Ph.D. on a historical topic within computer science, still less someone with a Ph.D. in history. I am also not aware of anyone in the U.S. having been tenured or promoted within a computer science department on the basis of work on the history of computer science. Campbell-Kelly, now retired, did both things (earning his Ph.D. in computer science under Randell's direction) but he worked in England where reputable computer science departments have been more open to "fuzzy" topics than their American counterparts. Neither are the review processes and presentation formats at prestigious computer conferences well suited for the presentation of historical work. Nobody can reasonably expect to build a career within computer science by researching its history.

In its early days the history of computing was studied primarily by those who had already made their careers and could afford to indulge pursuing historical interests from tenured positions or to dabble after retirement. Despite some worthy initiatives, such as the efforts of the ACM History Committee to encourage historical projects, the impulse to write technical history has not spread widely among younger generations of distinguished and secure computer scientists.

... ... ...

Contrary both to Knuth's despair and to Campbell-Kelly's story of a march of progress away from technical history, some scholars with formal training in history and philosophy have been turning to topics with more direct connections to computer science over the past few years. Liesbeth De Mol and Maarten Bullynck have been working to engage the history and philosophy of mathematics with issues raised by early computing practice and to bring computer scientists into more contact with historical work.3 Working with like-minded colleagues, they helped to establish a new Commission for the History and Philosophy of Computing within the International Union of the History and Philosophy of Science. Edgar Daylight has been interviewing famous computer scientists, Knuth included, and weaving their remarks into fragments of a broader history of computer science.8 Matti Tedre has been working on the historical shaping of computer science and its development as a discipline.22 The history of Algol was a major focus of the recent European Science Foundation project Software for Europe. Algol, as its developers themselves have observed, was important not only for pioneering new capabilities such as recursive functions and block structures, but as a project bringing together a number of brilliant research-minded systems programmers from different countries at a time when computer science had yet to coalesce as a discipline.c Pierre Mounier-Kuhn has looked deeply into the institutional history of computer science in France and its relationship to the development of the computer industry.16

Stephanie Dick, who recently earned her Ph.D. from Harvard, has been exploring the history of artificial intelligence with close attention to technical aspects such as the development and significance of the linked list data structure.d Rebecca Slayton, another Harvard Ph.D., has written about the engagement of prominent computer scientists with the debate on the feasibility of the "Star Wars" missile defense system; her thesis has been published as an MIT Press book.20 At Princeton, Ksenia Tatarchenko recently completed a dissertation on the USSR's flagship Akademgorodok Computer Center and its relationship to Western computer science.21 British researcher Mark Priestley has written a deep and careful exploration of the history of computer architecture and its relationship to ideas about computation and logic.18 I have worked with Priestly to explore the history of ENIAC, looking in great detail at the functioning and development of what we believe to be the first modern computer program ever executed.9 Our research engaged with some of the earliest historical work on computing, including Knuth's own examination of John von Neumann's first sketch of a modern computer program10 and Campbell-Kelly's technical papers on early programming techniques.5

[Nov 12, 2014] 2014 in video gaming

Nov 12, 2014 | Wikipedia,

The year 2014 will see release of numerous games, including new installments for some well-received franchises, such as Alone in the Dark, Assassin's Creed, Bayonetta, Borderlands, Call of Duty, Castlevania, Civilization, Dark Souls, Donkey Kong, Dragon Age, The Elder Scrolls, Elite, EverQuest, Far Cry, Final Fantasy, Forza Horizon, Infamous, Kinect Sports, Kirby, LittleBigPlanet, Mario Golf, Mario Kart, Metal Gear, MX vs. ATV, Ninja Gaiden, Persona, Pokémon, Professor Layton, Shantae, Sniper Elite, Sonic the Hedgehog, Strider Hiryu, Super Smash Bros., Tales, The Amazing Spider-Man, The Legend of Zelda, The Settlers, The Sims, Thief, Trials, Tropico, Wolfenstein and World of Warcraft.

[Nov 10, 2014] Why Google Glass Is Creepy

May 17, 2013 | Scientific American

The biggest concern seems to be distraction. Google Glass looks like a pair of glasses, minus the lenses; it's just a band across your forehead, with a tiny screen mounted at the upper-right side. By tapping the earpiece and using spoken commands, you direct it to do smartphone-ish tasks, such as fielding a calendar alert and finding a nearby sushi restaurant.

Just what we need, right? People reading texts and watching movies while they drive and attaining new heights of rudeness by scanning their e-mail during face-to-face conversation.

Those are misguided concerns. When I finally got to try Google Glass, I realized that they don't put anything in front of your eyes. You still make eye contact when you talk. You still see the road ahead. The screen is so tiny, it doesn't block your normal vision.

Hilarious parody videos show people undergoing all kinds of injury while peering at the world through a screen cluttered with alerts and ads. But that's not quite how it works. You glance up now and then, exactly as you would check your phone. But because you don't have to look down and dig around in your pocket, you could argue that there's less distraction. By being so hands-free, it should be incredibly handy.

Stormport May 17, 2013, 12:42 PM

Although the fallibility of the human monkey is much trumpeted (e.g. "To Err is Human", NAS study of out of control corporate iatrogenic death in America), there is one area of human activity where we score an almost 100% reliability: the 'justifiability' of the sport shooting of the mentally ill, Big Pharma crazed, suicidal, or just simply angry folks amongst us by our local and national 'law enforcement' (LE). Well, not all are simply shooting for sport, many are the result of overwhelming panic (e.g. 17 or 57 bullet holes in the human target) of individuals who shouldn't be allowed to own a sharp pencil much less carry a gun with a license to kill. I have not bothered to look for the statistics presuming them to be either not available or obfuscated in some way but rely on my local newspaper for my almost daily story of a local police shooting.

With that said, one can only say YES! to Google Glass and its most obvious use, replacing the patrol car dash cam. Every uniformed 'law enforcement' officer and 'security guard' must be so equipped and required to have the camera on and recording at any time on 'duty' and not doing duty in the can or on some other 'personal' time. Consider it simply as having one's supervisor as a 'partner'. Same rules would apply. No sweat.

[Sep 01, 2014] Ex-IBM CEO John Akers dies at 79

The last technically competent CEO, before Lou Gerstner with his financial machinations and excessive greed destroyed IBM as we used to know.
25 Aug 2014 | The Register

Obituary Former IBM CEO John Akers has died in Boston aged 79.

Big Blue announced Akers' passing here, due to a stroke according to Bloomberg.After a stint as a US Navy pilot, the IBM obit states, Akers joined the company in 1960. His 33-year stint with IBM culminated in his appointment as its sixth CEO in 1985, following three years as president.

The top job became something of a poisoned chalice for Akers: the IBM PC project was green-lit during his tenure, and the industry spawned by this computer would cannibalize Big Blue's mainframe revenue, which was already under attack from minicomputers.

His career was founded on the success of the iconic System/360 and System/370 iron, but eventually fell victim to one of the first big disruptions the industry experienced.

He was eventually replaced by Lou Gerstner (as Bloomberg notes, the first CEO to be appointed from outside IBM).

To Gerstner fell the task of reversing the losses IBM was racking up – US$7.8 billion over two years – by embarking on a top-down restructure to shave US$7 billion in costs.

According to retired IBM executive Nicholas Donofrio, Akers took a strong interest in nursing the behind-schedule RS6000 Unix workstation project through to fruition in the late 1980s:

"he asked what additional resources I needed and agreed to meet with me monthly to ensure we made the new schedule".

[Apr 21, 2014] How Google Screwed Up Google Glass by Gene Marks

Apr 21, 2014

It really is a great idea.

A pair of glasses that can project information or perform actions on a virtual screen in front of you about pretty much anything and all you have to do is ask. Driving directions. LinkedIn LNKD -0.59% connections. Order history. A photo. A video. A phone call. An email. The options seem limitless. And they are. Google GOOG -0.37% Glass really is a great idea. The technology can and probably will change the world. So how did Google screw it up?

Yes, screw it up. Since first announcing the product in 2012, Google Glass has been subject to ridicule and even violence. It's become a symbol of the anti-tech, anti-Silicon Valley crowd. Surveys like this one demonstrate the American public's general dislike and distrust of Google Glass. The product has not yet spawned an industry. It has not generated revenues for Google. It's become a frequent joke on late night TV and a target for bloggers and comedians around the country. The world "glasshole" has now risen to the same prominence as "selfie" and "twerk." Yes, it's getting attention. But only as a creepy gimmick which, I'm sure, is not the kind of attention that Google intended when they initially introduced it. As cool as it is, let's admit that Google Glass will go down in the annals of bad product launches. And it will do so because of these reasons.

[Apr 10, 2014] Google Glass Going On Sale To Public For VERY Limited Time

Apr 10, 2014

For a limited time starting Tuesday, Google will make the wearable device available to more than just the select group of users such as apps developers in its Glass Explorer program.

In a blogpost, Google did not say how many pairs it would sell, just that the quantity would be limited.

"Every day we get requests from those of you who haven't found a way into the program yet, and we want your feedback too," the company said on a Thursday blogpost.

"That's why next Tuesday, April 15th, we'll be trying our latest and biggest Explorer Program expansion experiment to date. We'll be allowing anyone in the U.S. to become an Explorer by purchasing Glass."

Many tech pundits expect wearable devices to go mainstream this year, extending smartphone and tablet capabilities to gadgets worn on the body, from watches to headsets. Google has run campaigns in the past to drum up public involvement, including inviting people to tweet under the hashtag #ifihadglass for a chance to buy a pair of the glasses.

Google Glass has raised privacy concerns, prompting some legislators to propose bans on the gadget.

[Apr 08, 2014] Why won't you DIE IBM's S-360 and its legacy at 50

"... IBM's System 360 mainframe, celebrating its 50th anniversary on Monday, was more than a just another computer. The S/360 changed IBM just as it changed computing and the technology industry. ..."
"... Big Blue introduced new concepts and de facto standards with us now: virtualisation - the toast of cloud computing on the PC and distributed x86 server that succeeded the mainframe - and the 8-bit byte over the 6-bit byte. ..."
Apr 08, 2014 | The Register

IBM's System 360 mainframe, celebrating its 50th anniversary on Monday, was more than a just another computer. The S/360 changed IBM just as it changed computing and the technology industry.

The digital computers that were to become known as mainframes were already being sold by companies during the 1950s and 1960s - so the S/360 wasn't a first.

Where the S/360 was different was that it introduced a brand-new way of thinking about how computers could and should be built and used.

The S/360 made computing affordable and practical - relatively speaking. We're not talking the personal computer revolution of the 1980s, but it was a step.

The secret was a modern system: a new architecture and design that allowed the manufacturer - IBM - to churn out S/360s at relatively low cost. This had the more important effect of turning mainframes into a scalable and profitable business for IBM, thereby creating a mass market.

The S/360 democratized computing, taking it out of the hands of government and universities and putting its power in the hands of many ordinary businesses.

The birth of IBM's mainframe was made all the more remarkable given making the machine required not just a new way of thinking but a new way of manufacturing. The S/360 produced a corporate and a mental restructuring of IBM, turning it into the computing giant we have today.

The S/360 also introduced new technologies, such as IBM's Solid Logic Technology (SLT) in 1964 that meant a faster and a much smaller machine than what was coming from the competition of the time.

Big Blue introduced new concepts and de facto standards with us now: virtualisation - the toast of cloud computing on the PC and distributed x86 server that succeeded the mainframe - and the 8-bit byte over the 6-bit byte.

The S/360 helped IBM see off a rising tide of competitors such that by the 1970s, rivals were dismissively known as "the BUNCH" or the dwarves. Success was a mixed blessing for IBM, which got in trouble with US regulators for being "too" successful and spent a decade fighting a government anti-trust law suit over the mainframe business.

The legacy of the S/360 is with us today, outside of IBM and the technology sector.


S/360 I knew you well

The S/390 name is a hint to its lineage, S/360 -> S/370 -> S/390, I'm not sure what happened to the S/380. Having made a huge jump with S/360 they tried to do the same thing in the 1970s with the Future Systems project, this turned out to be a huge flop, lots of money spent on creating new ideas that would leapfrog the competition, but ultimately failed. Some of the ideas emerged on the System/38 and onto the original AS/400s, like having a query-able database for the file system rather than what we are used to now.

The link to NASA with the S/360 is explicit with JES2 (Job Execution Subsystem 2) the element of the OS that controls batch jobs and the like. Messages from JES2 start with the prefix HASP, which stands for Houston Automatic Spooling Program.

As a side note, CICS is developed at Hursley Park in Hampshire. It wasn't started there though. CICS system messages start with DFH which allegedly stands for Denver Foot Hills. A hint to its physical origins, IBM swapped the development sites for CICS and PL/1 long ago.

I've not touched an IBM mainframe for nearly twenty years, and it worries me that I have this information still in my head. I need to lie down!

Ross Nixon

Re: S/360 I knew you well

I have great memories of being a Computer Operator on a 360/40. They were amazing capable and interesting machines (and peripherals).


Re: S/360 I knew you well

ESA is the bit that you are missing - the whole extended address thing, data spaces,hyperspaces and cross-memory extensions.

Fantastic machines though - I learned everything I know about computing from Principals of Operations and the source code for VM/SP - they used to ship you all that, and send you the listings for everything else on microfiche. I almost feel sorry for the younger generations that they will never see a proper machine room with the ECL water-cooled monsters and attendant farms of DASD and tape drives. After the 9750's came along they sort of look like very groovy American fridge-freezers.

Mind you, I can get better mippage on my Thinkpad with Hercules than the 3090 I worked with back in the 80's, but I couldn't run a UK-wide distribution system, with thousands of concurrent users, on it.

Nice article, BTW, and an upvote for the post mentioning The Mythical Man Month; utterly and reliably true.

Happy birthday IBM Mainframe, and thanks for keeping me in gainful employment and beer for 30 years!

Anonymous Coward

Re: S/360 I knew you well

I stated programming (IBM 360 67) and have programmed several IBM mainframe computers. One of the reason for the ability to handle large amounts of data is that these machines communicate to terminals in EBCDIC characters, which is similar to ASCII. It took very few of these characters to program the 3270 display terminals, while modern X86 computers use a graphical display and need a lot data transmitted to paint a screen. I worked for a company that had an IBM-370-168 with VM running both os and VMS.

We had over 1500 terminals connected to this mainframe over 4 states. IBM had visioned that VM/CMS. CICS was only supposed to be a temporary solution to handling display terminals, but it became the main stay in many shops.

Our shop had over 50 3330 300 meg disk drives online with at least 15 tape units. These machines are in use today, in part, because the cost of converting to X86 is prohibitive.

On these old 370 CICS, the screens were separate from the program. JCL (job control language) was used to initiate jobs, but unlike modern batch files, it would attach resources such as a hard drive or tape to the program. This is totally foreign to any modern OS.

Linux or Unix can come close but MS products are totally different.

Stephen Channell

Re: S/360 I knew you well

S/380 was the "future systems program" that was cut down to the S/38 mini.

HASP was the original "grid scheduler" in Houston running on a dedicated mainframe scheduling work to the other 23 mainframes under the bridge.. I nearly wet myself with laughter reading Data-Synapse documentation and their "invention" of a job-control-language. 40 years ago HASP was doing Map/Reduce to process data faster than a tape-drive could handle.

If we don't learn the lessons of history, we are destined to IEFBR14!

Pete 2

Come and look at this!

As a senior IT bod said to me one time, when I was doing some work for a mobile phone outfit.

"it's an IBM engineer getting his hands dirty".

And so it was: a hardware guy, with his sleeves rolled up and blood grime on his hands, replacing a failed board in an IBM mainframe.

The reason it was so noteworthy, even in the early 90's was because it was such a rare occurrence. It was probably one of the major selling points of IBM computers (the other one, with just as much traction, is the ability to do a fork-lift upgrade in a weekend and know it will work.) that they didn't blow a gasket if you looked at them wrong.

The reliability and compatibility across ranges is why people choose this kit. It may be arcane, old-fashioned, expensive and untrendy - but it keeps on running.

The other major legacy of OS/360 was, of course, The Mythical Man Month who's readership is still the most reliable way of telling the professional IT managers from the wannabees who only have buzzwords as a knowledge base.

Amorous Cowherder

Re: Come and look at this!

They were bloody good guys from IBM!

I started off working on mainframes around 1989, as graveyard shift "tape monkey" loading tapes for batch jobs. My first solo job was as a Unix admin on a set of RS/6000 boxes, I once blew out the firmware and a test box wouldn't boot.

I called out an IBM engineer after I completely "futzed" the box, he came out and spent about 2 hours with me teaching me how to select and load the correct firmware. He then spent another 30 mins checking my production system with me and even left me his phone number so I call him directly if I needed help when I did the production box.

I did the prod box with no issues because of the confidence I got and the time he spent with me. Cheers!

David Beck

Re: 16 bit byte?

The typo must be fixed, the article says 6-bit now. The following is for those who have no idea what we are talking about.

Generally machines prior to the S/360 were 6-bit if character or 36-bit if word oriented. The S/360 was the first IBM architecture (thank you Dr's Brooks, Blaauw and Amdahl) to provide both data types with appropriate instructions and to include a "full" character set (256 characters instead of 64) and to provide a concise decimal format (2 digits in one character position instead of 1) 8-bits was chosen as the "character" length.

It did mean a lot of Fortran code had to be reworked to deal with 32-bit single precision or 32 bit integers instead of the previous 36-bit.

If you think the old ways are gone, have a look at the data formats for the Unisys 2200.

John Hughes


Came with the S/370, not the S/360, which didn't even have virtual memory.

Steve Todd

Re: Virtualisation

The 360/168 had it, but it was a rare beast.

Mike 140

Re: Virtualisation

Nope. CP/67 was the forerunner of IBM's VM. Ran on S/360

David Beck

Re: Virtualisation

S/360 Model 67 running CP67 (CMS which became VM) or the Michigan Terminal System. The Model 67 was a Model 65 with a DAT box to support paging/segmentation but CP67 only ever supported paging (I think, it's been a few years).

Steve Todd

Re: Virtualisation

The 360/168 had a proper MMU and thus supported virtual memory. I interviewed at Bradford university, where they had a 360/168 that they were doing all sorts of things that IBM hadn't contemplated with (like using conventional glass teletypes hooked to minicomputers so they could emulate the page based - and more expensive - IBM terminals).

I didn't get to use an IBM mainframe in anger until the 3090/600 was available (where DEC told the company that they'd need a 96 VAX cluster and IBM said that one 3090/600J would do the same task). At the time we were using VM/TSO and SQL/DS, and were hitting 16MB memory size limits.

Peter Gathercole

Re: Virtualisation @Steve Todd

I'm not sure that the 360/168 was a real model. The Wikipedia article does not think so either.

As far as I recall, the only /168 model was the 370/168, one of which was at Newcastle University in the UK, serving other Universities in the north-east of the UK, including Durham (where I was) and Edinburgh.

They also still had a 360/65, and one of the exercises we had to do was write some JCL in OS/360. The 370 ran MTS rather than an IBM OS.

Grumpy Guts

Re: Virtualisation

You're right. The 360/67 was the first VM - I had the privilege of trying it out a few times. It was a bit slow though. The first version of CP/67 only supported 2 terminals I recall... The VM capability was impressive. You could treat files as though they were in real memory - no explicit I/O necessary.

Chris Miller

This was a big factor in the profitability of mainframes. There was no such thing as an 'industry-standard' interface - either physical or logical. If you needed to replace a memory module or disk drive, you had no option* but to buy a new one from IBM and pay one of their engineers to install it (and your system would probably be 'down' for as long as this operation took). So nearly everyone took out a maintenance contract, which could easily run to an annual 10-20% of the list price. Purchase prices could be heavily discounted (depending on how desperate your salesperson was) - maintenance charges almost never were.

* There actually were a few IBM 'plug-compatible' manufacturers - Amdahl and Fujitsu. But even then you couldn't mix and match components - you could only buy a complete system from Amdahl, and then pay their maintenance charges. And since IBM had total control over the interface specs and could change them at will in new models, PCMs were generally playing catch-up.

David Beck

Re: Maintenance

So true re the service costs, but "Field Engineering" as a profit centre and a big one at that. Not true regarding having to buy "complete" systems for compatibility. In the 70's I had a room full of CDC disks on a Model 40 bought because they were cheaper and had a faster linear motor positioner (the thing that moved the heads), while the real 2311's used hydraulic positioners. Bad day when there was a puddle of oil under the 2311.

John Smith

@Chris Miller

"This was a big factor in the profitability of mainframes. There was no such thing as an 'industry-standard' interface - either physical or logical. If you needed to replace a memory module or disk drive, you had no option* but to buy a new one from IBM and pay one of their engineers to install it (and your system would probably be 'down' for as long as this operation took). So nearly everyone took out a maintenance contract, which could easily run to an annual 10-20% of the list price. Purchase prices could be heavily discounted (depending on how desperate your salesperson was) - maintenance charges almost never were."


Back in the day one of the Scheduler software suppliers made a shed load of money (the SW was $250k a pop) by making new jobs start a lot faster and letting shops put back their memory upgrades by a year or two.

Mainframe memory was expensive.

Now owned by CA (along with many things mainframe) and so probably gone to s**t.

tom dial

Re: Maintenance

Done with some frequency. In the DoD agency where I worked we had mostly Memorex disks as I remember it, along with various non-IBM as well as IBM tape drives, and later got an STK tape library. Occasionally there were reports of problems where the different manufacturers' CEs would try to shift blame before getting down to the fix.

I particularly remember rooting around in a Syncsort core dump that ran to a couple of cubic feet from a problem eventually tracked down to firmware in a Memorex controller. This highlighted the enormous I/O capacity of these systems, something that seems to have been overlooked in the article. The dump showed mainly long sequences of chained channel programs that allowed the mainframe to transfer huge amounts of data by executing a single instruction to the channel processors, and perform other possibly useful work while awaiting completion of the asynchronous I/O.

Mike Pellatt

Re: Maintenance

@ChrisMiller - The IBM I/O channel was so well-specified that it was pretty much a standard. Look at what the Systems Concepts guys did - a Dec10 I/O and memory bus to IBM channel converter. Had one of those in the Imperial HENP group so we could use IBM 6250bpi drives as DEC were late to market with them. And the DEC 1600 bpi drives were horribly unreliable.

The IBM drives were awesome. It was always amusing explaining to IBM techs why they couldn't run online diags. On the rare occasions when they needed fixing.

David Beck

Re: Maintenance

It all comes flooding back.

A long CCW chain, some of which are the equivalent of NOP in channel talk (where did I put that green card?) with a TIC (Transfer in Channel, think branch) at the bottom of the chain back to the top. The idea was to take an interrupt (PCI) on some CCW in the chain and get back to convert the NOPs to real CCWs to continue the chain without ending it. Certainly the way the page pool was handled in CP67.

And I too remember the dumps coming on trollies. There was software to analyse a dump tape but that name is now long gone (as was the origin of most of the problems in the dumps). Those were the days I could not just add and subtract in hex but multiply as well.

Peter Simpson

The Mythical Man-Month

Fred Brooks' seminal work on the management of large software projects, was written after he managed the design of OS/360. If you can get around the mentions of secretaries, typed meeting notes and keypunches, it's required reading for anyone who manages a software project. Come to think of it...*any* engineering project. I've recommended it to several people and been thanked for it.

// Real Computers have switches and lights...


The Mythical Man-Month

The key concepts of this book are as relevant today as they were back in the 60s and 70s - it is still oft quoted ("there are no silver bullets" being one I've heard recently). Unfortunately fewer and fewer people have heard of this book these days and even fewer have read it, even in project management circles.


Was IBM ever cheaper?

I've been in IT since the 1970s.

My understanding from the guys who were old timers when I started was the big thing with the 360 was the standardized Op Codes that would remain the same from model to model, with enhancements, but never would an Op Code be withdrawn.

The beauty of IBM s/360 and s/370 was you had model independence. The promise was made, and the promise was kept, that after re-writing your programs in BAL (360's Basic Assembler Language) you'd never have to re-code your assembler programs ever again.

Also the re-locating loader and method of link editing meant you didn't have to re-assemble programs to run them on a different computer. Either they would simply run as it, or they would run after being re-linked. (When I started, linking might take 5 minutes, where re-assembling might take 4 hours, for one program. I seem to recall talk of assemblies taking all day in the 1960s.)

I wasn't there in the 1950s and 60s, but I don't recall any one ever boasting at how 360s or 370s were cheaper than competitors.

IBM products were always the most expensive, easily the most expensive, at least in Canada.

But maybe in the UK it was like that. After all the UK had its own native computer manufacturers that IBM had to squeeze out despite patriotism still being a thing in business at the time.


Cut my programming teeth on S/390 TSO architecture

We were developing CAD/CAM programs in this environment starting in the early eighties, because it's what was available then, based on use of this system for stock control in a large electronics manufacturing environment. We fairly soon moved this Fortran code onto smaller machines, DEC/VAX minicomputers and early Apollo workstations. We even had an early IBM-PC in the development lab, but this was more a curiosity than something we could do much real work on initially. The Unix based Apollo and early Sun workstations were much closer to later PCs once these acquired similar amounts of memory, X-Windows like GUIs and more respectable graphics and storage capabilities, and multi-user operating systems.

Gordon 10

Ahh S/360 I knew thee well

Cut my programming teeth on OS/390 assembler (TPF) at Galileo - one of Amadeus' competitors.

I interviewed for Amadeus's initial project for moving off of S/390 in 1999 and it had been planned for at least a year or 2 before that - now that was a long term project!

David Beck

Re: Ahh S/360 I knew thee well

There are people who worked on Galileo still alive? And ACP/TPF still lives, as zTPF? I remember a headhunter chasing me in the early 80's for a job in OZ, Quantas looking for ACP/TPF coders, $80k US, very temping.

You can do everything in 2k segments of BAL.

Anonymous IV

No mention of microcode?

Unless I missed it, there was no reference to microcode which was specific to each individual model of the S/360 and S/370 ranges, at least, and provided the 'common interface' for IBM Assembler op-codes. It is the rough equivalent of PC firmware. It was documented in thick A3 black folders held in two-layer trolleys (most of which held circuit diagrams, and other engineering amusements), and was interesting to read (if not understand). There you could see that the IBM Assembler op-codes each translated into tens or hundreds of microcode machine instructions. Even 0700, NO-OP, got expanded into surprisingly many machine instructions.

John Smith 19

Re: No mention of microcode?

"I first met microcode by writing a routine to do addition for my company's s/370. Oddly, they wouldn't let me try it out on the production system :-)"

I did not know the microcode store was writeable.

Microcode was a core (no pun intended) feature of the S/360/370/390/4030/z architecture.

It allowed IBM to trade actual hardware (EG a full spec hardware multiplier) for partial (part word or single word) or completely software based (microcode loop) depending on the machines spec (and the customers pocket) without needing a re compile as at the assembler level it would be the same instruction.

I'd guess hacking the microcode would call for exceptional bravery on a production machine.

Arnaut the less

Re: No mention of microcode? - floppy disk

Someone will doubtless correct me, but as I understood it the floppy was invented as a way of loading the microcode into the mainframe CPU.

tom dial

The rule of thumb in use (from Brooks's Mythical Man Month, as I remember) is around 5 debugged lines of code per programmer per day, pretty much irrespective of the language. And although the end code might have been a million lines, some of it probably needed to be written several times: another memorable Brooks item about large programming projects is "plan to throw one away, because you will."

Tom Welsh

Programming systems product

The main reason for what appears, at first sight, low productivity is spelled out in "The Mythical Man-Month". Brooks freely concedes that anyone who has just learned to program would expect to be many times more productive than his huge crew of seasoned professionals. Then he explains, with the aid of a diagram divided into four quadrants.

Top left, we have the simple program. When a program gets big and complex enough, it becomes a programming system, which takes a team to write it rather than a single individual. And that introduces many extra time-consuming aspects and much overhead.

Going the other way, writing a simple program is far easier than creating a product with software at its core. Something that will be sold as a commercial product must be tested seven ways from Sunday, made as maintainable and extensible as possible, be supplemented with manuals, training courses, and technical support services, etc.

Finally, put the two together and you get the programming systems product, which can be 100 times more expensive and time-consuming to create than an equivalent simple program.

Tom Welsh

"Why won't you DIE?"

I suppose that witty, but utterly inappropriate, heading was added by an editor; Gavin knows better.

If anyone is in doubt, the answer would be the same as for other elderly technology such as houses, roads, clothing, cars, aeroplanes, radio, TV, etc. Namely, it works - and after 50 years of widespread practical use, it has been refined so that it now works *bloody well*. In extreme contrast to many more recent examples of computing innovation, I may add.

Whoever added that ill-advised attempt at humour should be forced to write out 1,000 times:

"The definition of a legacy system: ONE THAT WORKS".

Grumpy Guts

Re: Pay Per Line Of Code

I worked for IBM UK in the 60s and wrote a lot of code for many different customers. There was never a charge. It was all part of the built in customer support. I even rewrote part of the OS for one system (not s/360 - IBM 1710 I think) for Rolls Royce aero engines to allow all the user code for monitoring engine test cells to fit in memory.


Sole Source For Hardware?

Even before the advent of Plug Compatible Machines brought competition for the Central Processing Units, the S/360 peripheral hardware market was open to third parties. IBM published the technical specifications for the bus and tag channel interfaces allowing, indeed, encouraging vendors to produce plug and play devices for the architecture, even in competition with IBM's own. My first S/360 in 1972 had Marshall not IBM disks and a Calcomp drum plotter for which IBM offered no counterpart. This was true of the IBM Personal Computer as well. This type of openness dramatically expands the marketability of a new platform architecture.


Eventually we stripped scrapped 360s for components.

"IBM built its own circuits for S/360, Solid Logic Technology (SLT) - a set of transistors and diodes mounted on a circuit twenty-eight-thousandths of a square inch and protected by a film of glass just sixty-millionths of an inch thick. The SLT was 10 times more dense the technology of its day."

When these machines were eventually scrapped we used the components from them for electronic projects. Their unusual construction was a pain, much of the 'componentry' couldn't be used because of the construction. (That was further compounded by IBM actually partially smashing modules before they were released as scrap.)

"p3 [Photo caption] The S/360 Model 91 at NASA's Goddard Space Flight Center, with 2,097,152 bytes of main memory, was announced in 1968"

Around that time our 360 only had 44kB memory, it was later expanded to 77kB in about 1969. Why those odd values were chosen is still somewhat a mystery to me.

David Beck

Re: Eventually we stripped scrapped 360s for components.

@RobHib-The odd memory was probably the size of the memory available for the user, not the hardware size (which came in powers of 2 multiples). The size the OS took was a function of what devices were attached and a few other sysgen parameters. Whatever was left after the OS was user space. There was usually a 2k boundary since memory protect keys worked on 2k chunks, but not always, some customers ran naked to squeeze out those extra bytes.

Glen Turner 666

Primacy of software

Good article.

Could have had a little more about the primacy of software: IBM had a huge range of compliers, and having an assembling language common across a wide range was a huge winner (as obvious as that seems today in an age of a handful of processor instruction sets). Furthermore, IBM had a strong focus on binary compatibility, and the lack of that with some competitor's ranges made shipping software for those machines much more expensive than for IBM.

IBM also sustained that commitment to development. Which meant that until the minicomputer age they were really the only possibility if you wanted newer features (such as CICS for screen-based transaction processing or VSAM or DB2 for databases, or VMs for a cheaper test versus production environment). Other manufacturers would develop against their forthcoming models, not their shipped models, and so IBM would be the company "shipping now" with the feature you desired.

IBM were also very focused on business. They knew how to market (eg, the myth of 'idle' versus 'ready' light on tape drives, whitepapers to explain technology to managers). They knew how to charge (eg, essentially a lease, which matched company's revenue). They knew how to do politics (eg, lobbying the Australian PM after they lost a government sale). They knew how to do support (with their customer engineers basically being a little bit of IBM embedded at the customer). Their strategic planning is still world class.

I would be cautious about lauding the $0.5B taken to develop the OS/360 software as progress. As a counterpoint consider Burroughs, who delivered better capability with less lines of code, since they wrote in Algol rather than assembler. Both companies got one thing right: huge libraries of code which made life much easier for applications programmers.

DEC's VMS learnt that lesson well. It wasn't until MS-DOS that we were suddenly dropped back into an inferior programming environment (but you'll cope with a lot for sheer responsiveness, and it didn't take too long until you could buy in what you needed).

What killed the mainframe was its sheer optimisation for batch and transaction processing and the massive cost if you used it any other way. Consider that TCP/IP used about 3% of the system's resources, or $30k pa of mainframe time. That would pay for a new Unix machine every year to host your website on.

Continued at Computer History Bulletin Bulletin, 2010-2019

Recommended Links

Google matched content

Softpanorama Recommended

Top articles

[Dec 01, 2019] Academic Conformism is the road to 1984. - Sic Semper Tyrannis Published on Dec 01, 2019 |






Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy


War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes


Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law


Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D

Copyright © 1996-2018 by Softpanorama Society. was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) in the author free time and without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting development of this site and speed up access. In case is down you can use the at


The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: January 01, 2020