|Home||Switchboard||Unix Administration||Red Hat||TCP/IP Networks||Neoliberalism||Toxic Managers|
May the source be with you, but remember the KISS principle ;-)
Bigger doesn't imply better. Bigger often is a sign of obesity, of lost control, of overcomplexity, of cancerous cells
Prev | Contents | Next
Dec 23, 2013 | Computerworld
These 14 individuals left lasting impressions on the industry -- some by creating innovative technologies, and others by building game-changing companies. And all left their marks by challenging the status quo.
Of mouse and man
Fortunes were made from Doug Engelbart's ideas; none by him. One of computing's greatest visionaries, he invented the computer mouse and significantly contributed to the development of hypertext, word processing, graphical user interfaces, networking and real-time collaboration, including videoconferencing. He displayed early forms of all of them in one bombshell demo in 1968.
But he didn't commercialize his ideas; that would be left to others, including members of his lab at Stanford Research Institute, many of whom went to Xerox's Palo Alto Research Center when he lost funding.
But then, Engelbart didn't want to get rich; he wanted to enrich human life. And that he did. He was 88.
October 29, 2013 | The Register
Obit William (Bill) C. Lowe, the IBM manager who broke through Big Blue's corporate structure to build its first personal computer (and inadvertently made Microsoft the industry's powerhouse) has died at the age of 72 after a heart attack.
Lowe joined IBM in 1962 and swiftly rose through the ranks to become lab director at the company's Boca Raton base in Florida. But in 1979 he was given what was, at the time, a seemingly impossible task building a working personal computer in a year.
Big Blue was the computing company in the 1950s and 60s, but it dealt in big-iron systems. In the 1970s companies such as Altair, Apple and others showed there was a booming market for small computers and IBM felt it had to get in the game, and quickly.
But that was a problem. IBM's corporate culture didn't do things fast decisions were carefully scrutinized and design teams worked for years to develop their own hardware that was as good as they could make it internally. To build a PC in a year would be impossible, unless the company was willing to buy in hardware and software from third-parties.
Moving pieces in Project Chess
Lowe convinced the IBM board that this particular strategy was the only way to go, and thus set up Project Chess: a team of a dozen engineers who would design and build the first IBM PC. Getting off-the-shelf components to power the system wasn't too big an issue, but getting the software to run it was, and Lowe and his crew went to two companies to get it: Digital Research and Microsoft.
At the time Gary Kildall's firm Digital Research was the biggest operating-system vendor in the nascent PC market and its CP/M software was popular and flexible. Microsoft was then one of the biggest suppliers of BASIC interpreters and other software, so IBM sent out a purchasing squad to get the code it needed.
The team met Gates first it helped that his mother was on the board of the non-profit United Way of America, as was John Opel, chairman of IBM, so she put in a good word about her son. But before discussing the project, IBM asked Gates and his team to sign one of its legendary non-disclosure agreements (NDA), which gave Big Blue full access to Microsoft's business and locked down the small software house from discussing anything about the meetings.
NDA stands for Not Doing Anything
Bill had no problems signing the NDA and discussed the situation with Lowe and others before confirming Microsoft could supply their programming language needs, although he recommended they speak to Kindall to get CP/M as an operating system.
When the IBM suits arrived at Kindall's house, he was away at the time, indulging in his passion for flying or so Silicon Valley history has it. His wife, also a director at Digital, answered the door instead, took one look at the NDA and called the firm's lawyer. That day's meeting between Big Blue and Digital, which Kindall turned up to after his flight, failed to produce a deal.
Bill Lowe in 2007 ... Credit: Marcin Wichary
This left IBM in something of a quandary, so they went back to Gates and asked if he had any ideas. Seeing an opportunity, Gates said Microsoft could supply IBM with the operating systems it needed. The only problem was Microsoft didn't have a working operating system, so it went to Seattle Computer Products, which had just written one called QDOS (Quick and Dirty Operating System), bought it for $50,000 and renamed it MS-DOS (but IBM branded it PC DOS).
Lowe moved over to the position of general manager of IBM's Rochester plant in 1980, so it was his successor Don Estridge who launched the IBM Personal Computer 5150 in 1981. It soon became clear that Big Blue had a hit on its hands, with the computer selling six times the forecast figure in its first year. But Lowe and his team had made two crucial mistakes.
Where Big Blue blew it
Firstly, because it used third-party components, IBM didn't have control over the design. The only part of the computer IBM had copyright control of was the BIOS ROM chip, and before long Compaq had figured a way to reverse-engineer it so that it could sell IBM-compatible systems for less than Big Blue was charging.
Secondly, IBM made the mistake of letting Microsoft sell its operating system to other manufacturers, rather than reserving it exclusively for IBM alone. Gates cheerfully sold his OS to the PC cloners and it became the de facto standard for the industry.
Lowe came back to the PC business in 1985 after Estridge's death in an air crash. In an effort to regain control of the market, in 1987 IBM introduced the PS/2 computer, which rewrote the architecture for its PCs, and a new operating system called OS/2. Unfortunately for IBM, buyers weren't sold on changing out their existing PCs to the relatively pricey PS/2 architecture and OS/2 was being written in partnership with Microsoft which understandably wasn't putting too much effort into coding a rival product.
Neither the new computer nor the operating system took off and IBM was increasingly relegated to also-ran status among PC purchasers. Lowe was partially blamed for the situation, despite his enormous achievement in getting IBM moving, and left the company in 1988 in search of pastures new at Xerox.
He is survived by his wife Cristina, four children, and 10 grandchildren. ®
July 3, 2013 | NYTimes.com
Douglas C. Engelbart, 1925-2013
Douglas C. Engelbart was 25, just engaged to be married and thinking about his future when he had an epiphany in 1950 that would change the world.
Clips of Douglas C. Englebart's 1968 demonstration of a networked computing system, which included a mouse, text editing, video conferencing, hypertext and windowing. Read more
He had a good job working at a government aerospace laboratory in California, but he wanted to do something more with his life, something of value that might last, even outlive him. Then it came to him. In a single stroke he had what might be safely called a complete vision of the information age.
The epiphany spoke to him of technology's potential to expand human intelligence, and from it he spun out a career that indeed had lasting impact. It led to a host of inventions that became the basis for the Internet and the modern personal computer.
In later years, one of those inventions was given a warmhearted name, evoking a small, furry creature given to scurrying across flat surfaces: the computer mouse.
Dr. Engelbart died on Tuesday at 88 at his home in Atherton, Calif. His wife, Karen O'Leary Engelbart, said the cause was kidney failure.
Computing was in its infancy when Dr. Engelbart entered the field. Computers were ungainly room-size calculating machines that could be used by only one person at a time. Someone would feed them information in stacks of punched cards and then wait hours for a printout of answers. Interactive computing was a thing of the future, or in science fiction. But it was germinating in Dr. Engelbart's restless mind.
In his epiphany, he saw himself sitting in front of a large computer screen full of different symbols - an image most likely derived from his work on radar consoles while in the Navy after World War II. The screen, he thought, would serve as a display for a workstation that would organize all the information and communications for a given project.
It was his great insight that progress in science and engineering could be greatly accelerated if researchers, working in small groups, shared computing power. He called the approach "bootstrapping" and believed it would raise what he called their "collective I.Q."
A decade later, during the Vietnam War, he established an experimental research group at Stanford Research Institute (later renamed SRI and then SRI International). The unit, the Augmentation Research Center, known as ARC, had the financial backing of the Air Force, NASA and the Advanced Research Projects Agency, an arm of the Defense Department. Even so, in the main, computing industry professionals regarded Dr. Engelbart as a quixotic outsider.
In December 1968, however, he set the computing world on fire with a remarkable demonstration before more than a thousand of the world's leading computer scientists at the Fall Joint Computer Conference in San Francisco, one of a series of national conferences in the computer field that had been held since the early 1950s. Dr. Engelbart was developing a raft of revolutionary interactive computer technologies and chose the conference as the proper moment to unveil them.
For the event, he sat on stage in front of a mouse, a keyboard and other controls and projected the computer display onto a 22-foot-high video screen behind him. In little more than an hour, he showed how a networked, interactive computing system would allow information to be shared rapidly among collaborating scientists. He demonstrated how a mouse, which he invented just four years earlier, could be used to control a computer. He demonstrated text editing, video conferencing, hypertext and windowing.
In contrast to the mainframes then in use, a computerized system Dr. Engelbart created, called the oNLine System, or NLS, allowed researchers to share information seamlessly and to create and retrieve documents in the form of a structured electronic library.
The conference attendees were awe-struck. In one presentation, Dr. Engelbart demonstrated the power and the potential of the computer in the information age. The technology would eventually be refined at Xerox's Palo Alto Research Center and at the Stanford Artificial Intelligence Laboratory. Apple and Microsoft would transform it for commercial use in the 1980s and change the course of modern life.
Judy New Zealand
We stand on the shoulders of giants, as Isaac Newton once said. R.I.P. Dr Engelbart. I never knew you but have admired you for years since discovering your story and your famous demonstration while doing research for an IT article. Ideas and achievements will always rate higher with me than the wealth other people turn them into.
I never knew "The Mother of All Demos" https://www.youtube.com/watch?v=yJDv-zdhzMY was available on Utube until reading a comment on this article and am delighted to find that as well as being brilliant and creative you were also good looking. What a wonderful young man you must have been. My sympathy to your children and hopes that creativity reigns amongst your grandchildren in whatever form it may take. You definitely were one of Isaac Newton's giants with millions standing on your shoulders.
Vic Kley Berkeley, CA
I knew Doug and liked him for his wisdom and integrity. He went well beyond pointing devices like the mouse or specific cursor shapes like his "bug".
This article makes a good effort to change the emphasis, from the mouse to Englebart himself and his realized visions.
He is one of the few visionaries who has seen the vision he shared with Vanevar Bush, Turing and others actually come to pass in his time.
Pointing devices and cursors as Doug would certainly tell you if he could, were in use before WWII. Trackballs and joysticks, even digital tablets all came before the mouse. The mouse was adopted quite simply because it was cheap and profitable. This fact is high praise for Doug for such properties are the things of a successful technology.
Doug, you will be missed.
Tom Foth Trumbull
Back in the 1980's I was writing for Softalk, a PC magazine, and wrote a cover article on these "new" things called "mice." Through this and later some mutual friends, I got a chance to meet and spend time with Dr. Engelbart.
He was always frustrated because the other part of his human interface, the chorded keyboard ( http://en.wikipedia.org/wiki/Chorded_keyboard ) never received the notoriety and use as the mouse did, because it took practice to use.
I was in a lecture he was giving and he showed himself on snow skis at the top of a mountain peak. He said "Anything worth doing, anything that brings value, be it driving a car or skiing down a mountain, is worth doing well." His point was we need to master a tool or a technology to fully exploit it... and not be left to a tool or technology mastering us and our becoming slaves to it.
So much of today's human interface to computers and the ways computers organize information is based on his passion and vision.
I remember having supper with him once around 1985... and I was so humbled to be with him. His intellect and passion were off the chart. Even though I was hardly his intellectual equal, he engaged me in conversation and treated me as his equal. He was as humble as he was brilliant.
We lost an incredible thinker, innovator, and visionary in his passing. My condolences to his family and friends.
I learned about his great work which he demonstrated in 1968 and accepted as mother of all demonstration's.
Engelbart has seen through what we are living with our computers today, everything he explained in his presentation was solid reality today. how a one man mind can imagine things this much accurate preciseness. I watched his demonstration's on internet, I can tell one thing If I were in the audience I most likely didn't understand what he is creating, But today in 2013 when I listened Engelbart I am shock and awestruck what he is talking about. He already created todays computer world in his mind and crystallized.
Douglas Engelbart is the mind behind todays computer world. What he envisioned become truth and more entire world is running on it.
His conceptual idea was worth a Nobel prize.
R.I.P Mr Engelbart, one man can do only this much to change the world.
it is not only mouse watch the 1968 presentation.
N West Coast
Thank you, Dr. Engelbart.
I fell in love with the Mac over the PC 20 years ago while in college because of the mouse and it's the meteor-like trailing cursor.
This exemplifies what his and Mr. Jobs' geniune vision - that computers are designed to serve us. They shouldn't make us all secretaries, doing mundane data entry with 10 fingers.
Instead, with the mouse, we all become our own conductors of our own symphony full of GUI's. It's wonderful that Mr. Jobs had thought that "a single button was appropriate," like holding onto a conductor baton.
To this day I have yet let go of the mouse. I refuse to go onto the touch pads for the same reasons why I dislike the keyboard. Eventually, I will also refuse to use voice-activated commands. We cannot think and move our tongue at the same time, in order to devote more time to "look, listen, and feel."
In medicine we had someone similar to Dr. Englebart, Dr. Netter. They represent a human dimension that is irreplaceable.
We should hope to respect their legacy by providing a nurturing culture in society for the next 100 years even if only another Englebart or Netter shall come about.
DO NOT FOLD SPINDLE OR MUTILATE...
I was privileged to watch the entire history of modern computing when I enrolled at SUNY Binghamton's School of Advanced Technology in 1980.
My wife and I started programming with punch cards. UGH!!! Card decks of hundreds of cards, each card had to be typed exactly, then put thru a card reader and executed. Mistakes had to be corrected by retyping the card, then put thru the reader again (waiting on line for the reader to be free)...
Finally SAT got a time share system. We could work remotely thru a SLOW modem. One program I wrote (in APL) took 8 hours to compile and execute. Better than cards, but not by much.
On my last day at SAT I went to say goodbye to one professor and on his desk was one of the first (arguably with a serial number under 10) IBM PCs. 80x25 char screen, screen colors were green or amber, no graphics, no mouse, clunky but compelling. Then, in 1982, we left for Silicon Valley.
The history lesson continued as the PC boom started and gained momentum. Companies that relied on mainframes began to use PCs, at least for individual workers. The last part of those lessons was the Lisa, Apple's first foray into what we now recognize as modern computing. This was all of Doug Englebarts ideas in commercialized form. A beautiful machine, with color, graphics, all the bells and whistles we take for granted now.
He was a true Silicon Valley genius who did in fact make everyone's life better than it had been... RIP
calhouri cost rica
"The group disbanded in the 1970s, and SRI sold the NLS system in 1977 to a company called Tymshare. Dr. Engelbart worked there in relative obscurity for more than a decade until his contributions became more widely recognized by the computer industry."
As impressive as were the achievements recounted in this obit, I find the above quoted section the most touching part of the memorial. A man who grew up on a farm in Oregon and despite a life of accomplishment most of us could only dream of somehow managed to retain a modesty and self-confidence that precluded any necessity of blowing his own horn.
Would that the modern tech-stars mentioned in the piece (and others who need no naming) shared these endearing traits.
I had the privilege to know Doug well, during my first 7 years as Logitech's CEO. He and I met regularly when Doug had his office at our Fremont facility. Our conversations were fascinating and incredibly instructive for me.
As executives, we tend to focus on the short/medium term. Long term for us may mean 3-5 years. Doug's vision of bootstrapping Collective IQ, his passion in life, was a 50 year+ vision. His belief that technology is key to mankind's ability to solve difficult problems collectively has the transformative power that few others share. In every conversation with him, the wide ranging strength of his vision energized and amazed me. It made me a better person, a better thinker and a better executive.
In spite of all the awards, the Turing Prize, the Medal of Honor, Doug was one of the most under recognized geniuses of our times. And he stayed humble, curious and accessible to ideas and people all his life.
He left many lessons for us. As a company, we will be well served by understanding how we can dramatically enhance our effectiveness by working together, tapping into each other IQ and skills to achieve much more than the sum of our individual parts. It's Doug's legacy, and we will strive to honor it.
Long live Doug, not just in our memories, but in our embodying his vision as a group of individuals
Guerrino De Luca - chairman Logitech
kat New England
Here's the commercially available in 1981 Xerox Star (first link), via a 1982 paper (second link), complete with photo of the graphical interface/display, mouse, what you see is what you get editor:
DK Adams Castine, ME
Mr. Engelbart deserves each and all of these accolades. The author of this article, however, might use a bit more education in early computing. Noting the comment "Computers were ungainly room-size calculating machines that could be used by only one person at a time." By 1960- eight years after delivery of the first UNIVAC I, machines were already capable of full multiprogramming/ mutiuser/ timesharing/ networking, able to support hundreds of users. UNIVAC introduced its 400 series of commercial real-time computers, developed out of fire-control computers for the US Navy. Beginning in 1962, I worked with these machines and the UNIVAC 1100 series, developing some of the first industrial real-time uses; they were also used in real-time process control (Westinghouse Prodac); Western Union switched their telegrams through them; and Eastern Airlines used its 490 systems for the first efficient reservation system: all of these were running in the early '60s- and all developed in Minnesota and Pennsylvania, not Silicon Valley. Not everything under the sun happens under the California sun.
Drew Levitt, Berlin
I was in the audience when Engelbart received an honorary PhD from Yale a couple years ago. Martin Scorsese was another of the honorees, and was initially getting all the attention. But when the speaker read Engelbart's citation and the magnitude of his contributions sank in, the audience gave him a very long standing ovation.
Concerned American USA
Doug Englebart's contributions were outstanding. This was a great life to celebrate, indeed.
I wonder if any of it would have happened so quickly without government research funding? Invented in 1964 and product-ized in 1980. This would not show the modern corporate requisite quarterly or even annual return and would have run the bulk of a patent's duration.
Game changing, high-risk research is not consonant with the modern corporate model. It may never be with any real corporate model. This is OK, but govt high-risk research programs have been cut for years - is it no wonder much modern innovation is modest?
Even Xerox PARC, having the parent Xerox - with a great hold on the photo-copy business in those days, had vestiges of govt sponsored research.
Miguel United States
Not everything has stagnated. Thanks to government funding, great advances have been made in surveillance technology.
SJFine Wilton, CT
The story I always heard was that Jobs and Woz saw the potential of the mouse when the Xerox PARC team in charge of the it showed them all the hard work they were doing with it. As I understood it, the PARC team did not expect to have Apple essentially (if legally) steal it out from under them when they showed them the mouse. Well, that's the story that is told by many people and was shown in the 1999 film, "Pirates of Silicon Valley" (a really fun film with great performances by Noah Wyle as Steve Jobs and Anthony Michael Hall as Bill Gates).
However, I just discovered that version of the story may be at least partially apocryphal as well. This New Yorker article : http://www.newyorker.com/reporting/2011/05/16/110516fa_fact_gladwell entitled "Creation Myth" has the tag line, "The mouse was conceived by the computer scientist Douglas Engelbart, developed by Xerox PARC, and made marketable by Apple." It goes on to describe a less commonly known version of the mouse creation story.
I have to admit that until this morning, I had no idea who Douglas C. Engelbart was. (I would bet that a majority of "techie" people didn't know of him either, but that I'm among a great minority of people who will admit it.) I look forward to learning more about this brilliant man. It's interesting to observe the crafting of history that is taking place with the death of Jobs and now Engelbart. How many other "historical facts" have few or none have ever known?
bonongo Ukiah, CA
I first saw a clip of Doug Englebart's remarkable 1968 demonstration of the computer mouse in a 1992 documentary, "The Machine that Changed the World" broadcast on PBS. The multi-part series included a fascinating history of the computer, including pioneers like Charles Babbage, Alan Turing -- and Doug Englebart.
The documentary noted that, for all the revolutionary innovations he was responsible for, though, even by the early 1990s, Englebart had largely been forgotten in the forward rush of the computer revolution. At one point it showed him walking in anonymity, on the Stanford campus.
And yet as I write this, I use the computer mouse to move the cursor around on my screen -- much as Dr. Engelbart did nearly a half-century ago. I've never forgotten the recording of that 1968 demonstration, and how advanced his equipment looks, even today, compared to the giant mainframes that most people associated with computers in those days. So thanks, Dr. Engelbart -- your creation has now outlived its creator.
Fred Wilf Philadelphia
>> "Mr. Bates said the name was a logical extension of the term then used for the cursor on a screen: CAT. Mr. Bates did not remember what CAT stood for, but it seemed to all that the cursor was chasing their tailed desktop device.)"
Back in the day, all of the screens were CAThode ray tubes. By the time I started using them in the 1970s, we called them "CRTs" or "tubes" or "screens" or "terminals" depending on the particular technology (some were pretty dumb terminals limited to 80 characters across by 25 lines down, while others were could do relatively sophisticated graphics), but I could see them being called "CATs" as well.
RIP, Dr. Engelbart. Your work was groundbreaking. You should be as well-known as others who followed the path that you set.
LongView San Francisco Bay Area
Most unfortunate that his vision did not acknowledge - (a) the consolidation of money and power that computer technology has obtained, (b) massive loss of 'hand and eye' based employment usurped by digital mechanization driving down wages and employment prospects coincident with the decimation of the "middle class", (c) addiction of the "masses" to point and click rather than read and reflect, and (d) pretty much a whole-scale usurpation of the centuries validated method of learning and acquisition of knowledge - reading, reflection and contemplation, and writing. The future in plain sight.
Ichiro Furusato New Zealand
Actually, you are quite wrong about Doug. I was fortunate enough to be invited into his Bootstrap Institute in the late 1990s where I learned much about the history of computing and Doug's ideas.
Doug talked about how Robert Oppenheimer came to later regret being the "father" of the atomic bomb and that people who develop and work with computers bear a similar responsibility to society, i.e., that technologies are not neutral and that technologists must consider how their inventions may be used both to augment as well as potentially damage human society. Doug was simply in favour of focusing on augmentation, but fully acknowledged the hazards. It's billionaires like Jobs and Zuckerberg who are happy to use others' inventions to make themselves extremely wealthy, but take no responsibility to society. Witness Apple's legacy on worker conditions and disposable products, and Facebook's on privacy.
It seems especially poignant that Doug's death follows closely after the world has learned that the surveillance state has come fully into being, more advanced and pervasive than any dystopian novelist could ever have imagined, and that computer-assisted drones are being used by governments to indiscriminantly kill innocent people in faraway lands.
If you're interested in learning more about Doug rather than just spouting off in angry ignorance I might recommend the book "Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing" by Thierry Bardini.
LongView San Francisco Bay Area
Robert Oppenheimer, being a world-class physicist, fully understood, despite his hand-wringing witnessed by President Truman, the magnitude of human death and infrastructure destruction, that would be obtained long before the atomic bombs were detonated over Hiroshima and Nagasaki. To a mathematical mind of Oppenheimer's caliber, the calculations were literally simple and "back of the envelope".
The disconnect of Engelbart, and many of his generation and kind, is that he and they did not have the intellectual capacity of fore-sight to understand that a socio-economy based on electronic digital computation that has pauperized the human endeavor.
Fortunately, several hundred years out, but eight or ten human generations, their lack of vision will be but a dim footnote by the world as it will be - significant affects from human caused climate change, effective depletion of the petroleum resource, and a world made by hand due to the biogeophysical constraints that control all biological species. Indeed the future in plain sight. Earth abides.
Mei Lin Fung Palo Alto
Doug Engelbart liked to describe himself as a simple farm boy who dared to dream big.
He actually won a Readers Digest $5? for an aphorism paraphrasing "My talent is my willingness to tolerate a high level of embarrassment to get to my dreams"....
it took a lot of tolerance for embarrassment to imagine hypertext, windows, remote video conferencing and more in 1968 when people were using punched cards to communicate with their computers.
He was part of an Authors at Google video in 2007 - you can watch it here : http://www.youtube.com/watch?v=xQx-tuW9A4Q
Engelbart dedicated his life from the age of 25 to helping people understand that technology was like fire, it could burn us up, or be harnessed for the highest good - for augmenting our humanity. Engelbart inspired generations to work towards this goal of augmenting our humanity.
He was one of our greatest global thinkers who partnered with Vint Cerf, and the other Arpanet pioneers who were funded by ARPA Program Manager Bob Kahn. These deeply thoughtful people have been instrumental in making the Internet something that everyone could benefit from.
His thesis is that human systems and tool systems have to co-evolve - IBM produced the Co-Evolution Symposium where he was the keynote speaker.
Friends of Engelbart are gathering in Palo Alto today to remember him.
kat New England
For those who claim the pc as we know it would have "died" without Jobs, here's the commercially available in 1981 Xerox Star (first link), via a 1982 paper (second link), complete with photo of the graphical interface/display, mouse, what you see is what you get editor:
GUIdebook Articles "Designing the Star User Interface" Picture
GUIdebook Articles "Designing the Star User Interface"
martinn palo alto
Shreekar, Parts of Xerox were already commercializing the ideas from the Computer Science Lab. They would certainly not have "died" without Apple. Jobs was a ripoff artist who never gave credit to the people whose ideas he took - Engelbart's, Taylor's, Lampson's, Thacker's.
DeVaughn Silicon Valley
Met Doug Englebart at SRI, during my first real job right out of college. In SRI's Information Sciences and Engineering Department, he was crafting the mouse. A young PR guy, I could never really explain what it was or did, but I assumed it was important because Doug was working on it and he was, well, important. A good guy who was never celebrated in the Valley with the flourish he deserved.
Dennis Johns Island, SC
Wasn't much celebrated at SRI for a long time either. The DARPA contract ended and -- typical of SRI at the time -- since there was no funding, his lab was closed and Xerox PARC made one of its better decisions and hired this gentle, quite terrific man.
nemecl Big Bear, CA
This was a great breakthrough.
I worked at NYU during the early seventies and I took computer science to get my master's in 1982. As students we were supposed to punch cards - I never managed to do it right. Fortunately, the computer powers-to-be (the computer was built from discrete transistors) allowed a very limited teletype access for the chemistry lab where I worked.
A mouse? It was 1971, stupid.
Amazing how much can change during a lifetime of one person (I am nine years younger than Dr. Engelbart).
RIP, Dr. Engelbart.
Swannie Honolulu, HI
Ah, yes...punch cards...one wrong key-stroke and you start all over again from the beginning. Bah-Humbug!
Thomas Zaslavsky Binghamton, N.Y.
PunchED cards make great note papers and good bookmarks. A use for all those typos.
kat New England
Re: "Eventually the technology Dr. Engelbart demonstrated would be refined at Xerox's Palo Alto Research Center and at the Stanford Artificial Intelligence Laboratory and then be transformed for commercial use by Apple and Microsoft in the 1980s."
I think you're forgetting the Xerox Star, released with a mouse in 1981, well ahead of Apple's and Microsoft's systems.
Robert Vancouver, Canada
One of my technical manuscripts was composed on a Xerox-Star prototype in March 1980 in Webster NY, sent by 'ether-net' to Stamford Conn for approval, and published in SCIENCE 5 months later.
Two years later most of my colleagues, lost their secretarial services and typed their own manuscripts.
A. M. Garrett Lafayette, La.
I almost wish the headline read, "Douglas C. Engelbart, a graduate of public colleges and a government employee, dies at 88"
Sad to see how we are dismantling the great public umiversity system that fostered so many American dreams and gave the world such genius. I wonder how many kids who come from a farm today can afford college. I wonder how many government workers we are furloughing to cut an already shrinking deficit are working on just this kind of ground-breaking research.
Mei Lin Fung Palo Alto
Doug was always proud of his alma mater, the Oregon State University - he spoke about the importance of the Land Grant System that allowed a boy who grew up in the depression, who did not have more than one pair of shoes to walk to school, He earned his PhD at the University of California at Berkeley. The ground breaking research was made possible by the great investment in Education made by earlier generations. What are we doing for the generations after us?
Thomas Zaslavsky Binghamton, N.Y.
It doesn't matter. Our masters are making out very well and they're making sure things won't change in that respect. The research can go to the Moon for all they care.
Anonymoose & Squirrel Pasadena
That's a very interesting and perceptive comment from Lafayette, where most things are organic and growing.
I spent a couple of years in New Orleans recently and Summers in Lake Charles when I was a early teen, in the 1970s. It's so much hotter now it seems, that could be the changes in land use what with fewer trees and swamps. Two Summers ago the drought was really bad. I stopped several times in Lafayette and it was impossible not to notice the way people took things in a stride that was so much more tolerable than the people in Los Angeles.
The cuts to your university system and health care were the things that made me leave (besides the heat). A distant relative of mine had been a state supreme court justice under Huey Long, but I never did reconnect with that end of the family. That and the money from the spill has never really been allocated to people or to remediation while there were plenty of corporations getting in on the act. I think Gov. Long had a hand in this, plus the basic insular nature of the state.
I think all Americans should visit Louisiana, not just New Orleans (and not at once), the folks there are fun, generous, and intelligent. I never felt threatened in Louisiana whereas Los Angeles can be a dangerous place, not once.
KenJr New Mexico
At Fairchild, Gene Hoerni's planar process mentioned in the next to last paragraph gave birth in that same timeframe to the integrated circuit first demonstrated at Fairchild in 1960 by Robert Noyce and Jay Last. To say the planar process "improved the electrical output of transistors and made them cheaper to manufacture and available to a mass market, gives short shrift in the extreme to both Hoerni's planar process and Engelbart's "Scaling in Microelectronics" ideas.
Mei Lin Fung Palo Alto
Doug would be the first to say that he was one of many - his aspiration was for tapping the collective wisdom - he often spoke about the importance of generating collective wisdom. The Center for Collective Intelligence at MIT was named after Prof. Tom Malone founder of the center, met Dr. Engelbart in California. Tom Malone and Prof. Hiroshi Ishii came to celebrate Engelbart's Legacy at the Program for the Future held by the Tech Museum in 2008 to celebrate the 40th anniversary of the Mother of all Demos. An interview conducted with Engelbart was downloaded 50,000 times in a week - people in 2003/4 were amazed that they had never heard of him, and asked it this was an April Fools Day joke. Engelbart was one of a kind and he believed each of us had untapped potential and he devoted his life to making it possible for others to have the opportunities he had to build a better future
June 24, 2013 | Slashdot
Posted by samzenpus
from the please-forget-about-that-other-stuff dept.
An anonymous reader writes "The National Security Agency has declassified an eye-opening pre-history of computers used for code-breaking between the 1930s and 1960s. The 344 page report, entitled It Wasn't All Magic: The Early Struggle to Automate Cryptanalysis (pdf), it is available on the Government Attic web site. Government Attic has also just posted a somewhat less declassified NSA compendium from 1993: A Collection of Writings on Traffic Analysis. (pdf)"
Re:Pay no attention (Score:5, Insightful)
Pay no attention to the man in the Russian airport.
No, they want you to pay attention to him, to this, to ANYTHING except for what they (the US government and the NSA in particular) are actually doing with regards to you personal liberties. That is what they are trying to distract you from thinking about.
First pwned! (Score:5, Funny)
Am I crazy for opening a PDF from the NSA?
Re:First pwned! (Score:5, Informative)
Not if you did it in a VM running a LiveCD...
More Secret History (Score:2, Informative)
How about Bush's blackmail scheme where he used the NSA to try to obtain material to blackmail UN ambassadors into voting for invading Iraq. Most of the media treated that like it was secret...
PDFS (Score:5, Funny)
Hey you guys who are talking about Snowden, download this PDF with some cool additional code! Don't worry about it. I promise we didn't buy exploits from Adobe or Microsoft!
Re:PDFS (Score:4, Interesting)
Hey you guys who are talking about Snowden, download this PDF with some cool additional code! Don't worry about it. I promise we didn't buy exploits from Adobe or Microsoft!
Why buy what you can get for free?
If you don't use up the budget you don't get more next year. Especially if your working at an agency that can't be measured for efficiency in any way.
The Puzzle Palace (Score:1)
There's a relatively old book about the NSA and SIGINT written by a journalist who studied publicly available materials using Tom Clancy's MO, that you can buy at Barnes and Noble or Amazon.com. I remember reading it and thinking it was more like "what it's like to work at the NSA" than an expose, though. Still, IIRC the author and publisher had to square off with the NSA to get it in print.
Re:Broken Link (Score:4, Funny)
Got it for you. It is called stuxnet-prehistory.pdf.exeRe:The site got suspended...
Google webcache has this [googleusercontent.com]
This month marks the 60th anniversary of computer-disk memory. Don't feel bad if you missed the big celebration-there wasn't one. Computer memory is the forgotten story of the electronics revolution. Yet it may be the most remarkable of all.
In September 1952, IBM opened a facility in San Jose, Calif.-a critical moment in the story of Silicon Valley. The company set to work developing a new kind of magnetic memory for its planned Model 305 Ramac (Random Access Method of Accounting and Control), the world's first "supercomputer."
Like the generations of mainframe computers before it, Ramac was slated to feature ...
14 June 2012
It is fitting that the greatest code-breaker of World War Two remains a riddle a hundred years after his birth. Alan Turing, the brilliant, maverick mathematician, widely considered to be the father of computer science and artificial intelligence, invented an electromagnetic machine called the 'bombe' which formed the basis for deciphering Germany's Enigma codes.
The man himself has rather eluded definition: painted (too easily) as a nutty professor with a squeaky voice; as a quirky, haphazard character with a sloppy appearance by his mother and schoolmasters; by colleagues as a gruff, socially awkward man; and by his friends as an open-hearted, generous and gentle soul.
The crucial contribution Turing made at Bletchley Park, one that has been credited with shortening the war by two years and saving countless lives, did not become public knowledge until twenty years after his death. His mother, brother and friends did not know until long after they'd mourned him, the extent of his heroism.
Despite his premature death aged 41, Turing was so prolific and ground-breaking that the Science Museum is dedicating an entire exhibition to what sprang from his mind. It will showcase his machines, his mathematics and his work on codes and morphogenesis, but will also tell the extraordinary story of his life.
"We're calling the exhibition Code-breaker because of Bletchley, but also because Turing broke the codes of science in his work and the codes of society through his homosexuality," says David Rooney, head curator at the Science Museum.
The State which Turing had fought to protect cruelly turned on him in 1952. He was found guilty of gross indecency for homosexual acts avoiding prison by agreeing to a now unthinkable condition of probation: chemical castration. He took Stilboestrol, a pill containing female hormones, but was removed from his government work and felt himself to have been placed under observation. As the holder of State secrets, who was in 1950s attitudes a sexual deviant, he was a dangerous outcast.
He was found dead on 7 June 1954, a few weeks before his 42nd birthday, after biting into an apple laced with cyanide. This 'Snow White' suicide is particularly resonant given his enjoyment of the 1937 Disney film of the fairy-tale. In Andrew Hodges' biography Alan Turing: The Enigma, he records Turing's fondness for singing the words from the scene where the witch drops an apple into a sulphurous cauldron: "Dip the apple in the brew/ Let the Sleeping Death seep through".
58 years after his suicide Turing is beginning to get the recognition he deserves. Nearly 35,000 people have signed a petition calling for his criminal record to be posthumously overturned. Another petition (so far 15,000+ names strong) calls for his face to be printed on the £10 note.
Insights into the man behind the machines:
Turing's nephew, Dermot Turing, 51, the son his brother John, never met him. He was born after his uncle's death so his impressions are drawn from his father and step sisters' stories.
Because my father was quite closely involved in tidying the pieces after Alan had committed suicide, we didn't talk about him at home much.
Frankly, they weren't particularly close as adults. With the business of the prosecution [for homosexuality], which was only a couple of years prior to the suicide, it is hardly surprising that my father found the whole thing pretty distressing. He felt that he'd been left to 'deal' with their mother.
I began to hear about him in the mid Seventies when information about Bletchley became publicly known. During that period there was a lot of talk about it, obviously. I remember being glued to a BBC report revealing the weirder things that took place during the war, of which Enigma was just one. Also, because his mother died in 1976, my father was suddenly able to talk about him.
Alan had written some "scarifying" things about his mother in notebooks for Doctor Greenbaum [a Jungian psychotherapist]. My father felt it was appropriate to conceal the toxic material about granny, so he destroyed all the famous notebooks.
I suspect people might be upset to think Alan hated his mother. It's more complicated than that, of course. I'm not in a position to judge, but my half sisters, who are, would hotly deny that Alan hated her.
That begs the question: why did he write those terrible things? I'm speculating, and anybody else's judgement on this is as good as mine, but I think if you put together the fact that in 1950s England, when social attitudes are very, very different from what they are now, having to explain to his mother (who was essentially an Edwardian lady) what that conviction for homosexuality meant, must have been the toughest thing he'd ever had to do.
I don't think it's possible to comprehend the enormous pressure he was under. It has taken me quite by surprise that a vociferous group of people still think that it's not imaginable that he could have committed suicide.
These people don't feel it was in his nature to do it and believe in evidence that points away from it. The fact that he'd bought himself two pairs of socks the day before, or something. Frankly, I suspect Alan was a victim of mood swings and we probably won't know what it was that tipped him over the edge at that last moment.
That my father, whose initial reaction was that Alan couldn't possibly have committed suicide, found himself persuaded that he must be wrong on that score, is I think the most powerful evidence that he did.
To lots of people this remains an open question. The fact that he can excite such interest about the manner of his death nearly 60 years on is extraordinary. But this year should be about celebrating his achievements rather than reopening questions about his death.
Regarding his 1952 conviction [for homosexuality], I am still, putting it at its mildest, puzzled as to how the court concluded that it had power to push him to do that [take chemical castration]. There is an open question on that.
He was sentenced under the Criminal Justice Act 1948 which introduced the option of probation as an alternative to prison. This was a very new piece of legislation in 1952. How the judge concluded that you could attach conditions to probation so early in the life of this new sentencing power I don't know.
There's a double-sided view of Alan Turing. Talk to the people who worked with him and were his junior assistants and you get this very positive picture of somebody who took time and was approachable. You also get this same sense from people who knew Alan as children: my half sisters, the Greenbaum children, Professor Newman's sons.
If you talk to people who dealt with Alan either as superiors or in a non-technical social setting, and read the really quite acid writings by Alan about what was going on at Cambridge, you realise there was another facet of him: uncompromising, socially a bit awkward. He didn't go out of his way to charm people if it wasn't interesting enough for him to do so.
I can see traits of my father in that, too. Captain Jerry Roberts [a Bletchley Park veteran] said if you passed Alan in the corridor he would turn his eyes to the wall rather than say hello. He obviously wasn't that easy to deal with.
I'm probably not allowed to say things like that. I'm not trying to de-sanctify him but I think there's a tendency to paint him as completely ridiculous. You've got all these stories about weird things that he got up to. My granny's book [Alan. M Turing by Sara Turing] is full of them. Other people assume he's a mad mathematics professor character.
The people knew him personally will tell you Alan was a bit chaotic. Quite the opposite from a person who is good is processing. I suspect he'd often get bored and not finish projects. Having written this design spec for a universal computer, he wasn't particularly interested in its day-to-day application.
Mike Woodger, 89, was Alan Turing's first assistant at the National Physical Laboratory. They worked together on the Ace Pilot Computer.
I was 23 in 1946 when I first met Turing at the NPL. At that point Turing had nobody else to work for him. He was rather motherly towards me.
My initial impression of Turing was of a rather shy and retiring man. We first spoke because I got into difficulty over a puzzle I was trying to solve. Turing looked over my shoulder and said: "Why don't you try singularity?" I'd done a degree in mathematics and should have known what he meant, but didn't. He patiently explained it to me.
You know about his personal life, of course. But I didn't know that he was homosexual until after his death. I went to his home a few times and we got on very well.
He was respected at NPL, but I would not say he was revered as he is now. Not many people knew what he had done during the war. He had a reputation for being rather gruff. He didn't suffer fools gladly.
I went down with glandular fever almost as soon as I arrived at NPL and went off sick for six weeks. I returned in September and there was a charming note from Turing:
Dear Woodger, [He would never have called me Mike]
Unfortunately Wilkinson and I have both arranged to go on leave just at the moment you are coming back. I trust you can keep yourself occupied while we are gone. You could do:
2. Try and help out in any measure doing Ace jobs
3. Read the folder
4. Read some good books
I hope you really are alright. It's a shame to have you come back and find the place deserted. It might be wise to have a relapse for a week.
He was a bit of a fingers and thumbs man. The ideas were brilliant but the execution suffered somewhat from his physical disabilities.
Turing didn't need to be meticulous. He was creative. He was always looking ahead.
He left NPL in 1947 but he returned for the launch of the first Pilot Ace in 1950. He told us how much better we had made it than if he had stayed.
John Turing, Alan's brother, wrote an account of him before he died. It is included as an Afterword in the recently republished Alan M. Turing: Centenary Edition by Sara Turing. Here's an extract:
One Easter holiday in Dinard, Alan spent all his time collecting seaweed and brewing it up in the cellar until at length he extracted a few drops of iodine which he carried back to the science master at Sherborne [the public school both brothers attended] in high triumph.
When later we were living in Guildford, he had a series of crazes. He tried to learn the violin, which was excruciating. Then he turned his attention to breeding those little red banana flies in test tubes, so that he could prove Mendel's theory at first hand. Unfortunately, they escaped and the house was full of banana flies for several days.
Oddest of all, in the heat of summer, he spent much of his time dressed as a private soldier allegedly drilling at Knightsbridge barracks, to what purpose nobody knew, but looking back on it now, I strongly suspect that drilling was not the object of the exercise at all. He was, as I have said, good at beating the system and, of course, the odder the things he did, the less one was likely to enquire into them.
My mother gives a true picture of Alan's generosity. Our family friend Hazel achieved her life's ambition of becoming a missionary with Alan's help. Alan gave his time and brains unstintingly to his friends, paid for the schooling of a boy whom he more or less adopted, spent hours choosing suitable presents for his relations and friends, without regard to expense, and was incredibly patient with and endearing to small children, with whom he would have interesting conversations about the nature of God and other daunting subjects.
Alan could not stand social chat or what he was pleased to call "vapid conversation". What he really liked was a thoroughly disputatious exchange of views. It was pretty tiring, really. You could take a safe bet that if you ventured on some self-evident proposition, as, for example, that the earth was round, Alan would produce a great deal of incontrovertible evidence to prove that it was almost certainly flat, ovular or much the same shape as a Siamese cat which had been boiled for fifteen minutes at a temperature of one thousand degrees Centigrade.
Code-breaker: Alan Turing's Life and Legacy at the Science Museum from 21 June 2012- June 2013, www.sciencemuseum.org.uk
Jun 06, 2012 | Los Angeles Times
Author of more than 27 novels and story collections-most famously "The Martian Chronicles," "Fahrenheit 451," "Dandelion Wine" and "Something Wicked This Way Comes"-and more than 600 short stories, Bradbury has frequently been credited with elevating the often-maligned reputation of science fiction.
Some say he singlehandedly helped to move the genre into the realm of literature.
PHOTOS: Ray Bradbury | 1920 - 2012
"The only figure comparable to mention would be [Robert A.] Heinlein and then later [Arthur C.] Clarke," said Gregory Benford, a UC Irvine physics professor who is also a Nebula award-winning science fiction writer. "But Bradbury, in the '40s and '50s, became the name brand."
Much of Bradbury's accessibility and ultimate popularity had to do with his gift as a stylist-his ability to write lyrically and evocatively of lands an imagination away, worlds he anchored in the here and now with a sense of visual clarity and small-town familiarity.
The late Sam Moskowitz, the preeminent historian of science fiction, once offered this assessment:
"In style, few match him. And the uniqueness of a story of Mars or Venus told in the contrasting literary rhythms of Hemingway and Thomas Wolfe is enough to fascinate any critic."
His stories were multi-layered and ambitious. Bradbury was far less concerned with mechanics-how many tanks of fuel it took to get to Mars and with what rocket-than what happened once the crew landed there, or what they would impose on their environment. "He had this flair for getting to really major issues," said Paul Alkon, emeritus professor of English and American literature at USC.
"He wasn't interested in current doctrines of political correctness or particular forms of society. Not what was wrong in '58 or 2001 but the kinds of issues that are with us every year."
Whether describing a fledgling Earthling colony bullying its way on Mars (" -- And the Moon Be Still as Bright" in 1948) or a virtual-reality baby-sitting tool turned macabre monster ("The Veldt" in 1950), Bradbury wanted his readers to consider the consequences of their actions: "I'm not a futurist. People ask me to predict the future, when all I want to do is prevent it."
Ray Douglas Bradbury was born Aug. 22, 1920, in Waukegan, Ill., to Leonard Spaulding Bradbury and the former Esther Marie Moberg. As a child he soaked up the ambience of small-town life - wraparound porches, fireflies and the soft, golden light of late afternoon - that would later become a hallmark of much of his fiction.
In 1945, "The Big Black and White Game," published in the American Mercury, opened the doors to other mainstream publications including Saturday Evening Post, Vogue and Colliers. "A young assistant [at Mademoiselle] found one of my stories in the 'slush pile.' It was about a family of vampires [and] called 'The Homecoming.' " Bradbury told the Christian Science Monitor in 1991. "He gave it to the story editor and said, 'You must publish this!' " That young assistant was Truman Capote, whose own"Homecoming" brought him renown.
Bradbury married Marguerite McClure in 1947, the same year he published his first collection of short stories - "Dark Carnival" (Arkham House) - a series of vignettes that revisited his childhood hauntings.
His first big break came in 1950, when Doubleday collected some new and previously published Martian stories in a volume titled "The Martian Chronicles." A progression of pieces that were at once adventures and allegories taking on such freighted issues as censorship, racism and technology, the book established him as an author of particular insight and note. And a rave review from novelist Christopher Isherwood in Tomorrow magazine helped Bradbury step over the threshold from genre writer to mainstream visionary.
"The Martian Chronicles" incorporated themes that Bradbury would continue to revisit for the rest of his life. "Lost love. Love interrupted by the vicissitudes of time and space. Human condition in the large perspective and definition of what is human," said Benford. "He saw ... the problems that the new technologies presented - from robots to the super-intelligent house to the time machine -- that called into question our comfy definitions of human."
Bradbury's follow-up bestseller, 1953's "Fahrenheit 451," was based on two earlier short stories and written in the basement of the UCLA library, where he fed the typewriter 10 cents every half-hour. "You'd type like hell," he often recalled. "I spent $9.80 and in nine days I had 'Fahrenheit 451.' "
Books like "Fahrenheit 451," in which interactive TV spans three walls, and "The Illustrated Man" - the 1951 collection in which "The Veldt" appeared - not only became bestsellers and ultimately films but cautionary tales that became part of the American vernacular.
"The whole problem in 'Fahrenheit' centers around the debate whether technology will destroy us," said George Slusser, curator emeritus of the J. Lloyd Eaton Collection of Science Fiction, Fantasy, Horror and Utopia at UC Riverside. "But there will always be a spirit that keeps things alive. In the case of 'Fahrenheit,' even though this totalitarian government is destroying the books, the people have memorized them. There are people who love the written word. That is true in most of his stories. He has deep faith in human culture."
But as he garnered respect in the mainstream, he lost some standing among science fiction purists. In these circles, Bradbury was often criticized for being "anti-science." Instead of celebrating scientific breakthroughs, he was reserved, even cautious.
Bradbury had very strong opinions about what the future had become. In the drive to make their lives smart and efficient, humans, he feared, had lost touch with their souls. "We've got to dumb America up again," he said.
Bradbury is survived by his daughters Susan Nixon, Ramona Ostergren, Bettina Karapetian and Alexandra Bradbury; and eight grandchildren. His wife, Marguerite, died in 2003.
Before Prince of Persia was a best-selling video game franchise and a Jerry Bruckheimer movie, it was an Apple II computer game created and programmed by one person, Jordan Mechner.
Now available as a paperback and ebook, Mechner's candid journals from the time capture his journey from his parents' basement to the forefront of the fast-growing 1980s video game industry and the creative, technical and personal struggles that brought the prince into being and ultimately into the homes of millions of people worldwide.
Feb 28, 2012 | Triangle Business Journal
Back in the 1960s, System 360 mainframe computer was a technological wonder. And frankly it stayed that way for years, but new systems have been developed. Now comes word that NASA has shut down its last functioning IBM mainframe.
"This month marks the end of an era in NASA computing. Marshall Space Flight Center powered down NASA's last mainframe, the IBM Z9 Mainframe,"wrote NASA CIO Linda Cureton in a blog post.
Dec 28, 2011 | Computerworld
It's been a rough year for the IT industry. The death of Apple co-founder Steve Jobs in October grabbed international headlines. But we also lost other major figures from almost every area of technology, including Xerox PARC founder Jacob E. Goldman, who died in late December. Here's one last look at some of the people who made a big difference.
Dennis M. Ritchie
Godfather of Unix, Father of C
September 1941 - October 2011
Arguably the most influential programmer of the past 50 years, Dennis Ritchie helped create the Unix operating system, designed
A Knack for Encryption
July 1932 - June 2011
Among the Bell Labs researchers who worked on Unix with Thompson and Ritchie was Bob Morris, who developed Unix's password system, math library, text-processing applications and crypt function.
Intelligence, Artificial and Otherwise
September 1927 - October 2011
He may be best known as the creator of the Lisp programming language and as the "father of artificial intelligence" (he coined the term in 1956), but John McCarthy's influence in IT reached far beyond would-be thinking machines. For example, in 1957 McCarthy started the first project to implement time-sharing on a computer, and that initiative sparked more elaborate time-sharing projects including Multics, which in turn led to the development of Unix.
The Digital Man
February 1926 - February 2011
As an engineer working at MIT's Lincoln Laboratory in the 1950s, Ken Olsen noticed that students lined up to use an outdated computer called the TX-0, even though a much faster mainframe was available. The difference? The mainframe ran batch jobs, while the TX-0 (which Olsen had helped develop as a grad student) allowed online interactivity.
April 1926 - March 2011
Working to make electronic communications bulletproof at the height of the Cold War, Paul Baran developed what would eventually become a core technology of the Internet: packet switching. Baran was a researcher at the Rand Corp. think tank in 1961 when he suggested that messages could be broken into pieces, sent to a destination by multiple routes if necessary and then reassembled upon arrival to guarantee delivery.
Last of the First Programmers
December 1924 - March 2011
Jean Bartik was the last surviving member of the original programming team for the ENIAC, the first general-purpose electronic computer. But that understates her work. Bartik, the only female math graduate in her 1945 college class, was hired to make the physical connections that let the ENIAC perform artillery calculations, and she served as a lead programmer on the project. But Bartik also developed circuit logic and did design work under the direction of ENIAC's hardware developer, J. Presper Eckert.
Jack Keil Wolf
Disk Drivin' Man
February 1926 - February 2011
There's a reason why the amount of information we can store on hard disks keeps growing -- and its name is Jack Wolf. That may be an overstatement, but it's not too much to say that Wolf did more than almost anyone else to use math to cram more data into magnetic drives, flash memory and electronic communications channels.
June 1925 - September 2011
Silicon Valley had many builders, but one of them literally built some of the high-tech hub's first silicon-making machines. Julius Blank was one of the "Traitorous Eight" engineers who founded Fairchild Semiconductor in 1957. He and his seven colleagues had acquired that unflattering sobriquet because they decided to strike out on their own just a year after Nobel Prize-winning physicist William Shockley had recruited them to create a new kind of transistor at Shockley Labs.
No More Mobile Monopoly
October 1922 - October 2011
Motorola CEO Bob Galvin didn't design the first working handheld mobile phone -- one of his researchers, Marty Cooper, did that in 1973. But Galvin broke AT&T's monopoly on mobile-phone service in the U.S. by demonstrating a Motorola phone at the White House in 1981, spurring then-President Ronald Reagan to push the FCC to approve Motorola's proposal for a competing cellular network, just three years after AT&T had lost its long-distance monopoly.
Gerald A. Lawson
December 1940 - April 2011
The man who created the first home video-game system that used interchangeable game cartridges wasn't a typical Silicon Valley engineer. Jerry Lawson was 6-foot-6, more than 250 lbs. and African-American -- even more of an IT industry rarity in the 1970s than today. Lawson's creation, the Fairchild Channel F, arrived in 1976, a year before Atari's first home game system, and sparked an industry of third-party video games.
December 1915 - September 2011
Lee Davenport didn't invent battlefield radar for tracking enemy planes, but the system he developed -- which used a computer to control anti-aircraft guns -- did its job better than any previous approach during World War II.
Heartbeat of the Century
September 1919 - September 2011
It was an electronic mistake in 1956 that led to the first practical implantable cardiac pacemaker. Wilson Greatbatch, an electrical-engineering professor at the University of Buffalo, was building a heart rhythm monitor for the school's Chronic Disease Research Institute. When he attached a wrong-size resistor to a circuit, it produced intermittent electrical pulses -- which, Greatbatch realized, might be used to regulate a damaged heart.
Nov 16, 2011 | arstechnica.com
Forty years ago today, electronics and semiconductor trade newspaper Electronic News ran an advertisement for a new kind of chip. The Intel 4004, a $60 chip in a 16-pin dual in-line package, was an entire CPU packed onto a single integrated circuit (IC).
At a bare minimum, a CPU is an instruction decoder and an arithmetic logic unit (ALU); the decoder reads instructions from memory and directs the ALU to perform appropriate arithmetic. Prior CPUs were made up of multiple small ICs of a few dozen or hundred transistors (and before that, individual transistors or valves) wired up together to form a complete "CPU." The 4004 integrated the different CPU components into one 2,300-transistor chip.
4004 wasn't just a new direction for the computer industry; it was also a new direction for Intel. Since its founding in 1968, Intel was a memory company, making various kinds of RAM, boasting some of the fastest and highest density memory in the industry. It wasn't in the business of making CPUs or logic chips. Nonetheless, Japanese electronic calculator company Busicom approached Intel in 1969, asking the memory company to build a new set of logic chips for its calculators.
Busicom proposed a fixed-purpose design requiring around a dozen chips. Busicom had designed the logic itself, and even verified that it was correct; it wanted Intel to build the things. Ted Hoff, manager of Intel's Application Department, realized that the design could be simplified and improved by using a general-purpose CPU instead of the specialized calculator logic that Busicom proposed. Hoff managed to convince both Intel and Busicom management that his approach was the right one.
Work started six months later when Intel hired Federico Faggin in April 1970 to work on the project. Faggin had to design and validate the logic of the CPU. This was a challenge for Intel. As a memory company, it didn't have methodologies for designing or validating logic circuits. Intel's processes were geared towards the production of simple, regular repeating structures, rather than the highly varied logic that a CPU requires.
Faggin's job was also made more complex by the use of silicon gate transistors. At the time, aluminum gates were standard, and while silicon eventually won out, its early development was difficult; silicon gates needed different design approaches than aluminum, and those approaches hadn't been invented yet.
Nonetheless, Faggin was successful, and by March 1971 had completed the development work of a family of four different chips. There was a 2048-bit ROM, the 4001; a 40-byte RAM, the 4002; an I/O chip, the 4003; and finally, the CPU itself, 4004. Intel paid Busicom for the rights to the design, allowing the firm to sell and market the chip family. Branded as MCS-4, the chips started production in June 1971, before being advertised to the commercial markets 40 years ago today.
Clumsy and cutting-edge
The 4004 itself was a peculiar mix of cutting-edge technology and conservative cost-cutting. As an integrated CPU it was a landmark, but the design itself was clumsy even for 1970. Intel management insisted that the chip use a 16-pin DIP, even though larger, 40-pin packages were becoming mainstream at the time. This means that the chip's external bus was only four bits wide, and this single 4-bit bus had to transport 12-bit memory addresses, 8- and 16-bit instructions, and the 4-bit integers that the CPU operated on. Reading a single 16-bit instruction thus took four separate read operations. The chip itself had 740 kHz clock, using 8 clock cycles per instruction. It was capable of 92,600 instructions per second-but with the narrow multipurpose bus, achieving this in practice was difficult.
In 1972, Intel produced the 8-bit 8008. As with the 4004, this was built for a third party-this time terminal manufacturer Datapoint-with Datapoint contributing much of the design of the instruction set, but Intel using its 4004 experience to actually design the CPU. In 1974, the company released the 8080, a reworked 8008 that used a 40-pin DIP instead of 8008's 18-pin package. Federico Faggin did much of the design work for the 8008 and 8080.
In spite of these pioneering products, Intel's management still regarded Intel as a memory company, albeit a memory company with a sideline in processors. Faggin left intel in 1974, founding his own processor company, Zilog. Zilog's most famous product was the Z80, a faster, more powerful, software-compatible derivative of the 8080, that powered early home computers including the Radio Shack TRS-80 and the Sinclair ZX80, ZX81, and ZX Spectrum-systems that were many people's first introduction into the world of computing.
Faggin's decision to leave Intel and go into business for himself caused some bad feeling, with Intel for many years glossing over his contribution. Nonetheless, he left an indelible mark on Intel and the industry as a whole, not least due to his decision to sign his initials, FF, on the 4004 die.The 8080 instruction set was then extended to 16 bits, with Intel's first 16-bit processor, the 20,000 transistor 8086, released in 1978. This was the processor that first heralded Intel's transition from a memory company that also produced processors into the world's leading processor company. In 1981, IBM picked the Intel 8088-an 8086 with the external bus cut to 8-bit instead of 16-bit-to power its IBM PC, the computer by which all others would come to be measured. But it wasn't until 1983, with memory revenue being destroyed by cheap Asian competitors, that Intel made microprocessors its core product.
The processors of today continue to owe much of their design (or at least, the design of their instructions) to the 8086. They're unimaginably more complex, with the latest Sandy Bridge E CPUs using 2.2 billion transistors, a million-fold increase on 4004 and 100,000-fold on the 8086, the basic design elements are more than 30 years old.
While the 4004 is widely regarded as the first microprocessor, and is certainly the best known, it arguably isn't actually the first. There are two other contenders.
Texas Instruments' TMS 1000 first hit the market in calculators in 1974, but TI claimed it was invented in 1971, before the 4004. Moreover, TI was awarded a patent in 1973 for the microprocessor. Intel subsequently licensed this patent.
Earlier than both of these was a processor called AL1. AL1 was built by a company named Four-Phase Systems. Four-Phase demonstrated systems built using AL1 in 1970, with several machines sold by early 1971. This puts them ahead of both TI and Intel. However, at the time AL1 was not used as a true standalone CPU; instead, three AL1s were used, together with three further logic chips and some ROM chips.
Intel and Cyrix came to blows in a patent dispute in 1990, with TI's patent being one of the contentious ones. To prove that TI's patent should not have been granted, Four-Phase Systems founder Lee Boysel took a single AL1 and assembled it together with RAM, ROM, and I/O chips-but no other AL1s or logic chips-to prove that it was, in fact, a microprocessor, and hence that it was prior art that invalidated TI's claim. As such, although it wasn't used this way, and wasn't sold standalone, the AL1 can retrospectively claim to have been the first microprocessor.
The 4004 is, however, still the first commercial microprocessor, and it's the first microprocessor recognized and used at the time as a microprocessor. Simple and awkward though its design may have been, it started a revolution. Ted Hoff, for convincing Busicom and Intel alike to produce a CPU, Federico Faggin, for designing the CPU, and Intel's management, particularly founders Gordon Moore and Robert Noyce, for buying the rights and backing the project, together changed the world.Photograph by Rostislav Lisovy
Nov 16, 2011 | arstechnica.com
Windows XP's retail release was October 25, 2001, ten years ago today. Though no longer readily available to buy, it continues to cast a long shadow over the PC industry: even now, a slim majority of desktop users are still using the operating system.
...For home users using Windows 95-family operating systems, Windows XP had much more to offer, thanks to its substantially greater stability and security, especially once Service Pack 2 was released.
...Over the course of its life, Microsoft made Windows XP a much better operating system. Service Pack 2, released in 2004, was a major overhaul of the operating system. It made the software better able to handle modern systems, with improved WiFi support and a native Bluetooth stack, and made it far more secure. The firewall was enabled by default, the bundled Internet Explorer 6 gained the "gold bar" popup blocker and ActiveX security feature, and for hardware that supported it, Data Execution Protection made it more difficult to exploit software flaws.
...Ten years is a good run for any operating system, but it really is time to move on. Windows 7 is more than just a solid replacement: it is a better piece of software, and it's a much better match for the software and hardware of today.
"Today we celebrate Dennis Ritchie Day, an idea proposed by Tim O'Reilly. Ritchie, who died earlier this month, made contributions to computing that are so deeply woven into the fabric that they impact us all. We now have to remark on the elephant in the room. If Dennis Ritchie hadn't died just after Steve Jobs, there would probably have been no suggestion of a day to mark his achievements.
We have to admit that it is largely a response to the perhaps over-reaction to Steve Jobs which highlighted the inequality in the public recognition of the people who really make their world work."
Before he was deposed from Apple the first time around, Jobs already had a reputation internally for acting like a tyrant. Jobs regularly belittled people, swore at them, and pressured them until they reached their breaking point. In the pursuit of greatness he cast aside politeness and empathy. His verbal abuse never stopped. Just last month Fortune reported about a half-hour "public humiliation" Jobs doled out to one Apple team:Jobs ended by replacing the head of the group, on the spot.
"Can anyone tell me what MobileMe is supposed to do?" Having received a satisfactory answer, he continued, "So why the fuck doesn't it do that?"
"You've tarnished Apple's reputation," he told them. "You should hate each other for having let each other down."
In his book about Jobs' time at NeXT and return to Apple, The Second Coming of Steve Jobs, Alan Deutschman described Jobs' rough treatment of underlings:
He would praise and inspire them, often in very creative ways, but he would also resort to intimidating, goading, berating, belittling, and even humiliating them... When he was Bad Steve, he didn't seem to care about the severe damage he caused to egos or emotions... suddenly and unexpectedly, he would look at something they were working on say that it "sucked," it was "shit."
Jobs had his share of personal shortcomings, too. He has no public record of giving to charity over the years, despite the fact he became wealthy after Apple's 1980 IPO and had accumulated an estimated $7 billion net worth by the time of his death. After closing Apple's philanthropic programs on his return to Apple in 1997, he never reinstated them, despite the company's gusher of profits.
It's possible Jobs has given to charity anonymously, or that he will posthumously, but he has hardly embraced or encouraged philanthropy in the manner of, say, Bill Gates, who pledged $60 billion to charity and who joined with Warren Buffet to push fellow billionaires to give even more."He clearly didn't have the time," is what the director of Jobs' short-lived charitable foundation told the New York Times. That sounds about right. Jobs did not lead a balanced life. He was professionally relentless. He worked long hours, and remained CEO of Apple through his illness until six weeks before he died. The result was amazing products the world appreciates. But that doesn't mean Jobs' workaholic regimen is one to emulate.
There was a time when Jobs actively fought the idea of becoming a family man. He had his daughter Lisa out of wedlock at age 23 and, according to Fortune, spent two years denying paternity, even declaring in court papers "that he couldn't be Lisa's father because he was 'sterile and infertile, and as a result thereof, did not have the physical capacity to procreate a child.'" Jobs eventually acknowledged paternity, met and married his wife, now widow, Laurene Powell, and had three more children. Lisa went to Harvard and is now a writer.
Windows XP's retail release was October 25, 2001, ten years ago today. Though no longer readily available to buy, it continues to cast a long shadow over the PC industry: even now, a slim majority of desktop users are still using the operating system.
Windows XP didn't boast exciting new features or radical changes, but it was nonetheless a pivotal moment in Microsoft's history. It was Microsoft's first mass-market operating system in the Windows NT family. It was also Microsoft's first consumer operating system that offered true protected memory, preemptive multitasking, multiprocessor support, and multiuser security.
The transition to pure 32-bit, modern operating systems was a slow and painful one. Though Windows NT 3.1 hit the market in 1993, its hardware demands and software incompatibility made it a niche operating system. Windows 3.1 and 3.11 both introduced small amounts of 32-bit code, and the Windows 95 family was a complex hybrid of 16-bit and 32-bit code. It wasn't until Windows XP that Windows NT was both compatible enough-most applications having been updated to use Microsoft's Win32 API-and sufficiently light on resources.
In the history of PC operating systems, Windows XP stands alone. Even Windows 95, though a landmark at its release, was a distant memory by 2005. No previous PC operating system has demonstrated such longevity, and it's unlikely that any future operating system will. Nor is its market share dominance ever likely to be replicated; at its peak, Windows XP was used by more than 80 percent of desktop users.
The success was remarkable for an operating system whose reception was initially quite muted. In the wake of the September 11th attacks, the media blitz that Microsoft planned for the operating system was toned down; instead of arriving with great fanfare, it slouched onto the market. Retail sales, though never a major way of delivering operating systems to end users, were sluggish, with the operating system selling at a far slower rate than Windows 98 had done three years previously.
It faced tough competition from Microsoft's other operating systems. Windows 2000, released less than two years prior, had won plaudits with its marriage of Windows NT's traditional stability and security to creature comforts like USB support, reliable plug-and-play, and widespread driver support, and was widely adopted in businesses. For Windows 2000 users, Windows XP was only a minor update: it had a spruced up user interface with the brightly colored Luna theme, an updated Start menu, and lots of little bits and pieces like a firewall, UPnP, System Restore, and ClearType. ...
Long in the tooth it may be, but Windows XP still basically works. Regardless of the circumstances that led to its dominance and longevity, the fact that it remains usable so long after release is remarkable. Windows XP was robust enough, modern enough, well-rounded enough, and usable enough to support this extended life. Not only was Windows XP the first (and only) PC operating system that lasted ten years: it was the first PC operating system that was good enough to last ten years. Windows 98 didn't have the security or stability; Windows 2000 didn't have the security or comfort; Mac OS X 10.1 didn't have the performance, the richness of APIs, or the hardware support.
... ... ...Given current trends, Windows 7 will overtake XP within the next year, with many businesses now moving away from the decade-old OS in earnest. Not all-there are still companies and governments rolling out Windows XP on new hardware-but the tide has turned. Windows XP, with its weaker security and inferior support for modern hardware, is now becoming a liability; Windows 7 is good enough for business and an eminently worthy successor, in a way that Windows Vista was never felt to be.
Ten years is a good run for any operating system, but it really is time to move on. Windows 7 is more than just a solid replacement: it is a better piece of software, and it's a much better match for the software and hardware of today. Being usable for ten years is quite an achievement, but the stagnation it caused hurts, and is causing increased costs for administrators and developers alike. As incredible as Windows XP's longevity has been, it's a one-off. Several factors-the 32-bit transition, the Longhorn fiasco, even the lack of competition resulting from Apple's own Mac OS X transition-conspired to make Windows XP's position in the market unique. We should not want this situation to recur: Windows XP needs to be not only the first ten-year operating system; it also needs to be the last.
"We should not want this situation to recur: Windows XP needs to be not only the first ten-year operating system; it also needs to be the last."
It feels like you completely missed the point.
In today's fast-paced, constantly iterating (not innovating, as they claim) world, "good enough" is an alien concept, a foreign language. Yet we reached "good enough" ten years ago and it shows no signs of ever going away.
OttoResponder wrote: All those words and yet one was missed: monopoly. You can't really talk about XP - or any other Microsoft OS - without talking about the companies anti-competitive practices. For example, the way they strong-armed VARs into selling only Windows... Sure the Bush Administration let them off the hook, but the court's judgement still stands.
Pirated XP is still installed far more than Linux despite being an OS from 2001.
Linux on the desktop has shortcomings and pretending they don't exist won't make them go away.
Microsoft used strong arm tactics but the competition also sucked. I have known many geeks that chose XP over Linux because they found the latter to be too much of a hassle, not because of OEMs or software compatibility.
"Windows XP didn't boast exciting new features". I stopped reading there because that's a load of crap/myth. XP came with a large number of NEW and EXCITING features. Read more about them here: http://en.wikipedia.org/wiki/Features_new_to_Windows_XP . XP was a very well engineered system that improved by orders of magnitude upon Windows 2000. Its popularity and continued use demonstrate just how well designed the system was. Great compatibility, excellent stability and performance. Security was an Achilles heel but SP2 nailed it and XP became a very good OS.
Windows 7 has some nice features but plenty of regressions too. Windows 7 can't even do basic operations like freely arrange pictures in a folder or not force a sorting order on files. The search is totally ruined for real-time searching, WMP12 is a UI disaster. Service packs and updates take hours to install instead of minutes and can't be slipstreamed into setup files. There's no surround sound audio in games. There is no choice of a Classic Start Menu. Windows Explorer, the main app where I live (instead of living on Facebook) is thoroughly dumbed down.
"Windows XP didn't boast exciting new features or radical changes, but it was nonetheless a pivotal moment in Microsoft's history. It was Microsoft's first mass-market operating system in the Windows NT family. It was also Microsoft's first consumer operating system that offered true protected memory, preemptive multitasking, multiprocessor support, and multiuser security."
Talk about contradictions. First, claim there were no new or exciting features, then list a bunch of them. XP was the first fully 32 bit Windows OS and broke dependence on DOS. I'd say it did offer radical changes for the better. carlisimo | 5 days ago | permalink I'm still on XP, and I'm hesitant about upgrading... I don't like the changes to Windows Explorer at all. dnjake | 5 days ago | permalink How often do you need a new spoken language or a new hammer? When people spend effort learning how to use an operating system and that operating system meets their needs, change is a losing proposition. The quality of Microsoft's work is going down. But Microsoft's quality is still far better than almost any of the low grade work that is standard for the Web. Windows 7 does offer some improvement over XP and it is a more mature operating system. It will be used longer than XP and it remains to be seen how long XP's life will turn out to be. The quality of the Web is still low. Even the most basic forms and security sign in applications are primitive and often broken. It may easily take another decade. But the approach Microsoft is talking about with Windows 8 probably will eventually will provide a mature system based on the HTML DOM as the standard UI. Between that and XAML, it is hard to see why anything more will be needed. The days of ever changing operating systems are drawing to a close.
Windows 2008R2 is probably safer than RHEL for web hosting.
Really? I'd like to see some results on this.
Quote: IE6 and IE9 might as well be entirely different browsers.
Yes, because they had competition surpassing them. Otherwise you get 5+ years of... nothing.
I am in charge of the PC Deployment team of a Fortune 400 company. I can tell you first hand the nightmare of upgrading to Windows 7. The facts are, legacy apps haven't been upgraded to run on Windows 7, much less Windows 7 64bit. We had the usual suspects, Symantec, Cisco, Citrix all ready for launch, but everyone else drags their feet and we have had to tell our customers, either do away with the legacy app and we can find similar functionality in another application or keep your legacy app and you will be sent an older PC with XP on it (Effectively redeploying what they already have with more memory). Add to that, we are using Office 2010 and it's a complete shell shock to most end users used to Office 2003, though going from 2007 isn't as bad.
On the other hand I do small business consulting and moved over a 10 person office and they were thrilled with Windows 7, as it really took advantage of the newer hardware.
It just depends on the size and complexity of the upgrade. My company just cannot throw away a working OS, when these production applications won't work... Maybe in a perfect IT world
The various Linux distros still haven't managed to come up with a solid desktop OS that just works, to say nothing of the dearth of decent applications, and what is there frequently looks and works like some perpetual beta designed by nerds in their spare time.
It's funny b/c it's so very true
Windows XP's longevity is truly remarkable. The article makes a good point in that the strong push towards internet-connected PC's and internet security made running all pre-XP Microsoft desktop OS'es untenable after a few years, especially after Windows XP SP2 released with beefier security.
I personally jumped ship to Vista as soon as I could, because after the stability issues were ironed out within the first 6 months, it was a much smoother, better PC experience than XP (long boot times notwithstanding). Windows 7, which was essentially just a large service pack of Vista sold as a new OS (think OS X releases), was a smoother, more refined Vista.
I believe that Windows 7 is "the new XP", and it will probably still command well over 10% of the desktop market in 5+ years. I believe that for non-touch screen PC's, Windows 7 will be the gold standard for years to come, and that is the vast majority of buisness PC's and home PC's. New builds of Windows 7 boot faster than XP, and run smoother with fewer hiccups. The GPU-accelerated desktop really does run smoother than the CPU driven ones of the past.
Nothing will approach XP's 10-year run, most of that as the dominant desktop OS. The lines between desktop and laptop have been blurred lately as well; Windows 7 and Mac OS X are considered "desktop" OS'es even when they run on laptops. There is a newfound emphasis on mobile OS'es like never before today. More and more people will use Tablet devices as media consumption devices - to surf the net, watch videos, etc. More and more people use computers *while watching TV; it's a trend that is only increasing, and smartphones and tablets make this even easier. ----------
Because Windows 8's "Metro" UI is so touch-focused, I could see it taking off in school usage, laptops, and, of course, tablets in the 201x decade. It will be interesting to see how Windows 8 tablets run when the OS first launches in late 2012; tablet hardware is at least an order of magnitude slower than desktop hardware. Within a few years of Windows 8's launch, however, there may be no perceptible performance difference between Tablet and desktop/laptop usage.
OS X 10.0-10.7 = $704 Windows XP - 7 via upgrades (XP Professional Full - Ultimate - Ultimate) = $780
Windows XP - 7 via full versions (Professional - Ultimate - Ultimate) = $1020
OS X has been cheaper if you were looking for the full experience of Windows each time. I dont' have time to research which of the versions were valid upgrades of each other, b/c that was a super PITA, so I'll just leave that there for others to do.
You're too lazy to do the research for your own comparison?
Ultimate mainly exists for the people that max out laptops because they have nothing better to do with their money. The best feature of Ultimate (bitlocker) has a free alternative (truecrypt).
You certainly don't need Ultimate for the full Windows experience.
DrPizza wrote: I don't even understand the question, particularly not with regard to how it relates to Apple. Apple doesn't give you iterative improvements. It releases big new operating systems that you have to buy.
I think the case can be made that the transition between versions of Windows, traditionally, are a bit larger of a jump than transitions between versions of Mac OS X.
The leap from Windows XP to Windows Vista was quite large; compare that to the changes between Mac OS 10.3 and Mac OS 10.4. Similarly, Vista to 7 looks "more" than 10.4 to 10.5.
While Apple's charging for each point iteration of its operating system and adding new features, most of the underlying elements are essentially the same. They round out corners, remove some brushed metal here and there, change a candy stripe or two, but the visual differences between versions aren't as dramatic.
cactusbush wrote: Yeah, we are obliged to upgrade OS's eventually, when we purchase new hardware. Problem is that Windoze is getting worse - not better. Windoze Vista and 7 -STINK- and the future with Win8 looks even bleaker. My issues revolve around having command and control over my personal machine rather than it behaving like a social machine or by having the OS protect me from the machine's inner workings.
Several previous commentators have already remarked along similar lines, their frustration with Win7&8's isolation and de-emphasis of the file managing 'Explorer' app. - "aliasundercover" listed several Win7 shortcomings; including excessive 'nag screens', less control over where things get put, "piles of irrelevant things run incessantly in the background", the need for the machine to 'Phone Home' constantly and the "copy protection and activation getting meaner".
Years ago the purchase of XP and its new limited installation to only one computer; determined that XP would be my last MS OS purchase. Linux however has not yet blossomed to a desirable point.
MS is developing dumbed down style of operating systems that I don't want and don't like.
1) There is absolutely nothing stopping you from "having command and control" over your personal Windows 7 machine. In fact, Windows 7 provides more and better facilities for automatically starting and managing various tasks.
2) Nag screens can be easily disabled. Moreover, there are multiple settings for these screens. For example, I only see these screens when I install or uninstall software.
3) I don't see how explorer has been "isolated" or "de-emphasized." There are a few UI changes to the program, but most can be reverted to what XP looked like, save for the lack of an "up" button (which can be restored with certain software). Learn to use the new search in the start menu. It will save you a lot of time in the long run.
4) I'm not sure what the "less control over where things get put" complaint is about. Pretty much every program installer allows you to change where programs are installed.
5) Windows 7 runs faster and more efficiently than Windows XP, regardless of background processes.
6) Windows 7 activation has been painless. I don't see why anyone cares about Windows "phoning home" for activation after installation or the copy protection scheme, unless you're a pirate. Buy a copy of Windows 7 for each PC and stop being such a cheap-ass.
Honestly, it sounds like you have had very little actual exposure to Windows 7 and have just picked up complaints from other people. Neither XP nor 7 are perfect OSes, but 7 is leagues above XP in terms of security, performance, and standards. Windows 7 is a modern OS in every sense of the word. XP is an OS that has been patched and updated many times during its lifespan to include features and security it should have had in the first place.
Nightwish wrote: That and most people walk into a best buy and get whatever they're told to get having no idea what an OS is or that there's a choice. Enterprise is terrified of change.
Enterprise needs to get work done. Vista had all the problems of a major update with the benefits of a minor update.
I'm left wondering the age of the people spouting this "enterprise is terrified of change" meme.
Seriously. This isn't meant to insult of younger people. It isn't bad to be young. However youth often don't fully grasp the factors that go into the decision making process.
IT departments aren't afraid of change. Change is exactly what keeps them employed and makes their job interesting. You'll find that they usually run the latest and greatest at home, likely have a brand new gadgets, and spend their free time on sites like ars.
So why don't they upgrade? Because upgrading costs time, money, and the devoted attention of people in key rolls. It also results in lost productivity in the short term. The benefits of upgrading must be weighed against the costs of upgrading. But not only that, the upgrade must be weighed against other projects that might help the organization more. Only so much change can be managed and endured simultaneously.
Meanwhile, casual and overly emotional observers pretend that IT departments are sticking with XP because they're lazy or haven't given the topic much thought. Rest assured, migration from XP has been given a ton of attention and the decision of when to leap isn't made lightly.
Great post... I think young people don't fully grasp how important it is to keep those main line of business applications operating.
I love XP and always will. I have been using it for almost 10 years. Got it just after it came out on my first real computer. The Dell Dimension 4100 with 733MHZ Pentium3 and 128MB SDRAM.
Just a few months ago I sold a brand new replacement laptop that I was sent from Dell so that I could buy an older, cheap laptop. A 2006 Inspiron E1405. It has a 1.83GHZ Core Duo, released before even the Core 2 Duos came out, only a 32-bit CPU. I am running XP SP3 on it with 2gigs of RAM and it flies. I run every program that any normal person would. Currently have 13 tabs open in Chrome, Spotift open, some WinExplorer windows, and Word 2010 open. Not a hint of slow down.
XP is just so lean and can be even furtherly leaned out through tons of tweaks.
FOR ANYONE WHO STILL RUNS XP, download an UxTheme.dll patcher so that you can use custom themes!
All this crying about "Holding us back". I say to the contrary, it kept us from moving "Forward". Forward as in needing new hardware every couple of years that in the end gave us NO REAL new functionality, speed, or efficiency. It wasn't until the CORE processors from Intel that there was any NEED for a new OS to take advantage.
Being able to keep working on old hardware that STILL PERFORMED, or being able to upgrade when you FELT like it (instead of being FORCED to because the new crappy whizbang OS brought it to its knees) with results that FLEW was NICE.
Windows 7 is a worthy successor to XP, but that doesn't mean XP wasn't a GREAT OS during its run!
A few of us are using XP because the 70-250 thousand dollar instrument requires a particular OS to run the software. Upgrading to a new OS (if offered) is a 3 to 14 thousand dollar cost for new controller boards in the PC and the new software, not to mention the additional cost of a new PC. We have two Win98, one WinNT, and three WinXP machines in our lab running instruments.
I just got on the Windows 7 bandwagon a little over a month ago. There are some things I like and some things I don't like. The boot times and shut down times are considerably faster that XP. Also I feel like the entire OS is just much more stable. I never get programs that hang on shut down. It just plain works. I don't care much for the new Windows 7 themes. I immediately went to the Windows Classic theme as soon as I found it. However, I still like the old XP start menu more. It was just more compact and cleaner. I do like the search feature for Windows 7. There are some other things I don't like, like the Explorer automatically refreshing when I rename a file. It's a pointless feature that adds nothing to the experience.
At my job, however, I see no upgrade from XP in sight. I work for a major office supply retailer, and we are hurting financially. We still use the same old Pentium 4 boxes from when I started back in 2003.
What moves people (and companies) to upgrade (or not) their OS ? Basically, 2 things: applications and hardware. For a long time, XP covered this 2 items very well (even in the case of legacy Win-16 or DOS applications in most of the cases, either natively or thru 3rd. party support like DOSBox), so the market felt no big need to change. Microsoft knew this, that's why things like DirectX10 got no support on XP. OTOH, the hardware market evolved to 64 bit platforms, and things like the 3GB address space limit in XP and immature multi core processors support became real problems.
I think that Windows 7 / Windows 8 will spread faster when people and enterprises feel the need to migrate to 64 bit hardware (that was my case, BTW).
October 7, 2011 | Mail Online
A father he never knew, a love-child he once denied and a sister he only met as an adult: The tangled family of Steve Jobs... and who could inherit his $8.3 BILLION fortune Apple co-founder survived by two sisters, wife and their three children But he also had love child Lisa Brennan-Jobs with Chrisann Brennan His Syrian biological father never had conversation with Jobs as an adult
Steve Jobs's tangled family of a forgotten father, long-lost sister and love child means lawyers may face a delicate task breaking up his $8.3billion fortune. The 56-year-old co-founder and former CEO of Apple is widely seen as one of the world's greatest entrepreneurs - and he died just outside the top 100 world's richest billionaires. But behind the iconic Californian's wealth and fame lies an extraordinary story of a fragmented family. Husband and wife: Steve Jobs leans his forehead against his wife after delivering the keynote address at an Apple conference in San Francisco, California, in June Mr Jobs, of Palo Alto, California, is survived by his sisters Patti Jobs and Mona Simpson, his wife Laurene Powell Jobs and their three children Eve, Erin and Reed. STEVE JOBS AND HIS FAMILY Biological parents: Joanne Schieble and Abdulfattah Jandali Biological sister: Mona Simpson Adoptive parents: Clara and Paul Jobs Adoptive sister: Patti Jobs Wife: Laurene Powell Jobs Children: Eve, Erin and Reed Love child: Lisa Brennan-Jobs from relationship with Chrisann Brennan
But his family is far from straightforward. He was adopted as a baby and, despite his biological father's attempts to contact him later on, remained estranged from his natural parents. In his early twenties Mr Jobs became embroiled in a family scandal before his days of close media scrutiny, after he fathered a love child with his high school sweetheart Chrisann Brennan. Ms Brennan, who was his first serious girlfriend, became pregnant in 1977 - and he at first denied he was the father. She gave birth to Lisa Brennan-Jobs in 1978 - and in the same year Mr Jobs created the 'Lisa' computer, but insisted it only stood for 'Local Integrated Software Architecture'. The mother initially raised their daughter on benefits. But he accepted his responsibilities two years later after a court-ordered blood test proved he was the father, despite his claims of being 'infertile'. Relatives: Mr Jobs did not meet his biological sister Mona Simpson, left, until he was aged 27. Lisa Brennan-Jobs, right, was his love child with longtime girlfriend Chrisann Brennan in 1978 Ms Brennan-Jobs has made a living for herself, after graduating from Harvard University, as a journalist and writer. 'My father was rich and renowned, and later, as I got to know him, went on vacations with him, and then lived with him for a few years, I saw another, more glamorous world' Lisa Brennan-Jobs She was eventually invited into her father's life as a teenager and told Vogue that she 'lived with him for a few years'. 'In California, my mother had raised me mostly alone,' Lisa wrote in an article for Vogue in 2008. 'We didn't have many things, but she is warm and we were happy. We moved a lot. We rented.
'My father was rich and renowned, and later, as I got to know him, went on vacations with him, and then lived with him for a few years, I saw another, more glamorous world.' Biological dad: Abdulfattah Jandali, 80, a casino boss, has said he wanted to meet his son but was worried about calling him in case Mr Jobs thought he was after money Mr Jobs was born to Joanne Schieble and Syrian student Abdulfattah Jandali before being given up for adoption.
Mr Jandali was a Syrian student and not married to Ms Simpson at the time of Mr Jobs's birth in San Francisco, California, in February 1955.
More...The man who changed the world: Apple founder Steve Jobs, 56, dies weeks after quitting as boss of firm he started in his garage 'The world is a better place because of Steve': The life and times of Apple visionary Steve Jobs
She did not want to bring up a child out of wedlock and went to San Francisco from their home in Wisconsin to have the baby.
Mr Jobs is thought never to have made contact with his biological father.
Mr Jandali, 80, a casino boss, has said he wanted to meet his son but was worried if Mr Jobs thought he was after money. Tributes: Flowers adorn the sidewalk outside the home of Steve Jobs in Palo Alto, California, today He had always hoped that his son would call him to make contact - and had emailed him a few times in an attempt to speak. Mr Jandali once said he 'cannot believe' his son created so many gadgets. 'This might sound strange, though, but I am not prepared, even if either of us was on our deathbeds, to pick up the phone to call him,' he said. Ms Schieble and Mr Jandali then had a second child called Mona Simpson, who became a novelist. Ms Simpson is an author who once wrote a book loosely based on her biological brother. She lives in Santa Monica, California, with her two children and was once married to producer Richard Appel. Couple: He met his wife Laurene Powell in 1989 while speaking at Stanford's graduate business school and he had three children with her - Eve, Erin and Reed But Mr Jobs did not actually meet Ms Simpson until he was aged 27. He never wanted to explain how he tracked down his sister, but she described their relationship as 'close'. Mr Jobs was adopted by working-class couple Clara and Paul Jobs, who have both since died, but they also later adopted a second child - Patti Jobs. He later had the Ms Brennan-Jobs love child with his longtime girlfriend Ms Brennan in 1978. He met his wife Laurene Powell in 1989 while speaking at Stanford's graduate business school and he had three children with her - Eve, Erin and Reed. Residence: Apple co-founder Mr Jobs lived in this home estimated at $2.6million in Palo Alto, California They married in 1991 and Reed was born soon after. He is their oldest child, aged 20. Mr Jobs registered an incredible 338 U.S. patents or patent applications for technology and electronic accessories, reported the International Business Times. He was believed to have driven a 2007 Mercedes Benz SL55 AMG, which was worth around $130,000 new at the time. His 5,700 sq ft home was a 1930s Tudor-style property with seven bedrooms and four bathrooms - and it is estimated by CNBC to be worth $2.6million.
Mr Jobs also owned a huge historic Spanish colonial home in Woodside, which had 14 bedrooms and 13 bathrooms, located across six acres of forested land. Steve Jobs: The 56-year-old co-founder and former CEO of Apple is widely seen as one of the world's greatest entrepreneurs - and he also died just outside the top 100 world's richest billionaires But he later had it knocked down to make way for a smaller property after a long legal battle.
His charitable giving has always been a secret topic, just like most other elements of his lifestyle. Mr Jobs reportedly declined to get involved with the Giving Pledge - founded by Warren Buffett and Bill Gates to get the wealthiest people to give away at least half of their wealth. But he is rumoured to have given $150million to the Helen Diller Family Comprehensive Cancer Center at the University of California in San Francisco, reported the New York Times. It is cancer organisations that are most likely to be supported if any charities are in his will, as he died on Wednesday at the age of 56 from the pancreatic form of the illness.
Read more: http://www.dailymail.co.uk/news/article-2046031/Steve-Jobs-death-Apple-boss-tangled-family-inherit-8-3bn-fortune.html#ixzz1cG2DkrVu
I don't care how you spell it out..... money, love children, tangled web of a life..... The world just lost the Henry Ford and the Thomas Edison of our day. Can anyone set the dollar value of his estate aside and look at Steve Jobs holistically? The guy was damn brilliant.... RIP Steve and sympathies to your wife and children. Right now, they don't care what your net worth was..... you were there dad and a father....
- Kurt R, Northville, MI, 08/10/2011 00:07
Click to rate Rating 84 Report abuse
@Nancy Briones: he was a hero because he epitomized the American dream - brought up in a very modest household, dropped out of college to save his parents' money, started a business in his garage, made it big, failed and was fired, got back up and made it big all over again. His visions and his attention to detail have changed everyone's lives, whether you use Apple products or not. All computers changed because of Apple; the music industry was dragged kicking & screaming into the 21st century, to the benefit of consumers. Pixar revolutionized animated movies. Just simple computer typography changed massively because of Jobs. He took the computer version of ergonomics (that is, their ease of use) to levels no-one else could be remotely bothered to take them. He made computers useful for the liberal arts field, not just number crunching. His mission in life was to improve the world. His salary was $1 per year. He got rich just because he was successful at changing the world.
- DBS, San Francisco, USA, 08/10/2011 00:00
Click to rate Rating 66 Report abuse
My name is Ozymandias, king of kings: Look on my works, ye Mighty, and despair
- Clive, Fife, 07/10/2011 15:24
Click to rate Rating 53 Report abuse
Why was he such a hero? He benefited greatly from his creations. It was his job and he was paid for it. Funny how his cancer diagnosis somehow made us all so sympathetic to someone whose mission in life was to amass wealth, not save the world. My heart goes out to his family in their time of loss, however.
Read more: http://www.dailymail.co.uk/news/article-2046031/Steve-Jobs-death-Apple-boss-tangled-family-inherit-8-3bn-fortune.html#ixzz1cG1yoRpu
October 13, 2011 | NYTimes.comDennis M. Ritchie, who helped shape the modern digital era by creating software tools that power things as diverse as search engines like Google and smartphones, was found dead on Wednesday at his home in Berkeley Heights, N.J. He was 70.
Mr. Ritchie, who lived alone, was in frail health in recent years after treatment for prostate cancer and heart disease, said his brother Bill.
In the late 1960s and early '70s, working at Bell Labs, Mr. Ritchie made a pair of lasting contributions to computer science. He was the principal designer of the C programming language and co-developer of the Unix operating system, working closely with Ken Thompson, his longtime Bell Labs collaborator.
The C programming language, a shorthand of words, numbers and punctuation, is still widely used today, and successors like C++ and Java build on the ideas, rules and grammar that Mr. Ritchie designed. The Unix operating system has similarly had a rich and enduring impact. Its free, open-source variant, Linux, powers many of the world's data centers, like those at Google and Amazon, and its technology serves as the foundation of operating systems, like Apple's iOS, in consumer computing devices.
"The tools that Dennis built - and their direct descendants - run pretty much everything today," said Brian Kernighan, a computer scientist at Princeton University who worked with Mr. Ritchie at Bell Labs.
Those tools were more than inventive bundles of computer code. The C language and Unix reflected a point of view, a different philosophy of computing than what had come before. In the late '60s and early '70s, minicomputers were moving into companies and universities - smaller and at a fraction of the price of hulking mainframes.
Minicomputers represented a step in the democratization of computing, and Unix and C were designed to open up computing to more people and collaborative working styles. Mr. Ritchie, Mr. Thompson and their Bell Labs colleagues were making not merely software but, as Mr. Ritchie once put it, "a system around which fellowship can form."
C was designed for systems programmers who wanted to get the fastest performance from operating systems, compilers and other programs. "C is not a big language - it's clean, simple, elegant," Mr. Kernighan said. "It lets you get close to the machine, without getting tied up in the machine."
Such higher-level languages had earlier been intended mainly to let people without a lot of programming skill write programs that could run on mainframes. Fortran was for scientists and engineers, while Cobol was for business managers.
C, like Unix, was designed mainly to let the growing ranks of professional programmers work more productively. And it steadily gained popularity. With Mr. Kernighan, Mr. Ritchie wrote a classic text, "The C Programming Language," also known as "K. & R." after the authors' initials, whose two editions, in 1978 and 1988, have sold millions of copies and been translated into 25 languages.
Dennis MacAlistair Ritchie was born on Sept. 9, 1941, in Bronxville, N.Y. His father, Alistair, was an engineer at Bell Labs, and his mother, Jean McGee Ritchie, was a homemaker. When he was a child, the family moved to Summit, N.J., where Mr. Ritchie grew up and attended high school. He then went to Harvard, where he majored in applied mathematics.
While a graduate student at Harvard, Mr. Ritchie worked at the computer center at the Massachusetts Institute of Technology, and became more interested in computing than math. He was recruited by the Sandia National Laboratories, which conducted weapons research and testing. "But it was nearly 1968," Mr. Ritchie recalled in an interview in 2001, "and somehow making A-bombs for the government didn't seem in tune with the times."
Mr. Ritchie joined Bell Labs in 1967, and soon began his fruitful collaboration with Mr. Thompson on both Unix and the C programming language. The pair represented the two different strands of the nascent discipline of computer science. Mr. Ritchie came to computing from math, while Mr. Thompson came from electrical engineering.
"We were very complementary," said Mr. Thompson, who is now an engineer at Google. "Sometimes personalities clash, and sometimes they meld. It was just good with Dennis."
Besides his brother Bill, of Alexandria, Va., Mr. Ritchie is survived by another brother, John, of Newton, Mass., and a sister, Lynn Ritchie of Hexham, England.
Mr. Ritchie traveled widely and read voraciously, but friends and family members say his main passion was his work. He remained at Bell Labs, working on various research projects, until he retired in 2007.
Colleagues who worked with Mr. Ritchie were struck by his code - meticulous, clean and concise. His writing, according to Mr. Kernighan, was similar. "There was a remarkable precision to his writing," Mr. Kernighan said, "no extra words, elegant and spare, much like his code."
October 06, 2011 | Moon of Alabama
To consider today's Steve Jobs hype citing some excerpts from the Wikipedia entry about him seems appropriate.Jobs returned to his previous job at Atari and was given the task of creating a circuit board for the game Breakout. According to Atari founder Nolan Bushnell, Atari had offered $100 for each chip that was eliminated in the machine. Jobs had little interest in or knowledge of circuit board design and made a deal with Wozniak to split the bonus evenly between them if Wozniak could minimize the number of chips. Much to the amazement of Atari, Wozniak reduced the number of chips by 50, a design so tight that it was impossible to reproduce on an assembly line. According to Wozniak, Jobs told Wozniak that Atari had given them only $700 (instead of the actual $5,000) and that Wozniak's share was thus $350.
While Jobs was a persuasive and charismatic director for Apple, some of his employees from that time had described him as an erratic and temperamental manager.
In the coming months, many employees developed a fear of encountering Jobs while riding in the elevator, "afraid that they might not have a job when the doors opened. The reality was that Jobs' summary executions were rare, but a handful of victims was enough to terrorize a whole company." Jobs also changed the licensing program for Macintosh clones, making it too costly for the manufacturers to continue making machines.
After resuming control of Apple in 1997, Jobs eliminated all corporate philanthropy programs.
In 2005, Jobs responded to criticism of Apple's poor recycling programs for e-waste in the U.S. by lashing out at environmental and other advocates at Apple's Annual Meeting in Cupertino in April.
In 2005, Steve Jobs banned all books published by John Wiley & Sons from Apple Stores in response to their publishing an unauthorized biography, iCon: Steve Jobs.
The article doesn't go into the outsourcing of the production of Apple products to a Chinese company which is essentially using slave labor with 16 hour work days and a series of employee suicides. This while Apple products are beyond real price competitions and the company is making extraordinary profits.
Jobs was reported to be the 42nd of the richest men list in the United States.
He marketed some good products. The NeXT cube was nice. Jobs though wasn't a nice man.b
@jdmckay NeXT OS & Development tools were 5-10 years beyond... *anything* else out there. NeXT STEP *defined* OOP... when CS professors were still saying it was a fad.
NeXT came out 1988/89.
I learned object oriented programming (OOP) 1985/86 on a Symbolics LISP Machine which had a very nice graphic interface. The machine was of course running at a computer science department at a university and there were several capable CS professors around who were working on such machines and saw them as the future and not as a fad.
Jobs didn't invent with NeXT. He created a really nice package of existing technologies using a UNIX derivative and aspects of the LISP Machine and Smalltalk. Objective-C was developed in early 1980s. Jobs just licensed it. People at XEROX and elsewhere had been working at such stuff for years before Jobs adopted them.
NeXTStep did not define OOP. It made it wider available. There were already some 7000+ LISP machines sold before NeXT came onto the market.
Six years ago, Jobs had talked about how a sense of his mortality was a major driver behind that vision.
"Remembering that I'll be dead soon is the most important tool I've ever encountered to help me make the big choices in life," Jobs said during a Stanford commencement ceremony in 2005.
"Because almost everything -- all external expectations, all pride, all fear of embarrassment or failure -- these things just fall away in the face of death, leaving only what is truly important."
"Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart."
Apple said in a press release that it was "deeply saddened" to announce that Mr. Jobs had passed away on Wednesday.
"Steve's brilliance, passion and energy were the source of countless innovations that enrich and improve all of our lives," the company said. "The world is immeasurably better because of Steve.
Mr. Jobs stepped down from the chief executive role in late August, saying he could no longer fulfill his duties, and became chairman. He underwent surgery for pancreatic cancer in 2004, and received a liver transplant in 2009.
Rarely has a major company and industry been so dominated by a single individual, and so successful. His influence went far beyond the iconic personal computers that were Apple's principal product for its first 20 years. In the last decade, Apple has redefined the music business through the iPod, the cellphone business through the iPhone and the entertainment and media world through the iPad. Again and again, Mr. Jobs gambled that he knew what the customer would want, and again and again he was right.
The early years of Apple long ago passed into legend: the two young hippie-ish founders, Mr. Jobs and Steve Wozniak; the introduction of the first Macintosh computer in 1984, which stretched the boundaries of what these devices could do; Mr. Jobs's abrupt exit the next year in a power struggle. But it was his return to Apple in 1996 that started a winning streak that raised the company from the near-dead to its current position of strength.
Bill Gates, the former chief executive of Microsoft, said in a statement that he was "truly saddened to learn of Steve Jobs's death." He added: "The world rarely sees someone who has had the profound impact Steve has had, the effects of which will be felt for many generations to come. For those of us lucky enough to get to work with him, it's been an insanely great honor. I will miss Steve immensely."
Mr. Jobs's family released a statement that said: "Steve died peacefully today surrounded by his family. In his public life, Steve was known as a visionary; in his private life, he cherished his family. We are thankful to the many people who have shared their wishes and prayers during the last year of Steve's illness; a Web site will be provided for those who wish to offer tributes and memories."
On the home page of Apple's site, product images were replaced with a black-and-white photo of Mr. Jobs.
Mr. Jobs's decision to step down in August inspired loving tributes to him on the Web and even prompted some fans to head to Apple stores to share their sentiments with others. Some compared him to legendary innovators like Thomas Edison.
September 08, 2011 | Open Enterprise
I've never written an obituary before in these pages. Happily, that's because the people who are driving the new wave of openness are relatively young, and still very much alive. Sadly, one of the earliest pioneers, Michael Hart, was somewhat older, and died on Tuesday at the age of just 64.
What makes his death particularly tragic is that his name is probably only vaguely known, even to people familiar with the areas he devoted his life to: free etexts and the public domain. In part, that was because he modest, content with only the barest recognition of his huge achievements. It was also because he was so far ahead of his times that there was an unfortunate disconnect between him and the later generation that built on his trailblazing early work.
To give an idea of how visionary Hart was, it's worth bearing in mind that he began what turned into the free etext library Project Gutenberg in 1971 - fully 12 years before Richard Stallman began to formulate his equivalent ideas for free software. Here's how I described the rather extraordinary beginnings of Hart's work in a feature I wrote in 2006:
In 1971, the year Richard Stallman joined the MIT AI Lab, Michael Hart was given an operator's account on a Xerox Sigma V mainframe at the University of Illinois. Since he estimated this computer time had a nominal worth of $100 million, he felt he had an obligation to repay this generosity by using it to create something of comparable and lasting value.
His solution was to type in the US Declaration of Independence, roughly 5K of ASCII, and to attempt to send it to everyone on ARPANET (fortunately, this trailblazing attempt at spam failed). His insight was that once turned from analogue to digital form, a book could be reproduced endlessly for almost zero additional cost - what Hart termed "Replicator Technology". By converting printed texts into etexts, he was able to create something whose potential aggregate value far exceeded even the heady figure he put on the computing time he used to generate it.
Hart chose the name "Project Gutenberg" for this body of etexts, making a bold claim that they represented the start of something as epoch-making as the original Gutenberg revolution.
Naturally, in preparing to write that feature for LWN.net, I wanted to interview Hart to find out more about him and his project, but he was very reluctant to answer my questions directly - I think because he was uncomfortable with being placed in the spotlight in this way. Instead, he put me on his mailing list, which turned out to be an incredible cornucopia of major essays, quick thoughts, jokes and links that he found interesting.
In one of those messages, he gave a good explanation of what he believed his Project Gutenberg would ultimately make possible:
Today we have terabyte drives for under $100 that are just about the same size as the average book.
10 years ago, in 1999, most people were using gigabytes in their systems rather than terabytes.
10 years before that, in 1989, most people used megabytes.
10 years before that, in 1979, most people used kilobytes.
My predictions run up to about 2021, which would be around the 50th anniversary of that first eBook from 1971.
I predict there will be affordable petabytes in 2021.
If there are a billion eBooks by 2021, they should fit the new petabytes just fine, as follows:
The average eBook in the plainest format takes a megabyte.
There will be a billion eBooks in 2021 or shortly after.
A billion eBooks at a megabyte each takes one petabyte.
You will be able to carry all billion eBooks in one hand.
As this makes clear, Hart was the original prophet of digital abundance, a theme that I and others are now starting to explore. But his interest in that abundance was not merely theoretical - he was absolutely clear about its technological, economic and social implications:
I am hoping that with a library this size that the average middle class person can afford, that the result will be an even greater overthrow of the previous literacy, education and other power structures than happened as direct results of The Gutenberg Press around 500 years ago.
Here are just a few of the highlights that may repeat:
1. Book prices plummet.
2. Literacy rates soar.
3. Education rates soar.
4. Old power structures crumbles, as did The Church.
5. Scientific Revolution.
6. Industrial Revolution.
7. Humanitarian Revolution.
Part of those revolutions was what Hart called the "Post-Industrial Revolution", where the digital abundance he had created with Project Gutenberg would be translated into the analogue world thanks to more "replicators" - 3D printers such as the open source RepRap:
If we ... presume the world at large sees its first replicator around 2010, which is probably too early, given how long it took most other inventions to become visible to the world at large [usually 30 years according to thesis by Madelle Becker], we can presume that there will be replicators capable of using all the common materials some 34.5 years into the future from whatever time that may actually be.
Hence the date of 2050 for the possibility of some replicators to actually follow somebody home: if that hasn't already been made illegal by the fears of the more conservative.
Somewhere along the line there will also be demarcations of an assortment of boundaries between replicators who can only make certain products and those who can make new replicators, and a replicator that could actually walk around and follow someone, perhaps all the way home to ask if it could help.
The fact that it was ~30 years from the introduction of eBooks to those early Internet pioneers to the time Google made their big splashy billion dollar media blitz to announce their eBook project without any mention of the fact that eBooks existed in any previous incarnation, simply is additional evidence for an educated thesis mentioned above, that had previously predicted about a 30 year gap between the first public introductions and awareness by the public in general.
So, when you first start to see replicators out there set your alarm clocks for ~30 years, to remind you when you should see, if they haven't been made illegal already, replicators out for a walk in at least some neighborhoods.
Notice the comment "if that hasn't already been made illegal". This was another major theme in Hart's thinking and writings - that copyright laws have always been passed to stop successive waves of new technologies creating abundance:
We keep hearing about how we are in "The Information Age," but rarely is any reference made to any of four previously created Information Ages created by technology change that was as powerful in the day as the Internet is today.
The First Information Age, 1450-1710, The Gutenberg Press, reduced the price of the average books four hundred times. Stifled by the first copyright laws that reduced the books in print in Great Britain from 6,000 to 600, overnight.
The Second Information Age, 1830-1831, Shortest By Far The High Speed Steam Powered Printing Press Patented in 1830, Stifled By Copyright Extension in 1831.
The Third Information Age, ~1900, Electric Printing Press Exemplified by The Sears Catalog, the first book owned by millions of Americans. Reprint houses using such presses were stifled by the U.S. Copyright Act of 1909.
The Fourth Information Age, ~1970, The Xerox Machine made it possible for anyone to reprint anything. Responded to by the U.S. Copyright Act of 1976.
The Fifth Information Age, Today, The Internet and Web. Hundreds of thousands, perhaps even a million, books from A to Z are available either free of charge or at pricing, "Too Cheap To Meter" for download or via CD and DVD. Responded to by the "Mickey Mouse Copyright Act of 1998," The Digital Millennium Copyright Act, The Patriot Act and any number of other attempted restrictions/restructures.
Hart didn't just write about the baleful effect of copyright extensions, he also fought against them. The famous "Eldred v Ashcroft" case in the US that sought to have such unlimited copyright extensions declared unconstitutional originally involved Hart. As he later wrote:
Eldred v Ashcroft was previously labeled as in "Hart v Reno" before I saw that Larry Lessig, Esquire, had no intention of doing what I thought necessary to win. At that point I fired him and he picked up Eric Eldred as his current scapegoat du jour.
As this indicates, Hart was as uncompromising in his defense of the public domain as Stallman is of free software.
Most of his best writings are to be found in the emails that were sent out to his mailing list from time to time, although there is a Web page with links to a couple of dozen essays that are all well-worth reading to get a feeling for the man and his mind. There are also more of his writings on the Project Gutenberg site, as well as a useful history of the project.
However, it's hugely regrettable that Hart never published his many and wide-ranging insights as a coherent set of essays, since this has led to a general under-appreciation of the depth of his thinking and the crucial importance of his achievements. Arguably he did more for literature (and literacy) than any Nobel Prize laureate for that subject every will.
Fortunately, Project Gutenberg, which continues to grow and broaden its collection of freely-available texts in many languages, stands as a fitting and imperishable monument to a remarkable human being who not only gave the world great literature in abundance, but opened our eyes to the transformative power of abundance itself.
Follow me @glynmoody on Twitter or identi.ca, and on Google+
January 21, 2011 | The Register
If you program it, they will come
By Gavin Clarke in Mountain View Get more from this author
Posted in Music and Media, 21st January 2011 19:58 GMT
It's weird to see something from your childhood displayed as an ancient cultural artifact. Here at the newly refurbished Computer History Museum in Mountain View, California, I'm standing over a glass case that houses the Commodore 64, the same machine I begged my parents to buy me for Christmas in 1983.
Compared to today's slick and smart personal computers, the Commodore 64 is the village idiot. With its 8-bit, 1MHz MOS 6510 processor and 64KB of memory, the only thing chunky about this machine was its famous built-in keyboard. But the Commodore 64 was in the vanguard of a revolution, one that took computers into people's homes by making them mass-produced, affordable, and usable by people without maths degrees or special training.
The Commodore even bested the iconic Apple II or Apple ][, as fanbois of the day wrote its name which was designed by that company's larger-than-life cofounder Steve Wozniak. When Apple's pioneering desktop debuted in 1977, the entry-level version with a mere 4KB of RAM cost $1,298 significantly more expensive than the Commodore 64's 1982 intro price of $595, which later dropped to $199. The Commodore 64 was so popular that sales estimates range from between 17 and 22 million units during its 11-year run.
The Commodore 64 is now among 1,100 objects that comprise a new exhibition called Revolution: The First 2,000 Years of Computing at the Computer History Museum. Assembled by an army of 300 people over eight years, Revolution fills 25,000 square feet and is the crown jewel of a $19m renovation of a museum that's been an easy-to-miss landmark of Silicon Valley the spiritual home of the tech industry since 1999.
$19m is a hefty dose of private philanthropy by any standard, and one that's all the more impressive given that it came from an industrial sector famed for entrepreneurs and engineers obsessed by the future, not the past. Among the donors to Revolution is Bill Gates, who also provided the establishing gift: the BASIC interpreter tape he wrote for the MITS Altair 8800 while at Harvard in 1975, and that led to Microsoft and Windows.
Museum president and chief executive John Hollar told The Reg on a tour ahead of Revolution's opening that the exhibit centers on thematic moments in computing. "If you knit all those together, you get an interesting picture of where we are today," he said.
Revolution features pieces of the 700-square-foot, 30-ton ENIAC Electrical Numerical Integrator And Computer built by the US government between 1943 and 1945 to calculate missile trajectories. Not only was ENIAC the first general-purpose computer to run at "electronic speed" because it lacked mechanical parts, it was also programmed entirely by a staff of six female mathematicians who lacked any manuals and worked purely by deciphering logical and block diagrams.
There's also an IBM/360 from 1964, the first general-purpose computer for businesses that killed custom systems such as ENIAC that were built by and for governments. IBM staked its future on the IBM/360 in today's dollars the project would cost $80bn.
Revolution is home to one of Google's first rack towers, dating from 1999. Spewing Ethernet cabling down its front, the tower helped establish Google as a search colossus whose thumb is now on the throat of the web and society, choking out $23bn a year from online ads.
Is small and portable more your thing? There's the PalmPilot prototype and the original card and wood mock-up donated by Palm co-founder and Apple graduate Donna Dubinsky. With its stylus input, the PalmPilot became the first widely popular handheld device. The original models, the Pilot 1000 and Pilot 5000, predated Apple's finger-poking iPad by 14 years.
Revolution houses analogue devices that are more like workbenches, and is home to the first Atari Pong arcade game in its plywood case (which ignited the video-game revolution), a gold-colored Bandai Pippin from Apple (which disappeared without a trace), the Apple II, and the Altair that inspired Gates to eventually build the world's largest software company.
While you can't view the actual BASIC interpreter tape that Gates wrote, the code has been immortalized in a huge glass plaque in the newly minted, airy reception area. Nerds take note: the reception area's tiled floor holds a punch-card design work out what it says and you win a prize.
"Knitting it all together," as Hollar puts it, means you shouldn't be surprised to see that 1999 Google rack server in an exhibit that goes back 2,000 years to the abacus.
"Will Google be remembered 100 years from now? That's hard to say," Hollar told us. "But what's more likely is what Google represents is with us forever - which is finding what you want, when you want it, where you are, and having an expectation that that information is going to be instantaneously available to you. That's unleashed a new kind of human freedom and those powerful forces that affect people at a personal and human level they don't get put back in the box."
Revolution doesn't just show objects that are important to computing, such as the Google rack, it also tells the stories of their creation and their creators. Not just familiar names such as Alan Turing, but also the ENIAC women whose job title was "computer" and who were classified as "sub professional" by the army and disregarded by their snotty male managers.
Also featured are people such as integrated-circuit inventor Jack Kilby, whose bosses at Texas Instruments told him in 1958 not to bother with his project. Kilby pressed on regardless during his summer holidays, and presented the top brass with the finished article when they returned from their undeserved time away. Such jaw-dropping tales of achievement against all odds are explained with the assistance of 5,000 images, 17 commissioned films, and 45 interactive screens you can poke at and scroll through.
Thanks to its location in the heart of Silicon Valley, down the road from Apple, Intel, HP, and Xerox PARC companies whose ideas or products now dominate our daily lives it would be easy for the museum to present only a local feel to the story of computing, and to give a US-centric bias. With displays from Britain, France, Japan, Korea, and Russia, however, Revolution looks beyond the boundaries of Silicon Valley and the US.
A good example of the global focus is the story of LEO, the Lyons Electronic Office that's one of the least-known entries from Britain in the history of personal computing.
Back when Britain ran an Empire, J Lyons & Company was a huge food and catering company famed for running a network of nationwide teashops that refreshed stiff upper lips from Piccadilly to the provinces with cups of tea for the price of just over two old pennies.
Then, as now, catering had extremely narrow margins, and J Lyons took an early interest in computers to help automate its back office and improve its bottom line. Specifically, J Lyons was watching the work of Sir Maurice Wilkes at Cambridge University, where he was building EDSAC, the Electronic Delay Storage Automatic Calculator, that performed its first calculation in 1949. EDSAC was a pre-semiconductor dinosaur, using 3,000 vacuum valves and 12 racks, with mercury for memory, and that lumbered along at a then mind-blowing 650 operations per second.
Lyons copied EDSAC and came up with LEO in 1951, regarded as the first computer for business. LEO ran the first routine office jobs - payroll and inventory management - while Lyons also used LEO for product development, calculating different tea blends.
Lyons quickly realized the potential of office automation, built the LEO II and formed LEO Computers, which went on to build and install LEO IIs for the British arm of US motor giant Ford Motor Company, the British Oxygen Company, HM Customs & Excise, and the Inland Revenue. LEO computers were exported to Australia, South Africa, and the Czech Republic - at the height of the Cold War. By 1968 LEO was incorporated into ICL, one of Britain's few computer and services companies, now Fujitsu Services.
The lion goes to pieces
Two surviving fragments of the once mighty LEO were acquired for Revolution at auction in London: a cracked vacuum tube and a battered, grey-blue metal control box with buttons and flip switches that's more car part than computer component.
Alex Bochannek, the museum curator who showed The Reg around Revolution, told us he feels "particularly strongly" about LEO. "Our collecting scope is global in nature and the story of LEO is such a fascinating one, yet almost completely unknown in this country. This is why we decided to add these objects to the collection specifically for Revolution."
The $19m renovation and Revolution make the museum an attractive destination for visitors of all kinds and ages - engineers, non-techies, tourists, and those interested in the history of women in the workplace, to name a few. The museum is also trying to raise its game in academic circles, and Hollar wants to turn it into the premier center on the history of computing.
Just 2 per cent of the museum's entire collection is on display in Revolution, with the plan to make the rest available to a worldwide audience through a new web site in March.
"[The museum] will be seen as a destination of important papers and other important artifacts located around the world for that definitive collection point of oral histories of people whose stories need to be recorded," Hollar said.
After being in the Valley for 12 years, why is now the right time to plough $19m into Revolution? According to Hollar, the time is ripe because of the ubiquity of computing online and in our pockets: we need to understand the journey from moving mechanical parts to digitization, from room-sized single-purpose dinosaurs to the multifunction iPhone, from switches and flashing lights to the keyboard and screen.
The entrepreneurs and engineers of Silicon Valley could also learn a thing or two by examining their past. "Some people in Silicon Valley believe they don't look backward, they only look forwards, but some people here who are very successful do understand they are part of something larger," Hollar said.
I hear echoes of British wartime leader Winston Churchill, who was fond of George Santayana's sentiment that those who fail to learn from history are doomed to repeat it. In this case, however, they might also miss new opportunities.
"The Google search engine is based on a very simple analogy that academic articles are known to become authoritative the more they are cited," Hollar said. "By making that analogy, Larry Page working with Brin took search from looking for words on a page to looking for something that's really important to you."
As more and more of the pioneers of modern computing age and pass away - Sir Wilkes died last year, and just one of the ENIAC women remains with us - there must surely be among modern computing's pioneers a growing desire for something tangible that preserves and records their achievements. It would be ironic if those obsessed with digitizing and recording data fail to record their stories, and if those stories slipped into an oral tradition or - worse - a Wikipedia-style group consensus of history where facts are relative and received secondhand. How many more LEOs are alive in the auction houses of the world, waiting to be clawed back?
Was such a concern for legacy behind Gates' donation to the museum?
"I think he's proud of what that little piece of tape represents," Hollar said. "That's the essence of the very first days of Microsoft, and if you talk to Bill and [cofounder] Paul Allen about it they are very aware now they are at a point in their lives where they are very aware that what they did was important and it needs to be preserved.
"I'm very glad they see the museum as the place where they want that to happen."
I just wonder if Gates feels as weird about all this as I did.
April 2, 2010 | BBC News
The "father of the personal computer" who kick-started the careers of Microsoft founders Bill Gates and Paul Allen has died at the age of 68.
Dr Henry Edward Roberts was the inventor of the Altair 8800, a machine that sparked the home computer era.
Gates and Allen contacted Dr Roberts after seeing the machine on the front cover of a magazine and offered to write software for it.
The program was known as Altair-Basic, the foundation of Microsoft's business.
"Ed was willing to take a chance on us - two young guys interested in computers long before they were commonplace - and we have always been grateful to him," the Microsoft founders said in a statement.
"The day our first untested software worked on his Altair was the start of a lot of great things."
Apple co-founder Steve Wozniak told technology website CNET that Dr Roberts had taken " a critically important step that led to everything we have today".
Dr Roberts was the founder of Micro Instrumentation and Telemetry Systems (MITS), originally set up to sell electronics kits to model rocket hobbyists.
The company went on to sell electronic calculator kits, but was soon overshadowed by bigger firms.
In the mid-1970's, with the firm struggling with debt, Dr Roberts began to develop a computer kit for hobbyists.
The result was the Altair 8800, a machine operated by switches and with no display.
It took its name from the then-cutting edge Intel 8080 microprocessor.
The $395 kit (around £1,000 today) was featured on the cover of Popular Electronics in 1975, prompting a flurry of orders. It was also sold assembled for an additional $100 charge.
Amongst those interested in the machine were Paul Allen and Bill Gates.
The pair contacted Dr Roberts, offering to write software code that would help people program the machine.
The pair eventually moved to Albuquerque - the home of MITS - where they founded Micro-Soft, as it was then known, to develop their software: a variant of the Beginners All-purpose Symbolic Instruction Code (Basic).
"We will always have many fond memories of working with Ed in Albuquerque, in the MITS office right on Route 66 - where so many exciting things happened that none of us could have imagined back then," the pair said.
Dr Roberts sold his company in 1977.
He died in hospital on 1 April after a long bout of pneumonia.
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Haters Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2018 by Dr. Nikolai Bezroukov. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) in the author free time and without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info|
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.
Last modified: September, 12, 2017