Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Softpanorama Assembler Bulletin, 2004

[Oct 2, 2004] flat assembler site has a nice forum, that I recommend to visit.

Welcome to the site of flat assembler! This is a place dedicated to assembly language programming for x86 systems and contains many resources for both beginners and advanced assembly programmers. This site is still under construction (and perhaps it will always be), but hopefully you'll find here some useful materials, no matter whether you are trying to learn the assembly language, or just are looking for the solution for some particular problem.

[Sept 12, 2004] The Assembler Connection System 360/370 assembler site

The Assembler Connection provides a suite of sample programs that are written to assemble and link using Assembler/H or High Level Assembler (HLASM) when possible. If a technique is used that is unique to a specific dialect it will be noted. JCL members are provided to run the jobs as MVS batch jobs on an IBM mainframe or within a project using Micro Focus Mainframe Express (MFE) running on a PC with Windows . The 370 Assembler Option for MFE is required to run on the PC.

[Feb 6, 2004] Slashdot Learning Computer Science via Assembly Language -- it's actually amazing how few people know the history or understand the differences between very-high level languages, high-level languages and low level languages :-(. Assembler is the key to efficiency but the main point is not in this but in the fact that you can benefit from usage of  different languages for different parts of your system (if this is a big system that needs to be fast, like computer games) and only assembler can provide you the level of understanding needed to debug such a complex system. The latter is true even if was written with zero lines of assembler.  Of course any discussion about programming languages is incomplete without at least one OO enthusiast, who thinks that Java is perfect language for study of algorithms, but it's a troubling sigh when there are too many of them ;-) Below I collected some interesting from my point of view, comments...


  Available under GNU FDL (Score:5, Informative)
by JoshuaDFranklin (147726) * <joshuadfranklin,NOSPAM&yahoo,com> on Thursday February 05, @06:53PM (#8195987)
(mailto:[email protected])
I don't know why he didn't mention that this is a free documentation project:

http://savannah.nongnu.org/projects/pgubook/ [nongnu.org]

It's also being used at Princeton [princeton.edu]

This book (Score:5, Informative)
by voodoo1man (594237) on Thursday February 05, @08:43PM (#8197123)
has been available [nongnu.org] for some time under the GNU Free Documentation License. I tried to use it a while back when I decided to learn assembler, but I found Paul Carter's PC Assembly Language [drpaulcarter.com] to be a much better introduction. Re:Linux x86 assembly? (Score:5, Interesting)
by vsprintf (579676) on Thursday February 05, @08:34PM (#8197069)
I don't know why, but just saying the words 'assembly language', sends a chill down my spine. I guess I am too weak minded to learn it.

Maybe individual brains just work in different ways. In school, I knew some people who were good with high-level languages but just couldn't hack assembler. They could not get down to that absolute minimal step-by-step instruction level. I'm not sure what that says about those of us who use assembler. :) BTW, I certainly don't advocate assembler as a first computer language - second, perhaps.

Re:Linux x86 assembly? (Score:2)
by chthon (580889) on Friday February 06, @07:23AM (#8199964)
Well, my first language was 8080 assembler, and I did not have a computer back then. For exercising, I created really small programs, which I converted into hex.

I bought my first computer when I was 18, a ZX Spectrum, which had a Z80. I think that was for me the main reason to buy it.

I have always had the feeling (still) that I knew more about computers and good programming, because I have insight into the processor. I studied electronics, not CS, and for most part I write better code than people who learned only high level languages.
Re:Linux x86 assembly? (Score:1)
by douglas jeffries (585519) on Friday February 06, @07:47AM (#8200072)
(http://douglasjeffries.com/ | Last Journal: Tuesday August 13, @09:49PM)
I think that the only difference is that people who understand the lower-level system are more likely to consider the higher-level things. Ultimately, I think that some understanding of each layer of abstraction is necessary to succeed at designing within any layer, and that the best engineers study the entire system.

People who only want to know one layer are probably likely to go for a really high-level one, because Java seems easier than assembly. In truth though, I think that there are similar challenges and that designing good software is just as difficult in Java as in assembly. IMHO the biggest difference is how easy it is how obvious your mistakes are.
Re:Linux x86 assembly? (Score:5, Informative)
by Endive4Ever (742304) on Thursday February 05, @07:53PM (#8196662)
Well, some of us code assembly on bare hardware. We have to roll our own 'api' and include it in there with the rest of the code.

I've worked before with programmers who had little experience in programming 'bare hardware'- they do really foolish things like not initing timers, setting up stack pointers, and the like.

Writing bare ASM code for a processor (where it boots up out of your own EPROM or on an emulator) is good experience in minimalism. It can give you a good feeling when the project is all done and you can say you did it all yourself.

For those interested in getting into this kind of thing, start with a PIC embedded controller and a cheap programmer. You can get PIC assembly language tools for free, and build a programmer, or buy a kit for a programmer, that plugs into your serial or parallel port. Your first PIC machine can be the CPU, a clock crystal, a few resistors and capacitors, and the LED you want to blink, or whatever other intrigues you. If you're not into complex soldering, and/or layout and complex schematics, you can buy pre-etched boards you just plug the PIC into.

Another easy-start processor would be the 68HC11. It has a bootstrap built into ROM. Basically, you can jumper the chip so it wakes up listening on the serial port for code you send down the wire at it, and burns it into the EEPROM memory in the 'HC11 chip itself. Move the jumper and reboot the chip, and it's running your code.

I think this is far more interesting that just writing apps that run on an Operating System you didn't roll yourself.
Re:Linux x86 assembly? (Score:4, Insightful)
by Breakfast Pants (323698) on Thursday February 05, @10:22PM (#8197806)
(Last Journal: Wednesday October 16, @01:31AM)
To me, not teaching assembly in a CS major would be insane. It would be like teaching physics without any of the history of how it was discovered and without showing how to derive the various equations from the more fundamental equations. My first 3 semesters were in java. My fourth semester I was in a C, Assembly, and an intro ECE class and I am very glad that I was. The combination of these 3 classes at the same time was great. Sometimes it is a lot more helpful to learn why something works or how something works than just learning (heard countless times in my java classes) "Oh don't ask questions about that, its not something you need to know. Java handles this for you automatically." If you want it to only be taught like that thats great; just don't expect any of your students to ever create the next java.
 
    Also, you do know that compilers are written by programmers don't you?
Re:Linux x86 assembly? (Score:5, Insightful)
by texaport (600120) <texaport@@@msn...com> on Thursday February 05, @10:53PM (#8197987)

Assembly has little to nothing to do with either programming or computer science anymore. Computer science (IMHO) deals with the study of software engineering and algorithms

If it truly is a science, then someone who finishes a Bachelors and Masters program in Computer Science had better be capable of contributing to the advancement of this field.

This could be through the development of new languages, in which case I hope they know a thing or two about assembly in the first place.

Otherwise, a couple of Computer Science degrees would simply mean someone is a techno-wonk, a professional student, or just a technician rather than a professional engineer/scientist.

--
Disclaimer: 90% of the programmers out there do not need a Computer Science degree, and 90% of the jobs
out there for developers don't need CS graduates

Re:Linux x86 assembly? (Score:4, Insightful)
by John Courtland (585609) on Friday February 06, @12:50AM (#8198618)
It also gave me a better appreciation for optimization. The cycle counting, the instruction scheduling, cheap tricks that save 30 cycles here and there (like SHL a few times instead of Multiplying). I miss having that sort of control. Coding for DOS was always a learning experience too, becuase you basically had to write an OS everytime you wanted to do anything non-trivial.

I learned Asm before I learned C, and I must say that was a good way of going about it. I'm glad I don't view C as some sort of "hocus pocus", and I never did. Everything just made sense. Now-a-days you've got Joe Blow with dollar signs in his eyes and his shiny new degree in using Java, who doesn't understand the little black box he's entering commands into.

It sort of pisses me off, because I don't want to put gay little buzzwords in my resume like C#, or Java, or
.NET. I should be able to put down "Assembler: x86, z80, s/390" and the idiot HR guy should know everything else is a simple matter of syntax. Re:Syntax, OS interfaces... (Score:5, Informative)
by Anonymous Coward on Thursday February 05, @07:14PM (#8196269)
There are two standards, the AT&T ... and the other one

Incorrect. There are at least four different assemblers and standards:

ASM - GNU Assembler. AT&T standard, as commonly used on Linux. The syntax hasn't changed since the 60's - which is both very good and very bad. I personally think it should be retired.

MASM - Microsoft Assembler. Intel standard assembly. The syntax is nice, but there are some ambiguous operators (is [] address of or address by value? - the meaning changes depending on the context). This is typically what the commercial Windows world uses. MASM itself is mostly obsolete - the Visual C compiler can now do everything that it could and supports all modern CPU instructions (even on Visual C++ 6 if you install the latest CPU pack).

NASM - Netwide Assembler. An assembler that set out to put right all the things that were wrong with MASM. The syntax is excellent, ambiguous operators are cleared up, documentation is also excellent, it interoperates beautifully with Visual C on Windows and GNU C on Linux. Ideally NASM would replace AS as the standard now that it's open source.

TASM - Borland Turbo Assembler. Based around the Intel standards, but does things slightly differently. Has extensions which allow for easy object-oriented assembly programming - which can make for some very nice code. Had a MASM compatibility mode, but nobody in their right mind used that if they could help it. I had version 5, but I don't believe they've kept it up to date, so it's obsolete now.

There are a couple of others as well, most notably AS86 (which was the leading independent solution for writing assembler back in the DOS days).
Re:Linux x86 assembly? (Score:5, Interesting)
by pla (258480) on Thursday February 05, @06:58PM (#8196049)
(Last Journal: Sunday December 08, @05:40PM)
Is "Linux x86 assembly" any different to any other kind of "x86 assembly"?

Yes. Although it requires understanding the CPU's native capabilities to the same degree, Linux uses AT&T syntax, whereas most of the Wintel world uses (unsurprisingly) Intel/Microsoft syntax.

Personally, although I far prefer coding C under Linux, I prefer Intel syntax assembly. Even with many years of coding experience, I find AT&T syntax unneccessarily convoluted and somewhat difficult to quickly read through.

The larger idea holds, however, regardless of what assembler you use. I wholeheartedly agree with the FP - People who know assembly produce better code by almost any measurement except "object-oriented-ness", which assembly makes difficult to an extreme. On that same note, I consider that as one of the better arguments against OO code - It simply does not map well to real-world CPUs, thus introducing inefficiencies in the translation to something the CPU does handle natively.
Re:Linux x86 assembly? (Score:5, Interesting)
by pla (258480) on Thursday February 05, @07:31PM (#8196434)
(Last Journal: Sunday December 08, @05:40PM)
maxim: cycles are cheap, people are expensive.

True. This topic, however, goes beyond mere maximizing of program performance. Pur simply, if you know assembler, you can take the CPU's strengths and weaknesses into consideration while still writing readable, maintainable, "good" code. If you do not know assembly, you might produce simply beautiful code, but then have no clue why it runs like a three-legged dog.

it is significantly better value to design and build a well architected OO solution

Key phrase there, "well-architected". In practice, the entire idea of "object reuse" counts as a complete myth (I would say "lie", but since it seems like more of a self-deception, I woun't go that far). I have yet to see a project where more than a handful of objects from older code would provide any benefit at all, and even those that did required subclassing them to add and/or modify over half of their existing functionality. On the other hand, I have literally hundreds of vanilla-C functions I've written over the years from which I draw with almost every program I write, and that require no modification to work correctly (in honesty, the second time I use them, I usually need to modify them to generalize better, but after that, c'est fini).

Who cares if it's not very efficient - it'll run twice as fast in 18 months

Y'know, I once heard an amusing joke about that... "How can you tell a CS guy from a programmer?" "The CS guy writes code that either won't run on any machine you can fit on a single planet, or will run too slowly to serve its purpose until technology catches up with it in few decades". Something like tha - I killed the joke, but you get the idea.

Yeah, computers constantly improve. But the clients want their shiny new software to run this year (if not last year, or at least on 5-year old equipment), not two years hence.
Re:Linux x86 assembly? (Score:5, Insightful)
by gweihir (88907) on Thursday February 05, @09:52PM (#8197620)
(http://www.tik.ee.ethz.ch/~wagner/)
Not only that but real world experience shows that code written in ASM is NOT maintanable, the indepth knowledge of a specific architecture is fleeting while knowledge of most high level languages lasts a LONG time.

That is not the point. The point is that knowing one assembly language gives far more insight into what higher level languages actually do. It is, e.g., very difficult to explain the actual workings of a buffer-overflow exploit to somebody without any assembly knowledge. Or what a pointer is. Or what pageing does. Or what an interrupt is. Or what impact the stack has and how it is being used for function arguments. Or how much memory a variable needs....

The only processor I know that actually made assembly programming almost a c-like experience was the Motorola 68xxx family. On the Atari ST, e.g., there were complex applications writen entirely in assembly. Today it would indeed be foolish to do a larger project in assembly language, but that is not the point of the book at all.

Bottom line: You need to understand the basic tools well. You don't need to restrict yourself to their use or even use them often. But there is no substitute for this understanding.
Actually, they DON'T. (Score:5, Interesting)
by Ungrounded Lightning (62228) on Thursday February 05, @08:09PM (#8196861)
(Last Journal: Sunday January 18, @06:45AM)
People who know assembly produce better code by almost any measurement except "object-oriented-ness", which assembly makes difficult to an extreme.

Actually, they don't.

A study was done, some decades ago, on the issue of whether compilers were approaching the abilities of a good assembly programmer. The results were surprising:

While a good assembly programmer could usually beat the compiler if he really hunkered down and applied himself to the particular piece of code, on the average his code would be worse - because he didn't maintain that focus on every line of every program.

The programmer might know all the tricks. But the compiler knew MOST of the tricks, and applied them EVERYWHERE, ALL THE TIME.

Potentially the programmer could still beat the compiler in reasonable time by focusing on the code that gets most of the execution. But the second part of Knuth's Law applies: "95% of the processor time is spent in 5% of the code - and it's NOT the 5% you THOUGHT it was." You have to do extra tuning passes AFTER the code is working to find and improve the REAL critical 5%. This typically was unnecessary in applications (though it would sometimes get done in OSes and some servers).

This discovery lead directly to two things:

1) Because a programmer can get so much more done and working right with a given time and effort using a compiler than using an assembler, and the compiler was emitting better assembly on the average, assember was abandoned for anything where it wasn't really necessary. That typically means:

  - A little bit in the kernel where it can't be avoided (typically bootup, the very start of the interrupt handling, and maybe context switching). (Unix System 6 kernel was 10k lines, of which 1.5k was assembler - and the assembly fraction got squeezed down from then on.)

  - A little bit in the libraries (typically the very start of a program and the system call subroutines)

  - Maybe a few tiny bits embedded in compiler code, to optimize the core of something slow.

2) The replacement of microcoded CISC processors (i.e. PDP11, VAX, 68K) with RISC processors (i.e. SPARC, MIPS). (x86 was CISC but hung in there due to initera and cheapness.)

Who cares if it takes three instructions instead of one to do some complex function, or if execution near jumps isn't straightforward? The compiler will crank out the three instructions and keep track of the funny execution sequence. Meanwhile you can shrink the processor and run the instructions at the microcode engine's speed - which can be increased further by reducing the nubmer of gates and length of wiring, and end up with a smaller chip (which means higher yeilds, which means making use of the next, faster, FAB technology sooner.)

CISC pushed RISK out of general purpose processors again once the die sizes got big: You can use those extra gates for pipelining, branch prediction, and other stuff that lets you gain back more by parallelism than you lost by expanding the execution units. But it's still alive and well in embedded cores (where you need SOME crunch but want to use most of the silicon for other stuff) and in systems that don't need the absolute cutting-edge of speed or DO need a very low power-per-computation figure.

The compiler advantage over an assembly programmer is extreme both with RISC and with a poorly-designed CISC instruction set (like the early x86es). Well-designed CISC instruction sets (like PDP11, VAX, and 68k) are tuned to simplify the compilers' work - which makes them understandable enough that the tricks are fewer and good code is easier for a human to write. This puts an assembly programmer back in the running. But on the average the compiler still wins.

(But understanding how assembly instruction sets work, and how compilers work, are both useful for writing better code at the compiler level. Less so now that optimizers are really good - but the understanding is still helpful.)
PDP-11 C / Origin of gcc (Score:3, Interesting)
by rs79 (71822) <[email protected]> on Friday February 06, @03:18AM (#8199162)
(http://www.open-rsc.org/)
I understand that much of C was inspired by that instruction set.

I'm not sure inspired would be the right way to say that. C was invented a shorthand for assembler, in particular PDP-11 assembler. I'm probably just being pedantic but I think it's an important distinction.

We owe a lot to those machines, by '74 UNIX and C were available (barely) from Bell Labs but by the late summer of 76 Dave Conroy at Teklogix in Mississauga, Ontario, had written and made work the only C compiler not written by Bell Labs, which ran under RSX-11M. This became DECUS C, and then gcc.

I worked there between high school and university ; Dave taught me C to test his compiler and must have got all of about $1200 for writing it as it only took him a few weeks. It was of course written entirely in assembler.
Re:PDP-11 C / Origin of gcc (Score:3, Interesting)
by nutznboltz (473437) on Friday February 06, @09:37AM (#8200815)
C was invented a shorthand for assembler, in particular PDP-11 assembler.

Yes, C is basically an abstracted PDP-11.

Dave Conroy at Teklogix in Mississauga, Ontario, had written and made work the only C compiler not written by Bell Labs

Is this the same Dave Conroy that does FPGA re-implimentations of old DEC computers?

http://www.spies.com/~dgc/ [spies.com]
Yes, THAT dgc (Score:2)
by rs79 (71822) <[email protected]> on Friday February 06, @11:38AM (#8202296)
(http://www.open-rsc.org/)
Yup, that's Dave. After Teklogix he went to work for DEC and worked on DEC-TALK and then the Alpha.

Dave is the ultimate uber-hacker and I never met anybody like him. He could talk more quickly than a country auctioneer and code even more quickly and was never wrong, ever. He works for MS now in the machine architecture group last time I talked to him about a year ago.
Re:Actually, they DON'T. (Score:1)
by steveg (55825) on Thursday February 05, @08:56PM (#8197217)
(But understanding how assembly instruction sets work, and how compilers work, are both useful for writing better code at the compiler level. Less so now that optimizers are really good - but the understanding is still helpful.)

My understanding of the parent post was that this is exactly what he was saying. I don't think he was claiming that programs written in assembly were better, but that programmers who knew assembly were better programmers.

I think you were agreeing with him.
[ Reply to This | Parent ]     Re:Actually, they DON'T. (Score:2)
by be-fan (61476) on Thursday February 05, @09:41PM (#8197542)
There was an interesting study done comparing the performance and productivity of C++ vs Lisp vs Java programmers. Results are here. [flownet.com]

One very interesting thing they found was that while the best C++ programs were faster, the average Lisp program was faster*. Programmer experience could not account for this.

In retrospect, its easy to see why. When you write clean, straight-forward code like you would in a production environment, its much easier for the compiler to optimize high-level code than low-level code. Compilers for languages like Lisp/Scheme/Haskell/etc do all sorts of optimizations that existing C/C++/Java compilers either don't do (forgotten technology) or can't do (pointers cause lots of problems).

My point is that programming at a higher level, in general, allows the compiler to do more optimization than programming at a lower level. Given infinite time, an asm programmer will always be able to crank out faster code than a C++ programmer, who will always be able to crank out faster code than a Lisp programmer. However, in the real world, it may very well be the case that giving the optimizer more meat to work on will result in a program that is ultimately faster over all.
Re:Actually, they DON'T. (Score:1)
by shmat (124756) <david.dguy@net> on Friday February 06, @10:47AM (#8201667)
(http://www.dguy.net/)
I agree completely. I started my career coding in assembly language (yes, I'm old). When I started using C I thought I had died and gone to heaven because I was 10 TIMES more productive with C.

Like most assembly language programmers, I went through the compiler generated assembly for my first couple of C programs because I wanted to see how bad a job the compiler did. I found that the assembly was hard to understand but very efficient. There were very few places where I could have done better.

As to learning computer science, I think the only value in using assembly language as a teaching tool is that assembly language requires extremely careful attention to detail and patience. So maybe it serves as a screening process because good developers need lots of both. However, algorithms, data structures, OO, patterns, etc. are far more important to learn than assembler.
PDP11, VAX, 68K mislabeled (Score:3, Informative)
by snStarter (212765) on Friday February 06, @01:04AM (#8198707)
No one would really call the PDP-11 a CISC machine. You might call it a RISC VAX however (pause for audience laughter).

Also, many PDP-11's were random logic and not micro-coded. The later 11's were microcoded, of course, the 11/60 being the extreme because it had a writeable control store that let you define your own micro-coded instructions.

It's important to remember that the entire RT-11 operating system was written entirely in MACRO-11 by some amazing software engineers who knew the PDP-11 instruction set inside and out. The result was an operating system that ran very nicely in a 4K word footprint.

The VAX had a terrific compiler, BLISS-32, which created amazingly efficient code; code no human being would ever create but fantastic none-the-less.
Forgetting the Most Important Point (Score:4, Funny)
by duck_prime (585628) on Thursday February 05, @08:45PM (#8197142)

For learning, we don't have to learn assembly first anymore, you can start with any language. I think it is good to take a two pronged approach. Learn C first, and at the same time, start learning digital logic. [...] When one is comfortable with both, I think learning assembly is much easier.

You are missing the One True Purpose of assembly language, and the One True Reason everyone should learn assembly first:

Nothing else in the Universe can make students grateful -- grateful! -- to be allowed to use C
While Don Knuth's assembly language MIX runs on a theoretical processor, all of the examples in The Art of Computer Programming (TAOCP) are based on it. Even as he has revised the editions, he has updated the language to be based on RISC (search Google for MMIX [google.com]), but he chose not to update the examples to a higher level language. Here is his reasoning from his web page [stanford.edu]:
 

Many readers are no doubt thinking, Why does Knuth replace MIX by another machine instead of just sticking to a high-level programming language? Hardly anybody uses assemblers these days.

Such people are entitled to their opinions, and they need not bother reading the machine-language parts of my books. But the reasons for machine language that I gave in the preface to Volume 1, written in the early 1960s, remain valid today:
 

  • One of the principal goals of my books is to show how high-level constructions are actually implemented in machines, not simply to show how they are applied. I explain coroutine linkage, tree structures, random number generation, high-precision arithmetic, radix conversion, packing of data, combinatorial searching, recursion, etc., from the ground up.
     
  • The programs needed in my books are generally so short that their main points can be grasped easily.
     
  • People who are more than casually interested in computers should have at least some idea of what the underlying hardware is like. Otherwise the programs they write will be pretty weird.
     
  • Machine language is necessary in any case, as output of many of the software programs I describe.
     
  • Expressing basic methods like algorithms for sorting and searching in machine language makes it possible to carry out meaningful studies of the effects of cache and RAM size and other hardware characteristics (memory speed, pipelining, multiple issue, lookaside buffers, the size of cache blocks, etc.) when comparing different schemes.


Moreover, if I did use a high-level language, what language should it be? In the 1960s I would probably have chosen Algol W; in the 1970s, I would then have had to rewrite my books using Pascal; in the 1980s, I would surely have changed everything to C; in the 1990s, I would have had to switch to C++ and then probably to Java. In the 2000s, yet another language will no doubt be de rigueur. I cannot afford the time to rewrite my books as languages go in and out of fashion; languages aren't the point of my books, the point is rather what you can do in your favorite language. My books focus on timeless truths.

Therefore I will continue to use English as the high-level language in TAOCP, and I will continue to use a low-level language to indicate how machines actually compute. Readers who only want to see algorithms that are already packaged in a plug-in way, using a trendy language, should buy other people's books.

The good news is that programming for RISC machines is pleasant and simple, when the RISC machine has a nice clean design. So I need not dwell on arcane, fiddly little details that distract from the main points. In this respect MMIX will be significantly better than MIX

YALE PATT ALREADY DID THIS (Score:2)
by Prof. Pi (199260) on Friday February 06, @02:26PM (#8204582)
One of the leaders in computer architecture, Yale Patt, has already written a book [mhhe.com] based on this concept. He gives enough of an overview of logic design to understand things at an RTL (register) level, and distills CPU design to its essentials. He doesn't get to C until half way through the book.

His observation is that CS students have a MUCH easier time comprehending things like recursion when they understand what's really going on inside.

(My efforts to get this book introduced at my old university were unsuccessful, as the department chairman was afraid that teaching assembly language would drive students away. He wanted to teach them Java instead.)

MIXAL (Score:3, Informative)
by texchanchan (471739) <[email protected]> on Thursday February 05, @07:03PM (#8196116)
(http://www.chanchan.net/)
MIXAL, MIX assembly language. MIX was the virtual machine I learned assembly on in 1975. Googling reveals that MIX was, in fact, the Knuth virtual computer. The book came with a little cue card with a picture of Tom Mix [old-time.com] on it. MIX has 1 K of memory. Amazing what can be done in 1 K.
 
[ Reply to This | Parent ]  

Re:Knuth (Score:1)
by d_p (63654) on Thursday February 05, @08:25PM (#8196980)
Knuth uses MIX, a 6 bit machine language as well as a form of assembly for his simulated computer.

It may have been updated to 8 bit in the addendums to the book.
Re:Somewhere in the middle... (Score:5, Insightful)
by Saven Marek (739395) on Thursday February 05, @07:09PM (#8196209)
I learned LOGO and BASIC as a kid, then grew into Cobol and C, and learned a little assembly in the process. I now use C++, Perl, and (shudder) Visual Basic (when the need arises). My introduction to programming at a young age through very simple languages really helped to whet my appetite, but I think that my intermediate experiences with low level languages helps me to write code that is a lot tighter than some of my peers.

I'm with you there. I learned C, C++ and assembler while at university, and came out with the ability to jump into anything. Give me any language and I can guarantee I'll be churning out useful code in a VERY short amount of time.

Compare this to my brother, 12 years younger than me who has just completed the same comp.sci course at the same uni, and knows only one language; Java. Things change, not always for the better. I know many courses haven't gone to the dogs as much as that, but many have. I'm not surprised the idea of teaching coders how the computer works is considered 'novel'.

I can see a great benefit for humanity the closer computers move to 'thinking' like people, for people. But that's just not done at the hardware level, it's done higher. The people who can bring that to the world are coders, and as far as I'm concerned thinking in the same way as the hardware works is absolutely essential for comp.sci. Less so for IT.
Re:Good idea, Bad Idea (Score:5, Insightful)
by RAMMS+EIN (578166) on Thursday February 05, @07:10PM (#8196217)
(http://www.inglorion.net/)
``Bad Idea: Teaching CS by starting with one of the most cryptic languages around, and then trying to teach basic CS fundamentals.''

I completely disagree. Assembly is actually one of the simplest languages around. There is little syntax, and hardly any magic words that have to be memorized. Assembly makes an excellent tool for learning basic CS fundamentals; you get a very direct feeling for how CPUs work, how data structures can be implemented, and why they behave the way they do. I wouldn't recommend assembly for serious programming, but for getting an understanding of the fundamentals, it's hard to beat.
Re:Good idea, Bad Idea (Score:4, Insightful)
by pla (258480) on Thursday February 05, @07:17PM (#8196302)
(Last Journal: Sunday December 08, @05:40PM)
Then, confuse the hell out of a student with assembly

I disagree. Personally, I learned Basic, then x86 asm, then C (then quite a few more, but irrelevant to my point). Although I considered assembly radically different from the Basic I started with, it made the entire concept of "how the hell does that Hello World program actually work?" make a whole lot more sense.

From the complexity aspect, yeah, optimizing your code for a modern CPU takes a hell of a lot of time, effort and research into the behavior of the CPU itself. But to learn the fundamental skill of coding in assembler, I would consider it far less complex than any high-level language. You have a few hundred instructions (of which under a dozen make up 99% of your code). Compare that to C, where you have literally thousands of standard library functions, a good portion of which you need to understand to write any non-trivial program.


There are already problems with people interested in CS getting turned off by intro/intermediate programming classes.

You write that as though you consider it a bad idea...

We have quite enough mediocre high-level hacks (which I don't mean in the good sense, here) flooding the market. If they decide to switch to English or Art History in their first semester, all the better for those of us who can deal with the physical reality of a modern computer. I don't say that as an "elitist" - I fully support those with the mindset to become "good" programmers (hint: If you consider "CS" to have an "S" in it, you've already missed the boat) in their efforts to learn. But it has grown increasingly common for IT-centric companies to have a handful of gods, with dozens or even hundreds of complete wastes-of-budget who those gods need to spend most of their time cleaning up after. We would do better to get rid of the driftwood. Unfortunately, most HR departments consider the highly-paid gods as the driftwood, then wonder why they can't produce anything decent.

Hmm, okay, rant over.
 
Re:Whatever (Score:2, Insightful)
by jhoger (519683) on Thursday February 05, @07:01PM (#8196088)
(http://hogerhuis.net/)
A lot of software work is at a smaller scale. If 60% or so of software's lifecycle is maintenance, and there's a lot of software out there, and also since many software projects are very small, I'd venture to say that process is almost irrelevant for plenty of work.

Being knowledgeable about low level operation of the machine will take you farther, since you won't have the fear of getting down to the bare metal to figure out a problem. And assembly language is important there... but also things like debuggers, protocol sniffers, etc. Anything that lets you get to the bare metal to figure out a problem will get you to a solution quicker.

Process and modern design concepts are important for large projects and at the architectural level.
Great concept. (Score:5, Insightful)
by shaitand (626655) on Thursday February 05, @06:59PM (#8196060)
I started out learning to code in asm on my c64 and I'd have to say it was a very rewarding experience.

Anyone who disagrees with this probably doesn't have much experience coding in assembler to begin with. Asm really is fairly easy, the trick is that most who teach asm actually spend too much time on those computer concepts and not enough time on actual real coding. It's wonderful understanding how the machine works, and necessary to write good assembler but you should start with the 2 pages of understanding that is needed to "get" asm at all.

Then teach language basics and THEN teach about the machine using actual programs (text editor, other simple things) and explaining the reason they are coded the way they are in small chunks. Instead of handing a chart of bios calls and a tutorial on basic assembler, introduce bios calls in actual function in a program, most of them are simple enough that when shown in use they are quite clear and anyone can understand.

After all assembler, pretty much any assembler, is composed of VERY simple pieces, it's understanding how those pieces can be fit together to form a simple construct and how those simple constructs form together to create a simple function and how those simple functions form together to create a simple yet powerful program that teaches someone programming. Learning to program this way keeps things easy, but still yields a wealth of knowledge about the system.

It also means that when you write code for the rest of your life you'll have an understanding of what this and that form of loop do in C (insert language here) and why this one is going to be faster since simply looking at the C (insert language here) concepts doesn't show any benefit to one over the other.


Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: March 12, 2019