That's the culture: the on-board shuttle group produces grown-up software, and the way they do it is by being grown-ups. It may not be sexy, it may not be a coding ego-trip -- but it is the future of software. When you're ready to take the next step -- when you have to write perfect software instead of software that's just good enough -- then it's time to grow up.
Software is getting more and more common and more and more important, but it doesn't seem to be getting more and more reliable.
My name is anonymous, I am a programmer, and I approve this statement.
Are they serious about commenting every single line of code? Surely there must be some code or some techniques that are used in more than one place. Commenting on every single instance of duplicate code with distinct, uniquely in-context insights must be some chore.
When lives are on the line, the meaning of a line, its requirement and its effects, must be thoroughly understood. It might be a chore, but them's the breaks.
They don't just comment it either. They have several pages for every line. Even if identical code is found elsewhere (which is even more likely than normal code due to the linear coding style) since numerous people work on the code, you want to ensure they don't need to refer elsewhere. Absolute certainty.
That's all well and good, but I really doubt the feasibility of this kind of coding for regular commercial software; it would just take too much time and be too expensive. (Come to think of it, maybe this is why Longhorn has been so delayed... ha ha, ha ha ha, ha. Ha.) And as for open-source software, forget it! If you think about it, it's a minor miracle that a bunch of geeks who have never met each other face-to-face can get together and do things like Apache and KDE and The GIMP in the first place.
All that effort and the things still blow up every now and again. Maybe instead of bragging about having perfect code they should be making sure the rest of the shuttle team is doing its job properly.
When you need to write software that works, you take a fscking course in Software Engineering. You learn what your code is doing and you make it work. I'm not saying it's "easy", but I don't believe it's "hard". I haven't been around enough to be sure, but I think they're pushing the "normal coders are frivilous twenty-something-year olds" line a bit hard.
You can make software work, but the reason it often doesn't is due to poor management and budget constraints. With the tech boom, perhaps people got used to too much, too quickly. Writing software takes time and money, and I get the impression that just doesn't happen nowadays. Apparently noone has the time to do proper requirements analysis, then follow this up with ERDs/DFDs.
You can write robust code, so long as you keep some things in mind. Heck, let's make that one thing: paranoia.
Never assume anything about the data that's coming in (and what happens to it once it's out of your hands). Which leads us to the next bit, which is being able to deal with ANY data that that does come in (or gets lost).
We have tools to help us do this. Maybe they're expensive, but you can use them, and they can assure you 100% (and show you the proofs!) that your code cannot fail, assuming the system behaves in a predictable manner (ie. CPU keeps running and code is read one line after the other, sorta thing).
Commenting: Removes ambiguity. The big bonus is obviously in a team environment. That way, John doesn't need to ask Jim what $xyz refers to. He already knows.
>>6
Who says they aren't? Maybe they are. I wouldn't know though, and neither do you.
Throwing large machines into orbit and bringing them back down again, repeatedly, isn't easy. It's also a thankless job, because no matter how good you are, one oversight can spell the end. And nobody will appreciate what you do, because when you do it properly, nobody notices.
They only notice the mistakes.
>>7
I love the "it's never our mistake" attitude a lot of developers have. It's always mismanagement.
I won't defend managers, since they do deserve a lot of the flack, but coder discipline and communication is also an important element.
> they can assure you 100% (and show you the proofs!)
Static analysis is limited. It can only catch a limited subset of bugs with any real-world language. Besides, even if you can prove the code is correct, it can still have bugs related to the problem at hand.
>>9 management's fault: Well, that's the ego-tripping side. I admit it, I'm not a developer, I'm a bright-eyed, idealistic, naive young-un. Neither mgmt. nor coders can be absolved completely from blame. But as we've already agreed, everyone has to work together.
Static analysis: what I was referring to here was actually not code analysis but logic analysis. We were taught at uni with the "B toolkit" (don't know if you've heard of it, I don't know how widespread it is). You specify your system using Abstract Machine Notation (AMN) and it whirrs its gears and checks that all the logic is bulletproof. Many people in the subject hate it, and I can see why, but by the end of the course, it had really grown on me, and I have a certain respect for it now.
What I meant by 100%-perfect code was that the toolkit can apparently (I never used this feature) write the code for you (presumably C or something). Not all of it, obviously, but the core logic.
And yes, I agree it's still not perfect. The system may be perfect, but the working environment isn't.
Never heard of it, but I know about formal proofs. It's certainly interesting, but it won't be used any time soon.
I can see it producing functional code. C though... pointers and all? Does it let you specify library calls and place constraints on them?
Must be consigned mostly to academia then. Which is a pity, really. Stuff might work more often if it were used. It's not trivial to use, though. Something like an OS specified like that would be unimaginable to me.
I hear (from my tutors, which they probably heard from friend-of-a-friend, etc.) that it was used to figure out the running of France's rail systems. If it's true, that's pretty cool.
Code: Well, that, I couldn't say. I never used it. The C thing was just a wild guess on my part. There're probably better languages for being autowritten in.
I hate that article. The future of software? I don't think so.
>>7,10
Ever heard of the halting problem? It means that you may start needing an awful lot of qualifications to say "this code is correct". I can see it working within the bounds of a certain library, but I tend to be suspicious when I hear people say "xxx program can tell you if your code is correct". Also, logic analysis is just one example of various kinds of static analysis you can do, though some would probably argue that any sort of static analysis you can do is logic analysis.
Machine generated code isn't anything new. lex(1) and yacc(1) have been producing working compilers since the 1970s. Feed them a grammar for the C language, as can be found online in various places, and you can get yourself a working C compiler, written in C, generated completely by machines (because they don't make mistakes). Probably not the sexiest example, but it's the one I'm most familiar with.
The actual contents of the article weren't that bad. "Every line annotated" doesn't mean "every line has a comment", it means that every line in the software can be traced back to the revision in which it was introduced, to all revisions in which it's changed ... it just indicates they're using SCM. Which is ... pretty fricking elementary, in itself. I wonder why >>13 hates the article. (Well, unless it's its tone - I found it really annoying, the whole "haha, look at that mass of 20somethings writing buggy code" air it had for half of it.)
> and you can get yourself a working C compiler, written in C, generated completely by machines (because they don't make mistakes)
You can get a C parser, but not a compiler, which also has to translate to machine language. I don't think there are fully automated toolkits for this, and there'd still be room for error in the specifications for the output.
>>15
Sorry. This is true, but for some machines the extra work is easy - and I've only done it for MIPS, which isn't hard. You could add a third layer for completely automated compilation if you wanted, but such a layer doesn't exist in standard tool form to my knowledge. Still, that's half the heavy lifting of writing a compiler done by machine.
That's kind of misleading. The bulk of compiler design is in optimization, which is definitely not automatable. So you can get a very slow compiler easily, but it will be useless for any real program.