"Young people occasionally ask me whether language X is something they'll need to make computer games, or what they should learn if they want to become programmers. And what language should we teach in school?"
Thought I'd post this here. Full article by a current programmer for Funcom (Anarchy Online): http://enno.homeunix.net/index.php?itemid=224
C
I'm not certain I'd pick C first, but I'd definitely have a person learn C before C++ and Java.
I agree with the author that learning Java first is a mistake. Yes, it's nice that the student doesn't need to worry about pointers (oh, no!), but teaching OO first is a mistake - a very bad one. Java is also horrid in its verbosity and sprawling library.
While OO is conceptually easy, actually implementing it isn't, at least in Java and C++. Further, since it hides logic, hidden effects can occur. This is impossible in C. WYSIWYG.
I've heard people argue that python may be a good starter language.
When I studied theoretical compsci + telematics, we were "taught" Oberon and Algol. Those are nice beginner languages, because they make you realize real fast whether you really want to be a programmer or not, and in that respect I advocate their teaching
I agree with the notion mentioned in >>4 that python may be a good choice, because -in my opinion- it is conceptionally close enough to C to be of use while being somewhat easier to pick up at first.
i'm surprised no one's mentioned perl yet... i'd say perl and c would be the two best choices...
java actually wouldn't be too bad for a beginner language... it was my second language and it definitely made learning other languages seem a lot easier...
C wouldn't be a very good language to start out with, IMHO. It simply isn't a very newbie-friendly language. Error messages are often cryptic, documentation is often inconsistant or incorrect and don't get me started on 3rd party library use.
When learning programming, it is really more learning a way of thinking rather than learning a language. If you can code in one procedural language, you can usually pick up another procedural language with relative ease. However, when actually in the process of learning that 'way of thinking' it is in my eyes better for the one doing the learning to be able to actually focus on the learning of that way of thinking rather than trying to figure out what 'Error 01' means (true story, by the way... I still have no idea what it's supposed to mean).
I do think a procedural language would be the best to begin with, though. Not only is it easier to grasp than, for example, OO, but once you do begin using OO, you'll also be using concepts used in procedural programming.
>Java is also horrid in its verbosity
Eh? Why would verbosity be horrid? Programming is programming, whether a method is called c_str() or toCharArray(), it all requires the same way of thinking.
>Further, since it hides logic, hidden effects can occur. This is impossible in C.
I'm still curious as to how OO hide logic more than the use of functions and structures do.
>>5
Another benefit of Python is that even though it's capable of being used as an object-oriented language, students can transition smoothly from procedural base libraries to more specialised OO libraries. Unfortunately, the deprecation of procedural functions in the string library doesn't bode well for the future of that sort of transition.
It's also worth noting that Python enforces the use of proper indentation and spacing.
>>7
OO classes make use of private functions that may be incomplete or faulty implementations of functionality available in other third-party libraries used by the same project. It isn't clear exactly how any function works in a given class relative to similar functions provided by other classes. In a well-designed, totally self-sufficient, fully-encapsulated object, I suppose it wouldn't matter, but how many classes realistically provide all the functionality you'd ever need from it without having to subclass it at some point?
> In a well-designed, totally self-sufficient, fully-encapsulated object, I suppose it wouldn't matter, but how many classes realistically provide all the functionality you'd ever need from it without having to subclass it at some point?
I can see you've never used Cocoa.
(Unfortunately, that's probably not an option for teaching GUI programming...)
This should tell you all you need to know:
http://groups.google.co.nz/groups?selm=60k4s3%24dci%40netlab.cs.rpi.edu&output=gplain
>OO classes make use of private functions that may be incomplete or faulty implementations of functionality available in other third-party libraries used by the same project. It isn't clear exactly how any function works in a given class relative to similar functions provided by other classes.
In C, at least, it's also possible to prevent others from using a certain function. So, if I wanted to use the functionality that function provides myself, I'd have to write a function with that functionality myself, which may be an incomplete or faulty implementation of that function.
>In a well-designed, totally self-sufficient, fully-encapsulated object, I suppose it wouldn't matter, but how many classes realistically provide all the functionality you'd ever need from it without having to subclass it at some point?
How is that any different from having to write your own function because a certain library doesn't provide the functionality I need?
I think, whatever lang you start out with, it should be an interpreted one. They let you more quickly see the results of your code without having to hassle with the compile-debug-compile loop. And it's not like n00bs are going to be writing serious resource-intensive programs anyway.
I'd go with Python.
C isn't a good first language - it's a bit too unforgiving - but it is definitely a very good second or third one. Everyone should learn C at some point, just for the lower-level understanding of computing it gives you. These days, learning assembly isn't really feasible any more, so C is a good second choice for learning how a machine works. Besides, pretty much everything you actually use is written in C or C++.
For a first language, I'd agree that an interpreted one might be a good choice. The best would be a interpreted one with an interactive mode, but those are few and far between these days. Depending on your mindset, Perl might be very good or completely awful. An easily overlooked alternative, however, is Javascript. Javascript is syntactically very simple and elegant, it lets you do things that are actually somewhat fun, and it is available on pretty much every machine by default. The downside is that it requires some familiarity with HTML before you can really use it.
C is very useful to learn at some point for the sole reason that a whole bunch of languages are based on the C syntax - C++, javascript, java, php, and probably a bunch more. One may like or dislike these but getting to know their syntax for "free" is a good thing.
>>13
If JavaScript is interpreted exactly the same way by all interpreters and executes predictably in every browser, maybe.
>>11
Procedural languages don't conceal data in classes that only the class' own functions can access and manipulate. With C, you can personally debug and correct any function that you write to manipulate the data. In an OO language, you might have to code an entire custom class that reimplements most of another class' functionality from scratch just to work around a bug (child classes can't access the parent's private data). With a procedural language, you can get away with replacing or adding only the one function that's broken or missing.
There are C interpreters out there. I learned C with one of them. It was limited but very useful to get the hang of C. Plus the code could be re-used into a proper C compiler afterwards.
As for bad documentation, I don't know the current state of affairs but Borland was very clear back then when I learned. One click and the function was explained, with examples that could be copy/pasted.
While Microsoft Visual C's help was a total nightmare, very unintuitive.
The language itself executes pretty predicatbly across most browsers. It's when you start digging into the DOM that you run into the problems.
>>16
Odd that. I used to think that Microsoft's strong point was their documentation. Going from QuickBasic and QuickC to TurboC++, I remember clearly thinking that Borland's documentation was horrible. I often referred to QC's documentation for details.
I never used QuickC (for DOS) nor TurboC++ (for Windows).
I practiced with Borland TurboC/C++ (for DOS) and MS VisualC 2.x (for Windows).
Maybe they had completely different help?
Meanwhile I found the name of the C Interpreter I used: Quincy.
It was offered with an issue of DrDobbs Journal, and is still available at www.ddj.com.
I find Microsoft's documentation very good as well, particularly in content and organization. It's come a long way in the last 7 or so years. As far as horrible documentation goes, I'm going to have to give the prize to Sun.
>Procedural languages don't conceal data in classes that only the class' own functions can access and manipulate.
True, but that doesn't mean data can't be concealed. It can be.
>With C, you can personally debug and correct any function that you write to manipulate the data.
In an OO language, if you wrote the class doing the concealing of data yourself, then obviously, you'll be able to debug and correct any method that's part of that class to manipulate the data.
If you didn't write the class yourself, but want to write a function that manipulates that data, then you'll need to make some workaround, of course.
But then, the same goes for procedural languages; if the data is concealed (which can be), you can't acces, and thus, not manipulate it, unless you make some workaround.
To elaborate on verbosity:
I hesitate to elaborate on why I believe Java is too verbose, simply because there are so many exceptions and caveats. After some thought I've come to the conclusion that it isn't the language itself I dislike. It's the standard library that comes with it.
Now, why does terseness matter? The issue for me is the quick recognition of what the code is doing.
Ie:
class YeOldeHelloWorld {
public static void main(String[] args) {
System.out.println("Hello World!");
}
}
vs:
print "Hello world!"
vs:
"Hello world!" .;
Now, you can get around that rather ugly System.out.println mess by importing (see: an exception already). It's also unfair to bash Java for the "public static void" since they serve important functions. However, this whole mindset pervades the entire standard library, and leeches into the code that I've seen other people produce.
"Sure," you say, "but you should have a deep understanding of the code anyway! Most your time won't be reading or writing the code."
That's true! But when you're dealing with large amounts of code, shortening the interval until comprehension is critical. Can you fit the meaning of a code statement in one glance, or do you have to read it for a couple moments? Multiply that by a never-ending stream of code as you sit there all day, and that reading becomes a time sink. More reading, less thinking.
There's also the problem of reader fatigue, which has been studied in different disciplines (http://cogprints.org/753/00/pmuter1.htm for an overview). That's all nice and well, except when you bump up against java's library (I detest Java 3D):
TransformGroup tGrp = new TransformGroup();
rootGrp.addChild(tGrp);
tGrp.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE);
tGrp.setCapability(TransformGroup.ALLOW_TRANSFORM_READ);
MouseRotate mRot = new MouseRotate();
mRot.setTransformGroup(tGrp);
mRot.setSchedulingBounds(new BoundingSphere());
...which is by far not the worst example I can come up with, since it doesn't involve the creation of useless objects just to pass some silly values to the Java3D library (ie: bc.addBox(0.5f, 0.5f, 0.5f, new Color3f(1, 0, 0), new Color3f(1, 0, 0));).
To point out further why terseness matters, allow me to make an absurd example:
c_str()
vs:
toCharArray()
vs:
ConvertAllCharactersInArrayToOneByteUnsignedCharacters()
Alright, that's extreme. But I hope you see where I'm going: it takes time to read that. Even toCharArray() may be too long, since it will probably be part of an entire statement, not just sitting there on a single like alone.
On the other hand, something like ca() is the opposite problem: it's so short it conveys no useful information (one reason why complex regex's are sometimes called "write-only"). Obviously, a balance must be struck between meaning and descriptiveness. I think the Java libs err on the side of descriptiveness.
Of course, this is entirely my preference. YMMV.
Actually, what the Java runtime does is make difficult things possible, and easy things just as difficult. For instance, if you want to use the Java library to load an image from disk and process it, you have to create a horde of useless objects, including one object whose purpose is to monitor when the image has finished loading and tell you about it. This is all obviously designed to be usable in an environment such as a web browser, where you want to issue requests for images over various kinds of network connections and have them load in the background. However, when you want to load a single image from a local disk, you have to do JUST AS MUCH WORK as if you were loading it asynchronously over the network.
PHP
I'd agree that PHP is in the same class as BASIC or Pascal, but the danger in teaching people PHP is that they might actually use it for something. <-- troll
Yes. Whereas if we teach them Perl instead, they won't be able to use it for anything. <-- troll right back
>>24
That's the problem with Java in general, it takes the jackhammer approach to everything. It's virtually almost never the right size for the any given task.
The concept Java is based on is good, but they really screwed up the implementation.
Well, there is always QuickBasic. The language was simple, as was the IDE, and came with extensive help and examples. It's what I started with, and I had a lot of fun.
What's good in an learning language is instant gratification. Having a good graphics and game library is a great thing in an learning language, because making graphics and games is fun.
If you have an interactive mode to just punch in expressions to see how they work, that's even better.
>>30: Python it is, then.
order i learned: LOGO, then ZZT-OOP, then QuickBasic, then javascript or C (and from there one should be able to learn anything, right?)
brainfuck?
> ZZT-OOP
lol
A school of thought holds that Scheme is a good choice for a first programming language. First, learn how to program, then learn C's syntax.
>>36's school is full of it.
You pretty much end up learning to program twice if you do it that way. Some people who have been doing CS for far too many years have forgotten all about what being a novice programmer is like. Functional programming may appeal to people who are already familiar with the rigorously mathematical way of thinking, but for the novice an imperative language is far easier to understand and follow - reading a recipie is easier than reading a mathematical proof to most people.
I wouldn't be that harsh. Assignment in imperative languages isn't exactly intuitive either.
Scheme as a first language is definitely unorthodox though. o.x
use perl;
Isn't Scheme the first language taught at MIT for CS?
And sure reading a recipe is easier than a mathematical proof, but it does't make you a good cook^^;
Most people who go into CS should be pretty decent at math, and if they don't understand the algorithm behind a program they should work on catching up because otherwise they are not going to get far.
We did HUGS (Haskell) in first semester CS and I didn't have any particular problems (backed up only by school maths and no programming experience at all), now one semester later we are doing Java and comparing the two of them I'm glad we started with the functional one.
>>40
Fortunately, programming and computer science aren't inextricably bound to each other. Many programming projects require neither math nor mathematical thinking; only a clear sense of what you (or your users) want the program to do, and how to present that functionality. The backend and coding framework will reveal itself with a bit of trial and error.
There's a place for math in physics modeling and all that stuff, but I don't see it as a definitive element that must be mastered to get anywhere.
Two languages should be taught. The first should be C. This gets kids used to playing with VB, JavaScript, and the like down to real machine level problems, like pointer manipulation, buffer resizing, data structure design, etc. You really learn how systems programming actually works with C, something that langauges like Java hide from you. And debugging in C is for real, chasing pointers and poking through data structures by hand. It's as close to assembly as you'll get without having to learn a specific machine architecture.
Second should be a language that breaks all the usual paradigms, like Common Lisp. You can teach OO in it, you can teach functional programming in it, you can teach compiler design in it, you can teach complex systems analysis in it, graphics programming, automated code rewriting and optimization, all sorts of stuff. You're abstracted away from the machine unlike in C, but you can dig deep and play with the bits if you want.
After that learning Java, C++, Perl, PL/SQL, COBOL, or anything else is pretty easy because the concepts in all of those languages have simple maps to either C or CL or both.
Your arguments are correct, but these are not "first" and "second" languages, but more like second and third, or even later. This isn't really about what to teach at the university level, but what to teach to those who want to learn programming from the start. There is a lot to be said for a first language that DOES hide all that from you, just to get you used to thinking in terms of programming.
Why do you hate Tcl/Tk? I like it.
>>43
If you don't already have some minor programming experience before entering a university program in computer science then you probably deserve to lose. You should already be capable of thinking algorithmically (although obviously not necessarily good at it) by the time you take your first formal programming class at university, because this is something that shouldn't need to be taught, just as use of variables shouldn't need to be taught in mathematics at a university. It's also something you are unlikely to learn quickly enough to pass your first semester in computer science. Therefore the languages that university students should be learning are languages which expose them to the maximal programmatic experience.
If you want to learn how to think of and write computer programs, go to a community college or learn it in high school. University professors shouldn't need to waste their time handholding CS students. This doesn't apply to programming classes for people the natural or social sciences, since they're not expected to be talented with computers; however CS students shouldn't be taking those classes (unless they need an easy A).
I don't understand why university CS departments are trying so hard to be monolingual. Linguists are expected to learn at least two other languages besides their native language, and human languages are much harder to learn than programming languages. (It takes a typical linguist a couple of years to learn a language well enough to hold ordinary conversations. It takes a typical programmer a couple of months to learn a programming language well enough to write ordinary programs in it.) It seems to me that CS students should be expected to learn several programming languages, and to use them confidently. Particular classes would of course focus on particular languages, but it's strange to me why departments insist on only teaching one major language and then introduce one or two others in passing for a single semester and have students write only toy programs in them. CS students today ought to graduate with full competence in at least C, Java, and some "minority language" which is quite different from the Pascal/C camp like Lisp, Haskell, Erlang, Smalltalk, etc. They can go on to learn "industry" languages like PL/SQL, COBOL, Intercal, Fortran, et similia in the working world, and will have experience learning new languages quickly because of their education. And they'll naturally pick up Perl or Python or PHP or something because that's just what happens.
Hah, I see your point. It's just that they're rather rare now days (yay!).
COBOL: primarily legacy code in banks
Fortran: legacy code in scientific circles
PL/SQL: heavy-duty DB
Intercal: uh... someone help me here... do people use this?
When I think "industry" I think C/C++, Java/C#, VB, Perl/PHP/Python.
http://www.catb.org/~esr/intercal/ might be what you're looking for.
Here's ROT-13 implemented in Intercal: http://www.ofb.net/~jlm/rot13.i
My teacher used to says one is not a programmer before he can program xx programs in three languages. I forgot how much was xx - maybe 30. 3 x 30 = 90 programs.
>>39 is DQN
He's wrong. One is a programmer when one can program any program in any language. Subject to inherent limitations in the languages and what is a reasonable amount of work, of course, but a real programmer transcends language.
Now, that is what should be taught at university: How to write any program in any language.
> a real programmer transcends language.
And this from a Bertrand Russell fan.
My school is switching from VB.NET and Java to Python and standard ISO-compliant C.
for the 1st learn... u should try BASIC...
since no one clever enough to understand why we must learn BASIC (is for learn flowchart)
TRY C or C++ (better).
by the time u use above, u will not learn how the prog work.. if you learn from this step:
U will soon understand what i speak about.
the step i say basicly 2 step!
the question u will ask only What computer language should we teach? => the answer above step.
if they learn C (not full learn is no PROBLEM)
U will find out every prog. using same style like C..
oh yeah.... Java is the <b>D</b> languages.. (Djava)
the cause of C is to upgrade B, The D is to upgrade C. one more u should know.. Java is a island in Country named INDONESIA and i live there and write from that island
Basic will fuck over your ability to program in every other language by teaching you bad style and making you lazy around UI's.
Basic will fuck over your ability to program in every other language by teaching you bad style and making you lazy around UI's.>>>
i don't say you have to learn BASIC... U learn the flowchart from BASiC... u don't have to face the dim (array things)...
is that suck... YEAH
but face it... no one cleaver than me.. since I more STUPID than U!
>>59 deleted because it was incoherent rambling that broke the tables with coded text. Don't do that kind of shit in here, please.
Teach them to program directly in the target system's machine language. If they have the patience to even manage a hello world program then we can probably trust them not to use horrible shortcuts in higher level languages.
On a more serious note though, I think Python has been touted as a good language for beginners. I agree with >>51 though. Programmers should aim to learn the concepts of programming in a manner that does not depend on implementation. Unfortunately this is not easy to do because most people will balk if they see even the smallest difference in syntax. A lot of people complained to one of my professors for having his examples in Pascal when they were being taught Java.
We shouldn't be teaching a language, we should be teaching how to build components, which means instead of picking a language environment that is language oriented like Java, it should be one that is component oriented, which means ones like the main .NET languages, Delphi, or even, horror, regular Visual Basic.
This places the focus on actually solving the real problem and thinking about how and making your little component behave consistently with other components.
I think we should be teaching people not to think like >>62.
>>62
stfu troll
If you don't understand my comments when I am being intentionally abstruse, you can always ask what the hell I am talking about.
Component oriented looks like a zomg new term for object oriented. Someone explain to me how they differ.
No, "components" are things you license from shady software companies, and have half-finished implementations, bad documentation, and numerious bugs.
the question in this page are What computer language should we teach?
it mean... the person ask this are want to know what ideal languages should he/she teach to everyone like me or U!
Damn i believe i was wrong/mistake to know what he/she want!
why i give that talk! (see BEDUL) >>>>56 because in my university i use that Way!
BTW... the step i lear is:
2.Basic
2b. fortran
3. pascal
3b. VB (welll VB 5 not VB#net)
4. cobol (trust me... is real)
5. C
6. ASM
basicly.. what u want to teach is up to you. Even is only basic (change languages u able to use), u must remember one things... there a lot people in the world MAY not know about Basic (or that kind) or maybe MORE ADVANCE than u.. Don't affraid.. i use to be that...
what i going to do is learn from them... HEy... Teacher (or assistant) can learn from her/his student. The time has changes.. to learn.. u must not always to be the supperior... u have to be adapted...
BTW sory about my languages... in the 1st i was arrogant.
i'm from Indonesia... so my english in this are rather.. confused... not good english huh!
Components are more generic pieces. They link to each other using your chosen communication framework, sending messages to each other to get things done.
In one model, the only assumptions you make about other components are in the input and output of each component. Everything, including implementation/language, is hidden. You then use your connecting framework to link everything together.
That sort of thing has its place, but it's no model for general purpose programming. You're not going to be using linked list components or hash table components in your code anytime soon. The overhead of the communications framework is far too big. That sort of thing is more for high level systems design.
>>71
Yes, that's right.
Personally, I say teach program flow, then OO, then PLs. The people that won't understand the general idea won't understand if you start with Lisp or Scheme anyway. At least this way you get a bunch of people that know how to script, which is useful. Whether you should kick them out of CS is up to you. It has to be done within 2 years if you want to give them a chance to go into something else.
IMO python is best suited for that purpose.
It's easy, powerful, portable, interpreted, and very similar to most popular languages (c/c++,java,...)
I wouldn't touch commercial, unportable stuff like delphi, or vb with a ten-foot pole.
In my country pascal is usually taught in high schools, but i don't think anyone is using it for anythig else. ;)
Anyway, if you know one language, learning another isn't hard.
If you know one language, learning another is very hard.
If you know three languages, learning another isn't hard.
learning just C make u learn PHP, ASP (even like VB), even all linux based and JAVA!
but learn 1 languages is useless without learn how to manage/make everyone beside the builder understand ur languages (program).
what i mean is... u must know to make anyone beside u (i now u will never created the prog alone! so anyone is your patner) know the prog works.
I am thinking it would be more useful to learn English.
i think assembler is the best as a teaching language. expecially RISC assembler >:)
>>76
I concur.
From my experience, the best way to teach programming is to start (and here's the really clever bit) before the University. A mandatory school course in basic computer science, with focus on program flow and algorithm (as opposed to data structure) concepts would not only help teach CS in general, but would also aid the general population to stop treating computers as black boxes that work somewhat in the same way as the laws of physics.
Languages of choice for that would include any scripting languages, with my preference being JavaScript, and possibly Basic. It is important to note that many of those habbits picked up by using these languages often reflect fault in mentality (which should be broken while inside those languages) rather than fault in choice of first language. Also, these languages hide the UI behind a black box, exposing newbies to an intermediate between flowcharting and in-depth UI programming. (Seriously, does anybody actually like using message pumps for events?)
After simple algorithmics, it is important to get a feel for the basic structures that power computer programs. A data structures and algorithms class is a must at a very early level; languages of choice for this would be C, Java/C# (little OOP emphasis). While C prevents dealing with OOP concepts right away (which might or might not be a good thing at this point), it introduces a lot of ideas that also might be completely extraneous - namely the nightmare that is memory management. On the other hand, automatic, non-preemptive garbage collecting ala SmartPointers should be covered in a DS class... Really depends on the specific nature of the class.
After this grounding in computer science (which realistically should occur before the University level) concepts emerging from evolution into OOP and complex infrastructures (esp. as Operating Systems, Networked Systems and Enterprise Systems) should be taught. At this point, language really does not matter, except for implementation specifics.
Note: the above is only half of the story. It only covers the practical programming side of CS. The theoretical side is a widely different animal. To start, CS theory has solid grounding in math. Without basic familiarity with combinatorics, probablity, and possibly some graph theory, aspiring computer scientists would get lost in the tangles of theory.
Like mathematics, it is best to study CS theory in Historic order - i.e. emergence of static computing engines -> finite state machines -> pushdown automata -> turing machines -> equivalence between language decision/recognition and solution of generic problems -> computability -> complexity... After that the simple linear model protrayed above breaks down, getting into increasingly abstract (though sometimes not so) fields. Information theory, i.e. coding/data size, cryptography, etc. and other such things can really be done in any language, though their formal definitions are usually expressed in mathematical terms.
The more 'academic' languages mentioned above, most notably Scheme, belong in the group of languages used to teach the theory of CS, rather than the practice. I, personally like the beauty of scheme, but would not be caught dead trying to build an enterprise system on it - it is too rigid, too formalized for practical purposes. Similarly, imagine an engineer being forced to use cutting-edge mathematics from theoretical physics to calculate stress from loads in his designs. Not practical whatsoever. If all those applicants that did not know C++ showed up for the interviewed mentioned in the article with their language of choice replaced with Scheme, it is unlikely they would have been hired.
A true CS program includes a balanced collection of functional languages, theoretical languages, and specialty languages (SQL, COBOL, the like). Their order should proceed in terms of increasing complexity - increasing demand for logical/declarative rather than intuitive/temporal expressionism.
Well... the above and 50c might get you on the bus, if the fares ever got lower.
> To start, CS theory has solid grounding in math. Without basic familiarity with combinatorics, probablity, and possibly some graph theory, aspiring computer scientists would get lost in the tangles of theory.
I dunno. You can learn most of the maths you'll ever need for CS (unless you get into crypto or something) in a couple of months. You should probably do that, but it's not like CS uses any really heavy maths.
And let's face it, CS people couldn't do a real mathematical proof if their lives depended on it. <-- troll!
> Like mathematics, it is best to study CS theory in Historic order - i.e. emergence of static computing engines -> finite state machines -> pushdown automata -> turing machines -> equivalence between language decision/recognition and solution of generic problems -> computability -> complexity...
I'm not convinced here either. A lot of that is fairly esoteric with little practical use. That doesn't mean you shouldn't learn it - it means you shouldn't learn it first, because then you'll just be going "What the hell is this shit? Why do I care?". You appreciate the theory of many of those topics much more if you've previously dealt with them in practice.
> You appreciate the theory of many of those topics much more if you've previously dealt with them in practice.
Exactly. What I was trying to get accross there is that Scheme and other languages of the same nature should not be taught first, and should never be taught as a "practical" language. Which unfortunately, my University does... to "Intro to Computers" students, no less. I feel so bad for them...
>>68
Congratulations, you have just described Java and the majority of open source, all of which fail at componentization. Next you're going to tell us "services" are for enforcing a subcription model to make everyone pay $$$.
>>67
One of the more basic differences between component-oriented and just object-oriented languages are what gets treated as "first-class". CO languages may or may not be OO, but in general, current CO based languages are also OO based.
Most of this thread show why industry has so little faith in the ability of the eduational system - NFI where they're going or what they're doing on the leading edge, 5 years behind otherwise.
>> And let's face it, CS people couldn't do a real mathematical proof if their lives depended on it. <-- troll!
But it's true! :O
> One of the more basic differences between component-oriented and just object-oriented languages are what gets treated as "first-class".
Could you elaborate?
At least with open-source, you don't have to pay through your nose to receive shit. Also, your statement sort of implies that open source software wouldn't suck if it was "componentized". I find this hilarious.
Furthermore, academia has never been about teaching the flavour of the month. You want that, you go to a vocational school. Going from one language to another is trivial for a person with a real education in programming. Going from one paradigm to another, likewise. Especially when these paradigms are so nearly equivalent to each other. You can't even seem to illustrate an actual difference between object oriented and component oriented programming, whatever it is.
Following what Waha said in >>84, I thought I might jump in here and mention my experiences to date.
I started with BASIC back when I was about six or seven years old, and while this didn't go far, it was when I first got my taste for computing, and that's stuck with me ever since. With my limited capacity at the time, I wasn't going to get far beyond simple procedural programming, and that's what happened.
Early in high school when I was eleven or twelve I taught myself HTML through osmosis and picked up some Javascript after that (but I eventually had to buy a book to understand what the '.' was doing and that was the start of OO.
Not much happened until I got to uni some four/five years later (where I started Computer Engineering). In first year we were taught Haskell (everyone hears Pascal whenever you tell them this) in COMP1A, which is a very... "pure" functional programming language. I admit it was kinda cool in ways, but it also gave me the shits, bigtime. Having a firm idea of procedural programming in my head, this was hard to get a handle on.
Second year we were doing Data Organisation, where we learnt C (what I regard as my first "real" programming language). So we did fun things with memory allocation and pointers and structs.
After that it's been microprocessors and some more hardcore bare-metal stuff. Looking back on it, I think this has been a really great path in learning programming and computing in general. The way the course has built has been very solid and you're getting a view of the whole thing as you go. C may have been invented, what, 30 years ago? There's a reason why it's still used and taught today. Simple fact is, it's relevant.
It's a wholistic method of teaching programming and comp. in general. When you understand how it all works and fits together, then you can pick up languages easily. As waha said earlier, when you know maybe three languages, learning another one is easy? Why? Because chances are you now know what you're doing, and why.
Other comments from earlier:
2. Assembler as a teaching language: Sure, but make it the teaching language for when you're teaching hardware at the same time (hardware @ the single-cycle computer level). Assembler is great fun, but be kind to their sanity and make it RISC (I like ARM).
3. Maths in Comp. Okay, I fscking hate the mathematics I'm taught. Sure, I can almost see some of it being useful, but I hate the rest of it. And yeah, I probably can't do a proof to save my arse. Besides, that's what AMN and Z is for, right? :)
>>81
Are you at Berkeley by any chance?
A lot of people complain about Scheme being taught in the CS3 course. But it's not something they picked at random. The people behind CS3 include people with degrees in education as well as CS, which is unusual. So at least they're thinking about it and they have reasons, even if the reasons are arguable.
BASIC is a great first language, but VB is a poor first teaching environment. Get them going in QuickBASIC for the coursework; encourage them to explore VB on their own if they have a mind to, and give them any guidance they want.
QB will make them think about what they are doing, as they must implement the program logic themselves. At the same time, the fact that they can apply what they're learning in QB (with some amendments) to a graphical application in VB (which they can show off to their mates) encourages further personal experimentation.
Don't forget to explain to them why you're teaching them in a crusty old environment; and don't forget to ENCOURAGE THEM to explore on their own. 90% of real world programming (100% made up statistic) is self-taught -- familiarising with interfaces to new modules, searching for and through documentation, etc etc.
Explain to them that you can only show them ways to think about programming; that they must teach themselves how to program, by using this new thinking that you are demonstrating.
Instill in them the hacker mentality: there are no impossible things; merely things I have not yet taught myself to do.
When you teach them to program independently, you don't need to teach them their second and third and nth languages; they will teach themselves, for themselves. (Subsequent programming classes should then focus on concepts, theory, methodologies, etc: the heavy stuff that's better absorbed in the classroom than from google.)
Inspire them!
you're treating students like teaching the wrong language will kill them. more important is determination on the student's part.i tried to learn java when i was eleven/twelve.it was bloody hard by myself,and it took a seriously long time,just over a year before i could do anything good,and i had no results for ages.i would suggest a baptism of fire - teaching them something practical, something built for the present like java or c++ because it allegedly rocks.however i was self-taught and had all the time in the world,different from teaching.i agree with the last guy though - the 'hacker mentality' is very good and useful.
> you're treating students like teaching the wrong language will kill them. more important is determination on the student's part.i tried to learn java when i was eleven/twelve.
As a general policy, I think we shouldn't teach any shitty languages. Java is a shitty language. Therefore we shouldn't teach it to anyone, newcomers included.
Also, I disagree that learning should be an painful experience. "Baptism of fire" makes it sound like you're in the military and your life might be on the line.
Back me up, I'm goin' in!
I'd advocate C as the first language. It's small, in that the number of reserved words and the size of the standard library are small, and these days you could teach the use of debugging tools in the same course. Not to mention that it's low level, so you could build on top of those programming 101 courses that teach MIPS assembler as first language.
Also teaches people either to not fear the almighty pointer, or to be continually scared shitless of the big bad segfault.
>>90
Both of which are good things. Seconded.
C is a bad first language. You have to learn too much right up-front before you can write code.
Certainly no more than you have to learn to use Java, and Java is a popular first language at lots of schools. Actually, you have to learn quite a bit more to use Java meaningfully.
>>92
You can learn C in little bite-size increments. Starting with your first #include directive, which is like "without it, the compiler is mad at you, with it the compiler is happy"; everything else is just a function returning an int and taking no arguments and a function that prints text to stdout. No need to go into pointers or even arrays on the first day. Integers and maybe if() and for(), if it's an university course. That %d in the printf call can remain a magic mystery until a couple of weeks forward.
Contrast with Java, where... uh... how exactly do you explain what System.out is? Or what "public static" in main's declaration means? And why String needs to be used with a capital S? Or why you need to have an object or a class to call a method? What is a method anyway? Why do instance methods behave differently from class methods? And how does this relate to what we learned about MIPS assembly in computer innards 101?
I tried to learn Verilog about a year ago. It was indeed quite difficult an experience, given that I have absolutely no experience in the low-level nitty gritty of electrical engineering. Imagine how clueless the young students are going to be when they have no concept of, say, control flow.
>>94
As opposed to most scripting languages, which start something similar to this:
print "Hello world"
C would make a fine second language. It strikes me as a bit sadistic to recommend it as a first.
>>95
I'm sure those are the exact thoughts of the people who recommended BASIC as the first language.
And what exactly is wrong with BASIC? GOTO/GOSUB? Oh no, the humanity!
I have news for you, man: perl/python/ruby/et al are far more powerful than C, except for certain niche domains. A powerful language doesn't need to have a high learning curve.
A lot is wrong with BASIC. Don't take my word on it, take Edsger Dijkstra's word.
>>97
Except you don't want to teach people powerful languages as their first language. The flipside of powerful is "can't fit a cheat sheet on the sides of a coffee mug", and TMTOWTDI is a pretty much certain way to confuse the newbies. Seriously, try explaining generators to someone who doesn't understand control flow in the first place.
With C it's what you see is what the compiler will give you, undefined results, furballs and all. And that is one of the things where C++ fails it as a teaching language; how do you explain in the first few lectures that "yes, at this point this operator is called the into-the-stream operator, but at that other point it's a bit shift".
Anyway, I'm going to take your "certain niche domains" to mean damn near all of system-level software in modern Unix-like operating systems (i.e. the *BSD variants, Linux and whatever the HURD is these days).
If system-level software isn't a niche domain, what is?
>>100
Elevator controllers.
> Except you don't want to teach people powerful languages as their first language.
Why the hell not? First you (I assume you're >>94) argue you can learn C in "little bite-size increments", now you claim you shouldn't do such a thing in another language?
Regardless, the relevant issue here is as thus: it has less to do with the power of a language, and more with its learning curve. And you can do a lot in Python without using its generators. How much can you do in C without pointers? This isn't even the same ballpark...
> damn near all of system-level software in modern Unix-like operating systems
May I be the first to point out that most of said system-level software need not be written in C? Indeed, the world might have been better off if a more powerful language had been used (and a less shitty library that thinks we're still in the era of the PDP and VAX). Think of all that wasted productivity.
Furthermore, >>100 is bang on. When was the last time you looked for employment?
Pascal reeks of old, but I'd still teach Pascal first. It's simple and straight to the point of the first paradigm one should learn - traditional/imperative/structured programming.
Then I'd move on to C and see things from a low-level standpoint, and from it I'd go to Python which is a pretty good, clean language IMO.
I just wouldn't teach Python first because it deals with more complex things (objects, references, lists, etc.) which you are bound to stumble upon if you do anything with it. Besides, people might think less of it if they see it first, or they may never discover the good stuff it has.
As for OO, no OO for starters, that's a bad idea; and much less Java, which is a crappy language with a crappy API that's going to get students nowhere.
> How much can you do in C without pointers?
Without pointers, not much. Without pointer arithmetic, lots and lots.
It's not really pointers that are hard. It's the pointers-to-base-types and pointer arithmetic that is hard. You can teach a whole lot by just ignoring the existence of these things. As long as you treat pointers to system structures and objects as opaque types, you've basically got what Java has, and nobody says Java is too hard because it uses object references.
Sooner or later you have to learn these things, but you don't have to learn them right away. The learning curve shouldn't suffer if you don't go our of your way to make it hard.
> Without pointer arithmetic, lots and lots.
In C, [] is pointer arithmetic. Would you live without arrays?
Pointers aren't hard, but they require you to be agile enough to think of them, and have pretty clear concepts of variables and memory organization. For example, if somebody thinks ***a is more complex than **a, he needs to improve on this.
>>103
I think you should really have a functional language as the second or third. Teach some easy scripting language like Python or JavaScript as a first language, then C and a functional language as the second or third. That sort of balanced mix will get people to be able to think about programming from a variety of perspectives.
That's what it is, but you don't have to know that. You can pretend it's just a magic array index operator, until you reach the point where you're ready to deal with it in more detail.
That sort of approach will also mean each new language is as hard as the first one, since you can't carry over most of your earlier knowledge. If you're smart, you'll probably take it in your stride, but if you're smart you're going to learn this stuff sooner or later anyway. If you're not so smart, you're just learning a bunch of stuff that won't really benefit you and that you would be better off without.
Of course, I think functional languages are largely a waste of time (except maybe as a warning as to why you shouldn't let failed maths majors do CS).
Newsflash: Learning can be hard. Programming is a hard subject. Learning a hard subject is extra hard. Some people will have trouble with it, sure, but they should maybe pick an easier subject.
And really, I do not think you know what you're talking about when you dismiss functional programming. Maybe you haven't worked on projects big enough to understand the need for abstraction? ... Sure, Haskell and Scheme are not the world's most practical programming languages, but techniques from functional programming are invaluable, and they can help you write clear, effective code even in C.
>>106
To tell you the truth, I'm not terribly sure what's a better idea - to start from a more theoretical point of view then build downwards or to start with a more low-level point of view then build upwards.
However, I wouldn't call Python simple for a few reasons; I'd rather start with JavaScript.
I dismiss functional programming for the one single reason that it is not practical. It doesn't matter how mathematically elegant it is, if you can't use it to write real-world code it's just theoretical wanking. But let's not start up that discussion again - look through some of the older threads for more on that.
>>110
Though it wouldn't be a bad idea to teach it. It's a bit of an enlightening experience, and you can use some functional constructs in real world, practical, down-to-Earth multiparadigm langauges such as Python or Ruby.
Then wouldn't it be better to teach how to use those constructs in those languages? A lot of students I know don't get enlightened from learning a functional language, they get annoyed, and will want nothing to do with those afterwards.
There's nothing inherently impractical about functional programming. It's quite possible to write "real-world" code in a functional language just as easily as it is in a procedural one. Or rather, it's just as easy in a language that's mostly functional. Of course it's ridiculous to program in a completely functional way, just as it's ridiculous these days to program in a completely procedural way. There's lots of benefits to programming functionally, as long as you don't adhere to the paradigm beyond all reasonable logic.
But this brings up an important point regarding attempting to teach functional programming: There's no reason to use a language like scheme or haskell if all you want to do is teach a paradigm; you can program functionally in python or perl too. Maybe not to as great a degree, but much more than one might expect.
On the flip side of that coin, though, is the fact that so-called functional languages like lisp don't adhere you to a functional paradigm either. You can program procedurally in lisp if you want; actually, you can pretty much do anything you want in lisp. That's one of its perks. Moreover, there's plenty of real-world code written in common lisp, and enough libraries and support that it's just as easy to write actual code in lisp as it is in perl or any other language, for the most part. So as far as functional programming goes, if you're concerned about practicality then you're thinking about the issue in the wrong way.
No, I agree entirely that functional programming methods are useful in languages that aren't purely functional. It's just the purely functional languages that I can't stand, because they elevate dogma above everything else.
C++ is the way to go in 99% of all real world applications. Learning script languages isn't nearly as useful.
If by "real world applications" you mean "GUI apps under Windows", sure. But that's hardly all real world apps.
>>115
Has never been a sysadmin.
C++ really is a pretty bad language. It's so big that people normally learn a subset of the language and program in that. Which can work great, until you try to collaborate with other programmers, and they use a different subset. I won't deny that C++ is useful, but the situations where you need it are becoming less and less, and I think that's a good thing.
>>115
Eh, what? A good number of my real world applications are indeed written in C++, but I'm also running lots of low-level software written in plain C, and lots of high-level scripts and applications written in PHP, Python, Perl, and ugh Java.
>>115
moar liek 30% amirite? i speculate that 30% is taken up by c programs, 20% is taken up by high level scripting/interpreted languages and the other 20% taken up by other compiled languages.
I have no idea what language should be teached, but I definitely know which language should be learned first: one you can start using right away, where you can experience your first successes one hour after starting fooling around, where experimenting with stuff is easy.
That means:
Obvious candidates are Python and Ruby, and both have the added advantage of being used in the real world.
>>115
Definitely, because unlike your time, these CPU are really expensive!
Going only by your description, Visual Basic really is a much better choice than either of those.
Compiled languages don't necessarily constitute not being able to learn quickly and be able to reach success within an hour.
For example, QBasic is a compiled language, yet, IMO, easier to gain quick success than Python or Ruby.
>QBasic is a compiled language
But QBasic progs could only be run from inside the QBasic IDE, no? I don't think this truly counts as compilation, in the sense that it doesn't build a stand-alone application.
>>122
I think there's too much magic involved in those visual languages for a beginner to feel that he makes things happen, to understand what is going on.
>>123
I shouldn't have said "no compilation", more that it's important that the process is no more complicated than write and run - no setup before, no waiting after.
>>124
QuickBasic can compile. They're probably thinking of that.
I was of course not seriously suggesting Visual Basic, I just wanted to see what argument you would use against it.
You did say "As high-level language as possible". But then you dismiss Visual Basic as too magic. But isn't all high-level constructs magic? Are you sure you don't mean "suitably high-level", in that case?
"I'm against Visual Basic, now I just need some arguments against it for every practical purpose"
>>127
Yes, I should have stated my idea without big words. Basically, I think that for a beginner, it's important to feel in control.
I think that when you run a small script that writes something to a terminal, you feel more in control than you'd feel displaying it in a window using a visual language and its impressive IDE. (even if a lot is involved in displaying stuff to a terminal - it's purely on the psychological level)
Of course, I'm not arguing that using a GUI is a problem, only that keeping things simple is good because the learner feels in control.
But of course the best argument against VB is that it's a bad language that will teach you bad habits.
Picking up a language when you really need it is easy once you can already code, so I think that for a first language quality is more important than real-world usage, and choosing as a first a good language that is also used in the real world is better since mastering it will be a good investment.
My opinion is to remove the barriers to entry and needless frustrations. As I said in >>121, it's not that good an argument for what language should be teached, but I think it makes sense for which language somebody should decide to learn first: making the experience painless keeps you motivated.
>>128 i'm support vb but in back i hate them
I am this close to issuing one of our very, very rare bans for you.
Not only do you post shit in every thread, start new useless shitty threads, but now you are impersonating other posters.
Stop it NOW, or face the consequences. Understand?
>>102
I can make my argument well enough without your strawmen, thank you very much.
My point was that pointers, though much maligned in this limpy-wristed era of Java and oodles of other heavy leather mittens languages, aren't that hard to teach given an audience that is acquainted with something like MIPS assembly. In fact to someone who knows what a memory address is, a pointer is just a hint for the compiler to produce code that offsets the pointer in a type-specific way. Consider that many universities teach assembly to EE/CS students before they go into higher-level languages.
It's not that I don't see the utility of having a language like Python. But teaching something like that to a student without making clear what the correlation between machine instructions and programming language constructs is only serves to produce a class of people who prefer to think of e.g. compilers as magic and consider the use of a debugger something that only gurus have the skills to do; if it didn't work when they typed it out from the Great Big Cookbook of Black Magic then it's never going to.
What's the better way to teach someone to program computers effectively, then? Top-down, where you tell him that references are magical and exceptions Just Happen and generators are deep magic for when you grow up (close your eyes and think of England, and don't have too much fun because God is watching)? Or bottom-up, by letting the students figure the hows, whys and whats of abstraction out for themselves (or not, in which case there's always a market for COBOL grinders...)?
This all reminds me about a post on some forgotten message board about a young programmer who wanted to get into x86 assembly but was concerned that he might permanently fuck up his computer by accident. He explained that he had read somewhere that this could happen if one wasn't careful. Perhaps something like this is going on with pointers, today? "Ooh eek it's scary, it's frightening, it's something that Sun says is evil and outdated and hello, 1994 calling, they'd like their pointers back!".
(And sorry about the necropost.)
> aren't that hard to teach given an audience that is acquainted with something like MIPS assembly
Which implies that they learned MIPS assembly earlier. That conflicts a bit with teaching C as a first language.
> Consider that many universities teach assembly to EE/CS students before they go into higher-level languages.
EE? I wouldn't be surprised. For what they do assembly is already a fairly high level.
CS? I'm not familiar with any university teaching assembly first.
> He explained that he had read somewhere that this could happen if one wasn't careful.
With good reason, I'd say. I don't recall MASM and TASM coming with libraries, which left you with several hazardous options.
So you want to write to files, do you? cue evil laughter
> > He explained that he had read somewhere that this could happen if one wasn't careful.
> With good reason, I'd say. I don't recall MASM and TASM coming with libraries, which left you with several hazardous options.
> So you want to write to files, do you? cue evil laughter
it's still nearly impossible to permanently fuck up a computer with software. and in those rare cases where it is possible, it's usually because of a hardware bug.
Well, there was the legendary destruction of monitors. In 1994 there were probably still a few vulnerable ones floating around. Having said that, yes, you're right.
My main worry when fooling with assembly was destroying my data. Fortunately it never happened, although I did end up with a non-booting machine several times.
C, just the cross-compatible stuff...If they learn BASIC first, they won't go back, I'd know...
Two problems here. At least.
1st. Students shouldn't be buried. It's inefficient & demoralizes 'em. This implies smallish lang, smallish ref doc, good tools, eg an IDE. Rules out trad bare editors for command line save/compile/debug/edit... cycle. The Stockholm syndrome may demand we make 'em suffer, but computing is too important for us to indulge.
2nd The lang chosen should not teach bad habits. EWD has a real point. LISP's littered with parens, APL is a terseness too far, & Forth requires simultaneous concern w/ contents of the stack & much higher level issues like getting correct information to (or from) disk. Java is so careful it's a feather pillow -- you're buried in fluff, teaching newbies bad expectations. Ditto many of the special purpose languages, eg for Web work. C's syntax is ad hoc and confusing (so's PERL), and pointer artihmetic (as in C) is a bug farm. C was built for a group who used it were the intellectual peers of Thompson and Ritchie. Few of us are, and no students can be expected to be.
3rd Don't force teachers to reinvent wheels, eg first class teaching materials. They should already be available. Algorithms are a central issue in coding, and part of teaching a first lang should be an appreciation of algorithm design and evaluation. The lang should'nt be so difficult algorithm structures and mechanisms are obscured.
MUMPS, PILOT, LOGO, are all small, but too special purpose or too unusual. APL, Prolog, Simula are too far from the mainstream to carry over much to actual problems. C++, Ada, PL/1, Smalltalk, and Java are too large for a first lang. Intercal, and other Retro-Computing Museum wierdies are only for the self-masochistic; sort of like COBOL and Fortran. Maybe as comic relief? The functional langs (and partial ones like LISP) are too exotic for a first lang, even if they are theoretically cool. Assemblers (for any CPU) are too detailed to be even partially plausible. Even MIX. One of the BASICs might be adequate (I think not VB in any form), but they tend to have some specialized feature that's been over stressed and so aren't very general purpose.
I'd suggest Modula-2. There's a truly superb text, by KN King, in print and in use for a very long time. A very large point in most teaching situations. It's a small lang (the Modula-2 Report is slim, was specified in BNF (a chance for some teaching about lesing/parsing), and has no obscure syntax like C's operators. Obfuscated code is next to impossible; students won't acquire bad habits in the Dijkstra EWD498 sense. Yet it has both pass-by-value and pass-by-reference (w/o going thru memory mapping/allocation/deallocation; none needed to make a pedagogic point), and has access to bare hardware if needed to show device register operations. Data types offered are fine to illustrate programming interaction with the gritty reality of underlying bits. Both are good things, in some intro courses and for some purposes. And it has a sort of multi-programming (the coroutine and monitor provisions) which can be a useful entry into teaching a bit about concurrency. This will help introduce processes, threads and such, if needed.
If Modula 2 is not on, perhaps as insufficiently OO, try Oberon (or Oberon 2). Full OO, just not the Java, C++, or Smalltalk way. The Oberon Report is much smaller still, for those concerned. Oberon and its relatives are almost as clear in their code (though a newbie can twist anything into a pretzel). Lots of free versions, and even an entire operating system/lang/utilities collection for Intel as well.
My favorite term-long project is to require students to read Mythical Man Month (Brooks) to inject some reality, and to translate most of the code in Software Tools (1st RatFor edition) into the lang they're learning. Groups work on diff chunks to allow the whole thing to be done in a term. With more care and thought, in two terms. When they're done, or as extra help, I have them implement/compare some of the algorithms in Sedgewick.
You're forgetting one incredibly important thing: Nobody wants to learn something useless. It doesn't matter how educational it is, if you can't make "real" programs in it, people will complain non-stop about having to put in the effort to learn something they can not use later on.
>>139
Yeah, agreed. I was going to post, "But hey, CVSup is written in Modula-2", but it turns out that's Modula-3. (http://www.cvsup.org/faq.html#notc)
>>138
I don't understand your "buried in fluff" complaint. Explain?
Ignoring that one, what about Lua (http://www.lua.org/)? It's simple, approachable, and has real-world use as an embedded scripting language.
>>140
You'd think the garbage collector would take care of all that fluff.
I'm guessing Lua, Python, etc. were covered by "many of the special purpose languages." Is 'sane programming languages don't make the coder deal with memory management' a bad expectation?
I don't think that's a bad expectation. It's important to know how memory management works and many of us in this thread have agreed that C is a good second or third language. However, for first learning to program, one should not have to face the bookkeeping.
>>138
your comment about no student should be expected to be the intellectual peer of someone or other is short sighted. better by far to aim too high and fall short than to aim too low and never advance past success.
the best intellects in the world had to learn once upon a time; you'll never add to the pool if you go around preaching defeat before anyone even starts.
>>139,143
If the goal was introduction into thinking in computer logic, then languages that give the user quick and satisfying results such as logo and squeak and kid's programming language are great. Languages such as Pascal are fine too. If the goal was to teach them to think in terms of computer science level algorithms, then you use a lower level language along the lines of C. If the goal was to teach the person how to become a general purpose programmer, eg. a junior level programmer of a company IT software project, use a higher level language that assists in not wasting the programmer's time by not requiring them to think in low level. Languages such Ruby or C# or Python are great for this.
> it's still nearly impossible to permanently fuck up a computer with software. and in those rare cases where it is possible, it's usually because of a hardware bug.
Unless you are using a BIOS designed by some kind of wacky monkeys, accidentially flashing the BIOS is pretty much impossible.
For me, it depends on what you're teaching: to program, or about programming. In teaching to program, there is a huge class of languages (including C/C++, Java, C#, Python, Ruby, etc.) which offer variations on the same experience (mutable state, strict evaluation, etc., etc.). I'd like to think that my pedagogy will be more important in teaching my students to program than the tools we use.
When teaching about programming, on the other hand, I think that there are significantly fewer options. Depending on the school, many of the students won't have any experience with code that isn't imperative/OO. If you're going to teach about programming (an introduction to PL theory, for example) then you either need an extremely rigourous and flexible language, or you need lots and lots of languages. For me, I'd choose to go with Haskell (though Scheme [witness: SICP, the Reasoned Schemer, etc.] and CL will be able to handle all of the same techniques and content, just with much reduced safety, and Oz is perfect for such an approach [witness: van Roy and Haridi, 2004. Concepts Techniques and Models of Computer Programming.]) for the sole reason of monads. Implementing imperative, OO, non-deterministic, and logic programming yourself really helps get to grips with their intricacies. Furthermore, many students won't have used a language with a strong type system before (no, C/C++, Java, etc. don't count); the main reason people rail against type-checking is that their only experience is of Java's rather pathetic type system. Algebraic data-types, pattern matching, type-classes, and type inference give a much more compelling case for type-checking than anything from the C/C++, Java, etc. world.
To link this back to the original question, I don't see why the first case (teaching to program) and the second (teaching about programming) must necessarily be distinct. I plan to try a "fundamentals" approach to teaching programming (in Haskell) next year if I can get a good set-up and presentation figured out in time.
> I'd like to think that my pedagogy will be more important in teaching my students to program than the tools we use.
To an experienced programmer, switching languages is like switching pants. One easily forgets that this is most definitely not so for a novice programmer, and the particulars of the language are indeed very important.
As for the rest of the argument, I'd opine that functional languages are great for teaching functional programming. For teaching about programming in general, I am pretty sure your first option of using many languages is the only really useful choice.
>>149 DQN is that-a-way ->
Start with C or assembly language to show people how the machine actually works on the lowest level, then show JAVA to see a language on a very high level [good for software engineering concepts], then introduce various popular languages like Python, Perl, etc..
I wouldn't call Java a particularly high-level language...
it wouldn't call you one either
Well, Java does implement a lot of the drawbacks of high-level languages!
In addition to those of low-level language, and some it made up itself.
>>147
Saying that Common Lisp and Scheme can do all that Haskell can do is like saying that PL/I can do everything Lisp can do.
*summons the dead
I like the notion of introducing the concepts of programming first, using languages such as Smalltalk (Squeak), Python, Ruby, Perl, Scheme or what have you. Leave out the complicated low-level stuff at first to get them a general feeling of what it is like to write and think in code. Then you can introduce a 2nd language such as C or C++ to teach them about efficiency and real-world applications, as suggested already, but with a twist: combine both languages.
My idea is this: if you taught them one language, finished it and then started with another, the students might forget about the first ("Hey, we're not using language 1 anymore, time for some garbage-collection of my brain to make space for the new one haha pun."). Like Waha said, learning another language when you only know one is hard, since you don't have the fundamental concepts abstracted and safely stored away in your brain. Since both high-level (1st) und low-level (2nd) languages are useful in the real world, why not complement each other?
Ruby, Python, Perl, Scheme (guile) and other languages have C APIs to hook code into the interpreter. The students could, e.g., learn to optimize certain programs in C while retaining the maintainability or whatever of their high-level programs or how to use external libraries to make the task at hand easier. They don't need to to this all the time, just often enough to understand why knowing the right tool for the job is important.
I know this sounds somewhat abstract, but hey, I'm not a teacher.
if you dont mind a suggestion, it wasnt too long ago i was student myself n i remember the biggest fault when they started to teach was nothing but theory. even if we did code alot, me n most of the ppl in the class started to feel like "whats the point?, sure i can make my puter write hello world 5000times but what use is that to me?". so i was thinking maybe it'd be better to start with a lang that makes it easier to understand the uses, what i'm talking about is webbased langs like php. php is abit lax with variable declarations n such but it is alot like c++ and if your students already have learned html it would be alot easier for them to see the uses of programming.
last point also, dont even try to have them learn oop right off the bat it'd confuse them alot i imagine..
>>157
you were in the wrong class
>>158
Yes, he was. OOP from the beginning? Was your teacher mad?
As for what lang, I'd definitely say Python. It's, as 157 (or Ninereeds, bah) said, a "lang that makes it easier to understand the uses" (it's actually pretty much like php, but even easier to read and code).
My first language was Pascal. I mean, come on! Why would someone teach a language wich has no use? When I moved to C... T_T it got really hard. But, as I see it, Python is even easier to code than Pascal, it has several times its power, is opensource, is multiplataform, is non obligatory oop (not like Java, wich would just be ridicolus to teach to newbies) and can incorporate C. When your students become strong on Python, then you can give them C, C++, a little Java, and PHP, ASP, CGI or what ever application lang you desire. They'll use Python until they die, tho ^^.
There's a couple of great textbooks i'd recommend if you decide to use pithon:
http://www.amazon.com/How-Think-Like-Computer-Scientist/dp/0971677506
That also can be found for online reading at its website,
http://www.ibiblio.org/obp/thinkCS/python/english/
2. Python Programming for the Absolute Beginner, by Michael Dawson
http://www.amazon.com/Python-Programming-Absolute-Beginner-Michael/dp/1592000738
I don't know if there is a online reading site for it, but it's a fantastic adquisition. It might be the fest python book ever, because it's fun to read and all ^^.
What's wrong with Pascal?
Start with teaching them that a programmer's job is to solve problems. Teach em logic and how to think. Then let them code, I'll recommend python for the coding part.
>>160
Sorry if I was a bit harsh with Pascal, but... well, it's hard to apply it on real life. You might find yourself using Object Pascal with Delphi, Kylix (wich is Delphy, after all) or Lazarus, but... well, why would you want to use something like that if you can construct it with a simpler, multiplataform (that is, also in Mac OS) language?
That's why I rather to use Python.
>>161
You certanely are right.
That kind of reasoning only makes sense to someone who can already program and has forgotten that they once did not know how. If you do not know what tools you have or how they work, you can never learn how to solve problems with them.
FIRST you learn to use the tools, THEN you learn how to apply them.
I've never used Delphi, but everyone I know who used it loved it.
Apparently Free Pascal incorporated most of Delphi's features, and it's cross-platform. The binary it produces are quite efficient too.
Getting a job with Pascal, particularly outside of maintenance, is another thing though, which is a bit sad.
I'm a big noob in programming, and actually it's the first time I visit /Programming/ here on 4ch, but what I can say is that playing a little with RPGMaker before knowing anything about programming made it incredibly easier. RPGM is way simpler, there's no comparison, but it introduces you to the computer way of thinking (like with variables).
Plus, you can make games.
warota.
RPG maker is Ruby right? can we call it a toy language now?
>>163
teach them to think and solve problems and they'll teach themselves to rtfm
You're still thinking as someone who already knows how to program. You can't teach someone to "think and solve problems" with just words.
Do you think you can teach a man to paint without him ever touching a brush? Do you think you can teach an architect how to design and build a house without him ever trying to draw one? Do you think you can teach a car mechanic how to repair any car without ever letting him touch a car first?
C, perl and/or shell scripting
Once the student decides on their track, get more specific:
Higher level - C++, Java, OOP, etc.
Web Db - MSSQL, MySQL, Oracle
Web - Perl, PHP, Python (iffy, it's falling out of use), ASP.net, put coldfusion on hold for now. Rails might be on the curriculum in 5 years if it's still alive. Java's a valued skill and javascript is a plus even for the backend programmer to know.
Low level - ASM, C, etc. natch.
Games? Java, OOP, Graphics, graphics with C, ASM, etc. It's a whole subset of programming IMO.
>>168
I'm thinking as someone who taught themselves qbasic at lunchtimes out of an old c64 basic manual and the builtin help because they wanted to learn, not because someone held their hand and made them do exercises.
Nobody made you, but you did by yourself.
How much would you have learned if you hadn't had a basic manual and the builtin help, but instead a book on algorithm design?
>>171
who said anything about books on algorithm design?
>>168
Bah. You can't give him a brush and expect him to paint right away, right? You have to tell him how to move the hand. There's a reason the theory is called "The basics"
I'm replying to the people who say stuff like
> Start with teaching them that a programmer's job is to solve problems. Teach em logic and how to think. Then let them code, I'll recommend python for the coding part.
and
> teach them to think and solve problems and they'll teach themselves to rtfm
here.
python is the bob ross of programming languages
It, uh, it has happy binary trees?
>>176
do not traverse happy binary trees!
by far the best option in my opinion is python, as it is almost as easy as a pseudo-code.
On the contrary, easy is what you want to avoid when teaching people. They have time to become jaded and satisfy their comfort-seeking weakling needs later in their careers (if any).
Therefore assembly is the best language to teach to newbies. Teaching implies guidance, which combined with something supposedly scary like pointers or complete lack of typing besides "byte, word, longword, quadword" will produce a kind of confidence that Java programmers can only dream of. Once "bare to the C library" assembly has been used to study the basic concepts of computer programming, higher-level languages may be taught with the promise of the compiler catching type-safety errors and not having to do function calls and register spills by hand.
Contrast the end product of a survivor of such a course with the sort who's ever been taught "heavy gloves so you don't hurt yourself" languages like Java and (as some in this thread propose) Python. Which one would you rather hire? The old-school guy who's got confidence out the arse, or the scaredy-cat Java grinder who has nightmares about the big bad NullPointerException and who doesn't understand why the garbage collector won't catch his file descriptor leak?
I guess that is why Logo is used to teach kids in elementary school and kindergarten.
As far as your perfect candidate question, I think that I would hire this guy http://norvig.com/21-days.html
Now, he is 'the old school survivor' maybe because there was no Python back then, but with his extensive experience, he is still recommending Python. Just like me ;)
'become jaded and satisfy their comfort-seeking weakling needs later in their careers'
that was a nice try...
whatever!
Hey, I couldn't find pointers, garbage collections, and descriptors leaks in logo http://el.media.mit.edu/Logo-foundation/logo/programming.html
Shell I call them and tell them to change their curriculum. Their kids just wont have any confidence without those things, and won't be able to produce anything good in future.
Hey! Why isn't gravity being thought with Einsteins relativity stuff as well Newton/Leibniz's calculus, but only starts with more trivial examples, like some apple falling on someone's head. All those educators, what were they thinking?
Quit yer mumblin' and provide me with counterarguments.
>>179
Unfortunately, teaching isn't just about imparting knowledge. You also have to maintain the interest of your students. Assembly programming is tedious and unrewarding- new programmers won't care that it's an excellent way to learn the fundamentals and practice basic algorithm design, they want to do something other then screw around in text mode. This is why Logo is considered great for teaching; its easy-as-pie graphic functions make learning the concepts somewhat more exciting for kids.
Try running your assembly super-programming course, but don't be too surprised when 90% of it drops after the first two weeks and re-enrolls in "Making 3D Games in Visual Basic.NET 101."
>>185
signed
Python will be the most obvious answer since it is almost as easy as a pseudo-code. After they learned Python by heart, learning other languages will come naturally.
>>185
I think 90% of people just can't program. The basic concepts come easily to some and are extremely difficult for others to wrap their minds around. I don't claim this to be fact, but it's my experience, and from what I've read, I'm not the only one with that experience.
So, there's something to be said for an interesting and engaging class, true, but there's also something to be said for one that weeds out the people who just aren't cut out to be programmers.
Oh, to make this clear, though, I completely agreed with your first paragraph.
You should learn penuscode first
i think python would be best. make them program something they will use.
Everyone thinks something would be best. I think emacs lisp or LOGO are a good thing to learn first. I also think old micro-basic would be good except it's impossible to get at anymore (unless you use an emulator).
I miss the good ol' days of hacking at Nibbles with QBasic. In DOS.
Qbasic was all right, but wasn't quite as immediate as LOGO or lisp (or even other basics). As a result, I think ipython http://ipython.scipy.org/ would probably be a better choice than Qbasic simply because it's more interactive.
QuickBasic had an immediate mode. The online help was fantastic (F1 any keyword and a up pop cross-linked discussions and examples), and you could stop/resume a program anywhere without changing the source (including on errors), and a host of other goodies.
Not exactly Genera- or Smalltalk-quality, mind, but something most languages implementations nowadays are nowhere near doing. A pity.
I forgot to add you could stop the program and change the source too before resuming. At least in interpreted mode anyway.
QuickBasic's problem is it's long in the tooth and mostly irrelevant.
> The online help was fantastic
Very yes. Very, very yes.
I may have grown to hating Basic dialects with a passion, but I must owe my entire passion of programming to that help.
Although Python does have a help() function in its interpreter, and the REPL encourages experimenting more than QBasic's Immediate pane, it still has much to improve before it's immediately accessible to the newest aspiring hacker.
(Especially Nibbles. Python needs a Nibbles -- with code that's stupidly simple to understand.)
By the way, Pascal was meant to be compact and have a small compiler. Its BNF grammar is about half a page long. It used to be the favorite for compiler design courses. A subset, PL0, has a compiler with source you can read through pretty easily. The author, Wirth, did not intend that you use Pascal for commercial projects. He designed Modula-2 and Oberon for that.
>>45 just as use of variables shouldn't need to be taught in mathematics at a university
...just...just stop. Right now. That wasn't funny.
I'll bet you don't even have a clue what the difference is between variables that express arbitrariness and those that express generality. Arbitrary variables and general variables. And if you want to step into the realm of formal logic, you'll need to further divide up your variables with adjectives like "free" and "bound."
Varibles shouldn't be taught...that's a good one. Perhaps you'll say next that methods of proof don't need to be taught either? Let's just leave mathematical induction to intuition, shall we? Good luck proving complicated proofs that way.
The post you are replying to is three years old.
Wow. What a slow forum.
hes was probobly high.
Oh no, this is perfectly ordinary on this kind of messageboard. It's good if discussion is kept in one thread, or when that thread grows to 1000 posts, in a group of similarly named threads.