Exactly. So the student can learn all about computers, from the top to tbe bottom; from his first and almost magical "Hello, World!" to the lowest level where the secrets of machine code, stacks, memory management, etc, reside.
But they aren't learning anything about "the top" they are learning only the bottom. By making a low level language you are guaranteed to be mired in the low levels.
I haven't had those problems with any of my students, many of whom have gone on to lucrative careers as professional programmers, independently employed. Perhaps you're teaching the subject in an ineffective way. The key, in three words, is this: TEACHING IS TELLING. The so-called Socratic method is bunk. You want to transfer the working neural patterns in your brain into your student's brain -- not ask the student to reinvent the wheel. So you show him a problem, then you hand him the solution on a silver platter and have him type it in. (Our standard dual-monitor set up, where the teacher controls the mouse and the student controls the keyboard, is ideal for this kind of training.) Then you do it again, with the next problem. And again, and again, and again. When you see that the student is typing the right stuff before you tell him what to type, you're done.
Good job, you've trained code monkey's that are incapable of original thought! Seriously, this method of teaching will only result in students that program in a style very similar to their teacher. Quite frankly, I've seen this method employed before and it rarely (if ever) turns out someone that actually understands the WHY behind what they are doing.
Actually, more like Pascal or Oberon. But clearly that's all that's needed, both for beginner and professional.
No, those language have WAY more features that your natural English garbage. The similarity is only that those languages focused less on symbols and more on using english words.
If it wasn't enough, how could we write an entire integrated development environment (including interface, file manager, text editor, hex dumper, native-code-generating compier/linker, and wysiwyg page layout facility for documentation) -- conveniently and efficiently -- with just that and no more?
Oh? So the proof that a language is great is that you can do complex things with it? All of these things you have done were done with assembly just as efficiently. Were those assembly languages enough? No way! Just because you've made a compiler and wysiwyg editor in your language says nothing about the goodness of the language. The only thing it proves is that the language is Turing complete and that it has someone dedicated enough to it to make these things.
Just about every popular language out there (and several pretty unpopular ones) have been used to make the things you have made here. So the next question is why should a professional programmer want to use your language. As I've written, your focuses are completely misplaced.
Making complex things says absolutely NOTHING about the quality of a language or programmer. You can get the same results using a power screwdriver vs a manual screwdriver. Yet the tools are fundamentally different.
The beginning student doesn't need to see any of the low-level stuff. That's why, for example, we manage memory for strings automatically. But we do want the more advanced student to understand memory management, so we expose that level of detail when the student is ready to study more advanced data structures.
Great, this shows that you have make a purposefully inconsistent language. Good job on that one. Now "the student" has to figure out when memory manage takes place and why strings are treated differently from other things. The answer? "Well, we wanted this part to be easier but we added booby traps because it makes people better programmers, right?"
Yet smart enough to develop the non-trivial program under discussion. Smart enough to teach others who are now gainfully self-employed in the field. Smart enough to be self-employed in the field myself for over four decades.
I know plenty of idiots that teach programming and have been employees for decades. They have even made non-trivial programs. Their programs are crap and they don't know why. The lessons they teach are useless, but they see them as successes. Given the rest of your post here, I'm guessing you are one of these dinosaurs who couldn't wrap their mind around programming concepts much more complex than methods and variables. You don't understand the higher order programming concepts and so you write it all off as being useless garbage not needed.
To be frank, 40 years ago anyone could have been a programmer. The field was so new that we really didn't know what made a programmer good or bad. Being employed for 40 years says very little about your abilities as a programmer. Your posts here tell me the whole story. Congratulations, you are a terrible programmer with delusions of grandeur because you've managed to stay in the field for a long time.
That's not true. Memory leaks (including the number "drips") are reported when a program terminates. The developer can then use our standard debugging facilities and techniques to find his error. Besides, in practice it's hardly ever a serious (or even frequent) problem -- at least not for a student trained as we train them.
It certainly is true.
All types and routines in Plain English are global in scope, and variables are either global or local in scope. Each project is also it's own scope. This is sufficient for our needs: for teaching beginners how to write simple programs, and for teaching advanced students how to write sophisticated interfaces, editors, compilers, etc. In fact, it encourages the student not to let a program get too big, and not to integrate too many things into a single executable.
Ahahahaahahahaahahaaha!
You are basically admitting here that your language ISN'T suited for large complex application development. Love it.
Yeah. This is why no namespacing with global variables are such a bad idea. Thanks for exemplifying it while sanctimoniously being completely oblivious to the problem.
Yes, the developer copies the libraries (or the parts of the libraries) he needs into his project directory. (There is no separate "project management" facility in Plain English.) This insures that the developer's project will remain stable and operational even if the source library is "improved" at a later time. It's similar to the way our DNA is carried along in every cell of our bodies.
Got it, abstraction bad and scary! Why would we ever want to reuse code without having to copy and paste it everywhere!
It isn't like code every has bugs in it... oh... wait.
So tell me. What happens when the library you are copying and pasting inevitably has a security vulnerability, memory leak, or just bad logic? You now have to go to every project you've ever copied and pasted that library into and hope to god that nobody has touched the library logic. Yeah... great... And the fact that you don't see this as a problem is hilarious. Talk about not seeing the forest through the trees. Happily, you have so very few converts to your programming cult, because I shutter for the poor developers that have to maintain the steaming pile of garbage that is the natural english programming language.
At the time we wrote our prototype, most computers were x86s running Windows, so we chose that as our target. And sure, we could have inserted an intermediate virtual machine into our architecture, but that would have unnecessarily complicated the system for the student, and slowed performance. As I've said before, the elegance of a solution has to be judged based on the goals set for that solution -- not the personal goals of an outside observer like yourself.
So x86 isn't confusing but a intermediary byte code would be? Umm, yeah.... I would just say look up the LLVM for a great example of a low level byte code just as simple as x86 which targets multiple platforms while producing high performance assembly. it is certainly doable and has been done.
It's a wonder, then, that our code is so simple and reliable. In over eight years of productive use, only a handful of inconsequential bugs have turned up, and a serious defect in the compiler is yet to be found.
The language itself is overly simplistic and offers very few useful features to a programmer. That a compiler for a simple language doesn't have serious defects isn't really that surprising and says nothing about the goodness of the language.
Not needed for our purposes. And our disdain for the object approach (verbs inside nouns indeed!) has been documented elsewhere in this thread.
What does immutability have to do with objects? (hint: nothing).
The imperatives that fall from the lips of everyone on the planet, all day, every day, are typically procedural, not functional, in nature. We thus adopt a strictly procedural paradigm for our natural-language programming system. If it's not for you, fine. The "rest of us" are going to stick with the paradigm that has served humanity well for all 6000 years of recorded history.
Yet mathematics is fundamentally founded on functional concepts (simple algebra is functional in nature) and it has been the gradual adaptation of those concepts which have enable our rapid post dark age technical advancement.
We have advanced as far as we have not because a team of workers learned how to read instruction books, but because specialists have proven first theoretically how to do something and later transformed those concepts into actions that they themselves or others could perform.
The future of computers isn't making them dumb procedural executors, but rather expanding their capabilities to understand and handle higher order concepts.
Actually, we are forced to deal with concurrency in some spots -- the same file opened on more than one tab, for example. And we do. But it's not a major area of concern for us or our intended audience.
Who is your intended audience? Beginners? So why the hell are you including such low level concepts. Why is concurrency taboo yet assembly is OK? Is it, perhaps, that concurrency is a hard problem not easily represented by strictly procedural languages? Sorry but any language hoping to be mainstream needs to handle concurrency elegantly, the fact that you ignore it and say "Well, our intended audience doesn't care about this!" shows some big deficiencies in your language. Oh, btw, concurrency is something that is pretty well represented in the english langauge.
"Bill, open the door. Ted you get behind the goat, and jim you make sure the goat doesn't get away from us. Now go!"
That statement assumes that the parties involved will be concurrently doing things.
.
I'm not going to address the rest of the stuff about the english language, My wife is kicking me and telling me to go to sleep.