A forth draft

Posted 26 Sep 2002 at 12:51 UTC by badvogato Share This

Hi, all. The following is my forth draft of an article "Thoughtful programming and Forth". To my benefit, all three previous drafts have been executed by a mob of no more than 30 examiners. I thank them all for contributing trimendously to the birth of this forth draft. But since 'forth' draft is more important than 1st, 2nd and the third draft, I'd like to take time to make sure it suits my audience tastes so they can view it under a most favorable light. Keep in your mind that the following draft's intended readers are more or less clueful Kuro5hiners. Their standards of good writing-up are very high. Cheers!

"Throughful programming and Forth"
Assignments:
  • social science students read Preface.
  • 1st year computer science students read Chapter 1
  • seniors in computer science, physics and maths read Chapter 2
  • religious & neo-religious study students read Chapter 3
  • future science-fiction writers read all of the above but write something better than what Chuck Moore is writing about.

    Respectful general readers at large, "Abstain" votes are reserved for you unless you have edited this article three times before or you have full confidence that you understand the following without any reservation:
  • What is at stake in fields of software and technology
  • In a 'free' society like ours, 95% voters trivialize silence, trivialize incoherent voices, incoherent thoughts because they understood that 'free' carries the price and cost of supporting 95% population with only 5% resources. The rest 5% voters did not agree with the majority 95%. That 5% either have original ideas or they are nuts.
  • All I am asking is to give yourself a chance to try original nuts instead of eating English muffins day after day.
    Cheers

  • haha, posted 26 Sep 2002 at 15:08 UTC by beppu » (Journeyer)

    I liked the original version better. This new one is inefficient.

    The original nut, posted 26 Sep 2002 at 16:43 UTC by tk » (Observer)

    That 5% either have original ideas or they are nuts.
    All I am asking is to give yourself a chance to try original nuts instead of eating English muffins day after day.

    Hey, "original nuts" is a tautology! All nuts are original!

    try aboriginal nuts, posted 26 Sep 2002 at 17:11 UTC by sye » (Journeyer)

    with Gin without any DO loop.

    Suggested exercise, posted 26 Sep 2002 at 17:28 UTC by atai » (Journeyer)

    The Commodore 64 is an excellent platform for Forth development. Back in 1983 the SuperForth cartridge was described as the best high level language available for the C64, beating Basic 2.0. Suggested exercise for the students: write their programs to run on the Commodore 64!

    another Forth for Commodore 64, posted 26 Sep 2002 at 21:44 UTC by atai » (Journeyer)

    Cannot find SuperForth anywhere, but this may do:

    http://www.geocities.com/SiliconValley/Haven/1033/

    ..., posted 27 Sep 2002 at 01:16 UTC by tk » (Observer)

    sye: Then the sentence will become "That 5% either have aboriginal ideas or they are nuts." What are "aboriginal ideas?"

    atai: Exercise for Archaeology students? Nice. It'll show them how enlightened we are nowadays. In the ancient days, one had to write good code to be called a programmer. Nowadays, one still has to write real code, but not necessarily good (or even scalable to the real world). Very soon, a programmer will just need to write "Hello world" and troll a lot. Way to go in removing discrimination from the programming world!

    no no, posted 27 Sep 2002 at 14:11 UTC by sye » (Journeyer)

    the sentence will become "that tk neither have abnormal ideas nor is he a nut." He's just a looner like everybody else.

    tk? The King?, posted 27 Sep 2002 at 16:28 UTC by badvogato » (Master)

    I guess you are right.

    -4 version draft of a financial plan , posted 27 Sep 2002 at 16:38 UTC by badvogato » (Master)

    ok, you folks think the following plan will get us all rich? Of course, the lead developer has already been chosen by the King. But the fact of that chosen doesn't stop anyone of aspiring to become a porter in Queen's chamber. Cheers!

    > i thought about a basic financial plan for supporting our endeavor. Here's my
    > very rough estimation:
    >
    > 100 hours of porting at $80/hr = $8000
    >
    > solicit investors for $100/per interest. with $100 invested interest, he/she will get
    > a 64M secure digital card (retail price ~$50) with your code and documentations. plus
    > 4 licenses to share among his/her friends for the code and an obligation to tell
    > whoever interested in our code not to abuse license agreement. ( that is to copy
    > freely from the original media).
    >
    > the first 160 investors will make our ends meet. I'll send you a check of $7000. I
    > keep $1000. If we are lucky to get 160 more, i'd like to get a 1-U rack
    > colorForth machine on the net and all first 320 investor will be entitled to have an
    > account for one year on that machine.
    >
    > After the first 320, i'll transfer the operation and all possible gains/losses to
    > Prof. Chuck Moore's friend Jeff Fox's company if they are interested in taking over
    > the project.
    

    Re: -4 version draft of a financial plan, posted 28 Sep 2002 at 03:25 UTC by tk » (Observer)

    Zhen grants you the right to leave investors to imagine the contents of each digital card. In the meantime, gems are nice.

    Straw men, posted 28 Sep 2002 at 04:16 UTC by mslicker » (Journeyer)

    If you can't make a point in a real argument, it might be fun to create a straw man with which you argue against. Also in the event you can't make a point in real argument, be sure to destory all evidence of any attempt.

    Re: Straw men, posted 28 Sep 2002 at 04:35 UTC by tk » (Observer)

    Though a straw man may be made of straw, yet it bears the likeness of a man. What then is the difference between a straw man and a real man, when no action is involved?

    Dishonest men, posted 28 Sep 2002 at 04:57 UTC by mslicker » (Journeyer)

    I don't need to prove myself to you or anyone else. You are free to code your own wish list. If you have argue against a straw man to make point, it only reflects your general dishonesty.

    Re: Dishonest men, posted 28 Sep 2002 at 17:02 UTC by tk » (Observer)

    Turing shall be our judge. Amen.

    Turing, posted 28 Sep 2002 at 17:16 UTC by mslicker » (Journeyer)

    Give me a turing machine, Forth can address any problem the turing machine is capable of. Give me a real a machine, Forth can address any problem the real machine is capable of. You have still failed to make any point.

    No..., posted 29 Sep 2002 at 02:23 UTC by tk » (Observer)

    ...not that. This!

    Sourceless programming, posted 29 Sep 2002 at 16:43 UTC by mslicker » (Journeyer)

    If you can program effectively in that system, more power to you! The only other person I've heard of who has coded in hex is Chuck Moore. His was a most impressive result. The first version of OKAD (his VSLI design tools) was coded in hex, he called it "sourceless" programming. Of course, the interface he used was much more sophisticated, probably what you would expect from a average hex editor. For those curious, there is a demonstration of OKAD from 1993. (Realplayer, mpg, html)

    He later declared it a dead end, self limiting. That is the impression I get. For me, I see strong advantages in source code. Not the least of which, the ability to share your work with others.

    gems are nice but stones are hard and weiqi is fun, posted 30 Sep 2002 at 14:48 UTC by sye » (Journeyer)

    i like stone and salt better...

    C K Yuen - Superscalar execution of Stack prorams using reorder buffer, posted 30 Sep 2002 at 15:08 UTC by sye » (Journeyer)

    C K Yuen is a straw man at School of Computing National University of Singapore, Kent Ridge, Singapore 119260. Buddha is dust also so WHAT can't be a straw man? Anything that already has too much significance attached to it can't be a straw man in Buddha's marketplace. 4th reincarnates Buddha in ONE.

    In search of a relevant discussion, posted 30 Sep 2002 at 22:19 UTC by Alleluia » (Journeyer)

    Missing the point

    After reading Preface, Chapter 1, Chapter 2, and Chapter 3, and learning a great deal in a short amount of time, I expected to rejoin this conversation and learn some more. However, I completely missed the point of all previous responses above, excepting maybe the first one, which seems to have been a joke regarding efficiency: relevant about an article exposing how to be more efficient.

    As a programmer, I appreciated the 10x, 10x, and 10x efficiency techniques: examination and re-examination of code before writing, carefully designed data structures, and recursive algorithm factoring. While reading, I found words for principles which I have known to be at the essence of good coding for years, but which I rarely find others describing. So this is exciting to me.

    I am concerned that the series of articles is entirely overlooked in the conversation above. I have no experience with Forth, and will look into it more. ColorForth is obviously a very efficient direction to take programming--I already use syntax highlighting to great effect, and replacing syntax with highlighting is the obvious next step.

    Getting the point

    The point of the article, and elegantly delivered according to some of its own principles, is that software "bloat" can be overcome by applying the refining methods that Charles Moore advocates. (Looks like Occam's razor applied to coding). However, it probably won't happen, because people see Chuck as being out of step with practical coding techniques, and thus write off the very keen advice he is giving.

    I also read through some of the footnotes, discussion of how Chuck's ongoing editions of Forth, by eliminating unnecessary coding techniques in favor of more simple or elegant solutions, alienated programmers who didn't want to "relearn" how to program, even though the relearning was in the direction of greater efficiency.

    Re: In search of a relevant discussion, posted 30 Sep 2002 at 23:50 UTC by mslicker » (Journeyer)

    Some of these messages are just personal tit for tat, which I would like to cease.

    You are right on about the point. If the methods of Chuck Moore have merit (and I believe they do), it could turn the whole computing industry on it's head.

    There are many companies that benefit from slower, buggier, bloated software. Just think about. PC makers, Microsoft, Intel, ect. Obsolescence and defect are at the heart of their buisness models. For software makers, it is providing fixes, upgrades, service contracts, ect. For hardware makers it is providing faster more capable hardware for ever slower, ever bloated software. Their buisness models work together so well, you almost think that they are running a cartel. In recent times, the hardware makers are pushing the PC more and more as game platform. Word processing has ceased to be a major computational problem (as if it ever was!!), so they have moved on to more demanding software.

    Simple software and simple hardware would destroy these industries or at least force them to completely change.

    In the mainstream of the free software world, we are copying directly from this corrupt and dysfunctional industry.

    We should be smarter than this, much smarter! We should relearn, as you put it, the proper methods of software development. We have no reason not to have the finest software on the planet, we certainly have the resources. What is lacking is perhaps the strength to examine our own practices and to take serious the methods being pioneered by the likes of Chuck Moore[1].

    [1] I'd really like to list more beside him, but he is really only one pushing the limits of simplicity, advocating a viable alternative to bloatware. Please sugest others if you know.

    Mindful Programming, posted 1 Oct 2002 at 15:40 UTC by garym » (Master)

    It's an interesting read, it provokes one to ponder, and that is a Good Thing, although does so with 10x the verbage needed, which is a not-so-good-but-tolerable-thing. It's also dead wrong.

    My first computer, the one that started this all, was a Forth machine. Ok, more accurately, it was a Sinclair ZX-80 which initially comes with a pretty darn amazing basic-like interpreter in its 8k ROM, but for a few dollars more you pop that out and pop in the FORTH ROM and you now have (a) a better way to program small devices and (b) an industry standard language. That latter feature, (b), is what kills the Thoughtful argument.

    When the 8086 came out, and for the next 10 years at least, about once a year I had to restrain myself from wiping the whole friggin' thing clean to the metal and installing FORTH. There was no logical reason not to do it, I was fed up and pissed off with Greedy Gates Guts and the mind-numbing cash-sucking DOS products, sick of TurboPascal, TurboC, TurboProlog and Terminal TurboNess. But I never did wipe that machine, not with all the new tools-rich FORTHs that emerged over the 20 years since my first exposure.

    Why? Because I am inherently a lazy person. As much as I'd love to take over the universe and improve it, it's not so easy as filling 10x as many pages with words as is needed to state a point. Why? Because what might seem efficient in the microcosm may not be efficient in the macrocosm.

    I'm reminded of that Monty Python segment (Holy Grail?) in the armoury where John Cleese says "I couldn't help noticing that it would be a lot more efficient if you moved your table here," and as he does, the rhythm of the shop is fatally disrupted, the whole thing levelled to the ground within minutes of the "thoughtful" change. What this essay misses, why it's here and not on the front pages of every tech journal, is mindfulness of the whole ecology in which programs sit.

    I didn't scrub my machines down to FORTH then, and I won't now, for the very same reasons I abandoned clever Cambridge BASIC for Forth in 1980: Industry standards, resources in the commons, and whole-system ecology. And it's not just FORTH, but the whole Holy Grail of efficient coding: If I use the horrific BIOS routines, it's at least easy for someone to someday fix all programs with one chip change; if we all code our own, it's a nightmare.

    An interesting aside to this:

    But for those who have been in those BIOSes and system software and seen how bad it can get it seems like a shame to see people being forced to waste 90% of their investment in hardware or software because it means someone gets to charge more money.
    ... so where then is the opensource BIOS these people on the BIOS inside have graciously coded as the replacement? (and to prove their bold assessment)

    The computer itself is a global collaborative effort, using materials and technologies from almost every nation on earth, combined in a near chaos of tenuous links. Software is no different and should be no different: No one cares that the Web is inefficient and clunky and bloated because we need the information fix and that, my friend, that objective is more important than the vehicle of the journey. I don't discount that there are applications where blazing efficiency is paramount at any expense, but in the Real World, such cases are dwarfed by the overabundance of just getting results.

    All that said, there is still some value to Chapter 3; it seems like stating the obvious, but it is true that an understanding of the hardware, for those with the luxury to know the hardware that will always and forever more run their craftwork, is important

    Personally, I'd prefer if all those smarts of what and why goes where how, if all this were left to robots, to the compilers, to decide what's best for the deep machine so it's plug upgradeable across the board, so I can simply express my algorithms in the simple abstract purity so characteristic of a U of T trained CS student ;) and just let the machine decide how that abstraction would be best rendered on this specific hardware (the Holy Grail of Java, OCaml, Ruby et al) Realistically, however, as warm and fuzzy as it sounds in Theory, and although we've come a long way, we're not there yet, not by a long shot, and until we get to that Promised Land of self-optimizing compilers, while it runs the risk of painting itself into the same sort of corners that Linux blundered into basing its aesthetic on the x86, it's the best option we have in these primative times.

    Thoughtful programming , posted 1 Oct 2002 at 18:00 UTC by mslicker » (Journeyer)

    The BIOS is waste of time for the most part, hardware already comes in too many varieties, replacing the BIOS is just multiplying the problem. That does not mean you need to use the BIOS, Linux and colorForth do not use the BIOS (colorForth uses it once to set the video), yet the BIOS does many things essential for proper operation of the PC.

    Thoughtful programming is not about pitting individual efforts against collaborative efforts. Software complexity has gone well beyond the capability of an individual, especially on the PC. Whether thoughtful or not, programming useful free software systems will be a collaborative effort.

    On an individual level, Thoughtful programming is about perfecting ones skills. A compiler of any sort can only produce results within certain limits. The simple fact is, humans can always do better, sometimes orders of magnitude better. Thoughtful programming begins by rejecting abstract optimizing compilers. Rejecting these compilers is where most of the improvement comes. It literally changes everything, the programmer is no longer left with speculation as to what is actually done with his/her program. The programmer now understands the machine, the limits and capabilities of the machine, and expresses the solution knowing full well the result.

    This approach does not simply produce a quantative difference. This is extremely important to note. With a completely different approach comes a large qualitative difference.

    The primary focus of the mainstream of free software is: portability, compatability, standard compliance, abstraction, generalization, fault tolerance, choice (languages, editors, libraries, ect), proprietary equivalence.

    The primary focus of the thoughtful approach is simply writing software to accomplish a task. The task may imply implementing parts of various standards, however the focus is not on the standard, it is on the application that uses the standard. Everything that is not important to the function of the application is brushed aside.

    With thoughtful applications, the software becomes transparent, an extension of the user. All the irrelavent things the programmer brushed aside, the user no longer has to deal with.

    Re: Thoughtful programming, posted 2 Oct 2002 at 05:14 UTC by tk » (Observer)

    I'd like to propose a method by which mslicker can further his noble cause of eradicating bloatware.

    First, set up a non-profit organization called the One True Church of Forth, which will have the writings of Chuck Moore and Jeff Fox as its Holy Canon.

    Next, gather your existing writings on the subject into a series of exegeses on the Canon. You may also wish to write a few more exegeses; allow me to suggest titles for some of them:

    • Resolving the Information-Theoretic Paradox: Why "1 2 +" (5 Bytes) Uses Less Space Than "1+2" (3 Bytes)
    • The Thoughtful Programming Methodology: How to Implement General End-User Systems With Your Thought Power Alone
    • Why the "Right Thing" Of Every Other School of Programming Thought is Wrong, and Forth Is the Right "Right Thing"
    • Detailed 100-Page Commentary on an 8-Page[*] VLSI CAD Application
    • The Computing Revolution Which Is Imminent, and Will Always Be Imminent

    [*] 500 lines is around 500/60 = 8.3 pages

    Finally, recruit members into the church, in any way you will. Encourage them to write more essays on the subject of Forth, and come up with more proofs-of-concept of the power of Forth. Discourage them from trying to create production systems, because that is a job for code monkeys, not creative people.

    Alas, my writing skills are poor compared to those of mslicker, so I can only spend the rest of my days studying the various schools of programming thought with an equal eye.

    re: Thoughtful programming, posted 2 Oct 2002 at 06:36 UTC by nymia » (Master)

    The primary focus of the thoughtful approach is simply writing software to accomplish a task. The task may imply implementing parts of various standards, however the focus is not on the standard, it is on the application that uses the standard. Everything that is not important to the function of the application is brushed aside.

    This one reminded me of machine having a memory in which data and instruction are stored. And then these instructions are executed sequentially, jumping to another location when a jump instruction is encountered.

    Isn't this what language and compiler development is all about? Focusing on what the task is done by grouping instructions or statements so that later these abstracted statements can then be converted to machine codes.

    From my point of view, this is a universal quality of software, though. Present in all of them, especially Unix and those those dinosaur machines.

    Maybe I misunderstood the statement, maybe I need further clarification.

    BOOST faith from Berkeley, posted 2 Oct 2002 at 13:09 UTC by sye » (Journeyer)

    "BOOST : Berkeley's Out-of-Order Stack Thingy" by Steve Sinha, Satrajit Chatterjee and Kaushik Ravindran, Dept. of EE and CS, UC Berkeley.
    Abstract : We present a novel scheme based on Tomasulo algorithm to implement out-of-order execution in stack machines. This scheme not only reduces redundant data movement within the processor core, but also eliminates the stack cache and the associated problem of managing it. We present some preliminary quantitative evaluation of the proposed technique. For a suite of scientific benchmarks we obtain performance that appears to be competitive within out-of-order superscalar GPR processors.
    ...
    5. Conclusion and Future Work In this paper we have presented a novel scheme to implement out-of-order execution in stack machines. To our knowledge this is the first such scheme proposed in the literature
    * To this author's intuition, nothing is new under the sun. To this author's knowledge, C.K. Yuen, Chuck Moore, Dr. Philip Koopman have been working on such scheme for decades. In a round-robin stack, First is also the last. That's the ingenious of Forth besides Forth never claims to be the First or the Last.
    Acknowledgements We thank David Culler for guiding us in the direction of this work and for having faith in architecture.
    * What? having faith in architecture? That's baloney! They can having faith in David Culler or having faith in I.M.Pei or having faith in stupid@cyberspace.org but "having faith in architecture?!" is so wrong since architecture can only be an art or a science.

    Focus, posted 2 Oct 2002 at 14:50 UTC by mslicker » (Journeyer)

    nymia, It is about doing the miniminum. Starting from the application, implementing only what the application needs.

    Unix starts the other way around. It first begins by saying everything is a file, introduces a file system, processes, inter-process communication, multiple users, groups, file permissions.

    If a standard is needed in a Unix system, then it is abstracted and implemented in library. The focus of the library then is purely on the standard. What is used or unused is of little concern.

    Now, when I speak of a task, I'm discussing applications. The user doesn't care about how the application was built, the architecture, what language it was written in, how many machines it could run on, ect. The user in only interested in the final result.

    The thoughtful aproach is, starting with the result (the application), deriving the requirements, and implementing the software.

    The Forth Methodology Applied to Programming is a useful document for understanding this approach.

    2 comments from the Pope about BOOST faith, posted 2 Oct 2002 at 17:57 UTC by badvogato » (Master)

    I have 2 comments about BOOST.

    Sinha, et al seem to think that stack processors need more speed to match register machines. Because they may not be efficient in implementing C programs. The problem is not in the architecture, but in the C code.

    I agree with the value of parallel processing. But rather than introduce complexity into the processor, I simply provide multiple processors. This kind of parallelism requires rethinking the art of programming and the design of languages. Which I think should be done anyway.
    * note: 2 invisible br brackets and 2 invisilbe blockquote brackets, total 4 tags have been fabricated by this messenger to represent the TRUE layout of the orginal format typed by Pope himself. A men!

    User concerns, posted 5 Oct 2002 at 05:40 UTC by garym » (Master)

    The user doesn't care about how the application was built, the architecture, what language it was written in, how many machines it could run on, ect. The user in only interested in the final result.

    I'm in the wrong industry sector: From where I sit, users care most about the final cost. They will accept any solution within affordable costs, and thus we have (ahem) Microsoft. Not grasping that is why Bill is so fabulously rich and you and I eat at Macdonalds. I have heard of applications where efficiency was paramount, but I have never actually met one. Instead I meet clients who know that an enginer costs upwards of $100/hr, and you can buy a 1U rackmount for the price of one engineer person-day. Even NASA, at one time the agency to spend the most per line of code, is today more sensitive to cost than most any other factor.

    Obviously the answer is somewhere in between, engineers and especially designers who honestly make effort within budget to reduce bloat by implementing only what is actually needed, not what is envisioned as potentially needed, ie the subject of both Alan Cooper books (yes, Mr VBasic). This is an issue of human factors and interface design as much as an issue of code, and my guess is we'd do more to trim bloat with the design crew on board than with any amount of engineering discipline.

    Another message from the Pope... this time to garym, posted 5 Oct 2002 at 07:32 UTC by tk » (Observer)

    Factoring the cost of hiring good engineers into the total cost is, of course, wrong.

    A more detailed exegesis on this subject is in the works.

    The Role of Aesthetics in Programming Language Design, posted 5 Oct 2002 at 15:25 UTC by garym » (Master)

    A more sober plan was recently posted to FoRK: The Role of Aesthetics in Programming Language Design by Bruce MacLennan of the University of Tennessee CS dept.

    The crucial role played by aesthetics in programming language design and the importance of elegance in programming languages are defended on the basis of analogies with structural engineering, as presented in Billington's The Tower and the Bridge.

    Cost, posted 5 Oct 2002 at 15:56 UTC by mslicker » (Journeyer)

    I don't expect to change the practice of the computing industry. If that can be done, it can only be done indirectly. The move to GNU/Linux by many in the industry at large and government is a good example of external change.

    With free software, cost is less a concern, since free software is mostly a volunteer effort. Developer time becomes the resource. Of course, I'm quite respectful of this resource. I see people who are quite disrespectful of this resource, they create buggy software, and call for volunteers to go on a bug hunt. Or call for volunteers to polish up their poorly designed, disorganized software. This practice is quite shameful in my opinion.

    I'm not going to make any claims of developer cost. However, with free software we are creating the whole collection of software. Clearly doing the minimum is less work than trying to generalize and abstract everything. "Right by design" software is going to be more costly initially, than code without thought. In the later stages of software, maintenance should be much less. A well thought out design will have few bugs in the implementation, it will most likely be easier to extend and modify. In my experience it has been.

    I agree that the interface is where to begin as far removing bloat. With good specififcations I believe engineers can do good work. Where the goals are unclear, that is where you see code sprawl uncontrolably.

    OK, now that we're on less controversial ground..., posted 5 Oct 2002 at 16:09 UTC by tk » (Observer)

    Clearly doing the minimum is less work than trying to generalize and abstract everything.

    Except, now that

    OK, now that we're on less controversial ground..., posted 5 Oct 2002 at 16:29 UTC by tk » (Observer)

    Clearly doing the minimum is less work than trying to generalize and abstract everything.

    Except, now that the abstractions are already there in the first place, in the form of C libraries... in that case, it's silly to reinvent the wheel just because you think it's "Right".

    Since the GNU system has already been created by RMS and his team, and it actually works, so

    "Right by design" software is going to be more costly initially, than code without thought. In the later stages of software, maintenance should be much less.

    What have all the attempts to design "Right by Design" systems taught us? Lisp, Smalltalk, Eiffel, the Mac UI. How have they fared -- in comparison to C, Perl, Windows, Linux? Why?

    Of course, "economic factors" is a possible answer, but that only means that economic factors do play a significant role in the whole game. "Right by Design" software is more costly; in fact, it's so costly that by the time it's out, people have already got impatient and switched to software written by monkeys.

    OK, now that we're on less controversial ground..., posted 5 Oct 2002 at 16:30 UTC by tk » (Observer)

    Clearly doing the minimum is less work than trying to generalize and abstract everything.

    Except, now that the abstractions are already there in the first place, in the form of C libraries... in that case, it's silly to reinvent the wheel just because you think it's "Right".

    "Right by design" software is going to be more costly initially, than code without thought. In the later stages of software, maintenance should be much less.

    What have all the attempts to design "Right by Design" systems taught us? Lisp, Smalltalk, Eiffel, the Mac UI. How have they fared -- in comparison to C, Perl, Windows, Linux? Why?

    Of course, "economic factors" is a possible answer, but that only means that economic factors do play a significant role in the whole game. "Right by Design" software is more costly; in fact, it's so costly that by the time it's out, people have already got impatient and switched to software written by monkeys.

    Also..., posted 5 Oct 2002 at 16:40 UTC by tk » (Observer)

    "Thoughtful programming"[*] is not equal to "creationist programming". Perhaps if our great pontificator tries to code something larger than Just Another JPEG Decoder, the difference will show up.

    [*] where one actually programs something

    Reply to tk, posted 5 Oct 2002 at 19:59 UTC by mslicker » (Journeyer)

    You ignore the signifigant complexity these C libraries introduce. By this logic, no new operating systems should be created since Linux handles almost every combination of hardware. Base complexity can not be neglected when considering the ammount of effort required in a new work. For me, I'm overwhelmed by the complexity of GNU and Linux. Making even a simple change to Linux seems insurmountable.

    You greatly simplify issues of cost, so great that you don't make a useful statement. First, why is windows popular? By all accounts the Mac interface has been consistently better. This issue is far more about marketing, control, and dominance, than issues of design. Why is Smalltalk not popular? If anyone has tried out Squeak, it becomes quite clear for Smalltalk, the language has no concept of efficient execution. I'm not too interested in why commercial attempts fail. I have no ambition to produce proprietary software, and generally believe software should be free, not controlled by one company or another.

    On the free software side of things, we have GNU/Linux as a successful operating system. Where is the competition? The fact is, the PC discourages competition, the number of devices is overwhelming. Indeed, there is a great cost initially for new systems. However does the cost of a well designed system exceed the cost of the complexity systems like GNU/Linux introduce? I don't know. My intuition is yes, however I'd like the scientific evidence to back this up. In any case, any further involment with computers on my part requires an alternative to this complexity. People who feel similarly will join the effort. Determination and collaboration can overcome the cost.

    On my code. Well, this Jpeg decoder is not flawless. The process largely followed the document I've linked to above. There was a lot of thought going into the design before I even started coding. The design was not completely specified. No, the thoughtful approach is not purely design which can then be handed over to people of average ability. The thoughtful aproach is being involved in every part of the process. The design time is emphasised though. This is where you have the most flexibility, where you can consider a number of options, compare solutions without the time and effort to implement them. Most issues are resolved at design time. There is however an interaction between design and code. Design initially guides the code, but the code can later influence the design. There really isn't a mechanical process that anyone follows, though the document I've linked above does give insight into the process that has led to some remarkable creations.

    Correction to Reply, posted 5 Oct 2002 at 20:23 UTC by mslicker » (Journeyer)

    My intuition is that the cost is less for a well designed than the increasing mataintenance cost/complexity cost of GNU/Linux. There are some interesting figures in this document. For comparison estimates would be required for the particular approach, be it colorForth or otherwise.

    ..., posted 6 Oct 2002 at 09:50 UTC by tk » (Observer)

    Making even a simple change to Linux seems insurmountable.
    I've successfully hacked the kernel several times, often without having to ask #kernelnewbies . Besides, how often does one need to hack the kernel?

    By this logic, no new operating systems should be created since Linux handles almost every combination of hardware.
    No new OSes should be created for the sake of religion alone.

    I'm not too interested in why commercial attempts fail.
    Both free software and proprietary software are subject to the laws of physics and the laws of economics.

    People who feel similarly will join the effort. Determination and collaboration can overcome the cost.
    You can consider joining the MFIG Forth OS Project. At the very least, the current status should give an idea of the power of determination and collaboration.

    Reply, posted 6 Oct 2002 at 15:31 UTC by mslicker » (Journeyer)

    I really dislike this style of cutting up sentences.

    More people should want to hack the kernel. The complexity likely scares most off. If you want to be baby sit by Linus Torvalds et al, well ... that is fine. Others like to be in control of their system.

    You've referenced religion in connection witth Forth several times without substantiation. The fact is, Forth makes other methods look religious in comparison. Forth is about ideas. If you want to see religion in computing take Object Oriented Programming. There is absolutely no proven benefit of object-oriented programming.

    Free and proprietary software are at different parts of the economic system. Free sofware depends only on the well-being of the free sofware developer. Proprietary software depends on a market for its product. As long the whole economic system doesn't colaspe and the internet remains intact, their will be free software.

    MFIG Forth OS? Bah! That is not determination. Determination is creating your own VLSI cad software and the OS from scratch, determination is designing your own chips and the language it interprets. Linux is determinaton and colaboration. Don't point me some dead forth project. colorForth exists, it has real software that solves real problems.

    Determination, posted 7 Oct 2002 at 04:45 UTC by tk » (Observer)

    MFIG Forth OS? Bah! That is not determination.
    Exactly how different is their "determination" from the "determination" you now exhibit? I mean, I'm also very determined to bring everlasting peace to the world, eradicate poverty and corruption, stem out environmental pollution, etc. But of course, when it comes to actually doing any of these things, either I don't have the time, or I don't have the energy, or I just don't feel like it. So, I spend my time writing manifestoes. But I'm still very determined! Trust me!

    Interlude: straw men and analogies, posted 7 Oct 2002 at 05:27 UTC by tk » (Observer)

    It seems someone doesn't understand the difference between a straw man and an analogy.

    Straw man: in order to prove P(a), prove P(a'), where a' ~= a (but not equal to a). This is a bogus proof technique.

    Analogy: suppose one asserts that (forall x)[P(x) => Q(x)], and uses this to `prove' that P(a) => Q(a). Then another person may pick a different object b, and directly show that P(b) => Q(b) is false. Since P(b) => Q(b) is false, it follows that (forall x)[P(x) => Q(x)] cannot be true, so it cannot be used to prove P(a) => Q(a). This is a valid proof technique. It is of course understood here that a and b can be very different things, and P(a) and Q(a) can thus have different meanings from P(b) and Q(b) respectively.

    ..., posted 7 Oct 2002 at 15:15 UTC by mslicker » (Journeyer)

    Your strawman, is telling people something I never said is my argument. Look at your reply "Re: Thoughtful programming", and look at the first three bullets, show me precisely the text that implys these arguments.

    The difference between me and you, is that there is weight behind my words. Everything I say, I am prepared to substantiate or conceed. When you are proven wrong repeatedly, you just make up a new lie to pursue. You simply have no credibility.

    I'd like to know why someone who works their ass off their whole life is a "prima donna"(your words).

    In the final analysis it apears you recognize you have no talent of your own so you pursue baseless attacks on peope who are doing things. Are you trying to make a name for yourself? As a great seeker of truth? Ha! If I have shown anything here, is that your argument doesn't hold up to independent analysis. I could continue onto your web site, but really, this is growing tiresome!

    why to write new OS?, posted 8 Oct 2002 at 11:17 UTC by Malx » (Apprentice)

    Simply because goals changed. (or paradigm?)
    Design of Linux (and other OSs) was based on some assumptions, which are not true any more.

    For example - "It assumes that user controls all. It knows all about software it uses. Software is a tool, not the artificial intelligence or other clever being". Consequence is "there is UID and all running software has this UID. All security is based on UID."

    It is not true - just remember viruses. Even if not - you can't take responsibility for your software any more.

    It is not possible to create new OS in reasonable time. So the only way is to create extension tools with is a bad design. The firewalls, antiviruses, encriptosrs, installation/deinstalltion checkers etc.

    So - this is reality - you can't change any base OS goal without rewriting all from scratch. You can't just import any part of old OS (drivers, libs, apps etc).

    ..., posted 8 Oct 2002 at 16:09 UTC by tk » (Observer)

    mslicker says:

    show me precisely the text that implys these arguments.

    *sigh*

    ..., posted 8 Oct 2002 at 16:24 UTC by tk » (Observer)

    mslicker says:

    show me precisely the text that implys these arguments.

    *sigh*

    There you are.

    I'd like to know why someone who works their ass off their whole life is a "prima donna"(your words).

    Because some people think that working their whole life means their authorities on all matters should be unquestioned.

    Are you trying to make a name for yourself?

    Seriously, if I manage to make a name like this, I'll be very surprised -- and disgusted.

    And you're right, I should do more actual work. Ditto to your good self.

    ..., posted 8 Oct 2002 at 18:30 UTC by mslicker » (Journeyer)

    • I still maintian infix is a drawback, my reasons are compexity of implementation, case by case notation (versus universal notation), and also infix is error prone. I frequenty make errors regaurding precedence in infix languages, which can lead to hard to find bugs. Certainly, in some cases white space can be eliminated. I'm more interested in global source reductions (lines of code), than local source reductions.
    • What I've said is true. How does any new software come into existence? Is it all just random typing, or is there a concept in the mind of software writer? In particular how would that gui you referenced be created in the first place?
    • Forth provides the features I care about. My current focus is interactive applications. Computing is a diverse field, I don't deny anyone the choice of thier tool, nor deny that their tool has power for its uses. I am in fact an admirer of Lisp, there are plainly things you can do Lisp that would require some work to do in Forth. Smalltalk, I have only a superficial understanding of. However in my current focus, Smalltalk and Lisp do come into comparison. My initial direction was a Lisp system, when I came across Forth and it changed my view considerably. Whether justifiable or not, my impression of Smalltalk is tainted by the Squeak system which has, in my view, poor performance.

    Chuck Moore an authority? If you examine the Forth community, you will see ANS Forth dominates discussion. He didn't participate in the standardization, but really this is not common in standardization. People who create standards often have different goals than language designers. It creates a divide in the community to a certain degree, but the new ideas and innovations are a good sign of health in a language.

    I haven't had much time lately to write code. I'm looking foward to porting colorForth to the Zaurus when get some free time.

    Reply, posted 10 Oct 2002 at 05:00 UTC by nymia » (Master)

    nymia, It is about doing the miniminum. Starting from the application, implementing only what the application needs.

    Unix starts the other way around. It first begins by saying everything is a file, introduces a file system, processes, inter-process communication, multiple users, groups, file permissions.

    If a standard is needed in a Unix system, then it is abstracted and implemented in library. The focus of the library then is purely on the standard. What is used or unused is of little concern.

    Now, when I speak of a task, I'm discussing applications. The user doesn't care about how the application was built, the architecture, what language it was written in, how many machines it could run on, ect. The user in only interested in the final result.

    The thoughtful aproach is, starting with the result (the application), deriving the requirements, and implementing the software.

    The Forth Methodology Applied to Programming is a useful document for understanding this approach.

    I'm beginning to see what you mean, The Unix model basically offers a very thin layer, offering system services at their minimun. Maybe that is the reason why this model lacks so many things an ordinary enduser may require. And this thin layer is just another platform on which virtual machines can run. Normally, under this model, a user space virtual machine can range from basic to complex, that including the Forth VM.

    Also thought about the idea of hiding the hardware under the veil of colorForth VM, isn't this a bit too restrictive? If one would be confined to just work within the sandbox, without any tools for loading programs the OS can run. From there, it seems no computing industry would rise out from that model, though.

    Unix and colorForth, posted 10 Oct 2002 at 16:31 UTC by mslicker » (Journeyer)

    I wouldn't say Unix offers system services at their minimum. Unix is set a of abstractions. Unix considers its users antagonistic. Unix feels the system needs to be protected from both users and programmers. It tries to abstract everything it feels is vulnerable to attack. Perhaps abstraction is also considered a convienence. It is true, that Unix allows multiple programs to run completely ignorant of each other.

    colorForth is a holistic approach. It is based far more on cooperation, obeying unenforced protocals. The virtual machine is such a protocal. Multitasking is cooperative, and depends on tasks to give up control. There are no restrictions, new protocals can be invented, every rule that exists can be broken. There is no object code that is loaded, programs are stored as source. I don't see why you couldn't load say, an elf executable. However by the time you have everything to support this, you are back to the Unix scenario, negating any advantages colorForth had to begin with.

    You are right, I don't think colorForth could support the computing industry in the style of Microsoft ect. However, that is largely the point. It is far more about empowering the individual user, giving him/her back power and control over the computer that has been taken away.

    my 2 cents for tk, posted 10 Oct 2002 at 20:09 UTC by sye » (Journeyer)

    * OS layer need not be there between computing machine and human intelligence. That's what Forth comes about. To strip away baggages and bondages when other OS established themselves to be an indispensible thing for computing hardware and VLSI chips in the consumer markets and believed that OS ought to dictate what machine can do for human and what machine can do for networking and application layer programs.
    * "To code something larger than JPEG Decoder" what kind of difference are you expecting in selecting different design principles, different coding mechanics and maybe different organizational and monetary control? Something larger, in my mind, like Pyramide du Louvre by I.M. Pei? colorForth and Forth chips by Charles Moore? Proof of Fermat's Last Theorem by Andrew Wiles et al? Where Unix/C, Oracle/SQL, Microsoft/PC stand among other big things to draw awe from people? There's no doubt in my mind that UNIX/C, Oracle/SQL and Microsoft/PC provide employment for more people than we can afford to be enslaved to the service of machines.

    New Advogato Features

    New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

    Keep up with the latest Advogato features by reading the Advogato status blog.

    If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

    X
    Share this page