| Deutsch English Français Italiano |
|
<cslahjdbii4ld914fi1lgtqqs3pd86sdpr@4ax.com> View for Bookmarking (what is this?) Look up another Usenet article |
Path: local-2.nntp.ord.giganews.com!Xl.tags.giganews.com!local-1.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail NNTP-Posting-Date: Sun, 20 Oct 2024 20:21:09 +0000 From: Joe Gwinn <joegwinn@comcast.net> Newsgroups: sci.electronics.design Subject: Re: Duplicate identifiers in a single namespace Date: Sun, 20 Oct 2024 16:21:08 -0400 Message-ID: <cslahjdbii4ld914fi1lgtqqs3pd86sdpr@4ax.com> References: <vdbgch$1ob5k$1@dont-email.me> <nnd$37a9dde7$61356113@e5c57d590be73bf6> <o8jvgjt1vj7jo87i1b1nq867ivv64bvkn8@4ax.com> <vepai9$2dfrb$1@dont-email.me> <hqg0hjl6jshfnqcqtu6982t6tksldn93m0@4ax.com> <vepjok$2eqc8$1@dont-email.me> <5mb8hjt0m6sauphsrj9lt764iejamr66dt@4ax.com> <vf1i38$3atc$2@dont-email.me> User-Agent: ForteAgent/8.00.32.1272 MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Lines: 370 X-Usenet-Provider: http://www.giganews.com X-Trace: sv3-xBcyEWMuYvTOA98qxeXk5LHk1PovP+8nqSjBCTop07sRCvuVr+10D5RkbdSc9sBC5ZSrMa/JzrQ+yIS!WN+dG3RlQ/+LfpOEYZ7xT8I8FBfcLTjWoYIcFCrXN+OY63pAb1BvhkYR3AgoamYDQQV7Tro= X-Complaints-To: abuse@giganews.com X-DMCA-Notifications: http://www.giganews.com/info/dmca.html X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly X-Postfilter: 1.3.40 Bytes: 18519 On Sat, 19 Oct 2024 17:15:24 -0700, Don Y <blockedofcourse@foo.invalid> wrote: >On 10/19/2024 3:26 PM, Joe Gwinn wrote: >>> Will an average *coder* (someone who has managed to figure out how to >>> get a program to run and proclaimed himself a coder thereafter) >>> see the differences in his "product" (i.e., the code he has written) >>> and the blemishes/shortcomings it contains? >> >> Well, we had the developers we had, and the team was large enough that >> they cannot all be superstars in any language. > >And business targets "average performers" as the effort to hire >and retain "superstars" limits what the company can accomplish. It's a little bit deeper than that. Startups can afford to have a large fraction of superstars (so long as they like each other) because the need for spear carriers is minimal in that world. But for industrial scale, there are lots of simpler and more boring jobs that must also be done, thus diluting the superstars. War story: I used to run an Operating System Section, and one thing we needed to develop was hardware memory test programs for use in the factory. We had a hell of a lot of trouble getting this done because our programmers point-blank refused to do such test programs. One fine day, it occurred to me that the problem was that we were trying to use race horses to pull plows. So I went out to get the human equivalent of a plow horse, one that was a tad autistic and so would not be bored. This worked quite well. Fit the tool to the job. >>>> I was still scratching my head about why Pascal was so different than >>>> C, so I looked for the original intent of the founders. Which I found >>>> in the Introductions in the Pascal Report and K&R C: Pascal was >>>> intended for teaching Computer Science students their first >>>> programming language, while C was intended for implementing large >>>> systems, like the Unix kernel. >>> >>> Wirth maintained a KISS attitude in ALL of his endeavors. He >>> failed to see that requiring forward declarations wasn't really >>> making it any simpler /for the coders/. Compilers get written >>> and revised "a few times" but *used* thousands of times. Why >>> favor the compiler writer over the developer? >> >> Because computers were quite expensive then (circa 1982), and so >> Pascal was optimized to eliminate as much of the compiler task as >> possible, given that teaching languages are used to solve toy >> problems's, the focus being learning to program, not to deliver >> efficient working code for something industrial-scale in nature. > >I went to school in the mid 70's. Each *course* had its own >computer system (in addition to the school-wide "computing service") >because each professor had his own slant on how he wanted to >teach his courseware. We wrote code in Pascal, PL/1, LISP, Algol, >Fortran, SNOBOL, and a variety of "toy" languages designed to >illustrate specific concepts and OS approaches. I can't recall >compile time ever being an issue (but, the largest classes had >fewer than 400 students) I graduated in 1969, and there were no computer courses on offer near me except Basic programming, which I took. Ten years later, I got a night-school masters degree in Computer Science. >>>> Prior operating systems were all written in assembly code, and so were >>>> not portable between vendors, so Unix needed to be written in >>>> something that could be ported, and yet was sufficient to implement a >>>> OS kernel. Nor can one write an OS in Pascal. >>> >>> You can write an OS in Pascal -- but with lots of "helper functions" >>> that defeat the purpose of the HLL's "safety mechanisms". >> >> Yes, lots. They were generally written in assembler, and it was >> estimated that about 20% of the code would have to be in assembly if >> Pascal were used, based on a prior project that had done just that a >> few years earlier. > >Yes. The same is true of eeking out the last bits of performance >from OSs written in C. There are too many hardware oddities that >languages can't realistically address (without tying themselves >unduly to a particular architecture). > >> The target computers were pretty spare, multiple Motorola 68000 >> single-board computers in a VME crate or the like. I recall that a >> one megahertz instruction rate was considered really fast then. > >Even the 645 ran at ~500KHz (!). Yet, supported hundreds of users >doing all sorts of different tasks. (I think the 6180 ran at >~1MHz). Those were the days. Our computers did integer arithmetic only, because floating-point was done only in software and was dog slow. And we needed multi-precision integer arithmetic for many things, using scaled binary to handle the needed precision and dynamic range. > But, each of these could exploit the fact that users >don't consume all of the resources available /at any instant/ >on a processor. > >Contrast that with moving to the private sector and having >an 8b CPU hosting your development system (with dog slow >storage devices). A realtime system can definitely consume a goodly fraction of the computers. >> Much was made by the Pascal folk of the cost of software maintenance, >> but on the scale of a radar, maintenance was dominated by the >> hardware, and software maintenance was a roundoff error on the total >> cost of ownership. The electric bill was also larger. > >There likely is less call for change in such an "appliance". >Devices with richer UIs tend to see more feature creep. >This was one of Wirth's pet peeves; the fact that "designers" >were just throwing features together instead of THINKING about >which were truly needed. E.g., Oberon looks like something >out of the 1980's... In the 1970s, there was no such thing as such an appliance. Nor did appliances like stoves and toasters possess a computer. >>>> This did work - only something like 4% of Unix had to be written in >>>> assembly, and it was simply rewritten for each new family of >>>> computers. (Turned out to be 6%.) >> >> The conclusion was to use C: It was designed for the implementation >> of large realtime systems, while Pascal was designed as a teaching >> language, and is somewhat slow and awkward for realtime systems, >> forcing the use of various sidesteps, and much assembly code. Speed >> and the ability to drive hardware directly are the dominant issues >> controlling that part of development cost and risk that is sensitive >> to choice of implementation language. > >One can write reliable code in C. But, there has to be discipline >imposed (self or otherwise). Having an awareness of the underlying >hardware goes a long way to making this adjustment. > >I had to write a driver for a PROM Programmer in Pascal. It was >a dreadful experience! And, required an entirely different >mindset. Things that you would do in C (or ASM) had incredibly >inefficient analogs in Pascal. > >E.g., you could easily create an ASCII character for a particular >hex-digit and concatenate these to form a "byte"; then those >to form a word/address, etc. (imagine doing that for every byte >you have to ship across to the programmer!) In Pascal, you spent >all your time in call/return instead of actually doing any *work*! Yes, a bullet dodged: >>>> So the Pascal crowd fell silent, and C was chosen and successfully >>>> used. >>>> >>>> The Ada Mandate was rescinded maybe ten years later. The ISO-OSI >>>> mandate fell a year or so later, slain by TCP/IP. >>> >>> I had to make a similar decision, early on. It's really easy to get >>> on a soapbox and preach how it *should* be done. But, if you expect >>> (and want) others to adopt and embelish your work, you have to choose >>> an implementation that they will accept, if not "embrace". >>> >>> And, this without requiring scads of overhead (people and other >>> resources) to accomplish a particular goal. >>> >>> Key in this is figuring out how to *hide* complexity so a user >>> (of varying degrees of capability across a wide spectrum) can >>> get something to work within the constraints you've laid out. >> >> Hidden complexity is still complexity, with complex failure modes ========== REMAINDER OF ARTICLE TRUNCATED ==========