Path: ...!feeds.phibee-telecom.net!2.eu.feeder.erje.net!feeder.erje.net!newsfeed.bofh.team!paganini.bofh.team!not-for-mail From: antispam@fricas.org (Waldek Hebisch) Newsgroups: comp.lang.c Subject: Re: Python recompile Date: Fri, 14 Mar 2025 00:37:48 -0000 (UTC) Organization: To protect and to server Message-ID: References: <871pv861ht.fsf@nosuchdomain.example.com> <20250308192940.00001351@yahoo.com> Injection-Date: Fri, 14 Mar 2025 00:37:48 -0000 (UTC) Injection-Info: paganini.bofh.team; logging-data="419831"; posting-host="WwiNTD3IIceGeoS5hCc4+A.user.paganini.bofh.team"; mail-complaints-to="usenet@bofh.team"; posting-account="9dIQLXBM7WM9KzA+yjdR4A"; User-Agent: tin/2.6.2-20221225 ("Pittyvaich") (Linux/6.1.0-9-amd64 (x86_64)) X-Notice: Filtered by postfilter v. 0.9.3 Bytes: 28642 Lines: 578 bart wrote: > On 11/03/2025 01:33, Waldek Hebisch wrote: >> bart wrote: >>> On 10/03/2025 10:58, Waldek Hebisch wrote: >>>> bart wrote: >>>>> >>>>> I think nobody does. There's always been some sort of mystique >>>>> surrounding 'gcc' on Windows. >>>>> >>>>> 'MinGW' supposedly 'Minimalist Gnu on Windows'. In that case I wouldn't >>>>> like to see the full-scale one.. >>>> >>>> "Minimalist" is not about size of the compiler. Rather, it is >>>> about possible support routines. For "hosted implementation" C >>>> mandates presence of C library and there is a lot of functions >>>> not in C standard, but included in libraries of C compilers. >>>> There is also question of operating system support, complicated >>>> by fact that Windows is different than other systems. Cygwin >>>> solved those issues by offering Posix emulation and a sizable >>>> collection os libraries. MinGW is minimalist in the sense >>>> that it provides very little own libraries and mainly uses >>>> what is provide by Windows. >>> >>> I still don't get this stuff. >>> >>> I get the impression that a port of gcc to Windows is not simply about >>> building C programs, but building C programs that use a lot of features >>> from Linux. >> >> You apparently do not get fact that people want tools to >> automate various routine tasks. > > What routine task is this? I'm talking exclusively about turning a bunch > of source files in some language (here it is C) into an executable binary. > > This task can be done with a program called a 'compiler'. You ignore fact that people are developing programs. And developement is much more than "turning a bunch of source files into an executable". GPL says "The source code for a work means the preferred form of the work for making modifications to it". And that nicely captures the idea: sources are in form that is convenient for developer and may require several steps before one gets the executable. > However, what I'm arguing about is that this simple task has become > unnecessarily elaborate on OSes like Linux, by introducing makefiles, > OS-specific scripts, and OS-specific utilities. There is not much OS-specific in "Linux tools". 'make' is problematic on zOS, because files have no timestamps there, but most normal OS-es have timestamps (and IIUC even zOS has USS where timestamps are available). You need ability to invoke a program from a different program, exact form of this is OS-dependent, but ability is there in any semi-reasonable general purpose OS. You need ability to get exit status of a program, again exact form is OS-dependent but ability is there. FYI, is started using "Linux tools" on MS-DOS, before I heard about Linux. And tools are widely available, IIUC classic zOS is problematic, but I think that USS is OK. Some systems may be prone to breakage, when there is small number of _active_ users and nobody bothers to timely report real problems. Coming back to developement, in many cases it is desirable to generate some C files. C preprocessor can do some transformations, but is may be convenient to transform source in a way hard to do via proprocessor. Also, this is C group, but big programs frequently have parts in different languages, an then it is good to have language-neutral tools. Simple tools to transform or generate files are 'sed' and 'awk'. Some people use 'perl' or 'python' for such purpose and in principle there is much more possiblities. More specialised are compiler-writing tools like 'bison' and 'flex'. You can write a parser by hand, but using 'bison' one does not need to know much to write working parser. 'flex' helps in writing scanners, but IMO is quite useful for text processing task (its support for regular expressions is nicer than Perl or Python offer, and speed frequently is much better too). Users beside binary need also documentation. In principle documentation could be provided as plain text, but there is now tendency to offer nicely formatted documentation and to offer multiple formats. Which means that there is need for formatting tools. Old ones are Texinfo and TeX. And some moment there was push towards SGML and formats like DocBook. This needs appropriate tools. Some people now prefer tools like 'sphinx' (which needs Python with several extention packages). There is 'doxygen' which extract information from source files and presents nicely formated info about API (I am not a fan of 'doxygen' but is seems to be widely used). Beside building program and documentation one wants some automatic way of running tests. And people want to automate more tasks, for example removal of generated files or installation. Automation requires some scripting language. For authors of free programs important point is free availability (preferably with sources) of choosen language. That ruled out things like MS Basic or 'command.com' (and 'command.com' is quite limited). Unix shell was available as port of OS on Unix systems and quite early there were free ports of Unix shell to other systems. So there were natural preference to Unix shell, simply because of wide availability. Similarly with other classic Unix tools. One can argue that those classic tools are now dated, but there is stong inertia in action: tools are widely used, widely availeble, there is knowldege how to use them so people keep using them. When one thinks about possible replacements, it is not an easy job. Namely, Unix tools evolved and are designed to work together. Different tool either must cooperate nicely with existing tools, which forces similar behaviour to current tools, or must provide replacement for the whole toolkit which is much more complicated than replacing a single tool. When you look at bloat, I expect competing tools to be much more bloated than classic Unix tools. Namely, Unix tools were designed to be simple and power comes from possibility to compose them. It is hard to imagine way of composing tools that is both simple and offers more possibilities than Unix way. So one can have simpler/smaller tools but less powerful. Or one can increase fuctionality of each tools separately (to some degree this is happening with Unix tools), but this increases complexity of each tool. One can try to make one tool that replaces the whole toolkit, but experiance suggest that such a single tool would be much more complicated. To summarize, more powerful tools are likely to be more complicated, hence more bloated than classic Unix tools. Given that classic Unix tools are quite small compared to modern systems (they can be implemented in few megabytes of code) incentive to make smaller tools is rather low. > This is done even on smaller, simple applications, and also on those > that are supposedly cross-platform that are to be built on the target. You can have basic toolkit running on rather wide set of systems. So this is not a big burden for recipients, in particuar since the same toolkit may be used to build many programs. > If scripts are going to be used, then use them at the developer site > only, and make the script generate the streamlined set of files for the > particular platform of interest. Current free software ecosystem is colaborative work of many people. In particular, there are people creating binary packages and if all you want is running binary you may be better off getting a package from them. Now, people creating binaries want to do this in an automated way. For them having to give a list of files to compiler each time when they build a package is too mach work. They have scripts that fetch sources from the net, check checksums (to verify authenticity), run 'configure' and 'make'. They are willing to install tools which are needed for this work. > It should not rely on anything that is not native to the target platform. Why not? You already admited that one need a compiler. While a bunch of tiny tools should be a blocker? >> And people do not want >> to reinvent the wheel, they prefer to use code written >> by others. > > What wheels are these that are being reinvented? I'm simply arguing for > using either 2 or 4 wheels, not 18! > > And I'm not suggesting reinventing those 2/4 either, just using the ones > I have on my OS. Well, one trouble is that things you have on your preferred OS are not free, one can not use them legally on a different OS. And when it comes to developement tools what cames with basic install of your OS seem to be quite limited. No law prevents you from installing free tools and that is simplest solution. ========== REMAINDER OF ARTICLE TRUNCATED ==========