Deutsch   English   Français   Italiano  
<v4jqpr$3e1od$1@dont-email.me>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!feed.opticnetworks.net!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: David Brown <david.brown@hesbynett.no>
Newsgroups: comp.lang.c
Subject: Re: Baby X is bor nagain
Date: Sat, 15 Jun 2024 12:35:37 +0200
Organization: A noiseless patient Spider
Lines: 123
Message-ID: <v4jqpr$3e1od$1@dont-email.me>
References: <v494f9$von8$1@dont-email.me>
 <v49seg$14cva$1@raubtier-asyl.eternal-september.org>
 <v49t6f$14i1o$1@dont-email.me>
 <v4bcbj$1gqlo$1@raubtier-asyl.eternal-september.org>
 <v4bh56$1hibd$1@dont-email.me> <v4c0mg$1kjmk$1@dont-email.me>
 <v4c8s4$1lki1$4@dont-email.me> <20240613002933.000075c5@yahoo.com>
 <v4emki$28d1b$1@dont-email.me> <20240613174354.00005498@yahoo.com>
 <v4hs0f$2ve92$1@dont-email.me> <v4i1s4$30goi$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Sat, 15 Jun 2024 12:35:39 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="f678a482ffafce70c2ceef8ecfac3e10";
	logging-data="3606285"; mail-complaints-to="abuse@eternal-september.org";	posting-account="U2FsdGVkX1+DgDT6KWMovrzWcOqeZrDis6anGLTd1c8="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:ztQv0oZ6omkQm70+1Ow1fErErtI=
Content-Language: en-GB
In-Reply-To: <v4i1s4$30goi$1@dont-email.me>
Bytes: 7090

On 14/06/2024 20:24, bart wrote:
> On 14/06/2024 17:43, David Brown wrote:
>> On 13/06/2024 16:43, Michael S wrote:
> 
>>> Somewhat more than a second on less modern hardware. Enough for me to
>>> feel that compilation is not instant.
>>> But 1 MB is just an arbitrary number. For 20 MB everybody would feel
>>> the difference. And for 50 MB few people would not want it to be much
>>> faster.
>>>
>>
>> But what would be the point of trying to embed such files in the first 
>> place?  There are much better ways of packing large files.
> 
> I remember complaining that some tool installations were bloated at 
> 100MB, 500MB, 1000MB or beyond, and your attitude was So what, since 
> there is now almost unlimited storage.

We all remember that :-)

> 
> But now of course, it's Why would someone ever want to do X with such a 
> large file! Suddenly large files are undesirable when it suits you.

It's a /completely/ different situation.  Anyone doing development work 
is going to have a machine with lots of space - 1 GB is peanuts for 
space on a disk.  But that does not mean it makes sense to have a 1 GB 
initialised array in an executable!

Consider /why/ you might want to include a binary blob inside an 
executable.  I can think of a number of scenarios :

1. You want a "setup.exe" installation file.  Then you use appropriate 
tools for the job, you don't use inclusion in a C file.

2. You want a "portable" version of a big program - portable apps on 
Windows, AppImage on Linux, or something like that.  Then you use 
appropriate tools for the job so that the application can access the 
enclosed files as /normal/ files (not some weird "XML Filesystem" nonsense).

3. You are targeting a platform where there is no big OS and no 
filesystem, and everything is within a single statically-linked binary. 
Then embedded files in C arrays are a good solution, but your files are 
always small because your system is small.

4. You want to include a few "resources" like icons or images in your 
executable, because you don't need much and it makes the results neater. 
  Then you use some kind of "resource compiler", such as has been used 
on Windows for decades.

I'm sure there are a few other niche cases where the convenience of a 
single executable file is more important than the inconvenience of not 
being able to access the files with normal file operations.  Even then, 
it's unlikely that they will be big files.



To give an analogy, consider books.  In a home, it's no problem having a 
set of bookshelves with hundreds of books on them - that's your disk 
storage.  It is also sometimes convenient to have books packed together 
in single units, boxes, even though you need to unpack them to get to 
the books - that's your setup.exe or AppImage files.  And sometimes it 
is nice to have a few /small/ books inside one binding, such as a a 
trilogy in one binding - that's your embedded files.  But no one wants 
the complete Encyclopedia Britannica in one binding.



> 
>>   You can always increase sizes for things until you get problems or 
>> annoying slowdowns, but that does not mean that will happen in 
>> practical situations.
>>
>> And even if you /did/ want to embed a 20 MB file, and even if that 
>> took 20 seconds, so what?  Unless you have a masochistic build setup, 
>> such as refusing to use "make" or insisting that everything goes in 
>> one C file that is re-compiled all the time, that 20 second compile is 
>> a one-off time cost on the rare occasion when you change the big 
>> binary file.
>>
>> Now, I am quite happy to agree that faster is better, all other things 
>> being equal.  And convenience and simplicity is better.  Once the 
>> compilers I use support #embed, if I need to embed a file and I don't 
>> need anything more than an array initialisation, I'll use #embed.  
>> Until then, 5 seconds writing an "xxd -i" line in a makefile and a 20 
>> second compile (if it took that long) beats 5 minutes writing a Python 
>> script to generate string literals even if the compile is now 2 seconds.
> 
> That's a really bad attitude. It partly explains why such things as 
> #embed take so long to get added.
> 

Using the best tool available for the job, and using a better tool if 
one becomes available, is a "bad attitude" ?

Or did you mean it is a "bad attitude" to concentrate on things that are 
important and make a real difference, instead of improving on something 
that was never really a big issue in the first place?

> I've heard lots of horror stories elsewhere about projects taking 
> minutes, tens of minutes or even hours to build.

I agree - some kinds of builds take a /long/ time.  Embedding binary 
blobs has absolutely nothing to do with it.  Indeed, long build times 
are often the result of trying to put too much in one build rather than 
splitting things up in separate files and libraries.  (Sometimes such 
big builds are justified, such as for large programs with very large 
user bases.)

> 
> How much of that is due to attitudes like yours? You've managed to find 
> ways of working around speed problems, by throwing hardware resources at 
> it (fast processors, loads of memory, multiple cores, SSD, RAM-disk), or 
> using ingenuity in *avoiding* having to compile stuff as much as 
> possible. Or maybe the programs you build aren't that big.

You are joking, right?  Or trolling?

(I'm snipping the rest, because if it is not trolling, it would take far 
too long to explain to you how the software development world works for 
everyone else.)