Deutsch   English   Français   Italiano  
<v6b2av$3ofef$1@dont-email.me>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!2.eu.feeder.erje.net!feeder.erje.net!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: bart <bc@freeuk.com>
Newsgroups: comp.lang.c
Subject: =?UTF-8?Q?Re=3A_technology_discussion_=E2=86=92_does_the_world_need?=
 =?UTF-8?B?IGEgIm5ldyIgQyA/?=
Date: Sat, 6 Jul 2024 10:21:36 +0100
Organization: A noiseless patient Spider
Lines: 45
Message-ID: <v6b2av$3ofef$1@dont-email.me>
References: <v66eci$2qeee$1@dont-email.me> <v67gt1$2vq6a$2@dont-email.me>
 <v687h2$36i6p$1@dont-email.me> <v68sjv$3a7lb$1@dont-email.me>
 <v6a76q$3gqkm$6@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Sat, 06 Jul 2024 11:21:36 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="c37fc5164b87e9d2d37a8965ff47e68e";
	logging-data="3947983"; mail-complaints-to="abuse@eternal-september.org";	posting-account="U2FsdGVkX1+KoZi4gSG/ic0JPWW7EpnB"
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:cz+rZrDyYJHsNglaENAPhP3DkLI=
Content-Language: en-GB
In-Reply-To: <v6a76q$3gqkm$6@dont-email.me>
Bytes: 2941

On 06/07/2024 02:38, Lawrence D'Oliveiro wrote:
> On Fri, 5 Jul 2024 14:31:44 +0100, bart wrote:
> 
>> C also is the only language that is supposed to work on any kind of
>> processor ...
> 
> I don’t think there is anything innate in the design of C to ensure that.
> It was simply its popularity that meant it was usually the first language
> implemented on a new processor.
> 
> For example, C assumes byte addressability.

C didn't define a 'byte' at all. It assumed 'char' addressability, but 
allowed that 'char' to be any width at all. I think at some point a 
minimum of 8 bits was applied.

> So that causes awkwardness on
> architectures like the PDP-10, for example.

Which was also the first machine I used, and the first I wrote a 
compiler for.

C didn't exist on it, at least at that establishment, and was never 
mentioned. I didn't take a closer looker until 16 years after I started 
coding.

The 36-bit words caused problems on other languages too. There was an 
instruction set extention to allow access to bitfields of any width, but 
that was fiddly to use.

Some languages had to choose between 'packed' strings, and strings using 
one word per character.

> It just so happened such
> architectures became extinct at about the time the rise of 8-bit
> microprocessors (and their more advanced successors) made byte-
> addressability essentially universal.

The next machine I wrote a compiler for was an 8-bit microprocessor, 
using twos complement, byte addressibility, some 16-bit capability, and 
16-bit addressing.

Most of today's hardware evolved from such a model: 32- and 64-bit words 
and addresses were an obvious natural progression. C however still 
hasn't got the memo.