Rocksolid Light

Welcome to RetroBBS

mail  files  register  newsreader  groups  login

Message-ID:  

Dennis Ritchie is twice as bright as Steve Jobs, and only half wrong. -- Jim Gettys


devel / comp.lang.misc / Re: C Plagiarism

SubjectAuthor
* C PlagiarismBart
+* Re: C PlagiarismJames Harris
|`* Re: C PlagiarismBart
| `* Re: C PlagiarismJames Harris
|  `* Re: C PlagiarismBart
|   +* Re: C PlagiarismJames Harris
|   |`- Re: C PlagiarismBart
|   `* Re: C PlagiarismDavid Brown
|    `* Re: C PlagiarismBart
|     +* Re: C PlagiarismDmitry A. Kazakov
|     |`* Re: C PlagiarismBart
|     | `- Re: C PlagiarismDmitry A. Kazakov
|     `* Re: C PlagiarismDavid Brown
|      `* Re: C PlagiarismBart
|       `* Re: C PlagiarismDavid Brown
|        +- Re: C PlagiarismAndy Walker
|        `* Re: C PlagiarismBart
|         `- Re: C PlagiarismDmitry A. Kazakov
`- Re: C PlagiarismDavid Brown

1
C Plagiarism

<tlaulk$10io$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2109&group=comp.lang.misc#2109

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!uabYU4OOdxBKlV2hpj27FQ.user.46.165.242.75.POSTED!not-for-mail
From: bc@freeuk.com (Bart)
Newsgroups: comp.lang.misc
Subject: C Plagiarism
Date: Sat, 19 Nov 2022 16:01:58 +0000
Organization: Aioe.org NNTP Server
Message-ID: <tlaulk$10io$1@gioia.aioe.org>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="33368"; posting-host="uabYU4OOdxBKlV2hpj27FQ.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
X-Notice: Filtered by postfilter v. 0.9.2
 by: Bart - Sat, 19 Nov 2022 16:01 UTC

On 16/11/2022 16:50, David Brown wrote:
> Yes, but for you, a "must-have" list for a programming language would be
> mainly "must be roughly like ancient style C in functionality, but with
> enough change in syntax and appearance so that no one will think it is
> C". If that's what you like, and what pays for your daily bread, then
> that's absolutely fine.

On 18/11/2022 07:12, David Brown wrote:
> Yes, it is a lot like C. It has a number of changes, some that I think
> are good, some that I think are bad, but basically it is mostly like C.

The above remarks implies strongly that my systems language is a rip-off
of C.

This is completely untrue.

I started devising languages in 1981, but didn't write my first C
program until at least 1992 (which is when I acquired a compiler).

I did buy the K&R book around 1982, but was disappointed enough by the
language that I sold it a colleague (at a significant loss too). That
1992 Visual C compiler was itself given away for nothing.

I didn't write any significant C program until about 2010, and even for
that, I had to create a syntax wrapper, preprocessed with a script, to
make the process more palatable.

My influences were these languages:

Algol 68 (this one I'd never used, only read about it avidly)
Algol 60
Pascal
Fortran IV
Babbage (a machine-oriented language I'd implemented for PDP10)
ASM for PDP10 and for Z80

There were some things that were eventually adopted from C, but much
later on:

* 0xABC notation for hex constants (instead of 0ABCH)
* F() notation for calling functions with no parameters
(instead of just F)
* Allowing F as well as &F to create function references
* '$caligned' option for structs, to force C-style member
layouts, but mainly to be inline with such structs in APIs

The design of my systems language, especially its type system, was
largely driven by machine architecture, so necessarily had to have
similarities with other lower level languages.

C itself has a considerable number of differences from my approach:

* An utterly different syntax for a start (and a crazy one too)

* Character- not line-oriented

* It's case-sensitive

* Arrays are indexed always from 0

* It uses 3 'char' types (which are really small integers)

* It has very loosely defined integer types

* It has fixed-width types often defined on top of those loose
types, with unwelcome consequences (eg. you cannot use %lld or 123LL
for those u/intN_t types, or int32_t may or may not be a synonym for
either int or long)

* It conflates arrays and pointers (a fact that has significant
consequences)

* It doesn't support value arrays in expressions or for direct
parameter passing (only when contained within structs)

* It has separate statements and expressions

* It has block scopes

* It has the concept of struct tags

* It uses different namespaces for labels, tags and everyting else

* How it interprets hex floating point constants is a dramatic
difference, being a mix of hex, decimal and binary (mine is pure
hex)

* There are very peculiar rules for how many {} pairs are needed in
data initialisers

* It uses some 30 system headers, needed even for basic functionality
(eg. it needs a header to enable 'printf', 'int8_t', 'NULL`)

* It doesn't allow out-of-order declarations, or forward declarations
are needed

* Augmented assignments like `a += b` return a value

* It doesn't have a conventional for-loop

* It has a 'const' attribute on types

* It doesn't have default parameter values, or keyword parameters,
or language-supported reference parameters

* It makes signed integer overflow, and lots of other aspects,
undefined behaviour

* The rules for mixing signed and unsigned ints in binary operations,
in terms of what signedness will be used in evaluation and result,
are elaborate

* It uses primitive means for printing values (to print an elaborate
expression that may involve opaque types, you need to know the
exact type

* There is no easy way to get the length of a fixed array type

* It has a token-based macro system on which it relies extensively for
added functionality.

Lots of other things, but in all, all things my 'ripoff' has
inexplicably never bothered to replicate. Plus there's all the bigger
stuff I listed elsewhere that I have and C doesn't.

From the lofty, condescending viewpoint of a functional language, this
may all seem petty squabbling about inconsequential details.

But for somebody involved in this space of devising and implementing
these lower level languages (the kind that actually run things) these
are all significant.

Re: C Plagiarism

<tlbdkq$39me6$2@dont-email.me>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2113&group=comp.lang.misc#2113

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!eternal-september.org!reader01.eternal-september.org!.POSTED!not-for-mail
From: james.harris.1@gmail.com (James Harris)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Sat, 19 Nov 2022 20:17:30 +0000
Organization: A noiseless patient Spider
Lines: 45
Message-ID: <tlbdkq$39me6$2@dont-email.me>
References: <tlaulk$10io$1@gioia.aioe.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Sat, 19 Nov 2022 20:17:30 -0000 (UTC)
Injection-Info: reader01.eternal-september.org; posting-host="6918f107364b996941fc57975b841693";
logging-data="3463622"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/P2l8rHM5l76UEf/NqRY1crwwYXJVXK+c="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101
Thunderbird/102.4.2
Cancel-Lock: sha1:YujW62wZe4+LxV15h4Mg6Uyj2pk=
In-Reply-To: <tlaulk$10io$1@gioia.aioe.org>
Content-Language: en-GB
 by: James Harris - Sat, 19 Nov 2022 20:17 UTC

On 19/11/2022 16:01, Bart wrote:
>
> On 16/11/2022 16:50, David Brown wrote:
> > Yes, but for you, a "must-have" list for a programming language would be
> > mainly "must be roughly like ancient style C in functionality, but with
> > enough change in syntax and appearance so that no one will think it is
> > C".  If that's what you like, and what pays for your daily bread, then
> > that's absolutely fine.
>
> On 18/11/2022 07:12, David Brown wrote:
> > Yes, it is a lot like C.  It has a number of changes, some that I think
> > are good, some that I think are bad, but basically it is mostly like C.
>
> The above remarks implies strongly that my systems language is a rip-off
> of C.

I don't think anyone could accuse /you/ of copying C! Your view of it is
consistently negative. IIRC you even produced a long list of things
which are wrong with C.

....

> My influences were these languages:
>
>     Algol 68 (this one I'd never used, only read about it avidly)
>     Algol 60
>     Pascal
>     Fortran IV
>     Babbage (a machine-oriented language I'd implemented for PDP10)
>     ASM for PDP10 and for Z80

I try to keep my main influences to hardware and various assembly
languages I've used over the years. But even though we try not to be
influenced by C I don't think any of us can help it. Two reasons: C
became the base for so many languages which came after it, and C so well
fits the underlying machine.

I even suspect that the CPUs we use today are also as they are in part
due to C. It has been that influential.

--
James Harris

Re: C Plagiarism

<tlbecl$ecj$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2116&group=comp.lang.misc#2116

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!uabYU4OOdxBKlV2hpj27FQ.user.46.165.242.75.POSTED!not-for-mail
From: bc@freeuk.com (Bart)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Sat, 19 Nov 2022 20:30:16 +0000
Organization: Aioe.org NNTP Server
Message-ID: <tlbecl$ecj$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="14739"; posting-host="uabYU4OOdxBKlV2hpj27FQ.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
X-Notice: Filtered by postfilter v. 0.9.2
 by: Bart - Sat, 19 Nov 2022 20:30 UTC

On 19/11/2022 20:17, James Harris wrote:

>
> I try to keep my main influences to hardware and various assembly
> languages I've used over the years. But even though we try not to be
> influenced by C I don't think any of us can help it. Two reasons: C
> became the base for so many languages which came after it, and C so well
> fits the underlying machine.
>
> I even suspect that the CPUs we use today are also as they are in part
> due to C. It has been that influential.

Well, there's a lot of C code around that needs to be keep working.

However, what aspects of today's processors do you think owe anything to C?

The progression from 8 to 16 to 32 to 64 bits and beyond has long been
on the cards, irrespective of languages.

Actually C is lagging behind since most implementations are stuck with a
32-bit int type. Which means lots of software, for those lazily using
'int' everywhere, will perpetuate the limitations of that type.

C famously also doesn't like to pin down its types. It doesn't even have
a `byte` type, and its `char` type, apart from not have a specified
signedness, could have any width of 8 bits or more.

Re: C Plagiarism

<tlbg9t$3a00n$5@dont-email.me>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2118&group=comp.lang.misc#2118

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!eternal-september.org!reader01.eternal-september.org!.POSTED!not-for-mail
From: james.harris.1@gmail.com (James Harris)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Sat, 19 Nov 2022 21:02:53 +0000
Organization: A noiseless patient Spider
Lines: 45
Message-ID: <tlbg9t$3a00n$5@dont-email.me>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Sat, 19 Nov 2022 21:02:53 -0000 (UTC)
Injection-Info: reader01.eternal-september.org; posting-host="6918f107364b996941fc57975b841693";
logging-data="3473431"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX18C1qFYcRwiv0i0+wZ0sBUqXADw1vLA53Y="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101
Thunderbird/102.4.2
Cancel-Lock: sha1:jJvs7h0RXTpB2PlV4/IjH1I/61Q=
In-Reply-To: <tlbecl$ecj$1@gioia.aioe.org>
Content-Language: en-GB
 by: James Harris - Sat, 19 Nov 2022 21:02 UTC

On 19/11/2022 20:30, Bart wrote:
> On 19/11/2022 20:17, James Harris wrote:
>
>>
>> I try to keep my main influences to hardware and various assembly
>> languages I've used over the years. But even though we try not to be
>> influenced by C I don't think any of us can help it. Two reasons: C
>> became the base for so many languages which came after it, and C so
>> well fits the underlying machine.
>>
>> I even suspect that the CPUs we use today are also as they are in part
>> due to C. It has been that influential.
>
> Well, there's a lot of C code around that needs to be keep working.

Yes.

>
> However, what aspects of today's processors do you think owe anything to C?

Things like the 8-bit byte, 2's complement, and the lack of segmentation.

>
> The progression from 8 to 16 to 32 to 64 bits and beyond has long been
> on the cards, irrespective of languages.
>
> Actually C is lagging behind since most implementations are stuck with a
> 32-bit int type. Which means lots of software, for those lazily using
> 'int' everywhere, will perpetuate the limitations of that type.
>
> C famously also doesn't like to pin down its types. It doesn't even have
> a `byte` type, and its `char` type, apart from not have a specified
> signedness, could have any width of 8 bits or more.

Pre C99 yes. But AIUI since C99 C has had very precise types such as

int64_t

but it only allows specific sizes.

--
James Harris

Re: C Plagiarism

<tlbj0f$fne$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2120&group=comp.lang.misc#2120

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!uabYU4OOdxBKlV2hpj27FQ.user.46.165.242.75.POSTED!not-for-mail
From: bc@freeuk.com (Bart)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Sat, 19 Nov 2022 21:49:03 +0000
Organization: Aioe.org NNTP Server
Message-ID: <tlbj0f$fne$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: gioia.aioe.org; logging-data="16110"; posting-host="uabYU4OOdxBKlV2hpj27FQ.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
X-Notice: Filtered by postfilter v. 0.9.2
 by: Bart - Sat, 19 Nov 2022 21:49 UTC

On 19/11/2022 21:02, James Harris wrote:
> On 19/11/2022 20:30, Bart wrote:
>> On 19/11/2022 20:17, James Harris wrote:
>>
>>>
>>> I try to keep my main influences to hardware and various assembly
>>> languages I've used over the years. But even though we try not to be
>>> influenced by C I don't think any of us can help it. Two reasons: C
>>> became the base for so many languages which came after it, and C so
>>> well fits the underlying machine.
>>>
>>> I even suspect that the CPUs we use today are also as they are in
>>> part due to C. It has been that influential.
>>
>> Well, there's a lot of C code around that needs to be keep working.
>
> Yes.
>
>>
>> However, what aspects of today's processors do you think owe anything
>> to C?
>
> Things like the 8-bit byte, 2's complement, and the lack of segmentation.

Really? C was pretty much the only language in the world that does not
specify the size of a byte. (It doesn't even a 'byte' type.)

And it's a language that, even now (until C23) DOESN'T stipulate that
integers use two's complement.

As for segmentation, or lack of, that was very common across machines.

It is really nothing at all to do with C. (How would it have influenced
that anyway, given that C implementions were adept are dealing with any
memory model?)

>
>>
>> The progression from 8 to 16 to 32 to 64 bits and beyond has long been
>> on the cards, irrespective of languages.
>>
>> Actually C is lagging behind since most implementations are stuck with
>> a 32-bit int type. Which means lots of software, for those lazily
>> using 'int' everywhere, will perpetuate the limitations of that type.
>>
>> C famously also doesn't like to pin down its types. It doesn't even
>> have a `byte` type, and its `char` type, apart from not have a
>> specified signedness, could have any width of 8 bits or more.
>
> Pre C99 yes. But AIUI since C99 C has had very precise types such as
>
>   int64_t

I'm sure the byte type, it's size and byte-addressibility, was more
influenced more by IBM, such as with its 360 mainframes from the 1960s
BC (Before C). The first byte-addressed machine I used was a 360-clone.

In any case, I would dispute that C even now properly has fixed-width
types. First, you need to do this to enable them:

#include <stdint.h>

Otherwise it knows nothing about them. Second, if you look inside a
typical stdint.h file (this one is from gcc/TDM on Windows), you might
well see:

typedef signed char int8_t;
typedef unsigned char uint8_t;

Nothing here guarantees that int8_t will be an 8-bit type; these
'exact-width' types are defined on top of those loosely-defined types.
They're an illusion.

Re: C Plagiarism

<tlbl16$3adok$1@dont-email.me>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2121&group=comp.lang.misc#2121

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!eternal-september.org!reader01.eternal-september.org!.POSTED!not-for-mail
From: james.harris.1@gmail.com (James Harris)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Sat, 19 Nov 2022 22:23:34 +0000
Organization: A noiseless patient Spider
Lines: 92
Message-ID: <tlbl16$3adok$1@dont-email.me>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Sat, 19 Nov 2022 22:23:34 -0000 (UTC)
Injection-Info: reader01.eternal-september.org; posting-host="6918f107364b996941fc57975b841693";
logging-data="3487508"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19EUYBPXIvJ53ehW3InZ3F4rbRuprAuA7o="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101
Thunderbird/102.4.2
Cancel-Lock: sha1:wxf8/jjv9hUD8cgEQTTIStnTbok=
In-Reply-To: <tlbj0f$fne$1@gioia.aioe.org>
Content-Language: en-GB
 by: James Harris - Sat, 19 Nov 2022 22:23 UTC

On 19/11/2022 21:49, Bart wrote:
> On 19/11/2022 21:02, James Harris wrote:
>> On 19/11/2022 20:30, Bart wrote:
>>> On 19/11/2022 20:17, James Harris wrote:

....

>>>> I even suspect that the CPUs we use today are also as they are in
>>>> part due to C. It has been that influential.
>>>
>>> Well, there's a lot of C code around that needs to be keep working.
>>
>> Yes.
>>
>>>
>>> However, what aspects of today's processors do you think owe anything
>>> to C?
>>
>> Things like the 8-bit byte, 2's complement, and the lack of segmentation.
>
> Really? C was pretty much the only language in the world that does not
> specify the size of a byte. (It doesn't even a 'byte' type.)
>
> And it's a language that, even now (until C23) DOESN'T stipulate that
> integers use two's complement.

That's not what I was thinking. Rather, it was C's lower-level approach
to storage which helped cement in programmers' minds memory as an array
of bytes. Kernighan's C text even included an allocator which used
standard C to manage memory.

Don't get me wrong I am not saying C was the main driver or even that we
wouldn't have had 2's complement and 8-bit bytes without it but that C
gave programmers access to implementation details,and the logic of chars
using 8 bits all encouraged programmers and IT people in general to
think in terms of octet-addressable storage.

>
> As for segmentation, or lack of, that was very common across machines.

I remember reading that when AMD wanted to design a 64-bit architecture
they asked programmers (especially at Microsoft) what they wanted. One
thing was 'no segmentation'. The C model had encouraged programmers to
think in terms of flat address spaces, and the mainstream segmented
approach for x86 was a nightmare that people didn't want to repeat.

....

>>> C famously also doesn't like to pin down its types. It doesn't even
>>> have a `byte` type, and its `char` type, apart from not have a
>>> specified signedness, could have any width of 8 bits or more.
>>
>> Pre C99 yes. But AIUI since C99 C has had very precise types such as
>>
>>    int64_t
>
> I'm sure the byte type, it's size and byte-addressibility, was more
> influenced more by IBM, such as with its 360 mainframes from the 1960s
> BC (Before C). The first byte-addressed machine I used was a 360-clone.

I used a 6502 and a Z80 before starting work but probably like you I
began work on S360. IIRC IBM pioneered different architectures
(including various byte sizes) on their Stretch product.

>
> In any case, I would dispute that C even now properly has fixed-width
> types. First, you need to do this to enable them:
>
>     #include <stdint.h>
>
> Otherwise it knows nothing about them.

Types don't have to be inbuilt to be provided as part of the standard.

> Second, if you look inside a
> typical stdint.h file (this one is from gcc/TDM on Windows), you might
> well see:
>
>     typedef signed char int8_t;
>     typedef unsigned char uint8_t;
>
> Nothing here guarantees that int8_t will be an 8-bit type; these
> 'exact-width' types are defined on top of those loosely-defined types.
> They're an illusion.

The header is built to match the distribution.

--
James Harris

Re: C Plagiarism

<tlbsnk$1bu$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2122&group=comp.lang.misc#2122

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!uabYU4OOdxBKlV2hpj27FQ.user.46.165.242.75.POSTED!not-for-mail
From: bc@freeuk.com (Bart)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Sun, 20 Nov 2022 00:35:00 +0000
Organization: Aioe.org NNTP Server
Message-ID: <tlbsnk$1bu$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlbl16$3adok$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="1406"; posting-host="uabYU4OOdxBKlV2hpj27FQ.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
X-Notice: Filtered by postfilter v. 0.9.2
 by: Bart - Sun, 20 Nov 2022 00:35 UTC

On 19/11/2022 22:23, James Harris wrote:
> On 19/11/2022 21:49, Bart wrote:

>> Really? C was pretty much the only language in the world that does not
>> specify the size of a byte. (It doesn't even a 'byte' type.)
>>
>> And it's a language that, even now (until C23) DOESN'T stipulate that
>> integers use two's complement.
>
> That's not what I was thinking. Rather, it was C's lower-level approach
> to storage which helped cement in programmers' minds memory as an array
> of bytes. Kernighan's C text even included an allocator which used
> standard C to manage memory.

> Don't get me wrong I am not saying C was the main driver or even that we
> wouldn't have had 2's complement and 8-bit bytes without it but that C
> gave programmers access to implementation details,and the logic of chars
> using 8 bits all encouraged programmers and IT people in general to
> think in terms of octet-addressable storage.

>>
>> As for segmentation, or lack of, that was very common across machines.
>
> I remember reading that when AMD wanted to design a 64-bit architecture
> they asked programmers (especially at Microsoft) what they wanted. One
> thing was 'no segmentation'. The C model had encouraged programmers to
> think in terms of flat address spaces, and the mainstream segmented
> approach for x86 was a nightmare that people didn't want to repeat.

I think you're ascribing too much to C. In what way did any other
languages (Algol, Pascal, Cobol, Fortran, even Ada by then) encourage
the use of segmented memory?

Do you mean because C required the use of different kinds of pointers,
and people were fed up with that? Whereas other languages hid that
detail better.

You might as well say then that Assembly was equally responsible since
it was even more of a pain to deal with segments!

(Actually, aren't the segments still there on x86? Except they are 4GB
in size instead of 64KB.)

>> Nothing here guarantees that int8_t will be an 8-bit type; these
>> 'exact-width' types are defined on top of those loosely-defined types.
>> They're an illusion.
>
> The header is built to match the distribution.

Still, it is something when the language most famous for being 'close to
the metal' doesn't allow you to use a byte type, unless you enable it.

Re: C Plagiarism

<tlg5ik$3quce$1@dont-email.me>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2130&group=comp.lang.misc#2130

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!eternal-september.org!reader01.eternal-september.org!.POSTED!not-for-mail
From: david.brown@hesbynett.no (David Brown)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Mon, 21 Nov 2022 16:30:27 +0100
Organization: A noiseless patient Spider
Lines: 29
Message-ID: <tlg5ik$3quce$1@dont-email.me>
References: <tlaulk$10io$1@gioia.aioe.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Mon, 21 Nov 2022 15:30:28 -0000 (UTC)
Injection-Info: reader01.eternal-september.org; posting-host="ddb0d85423665bc01be4416d93681612";
logging-data="4028814"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19zm7r0jMKH4JDSsbBHGaueAubd6xwfVDk="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:91.0) Gecko/20100101
Thunderbird/91.9.1
Cancel-Lock: sha1:VfJBHCjmPvVmzpEy4oy5TROA+jE=
In-Reply-To: <tlaulk$10io$1@gioia.aioe.org>
Content-Language: en-GB
 by: David Brown - Mon, 21 Nov 2022 15:30 UTC

On 19/11/2022 17:01, Bart wrote:
>
> On 16/11/2022 16:50, David Brown wrote:
> > Yes, but for you, a "must-have" list for a programming language would be
> > mainly "must be roughly like ancient style C in functionality, but with
> > enough change in syntax and appearance so that no one will think it is
> > C".  If that's what you like, and what pays for your daily bread, then
> > that's absolutely fine.
>
> On 18/11/2022 07:12, David Brown wrote:
> > Yes, it is a lot like C.  It has a number of changes, some that I think
> > are good, some that I think are bad, but basically it is mostly like C.
>
> The above remarks implies strongly that my systems language is a rip-off
> of C.
>

No, it does not. You can infer what you want from what I write, but I
don't see any such implications from my remark. If anyone were to write
a (relatively) simple structured language for low level work, suitable
for "direct" compilation to assembly on a reasonable selection of common
general-purpose processors, and with the aim of giving a "portable
alternative to writing in assembly", then the result will inevitably
have a good deal in common with C. There can be plenty of differences
in the syntax and details, but the "ethos" or "flavour" of the language
will be similar.

Note that I have referred to Pascal as C-like in this sense.

Re: C Plagiarism

<tlge3j$3rksv$1@dont-email.me>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2132&group=comp.lang.misc#2132

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!eternal-september.org!reader01.eternal-september.org!.POSTED!not-for-mail
From: david.brown@hesbynett.no (David Brown)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Mon, 21 Nov 2022 18:56:02 +0100
Organization: A noiseless patient Spider
Lines: 130
Message-ID: <tlge3j$3rksv$1@dont-email.me>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Mon, 21 Nov 2022 17:56:03 -0000 (UTC)
Injection-Info: reader01.eternal-september.org; posting-host="ddb0d85423665bc01be4416d93681612";
logging-data="4051871"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX18DQ3wSp7DTU+zEod3oaKQKxkkmINjYZ7E="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:91.0) Gecko/20100101
Thunderbird/91.9.1
Cancel-Lock: sha1:igxgSUjslzkt7LQ6xCQYM1BZ/BM=
In-Reply-To: <tlbj0f$fne$1@gioia.aioe.org>
Content-Language: en-GB
 by: David Brown - Mon, 21 Nov 2022 17:56 UTC

On 19/11/2022 22:49, Bart wrote:
> On 19/11/2022 21:02, James Harris wrote:
>> On 19/11/2022 20:30, Bart wrote:
>>> On 19/11/2022 20:17, James Harris wrote:
>>>
>>>>
>>>> I try to keep my main influences to hardware and various assembly
>>>> languages I've used over the years. But even though we try not to be
>>>> influenced by C I don't think any of us can help it. Two reasons: C
>>>> became the base for so many languages which came after it, and C so
>>>> well fits the underlying machine.
>>>>
>>>> I even suspect that the CPUs we use today are also as they are in
>>>> part due to C. It has been that influential.

C is /massively/ influential to the general purpose CPUs we have today.
The prime requirement for almost any CPU design is that you should be
able to use it efficiently for C. After all, the great majority of
software is written in languages that, at their core, are similar to C
(in the sense that once the compiler front-end has finished with them,
you have variables, imperative functions, pointers, objects in memory,
etc., much like C). Those languages that are significantly different
rely on run-times and libraries that are written in C.

>>>
>>> Well, there's a lot of C code around that needs to be keep working.
>>
>> Yes.
>>
>>>
>>> However, what aspects of today's processors do you think owe anything
>>> to C?
>>
>> Things like the 8-bit byte, 2's complement, and the lack of segmentation.
>
> Really? C was pretty much the only language in the world that does not
> specify the size of a byte. (It doesn't even a 'byte' type.)
>

8-bit byte and two's complement were, I think, inevitable regardless of
C. But while the C standard does not require them, their popularity has
grown along with C.

> And it's a language that, even now (until C23) DOESN'T stipulate that
> integers use two's complement.
>
> As for segmentation, or lack of, that was very common across machines.
>

There are plenty of architectures that did not have linear addressing,
and there are many advantages of not allowing memory to be viewed and
accessed as one continuous address space (primarily, it can make buffer
overruns and out of bounds accesses almost impossible). C's model does
not /require/ a simple linear memory space, but such a setup makes C far
easier.

> It is really nothing at all to do with C. (How would it have influenced
> that anyway, given that C implementions were adept are dealing with any
> memory model?)
>

C implementations are /not/ good at dealing with non-linear memory, and
lots of C software assumes memory is linear (and also that bytes are
8-bit, and integers are two's complement). Having the C standard
/allow/ more varied systems does not imply that other systems are good
for C.

But of course C was not the only influence on processor evolution.

>>
>>>
>>> The progression from 8 to 16 to 32 to 64 bits and beyond has long
>>> been on the cards, irrespective of languages.
>>>
>>> Actually C is lagging behind since most implementations are stuck
>>> with a 32-bit int type. Which means lots of software, for those
>>> lazily using 'int' everywhere, will perpetuate the limitations of
>>> that type.
>>>
>>> C famously also doesn't like to pin down its types. It doesn't even
>>> have a `byte` type, and its `char` type, apart from not have a
>>> specified signedness, could have any width of 8 bits or more.
>>
>> Pre C99 yes. But AIUI since C99 C has had very precise types such as
>>
>>    int64_t
>
> I'm sure the byte type, it's size and byte-addressibility, was more
> influenced more by IBM, such as with its 360 mainframes from the 1960s
> BC (Before C). The first byte-addressed machine I used was a 360-clone.
>
> In any case, I would dispute that C even now properly has fixed-width
> types. First, you need to do this to enable them:

Dispute all you want - it does not change a thing.

>
>     #include <stdint.h>
>
> Otherwise it knows nothing about them. Second, if you look inside a
> typical stdint.h file (this one is from gcc/TDM on Windows), you might
> well see:
>
>     typedef signed char int8_t;
>     typedef unsigned char uint8_t;
>
> Nothing here guarantees that int8_t will be an 8-bit type; these
> 'exact-width' types are defined on top of those loosely-defined types.
> They're an illusion.
>

Sorry, you are completely wrong here. Feel free to look it up in the C
standards if you don't believe me.

One of the biggest influences C had on processor design was the idea of
a single stack for return addresses and data, with stack pointer +
offset and frame pointer + offset addressing. C is not the only
language that works well with that setup, but it can't really take any
kind of advantage of more advanced setups with multiple stacks or linked
stack frames. Languages that have local functions, such as Pascal or
Ada, could benefit from more sophisticated stack models. Better stack
models on processors would also greatly reduce the risk of stack
overflows, corruption (intentionally or unintentionally) of return
addresses on stacks, and other bugs in software.

However, any kind of guesses as to how processors would have looked
without C, and therefore what influence C /really/ had, are always going
to be speculative.

Re: C Plagiarism

<tlggtk$ikp$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2133&group=comp.lang.misc#2133

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!uabYU4OOdxBKlV2hpj27FQ.user.46.165.242.75.POSTED!not-for-mail
From: bc@freeuk.com (Bart)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Mon, 21 Nov 2022 18:44:05 +0000
Organization: Aioe.org NNTP Server
Message-ID: <tlggtk$ikp$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: gioia.aioe.org; logging-data="19097"; posting-host="uabYU4OOdxBKlV2hpj27FQ.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
X-Notice: Filtered by postfilter v. 0.9.2
 by: Bart - Mon, 21 Nov 2022 18:44 UTC

On 21/11/2022 17:56, David Brown wrote:
> On 19/11/2022 22:49, Bart wrote:

>>>>> I even suspect that the CPUs we use today are also as they are in
>>>>> part due to C. It has been that influential.
>
> C is /massively/ influential to the general purpose CPUs we have today.

"Massively" influential? Why, how do you think CPUs would have ended up
without C?

Two of the first machines I used were PDP10 and PDP11, developed by DEC
in the 1960s, both using linear memory spaces. While the former was
word-based, the PDP11 was byte-addressable, just like the IBM 360 also
from the 1960s.

The early microprocessors I used (6800, Z80) also had a linear memory
space, at a time when it was unlikely C implementations existed for
them, or that people even thought that much about C outside of Unix.

>  The prime requirement for almost any CPU design is that you should be
> able to use it efficiently for C.

And not Assembly, or Fortran or any other language? Don't forget that at
the point it all began to change, mid-70s to mid-80, C wasn't that
dominant. Any C implementations for microprocessors were incredibly slow
and produced indifferent code.

The OSes I used (for PDP10, PDP11, ICL 4/72, Z80) had no C involvement.
When x86 popularised segment memory, EVERYBODY hated it, and EVERY
language had a problem with it.

The REASON for segmented memory was becaused 16-bits and address spaces
larger than 64K words didn't mix. When this was eventually fixed on
80386 for x86, that was able to use 32-bit registers.

According to you, without C, we would have been using 64KB segments even
with 32 bit registers, or we maybe would never have got to 32 bits at
all. What nonsense!

(I was designing paper CPUs with linear addressing long before then,
probably like lots of people.)

> ll, the great majority of
> software is written in languages that, at their core, are similar to C
> (in the sense that once the compiler front-end has finished with them,
> you have variables, imperative functions, pointers, objects in memory,
> etc., much like C).

I wish people would just accept that C does not have and never has had a
monopoly on lower level languages.

It a shame that people now associate 'close-to-the-metal' programming
with a language where a function pointer type is written as
`void(*)(void)`, and that's in the simples case.

>

>> Really? C was pretty much the only language in the world that does not
>> specify the size of a byte. (It doesn't even a 'byte' type.)
>>
>
> 8-bit byte and two's complement were, I think, inevitable regardless of
> C.

So were lots of things. It didn't take a clairvoyant to guess that the
next progression of 8 -> 16 was going to be 32 and then 64.

(The Z8000 came out in 1979. It was a 16-bit processor with a register
set that could be accessed as 8, 16, 32 or 64-bit chunks. Actually you
can also look at 68000 from that era, and the NatSemi 32032. I was an
engineer at the time and very familiar with this stuff.

C didn't figure in that world at all as far as I was concerned.)

>> It is really nothing at all to do with C. (How would it have
>> influenced that anyway, given that C implementions were adept are
>> dealing with any memory model?)
>>
>
> C implementations are /not/ good at dealing with non-linear memory,

No longer likes it, as I said.

> But of course C was not the only influence on processor evolution.

OK, you admit now it was not '/massive/'; good!

>>
>>      #include <stdint.h>
>>
>> Otherwise it knows nothing about them. Second, if you look inside a
>> typical stdint.h file (this one is from gcc/TDM on Windows), you might
>> well see:
>>
>>      typedef signed char int8_t;
>>      typedef unsigned char uint8_t;
>>
>> Nothing here guarantees that int8_t will be an 8-bit type; these
>> 'exact-width' types are defined on top of those loosely-defined types.
>> They're an illusion.
>>
>
> Sorry, you are completely wrong here.  Feel free to look it up in the C
> standards if you don't believe me.

The above typedefs are from a C compiler you may have heard of: 'gcc'.
Some may well use internal types such as `__int8`, but the above is the
actual content of stdint.h, and makes `int8_t` a direct synonym for
`signed char`.

> However, any kind of guesses as to how processors would have looked
> without C, and therefore what influence C /really/ had, are always going
> to be speculative.

Without C, another lower-level systems language would have dominated,
since such a language was necessary.

More interesting however is what Unix would have looked like without C.

Re: C Plagiarism

<tlgmi1$185r$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2135&group=comp.lang.misc#2135

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!BLmi2Wt9MIz6qXtlQu2iWw.user.46.165.242.91.POSTED!not-for-mail
From: mailbox@dmitry-kazakov.de (Dmitry A. Kazakov)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Mon, 21 Nov 2022 21:20:20 +0100
Organization: Aioe.org NNTP Server
Message-ID: <tlgmi1$185r$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
<tlggtk$ikp$1@gioia.aioe.org>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="41147"; posting-host="BLmi2Wt9MIz6qXtlQu2iWw.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
Content-Language: en-US
X-Notice: Filtered by postfilter v. 0.9.2
 by: Dmitry A. Kazakov - Mon, 21 Nov 2022 20:20 UTC

On 2022-11-21 19:44, Bart wrote:

> Two of the first machines I used were PDP10 and PDP11, developed by DEC
> in the 1960s, both using linear memory spaces. While the former was
> word-based, the PDP11 was byte-addressable, just like the IBM 360 also
> from the 1960s.

PDP-11 was not linear. The internal machine address was 24-bit. But the
effective address in the program was 16-bit. The address space was 64K
for data and 64K for code mapped by the virtual memory manager. Some
machines had a third 64K space.

> And not Assembly, or Fortran or any other language?

Assember is not portable. FORTRAN had no pointers. Programmers
implemented memory management on top of an array (e.g. LOGICAL*1, since
it had no bytes or characters either (:-)). Since FORTRAN was totally
untyped, you don't even need to cast anything! (:-))

> The REASON for segmented memory was becaused 16-bits and address spaces
> larger than 64K words didn't mix. When this was eventually fixed on
> 80386 for x86, that was able to use 32-bit registers.

Segmented memory requires less memory registers because the segment size
may vary. A potential advantage, as it was already mentioned, is that
you could theoretically implement bounds checking on top of it. One
example of such techniques was VAX debugger which ran programs at normal
speed between breakpoints. The trick was to place active breakpoints on
no-access pages. I don't advocate segmented memory, BTW.

> More interesting however is what Unix would have looked like without C.

Though I hate both, I don't think C influenced UNIX much.

--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

Re: C Plagiarism

<tlgmmp$3sdo2$2@dont-email.me>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2136&group=comp.lang.misc#2136

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!eternal-september.org!reader01.eternal-september.org!.POSTED!not-for-mail
From: david.brown@hesbynett.no (David Brown)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Mon, 21 Nov 2022 21:22:49 +0100
Organization: A noiseless patient Spider
Lines: 214
Message-ID: <tlgmmp$3sdo2$2@dont-email.me>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
<tlggtk$ikp$1@gioia.aioe.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Mon, 21 Nov 2022 20:22:49 -0000 (UTC)
Injection-Info: reader01.eternal-september.org; posting-host="33316b73f02e3cd04c1452fac68237a1";
logging-data="4077314"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/nYWwQg07i7ivyKYEazUPTLwIc7HBdQsU="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101
Thunderbird/102.2.2
Cancel-Lock: sha1:3DnIbVkqeV5FfUj8AddgY+N219M=
In-Reply-To: <tlggtk$ikp$1@gioia.aioe.org>
Content-Language: en-GB
 by: David Brown - Mon, 21 Nov 2022 20:22 UTC

On 21/11/2022 19:44, Bart wrote:
> On 21/11/2022 17:56, David Brown wrote:
>> On 19/11/2022 22:49, Bart wrote:
>
>>>>>> I even suspect that the CPUs we use today are also as they are in
>>>>>> part due to C. It has been that influential.
>>
>> C is /massively/ influential to the general purpose CPUs we have today.
>
> "Massively" influential? Why, how do you think CPUs would have ended up
> without C?

As I said at the end of my previous post, it's very difficult to tell.
Maybe they would be more varied. Maybe we'd have more stacks. Maybe
we'd be freed from the idea that a "pointer" is nothing more than a
linear address - it could have bounds, or access flags. Registers and
memory could hold type information as well as values. Processors could
have had support for multi-threading or parallel processing. They could
have been designed around event models and signal passing, or have
hardware acceleration for accessing code or data by name. They could
have been better at handling coroutines. There are all kinds of
different things hardware /could/ do, at least some of which would
greatly suit some of the many different kinds of programming languages
we have seen through the years.

A few of these have turned up - there are processors with multiple
stacks optimised for Forth, there were early massively parallel
processors designed alongside the Occam language, the company Linn Smart
Computing made a radical new processor design for more efficient
implementation of their own programming language. Some ARM cores had
hardware acceleration for Java virtual machines.

But I have no specific thoughts - predictions about possible parallel
pasts are just as hard as predictions about the future!

>
> Two of the first machines I used were PDP10 and PDP11, developed by DEC
> in the 1960s, both using linear memory spaces. While the former was
> word-based, the PDP11 was byte-addressable, just like the IBM 360 also
> from the 1960s.
>

C was developed originally for these processors, and was a major reason
for their long-term success.

C was designed with some existing processors in mind - I don't think
anyone is suggesting that features such as linear memory came about
solely because of C. But there was more variety of processor
architectures in the old days, while almost all we have now are
processors that are good for running C code.

> The early microprocessors I used (6800, Z80) also had a linear memory
> space, at a time when it was unlikely C implementations existed for
> them, or that people even thought that much about C outside of Unix.
>
>>   The prime requirement for almost any CPU design is that you should
>> be able to use it efficiently for C.
>
> And not Assembly, or Fortran or any other language?

Not assembly, no - /very/ little code is now written in assembly.
FORTRAN efficiency used to be important for processor design, but not
for a very long time. (FORTRAN is near enough the same programming
model as C, however.)

> Don't forget that at
> the point it all began to change, mid-70s to mid-80, C wasn't that
> dominant. Any C implementations for microprocessors were incredibly slow
> and produced indifferent code.
>
> The OSes I used (for PDP10, PDP11, ICL 4/72, Z80) had no C involvement.
> When x86 popularised segment memory, EVERYBODY hated it, and EVERY
> language had a problem with it.
>

Yes - the choice of the 8086 for PC's was a huge mistake. It was purely
economics - the IBM designers wanted a 68000 processor. But IBM PHB's
said that since the IBM PC was just a marketing exercise and they would
never make more than a few thousand machines, technical benefits were
irrelevant and the 8086 devices were cheaper. (By the same logic, they
bought the cheapest OS they could get, despite everyone saying it was
rubbish.)

> The REASON for segmented memory was becaused 16-bits and address spaces
> larger than 64K words didn't mix. When this was eventually fixed on
> 80386 for x86, that was able to use 32-bit registers.
>
> According to you, without C, we would have been using 64KB segments even
> with 32 bit registers, or we maybe would never have got to 32 bits at
> all. What nonsense!
>

Eh, no. I did not say anything /remotely/ like that.

> (I was designing paper CPUs with linear addressing long before then,
> probably like lots of people.)
>
>
>> ll, the great majority of software is written in languages that, at
>> their core, are similar to C (in the sense that once the compiler
>> front-end has finished with them, you have variables, imperative
>> functions, pointers, objects in memory, etc., much like C).
>
> I wish people would just accept that C does not have and never has had a
> monopoly on lower level languages.
>

I does have, and has had for 40+ years, a /near/ monopoly on low-level
languages. You can dislike C as much as you want, but you really cannot
deny that!

> It a shame that people now associate 'close-to-the-metal' programming
> with a language where a function pointer type is written as
> `void(*)(void)`, and that's in the simples case.
>

I don't disagree that it is a shame, or that better (for whatever value
of "better" you like) low-level languages exist or can be made. That
doesn't change the facts.

>>
>
>>> Really? C was pretty much the only language in the world that does
>>> not specify the size of a byte. (It doesn't even a 'byte' type.)
>>>
>>
>> 8-bit byte and two's complement were, I think, inevitable regardless
>> of C.
>
> So were lots of things. It didn't take a clairvoyant to guess that the
> next progression of 8 -> 16 was going to be 32 and then 64.
>

Agreed.

> (The Z8000 came out in 1979. It was a 16-bit processor with a register
> set that could be accessed as 8, 16, 32 or 64-bit chunks. Actually you
> can also look at 68000 from that era, and the NatSemi 32032. I was an
> engineer at the time and very familiar with this stuff.
>
> C didn't figure in that world at all as far as I was concerned.)
>
>>> It is really nothing at all to do with C. (How would it have
>>> influenced that anyway, given that C implementions were adept are
>>> dealing with any memory model?)
>>>
>>
>> C implementations are /not/ good at dealing with non-linear memory,
>
> No longer likes it, as I said.
>
>> But of course C was not the only influence on processor evolution.
>
> OK, you admit now it was not '/massive/'; good!
>

Would you please stop making things up and pretending I said them?

C was a /massive/ influence on processor evolution and the current
standardisation of general-purpose processors as systems for running C
code efficiently. But it was not the only influence, or the sole reason
for current processor design.

>>>
>>>      #include <stdint.h>
>>>
>>> Otherwise it knows nothing about them. Second, if you look inside a
>>> typical stdint.h file (this one is from gcc/TDM on Windows), you
>>> might well see:
>>>
>>>      typedef signed char int8_t;
>>>      typedef unsigned char uint8_t;
>>>
>>> Nothing here guarantees that int8_t will be an 8-bit type; these
>>> 'exact-width' types are defined on top of those loosely-defined
>>> types. They're an illusion.
>>>
>>
>> Sorry, you are completely wrong here.  Feel free to look it up in the
>> C standards if you don't believe me.
>
> The above typedefs are from a C compiler you may have heard of: 'gcc'.
> Some may well use internal types such as `__int8`, but the above is the
> actual content of stdint.h, and makes `int8_t` a direct synonym for
> `signed char`.
>

They are part of C - specified precisely in the C standards. It does
not matter how any particular implementation defines them. The C
standards say they are part of C, and the type names are introduced into
the current namespace using "#include <stdint.h>" (Or "#include
<inttypes.h>".) The standards also say that "int8_t" is an 8-bit type,
with no padding, and two's complement representation. This has been the
case since C99 - there is no "looseness" or "illusions" in these types.

>
>
>> However, any kind of guesses as to how processors would have looked
>> without C, and therefore what influence C /really/ had, are always
>> going to be speculative.
>
> Without C, another lower-level systems language would have dominated,
> since such a language was necessary.


Click here to read the complete article
Re: C Plagiarism

<tlgr4b$1akj$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2138&group=comp.lang.misc#2138

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!uabYU4OOdxBKlV2hpj27FQ.user.46.165.242.75.POSTED!not-for-mail
From: bc@freeuk.com (Bart)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Mon, 21 Nov 2022 21:38:19 +0000
Organization: Aioe.org NNTP Server
Message-ID: <tlgr4b$1akj$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
<tlggtk$ikp$1@gioia.aioe.org> <tlgmi1$185r$1@gioia.aioe.org>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="43667"; posting-host="uabYU4OOdxBKlV2hpj27FQ.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
X-Notice: Filtered by postfilter v. 0.9.2
 by: Bart - Mon, 21 Nov 2022 21:38 UTC

On 21/11/2022 20:20, Dmitry A. Kazakov wrote:
> On 2022-11-21 19:44, Bart wrote:
>
>> Two of the first machines I used were PDP10 and PDP11, developed by
>> DEC in the 1960s, both using linear memory spaces. While the former
>> was word-based, the PDP11 was byte-addressable, just like the IBM 360
>> also from the 1960s.
>
> PDP-11 was not linear. The internal machine address was 24-bit. But the
> effective address in the program was 16-bit. The address space was 64K
> for data and 64K for code mapped by the virtual memory manager. Some
> machines had a third 64K space.

My PDP11/34 probably didn't have that much memory. But if you couldn't
access more than 64K per task (say for code or data, if treated
separately), then I would still call that linear from the task's point
of view.

>> And not Assembly, or Fortran or any other language?
>
> Assember is not portable.

That is not relevant. The suggestion was that keeping C happy was a
motivation for CPU designers, but a lot of ASM code was still being run too.

> FORTRAN had no pointers. Programmers
> implemented memory management on top of an array

But those arrays work better in linear memory. There was a lot of
Fortran code around too (probably a lot more than C at the time I got
into it), and /that/ code needed to stay efficient too.

So I was questioning whether C was that big a factor at that period when
the architectures we have now were just beginning to be developed.

Re: C Plagiarism

<tlgsit$1v1q$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2139&group=comp.lang.misc#2139

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!W4+pUJJ+LMQSnRdpBvjvmw.user.46.165.242.91.POSTED!not-for-mail
From: mailbox@dmitry-kazakov.de (Dmitry A. Kazakov)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Mon, 21 Nov 2022 23:03:09 +0100
Organization: Aioe.org NNTP Server
Message-ID: <tlgsit$1v1q$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
<tlggtk$ikp$1@gioia.aioe.org> <tlgmi1$185r$1@gioia.aioe.org>
<tlgr4b$1akj$1@gioia.aioe.org>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="64570"; posting-host="W4+pUJJ+LMQSnRdpBvjvmw.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
Content-Language: en-US
X-Notice: Filtered by postfilter v. 0.9.2
 by: Dmitry A. Kazakov - Mon, 21 Nov 2022 22:03 UTC

On 2022-11-21 22:38, Bart wrote:
> On 21/11/2022 20:20, Dmitry A. Kazakov wrote:
>> On 2022-11-21 19:44, Bart wrote:
>>
>>> Two of the first machines I used were PDP10 and PDP11, developed by
>>> DEC in the 1960s, both using linear memory spaces. While the former
>>> was word-based, the PDP11 was byte-addressable, just like the IBM 360
>>> also from the 1960s.
>>
>> PDP-11 was not linear. The internal machine address was 24-bit. But
>> the effective address in the program was 16-bit. The address space was
>> 64K for data and 64K for code mapped by the virtual memory manager.
>> Some machines had a third 64K space.
>
> My PDP11/34 probably didn't have that much memory. But if you couldn't
> access more than 64K per task (say for code or data, if treated
> separately), then I would still call that linear from the task's point
> of view.

So is segmented memory if you have single segment. Once you needed more
that 64K of data or code, your linearity would end.

>> FORTRAN had no pointers. Programmers implemented memory management on
>> top of an array
>
> But those arrays work better in linear memory.

FORTRAN was not that high level to support memory mapping on indexing.
The method of handling larger than address space data structures and
code was per loader's overlay trees, kind of precursor of paging/swap.
Segmented or paged played no difference.

--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

Re: C Plagiarism

<tlifro$bj1$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2140&group=comp.lang.misc#2140

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!uabYU4OOdxBKlV2hpj27FQ.user.46.165.242.75.POSTED!not-for-mail
From: bc@freeuk.com (Bart)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Tue, 22 Nov 2022 12:38:18 +0000
Organization: Aioe.org NNTP Server
Message-ID: <tlifro$bj1$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
<tlggtk$ikp$1@gioia.aioe.org> <tlgmmp$3sdo2$2@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: gioia.aioe.org; logging-data="11873"; posting-host="uabYU4OOdxBKlV2hpj27FQ.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
X-Notice: Filtered by postfilter v. 0.9.2
 by: Bart - Tue, 22 Nov 2022 12:38 UTC

On 21/11/2022 20:22, David Brown wrote:
> On 21/11/2022 19:44, Bart wrote:

>> Two of the first machines I used were PDP10 and PDP11, developed by
>> DEC in the 1960s, both using linear memory spaces. While the former
>> was word-based, the PDP11 was byte-addressable, just like the IBM 360
>> also from the 1960s.
>>
>
> C was developed originally for these processors, and was a major reason
> for their long-term success.

Of the PDP10 and IBM 360? Designed in the 1960s and discontinued in 1983
and 1979 respectively. C only came out in a first version in 1972.

The PDP11 was superceded around this time (either side of 1980) by the
VAX-11, a 32-bit version, no doubt inspired by the C language, one that
was well known for not specifying the sizes of its types - it adapted to
the size of the hardware.

Do you really believe this stuff?

> C was designed with some existing processors in mind - I don't think
> anyone is suggesting that features such as linear memory came about
> solely because of C.  But there was more variety of processor
> architectures in the old days, while almost all we have now are
> processors that are good for running C code.

As I said, C is the language that adapts itself to the hardware, and in
fact still is the primary language now that can and does run on every
odd-ball architecture.

Which is why it is an odd candidate for a language that was supposed to
drive the evolution of hardware because of its requirements.

>> The early microprocessors I used (6800, Z80) also had a linear memory
>> space, at a time when it was unlikely C implementations existed for
>> them, or that people even thought that much about C outside of Unix.
>>
>>>   The prime requirement for almost any CPU design is that you should
>>> be able to use it efficiently for C.
>>
>> And not Assembly, or Fortran or any other language?
>
> Not assembly, no - /very/ little code is now written in assembly.

Now, yes. I'm talking about that formative period of mid-70s to mid-80s
when everything changed. From being dominated by mainframes, to 32-bit
microprocessors which are only one step behind the 64-bit ones we have now.

> FORTRAN efficiency used to be important for processor design, but not
> for a very long time.  (FORTRAN is near enough the same programming
> model as C, however.)

Oh, right. In that case could be it possibly have been the need to run
Fortran efficiency that was a driving force in that period?

(I spent a year in the late 70s writing Fortran code in two scientific
establishments in the UK. No one used C.)

>> Don't forget that at the point it all began to change, mid-70s to
>> mid-80, C wasn't that dominant. Any C implementations for
>> microprocessors were incredibly slow and produced indifferent code.
>>
>> The OSes I used (for PDP10, PDP11, ICL 4/72, Z80) had no C
>> involvement. When x86 popularised segment memory, EVERYBODY hated it,
>> and EVERY language had a problem with it.
>>
>
> Yes - the choice of the 8086 for PC's was a huge mistake.  It was purely
> economics - the IBM designers wanted a 68000 processor.

When you looked at the 68000 more closely, it had nearly as much
non-orthoganality as the 8086. (I was trying at that time to get my
company to switch to a processor like the 68k.)

(The 8086 was bearable, but it had one poor design choice that had huge
implications: forming an address by shifting a 16-bit segment address by
4 bits instead 8.

That meant an addressing range of only 1MB instead of 16MB, leading to a
situation later where you could cheaply install 4MB or 8MB of memory,
but you couldn't easily make use of it.)

>> According to you, without C, we would have been using 64KB segments
>> even with 32 bit registers, or we maybe would never have got to 32
>> bits at all. What nonsense!
>>
>
> Eh, no.  I did not say anything /remotely/ like that.

It sounds like it! Just accept that C had no more nor less influence
than any other language /at that time/.

> I does have, and has had for 40+ years, a /near/ monopoly on low-level
> languages.  You can dislike C as much as you want, but you really cannot
> deny that!

It's also the fact that /I/ at least have also successively avoided
using C for 40+ years (and, probably fairly uniquely, have used private
languages). I'm sure there are other stories like mine that you don't
hear about.

>>> But of course C was not the only influence on processor evolution.
>>
>> OK, you admit now it was not '/massive/'; good!
>>
>
> Would you please stop making things up and pretending I said them?

You actually said this:

> C is /massively/ influential to the general purpose CPUs we have today.

Which suggests that you don't think any other language comes close.

I don't know which individual language, if any, was most influential,
but I doubt C played a huge part because it came out too late, and was
not that popular in those formative years, but which time the way
processors were going to evolve was becoming clear anyway.

(That is, still dominated by von Neumann architectures, as has been the
case since long before C.)

But C probably has influenced modern 64-bit ABIs, even though they are
supposed to be language-independent.

>> More interesting however is what Unix would have looked like without C.
>
> How do you think it would have looked?

Case insensitive? Or maybe that's just wishful thinking.

Re: C Plagiarism

<tlipt7$48av$1@dont-email.me>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2143&group=comp.lang.misc#2143

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!eternal-september.org!reader01.eternal-september.org!.POSTED!not-for-mail
From: david.brown@hesbynett.no (David Brown)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Tue, 22 Nov 2022 16:29:42 +0100
Organization: A noiseless patient Spider
Lines: 249
Message-ID: <tlipt7$48av$1@dont-email.me>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
<tlggtk$ikp$1@gioia.aioe.org> <tlgmmp$3sdo2$2@dont-email.me>
<tlifro$bj1$1@gioia.aioe.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 22 Nov 2022 15:29:43 -0000 (UTC)
Injection-Info: reader01.eternal-september.org; posting-host="f6dd6be597a36657552486283c84fce2";
logging-data="139615"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/sy2fDXhoFfnbVdMlg2O4oBCHxYVhLtBc="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:91.0) Gecko/20100101
Thunderbird/91.9.1
Cancel-Lock: sha1:DM80FmHcyY2uuoXCatq9VbqNfIw=
In-Reply-To: <tlifro$bj1$1@gioia.aioe.org>
Content-Language: en-GB
 by: David Brown - Tue, 22 Nov 2022 15:29 UTC

On 22/11/2022 13:38, Bart wrote:
> On 21/11/2022 20:22, David Brown wrote:
>> On 21/11/2022 19:44, Bart wrote:
>
>>> Two of the first machines I used were PDP10 and PDP11, developed by
>>> DEC in the 1960s, both using linear memory spaces. While the former
>>> was word-based, the PDP11 was byte-addressable, just like the IBM 360
>>> also from the 1960s.
>>>
>>
>> C was developed originally for these processors, and was a major
>> reason for their long-term success.
>
> Of the PDP10 and IBM 360? Designed in the 1960s and discontinued in 1983
> and 1979 respectively. C only came out in a first version in 1972.
>

I was thinking primarily of the PDP11, which was the first real target
for C (assuming I have my history correct - this was around the time I
was born). And by "long-term success" of these systems, I mean their
successors that were built in the same style - such as the VAX.

> The PDP11 was superceded around this time (either side of 1980) by the
> VAX-11, a 32-bit version, no doubt inspired by the C language, one that
> was well known for not specifying the sizes of its types - it adapted to
> the size of the hardware.
>
> Do you really believe this stuff?
>
>> C was designed with some existing processors in mind - I don't think
>> anyone is suggesting that features such as linear memory came about
>> solely because of C.  But there was more variety of processor
>> architectures in the old days, while almost all we have now are
>> processors that are good for running C code.
>
> As I said, C is the language that adapts itself to the hardware, and in
> fact still is the primary language now that can and does run on every
> odd-ball architecture.
>

C does not "adapt itself to the hardware". It is specified with some
details of features being decided by the implementer. (Some of these
details are quite important.) Part of the reason for this is to allow
efficient implementations on a wide range of hardware, but it also
determines a balance between implementer freedom, and limits that a
programmer can rely upon. There are plenty of cases where different
implementations on the same hardware make different choices of the
details. (Examples include the size of "long" on 64-bit x86 systems
being different for Windows and the rest of the world, or some compilers
for the original m68k having 16-bit int while others had 32-bit int.)

> Which is why it is an odd candidate for a language that was supposed to
> drive the evolution of hardware because of its requirements.

There is a difference between a language being usable on a range of
systems, and being very /efficient/ on a range of systems. You can use
C on an 8-bit AVR processor - there is a gcc port. But it is not a good
processor design for C - there are few pointer registers, 16-bit
manipulation is inefficient, there are separate address spaces for flash
and ram, there is no stack pointer + offset addressing mode. So while C
is far and away the most popular language for programming AVR's, AVR's
are not good processors for C. (Other 8-bit cores such as the 8051 are
even worse, and that is a reason for them being dropped as soon as
32-bit ARM cores became cheap enough.)

>
>>> The early microprocessors I used (6800, Z80) also had a linear memory
>>> space, at a time when it was unlikely C implementations existed for
>>> them, or that people even thought that much about C outside of Unix.
>>>
>>>>   The prime requirement for almost any CPU design is that you should
>>>> be able to use it efficiently for C.
>>>
>>> And not Assembly, or Fortran or any other language?
>>
>> Not assembly, no - /very/ little code is now written in assembly.
>
> Now, yes. I'm talking about that formative period of mid-70s to mid-80s
> when everything changed. From being dominated by mainframes, to 32-bit
> microprocessors which are only one step behind the 64-bit ones we have now.
>

OK, for a time the ability to program efficiently in assembly was
important. But that was already in decline by the early 1980's in big
systems, as we began to see a move towards RISC processors optimised for
compiler output rather than CISC processors optimised for human assembly
coding. (The continued existence of CISC was almost entirely due to the
IBM PC's choice of the 8088 processor.)

>
>> FORTRAN efficiency used to be important for processor design, but not
>> for a very long time.  (FORTRAN is near enough the same programming
>> model as C, however.)
>
> Oh, right. In that case could be it possibly have been the need to run
> Fortran efficiency that was a driving force in that period?

That would have been important too, but C quickly overwhelmed FORTRAN in
popularity. FORTRAN was used in scientific and engineering work, but C
was the choice for systems programming and most application programming.

>
> (I spent a year in the late 70s writing Fortran code in two scientific
> establishments in the UK. No one used C.)
>
>>> Don't forget that at the point it all began to change, mid-70s to
>>> mid-80, C wasn't that dominant. Any C implementations for
>>> microprocessors were incredibly slow and produced indifferent code.
>>>
>>> The OSes I used (for PDP10, PDP11, ICL 4/72, Z80) had no C
>>> involvement. When x86 popularised segment memory, EVERYBODY hated it,
>>> and EVERY language had a problem with it.
>>>
>>
>> Yes - the choice of the 8086 for PC's was a huge mistake.  It was
>> purely economics - the IBM designers wanted a 68000 processor.
>
> When you looked at the 68000 more closely, it had nearly as much
> non-orthoganality as the 8086. (I was trying at that time to get my
> company to switch to a processor like the 68k.)

No, it does not. (Yes, I have looked at it closely, and used 68k
processors extensively.)

>
> (The 8086 was bearable, but it had one poor design choice that had huge
> implications: forming an address by shifting a 16-bit segment address by
> 4 bits instead 8.
>
> That meant an addressing range of only 1MB instead of 16MB, leading to a
> situation later where you could cheaply install 4MB or 8MB of memory,
> but you couldn't easily make use of it.)

The 8086 was horrible in all sorts of ways. Comparing a 68000 with an
8086 is like comparing a Jaguar E-type with a bathtub with wheels. And
for the actual chip used in the first PC, an 8088, half the wheels were
removed.

>
>
>>> According to you, without C, we would have been using 64KB segments
>>> even with 32 bit registers, or we maybe would never have got to 32
>>> bits at all. What nonsense!
>>>
>>
>> Eh, no.  I did not say anything /remotely/ like that.
>
> It sounds like it! Just accept that C had no more nor less influence
> than any other language /at that time/.
>

The most successful (by a huge margin - like it or not) programming
language evolved, spread and conquered the programming world, at the
same time as the basic processor architecture evolved and solidified
into a style that is very good at executing C programs, and is missing
countless features that would be useful for many other kinds of
programming languages. Coincidence? I think not.

Of course there were other languages that benefited from those same
processors, but none were or are as popular as C and its clear descendants.

>
>> I does have, and has had for 40+ years, a /near/ monopoly on low-level
>> languages.  You can dislike C as much as you want, but you really
>> cannot deny that!
>
> It's also the fact that /I/ at least have also successively avoided
> using C for 40+ years (and, probably fairly uniquely, have used private
> languages). I'm sure there are other stories like mine that you don't
> hear about.

Sure. But for every person like you that has made a successful career
with your own language, there are perhaps 100,000 other programmers who
have used other languages as the basis of their careers. 90% of them at
least will have C or its immediate descendants (C++, Java, C#, etc.) as
their main language.

You can have your opinions about quality, but in terms of /quantity/
there is no contest.

>
>
>>>> But of course C was not the only influence on processor evolution.
>>>
>>> OK, you admit now it was not '/massive/'; good!
>>>
>>
>> Would you please stop making things up and pretending I said them?
>
> You actually said this:
>
> > C is /massively/ influential to the general purpose CPUs we have today.
>
> Which suggests that you don't think any other language comes close.


Click here to read the complete article
Re: C Plagiarism

<tlit8t$15vj$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2144&group=comp.lang.misc#2144

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!So0LnG6PxyYAGPdAPzwWCg.user.46.165.242.75.POSTED!not-for-mail
From: anw@cuboid.co.uk (Andy Walker)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Tue, 22 Nov 2022 16:27:08 +0000
Organization: Not very much
Message-ID: <tlit8t$15vj$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
<tlggtk$ikp$1@gioia.aioe.org> <tlgmmp$3sdo2$2@dont-email.me>
<tlifro$bj1$1@gioia.aioe.org> <tlipt7$48av$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="38899"; posting-host="So0LnG6PxyYAGPdAPzwWCg.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (X11; Linux i686; rv:102.0) Gecko/20100101
Thunderbird/102.4.2
Content-Language: en-GB
X-Notice: Filtered by postfilter v. 0.9.2
 by: Andy Walker - Tue, 22 Nov 2022 16:27 UTC

On 22/11/2022 15:29, David Brown wrote:
> Case insensitivity is a mistake, born from the days before computers
> were advanced enough to have small letters as well as capitals.

I don't believe I have ever used a computer that did not "have
small letters". There has been some discussion over in "comp.compilers"
recently, but it's basically the difference between punched cards and
paper tape. The Flexowriter can be traced back to the 1920s, and its
most popular form was certainly being used by computers in the 1950s,
so there really weren't many "days before" to be considered.

--
Andy Walker, Nottingham.
Andy's music pages: www.cuboid.me.uk/andy/Music
Composer of the day: www.cuboid.me.uk/andy/Music/Composers/Hertel

Re: C Plagiarism

<tlivvq$off$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2145&group=comp.lang.misc#2145

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!uabYU4OOdxBKlV2hpj27FQ.user.46.165.242.75.POSTED!not-for-mail
From: bc@freeuk.com (Bart)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Tue, 22 Nov 2022 17:13:32 +0000
Organization: Aioe.org NNTP Server
Message-ID: <tlivvq$off$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
<tlggtk$ikp$1@gioia.aioe.org> <tlgmmp$3sdo2$2@dont-email.me>
<tlifro$bj1$1@gioia.aioe.org> <tlipt7$48av$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: gioia.aioe.org; logging-data="25071"; posting-host="uabYU4OOdxBKlV2hpj27FQ.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
X-Notice: Filtered by postfilter v. 0.9.2
 by: Bart - Tue, 22 Nov 2022 17:13 UTC

On 22/11/2022 15:29, David Brown wrote:
> On 22/11/2022 13:38, Bart wrote:

>> When you looked at the 68000 more closely, it had nearly as much
>> non-orthoganality as the 8086. (I was trying at that time to get my
>> company to switch to a processor like the 68k.)
>
> No, it does not.  (Yes, I have looked at it closely, and used 68k
> processors extensively.)

As a compiler writer? The first thing you noticed is that you have to
decide whether to use D-registers or A-registers, as they had different
characteristics, but the 3-bit register field of instructions could only
use one or the other.

That made the 8086 simpler because there was no choice! The registers
were limited and only one was general purpose.

>> But C probably has influenced modern 64-bit ABIs, even though they are
>> supposed to be language-independent.
>>
>
> What makes you think they are supposed to be language independent?  What
> makes you think they are not?  What makes you care?

Language A can talk to language B via the machine's ABI. Where does C
come into it?

Language A can talk to a library or OS component that resides in a DLL,
via the ABI. The library might have been implemented in C, or assembler,
or in anything else, but in binary form, is pure machine code anyway.

What makes /you/ think that such ABIs were invented purely for the use
of C programs? Do you think the designers of the ABI simply assumed that
only programs written in the C language could call into the OS?

When you download a shared library DLL, do you think they have different
versions depending on what language will be using the DLL?

> The types and terms from C are a very convenient way to describe an ABI,

They're pretty terrible actually. The types involved in SYS V ABI can be
expressed as follows in a form that everyone understands and many
languages use:

i8 i16 i32 i64 i128
u8 u16 u32 u64 u128
f32 f64 f128

This document (https://refspecs.linuxbase.org/elf/x86_64-abi-0.99.pdf)
lists the C equivalents as follows (only signed integers shown):

i8 char, signed char
i16 short, signed short
i32 int, signed int
i64 long, signed long, long long, signed long long
i128 __int128, signed __int128

(No use of int8_t etc despite the document dated 2012.)

This comes up in APIs too where it is 100 times more relevant (only
compiler writers care about the API). The C denotations shown here are
not fit for purpose for language-neutral interfaces.

(Notice also that 'long' and 'long long' are both 64 bits, and that
'char' is assumed to be signed. In practice the C denotations would vary
across platforms, while those i8-i128 would stay constant, provided only
that the machine uses conventional register sizes.)

So it's more like, such interfaces were developed /despite/ C.

> since it is a language familiar to any programmer who might be
> interested in the details of an ABI.  Such ABI's only cover a
> (relatively) simple common subset of possible interfaces, but do so in a
> way that can be used from any language (with wrappers if needed) and can
> be extended as needed.
>
> People make ABI's for practical use.  MS made the ABI for Win64 to suit
> their own needs and uses.  AMD and a range of *nix developers (both OS
> and application developers) and compiler developers got together to
> develop the 64-bit x86 ABI used by everyone else, designed to suit
> /there/ needs and uses.

x86-32 used a number of different ABIs depending on language and
compiler. x86-64 tends to use one ABI, which is a strong indication that
that that ABI was intended to work across languages and compilers.

>> Case insensitive? Or maybe that's just wishful thinking.
>>
>
> Case insensitivity is a mistake, born from the days before computers
> were advanced enough to have small letters as well as capitals.  It
> leads to ugly inconsistencies, wastes the opportunity to convey useful
> semantic information, and is an absolute nightmare as soon as you stray
> from the simple English-language alphabet.

Yet Google searches are case-insensitive. How is that possible, given
that search strings can use Unicode which you say does not define case
equivalents across most alphabets?

As are email addresses and domain names.

As are most things in everyday life, even now that it is all tied up
with computers and smartphones and tablets with everything being online.

(Actually, most people's exposure to case-sensitivity is in online
passwords, which is also the worst place to have it, as usually you
can't see them!)

Your objections make no sense at all. Besides which, plenty of
case-insensitive languages, file-systems and shell programs and
applications exist.

> I believe Unix's predecessor, Multics, was case-sensitive.  But I could
> be wrong.

I'm surprised the Unix and C developers even had a terminal that could
do upper and lower case. I was stuck with upper case for the first year
or two. File-systems and global linker symbols were also restricted in
length and case for a long time, to minimise space.

Case-sensitivity was a luxury into the 80s.

Re: C Plagiarism

<tlj1ur$1rbo$1@gioia.aioe.org>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=2146&group=comp.lang.misc#2146

  copy link   Newsgroups: comp.lang.misc
Path: i2pn2.org!i2pn.org!aioe.org!QJLXApsvkYYOaKx3c4LRTg.user.46.165.242.91.POSTED!not-for-mail
From: mailbox@dmitry-kazakov.de (Dmitry A. Kazakov)
Newsgroups: comp.lang.misc
Subject: Re: C Plagiarism
Date: Tue, 22 Nov 2022 18:47:08 +0100
Organization: Aioe.org NNTP Server
Message-ID: <tlj1ur$1rbo$1@gioia.aioe.org>
References: <tlaulk$10io$1@gioia.aioe.org> <tlbdkq$39me6$2@dont-email.me>
<tlbecl$ecj$1@gioia.aioe.org> <tlbg9t$3a00n$5@dont-email.me>
<tlbj0f$fne$1@gioia.aioe.org> <tlge3j$3rksv$1@dont-email.me>
<tlggtk$ikp$1@gioia.aioe.org> <tlgmmp$3sdo2$2@dont-email.me>
<tlifro$bj1$1@gioia.aioe.org> <tlipt7$48av$1@dont-email.me>
<tlivvq$off$1@gioia.aioe.org>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="60792"; posting-host="QJLXApsvkYYOaKx3c4LRTg.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101
Thunderbird/102.5.0
X-Notice: Filtered by postfilter v. 0.9.2
Content-Language: en-US
 by: Dmitry A. Kazakov - Tue, 22 Nov 2022 17:47 UTC

On 2022-11-22 18:13, Bart wrote:

> Language A can talk to language B via the machine's ABI. Where does C
> come into it?

Data types of arguments including padding/gaping of structures, calling
conventions. E.g. Windows' native calling convention is stdcall, while C
deploys cdecl.

> Language A can talk to a library or OS component that resides in a DLL,
> via the ABI. The library might have been implemented in C, or assembler,
> or in anything else, but in binary form, is pure machine code anyway.

Same as above. Data types, calling conventions.

> What makes /you/ think that such ABIs were invented purely for the use
> of C programs? Do you think the designers of the ABI simply assumed that
> only programs written in the C language could call into the OS?

That depends on the OS.

- VMS used MACRO-11 and unified calling conventions. That was DEC and
that was the time people really care, before the Dark Age of Computing.

- Windows was stdcall, but then some it parts gave way to C.

- UNIXes used C's conventions, naturally.

> When you download a shared library DLL, do you think they have different
> versions depending on what language will be using the DLL?

That is certainly a possibility. There are lots of libraries having
language-specific adapters. If you use a higher level language you would
like to make advantage of this. Usually there are quite complicated
elaboration protocols upon library loading ensuring initialization of
complex objects, versioning consistency all thing the stupid loaders
cannot. The price is that you might not be able to use it with C or
another language.

> I'm surprised the Unix and C developers even had a terminal that could
> do upper and lower case.

No idea, but already DEC VT52 has lower-case.

(Of course, case-sensitivity was an incredibly stupid choice)

--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

1
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor