Discussion:
What is a pile?
(too old to reply)
Dingbat
2021-11-24 02:51:56 UTC
Permalink
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8

I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
Tony Cooper
2021-11-24 03:07:06 UTC
Permalink
On Tue, 23 Nov 2021 18:51:56 -0800 (PST), Dingbat
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
There isn't a pile in a "pile up". It's just an expression that
indicates more than one car was in a collision, and there was more
than minor damage to the cars.

The term is written as both "pile up" and "pillup".

We also say that work is piling up on us. It just means we are
overloaded with work.
--
Tony Cooper Orlando Florida
Tony Cooper
2021-11-24 03:09:08 UTC
Permalink
On Tue, 23 Nov 2021 22:07:06 -0500, Tony Cooper
Post by Tony Cooper
On Tue, 23 Nov 2021 18:51:56 -0800 (PST), Dingbat
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
There isn't a pile in a "pile up". It's just an expression that
indicates more than one car was in a collision, and there was more
than minor damage to the cars.
The term is written as both "pile up" and "pillup".
Argh. "Pile up" and "Pileup".
Post by Tony Cooper
We also say that work is piling up on us. It just means we are
overloaded with work.
--
Tony Cooper Orlando Florida
Dingbat
2021-11-24 07:31:12 UTC
Permalink
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
There isn't a pile in a "pile up". It's just an expression that
indicates more than one car was in a collision, and there was more
than minor damage to the cars.
The term is written as both "pile up" and "pillup".
We also say that work is piling up on us. It just means we are
overloaded with work.
Thanks. "Not pretty" referred to the cars. The drivers evidently
found the women pretty.
lar3ryca
2021-11-24 17:34:31 UTC
Permalink
Post by Dingbat
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
There isn't a pile in a "pile up". It's just an expression that
indicates more than one car was in a collision, and there was more
than minor damage to the cars.
The term is written as both "pile up" and "pillup".
We also say that work is piling up on us. It just means we are
overloaded with work.
Thanks. "Not pretty" referred to the cars. The drivers evidently
found the women pretty.
Not necessarily 'pretty'. The drivers may well have found the sight 'scary'.
Dingbat
2021-11-24 07:31:23 UTC
Permalink
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
There isn't a pile in a "pile up". It's just an expression that
indicates more than one car was in a collision, and there was more
than minor damage to the cars.
The term is written as both "pile up" and "pillup".
We also say that work is piling up on us. It just means we are
overloaded with work.
Thanks. "Not pretty" referred to the cars. The drivers evidently
found the women pretty.
Quinn C
2021-11-24 14:00:08 UTC
Permalink
Post by Tony Cooper
On Tue, 23 Nov 2021 18:51:56 -0800 (PST), Dingbat
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
There isn't a pile in a "pile up". It's just an expression that
indicates more than one car was in a collision, and there was more
than minor damage to the cars.
The term is written as both "pile up" and "pillup".
We also say that work is piling up on us. It just means we are
overloaded with work.
I disagree. When we say work is piling up, it can often be figurative,
but the origin of the metaphor is clearly an actual pile, as in a pile
of dishes to be washed or a tall stack of papers to be processed.

That's less obvious in the case of the car crash.
--
There is, at a women's college, always some emancipating
encouragement for those with masculine tastes for such things
as mathematics, philosophy, and friendship.
-- Jane Rule, This Is Not For You, p.15
Quinn C
2021-11-24 03:16:20 UTC
Permalink
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
More importantly, exactly how many cars make a pile, as Eubulides of
Miletus asked.
--
A patriot must always be ready to defend his country against
his government.
-- Edward Abbey
Dingbat
2021-11-25 00:42:42 UTC
Permalink
Post by Quinn C
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
More importantly, exactly how many cars make a pile, as Eubulides of
Miletus asked.
He had a beef with Aristotle. I don't know why.
One of his paradoxes is not named after him. I don't know how that came to be.
It is named the Epimenedes paradox
https://en.wikipedia.org/wiki/Eubulides
https://en.wikipedia.org/wiki/Epimenides_paradox
Post by Quinn C
--
A patriot must always be ready to defend his country against
his government.
-- Edward Abbey
occam
2021-11-24 09:15:06 UTC
Permalink
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile?
If you visit the junk yard where these cars will be taken after this
incident, you will see the pileup.

Can there be a pileless pile(up)
Post by Dingbat
in a context other than a pileup of cars?
A pileup of complaints after any borderline statement on Facebook/Twitter.
Stefan Ram
2021-11-24 14:48:36 UTC
Permalink
Subject: What is a pile?
|With the assumption that removing a single grain does not
|cause a heap to become a non-heap, the paradox is to consider
|what happens when the process is repeated enough times that
|only one grain remains: is it still a heap? If not, when did
|it change from a heap to a non-heap?
|
from a Web page about the sorites paradox.

Must have been written be a physicist! A mathematician would
have asked:

|What happens when the process is repeated enough times that
|no grain remains: is it still a heap?

.
J. J. Lodder
2021-11-25 10:33:23 UTC
Permalink
Post by Stefan Ram
Subject: What is a pile?
|With the assumption that removing a single grain does not
|cause a heap to become a non-heap, the paradox is to consider
|what happens when the process is repeated enough times that
|only one grain remains: is it still a heap? If not, when did
|it change from a heap to a non-heap?
|
from a Web page about the sorites paradox.
Must have been written be a physicist! A mathematician would
|What happens when the process is repeated enough times that
|no grain remains: is it still a heap?
Of course it is, to a mathematician.
The empty set is still a set,
so the empty heap is still a heap.

The best of all posible heaps!

Jan
Peter Moylan
2021-11-25 10:43:35 UTC
Permalink
Post by J. J. Lodder
Post by Stefan Ram
Subject: What is a pile?
|With the assumption that removing a single grain does not
|cause a heap to become a non-heap, the paradox is to consider
|what happens when the process is repeated enough times that
|only one grain remains: is it still a heap? If not, when did
|it change from a heap to a non-heap?
|
from a Web page about the sorites paradox.
Must have been written be a physicist! A mathematician would
|What happens when the process is repeated enough times that
|no grain remains: is it still a heap?
Of course it is, to a mathematician.
The empty set is still a set,
so the empty heap is still a heap.
The best of all posible heaps!
And the fastest possible case, in the HeapSort algorithm.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Kerr-Mudd, John
2021-11-25 12:24:00 UTC
Permalink
On Thu, 25 Nov 2021 21:43:35 +1100
Post by Peter Moylan
Post by J. J. Lodder
Post by Stefan Ram
Subject: What is a pile?
|With the assumption that removing a single grain does not
|cause a heap to become a non-heap, the paradox is to consider
|what happens when the process is repeated enough times that
|only one grain remains: is it still a heap? If not, when did
|it change from a heap to a non-heap?
|
from a Web page about the sorites paradox.
Must have been written be a physicist! A mathematician would
|What happens when the process is repeated enough times that
|no grain remains: is it still a heap?
Of course it is, to a mathematician.
The empty set is still a set,
so the empty heap is still a heap.
The best of all posible heaps!
And the fastest possible case, in the HeapSort algorithm.
I guess pidgin paleface talk is no longer acceptable.

QuickSort is faster, so says the Wikipedia Heapsort article, but
not so clear on the QS page.

Or go for the best of both:
https://en.wikipedia.org/wiki/Introsort
--
Bah, and indeed Humbug.
lar3ryca
2021-11-25 16:46:29 UTC
Permalink
Post by Kerr-Mudd, John
On Thu, 25 Nov 2021 21:43:35 +1100
Post by Peter Moylan
Post by J. J. Lodder
Post by Stefan Ram
Subject: What is a pile?
|With the assumption that removing a single grain does not
|cause a heap to become a non-heap, the paradox is to consider
|what happens when the process is repeated enough times that
|only one grain remains: is it still a heap? If not, when did
|it change from a heap to a non-heap?
|
from a Web page about the sorites paradox.
Must have been written be a physicist! A mathematician would
|What happens when the process is repeated enough times that
|no grain remains: is it still a heap?
Of course it is, to a mathematician.
The empty set is still a set,
so the empty heap is still a heap.
The best of all posible heaps!
And the fastest possible case, in the HeapSort algorithm.
I guess pidgin paleface talk is no longer acceptable.
QuickSort is faster, so says the Wikipedia Heapsort article, but
not so clear on the QS page.
Hmm... he did not claim that HeapSort is fastest sort. He only claimed that a Heapsort on the empty heap is the fastest possible HeapSort.
Post by Kerr-Mudd, John
https://en.wikipedia.org/wiki/Introsort
Interesting! Thanks for the link.
Peter Moylan
2021-11-25 21:04:54 UTC
Permalink
Post by lar3ryca
On Thu, 25 Nov 2021 21:43:35 +1100 Peter Moylan
Post by Peter Moylan
Post by Stefan Ram
Subject: What is a pile?
|With the assumption that removing a single grain does not
|cause a heap to become a non-heap, the paradox is to
consider |what happens when the process is repeated enough
times that |only one grain remains: is it still a heap? If
not, when did |it change from a heap to a non-heap? | from a
Web page about the sorites paradox.
Must have been written be a physicist! A mathematician would
|What happens when the process is repeated enough times that
|no grain remains: is it still a heap?
Of course it is, to a mathematician. The empty set is still a
set, so the empty heap is still a heap.
The best of all posible heaps!
And the fastest possible case, in the HeapSort algorithm.
I guess pidgin paleface talk is no longer acceptable.
QuickSort is faster, so says the Wikipedia Heapsort article, but
not so clear on the QS page.
Hmm... he did not claim that HeapSort is fastest sort. He only
claimed that a Heapsort on the empty heap is the fastest possible
HeapSort.
Precisely. I use Quicksort for my own sorting, as it happens. I've tried
HeapSort and like it less. They perform equally well on zero-sized data.

When sorting an array of large records - a case that the textbooks never
cover, but which in my experience is the most important practical case -
one important contribution to the execution time is the time to move the
records around. With that in mind, my implementation delays the movement
and instead moves "holes" around. A simple idea, but one that nobody
else seems to use in QuickSort, although using "holes" is more
intuitively obvious in HeapSort.

Of course one could handle the long-record case by sorting an array of
pointers, but then you get a messy problem at the end when moving
everything to its final position; an annoying problem if, as usually
happens, you are trying to sort the array in place.
Post by lar3ryca
https://en.wikipedia.org/wiki/Introsort
Interesting! Thanks for the link.
That's designed to have good worst-case performance, but personally I'd
rather have best average-case performance.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Athel Cornish-Bowden
2021-11-26 09:11:24 UTC
Permalink
Post by Peter Moylan
Post by lar3ryca
On Thu, 25 Nov 2021 21:43:35 +1100 Peter Moylan
Post by Peter Moylan
Post by Stefan Ram
Subject: What is a pile?
|With the assumption that removing a single grain does not
|cause a heap to become a non-heap, the paradox is to
consider |what happens when the process is repeated enough
times that |only one grain remains: is it still a heap? If
not, when did |it change from a heap to a non-heap? | from a
Web page about the sorites paradox.
Must have been written be a physicist! A mathematician would
|What happens when the process is repeated enough times that
|no grain remains: is it still a heap?
Of course it is, to a mathematician. The empty set is still a
set, so the empty heap is still a heap.
The best of all posible heaps!
And the fastest possible case, in the HeapSort algorithm.
I guess pidgin paleface talk is no longer acceptable.
QuickSort is faster, so says the Wikipedia Heapsort article, but
not so clear on the QS page.
Hmm... he did not claim that HeapSort is fastest sort. He only
claimed that a Heapsort on the empty heap is the fastest possible
HeapSort.
Precisely. I use Quicksort for my own sorting, as it happens. I've tried
HeapSort and like it less. They perform equally well on zero-sized data.
When sorting an array of large records - a case that the textbooks never
cover, but which in my experience is the most important practical case -
one important contribution to the execution time is the time to move the
records around. With that in mind, my implementation delays the movement
and instead moves "holes" around. A simple idea, but one that nobody
else seems to use in QuickSort, although using "holes" is more
intuitively obvious in HeapSort.
Of course one could handle the long-record case by sorting an array of
pointers, but then you get a messy problem at the end when moving
everything to its final position; an annoying problem if, as usually
happens, you are trying to sort the array in place.
Post by lar3ryca
https://en.wikipedia.org/wiki/Introsort
Interesting! Thanks for the link.
That's designed to have good worst-case performance, but personally I'd
rather have best average-case performance.
The last time I needed to incorporate a sorting algorithm in a program
(around 1992, I think) I used comb sort. I don't remember why, but it
had to do with memory requirements. (I was an avid reader of Byte at
that time, and I had probably read the 1991 article by Lacey and Box.)
Anyway, it worked very well, not just an improvement on its ancestor,
the simple-minded bubble sort, but a vast improvement.
--
Athel -- French and British, living mainly in England until 1987.
Adam Funk
2021-11-26 16:33:04 UTC
Permalink
Post by Athel Cornish-Bowden
Post by Peter Moylan
Post by lar3ryca
On Thu, 25 Nov 2021 21:43:35 +1100 Peter Moylan
Post by Peter Moylan
Post by Stefan Ram
Subject: What is a pile?
|With the assumption that removing a single grain does
not |cause a heap to become a non-heap, the paradox is
to consider |what happens when the process is repeated
enough times that |only one grain remains: is it still a
heap? If not, when did |it change from a heap to a
non-heap? | from a Web page about the sorites paradox.
Must have been written be a physicist! A mathematician
|What happens when the process is repeated enough times
that |no grain remains: is it still a heap?
Of course it is, to a mathematician. The empty set is still
a set, so the empty heap is still a heap.
The best of all posible heaps!
And the fastest possible case, in the HeapSort algorithm.
I guess pidgin paleface talk is no longer acceptable.
QuickSort is faster, so says the Wikipedia Heapsort article,
but not so clear on the QS page.
Hmm... he did not claim that HeapSort is fastest sort. He only
claimed that a Heapsort on the empty heap is the fastest
possible HeapSort.
Precisely. I use Quicksort for my own sorting, as it happens. I've
tried HeapSort and like it less. They perform equally well on
zero-sized data.
When sorting an array of large records - a case that the textbooks
never cover, but which in my experience is the most important
practical case - one important contribution to the execution time
is the time to move the records around. With that in mind, my
implementation delays the movement and instead moves "holes"
around. A simple idea, but one that nobody else seems to use in
QuickSort, although using "holes" is more intuitively obvious in
HeapSort.
Of course one could handle the long-record case by sorting an array
of pointers, but then you get a messy problem at the end when
moving everything to its final position; an annoying problem if, as
usually happens, you are trying to sort the array in place.
Post by lar3ryca
https://en.wikipedia.org/wiki/Introsort
Interesting! Thanks for the link.
That's designed to have good worst-case performance, but personally
I'd rather have best average-case performance.
The last time I needed to incorporate a sorting algorithm in a
program (around 1992, I think) I used comb sort. I don't remember
why, but it had to do with memory requirements. (I was an avid reader
of Byte at that time, and I had probably read the 1991 article by
Lacey and Box.) Anyway, it worked very well, not just an improvement
on its ancestor, the simple-minded bubble sort, but a vast
improvement.
I like to keep my GEDCOM files (genealogy data) sorted by surname, and
to do this I use a merge sort. The basic idea is simple and easy to
implement. You break the file into two halves, sort the two halves, and
then merge the two resulting files. Clearly this involves recursion,
where each half is itself split into halves.
To make it efficient, you have to add one extra detail: when you reach a
file small enough to fit in memory, sort it by a different method such
as Quicksort.
The whole sort goes very quickly. For this particular application, you
have the extra benefit that the file is already nearly sorted, because
of the sort you did the last time the file was modified.
You have GEDCOM files too big to load into memory?
--
suckerpunch the demons from my dreams
Peter Moylan
2021-11-26 23:35:53 UTC
Permalink
Post by Adam Funk
I like to keep my GEDCOM files (genealogy data) sorted by surname,
and to do this I use a merge sort. The basic idea is simple and
easy to implement. You break the file into two halves, sort the
two halves, and then merge the two resulting files. Clearly this
involves recursion, where each half is itself split into halves.
To make it efficient, you have to add one extra detail: when you
reach a file small enough to fit in memory, sort it by a different
method such as Quicksort.
The whole sort goes very quickly. For this particular application,
you have the extra benefit that the file is already nearly sorted,
because of the sort you did the last time the file was modified.
You have GEDCOM files too big to load into memory?
Checing the relevant directory, I see that the largest is only 500 kB,
but it goes against my sense of economic program design to declare an
array that large.

My GEDCOM files appear to contain about 10,000 individuals. I deal with
that by having several different files, with cross-links between them.
One day, I suppose, I should merge them, but that's a job that I don't
feel I can safely automate, and I'd need to set several days aside for
checking for duplicates.

It now occurs to me, though, that I could semi-automate the job by
preparing a list of suspected duplicates, and going through that
(probably small) file by hand.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Transition Zone
2021-11-27 20:28:57 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
I like to keep my GEDCOM files (genealogy data) sorted by surname,
and to do this I use a merge sort. The basic idea is simple and
easy to implement. You break the file into two halves, sort the
two halves, and then merge the two resulting files. Clearly this
involves recursion, where each half is itself split into halves.
To make it efficient, you have to add one extra detail: when you
reach a file small enough to fit in memory, sort it by a different
method such as Quicksort.
The whole sort goes very quickly. For this particular application,
you have the extra benefit that the file is already nearly sorted,
because of the sort you did the last time the file was modified.
You have GEDCOM files too big to load into memory?
Checing the relevant directory, I see that the largest is only 500 kB,
but it goes against my sense of economic program design to declare an
array that large.
Checing is a french word, as in: 'cheching de la qualité'. 'Gestin des réclamations, interface entre lservice clientèle et la production, cheching de la qualité, responsable secteur.' Unless you meant 'checking'.
Peter Moylan
2021-11-27 22:12:52 UTC
Permalink
On Friday, November 26, 2021 at 6:36:01 PM UTC-5, Peter Moylan
Post by Peter Moylan
Checing the relevant directory, I see that the largest is only 500
kB, but it goes against my sense of economic program design to
declare an array that large.
Checing is a french word, as in: 'cheching de la qualité'. 'Gestin
des réclamations, interface entre lservice clientèle et la
production, cheching de la qualité, responsable secteur.' Unless
you meant 'checking'.
It was a typo, and I meant checking. But I never knew that checing is a
French word. Atilf doesn't have either checing or cheching. Canadian?
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Quinn C
2021-11-29 00:03:41 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
I like to keep my GEDCOM files (genealogy data) sorted by surname,
and to do this I use a merge sort. The basic idea is simple and
easy to implement. You break the file into two halves, sort the
two halves, and then merge the two resulting files. Clearly this
involves recursion, where each half is itself split into halves.
To make it efficient, you have to add one extra detail: when you
reach a file small enough to fit in memory, sort it by a different
method such as Quicksort.
The whole sort goes very quickly. For this particular application,
you have the extra benefit that the file is already nearly sorted,
because of the sort you did the last time the file was modified.
You have GEDCOM files too big to load into memory?
Checing the relevant directory, I see that the largest is only 500 kB,
but it goes against my sense of economic program design to declare an
array that large.
Wow. I'd consider that trivial at this point.

For years, people suspected our main software of having a memory-leak,
and many hours were spent trying to trace it and to discuss it. I never
really believed in this. And I haven't observed any of the issues that
lead to the suspicion since we routinely work with 16 GB RAM. 8 GB RAM
was simply tight for the amount of data we're working with.

We're not loading any of our dictionaries (up to 1 GB on disk in a
database format that's fairly compressed) into memory, but there's
layers of caching. At some point, I had the task to look whether loading
all of it into memory would gain us speed, and I found that a number of
different technologies didn't result in any significant gain.
--
The wrong body ... now comes not to claim rightness but to
dismantle the system that metes out rightness and wrongness
according to the dictates of various social orders.
-- Jack Halberstam, Unbuilding Gender
Ken Blake
2021-11-29 17:45:43 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
I like to keep my GEDCOM files (genealogy data) sorted by surname,
and to do this I use a merge sort. The basic idea is simple and
easy to implement. You break the file into two halves, sort the
two halves, and then merge the two resulting files. Clearly this
involves recursion, where each half is itself split into halves.
To make it efficient, you have to add one extra detail: when you
reach a file small enough to fit in memory, sort it by a different
method such as Quicksort.
The whole sort goes very quickly. For this particular application,
you have the extra benefit that the file is already nearly sorted,
because of the sort you did the last time the file was modified.
You have GEDCOM files too big to load into memory?
Checing the relevant directory, I see that the largest is only 500 kB,
but it goes against my sense of economic program design to declare an
array that large.
And here I load 8 MB csv/tsv files into a pandas.DataFrame without
even thinking about it.
I remember when 8MB was large. Heck, I remember when 80KB was large.
The first computer I programmed, an IBM 1401, had 4K of RAM (and that
was 4000, not 4096). It was possible to go as high as 16K, but we never
got one that big while I worked there.
Those days are long gone.
Yep!
Dingbat
2021-11-30 14:37:58 UTC
Permalink
Post by Ken Blake
Post by Peter Moylan
Post by Adam Funk
I like to keep my GEDCOM files (genealogy data) sorted by surname,
and to do this I use a merge sort. The basic idea is simple and
easy to implement. You break the file into two halves, sort the
two halves, and then merge the two resulting files. Clearly this
involves recursion, where each half is itself split into halves.
To make it efficient, you have to add one extra detail: when you
reach a file small enough to fit in memory, sort it by a different
method such as Quicksort.
The whole sort goes very quickly. For this particular application,
you have the extra benefit that the file is already nearly sorted,
because of the sort you did the last time the file was modified.
You have GEDCOM files too big to load into memory?
Checing the relevant directory, I see that the largest is only 500 kB,
but it goes against my sense of economic program design to declare an
array that large.
And here I load 8 MB csv/tsv files into a pandas.DataFrame without
even thinking about it.
I remember when 8MB was large. Heck, I remember when 80KB was large.
The first computer I programmed, an IBM 1401, had 4K of RAM (and that
was 4000, not 4096). It was possible to go as high as 16K, but we never
got one that big while I worked there.
Those days are long gone.
Yep!
I programmed on a Ti5 8C calculator with 1/2 KB bubble memory and moved
up to a hand-me-down IBM 1620, I don't know how much core memory it
had; the max was 32KB.
lar3ryca
2021-12-01 05:17:03 UTC
Permalink
Post by Dingbat
Post by Ken Blake
Post by Peter Moylan
Post by Adam Funk
I like to keep my GEDCOM files (genealogy data) sorted by surname,
and to do this I use a merge sort. The basic idea is simple and
easy to implement. You break the file into two halves, sort the
two halves, and then merge the two resulting files. Clearly this
involves recursion, where each half is itself split into halves.
To make it efficient, you have to add one extra detail: when you
reach a file small enough to fit in memory, sort it by a different
method such as Quicksort.
The whole sort goes very quickly. For this particular application,
you have the extra benefit that the file is already nearly sorted,
because of the sort you did the last time the file was modified.
You have GEDCOM files too big to load into memory?
Checing the relevant directory, I see that the largest is only 500 kB,
but it goes against my sense of economic program design to declare an
array that large.
And here I load 8 MB csv/tsv files into a pandas.DataFrame without
even thinking about it.
I remember when 8MB was large. Heck, I remember when 80KB was large.
The first computer I programmed, an IBM 1401, had 4K of RAM (and that
was 4000, not 4096). It was possible to go as high as 16K, but we never
got one that big while I worked there.
Those days are long gone.
Yep!
I programmed on a Ti5 8C calculator with 1/2 KB bubble memory and moved
up to a hand-me-down IBM 1620, I don't know how much core memory it
had; the max was 32KB.
Ah, the old 1620. We used to call it CADET. Can't Add, Doesn't Even Try.
Peter Moylan
2021-12-01 07:43:07 UTC
Permalink
Post by lar3ryca
Post by Dingbat
Post by Ken Blake
Post by Peter Moylan
Post by Adam Funk
I like to keep my GEDCOM files (genealogy data) sorted
by surname, and to do this I use a merge sort. The
basic idea is simple and easy to implement. You break
the file into two halves, sort the two halves, and then
merge the two resulting files. Clearly this involves
recursion, where each half is itself split into
halves.
when you reach a file small enough to fit in memory,
sort it by a different method such as Quicksort.
The whole sort goes very quickly. For this particular
application, you have the extra benefit that the file
is already nearly sorted, because of the sort you did
the last time the file was modified.
You have GEDCOM files too big to load into memory?
Checing the relevant directory, I see that the largest is
only 500 kB, but it goes against my sense of economic
program design to declare an array that large.
And here I load 8 MB csv/tsv files into a pandas.DataFrame
without even thinking about it.
I remember when 8MB was large. Heck, I remember when 80KB was large.
The first computer I programmed, an IBM 1401, had 4K of RAM (and
that was 4000, not 4096). It was possible to go as high as 16K,
but we never got one that big while I worked there.
Those days are long gone.
Yep!
I programmed on a Ti5 8C calculator with 1/2 KB bubble memory and
moved up to a hand-me-down IBM 1620, I don't know how much core
memory it had; the max was 32KB.
Ah, the old 1620. We used to call it CADET. Can't Add, Doesn't Even Try.
When I first arrived at Newcastle University, the university's computer
(an IBM 1130) had just had its memory upgraded from 8K to 16K words.
That decision was controversial; many people doubted that it was
possible to write a program that would need that much memory.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Adam Funk
2021-12-01 09:06:31 UTC
Permalink
Post by Peter Moylan
Post by lar3ryca
Post by Dingbat
Post by Ken Blake
Post by Peter Moylan
Post by Adam Funk
I like to keep my GEDCOM files (genealogy data) sorted
by surname, and to do this I use a merge sort. The
basic idea is simple and easy to implement. You break
the file into two halves, sort the two halves, and then
merge the two resulting files. Clearly this involves
recursion, where each half is itself split into
halves.
when you reach a file small enough to fit in memory,
sort it by a different method such as Quicksort.
The whole sort goes very quickly. For this particular
application, you have the extra benefit that the file
is already nearly sorted, because of the sort you did
the last time the file was modified.
You have GEDCOM files too big to load into memory?
Checing the relevant directory, I see that the largest is
only 500 kB, but it goes against my sense of economic
program design to declare an array that large.
And here I load 8 MB csv/tsv files into a pandas.DataFrame
without even thinking about it.
I remember when 8MB was large. Heck, I remember when 80KB was large.
The first computer I programmed, an IBM 1401, had 4K of RAM (and
that was 4000, not 4096). It was possible to go as high as 16K,
but we never got one that big while I worked there.
Those days are long gone.
Yep!
I programmed on a Ti5 8C calculator with 1/2 KB bubble memory and
moved up to a hand-me-down IBM 1620, I don't know how much core
memory it had; the max was 32KB.
Ah, the old 1620. We used to call it CADET. Can't Add, Doesn't Even Try.
When I first arrived at Newcastle University, the university's computer
(an IBM 1130) had just had its memory upgraded from 8K to 16K words.
That decision was controversial; many people doubted that it was
possible to write a program that would need that much memory.
Whereas now, OTOH, it's hard to write one that doesn't.
--
it's the nexus of the crisis
and the origin of storms
Peter Moylan
2021-12-01 09:56:00 UTC
Permalink
Post by Adam Funk
Post by Peter Moylan
When I first arrived at Newcastle University, the university's
computer (an IBM 1130) had just had its memory upgraded from 8K to
16K words. That decision was controversial; many people doubted
that it was possible to write a program that would need that much
memory.
Whereas now, OTOH, it's hard to write one that doesn't.
Just for the hell of it I've just written a "Hello world" program. 60
kB. I was shocked.

Then I changed it to have a single statement "x := 5", so that no screen
output was needed. 40 kB. Still shocking.

Long ago my programming environment (TopSpeed, at the time) had a smart
linker that only included library functions that were actually called.
Today's linkers don't seem to bother.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Snidely
2021-12-01 10:11:14 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
When I first arrived at Newcastle University, the university's
computer (an IBM 1130) had just had its memory upgraded from 8K to
16K words. That decision was controversial; many people doubted
that it was possible to write a program that would need that much
memory.
Whereas now, OTOH, it's hard to write one that doesn't.
Just for the hell of it I've just written a "Hello world" program. 60
kB. I was shocked.
Then I changed it to have a single statement "x := 5", so that no screen
output was needed. 40 kB. Still shocking.
Long ago my programming environment (TopSpeed, at the time) had a smart
linker that only included library functions that were actually called.
Today's linkers don't seem to bother.
The exposed APIs often have a lot more going on behind the scenes than
in the Olde Days, so it is more difficult to tell what all is being
called IME.

/dps
--
"This is all very fine, but let us not be carried away be excitement,
but ask calmly, how does this person feel about in in his cooler
moments next day, with six or seven thousand feet of snow and stuff on
top of him?"
_Roughing It_, Mark Twain.
Snidely
2021-12-01 10:16:36 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
When I first arrived at Newcastle University, the university's
computer (an IBM 1130) had just had its memory upgraded from 8K to
16K words. That decision was controversial; many people doubted
that it was possible to write a program that would need that much
memory.
Whereas now, OTOH, it's hard to write one that doesn't.
Just for the hell of it I've just written a "Hello world" program. 60
kB. I was shocked.
Then I changed it to have a single statement "x := 5", so that no screen
output was needed. 40 kB. Still shocking.
But since that's much less than one swap page, why do you care? You're
not using a 64KB machine anymore, and those with real memory
constraints (usually embedded processors, AFAICT) will have special
libraries to suit their constraints.
Post by Peter Moylan
Long ago my programming environment (TopSpeed, at the time) had a smart
linker that only included library functions that were actually called.
Today's linkers don't seem to bother.
The exposed APIs often have a lot more going on behind the scenes than in the
Olde Days, so it is more difficult to tell what all is being called IME.
/dps
--
Rule #0: Don't be on fire.
In case of fire, exit the building before tweeting about it.
(Sighting reported by Adam F)
Mark Brader
2021-12-01 12:58:19 UTC
Permalink
Post by Peter Moylan
Just for the hell of it I've just written a "Hello world" program. 60
kB. I was shocked.
Then I changed it to have a single statement "x := 5", so that no screen
output was needed. 40 kB. Still shocking.
Well, if you *will* use the wrong language...

% cat x5.c
int main() {
int x;
x = 5;
}
%
% cc -s -o x5 x5.c
% ls -l x5
-rwx------ 1 msb nobody 4856 Dec 1 07:56 x5
--
Mark Brader At any rate, C++ != C. Actually, the value of
Toronto the expression "C++ != C" is [undefined].
***@vex.net -- Peter da Silva

My text in this article is in the public domain.
Adam Funk
2021-12-01 13:34:11 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
When I first arrived at Newcastle University, the university's
computer (an IBM 1130) had just had its memory upgraded from 8K to
16K words. That decision was controversial; many people doubted
that it was possible to write a program that would need that much
memory.
Whereas now, OTOH, it's hard to write one that doesn't.
Just for the hell of it I've just written a "Hello world" program. 60
kB. I was shocked.
Then I changed it to have a single statement "x := 5", so that no screen
output was needed. 40 kB. Still shocking.
Long ago my programming environment (TopSpeed, at the time) had a smart
linker that only included library functions that were actually called.
Today's linkers don't seem to bother.
I was going to ask which := programming language you were using, but I
think TopSpeed (other than a courier service) was for Modula-2?
--
We live in capitalism. Its power seems inescapable. But then, so did
the divine right of kings. Any human power can be resisted and changed
by human beings. Resistance and change often begin in art. Very often
in the art of words. ---Ursula Le Guin
Dingbat
2021-12-01 14:35:10 UTC
Permalink
Post by Adam Funk
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
When I first arrived at Newcastle University, the university's
computer (an IBM 1130) had just had its memory upgraded from 8K to
16K words. That decision was controversial; many people doubted
that it was possible to write a program that would need that much
memory.
Whereas now, OTOH, it's hard to write one that doesn't.
Just for the hell of it I've just written a "Hello world" program. 60
kB. I was shocked.
Then I changed it to have a single statement "x := 5", so that no screen
output was needed. 40 kB. Still shocking.
Long ago my programming environment (TopSpeed, at the time) had a smart
linker that only included library functions that were actually called.
Today's linkers don't seem to bother.
I was going to ask which := programming language you were using, but I
think TopSpeed (other than a courier service) was for Modula-2?
Topspeed had an ad announcing that they'd have a compiler for every
language under the sun. It was about 10 languages. I don't remember
the list but C was included.
Peter Moylan
2021-12-02 01:13:21 UTC
Permalink
Post by Adam Funk
Post by Peter Moylan
Long ago my programming environment (TopSpeed, at the time) had a
smart linker that only included library functions that were
actually called. Today's linkers don't seem to bother.
I was going to ask which := programming language you were using, but
I think TopSpeed (other than a courier service) was for Modula-2?
It supported Modula-2, C++, C, and assembly language. I used all of them.

TopSpeed made it easy to combine modules in multiple languages. Borland
didn't want to support that, which is why its best developers left and
formed their own company.

TopSpeed also had a very sophisticated pragma system, to specify which
registers were used for parameter passing and things like that. I even
managed to write some zero-byte functions using pragmas.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Bob Martin
2021-12-02 07:08:58 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
Long ago my programming environment (TopSpeed, at the time) had a
smart linker that only included library functions that were
actually called. Today's linkers don't seem to bother.
I was going to ask which := programming language you were using, but
I think TopSpeed (other than a courier service) was for Modula-2?
It supported Modula-2, C++, C, and assembly language. I used all of them.
and Pascal.
Post by Peter Moylan
TopSpeed made it easy to combine modules in multiple languages. Borland
didn't want to support that, which is why its best developers left and
formed their own company.
TopSpeed also had a very sophisticated pragma system, to specify which
registers were used for parameter passing and things like that. I even
managed to write some zero-byte functions using pragmas.
Peter Moylan
2021-12-02 07:58:10 UTC
Permalink
Post by Bob Martin
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
Long ago my programming environment (TopSpeed, at the time) had a
smart linker that only included library functions that were
actually called. Today's linkers don't seem to bother.
I was going to ask which := programming language you were using, but
I think TopSpeed (other than a courier service) was for Modula-2?
It supported Modula-2, C++, C, and assembly language. I used all of them.
and Pascal.
Yes, sorry. My silly oversight.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Adam Funk
2021-12-02 09:07:14 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
Long ago my programming environment (TopSpeed, at the time) had a
smart linker that only included library functions that were
actually called. Today's linkers don't seem to bother.
I was going to ask which := programming language you were using, but
I think TopSpeed (other than a courier service) was for Modula-2?
It supported Modula-2, C++, C, and assembly language. I used all of them.
TopSpeed made it easy to combine modules in multiple languages. Borland
didn't want to support that, which is why its best developers left and
formed their own company.
I wonder why they didn't want that. I remember using some of Borland's
"Turbo" products, but I'm not sure how they overlapped the TopSpeed
range.
Post by Peter Moylan
TopSpeed also had a very sophisticated pragma system, to specify which
registers were used for parameter passing and things like that. I even
managed to write some zero-byte functions using pragmas.
--
President Business is going to end the world? But he's such a good
guy! And Octan, they make good stuff: music, dairy products, coffee,
TV shows, surveillance systems, all history books, voting
machines... wait a minute! ---Emmet
Peter Moylan
2021-12-02 09:57:15 UTC
Permalink
Post by Adam Funk
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
Long ago my programming environment (TopSpeed, at the time) had
a smart linker that only included library functions that were
actually called. Today's linkers don't seem to bother.
I was going to ask which := programming language you were using,
but I think TopSpeed (other than a courier service) was for
Modula-2?
It supported Modula-2, C++, C, and assembly language. I used all of them.
TopSpeed made it easy to combine modules in multiple languages.
Borland didn't want to support that, which is why its best
developers left and formed their own company.
I wonder why they didn't want that. I remember using some of
Borland's "Turbo" products, but I'm not sure how they overlapped the
TopSpeed range.
I think Modula-2 was close to the centre of the problem, and possibly
was even the major issue. The developers wanted to do a Modula-2
compiler, a natural desire since a lot of the Pascal work could have
been reused. Borland didn't want to add another language to its range. A
bad decision, because the outcome was that Borland lost some of its best
people.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Adam Funk
2021-12-02 13:42:22 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
Long ago my programming environment (TopSpeed, at the time) had
a smart linker that only included library functions that were
actually called. Today's linkers don't seem to bother.
I was going to ask which := programming language you were using,
but I think TopSpeed (other than a courier service) was for
Modula-2?
It supported Modula-2, C++, C, and assembly language. I used all of them.
TopSpeed made it easy to combine modules in multiple languages.
Borland didn't want to support that, which is why its best
developers left and formed their own company.
I wonder why they didn't want that. I remember using some of
Borland's "Turbo" products, but I'm not sure how they overlapped the
TopSpeed range.
I think Modula-2 was close to the centre of the problem, and possibly
was even the major issue. The developers wanted to do a Modula-2
compiler, a natural desire since a lot of the Pascal work could have
been reused.
I'm not surprised (I haven't used Modula-2 but AIUI it's very similar
to Pascal --- designed by Wirth as a successor).
Post by Peter Moylan
Borland didn't want to add another language to its range. A
bad decision, because the outcome was that Borland lost some of its best
people.
--
Random numbers should not be generated with a method chosen at random.
---Donald Knuth
Peter Moylan
2021-12-03 00:18:23 UTC
Permalink
Post by Adam Funk
Post by Peter Moylan
I think Modula-2 was close to the centre of the problem, and
possibly was even the major issue. The developers wanted to do a
Modula-2 compiler, a natural desire since a lot of the Pascal work
could have been reused.
I'm not surprised (I haven't used Modula-2 but AIUI it's very
similar to Pascal --- designed by Wirth as a successor).
Wirth designed Modula-2 with two goals in mind:

1. To get a clean implementation of "modularity", where the external
specification is separate from the implementation, and things the caller
doesn't need to know are hidden inside the implementation module.

2. To fix all the faults he'd noticed in the design of Pascal.

There's a surprising lot in part 2.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Athel Cornish-Bowden
2021-12-03 08:27:19 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
I think Modula-2 was close to the centre of the problem, and
possibly was even the major issue. The developers wanted to do a
Modula-2 compiler, a natural desire since a lot of the Pascal work
could have been reused.
I'm not surprised (I haven't used Modula-2 but AIUI it's very
similar to Pascal --- designed by Wirth as a successor).
1. To get a clean implementation of "modularity", where the external
specification is separate from the implementation, and things the caller
doesn't need to know are hidden inside the implementation module.
2. To fix all the faults he'd noticed in the design of Pascal.
There's a surprising lot in part 2.
I was very attracted by Modula-2 when I first learned about it.
However, I didn't find a suitable compiler, so I stuck with Pascal.
--
Athel -- French and British, living mainly in England until 1987.
Peter Moylan
2021-12-03 11:02:06 UTC
Permalink
I worked in Ada before it got real inheritance, and I haven't
followed up with the modern Gnu compiler for it. I knew the
compiler writers at the time I used it. I liked Ada, but then I
had already spent years working in PL/M, and there was a general
resemblance, where as C was very foreign to me at the time.
I was once the external examiner of a PhD thesis (in biochemistry,
not in computer science) of someone who was the Ada expert at
Edinburgh University. I found his remarks about Ada very interesting,
but I wasn't tempted to move to Ada.
I found Ada disappointing given the buildup it had had. A competition
was set up where the winner was going to be the one language that
everyone had to use from then on, forsaking all others, at least in the
US department of defense. The winner was named Ada in honour of Ada
Lovelace. It certainly has many virtues, but it didn't achieve its goal
of making everyone abandon all other programming languages.

The one big effect it had on my own programming (in other languages) is
that I now label all of my procedure and function parameters, except for
those passed by value, as *IN* or *OUT* or *INOUT*. Unlike in Ada, I can
only do this in comments, but it has a huge effect on program readability.

Back then there was another language called Linda, but it sucked.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Kerr-Mudd, John
2021-12-03 11:17:19 UTC
Permalink
On Fri, 3 Dec 2021 22:02:06 +1100
Post by Peter Moylan
I worked in Ada before it got real inheritance, and I haven't
followed up with the modern Gnu compiler for it. I knew the
compiler writers at the time I used it. I liked Ada, but then I
had already spent years working in PL/M, and there was a general
resemblance, where as C was very foreign to me at the time.
I was once the external examiner of a PhD thesis (in biochemistry,
not in computer science) of someone who was the Ada expert at
Edinburgh University. I found his remarks about Ada very
interesting, but I wasn't tempted to move to Ada.
I found Ada disappointing given the buildup it had had. A competition
was set up where the winner was going to be the one language that
everyone had to use from then on, forsaking all others, at least in
the US department of defense. The winner was named Ada in honour of
Ada Lovelace. It certainly has many virtues, but it didn't achieve
its goal of making everyone abandon all other programming languages.
The one big effect it had on my own programming (in other languages)
is that I now label all of my procedure and function parameters,
except for those passed by value, as *IN* or *OUT* or *INOUT*. Unlike
in Ada, I can only do this in comments, but it has a huge effect on
program readability.
Back then there was another language called Linda, but it sucked.
I heard there was a later object-orientated edition called Monica.
--
Bah, and indeed Humbug.
bruce bowser
2021-12-04 19:27:53 UTC
Permalink
Post by Kerr-Mudd, John
On Fri, 3 Dec 2021 22:02:06 +1100
Post by Peter Moylan
I worked in Ada before it got real inheritance, and I haven't
followed up with the modern Gnu compiler for it. I knew the
compiler writers at the time I used it. I liked Ada, but then I
had already spent years working in PL/M, and there was a general
resemblance, where as C was very foreign to me at the time.
I was once the external examiner of a PhD thesis (in biochemistry,
not in computer science) of someone who was the Ada expert at
Edinburgh University. I found his remarks about Ada very
interesting, but I wasn't tempted to move to Ada.
I found Ada disappointing given the buildup it had had. A competition
was set up where the winner was going to be the one language that
everyone had to use from then on, forsaking all others, at least in
the US department of defense. The winner was named Ada in honour of
Ada Lovelace. It certainly has many virtues, but it didn't achieve
its goal of making everyone abandon all other programming languages.
The one big effect it had on my own programming (in other languages)
is that I now label all of my procedure and function parameters,
except for those passed by value, as *IN* or *OUT* or *INOUT*. Unlike
in Ada, I can only do this in comments, but it has a huge effect on
program readability.
Back then there was another language called Linda, but it sucked.
I heard there was a later object-orientated edition called Monica.
I wonder if people died from eating in a Mexican restaurant in a city named Linda. I mean Monica. I'm sorry, Monaca.
Dingbat
2021-12-04 20:42:15 UTC
Permalink
Post by Kerr-Mudd, John
On Fri, 3 Dec 2021 22:02:06 +1100
Post by Peter Moylan
I found Ada disappointing given the buildup it had had. A competition
was set up where the winner was going to be the one language that
everyone had to use from then on, forsaking all others, at least in
the US department of defense. The winner was named Ada in honour of
Ada Lovelace. It certainly has many virtues, but it didn't achieve
its goal of making everyone abandon all other programming languages.
The one big effect it had on my own programming (in other languages)
is that I now label all of my procedure and function parameters,
except for those passed by value, as *IN* or *OUT* or *INOUT*. Unlike
in Ada, I can only do this in comments, but it has a huge effect on
program readability.
Back then there was another language called Linda, but it sucked.
That figures. It was named after Linda Lovelace.

Its distributed tuple spaces have implementations in other languages.
<https://en.wikipedia.org/wiki/Linda_(coordination_language)>
<https://www.google.com/search?q=pylinda+cpplinda>
Post by Kerr-Mudd, John
I heard there was a later object-orientated edition called Monica.
She didn't go anywhere, so she couldn't get to suck.
Unless M Lewinsky begs to differ.
Adam Funk
2021-12-05 19:31:37 UTC
Permalink
Post by Peter Moylan
I worked in Ada before it got real inheritance, and I haven't
followed up with the modern Gnu compiler for it. I knew the
compiler writers at the time I used it. I liked Ada, but then I
had already spent years working in PL/M, and there was a general
resemblance, where as C was very foreign to me at the time.
I was once the external examiner of a PhD thesis (in biochemistry,
not in computer science) of someone who was the Ada expert at
Edinburgh University. I found his remarks about Ada very interesting,
but I wasn't tempted to move to Ada.
I found Ada disappointing given the buildup it had had. A competition
was set up where the winner was going to be the one language that
everyone had to use from then on, forsaking all others, at least in the
alternatively "one Language to rule them all"
Post by Peter Moylan
US department of defense. The winner was named Ada in honour of Ada
Lovelace. It certainly has many virtues, but it didn't achieve its goal
of making everyone abandon all other programming languages.
The one big effect it had on my own programming (in other languages) is
that I now label all of my procedure and function parameters, except for
those passed by value, as *IN* or *OUT* or *INOUT*. Unlike in Ada, I can
only do this in comments, but it has a huge effect on program readability.
Good idea.
Post by Peter Moylan
Back then there was another language called Linda, but it sucked.
--
Radiation! Yes, indeed. You hear the most outrageous lies about
it. Half-baked goggle-box do-gooders telling everybody it's bad
for you. Pernicious nonsense! ---J Frank Parnell
Peter Moylan
2021-12-06 22:09:47 UTC
Permalink
Post by Adam Funk
[...] I found Ada disappointing given the buildup it had had. A
competition was set up where the winner was going to be the one
language that everyone had to use from then on, forsaking all
others, at least in the
alternatively "one Language to rule them all"
PL/1 was supposed to unite Fortran and COBOL.
Back in the 1960s someone from IBM came to our university and gave a
talk on PL/1. He said that it was so complete that nobody would ever
need any other programming language.

At question time, the first question was "when is PL/2 coming out?"

It turns out that IBM copyrighted (or perhaps trade-mark registered)
everything from PL/1 to PL/99. Despite that, there is/was a medium-level
language, back when many people (including me) were designing such
things, called PL/99.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
lar3ryca
2021-12-01 17:42:16 UTC
Permalink
Post by Peter Moylan
Post by lar3ryca
Post by Dingbat
Post by Ken Blake
Post by Peter Moylan
Post by Adam Funk
I like to keep my GEDCOM files (genealogy data) sorted
by surname, and to do this I use a merge sort. The
basic idea is simple and easy to implement. You break
the file into two halves, sort the two halves, and then
merge the two resulting files. Clearly this involves
recursion, where each half is itself split into
halves.
when you reach a file small enough to fit in memory,
sort it by a different method such as Quicksort.
The whole sort goes very quickly. For this particular
application, you have the extra benefit that the file
is already nearly sorted, because of the sort you did
the last time the file was modified.
You have GEDCOM files too big to load into memory?
Checing the relevant directory, I see that the largest is
only 500 kB, but it goes against my sense of economic
program design to declare an array that large.
And here I load 8 MB csv/tsv files into a pandas.DataFrame
without even thinking about it.
I remember when 8MB was large. Heck, I remember when 80KB was large.
The first computer I programmed, an IBM 1401, had 4K of RAM (and
that was 4000, not 4096). It was possible to go as high as 16K,
but we never got one that big while I worked there.
Those days are long gone.
Yep!
I programmed on a Ti5 8C calculator with 1/2 KB bubble memory and
moved up to a hand-me-down IBM 1620, I don't know how much core
memory it had; the max was 32KB.
Ah, the old 1620. We used to call it CADET. Can't Add, Doesn't Even Try.
When I first arrived at Newcastle University, the university's computer
(an IBM 1130) had just had its memory upgraded from 8K to 16K words.
That decision was controversial; many people doubted that it was
possible to write a program that would need that much memory.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
For some reason, my email response did not make it, so...

The first computer I worked on (final test line, in Don Mills, Ontario)
was the IBM 6400 Electronic Accounting Machine. 32 words of 36-bit
memory, but the program was not stored in memory. It was a plugboard.
Program stepping, command, and arguments were set with the plugboard.
Stepping itself was done with relays, and the actual operations were
done using SMS (Standard Modular System) cards.

See http://www.glennsmuseum.com/ibm/ibm.html - Figure 11 for a picture
of the plugboard.

The first program I ever wrote was a brute-force prime number
generator. It tested every possible divisor for all odd numbers, and if
it turned out to be only divisible by 1, it printed the result on the
Selectric typewriter. I started it running at 5PM or so, and when I
arrived back at work at about 8AM, the last number on the list was 853.
Not surprising, since a multiply or divide took about 300ms (yes,
milliseconds).
Ken Blake
2021-12-01 18:48:59 UTC
Permalink
Post by lar3ryca
The first computer I worked on (final test line, in Don Mills, Ontario)
was the IBM 6400 Electronic Accounting Machine. 32 words of 36-bit
memory, but the program was not stored in memory. It was a plugboard.
Program stepping, command, and arguments were set with the plugboard.
Stepping itself was done with relays,
Do you have the IBM model number right? Do you perhaps mean a 405 or 407?

Assuming that's what you mean, those machines weren't called computers;
they were called EAM Machines, and the people who set them up weren't
called programmers.

On my first programming job (programming an IBM 1401) one day I needed
something done on a 407 (I can't remember what or why, and all the
people there who knew how to do it were busy and unavailable. So I
grabbed a manual, read through it, figured out how to do what I needed
to do and stuck the wires into the board. It was a lot easier than
programming the 1401. It worked.
Post by lar3ryca
and the actual operations were
done using SMS (Standard Modular System) cards.
I don't think you mean SMS cards. I think you mean 80-column punched
cards. That's what was used in those day. I still have few around here
somewhere; I used them as bookmarks for years.
lar3ryca
2021-12-01 20:56:50 UTC
Permalink
Post by Ken Blake
Post by lar3ryca
The first computer I worked on (final test line, in Don Mills, Ontario)
was the IBM 6400 Electronic Accounting Machine. 32 words of 36-bit
memory, but the program was not stored in memory. It was a plugboard.
Program stepping, command, and arguments were set with the plugboard.
Stepping itself was done with relays,
Do you have the IBM model number right? Do you perhaps mean a 405 or 407?
Assuming that's what you mean, those machines weren't called computers;
they were called EAM Machines, and the people who set them up weren't
called programmers.
Well, we called it the 6400 Electronic Accounting Machine. It was
definitely not the 405 or 407. I call it a computer because it could be
programmed to do calculations, manipulate strings, and do I/O.
At the time I worked on the final test line, the 360/30 was a topic of conversation.
Being a strictly hardware guy, I could not figure out how a program could be
stored in memory. I chatted with a few guys who were giving courses, on it, and
finally understood.
From the site I referenced, http://www.glennsmuseum.com/ibm/ibm.html
Figure 11 shows a particularly rare piece: the control
plugboard for an IBM 6400 Accounting Machine.
---
The IBM 6400 series is a series of four calculating and accounting
machines produced by IBM starting in 1962.[1] The IBM 6405 was a
desk-size calculator, and the 6410, 6420, and 6430 were more advanced
accounting machines.
The 6405, 6410, and 6420 were developed by IBM in Lexington,
Kentucky, United States, in the early 1960s. Manufacturing was done
by IBM in Lexington and by IBM in Don Mills, Ontario, Canada.
In 1966 all work was transferred to Don Mills.
---
The model I worked on was the 6420, but we all called it the 6400.
I transferred to IBM in Kitchener, before the 6430 was developed,
so I can tell you nothing about it. Shortly after that I trained on
and serviced the 360 Model 30.
Post by Ken Blake
On my first programming job (programming an IBM 1401) one day I needed
something done on a 407 (I can't remember what or why, and all the
people there who knew how to do it were busy and unavailable. So I
grabbed a manual, read through it, figured out how to do what I needed
to do and stuck the wires into the board. It was a lot easier than
programming the 1401. It worked.
I never did work on 1401 computer. I did, however, work on the IBM 1401,
but they were attached to System 360 mainframes.
Post by Ken Blake
Post by lar3ryca
and the actual operations were
done using SMS (Standard Modular System) cards.
I don't think you mean SMS cards. I think you mean 80-column punched
cards. That's what was used in those day. I still have few around here
somewhere; I used them as bookmarks for years.
No, I meant SMS cards. They were not input devices. They lived in the electronic
'gate' and were circuit boards containing the logic. They each had specific jobs
to do, and consisted of transistors, resistors, etc.
https://en.wikipedia.org/wiki/IBM_Standard_Modular_System
PS:
The term 'gate' referred to one of two swing-out racks, one of which held the relays
and the other held the SMS cards. The cards were wired using wire-wrap.

And oops!

When I said "I did, however, work on the IBM 1401", I meant to say "I did, however,
work on the IBM 1401 chain printer"
Ken Blake
2021-12-01 22:37:03 UTC
Permalink
Post by lar3ryca
Post by Ken Blake
Post by lar3ryca
The first computer I worked on (final test line, in Don Mills, Ontario)
was the IBM 6400 Electronic Accounting Machine. 32 words of 36-bit
memory, but the program was not stored in memory. It was a plugboard.
Program stepping, command, and arguments were set with the plugboard.
Stepping itself was done with relays,
Do you have the IBM model number right? Do you perhaps mean a 405 or 407?
Assuming that's what you mean, those machines weren't called computers;
they were called EAM Machines, and the people who set them up weren't
called programmers.
Well, we called it the 6400 Electronic Accounting Machine. It was
definitely not the 405 or 407. I call it a computer because it could be
programmed to do calculations, manipulate strings, and do I/O.
At the time I worked on the final test line, the 360/30 was a topic of conversation.
Being a strictly hardware guy, I could not figure out how a program could be
stored in memory. I chatted with a few guys who were giving courses, on it, and
finally understood.
From the site I referenced, http://www.glennsmuseum.com/ibm/ibm.html
Figure 11 shows a particularly rare piece: the control
plugboard for an IBM 6400 Accounting Machine.
---
The IBM 6400 series is a series of four calculating and accounting
machines produced by IBM starting in 1962.[1] The IBM 6405 was a
desk-size calculator, and the 6410, 6420, and 6430 were more advanced
accounting machines.
The 6405, 6410, and 6420 were developed by IBM in Lexington,
Kentucky, United States, in the early 1960s. Manufacturing was done
by IBM in Lexington and by IBM in Don Mills, Ontario, Canada.
In 1966 all work was transferred to Don Mills.
---
The model I worked on was the 6420, but we all called it the 6400.
I transferred to IBM in Kitchener, before the 6430 was developed,
so I can tell you nothing about it. Shortly after that I trained on
and serviced the 360 Model 30.
Post by Ken Blake
On my first programming job (programming an IBM 1401) one day I needed
something done on a 407 (I can't remember what or why, and all the
people there who knew how to do it were busy and unavailable. So I
grabbed a manual, read through it, figured out how to do what I needed
to do and stuck the wires into the board. It was a lot easier than
programming the 1401. It worked.
I never did work on 1401 computer. I did, however, work on the IBM 1401,
but they were attached to System 360 mainframes.
Post by Ken Blake
Post by lar3ryca
and the actual operations were
done using SMS (Standard Modular System) cards.
I don't think you mean SMS cards. I think you mean 80-column punched
cards. That's what was used in those day. I still have few around here
somewhere; I used them as bookmarks for years.
No, I meant SMS cards. They were not input devices. They lived in the electronic
'gate' and were circuit boards containing the logic. They each had specific jobs
to do, and consisted of transistors, resistors, etc.
https://en.wikipedia.org/wiki/IBM_Standard_Modular_System
The term 'gate' referred to one of two swing-out racks, one of which held the relays
and the other held the SMS cards. The cards were wired using wire-wrap.
And oops!
When I said "I did, however, work on the IBM 1401", I meant to say "I did, however,
work on the IBM 1401 chain printer"
Yes, the 1403.

An anecdote: Back in those days, I was once in the computer room doing a
tape dump on the 1403. I ran out of paper, and since each tape record
was only 15 or 20 characters long, and not wanting to waste any more
paper, I took the used paper, turned it around and fed it back into the
printer.

The computer operator on duty that day came over and looked with me at
the paper rising from the printer. He saw the characters right side up
on the left side of the paper, and upside down on the right side. He did
a double take and said "How did you do that?"
Ken Blake
2021-12-01 22:29:39 UTC
Permalink
Post by Ken Blake
Post by lar3ryca
The first computer I worked on (final test line, in Don Mills, Ontario)
was the IBM 6400 Electronic Accounting Machine. 32 words of 36-bit
memory, but the program was not stored in memory. It was a plugboard.
Program stepping, command, and arguments were set with the plugboard.
Stepping itself was done with relays,
Do you have the IBM model number right? Do you perhaps mean a 405 or 407?
Assuming that's what you mean, those machines weren't called computers;
they were called EAM Machines, and the people who set them up weren't
called programmers.
Well, we called it the 6400 Electronic Accounting Machine. It was
definitely not the 405 or 407. I call it a computer because it could be
programmed to do calculations, manipulate strings, and do I/O.
At the time I worked on the final test line, the 360/30 was a topic of conversation.
Being a strictly hardware guy, I could not figure out how a program could be
stored in memory. I chatted with a few guys who were giving courses, on it, and
finally understood.
From the site I referenced, http://www.glennsmuseum.com/ibm/ibm.html
Figure 11 shows a particularly rare piece: the control
plugboard for an IBM 6400 Accounting Machine.
OK, thanks, that looks nothing like the 405 or 407 boards.
---
The IBM 6400 series is a series of four calculating and accounting
machines produced by IBM starting in 1962.[1] The IBM 6405 was a
desk-size calculator, and the 6410, 6420, and 6430 were more advanced
accounting machines.
OK, I've never heard of those. I though I knew about everything in IBM's
line at that time, but I guess I was wrong.
The 6405, 6410, and 6420 were developed by IBM in Lexington,
Kentucky, United States, in the early 1960s. Manufacturing was done
by IBM in Lexington and by IBM in Don Mills, Ontario, Canada.
In 1966 all work was transferred to Don Mills.
---
The model I worked on was the 6420, but we all called it the 6400.
I transferred to IBM in Kitchener, before the 6430 was developed,
so I can tell you nothing about it. Shortly after that I trained on
and serviced the 360 Model 30.
I knew and programmed the 36/20, 25, 30, 40, 50, 65, and 67. Also
several 370 models.
Post by Ken Blake
On my first programming job (programming an IBM 1401) one day I needed
something done on a 407 (I can't remember what or why, and all the
people there who knew how to do it were busy and unavailable. So I
grabbed a manual, read through it, figured out how to do what I needed
to do and stuck the wires into the board. It was a lot easier than
programming the 1401. It worked.
I never did work on 1401 computer. I did, however, work on the IBM 1401,
but they were attached to System 360 mainframes.
Post by Ken Blake
Post by lar3ryca
and the actual operations were
done using SMS (Standard Modular System) cards.
I don't think you mean SMS cards. I think you mean 80-column punched
cards. That's what was used in those day. I still have few around here
somewhere; I used them as bookmarks for years.
No, I meant SMS cards. They were not input devices. They lived in the electronic
'gate' and were circuit boards containing the logic. They each had specific jobs
to do, and consisted of transistors, resistors, etc.
OK, sorry to have misunderstood you.
https://en.wikipedia.org/wiki/IBM_Standard_Modular_System
Peter Moylan
2021-12-02 10:17:26 UTC
Permalink
As for programming, The neatest one I ever did was written in
assembler and punched onto 3 80-column cards. Because of the way the
360 line loaded a program from a card deck, you could actually write
a complete (small) program using three cards.
The one I wrote was a card deck copier. It ran on a 2540 card
reader/punch. You put the three-card program into the reader,
followed by the cards you wanted to copy, followed by a single blank
card. You needed to have enough blank cards in the punch hopper, and
when you selected the address of the reader and hit IPL (Initial
Program Load), it read in the program, then ther deck to be copied,
and when it got to a blank card, it stopped reading and punched the
copy.
The first computer I ever worked with was a PDP8-S, and I remember being
fascinated with the way the bootstrap loader worked. You keyed in a few
instructions from the front panel, basically just enough to start
reading data from a paper tape. The stuff on the paper tape contained
the part of the bootstrap loader that you hadn't yet loaded, so
instructions were being put in memory just before it was time to execute
them. Once the bootstrap loader was in memory the paper tape switched to
a different format, containing a better-quality loader that the
bootstrap loader was able to load. It was very much a case of the loader
lifting itself up by its own bootstraps.

A few years later we got one of the first PDP-11s, and again the initial
loader had to be keyed in on the front panel. Later we were saved from
that tedium when we got a ROM containing the bootstrap loader. That ROM
was not a semiconductor ROM. It was just an array of diodes, and you
programmed it with a pair of sidecutters.

Eventually we were able to add a disk drive, and then initial program
loading worked the way the big machines did: the loader was on the first
block of the disk, and that block was loaded in automagically.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Snidely
2021-12-02 12:45:45 UTC
Permalink
Post by Peter Moylan
As for programming, The neatest one I ever did was written in
assembler and punched onto 3 80-column cards. Because of the way the
360 line loaded a program from a card deck, you could actually write
a complete (small) program using three cards.
The one I wrote was a card deck copier. It ran on a 2540 card
reader/punch. You put the three-card program into the reader,
followed by the cards you wanted to copy, followed by a single blank
card. You needed to have enough blank cards in the punch hopper, and
when you selected the address of the reader and hit IPL (Initial
Program Load), it read in the program, then ther deck to be copied,
and when it got to a blank card, it stopped reading and punched the
copy.
The first computer I ever worked with was a PDP8-S, and I remember being
fascinated with the way the bootstrap loader worked. You keyed in a few
instructions from the front panel, basically just enough to start
reading data from a paper tape. The stuff on the paper tape contained
the part of the bootstrap loader that you hadn't yet loaded, so
instructions were being put in memory just before it was time to execute
them. Once the bootstrap loader was in memory the paper tape switched to
a different format, containing a better-quality loader that the
bootstrap loader was able to load. It was very much a case of the loader
lifting itself up by its own bootstraps.
I had a little access to an 8S, but most of my learning was on a
cupboard full of PDP-8 ("straight 8"?) with DECtape drives, and a
second 4K was eventually added. I was jealous of the folk who had a
PDP-8 with the plexiglas covers.

The DECtape boot loader I first learned was about 15 words (12 bit
words), but someone got it down to about 8. [actual numbers are
shadows in the past]. And we generally felt that we had the best
console switches of the various ones we encounter on DEC machines,
including the PDP-10.
Post by Peter Moylan
A few years later we got one of the first PDP-11s, and again the initial
loader had to be keyed in on the front panel. Later we were saved from
that tedium when we got a ROM containing the bootstrap loader. That ROM
was not a semiconductor ROM. It was just an array of diodes, and you
programmed it with a pair of sidecutters.
Eventually we were able to add a disk drive, and then initial program
loading worked the way the big machines did: the loader was on the first
block of the disk, and that block was loaded in automagically.
I missed out on the PDP-11 that replaced that 8, and went off to a
school with an IBM 1130. I never had to bootstrap it. I briefly
worked in a test department that used PDP-11s and DG [mumble]s, but
changed jobs before I had much toggling experience. My VAX experience
was all at the end of a long terminal cable, and was never taken into
the Actual Presence.

/dps
--
Ieri, oggi, domani
Adam Funk
2021-12-02 13:42:57 UTC
Permalink
Post by Peter Moylan
As for programming, The neatest one I ever did was written in
assembler and punched onto 3 80-column cards. Because of the way the
360 line loaded a program from a card deck, you could actually write
a complete (small) program using three cards.
The one I wrote was a card deck copier. It ran on a 2540 card
reader/punch. You put the three-card program into the reader,
followed by the cards you wanted to copy, followed by a single blank
card. You needed to have enough blank cards in the punch hopper, and
when you selected the address of the reader and hit IPL (Initial
Program Load), it read in the program, then ther deck to be copied,
and when it got to a blank card, it stopped reading and punched the
copy.
The first computer I ever worked with was a PDP8-S, and I remember being
fascinated with the way the bootstrap loader worked. You keyed in a few
instructions from the front panel, basically just enough to start
reading data from a paper tape. The stuff on the paper tape contained
the part of the bootstrap loader that you hadn't yet loaded, so
instructions were being put in memory just before it was time to execute
them. Once the bootstrap loader was in memory the paper tape switched to
a different format, containing a better-quality loader that the
bootstrap loader was able to load. It was very much a case of the loader
lifting itself up by its own bootstraps.
A few years later we got one of the first PDP-11s, and again the initial
loader had to be keyed in on the front panel. Later we were saved from
that tedium when we got a ROM containing the bootstrap loader. That ROM
was not a semiconductor ROM. It was just an array of diodes, and you
programmed it with a pair of sidecutters.
Wouldn't a breadboard have been easier? Or too easy to knock diodes
loose?
Post by Peter Moylan
Eventually we were able to add a disk drive, and then initial program
loading worked the way the big machines did: the loader was on the first
block of the disk, and that block was loaded in automagically.
--
books by the blameless and by the dead
Peter Moylan
2021-12-03 00:33:40 UTC
Permalink
Post by Adam Funk
Post by Peter Moylan
A few years later we got one of the first PDP-11s, and again the
initial loader had to be keyed in on the front panel. Later we
were saved from that tedium when we got a ROM containing the
bootstrap loader. That ROM was not a semiconductor ROM. It was just
an array of diodes, and you programmed it with a pair of
sidecutters.
Wouldn't a breadboard have been easier? Or too easy to knock diodes
loose?
A diode array is cheaper to build. Also, a breadboard takes too much
space, and that's a consideration given that the model has to be plugged
into a slot somewhere inside the machine. Knocking the diodes is an
issue only when you're plugging the board into that slot, and typically
you do that only once.

I probably should have mentioned that if you make a mistake in cutting
the diodes, the error can be corrected with a soldering iron. That's
what you call programming right down to the bare metal.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Snidely
2021-12-03 00:54:19 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
A few years later we got one of the first PDP-11s, and again the
initial loader had to be keyed in on the front panel. Later we
were saved from that tedium when we got a ROM containing the
bootstrap loader. That ROM was not a semiconductor ROM. It was just
an array of diodes, and you programmed it with a pair of
sidecutters.
Wouldn't a breadboard have been easier? Or too easy to knock diodes
loose?
A diode array is cheaper to build. Also, a breadboard takes too much
space, and that's a consideration given that the model has to be plugged
into a slot somewhere inside the machine. Knocking the diodes is an
issue only when you're plugging the board into that slot, and typically
you do that only once.
I probably should have mentioned that if you make a mistake in cutting
the diodes, the error can be corrected with a soldering iron. That's
what you call programming right down to the bare metal.
Heh!

/dps "The PDP-8 had about 1 gate per 3x5 card"
--
"That's a good sort of hectic, innit?"

" Very much so, and I'd recommend the haggis wontons."
-njm
Adam Funk
2021-12-03 09:22:30 UTC
Permalink
Post by Peter Moylan
Post by Adam Funk
Post by Peter Moylan
A few years later we got one of the first PDP-11s, and again the
initial loader had to be keyed in on the front panel. Later we
were saved from that tedium when we got a ROM containing the
bootstrap loader. That ROM was not a semiconductor ROM. It was just
an array of diodes, and you programmed it with a pair of
sidecutters.
Wouldn't a breadboard have been easier? Or too easy to knock diodes
loose?
A diode array is cheaper to build. Also, a breadboard takes too much
space, and that's a consideration given that the model has to be plugged
into a slot somewhere inside the machine. Knocking the diodes is an
issue only when you're plugging the board into that slot, and typically
you do that only once.
I probably should have mentioned that if you make a mistake in cutting
the diodes, the error can be corrected with a soldering iron. That's
what you call programming right down to the bare metal.
I see --- I guess it wasn't reprogrammed very often.
--
Apparently I lack some particular perversion which today's
employer is seeking. ---Ignatius J Reilly
Ken Blake
2021-12-02 18:09:22 UTC
Permalink
Post by Ken Blake
Post by Ken Blake
Post by lar3ryca
The first computer I worked on (final test line, in Don Mills, Ontario)
was the IBM 6400 Electronic Accounting Machine. 32 words of 36-bit
memory, but the program was not stored in memory. It was a plugboard.
Program stepping, command, and arguments were set with the plugboard.
Stepping itself was done with relays,
Do you have the IBM model number right? Do you perhaps mean a 405 or 407?
Assuming that's what you mean, those machines weren't called computers;
they were called EAM Machines, and the people who set them up weren't
called programmers.
Well, we called it the 6400 Electronic Accounting Machine. It was
definitely not the 405 or 407. I call it a computer because it could be
programmed to do calculations, manipulate strings, and do I/O.
At the time I worked on the final test line, the 360/30 was a topic of conversation.
Being a strictly hardware guy, I could not figure out how a program could be
stored in memory. I chatted with a few guys who were giving courses, on it, and
finally understood.
From the site I referenced, http://www.glennsmuseum.com/ibm/ibm.html
Figure 11 shows a particularly rare piece: the control
plugboard for an IBM 6400 Accounting Machine.
OK, thanks, that looks nothing like the 405 or 407 boards.
---
The IBM 6400 series is a series of four calculating and accounting
machines produced by IBM starting in 1962.[1] The IBM 6405 was a
desk-size calculator, and the 6410, 6420, and 6430 were more advanced
accounting machines.
OK, I've never heard of those. I though I knew about everything in IBM's
line at that time, but I guess I was wrong.
The 6405, 6410, and 6420 were developed by IBM in Lexington,
Kentucky, United States, in the early 1960s. Manufacturing was done
by IBM in Lexington and by IBM in Don Mills, Ontario, Canada.
In 1966 all work was transferred to Don Mills.
---
The model I worked on was the 6420, but we all called it the 6400.
I transferred to IBM in Kitchener, before the 6430 was developed,
so I can tell you nothing about it. Shortly after that I trained on
and serviced the 360 Model 30.
I knew and programmed the 36/20, 25, 30, 40, 50, 65, and 67. Also
several 370 models.
I was a CE (Customer Engineer, IBMSpeak for technician),
Yes, I know the term CE very well. Also SE (Systems Engineer).
30, 40, 44, 50, and 95, though that last one was only to do ECs
(Engineering Changes) at the University of Waterloo.
I never programmed a 95 (a giant machine, in its day), but I saw one
once, at an IBM facility. If I'm not mixing it up with another very
large computer, it was surrounded by a football-field of disk drives.
lar3ryca
2021-12-02 21:33:40 UTC
Permalink
Post by Ken Blake
Post by Ken Blake
Post by Ken Blake
Post by lar3ryca
The first computer I worked on (final test line, in Don Mills, Ontario)
was the IBM 6400 Electronic Accounting Machine. 32 words of 36-bit
memory, but the program was not stored in memory. It was a plugboard.
Program stepping, command, and arguments were set with the plugboard.
Stepping itself was done with relays,
Do you have the IBM model number right? Do you perhaps mean a 405 or 407?
Assuming that's what you mean, those machines weren't called computers;
they were called EAM Machines, and the people who set them up weren't
called programmers.
Well, we called it the 6400 Electronic Accounting Machine. It was
definitely not the 405 or 407. I call it a computer because it could be
programmed to do calculations, manipulate strings, and do I/O.
At the time I worked on the final test line, the 360/30 was a topic of conversation.
Being a strictly hardware guy, I could not figure out how a program could be
stored in memory. I chatted with a few guys who were giving courses, on it, and
finally understood.
From the site I referenced, http://www.glennsmuseum.com/ibm/ibm.html
Figure 11 shows a particularly rare piece: the control
plugboard for an IBM 6400 Accounting Machine.
OK, thanks, that looks nothing like the 405 or 407 boards.
---
The IBM 6400 series is a series of four calculating and accounting
machines produced by IBM starting in 1962.[1] The IBM 6405 was a
desk-size calculator, and the 6410, 6420, and 6430 were more advanced
accounting machines.
OK, I've never heard of those. I though I knew about everything in IBM's
line at that time, but I guess I was wrong.
The 6405, 6410, and 6420 were developed by IBM in Lexington,
Kentucky, United States, in the early 1960s. Manufacturing was done
by IBM in Lexington and by IBM in Don Mills, Ontario, Canada.
In 1966 all work was transferred to Don Mills.
---
The model I worked on was the 6420, but we all called it the 6400.
I transferred to IBM in Kitchener, before the 6430 was developed,
so I can tell you nothing about it. Shortly after that I trained on
and serviced the 360 Model 30.
After I moved into the field, and to Kitchener, one of the 6400 systems I
worked on was at Budd Automotive. They made car frames. Once they were
made, someone had to measure the frame in a number of places to ensure
they met the required specifications. The specs were such that the
measurements interacted, in that it was OK for a particular measurement to
be a certain amount off, provided that another particular measurement was
within a certain amount.

They used digital calipers and depth gauges, etc., which had RS232 interfaces.
The measurements were printed out, then entered by hand into the 6400 using
the program to calculate the result.

The plant manager asked IBM for an RPQ to provide an RS232 to 6400 input, and
was quoted something in the range of $20,000. On my next call, I saw something
attached to the numeric keypad, and asked him about it. He told me it took RS232,
then activated solenoids to enter the data. Turns out he approached a few
companies, and Victor Comptometer (I think) provided it for under $1,000.
All he had to do was to get permission from IBM to drill 4 mounting holes in the
keypad surround (the machine was rented). Says he told them that if he didn't get
their permission, he would glue it on.
Post by Ken Blake
Post by Ken Blake
I knew and programmed the 36/20, 25, 30, 40, 50, 65, and 67. Also
several 370 models.
I was a CE (Customer Engineer, IBMSpeak for technician),
Yes, I know the term CE very well. Also SE (Systems Engineer).
30, 40, 44, 50, and 95, though that last one was only to do ECs
(Engineering Changes) at the University of Waterloo.
I never programmed a 95 (a giant machine, in its day), but I saw one
once, at an IBM facility. If I'm not mixing it up with another very
large computer, it was surrounded by a football-field of disk drives.
The only model 44 I ever worked on was also at UW.
It had something I never saw on any other 360. It looked like a sort of
cartridge disk drive that fit into a slot on the side of the cabinet.

Never did see a model 65 0r 67. What sort of read-only storage did they use?
I know the model 30 used CCROS and the 40 used TROS.

There were a number (perhaps 2 or 3) of CEs that were pretty much stationed
at UW. The only time I saw them was when there was a meeting at the IBM office.

Yes, they had a LOT of disk drives.
Peter Moylan
2021-12-02 01:20:28 UTC
Permalink
Post by lar3ryca
The first computer I worked on (final test line, in Don Mills,
Ontario) was the IBM 6400 Electronic Accounting Machine. 32 words of
36-bit memory, but the program was not stored in memory. It was a
plugboard. Program stepping, command, and arguments were set with the
plugboard. Stepping itself was done with relays, and the actual
operations were done using SMS (Standard Modular System) cards.
I've never seen a digital machine with a plugboard, but of course I'm
familiar with using plugboards on analogue computers.

Analogue computers were going out of fashion by about 1970, but it has
just now occurred to me that with today's electronics you could build a
far superior analogue computer on a shoestring budget. In fact the
plugboards would probably account for most of the cost.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
lar3ryca
2021-12-02 03:36:06 UTC
Permalink
Post by Peter Moylan
Post by lar3ryca
The first computer I worked on (final test line, in Don Mills,
Ontario) was the IBM 6400 Electronic Accounting Machine. 32 words of
36-bit memory, but the program was not stored in memory. It was a
plugboard. Program stepping, command, and arguments were set with the
plugboard. Stepping itself was done with relays, and the actual
operations were done using SMS (Standard Modular System) cards.
I've never seen a digital machine with a plugboard, but of course I'm
familiar with using plugboards on analogue computers.
Analogue computers were going out of fashion by about 1970, but it has
just now occurred to me that with today's electronics you could build a
far superior analogue computer on a shoestring budget. In fact the
plugboards would probably account for most of the cost.
Ahh. Perhaps my opening statement should have read "The first
_digital_ computer I worked on..."

My step-brother and I bought an analogue computer kit in about 1960.
ISTR that we purchased it from Van Nostrand, Fischer Scientific, or
Scientific supplies. I don't think I ever figured out the circuit.
It had three potentiometers and a centre-zero meter, and some
switches. It was sort of an electric slide rule.

I wish I could remember more details about it.
Peter T. Daniels
2021-12-02 15:38:53 UTC
Permalink
Post by Peter Moylan
I've never seen a digital machine with a plugboard, but of course I'm
familiar with using plugboards on analogue computers.
Analogue computers were going out of fashion by about 1970, but it has
just now occurred to me that with today's electronics you could build a
far superior analogue computer on a shoestring budget. In fact the
plugboards would probably account for most of the cost.
<Analogue> looks quite odd. <Dialogue> and <catalogue> and I suppose
other <-logue>s don't bother me, but that one does.
Jerry Friedman
2021-12-02 15:44:49 UTC
Permalink
Post by Peter T. Daniels
Post by Peter Moylan
I've never seen a digital machine with a plugboard, but of course I'm
familiar with using plugboards on analogue computers.
Analogue computers were going out of fashion by about 1970, but it has
just now occurred to me that with today's electronics you could build a
far superior analogue computer on a shoestring budget. In fact the
plugboards would probably account for most of the cost.
<Analogue> looks quite odd. <Dialogue> and <catalogue> and I suppose
other <-logue>s don't bother me, but that one does.
In the American usage I'm familiar with, "analogue" means something
that's analogous to something else--torque is the rotational analogue
of force--and "analog" describes a kind of computer.
--
Jerry Friedman
Peter Moylan
2021-12-03 00:28:12 UTC
Permalink
Post by Jerry Friedman
Post by Peter T. Daniels
Post by Peter Moylan
I've never seen a digital machine with a plugboard, but of course
I'm familiar with using plugboards on analogue computers.
Analogue computers were going out of fashion by about 1970, but
it has just now occurred to me that with today's electronics you
could build a far superior analogue computer on a shoestring
budget. In fact the plugboards would probably account for most of
the cost.
<Analogue> looks quite odd. <Dialogue> and <catalogue> and I
suppose other <-logue>s don't bother me, but that one does.
In the American usage I'm familiar with, "analogue" means something
that's analogous to something else--torque is the rotational
analogue of force--and "analog" describes a kind of computer.
The most important components in an analog(ue) computer are the
integrators, because by interconnecting them with the plugboard you can
solve differential equations. In setting up the differential equation
you are creating an analogue of the physical system you are trying to model.

At least, that's the way I see it. But in real-life writing we
distinguish between pairs like program/programme purely on the bssis
that that's the way we are used to writing the word or words.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Snidely
2021-12-03 01:01:55 UTC
Permalink
Post by Peter Moylan
Post by Jerry Friedman
Post by Peter T. Daniels
Post by Peter Moylan
I've never seen a digital machine with a plugboard, but of course
I'm familiar with using plugboards on analogue computers.
Analogue computers were going out of fashion by about 1970, but
it has just now occurred to me that with today's electronics you
could build a far superior analogue computer on a shoestring
budget. In fact the plugboards would probably account for most of
the cost.
<Analogue> looks quite odd. <Dialogue> and <catalogue> and I
suppose other <-logue>s don't bother me, but that one does.
In the American usage I'm familiar with, "analogue" means something
that's analogous to something else--torque is the rotational
analogue of force--and "analog" describes a kind of computer.
I don't think I see the longue spelling of the noun often, and I'm
mostly reading AmE (The Atlantic, Gizmodo, Ars Technica, The Orange
County Register, The LA Times, ...). And don't forget that _Analog_ is
a magazine.
Post by Peter Moylan
The most important components in an analog(ue) computer are the
integrators, because by interconnecting them with the plugboard you can
solve differential equations. In setting up the differential equation
you are creating an analogue of the physical system you are trying to model.
At least, that's the way I see it. But in real-life writing we
distinguish between pairs like program/programme purely on the bssis
that that's the way we are used to writing the word or words.
And integrators are mostly done with op-amps, which come many to a chip
package these days. If you can scale your inputs to the levels these
use, you could probably do the analog side of a Linc-8 in a box not
much bigger than used for a Raspberry Pi. Oh, and the RPi may have
daughter boards with op amps, so there you go.

/dps
--
"This is all very fine, but let us not be carried away be excitement,
but ask calmly, how does this person feel about in in his cooler
moments next day, with six or seven thousand feet of snow and stuff on
top of him?"
_Roughing It_, Mark Twain.
Dingbat
2021-11-27 23:44:45 UTC
Permalink
Post by Athel Cornish-Bowden
Post by Peter Moylan
Post by lar3ryca
On Thu, 25 Nov 2021 21:43:35 +1100 Peter Moylan
Post by Peter Moylan
Post by Stefan Ram
Subject: What is a pile?
|With the assumption that removing a single grain does
not |cause a heap to become a non-heap, the paradox is
to consider |what happens when the process is repeated
enough times that |only one grain remains: is it still a
heap? If not, when did |it change from a heap to a
non-heap? | from a Web page about the sorites paradox.
Must have been written be a physicist! A mathematician
|What happens when the process is repeated enough times
that |no grain remains: is it still a heap?
Of course it is, to a mathematician. The empty set is still
a set, so the empty heap is still a heap.
The best of all posible heaps!
And the fastest possible case, in the HeapSort algorithm.
I guess pidgin paleface talk is no longer acceptable.
QuickSort is faster, so says the Wikipedia Heapsort article,
but not so clear on the QS page.
Hmm... he did not claim that HeapSort is fastest sort. He only
claimed that a Heapsort on the empty heap is the fastest
possible HeapSort.
Precisely. I use Quicksort for my own sorting, as it happens. I've
tried HeapSort and like it less. They perform equally well on
zero-sized data.
When sorting an array of large records - a case that the textbooks
never cover, but which in my experience is the most important
practical case - one important contribution to the execution time
is the time to move the records around. With that in mind, my
implementation delays the movement and instead moves "holes"
around. A simple idea, but one that nobody else seems to use in
QuickSort, although using "holes" is more intuitively obvious in
HeapSort.
Of course one could handle the long-record case by sorting an array
of pointers, but then you get a messy problem at the end when
moving everything to its final position; an annoying problem if, as
usually happens, you are trying to sort the array in place.
Post by lar3ryca
https://en.wikipedia.org/wiki/Introsort
Interesting! Thanks for the link.
That's designed to have good worst-case performance, but personally
I'd rather have best average-case performance.
The last time I needed to incorporate a sorting algorithm in a
program (around 1992, I think) I used comb sort. I don't remember
why, but it had to do with memory requirements. (I was an avid reader
of Byte at that time, and I had probably read the 1991 article by
Lacey and Box.) Anyway, it worked very well, not just an improvement
on its ancestor, the simple-minded bubble sort, but a vast
improvement.
I like to keep my GEDCOM files (genealogy data) sorted by surname, and
to do this I use a merge sort. The basic idea is simple and easy to
implement. You break the file into two halves, sort the two halves, and
then merge the two resulting files. Clearly this involves recursion,
where each half is itself split into halves.
To make it efficient, you have to add one extra detail: when you reach a
file small enough to fit in memory, sort it by a different method such
as Quicksort.
The whole sort goes very quickly. For this particular application, you
have the extra benefit that the file is already nearly sorted, because
of the sort you did the last time the file was modified.
Quicksort goes slowly if the data are already mostly sorted.
In a worst case, it's of order n^2. There was a b-sort in CACM,
about 1985, which seemed to not have this demerit.

Ah, here it is:
01-Apr-1985 — Bsort, a variation of Quicksort, combines the interchange
technique used in Bubble sort with the Quicksort algorithm.

It was the 1st hit returned by this search:
<https://www.google.com/search?q=b-sort+cacm>
Adam Funk
2021-11-26 09:25:56 UTC
Permalink
Post by Peter Moylan
Post by lar3ryca
On Thu, 25 Nov 2021 21:43:35 +1100 Peter Moylan
Post by Peter Moylan
Post by Stefan Ram
Subject: What is a pile?
|With the assumption that removing a single grain does not
|cause a heap to become a non-heap, the paradox is to
consider |what happens when the process is repeated enough
times that |only one grain remains: is it still a heap? If
not, when did |it change from a heap to a non-heap? | from a
Web page about the sorites paradox.
Must have been written be a physicist! A mathematician would
|What happens when the process is repeated enough times that
|no grain remains: is it still a heap?
Of course it is, to a mathematician. The empty set is still a
set, so the empty heap is still a heap.
The best of all posible heaps!
And the fastest possible case, in the HeapSort algorithm.
I guess pidgin paleface talk is no longer acceptable.
QuickSort is faster, so says the Wikipedia Heapsort article, but
not so clear on the QS page.
Hmm... he did not claim that HeapSort is fastest sort. He only
claimed that a Heapsort on the empty heap is the fastest possible
HeapSort.
Precisely. I use Quicksort for my own sorting, as it happens. I've tried
HeapSort and like it less. They perform equally well on zero-sized data.
What's the worst sorting algorithm for an empty list?

I think the standard answer for "which sorting algorithm should I
use?" in most modern programming languages is "use the one that comes
with the language [1] unless you really know what you're doing & have a
good reason to use something else".


[1] or its standard library, to be pedantic.
Post by Peter Moylan
When sorting an array of large records - a case that the textbooks never
cover, but which in my experience is the most important practical case -
one important contribution to the execution time is the time to move the
records around. With that in mind, my implementation delays the movement
and instead moves "holes" around. A simple idea, but one that nobody
else seems to use in QuickSort, although using "holes" is more
intuitively obvious in HeapSort.
Of course one could handle the long-record case by sorting an array of
pointers, but then you get a messy problem at the end when moving
everything to its final position; an annoying problem if, as usually
happens, you are trying to sort the array in place.
Post by lar3ryca
https://en.wikipedia.org/wiki/Introsort
Interesting! Thanks for the link.
That's designed to have good worst-case performance, but personally I'd
rather have best average-case performance.
--
Not even computers will replace committees, because committees buy
computers. ---Shepherd Mead
Ken Blake
2021-11-24 15:57:47 UTC
Permalink
A hemorrhoid?
Sam Plusnet
2021-11-24 20:16:50 UTC
Permalink
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
Alessandro Volta knew all about piles.
--
Sam Plusnet
Peter Moylan
2021-11-24 23:22:55 UTC
Permalink
Post by Sam Plusnet
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
Alessandro Volta knew all about piles.
Piles are a pain in the neck.
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
Athel Cornish-Bowden
2021-11-25 09:26:07 UTC
Permalink
Post by Peter Moylan
Post by Sam Plusnet
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
Alessandro Volta knew all about piles.
French makes a distinction between "pile" and "battérie", and I need to
remember which is which when I want to buy a battery.
Post by Peter Moylan
Piles are a pain in the neck.
--
Athel -- French and British, living mainly in England until 1987.
Peter Moylan
2021-11-25 10:08:06 UTC
Permalink
Post by Athel Cornish-Bowden
Post by Sam Plusnet
Many car pileup https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a
pileless pile(up) in a context other than a pileup of cars?
Alessandro Volta knew all about piles.
French makes a distinction between "pile" and "battérie", and I need
to remember which is which when I want to buy a battery.
I'd be interested in the answer. Atilf only has the electrical "pile" as
a special case of what you get by stacking things one above the other.
For "batterie" (no accent mark) it has "Groupement d'un certain nombre
de piles ou d'accumulateurs disposés en série ou en parallèle". (As a
subsidiary meaning of the word that means beating up someone, or
attacking them with an artillery battery.)

That suggests that BOTH of these mean the same as the strict-sense
meaning of "battery" in English: a collection of N>1 voltaic cells in
series and/or parallel. This can be contrasted with "cell", which means
the N=1 case.

But I think I have the answer. Since "batterie" is defined as a
combination of "piles", that suggests that French "pile" means the same
as English "cell".

That's taken me back for another look at atilf's definition of "pile".
It's definitely a stack of things, but in the electrical case it appears
to be a stack of exactly one object.

Of course we all know that English speakers - those who are not
electrical engineers - are sloppy about the distinction. Where I write
"AAA cells" on my shopping list, many other people would write "AAA
batteries".

Anyway, here's a mnemonic for you. French "batterie" = English "battery".

(both in the electrical sense and the "assault and battery" sense.)
--
Peter Moylan Newcastle, NSW http://www.pmoylan.org
bruce bowser
2021-11-28 15:57:29 UTC
Permalink
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
What is a pfeil?
J. J. Lodder
2021-11-28 20:48:31 UTC
Permalink
Post by bruce bowser
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
What is a pfeil?
Something you may find in your apfelstrudel. What's a tell?
--
Jan
bruce bowser
2021-11-28 21:30:30 UTC
Permalink
Post by J. J. Lodder
Post by bruce bowser
Post by Dingbat
Many car pileup
https://images.app.goo.gl/NyjN8NAheVG7EMos8
I ask: Not pretty, but where is the pile? Can there be a pileless pile(up)
in a context other than a pileup of cars?
What is a pfeil?
Something you may find in your apfelstrudel. What's a tell?
We zullen, 'tell' kan vertellen betekenen. In Engels te zeggen. In Zwitsers Duits (wat een antwoord dat je misschien wilt) een held of heldin? Van een soort?
Loading...