Discussion:
If Scheme is so good why MIT drops it?
(too old to reply)
Xah Lee
2009-07-20 14:04:09 UTC
Permalink
How do you explain that something as inferior as Python beat Lisp in
the market place despite starting 40 years later.
The same way like VHS 'beat' the film formats for home videos.
The same way it will be 'beaten'.
wishful thinking tech geeking idiots always think that. It's a
psychological defense mechansim. When you are the loser, you enjoin
fancy adages to comfort yourself. Thus, Mac people, lisp people, and
unix people when it comse to Microsoft, always quote the same
mothefucking VHS vs Beta fuck.

In my life experience and interest on this issue in the past 5 years,
i found that, the world is basically pretty fair, all things
considered. Successful are successful not because they are unethical
or whatever motherfuck. Successful people, or products, in any
industry, may it be computing, singer, musics, software,
business ... , are due to reasonable causes, social and or technical.
e.g. the mundane hard working, good price/performance ratio, good
advertising, talent, and with perhaps a little what we'd have to call
fortuity.

Scheme, has become more obscure because it wasn't that great in the
first fucking place. Scheme grew upon a myth that it being elegant,
and this cult largely came from the fact when lisps are all
bewildering industrial langs and Scheme is one come with a design to
reduced lisps's ugliness. Since, the word “elegance” stuck with
Scheme.

As a illustration, the following 2 tech aspects we can see Scheme's
problem: (1) the cons business. (2) no namespace/library mechanism. In
the 1990s or before, these 2 reasons are not sufficient to kill it,
since other langs isn't much better in these areas, and much worse in
general. But today, starting about after 2000s, with proliferation of
langs and tools, Scheme isn't fit to compete for popularity.

References:

• Fundamental Problems of Lisp
http://xahlee.org/UnixResource_dir/writ/lisp_problems.html

• Lisp's List Problem
http://xahlee.org/emacs/lisp_list_problem.html

• Proliferation of Computing Languages
http://xahlee.org/UnixResource_dir/writ/new_langs.html

• Xah Lee's Computing Experience Bio
http://xahlee.org/PageTwo_dir/Personal_dir/xah_comp_exp.html

• Language, Purity, Cult, and Deception
http://xahlee.org/UnixResource_dir/writ/lang_purity_cult_deception.html

Xah
∑ http://xahlee.org/


Peter Keller
2009-07-20 16:36:47 UTC
Permalink
Post by Xah Lee
Scheme, has become more obscure because it wasn't that great in the
first fucking place. Scheme grew upon a myth that it being elegant,
and this cult largely came from the fact when lisps are all
bewildering industrial langs and Scheme is one come with a design to
reduced lisps's ugliness. Since, the word ?elegance? stuck with
Scheme.
Then why are most modern languages evolving towards all of the
functionality of functional languages? Things like lexical closure,
higher order functions, garbage collection, etc, are commonplace in your
"accepted and modern" languages these days. Give it another 20 years
and full macro systems will be common place as well. Why does this
diffusion happen?

Because languages like Scheme are "crucible" languages. People explore
the language in their implementations and then the knowledge of what works
and what doesn't gets disseminated throughout the relatively small culture
of language/compiler designers.

Syntax isn't what drives a language forward, any reasonable one will
do just fine. The four things which dominate language acceptance is A)
pure luck, B) the tools one uses to interact with it, C) expressivity
coupled with suggestivity, and D) available libraries.

If you think that because I said libraries I somehow justify your
argument, that isn't so. If you notice, C's libraries are probably the
most prolific and widest in use. Yet, there is no library structure at
all and most other languages create FFIs specifically to bring in the
functionality of C libraries.

Syntax isn't elegant: ideas are elegant.

-pete
Xah Lee
2009-07-21 12:31:45 UTC
Permalink
Post by Peter Keller
Post by Xah Lee
Scheme, has become more obscure because it wasn't that great in the
first fucking place. Scheme grew upon a myth that it being elegant,
and this cult largely came from the fact when lisps are all
bewildering industrial langs and Scheme is one come with a design to
reduced lisps's ugliness. Since, the word ?elegance? stuck with
Scheme.
Then why are most modern languages evolving towards all of the
functionality of functional languages? Things like lexical closure,
higher order functions, garbage collection, etc, are commonplace in your
"accepted and modern" languages these days.
you as a Scheme or Lisp fan, perceives that the world is all copying
you.

Mac fans, perceives that Windows is all copying Mac, even today.

More realistically, the world didn't just copy you. Lisp is one of the
early languages, and in our opinion, a good one, containing many good
ideas. However, just because many lisp's ideas are common in many of
today's lang, you can't say that the world all copied lisp. For
example, automatic memory management, list datatype, are natural ideas
that would naturally come into being with increased computer hardware
power and progress of computer science.

In late 1990s, when Perl is raging, it is common to see here debates
about whether Perl is a lisp. Quite fucking idiotic. When XML is
raging in early 2000s, lisper fanatics think that the world finally
understood sexp, but did a lousy job in copying. What a motherfucking
idiocy. Today, especially like few years ago Ruby is raging with its
Rail whatnotfuck, you see people discussing with subjcet lines like Is
Ruby Lisp Done Right? What a motherfuck.

On the other hand, many, many, today's lang's features are not in
lisp.

It's the magic of wishful thinking at work.

Also, there's functional langs the likes of Mathematica, ML/OCaml/F#,
Haskell, erlang, OZ... and getting more popular today. Some of these
have roots in the 1980s. Lisps, in comparison to these, don't seem to
have a dick. Of course, the hardcore lispers look at these askance,
thinking that they are seeing some oddity of outter space that has
little earthy bearings; the same way imperative coding monkies look at
lisp — something they don't understand and ignore.

Also, lisp's macros, a feature that gets lispers much ado about
nothing. In Mathematica (b ~1989), the whole language can be
considered as a extended lisp macros system. When i learned about
lisp's macros while practical coding elisp, i find lisp macros are
rather so trivial, painful to use, and laughable. In fact, i never use
it, never see a need to use it. But you see lisp fanatics getting
giddy about macros all day, like Hasklers idiots drivel about monads
all day. All day, day and night. Macros! Monads!
Post by Peter Keller
Give it another 20 years
and full macro systems will be common place as well. Why does this
diffusion happen?
LOL. (LOL = Laughing Out Loud.) Dream on.

In 20 years, bots will code for you, and meat brains would have
embedded silicon chips, and quantum computing would be reasonable too!
I'm not too sure Common Lisp, Scheme Lisp, Emacs Lisp, would still
exist. LOL.
Post by Peter Keller
Because languages like Scheme are "crucible" languages. People explore
the language in their implementations and then the knowledge of what works
and what doesn't gets disseminated throughout the relatively small culture
of language/compiler designers.
Syntax isn't what drives a language forward, any reasonable one will
do just fine.
Is this the “syntax is not important” adage? This is another idiotic
myth, popular among sophomoric computer language geekers, counting
some book writers.

Syntax, is the MOST important aspect of a computer language! Witness
lisp's nested parens. Without it, lisp wouldn't develope its
characteristics and features.
Post by Peter Keller
The four things which dominate language acceptance is A)
pure luck, B) the tools one uses to interact with it, C) expressivity
coupled with suggestivity, and D) available libraries.
If you think that because I said libraries I somehow justify your
argument, that isn't so. If you notice, C's libraries are probably the
most prolific and widest in use. Yet, there is no library structure at
all and most other languages create FFIs specifically to bring in the
functionality of C libraries.
Syntax isn't elegant: ideas are elegant.
What a motherfucking meaningless idiocy.

Xah
∑ http://xahlee.org/


Alex Queiroz
2009-07-21 14:10:34 UTC
Permalink
Hallo,
Post by Xah Lee
Also, lisp's macros, a feature that gets lispers much ado about
nothing. In Mathematica (b ~1989), the whole language can be
considered as a extended lisp macros system. When i learned about
lisp's macros while practical coding elisp, i find lisp macros are
rather so trivial, painful to use, and laughable. In fact, i never use
it, never see a need to use it. But you see lisp fanatics getting
giddy about macros all day, like Hasklers idiots drivel about monads
all day. All day, day and night. Macros! Monads!
This is interesting. This shows you cannot really code in Lisp,
be it Elisp, CL or Scheme. But deeply in your disturbed mind you
believe you can, and to solve this cognitive dissonance you end up
harrassing everyone with your idiotic ideias of how to improve Lisp.

-alex
Keith H Duggar
2009-07-21 16:40:43 UTC
Permalink
Post by Xah Lee
mothefucking VHS vs Beta fuck.
whatever motherfuck
first fucking place
Quite fucking idiotic
motherfucking idiocy
Rail whatnotfuck
What a motherfuck.
What a motherfucking meaningless idiocy.
Xah, this should get your creative juices flowing:

http://www.duggar.org/pub/humor/FWordIsBeautiful.wav

KHD
Bata
2009-07-22 12:38:19 UTC
Permalink
lol
Peter Keller
2009-07-21 23:27:13 UTC
Permalink
Post by Xah Lee
Post by Peter Keller
Syntax isn't elegant: ideas are elegant.
What a motherfucking meaningless idiocy.
For example: I read the articles on your website and while I think your syntax
was passable, your ideas were not. I don't think there is any language you
could translate that stuff into to make the concepts in it better.

See the difference?

-pete
Mr.Cat
2009-07-21 11:29:34 UTC
Permalink
1. What's wrong with cons again? If you don't like it - just don't use
it.
2. Namespace/library mechanism is not absent. It exists in r6rs. In
r5rs it is not absent, it is just implementation-specific.
Mr.Cat
2009-07-21 11:37:05 UTC
Permalink
3. Cons is not just about proper lists. You may use it to build
arbitrary graphs (a common example is a circular list).
Xah Lee
2009-07-21 12:27:23 UTC
Permalink
Post by Mr.Cat
1. What's wrong with cons again? If you don't like it - just don't use
it.
2. Namespace/library mechanism is not absent. It exists in r6rs. In
r5rs it is not absent, it is just implementation-specific.
dear idiot,

see the FAQ at bottom of:

• Fundamental Problems of Lisp
http://xahlee.org/UnixResource_dir/writ/lisp_problems.html

Excerpt:

Q: If you don't like cons, lisp has arrays and hashmaps, too.

A: Suppose there's a lang called gisp. In gisp, there's cons but also
fons. Fons are just like cons except it has 3 cells with 3 accessors:
car, cbr, cdr. Now, gisp is a old lang, the fons are deeply rooted in
the lang. Every some 100 lines of code you'll see a use of fons with
its extra accessor cbr, or any one of the cbaar, cdabr, cbbar, cbbbar,
etc. You got annoyed by this. You as a critic, complains that fons is
bad. But then some gisp fanatics retorts: “If you don't like fons,
gisp has cons, too!”.

You see, by “having something too”, does not solve the problem of
pollution. Sure, you can use just cons in gisp, but every lib or
other's code you encounter, there's a invasion of fons with its cbar,
cbbar, cbbbar. The problem created by fons does not go away by “having
cons too”.

Xah
∑ http://xahlee.org/


Jon Harrop
2009-07-21 14:11:18 UTC
Permalink
Post by Xah Lee
Post by Mr.Cat
1. What's wrong with cons again? If you don't like it - just don't use
it.
2. Namespace/library mechanism is not absent. It exists in r6rs. In
r5rs it is not absent, it is just implementation-specific.
dear idiot,
• Fundamental Problems of Lisp
http://xahlee.org/UnixResource_dir/writ/lisp_problems.html
Q: If you don't like cons, lisp has arrays and hashmaps, too.
A: Suppose there's a lang called gisp. In gisp, there's cons but also
car, cbr, cdr. Now, gisp is a old lang, the fons are deeply rooted in
the lang. Every some 100 lines of code you'll see a use of fons with
its extra accessor cbr, or any one of the cbaar, cdabr, cbbar, cbbbar,
etc. You got annoyed by this. You as a critic, complains that fons is
bad. But then some gisp fanatics retorts: “If you don't like fons,
gisp has cons, too!”.
ROTFL. :-)
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
vippstar
2009-07-21 16:19:25 UTC
Permalink
Post by Xah Lee
Post by Mr.Cat
1. What's wrong with cons again? If you don't like it - just don't use
it.
2. Namespace/library mechanism is not absent. It exists in r6rs. In
r5rs it is not absent, it is just implementation-specific.
dear idiot,
• Fundamental Problems of Lisp
 http://xahlee.org/UnixResource_dir/writ/lisp_problems.html
Q: If you don't like cons, lisp has arrays and hashmaps, too.
A: Suppose there's a lang called gisp. In gisp, there's cons but also
car, cbr, cdr. Now, gisp is a old lang, the fons are deeply rooted in
the lang. Every some 100 lines of code you'll see a use of fons with
its extra accessor cbr, or any one of the cbaar, cdabr, cbbar, cbbbar,
etc. You got annoyed by this. You as a critic, complains that fons is
bad. But then some gisp fanatics retorts: “If you don't like fons,
gisp has cons, too!”.
You see, by “having something too”, does not solve the problem of
pollution. Sure, you can use just cons in gisp, but every lib or
other's code you encounter, there's a invasion of fons with its cbar,
cbbar, cbbbar. The problem created by fons does not go away by “having
cons too”.
Except that fons is useless while I can see use in cons - I can't
imagine lisp without it. Perhaps you can do that though, so please
explain or link me to an article of yours if you've already done so,
how do you imagine lisp without cons cells.
Jon Harrop
2009-07-22 19:05:43 UTC
Permalink
Post by vippstar
Except that fons is useless while I can see use in cons - I can't
imagine lisp without it. Perhaps you can do that though, so please
explain or link me to an article of yours if you've already done so,
how do you imagine lisp without cons cells.
Easy: just treat cons as the special case of a 2-element array. That is
essentially what Mathematica does and it works just fine.

I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
vippstar
2009-07-23 01:21:05 UTC
Permalink
Post by Jon Harrop
Post by vippstar
Except that fons is useless while I can see use in cons - I can't
imagine lisp without it. Perhaps you can do that though, so please
explain or link me to an article of yours if you've already done so,
how do you imagine lisp without cons cells.
Easy: just treat cons as the special case of a 2-element array. That is
essentially what Mathematica does and it works just fine.
Cons is not a two-element array, it contains more information than
just that. An array can not hold two different types of elements,
which is necessary for cons to represent a list:

'(1 . (2 . nil))

1 -> INTEGER
(2 . nil) -> CONS

I don't know how Mathematica does this, but mathematicas arrays likely
have different semantics than the usual arrays, likely making them
much slower.
Post by Jon Harrop
I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
What do you mean 'seperating cons out'? I don't understand, seperating
it out of what?
Jon Harrop
2009-07-23 08:03:35 UTC
Permalink
Post by vippstar
Post by Jon Harrop
Easy: just treat cons as the special case of a 2-element array. That is
essentially what Mathematica does and it works just fine.
Cons is not a two-element array, it contains more information than
just that. An array can not hold two different types of elements...
Not necessarily in dynamic languages.
Post by vippstar
Post by Jon Harrop
I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
What do you mean 'seperating cons out'? I don't understand, seperating
it out of what?
Introducing an artificial distinction between cons and arbitrary-length
arrays.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
ACL
2009-07-23 21:20:59 UTC
Permalink
Post by Jon Harrop
Post by vippstar
Post by Jon Harrop
Easy: just treat cons as the special case of a 2-element array. That is
essentially what Mathematica does and it works just fine.
Cons is not a two-element array, it contains more information than
just that. An array can not hold two different types of elements...
Not necessarily in dynamic languages.
Post by vippstar
Post by Jon Harrop
I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
What do you mean 'seperating cons out'? I don't understand, seperating
it out of what?
Introducing an artificial distinction between cons and arbitrary-length
arrays.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.http://www.ffconsultancy.com/?u
Lets remove the distinction between data-structures and blocks of
memory.
Everything is just a bunch of 1's and 0's stored in a chunk of memory
anyway.
Raffael Cavallaro
2009-07-23 06:56:46 UTC
Permalink
Post by vippstar
An array can not hold two different types of elements,
? (defparameter *heterogeneous-array* (make-array 2 :initial-contents
(list pi "a string")))
*HETEROGENEOUS-ARRAY*
? (type-of (aref *heterogeneous-array* 0))
DOUBLE-FLOAT
? (type-of (aref *heterogeneous-array* 1))
(SIMPLE-BASE-STRING 8)
--
Raffael Cavallaro
josephoswald+gg@gmail.com
2009-07-23 12:14:18 UTC
Permalink
Post by Jon Harrop
Post by vippstar
Except that fons is useless while I can see use in cons - I can't
imagine lisp without it. Perhaps you can do that though, so please
explain or link me to an article of yours if you've already done so,
how do you imagine lisp without cons cells.
Easy: just treat cons as the special case of a 2-element array. That is
essentially what Mathematica does and it works just fine.
I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.http://www.ffconsultancy.com/?u
Cons pairs are typically represented as two consecutive machine words,
one for car, one for cdr, using the low-tag bits in a reference to
identify those words as a cons pair; 2-element arrays typically must
have an array header that indicates that the object 1) is an array and
2) has two generic elements, in addition to the two machine words that
hold the array elements themselves.

Perhaps in your phrase "slightly less cripplingly-bad performance" you
were referring to Mathematica, which is hardly an example of how to do
Lisp right.
Jon Harrop
2009-07-23 14:49:04 UTC
Permalink
Post by josephoswald+***@gmail.com
Post by Jon Harrop
I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
Cons pairs are typically represented as two consecutive machine words,
one for car, one for cdr, using the low-tag bits in a reference to
identify those words as a cons pair; 2-element arrays typically must
have an array header that indicates that the object 1) is an array and
2) has two generic elements, in addition to the two machine words that
hold the array elements themselves.
Perhaps in your phrase "slightly less cripplingly-bad performance" you
were referring to Mathematica, which is hardly an example of how to do
Lisp right.
No, I was referring to the irrelevance of such bit twiddling of pointers
when implementing a performant Lisp today.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
josephoswald+gg@gmail.com
2009-07-24 15:31:10 UTC
Permalink
Post by Jon Harrop
Post by josephoswald+***@gmail.com
Post by Jon Harrop
I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
Cons pairs are typically represented as two consecutive machine words,
one for car, one for cdr, using the low-tag bits in a reference to
identify those words as a cons pair; 2-element arrays typically must
have an array header that indicates that the object 1) is an array and
2) has two generic elements, in addition to the two machine words that
hold the array elements themselves.
Perhaps in your phrase "slightly less cripplingly-bad performance" you
were referring to Mathematica, which is hardly an example of how to do
Lisp right.
No, I was referring to the irrelevance of such bit twiddling of pointers
when implementing a performant Lisp today.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
Froggy, you aren't making yourself clear here.

What do "performant Lisp" implementations do today to implement cons
pairs? Do they add multiple words of overhead (your "2 element array"
technique), or do they use the same data tagging techniques that have
been used for decades to implement Lisps on conventional un-tagged
machines, or something else like depending on the JVM or CLR to do
whatever they do under the hood?

The bits of a lowtag are rarely twiddled: fast CAR and CDR simply
fetch the machine word at a fixed offset from the CONS (tagged
address) value. The lowtag bits are extracted only when an operation
wants to detect if a value is a CONS cell or not. The lowtag is set
only by the CONS operation, which allocates two words of memory, puts
the CAR value in one, the CDR value in the other, then returns the
location of that word pair, offset by a value that is the CONS-
specific lowtag. On most architectures today, addresses are byte-
aligned, while machine words are 4- or 8-byte aligned, so this lowtag
is mostly using what would be zero bits anyhow, meaning CONS pairs can
typically be located at any even word address, with zero padding
needed.

No allocation, writing, or reading of an array header. No wasted
memory. CAR and CDR can be single machine instructions when the type-
check for CONS can be omitted.

How can this be made more performant?
Jon Harrop
2009-07-25 08:53:45 UTC
Permalink
Post by josephoswald+***@gmail.com
What do "performant Lisp" implementations do today to implement cons
pairs?
They build on existing performant VMs where all bit twiddling has been
abstracted away.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Vend
2009-07-30 00:06:29 UTC
Permalink
Post by josephoswald+***@gmail.com
memory. CAR and CDR can be single machine instructions when the type-
check for CONS can be omitted.
How can this be made more performant?
There are performance benefits in using a special storage format for
cons cells, but why do you need to expose this to the programmer?
Couldn't the array creation primitive just use the special format when
array size was 2?

Anyway I don't think that cons should have been special cases of
arrays, but rather some special type used only to build proper linked
lists (like it is done in many functional languages).

In lisp languages instead, despite the name, there is no language-
level list type. The cons cell is a generic 2-tuple used to implement
various data structures, and it is programmer's responsability to
ensure consistency.
ACL
2009-07-23 21:15:35 UTC
Permalink
Post by Jon Harrop
Post by vippstar
Except that fons is useless while I can see use in cons - I can't
imagine lisp without it. Perhaps you can do that though, so please
explain or link me to an article of yours if you've already done so,
how do you imagine lisp without cons cells.
Easy: just treat cons as the special case of a 2-element array. That is
essentially what Mathematica does and it works just fine.
I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.http://www.ffconsultancy.com/?u
Since when does lisp have cripplingly bad performance?
It seems fine to me.

I think there is even a scheme implementation that is as good
performance as C in a lot of cases.
Tim Bradshaw
2009-07-24 19:04:24 UTC
Permalink
Post by ACL
Since when does lisp have cripplingly bad performance?
It seems fine to me.
It is fine. The person to whom you're responding has something to sell
you which isn't Lisp.
Jon Harrop
2009-07-25 08:52:44 UTC
Permalink
Post by Tim Bradshaw
Post by ACL
Since when does lisp have cripplingly bad performance?
It seems fine to me.
It is fine. The person to whom you're responding has something to sell
you which isn't Lisp.
Note that 99% of developers have something to sell you which isn't Lisp.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Scott Burson
2009-07-25 18:40:48 UTC
Permalink
Post by Jon Harrop
Post by ACL
Since when does lisp have cripplingly bad performance?
It seems fine to me.
It is fine.  The person to whom you're responding has something to sell
you which isn't Lisp.
Note that 99% of developers have something to sell you which isn't Lisp.
Since this statement, even if true, would prove nothing whatsoever, it
can only be considered to be more adolescent heckling.

-- Scott
Jon Harrop
2009-07-26 10:01:48 UTC
Permalink
Post by Jon Harrop
Post by ACL
Since when does lisp have cripplingly bad performance?
It seems fine to me.
It is fine.  The person to whom you're responding has something to sell
you which isn't Lisp.
Note that 99% of developers have something to sell you which isn't Lisp.
Since this statement, even if true...
Even if true?
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Tim Bradshaw
2009-07-26 11:23:20 UTC
Permalink
Post by Jon Harrop
Note that 99% of developers have something to sell you which isn't Lisp.
I care about Lisp's popularity the same way I care about Mahler's
Richard Fateman
2009-07-26 18:51:20 UTC
Permalink
Post by Jon Harrop
Post by vippstar
Except that fons is useless while I can see use in cons - I can't
imagine lisp without it. Perhaps you can do that though, so please
explain or link me to an article of yours if you've already done so,
how do you imagine lisp without cons cells.
Easy: just treat cons as the special case of a 2-element array. That is
essentially what Mathematica does and it works just fine.
1. Mathematica, last I looked, uses the name List for something that
most of us would call a vector. Prepending a value to the front of a
List L, the O(1) operation in Lisp called cons, takes, in Mathematica,
O(n) time where n is the length of L. This is not "fine" to many
people. Another operation that Mathematica associates with creating such
a new data structure is that it goes through all the elements to update
them -- to see if they depend on some value that has changed recently.
This is not "fine" to many people.

As for storing a cons cell as a 2-element array, this is a case of
economy of storage and operation (for the cons) vs. redundant (larger)
storage and slower general operation (for the array).
Post by Jon Harrop
I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
You are of course free to believe anything at all, but your attestation
as to the "original motivation" is not particularly credible. Are you
talking about the roots of Lisp, e.g. Lisp 1.5 and earlier? Are you
claiming that you have a better instruction sequence for accessing
elements 0 and 1 of an array than CAR and CDR on the IBM 709X?
You can read the Lisp 1.5 programmer's manual online:

http://www.softwarepreservation.org/projects/LISP/book/LISP%201.5%20Programmers%20Manual.pdf

In some lisps (Maclisp, Franz Lisp) there were data object called hunks,
which could be allocated in power-of-2 sizes, and which did not have
special headers -- they were tagged by virtue of their locations. Hunks
of size 2,4, ... 512 were on different pages.
And cons cells were on different pages.

This was a feature of BIBOP allocation of objects in those Lisps. So if
you really wanted a kind-of cons-cell with, say 4 or 8 spots, you could
allocate them etc.

Common Lisp does not have hunks.
RJF
Jon Harrop
2009-07-28 16:26:58 UTC
Permalink
Post by Richard Fateman
Post by Jon Harrop
Post by vippstar
Except that fons is useless while I can see use in cons - I can't
imagine lisp without it. Perhaps you can do that though, so please
explain or link me to an article of yours if you've already done so,
how do you imagine lisp without cons cells.
Easy: just treat cons as the special case of a 2-element array. That is
essentially what Mathematica does and it works just fine.
1. Mathematica, last I looked, uses the name List for something that
most of us would call a vector. Prepending a value to the front of a
List L, the O(1) operation in Lisp called cons, takes, in Mathematica,
O(n) time where n is the length of L. This is not "fine" to many
people. Another operation that Mathematica associates with creating such
a new data structure is that it goes through all the elements to update
them -- to see if they depend on some value that has changed recently.
This is not "fine" to many people.
That is neither relevant nor accurate. We are talking about 2-element arrays
which are obviously O(1). If you want to build sequences efficiently in
Mathematica use Sow and Reap as the documentation describes. The AppendTo
and PrependTo functions do not rewrite their sequence inputs as you claim.
If Mathematica were "not fine to many people" it would not have orders of
magnitude more users than Lisp or anything you have ever written (given
away for free, even).
Post by Richard Fateman
As for storing a cons cell as a 2-element array, this is a case of
economy of storage and operation (for the cons) vs. redundant (larger)
storage and slower general operation (for the array).
No, it is the case of sacrificing the bit twiddling of cons cells for the
power of a production-quality VM like the JVM or CLR. The benefits those
VMs offer in the context of parallelism alone far outweigh the benefits of
bit twiddled cons cells in any of today's Lisp implementations.
Post by Richard Fateman
Post by Jon Harrop
I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
You are of course free to believe anything at all, but your attestation
as to the "original motivation" is not particularly credible. Are you
talking about the roots of Lisp, e.g. Lisp 1.5 and earlier? Are you
claiming that you have a better instruction sequence for accessing
elements 0 and 1 of an array than CAR and CDR on the IBM 709X?
In other words, it was done for performance exactly as I said.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Richard Fateman
2009-07-28 17:25:02 UTC
Permalink
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
Post by vippstar
Except that fons is useless while I can see use in cons - I can't
imagine lisp without it. Perhaps you can do that though, so please
explain or link me to an article of yours if you've already done so,
how do you imagine lisp without cons cells.
Easy: just treat cons as the special case of a 2-element array. That is
essentially what Mathematica does and it works just fine.
1. Mathematica, last I looked, uses the name List for something that
most of us would call a vector. Prepending a value to the front of a
List L, the O(1) operation in Lisp called cons, takes, in Mathematica,
O(n) time where n is the length of L. This is not "fine" to many
people. Another operation that Mathematica associates with creating such
a new data structure is that it goes through all the elements to update
them -- to see if they depend on some value that has changed recently.
This is not "fine" to many people.
That is neither relevant nor accurate. We are talking about 2-element arrays
which are obviously O(1).
Well, 4 element arrays would also be O(1).
As for the timing, to do Prepend, consider this, in Mathematica:

b1= Table[a[i],{i,1,10}];
b2= Table[a[i],{i,1,100000}];
b3= Table[a[i],{i,1,1000000}];

Timing[Prepend[b1,zero];] --> {0., Null}

/* that's time in seconds, also the suppressed [not printed] answer
because of the ";"*/

Timing[Prepend[b2,zero];] --> {0.032, Null}
Timing[Prepend[b2,zero];] --> {0.203, Null}

The relevance is that Lists in Mathematica are arrays, and arrays are,
generally speaking, unsuitable for making conses because every time you
construct a CONS-like object Mathematica checks to see if it has to
re-evaluate the CAR and CDR, so to speak. Recursively.

So this is not, as I said, just fine. Making a CONS cell in Mathematica
out of this structure would be a very bad idea.
Post by Jon Harrop
If you want to build sequences efficiently in
Mathematica use Sow and Reap as the documentation describes.
Sow and Reap were introduced in Mathematica version 5.0.
I see no reason for you to think that this is efficient, compared to
some other method (unspecified!) to produce the same sequence.


The AppendTo
Post by Jon Harrop
and PrependTo functions do not rewrite their sequence inputs as you claim.
I'm not sure what you mean by "rewrite". They construct entirely new
sequences.
Post by Jon Harrop
If Mathematica were "not fine to many people" it would not have orders of
magnitude more users than Lisp or anything you have ever written (given
away for free, even).
Hm, 1 order of magnitude is, in common usage, a factor of ten.
"Orders of magnitude" would, I suppose, be a factor of 100 or 1000 or more.

According to sourceforge
http://sourceforge.net/projects/maxima/files/

The latest version (April 2009) of windows Maxima (a program written in
Lisp, descendant of Macsyma) and which I wrote parts of, has been
downloaded 27,691 times directly. Who knows how many times it has been
transferred. Another 2500 or so for Linux and Mac OSX.

The previous version (Dec. 2008) was downloaded 36,000 times or so.

Now I don't know how one counts "users" -- Mathematica could count "paid
up licenses" but do you think Mathematica has 100 X 30,000 paid up
licenses? That's 3 million. If each licensee pays on average, say $500,
that means the Mathematica earns $1.5 billion/year. Does this jibe
with your understanding of that company?

That's just Maxima compared to Mathematica.

While I have nothing to do with the implementation of common lisp called
CLISP, I note that sourceforge says it has been downloaded 317,000 times.
This could, of course, be one user downloading it 317,000 times, but I
doubt it. Then there are also scheme implementations, and programs that
link lisp to graphics, java, lapack, ...

And those are just a few of the free programs, add to that the
commercial sales, autocad, and the possible claims that anyone who uses
emacs is a lisp user.

Jon, when you make statements that are so easily disproven, you give
trolling a bad name.
Post by Jon Harrop
Post by Richard Fateman
As for storing a cons cell as a 2-element array, this is a case of
economy of storage and operation (for the cons) vs. redundant (larger)
storage and slower general operation (for the array).
No, it is the case of sacrificing the bit twiddling of cons cells for the
power of a production-quality VM like the JVM or CLR.
Are you selling toothpaste? all new, brighter teeth, organic?

My guess is that you are somehow objecting to the use of tagged
pointers, which is often a really good implementation idea.
Post by Jon Harrop
The benefits those
VMs offer in the context of parallelism alone far outweigh the benefits of
bit twiddled cons cells in any of today's Lisp implementations.
I can't imagine why you think that the overhead of a virtual machine is
(what, faster?) than, or incompatible with, tagged pointers. Or why
tagged pointers are incompatible with parallelism.

Perhaps you would turn your keen eye to comparing the benefits of real
memory management and a quality garbage collector (as in some Lisps) to
the state of the art in JVM or similar memory management?
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
(JH) I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
(RJF) You are of course free to believe anything at all, but your attestation
as to the "original motivation" is not particularly credible. Are you
talking about the roots of Lisp, e.g. Lisp 1.5 and earlier? Are you
claiming that you have a better instruction sequence for accessing
elements 0 and 1 of an array than CAR and CDR on the IBM 709X?
In other words, it was done for performance exactly as I said.
Uh, I guess you are free to claim that you agree with me.

It is presumptuous of you to claim that I agree with you. The point you
seem determined to miss is that CAR and CDR were not somehow second
choice abstractions compared to (say) arrays, some sacrifice in the name
of efficiency. Indeed the concept of "ordered pair" is in some sense an
extraordinarily powerful tool for building anything, and this was
realized fairly early. AND, as it happens, easy/fast to implement on
most machines.

RJF
Jon Harrop
2009-07-28 19:39:31 UTC
Permalink
Post by Richard Fateman
Post by Jon Harrop
That is neither relevant nor accurate. We are talking about 2-element
arrays which are obviously O(1).
Well, 4 element arrays would also be O(1).
b1= Table[a[i],{i,1,10}];
b2= Table[a[i],{i,1,100000}];
b3= Table[a[i],{i,1,1000000}];
Timing[Prepend[b1,zero];] --> {0., Null}
/* that's time in seconds, also the suppressed [not printed] answer
because of the ";"*/
Timing[Prepend[b2,zero];] --> {0.032, Null}
Timing[Prepend[b2,zero];] --> {0.203, Null}
You are timing the creation of an array with 10^5 elements. We are talking
about 2-element arrays...
Post by Richard Fateman
The relevance is that Lists in Mathematica are arrays, and arrays are,
generally speaking, unsuitable for making conses because every time you
construct a CONS-like object Mathematica checks to see if it has to
re-evaluate the CAR and CDR, so to speak. Recursively.
As an aside, the time stamp on the first cons should cause Mathematica to
stop rewriting immediately. However, this has nothing to do with the topic
of conversation: replacing cons with 2-element arrays in a Lisp.
Post by Richard Fateman
So this is not, as I said, just fine.
Nor is it relevant.
Post by Richard Fateman
Post by Jon Harrop
If you want to build sequences efficiently in
Mathematica use Sow and Reap as the documentation describes.
Sow and Reap were introduced in Mathematica version 5.0.
I see no reason for you to think that this is efficient, compared to
some other method (unspecified!) to produce the same sequence.
RTFM.
Post by Richard Fateman
Post by Jon Harrop
If Mathematica were "not fine to many people" it would not have orders of
magnitude more users than Lisp or anything you have ever written (given
away for free, even).
Hm, 1 order of magnitude is, in common usage, a factor of ten.
"Orders of magnitude" would, I suppose, be a factor of 100 or 1000 or more.
Yes.
Post by Richard Fateman
According to sourceforge
http://sourceforge.net/projects/maxima/files/
The latest version (April 2009) of windows Maxima (a program written in
Lisp, descendant of Macsyma) and which I wrote parts of, has been
downloaded 27,691 times directly. Who knows how many times it has been
transferred. Another 2500 or so for Linux and Mac OSX.
The previous version (Dec. 2008) was downloaded 36,000 times or so.
Now I don't know how one counts "users" -- Mathematica could count "paid
up licenses" but do you think Mathematica has 100 X 30,000 paid up
licenses? That's 3 million.
If each licensee pays on average, say $500, that means the Mathematica
earns $1.5 billion/year.
The most common Mathematica licence is only $50. You seriously think those
Mathematica users upgrade every 5 weeks?
Post by Richard Fateman
Does this jibe with your understanding of that company?
Stephen Wolfram is now a billionaire thanks to sales of Mathematica. That
equates to 1e9/50 = 20,000,000 student licenses sold. In reality, WRI will
have pulled in far more than $1bn in licenses because they have significant
overheads (like paying hundreds of employees for decades).

So yes, I can easily believe that Mathematica has millions of active users.
Post by Richard Fateman
That's just Maxima compared to Mathematica.
While I have nothing to do with the implementation of common lisp called
CLISP, I note that sourceforge says it has been downloaded 317,000 times.
This could, of course, be one user downloading it 317,000 times, but I
doubt it.
You think hundreds of thousands of people are programming with clisp?

Ubuntu lists 4,273 clisp installs of which only 112 are actively used. I
seriously doubt clisp has over 1,000 users.
Post by Richard Fateman
Then there are also scheme implementations, and programs that
link lisp to graphics, java, lapack, ...
How is that relevant?
Post by Richard Fateman
...anyone who uses emacs is a lisp user...
That's hilarious.

Even if you count end users, the Ubuntu popcon gives under 100k emacs
installs ever (the vast majority of whom know nothing about Lisp) whereas
Wolfram Alpha pulled in a million users in a single month.
Post by Richard Fateman
Jon, when you make statements that are so easily disproven, you give
trolling a bad name.
Yes, you didn't have to clutch at extraordinarily long straws to "prove"
that at all.
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
As for storing a cons cell as a 2-element array, this is a case of
economy of storage and operation (for the cons) vs. redundant (larger)
storage and slower general operation (for the array).
No, it is the case of sacrificing the bit twiddling of cons cells for the
power of a production-quality VM like the JVM or CLR.
Are you selling toothpaste? all new, brighter teeth, organic?
Eh?
Post by Richard Fateman
Post by Jon Harrop
The benefits those
VMs offer in the context of parallelism alone far outweigh the benefits
of bit twiddled cons cells in any of today's Lisp implementations.
I can't imagine why you think that the overhead of a virtual machine is
(what, faster?) than, or incompatible with, tagged pointers.
Oh, you're going to implement cons cells in the JVM or CLR?
Post by Richard Fateman
Or why tagged pointers are incompatible with parallelism.
You would have to implement a Lisp with a production-quality concurrent GC.
That is certainly possible but nobody has actually done it yet.
Post by Richard Fateman
Perhaps you would turn your keen eye to comparing the benefits of real
memory management and a quality garbage collector (as in some Lisps) to
the state of the art in JVM or similar memory management?
Turn my eye to the topic I have been discussing all along?
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
(JH) I believe the original motivation for separating cons out was performance
but, as we know now, that just led to slightly less cripplingly-bad
performance.
(RJF) You are of course free to believe anything at all, but your attestation
as to the "original motivation" is not particularly credible. Are you
talking about the roots of Lisp, e.g. Lisp 1.5 and earlier? Are you
claiming that you have a better instruction sequence for accessing
elements 0 and 1 of an array than CAR and CDR on the IBM 709X?
In other words, it was done for performance exactly as I said.
Uh, I guess you are free to claim that you agree with me.
It is presumptuous of you to claim that I agree with you. The point you
seem determined to miss is that CAR and CDR were not somehow second
choice abstractions compared to (say) arrays, some sacrifice in the name
of efficiency. Indeed the concept of "ordered pair" is in some sense an
extraordinarily powerful tool for building anything, and this was
realized fairly early. AND, as it happens, easy/fast to implement on
most machines.
A 2-element array is an ordered pair. Indeed, that was the whole point...
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Rainer Joswig
2009-07-28 18:42:43 UTC
Permalink
Post by Jon Harrop
As an aside, the time stamp on the first cons should cause Mathematica to
stop rewriting immediately. However, this has nothing to do with the topic
of conversation: replacing cons with 2-element arrays in a Lisp.
Yeah, let's talk about that. Why would you replace cons cells with 2-
element arrays?
Jon Harrop
2009-07-28 22:23:15 UTC
Permalink
Why would you replace cons cells with 2-element arrays?
Simplicity.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Rainer Joswig
2009-07-28 21:40:43 UTC
Permalink
Post by Jon Harrop
Why would you replace cons cells with 2-element arrays?
Simplicity.
What?
Jon Harrop
2009-07-28 23:16:11 UTC
Permalink
What?
You can simplify the language by replacing conses with 2-element arrays.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Andrew Reilly
2009-07-28 23:38:33 UTC
Permalink
Post by Jon Harrop
What?
You can simplify the language by replacing conses with 2-element arrays.
How does passing arround array indexes (which will be zero or one) and
storing the vector length for each node, and checking the indices against
the length get to be simpler? Cons seems like a perfectly valid exercise
in constant propagation optimization at the language level, to me.

This is really a very amusing line of argument, by the way.
Congratulations on a successful troll.

Cheers,
--
Andrew
w_a_x_man
2009-07-28 23:51:34 UTC
Permalink
Post by Andrew Reilly
Congratulations on a successful troll.
You beg the question.
Jon Harrop
2009-07-29 20:40:36 UTC
Permalink
Post by Andrew Reilly
Post by Jon Harrop
What?
You can simplify the language by replacing conses with 2-element arrays.
How does passing arround array indexes (which will be zero or one) and
storing the vector length for each node, and checking the indices against
the length get to be simpler? Cons seems like a perfectly valid exercise
in constant propagation optimization at the language level, to me.
Then you must want to add lots of other special cases, like a custom
incompatible redundant representation of 3-element vectors as well? After
all, it is a valid exercise in exposing constant propagation so that the
programmer must deal with it by hand...
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Richard Fateman
2009-07-28 20:06:14 UTC
Permalink
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
That is neither relevant nor accurate. We are talking about 2-element
arrays which are obviously O(1).
Well, 4 element arrays would also be O(1).
b1= Table[a[i],{i,1,10}];
b2= Table[a[i],{i,1,100000}];
b3= Table[a[i],{i,1,1000000}];
Timing[Prepend[b1,zero];] --> {0., Null}
/* that's time in seconds, also the suppressed [not printed] answer
because of the ";"*/
Timing[Prepend[b2,zero];] --> {0.032, Null}
Timing[Prepend[b2,zero];] --> {0.203, Null}
... oops that should have been b3 in the line above.
transcribing/simplifying resulted in typo.
Post by Jon Harrop
You are timing the creation of an array with 10^5 elements. We are talking
about 2-element arrays...
The point was that Prepend in Mathematica is not "just fine". Compare to
CONS in Lisp, which is rather fast, and in particular, (cons a b) does
not depend on the length of b.
Post by Jon Harrop
Post by Richard Fateman
The relevance is that Lists in Mathematica are arrays, and arrays are,
generally speaking, unsuitable for making conses because every time you
construct a CONS-like object Mathematica checks to see if it has to
re-evaluate the CAR and CDR, so to speak. Recursively.
As an aside, the time stamp on the first cons should cause Mathematica to
stop rewriting immediately.
Rewriting? For an explanation of Mathematica's time stamp philosophy,
you can read my review of Mathematica. Frankly, I doubt that you know
what you are talking about.
Post by Jon Harrop
However, this has nothing to do with the topic
of conversation: replacing cons with 2-element arrays in a Lisp.
Post by Richard Fateman
So this is not, as I said, just fine.
Nor is it relevant.
Post by Richard Fateman
Post by Jon Harrop
If you want to build sequences efficiently in
Mathematica use Sow and Reap as the documentation describes.
Sow and Reap were introduced in Mathematica version 5.0.
I see no reason for you to think that this is efficient, compared to
some other method (unspecified!) to produce the same sequence.
RTFM.
I have, have you? I see no reason for you to think that this is
efficient, compared to some other method.
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
If Mathematica were "not fine to many people" it would not have orders of
magnitude more users than Lisp or anything you have ever written (given
away for free, even).
Hm, 1 order of magnitude is, in common usage, a factor of ten.
"Orders of magnitude" would, I suppose, be a factor of 100 or 1000 or more.
Yes.
Post by Richard Fateman
According to sourceforge
http://sourceforge.net/projects/maxima/files/
The latest version (April 2009) of windows Maxima (a program written in
Lisp, descendant of Macsyma) and which I wrote parts of, has been
downloaded 27,691 times directly. Who knows how many times it has been
transferred. Another 2500 or so for Linux and Mac OSX.
The previous version (Dec. 2008) was downloaded 36,000 times or so.
Now I don't know how one counts "users" -- Mathematica could count "paid
up licenses" but do you think Mathematica has 100 X 30,000 paid up
licenses? That's 3 million.
If each licensee pays on average, say $500, that means the Mathematica
earns $1.5 billion/year.
The most common Mathematica licence is only $50. You seriously think those
Mathematica users upgrade every 5 weeks?
The price for Mathematica 7 "home edition" is $295. The Mathematica for
students price is $139.50. "Real" licenses are much more.
Post by Jon Harrop
Post by Richard Fateman
Does this jibe with your understanding of that company?
Stephen Wolfram is now a billionaire thanks to sales of Mathematica.
What evidence do you have for this fairly implausible claim?
Post by Jon Harrop
That
equates to 1e9/50 = 20,000,000 student licenses sold. In reality, WRI will
have pulled in far more than $1bn in licenses because they have significant
overheads (like paying hundreds of employees for decades).
You think that if you go out in the street, and grab the first 100
people, including children, that even one of them would have a
Mathematica license?? Assuming that 10 million were sold in the USA,
where you presumably live, you would expect 3 out of 100 to be
Mathematica licensees.
Post by Jon Harrop
So yes, I can easily believe that Mathematica has millions of active users.
I think this is called "drinking your own Kool Aid".

Isn't Google wonderful... I learned

"Inhabitants of the Kamchitka Peninsula, in Northeastern Siberia,
traditionally drank the urine of individuals who had ingested the
psychedelic mushroom amanita muscaria....
These individuals then shared their urine, and so on, and so on --
ensuring that everyone shared in the high. The intoxicant would also
drink his own urine, thus keeping his high going for a few days at a
time, without any additional expense. "

And this was even before the internet!


reference:
http://www.rotten.com/library/medicine/bodily-functions/pissing/drinking-pee/




Returning to the case at hand (after washing up...)


20 million active users?? According to Wolfram|Alpha there are 4.3
million science and engineering students in post-secondary education in
USA. Are you counting them all as active users of Mathematica, as well
as the students in China, India, all of Europe?
Post by Jon Harrop
Post by Richard Fateman
That's just Maxima compared to Mathematica.
While I have nothing to do with the implementation of common lisp called
CLISP, I note that sourceforge says it has been downloaded 317,000 times.
This could, of course, be one user downloading it 317,000 times, but I
doubt it.
You think hundreds of thousands of people are programming with clisp?
Is a user of lisp a programmer? Not necessarily.

You think 20 million people are programming in Mathematica? I doubt it.
Post by Jon Harrop
Ubuntu lists 4,273 clisp installs of which only 112 are actively used. I
seriously doubt clisp has over 1,000 users.
Why would ubuntu matter? All versions of unix amount to something like
1-3% percent of installed systems.

And why would anyone trust your judgment? Seriously.
Post by Jon Harrop
Post by Richard Fateman
Then there are also scheme implementations, and programs that
link lisp to graphics, java, lapack, ...
How is that relevant?
You were attacking (a) lisp [all kinds?] (b) lisp applications [all
kinds] as being less common than Mathematica..
Post by Jon Harrop
Post by Richard Fateman
...anyone who uses emacs is a lisp user...
That's hilarious.
It is true. Anyone who books a plane ticket with Orbitz is a lisp user,
as is anyone who visits any number of web sites that are implemented in
lisp.
Post by Jon Harrop
Even if you count end users, the Ubuntu popcon gives under 100k emacs
installs ever (the vast majority of whom know nothing about Lisp) whereas
Wolfram Alpha pulled in a million users in a single month.
I ran a web site called Tilu, written in Lisp, doing integral table
lookups. It amassed about 260,000 user hits before I took it down.
Are you counting visitors to wolfram alpha as mathematica users??
Then I should could Tilu users. And multiply by 2 orders of magnitude.
Tilu beats you.
Post by Jon Harrop
Post by Richard Fateman
Jon, when you make statements that are so easily disproven, you give
trolling a bad name.
Yes, you didn't have to clutch at extraordinarily long straws to "prove"
that at all.
Counting Wolfram Alpha visitors -- now that's a long straw.
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
As for storing a cons cell as a 2-element array, this is a case of
economy of storage and operation (for the cons) vs. redundant (larger)
storage and slower general operation (for the array).
No, it is the case of sacrificing the bit twiddling of cons cells for the
power of a production-quality VM like the JVM or CLR.
Are you selling toothpaste? all new, brighter teeth, organic?
Eh?
Hey, I use a production-quality toothbrush.
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
The benefits those
VMs offer in the context of parallelism alone far outweigh the benefits
of bit twiddled cons cells in any of today's Lisp implementations.
I can't imagine why you think that the overhead of a virtual machine is
(what, faster?) than, or incompatible with, tagged pointers.
Oh, you're going to implement cons cells in the JVM or CLR?
Ah, you are not peddling virtual machines, but particular virtual
machine implementations. I would suggest that you explore the history
of virtual machines and byte-coded Lisps to see how implementing
operations and representations for Lisp in a VM might be done. There
were such implementations for BBN Lisp, INterlisp on D-machines, Franz
Lisp (on VAX), and CLISP.
Post by Jon Harrop
Post by Richard Fateman
Or why tagged pointers are incompatible with parallelism.
You would have to implement a Lisp with a production-quality concurrent GC.
That is certainly possible but nobody has actually done it yet.
How would you know?
Post by Jon Harrop
Post by Richard Fateman
Perhaps you would turn your keen eye to comparing the benefits of real
memory management and a quality garbage collector (as in some Lisps) to
the state of the art in JVM or similar memory management?
Turn my eye to the topic I have been discussing all along?
You seem to confuse "concurrent" with "quality".
[snip]
Post by Jon Harrop
A 2-element array is an ordered pair. Indeed, that was the whole point...
Your "point" was that there would be some advantage by storing a pair in
a 2-element array as opposed to some carefully crafted representation
that stored 2 pointers in a "cons cell" whatever that might be. And that
advantage would be related to some made-up attribute like
"production-quality-ness" or "parallel-ness".

I doubt that your "point" has anything to do with reality, but it
certainly has not been demonstrated.


RJF
Rainer Joswig
2009-07-28 20:23:01 UTC
Permalink
Post by Jon Harrop
Post by Richard Fateman
...anyone who uses emacs is a lisp user...
That's hilarious.
It is true.  Anyone who books a plane ticket with Orbitz is a lisp user,
as is anyone who visits any number of web sites that are implemented in
lisp.
Everyone who uses technical objects is a Lisp user, thanks to Autocad
(and its clones):
Buildings, bridges, motors, elevators, ...

...
w_a_x_man
2009-07-28 20:40:36 UTC
Permalink
Post by Rainer Joswig
Post by Jon Harrop
Post by Richard Fateman
...anyone who uses emacs is a lisp user...
That's hilarious.
It is true.  Anyone who books a plane ticket with Orbitz is a lisp user,
as is anyone who visits any number of web sites that are implemented in
lisp.
Everyone who uses technical objects is a Lisp user, thanks to Autocad
Buildings, bridges, motors, elevators, ...
...
It was generous of him to share his urine with you.
Rainer Joswig
2009-07-28 20:51:48 UTC
Permalink
Post by w_a_x_man
Post by Rainer Joswig
Post by Jon Harrop
Post by Richard Fateman
...anyone who uses emacs is a lisp user...
That's hilarious.
It is true.  Anyone who books a plane ticket with Orbitz is a lisp user,
as is anyone who visits any number of web sites that are implemented in
lisp.
Everyone who uses technical objects is a Lisp user, thanks to Autocad
Buildings, bridges, motors, elevators, ...
...
It was generous of him to share his urine with you.
You are a pisser.
w_a_x_man
2009-07-28 20:38:18 UTC
Permalink
It is true.  Anyone who books a plane ticket with Orbitz is a lisp user,
as is anyone who visits any number of web sites that are implemented in
lisp.
Any honest man would have to agree with that (if he had drunk
as much drug-laced urine as you have).
Jon Harrop
2009-07-28 23:10:07 UTC
Permalink
Post by Richard Fateman
Post by Jon Harrop
You are timing the creation of an array with 10^5 elements. We are
talking about 2-element arrays...
The point was that Prepend in Mathematica is not "just fine".
Which remains completely irrelevant.
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
The relevance is that Lists in Mathematica are arrays, and arrays are,
generally speaking, unsuitable for making conses because every time you
construct a CONS-like object Mathematica checks to see if it has to
re-evaluate the CAR and CDR, so to speak. Recursively.
As an aside, the time stamp on the first cons should cause Mathematica to
stop rewriting immediately.
Rewriting? For an explanation of Mathematica's time stamp philosophy,
you can read my review of Mathematica.
I can read some of your woefully misinformed whining or remember what the
guy who actually wrote the code told me. Hmm, let me think...
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
If you want to build sequences efficiently in
Mathematica use Sow and Reap as the documentation describes.
Sow and Reap were introduced in Mathematica version 5.0.
I see no reason for you to think that this is efficient, compared to
some other method (unspecified!) to produce the same sequence.
RTFM.
I have, have you? I see no reason for you to think that this is
efficient, compared to some other method.
So you didn't see the "Possible Issues" section about performance that gave
measurements about exactly this then?

Here, let me cite a URL of the exact documentation:

http://reference.wolfram.com/mathematica/ref/AppendTo.html

Maybe you need someone to read it out loud to you?
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
If each licensee pays on average, say $500, that means the Mathematica
earns $1.5 billion/year.
The most common Mathematica licence is only $50. You seriously think
those Mathematica users upgrade every 5 weeks?
The price for Mathematica 7 "home edition" is $295. The Mathematica for
students price is $139.50. "Real" licenses are much more.
"In just 10 minutes, for about $50, any student around the world can be up
and running with Mathematica." -
http://www.wolfram.com/news/timedstudent.html
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Does this jibe with your understanding of that company?
Stephen Wolfram is now a billionaire thanks to sales of Mathematica.
What evidence do you have for this fairly implausible claim?
That was widely reported in the mainstream press a few years ago.
Post by Richard Fateman
Post by Jon Harrop
That
equates to 1e9/50 = 20,000,000 student licenses sold. In reality, WRI
will have pulled in far more than $1bn in licenses because they have
significant overheads (like paying hundreds of employees for decades).
You think that if you go out in the street, and grab the first 100
people, including children, that even one of them would have a
Mathematica license??
Probably a lot more here in Cambridge but we are hardly representative of
the rest of the world.
Post by Richard Fateman
Assuming that 10 million were sold in the USA,
You think half of the world's population is in the USA?
Post by Richard Fateman
where you presumably live,
Err, no. I'm in the 95% of the world's population who live outside the USA.
Post by Richard Fateman
you would expect 3 out of 100 to be Mathematica licensees.
7 billion people in the world divided by no more than 20M licenses gives one
licence to at least 350 people. So you're out by an order of magnitude
again...

Wolfram Research actually have a building just outside Oxford filled with
multilingual people who sell it across the globe. Did you not do that with
Maxima?
Post by Richard Fateman
Post by Jon Harrop
So yes, I can easily believe that Mathematica has millions of active users.
I think this is called "drinking your own Kool Aid".
...
20 million active users?? According to Wolfram|Alpha there are 4.3
million science and engineering students in post-secondary education in
USA. Are you counting them all as active users of Mathematica, as well
as the students in China, India, all of Europe?
You think Mathematica is only used by science and engineering students in
post-secondary education?
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
That's just Maxima compared to Mathematica.
While I have nothing to do with the implementation of common lisp called
CLISP, I note that sourceforge says it has been downloaded 317,000
times. This could, of course, be one user downloading it 317,000 times,
but I doubt it.
You think hundreds of thousands of people are programming with clisp?
Is a user of lisp a programmer? Not necessarily.
You think 20 million people are programming in Mathematica? I doubt it.
Right. It is more like 1,000 Lisp programmers and 1,000,000 Mathematica
programmers.
Post by Richard Fateman
Post by Jon Harrop
Ubuntu lists 4,273 clisp installs of which only 112 are actively used. I
seriously doubt clisp has over 1,000 users.
Why would ubuntu matter? All versions of unix amount to something like
1-3% percent of installed systems.
You think 98% of Lisp programmers use Windows?
Post by Richard Fateman
And why would anyone trust your judgment? Seriously.
You can verify it for yourself:

http://popcon.ubuntu.com
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Then there are also scheme implementations, and programs that
link lisp to graphics, java, lapack, ...
How is that relevant?
You were attacking (a) lisp [all kinds?] (b) lisp applications [all
kinds] as being less common than Mathematica..
No, that would be the strawman argument you built and then beat to death.
What I actually said was "Lisp or anything you have ever written".
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
...anyone who uses emacs is a lisp user...
That's hilarious.
...Anyone who books a plane ticket with Orbitz is a lisp user...
This just keeps getting better.
Post by Richard Fateman
Post by Jon Harrop
Even if you count end users, the Ubuntu popcon gives under 100k emacs
installs ever (the vast majority of whom know nothing about Lisp) whereas
Wolfram Alpha pulled in a million users in a single month.
I ran a web site called Tilu, written in Lisp, doing integral table
lookups. It amassed about 260,000 user hits before I took it down.
What is a "user hit"?
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Jon, when you make statements that are so easily disproven, you give
trolling a bad name.
Yes, you didn't have to clutch at extraordinarily long straws to "prove"
that at all.
Counting Wolfram Alpha visitors -- now that's a long straw.
If you want to count people editing text with emacs or booking flights
on-line as Lisp programmers then it seems more than reasonable.
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Are you selling toothpaste? all new, brighter teeth, organic?
Eh?
Hey, I use a production-quality toothbrush.
I imagine you'd have to with all that BS spewing out of your mouth.
Post by Richard Fateman
Post by Jon Harrop
Oh, you're going to implement cons cells in the JVM or CLR?
Ah, you are not peddling virtual machines, but particular virtual
machine implementations.
You could probably have inferred that from the way I explicitly named the
JVM and CLR several times in this thread already.
Post by Richard Fateman
I would suggest that you explore the history
of virtual machines and byte-coded Lisps to see how implementing
operations and representations for Lisp in a VM might be done.
I suggest you reread my statement in order to understand why that is also
completely irrelevant.
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Or why tagged pointers are incompatible with parallelism.
You would have to implement a Lisp with a production-quality concurrent
GC. That is certainly possible but nobody has actually done it yet.
How would you know?
Oh, is there a secret Lisp implementation that doesn't suck? Shhh or it
might get users...
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Perhaps you would turn your keen eye to comparing the benefits of real
memory management and a quality garbage collector (as in some Lisps) to
the state of the art in JVM or similar memory management?
Turn my eye to the topic I have been discussing all along?
You seem to confuse "concurrent" with "quality".
Oh, the JVM and CLR are just toys are they? Not production quality like
Maxima.
Post by Richard Fateman
Post by Jon Harrop
A 2-element array is an ordered pair. Indeed, that was the whole point...
Your "point" was that there would be some advantage by storing a pair in
a 2-element array as opposed to some carefully crafted representation
that stored 2 pointers in a "cons cell" whatever that might be. And that
advantage would be related to some made-up attribute like
"production-quality-ness" or "parallel-ness".
No, that was not my "point" at all.
Post by Richard Fateman
I doubt that your "point" has anything to do with reality, but it
certainly has not been demonstrated.
Really? Your "strawman" has not been demonstrated? How enlightening.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Dimiter "malkia" Stanev
2009-07-28 22:43:56 UTC
Permalink
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
You are timing the creation of an array with 10^5 elements. We are
talking about 2-element arrays...
The point was that Prepend in Mathematica is not "just fine".
Which remains completely irrelevant.
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
The relevance is that Lists in Mathematica are arrays, and arrays are,
generally speaking, unsuitable for making conses because every time you
construct a CONS-like object Mathematica checks to see if it has to
re-evaluate the CAR and CDR, so to speak. Recursively.
As an aside, the time stamp on the first cons should cause Mathematica to
stop rewriting immediately.
Rewriting? For an explanation of Mathematica's time stamp philosophy,
you can read my review of Mathematica.
I can read some of your woefully misinformed whining or remember what the
guy who actually wrote the code told me. Hmm, let me think...
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
If you want to build sequences efficiently in
Mathematica use Sow and Reap as the documentation describes.
Sow and Reap were introduced in Mathematica version 5.0.
I see no reason for you to think that this is efficient, compared to
some other method (unspecified!) to produce the same sequence.
RTFM.
I have, have you? I see no reason for you to think that this is
efficient, compared to some other method.
So you didn't see the "Possible Issues" section about performance that gave
measurements about exactly this then?
http://reference.wolfram.com/mathematica/ref/AppendTo.html
Maybe you need someone to read it out loud to you?
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
If each licensee pays on average, say $500, that means the Mathematica
earns $1.5 billion/year.
The most common Mathematica licence is only $50. You seriously think
those Mathematica users upgrade every 5 weeks?
The price for Mathematica 7 "home edition" is $295. The Mathematica for
students price is $139.50. "Real" licenses are much more.
"In just 10 minutes, for about $50, any student around the world can be up
and running with Mathematica." -
http://www.wolfram.com/news/timedstudent.html
But that's only for six months. It's quite costly for a student - about
$8 bucks a month for it, some of them are living on $800/month (well
that's a figure from a friend at UCLA, after paying rent) - and every
dollar is worth keeping it. There are other licenses $70 for a year, and
$140 as long as one is active student:

Mathematica for Students:
Semester Edition
6 months
$44.95
Buy Now

* Product license remains active for six months from the date of
registration
* Available to part-time or full-time students working toward a
high school, associate's, bachelor's, master's, doctoral, or equivalent
degree
* Available to part-time, non-degree-seeking students if required
for a class
* Available by download only

http://www.wolfram.com/products/student/mathforstudents/licenses.html

I've bought the Mathematica Personal Editition for about $299.
Jon Harrop
2009-07-29 20:36:13 UTC
Permalink
Post by Dimiter "malkia" Stanev
Post by Jon Harrop
"In just 10 minutes, for about $50, any student around the world can be
up and running with Mathematica." -
http://www.wolfram.com/news/timedstudent.html
But that's only for six months. It's quite costly for a student - about
$8 bucks a month for it, some of them are living on $800/month (well
that's a figure from a friend at UCLA, after paying rent) - and every
dollar is worth keeping it.
IME, poor students tend to pay $50 and then use the same licence for their 3
year degree by changing their clock and, if necessary, setting the mach
addr at startup.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
w_a_x_man
2009-07-28 23:12:57 UTC
Permalink
Post by Jon Harrop
Maybe you need someone to read it out loud to you?
aloud to you?
Post by Jon Harrop
--
Dr Jon D Harrop
If you don't have a first name that's pronounced "durr",
then that ought to be
Dr. Jon D. Harrop
Richard Fateman
2009-07-29 00:05:53 UTC
Permalink
Post by Jon Harrop
Post by Richard Fateman
Rewriting? For an explanation of Mathematica's time stamp philosophy,
you can read my review of Mathematica.
I can read some of your woefully misinformed whining or remember what the
guy who actually wrote the code told me. Hmm, let me think...
You seem determined to not think, but I will certainly give you
permission to think.
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
If you want to build sequences efficiently in
Mathematica use Sow and Reap as the documentation describes.
Sow and Reap were introduced in Mathematica version 5.0.
I see no reason for you to think that this is efficient, compared to
some other method (unspecified!) to produce the same sequence.
RTFM.
I have, have you? I see no reason for you to think that this is
efficient, compared to some other method.
So you didn't see the "Possible Issues" section about performance that gave
measurements about exactly this then?
http://reference.wolfram.com/mathematica/ref/AppendTo.html
Maybe you need someone to read it out loud to you?
Well for someone so concerned with performance, let me explain to you this:

program 1 takes 5 seconds
program 2 takes 0.3 seconds to compute the same thing. (using Reap, Sow]

Your conclusion is that this has proved: Reap/Sow is efficient.
This is FALSE in general and in specificity as well.

Reason: given ANY program 2, another program can be constructed that
does the same thing but slower.

To take examples from the cited page... (This is Mathematica code)

program 1
BlockRandom[
Timing[a = {}; sum = 0;
While[sum < 10^5, r = RandomInteger[9]; sum += r; AppendTo[a, r]];
Length[a]]] takes 3.42 sec

program 2
BlockRandom[
Timing[sum = 0; {r, {a}} =
Reap[While[sum < 10^5, r = RandomInteger[9]; sum += r; Sow[r]]];
Length[a]]] takes 0.27 sec

Here's an equivalent program that I wrote that is 3X FASTER

BlockRandom[Timing[(
Length[
a = Differences[
Select[Accumulate[
Table[RandomInteger[9], {i, 1, 5*10^4}]], # < 10^5 &]]])]]


It is also simpler, having no While, and no variables except a which is
used to return the array answer.
(I suppose I could make it faster, but I didn't need to to make my point. )

So my point is this: Sow and Reap are not "efficient" per se. They
provide a different paradigm for collection, similar in some ways to
lisp's catch/throw. Because AppendTo and Prepend are such terrible
programs, you shouldn't use them often. See the program above.
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
If each licensee pays on average, say $500, that means the Mathematica
earns $1.5 billion/year.
The most common Mathematica licence is only $50. You seriously think
those Mathematica users upgrade every 5 weeks?
The price for Mathematica 7 "home edition" is $295. The Mathematica for
students price is $139.50. "Real" licenses are much more.
"In just 10 minutes, for about $50, any student around the world can be up
and running with Mathematica." -
http://www.wolfram.com/news/timedstudent.html
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
Does this jibe with your understanding of that company?
Stephen Wolfram is now a billionaire thanks to sales of Mathematica.
What evidence do you have for this fairly implausible claim?
That was widely reported in the mainstream press a few years ago.
Uh, Google News didn't notice any of those widely reported reports.
Neither did Fortune Magazine.
Perhaps you saw the term bilious..

..bilious means suffering from liver dysfunction (and especially
excessive secretion of bile). And, further by extension, it is
indicative of a peevish ill-natured disposition.

Of course many people write about Wolfram and Wolfram's work. For example,
Google found this comment on his New Kind of Science book:...

"Its publication has been seen as initiating a paradigm shift of
historic importance in science, with new implications emerging at an
increasing rate every year."

Though that was Wolfram praising himself.
Post by Jon Harrop
7 billion people in the world divided by no more than 20M licenses gives one
licence to at least 350 people. So you're out by an order of magnitude
again...
OK, if you believe that you should divide up the licenses to Mathematica
equally over the whole population,(incl. people who don't have
computers??) I think you are way off.
Post by Jon Harrop
Wolfram Research actually have a building just outside Oxford filled with
multilingual people who sell it across the globe. Did you not do that with
Maxima?
No, as you know Maxima is given away free, but the manual is in several
languages. I think Spanish, Portugese, German, French, maybe Japanese.
Post by Jon Harrop
You think Mathematica is only used by science and engineering students in
post-secondary education?
no, just to give you a place to start with a back-of-the envelope
estimate. I assure you that many such students DONT use Mathematica.
Post by Jon Harrop
Post by Richard Fateman
You think 20 million people are programming in Mathematica? I doubt it.
Right. It is more like 1,000 Lisp programmers and 1,000,000 Mathematica
programmers.
I think it is kind of pointless to discuss numbers that you just pull
out of your posterior.
Post by Jon Harrop
You think 98% of Lisp programmers use Windows?
The Gartner Group provides figures that suggest that something like 95%
of PCs run Windows.
Post by Jon Harrop
Post by Richard Fateman
And why would anyone trust your judgment? Seriously.
http://popcon.ubuntu.com
What does ubuntu have to do with anything?
Post by Jon Harrop
Post by Richard Fateman
You were attacking (a) lisp [all kinds?] (b) lisp applications [all
kinds] as being less common than Mathematica..
No, that would be the strawman argument you built and then beat to death.
What I actually said was "Lisp or anything you have ever written".
I think the argument was your strawman exactly.
Post by Jon Harrop
Post by Richard Fateman
Post by Jon Harrop
Post by Richard Fateman
...anyone who uses emacs is a lisp user...
That's hilarious.
...Anyone who books a plane ticket with Orbitz is a lisp user...
This just keeps getting better.
Post by Richard Fateman
Post by Jon Harrop
Even if you count end users, the Ubuntu popcon gives under 100k emacs
installs ever (the vast majority of whom know nothing about Lisp) whereas
Wolfram Alpha pulled in a million users in a single month.
I ran a web site called Tilu, written in Lisp, doing integral table
lookups. It amassed about 260,000 user hits before I took it down.
What is a "user hit"?
Someone who submits a mathematical formula to the web site to find its
integral. I suppose you could consider "typing a formula" to be
"writing a program". I think that's a weak argument, but you may be
using it with respect to Mathematica.
[big snip]

If you think that the only way to make a production quality lisp is to
use JVM, that's too bad.

I think your pee is recirculating.
Larry Coleman
2009-07-29 12:10:46 UTC
Permalink
Post by Richard Fateman
If you think that the only way to make a production quality lisp is to
use JVM, that's too bad.
No, it's much worse than that. He thinks the only way to make a
production quality lisp is to write his own VM first.
Jon Harrop
2009-07-29 19:44:46 UTC
Permalink
Post by Larry Coleman
Post by Richard Fateman
If you think that the only way to make a production quality lisp is to
use JVM, that's too bad.
No, it's much worse than that. He thinks the only way to make a
production quality lisp is to write his own VM first.
Yes.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Pillsy
2009-07-28 20:42:02 UTC
Permalink
On Jul 28, 3:39 pm, Jon Harrop <***@ffconsultancy.com> wrote:
[...]
Post by Jon Harrop
Even if you count end users, the Ubuntu popcon gives under 100k emacs
installs ever (the vast majority of whom know nothing about Lisp) whereas
Wolfram Alpha pulled in a million users in a single month.
There are so many things wrong with this comparison that I don't even
know where to begin.

IME, Emacs users use Lisp to about the extent that most Mathematica
users use the Mathematica programming language: for customization,
simple extension, and basic scripting of repetitive tasks. Anything
other than making assignments to global variables or defining simple
"functions" is beyond them.

Later,
Pillsy
Jon Harrop
2009-07-28 22:20:32 UTC
Permalink
Post by Pillsy
[...]
Post by Jon Harrop
Even if you count end users, the Ubuntu popcon gives under 100k emacs
installs ever (the vast majority of whom know nothing about Lisp) whereas
Wolfram Alpha pulled in a million users in a single month.
There are so many things wrong with this comparison that I don't even
know where to begin.
IME, Emacs users use Lisp to about the extent that most Mathematica
users use the Mathematica programming language: for customization,
simple extension, and basic scripting of repetitive tasks. Anything
other than making assignments to global variables or defining simple
"functions" is beyond them.
Actually I think the discrepancy is much larger than that. The vast majority
of emacs users probably never even touched the settings written in lisp. I
don't.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
fft1976
2009-07-21 18:52:05 UTC
Permalink
Post by Xah Lee
• Fundamental Problems of Lisp
 http://xahlee.org/UnixResource_dir/writ/lisp_problems.html
This sounds very interesting, but the translation is very bad. Did you
use babelfish?
gugamilare
2009-07-21 20:59:15 UTC
Permalink
Post by fft1976
Post by Xah Lee
• Fundamental Problems of Lisp
 http://xahlee.org/UnixResource_dir/writ/lisp_problems.html
This sounds very interesting,
You can't be serious.
Mr.Cat
2009-07-21 23:13:45 UTC
Permalink
Post by Xah Lee
dear idiot,
...
Post by Xah Lee
Q: If you don't like cons, lisp has arrays and hashmaps, too.
No need to be insulting here.

When it comes to lists, scheme is rather close to other functional
languages like haskell or erlang. Lists haskell/erlang also have cons-
cell semantics, but it is a bit less explicit. At the same time in
scheme you're not limited to cons/car/cdr whe dealing with lists.
You've got different utility functions (i.e. srfi-1), pattern matching
and so on.
Aaron W. Hsu
2009-07-21 23:57:14 UTC
Permalink
Post by Mr.Cat
When it comes to lists, scheme is rather close to other functional
languages like haskell or erlang. Lists haskell/erlang also have cons-
cell semantics, but it is a bit less explicit. At the same time in
scheme you're not limited to cons/car/cdr whe dealing with lists.
You've got different utility functions (i.e. srfi-1), pattern matching
and so on.
Xah is, I believe, complaining about the very existence of the construct
in the core language, arguing that its frequent use represents a problem
to him and writing his own code. Apparently, the use of CONS in any code
makes that code bad, for some value of bad. I hope that the illegitimacy
of such "reasoning" appears self-evident to the majority of readers.

Aaron Hsu
--
Of all tyrannies, a tyranny sincerely exercised for the good of its
victims may be the most oppressive. -- C. S. Lewis
fft1976
2009-07-22 01:07:36 UTC
Permalink
Post by Mr.Cat
When it comes to lists, scheme is rather close to other functional
languages like haskell or erlang. Lists haskell/erlang also have cons-
cell semantics, but it is a bit less explicit. At the same time in
scheme you're not limited to cons/car/cdr whe dealing with lists.
You've got different utility functions (i.e. srfi-1), pattern matching
and so on.
Xah is, I believe, complaining about the very existence of the construct  
in the core language, arguing that its frequent use represents a problem  
to him and writing his own code. Apparently, the use of CONS in any code  
makes that code bad, for some value of bad. I hope that the illegitimacy  
of such "reasoning" appears self-evident to the majority of readers.
But the same people who think undesirable optional features are not a
valid reason to complain about a language (Common Lisp), will turn
around and claim that C++ sucks compared to C.

I maintain that Xah is smarter than most people here. He's simply
misunderstood (because he seems to be using babelfish for his posts)
Mr.Cat
2009-07-22 09:23:24 UTC
Permalink
Post by fft1976
I maintain that Xah is smarter than most people here. He's simply
misunderstood (because he seems to be using babelfish for his posts)
The cons (and cdr, car, caadar etc) are fundamentally rooted in the lisp langs, is thus not something that can be easily mended.
(A) create a standard that all list data structures must be proper lists.
(B) Expose only higher-level list manipulation functions (i.e. interfaces to cons) in all newer literature.
(C) Even mark cons related constructs as obsolete)
But this is already done for scheme (except for making cons obsolete
of course). Thus, I don't know what are we going to discuss here.
Michele Simionato
2009-07-22 13:37:16 UTC
Permalink
Post by Mr.Cat
Post by fft1976
(A) create a standard that all list data structures must be proper lists.
(B) Expose only higher-level list manipulation functions (i.e. interfaces to cons) in all newer literature.
(C) Even mark cons related constructs as obsolete)
But this is already done for scheme (except for making cons obsolete
of course).
What are you talking about? Nor A, nor B, nor C are done in Scheme.
Aaron W. Hsu
2009-07-22 19:37:41 UTC
Permalink
On Wed, 22 Jul 2009 09:37:16 -0400, Michele Simionato
Post by Michele Simionato
Post by Mr.Cat
Post by fft1976
(A) create a standard that all list data structures must be proper
lists.
Post by fft1976
(B) Expose only higher-level list manipulation functions (i.e.
interfaces to cons) in all newer literature.
Post by fft1976
(C) Even mark cons related constructs as obsolete)
But this is already done for scheme (except for making cons obsolete
of course).
What are you talking about? Nor A, nor B, nor C are done in Scheme.
And thank goodness for that.

Aaron
--
Of all tyrannies, a tyranny sincerely exercised for the good of its
victims may be the most oppressive. -- C. S. Lewis
Pillsy
2009-07-27 15:15:36 UTC
Permalink
On Jul 20, 10:04 am, Xah Lee <***@gmail.com> wrote:
[....]
Post by Xah Lee
As a illustration, the following 2 tech aspects we can see Scheme's
problem: (1) the cons business.
The idea that exposing something as low-level as "cons" is a major
obstacle to a language's popularity has seems to have two major
problems.

A. It implicitly depends on the assertion that exposing[1] low-level
constructs is a major impediment to a language's widespread adoption,
but C exposes an even lower-level detail than conses, in the form of
pointers, and doing anything useful in C without understanding
pointers is virtually impossible. This hasn't prevented C from
becoming vastly more widely used than Lisp or any of the functional
languages you often cite as somehow superseding it.

B. It's extremely common for functional languages that aren't Lisp to
have conses as well, and to represent lists as recursively built out
of conses. That's because they're a very convenient data structure
when you're doing things (constructing lists, or reducing across them,
or mapping over them) in a functional way, using recursive functions.
If you don't have them, it makes functional programming a pain, and
trying to program in a functional style in Mathematica is often a real
pain because you don't have conses. Oh, sure, you can fake a cons with
a two-element Mathematica List, but then you either need to
gratuitously flatten things all over the place, or you need to use
convoluted and non-idiomatic things in order to do your mapping and
reducing.

C. Lisp does provide simple syntax for expressing those lists built
out of conses. The parenthesized syntax of Lisp is precisely that: a
simple syntax for representing lists built recursively from conses!

All that remains is that conses are too low-level a representation for
a list datatype. This is a more common complaint (Ron Garret makes the
same complaint elsewhere in this discussion), but I don't really agree
with it either. At some point, and probably sooner rather than later,
you're going to kill the performance of an otherwise perfectly
legitimate algorithm if you don't know what sort of underlying list
representation you have. People are likely to want both
representations depending on the context, so why not provide both and
ensure that the various functions people want to use on abstract lists
work with both representations?

Cheers,
Pillsy
Jon Harrop
2009-07-28 16:29:28 UTC
Permalink
Post by Pillsy
Post by Xah Lee
As a illustration, the following 2 tech aspects we can see Scheme's
problem: (1) the cons business.
The idea that exposing something as low-level as "cons" is a major
obstacle to a language's popularity has seems to have two major
problems.
A. It implicitly depends on the assertion that exposing[1] low-level
constructs is a major impediment to a language's widespread adoption,
but C exposes an even lower-level detail than conses, in the form of
pointers, and doing anything useful in C without understanding
pointers is virtually impossible. This hasn't prevented C from
becoming vastly more widely used than Lisp or any of the functional
languages you often cite as somehow superseding it.
Pointers in C do the job well. Cons cells in Lisp do not.
Post by Pillsy
B. It's extremely common for functional languages that aren't Lisp to
have conses as well,
No. They often use the name "cons" to refer to non-empty lists but they are
not cons cells in the Lisp sense.
Post by Pillsy
If you don't have them, it makes functional programming a pain...
No. In all modern FPL implementations, lists are just another data structure
provided in the standard library. Variant types are the core feature
provided in the language itself that lists are built upon.
Post by Pillsy
All that remains is that conses are too low-level a representation for
a list datatype. This is a more common complaint (Ron Garret makes the
same complaint elsewhere in this discussion), but I don't really agree
with it either. At some point, and probably sooner rather than later,
you're going to kill the performance of an otherwise perfectly
legitimate algorithm if you don't know what sort of underlying list
representation you have. People are likely to want both
representations depending on the context, so why not provide both and
ensure that the various functions people want to use on abstract lists
work with both representations?
Why not forget about cons cells because they have been of no practical
relevance for several decades?
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Pillsy
2009-07-28 15:50:59 UTC
Permalink
[...]
Post by Jon Harrop
Post by Pillsy
A. It implicitly depends on the assertion that exposing[1] low-level
constructs is a major impediment to a language's widespread adoption,
but C exposes an even lower-level detail than conses, in the form of
pointers, and doing anything useful in C without understanding
pointers is virtually impossible. This hasn't prevented C from
becoming vastly more widely used than Lisp or any of the functional
languages you often cite as somehow superseding it.
Pointers in C do the job well. Cons cells in Lisp do not.
Assumes facts not in evidence.
Post by Jon Harrop
Post by Pillsy
B. It's extremely common for functional languages that aren't Lisp to
have conses as well,
No. They often use the name "cons" to refer to non-empty lists but they are
not cons cells in the Lisp sense.
This is not a particularly interesting difference, because it hinges
much more on the different approaches to typing taken by FPLs and Lisp
than anything else.

Of course, I expect this to launch you on a tirade about the evils of
dynamic typing, which is probably a more useful approach than
continuing this nonsensical attack on conses.
[...]
Post by Jon Harrop
Post by Pillsy
If you don't have them, it makes functional programming a pain...
No. In all modern FPL implementations, lists are just another data structure
provided in the standard library.
And your point is?
[...]
Post by Jon Harrop
Post by Pillsy
People are likely to want both representations depending on the context,
so why not provide both and ensure that the various functions people want to
use on abstract lists work with both representations?
Why not forget about cons cells because they have been of no practical
relevance for several decades?
If they're of no practical relevance, why provide them in the standard
library?

If they are of practical relevance, why quibble over details of
implementation?

Later,
Pillsy
Jon Harrop
2009-07-28 17:47:23 UTC
Permalink
Post by Pillsy
[...]
Post by Jon Harrop
Post by Pillsy
A. It implicitly depends on the assertion that exposing[1] low-level
constructs is a major impediment to a language's widespread adoption,
but C exposes an even lower-level detail than conses, in the form of
pointers, and doing anything useful in C without understanding
pointers is virtually impossible. This hasn't prevented C from
becoming vastly more widely used than Lisp or any of the functional
languages you often cite as somehow superseding it.
Pointers in C do the job well. Cons cells in Lisp do not.
Assumes facts not in evidence.
Pointers remain popular in systems programming languages. Cons died with
Lisp.
Post by Pillsy
Post by Jon Harrop
Post by Pillsy
B. It's extremely common for functional languages that aren't Lisp to
have conses as well,
No. They often use the name "cons" to refer to non-empty lists but they
are not cons cells in the Lisp sense.
This is not a particularly interesting difference, because it hinges
much more on the different approaches to typing taken by FPLs and Lisp
than anything else.
No, it has nothing whatsoever to do with the type system. There is nothing
special about the representation of a non-empty list in languages like SML,
OCaml, Haskell and F#. It is just another variant type constructor.
Post by Pillsy
Post by Jon Harrop
Post by Pillsy
If you don't have them, it makes functional programming a pain...
No. In all modern FPL implementations, lists are just another data
structure provided in the standard library.
And your point is?
Your statement was totally factually incorrect.
Post by Pillsy
Post by Jon Harrop
Post by Pillsy
People are likely to want both representations depending on the
context, so why not provide both and ensure that the various functions
people want to use on abstract lists work with both representations?
Why not forget about cons cells because they have been of no practical
relevance for several decades?
If they're of no practical relevance, why provide them in the standard
library?
Indeed. Get rid of them entirely as all modern FPLs already did.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Pillsy
2009-07-28 17:01:30 UTC
Permalink
[...]
Post by Pillsy
This is not a particularly interesting difference, because it hinges
much more on the different approaches to typing taken by FPLs and Lisp
than anything else.
*No, it has nothing whatsoever to do with the type system.* There is nothing
special about the representation of a non-empty list in languages like SML,
OCaml, Haskell and F#. *It is just another variant type constructor.*
Emphasis mine. Remind me why should I pay more attention to you than
you pay to yourself?

You still haven't explained why the details of how conses are
implemented (whether as variant types in an FPL,. or as cons cells in
Lisp) is a shortcoming of Lisp. I would hardly be surprised if you to
continue to not explain that, because it's such a self-evidently
idiotic position that I think even you wouldn't want to defend it.

Later,
Pillsy
[...]
Jon Harrop
2009-07-28 18:34:01 UTC
Permalink
Post by Pillsy
You still haven't explained why the details of how conses are
implemented (whether as variant types in an FPL,. or as cons cells in
Lisp) is a shortcoming of Lisp.
They are an obsolete special case, i.e. historical baggage.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Peter Keller
2009-07-28 18:34:08 UTC
Permalink
Post by Jon Harrop
Post by Pillsy
You still haven't explained why the details of how conses are
implemented (whether as variant types in an FPL,. or as cons cells in
Lisp) is a shortcoming of Lisp.
They are an obsolete special case, i.e. historical baggage.
So, why does the C++ STL have Pair? It is a special case of Vector. Why
not get rid of Pair from the STL and rewrite all of the codes to use a
Vector instead?

If you can show how your argument works in this case, where the
fundamental aspect of the C++ language and its libraries don't utilize
the exposed pair, like they do in Scheme, maybe people would understand
better.

-pete
Jon Harrop
2009-07-28 22:18:12 UTC
Permalink
Post by Peter Keller
Post by Jon Harrop
Post by Pillsy
You still haven't explained why the details of how conses are
implemented (whether as variant types in an FPL,. or as cons cells in
Lisp) is a shortcoming of Lisp.
They are an obsolete special case, i.e. historical baggage.
So, why does the C++ STL have Pair? It is a special case of Vector.
C++ is statically typed so a vector is a homogeneous collection where the
elements must be of the same type. There is no such restriction in a
dynamic language: the two elements of your vector may be of different types
such as the head element and tail list equivalent to a cons cell.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Peter Keller
2009-07-28 22:31:37 UTC
Permalink
Post by Jon Harrop
Post by Peter Keller
Post by Jon Harrop
Post by Pillsy
You still haven't explained why the details of how conses are
implemented (whether as variant types in an FPL,. or as cons cells in
Lisp) is a shortcoming of Lisp.
They are an obsolete special case, i.e. historical baggage.
So, why does the C++ STL have Pair? It is a special case of Vector.
C++ is statically typed so a vector is a homogeneous collection where the
elements must be of the same type. There is no such restriction in a
dynamic language: the two elements of your vector may be of different types
such as the head element and tail list equivalent to a cons cell.
Homogeneous doesn't matter. I could just make a class TypedObject
definition, and everything inherits from that. Then I can represent all
known types from that object and using polymorphism embed functionally
distinct types into a Vector<TypedObject>.

To perform closure on the Vector, I could derive an OVector from TypedObject
which implements Vector. Then, using OVector and the base type of TypedObject,
I can embed OVectors within themselves. And, I didn't even go the easy
route of using union to do the type mashing.

So, again, no need for Pair. So why does it exist?

If you say "because it makes it easy without the large type engine" then
you know why Scheme has it.

-pete
Keith H Duggar
2009-07-29 17:28:45 UTC
Permalink
Post by Peter Keller
Post by Jon Harrop
Post by Peter Keller
Post by Jon Harrop
Post by Pillsy
You still haven't explained why the details of how conses are
implemented (whether as variant types in an FPL,. or as cons cells in
Lisp) is a shortcoming of Lisp.
They are an obsolete special case, i.e. historical baggage.
So, why does the C++ STL have Pair? It is a special case of Vector.
C++ is statically typed so a vector is a homogeneous collection where the
elements must be of the same type. There is no such restriction in a
dynamic language: the two elements of your vector may be of different types
such as the head element and tail list equivalent to a cons cell.
Homogeneous doesn't matter. I could just make a class TypedObject
definition, and everything inherits from that. Then I can represent all
known types from that object and using polymorphism embed functionally
distinct types into a Vector<TypedObject>.
To perform closure on the Vector, I could derive an OVector from TypedObject
which implements Vector. Then, using OVector and the base type of TypedObject,
I can embed OVectors within themselves. And, I didn't even go the easy
route of using union to do the type mashing.
The "Everything is an Object" model sketched above (and used by
Java) has the below consequences in C++ 1) one is forced to use
type casting to use the contained objects as derived objects 2)
it adds runtime overhead 3) it adds memory overhead 4) prevents
one from utilizing a wide range of static type checking 5) etc.

So, to provide statically checked maximally efficient (in both
space and speed) heterogenous containers in the general case (ie
not involving special cases such as when both types have equal
sizeof) std::pair, boost::tuple, or the is needed.
Post by Peter Keller
So, again, no need for Pair. So why does it exist?
It or something equivalent is needed for the general case, see
above. For the special case of a typical cons cell both objects
have the same size we can use a union along with either dynamic
size arrays such as std::vector or more efficient static size
arrays such as boost::array.
Post by Peter Keller
If you say "because it makes it easy without the large type
engine" then you know why Scheme has it.
I'm not sure what that means.

KHD
Jon Harrop
2009-07-29 19:42:19 UTC
Permalink
Post by Peter Keller
Post by Jon Harrop
Post by Peter Keller
Post by Jon Harrop
Post by Pillsy
You still haven't explained why the details of how conses are
implemented (whether as variant types in an FPL,. or as cons cells in
Lisp) is a shortcoming of Lisp.
They are an obsolete special case, i.e. historical baggage.
So, why does the C++ STL have Pair? It is a special case of Vector.
C++ is statically typed so a vector is a homogeneous collection where the
elements must be of the same type. There is no such restriction in a
dynamic language: the two elements of your vector may be of different
types such as the head element and tail list equivalent to a cons cell.
Homogeneous doesn't matter. I could just make a class TypedObject
definition, and everything inherits from that. Then I can represent all
known types from that object and using polymorphism embed functionally
distinct types into a Vector<TypedObject>.
That means that Pair<Object, Object> is a special case of Vector<Object>
with two elements. However, that is not the same as Pair<T,U> is a special
case of Vector<?> for any ? that is valid in the C++ type system because
not everything is derived from Object in C++.

For example, Pair<int, double> is not a special case of Vector<?> for
any "?".
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Pillsy
2009-07-28 18:50:48 UTC
Permalink
Post by Jon Harrop
Post by Pillsy
You still haven't explained why the details of how conses are
implemented (whether as variant types in an FPL,. or as cons cells in
Lisp) is a shortcoming of Lisp.
They are an obsolete special case, i.e. historical baggage.
Replacing cons cells with something equivalent (instances of a
STRUCTURE-CLASS would be the most obvious approach in Common Lisp) and
propagating the needed changes through the rest of the library of
functions would do nothing to address any of the alleged problems that
Xah Lee raises. Nor would it solve any other problems I can see,
beyond your bare assertion that they ought to be discarded as
"historical baggage".

Later,
Pillsy
Jon Harrop
2009-07-28 22:19:12 UTC
Permalink
Post by Pillsy
Post by Jon Harrop
Post by Pillsy
You still haven't explained why the details of how conses are
implemented (whether as variant types in an FPL,. or as cons cells in
Lisp) is a shortcoming of Lisp.
They are an obsolete special case, i.e. historical baggage.
Replacing cons cells with something equivalent (instances of a
STRUCTURE-CLASS would be the most obvious approach in Common Lisp) and
propagating the needed changes through the rest of the library of
functions would do nothing to address any of the alleged problems that
Xah Lee raises. Nor would it solve any other problems I can see,
beyond your bare assertion that they ought to be discarded as
"historical baggage".
The benefit is simplicity.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Pillsy
2009-07-29 18:00:35 UTC
Permalink
[...]
Post by Jon Harrop
Post by Pillsy
Replacing cons cells with something equivalent (instances of a
STRUCTURE-CLASS would be the most obvious approach in Common Lisp) and
propagating the needed changes through the rest of the library of
functions would do nothing to address any of the alleged problems that
Xah Lee raises. Nor would it solve any other problems I can see,
beyond your bare assertion that they ought to be discarded as
"historical baggage".
The benefit is simplicity.
What simplicity does the user gain? Everything that they could do with
the old CONS they can do with the new CONS, which means they have to
understand the old CONS just as well as they had to understand the old
CONS (however well that is) to productively program in Lisp. The
*only* difference is that now

(type-of (class-of (cons 1 nil)) => STRUCTURE-CLASS

instead of

(type-of (class-of (cons 1 nil)) => BUILT-IN-CLASS

but how is that any simpler for the user?

Later,
Pillsy
Jon Harrop
2009-07-29 19:36:58 UTC
Permalink
Post by Pillsy
What simplicity does the user gain? Everything that they could do with
the old CONS they can do with the new CONS,
Yes.
Post by Pillsy
which means they have to
understand the old CONS just as well as they had to understand the old
CONS (however well that is) to productively program in Lisp.
No. They now have to understand only vectors and not both cons and vectors.
Post by Pillsy
The *only* difference is that now
(type-of (class-of (cons 1 nil)) => STRUCTURE-CLASS
instead of
(type-of (class-of (cons 1 nil)) => BUILT-IN-CLASS
* (type-of (class-of #(1 2)))

BUILT-IN-CLASS
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Peter Keller
2009-07-28 17:07:16 UTC
Permalink
[ snip "them" is meaning cons cell]
Post by Jon Harrop
Post by Pillsy
If they're of no practical relevance, why provide them in the standard
library?
Indeed. Get rid of them entirely as all modern FPLs already did.
So, the obvious solution to this conundrum is why don't one of the
people who are stating that cons cell should be removed from the language
implement a small dialect of scheme which in fact does just that. With
an actual kernel of a language people can download and try out, a better
choice can be made. The small dialect doesn't need the full monty of all
of the features of scheme behind it, but enough to show that it would
or wouldn't be a good idea.

-pete
r***@gmail.com
2019-07-12 20:29:56 UTC
Permalink
I'm a decade late in joining the party, but I think Dick Gabriel answers the original question better than anybody. And he's in a position to know:

http://www.dreamsongs.net/WorseIsBetter.html
r***@gmail.com
2019-07-12 20:38:57 UTC
Permalink
I'm a decade late in joining the party, but I think Dick Gabriel answers the original question better than anybody in his essay "Lisp: Good News, Bad News, How to Win Big". And he's in a position to know:

https://www.dreamsongs.com/WIB.html

Loading...