Discussion:
"Java is the SUV of programming tools"
(too old to reply)
Kent Paul Dolan
2003-09-22 23:49:36 UTC
Permalink
http://blogs.law.harvard.edu/philg/2003/09/20#a1762
Sigh. As someone who came up through the computer graphics
community from when drawing meant creating your own bits one
by one to make an image, that fellow is _so_ out of line.

Java is the most productive out of the 12 dozen programming
languages I've learned, and mostly it's because so many wheels
I used to have to invent anew in each language are now library
calls.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Jason Voegele
2003-09-23 13:24:41 UTC
Permalink
Post by Kent Paul Dolan
http://blogs.law.harvard.edu/philg/2003/09/20#a1762
[snip]
Post by Kent Paul Dolan
Java is the most productive out of the 12 dozen programming
languages I've learned, and mostly it's because so many wheels
I used to have to invent anew in each language are now library
calls.
Have you used any of:

Smalltalk
Ruby
Python
Lisp
Eiffel
Perl
?

If so, do you really feel that Java is more productive?

Ruby programs, for example, are typically less than half the size of their
Java equivalents. In some cases Ruby code can be less than 10% of the size
of the Java code. Much the same can be said for the other languages I've
listed, even if not as extreme in some cases.
--
Jason Voegele
"We believe that we invent symbols. The truth is that they invent us."
-- Gene Wolfe, The Book of the New Sun
DaiIchi
2003-09-27 05:44:37 UTC
Permalink
Language snobs. Sheesh. Java is a great general purpose language.
Kinda slow on the startup... but very flexible. A Java programmer
learns a skill that he can apply to multiple situations: he can build
web pages (ala JSP), server code, client code, phone code... with very
little relearning to do. Try to do that with Perl.

BTW, a good site to compare languages is :
http://www.bagley.org/~doug/shootout/ you quickly see that in many
cases, Java isn't the fastest--and it isn't the slowest. But it beats
out Ruby and Perl handily in many cases. Java code has a smaller
footprint and uses less memory. Go figure.
Post by Jason Voegele
Post by Kent Paul Dolan
http://blogs.law.harvard.edu/philg/2003/09/20#a1762
[snip]
Post by Kent Paul Dolan
Java is the most productive out of the 12 dozen programming
languages I've learned, and mostly it's because so many wheels
I used to have to invent anew in each language are now library
calls.
Smalltalk
Ruby
Python
Lisp
Eiffel
Perl
?
If so, do you really feel that Java is more productive?
Ruby programs, for example, are typically less than half the size of their
Java equivalents. In some cases Ruby code can be less than 10% of the size
of the Java code. Much the same can be said for the other languages I've
listed, even if not as extreme in some cases.
Martin Drautzburg
2003-09-27 09:26:05 UTC
Permalink
Post by DaiIchi
http://www.bagley.org/~doug/shootout/ you quickly see that in many
cases, Java isn't the fastest--and it isn't the slowest. But it beats
out Ruby and Perl handily in many cases. Java code has a smaller
footprint and uses less memory. Go figure.
This site does not talk about poductivity at all, with the minor
exception of listing lines of code and in that respect java does not
exactly shine. The benchmarks are typical micro benchmarks and it is
well known that Java does a decent job on micro benchmarks.

In my experience the single most important factor to achieve good
performace is maintainable elegant code (i.e. programmer performance)
and NOT a good micro benchmark rating.

Also neither Smalltalk nor Lisp is among the languages compared.

Then again I agree that Java is an acceptable general purpose
language. But considering what the computing world knew when Java was
invented it was not a big leap (at least not towards the cutting
edge).
Isaac Gouy
2003-09-27 20:36:54 UTC
Permalink
Post by Martin Drautzburg
Also neither Smalltalk nor Lisp is among the languages compared
Several Lisps and Schemes are represented (for at least some tests).
In fact, Stalin and Bigloo (Scheme) and Common Lisp are up with C on
one test http://www.bagley.org/~doug/shootout/bench/ary3/

I don't know whether any Smalltalk contributions were sent to Doug
Bagley, they have been provided to the Win32 Shootout - maybe one day
they'll appear.

The main problem has been that these are tests are intended to be like
command line utilities (originally it was a comparison of scripting
languages) and Smalltalk implementations haven't focused on this kind
of use. (Yes, it can be done but the build process is more involved.)
James A. Robertson
2003-09-27 14:05:00 UTC
Permalink
Post by DaiIchi
Language snobs. Sheesh. Java is a great general purpose language.
Kinda slow on the startup... but very flexible. A Java programmer
learns a skill that he can apply to multiple situations: he can build
web pages (ala JSP), server code, client code, phone code... with very
little relearning to do. Try to do that with Perl.
What you say above is true of Smalltalk, and Lisp, and Python, and
Ruby. I think you've found the language snobs, but not where you
thought...
Post by DaiIchi
http://www.bagley.org/~doug/shootout/ you quickly see that in many
cases, Java isn't the fastest--and it isn't the slowest. But it beats
out Ruby and Perl handily in many cases. Java code has a smaller
footprint and uses less memory. Go figure.
Look here:

http://lists.squeakfoundation.org/pipermail/seaside/2003-September/002132.html

smaller footprint and less memory? Certainly not the former, and I
doubt the latter.
Post by DaiIchi
Post by Jason Voegele
Post by Kent Paul Dolan
http://blogs.law.harvard.edu/philg/2003/09/20#a1762
[snip]
Post by Kent Paul Dolan
Java is the most productive out of the 12 dozen programming
languages I've learned, and mostly it's because so many wheels
I used to have to invent anew in each language are now library
calls.
Smalltalk
Ruby
Python
Lisp
Eiffel
Perl
?
If so, do you really feel that Java is more productive?
Ruby programs, for example, are typically less than half the size of their
Java equivalents. In some cases Ruby code can be less than 10% of the size
of the Java code. Much the same can be said for the other languages I've
listed, even if not as extreme in some cases.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Isaac Gouy
2003-09-27 20:21:27 UTC
Permalink
***@googlenews.test.xhome.us (DaiIchi) wrote
-SNIP-
Post by DaiIchi
http://www.bagley.org/~doug/shootout/ you quickly see that in many
cases, Java isn't the fastest--and it isn't the slowest. But it beats
out Ruby and Perl handily in many cases. Java code has a smaller
footprint and uses less memory. Go figure.
That site is no longer updated.
The Win32 Shootout does get updated sometimes:
http://dada.perl.it/shootout/
and includes other languages (C#, Ada, ...)
Isaac Gouy
2003-09-23 14:43:35 UTC
Permalink
Post by Kent Paul Dolan
Java is the most productive out of the 12 dozen programming
languages I've learned, and mostly it's because so many wheels
I used to have to invent anew in each language are now library
calls.
That's one aspect, it's why I first used Smalltalk, it's why Perl, and
Python, (and still Fortran) and ... are popular - the code libraries.

The conversation about productivity is quite different when you have
to write new code or maintain old code. Then, it's the language itself
that matters.
Tom Welsh
2003-09-23 15:36:14 UTC
Permalink
Post by Kent Paul Dolan
http://blogs.law.harvard.edu/philg/2003/09/20#a1762
Sigh. As someone who came up through the computer graphics
community from when drawing meant creating your own bits one
by one to make an image, that fellow is _so_ out of line.
Java is the most productive out of the 12 dozen programming
languages I've learned, and mostly it's because so many wheels
I used to have to invent anew in each language are now library
calls.
Right on.

Btw, since when have Harvard lawyers become experts on professional
programming techniques?
--
Tom Welsh
Martin Drautzburg
2003-09-26 07:05:05 UTC
Permalink
Post by Kent Paul Dolan
Java is the most productive out of the 12 dozen programming
languages I've learned, and mostly it's because so many wheels
I used to have to invent anew in each language are now library
calls.
Can you name a few of the 12 ? I'd really like to know.

No doubt someone will post a collection of languages that she claims
are more productive than Java. It'll be interesting to compare these
lists.
Kent Paul Dolan
2003-09-26 16:07:49 UTC
Permalink
Post by Martin Drautzburg
Post by Kent Paul Dolan
Java is the most productive out of the 12 dozen programming
languages I've learned, and mostly it's because so many wheels
I used to have to invent anew in each language are now library
calls.
Can you name a few of the 12 ? I'd really like to know.
Read that again.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Isaac Gouy
2003-09-27 20:47:13 UTC
Permalink
Post by Kent Paul Dolan
Post by Martin Drautzburg
Post by Kent Paul Dolan
Java is the most productive out of the 12 dozen programming
languages I've learned, and mostly it's because so many wheels
I used to have to invent anew in each language are now library
calls.
Can you name a few of the 12 ? I'd really like to know.
Read that again.
Hmmm, learned 144 languages?

Takes me at least a year of daily use before I start to think I've
learned a language (and usually another year realizing what I hadn't
learned).

Guess there are different ideas about when somethings been learnt.
Kent Paul Dolan
2003-09-27 21:41:00 UTC
Permalink
Post by Isaac Gouy
Hmmm, learned 144 languages?
Yes. That's no big deal, though, I started in 1961.
Post by Isaac Gouy
Takes me at least a year of daily use before I start to think I've
learned a language (and usually another year realizing what I hadn't
learned).
Same here, but it's quite possible to be learning three or four at the
same time. I managed seven my first year at one job, out of necessity,
they were already part of the software package whose upkeep I accepted
with the job, along with six others I already knew. One I learned from
scratch on a trans-Pacific airplane flight, and put to use when I
stepped off the plane; having a laptop along helped.
Post by Isaac Gouy
Guess there are different ideas about when somethings been learnt.
I'll freely admit to mostly learning just the kernel parts of each
language. All imperative programming languages are pretty much alike,
once you realize the small set of things each of them must do. The rest
you get by reading the manual a couple of times to know that they are
present, then go back and "work from the book" those few times you need
a specialization of a particular language.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Isaac Gouy
2003-09-28 14:40:37 UTC
Permalink
Post by Kent Paul Dolan
I'll freely admit to mostly learning just the kernel parts of each
language. All imperative programming languages are pretty much alike,
once you realize the small set of things each of them must do. The rest
you get by reading the manual a couple of times to know that they are
present, then go back and "work from the book" those few times you need
a specialization of a particular language.
Know what you mean - on the other hand it's when I start to think
differently about programming that it seems I've learned, rather than
when I can program language X as though it were language Y.

You could give a short-form answer to Martin by answering the question
Jason originally put:

"Have you used any of:
Smalltalk
Ruby
Python
Lisp
Eiffel
Perl

If so, do you really feel that Java is more productive?"
Kent Paul Dolan
2003-09-28 19:52:23 UTC
Permalink
Post by Jason Voegele
Smalltalk
Ruby
Python
Lisp
Eiffel
Perl
I've used Lisp and Perl, the latter with much more
expertise; I _have_ Ruby and Python, have not yet used
either; but the forced format of Python repels me, and I
don't yet know enough about Ruby to comment. However, I'm
not a fan of terseness for terseness' sake: I happen to be
a reasonably fast touch typist. I prefer a language that
any technically competent person can desk check with me.
Languages like Perl, Forth, C, and APL are famous for the
obscurity of code you can write with them, and I have. The
rest of that story is that they make programming in the
large a pain because only the original author can make
sense of all but the most deliberately self-documenting
code.

I'll probably never try Smalltalk, languages that promote
sloppiness by making nearly every statement execute whether
it makes sense as written or not have highly negative
effects on my productivity; I prefer the stricter languages
like Ada, Pascal, or Modula 2 that catch most of my logic
errors at compile time.

Eiffel I admire from a distance, programming by contract is
very attractive, and in some other language whose name I've
since forgotten. I've programmed with preconditions and
postconditions with good success.

However, an extremely offputting and shrill Eiffel advocate
spamming comp.lang.ada turned me off to the language for
political rather than technical reasons, much like the C#
proponents here.

In comparision to all the languages in that kit which I
_have_ used, and to a long list of others I'm sure I could
no longer reproduce accurately, Java is most productive for
programming tasks of the type I do today, heavy in human
interface construction and complex data structures with lots
of objects of short lifespans; for pure straightforward
math, on the other hand, I don't think APL can be beaten for
productivity and fit to the task at hand.

The sidebar to put a limit on how seriously this should all
be taken is that I'm strictly an imperative language
programmer; I have no "hands-on" knowledge of the
productivity of functional or equational languages in
current use. Also, there are literally thousands of
computer programming languages, and so my experience is
still just a tiny fraction of all there is to know.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-09-28 21:17:30 UTC
Permalink
On Sun, 28 Sep 2003 19:52:23 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by Jason Voegele
Smalltalk
Ruby
Python
Lisp
Eiffel
Perl
I've used Lisp and Perl, the latter with much more
expertise; I _have_ Ruby and Python, have not yet used
either; but the forced format of Python repels me, and I
IN what way is a tab different from a curly brace? Both are single
characters, and - arguably - a tab is better at conveying semantic
meaning to the person reading the code.
Post by Kent Paul Dolan
don't yet know enough about Ruby to comment. However, I'm
not a fan of terseness for terseness' sake: I happen to be
a reasonably fast touch typist. I prefer a language that
any technically competent person can desk check with me.
Languages like Perl, Forth, C, and APL are famous for the
obscurity of code you can write with them, and I have. The
rest of that story is that they make programming in the
large a pain because only the original author can make
sense of all but the most deliberately self-documenting
code.
I'll probably never try Smalltalk, languages that promote
sloppiness by making nearly every statement execute whether
it makes sense as written or not have highly negative
You have no idea what you are talking about here. None, zero, zip.
If you send a message that is not understood in Smalltalk, you get a
well formed exception. Take C++ or C (please) - you'll actually get
an attempt to execute the blasted thing, followed by who knows what
kind of nasty blowback.

Yeah, static typing helps that problem loads. Not to mention that -
in 10 years of Smalltalk development, I've had an MNU in a deployed
application too few times to count. I've had about as many
NullPointerExceptions in deployed Java apps I've used, and far, far
more errors that cause crashes in C/C++ apps.

So what it boils down to is, you don't know Smalltalk, you refuse to
look at it, but you are convinced that it's a bad thing. Hmm....
Post by Kent Paul Dolan
effects on my productivity; I prefer the stricter languages
like Ada, Pascal, or Modula 2 that catch most of my logic
errors at compile time.
Eiffel I admire from a distance, programming by contract is
very attractive, and in some other language whose name I've
since forgotten. I've programmed with preconditions and
postconditions with good success.
However, an extremely offputting and shrill Eiffel advocate
spamming comp.lang.ada turned me off to the language for
political rather than technical reasons, much like the C#
proponents here.
In comparision to all the languages in that kit which I
_have_ used, and to a long list of others I'm sure I could
no longer reproduce accurately, Java is most productive for
programming tasks of the type I do today, heavy in human
interface construction and complex data structures with lots
of objects of short lifespans; for pure straightforward
math, on the other hand, I don't think APL can be beaten for
productivity and fit to the task at hand.
The sidebar to put a limit on how seriously this should all
be taken is that I'm strictly an imperative language
programmer; I have no "hands-on" knowledge of the
productivity of functional or equational languages in
current use. Also, there are literally thousands of
computer programming languages, and so my experience is
still just a tiny fraction of all there is to know.
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-09-29 02:09:04 UTC
Permalink
Post by Kent Paul Dolan
the forced format of Python repels me
In what way is a tab different from a curly brace? Both
are single characters, and - arguably - a tab is better at
conveying semantic meaning to the person reading the code.
Why do you try to put words in my mouth? I never made the
claim the issue was tab versus curly brace, yet in your
ignorance, that's all you can imagine. The issue is as
stated: a forced choice of format. I started my programming
career when fixed format languages were the only kind
around. I have no interest in going backwards to crippled
technology.

I format my own code in a two dimensional style that uses
many times the whitespace of the usual programmer's style,
and the whitespace is a resource to me used to make my code
better self-documenting; I have no more interest in
surrendering that resource to a language designer's
ideosyncratic taste than I have love for case insensitive
languages which nevertheless insist on uppercasing keywords
where my own usage is the opposite.
Post by Kent Paul Dolan
I'll probably never try Smalltalk, languages that promote
sloppiness by making nearly every statement execute whether
it makes sense as written or not have highly negative
You have no idea what you are talking about here. None,
zero, zip.
God I love it when the victim _runs_ into the trap.

A clueful person might have stopped to think that someone
with 42 years of programming experience might have read just
a bit about Smalltalk in all those years; might have
subscribed to SIGPLAN and SIGSOFT and TOPLAS and Byte for a
number of those decades. I've put Smalltalk on systems I
control several times, but it's never called out to be used.
If you send a message that is not understood in Smalltalk,
you get a well formed exception.
Which, of course, is exactly the contrapositive of the
problem area; your naivete' about programming is showing.

The problem with your hobby horse is that call compatibility
is defined at the method level rather than at the class
level, and so a call ("message") that matches the method
signature but was never intended for an object of that
class in the current semantic context is blithely and
silently executed, without the protection of static typing
to highlight the class mismatch error at compile time, and
with no protection at all at run time, and disaster typically
ensues swiftly. Smalltalk is hardly the only dynamically
typed language, after all, the problem is common to the
breed.
Take C++ or C (please) - you'll actually get an attempt to
execute the blasted thing, followed by who knows what kind
of nasty blowback.
Exactly the case with Smalltalk as described above.
So what it boils down to is, you don't know Smalltalk, you
refuse to look at it, but you are convinced that it's a
bad thing. Hmm....
Again, you are made a fool of by going on assumptions which
are all your feeble imagination can produce, instead of
thinking before you type. Thank you for this entertaining
example of why True Believers act as punching dummies in
newsgroups dedicated to competing technologies.

Have a nice rant in reply. I might even answer, if you say
anything intelligent, but I'm not hold my breath.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-09-29 04:20:31 UTC
Permalink
On Mon, 29 Sep 2003 02:09:04 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by Kent Paul Dolan
the forced format of Python repels me
In what way is a tab different from a curly brace? Both
are single characters, and - arguably - a tab is better at
conveying semantic meaning to the person reading the code.
Why do you try to put words in my mouth? I never made the
claim the issue was tab versus curly brace, yet in your
ignorance, that's all you can imagine. The issue is as
stated: a forced choice of format. I started my programming
career when fixed format languages were the only kind
around. I have no interest in going backwards to crippled
technology.
And you haven't answered my question. So you have to indent - in a
way that adds semantic meaning. This is bad how? In that it disturbs
your sense of esthetics somehow?
Post by Kent Paul Dolan
I format my own code in a two dimensional style that uses
many times the whitespace of the usual programmer's style,
and the whitespace is a resource to me used to make my code
better self-documenting; I have no more interest in
surrendering that resource to a language designer's
ideosyncratic taste than I have love for case insensitive
languages which nevertheless insist on uppercasing keywords
where my own usage is the opposite.
Post by Kent Paul Dolan
I'll probably never try Smalltalk, languages that promote
sloppiness by making nearly every statement execute whether
it makes sense as written or not have highly negative
You have no idea what you are talking about here. None,
zero, zip.
God I love it when the victim _runs_ into the trap.
A clueful person might have stopped to think that someone
with 42 years of programming experience might have read just
a bit about Smalltalk in all those years; might have
subscribed to SIGPLAN and SIGSOFT and TOPLAS and Byte for a
number of those decades. I've put Smalltalk on systems I
control several times, but it's never called out to be used.
I might, but <I'm talking to you>. Your ignorance shines through
quite strongly.
Post by Kent Paul Dolan
If you send a message that is not understood in Smalltalk,
you get a well formed exception.
Which, of course, is exactly the contrapositive of the
problem area; your naivete' about programming is showing.
The problem with your hobby horse is that call compatibility
is defined at the method level rather than at the class
level, and so a call ("message") that matches the method
signature but was never intended for an object of that
class in the current semantic context is blithely and
silently executed, without the protection of static typing
to highlight the class mismatch error at compile time, and
with no protection at all at run time, and disaster typically
ensues swiftly. Smalltalk is hardly the only dynamically
typed language, after all, the problem is common to the
breed.
No, in fact it's not executed; thus your ignorance. The message is
sent to the object; it's not found. An exception is raised; the
exception is well understood, and can be handled. As a matter of
fact, for building proxy objects (to remote systems or database
objects), this is quite a powerful possibility; the proxy doesn't
understand much, but it grabs the MNU and forwards the message to the
actual intended recipient.

And this lack of static typing is not a problem; it's a feature. IME,
I got far, far more type errors in statically typed languages than I
do in dynamic ones. Why? Because the system forces me to remember
useless information.
Post by Kent Paul Dolan
Take C++ or C (please) - you'll actually get an attempt to
execute the blasted thing, followed by who knows what kind
of nasty blowback.
Exactly the case with Smalltalk as described above.
Nope. Not even close. Thus, your ignorance of the subject. In C or
C++, the system attempts to execute the message, and blows (typically
with a core dump) in te attempt. In Smalltalk, the system is robust -
it handles the situation. Cleanly.
Post by Kent Paul Dolan
So what it boils down to is, you don't know Smalltalk, you
refuse to look at it, but you are convinced that it's a
bad thing. Hmm....
Again, you are made a fool of by going on assumptions which
are all your feeble imagination can produce, instead of
thinking before you type. Thank you for this entertaining
example of why True Believers act as punching dummies in
newsgroups dedicated to competing technologies.
No, based on your post I was spot on. You have no clue how Smalltalk
works. You don't know a thing about method lookup in it, and you
don't understand the exception handling system <at all>. But you like
to think you do.
Post by Kent Paul Dolan
Have a nice rant in reply. I might even answer, if you say
anything intelligent, but I'm not hold my breath.
Try reading something about Smalltalk first, please.
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-09-29 13:58:05 UTC
Permalink
Post by James A. Robertson
No, in fact it's not executed; thus your ignorance. The message is
sent to the object; it's not found.
I'm sorry, I'm afraid dealing with deliberate stupidity on your part is
beyond me. That is not the scenario I described.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-09-29 15:23:29 UTC
Permalink
On Mon, 29 Sep 2003 13:58:05 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by James A. Robertson
No, in fact it's not executed; thus your ignorance. The message is
sent to the object; it's not found.
I'm sorry, I'm afraid dealing with deliberate stupidity on your part is
beyond me. That is not the scenario I described.
You said:

"> Take C++ or C (please) - you'll actually get an attempt to
Post by Kent Paul Dolan
execute the blasted thing, followed by who knows what kind
of nasty blowback.
Exactly the case with Smalltalk as described above."


It's not even close to the same situation as C/C++. As I said in my
post:

-- you don't understand Smalltalk method lookup semantics
-- you don't understand Smalltalk exception semantics

And yet, you <think> you do, and are willing to speak with assumed
authority on the subject.

Even worse, when called on your ignorance of the subject, you pretend
that the other person is ignorant and not worth arguing with. No
wonder you are unemployed; you must be a real joy to have on a
development team.....
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-09-29 16:33:21 UTC
Permalink
Post by Kent Paul Dolan
Post by Kent Paul Dolan
Post by James A. Robertson
No, in fact it's not executed; thus your ignorance. The message is
sent to the object; it's not found.
I'm sorry, I'm afraid dealing with deliberate stupidity on your part is
beyond me. That is not the scenario I described.
Take C++ or C (please) - you'll actually get an attempt to
execute the blasted thing, followed by who knows what kind
of nasty blowback.
Exactly the case with Smalltalk as described above.
It's not even close to the same situation as C/C++.
Nor did I claim so; had you bothered to _read_ the "as
described above", you would have noticed that I had turned
your bogus exoneration of Smalltalk around to point out the
situation where Smalltalk's vulnerabilities lie, which, when
provoked, cause exactly the same situation as with C/C++:
code is executed that has no business _being_ executed, and
the application goes crashy burny bye bye.

In the case of C/C++, that is because all function pointers
are interchangable; in the case of Smalltalk, that is
because all similarly-signatured methods are interchangable;
in each case, the weak typing allows entities that match by
coincidence the form of the entity the programmer actually
intended to address to be used in inappropriate situations
leading to program failure, in both cases, stronger typing
in better programming languages provides a way to catch that
kind of error at compile time.

As I said, I am simply incompetent to deal with you when
you are being deliberately stupid. Your attempts to
obfuscate Smalltalk's very obvious liabilities are every
bit the equivalent of YGBKM's similar intellectual
dishonesty on behalf of Microsoft.

It is far past evident that you are a full blown True
Believer that strong, static typing has nothing to offer
you. Having programmed extensively in both modes, I believe
I know better from experience, and I certainly know what
works better for me as a programmer of large (100M SLOC)
team-built software systems. I have no misimpression
that your mind is open to facts contradicting your
essentially illogical faith-based beliefs, and will not
waste the time trying to feed them to you.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-09-29 17:09:05 UTC
Permalink
On Mon, 29 Sep 2003 16:33:21 +0000 (UTC), "Kent Paul Dolan"
<snip>
Post by Kent Paul Dolan
Post by James A. Robertson
It's not even close to the same situation as C/C++.
Nor did I claim so; had you bothered to _read_ the "as
described above", you would have noticed that I had turned
your bogus exoneration of Smalltalk around to point out the
situation where Smalltalk's vulnerabilities lie, which, when
code is executed that has no business _being_ executed, and
the application goes crashy burny bye bye.
In the case of C/C++, that is because all function pointers
are interchangable; in the case of Smalltalk, that is
because all similarly-signatured methods are interchangable;
in each case, the weak typing allows entities that match by
coincidence the form of the entity the programmer actually
intended to address to be used in inappropriate situations
leading to program failure, in both cases, stronger typing
in better programming languages provides a way to catch that
kind of error at compile time.
Ok, it's worse than I thought; you don't know styrong typing from weak
typing. Smalltalk is strongly, but dynamically typed. C and C++ are
weakly, but statically typed. In C or C++ - the system actually tries
to execute the non-existant code, and burns trying. In Smalltalk,
there is no such attempt. The system notes that the method cannot be
found, and raises an exception. In C or C++, there really isn't a way
to recover from this; you just die. In Smalltalk, this is an
exception like any other - and it can be caught. And handled.

And static typing does not guarantee that this gets caught at compile
time. In a loosely coupled system, where you load components
dynamically, one can easily get a bad version of a coponent which is
not api compatible with code that wants to talk to it - all it takes
is a deployment error, and - given human processes - that's possible.
Post by Kent Paul Dolan
As I said, I am simply incompetent to deal with you when
you are being deliberately stupid. Your attempts to
obfuscate Smalltalk's very obvious liabilities are every
bit the equivalent of YGBKM's similar intellectual
dishonesty on behalf of Microsoft.
Incompetent is the right word all right. You completely don't
understand Smalltalk. It's not clear that you understand the
difference between strong and weak typing, since you seem to thing
"static" = "strong".
Post by Kent Paul Dolan
It is far past evident that you are a full blown True
Believer that strong, static typing has nothing to offer
you. Having programmed extensively in both modes, I believe
Not nothing. I simply believe that the costs - in flexibility and
downstream maintenance - are higher than the benefits. That comes
from fairly long experience with both types of systems. Now, I
haven't tried out any functional languages with type inferencing -
those might offer benefits, but I haven't really looked.
Post by Kent Paul Dolan
I know better from experience, and I certainly know what
works better for me as a programmer of large (100M SLOC)
team-built software systems. I have no misimpression
that your mind is open to facts contradicting your
essentially illogical faith-based beliefs, and will not
waste the time trying to feed them to you.
You know from one thing - statically typed languages, and weakly typed
dynamic ones. You have no clear understanding of Smalltalk, and - as
stated above - your grasp of type systems is questionable
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Isaac Gouy
2003-09-30 17:39:21 UTC
Permalink
Post by Kent Paul Dolan
Nor did I claim so; had you bothered to _read_ the "as
described above", you would have noticed that I had turned
your bogus exoneration of Smalltalk around to point out the
situation where Smalltalk's vulnerabilities lie, which, when
code is executed that has no business _being_ executed, and
the application goes crashy burny bye bye.
In the case of C/C++, that is because all function pointers
are interchangable; in the case of Smalltalk, that is
because all similarly-signatured methods are interchangable;
in each case, the weak typing allows entities that match by
coincidence the form of the entity the programmer actually
intended to address to be used in inappropriate situations
leading to program failure, in both cases, stronger typing
in better programming languages provides a way to catch that
kind of error at compile time.
Let's say we have 2 unrelated classes, Blob and Green, and we have a
method with the same signature that they (or a superclass) implements
- plang. Further, let's say that we have some arbitrary method with a
single argument and in that method we do this:

myMethod: someObject
someObject plang

It seems that you are pointing out that if (for whatever reason)
someObject is a Blob when it was intended to be a Green, then plang
will still succeed (and perhaps destructively change the Blob).

In contrast, if this was some statically checked language then we
would probably have one of these (assuming no common superclass or
interface) and the error would be detected at compile time:
void myMethod(Blob someObject){ someObject.plang };
or
void myMethod(Green someObject){ someObject.plang };


That seems correct as a statement of a possibly unsafe situation. (I
don't believe I've ever encountered that situation, but I probably
wouldn't remember and you're point was that it was possible not
common.)
Kent Paul Dolan
2003-10-01 07:02:01 UTC
Permalink
Post by Isaac Gouy
That seems correct as a statement of a possibly unsafe
situation. (I don't believe I've ever encountered that
situation, but I probably wouldn't remember and you're
point was that it was possible not common.)
Possible, yes. Uncommon? Humans are creatures of habit,
and I know I have multiple methods in multiple classes all
capable of returning their names as a string, and all named
"getName()". If I ask the object to return its name, but it
is an object of an unexpected class with a name in an
unanticipated format not captured as "signature", something
just broke when I go to use that name further.

In a "programming in the large" situation, where hundreds
(in the largest shop in which I've worked, bigger ones
exist) of worker-bee programmers add code to a common base,
the chances of method name and signature collisions
somewhere in the code base is essentially 100%. What are the
odds that an object of the wrong class but with the matching
method will be used as an argument? Larger if there is no
need for the class to be correct for the method to be
called. This is the environment and the situation where
static type checking is most needed: where programmers
unfamiliar with the entire code base (and in a shop of that
size, being so is humanly impossible) must depend on
namespace barriers such as class types to prevent
inadvertant successful use of inappropriately accessed
methods.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Isaac Gouy
2003-10-01 17:38:31 UTC
Permalink
Post by Kent Paul Dolan
Post by Isaac Gouy
That seems correct as a statement of a possibly unsafe
situation. (I don't believe I've ever encountered that
situation, but I probably wouldn't remember and you're
point was that it was possible not common.)
Possible, yes. Uncommon? Humans are creatures of habit,
and I know I have multiple methods in multiple classes all
capable of returning their names as a string, and all named
"getName()".
Agreed. Although getting more than 3 people to agree on the same names
can be a struggle ;-)
Post by Kent Paul Dolan
If I ask the object to return its name, but it
is an object of an unexpected class with a name in an
unanticipated format not captured as "signature", something
just broke when I go to use that name further.
The scenario seems to be that getName() returns a String, and
additionally for blob.getName() the last 3 chars are 'XXX' and for
green.getName() those chars are '!!!'; and somewhere we make use of
that difference in the software.

(I am struggling to match this scenario to something that I've worked
on.)
Post by Kent Paul Dolan
In a "programming in the large" situation, where hundreds
(in the largest shop in which I've worked, bigger ones
exist) of worker-bee programmers add code to a common base,
the chances of method name and signature collisions
somewhere in the code base is essentially 100%.
I agree that the probability is greater than zero. Without measuring
programming mistakes I don't know how we could reasonably assess what
the probability is.

(fyi: Given that I have no wish to start a 'that isn't really
programming in the large' debate - let me just say that dynamically
checked languages have been used successfully for more than
programming-in-the-small. This is an accessible example: "Four-fold
Increase in Productivity and Quality"
http://www.erlang.se/publications/Ulf_Wiger.pdf )
Post by Kent Paul Dolan
What are the
odds that an object of the wrong class but with the matching
method will be used as an argument? Larger if there is no
need for the class to be correct for the method to be
called.
This is the environment and the situation where
static type checking is most needed: where programmers
unfamiliar with the entire code base (and in a shop of that
size, being so is humanly impossible) must depend on
namespace barriers such as class types to prevent
inadvertant successful use of inappropriately accessed
methods.
Kent Paul Dolan
2003-10-01 21:08:11 UTC
Permalink
Post by Isaac Gouy
http://www.erlang.se/publications/Ulf_Wiger.pdf )
I'm just begun, but it "springs off the page" to me
that while the abstract credits a programming language
for the productivity increase, most of the chapter
titles are about human engineering concerns, which to
me are where most solutions to the productivity
question are found.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Isaac Gouy
2003-10-02 06:28:12 UTC
Permalink
Post by Kent Paul Dolan
Post by Isaac Gouy
http://www.erlang.se/publications/Ulf_Wiger.pdf )
I'm just begun, but it "springs off the page" to me
that while the abstract credits a programming language
for the productivity increase, most of the chapter
titles are about human engineering concerns, which to
me are where most solutions to the productivity
question are found.
Of course :-)

They are a group of pragmatic engineers, trying their best to succeed.
They created a language and frameworks that suit their application
domain, and broke out of the existing "standard practice" for large
scale development within their organization.

My impression from their mailing list and other publications is that
they don't feel that there is some problem with what they do that
static type checking would solve. If there was they would use it
gladly - they've certainly done enough language experiments.

The fun observation (which I'm sure you've seen before in other
experience reports) is that individual developers pretty much wrote
the same locs irrespective of language - so things go quicker with a
more expressive language.
James A. Robertson
2003-09-29 04:35:05 UTC
Permalink
On Mon, 29 Sep 2003 02:09:04 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by Kent Paul Dolan
the forced format of Python repels me
<snip>
Post by Kent Paul Dolan
Take C++ or C (please) - you'll actually get an attempt to
execute the blasted thing, followed by who knows what kind
of nasty blowback.
Exactly the case with Smalltalk as described above.
I explained in the other post how it's not the same thing at all.
Here's an example of how I often use this capability. Say I have a
domain object - let's call it Foo

The Foo object periodically sends change events off when interesting
state changes occur. Here's the kicker - Foo has no idea what other
objects may be interested, nor does it know (or care) which messages
those other objects may be interested in. So one of thos objects may
be Bar, and of all the events sent to it, he only cares about a
handful. Now, I'd rather not introduce a huge case statement into my
event handler - I'd have to revisit it every time I cared to handle a
new event. Instead, I write the handler like this:

update: anEventName with: aValue from: aModel

[anEventName isKeyword
ifTrue: [self perform: anEventName with: aValue]
ifFalse; [self perform: anEvent]]
on: MessageNotUnderstood
do: [:exception | exception return]

The upshot is, I send the inbound event name to myself (possibly with
an argument). If I don't implement a handler for that event, I get an
MNU - which I catch and just ignore. Ignoring is the correct
behavior, because I may get events I don't care about - and this
handily filters them out, while allowing my object to easily deal with
the ones it does care about.


I write event handlers like that a fair bit - and they take advantage
of this capability that seems to scare your socks off....
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-09-29 14:08:51 UTC
Permalink
Post by Kent Paul Dolan
I format my own code in a two dimensional style that uses
many times the whitespace of the usual programmer's style,
and the whitespace is a resource to me used to make my code
better self-documenting;
Can everyone else read your whitespace-based-self-documenting code or
don't you care about that?
The words "self-documenting" aren't in your vocabulary?

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Kent Paul Dolan
2003-09-29 17:24:30 UTC
Permalink
I had similar apprehensions about Python, but if Bruce Eckel can make
the transition and find value in doing so
<http://mindview.net/WebLog/log-0025> then it might be worth a 2nd look.
Who am I to be so rigid?
Perhaps someone who has received the wisdom that you cannot test a
program into correctness? The blog entry makes the correct claim
that programming without unit testing is madness, but slides from
there straight to the incorrect claim that unit testing _replaces_
all the safety checks that strong typing provides. This is simply
not the case, and to promote working as if it were is a path to
software disasters.

In the case, and with the software industry's miserable current
record, choosing to opt out of some sorts of testing like compile
time type checking because they are "productivity destroying" is
to risk becoming another provider of the kind of worthless
rubbish currently infesting the Net. Programmers need all the help
they can get, especially as project complexity continues to grow
without visible limit, and static type checking is one such source
of help. One pays for that help in the time and effort exerted to
create the circumstances that make it available _beyond_ what one
exerts to create software without making it available.

"Quick and dirty" software development is almost overwhelmingly
seductive. Fred Brooks' "5 lines of delivered code per programmer
day" for software in the large drives budget writers to fits of
frothing madness. With the attitude that "we can do better with
just a few shortcuts" come the disasters that have so often recently
swept across the web.

Fact is, as events demonstrate again and again, we _cannot_ yet do
better than that mark, in safety, for all the same reasons it was
unassailable 40 years ago. It isn't a typing speed limitation -- any
fool with what Asimov called "good sitzfleisch"(sp?) can hack out a
working 600 line program in a day. It is a limitation on our ability
to organize humans to do complex tasks ungraspable in the whole by
any single person, cooperatively.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Bobby Parker
2003-09-29 17:33:01 UTC
Permalink
Post by Kent Paul Dolan
In the case, and with the software industry's miserable current
record, choosing to opt out of some sorts of testing like compile
time type checking because they are "productivity destroying" is
to risk becoming another provider of the kind of worthless
rubbish currently infesting the Net. Programmers need all the help
they can get, especially as project complexity continues to grow
without visible limit, and static type checking is one such source
of help. One pays for that help in the time and effort exerted to
create the circumstances that make it available _beyond_ what one
exerts to create software without making it available.
I LOVE my immutable strings, dammit!

bp
Kent Paul Dolan
2003-09-29 19:04:14 UTC
Permalink
Post by Bobby Parker
I LOVE my immutable strings, dammit!
Okay, little woodenhead, you can have them back.

xanthian. [really, that was a _pro_ Java post]
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Bobby Parker
2003-09-29 20:55:49 UTC
Permalink
Post by Kent Paul Dolan
Okay, little woodenhead, you can have them back.
xanthian. [really, that was a _pro_ Java post]
Really that was an encouraging cheer. Honest! I think strong typing is a
good thing.

bp
Kent Paul Dolan
2003-09-29 21:17:21 UTC
Permalink
I think strong typing is a good thing.
My bruised and battered keyboard would like to have
words with you and your friend.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Bobby Parker
2003-09-29 22:16:10 UTC
Permalink
Post by Kent Paul Dolan
I think strong typing is a good thing.
My bruised and battered keyboard would like to have
words with you and your friend.
Yeah, well I have a nice, ancient IBM PS/2 keyboard (yes, from a real PS/2)
and a Silicon Graphics keyboard from an O2 (got 3 SGI workstations). Two of
the toughest customers you'll ever see. Heavier than hell, both of them,
packing EMP shielding, heat sinks and god knows what-all. Springs are built
to take a pounding. I *HAVE* pounded the IBM keyboard in the worst fits of
Microsoft Rage, and it still plunks away just fine.

But you gotta be NICE to laptops, doncha? My school of typing included the
IBM Selectric typewriter. Strong typing is in my blood.

bp
Kent Paul Dolan
2003-09-30 02:44:23 UTC
Permalink
Post by Bobby Parker
Strong typing is in my blood.
Hey, you're "A+" too!

xanthian, no rare donor.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Bobby Parker
2003-09-30 03:39:16 UTC
Permalink
Post by Kent Paul Dolan
Post by Bobby Parker
Strong typing is in my blood.
Hey, you're "A+" too!
Hey, that there's confidential govt. information buddy!

bp
Kent Paul Dolan
2003-09-29 20:39:19 UTC
Permalink
Considering we already have software disasters with the
predominant languages all supporting static typing
suggests (at least) that compile-time testing is
inadequate to the task (I think we all agree on this) but
that testing is the only way to prove a system's
/correctness/.
Bzzzt! The proof that proving a program's correctness by
testing is infeasible is an undergraduate computer science
exercise.

The only known way to _prove_ program correctness is by
formal mathematical validation methods, and because they
are not yet splendidly automated, they don't extend to
programming in the large economically feasibly.

Read the many writings of Edsgar Dijkstra on the subject.
If the latter is the target and the former does not
guarantee the latter, then do we need the former?
Nothing we can pay for "does the latter", we are in a mode
of trying to move to "doing the best we can" rather than
"doing the least we can".

You need the former because it takes out a whole class of
frequent errors quickly, automatically, and most important,
cheaply in comparision to formal SQA methods. In a large
programming effort, programmers are a small minority of all
salary costs, so having programmers work harder to save
overall costs makes economic sense.
Does the flexibility forfeited to the compiler pay for
itself in hard or soft dollars (pick your currency).
The "flexibility" you are "forfeiting" is the chance to make
stupid mistakes even an idiot machine could have caught; not
high on my hit parade.

On the issue of payback, read the publications of the
Software Engineering Institute, in particular their widely
used Capability Maturity Model book, and judge for yourself;
I lent my copy out and never got it back. Sigh.

http://www.sei.cmu.edu/publications/publications.html
There are probably all levels of programmers, just are
there are in any profession. Is it truly unimaginable
that some programmers are capable of delivering quality
(tested) systems without the assistance of compile-time
type checking?
It is truly imaginable that anyone I have ever heard make
that claim for him or her self was just excusing bone
laziness. I have known half a dozen people with 180 IQs in
my life, and most likely each of them could pull it off, but
definitely each of them has better sense than to try, as
well. For the rest of us, yes, it is truly unimaginable, and
the record bears me out.
Which have been the predominant languages for the past 10,
20, 30 years? Certainly not dynamically typed ones. C,
C++, Delphi, and Java constitute the majority of
development since (at least) the early 80s, all of them
favor static typing (though I'm uncertain about VB--never
used it). Is there a correlation?
No, because you have the wrong end of the stick; the problem
of creating software in the large is a problem in human
engineering, the failures of the past have not been, in any
great proportion, software development tool failures but
software management failures (read: Microsoftisms), and the
question is, does using tools less capable of assisting
correctness improve or damage the human engineering aspect?

Given that there is little choice in good times but to
employ programmers at all levels of competence, the question
of whether to give them all the double-checking help they
can get, whether they want it or not, has an obvious answer
if software quality and on-time deliver are important to
you as an employer.
If insanity really is doing the same thing over and over
again and expecting a different result is it possible
statically typed languages and the assumptions about them
have outlived their usefulness?
No, but it is possible you are slipping into ranting instead
of discussing. The human engineering aspects are slowly
being addressed (see the SEI CMM book), and there have been
some really excellent results in programs where loss of
human life was an issue in terms of delivered bugs per line
of code, most typically with Ada, as strict a taskmaster as
you could ask for in a compiler. The code came out no faster,
but its quality improved dramatically.
Why were the newsgroups misc.misc and talk.bizarre included
on the message? Is there a lot of Java and OO stuff over
there?
Misc.misc because I always try to do so (see my recent note
in rec.games.roguelike.nethack on the subject "Posting to
misc.misc..."), and talk.bizarre because I send only my best
quality rants there to share with my peers(*), in this case
also to share your article URL with the software types
there. It also helps lure intelligent people to talk.bizarre,
though a lot of dead wood gets dragged along too.

xanthian.

(*) None of whom appreciate seeing those rants in the least:
it's an ego thingie, and they put up with me for the most
part, after 18 years to learn resignation.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Martin Drautzburg
2003-09-29 21:25:14 UTC
Permalink
Post by Kent Paul Dolan
There are probably all levels of programmers, just are
there are in any profession. Is it truly unimaginable
that some programmers are capable of delivering quality
(tested) systems without the assistance of compile-time
type checking?
It is truly imaginable that anyone I have ever heard make
that claim for him or her self was just excusing bone
laziness.
In software laziness is a virtue not a vice.
Post by Kent Paul Dolan
I have known half a dozen people with 180 IQs in
my life, and most likely each of them could pull it off, but
definitely each of them has better sense than to try, as
well. For the rest of us, yes, it is truly unimaginable, and
the record bears me out.
But then how could a handful of people possibly write most of an
operating system, a windowing system, a virtual machine an IDE
including the compiler and a debugger in a dynamically typed language
and that within just a few years ?

This is what happened with squeak (written mostly in itself). Dynamic
typing can't be all that bad if such things are possible.

How much does IBM spend on Eclipse ?
Kent Paul Dolan
2003-09-30 04:10:14 UTC
Permalink
[can anyone code better without doing the extra effort
needed for static type checking?]
Post by Martin Drautzburg
Post by Kent Paul Dolan
It is truly imaginable that anyone I have ever heard make
that claim for him or her self was just excusing bone
laziness.
In software laziness is a virtue not a vice.
That laziness captured in the aphorism "good programmers
write great code; great programmers *steal* great code" is
indeed a virtue; I've sat beside a colleague (Pedro Tsai)
who was twenty times as productive as those around him yet
rarely needed to write original code: it's a mindset.

That laziness which results in code suffering run time
failures that could have been caught in the compilation
stage is _not_ a virtue, it is a menace, much like the
habits of programmers who write code as if they were to
be its only readers.

Pithy saying need to be kept within the limits of their
applicability; if allowed to stray outside those limits,
and taken as universal truths, they trend to disaster.
Post by Martin Drautzburg
Post by Kent Paul Dolan
I have known half a dozen people with 180 IQs in my life,
and most likely each of them could pull it off, but
definitely each of them has better sense than to try, as
well. For the rest of us, yes, it is truly unimaginable,
and the record bears me out.
But then how could a handful of people possibly write most
of an operating system, a windowing system, a virtual
machine an IDE including the compiler and a debugger in a
dynamically typed language and that within just a few
years ?
Umm, writing an OS is a senior year one semester class one
person undergraduate programming chore the last place I took
classes. Again, stealing working code or design can make any
project go forward quickly. Speed is not a distinguisher
between the two coding styles. What you describe isn't
really that fast a pace, given that working examples of all
the parts in other applications were available from the
beginning.
Post by Martin Drautzburg
This is what happened with squeak (written mostly in
itself).
The GNU Ada Translator, except for an initial bootstrap
version, is written entirely in Ada, a language with
severely strong typing, and is ported by cross-compilers
again written in Ada; so "written in itself" is not a
distinguisher between the two coding styles.
Post by Martin Drautzburg
Dynamic typing can't be all that bad if such things are
possible.
You've lost track of the goal here, which was to improve
the *quality* of software widely distributed for direct
Internet interface use, to reduce the set of current
disasters from buggy code. As I've said at least three
times already in this thread, choosing the faster, less
secure coding style is exactly how that code gains
access to the outside world.
Post by Martin Drautzburg
How much does IBM spend on Eclipse ?
Why would I know or care? I write software with a text
editor, not a crutch.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-09-30 05:25:02 UTC
Permalink
On Tue, 30 Sep 2003 04:10:14 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
That laziness which results in code suffering run time
failures that could have been caught in the compilation
stage is _not_ a virtue, it is a menace, much like the
habits of programmers who write code as if they were to
be its only readers.
10 years of Smalltalk development. I can still count - on one hand -
the number of times I've had such a type error. Now, in the last 10
years, with the fantastic help of your static type system, how many
such type errors has the compiler brought to your attention?

If it's more than a handful, you might want to ponder it.
Post by Kent Paul Dolan
Pithy saying need to be kept within the limits of their
applicability; if allowed to stray outside those limits,
and taken as universal truths, they trend to disaster.
That's static typing and safety, in a nutshell. Note that static
typing didn't help the Ariane 5, but rigorous testing would have.
Post by Kent Paul Dolan
Post by Martin Drautzburg
Post by Kent Paul Dolan
I have known half a dozen people with 180 IQs in my life,
and most likely each of them could pull it off, but
definitely each of them has better sense than to try, as
well. For the rest of us, yes, it is truly unimaginable,
and the record bears me out.
But then how could a handful of people possibly write most
of an operating system, a windowing system, a virtual
machine an IDE including the compiler and a debugger in a
dynamically typed language and that within just a few
years ?
Umm, writing an OS is a senior year one semester class one
person undergraduate programming chore the last place I took
classes. Again, stealing working code or design can make any
project go forward quickly. Speed is not a distinguisher
between the two coding styles. What you describe isn't
really that fast a pace, given that working examples of all
the parts in other applications were available from the
beginning.
Post by Martin Drautzburg
This is what happened with squeak (written mostly in
itself).
The GNU Ada Translator, except for an initial bootstrap
version, is written entirely in Ada, a language with
severely strong typing, and is ported by cross-compilers
again written in Ada; so "written in itself" is not a
distinguisher between the two coding styles.
Post by Martin Drautzburg
Dynamic typing can't be all that bad if such things are
possible.
You've lost track of the goal here, which was to improve
the *quality* of software widely distributed for direct
Internet interface use, to reduce the set of current
disasters from buggy code. As I've said at least three
times already in this thread, choosing the faster, less
secure coding style is exactly how that code gains
access to the outside world.
Post by Martin Drautzburg
How much does IBM spend on Eclipse ?
Why would I know or care? I write software with a text
editor, not a crutch.
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-09-30 06:08:13 UTC
Permalink
Post by James A. Robertson
Post by Kent Paul Dolan
That laziness which results in code suffering run time
failures that could have been caught in the compilation
stage is _not_ a virtue, it is a menace, much like the
habits of programmers who write code as if they were to
be its only readers.
10 years of Smalltalk development. I can still count - on
one hand - the number of times I've had such a type error.
Umm, but of course, with dynamic type checking, it is your
customers who normally first encounter them, since after all
your code _passed_ your testing, and they often respond by
not bothering to return.
Post by James A. Robertson
Now, in the last 10 years, with the fantastic help of your
static type system, how many such type errors has the
compiler brought to your attention?
How many type errors has the compiler caught? Several per
hour of programming. In large part that is because I
program deliberately so that logic errors are caught as type
conflict errors whenever possible. My experience with Ada
has been that code that compiled with that development model
is code that executed correctly the first time it was
tested.
Post by James A. Robertson
If it's more than a handful, you might want to ponder it.
Sure. I don't have to claim to be a superman, and yet I
still get out more reliable code than you do. Not much to
ponder. My only delivered COBOL program ever, implemented
a small self-contained database with no outside DBMS; when
I delivered it I never heard back from the user, and thought
they'd decided not to employ it. When I asked why they
weren't using it, a few years later, they told me all 20 of
them used it daily, it just had never suffered a failure or
needed an interface improvement. Not bad for a language I
learned to write that one bit of code.
Post by James A. Robertson
Post by Kent Paul Dolan
Pithy sayings need to be kept within the limits of their
applicability; if allowed to stray outside those limits,
and taken as universal truths, they trend to disaster.
That's static typing and safety, in a nutshell. Note that
static typing didn't help the Ariane 5, but rigorous
testing would have.
Interesting choice of examples, since the Ariane 5 crash was
caused by a human engineering error, a decision to reuse
part of the nozzle control software without taking the expense
to re-analyze, rework, and retest it, and was completely
independent of the language used; Logo or BASIC would have
had the same nose first landing result. Making the choice to
avoid the "expense" of writing code in a language supporting
strong static typing, because that is a less fun programming
modality for the prima donna developers, is a similar human
engineering error, with similar results to be expected.

But hey, if your code doesn't do anything important, why do
you care? My COBOL program was used in an application where
human life and large property loss was at risk from software
errors, so I bothered to take the time to make sure it was
correct with every tool available, not just the fun to use
ones.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-09-30 13:36:05 UTC
Permalink
On Tue, 30 Sep 2003 06:08:13 +0000 (UTC), "Kent Paul Dolan" >
Post by Kent Paul Dolan
Post by James A. Robertson
10 years of Smalltalk development. I can still count - on
one hand - the number of times I've had such a type error.
Umm, but of course, with dynamic type checking, it is your
customers who normally first encounter them, since after all
your code _passed_ your testing, and they often respond by
not bothering to return.
Nope. With BottomFeeder - an RSS news aggregator - I get bug reports,
but they are typically related to either HTML display or RSS handling
issues. I've yet to get a bug report related to a type error. I run
the app on my desktop day in and day out, and that class of error just
doesn't come up. It's one static typing advocates <think> is common -
but when using Smalltalk, it just isn't.
Post by Kent Paul Dolan
Post by James A. Robertson
Now, in the last 10 years, with the fantastic help of your
static type system, how many such type errors has the
compiler brought to your attention?
How many type errors has the compiler caught? Several per
hour of programming. In large part that is because I
program deliberately so that logic errors are caught as type
conflict errors whenever possible. My experience with Ada
has been that code that compiled with that development model
is code that executed correctly the first time it was
tested.
That's because the compiler forces you to make premature - and mostly
useless - type decisions. Logic errors are simply not caught by
static typing. If you think they are, you are truly, truly naive.
Post by Kent Paul Dolan
Post by James A. Robertson
If it's more than a handful, you might want to ponder it.
Sure. I don't have to claim to be a superman, and yet I
still get out more reliable code than you do. Not much to
And you base this on what? That you've compared your results to mine?
Heck, on that basis, I'll state that my employment status proves that
I produce better code than you. Not a fair measure, but every bit as
accurate as yours.
Post by Kent Paul Dolan
ponder. My only delivered COBOL program ever, implemented
a small self-contained database with no outside DBMS; when
I delivered it I never heard back from the user, and thought
they'd decided not to employ it. When I asked why they
weren't using it, a few years later, they told me all 20 of
them used it daily, it just had never suffered a failure or
needed an interface improvement. Not bad for a language I
learned to write that one bit of code.
Post by James A. Robertson
Post by Kent Paul Dolan
Pithy sayings need to be kept within the limits of their
applicability; if allowed to stray outside those limits,
and taken as universal truths, they trend to disaster.
That's static typing and safety, in a nutshell. Note that
static typing didn't help the Ariane 5, but rigorous
testing would have.
Interesting choice of examples, since the Ariane 5 crash was
caused by a human engineering error, a decision to reuse
part of the nozzle control software without taking the expense
to re-analyze, rework, and retest it, and was completely
independent of the language used; Logo or BASIC would have
had the same nose first landing result. Making the choice to
avoid the "expense" of writing code in a language supporting
strong static typing, because that is a less fun programming
modality for the prima donna developers, is a similar human
engineering error, with similar results to be expected.
But hey, if your code doesn't do anything important, why do
you care? My COBOL program was used in an application where
human life and large property loss was at risk from software
errors, so I bothered to take the time to make sure it was
correct with every tool available, not just the fun to use
ones.
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Isaac Gouy
2003-09-30 21:06:45 UTC
Permalink
Post by James A. Robertson
On Tue, 30 Sep 2003 06:08:13 +0000 (UTC), "Kent Paul Dolan" >
Post by Kent Paul Dolan
Post by James A. Robertson
10 years of Smalltalk development. I can still count - on
one hand - the number of times I've had such a type error.
Let me suggest that has something to do with your expertise.

Maybe a more interesting question is how often does a Smalltalk newbie
make that mistake, or deletes a method without checking senders, or
...
Of course, now we all do TDD...

(I do agree that programmers experienced with dynamically checked
languages don't make the kind of 'type mistakes' that someone who has
only used a statically checked language might imagine. Part of the
reason is that they manually check what the methods will do, and
manually document the argument types. The usual problem is failing to
initialize dynamic data structures correctly.)
Post by James A. Robertson
Post by Kent Paul Dolan
Post by James A. Robertson
Now, in the last 10 years, with the fantastic help of your
static type system, how many such type errors has the
compiler brought to your attention?
How many type errors has the compiler caught? Several per
hour of programming. In large part that is because I
program deliberately so that logic errors are caught as type
conflict errors whenever possible. My experience with Ada
has been that code that compiled with that development model
is code that executed correctly the first time it was
tested.
That's because the compiler forces you to make premature - and mostly
useless - type decisions.
Languages without polymorphism force you to make premature type
decisions. Using a statically checked language that supports many
forms of polymorphism is very different from using a statically
checked language like Modula-2.

Anyway, given that we refactor like crazy in Smalltalk, perhaps we
should accept that if a premature type decision has been made then it
will be unmade when the code is refactored ;-)

Yes, there are false positives. They are errors that are purely an
artifact of the limitations of the particular type system - shouldn't
we be able to use int where-ever we use double?
Post by James A. Robertson
Logic errors are simply not caught by
static typing. If you think they are, you are truly, truly naive.
That overstates the situation - some logic errors are revealed by
static type checking, even with something as simple as Modula-2.
-SNIP-
James A. Robertson
2003-10-01 01:26:02 UTC
Permalink
Post by Isaac Gouy
Post by James A. Robertson
On Tue, 30 Sep 2003 06:08:13 +0000 (UTC), "Kent Paul Dolan" >
Post by Kent Paul Dolan
Post by James A. Robertson
10 years of Smalltalk development. I can still count - on
one hand - the number of times I've had such a type error.
Let me suggest that has something to do with your expertise.
Maybe a more interesting question is how often does a Smalltalk newbie
make that mistake, or deletes a method without checking senders, or
...
Of course, now we all do TDD...
I clearly recall not getting type errors of that sort when I started
with Smalltalk, and could barely find my way around a browser.
Post by Isaac Gouy
(I do agree that programmers experienced with dynamically checked
languages don't make the kind of 'type mistakes' that someone who has
only used a statically checked language might imagine. Part of the
reason is that they manually check what the methods will do, and
manually document the argument types. The usual problem is failing to
initialize dynamic data structures correctly.)
Post by James A. Robertson
Post by Kent Paul Dolan
Post by James A. Robertson
Now, in the last 10 years, with the fantastic help of your
static type system, how many such type errors has the
compiler brought to your attention?
How many type errors has the compiler caught? Several per
hour of programming. In large part that is because I
program deliberately so that logic errors are caught as type
conflict errors whenever possible. My experience with Ada
has been that code that compiled with that development model
is code that executed correctly the first time it was
tested.
That's because the compiler forces you to make premature - and mostly
useless - type decisions.
Languages without polymorphism force you to make premature type
decisions. Using a statically checked language that supports many
forms of polymorphism is very different from using a statically
checked language like Modula-2.
Anyway, given that we refactor like crazy in Smalltalk, perhaps we
should accept that if a premature type decision has been made then it
will be unmade when the code is refactored ;-)
Yes, there are false positives. They are errors that are purely an
artifact of the limitations of the particular type system - shouldn't
we be able to use int where-ever we use double?
Post by James A. Robertson
Logic errors are simply not caught by
static typing. If you think they are, you are truly, truly naive.
That overstates the situation - some logic errors are revealed by
static type checking, even with something as simple as Modula-2.
-SNIP-
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-10-01 06:39:23 UTC
Permalink
Post by James A. Robertson
Logic errors are simply not caught by
static typing. If you think they are,
you are truly, truly naive.
Or, not having put static typing to much
use, you don't know how to use it write
code whose logic errors are caught as
type conflicts, a possiblity that you
might consider, since my experience is
hardly rare in the Ada community.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-10-01 15:20:16 UTC
Permalink
On Wed, 1 Oct 2003 06:39:23 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by James A. Robertson
Logic errors are simply not caught by
static typing. If you think they are,
you are truly, truly naive.
Or, not having put static typing to much
use, you don't know how to use it write
code whose logic errors are caught as
type conflicts, a possiblity that you
might consider, since my experience is
hardly rare in the Ada community.
I spent years developing in languages with static typing. Haven't
used Ada, but - improper use of an algorithm isn't going to be flagged
by type checking.
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Martin Drautzburg
2003-10-01 18:58:02 UTC
Permalink
Post by James A. Robertson
I spent years developing in languages with static typing. Haven't
used Ada, but - improper use of an algorithm isn't going to be flagged
by type checking.
I always thought that static typing is a historic leftover (correct
me if I'm wrong).

In languages like C variables are pushed to the stack. Since the
compiler also genereates the code that advances the stack pointer it
needs to know the size of the variables, i.e. these sizes must be
known at compile time. When dealing with struct types the same is
true - the compiler needs to know the sizes of the struct members in
order to access them.

Also improper use of typed variables leads to desasterous consequences
in these languages. These include a segmentation faults, corrupted
stacks etc. Most of these problems don't exist in dynamically typed
languages and most of them are not related to logical errors at all.

So far static typing is more a neccessity than a feature.

Since static typing also puts an additional burdon on the programmers
neck, their advocates argue that static typing increases software
quality. It is undisputed that catching errors at compile time is
better than catching them at runtime. Consquently a compiler should
refuse to compile things like

main()
{
int x;
int y=x; // obvious nonsense x is uninitialized
int *px;
int z= *px; // obvious nonsensical dereferencing

printf("%s\n",z); //segmentation fault
}

But it compiles this stuff even though the nonsense is obvious at
compile time and the language is statically typed.

Now many languages, even statically typed ones (including Java) refuse
to compile such nonsense and these days we don't have to do pointer
magic anymore (neither did we 20 years ago).

But my feeling is still that the prominent role of static typing dates
back to those languages that actually *needed* it.
Isaac Gouy
2003-10-02 07:45:07 UTC
Permalink
Martin Drautzburg <***@web.de> wrote
-SNIP-
Post by Martin Drautzburg
I always thought that static typing is a historic leftover (correct
me if I'm wrong).
In languages like C variables are pushed to the stack. Since the
compiler also genereates the code that advances the stack pointer it
needs to know the size of the variables, i.e. these sizes must be
known at compile time. When dealing with struct types the same is
true - the compiler needs to know the sizes of the struct members in
order to access them.
AFAIK BCPL was designed so that dynamic variables and vectors "can be
allocated space on a simple runtime stack." BCPL is proudly typeless.


In the beginning was the Word. (Wynken de Worde)

And then there was FORTRAN (with a later resurgence of the Word in
BCPL - seemingly as a reaction to the complications encountered
implementing CPL, with later derivatives like B and C).

http://www.fh-jena.de/~kleine/history/


-SNIP-
Post by Martin Drautzburg
But it compiles this stuff even though the nonsense is obvious at
compile time and the language is statically typed.
Now many languages, even statically typed ones (including Java) refuse
to compile such nonsense and these days we don't have to do pointer
magic anymore (neither did we 20 years ago).
Neither did we 40 years ago. Which makes this a sad example of a
language that doesn't know if it's typed or typeless.
Post by Martin Drautzburg
But my feeling is still that the prominent role of static typing dates
back to those languages that actually *needed* it.
C is recent, look at FORTRAN and Algol.
Isaac Gouy
2003-10-01 19:50:33 UTC
Permalink
Post by Kent Paul Dolan
Post by James A. Robertson
Logic errors are simply not caught by
static typing. If you think they are,
you are truly, truly naive.
Or, not having put static typing to much
use, you don't know how to use it write
code whose logic errors are caught as
type conflicts, a possiblity that you
might consider, since my experience is
hardly rare in the Ada community.
And that is the experience reported by practioners of newer statically
checked languages like Haskell - they talk of type-full programming.

Dynamically checked and statically checked languages require such
different programming practices, that it's enormously hard to
appreciate the other POV.

If only the typeless languages BCPL and MCPL were more popular this
could be a 3 way mis-understanding ;-)
Dr Chaos
2003-09-30 17:12:56 UTC
Permalink
Post by James A. Robertson
10 years of Smalltalk development. I can still count - on one hand -
the number of times I've had such a type error. Now, in the last 10
years, with the fantastic help of your static type system, how many
such type errors has the compiler brought to your attention?
But you know that's not fair, because in a statically
type constrained language, people use the type system to express
constraints and requirements which would be expressed in
different ways in a different language.
Post by James A. Robertson
Post by Kent Paul Dolan
Pithy saying need to be kept within the limits of their
applicability; if allowed to stray outside those limits,
and taken as universal truths, they trend to disaster.
That's static typing and safety, in a nutshell. Note that static
typing didn't help the Ariane 5, but rigorous testing would have.
static type proofs eliminate one category of errors.

but I think the more important part of type checks is not in initial
development, but when you (or somebody else) goes back to work with
code that they haven't used at least in a long time.

I find I make a number of silly errors in the initial stages
which type checks quickly find for me.
Post by James A. Robertson
Post by Kent Paul Dolan
Why would I know or care? I write software with a text
editor, not a crutch.
and a visual editor like emacs is a crutch instead of
teco

and automatic static type checking is a crutch instead of
random errors.
Isaac Gouy
2003-09-30 18:14:57 UTC
Permalink
"Kent Paul Dolan" <***@well.com> wrote
-SNIP-
Post by Kent Paul Dolan
You've lost track of the goal here, which was to improve
the *quality* of software widely distributed for direct
Internet interface use, to reduce the set of current
disasters from buggy code.
-SNIP-

AFAIK many of the widely reported problems (buffer overflows) result
from lack of run-time checks rather than lack of static type checks.


Tony Hoare on implementing Algol 60 (in 1961)
http://www.braithwaite-lee.com/opinions/p75-hoare.pdf

"(1) The first principle was security: ...
A consequence of this principle is that every occurrence of every
subscript of every subscripted variable was on every occasion checked
at run time against both the upper and the lower declared bounds of
the array.

Many years later we asked our customers whether they wished us to
provide an option to switch off these checks in the interests of
efficiency on production runs. Unanimously, they urged us not to -
they already knew how frequently subscript errors occur on production
runs where failure to detect them could be disastrous.
"
James A. Robertson
2003-09-30 19:32:16 UTC
Permalink
Post by Isaac Gouy
-SNIP-
Post by Kent Paul Dolan
You've lost track of the goal here, which was to improve
the *quality* of software widely distributed for direct
Internet interface use, to reduce the set of current
disasters from buggy code.
-SNIP-
AFAIK many of the widely reported problems (buffer overflows) result
from lack of run-time checks rather than lack of static type checks.
Yep. Years of developers worrying about exactly the wrong thing.
Post by Isaac Gouy
Tony Hoare on implementing Algol 60 (in 1961)
http://www.braithwaite-lee.com/opinions/p75-hoare.pdf
"(1) The first principle was security: ...
A consequence of this principle is that every occurrence of every
subscript of every subscripted variable was on every occasion checked
at run time against both the upper and the lower declared bounds of
the array.
Many years later we asked our customers whether they wished us to
provide an option to switch off these checks in the interests of
efficiency on production runs. Unanimously, they urged us not to -
they already knew how frequently subscript errors occur on production
runs where failure to detect them could be disastrous.
"
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-10-01 07:45:38 UTC
Permalink
Post by James A. Robertson
Post by Isaac Gouy
AFAIK many of the widely reported problems (buffer overflows) result
from lack of run-time checks rather than lack of static type checks.
Yep. Years of developers worrying about exactly the wrong thing.
Sigh. Your lack of insight is quite astounding.

The problems which we see today, use of strcpy() where strncpy() is
safer being the usual implementation of the buffer overflow error,
were, as Isaac quoted, solved in 1961 and the intelligence of the
solution recognized at that time, 11 years before the langauge
containing strcpy() was invented.

Why then do buffer overflows plague today's software?

A "best practice" approach was rejected or neglected in favor of
less careful methods, for reasons which turn out not to justify
the problems they have caused.

The lesson for today in the context of this discussion is obvious,
one would think.

To most.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Tom Welsh
2003-10-01 10:59:09 UTC
Permalink
Post by Kent Paul Dolan
Why then do buffer overflows plague today's software?
A "best practice" approach was rejected or neglected in favor of
less careful methods, for reasons which turn out not to justify
the problems they have caused.
Precisely.

Perhaps we should consider that the benefits accruing from neglect of
these better methods were enjoyed by one group of people - while the
problems which they caused affected, mostly, a quite different group.
Thus, a software manufacturer can become wealthy on the profits built up
largely by neglecting proper reliability and security. Meanwhile, losses
far greater (in total) than the manufacturer's profits are experienced
by its customers. Every hang, every reboot, every hour of an experienced
person's time spent explaining how to get around the more egregious
wrong behaviours of the software, every incorrect calculation, every
loss of data... the list goes on.

In modern society, with its complex net of interactions and
dependencies, one way of telling the street-smart operators is that they
are the ones who succeed in "exporting" their costs to other people,
most of whom remain oblivious to the way their pockets are being picked.
--
Tom Welsh
James A. Robertson
2003-10-01 15:29:29 UTC
Permalink
On Wed, 1 Oct 2003 07:45:38 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by James A. Robertson
Post by Isaac Gouy
AFAIK many of the widely reported problems (buffer overflows) result
from lack of run-time checks rather than lack of static type checks.
Yep. Years of developers worrying about exactly the wrong thing.
Sigh. Your lack of insight is quite astounding.
The problems which we see today, use of strcpy() where strncpy() is
safer being the usual implementation of the buffer overflow error,
were, as Isaac quoted, solved in 1961 and the intelligence of the
solution recognized at that time, 11 years before the langauge
containing strcpy() was invented.
Why then do buffer overflows plague today's software?
Because people still use languages that allow for the problem to
happen. Best practices won't be followed, period. Get over it
Post by Kent Paul Dolan
A "best practice" approach was rejected or neglected in favor of
less careful methods, for reasons which turn out not to justify
the problems they have caused.
The lesson for today in the context of this discussion is obvious,
one would think.
To most.
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-10-01 21:39:43 UTC
Permalink
Post by James A. Robertson
Best practices won't be followed, period.
Not by you, obviously, since you have been railing
against them here loudly and longly. This should
warn potential customers clearly about the likely
results of employing your services.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Dr Chaos
2003-10-01 17:44:42 UTC
Permalink
Post by Kent Paul Dolan
Post by James A. Robertson
Post by Isaac Gouy
AFAIK many of the widely reported problems (buffer overflows) result
from lack of run-time checks rather than lack of static type checks.
Yep. Years of developers worrying about exactly the wrong thing.
Sigh. Your lack of insight is quite astounding.
The problems which we see today, use of strcpy() where strncpy() is
safer being the usual implementation of the buffer overflow error,
were, as Isaac quoted, solved in 1961 and the intelligence of the
solution recognized at that time, 11 years before the langauge
containing strcpy() was invented.
Why then do buffer overflows plague today's software?
A "best practice" approach was rejected or neglected in favor of
less careful methods, for reasons which turn out not to justify
the problems they have caused.
An anti-intellectual bias for thinking that "C" is great and other
"academic" {a euphemism for carefully thought-out} langauges are for
losers.
Kent Paul Dolan
2003-10-02 02:36:09 UTC
Permalink
Yet again, when challenged, you resort to ad-homeneim attacks.
Perhaps daddy could also teach you how silly you look trying to use
fancy words you don't know how to spell (or from the looks of that
spelling, pronounce)?

Much like your championing of a programming language in which you
have to be instructed by people who don't even use it, how it works?

You seem uniquely lacking in detail orientation, for someone trying
to earn his living as a programmer. You should try outside sales,
instead, a better fit to that personality.

xanthian.

"Overflow of an integer is an implementation detail ... . The mere
fact that a business developer has to worry about it is a <problem>."
-- James A. Robertson, business software developer

Mmmpfh!

Heaven save the industry from business programmers who don't want to
be bothered knowing if their data fits in the space allowed, a "mere
implementation detail".
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Mark Smith
2003-10-02 03:21:22 UTC
Permalink
Post by Kent Paul Dolan
"Overflow of an integer is an implementation detail ... . The mere
fact that a business developer has to worry about it is a <problem>."
-- James A. Robertson, business software developer
Mmmpfh!
Heaven save the industry from business programmers who don't want to
be bothered knowing if their data fits in the space allowed, a "mere
implementation detail".
Oh good grief. Here's a hint: http://c2.com/cgi/wiki?SmallInteger

James is right, integer overflow is exactly the sort of low-level
bit-twiddling detail that a "business programmer" shouldn't have to
deal with, and Smalltalk ensures that they don't.
Kent Paul Dolan
2003-10-02 03:51:50 UTC
Permalink
Post by Mark Smith
Oh good grief. Here's a hint: http://c2.com/cgi/wiki?SmallInteger
Oh, my, and Smalltalkers consider that a language _feature_?

It counts as a bug, by me.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-10-02 04:58:49 UTC
Permalink
On Thu, 2 Oct 2003 03:51:50 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by Mark Smith
Oh good grief. Here's a hint: http://c2.com/cgi/wiki?SmallInteger
Oh, my, and Smalltalkers consider that a language _feature_?
It counts as a bug, by me.
Ok - explain how it's a bug. In a typical business app, under what
circumstances do I need to know or care how many bits are in some
value that has business meaning? There may be maximum and minimum
values - but those will be business related (max amount payable on a
claim, etc) - not implementation detail oriented.
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Roedy Green
2003-10-02 16:50:01 UTC
Permalink
On Thu, 02 Oct 2003 00:58:49 -0400, James A. Robertson
Post by James A. Robertson
Ok - explain how it's a bug. In a typical business app, under what
circumstances do I need to know or care how many bits are in some
value that has business meaning?
In a business language you should specify the upper and lower bounds
of a numeric value, not the size in bytes. The language should the
enforce those limits either by corralling out of bound values to the
max/min value, or by creating an exception.

--
Canadian Mind Products, Roedy Green.
Coaching, problem solving, economical contract programming.
See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Kent Paul Dolan
2003-10-02 05:20:33 UTC
Permalink
Post by Kent Paul Dolan
Post by Mark Smith
Oh good grief. Here's a hint: http://c2.com/cgi/wiki?SmallInteger
Oh, my, and Smalltalkers consider that a language _feature_?
It counts as a bug, by me.
And by the way, it is quite possible to do the same thing in Java,
while not surrendering knowledge of when a variable has attained a
value outside its expected range, by explicit coding.

http://www.well.com/user/xanthian/public/code/java/Hailstone.java

I'm sure that code will horrify "quick and dirty"-loving programmers.

[It is also an example to answer a question poised earlier why I don't
use Python -- I need my indentation for formatting for readability, I
cannot surrender that to have it instead convey syntatic information
to the code translation mechanism.]

The same ideas could be encapsulated in a class, I just wasn't going
to reuse them in this case.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-10-02 06:02:46 UTC
Permalink
On Thu, 2 Oct 2003 05:20:33 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by Kent Paul Dolan
Post by Mark Smith
Oh good grief. Here's a hint: http://c2.com/cgi/wiki?SmallInteger
Oh, my, and Smalltalkers consider that a language _feature_?
It counts as a bug, by me.
And by the way, it is quite possible to do the same thing in Java,
while not surrendering knowledge of when a variable has attained a
value outside its expected range, by explicit coding.
http://www.well.com/user/xanthian/public/code/java/Hailstone.java
The point - which still eludes you - is that having to worry about the
<implementation detail> of how many bits a given sort of number
supports is a <problem>. Developers need to support <business rules>,
not number of bits.

Also, your solution only works if you make sure to upfront use
BigIntegers - it won't "just work" like Smalltalk numerics do
Post by Kent Paul Dolan
I'm sure that code will horrify "quick and dirty"-loving programmers.
it horrifies me, but for diferent reasons.
Post by Kent Paul Dolan
[It is also an example to answer a question poised earlier why I don't
use Python -- I need my indentation for formatting for readability, I
cannot surrender that to have it instead convey syntatic information
to the code translation mechanism.]
So your quirks are more relevant than having consistent meaning for
all developers?
Post by Kent Paul Dolan
The same ideas could be encapsulated in a class, I just wasn't going
to reuse them in this case.
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Björn Eiderbäck
2003-10-02 08:24:07 UTC
Permalink
Post by Kent Paul Dolan
Post by Kent Paul Dolan
Post by Mark Smith
Oh good grief. Here's a hint: http://c2.com/cgi/wiki?SmallInteger
Oh, my, and Smalltalkers consider that a language _feature_?
It counts as a bug, by me.
And by the way, it is quite possible to do the same thing in Java,
while not surrendering knowledge of when a variable has attained a
value outside its expected range, by explicit coding.
http://www.well.com/user/xanthian/public/code/java/Hailstone.java
If that's the kind of style you use while writing programs I understand
why you need types or any other means of aids to assure yourself that
you have done anything right!
Maybe you have some performance resons to make it this complicated.
The last time I made the exercise of writing Hailstone in Smalltalk it
was like 5 lines of code (including code for output).

Björn
Mark Smith
2003-10-02 13:21:18 UTC
Permalink
Post by Kent Paul Dolan
Post by Kent Paul Dolan
Post by Mark Smith
Oh good grief. Here's a hint: http://c2.com/cgi/wiki?SmallInteger
Oh, my, and Smalltalkers consider that a language _feature_?
It counts as a bug, by me.
And by the way, it is quite possible to do the same thing in Java,
while not surrendering knowledge of when a variable has attained a
value outside its expected range, by explicit coding.
No, it's not possible without extraordinarily ugly tricks.
Post by Kent Paul Dolan
The same ideas could be encapsulated in a class, I just wasn't going
to reuse them in this case.
Uh huh. The same ideas could be encapsulated in a hamburger for all
the understanding you've demonstrated.
James A. Robertson
2003-10-02 04:56:44 UTC
Permalink
On Thu, 2 Oct 2003 02:36:09 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Yet again, when challenged, you resort to ad-homeneim attacks.
Perhaps daddy could also teach you how silly you look trying to use
fancy words you don't know how to spell (or from the looks of that
spelling, pronounce)?
sigh. So I didn't use a dictionary. Is this the best you can do?
I'm truly starting to understand why it is that you are unemployed.
Post by Kent Paul Dolan
Much like your championing of a programming language in which you
have to be instructed by people who don't even use it, how it works?
Not really. You're the one with a weak grasp of the difference
between static typing and strong typing. As well, you have amply
demonstrated that you have no idea how excpetion handling or method
lookup works in Smalltalk - and yet, you like to speak authoritatively
about it. Simply amazing.
Post by Kent Paul Dolan
You seem uniquely lacking in detail orientation, for someone trying
to earn his living as a programmer. You should try outside sales,
instead, a better fit to that personality.
I'm in Product Management, not full time development. Although I do
write software - BottomFeeder (RSS Reader) and the software that runs
my blog.
Post by Kent Paul Dolan
xanthian.
"Overflow of an integer is an implementation detail ... . The mere
fact that a business developer has to worry about it is a <problem>."
-- James A. Robertson, business software developer
Mmmpfh!
You can't get an overflow in Smalltalk. Period. Why? because of a
runtime system that makes it impossible. i.e., it's a non-problem for
developers. If I have a one-up counter, I need never worry about its
size. That's what I meant. However, since you don't really get
Smalltalk, you didn't follow what I meant.
Post by Kent Paul Dolan
Heaven save the industry from business programmers who don't want to
be bothered knowing if their data fits in the space allowed, a "mere
implementation detail".
In Smalltalk, that's a detail I need never worry about - unless I have
to work with an RDBMS. If I work with a decent database like
Gemstone, I don't need to worry about it then either. These are
details that business developers don't need to worry about - they are
trying to solve problems, and having to worry about that level of
detail gets in the way of solving the problem.


<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-10-02 05:08:33 UTC
Permalink
d) I say if that's the case, then even better static typing
can lead to even better code.
The problem of course being that by "better" you mean instead "a
parody of".

And here is where you fall into a common logical error. Just because
some of something is good doesn't _ever_ mean that lots of it _must_ be
better.

Salt and water are both necessary to human health. They will each kill
you if ingested in sufficient quantities, too.
At the same time, static typing restricts the flexibility of the
programming language. It makes certain designs impossible which are
otherwise simple with dynamic typing.
Sigh. Is the concept of "Turing Complete Language" no longer taught in
undergraduate computer science curricula? Just because one addicted to
dynamic typing cannot see a way to accomplish the same goal with types
checked at compile time (note that many statically typed languages, Ada
and Java among them, support run time polymorphism and dynamic dispatch)
doesn't, and by the definition of a Turing complete language, _cannot_,
mean that the same problem cannot be solved by a static typed language.

It is all a matter of being willing to learn how it is done.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-10-02 06:04:20 UTC
Permalink
On Thu, 2 Oct 2003 05:08:33 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Sigh. Is the concept of "Turing Complete Language" no longer taught in
undergraduate computer science curricula? Just because one addicted to
dynamic typing cannot see a way to accomplish the same goal with types
checked at compile time (note that many statically typed languages, Ada
and Java among them, support run time polymorphism and dynamic dispatch)
doesn't, and by the definition of a Turing complete language, _cannot_,
mean that the same problem cannot be solved by a static typed language.
Turing equivalence does not imply the same level of effort being
required. Read David's post again.
Post by Kent Paul Dolan
It is all a matter of being willing to learn how it is done.
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Björn Eiderbäck
2003-10-02 08:12:39 UTC
Permalink
Post by Kent Paul Dolan
Sigh. Is the concept of "Turing Complete Language" no longer taught in
undergraduate computer science curricula? Just because one addicted to
dynamic typing cannot see a way to accomplish the same goal with types
checked at compile time (note that many statically typed languages, Ada
and Java among them, support run time polymorphism and dynamic dispatch)
doesn't, and by the definition of a Turing complete language, _cannot_,
mean that the same problem cannot be solved by a static typed language.
But way harder! Maybe as hard as you never do it at all.
By the way: the last time i checked the "Turing machine" was typeless!!
(or rather handled only one type).
Post by Kent Paul Dolan
It is all a matter of being willing to learn how it is done.
Exactly!!
Post by Kent Paul Dolan
xanthian.
Björn
Isaac Gouy
2003-10-01 20:52:57 UTC
Permalink
Post by James A. Robertson
Post by Isaac Gouy
-SNIP-
Post by Kent Paul Dolan
You've lost track of the goal here, which was to improve
the *quality* of software widely distributed for direct
Internet interface use, to reduce the set of current
disasters from buggy code.
-SNIP-
AFAIK many of the widely reported problems (buffer overflows) result
from lack of run-time checks rather than lack of static type checks.
Yep. Years of developers worrying about exactly the wrong thing.
In this case that "wrong thing" would be performance at the expense of
safety ;-)

The problem couldn't exist in Ada or Smalltalk (assuming the C
implementation of the Smalltalk primitives and OE does the 'right
thing'...)
Post by James A. Robertson
Post by Isaac Gouy
Tony Hoare on implementing Algol 60 (in 1961)
http://www.braithwaite-lee.com/opinions/p75-hoare.pdf
"(1) The first principle was security: ...
A consequence of this principle is that every occurrence of every
subscript of every subscripted variable was on every occasion checked
at run time against both the upper and the lower declared bounds of
the array.
Many years later we asked our customers whether they wished us to
provide an option to switch off these checks in the interests of
efficiency on production runs. Unanimously, they urged us not to -
they already knew how frequently subscript errors occur on production
runs where failure to detect them could be disastrous.
"
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Thomas Gagné
2003-09-30 11:08:17 UTC
Permalink
Post by Kent Paul Dolan
Considering we already have software disasters with the
predominant languages all supporting static typing
suggests (at least) that compile-time testing is
inadequate to the task (I think we all agree on this) but
that testing is the only way to prove a system's
/correctness/.
Bzzzt! The proof that proving a program's correctness by
testing is infeasible is an undergraduate computer science
exercise.
Good point. Can we agree that testing gets us closer to delivering a
quality system than compiling and linking does?
Post by Kent Paul Dolan
<snip>
If the latter is the target and the former does not
guarantee the latter, then do we need the former?
Nothing we can pay for "does the latter", we are in a mode
of trying to move to "doing the best we can" rather than
"doing the least we can".
You need the former because it takes out a whole class of
frequent errors quickly, automatically, and most important,
cheaply in comparision to formal SQA methods. In a large
programming effort, programmers are a small minority of all
salary costs, so having programmers work harder to save
overall costs makes economic sense.
I'd like to agree with you on the last point, but the offshore industry
would be less appealing were it true.

Our experience about compile time checking is from two different
universes. But it recently occurred to me why those types of errors are
less frequent (mostly absent) in Smalltalk than they are in Java, and it
has to do with idiom.

In Smalltalk, if I want a read or write stream on an object (a file or
an array) I ask the object to return me a stream of the proper type. I
don't have to know what the proper type is and be sure I pass it the
correct argument to its constructor. This is where /nearly all/ my type
errors come from in Java--parameters to constructors. I sometimes
reverse them (because I don't remember their order) or I forget an
argument (because I forget its there). This is another kind of error
that doesn't happen in Smalltalk because the method's signature suggests
both the number, types, and order of its parameters.

So James Robertson isn't blowing smoke on this, though I wasn't prepared
to offer any reasons why Smalltalkers don't appreciate the value of
compile-time type checking. Considering those reasons, Java programmers
are right, compiler errors are needed because Java programming is prone
to these kinds of errors.

But that's OK. It's what Java programmers want.
Post by Kent Paul Dolan
<snip
On the issue of payback, read the publications of the
Software Engineering Institute, in particular their widely
used Capability Maturity Model book, and judge for yourself;
I lent my copy out and never got it back. Sigh.
http://www.sei.cmu.edu/publications/publications.html
Thanks for the link. I'll check it out. I've had books go unreturned.
I've started writing my name on the pages, inside the cover, everywhere
I can to make sure that if people don't return them to me, they at least
will be reminded of it everytime they look at the book.
Post by Kent Paul Dolan
There are probably all levels of programmers, just are
there are in any profession. Is it truly unimaginable
that some programmers are capable of delivering quality
(tested) systems without the assistance of compile-time
type checking?
It is truly imaginable that anyone I have ever heard make
that claim for him or her self was just excusing bone
laziness. I have known half a dozen people with 180 IQs in
my life, and most likely each of them could pull it off, but
definitely each of them has better sense than to try, as
well. For the rest of us, yes, it is truly unimaginable, and
the record bears me out.
I've decided that I think you're right, but for different reasons. It
is possible to design a language and its class libraries that don't
*require* static typing just as it is possible to design a language that
does. I would never advocate removing compile-time checking from Java.
It just couldn't work no matter your IQ.
Post by Kent Paul Dolan
<snip>
Given that there is little choice in good times but to
employ programmers at all levels of competence, the question
of whether to give them all the double-checking help they
can get, whether they want it or not, has an obvious answer
if software quality and on-time deliver are important to
you as an employer.
Good point. It's too bad programmers as a whole are considered only as
competent as their weakest link, and there's more of them than there are
strong ones. But the market is determined by the masses... sigh.
Post by Kent Paul Dolan
If insanity really is doing the same thing over and over
again and expecting a different result is it possible
statically typed languages and the assumptions about them
have outlived their usefulness?
No, but it is possible you are slipping into ranting instead
of discussing.
Perhaps. I guess I was wondering out loud to myself. I can type nearly
as fast as I think, even on my QWERTY keyboard (another topic.. :-)
--
.tom
remove email address' dashes for replies
opensource middleware at <http://isectd.sourceforge.net>
http://gagne.homedns.org
Kent Paul Dolan
2003-10-01 06:33:39 UTC
Permalink
I can type nearly as fast as I think
Man, if I could only think as slowly as I type, how much better
considered a set of thoughts I could produce.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Martin Drautzburg
2003-09-29 18:39:15 UTC
Permalink
Post by Kent Paul Dolan
"Quick and dirty" software development is almost overwhelmingly
seductive. Fred Brooks' "5 lines of delivered code per programmer
day" for software in the large drives budget writers to fits of
frothing madness. With the attitude that "we can do better with
just a few shortcuts" come the disasters that have so often recently
swept across the web.
A few weeks ago I made a strange observation. I had written a little
application in squeak (Smalltalk). It had taken me about a
week. Before I showed it to by colleagues I counted the lines of code
because I wanted to say: "see you can write that within a week and it
only takes 800 LOC".

To my surprize I counted 5000 LOC.
Thomas Gagné
2003-09-30 11:10:20 UTC
Permalink
Post by Martin Drautzburg
Post by Kent Paul Dolan
"Quick and dirty" software development is almost overwhelmingly
seductive. Fred Brooks' "5 lines of delivered code per programmer
day" for software in the large drives budget writers to fits of
frothing madness. With the attitude that "we can do better with
just a few shortcuts" come the disasters that have so often recently
swept across the web.
A few weeks ago I made a strange observation. I had written a little
application in squeak (Smalltalk). It had taken me about a
week. Before I showed it to by colleagues I counted the lines of code
because I wanted to say: "see you can write that within a week and it
only takes 800 LOC".
To my surprize I counted 5000 LOC.
So, don't keep us in suspense! How many type errors were there in a
week's worth of 5000 lines of Smalltalk?

It does seem like a lot of code for a week. Perhaps you didn't have
enough time to make it shorter?
--
.tom
remove email address' dashes for replies
opensource middleware at <http://isectd.sourceforge.net>
http://gagne.homedns.org
Phillip Lord
2003-09-30 11:25:48 UTC
Permalink
Post by Martin Drautzburg
A few weeks ago I made a strange observation. I had written a
little application in squeak (Smalltalk). It had taken me about a
week. Before I showed it to by colleagues I counted the lines of
code because I wanted to say: "see you can write that within a
week and it only takes 800 LOC".
To my surprize I counted 5000 LOC.
Thomas> So, don't keep us in suspense! How many type errors were
Thomas> there in a week's worth of 5000 lines of Smalltalk?

I thought it was a nice story. "I'll just knock this up" are classic
famous last words. You often find that you have written more than you
think.

Phil
Isaac Gouy
2003-09-30 21:17:11 UTC
Permalink
Post by Phillip Lord
Post by Martin Drautzburg
A few weeks ago I made a strange observation. I had written a
little application in squeak (Smalltalk). It had taken me about a
week. Before I showed it to by colleagues I counted the lines of
code because I wanted to say: "see you can write that within a
week and it only takes 800 LOC".
To my surprize I counted 5000 LOC.
Thomas> So, don't keep us in suspense! How many type errors were
Thomas> there in a week's worth of 5000 lines of Smalltalk?
I thought it was a nice story. "I'll just knock this up" are classic
famous last words. You often find that you have written more than you
think.
Maybe you missed the Smalltalk joke -
Post by Phillip Lord
It does seem like a lot of code for a week. Perhaps you didn't have
enough time to make it shorter?
The refactoring experience is to add functionality and reduce line
count, simultaneously ;-)
Phillip Lord
2003-10-01 11:45:03 UTC
Permalink
Isaac> Maybe you missed the Smalltalk joke -
Post by Thomas Gagné
It does seem like a lot of code for a week. Perhaps you didn't
have enough time to make it shorter?
Isaac> The refactoring experience is to add functionality and reduce
Isaac> line count, simultaneously ;-)

I'm not sure that this is unique to Smalltalk. I forget who wrote
"Sorry this is such a longer letter, I didn't have time to a shorter
one".

Its true in most languages, at least the way I write. Chopping out
code is one of the best ways of improving it.



Which reminds of this quote. Its more or less irrelevant, but I'll
stick it on anyway...


How is the Dictionary getting on?' said Winston, raising his voice to
overcome the noise.

'Slowly,' said Syme. 'I'm on the adjectives. It's fascinating.'

He had brightened up immediately at the mention of Newspeak. He pushed
his pannikin aside, took up his hunk of bread in one delicate hand and
his cheese in the other, and leaned across the table so as to be able
to speak without shouting.

'The Eleventh Edition is the definitive edition,' he said. 'We're
getting the language into its final shape -- the shape it's going to
have when nobody speaks anything else. When we've finished with it,
people like you will have to learn it all over again. You think, I
dare say, that our chief job is inventing new words. But not a bit of
it! We're destroying words -- scores of them, hundreds of them, every
day. We're cutting the language down to the bone. The Eleventh Edition
won't contain a single word that will become obsolete before the year
2050.'

He bit hungrily into his bread and swallowed a couple of mouthfuls,
then continued speaking, with a sort of pedant's passion. His thin
dark face had become animated, his eyes had lost their mocking
expression and grown almost dreamy.

'It's a beautiful thing, the destruction of words. Of course the great
wastage is in the verbs and adjectives, but there are hundreds of
nouns that can be got rid of as well. It isn't only the synonyms;
there are also the antonyms. After all, what justification is there
for a word which is simply the opposite of some other word? A word
contains its opposite in itself. Take "good", for instance. If you
have a word like "good", what need is there for a word like "bad"?
"Ungood" will do just as well -- better, because it's an exact
opposite, which the other is not. Or again, if you want a stronger
version of "good", what sense is there in having a whole string of
vague useless words like "excellent" and "splendid" and all the rest
of them? "Plusgood" covers the meaning, or "doubleplusgood" if you
want something stronger still. Of course we use those forms
already. but in the final version of Newspeak there'll be nothing
else. In the end the whole notion of goodness and badness will be
covered by only six words -- in reality, only one word. Don't you see
the beauty of that, Winston? It was B.B.'s idea originally, of
course,' he added as an afterthought.
Thomas Gagné
2003-10-01 12:49:06 UTC
Permalink
Post by Phillip Lord
Isaac> Maybe you missed the Smalltalk joke -
Post by Thomas Gagné
It does seem like a lot of code for a week. Perhaps you didn't
have enough time to make it shorter?
Isaac> The refactoring experience is to add functionality and reduce
Isaac> line count, simultaneously ;-)
"Sorry this is such a longer letter, I didn't have time to a shorter
one".
B. Pascal.
--
.tom
remove email address' dashes for replies
opensource middleware at <http://isectd.sourceforge.net>
http://gagne.homedns.org
Phillip Lord
2003-10-01 13:36:04 UTC
Permalink
Post by Phillip Lord
"Sorry this is such a longer letter, I didn't have time to a
shorter one".
Thomas> B. Pascal.



How strange. What goes around, comes around, I guess.

Phil
Martin Drautzburg
2003-09-30 19:13:48 UTC
Permalink
Post by Thomas Gagné
Post by Martin Drautzburg
A few weeks ago I made a strange observation. I had written a little
application in squeak (Smalltalk). It had taken me about a
week. Before I showed it to by colleagues I counted the lines of code
because I wanted to say: "see you can write that within a week and it
only takes 800 LOC".
To my surprize I counted 5000 LOC.
So, don't keep us in suspense! How many type errors were there in a
week's worth of 5000 lines of Smalltalk?
There were probably about as many error as there were lines of
code. Most of them typing errors. These were fixed within seconds. Er
- you asked for *type* errors not typing errors didn't you. Well I
didn't pay attention. I didn't know they were something special.
Post by Thomas Gagné
It does seem like a lot of code for a week. Perhaps you didn't have
enough time to make it shorter?
Absolutely right.
Kent Paul Dolan
2003-10-04 09:40:04 UTC
Permalink
Post by Kent Paul Dolan
"Quick and dirty" software development is almost overwhelmingly
seductive.
I hadn't intended to make a career out of this one thread, and I
think the heat of the discussion has made the above point abundantly
clear. You all argue it out among yourselves.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-10-04 14:51:28 UTC
Permalink
On Sat, 4 Oct 2003 09:40:04 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by Kent Paul Dolan
"Quick and dirty" software development is almost overwhelmingly
seductive.
I hadn't intended to make a career out of this one thread, and I
think the heat of the discussion has made the above point abundantly
clear. You all argue it out among yourselves.
Translation: I have no answers to the valid points that have been
raised. But thanks for playing.
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Kent Paul Dolan
2003-10-04 19:55:49 UTC
Permalink
Post by James A. Robertson
Translation: I have no answers to the valid points that have been
raised. But thanks for playing.
Sorry, ducks, but after watching you flounder about in the language
where you claim expertise, unable to get the point even though multiple
posters tried to explain it to you, you've taken on that characteristic
called "invincible ignorance", and I try not to play that game.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
James A. Robertson
2003-10-04 22:53:51 UTC
Permalink
On Sat, 4 Oct 2003 19:55:49 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by James A. Robertson
Translation: I have no answers to the valid points that have been
raised. But thanks for playing.
Sorry, ducks, but after watching you flounder about in the language
where you claim expertise, unable to get the point even though multiple
posters tried to explain it to you, you've taken on that characteristic
called "invincible ignorance", and I try not to play that game.
The question you have refused to answer is why having bit level limits
that cause overflow/underflow issues is a good thing. Numeric limits
at the app level are business related, not bit related. Having to
worry whether a given value conforms to a business rule is important -
having to worry whether it fits in N bits is just stupid. You don't
see that, or you don't understand it; it's unclear which.

As to ignorance, you haven't really pointed to any - but you do seem
to love name calling
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Earthlink
2003-10-05 07:28:45 UTC
Permalink
Post by James A. Robertson
On Sat, 4 Oct 2003 19:55:49 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by James A. Robertson
Translation: I have no answers to the valid points that have been
raised. But thanks for playing.
Sorry, ducks, but after watching you flounder about in the language
where you claim expertise, unable to get the point even though multiple
posters tried to explain it to you, you've taken on that characteristic
called "invincible ignorance", and I try not to play that game.
The question you have refused to answer is why having bit level limits
that cause overflow/underflow issues is a good thing. Numeric limits
at the app level are business related, not bit related. Having to
worry whether a given value conforms to a business rule is important -
having to worry whether it fits in N bits is just stupid.
I guess after lurking through this discussion for several days that I'm
missing your rational(ization) too. I DO need to know the number of bits if
I'm to create software that has to interact with any other system out there.
Examples include: storing data into a database and serializing data for
any other system besides Smalltalk.

Ignoring the interoperability issues for a second, I also have to question
if the flexibility of the automatic type promotion is worth the performance
trade-offs? It's interesting that a language like C#, clearly designed to
do Java one better (please argue their success in a different thread),
choose to expose much of the "metal" with value types. To address your issue
of under/over flow issues, if that's important to you, you can mark any
manipulations with "checked" and you get compiler level under/over flow
checking in addition to the speed of native math.

The auto promotion sure seems like a nice feature for prototyping and
probably works great for non-speed (or less speed) sensitive Smalltalk only
deployments, but I really question if the trade-offs work in the real world.

--hms
Post by James A. Robertson
You don't
see that, or you don't understand it; it's unclear which.
As to ignorance, you haven't really pointed to any - but you do seem
to love name calling
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
James A. Robertson
2003-10-05 14:08:17 UTC
Permalink
Post by Earthlink
Post by James A. Robertson
On Sat, 4 Oct 2003 19:55:49 +0000 (UTC), "Kent Paul Dolan"
The question you have refused to answer is why having bit level limits
that cause overflow/underflow issues is a good thing. Numeric limits
at the app level are business related, not bit related. Having to
worry whether a given value conforms to a business rule is important -
having to worry whether it fits in N bits is just stupid.
I guess after lurking through this discussion for several days that I'm
missing your rational(ization) too. I DO need to know the number of bits if
I'm to create software that has to interact with any other system out there.
Examples include: storing data into a database and serializing data for
any other system besides Smalltalk.
The fact that the database cares is a bug, not a feature. Again, the
only relevant piece of information is any business rule on the value
of a number - how many bits it fits in is mental overhead.
Post by Earthlink
Ignoring the interoperability issues for a second, I also have to question
if the flexibility of the automatic type promotion is worth the performance
trade-offs? It's interesting that a language like C#, clearly designed to
do Java one better (please argue their success in a different thread),
choose to expose much of the "metal" with value types. To address your issue
of under/over flow issues, if that's important to you, you can mark any
manipulations with "checked" and you get compiler level under/over flow
checking in addition to the speed of native math.
IN Smalltalk, I simply don't worry about it at all. Arithmetic "just
works", and I can go worry about other things.
Post by Earthlink
The auto promotion sure seems like a nice feature for prototyping and
probably works great for non-speed (or less speed) sensitive Smalltalk only
deployments, but I really question if the trade-offs work in the real world.
Seems to have worked out just fine over the last 20 some odd years.
Post by Earthlink
--hms
Post by James A. Robertson
You don't
see that, or you don't understand it; it's unclear which.
As to ignorance, you haven't really pointed to any - but you do seem
to love name calling
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Earthlink
2003-10-05 16:50:06 UTC
Permalink
Post by James A. Robertson
Post by Earthlink
Post by James A. Robertson
On Sat, 4 Oct 2003 19:55:49 +0000 (UTC), "Kent Paul Dolan"
The question you have refused to answer is why having bit level limits
that cause overflow/underflow issues is a good thing. Numeric limits
at the app level are business related, not bit related. Having to
worry whether a given value conforms to a business rule is important -
having to worry whether it fits in N bits is just stupid.
I guess after lurking through this discussion for several days that I'm
missing your rational(ization) too. I DO need to know the number of bits if
I'm to create software that has to interact with any other system out there.
Examples include: storing data into a database and serializing data for
any other system besides Smalltalk.
The fact that the database cares is a bug, not a feature. Again, the
only relevant piece of information is any business rule on the value
of a number - how many bits it fits in is mental overhead.
We can argue if it's a bug or a feature another day. It sure looks to me
like almost EVERYONE/EVERYTHING cares about variable sizes when interacting
outside of their native environment. If you use a language that doesn't give
you any, or limited, control over variable sizes, you suffer handicap during
integration.
Post by James A. Robertson
Post by Earthlink
Ignoring the interoperability issues for a second, I also have to question
if the flexibility of the automatic type promotion is worth the performance
trade-offs? It's interesting that a language like C#, clearly designed to
do Java one better (please argue their success in a different thread),
choose to expose much of the "metal" with value types. To address your issue
of under/over flow issues, if that's important to you, you can mark any
manipulations with "checked" and you get compiler level under/over flow
checking in addition to the speed of native math.
IN Smalltalk, I simply don't worry about it at all. Arithmetic "just
works", and I can go worry about other things.
Post by Earthlink
The auto promotion sure seems like a nice feature for prototyping and
probably works great for non-speed (or less speed) sensitive Smalltalk only
deployments, but I really question if the trade-offs work in the real world.
Seems to have worked out just fine over the last 20 some odd years.
At what cost? Doesn't seem like Smalltalk has become a wildly popular
language. I'm sure that the auto promotion features are not the root cause,
but it's just one more straw on the back of that camel.
Post by James A. Robertson
Post by Earthlink
--hms
Post by James A. Robertson
You don't
see that, or you don't understand it; it's unclear which.
As to ignorance, you haven't really pointed to any - but you do seem
to love name calling
Post by Kent Paul Dolan
xanthian.
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
<Talk Small and Carry a Big Class Library>
James Robertson, Product Manager, Cincom Smalltalk
http://www.cincomsmalltalk.com/blog/blogView
Martin Drautzburg
2003-10-06 18:40:08 UTC
Permalink
Post by Earthlink
Post by James A. Robertson
The fact that the database cares is a bug, not a feature. Again, the
only relevant piece of information is any business rule on the value
of a number - how many bits it fits in is mental overhead.
We can argue if it's a bug or a feature another day. It sure looks to me
like almost EVERYONE/EVERYTHING cares about variable sizes when interacting
outside of their native environment. If you use a language that doesn't give
you any, or limited, control over variable sizes, you suffer handicap during
integration.
If that was a type issue than Java would certainly need quite a few
more types e.g. an ofloat datatype, whose values will always fit into
an Oracle NUMBER field.

Dr Chaos
2003-10-06 00:20:20 UTC
Permalink
Post by James A. Robertson
The fact that the database cares is a bug, not a feature. Again, the
only relevant piece of information is any business rule on the value
of a number - how many bits it fits in is mental overhead.
suppose it takes 10,000,000,000,000,000,000,000 bits.

everybody, even a data base will eventually care.
Bobby Parker
2003-10-06 01:13:25 UTC
Permalink
Post by James A. Robertson
The fact that the database cares is a bug, not a feature. Again, the
only relevant piece of information is any business rule on the value
of a number - how many bits it fits in is mental overhead.
Honestly, I would think a programmer would understand these things. It's
clear to me you don't understand enterprise databases at all. The number of
bits required for storage in a database DIRECTLY AFFECTS the physical size
of the database itself, and so, such numbers are chosen in such a way as
optimize database performance and/or storage space. The general rule of
thumb is that the intelligent developer chooses the appropriate number type
for the job at hand.

Go read the documentation for MySQL, and storage requirements for the same,
when you reach millions of records. Then do the math on it. You might be
surprised.

GRANTED: Nowadays data storage has become quite cheap, with massive amounts
of space to be had for cheap, etc, however the performance issues STILL
apply. A bit of research is indeed in order for you.

bp
Bobby Parker
2003-10-05 12:21:02 UTC
Permalink
Post by James A. Robertson
The question you have refused to answer is why having bit level limits
that cause overflow/underflow issues is a good thing. Numeric limits
at the app level are business related, not bit related. Having to
worry whether a given value conforms to a business rule is important -
having to worry whether it fits in N bits is just stupid. You don't
see that, or you don't understand it; it's unclear which.
I hate to say it, but all this noise about "business rules" just makes me
remember COBOL.

Seriously, I never looked at a given language from a "business" standpoint.
It would seem to me that taking that angle would cause you to overlook
other realms of software development, and result in a language that was
useless for other things. As far as data manipulation goes, numeric limits
at the app level would *really* depend upon the application's goals. I
wouldn't just say it's a "business related" issue.

bp
Dr Chaos
2003-10-06 00:18:53 UTC
Permalink
Post by James A. Robertson
On Sat, 4 Oct 2003 19:55:49 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by James A. Robertson
Translation: I have no answers to the valid points that have been
raised. But thanks for playing.
Sorry, ducks, but after watching you flounder about in the language
where you claim expertise, unable to get the point even though multiple
posters tried to explain it to you, you've taken on that characteristic
called "invincible ignorance", and I try not to play that game.
The question you have refused to answer is why having bit level limits
that cause overflow/underflow issues is a good thing. Numeric limits
at the app level are business related, not bit related.
Why does Smalltalk not support arbitrary algebraic or p-adic integer
fields instead ordinary "Z"?
Post by James A. Robertson
Having to
worry whether a given value conforms to a business rule is important -
having to worry whether it fits in N bits is just stupid. You don't
see that, or you don't understand it; it's unclear which.
Everybody has to make a choice between generality and concrete
usefulness somewhere.
Isaac Gouy
2003-10-04 22:52:58 UTC
Permalink
Post by James A. Robertson
On Sat, 4 Oct 2003 09:40:04 +0000 (UTC), "Kent Paul Dolan"
Post by Kent Paul Dolan
Post by Kent Paul Dolan
"Quick and dirty" software development is almost overwhelmingly
seductive.
I hadn't intended to make a career out of this one thread, and I
think the heat of the discussion has made the above point abundantly
clear. You all argue it out among yourselves.
Translation: I have no answers to the valid points that have been
raised. But thanks for playing.
Play is good.

There were some things to be learned:

- I learned (or was reminded) about unsafe integer overflow in Java
and C# (and C++ and C?). Seems this untrapped error could create
problems similar to those xanthian described in the tropical cyclone
system.

- maybe Thomas has a better idea about what the equivalent of blocks
looks like in a statically typed language.

- hopefully David wasn't so put-off by the abuse that he failed to
read the example of lexical closures in a manifestly typed language.

- maybe we were reminded that "Quick and dirty" can also mean choosing
speed over safety by not performing runtime checks. Java, C#, ... have
type checking weaknesses that aren't present in other statically typed
languages.

- it would be interesting to see lifecycle cost measurements for
Smalltalk. There are function point measurements, but it seems
reasonable to suggest that manifest type information as documentation
and static type checks are most valuable during code maintenance.
mathew
2003-10-05 21:59:24 UTC
Permalink
Post by Kent Paul Dolan
In the case, and with the software industry's miserable current
record, choosing to opt out of some sorts of testing like compile
time type checking because they are "productivity destroying" is
to risk becoming another provider of the kind of worthless
rubbish currently infesting the Net.
Static type checking is undoubtedly a good idea. However, like many good
ideas, it becomes substantially less good when made compulsory in all
situations.

Documentation is undoubtedly a good idea too. Yet I can't help thinking
that there would be howls of complaint if lack of JavaDoc directives was
made a fatal compilation error in Java 1.5.

Complete lack of static type checking is an extremist position.
Mandatory static type checking is also an extremist position. I contend
that the sensible tradeoff is to have static type checking, but allow it
to be bypassed when it's necessary to do so to prevent an explosion of
complexity--because needless complexity is a much nastier demon than the
possibility of run-time type errors.


mathew
Isaac Gouy
2003-09-29 06:33:24 UTC
Permalink
Post by Kent Paul Dolan
I've used Lisp and Perl
Lisp and Scheme have so much to offer that I wish I'd learned them
years ago - not sure I ever will now.
Post by Kent Paul Dolan
I'll probably never try Smalltalk, languages that promote
sloppiness by making nearly every statement execute whether
it makes sense as written or not have highly negative
effects on my productivity; I prefer the stricter languages
like Ada, Pascal, or Modula 2 that catch most of my logic
errors at compile time.
Never did enough Ada to really get it, there's so much there.
Modula-2 I liked. Then there was a project that might need some kind
of AI approach, and before that could be tried it would rely on direct
manipulation of gantt charts and histograms - 15 years ago on a 286 PC
;-)
I saw an article about a Smalltalk system developed at Tektronix for
debugging oscilloscopes which combined graphics of where to set the
test probes with expert advice; a Smalltalk implementation appeared
for the 286 and I learnt that you can get an awful lot done with
Smalltalk ;-)

"making nearly every statement execute"
Don't know what you mean by this?
Are you just refering to lack of static type checking?
Post by Kent Paul Dolan
In comparision to all the languages in that kit which I
_have_ used, and to a long list of others I'm sure I could
no longer reproduce accurately, Java is most productive for
programming tasks of the type I do today, heavy in human
interface construction and complex data structures with lots
of objects of short lifespans
Moving from a limited language like Modula-2 or something as capable
as Ada, to Java - I can see that. Moving from Smalltalk to Java you
simply gave up productivity - partly language, partly tools, partly
libraries.

Happily some of the Smalltalk stuff has appeared in Java (refactoring
browser, xUnit) and, after a period of stagnation, there are "new and
improved" Smalltalk implementations.
Post by Kent Paul Dolan
The sidebar to put a limit on how seriously this should all
be taken is that I'm strictly an imperative language
programmer; I have no "hands-on" knowledge of the
productivity of functional or equational languages in
current use.
Well, there's a language called Nice which brings some of the ideas
that have been developed in functional programming into an object
oriented language, that runs on JVM and interoperates with Java
classes and methods.

There are things about Nice that will appeal to you - it's intended to
be a safer language than Java - distinguishing between ordinary types
and Option Types which may include null, works wonders for
NullPointerExceptions; parametric classes remove the need for nonsense
casts; there's support for DbC.

For me, the appeal is that Nice is more expressive and has less
baggage - anonymous functions instead of inner classes, type inference
for method variables instead of redundant declarations, a type system
where int is a subtype of double...

Given that you favour static type checking, the modern pure functional
languages are quite a kick - you gotta love type inference ;-)
Kent Paul Dolan
2003-09-29 14:44:21 UTC
Permalink
Post by Isaac Gouy
There are things about Nice that will appeal to you - it's intended to
be a safer language than Java - distinguishing between ordinary types
and Option Types which may include null, works wonders for
NullPointerExceptions;
This brings to mind one of Java's least successful endeavors: for a
language which claims not to _have_ pointers, finding the most common
way for software to die is via a _pointer_ exception is _way_
offputting. The Java compiler could do a lot more work to do path
analysis and warn the programmer if a way to achieve a null dereference
exists in the code, other language's compilers do; but perhaps the
tradeoff to do that much extra work against Java's currently blindingly
fast compilation speed is considered too grim. Still, a lint()-like
routine that spent the extra effort at the programmer's option to do
such path analysis would be a welcome toolkit member; does such exist?

Referring back to Nice, which I've looked at briefly, I suspect putting
the burden on the programmer to express the suitability of "null" as a
value for each object reference is entirely appropriate, and the markups
that Perl, for example, requires for the programmer to define the
intention of a token as a scalar, array, hash, or function name is
similar in impact: a slight mental adjustment and you're there.

You might want to look at OCaml, a strongly typed object oriented
version of Caml, a child of functional language ML, a child of Lisp, if
I have the story right. It is reported to be an extremely productive
language compared to the C family, and to create extremely tight code.

http://www.ocaml.org/ -- OCaml
http://caml.inria.fr/ -- Caml
http://www.faqs.org/faqs/meta-lang-faq/ -- good place to start ML

I'm a bit excluded by lacking more than a cursory idea of the
functioning of functional programming.

That's if you're still in the new language learning mode. Each new
one for me is becoming more and more of a struggle as I deal with the
problems of advancing age, and Java as a language system in the large
is growing faster than I can keep up, so far, which is actually praise.

xanthian.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Isaac Gouy
2003-09-29 21:16:19 UTC
Permalink
Post by Kent Paul Dolan
Post by Isaac Gouy
There are things about Nice that will appeal to you - it's intended to
be a safer language than Java - distinguishing between ordinary types
and Option Types which may include null, works wonders for
NullPointerExceptions;
This brings to mind one of Java's least successful endeavors: for a
language which claims not to _have_ pointers, finding the most common
way for software to die is via a _pointer_ exception is _way_
offputting. The Java compiler could do a lot more work to do path
analysis and warn the programmer if a way to achieve a null dereference
exists in the code, other language's compilers do
Quite - all those runtime failures are a bit much after the sermons on
static type checking ;-)
Post by Kent Paul Dolan
Still, a lint()-like
routine that spent the extra effort at the programmer's option to do
such path analysis would be a welcome toolkit member; does such exist?
Don't know.
Post by Kent Paul Dolan
Referring back to Nice, which I've looked at briefly, I suspect putting
the burden on the programmer to express the suitability of "null" as a
value for each object reference is entirely appropriate, and the markups
that Perl, for example, requires for the programmer to define the
intention of a token as a scalar, array, hash, or function name is
similar in impact: a slight mental adjustment and you're there.
You might want to look at OCaml,
Thanks, last year I gave it a try and stumbled over my confusion about
OCamls ideas about subtyping. It seemed that there were more hoops to
jump through - that hasn't seemed to be the case with Nice.
Post by Kent Paul Dolan
I'm a bit excluded by lacking more than a cursory idea of the
functioning of functional programming.
It's hard to gain a new way to think about programming. I'm just
starting to grok simple algebraic datatypes and think about
programming as transformation.
Post by Kent Paul Dolan
That's if you're still in the new language learning mode. Each new
one for me is becoming more and more of a struggle as I deal with the
problems of advancing age, and Java as a language system in the large
is growing faster than I can keep up, so far, which is actually praise.
Doing more of the same had less and less appeal. Happily I stumbled
onto a purist functional language which forced me to think differently
about programming. Clean compiles to fast code, but it's definitely
more of an research language than OCaml.
Skip Hendrix
2003-10-01 11:31:34 UTC
Permalink
http://blogs.law.harvard.edu/philg/2003/09/20#a1762
P. Greenspun?

Take a look at ArsDigita

oh wait. I guess you can't.
Loading...