Discussion:
Worse is better?
Joakim via Digitalmars-d
2014-10-08 19:44:03 UTC
Permalink
This is a somewhat famous phrase from a late '80s essay that's
mentioned sometimes, but I hadn't read it till this week. It's a
fascinating one-page read, he predicted that lisp would lose out
to C++ when he delivered this speech in 1990, well worth reading:

https://www.dreamsongs.com/RiseOfWorseIsBetter.html

Since "worse" and "better" are subjective terms, I interpret it
as "simpler spreads faster and wider than complex." He thinks
simpler is worse and complex is often better, hence the title.
Perhaps it's not as true anymore because that was the wild west
of computing back then, whereas billions of people use the
software built using these languages these days, so maybe we
cannot afford to be so fast and loose.

What does this have to D? Well, the phenomenon he describes
probably has a big effect on D's adoption even today, as he was
talking about the spread of programming languages, ones we use to
this day. Certainly worth thinking about, as we move forward
with building D.
John Carter via Digitalmars-d
2014-10-08 20:36:57 UTC
Permalink
Post by Joakim via Digitalmars-d
This is a somewhat famous phrase from a late '80s essay that's
mentioned sometimes, but I hadn't read it till this week.
Keep reading, he is still pretty ambivalent about the whole
concept still...

http://dreamsongs.com/WorseIsBetter.html
Peter Alexander via Digitalmars-d
2014-10-09 08:17:07 UTC
Permalink
Post by Joakim via Digitalmars-d
What does this have to D? Well, the phenomenon he describes
probably has a big effect on D's adoption even today, as he was
talking about the spread of programming languages, ones we use
to this day. Certainly worth thinking about, as we move
forward with building D.
That ship has sailed for D. It is no longer a simple language. It
now tries to do The Right Thing.

I found the turning point:

https://github.com/D-Programming-Language/dlang.org/commit/67e5f0d8b59aa0ce26b2be9bd79c93d1127b2db6#diff-b6ac8bc22fdbb33f7266c9422db97c2bL212

:-)
deadalnix via Digitalmars-d
2014-10-10 00:36:44 UTC
Permalink
On Thursday, 9 October 2014 at 08:17:09 UTC, Peter Alexander
Post by Peter Alexander via Digitalmars-d
Post by Joakim via Digitalmars-d
What does this have to D? Well, the phenomenon he describes
probably has a big effect on D's adoption even today, as he
was talking about the spread of programming languages, ones we
use to this day. Certainly worth thinking about, as we move
forward with building D.
That ship has sailed for D. It is no longer a simple language.
It now tries to do The Right Thing.
https://github.com/D-Programming-Language/dlang.org/commit/67e5f0d8b59aa0ce26b2be9bd79c93d1127b2db6#diff-b6ac8bc22fdbb33f7266c9422db97c2bL212
:-)
Is this the politically correct wy to say "we don't care about
simplicity anymore!" ?
Peter Alexander via Digitalmars-d
2014-10-10 09:00:16 UTC
Permalink
Post by deadalnix via Digitalmars-d
Is this the politically correct wy to say "we don't care about
simplicity anymore!" ?
Heh. I don't think so. We've just rebalanced our priorities.

You can't have simple, expressive, and low level control. D1 was
simple but lacking in expressiveness and control. D2 had traded
some simplicity in to improve the situation. I think it has been
worthwhile (modulo the inevitable hiccups and warts).
Ola Fosheim Grostad via Digitalmars-d
2014-10-10 21:11:19 UTC
Permalink
Post by Peter Alexander via Digitalmars-d
You can't have simple, expressive, and low level control.
Why not?
Peter Alexander via Digitalmars-d
2014-10-10 21:54:30 UTC
Permalink
On Friday, 10 October 2014 at 21:11:20 UTC, Ola Fosheim Grostad
On Friday, 10 October 2014 at 09:00:17 UTC, Peter Alexander
Post by Peter Alexander via Digitalmars-d
You can't have simple, expressive, and low level control.
Why not?
It's just something I believe from experience.

The gist of my reasoning is that to get low level control you
need to specify things. When those things are local and isolated,
all is good, but often the things you specify bleed across
interfaces and affect either all the implementations (making
things more complex) or all the users (making things less
expressive).

For example, consider the current memory allocation/management
debate. I cannot think of a possible way to handle this that
simultaneously:

(a) gives users full control over how every function
allocates/manages memory (control).
(b) makes the implementation of those functions easy (simple).
(c) makes it easy to compose functions with different management
policies (expressive).

There are trade-offs on every axis. I'm sure we'll be able to
find something reasonable, that maybe does a good job on each
axis, but I don't think it's possible to get 10/10 on all of them.

Maybe there's a way to do it, but if there is I imagine that
language and programming experience is going to be vastly
different from what we have now (in any language).
Chris Williams via Digitalmars-d
2014-10-10 22:22:08 UTC
Permalink
Post by Peter Alexander via Digitalmars-d
(a) gives users full control over how every function
allocates/manages memory (control).
(b) makes the implementation of those functions easy (simple).
(c) makes it easy to compose functions with different
management policies (expressive).
Probably the method would be to make garbage management an aspect
of the language itself, like how Go handles parallel processing
at the compiler level. Developers write everything like it's all
magically garbage collected, with maybe a few metatags/keywords
sprinkled around, and then tells the compiler what the default
garbage collection should be, and the garbage collector goes in
and rewrites code according to different strategies, including an
option for static-analysis based collection like Mercury.

D could potentially be moved that direction, but I would imagine
that adding reference versions of structs would be necessary
first, so that pointers became less prevalent, and pointers only
allowed in blocks marked dangerous where the programmer has to
perform any management himself.
via Digitalmars-d
2014-10-13 11:07:38 UTC
Permalink
Post by Peter Alexander via Digitalmars-d
On Friday, 10 October 2014 at 21:11:20 UTC, Ola Fosheim Grostad
On Friday, 10 October 2014 at 09:00:17 UTC, Peter Alexander
Post by Peter Alexander via Digitalmars-d
You can't have simple, expressive, and low level control.
Why not?
It's just something I believe from experience.
Ok, beliefs are good, but one should not limit visions by them.
Post by Peter Alexander via Digitalmars-d
The gist of my reasoning is that to get low level control you
need to specify things. When those things are local and
isolated, all is good, but often the things you specify bleed
across interfaces and affect either all the implementations
(making things more complex) or all the users (making things
less expressive).
I don't think there is anything that prevents a language from:

1. Allow the user to specify the constraints and let the system
fill in the details.

2. Let the user guide the search down to the low level details
for efficiency.

So, from a theoretical point of view I'd say it should be
possible to go from high to low level with a reasonable simple
language at the cost of an advanced compiler.

If you can specify how a program should work and let a human
being construct a program from it that works from the
specification alone, then a "competent" compiler/expert system
should be able to do the same thing.
Post by Peter Alexander via Digitalmars-d
For example, consider the current memory allocation/management
debate. I cannot think of a possible way to handle this that
(a) gives users full control over how every function
allocates/manages memory (control).
(b) makes the implementation of those functions easy (simple).
(c) makes it easy to compose functions with different
management policies (expressive).
I think the compiler should handle memory management and let the
user configure the compiler.

It is rather obvious that the compiler sometimes should to
instantiate two different versions of the same function based on
usage on the call site, so if you don't let the compiler handle
this then achieving the optimization potential becomes difficult.

Another point: compiling code to run on allocated activation
records is not the same as having it compile for a call-stack. If
you want lots of fibers you have to give up the concept of a
stack and stick to activation records. It is also a more generic
concept (Simula used it to represent both objects and memory
function blocks, they had the same internal representation).
Post by Peter Alexander via Digitalmars-d
Maybe there's a way to do it, but if there is I imagine that
language and programming experience is going to be vastly
different from what we have now (in any language).
Probably. So if you are going to support low level programming
then it is better to focus on the low level and be a bit more
reluctant to add high level features.

From a system level language I don't really need:
- templates
- exceptions
- fibers
- garbage collection

I'd rather have basic building blocks and some kind of well
designed deductive capability. Type systems are deductive in
nature, so I think deductive compile time evaluation makes sense.
Paulo Pinto via Digitalmars-d
2014-10-13 11:39:25 UTC
Permalink
On Monday, 13 October 2014 at 11:07:39 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
...
- templates
- exceptions
- fibers
- garbage collection
Ada, Modula-3 ? :)
via Digitalmars-d
2014-10-13 12:55:29 UTC
Permalink
Post by Paulo Pinto via Digitalmars-d
On Monday, 13 October 2014 at 11:07:39 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
...
- templates
- exceptions
- fibers
- garbage collection
Ada, Modula-3 ? :)
Ada would actually be a nice starting point. :)
Alex Ogheri via Digitalmars-d
2014-10-13 13:12:47 UTC
Permalink
Post by Paulo Pinto via Digitalmars-d
On Monday, 13 October 2014 at 11:07:39 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
...
- templates
- exceptions
- fibers
- garbage collection
Ada, Modula-3 ? :)
Did not Modula 3 had generics, so... templates but MUCH BETTER ?

exceptions ?

and garbage collection too ?

in Modula 3 I see only just fibers missing of the mentioned
Features...
Paulo Pinto via Digitalmars-d
2014-10-13 13:39:49 UTC
Permalink
Post by Alex Ogheri via Digitalmars-d
Post by Paulo Pinto via Digitalmars-d
On Monday, 13 October 2014 at 11:07:39 UTC, Ola Fosheim
Post by via Digitalmars-d
...
- templates
- exceptions
- fibers
- garbage collection
Ada, Modula-3 ? :)
Did not Modula 3 had generics, so... templates but MUCH BETTER ?
exceptions ?
and garbage collection too ?
in Modula 3 I see only just fibers missing of the mentioned
Features...
Well, it had real OS threads.

I was being ironic, as for as much as I like D, at least those
languages were already used to implement real OS.

The fact that they did not made the jump to mainstream, is
another matter.

--
Paulo
eles via Digitalmars-d
2014-10-13 13:56:19 UTC
Permalink
Post by Paulo Pinto via Digitalmars-d
On Monday, 13 October 2014 at 11:07:39 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
...
- templates
This is the no. 1 feature that I would like to have in a system
level programming languages such as hypothetical "C with
templates"

Add D's scope() statement.
Post by Paulo Pinto via Digitalmars-d
Post by via Digitalmars-d
- exceptions
Even those, sometimes you would like to have them.
Post by Paulo Pinto via Digitalmars-d
Post by via Digitalmars-d
- fibers
- garbage collection
I have not so much against GC, but not having RAII is a real
issue for me and I blame GC for that.
via Digitalmars-d
2014-10-13 14:58:06 UTC
Permalink
Post by eles via Digitalmars-d
Post by Paulo Pinto via Digitalmars-d
On Monday, 13 October 2014 at 11:07:39 UTC, Ola Fosheim
Post by via Digitalmars-d
...
- templates
This is the no. 1 feature that I would like to have in a system
level programming languages such as hypothetical "C with
templates"
In practice I use few templates in low level code. I might start
out with a template, and then end up using something concrete for
various reasons (performance, needed to modify the ADT as the
code base evolve, memory layout, desire for transparent source
code).

Nice to have, but not critical to success IMO.
eles via Digitalmars-d
2014-10-13 15:04:49 UTC
Permalink
On Monday, 13 October 2014 at 14:58:08 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
Post by Paulo Pinto via Digitalmars-d
On Monday, 13 October 2014 at 11:07:39 UTC, Ola Fosheim
Nice to have, but not critical to success IMO.
Of course is not that critical, because C succeeded without, but
still nice to have.
Paulo Pinto via Digitalmars-d
2014-10-13 16:28:13 UTC
Permalink
Post by via Digitalmars-d
Nice to have, but not critical to success IMO.
Of course is not that critical, because C succeeded without, but still
nice to have.
It had a killer application called UNIX, just like JavaScript has the
browser or Objective-C has the iPhone....

I doubt it would ever suceeded on its own.

--
Paulo
via Digitalmars-d
2014-10-13 18:38:46 UTC
Permalink
Post by Paulo Pinto via Digitalmars-d
It had a killer application called UNIX, just like JavaScript
has the browser or Objective-C has the iPhone....
I doubt it would ever suceeded on its own.
I probably would, since it was better than
http://en.wikipedia.org/wiki/BCPL
Paulo Pinto via Digitalmars-d
2014-10-13 19:00:37 UTC
Permalink
Am 13.10.2014 um 20:38 schrieb "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?=
Post by via Digitalmars-d
Post by Paulo Pinto via Digitalmars-d
It had a killer application called UNIX, just like JavaScript has the
browser or Objective-C has the iPhone....
I doubt it would ever suceeded on its own.
I probably would, since it was better than
http://en.wikipedia.org/wiki/BCPL
But no better than Algol or PL/... variants.

There were other alternatives.
via Digitalmars-d
2014-10-13 21:56:22 UTC
Permalink
Post by Paulo Pinto via Digitalmars-d
But no better than Algol or PL/... variants.
There were other alternatives.
Algol compilers required a lot more RAM than BCPL (~120k vs ~20k)

:)
Paulo Pinto via Digitalmars-d
2014-10-13 22:27:03 UTC
Permalink
Am 13.10.2014 um 23:56 schrieb "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?=
Post by via Digitalmars-d
Post by Paulo Pinto via Digitalmars-d
But no better than Algol or PL/... variants.
There were other alternatives.
Algol compilers required a lot more RAM than BCPL (~120k vs ~20k)
:)
If you wish, I can enumerate other alternatives with compatible memory
requirements. :)

It was a matter of luck being tied to UNIX, just like JavaScript is tied
to the browser, having a few of the key developers spread into American
universities outside AT&T, and creating workstation startups that
succeed in the market.

Outside of the workstation market based on UNIX systems and a few
universities, barely anyone was using C in Europe.

If those startups that paved the way to likes of Sun, SGI among others,
had failed to capture the market, C would just be another footnote in
the history of programming languages.

--
Paulo
Dicebot via Digitalmars-d
2014-10-14 01:36:06 UTC
Permalink
On Monday, 13 October 2014 at 14:58:08 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
Nice to have, but not critical to success IMO.
Templates are absolutely critical for any new system level
programming language for me to even consider it. I had my share
of pain emulating those in plain C and don't want to ever do it
again.
via Digitalmars-d
2014-10-14 07:25:56 UTC
Permalink
Post by Dicebot via Digitalmars-d
Templates are absolutely critical for any new system level
programming language for me to even consider it. I had my share
of pain emulating those in plain C and don't want to ever do it
again.
But maybe you don't really do low level programming then? In
which areas of low level programming are templates critical? I
have trouble finding examples where I have used it or seen it
used for anything non-trivial in performant code.

In theory it is nice to write double/float/fixed-point functions
once, but in reality you often need to change the algorithms or
the implementation when moving from double to float if you care
about performance.

Similarily for datastructures. You can often reduce the number of
data structure collections needed to implement an algorithm by
creating a special one that targets the dominant access patterns
and operations of the algorithm. Or significantly improve memory
handling. Or significantly improve cache performance.

In fact, when I think about it, you loose a lot when going from
machine language to a programming language in the first place.
When coding on a CISC like 68000 (that allows "high level
assembly") you would structure the data, lookup tables and memory
address space in a way to fit the problem and the instruction set
to get performance and tight code.
Dicebot via Digitalmars-d
2014-10-14 08:00:19 UTC
Permalink
On Tuesday, 14 October 2014 at 07:25:59 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
But maybe you don't really do low level programming then? In
which areas of low level programming are templates critical? I
have trouble finding examples where I have used it or seen it
used for anything non-trivial in performant code.
I don't do it right now but I definitely did (assuming barebone
MIPS sounds low-level enough). And I can't say anything good
about this experience from pure programming technology point of
view.

Templates are not about low level or high level of domain. It is
a tool to reduce code redundancy and simplify maintenance of
large code base. I am not even speaking about algorithms in STL
or std.algorithm sense but much more routine things - common
small snippets that either get copy-pasted or hidden behind C
macros.

Probably when you say "low level" you imagine something like
embedded microcontrollers. But there quite many huge scale
systems out there too, sometimes reaching millions lines of C
code. And those struggle from minimal C abstraction capabilities.
Walter Bright via Digitalmars-d
2014-10-14 08:27:56 UTC
Permalink
Templates are not about low level or high level of domain. It is a tool to
reduce code redundancy and simplify maintenance of large code base. I am not
even speaking about algorithms in STL or std.algorithm sense but much more
routine things - common small snippets that either get copy-pasted or hidden
behind C macros.
I discovered something very interesting about templates when writing Warp.
Templates make it easy for unittests to test a function, by accepting dummy
input that is conveniently of another type (such as using an array of data
instead of an input range).

I'm sure I've heard of this before, type mocking and all, but it didn't sink in
until I wrote Warp.
via Digitalmars-d
2014-10-14 09:36:33 UTC
Permalink
Post by Dicebot via Digitalmars-d
large code base. I am not even speaking about algorithms in STL
or std.algorithm sense but much more routine things - common
small snippets that either get copy-pasted or hidden behind C
macros.
C has macros to compensate for deficiencies in the language.

What kind of routine things are you thinking about that cannot be
covered either by better features or by explicit inlining?
Post by Dicebot via Digitalmars-d
Probably when you say "low level" you imagine something like
embedded microcontrollers. But there quite many huge scale
I am thinking about the stuff where it makes sense to drop down
to C/C++ due to the nature of the problem.

For most applications it makes more sense to write the high level
stuff in a high level language such as Objective-C/Swift and drop
down to C/C++ for engine level stuff.

People often write everything in C/C++ for portability, but that
is really a compiler/platform issue, not a language-design issue.
Post by Dicebot via Digitalmars-d
systems out there too, sometimes reaching millions lines of C
code. And those struggle from minimal C abstraction
capabilities.
Or they struggle with C not having the right feature set. Sure,
with templates you can implement more convenient ref-counting and
unique-pointers, and you can get a little bit more type safety.
But C suffers from the simple design of BCPL which was a bare
bones version of CPL.
ketmar via Digitalmars-d
2014-10-14 09:55:58 UTC
Permalink
On Tue, 14 Oct 2014 09:36:33 +0000
Post by via Digitalmars-d
C has macros
KILL! KILL! KILL! HULK SMASH!
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: not available
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20141014/9f225289/attachment.sig>
Paulo Pinto via Digitalmars-d
2014-10-14 11:04:01 UTC
Permalink
On Tuesday, 14 October 2014 at 09:36:34 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
Post by Dicebot via Digitalmars-d
large code base. I am not even speaking about algorithms in
STL or std.algorithm sense but much more routine things -
common small snippets that either get copy-pasted or hidden
behind C macros.
C has macros to compensate for deficiencies in the language.
What kind of routine things are you thinking about that cannot
be covered either by better features or by explicit inlining?
Post by Dicebot via Digitalmars-d
Probably when you say "low level" you imagine something like
embedded microcontrollers. But there quite many huge scale
I am thinking about the stuff where it makes sense to drop down
to C/C++ due to the nature of the problem.
For most applications it makes more sense to write the high
level stuff in a high level language such as Objective-C/Swift
and drop down to C/C++ for engine level stuff.
Why drop down to C/C++?

It would be like saying you need to drop down to them from D.
Post by via Digitalmars-d
People often write everything in C/C++ for portability, but
that is really a compiler/platform issue, not a language-design
issue.
This is what made me move away from Turbo Pascal back in the day.

If UNIX variants had a Turbo Pascal 7 or Modula-2 compatible
compilers,
I would have stayed in that world for a lot longer.
Post by via Digitalmars-d
Post by Dicebot via Digitalmars-d
systems out there too, sometimes reaching millions lines of C
code. And those struggle from minimal C abstraction
capabilities.
Or they struggle with C not having the right feature set. Sure,
with templates you can implement more convenient ref-counting
and unique-pointers, and you can get a little bit more type
safety. But C suffers from the simple design of BCPL which was
a bare bones version of CPL.
C suffers from its designers not wanting to acknowledge what
other systems programmers were doing, not from BCPL design.

--
Paulo
ketmar via Digitalmars-d
2014-10-14 11:10:09 UTC
Permalink
On Tue, 14 Oct 2014 11:04:01 +0000
Post by Paulo Pinto via Digitalmars-d
This is what made me move away from Turbo Pascal back in the day.
If UNIX variants had a Turbo Pascal 7 or Modula-2 compatible
compilers,
I would have stayed in that world for a lot longer.
*nix is very hostile to non-c languages. i dropped FreePascal due to
lack of headers for libraries. automatic converters still can't do the
good things, and converting/fixing headers manually is *very* tedious.

this was a hard move, as i had to drop all my fpc libraries and start
writing new ones for C.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: not available
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20141014/993d6a6c/attachment.sig>
ketmar via Digitalmars-d
2014-10-14 11:15:59 UTC
Permalink
On Tue, 14 Oct 2014 14:10:09 +0300
Post by ketmar via Digitalmars-d
this was a hard move, as i had to drop all my fpc libraries and start
writing new ones for C.
p.s. transition to D is much easier, as i can use all my C libraries in
D. thanks gods that i didn't switched to C++!
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: not available
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20141014/dba4a406/attachment-0001.sig>
Sag Academy via Digitalmars-d
2014-10-14 11:24:44 UTC
Permalink
On Tuesday, 14 October 2014 at 11:16:09 UTC, ketmar via
Post by ketmar via Digitalmars-d
On Tue, 14 Oct 2014 14:10:09 +0300
Post by ketmar via Digitalmars-d
this was a hard move, as i had to drop all my fpc libraries
and start
writing new ones for C.
p.s. transition to D is much easier, as i can use all my C
libraries in
D. thanks gods that i didn't switched to C++!
here c means language or drive?
ketmar via Digitalmars-d
2014-10-14 11:28:52 UTC
Permalink
On Tue, 14 Oct 2014 11:24:44 +0000
Post by Sag Academy via Digitalmars-d
here c means language or drive?
i never used CP/M for work.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: not available
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20141014/482f2247/attachment.sig>
eles via Digitalmars-d
2014-10-14 12:53:57 UTC
Permalink
On Tuesday, 14 October 2014 at 11:29:02 UTC, ketmar via
Post by ketmar via Digitalmars-d
On Tue, 14 Oct 2014 11:24:44 +0000
Sag Academy via Digitalmars-d <digitalmars-d at puremagic.com>
Post by Sag Academy via Digitalmars-d
here c means language or drive?
i never used CP/M for work.
Wow. I did use it, but only at school. :)
via Digitalmars-d
2014-10-14 11:57:42 UTC
Permalink
Post by Paulo Pinto via Digitalmars-d
Why drop down to C/C++?
It would be like saying you need to drop down to them from D.
Not sure what you meant here. Cocoa+tooling provides a fairly
high level environment. You drop down to C when you need speed or
low level interfacing. It was only an example, you could pick any
high level environment.
Post by Paulo Pinto via Digitalmars-d
C suffers from its designers not wanting to acknowledge what
other systems programmers were doing, not from BCPL design.
Well, I am not really sure if C suffers all that much. It was an
improvement on BCPL and aimed for easy porting so you can port to
new hardware platforms easily. And has been rather successful at
that. D is nowhere near that level of platform support.
Paulo Pinto via Digitalmars-d
2014-10-14 12:39:36 UTC
Permalink
On Tuesday, 14 October 2014 at 11:57:44 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
Post by Paulo Pinto via Digitalmars-d
Why drop down to C/C++?
It would be like saying you need to drop down to them from D.
Not sure what you meant here. Cocoa+tooling provides a fairly
high level environment. You drop down to C when you need speed
or low level interfacing. It was only an example, you could
pick any high level environment.
I don't need to drop out to C from Objective-C or Swift.

Objective-C is a C superset and Swift offers the required unsafe
constructs for the ultimate performance if I really want to.

My remark was that what should be emphasized is coding in a more
performance aware style, no need to switch languages.

--
Paulo
via Digitalmars-d
2014-10-14 13:02:51 UTC
Permalink
Post by Paulo Pinto via Digitalmars-d
Objective-C is a C superset and Swift offers the required
unsafe constructs for the ultimate performance if I really want
to.
My remark was that what should be emphasized is coding in a
more performance aware style, no need to switch languages.
Objective-C is a C superset in name, but not in spirit.
Objective-C/Cocoa is to a large extent a parallell universe. If
you want to go for performant C you have to drop down to
CoreFoundation et al.
Paulo Pinto via Digitalmars-d
2014-10-14 13:20:21 UTC
Permalink
On Tuesday, 14 October 2014 at 13:02:52 UTC, Ola Fosheim GrÞstad
Post by via Digitalmars-d
Post by Paulo Pinto via Digitalmars-d
Objective-C is a C superset and Swift offers the required
unsafe constructs for the ultimate performance if I really
want to.
My remark was that what should be emphasized is coding in a
more performance aware style, no need to switch languages.
Objective-C is a C superset in name, but not in spirit.
Objective-C/Cocoa is to a large extent a parallell universe. If
you want to go for performant C you have to drop down to
CoreFoundation et al.
Don't blame a library, for a language layer what matters is the
language grammar. :)

--
Paulo

ketmar via Digitalmars-d
2014-10-14 09:46:09 UTC
Permalink
On Tue, 14 Oct 2014 07:25:56 +0000
Post by via Digitalmars-d
But maybe you don't really do low level programming then? In
which areas of low level programming are templates critical?
in the same areas as other things, like conditional branches, for
example.

templates is a very cool macro system, with type checks and so. i
haven't seen low-level programmers that doesn't use macros.

the only other required thing is good optimizing compiler with good
inliner, so small templates will be really inserted in-place, like C
macros.

there is also attribute inference for template functions, it's nice. ;-)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: not available
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20141014/88b90cf2/attachment.sig>
Walter Bright via Digitalmars-d
2014-10-10 22:25:12 UTC
Permalink
Post by deadalnix via Digitalmars-d
Is this the politically correct wy to say "we don't care about
simplicity anymore!" ?
If simplicity was the overriding goal, we'd settle for the simplest possible
language that was Turing complete.

The problem, however, is that what makes a language simple to comprehend also
tends to make writing programs with it complicated!

For an analogy, when I was younger I had a set of hand tools to do everything
with. I couldn't afford a more complete set. Not having the right tool for each
job meant approximating it with some other tool. I'd get the jobs done, but at
the cost of much extra time invested, and crummy results.

Such as I could never get a square cut on a piece of wood. Now, I have a sweet
miter saw that quickly and accurately cuts wood every time. Of course it is a
far more complex tool than a handsaw, but much simpler to actually get work done
with.

So it's not that we don't care about simplicity anymore. We care about what is
simple for the programmer to get complex work done quickly and accurately. I
like to think of D as a fully equipped machine shop, where the programmer
doesn't have to make do with inadequate (but simple) tools. As professional
programmers, isn't that what we really care about?
Chris Williams via Digitalmars-d
2014-10-10 23:41:51 UTC
Permalink
Post by Walter Bright via Digitalmars-d
So it's not that we don't care about simplicity anymore. We
care about what is simple for the programmer to get complex
work done quickly and accurately. I like to think of D as a
fully equipped machine shop, where the programmer doesn't have
to make do with inadequate (but simple) tools. As professional
programmers, isn't that what we really care about?
Agreed. Overall, I'd say that there's a third way beyond "better"
or "worse", which is "non-whollistic better".

I always start any new task not by designing the whole
application nor by start to hack together parts as I need them.
Instead, I identify "tools" - parts of the application that I
know will exist, but could be used in any variety of applications
- and build nicely designed, generic libraries for those. With a
set of "better" libraries the remaining code that links them
together is fairly small, so it's easy to shuffle things around
or build out new functionality.
Nick Sabalausky via Digitalmars-d
2014-10-11 06:53:31 UTC
Permalink
Post by Walter Bright via Digitalmars-d
Post by deadalnix via Digitalmars-d
Is this the politically correct wy to say "we don't care about
simplicity anymore!" ?
If simplicity was the overriding goal, we'd settle for the simplest
possible language that was Turing complete.
In other words, Brainfuck: http://en.wikipedia.org/wiki/Brainfuck

It only has 8 commmands and *nothing* else, SOOO SIMPLE!!!

I've been incredibly productive ever since I switched to BF! It's so
amazingly simple and orthogonal that it only takes *just* a day or so to
write a "hello world"! Brilliant!

Today "hello world", next millennium "pong", and then some glorious
day...*real useful software*! Mwahahahahah!!!!!

/sarcastic_bastard turns back to his complicated language and resumes
getting real work done...
via Digitalmars-d
2014-10-09 08:34:58 UTC
Permalink
a fascinating one-page read, he predicted that lisp would lose
out to C++ when he delivered this speech in 1990, well worth
Lisp has never been in the same class of languages as C++. Lisp
gained traction in a time period when there were few
alternatives, and it was easy to implement an interpreter for it.
It was cool among the academic-geeks that hung at universities,
so it gained traction among the young who looked up to them. That
prolonged the lifespan of Lisp, but Lisp as a language has never
been great from a usability perspective.

Worse is not better, but things tend to get worse when you add
features to a core where the new parts does not fit and you
insist on backwards compatibility.

The dynamics of evolving around an "installed base"


You see this in X11, windows and the X86 instruction set too.

D should be able to a lot better, with a small installed base,
but you probably need to delay that to D3.

And even then you have a problem when so many D users think that
"alias this" is a good idea
 It is a hack and a "worse is better"
design. In order to avoid such constructs you need to think about
the semantics of the language in a more "axiomatic" manner.
Walter Bright via Digitalmars-d
2014-10-10 22:27:03 UTC
Permalink
On 10/9/2014 1:34 AM, "Ola Fosheim GrÞstad"
And even then you have a problem when so many D users think that "alias this" is
a good idea
 It is a hack and a "worse is better" design. In order to avoid such
constructs you need to think about the semantics of the language in a more
"axiomatic" manner.
I agree that 'alias this' syntax is a bit hackish, and I've never been happy
with that, but the semantics are pretty darned good.
via Digitalmars-d
2014-10-13 10:41:02 UTC
Permalink
Post by Walter Bright via Digitalmars-d
I agree that 'alias this' syntax is a bit hackish, and I've
never been happy with that, but the semantics are pretty darned
good.
I think that such features often are the result of special casing
that could have been handled by more generic solutions. Then they
become superfluous later when more generic constructs are added
and you end up with many ways of doing the same thing (complexity
with no gain).

In this case some kind of inheritance, some variation of static
interface, or some kind of deductive system/rewrite system
probably would have been more powerful and covered the same use
case.
ketmar via Digitalmars-d
2014-10-13 10:51:44 UTC
Permalink
On Mon, 13 Oct 2014 10:41:02 +0000
Post by via Digitalmars-d
In this case some kind of inheritance, some variation of static
interface, or some kind of deductive system/rewrite system
probably would have been more powerful and covered the same use
case.
AST macros! AST macros can do almost anything! ;-)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: not available
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20141013/2787ffb5/attachment-0001.sig>
via Digitalmars-d
2014-10-13 12:40:18 UTC
Permalink
On Monday, 13 October 2014 at 10:51:54 UTC, ketmar via
Post by ketmar via Digitalmars-d
AST macros! AST macros can do almost anything! ;-)
Including making it impossible to add new features to the
language... :)
ketmar via Digitalmars-d
2014-10-13 13:10:49 UTC
Permalink
On Mon, 13 Oct 2014 12:40:18 +0000
Post by via Digitalmars-d
On Monday, 13 October 2014 at 10:51:54 UTC, ketmar via
Post by ketmar via Digitalmars-d
AST macros! AST macros can do almost anything! ;-)
Including making it impossible to add new features to the
language... :)
with AST macros everyone can add new feature to the language!
almost all ERs can be closed with "write AST macro!" ;-)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: not available
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20141013/54d18f35/attachment.sig>
Walter Bright via Digitalmars-d
2014-10-10 22:12:00 UTC
Permalink
This is a somewhat famous phrase from a late '80s essay that's mentioned
sometimes, but I hadn't read it till this week. It's a fascinating one-page
read, he predicted that lisp would lose out to C++ when he delivered this speech
https://www.dreamsongs.com/RiseOfWorseIsBetter.html
Since "worse" and "better" are subjective terms, I interpret it as "simpler
spreads faster and wider than complex." He thinks simpler is worse and complex
is often better, hence the title. Perhaps it's not as true anymore because that
was the wild west of computing back then, whereas billions of people use the
software built using these languages these days, so maybe we cannot afford to be
so fast and loose.
I can't help but think that this is nothing more than different people have
different ideas on what "better" is.

It reminds me of the Beta vs VHS debate. Rarely mentioned is a movie could fit
on one VHS tape, rather than 2 Beta tapes. (VHS was cheaper, too.) That made VHS
"better" for an awful lot of people.

"What Sony did not take into account was what consumers wanted. While Betamax
was believed to be the superior format in the minds of the public and press (due
to excellent marketing by Sony), consumers wanted an affordable VCR (a VHS often
cost hundreds of dollars less than a Betamax);[9] Sony believed that having
better quality recordings was the key to success, and that consumers would be
willing to pay a higher retail price for this, whereas it soon became clear that
consumer desire was focused more intently on recording time, lower retail price,
compatibility with other machines for sharing (as VHS was becoming the format in
the majority of homes), brand loyalty to companies who licensed VHS (RCA,
Magnavox, Zenith, Quasar, Mitsubishi, Panasonic, even JVC itself, et al.), and
compatibility for easy transfer of information.[10] In addition, Sony, being the
first producer to offer their technology, also thought it would establish
Betamax as the leading format. This kind of lock-in and path dependence failed
for Sony, but succeeded for JVC. For thirty years JVC dominated the home market
with their VHS, Super VHS and VHS-Compact formats, and collected billions in
royalty payments."

-- http://en.wikipedia.org/wiki/Videotape_format_war#End_of_Beta

It's a cautionary tale for us.
Loading...