was no superfluous correction but meant to hint at
something larger and to split off a thread.
But it was not completely off topic:
We hone every *bit* when expressing ourselves to
silicon-based processors, but when trying to make ourselves
understood by carbon-based processors we tend to leave an
undue amount of work to them.
This does *not* mean that anyone choosing a sub-optimal
phrasing or even making a mere typo is sloppy or whatever
(This is especially true for people speaking English as a
foreign language, as I do).
And zxq9 (I hesitate to use first names without previous
authorisation, especially in East Asia) is very probably
much more fluent and expressive in English than I am, and
he certainly is so in Erlang (statement based on
having browsed the erlang-questions archive for years (2?)
The line was simply a perfect example of how distracting
even a small "mistake" can be. *Can* be, was, to me, might
be to others, and in a context where every bit counts, too.
My intention was to hint at a waste of brain cycles (and a
strange dichotomy between programming and spoken languages):
* On the first layer people want to implement their vision
in Erlang, which requires one or more layers of
understanding (depends on complexity).
* This requires them to add one more layer where they look
* Often it requires them to add at least one more layer
reading and understanding the user guide.
* Being thrown into "language analysis mode" by buggy
writing takes them one layer further away, and this can
be quite a thick layer.
The first three take them far from their actual work, but
they are more or less necessary (depending on experience).
I would simply like people to see how really *help*ful it
is to reduce this "unnecessary" last layer.
(told you the subject was not merely rethorical :-)
We write bugs into our software.
We write "bugs" into our writings.
We appreciate *help* with debugging our software
(the silicon simply being relentless).
We get irritated by *help* with debugging our writings?!
I do not, and I am suprised again and again by how many
people treat programming and spoken languages so
differently; do they respect processes running
on Si-based hardware more than those running on C-based
Of course, many cases of "text bugs" waste few cycles;
many are "debugged" so quickly that they go unnoticed,
but they get debugged.
Unfortunately even a small waste is multiplied by the
number of times it is required, and texts get copied and
distributed and served and read and read and read again
The idea is to free capacity for complexity on higher
levels by reducing complexity on lower levels.