Is concurrency hard?

classic Classic list List threaded Threaded
16 messages Options
Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Joe Armstrong (AL/EAB)


> -----Original Message-----
> From: owner-erlang-questions
> [mailto:owner-erlang-questions]On Behalf Of Lyn Headley
> Sent: den 2 november 2005 02:26
> To: erlang-questions
> Subject: RE: Is concurrency hard?
>
>
>
> > We are used to handling data where everybody has their own private
> > copy of a data structure.
> > If I say "think of the number 2007" - then I send a sound message
> > to my listeners - each one
> > of them who hears the message forms their own private mental image
> > of "2007" - there is no shared
> > data structure.
> >
>
> I'm not sure about this.  In fact I think we are constantly working
> on both a sequential and concurrent level with both shared data
> structures and parallelism.  Take an academic lecture.  We are all in
> the same room and more importantly share the same atmosphere which
> transmits our messages.  Only one of us can use this atmosphere at a
> time and we have to negotiate access to it by raising our hands and
> having a scheduled speaker. Furthermore there is a shared sense that
> what is going on is a lecture.  coughs and throat clearing are
> defined as peripheral.

I don't think so - N people are receiving more or less the same sensory data.

If you asked them afterwards what had happened (in detail) they would all disagree
that's because they receive slightly different sensory inputs.

Now you can do an experiment to test this.

Arrange that two people at the left and right side of your lecture theatre suddenly
explode and vanish in a puff of orange smoke with a very lound bang.

Make sure that they have synchronised watches, and that they vanish
at the "same" time (ie the accuracy of the watches was greater than the time taken for a ray
of light to pass from one side of the lecture theatre to the other).

Now ask the question: Who vanished first?

Now the people on the left hand side of the theatre would say that the person at the left had side had vanished first - and the people on the right hand side would say that the person at the right had vanished first.

On dear - who is right? - both - it is the question that is silly - you cannot ask questions about
the simultaneity of event occurring at different places (basic physics).
>
> But maybe that was actually your point.  The only thing that is
> shared is the channel.  I suppose that's not a data structure.  But
> it's a resource.  Is that right?

That's not how I'd put it.

I'm a physicist (or at least I was - a long time ago)

Now in physics there is no concept of sharing and no concept of simultaneity at
a distance. We can only say that two things occur at the same time if they occur at the same place.

In physics, light propagates through a media (called the ether) - but nobody
knows what the ether is.  In Newtonian and relativistic mechanics there is no sharing of the
ether - everybody could use it at once. << In esoteric things like Brane universes and 4 + 22 dimensional universes there are some twisting parameters, so perhaps, photons might get a
"time slice" move a bit and then get swapped out while the other universes and dimensions
do their bit - who knows>>

So to me channels don't really exists - there are just labels attached to messages.
If we say "A talks to B" (in a room) then A doesn't *really* talk to B - there is no A to B channel.

A talks - everybody in the room could potentially listen - but the message was intended for B.
So we might call this a channel - just for convenience.

In the real world (the physical world) there is no sharing, there are no channels, no locks
no simultaneous things happening in different places (well there are but we can only indirectly
infer this *after* the even) - all there are are messages containing information.

Since this is the case in the real world then software that pretends that this is not the case
is bound to run into problems.

How can we ensure that two computers in different places have the same data? - we can't -
this is the well known Byzantine Generals problem. We can only be reasonable sure - normally reasonable
means "two phase commit" has succeeded - but we're really deluding ourselves. All we can say
with certainty is "the last message I received from this machine had this content"

At first it seems difficult to program things where each process has it's own view of the
world and this view is only changed when the process receives a message - but after a while
this becomes second nature. This is how we work - ie how we (people) interact - we receive messages
through our senses - we send messages - that's it (Unless you want to introduce a deity)


/Joe






>
> Lyn Headley
> UCSD Communication and Science Studies
>
>
>
>
>
> __________________________________
> Yahoo! Mail - PC Magazine Editors' Choice 2005
> http://mail.yahoo.com
>


Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Mats Cronqvist (ÄL2/EAB)

Joe Armstrong (AL/EAB) wrote:
> I'm a physicist (or at least I was - a long time ago)
>
> Now in physics there is no concept of sharing and no concept of simultaneity at
> a distance. We can only say that two things occur at the same time if they occur at the same place.
>
> In physics, light propagates through a media (called the ether) - but nobody
> knows what the ether is.  In Newtonian and relativistic mechanics there is no sharing of the
> ether - everybody could use it at once.
[...]

   joe must be older than he seems since the aether theory was discounted nearly
100 years ago... from wikipedia->Luminiferous_aether:

In the late 19th century the luminiferous aether ("light-bearing aether"), or
ether, was a substance
postulated to be the medium for the propagation of light. Later theories,
including Einstein's
Theory of Relativity, suggested that an aether did not have to exist, and today
the concept is considered "quaint".

   joe's point (that reality is to all intents and purposes concurrent) is of
course perfectly true.
   i believe the only reason concurrency is percieved as hard is cultural;
programmers are trained to think sequentially. this in turn is because C++ is
glorified C, C is glorified assembly, and assembly is sequential because CPU's are.
   so no, concurrency isn't hard. what's hard is to unlearn the habit of turning
everything into a sequential problem.

   mats


Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Marc van Woerkom-2
In reply to this post by Joe Armstrong (AL/EAB)
>On dear - who is right? - both - it is the question that
>is silly - you cannot ask questions about
>the simultaneity of event occurring at different places
>(basic physics).

Each one has his own correct point of view about the
events happening, and it is possible to find out what is
going on in another observer's local coordinate frame (one
knows the proper transformations).


>Now in physics there is no concept of sharing and no
>concept of simultaneity at
>a distance. We can only say that two things occur at the
>same time if they occur at the same place.

The theory of relativity has abandoned absolute time, that
is true.

One the other hand we have that strange world of quantum
mechanics.
There you have the odd phenomenon of coupled states.
E.g. certain radioactive isotopes have an event, where two
photons are emitted in opposite directions. The
polarisation of the photons is undetermined at first.
But if you measure one photon, and thus force nature to
make a choice, instantenously the other photon takes the
opposite polarisation.

The interesting bit is that people start to employ such
odd quantum effects to create unique and potentially more
powerful computing devices.
The world is quantum, not classical, so let's use that!
Read a text on quantumn computing to see what interesting
combination of theoretical computer science and quantum
physics has been developed so far.
E.g. the above mentioned coupled photons are used to
realize secure communication channels.


>In physics, light propagates through a media (called the
>ether) - but nobody
>knows what the ether is.

Since 100 years, the annus mirabilis of Albert Einstein,
ether has been abandoned.

Regards,
Marc




Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Robert Raschke-6
In reply to this post by Mats Cronqvist (ÄL2/EAB)
Mats wrote:
>    i believe the only reason concurrency is percieved as hard is cultural;
> programmers are trained to think sequentially. this in turn is because C++ is
> glorified C, C is glorified assembly, and assembly is sequential because CPU's are.
>    so no, concurrency isn't hard. what's hard is to unlearn the habit of turning
> everything into a sequential problem.

Umm, try telling "CPU's are sequential" to the people who design and
make them.  I'm sure they'll disagree quite violently.

In my experience, it's one of the big culture clashes, the people who
deal with hardware (where things are more like Joe descibes) and the
people who do software (who would just love to abstract everything
into their sequential view of the world) have enormous difficulties
talking to each other.

I believe that concurrent programming is hard to most people, because
of the poor abstractions used by most programmers, i.e., state, lots
of it.

Robby



Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Mats Cronqvist (ÄL2/EAB)


Robert Raschke wrote:

> Mats wrote:
>
>>   i believe the only reason concurrency is percieved as hard is cultural;
>>programmers are trained to think sequentially. this in turn is because C++ is
>>glorified C, C is glorified assembly, and assembly is sequential because CPU's are.
>>   so no, concurrency isn't hard. what's hard is to unlearn the habit of turning
>>everything into a sequential problem.
>
>
> Umm, try telling "CPU's are sequential" to the people who design and
> make them.  I'm sure they'll disagree quite violently.

   maybe i should have said "CPU's were sequential" (at the time C/FORTAN was
designed). at least i don't remember having to deal with concurrency when i
programmed 6800's. otoh, i don't see how it matters what the CPU actually does
as long as it seems sequential to the programmer.
   the arrival of multi-core CPU's will of course change all that.

[...]
> I believe that concurrent programming is hard to most people, because
> of the poor abstractions used by most programmers, i.e., state, lots
> of it.

   how's that?

   mats


Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

David Hopwood-2
In reply to this post by Mats Cronqvist (ÄL2/EAB)
Mats Cronqvist wrote:
>   joe's point (that reality is to all intents and purposes concurrent)
> is of course perfectly true.
>   i believe the only reason concurrency is percieved as hard is
> cultural; programmers are trained to think sequentially. this in turn is
> because C++ is glorified C, C is glorified assembly, and assembly is
> sequential because CPU's are.

The CPUs for which C was designed were sequential [*]. CPUs now are highly
concurrent beasts, that use all kinds of complicated tricks to attempt to
present a sequential fa?ade to most programs.

>   so no, concurrency isn't hard. what's hard is to unlearn the habit of
> turning everything into a sequential problem.

That's one issue. There is also "artificial" difficulty introduced by the
programming model, beyond any inherent difficulty in the problem or in
understanding the problem. To be slightly contrary given the previous
answers, I think there is still significant artificial difficulty introduced
by message passing concurrency models supported in current languages,
including Erlang, although it's much less than for shared state models.


[*] although the techniques of out-of-order and speculative execution go way back,
    at least to IBM's Stretch: <http://www.cs.clemson.edu/~mark/stretch.html>

--
David Hopwood <david.nospam.hopwood>




Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

David Hopwood-2
In reply to this post by Marc van Woerkom-2
Marc van Woerkom wrote:

>> On dear - who is right? - both - it is the question that is silly -
>> you cannot ask questions about the simultaneity of event occurring at
>> different places (basic physics).
>
> Each one has his own correct point of view about the events happening,
> and it is possible to find out what is going on in another observer's
> local coordinate frame (one knows the proper transformations).
>
>> Now in physics there is no concept of sharing and no concept of
>> simultaneity at a distance. We can only say that two things occur at
>> the same time if they occur at the same place.
>
> The theory of relativity has abandoned absolute time, that is true.
>
> One the other hand we have that strange world of quantum mechanics.
> There you have the odd phenomenon of coupled states.
> E.g. certain radioactive isotopes have an event, where two photons are
> emitted in opposite directions. The polarisation of the photons is
> undetermined at first. But if you measure one photon, and thus force
> nature to make a choice, instantaneously the other photon takes the
> opposite polarisation.

More precisely, the two photons will not both be measured as having the
same polarisation. It's an overspecification of what is observable (and
itself a subtle example of sequential thinking) to describe this as
though measuring one photon causes a change to the other.

But enough of the armchair physics. I'm sure we're giving any real
physicists in the audience fits.

--
David Hopwood <david.nospam.hopwood>



Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Ulf Wiger-5
Den 2005-11-02 18:20:29 skrev David Hopwood  
<david.nospam.hopwood>:


> But enough of the armchair physics. I'm sure we're giving any real
> physicists in the audience fits.

I'm sure of it. (:

Going back on topic, I spent some time following the attempts of
companies like Encore and Expertelligence in the early 90's to
deploy software on highly parallel architectures (in the case of
Encore, Unix machines interconnected with very fast pipes,  and
in the case of Expertelligence, Lisp compiled down to transputer
arrays.) Back in those days, it seemed too difficult to make the
mental shift from sequential programming to multipro. Encore had
to rewrite software quite extensively in order to port it to
its architecture - AFAIR, they thought many of the rewrites could
have been avoided if the original programmers had followed some
common sense rules. But since deployment on distributed architectures
wasn't really conceived by most at the time, this didn't happen.

Most attempts to exploit concurrency in hardware seem to have been
severely limited by what software designers were ready/willing to
cope with. The current shift to multicore is driven by the apparent
fact that we've exploited all other known alternative routes to high
performance. Time then to force programmers to re-think. Luckily
for Erlangers, the mental shift shouldn't be that traumatic.

/Uffe
--
Ulf Wiger


Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Matthias Lang-2

Ulf Wiger writes:

 > Most attempts to exploit concurrency in hardware seem to have been
 > severely limited by what software designers were ready/willing to
 > cope with. The current shift to multicore is driven by the apparent
 > fact that we've exploited all other known alternative routes to high
 > performance. Time then to force programmers to re-think. Luckily
 > for Erlangers, the mental shift shouldn't be that traumatic.

I do not expect Erlang to far outshine other languages on multicore
machines. Erlang hasn't made a splash in applications designed to run
on multi-CPU machines and I see no huge difference* between multi-CPU
and multi-core.

The main benefit of using Erlang will continue to be that you get a
significantly simpler program for some types of problem. Most of the
time, the aspect of the problem which benefits the most from Erlang's
concurrency will not be concurrency in the underlying hardware.

Matthias

 * Ok, I'll try harder: multicore machines seem more likely to become
   widespread than multi-CPU machines.


Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Thomas Lindgren-5
In reply to this post by Ulf Wiger-5


--- Ulf Wiger <ulf> wrote:
> The current shift to multicore is driven
> by the apparent
> fact that we've exploited all other known
> alternative routes to high
> performance. Time then to force programmers to
> re-think. Luckily
> for Erlangers, the mental shift shouldn't be that
> traumatic.

Getting higher capacity will probably be
straightforward (as long as we have enough bandwidth
to that multicore chip, etc). Just throw more requests
at your system, as if it were a server farm.

Getting lower latency will be trickier. You can't rely
on clockspeeds to improve, obviously. Instead, you
must turn a sequential computation into a parallel
one. (Or construct a parallel one from the start.)

There are two conflicting objectives: you must break
down the work into independent portions that can be
processed in parallel (and joined together
efficiently). At the same time, you must keep the work
items large enough that threading overheads don't
swamp the system. A potentially delicate trade-off.

A simple but illustrative example is speeding up the
beam compiler on a multicore machine. A bit of
parallelization is easy: each function can be compiled
independently. (And even easier, multiple files can be
compiled in parallel, "make -j".) But after that, we
run into problems:

1. Per-function compilation is only part of the work.
Parsing the source is essentially sequential (as far
as I know), and emitting the object code may be too.
Thus, speedup will be limited by Amdahl's law. If
parsing the source and writing the object code uses
25% of the total time, we can get at most 4x speedup,
and probably less.

2. Even getting to that point may be difficult. Big
functions will take longer time to compile than small
ones, and may then become a bottleneck even in the
parallelized part of the compiler. We may then
eventually have to parallelize the compiler algorithms
as such. Last time I checked, the outlook for doing
this was pessimistic.

This is only half the problem, breaking the sequential
work into parallel parts. The other problem is to
throttle excess parallelism.

The same reasoning goes for a number of other
problems. In conclusion: for some of us, there are
interesting challenges ahead.

Best,
Thomas



               
__________________________________
Start your day with Yahoo! - Make it your home page!
http://www.yahoo.com/r/hs


Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Robert Raschke-6
In reply to this post by Mats Cronqvist (ÄL2/EAB)
mats wrote:
> Robert Raschke wrote:
>> I believe that concurrent programming is hard to most people, because
>> of the poor abstractions used by most programmers, i.e., state, lots
>> of it.
>
>    how's that?

The kind of code I see in day to day life:

A = something
B = something else
routine_reading_and_writing_A_and_maybe_reading_B_and_writing_to_C_and_D()
E = D /* because the next routine works only on E, not on D */
routine_reading_E_writing_to_F()
routine_reading_A_and_F()

... and so on ...

This kind of Basic style programming is extremely widespread,
regardless how advanced the underlying technology.  Although
functional and declarative languages make it just that little bit
harder to program in such a style, not much, but a little.

I get the impression a lot of people think about where and how to
store data when programming (i.e., how do I manipulate the data),
instead of concentrating on the transformational aspect (i.e., what am
I doing to the data).

I think that as long as you are concentrating on manually moving data
about in piecemeal fashion, you are unlikely to notice any
opportunities for making use of concurrency.

I feel that the best way of noticing concurrency in your application
is by carefully crafting the vocabulary that describes your problem
and its solution.  Very often you will find that storage does not rate
very highly in such a vocabulary, allowing you to concentrate on the
functional ascpect of you application.  Even more fundamentally, if
your vocabulary does end up with words for things that need storage,
you can be sure that they are the ones that will limit your
concurrency.

Robby



Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Alex Arnon-3
On 11/5/05, Robert Raschke <rrerlang> wrote:
[snip]

I get the impression a lot of people think about where and how to

> store data when programming (i.e., how do I manipulate the data),
> instead of concentrating on the transformational aspect (i.e., what am
> I doing to the data).
>
> I think that as long as you are concentrating on manually moving data
> about in piecemeal fashion, you are unlikely to notice any
> opportunities for making use of concurrency.
>
> I feel that the best way of noticing concurrency in your application
> is by carefully crafting the vocabulary that describes your problem
> and its solution. Very often you will find that storage does not rate
> very highly in such a vocabulary, allowing you to concentrate on the
> functional ascpect of you application. Even more fundamentally, if
> your vocabulary does end up with words for things that need storage,
> you can be sure that they are the ones that will limit your
> concurrency.


[/snip]

Thank you for so eloquently putting that into words, I've been trying to
properly verbalize this for a while :)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://erlang.org/pipermail/erlang-questions/attachments/20051106/3dc83103/attachment.html>

Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Marthin Laubscher-2
In reply to this post by Robert Raschke-6
Micky Latowicki writes:

> Is concurrency inherently hard?

Concurrency itself is extremely easy - it's how the world works. What
difficulty people experience with concurrency stems from how our technology
patriarchs chose to deal with their limitations. "We" made two fundamental
choices (called breakthroughs at the time) that went directly against
reality and consequently still cripple almost every aspect of computing
today.
a) We represent things with digital approximations.
b) We forced things into sequences of instructions.
(You should easily recognise that those two things are seen as the
cornerstones of computing.)

This is not the time and place to wonder about where analogue computing
could have been today if all the money that was spent on digital computing
was spent on the analogue branch.

Micky's question does however require that we look a little deeper in to the
sequential aspect.

Most programmers are first introduced to programming in the sequential
programming paradigm (COBOL, Fortran, C, PL/I, Pascal, BASIC, ADA, Java,
etc.) To some this is easier than to others, but eventually student
programmers learn to align their thinking, reasoning and approach to solving
problems to that mindset, and they become programmers. Once set, many would
never grow beyond that mindset, not because they're stubborn or stupid, but
because the mindset dictates that everything you encounter should be
"brought back to a sequential set of instructions" and so even when faced
with different opportunities and environments, the "programmed behaviour" is
to force everything back to sequential thinking.

Programming techniques allowing us to address problems in a different
manner, even the successful ones, require that we adapt to that mindset
before we can hope to be effective in solving problems.

Some examples:
1) SQL: To harness the power of the relational database, we needed to think
in terms of sets and set theory. Once that mindset is in place, the
technique is so powerful that I was able to write two and a half
telecommunications billing systems purely inside the database. But the
gravitational pull of the masses of people and environments that don't grasp
set theory is so strong that the vast majority of relational database
activity ends up with very little usage of set theory and a lot of
sequential processing of data. Had the internet spread as much at the time
as now, we'd have all seen just how many times the "Is RDBMS/SQL/Set Theory
inherently hard?" issue was in fact raised.

2) Event driven programming was another such technique that tripped up
countless very experienced programmers (and made life quite difficult for
the Windows and X11 worlds). But many did get it relatively quickly, in
particular the young and open-minded folk that Microsoft targeted at the
time - which accounts for a large part of it's success. Judging at the time
of it's rise who got into the swing of things and who stayed clear, the only
reason "Is Event Driven Programming inherently hard?" was asked perhaps a
little less often, was the ego's of the old hands hating how the "youngens"
took off.

3) Object orientation: Wow, didn't that take world through turmoil? The
extent to which the original concept was enhanced or retarded by traditional
sequential thinking, is another debate not worth having. Nevertheless, Micky
might as well have asked "Is OO inherently hard?"

In each of the example cases, including concurrency, the answer is the same:

No, it's not inherently hard. But it is hard to ignore our bad habits. It is
our bad habits, entrenched ways of thinking, of approximating the world
around us in order to deal with it inside a computer. That is what's making
it seem hard.
 
>
> I've seen it stated elsewhere that writing concurrent software
> cannot be made easy. The author was writing in a context of a very
> different approach to concurrent programming than erlang's and I was
> curious if erlang developers feel the same way. If so, then what's
> hard about it when working with erlang?

That's true in the same way as there being no such thing as a probability of
1 or 0.

For my liking, I find far too little reference to C.A.R. Hoare's CSP in the
Erlang community. Strange as it might seem, I "learned to program in Erlang"
a decade before the language was conceived. How? I learned to express
concurrency solutions in CSP at varsity in the mid 80's. At the time, CSP
was a notation, a concept, a specification language, an approach to
concurrency. I was over the moon with excitement a decade or so later when I
learned that (although largely extended with a lisp-like syntax and other
elements to make it practical) there now was a way of actually executing CSP
- Erlang.

CSP has formed a large part of my foundation to conceptualise concurrent
processes interacting, each of them sequential in nature, interacting.
With that, concurrency is easy, without it, a mess.

Communicating Sequential Processes (CSP) is still alive and kicking today
and what's become of the 1985 textbook is now available at
http://www.usingcsp.com/ for free.

In case anybody wondered, no, beyond what the original book meant to me 20
years ago, I am in no way affiliated CSP, Oxford, Tony Hoare or Jim Davies.

Maybe CSP's silent disfavour has something to do with what became of its
author - according to the usingCSP website, he's at Microsoft Research.

Marthin Laubscher






Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Matthias Lang-2
Marthin Laubscher writes:

 > "We" made two fundamental choices (called breakthroughs at the
 > time) that went directly against reality and consequently still
 > cripple almost every aspect of computing today.

 > a) We represent things with digital approximations.
[...]

 > This is not the time and place to wonder about where analogue computing
 > could have been today if all the money that was spent on digital computing
 > was spent on the analogue branch.

When you say something is 'analog', do you mean that the parameters of
the system all occupy a continuous space? Or do you mean something
else?

Matthias


Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Marthin Laubscher-2

Matthias Lang writes:

>
> Marthin Laubscher writes:
>
>  > "We" made two fundamental choices (called breakthroughs at the
>  > time) that went directly against reality and consequently still
>  > cripple almost every aspect of computing today.
>
>  > a) We represent things with digital approximations.
> [...]
>
>  > This is not the time and place to wonder about where analogue computing
>  > could have been today if all the money that was spent on digital
> computing
>  > was spent on the analogue branch.
>
> When you say something is 'analog', do you mean that the parameters of
> the system all occupy a continuous space? Or do you mean something
> else?
[Marthin Laubscher]

A little of both I suppose. If we were as highly evolved analogue
"computing"-wise as we are in the digital sense, what would concepts such as
"system" and "parameter" have meant? If we assume that we mean by "system"
and "parameter" something akin to what cybernetics, complexity theory and
such describe, then in all likelihood the answer is "Yes, and that's not the
end of it, and hardly the beginning". But in a fleeting remark sort of way,
yes, I mean with analogue that every value in the system occupy a continuous
space, also that it doesn't necessarily reside in a register or a memory
position from where it is copied around and addressed by yet another digital
number.

It's as tough and mind-bending to think about these things while having a
digital worldview, as it is to understand the language of dolphins.

>
> Matthias

[Marthin Laubscher]






Reply | Threaded
Open this post in threaded view
|

Is concurrency hard?

Matthew D Swank
Marthin Laubscher wrote:

> It's as tough and mind-bending to think about these things while having a
> digital worldview, as it is to understand the language of dolphins.
>
>

It is?  I was under the impression that we'd been thinking about these
things since Newton.  Besides, If you look at the history of Math and
Logic (especially in the 20th century), continuous systems are as
artificial as any other math construct.  A model's success in mirroring
the "real" world doesn't change the fact that it is still a model, and
not the world.

Matt
--
"You do not really understand something unless you can explain it to
your grandmother." ? Albert Einstein.