Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> the worst of all ways of doing concurrency

That would be threads.



You can't do concurrency with threads. I think you're confusing concurrency with parallelism.


@shit_hn_says

Why not? A thread allows for two parts of the program to be running concurrently. Whether or not they run in parallel is orthogonal. Why don't you think they allow for concurrency?

A few citations to back myself up:

Concurrent Haskell is all about adding threads to Haskell: http://www.haskell.org/ghc/docs/latest/html/libraries/base/C....

The Java tutorials on concurrency all use threads: http://docs.oracle.com/javase/tutorial/essential/concurrency....

The Scheme manual uses threads for concurrency: http://sisc-scheme.org/manual/html/ch06.html.

The Oz language also uses threads for concurrency: http://www.mozart-oz.org/home/doc/tutorial/node8.html.

So who's wrong here? The world leading academics behind Haskell, Scheme and Oz? The industrialists and pragmatists behind Java? Or little old you?

The reality is that threads were created only to allow for concurrency. They were then re-used to allow for parallelism when we had multi-processors and then multi-cores.

Edit: does this guy work at Google? Unbelieveable.


And I think you are wrong.

From wikipedia:

>In computer science, concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other. The computations may be executing on multiple cores in the same chip, preemptively time-shared threads on the same processor, or executed on physically separated processors. A number of mathematical models have been developed for general concurrent computation including Petri nets, process calculi, the Parallel Random Access Machine model, the Actor model and the Reo Coordination Language.

You can certainly build the actor model on the backs of threads.


What are you on about? Concurrent programming is the set of problems that exists when there are multiple 'computational processes' accessing shared state. This is a 'thread' (overloaded use) about JS, but Java has a package called java.util.concurrent and there's also a seminal book called "concurrency in practice", these are about nothing but threads.

Parallelism is just a new buzz word for people who have just discovered that threads\processes exist and they can have multiple things running at the same time. They just removed the shared state cause they don't want to have to deal with the headaches of concurrent programming.


You are wrong by definition: parallelism is a specialized form of concurrency...


Parallelism isn't concurrency and concurrency isn't parallelism.

Parallelism is simultaneity. Parallelism can be deterministic.

Concurrency is by definition, non-determinism. Concurrency does not, by any means, necessitate or mean simultaneity.


Who says concurrency has to be non-deterministic? Back that up with a single peer-reviewed paper or published book.


http://en.wikipedia.org/wiki/Unbounded_nondeterminism

And I quote, "In computer science, unbounded nondeterminism or unbounded indeterminacy is a property of concurrency by which the amount of delay in servicing a request can become unbounded as a result of arbitration of contention for shared resources while still guaranteeing that the request will eventually be serviced. Unbounded nondeterminism became an important issue in the development of the denotational semantics of concurrency, and later became part of research into the theoretical concept of hypercomputation."

If you haven't even read the wiki page, what have you read about concurrency exactly? Are you just making it up as you go along? Is confabulation about topics subject to factual review a habit for you?


> can become unbounded

Unfortunately can does not mean always will. For example, lock-free algorithms do not have any contention, ergo we can have concurrency without running into unbounded non-determinism. Please try to actually understand what you're talking about before being so rude to people. Copying and pasting the first thing you find on Google is no substitute for learning and careful research.


I'm actually pretty familiar with the current state of deterministic concurrency as a field of research and it has little to do with lock-free algorithms at all.

For that matter, what little of deterministic concurrency exists (FP interleaving, coroutines), it is only the palest imitation of true concurrency and how it gets used in practice for which one can SAFELY assume that useful and meaningful concurrency DOES IN FACT constitute non-determinism which also happens to be an apt way to summarize and contrast the subject with parallelism.

The best way to demonstrate deterministic concurrency would probably be: http://en.wikipedia.org/wiki/Very_long_instruction_word

But for the purposes of anybody on here, that's mostly irrelevant.


What? You can't just dismiss coroutines as not "meaningful concurrency". Given that every major language outside of C and Fortran has a native implementation, I'd call that pretty useful, and pretty meaningful.

> Concurrency is by definition, non-determinism.

The fact is that you are wrong: coroutines are a form of deterministic concurrency. They're also very useful.


But that's a different thing.


Web worker




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: