Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Functional programming, APL and UNIX pipes (porg.es)
32 points by philh on May 23, 2010 | hide | past | favorite | 22 comments


I noticed the parallel between pipes and functions only a few weeks ago and it struck me that if you re-write lisp expressions left to right instead of 'inside out' I can suddenly understand much better what is going on.

so if I look at f(b(c(d(e))))) it is the same as 'cat e | d | c | b | f'

Where 'e' is the initial input data and the other letters are the functions transforming e in successive steps to the desired output.

The analogy probably breaks down pretty quickly, but for simple examples it seems to hold.

Is there a programming language that works in this left-to-right pipelined fashion?


In clojure, there's the thrush operators: -> and ->>

  user=> (->> 10 (+ 2) (* 5))
  60
  user=> (* 5 (+ 2 10))
  60
http://debasishg.blogspot.com/2010/04/thrush-in-clojure.html


That's exactly it!

Neat, never knew that was in there. I really should do more reading before diving in to using something. Bad habit.

It would be nice to see two fair sized chunks of code side by side, one in each style and then to ask a panel of programmers which is the more readable of the two.

I'm beginning to slowly 'grok' functional code, I can read it a bit easier now than two months ago (exposure), but I still feel like with olives that it is an acquired taste.

I hope I'll grow over that, it definitely doesn't feel natural yet, and it takes me much longer to understand a piece of functional code than it does when I look at 'imperative' code.


I just started picking up Clojure (coming from Python) a couple weeks ago. It's hard to understand at first, but I'm really starting to like it more than Python. Once you get over the mind-backflip that comes with the leap from imperative to functional, it really feels a lot more coherent.


The biggest advantage I've noticed so far is that once it works it just keeps on working, it is almost like it is hard to create bugs in functional code than it is in imperative code.

This must have something to do with the lack of side effects.

For some fun I coded up a small site in PHP in a functional style (I can see a lot of HN'ers gouge out their eyeballs at this sentence, apologies, it was just an experiment), and it gave me a lot more practice in reading functional code.

But the side effect free trick had - pun intended - a side effect by itself, the code worked the first time out after writing it, and that's unusual for a 1500 lines or so project, once the minor syntactical problems were dealt with (mostly quoting issues).

That really was unexpected, if there is any concrete explanation for that (other than blind luck) I'd like to hear it, and if this is more common in the 'functional world'.


Yes, concatenative languages. Check out Factor[1] or Forth[2], for example.

[1] http://www.factorcode.org/

[2] http://thinking-forth.sourceforge.net/


Funny you should mention forth, I never even thought of forth that way (I did some forth long ago), but I can see the parallel, with the stack being the 'intermediate' that gets passed from function to function.

Never heard of 'factor' yet, I'll have a look at it.

Thanks!


Factor is interesting, because it tries to be Lisp in concatentive form. Because of this, it has a lot of interesting and powerful abstractions that make you rely on the order of the stack a lot lot less (as they like to say idiomatic factor doesnt use the stack and Factor has more in common with Lisp than with Forth). Its well worth a try, though its got a pretty steep learning curve.


What's Factor's approach to macros?


EDIT: whether anything I say below is actually true, making new syntax is totally supported in Factor both in the language and in the community.

http://factor-language.blogspot.com/2009/09/survey-of-domain...

pretty much the same as lisp's. MACRO: defines a lisp style macro, http://docs.factorcode.org/content/word-MACRO__colon__%2Cmac... (MACRO:: defines a macro with local variables)

SYNTAX: defines a 'parsing word' http://docs.factorcode.org/content/article-parsing-words.htm... which I think is basically the same as a lisp Reader Macro

Though you don't seem to need macros very often because you tend to pass things around as lambdas, cond for instance is implemented as a normal function.


One commonly used parsing word is INFIX: which, as you would expect from the name, allows you to write equations in infix notation rather than the default postfix. Macros work exactly like someone familiar with lisp would expect them to.


I've played with using Factor in this fashion.

I went so far as defining | as a noop (I think it was SYNTAX: | ; )

That way I could do stuff like [1, 3, 4] | first | dup


Hah, thats a nice and easy way to mirror shell piping, though I personally prefer it without the extra (unneeded) punctuation.


I recently asked here about an intermediate language to use for compiling a domain specific language I use.

http://news.ycombinator.com/item?id=1361382

The language I'm compiling does this, but with non-linear "pipe" systems. It's a bizarre mongrel of functional, dataflow, OO and declarative (as per Prolog), and revisits Alan Kay's idea that OO is about the messages, not the objects. On steroids.

It uses the left to right version, so function application is (args).(func) although that's not the syntax. In particular, using Lisp-like notation, addition is written similarly to (3 4 +) where 3 is sent to the tail of the list. the tail of the list is 4 sent to +, which creates a function that adds 4 to its argument. Thus (4 +) is a function in its own right.

There are interesting parallels with continuations and the like, but we've found some problems with the underlying theoretical basis and are re-working some of the ideas.

Of course it's probably isomorphic to an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp, but there you go.


Haskell does. Roughly the author's example using monads:

   [[10..20], [50..100]] >>= id >>= classify 
The only thing you can't directly do with monads is the fold, since the fold breaks you out of the monad. Also, you could use arrows:

   myCounter = ( concat >>> (map classify) >>> (foldr getCounts (0,0,0)) ) 
Then myCounter [ [10..20], [50..60]] is the function you want.


Actually, just realized you can do everything with monads.

    (return [[10..20], [50..100]] >>= concat >>= classify >>= foldr getCount (0,0,0))
Just use the identity monad rather than the list monad.


This idea is one of those perennials that makes for interesting discussion yet hasn't been done much with. It's come up on HN a few times (I remember touching on it here: http://news.ycombinator.com/item?id=236704). The closest thing to it that I've seen get widely adopted is the "fluent interface" or "chaining" style that has caught on in OO over the last few years; particularly, from what I've seen, in Java and JS.

The APL/J/K languages are built around a construct very similar to this, so the OP's reference to APL is apt. You really should check out J or K if you haven't yet. J comes with a marvelous set of tutorials. K comes with nothing. :)

(But K is a work of genius, in my opinion. A pinnacle.)


It's pleasant enough for doing computations, but if your C/C++ is not up to snuff, you will find yourself very frustrated trying to integrate it into an existing ecosystem. Additionally, there are some REALLY annoying things you face due to the single threaded nature of the language if you're doing real time systems.

But if you go and learn K or Q and you've got good C chops, you can absolutely get a job in Manhattan or Chicago (must be willing to work long hours with little support, build almost all of your own tools, and handle the constant pressure of making everything work perfectly the first time).


The monadic chaining style of LINQ from the .NET world is also relevant; it's even more pipe-like, in that it's creating a pipeline, rather than modifying the data en masse, one step at a time.


In actuality, the pipe-style programming paradigm is like a data flow language. Similar to LabView (https://www.ni.com/labview/) or XEE (http://www.futurepointsystems.com/?page=products).

Output from one becomes the input to another component. Most takes on it that I've seen (like the two above) slap a graphical interface on top for flexibility.


Method chaining in a fairly pure OO language (Ruby or Smalltalk or similar) can work like this. (where all operations are methods on objects rather than being functions in their own namespaces/utility classes)

In this case the "." is functioning sort of like the pipe.

something like [1, 2, 3].first().add(2)


The pipe operator in F# |> is used in this way. When combined with currying it is especially effective:

  let double_then_sum a_list =
      a_list
      |> List.map  ((*) 2)
      |> List.fold (+) 0




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: