Very nice work, and the code is clear and beautiful to a Haskell illiterate such as myself. But I have a problem. If your solution involves writing a library and importing it, have you really written a program with "only 13 nodes total"? If the answer is yes, then why can't I just put that program into a library, call it from a new program with only 1 or 2 nodes, and say "look at the size of this!"?
It's not obvious what counts as "the language" and what as "a library". Is printf part of C?
I suppose one could argue that if a library is general enough to make an entire class of programs shorter, as opposed to a few specific programs, then it gets to count as "the language". But that doesn't resolve the question so much as reword it.
Anyway, this is probably more of an objection to the original challenge than it is to this lovely Haskell code.
I thought the point of the arc challenge was to say to ruby/rails, smalltalk/seaside, python/django, etc., "hey, look what my language/libraries can do, can yours do this? or better than this?" If the answer is "no, I need to write a new library to do that sort of thing, or rewrite my existing one," then arc beat your language/library at the challenge.
Yes, I had the same problem with the initial challenge. I could patch virtually and programming language implementation such that it would be possible to write a 5 line web app, but that doesn't really say much about the language.
The criterion for tests like this is whether you're relying on stuff you'd expect any language already to have, or not. Features fall on a continuum in that respect. You'd expect any language to be able to add two numbers; you couldn't expect a language to have a library for parsing some format you'd just invented; other features fall somewhere between these extremes.
I honestly didn't think any of the things I did in the original Arc challenge required features that wouldn't already have existed in a language that had been used for web programming. All it does is ask the user to input something, then show it on another page. It's practically the hello world of stateful web apps. It seemed reasonable to expect any language that had been used for web apps would already have had convenient libraries for doing such basic things.
The reason I posted that was precisely because of the point where Arc shines: no inline HTML. But in practice, I want to avoid putting my presentation logic inline with my programming logic anyway.
And I fear that's the problem with the challenge; it seems to be highlighting the wrong kind of simplicity. I could add a few utility functions to make Ruby able to generate that HTML with a tiny amount of code, but I wouldn't ever actually use those functions in a real program.
Edit: Out of curiosity, I just implemented a pseudo-concision-obsessive "library" so see how a Ruby version might look:
The other matter of style is whether the order of the code matches the order in which the user interacts with it.
The three steps in user experience order:
first: a form to accept input,
second: a link to click
third: a printout of the input
Here are the orders used in code:
Ruby+sinatra: third, first, second
arc version 1: third, second, first (original)
arc version 2: second, third, first (May 2009)
Haskell+custom: first, second, third
Interesting results, I think Haskell wins here. Though technically it fails the challenge (the arc version 2 example fails too) since the supporting library was written after the challenge was considered by the author of the example.
The Perl+Continuity solution also uses "first, second & third" style and does it nicely without sessions and adheres to challenge by being an established library: http://arclanguage.org/item?id=805
Here is a version using the HTML::AsSubs module:
use Continuity;
use HTML::AsSubs;
Continuity->new->loop; # This starts the webserver
sub main {
my $request = shift;
my $p = 'foo';
$request->print( aform( $p )->as_HTML );
my $foo = $request->next->param( $p );
$request->print( w_link()->as_HTML );
$request->next->print( "You said: $foo" );
}
sub aform {
form(
input( { type => 'text', name => $_[0] } ),
input( { type => 'submit'} )
);
}
sub w_link { a( { href => '.' }, 'Click Here' ) }
But Arc itself is is exactly this "patch of a programming language implementation".
Also, everything in programming could be much less concise if you wanted it to be. Consider addition. It's an operator in your language. Then it's a CPU instruction. Then it's bits going through logic gates. Since it's a common thing to use, the complexity is abstracted away in many places.
If you are writing web apps, it makes sense for your web app libraries to abstract the web stuff away like this too.
I have had the same problem with the challenge as the gp ever since I read it. I'm a bit perplexed at what exactly Paul's point was.
So I write a templating module for OCaml that lets me beat or tie the Arc challenge. I compile it and never have to look at it again. It becomes part of my toolkit.
What do I have to do in order to win the challenge? Get my module accepted into some standard library? Why would what some language committee decides be important in terms of how expressive the language is?
It doesn't make sense to me. All it shows is that Paul controls Arc and has configured it to deliver small answers to questions that he finds important. I'm not sure it says anything about Arc or any other language for that matter. It might say something about functional versus imperative languages, but perhaps we already knew that?
I agree. The distinction between having a standard library and extending it with a new one is a weak distinction. The only thing it says is that arc handled a specific common function out of the box. What about all the other batteries that are not included in arc vs. language x? If you need to do any one of a hundred normal things that languages do, arc would fall flat on its' face. Try connecting to a database being capable of fetching large result sets, try having robust integration points with a host of common facilities. True pipes, cross platform, concurrency models etc. etc. My point is that any number of languages could offer a dozen challenges that arc would fail, but pg could always add it to his library and declare it standard simply because he's is in control as the authoritative source.
I'm a bit perplexed at what exactly Paul's point was.
What do I have to do in order to win the challenge?
I think the source of your perplexity may be that you're misunderstanding who (or rather what) is being challenged. It's not a contest between people, but between languages. So you don't win it; a language does.
The way a language wins is if you can write the program shorter using it plus some standard libraries. The definition of "standard" doesn't have to be exacting; anything short of a library written specifically to win this contest would be ok; so e.g. any library that existed prior to the challenge obviously would be.
But isn't Arc a library written specifically to win this contest?
A well-balanced "contest" would probably include a bunch of specifics (web, 3d, computation, ...) or stick to generic "programming" ("implement a self-balancing binary tree").
I see the Arc contest as more of something like, "take a look at what happens when you think about a problem space in depth and write a domain-specific language to make solving problems in that space really easy".
The only downside of Arc is that other domains are not as easy to work in. If you design a DSL on top of an existing language, then you don't have that limitation, which is what's good about the Haskell implementation. If I need to write a parser for part of my web app, I can use another nice DSL for that. If I used Arc, I would not have that option.
No, actually. The Arc libraries were written to make general-purpose web apps. I didn't propose the challenge till after the first version of Arc was released. And the way I chose the problem was to think of the simplest stateful web app I could. If this isn't the hello world of stateful web apps, what is?
There's nothing about this problem that's biased towards Arc's strengths. Take input from a form and print it on the next page. Every popular language already had libraries for such basic things.
I suppose Arc was designed to win this contest in the sense that Arc is designed to make programs short, and winning in this contest is measured by brevity. Is that what you meant?
Not really. If I added a condition to the contest that you didn't take into account when writing Arc, Arc would no longer do as well. When the same person designs the programming language and the contest to "prove" it's the best, the contest is likely to prove that his programming language is the best. Not due to malice, but rather because both the language designer and contest designer think exactly the same way (as they are both the same person).
When the same person designs the programming language and the contest to "prove" it's the best, the contest is likely to prove that his programming language is the best.
Ok. Now let's explore (or rather, return to) the question of whether that happened in this case.
Since I was aware people would make this type of criticism when I proposed the challenge, I made a conscious effort to make the problem very generic-- in fact to be the simplest stateful web app I could think of. In your opinion, did I succeed? Is it a simple, generic problem to ask for input on one page and display it on the next? Or is this a complex problem that requires unusual, Arc-specific functionality to solve?
Well this is a concise way to sum up what you want to do, but that could just be rewritten in any language of your choice. Then the appropriate libraries written. For example, java:
createForm("click here", new Response("you said %foo"), new InputElement("foo"), new SubmitElement());
I think there's more important things to compare in languages. speed, error handling, stability, memory management etc
Here's the critical distinction: you can use only whatever's already provided by the language+libraries you use. That's what I did. I didn't go back and add stuff to the Arc libraries to make the answer to the challenge shorter. I just used what was there already. This will be clear to anyone familiar with the source of HN; all the language features used in the Arc version of the challenge are used throughout news.arc.
But I could just write 'arc' for Java, and the above example would fall out.
The solution for arc doesn't have anything unique in it that you can't do in other languages just as concisely.
So I don't think it says anything at all about the language, it just says how well you designed the interface to helper libraries.
>> "all the language features used in the Arc version of the challenge are used throughout news.arc."
I don't see any 'language features' in the solution for Arc. It's just functions and parameters. They're not language features, they're helper libraries. They may be well designed good libraries, but that's what they are. The language used is irrelevant. You could have written arc in BASIC and the solution would be the same.
Maybe I just don't understand why arc is referred to as a 'language' rather than a framework/library, and why that distinction is important.
But I could just write 'arc' for Java, and the above example would fall out.
In terms of language design, if you were to write news.arc for Java, would you expect a solution to the challenge to just fall out? This probably says more about MzScheme+pg vs Java+you than it does about arc, however.
One language feature that's used here is lexical closures. This would be hard to implement concisely in a language without them.
If you wrote an Arc implementation in Java, then ran the program on top of that system, surely that would count as an instance of Arc winning the the challenge, not Java.
I'm not even sure if you're serious at this point, but the important distinction here is that the hypothetical library function you created to make this work in Java is (like the even shorter version proposed here: http://news.ycombinator.com/item?id=1005199) not one that anyone ever would put in a library. Its only function is this one case. Whereas the Arc version is built by combining highly orthogonal components that can be recombined to solve completely different problems. Do you really not see the difference?
Its only function is this one case. Whereas the Arc version is built by combining highly orthogonal components that can be recombined to solve completely different problems. Do you really not see the difference?
This is why the appropriate test is the ability for an average programmer to put together a DSL for a randomly-selected problem domain: it actually speaks to the power of the language. The characteristics of orthogonality and combinatorial flexibility is what DSLs are made to do. Arc doesn't have a monopoly on them. The only thing that is interesting here is your choice to include web-specific functionality in Arc. I think it's a great choice, but it just doesn't say much about the general expressiveness of the language as a whole.
We may have reached the point here where splitting hairs over DSLs versus included libraries is not going to get anybody anywhere. From what I understand, I would certainly agree that Arc programmers having such easy access to stateless web programming in a highly flexible manner is a great thing for the language.
But I would judge any language by the ability to easily add solutions to other problem domains that are highly orthogonal and flexible, not necessarily by the problem domains that are enabled by default.
Hope that makes sense. I think I finally figured out what your point was.
You've widened the scope of "language" to its libraries. And that's entirely fair. So, yes, Arc has it "built in", but Arc itself is (IMO) only marginally more mature than the libraries one might invent for (e.g.) Haskell to do the same thing (ok, I'm being a little unfair here, but not that much IMO). And one can invent those libraries for another language and can match Arc in the challenge using those invented libraries -- at least you don't seem to be denying this.
So, then, what's the point? That Arc already has the libraries available? That the Arc libraries meet the challenge? It certainly isn't that those libraries aren't possible in another language. The challenge means (almost) nothing with regards to comparing programming languages as you first implied, and is more about what tools and libraries were invented along with Arc to develop web apps.
And one can invent those libraries for another language and can match Arc in the challenge using those invented libraries -- at least you don't seem to be denying this.
Depends on the language, obviously. It seems unlikely you could in C, for example. Presumably the problems you'd encounter would gradually decrease as the language grew more powerful. That's why I phrased the problem as a challenge. I was curious to see what happened when you tried to solve this very simple problem using existing language/library options.
No, actually, I didn't. I chose the simplest stateful web app I could think of: take input on one page and print it back on the next. What about that problem is specific to Arc?
Not specific, of course you can do it without much trouble in language/framework. It's easy to do in Arc, and with web programming with continuations in general. You want an example that is representative of things you'd like to do in practice. This example isn't. I'm not saying you did that deliberately. You have been programming websites in this continuation style for a long time, so it's understandable that you'd choose an example that's simple with continuations if you try to choose a simple example instead of a representative example. Most web programming is (1) display a list of things (2) display a detail view for something in the list (3) provide a form to add something to the list. The code for such an example provides much more information about how good a language/framework is for real world programming.
By the way, why did you choose to use string names for the data in the input form? Why not use variables directly and make aform work more like let (and like Mathematica's manipulate):
(form
(name (input-string))
(age (input-number))
(pr "Hello, " name ". You are " age " years old."))
Where input-X writes HTML output and returns a function that extracts the value of the field from the HTTP request. Validation works well too.
> Get my module accepted into some standard library?
Yes, that is the gist of it. More specifically, the challenge is, "at this point in time, has anyone gotten the appropriate code into a standard library?"
As far as the challenge is concerned, we have to take it on faith that when the challenge was posed, it was no more than a coincidence that pg's language supported an example written in a style pg likes.
I think I'm getting this. Let me try another bit of questioning to make sure I got it.
So let's say I think CSV-file processing is important. If I show doing something useful and common with CSV files in 15-20 symbols with my language of choice would it be fair to challenge other languages to do the same?
If I understand you correctly, this all hinges around what libraries you think languages should have by default and how many symbols it takes to get something "common" done using these commonly-available pieces.
This is a little too subjective for me, although I'll easily grant that web programming is much more common than CSV-file processing.
Instead of relying on the arbitrariness of what components have been built or what various committees have approved, I would amend this to be something like "after an initial bit of programming not to exceed 3 days, how much symbology do I have to piece together to solve various problems in some sort of common domain like web question/response?" Or something like that. Because as a practical matter I'm always taking a bit to ramp up on new pieces of languages anyway and 3 days or so in this context is a nit.
Yes, that's how I understand the challenge. And I agree, it is subjective[1].
I think that the most positive way of looking at it is to declare that session storage in a web app and parsing csv files are understood problems which we don't want to get bogged down solving again. Last week I was writing a small web app to browse a 2GB or so data set, which was provided to me variously in xml and csv data files in a zip file.
If I could have called:
import data_20091201.zip
and gotten something useful, I would be all the happier.
I can now make that call, but in terms of getting stuff done, I'm judging my programming environment on the availability unzip utilities, xml and csv parsing libraries and database tools. As far as the language itself is concerned, I want to be able to structure my code neatly without worrying about forgotten temp files if the import fails.
I see the arc challenge as a proxy for making this sort of judgment, but it does come down to: "Do the people who maintain the language and contribute to its libraries worry about solving the sorts of problems that I would like to solve? And are they successful in making my life easier?"
[1] Subjective in the sense that the problem to solve is one that you may or may not care about solving, or may prefer to solve in a way that happens to use more code to make different aspects more explicit.
I'd be interested to hear if anyone has solved the problem of embedding a series of pages inside of another (in the sense of Seaside) in a pure functional way. In other words, here's my challenge: write http://www.seaside.st/about/examples/multicounter using the same monadic style.
Except that these two code examples do two very different things. The Haskell version actually uses a continuation to run multiple requests The Javascript version modifies local mutable state. The Haskell version does it via server interaction, which makes it much more useful to modern web apps.
Isn't this problematic? Lispers fling around lots of superlatives, as if their favorite language will never be dethroned from its position atop the language kingdom. And they always felt justified about this.
Shouldn't we confront them about this?
I do think there's a science of programming languages which can inform us how to best design the tools we use. Isn't it just possible that these Hindley-Milner languages are the next step in the progression that LISP started?
Look at the transition from Newtonian to Einsteinian mechanics. They were both great at the time of release, but one is clearly a step forward. The former is usually embeddable within the latter, but edge cases in the former don't quite work the same in the latter. And some cases in the former are just downright illegal in the latter. And unfortunately, the latter is a lot more complex for the power users. But, the latter is more expressive, teaches us more about the world... and enables completely new possibilities like nuclear fusion/fission.
This is like the difference between dynamic and static typing. Dynamic typing is certainly embeddable within Haskell. Just define a data type Dyn which is the sum of types Int, String, Float, [Dyn], and Dyn -> Dyn. You will see how much LISP is possible without much added syntactic cruft. But Haskell users don't need and want to operate in this way, because they can leverage the benefits of static typing.
Remember, these things don't come from whims of clever people or corporations. LISP and Haskell both come from research of very fundamental ideas. his is much more like real science than social science. And we should keep our eyes open to the future of this field, instead of proclaiming that we're finally done with our "100 year language".
There isn't some unified continuum of power in languages, though. There are some problems for which a H-M-ish type system is a tremendous advantage, and there are others for which it gets in the way. Same with lazy evaluation - it can make a problem much simpler or much harder to reason about. Sometimes using a language built from the ground up to support distributed and concurrent programming (e.g. Erlang), unification and backtracking (Prolog), or constraint programming is the right tool for the job. Sometimes being able to work close to the Unix kernel or run on an embedded system without an OS is more important.
Languages are a means to an end -- they help manage complexity while solving a problem. There's value in being able to extend a language's semantics to support a superset of several major language families, but unless handled very carefully, it can turn the host language into a sprawling mess in the process.
That's why we need a language with a scalable type system that supports both static and dynamic typing. (Actually VB had that years ago with Variant data types, but the rest of the type system was horrible.)
I suspect it might be fruitful to do static analysis for constraints, with type identity as just one attribute. While inferring that X is an int is useful, inferring that it's an int which is always positive and less than 256 would allow a lot of other optimizations. Even if an inference engine can't completely prove something is always a (string * int list) pair, it would still be useful to know that it's (string * (either int or double) list), and the list cannot be empty. Etc. Type declarations or inference are a bit all-or-nothing, and I think being able to read through the properties that the compiler could infer (or at least confirm) would help find bugs, suggest optimizations, etc.
If I ever get past the first dozen projects on my list, I'd like to write a compiler for a dialect of Prolog designed with constraint analysis in mind. (I also need to read more about what's already been tried, first - this is just me being curious about how far constraint analysis could go and wondering out loud.) It would be tricky, but more feasible with Prolog-like semantics than in, say, C.
Isn't it just possible that these Hindley-Milner languages are the next step in the progression that LISP started?
It's possible they've discovered interesting ideas, but they're not on the line of development Lisp started. They grow more out of the Algol tradition.
I don't think anyone has ever proclaimed "we're finally done with our 100 year language." That would be extremely unlikely. The question is more which present languages are on the path to it.
Dynamic typing is certainly embeddable within Haskell. Just define a data type Dyn which is the sum of types Int, String, Float, [Dyn], and Dyn -> Dyn. You will see how much LISP is possible without much added syntactic cruft. But Haskell users don't need and want to operate in this way, because they can leverage the benefits of static typing.
An enumerated type (as you describe) is very simple to implement, but true dynamic typing is a bit more complicated, and there's some constructs from dynamically-typed languages which can't be expressed within the bounds of Haskell's type system.
The Data.Dynamic[1] module is a pretty good implementation of dynamic typing in Haskell, but it's still a bit awkward to use compared to a language which supports dynamic typing natively (eg Python).
Well, I'm arguing from the Bob Harper standpoint that untyped lambda calculus is simply a mode of use of typed lambda calculus. That is, untyped lambda calculus is typed lambda calculus with exactly one type.
Now, neither LISP nor Haskell are just lambda calculus... but I look at Haskell as a bigger language that subsumes most of LISP.
Lazy evaluation. Template Haskell. Quasiquotations (EDSLs). The GHC API. (Liskell.)
I believe between them anything Lisp macros can do they can do, though it may not be so easy. (Strangely, we seem to have little need of them, but I will leave it to the reader as to whether this is a Blub situation or regular Haskell is just that good.)
My theory, as someone admittedly inexpert in both Lisp and Haskell but having used both on occasion, is that both are so ridiculously powerful in different ways compared to "normal" languages that very few people ever even reach the point of being comfortable Blub programmers in either, never mind hitting the wall and wondering "what's next?" [0]
Really, how many programmers are out there who simultaneously 1) spend enough time with either language to master it 2) are smart enough to not only realize the language is limiting them, but to invent a new language to surpass those limits 3) aren't heavily tied to their current language 4) have enough spare time to bootstrap a new language with all the associated scaffolding (libraries, &c.) needed for anyone to want to try it?
[0] Feel free to substitute Scheme, an ML dialect, or other related languages into that sentence.
For something going beyond (but including) Haskell, see Curry [1]. Curry marries functional and logic programming, and compiles down to Haskell in recent versions.
But that's a tradeoff you make. If you have anything more than the most rudimentary of basic syntax, macros become hard. Whereas most of the things that Haskell brings to the table can be incorporated into any other functional language.
I, as a Haskeller, envy Lispers as well. Metaprogramming is so incredibly easy in Lisp. Oftentimes I can find good solutions in Haskell for things that people would normally do using metaprogramming, but still. And almost always the types really help me, but every now and then they get in the way.
All I really want in a language is something similar to Haskell, with Lisp-like metaprogramming, easy support for Erlang-style concurrency, and syntax that doesn't immediately terrify the Blub programmers. Is that so much to ask?
Metaprogramming and syntax don't go that well together. If you're going to manipulate programs effectively, you're going to be operating on source trees, and it's easiest to do that when source code and source tree are of the same form.
Thank you for picking up on that half of my joke. Everyone else went for the type system part...
Not that it isn't easy to make a candied Lisp dialect with "friendly" syntax more appealing to non-Lisp programmers, but anyone who actually learns a Lisp quickly realizes that it just gets in the way...
If I were to personally try creating an "ideal language" to match my tongue-in-cheek plea, I'd probably start with a terse Lisp-like syntax and a HM-like type system and go from there.
Yes I think that is too much to ask. Metaprogramming, I think, is by definition incompatible with compile time type checking.
The "meta" in Metaprogramming means that the program is able to manipulate itself (recursively) at runtime. I believe it's theoretically impossible to combine that with AOT type checking and Haskell without its type system would not be Haskell at all.
[edit] Of course you could argue that the sort of thing C++ does with templates is metaprogramming as well, only at compile time. In that case I suggest that we need another term for that because it's totally unlikle what Lisp is so good at.
That's a very valid point. So I think we have these two types of metaprogramming, the runtime type and the compile time type. The latter is compatible with static typing but the former clearly is not.
If the compiler is part of the runtime (i.e. an interpreter that compiles expressions in order to evaluate them), and then the type checker is part of the compiler, even eval should be able to throw type errors. What, then, is the problem?
It depends on what you mean by "runtime." If you're creating a new datatype, is that really "at runtime," or are you just talking to the compiler interactively?
To put it another way: metaprogramming might have effects at runtime, but not the kind of effects that change depending on what the runtime does. Metaprogramming should be deterministic for your program to be considered to be "in production."
Thank you so much! I've had informal ideas quite like this from working on my "HTML rewriting system" part of my (unreleased) Common Lisp web framework. I now have a very specific search term to acquire more formalized knowledge which should speed up some future developments a lot because I won't have to independently come up with solutions to as many of the problems now.
heh. no problem. if you're interested in this kind of thing check out http://lambda-the-ultimate.com from time to time. just skimming the conversations can keep you up to date...
If I create syntax with a program (code with code) is that metaprogramming? Doesn't c++ or any text generating code already do this?
Maybe it's not "native" and I have to run system level calls to force compiles and execution of created code.
Almost all programs do metaprogramming (interpreted code, scala on the jvm, etc).
What's the differentiatior for Lisps Metaprogramming (self contained languange structures?).
I'm pretty interested in simplification of programmer interfaces on iteratively more functional (but perhaps more complex) underbellies. I will certainly investigate.
My definition of metaprogramming is writing a program that manipulates itself. I'm not sure I would call any arbitrary code generator metaprogramming as this would cause the "self" in my definition suffer a severe identity crisis :-)
But yes, C++ templates as well as Lisp macros are compile time metaprogramming. All languages that have eval and/or allow function/method bindings to be replaced at runtime allow runtime metaprogramming.
Since static typing guarantees certain invariants it cannot be compatible with a program that violates those invariants at runtime. You could still decide to consider it metaprogramming when the program manipulates itself only within the limits of those guarantees, but when you look at what real world runtime metaprogramming is being used for (for instance in Rails) you will realise that these things (e.g method_missing) would not be possible within the limits of a statically typed language.
[edit] Lisp makes both compile time and runtime metaprogramming exceptionally easy due to its homoiconic nature.
It's certainly possible for a typed language like ML or Haskell to support metaprogramming. As other have noted, Haskell has Template Haskell. However, systems like MetaML and MetaOCaml support metaprogramming and give much stronger typing guarantees than Template Haskell. See http://www.metaocaml.org/examples/ for inspiration.
Sure, but there are theoretical limits to what runtime metprogramming can do whilst upholding the guarantees that a static type system provides.
For instance, in order to check that a particular function call conforms to the function signature, the function signature must at least exist. Type checking a call to a function for which not even the signature exists anwhere within the system is impossible.
One of Haskell's defining features is a very expressive and powerful static type system, so yeah, compile-time type inference/checking was pretty heavily implied by "similar to Haskell".
And yes, the fact that I was essentially asking for a language with both static typing and runtime self-modification was the reason for the tongue-in-cheek "is that so much to ask". Might as well ask for a program that can compile any legal perl program [0].
That said, I suspect there's a lot that could be done to allow certain subsets of metaprogramming techniques in a static-typed language; some sort of crazy meta-type system that lets the compiler prove that something will only produce code with a particular polymorphic type, maybe? I don't know.
Yes. A lot of the meta-programming is actually programming over the structure. For example, given a Java class, you could generate database code. In Java, this relies heavily on introspection, but you if you're interested in doing this in a type-safe way then you could look at Generic Programming (in Haskell).
Also, you can generate code (at runtime) which is type-safe, compile the code, etc. Everything you ask for is possible in Haskell, however, it is definitely more complicated than in Lisp.
Compiling code at runtime may improve performance, but it doesn't give the kind of guarantees that static typing gives you.
Static type checking proves that certain invariants hold at runtime and hence that certain defects are not possible. If a program can manipulate itself at runtime in arbitrary ways (not just reflect on itself in a read-only fashion), those proofs become invalid.
Lisp-like metaprogramming is really tricky, because a lot of the language that you interact with is defined by macros. The Haskell that you see is a thick layer of syntactic sugar over a much smaller Haskell core.
Whoa. That doesn't match my experience at all. I'd say it's simple and practically effortless (compared to what you'd have to do elsewhere). It would be surprising if it weren't, given that the whole language is organized around code=data.
You should check out the Qi language, though the developer who has historically contributed the most to it has departed some months ago. It has the ideas you are looking for (sans message-passing concurrency, I think.)
Does Template Haskell affect this discussion at all? (That's not a rhetorical question - I haven't tried any Template Haskell, so I'm genuinely interested to know what it's capable of and how it compares to Lisp macros.)
TH does a lot of the same stuff. On a practical level, it isn't as good, though. The implementation leaves a lot to be desired. For example, I once tried to use TH to check XMonad configurations at compile-times (specifically, that their keymaps were sensible), but turns out TH cannot operate on anything defined in the same module!
Can you give an example of a situation where Haskell type checking gets in your way? Not to take issue, but because I'm a collector. I've been told that it's not that easy to make a list of functions in Haskell unless they all have the same input and output types, which seems restrictive to me.
It's not "easy" to make a list, but instead maybe you would be able to work with a tuple (which is easy).
More practically, Haskell is going to ask you implicitly why you want a heterogenous list of functions. In all likelihood they're going to have a common interface at some point and you abstract around that so that the list is still "homogenous" (which is harder, but still pretty easy).
Well, polymorphic lists are one thing that are somewhat awkward. Other thing is that a small change in some part of program (say introducing an additional parameter, or having to use IO or state) can require massive changes in rest of the program, or a PhD in category-theory.
I'm a bit late here, but if you want to really smash the challenge, you can use Lisp style metaprogramming with my Hasp program. Something like this is possible (10 nodes):
def arc
with->>= name input
link "click here"
display++ "you said: " name
It's not obvious what counts as "the language" and what as "a library". Is printf part of C?
I suppose one could argue that if a library is general enough to make an entire class of programs shorter, as opposed to a few specific programs, then it gets to count as "the language". But that doesn't resolve the question so much as reword it.
Anyway, this is probably more of an objection to the original challenge than it is to this lovely Haskell code.