Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Picolisp - what a weird thing. I have to admire the temerity of a guy who in 2023 is still team "dynamic scope is better."


It's not just temerity I think. Picolisp is a very different animal compared to all modern lisp and scheme flavors. It is the last "true lisp" that I am aware of -- it has an ultra-minimalist interpreter (hand-written in assembly, by the way) that actually represents programs as linked lists. A function is really just a list with the first element contains the argument and the second one the body, and the arguments are bare symbols. Picolisp has no compiler (not even a bytecode compiler), no lexical analysis and no other preprocessing. There is only the reader and the output goes directly to the interpreter. On the upside, this makes Picolisp the only language with truly "first class" functions in the sense that you can really create and manipulate them at runtime just like you would strings or integers, unlike pretty much every other language out there, where "lambdas" are just syntax sugar over function pointers. On the downside, this is all of course completely unchecked and pretty unsafe, and, to come back to the original point, you do not have such conveniences as lexical scoping. That would be literally impossible to implement without changing the nature of Picolisp into a proto-compiled language.


> It is the last "true lisp" that I am aware of -- it has an ultra-minimalist interpreter (hand-written in assembly, by the way) that actually represents programs as linked lists.

Strange, I thought many Lisp systems still have source level list-based interpreters. For Common Lisp I would think: SBCL (optional), CLISP, Allegro CL, LispWorks, ECL, ... They can also compile code. Compiling Lisp code was also already available in the first Lisp implementations and having a compiler was an explicit goal of the original implementors.

Let's use the LispWorks Listener (the REPL tool):

    CL-USER 25 > (defun foo (bar) (break) (print (list :hello bar)))
    FOO

    CL-USER 26 > (foo 10)

    Break.
      1 (continue) Return from break.
      2 (abort) Return to top loop level 0.

    Type :b for backtrace or :c <option number> to proceed.
    Type :bug-form "<subject>" for a bug report template or :? for other options.

    CL-USER 27 : 1 > :bq

    INVOKE-DEBUGGER <- BREAK <- FOO <- EVAL <- CAPI::CAPI-TOP-LEVEL-FUNCTION <- CAPI::INTERACTIVE-PANE-TOP-LOOP
    <- MP::PROCESS-SG-FUNCTION

    CL-USER 28 : 1 > :n
    Call to INVOKE-DEBUGGER

    CL-USER 29 : 1 > :n
    Call to BREAK

    CL-USER 30 : 1 > :n
    Interpreted call to FOO

    CL-USER 31 : 1 > :lambda
    (LAMBDA (BAR) (DECLARE (SYSTEM::SOURCE-LEVEL #<EQ Hash Table{0} 81D03EFF03>)) (DECLARE (LAMBDA-NAME FOO)) (BREAK) (PRINT (LIST :HELLO BAR)))

The debugger output of the currently running function looks like a linked list to me. I could modify it destructively, if I wanted.

LispWorks has a list-level interpreter. One can also compile code.

Typically Lisp interpreters tend to be written in C, since that usually is more portable than Assembler.

> On the downside, this is all of course completely unchecked and pretty unsafe, and, to come back to the original point, you do not have such conveniences as lexical scoping.

I would expect from a typical Lisp interpreter (a source level list-based interpreter) that it does all kinds of runtime checks and also provides lexical scoping. If there is a clojure, then this closure would be a combination of some function and an environment. In standard Common Lisp there is no access to that environment, but I could look from an inspector into it:

    CL-USER 54 > (defun example (a) (lambda () (break) (print a)))
    EXAMPLE

    CL-USER 55 > (example 10)
    #<anonymous interpreted function 8020001A59>

    CL-USER 56 > (describe *)

    #<anonymous interpreted function 8020001A59> is a TYPE::INTERPRETED-FUNCTION
    Code             (LAMBDA NIL (BREAK) (PRINT A))
    Environment      ((A . 10) (#:SOURCE-LEVEL-ENVIRONMENT-MARKER FUNCTION NIL . #<EQ Hash Table{0} 81D03EFF03>) (#:FUNCTOR-MARKER LAMBDA (A) (DECLARE (SYSTEM::SOURCE-LEVEL #<EQ Hash Table{0} 81D03EFF03>)) (DECLARE (LAMBDA-NAME EXAMPLE)) (LAMBDA NIL (BREAK) (PRINT A))))
As we can see, the interpreted closure has code a list and an environment, where A = 10.


> The debugger output of the currently running function looks like a linked list to me. I could modify it destructively, if I wanted.

I doubt that actually, though I don't have LispWorks installed to try it. Modifying a function at runtime as if it were a list is actually the best test to see if your Lisp really represents functions as lists, or as some other internal object that is rendered as a list in the REPL by accessing the stored definition. E.g. both CLISP and guile error out if I try `(car (lambda (a b) (+ a b)))`.

Another good test is to construct a function at runtime and try to call it. To do that you will probaby need to call `eval` or equivalent, just like you would in Lua or Python. Not in Picolisp though, which is why I consider it to be the only truly homoiconic programming language.


> I doubt that actually

You can doubt that. But I have done it. I know that it works.

    CL-USER 63 > (defun foo (a) (print 'hey) (print a))
    FOO

    CL-USER 64 > (foo 'jack)

    HEY 
    JACK 
    JACK

    CL-USER 65 > (function-lambda-expression 'foo)
    (LAMBDA (A) (DECLARE (SYSTEM::SOURCE-LEVEL #<EQ Hash Table{0} 81D03EFF03>)) (DECLARE (LAMBDA-NAME FOO)) (PRINT (QUOTE HEY)) (PRINT A))
    NIL
    FOO

    CL-USER 66 > (fifth *)
    (PRINT (QUOTE HEY))

    CL-USER 67 > (setf (fifth (function-lambda-expression 'foo)) '(print 'hello))
    (PRINT (QUOTE HELLO))

    CL-USER 68 > (foo 'jack)

    HELLO 
    JACK 
    JACK

> E.g. both CLISP and guile error out if I try `(car (lambda (a b) (+ a b)))`.

Sure, but in CLISP the function is still a list internally. An interpreted function is a record, which stores the code as a list internally. The internally stored list is interpreted.

Python compiles the code to byte code. CLISP has both a list-based interpreter and a byte code interpreter.


Mmh. Like I said, I'm not familiar with LispWorks, so take this with a grain of salt, but to me it looks like the system is just retrieving the original source expression that it keeps around in addition to the executable representation. But this is ultimately a question of implementation. My original point was that in Picolisp runtime-constructed lists are directly executable, without any processing. An unroller function that takes an action `foo` and a runtime number, e.g. 3, would return `'(() (foo) (foo) (foo))` and that would be it. In other Lisps you would first build the equivalent of this list and then pass it to `eval` to make it actually executable. Whether this step is expensive or not depends on the system. E.g. efficient closures require scanning the containing scopes and creating minimal state objects. Just storing the parent environment pointer would be super-inefficient and would prevent the entire environment from being garbage-collected, hence my claim that dynamic scope is the only thing that really makes sense in a direct-interpreted lisp, and that the presence of lexical scope implies some nontrivial processing before execution, though not necessarily as extensive as what would usually be called compilation.

Edit: lexical analysis -> lexical scope


But the behavior changed accordingly when lispm mutated the source expression.

So if there is another form that is actually being used for the execution, the change in source must have been detected and propagated to that other form.

Anyway, that situation looks like true blue interpreted functions. There is nested list source you can tweak, and the tweaks somehow go into effect.


    CL-USER 100 > (defun unroll (n exp) `(lambda () ,(loop repeat n collect exp)))
    UNROLL

    CL-USER 101 > (unroll 3 '(foo))
    (LAMBDA NIL ((FOO) (FOO) (FOO)))

    CL-USER 102 > (eval *)
    #<anonymous interpreted function 8020000EC9>

    CL-USER 103 > (describe *)

    #<anonymous interpreted function 8020000EC9> is a TYPE::INTERPRETED-FUNCTION
    CODE      (LAMBDA NIL ((FOO) (FOO) (FOO)))
As you can see, the thing is basically the same as a cons cell with two entries the type and the code:

    (TYPE::INTERPRETED-FUNCTION . (LAMBDA NIL ((FOO) (FOO) (FOO))))
The above Lisp implementation does not use a cons cell, but a different type mechanism to easily and reliably identify the runtime type.

I picolisp this is hardwired into the interpreter. The interpreter will also need to check every time at runtime, if the list structure is actually a function and what kind of function it is.

In above Lisp, the type of the function is encoded during EVAL and the check for the type is then a type tag check.

for this example here, using the LispWorks implementation, it also makes no difference for EVAL if it is a function with 10 or with 100000 subforms. The execution time is small. No special processing of the list of subforms takes place. For example the code is not compiled, not converted to byte code, not converted to another representation.

    CL-USER 111 > (let ((f (unroll 10 '(foo)))) (time (eval f)))
    Timing the evaluation of (EVAL F)

    User time    =        0.000
    System time  =        0.000
    Elapsed time =        0.000
    Allocation   = 184 bytes
    0 Page faults
    GC time      =        0.000
    #<anonymous interpreted function 8020001DA9>

    CL-USER 112 > (let ((f (unroll 100000 '(foo)))) (time (eval f)))
    Timing the evaluation of (EVAL F)

    User time    =        0.000
    System time  =        0.000
    Elapsed time =        0.000
    Allocation   = 184 bytes
    0 Page faults
    GC time      =        0.000
    #<anonymous interpreted function 8020000839>

    CL-USER 113 > (defun unroll (n exp) `(lambda () ,(loop repeat n collect exp)))
    UNROLL


I stand corrected, thank you :)

I always thought that `eval` in CL was an un-idiomatic and fairly expensive operation, even for code that is not compiled. You learn something every day...


A Common Lisp implementation may implement EVAL by calling the compiler. That would be more expensive. Several Common Lisp implementation use EVAL to create an interpreted function and then the user can call COMPILE to compile these.


> (car (lambda (a b) (+ a b)))

An interpreted function in a Common Lisp cannot literally just be a lambda expression list, because that would not satisfy the type system. It has to be of type function and a function is not a subtype of list.

What happens is that there is some container object which says "I'm an (interpreted) function", which has slots that hold the raw source code. It might not be a lambda form; for instance, the original lambda might be destructured into parameters and body that are separately held.

There is some API by which the interpreter gets to those pieces and then it's just recursing over the nested lists.

> Another good test is to construct a function at runtime and try to call it.

Common Lisp doesn't provide a standard API for constructing an interpreted function (or even make provisions for the existence of such a thing). Lisps that have interpreted functions may expose a way for application code to construct them without having to eval a lambda expression.

It's just a matter of constructing that aforementioned container object and stuffing it with the code piece or pieces. If that is possible then that object is something you can call.

When you call that function, eval ends up used anyway.


There's also the ultra-minimalist part. Picolisp has like one data structure, it's two cells with pointers, and that's it. Maybe symbols are implemented in some other way, I'm not sure, but pretty much everything is based around that.

Portability is not a concern. Either you run something 64 bit POSIX or you aren't going to use Picolisp (except if you get your hands on an old 32 bit build). I think it's usually tested by a user on OpenBSD but outside of Debian you're basically on your own.

There are like three basic data types. Fixnums, symbols and the linked list. If you do something similar to what you're showing from SBCL (or LispWorks, didn't read closely enough at first) it'll look pretty much like it does in source.

     $ pil +
     : (de myfun ()(prinl "yo")(prinl "world"))
     : (cdr myfun)
     -> ((prinl "yo") (prinl "world"))
     : myfun
     -> (NIL ((prinl "hey")) (prinl "world"))
     : (myfun)
     yo
     world
     -> "world"
     : (set (cdr myfun) '(prinl "hey"))
     -> (prinl "hey")
     : myfun
     -> (NIL (prinl "hey") (prinl "world"))
     : (myfun)
     hey
     world
     -> "world"
Hijacking the 'de mechanism is not something you'll do often, but looking at definitions like this you'll do a lot, and from time to time navigate it with list browsing functions.

It boils down to some very simple interpreter behaviours, and after some time surprises become quite rare. I find it takes off quite a bit of cognitive load when solving non-trivial scripting tasks compared to e.g. bash or Python. Especially since 'fork and 'in/'out are so easy to work with, with the former you just pass in an executable list, '((V1 V2 Vn)(code 'here)(bye)), with the latter you get a direct no-hassle connection to POSIX pipes.


LispWorks, I change the interpreted code:

    CL-USER 63 > (defun foo (a) (print 'hey) (print a))
    FOO

    CL-USER 64 > (foo 'jack)

    HEY 
    JACK 
    JACK

    CL-USER 65 > (function-lambda-expression 'foo)
    (LAMBDA (A) (DECLARE (SYSTEM::SOURCE-LEVEL #<EQ Hash Table{0} 81D03EFF03>)) (DECLARE (LAMBDA-NAME FOO)) (PRINT (QUOTE HEY)) (PRINT A))
    NIL
    FOO

    CL-USER 66 > (fifth *)
    (PRINT (QUOTE HEY))

    CL-USER 67 > (setf (fifth (function-lambda-expression 'foo)) '(print 'hello))
    (PRINT (QUOTE HELLO))

    CL-USER 68 > (foo 'jack)

    HELLO 
    JACK 
    JACK


Is the first line in the body executable?

  (LAMBDA (A) 
    (DECLARE (SYSTEM::SOURCE-LEVEL #<EQ Hash Table{0} 81D03EFF03>)) 
    (DECLARE (LAMBDA-NAME FOO)) 
    (PRINT (QUOTE HEY)) 
    (PRINT A))
If not, one would probably need to do a bit of fiddling to tear away the function from the symbol if one should feel a sudden urge to do so.

Perhaps similar to this:

    : (de myfun (D)(prinl "heyo") (prinl D))
    -> myfun
    : myfun
    -> ((D) (prinl "heyo") (prinl D))
    : (mapcar '((D) (prinl "heyo") (prinl D)) '("world"))
    heyo
    world
    -> ("world")
    : (mapcar '(NIL (prinl "hey")) '(lel))
    hey
    -> ("hey")
    : (car myfun)
    -> (D)
    : (cdr myfun)
    -> ((prinl "heyo") (prinl D))
    : (cons (car myfun) (cdr myfun))
    -> ((D) (prinl "heyo") (prinl D))
    : (mapcar (cons (car myfun) (cdr myfun)) '("world"))
    heyo
    world
    -> ("world")
Mostly I use this to get at an implementation so I can test it or a portion of it against some particular value, or just to see how something works. Most builtins are implemented in assembler and their symbols only return a pointer, but for example 'doc is implemented in Picolisp:

    : doc
    -> ((Sym Browser) (raw T) (call (or Browser (sys "BROWSER") "w3m") (pack "file:" (and (= 47 (char (path "@"))) "//") (path (if (get Sym 'doc) (pack @ "#" Sym) "@doc/ref.html")))) (raw NIL))
    : de
    -> 270351
    : macro
    -> ("Prg" (run (fill "Prg")))
I really like the Picolisp 'match (https://software-lab.de/doc/refM.html#match ) function. What's the easiest way to do the same in Common Lisp? If it's not obvious from the examples there, it can also be used with character lists, i.e. transient symbols, i.e. strings, chopped up into a list of UTF-8 characters. It's similar to unification in logic programming, which is something Picolisp supports.


LispWorks:

    CL-USER 130 > (defun myfun (d) (print "heyo") (print d))
    MYFUN

    CL-USER 131 > (let ((source (function-lambda-expression #'myfun)))
                    (mapcar (eval (list* 'lambda
                                         (second source)
                                         (nthcdr 4 source)))
                            '("world")))

    "heyo" 
    "world" 
    ("world")
Above can in some ways done in many Lisp implementations. It's in this form simply not widely used. For most applications it's more interesting to use macros to manipulate code, which then can be compiled to efficient code.

Pattern matching is much implemented in Lisp.

I had adopted this code for a pattern matcher from a book (LISP, Winston/Horn), probably >30 years ago:

    CL-USER 115 > (pmatch:match '(#$a is #$b) '(this is a test))
    ((B (A TEST)) (A (THIS)))

    CL-USER 116 > (pmatch:match '(#$X (d #$Y) #$Z) '((a b c) (d (e f) g) h i))
    ((Z (H I)) (Y ((E F) G)) (X ((A B C))))
Not to say, that Picolisp isn't great for you, but it is not the only language where lists can be manipulated.


"Not to say, that Picolisp isn't great for you, but it is not the only language where lists can be manipulated."

Don't think I've made this claim.

Can the #$a &c. be used as variables?

    : (and (match '("h" "e" "l" "l" "o" @A ~(chop "ld")) (chop "helloworld")) @A) 
    -> ("w" "o" "r")


> Don't think I've made this claim.

That's true, but it sounds a bit as if Lisp interpreters are something very unusual. The syntax and other details may differ, but the general idea of executing source code via an interpreter is very old, often implemented - similar also the idea that the code can be mutable. It's just not very fashionable, since some Lisp dialects are designed such that one wants the compiler to be able to statically check the code for various things, before runtime.

In Common Lisp we would not want to introduce variables into an outer scope by enclosed functions. One would explicitly set up a scope.

Example: this is a macro example, which creates a scope, where the matching match-variables are also Lisp variables.

    CL-USER 165 > (pmatch:when-match (append (coerce "hello" 'list)
                                             '(#$a)
                                             (coerce "ld" 'list))
                      (coerce "helloworld" 'list)
                      (length a))

    3


Wait, is it global by default (Lua, Bash) or truly dynamic? The latter would be kind of mind-bending to program with as the sole or default style. Was that ever a thing? Maybe I'm just too young to have experienced that.


It's a shame that John N. Shutt is no longer with us.

He created a very Scheme-like language called Kernel that seemed to walk the line between lexical and dynamic in a much more controlled way.

https://web.cs.wpi.edu/~jshutt/kernel.html


I think Kernel is the other side of the extreme from Picolisp, since it wants all objects to be first class but wishes to maintain lexical scope information for all of them. I think this is hard because in a certain sense the names of things in a program have no natural correspondence to the meaning of the program from the point of view of a compiler writer in particular. Code calculates a value using values or changes the state of memory or however you want to conceive of it. The names one used to tell the compiler how to do that don't have any obvious relation to the transformation and keeping them around so that the programmer can meta-program is complex and makes the generated code slower. In a way, Common Lisp and Scheme seem like two distinct local maxima, out of which I prefer the latter. Kernel is neat though.


Kernel is "mostly" just lexical unless you explicitly opt-out with FEXPRs. FEXPRs are what draw most people into Kernel.

However, what is probably more important but doesn't immediately stick out until you poke at Kernel a lot harder are the fully reified "environments". "Environments are copy on write that don't destroy older references" has very subtle consequences that seem to make dynamic scope a lot better behaved.

This also has the consequence that I can force things to explicitly evaluate in reference to the "ground environment" which is an explicit signal that it can always be compiled.

I suspect there is a lot of fertile research ground here. In addition, there is a lot of implementation subtlety that I'm not sure he really grasped. Environments need a special data structure otherwise they cons up an enormous amount of garbage (I suspect it really needs a Bitmapped Vector Trie like Clojure).

I wish Shutt were still around to talk to. :(


Emacs Lisp had dynamic binding as the default (without any true support for lexical binding) until 2012.


People talk as if dynamic scoping was objectively a mistake, but the fact that it works well and is really useful in a complex piece of software like Emacs seems to suggest otherwise.


original opposition to lexical binding in Lisp circles was that lexical would be slower. That turned out to be false.

Emacs Lisp explicitly kept to dynamic binding for everything because it made for simpler overriding of functions deep in, but resulted in lower performance and various other issues, and ultimately most benefit from such shadowing seems to be focus of defadvice and the like.


I can understand why that objection would be raised, because lexical binding is slower in code that is interpreted rather than compiled, compared to (shallow) dynamic binding. Under shallow dynamic binding, there isn't a chained dynamic environment structure. Variables are simply global: every variable is just the value cell of the symbol that names it. The value cell can be integrated directly into the representation of a symbol, and so accessing a variable under interpretation very fast, compared to accessing a lexical variable, which must be looked up in an environment structure.


A rather weak argument when you consider what kind of mechanisms (like a digital clock with working seven-segment display) people have been programming / put together in Conway's Game of Life; to me this does not suggest in any way or manner that GoL could ever be my favored platform to simulate a digital clock (or anything more complex than a glider for that matter). Likewise vacuum cleaners and toothbrushes have likely been made hosts for playing doom, and people accomplish all kinds of stuff like quines and working software in brainf*ck. None of these feats are indicative of the respective platform being suitable or the right tool for a sizable number of programmers.


As someone that used it in languages like Clipper, or Emacs Lisp, and ADL rule in C++ templates, cool to make programming tricks of wonder, a pain to debug when something goes wrong several months later.


Few deny the utility of dynamic-style variables for certain kinds of programming. But it can be helpful to segregate that behavior more carefully than in a language where it is the default.


I used to program in PicoLisp a long time ago but I have forgotten most of it.

Hope you can make sense of this:

https://picolisp.com/wiki/?firstclassenvironments


>PicoLisp uses dynamic binding for symbolic variables. This means that the value of a >symbol is determined by the current runtime context, not by the lexical context in the source file.

>This has advantages in practical programming. It allows you to write independent code >fragments as data which can be passed to other parts of the program to be later executed >as code in that context.

This amuses me because while its technically true this amazing feat is accomplished only by denuding of the code of substantial expressive power - namely the relation between the lexical denotation of the code and its meaning. I will say this - aesthetically, I prefer picolisp's approach to Common Lisp's, which is to just paper over this problem with gensyms, packages, etc. Give me hygienic macros or give me death.


Gensyms and packages are not required to make lexical scope work. Macros in an unhygienic macro system use these internally so that their expansions don't have unexpected behaviors in the scope where they are planted. The problems avoided by gensyms or packages are affect both dynamic and lexical scopes. A dynamic variable can be wrongly captured by an internal macro variable, not only a lexical variable.

It may be there are solutions favored in Picolisp without using macros that would be done using macros in idiomatic Common Lisp, and so those solutions don't need gensyms and whatnot.


My point is only that unless you are using a hygienic macro system the idea that you are manipulating code in your macro is a (often white) lie. Code has semantics, a meaning, and unless the object you manipulate carries those semantics with it (that is, the syntax objects of eg `syntax-case`) you're just manipulating some data which has a necessarily superficial relationship with the code itself. Picolisp resolves this by simple "eliminating" lexical scope, which means that code really is trivially related to its denotation since the semantics of variable binding really are just "whatever is currently bound to this variable." Scheme resolves this by having syntax-transformations instead of macros: functions which genuinely manipulate syntax objects which carry along with them, among other things, information about their lexical context. Common Lisp accepts that most of the issues arising from the distinction between code itself and its nude denotation can be worked around and provides the tools to do that, but in Common Lisp one still transforms the denotation of the code, not the code itself. From my point of view, if one is purely interested in the aesthetics of the situation, the Scheme approach is much more satisfactory. From a practical point of view, it doesn't seem to be particularly onerous to program in, although the macros in scheme seem to lack the immediate intelligibility of the Common Lisp ones.


You are manipulating fragments of source code in a macro. Material in which tokens have been converted to objects and which has a nested structure. So, nicer than textual source code.


I mean yes and no. In a CL Macro you are manipulating lists of symbols and other atoms and in a sense that is code. But code has some static properties (of which lexical bindng is one) which are not reflected in that structure and which you can break pretty easily in a CL macro. A scheme syntax object carries that lexical information which is so critical to the meaning of the code and because it does it is much harder to accidentally manipulate the code in such a way that meaning of the code changes. It is exactly the static lexical binding semantics Common Lisp which introduce the conceptual tension in macro programming that requires the programmer to manually worry about gensyms. Because picolisp lacks lexical binding manipulating code lacks this complication (and, in fact, the complication of a macro system almost reduces to a trivial combination of quotation and evaluation).


Programmers say that they are manipulating code when they go "vi foo.c" at their Unix prompt, so that's a bit of an upstream rhetorical paddle.

> It is exactly the static lexical binding semantics Common Lisp which introduce the conceptual tension in macro programming that requires the programmer to manually worry about gensyms.

A dynamically scoped Lisp (like Emacs lisp by default) with those kinds of macros needs gensyms all the same. It isn't the lexical scope.

When we have (let ((x 1)) (+ x x)), then regardless of whether x is lexical or dynamic, there is a lower level of binding going on. The x in (+ x x) physically belongs to the enclosing (let ...). That is not lexical scope; it's a fact about the position of the code pieces regardless of x being lexical or dynamic.

This is why in that strategy for implementing hygienic Scheme macros that you're alluding to, syntax objects, there is a different kind of closure at play: the syntactic closure. It is not a lexical closure.

The syntactic closure doesn't say that "x is bound as a variable". Only "this x expression is meant to be enclosed in this code".

Picolisp doesn't run into hygiene issues requiring gensym because it doesn't perform macro expansion:

https://picolisp.com/wiki/?macros

If you don't have a code manipulating process that invisibly transplants pieces of code from here to there, then of course you don't have the issues which that entails.


Lisps sure is fun! I didn't understand any of this kind of stuff until I learned Lisp.


Like Dijkstra said, you're able to think previously impossible thoughts.


idk doesn't this mean I can't get any help from IDEs? code-completion? find-all-references?


You'd probably be the only person using an IDE for development in Picolisp.

The main author does (or at least did) a lot of development on a tablet, with his own software keyboard (https://play.google.com/store/apps/details?id=de.software_la... , which I've enjoyed for years on my handhelds, in part due to the tmux-arpeggio), and his own editor (https://picolisp.com/wiki/?vip ). I think most of us do something similar, using vim or vip, maybe on larger computers, but generally a pretty minimal setup.

The REPL has string based completion, besides completing symbols it will also complete file paths. Development is heavily REPL based, you'd spend a lot more time inspecting the runtime than searching for string occurrences in files.

From the REPL you'd also read the language reference, most likely in w3m, the preferred text web browser in this community. (doc 'macro) will open the reference on this entry if you started the REPL with 'pil +', where the + is a flag denoting debug mode. You can expect the web GUI framework to work rather well in w3m.


Of all the quirks and weirdness in Picolisp, this is what gets to you?

Either a Picolisp system is shortlived enough that your execution environment actually maps to the file your loading, or you're going to be interactively inspecting it anyway.


honestly, I feel the same way. for some reasons dynamic scope just feels "wrong" rather than quirky or weird, in that it doesn't fit my mental model of how a programming language should behave and what sort of bookkeeping needs to be the compiler's problem rather than my problem. never used picolisp, but I really wanted to like lush back in the day and the dynamic scope was the stumbling block.


There is no compiler in Picolisp, only a very simple interpreter.

The only bookkeeping problem I've encountered in practice is littering the runtime with symbols. As far as I know there's no way to make it forget about symbols it has encountered, and they are interned as soon as they are encountered. I think the namespacing is supposed to counter this, but I've never learnt it properly.

I'm not sure in what situation the scoping would be a problem. The way I usually go about picolisping is using maybe a few globals (could be options, credentials, global stacks, something like that), conventionally marked with an asterisk, like *Global, and then everything else is functions that typically take one parameter, or perhaps a data parameter and a numeric limit for an iterator. Besides let-assignment inside functions variables rarely seem to be the right tool for me.

Lush, if it's the Lisp-like shell thingie, seems like a rather different programming environment, what with the inline C and whatnot. Might try it out, seems it hasn't been updated in fifteen years or so, could be an adventure.


lush was basically an early attempt at what julia does today - a high level lisp like language and a high performance c-like language with good ffi support wrapped in one.

I do see your point about picolisp being simple enough that the dynamic scope fits into the overall model; I might give it a try sometime just to see what it's like in practice.


It's one of my favourite tools, the entire Picolisp system is like 1,5 MB or so, so it's really easy to move to some constrained or remote environment, as long as it's POSIX.

When terminal incantations grow unwieldy I usually abstract by put them in a Picolisp function and start doing things in the REPL. It takes like ten minutes to write a wrapper around a MySQL CLI client that takes a query string and outputs a list of lists, and then you can wrap that in fork/bye and mapcar it on cuts from a list to run queries or imports in parallel. Similarly you can trivially hack up an 'Ansible at home' by running commands over SSH in parallel. If I'm worried about data races I let Linux handle it by writing lines to a file and then slurp that back in when every process is done.

Surprisingly many things are heavier or slower than launching and disbanding a POSIX process, so it's often quite viable to just hand over to forks. Once I scraped out data about many thousands of people in tens of organisations by wrapping w3m and forking on URL:s I'd gathered in a similar way. Can probably be done with Beautiful Soup too, but I already knew how to use a browser to grab a copy of a web page and low level string parsing was probably faster and easier to implement than some abstracted XML representation or XPath. The value of that data set was easily a couple of orders of magnitude larger than what I got payed to assemble it.

I mean, you can do these things in bash or Python or Clojure or something, but the Picolisp REPL has really good ergonomics and gets out of the way. Performance isn't great, but I find it good enough and have handled files with hundreds of thousands of lines without getting annoyed. Sometimes I reach for Elixir instead but it feels a bit clumsy in comparison.


It's like monkey patching in Ruby. It can be used for good and for evil.


In a less balanced both-sides view Ruby as a social construct is an outlier, monkey patching being (rightly so IMHO) regarded as a more-than-questionable practice in most other mainstream PL communities. I mean, yeah, GOTO can be used for good and for evil, and so can GOSUB 200.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: