As others noted, the Dragon Book spends far too much time on parsing, and not nearly enough on optimizations. To be fair, the first edition of the book is from 1986. It looks like the 2nd edition (2006) adds 3 chapters on data flow analysis, parallelization. I don't happen to have that version of the book so I don't know how good the material is.
I took compilers in grad school in the early 90's. Even then, we mostly skipped parsing because that was considered covered in another class (Theory of Computation, a class that covered regular expression, finite automata, Turing machines, P vs NP, those sorts of topics).
The professor spent as much time as he could on optimizations, since that was his research focus. He then went onto write his own compiler book (Engineering a Compiler; Cooper & Torczon) so you can compare the table of contents (https://www.elsevier.com/books/engineering-a-compiler/cooper...) and see what current compiler researchers feel are important topics to cover in a textbook.
Not throwing the Dragon Book under the bus, but it probably is in the "cited widely but rarely read" category as you have noted, just from its position as being first/early.
An anecdote from me - I had the occasion to write a parser for a work project a few years ago. Rather than reach for flex/bison or any theory I had from compiler courses... I went with the new-to-me method of a parser combinator. This was awesome and I might even describe as fun. Granted the target language I was parsing was much simpler than a programming language, but I hope that is covered in today's compilers courses. I remember really tedious parsing homework problems back in the day.
> As others noted, the Dragon Book spends far too much time on parsing, and not nearly enough on optimizations.
If you actually want to specialize in writing compilers, that may be true.
But most people who study compilers don't write compilers. Instead, the time they spend studying compilers will help them to better understand programming language design, which will make them more effective at programming anything.
And for that, it is far better to spend lots of time on parsing than on esoteric compiler optimizations.'
> (Engineering a Compiler; Cooper & Torczon)
I bought that book, but didn't get much out of it because it concentrates on compiler optimizations and seems to be truly targeted for an audience of compiler specialists. Instead, I chose Programming Language Pragmatics by Michael L. Scott, and went through nearly the entire thing with a study group over about 10 months.
The D programming language design is based in part on my experience with designing and building compiler optimizers, including the first data flow analysis C compiler for DOS.
It's why D has, for example, immutable types rather than just const types. (You can't optimize C const pointers because another mutable reference to the same value can change it.) D's contracts are also designed with an eye towards enabling optimizations.
> Instead, the time they spend studying compilers will help them to better understand programming language design, which will make them more effective at programming anything.
What makes you think that? especially 2nd part, but actually both
I can confirm that the second edition gives a quite-good treatment of data flow analysis. Not as thorough as you'd find in a proper program analysis book (where Nielson is the usual choice, though the recent Rival/Yi book on abstract interpretation is fantastic IMO) but it's well-written and at around the right level of detail for a compilers course.
Source: PL PhD candidate, used (parts of) it to teach a graduate compilers course.
> To be fair, the first edition of the book is from 1986.
The first edition of Compilers: Principles, Techniques, and Tools* is from 1986. The first edition of the dragon book wasn't Compilers, it was Principles of Compiler Design (1977). The 2nd edition of Compilers is the 3rd dragon book.
My impression (it's been a very long time since my compilers class) was that it didn't spend too much time on parsing; rather, it spent too much time on how to implement parser generators. I've spent a lot of time parsing things in my life, but rather much less on the details of LR parsers.
I was doing compiler design back in the late 70s/early 80s and writing your own compiler generator was quite common at the time - it wasn't until things like bison/yacc became available which not only did an excellent job but also allowed one to hack on them and/or the result (for example my Algol68 compiler needed to be able to be able to decide how to resolve some shift/reduce conflicts on the fly ... that's how you do changeable operator precedence)
Well, it was basically the only book out at that time. I remember a few others in the 1990s - Compiler Design in C; Holub, and later after I graduated Modern Compiler Implementation in C; Appel).
Hey something cool I found while googling - Holub's book is available as a free download from the author! (https://holub.com/compiler/)
The Dragon Book has flaws but it is also the case where the pioneer catches all the arrows. But the OP asked about unsubstantiated criticisms of the book so I added in the one I remember from my courses - mostly too much on parsing, not enough on everything else. The 2nd compiler course I took didn't even use a textbook, Dragon book or otherwise; it was a handful of papers and a semester-long project on SSA and associated documentation.
I took a professional development course in C from Alan Holub in Berkeley that I think was the base material from that book, ages ago. I can say that a room-full of talented people worked really, really hard to keep up the pace in that course. Like some other classes at UC Berkeley proper, surviving the course says something about you! It was all in portable C, and rigorous.
I took a course in compilers from Hennessy and Ullman at Stanford in the 80's. The course material was a series of papers written by them on various techniques, including a lot of data flow stuff.
I've thought this before, but I think it's a shame that parsing is lumped in with compiler courses. Parsing is such a huge subject, and I'd say that the Dragon Book only lightly touches on parsing (because it's a compiler book afterall).
CS course designers of the future - separate compiler courses into two separate subjects 1. Compilers, and 2. Parsing!
I took compilers in grad school in the early 90's. Even then, we mostly skipped parsing because that was considered covered in another class (Theory of Computation, a class that covered regular expression, finite automata, Turing machines, P vs NP, those sorts of topics).
The professor spent as much time as he could on optimizations, since that was his research focus. He then went onto write his own compiler book (Engineering a Compiler; Cooper & Torczon) so you can compare the table of contents (https://www.elsevier.com/books/engineering-a-compiler/cooper...) and see what current compiler researchers feel are important topics to cover in a textbook.
Not throwing the Dragon Book under the bus, but it probably is in the "cited widely but rarely read" category as you have noted, just from its position as being first/early.
An anecdote from me - I had the occasion to write a parser for a work project a few years ago. Rather than reach for flex/bison or any theory I had from compiler courses... I went with the new-to-me method of a parser combinator. This was awesome and I might even describe as fun. Granted the target language I was parsing was much simpler than a programming language, but I hope that is covered in today's compilers courses. I remember really tedious parsing homework problems back in the day.