Dijkstra was empirically wrong about that (generations of perfectly good programmers were taught BASIC first in schools) and about most of the other witty and self-congratulatory quotes in that collection: http://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EW... Perhaps the most amusingly wrong one is the claim of FORTRAN, which sits at the core of the scientific Python stack, as "hopelessly inadequate for whatever computer application you have in mind today [i.e., 1975]: it is now too clumsy, too risky, and too expensive to use."
Dijkstra was good at many things, but there's no need for an argument from authority when we have actual data and not just opinions made to sound stronger because they are phrased more aggressively.
Or to put things more accurately, he exaggerated for humor. It's a communications style. He was wrong if you read it too literally.
If you read him less literally, the basic point he was making was right.
If you start out in a language like BASIC, plenty of programmers never make it to the other side. And it applies much more to Java than to BASIC. With BASIC, everyone who goes into programming will move on at some point (there aren't BASIC jobs out there), and they'll be forced to learn something different. With Java, plenty of people learn it, work whole careers, and never know any better. That doesn't just prevent them from coding in Ruby on Rails or whatever else -- it makes them worse Java programmers too. They understand WHAT, but they don't understand WHY. They can't reason about things like abstraction from first principles.
You want to start out with a broad view of computation. From there, you then want to narrow. Java is okay for a junior-level course on OOP, but it's really, really lousy for a first exposure to programming, or a freshman course.
Dijkstra was good at many things, but there's no need for an argument from authority when we have actual data and not just opinions made to sound stronger because they are phrased more aggressively.