You can strive for perfection and still have a grounded outlook on tradeoffs. Those people would in fact be very good for engineering security-critical aspects IMHO. I don't think "striving for perfection" in itself implies the inability to accept a calculated risk, or some type of paralyzation.
I'd look at this the other way around, there are people who don't strive for perfection, simply delivering something when it just meets expected bars, not giving any thought beyond that. I wouldn't want those people to design my safety systems, they'd just leave possible improvements lying on the ground by not caring to think a little beyond the boundary of their "box".
It feels like the same coin to me: inability to accept calculated risk.