No, that was always the illusion put on to make the entire movement seem non-threatening to outsiders. EA started from the rationalist community as a way to justify why a small group of grifters deserved money to help save us from our future AI overlords. They put a bunch of mumbo jumbo together and pulled together a bunch of efforts they weren’t really involved in and called it EA as a way to add a thin coat of paint to essentially Pascal’s Wager and it worked because it turns out that the rationalists aren’t actually all that rational.
They’d attract impressionable, idealistic people with talk about malaria nets and universal income and then gradually expose them to talk about simulation theory and x-risk to convince them that the real EA was to fund a bunch of esoteric institutes most outsiders had never heard of. They’ve become more masks off in recent years as the grift wheels have fallen off the bus and now topics like longtermism are out in the open rather than secret esoteric knowledge you only gain access to once you’ve cleared your thetans cognitive biases. To me, there’s no real better way of understanding what the real EA is than to read their debate on whether it was justified for the EA foundation to spend £15M on a mansion in rural England. Oh, and also, as with pretty much all perfectionist cults, it also morphed into a misogynistic sex cult where sexual abuse was systematic and rampant.
There’s a reason the larger charity community has largely wanted nothing to do with them for the entire time they’ve existed and it has nothing to do with how “blinded” and “stuck in an old paradigm” the non profit industrial complex is. To paraphrase a famous saying, "EA is both novel and good. The problem with EA is that the good bits aren’t novel and the novel bits aren’t good.