I am not sure if you are confusing the current philosophy of longtermism with the general concept of taking the long view. I think any rational person would agree that looking beyond the immediate is necessary to thrive, although the further out you look the less you are to be able to make an informed decision. But longtermism is basically looking at the 8 billion people who are currently alive as completely expendable, with the exception of a very few who will be necessary to bring about some future with a galaxy full of happy humans. Conveniently, those who choose to embrace this view see themselves as part of those few. Either they are are uniquely gifted and visionary enough to help bring this about, or they have unique genes that need to be spread and advantaged so their descendents can. The rest of us are just faceless NPCs in their story.
I am not sure how a non-malignantly narcissistic version of longtermism would work. How would you weigh the value of a current human to a future human? I can see supporting NASA and private space companies, but what else? We have no idea at this point if things like AI and human bioengineering will push humanity toward or away from a future utopia. What if you became convinced of some version of Dune’s Golden Path? If you believe the only way to have those free and happy trillion humans is to force the next 50 generations into horrific slavery?