How do game developers handle advancements in technology?

Continuing the discussion from Star Citizen is the Future of PC Gaming, and it [was] Free to Play this week (Edit: No Longer Free):

The Star Citizen thread recently celebrated its 6th birthday. The game itself has been in development for ten years.

One thing I’ve heard from grizzled PCMR veterans is that an appeal of Star Citizen is how it doesn’t coddle to casual hardware. It’s so technically ambitious that won’t even be available on consoles, which generally lag behind contemporary high-end gaming rigs in terms of power.

Buuuut a powerful 2012 (the year of its Kickstarter) PC is what the kids today would call a potato. The prettiest and most complex games of 2014 (the year SC was supposed to release) aren’t blowing anybody’s minds today.

Star Citizen is an outlier, but there are plenty of games that spend 3-6 years in development and then launch as finished products. How do they deal with improvements and innovations that happen during the development cycle? Is the heavy lifting actually done by the developers of the big engines like Unreal, and game developers just sort of update and renovate as they go without much hassle? Is GPU scaling generally predictable enough that it can be planned for as part of the development cycle? Is implementation of brand new technology (I hear a lot about ray tracing and DLSS these days) relatively easy, or are they Big Deal decisions that end up driving a lot of the toxic crunch we keep hearing about?