Will Fallout 4 look on a 4K TV when played on Xbox One or PC?

So i recently got a new 65" Sony LED 4K TV and was amazed at the picture (it was a a big step up bacuase my old TV was a 42" Samsung Plasma that was starting to die)… I also got the Xbox One X and have been playing Fallout 4 on the 4K TV and the picture is very good… it does enhance the gaming experience greatly i feel. But i also have a pretty good gaming computer with a i52500 processor and gtx 980ti graphics card that i built a couple years ago but is not being taken full advantage of because becuase of the crappy 1080i monitor from 2007. I’m not really interested in upgrading the monitor because

  1. i’m not really interested in spending alot of $$$ on a state of the gaming monitor unless it will be worth it (and many people say the price of 4k monitors is way too high)
  2. i don’t really play PC games that much except once in a while

So is there way to let the high-end gaming components on the PC do the work and the 4K TV do the displaying of the picture? Basically streaming from computer to TV in 4K? Would that be possible or would the input lag be too high to be use full? And would the Xbox One X have better picture simply because it is designed to render games onto a 4K TV and a PC was not? Another question: can the streaming be done over a local nework or internet or does a long cable need to be attached between TV and PC?

Sure.

Your TV is essentially just a big computer monitor.

You can plug your PC in directly, switch to that input on the TV and off you go (assuming your video card has an HDMI output which it probably does…if not there are adapters that are cheap).

Input lag differs one TV to another. Some are good, some terrible. Some TVs have a special setting for games to minimize that. Have to look at your model to see what is what.

You’ve got a couple of options available to you.

The direct connection mentioned by Whack-a-Mole (preferred)

If you have Windows, there will be an option to “connect to a wireless display” in your display settings. You would be able to cast to a device like a Roku or possibly directly to the TV if it’s network enabled.

Similar to the second option, get a Steam Link to connect to the TV and let Steam deal with sending the picture over. This will probably have less lag that option 2.
ETA: Whack-a-Mole: What do you have against moles? What did they ever do to you?

ETA 2: Never mind the Steam Link. It only supports 1080p.

980ti will choke at 4K on FO4 most likely. You’ll probably have to scale down to 1080 to get playable framerates.

If your TV and video card have either DisplayPort >1.2 or HDMI >2.0 then use that. You do not want to stream it. I’m not sure if ethernet Steam Link will work well, but I can’t imagine intentionally using a controller if you have keyboard+mouse as an option.

Your biggest limitation is whether the game will run at a reasonable framerate.

The main reason I asked this question is that there are some who say a PC will always output a better picture than a console. I think it has something to do with the resolution or frame rate. For example a PC will output a native 4K image in 60 Fps while an Xbox one X will only do 30 fps. And then there is also HDR. Can someone explain what HDR gaming is and why it is being talked about so much. On my tv there is a setting called “hdr picture mode” under the picture modes and while it make the image very colorful and bright I’m not sure if that is what they are referring to when they say “ hdr gaming “ . Isn’t hdr dynamic meaning the lighting levels change as you go through the game. Is Computers also do hdr gaming?
What does it mean to calibrate a 4K tv? There are many different sliders like brightness, contrast, gamma factor, etc etc and I’m guessing the picture modes select a set of settings based on what you are doing (gaming, watching a movie). But is calibrating the 4K tv the way you get the best picture?

What graphics card do you recommend (does the processor matter, I have an i52500k and always assumed I could just overclock it to suit my needs, for example overclock to 4 gigahertz if the game is getting slow) I have a good cpu cooler on it.

As it happens, the legend goes, 1080 (maybe 1070?) should be bare minimum for 4k PC gaming and i5 2500K is still just sufficient, not to be direct bottleneck (we are talking 8y old tech here). Not rare combination in fact, poor enthusiast choice. Especially where you are also required to work with some older software. And some adequate gaming 4K TV’s (yes they exist) now costs less than 400$. Not sure about your model. Lag could be main problem.

The most expensive one you can afford to be honest. Gtx 1080 minimum. Most games bottleneck ok the GPU not the CPU.

The only reason PC games look better is because they have to do a bunch of optimizations for consoles to keep a steady frame rate because of the fixed hardware and that generally means looking worse.

For a PC game they figure if you really are a graphics enthusiast, you can throw as much money as you have available to keep a good frame rate at ultra settings and high resolutions. If you’re poor, you put everything in low and get by.

I play PC games because I tend to favor multiplayer shooters and nothing is better than a keyboard and mouse for that. Graphics are secondary and I will turn down any settings I have to to get a good frame rate.

I agree the 980 is nowhere near sufficient for 4k gaming.

You need a 1080-Ti at a minimum. Fortunately the new 2080’s just hit the street so the 1080 Ti prices should be dropping. The 2080’s seem to equal to a 1080 Ti but with the ray tracing thing that few/no games use yet. The 2080 Ti is a noticeable improvement and the fastest card out there but costs an arm and leg.

I’d say 1080 Ti is the best bang for the buck for 4k gaming. It CAN do it and it is not bad but it is kind of on the edge. Sometimes it chugs…

And yes, PC graphics can always be better than console graphics (partly because you usually have a lot more configuration options to get best results). Consoles cut corners in order to keep costs down. They do not have the best hardware and you cannot upgrade them. Generally, I think, they are equivalent to a middling gaming PC in terms of performance.

Not to mention the ability to mod many PC games. Look at Skyrim as an example…kept alive and awesome by a dedicated modding community. That can really make a difference.

Also, mouse + keyboard > game controller in almost all games. I forget where I read it but supposedly Microsoft was considering cross platform play so XBox could play with PC gamers. They setup some tests and the PC gamers consistently destroyed XBox gamers using controllers. So much so it was even true when they pitted expert XBox gamers versus PC gamers.

Mind you, I have a PS4 and a Switch. Not dumping on consoles. They have their place to be sure and are fun but you will always get the best results and options from a PC because it is completely configurable by you. Consoles aren’t. Costs you more for the pleasure too. That’s the tradeoff.

Graphics card performance comparison with prices:

While I’ve not played FO4 specifically, I play other games and anyone telling you you can’t play at 4K on a 980 Ti is talking rubbish and hasn’t tried it. I was gaming at 4K on a 780 Ti - the previous generation - and later a Titan XM - a 980 Ti with extra VRAM. Indeed, I’ve only just upgraded to a RTX 2080 Ti.

What you are unlikely to be able to do is play at 4K on maximum settings. But you know what? You won’t really notice. Turn settings down from Ultra to High and turn off post-processing effects like Anti-Aliasing and in particular god-rays and you’re likely good to go.

You will find this Guru3D thread of interest.

A more serious problem is connectivity. To run at 4K you need a Displayport 1.2 port or a HDMI 2.0 port, and while your TV likely only has the latter, the 980 Ti only has three of the former, a DVI port, and a HDMI 1.4 port, so you will need to buy a DP 1.2 to HDMI 2.0 converter like this one or a DVI to HDMI 2.0 converter like this one to get the two to talk to each other.

Thanks for the responses.

Will there be any noticeable difference between 4K on a Xbox one x and 4K on a pc (both displaying the picture on a 65” Sony screen)… it seems like consoles have a overtaken PC for the time moment, being that you can buy an Xbox one x for under $500 and have it run great 4K graphics no lag instead of spending thounsands on PC gaming components capable of doing the same. Are there any advantages that PCs have when it comes to graphics (picture quality) that I’m not understanding? And what about HDR and calibarating a 4K? Has anyone done it or taken advantage of it? Does HDR really make a mind blowing difference or is it just a marketing gimmick.

Your 4k on the consol will be 30 fps. And the resolution may be dynamically downgraded. The PC will be static 4k and you can tune the level of detail to get the frame rate you enjoy.

So the Xbox one x will only output 30fps max? Is that true of all consoles and all games (or just fo4 on xboxonex)… now another question… can a pc output any frame rate (30fps, 40 FPS, 100 FPS) if you have the components to back it up and everything.

To those who have 4K monitors/TVs and play pc games on them, is the difference between 1080p and 4K really incredible and worth it?

Was this it?