What Powers a TV's video When Streaming direct from a Desktop PC?

Hi. I’m trying to figure out what TV to buy (probably a 55" 4K QLED) and might have a million questions at some point. But for right now I’d like to know about what happens when I send a signal directly from my desktop PC to the TV via HDMI. In that case, what is driving the picture/video? Is it solely the GPU in my PC or is it the integrated graphics on the TV’s chip?

What I’ve read suggests it would be the PC’s GPU but that doesn’t make too much sense to me because if the TV is capable of driving the TV picture on it’s own (like it would do for an OTA signal or streaming directly from the TV’s crappy streaming interface) then why doesn’t it drive the signal from the PC?

I’d like to keep my PC GPU unencumbered so it can drive my PC monitors and video editing rendering, etc.

Could I force it to use the TV’s processing power instead of my PC GPU? This would be a 55" 4K TV so I’d imagine it would place some demand on my PC’s GPU.

Wassup??

It’s the GPU in your computer. The GPU renders the frames and sends them over the HDMI connection to the TV.

The TV is not actually rendering frames, it’s just uncompressing the stream sent from the service and displaying the frames. A little bit of processing happening but not much.

No. The TV is not the equivalent of an outboard graphics card and doesn’t have the high-speed data connection that a GPU takes advantage of. The GPU load for video playback, even in 4k, is just not all that much though. And screen size doesn’t matter. a 4k screen is a 4k screen.

Your computer is sending uncompressed video data to the tv. Your GPU is handling all the work. If you want to have your TV do the work, you’ll need to set up something like a Plex media server hosted on your computer and then send that via compressed data over ethernet to your TV running the Plex app. Your TV will then handle the uncompressing and decoding,

Your PC is sending digital video data and audio to the TV. The TV decodes and renders that data to the display.
The PC may or may not need to do quite a bit of processing to generate the video data in the first place.

So is the computer not rendering frames either and thus using only tiny amounts of energy?

If I were streaming directly via the TVs interface with no computer involved obviously there would be no PC GPU using any power at all. Does that mean my GPU will only expend a tiny amount of power/energy/memory processing a streaming signal sent to the TV?

Seems when I’m streaming via computer on my computer monitors the GPU uses a decent amount of energy powering a 27" monitor. Won’t a 55" monitor be that much harder on the GPU.

I pretty much get what you guys are saying but I don’t quite see how streaming Netflix via my computer OR streaming it via the TV use different amounts of power (i.e. one uses a big NVIDEA GPU and one doesn’t for the same result on a TV screen).

BTW, the reason I would want to power streaming from my computer instead of the TV is to avoid the horrible, horrible, on-screen streaming controls.

Just having the computer ON is going to consume a lot of energy. Modern streaming protocols are simple enough to decompress in hardware, which is why you can have a streaming box the size of a stick of gum, that consumes a few Watts, at most. Using a computer to stream the video to the TV is adding 100W or more. Possibly much more, for a gaming machine.

A 4k display has the same number of pixels whether it’s a small tablet screen or or a 100" tv.

In any case, decompressing a video stream and sending video frames out the HDMI port is a minimal load. As was pointed out upthread, one of those streaming sticks can easily do it in the form factor of a USB thumb drive.

What does “powers” mean to you?

In your 27" monitor case, the power supply and hardware in the monitor is powering the pixels; supplying enough volts and amps able to light them up. The computer’s GPU is deciding what color & brightness each pixel ought to be moment by moment and is sending that data stream to the monitor to distribute those control instructions to each pixel.

In your 55" TV case, the power supply and hardware in the TV is powering the pixels; supplying enough volts and amps able to light them up. The computer’s GPU is deciding what color & brightness each pixel ought to be and is sending that data stream to the TV to distribute those control instructions to each pixel.

You didn’t ask, but if you owned a 75 x 150 foot Jumbotron in a stadium …

In your Jumbotron’s case, the power supply and hardware in the Jumbotron is powering the pixels; supplying enough volts and amps able to light them up. The computer’s GPU is deciding what color & brightness each pixel ought to be and is sending that data stream to the Jumbotron to distribute those control instructions to each pixel.

Bottom line:
The computer does not care in the slightest what size or electrical consumption your display is. All it knows is 4K, or 1080p, or whatever. All the “powering”, for any sensible definition of the term, is done by the power supply of the destination display device.

For streaming stuff like Netflix, it’ll be the desktop CPU doing most of the work to decode the stream from the Netflix server. That won’t use much power, but it’ll use a lot more power than your TV because the PC is doing other things in the background and the TV probably uses a more efficient processor.

I think what you really want is an AppleTV. That’s the easiest solution to bad smart TV features.

I knew as I was writing that that it wasn’t the right term, I just couldn’t think of the right word. Still can’t. I knew “power” was a bad choice because I don’t mean electric power like watts. I meant more like “drive” or “produce” or “put the pretty pictures on the screen”.

I’m thinking about the “load” (not elec. power!) on my GPU to stream a movie on my computer monitor, let’s say. I guess I don’t get why my GPU has any “load” at all from the TV screen considering the TV can put the video on the screen just as well as my GPU without any help.

ETA: I guess by “power” I meant not watts but more like “gigaflops” (or what have you). I meant computing power regardless of how much electricity it uses or heat it creates.

With that clarification …

If you don’t want your computer expending GPU computing effort decoding the stream, then stop using the computer as a streaming source. Have the TV consume the stream directly from wherever. That’s what it’s designed for.

Streaming a video on your computer doesn’t actually that much of a GPU’s processing power. You can stream a video on a GeForce 5080 or a GeForce 1050 with essentially the same results. Advanced graphics processing hardware is mainly needed for gaming, not just for displaying video.

As mentioned above, HDMI transmits decoded video frames, just a sequence of plain bitmap images. Very little computing power is needed to display them, since no decoding is involved. Any decoding of a compressed stream into bitmap frames happens in the HDMI source (the computer, set top box, or whatever). A fancy TV may do some processing of the images to enhance the quality, but this is not necessary and simple TVs get by without any image processing.

The TV has its own little computer inside of it, and that computer has its own GPU that is used to display things on the TV’s screen.

An analogy that might help? Imagine your PC as a big pickup truck with a thirsty V8 engine that gets 16MPG. Imagine the TVs builtin computer as a little hybrid that gets 50MPG. Each vehicle with only a single occupant both driving the same speed will get that single occupant to their destination in the same amount of time, but the pickup will use a lot more gas to do it.

Now imagine that you need to move a load of bricks. All of the sudden the pickup truck with its lower fuel efficiency makes sense. That is a job the little hybrid cannot do.

Streaming is driving with one person in the car. The little computer in the TV can handle it efficiently. Your PC can do it, too, but less efficiently.

Playing a graphics heavy game is carrying the load of bricks. This is what the Nvidia GPU in your PC was designed to do, and it can do it very well, even if it uses lots of power doing it. The little computer in the TV cannot play that game.

So its design goals and tradeoffs. At idle your Nvidia GPU uses maybe 30 watts. That’s probably more than the peak power consumed by the computer in your TV.

If you just connect the computer to the TV, it doesn’t take any more GPU power to run the video as when you send it to your monitor. It’s the same process and the same signal sent.

If you’re sending over Wi-fi or something, it might take a tiny bit more power encoding it to be sent that way, but the difference will be miniscule. Probably less than a percent on anything modern.

I mentioned earlier that I plan to stream from my computer to circumvent the TVs ultra-shitty (yes–even on top-tier TVs) streaming interface. I can’t deal with trying to type in my password, etc., with a remote control on a TV screen. And, I don’t want the ads and the data snooping.

Thanks! OK. Now I’m catching on to some things. I was thinking about all the resources my GPU uses when rendering the video stuff I edit. And I know that the GPU also drives the monitors. So, I probably conflate some of that stuff.

If it really would be really light on its resource consumption that would work ok for me.

A TV does 2 things - it either receives a signal of pixel-by-pixel information over HDMI, or it receives a streaming data stream from over-the-air format broadcasts. HDMI is already decoded to individual pixels, 4K (3840x2048) or lower (usually 1080P). It has a dedicated GPU-type electronic to decode OTA compressed data streams, or to upscale if the stream is not the right size (i.e. 720P etc.).

I don’t know much about over-the-air because my “live” TV and DVR comes from a cable box that has an HDMI out so the box has already decoded the streams from the cable company, using the same sort of GPU-decoder. It also hosts apps like Netflix and AppleTV and Prime so it decodes those streams.

Obviously much older PC’s can decode thse streams too, so it’s not doing a big draw on the PC’s compute power for your PC to decode the stream. I use a PC from about 2010 hooked to my TV to show MP4 movies, with an HDMI connection. Even my older laptop can decode MP4’s without lag. I assume a stream is not much different in concept from a recording od a streamed service. (Minus all the security built into a commercial stream to prevent copying).

As for power as in amount of electricity, technology has improved. MY LCD TV from 2007 is physically warm to the touch, my more recent one in the basement runs cool. (I belive that’s a difference between flourescent backlit vs LED). All other tech being equal, the power is also related to the birghtness and size of the lights that provide the TV’s glow. Nowadays, a monitor is just a smaller TV without the over-the-air electronics, so no need for the hardware to decode streams. My TV is for me just a glorified monitor, since it never needs to decode streams itself.

What impact your streaming via PC and HDMI will have on performance for other tasks? Only one way to find out. Try playing a movie - say 4K? - from Youtube, or such - and see how if affects video editing and other tasks. I doubt that playing one video will affect modern GPU cards, I expect they multitask just fine.