I’ve been using TVs as a PC monitor for over 15 years. It’s easy! You’ll want to use the output off the back of the graphics card.

The advice about needing the latest graphics cards only matters if you want to watch 4K HDR content off Netflix or whatever, as older devices can’t do high resolutions and colour output, nor can they do high frame rates such as 120 fps above 1080p. HDCP would potentially be an issue as well if your GPU is reeeeeaaally old, but again that’s only for watching streamed movies or DVD/BluRay discs.

If you’re planning to just browse or watch YouTube or play some older games you’re going to be absolutely fine. Just make sure you set your desktop resolution to match the TV’s native resolution (for example 1920x1080 or 1366x768 or 1280x720) for a sharp and clear image.

One other thing you may have to fix up is if the GPU under-scans the image size. Basically what this means is, back in the olden days TVs had a visible screen area that was smaller than the actual size of the display itself; the manufacturer would put a fancy bezel around them. To allow for this almost every device would make the image output smaller than the maximum size, and this is called an under-scan. The opposite, where the image is larger, is called over-scan. If the image matches the screen size 1:1 it’s called a just-scan, or sometimes true-scan.

If you set everything correctly but still have a 1” black border around the whole frame, it’s probably under-scanning to allow for the TV’s actual viewable area. If this happens, you’ll need to go into your GPU’s settings and look for under/over-scan, or something like TV output safe zone. You might have to Google it for your specific GPU.