Iíve been using TVs as a PC monitor for over 15 years. Itís easy! Youíll want to use the output off the back of the graphics card.

The advice about needing the latest graphics cards only matters if you want to watch 4K HDR content off Netflix or whatever, as older devices canít do high resolutions and colour output, nor can they do high frame rates such as 120 fps above 1080p. HDCP would potentially be an issue as well if your GPU is reeeeeaaally old, but again thatís only for watching streamed movies or DVD/BluRay discs.

If youíre planning to just browse or watch YouTube or play some older games youíre going to be absolutely fine. Just make sure you set your desktop resolution to match the TVís native resolution (for example 1920x1080 or 1366x768 or 1280x720) for a sharp and clear image.

One other thing you may have to fix up is if the GPU under-scans the image size. Basically what this means is, back in the olden days TVs had a visible screen area that was smaller than the actual size of the display itself; the manufacturer would put a fancy bezel around them. To allow for this almost every device would make the image output smaller than the maximum size, and this is called an under-scan. The opposite, where the image is larger, is called over-scan. If the image matches the screen size 1:1 itís called a just-scan, or sometimes true-scan.

If you set everything correctly but still have a 1Ē black border around the whole frame, itís probably under-scanning to allow for the TVís actual viewable area. If this happens, youíll need to go into your GPUís settings and look for under/over-scan, or something like TV output safe zone. You might have to Google it for your specific GPU.