You know that Fud won't listen to nonsense like that.
Alan, true, there's no "real" need, but I don't give a fuck.
I wonder if you could use the 1080ti as a physics card with the "3080".
So I'm playing Control.
A few people (I won't name names but there's this Italian guy) poo-pooed RTX saying it was too soon and it was taking away from PCs being able to pump out 4K graphics at eye-popping frame rates. And that's correct. Playing any game with RTX enabled will suck your frame rate even at 1080p.
But it seems Nvidia knew this, so they packaged a thing called DLSS onto their GPUs. When they were first unveiled I said it would be the low-key game changer.
I'm playing Control on an RTX 2070, which is like the 2nd-lowest RTX GPU in the lineup, at full 4K/60fps with all the RTX features turned on and all the graphic settings at full tilt, and it looks phenomenal. The reason this is possible is because the DLSS AI upscaling is reconstructing a 1080p image and making it look as sharp as a true 4K image, and you can barely tell the difference.
I know it's not a super high framerate. I'm guessing a 2080Ti might give you a solid 60fps with the same setup but at true 4K. Leverage DLSS and get maybe another 20fps.
DLSS is a game changer. In the coming years it'll be put to good use with GPUs able to perform even better, and it will be critical if anyone was to run 8K on a screen in the near future and expect 60fps from their hardware.
It's been about a year since I put together my PC. No better time than now to open her up and see if it needs a clean.
And it really didn't!
From the bottom it looked terrible, since I place the PC on the floor next to my TV, and I live with two cats in the household where the box did its best to suck a bunch of fur onto the vents.
But luckily those dust covers did a phenomenal job. On the inside... front and back, only a small amount of dust got through.
Here's a close shot. Note the fans on the GPU will be blowing upwards from the bottom in this orientation.
With the card out here's the scene. No major dust on the mobo that I could see. There's certain to be dust in the GPU heatsink but there's no way I'm getting it out without pulling it apart so nyeah.
There is dust on the leading faces of all the fan blades and inside the shrouds, so I wiped what I could and left the rest, safe in the knowledge that dust is technically dead skin cells and cat urine particles that have wafted up from the litter trays downstairs. Probably litter dust too. Actually definitely cat litter dust with some poo for good measure. I can live with this. I wonder what's in the fabric of my couches...
I also pulled the filters out and gave them a rinse under the tap, and vacuumed the rest. What the Dyson couldn't move I wiped with a damp cloth.
Freshly serviced, I put it back together and set it to work where it happily plays Portal 2 at 289fps. Nice.
Mine is need need of a cleaning, badly...
So, I finally got my 21:9 LG 34WK650 monitor: pretty satisfied by it.
Discussion: I gather (from searching around and trying to pinpoint AMD official responses) that Freesync has to be on and V-sync as well to get the best of both worlds, possibly with FPS limit capped at your refresh rate or at just 1 fps lower (75 or 74Hz for me, for example).
What are your thoughts on this?
Also, got new earphones: Logitech G533. They are plasticky and sound is just ok-ish (although I'm positively surprised by the virtual surround 7.1 audio), but wireless, possibly light and loose enough to not making me suffer from clamping force sickness.
I think if you use Freesync or G-sync, you want V-sync off. The former 2 sync the monitor to the game fps, while the latter syncs the game to the monitor, no? So I think they'd fight each other?
I do put a cap on though.
G-Sync has to be used without V-sync. Freesync is a different thing. Here is official AMD support answering about it.
-----------------------------------------------------------------------------------------------------------------------
"Do not use any sort of frame capping with FreeSync. The frame time analysis algorithms that govern FreeSync, FRTC or other methods will conflict and break both solutions. It's unnecessary. Here's why:
1.The only time you'd want to turn off vsync with FreeSync is if the app's FPS can go way above your monitor's max refresh and you want the lowest possible input latency at the expense of a little tearing at high framerates. FRTC is the antithesis of this, so it doesn't make sense to use FRTC in this case.
If you're not trying to get the lowest possible input latency, or the app's FPS stays inside your monitor's DRR window on the regular, then leaving vsync enabled will cap your framerate anyways. FRTC is redundant.
FRTC is for people with regular ol' monitors who are playing low-demand games running in the hundreds of FPS, which just burns power and runs the fan faster than necessary.
FREESYNC WITH VSYNC
If vsync is enabled, it is only active when the FPS is above or below your monitor's refresh rate range.
If FPS is below, the monitor has no choice but to use vsync in the double or triple-buffer mode you've set. This will avoid tearing, but add input latency.
If the FPS is above, the GPU will reject frames ("FPS cap") to keep the application inside the FreeSync window. It will enforce smoothness. You won't get the lowest possible input latency due to rejected frames, but no extra latency is being added.
When your game is in the FreeSync window, this is the lowest possible input latency.
FREESYNC WITHOUT VSYNC
If you really care about input latency, then you can turn vsync off.
If the app is inside the FreeSync window, FreeSync is active. This is the lowest possible input latency.
If the app is below the FreeSync range, monitor will run at max refresh until the app's FPS gets back inside the DRR window. You will experience tearing, but no frames will be buffered or held as with vsync.
If the app is above the FreeSync range, monitor will run at max refresh and your FPS can go however high it will. This sustains the lowest possible input latency because no frames are being buffered, held or rejected as with vsync. You will experience some tearing until the FPS falls back inside the FreeSync window and FreeSync resumes.
** IF YOUR MONITOR IS COMPATIBLE WITH FREESYNC LFC**
If your monitor has a sufficiently wide range to support our Low Framerate Compensation feature, this supersedes your vsync setting. It has lower input latency, no tearing, and no vsync stutter. It's much better than vsync.
tl;dr: Most people want to leave FreeSync + Vsync enabled.
PROS: GPU won't waste power/heat/noise on unused frames, game forced inside the FreeSync range as often as possible, no stuttering, no tearing.
CONS: Lowest possible input latency will not be achieved if app goes outside of FreeSync range. Vsync stutter possible when app is below FreeSync window.
If you're a stickler for mouse latency, use FreeSync + vsync OFF.
PROS: No stuttering/tearing inside FreeSync window, lowest possible latency at all times.
CONS: Tearing possible when app leaves FreeSync window
IF app FPS < min_refresh THEN Low Framerate Compensation (LFC) supersedes vsync.
----------------------------------------------------------------------------------