Page 49 of 120 FirstFirst ... 3947484950515999 ... LastLast
Results 481 to 490 of 1200

Thread: Somebody stop me... (Gaming PC Build Thread)

  1. #481
    Senior Member
    Join Date
    Jan 2014
    Posts
    6,007
    I'm vaguely interested in the 27" and 32" models.

  2. #482
    Female Masturbatory Aid
    Join Date
    Jan 2014
    Location
    JAX
    Posts
    2,363
    I'm still regretting buying the 1080ti, and not the 2080ti. (Not really, but I was seriously thinking about it)

  3. #483
    Senior Member
    Join Date
    Jan 2014
    Posts
    2,104
    So, I just bought something I never did buy before: a sound card. In the past I always relied on integrated sound or soundboards taken for free from old PCs of friends.
    I got bored of having to manage Realtek shit and having to connect my headphones to the integrated audio and meanwhile having the 2.1 speakers connected to my monitor with audio pushed through DisplayPort because: 1) I can't be arsed to physically switch cables everytime I want to hop from headphones to speakers (my front case audio is disconnected: graphic cards do pass over the mobo socket for the front audio - shitty mobo, I know) 2) volume by the integrated audio is abysmal and, especially, low.

    So I got a Creative Soundblaster Z retail (with two annoying internal red leds that can't be turned off - I may either cut them or cover them with black electrical tape) and await for it to be delivered next week.
    People in comments say it is a totally different world compared to integrated audio, let's hope so.

    It will eventually be part of my future new PC, but I'm going to wait at least next fall before starting on that (Although with the RX 580, the SSD, 650W PSU and Soundblaster I'll be at half the point of the build already).
    Anyone got this audio card? Many suggested me to get an external USB audio card, but I can't stand anymore to have hubs and stuff outside my case...

    (There was also a bulk version, without the two leds, the metallic protection and the microphone... but it was 79 euro vs. 82 euro, really no brainer to get the retail version, even with the annoying leds).

    AR201211090019_g1.jpg

  4. #484
    Junior Potato
    Join Date
    Jan 2014
    Posts
    9,621
    So AMD has unveiled a new 7nm Radeon graphics card to push the high end of the consumer market, and the world let out a collective “Meh.”

    Even the Nvidia CEO gave it a blast, apparently.

    What’s the deal?

    Well, it’s got clocks that push into - but not quite - 2080 territory, no ray tracing and no AI cores, but at a lower cost. Are they aiming for the productivity market as with Threadripper, by boosting raw power over features and going for the user who wants to render large graphics, or edit 4K video files, rather than the gamers, who form the largest part of the graphics sector?

    They’re not exactly troubling Nvidia, which definitely has power up its sleeve. It’s a shame, as I’d love to see another GPU arms race between these two companies.

  5. #485
    Senior Member
    Join Date
    Jan 2014
    Posts
    2,104
    Actually, Nvidia seems nervous (not counting how Ray Tracing is so far overrated, underwhelming and available only on a stellar number of 2 games in the market (TWO) and 2080 can't manage to keep up 4K 60 fps with RT on, and are we forgetting about all the new RTX cards having memory issues and burning down?).
    That Radeon VII is the gaming counterpart of their productive beast, the Mi60 - 7nm, 16GB of HBM2 memory, 1 TB/s of bandwith... 2080 who? Only for rich newbs blatering about RT and DLSS.

    More and more people are already turning to AMD for cheap gaming rigs with Ryzen (but always remember to buy RAM at least 3000MHz or it will suffer a lot), and Radeons are the new cheap alternative (1060s are trampled by 580s and 590s).
    I think Nvidia is worrying and trying to conceal it by aggressively attacking AMD presentation.
    699 dollars for the Vega 7... surely will go down a 100$ by a month or two, meaning it's way cheaper than a 2080 and a useless 2080Ti.

  6. #486
    Director Freude am Fahren's Avatar
    Join Date
    Jan 2014
    Location
    DFW
    Posts
    5,109
    And rumor has it Nvidia is planning an 11xx series. I guess they'd be similar specs to their 20xx counterparts, without RT or DLSS? Maybe similar framerates to 20xx cards with RTX off, back at at 10xx price points?

    I don't fault them for RTX. It's a big step in tech, and is going to take time to work in seamlessly and become ubiquitous. They just kinda blew it on the marketing/QC/Price side. They probably should have released the 2060 sooner, but I'm guessing they were worried about the benchmarkers seeing unplayable frame rates with RTX on. They kinda made themselves into a meme, with high prices, some bad units and some bad RTX implementation, which in the gamer market is only amplified.

    It's important to note, that even though only BFV uses RTX, it came out and was a huge FPS hit, but after only one patch they made big improvements, so as developers work with it, I think we'll see RTX become usable and more popular. I can't wait to see it in driving and flying games/sims like ACC, PC3, maybe XPlane, etc.

  7. #487
    Senior Member
    Join Date
    Jan 2014
    Posts
    2,104
    Good points, especially on the 2060 (shall we forget it was supposed to be a normal gtx card without ray tracing and suddenly out of the blue it became an RTX card?).
    Nvidia could be afraid that people will eventually stop giving a damn about rt (I already don't, sounds like a gimmick to me) and shilling less money for a cheaper and equivalent AMD counterpart without rt (and with better quality control).

  8. #488
    Junior Potato
    Join Date
    Jan 2014
    Posts
    9,621
    I’m throwing my hat into the ring and saying that RT is definitely the future. This gen is the awkward first step that we have to have, and we will see big improvements over the next 12-18 months.

    The only complaining I’ve seen stems from the notion that the cards need to run games at full settings and achieve 4K/60 output at minimum, which is a fair comment. I think everyone was expecting a GTX1180 that would nail this. But we’re at a time when 4K screens are not totally ubiquitous, yet 8K is on the horizon and graphics cards are having a hard time keeping up with pixel counts while keeping costs at a reasonable level.

    It’s almost like tyre manufacturers that are struggling to keep up with power outputs from the latest supercars.

    I’d say that there is a 50/50 chance that both Sony and Microsoft will announce ray tracing for their next gen consoles, so then the tech will be ubiquitous and developers can unify their lighting systems. Then we will see some new techniques to make ray tracing faster and more efficient.

    It should be noted that Polyphony Digital has been using real time ray tracing in experimental builds of Gran Turismo, and used it to bake-in the lighting effects on GT Sport. There’s already a push towards it outside the PC space.

  9. #489
    Senior Member
    Join Date
    Jan 2014
    Posts
    2,104
    RT is nowhere to be seen in videogames. Any platform.
    And the whole videogame market is dominated by AMD, no matter if Nvidia has faster cards. Why? Consoles, which are all powered by AMD cards.
    Nvidia is afraid developers won't give a damn about RT because consoles and console games won't follow... suddenly getting cold feet, Mr. Huang?
    Actual consoles can't run real 4K at solid 30 fps with graphics close to top of the line PCs and you think RT will be in next gen consoles? Ludicrous to me.
    And lest we forget Nvidia is in financial problems: it lost 50% of its stocks' value since the 1st of October, and there's the class action on the horizon.
    Ray Tracing, pfff... we haven't even gone far with Direct X 12 and Vulkan yet (all fault of Nvidia which is boicotting them, as it did with the Direct X 10 in the past).
    Last edited by Blerpa; January 10th, 2019 at 03:44 PM.

  10. #490
    Senior Member
    Join Date
    Jan 2014
    Posts
    2,104
    Oh, almost forgot, AMD not only dominates the gaming world, but ironically even the mobile one: the Qualcomm Adreno, which is everywhere in smartphones and tablets, was an ATI (then AMD when ATI was bought in) chip developed by Imageon, now property of Qualcomm.
    Adreno is the anagram of Radeon.
    Far fetched but too fun to not tell.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •