Author Topic: Ampere vs RDNA 2: FIGHT! Thread of new GPUs (let's be honest, nvidia gonna win)  (Read 36447 times)

0 Members and 1 Guest are viewing this topic.

Human Snorenado

  • Stay out of Malibu, Lebowski
  • Icon
I think the list is pretty on par with the games that had DLSS when that launched

:yeshrug

FarCry 6 and RE Village are pretty big games that will be getting it at some point soon, and apparently it's pretty easy to implement. Will be interested to see if Cyberjank 2077 gets it.
yar

naff

  • someday you feed on a tree frog
  • Senior Member
sure, but as of right now it has very little support but yeah i mean, seems a great launch relative to DLSS which was shovelware until 2.0. i am definitely interested in comparisons and when i do eventually upgrade i'll prob be getting the 6800xt whatever i can find that seems an ok deal  :lol. hopefully btc keeps getting crushed and the market is flooded with miners

edit: just checked and it seems there's already a bunch of miner cards being put up  :doge my 1070 i bought years ago was an ex miner i copped for a little under 200usd. still going strong
« Last Edit: June 24, 2021, 10:58:29 PM by naff »
◕‿◕

naff

  • someday you feed on a tree frog
  • Senior Member
2nd hand 2080ti for $850 cop or not? (has warranty)
◕‿◕

Human Snorenado

  • Stay out of Malibu, Lebowski
  • Icon
Ehhhhh if you knew it hadn't been used to mine I'd say get it but good luck knowing that
yar

remy

  • my hog is small but it is mighty
  • Senior Member
I heard being a miner card isn't even that bad because most of the time people are cooling them properly/not blasting them at high temps because they want them to be stable.

naff

  • someday you feed on a tree frog
  • Senior Member
I heard being a miner card isn't even that bad because most of the time people are cooling them properly/not blasting them at high temps because they want them to be stable.

my 4 year old 1070 is an old mining card i bought off a local trading site. showed up with cash and dude handed it to me in an old shopping bag. still works great 8) that's the silicon lottery for you

built a new machine over the weekend but haven't had much time to play with it this week. lf to the weekend. specs; 5800x - only $70 more than the 5600x whereas 5900x was $450 more here - 5800x be unpopular because it's weird and not much better for gaming, but is better for compute and i'd already bought an aio cooler, 8/16 threads more future proof than 6/12, slightly better single core clocks etc and a 3080 gigabyte eagle rev 2 all in a p500a case. only rgb is on some deeply discounted ram i couldn't say no to (corsair vengeance 3600 cl18) and the gpu. was going to disable the rgb then i actually found a fine basic use for it - turning brightness way down and pairing the colour of the dimms to gpu temperature sensors so they're light purple normally with a "rain" pattern and increase to brighter orange and turquoise when the gpu is over 70c and red when 85+.

ran a few stress tests and couldn't get anything to go very hot - gpu peaked ~80c and the cpu ~76c at peak. big thing i think messes people up with the 5800x is the default TDP is waaaay too high. i modified the TDP of the cpu closer to what amd recommend in their specs but don't have set by default for whatever reason (142 down to 112 watts). kinda fucked up they do this, but whatever, meant i got a relatively cheap CPU partially because people report this cpu running so damn hot and it being a bad buy and the "odd one in the middle" with "no strong use case" between the 5600 and 5900x :p got peak temps down from ~85c to ~75 at peak load with fans nearly silent. actually improved single core performance too which is also weird af. might try bump up the wattage a little as i have a beefy ekwb 360 aio cooler, but yeah, p whack default settings by amd. look into undervolting if you have this cpu
« Last Edit: August 05, 2021, 08:50:15 PM by naff »
◕‿◕

naff

  • someday you feed on a tree frog
  • Senior Member
I actually don't disagree the 5800x is a hard cpu to justify, but it's not "bad" it just has bad out of the box settings. 5600x is more efficient though and 100% all you need for gaming/general productivity and the 5900x is way better for big rendering / compilation workloads. I like the middle tier jack of all trades bracket tho
◕‿◕

Nintex

  • Finish the Fight
  • Senior Member
I got the 5800x because it was the only one available at the time plus it is a bit better for productivity. With that said you can't really go wrong with any ~$300 CPU. Intel i7 10th or 11 gen runs 99.9% of the games just as good.

I must say that it took until 2 weeks ago for the B550 platform to become stable though.

My temps are fine though. ~73 at peak or so
🤴

Rufus

  • 🙈🙉🙊
  • Senior Member
ran a few stress tests and couldn't get anything to go very hot - gpu peaked ~80c and the cpu ~76c at peak. big thing i think messes people up with the 5800x is the default TDP is waaaay too high. i modified the TDP of the cpu closer to what amd recommend in their specs but don't have set by default for whatever reason (142 down to 112 watts). kinda fucked up they do this, but whatever, meant i got a relatively cheap CPU partially because people report this cpu running so damn hot and it being a bad buy and the "odd one in the middle" with "no strong use case" between the 5600 and 5900x :p got peak temps down from ~85c to ~75 at peak load with fans nearly silent. actually improved single core performance too which is also weird af. might try bump up the wattage a little as i have a beefy ekwb 360 aio cooler, but yeah, p whack default settings by amd. look into undervolting if you have this cpu
The various power limits (PPT, TDC, and EDC) are set by the motherboard maker. The defaults might not be stock if they're being particularly cheeky. Gotta get those extra pixels on the benchmark graph. ::)

Your cooling is top notch. These chips just run hot by design. Even the temps you were getting before are within expectations, though improved single core perf (boosts higher for longer, due to lower overall temp) is nice. If you want to run something at 240Hz, it's worth keeping in mind.


Coax

  • Member
I like the comment of 'People could totally tell, it was just never confirmed'... and then the follow-up comments clarify only one scene in the entire presentation was fully CG and it's the obvious one where he disintegrates and the set folds away :doge

Human Snorenado

  • Stay out of Malibu, Lebowski
  • Icon
Uggggh I got selected for a 3070ti in the Newegg shuffle. It's not really THAT much of an upgrade over the 6700 I have, tho. Think I'll pass.
yar

Beezy

  • Senior Member
Uggggh I got selected for a 3070ti in the Newegg shuffle. It's not really THAT much of an upgrade over the 6700 I have, tho. Think I'll pass.
Ask a friend if they've been trying to get one and sell it to them at cost then. These things are still hard to get. How much is it?

Human Snorenado

  • Stay out of Malibu, Lebowski
  • Icon
It's in a bundle with a motherboard for like $1160
yar

Nintex

  • Finish the Fight
  • Senior Member
Hoard everything that has chips bois

https://asia.nikkei.com/Business/Tech/Semiconductors/TSMC-hikes-chip-prices-up-to-20-amid-supply-shortage

Quote
TSMC hikes chip prices up to 20% amid supply shortage

The golden age of cheap technology is over :fbm
🤴

Human Snorenado

  • Stay out of Malibu, Lebowski
  • Icon
Ugh, goddamnit. Best Buy has a bunch of founders edition RTX cards in store tomorrow, but the only stores in Georgia that are getting any are in Augusta and Savannah, which are like all the way on the coast, 3-4 hours away from me. Fuck that.

I wish 6800xts actually existed but they basically never show up even in the Newegg Shuffle. TBH I'm thinking about going down from a 4k monitor to 1440p for my main gaming screen to stretch the life of this 6700xt I have.
yar

Beezy

  • Senior Member
5600 and 5800 are both on sale. Most reviews I've seen recommend the 5600, but the 5800 is only $50 more. They also mention that the 5600 comes with a cooler, but I haven't seen anyone mention if it's actually good or not, so I think I'll go with the 5800.

I thought about going with the new Intel CPUs, but apparently the mobos for it are expensive right now?

Nintex

  • Finish the Fight
  • Senior Member
5600 and 5800 are both on sale. Most reviews I've seen recommend the 5600, but the 5800 is only $50 more. They also mention that the 5600 comes with a cooler, but I haven't seen anyone mention if it's actually good or not, so I think I'll go with the 5800.

I thought about going with the new Intel CPUs, but apparently the mobos for it are expensive right now?
I got the 5800X mainly for the additional cores. As a consumer you don't buy a CPU for 12 months (like the tech Tubers) and the 2 extra cores will matter in the long run.

The new Intel CPU's are a bit better but in the CPU space you're talking about 120fps vs. 130fps or 65fps vs 70fps depending on the game and that is if you don't have a GPU bottleneck.
By the time your 5800X can't keep up, neither can the Intel i7 12. The Xbox Series X / PS5 basically run on the 3700X so you're good for this generation (and most likely the next).
🤴

Beezy

  • Senior Member
Anything that I need to look out for when it comes to mobos and ram? You already mentioned on the last page that I should stay away from Gigabyte mobos.

pilonv1

  • I love you just the way I am
  • Senior Member
I picked up a 5600X recently because it was on sale and it's been fantastic, but if you can afford the 5800 then go for it for the extra cores. I was upgrading from a 3570k so it's already a huge improvement. Really depends on the price gap between the two though and what you're willing to spend.
itm

Nintex

  • Finish the Fight
  • Senior Member
Anything that I need to look out for when it comes to mobos and ram? You already mentioned on the last page that I should stay away from Gigabyte mobos.
Yeah, I had a difficult time with mine but they've fixed most of the issues with the B550 platform after a couple of BIOS updates.

Any of the bigger brands (MSI/ASUS) will do. I know a lot of folks like ASUS for their build quality and features but those are usually a bit more expensive though.
Gigabyte is hit and miss, cheaper and overall good parts when it comes to hardware (especially for the price) but they haven't figured out the QA or UX for some of their products.
For GPU's this is less of an issue (as you use the Nvidia drivers anyway) for motherboards with specific chipsets and a mix of utilities though...
🤴

who is ted danson?

  • ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀✋💎✋🤬
  • Senior Member
I went a bit crazy at Black Friday, so I now have:
4690k -> 5900X
8GB RAM -> 32GB RAM
256GB SSD + HDD -> 1TB PCI-E 3 SSD
1060 -> still the same 1060   :-\

It is nice seeing Cyberpunk using like 25% CPU / RAM   :lol
Really wish the GPU shortage would end. Maybe next year...
⠀⠀⠀⠀⠀

Nintex

  • Finish the Fight
  • Senior Member
Impressive rig bro :pitbull

Don't forget to upgrade the screen either, this 32" Odyssey G7 is heavenly once you turn on Adaptive Sync.
(if you don't it's a flickering mess when you turn on G-Sync, can't believe they launched them like that  :lol )

Not sure when I'll replace this 2070 Super, probably late 2022 at the latest if the 4000 series arrive to the scene but I might swap out the One X for the Series X instead and wait another year.
There's nothing that an RTX 2070 Super can't run. Hell, even that 1060 still packs a punch in most modern games at 1080p. 
🤴

who is ted danson?

  • ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀✋💎✋🤬
  • Senior Member
I actually have upgraded my monitors recently too  :lol.

1080p/240Hz Alienware AW2521HF
1440p/165Hz Dell S2721DGF (also bought on black friday)

Wanted to have a high refresh 1080p screen for FPS gayming, and the 1440p will get to stretch its legs with everything else (once I can get a new GPU).
The Alienware is my primary for now because as you mention the 1060 only really has the grunt for 1080p.

⠀⠀⠀⠀⠀

who is ted danson?

  • ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀✋💎✋🤬
  • Senior Member
⠀⠀⠀⠀⠀

who is ted danson?

  • ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀✋💎✋🤬
  • Senior Member
I finally got a 3060ti FE at RRP at the end of last week. Glad to finally have a decent GPU for the next few years (and no longer have to pay attention to Part Alert).

Tried out at bit of Control with all the bells and whistles on and it looks pretty nice. Gonna have to try out Cyberpunk again at some point
⠀⠀⠀⠀⠀

who is ted danson?

  • ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀✋💎✋🤬
  • Senior Member
https://www.techpowerup.com/296396/nvidia-to-cut-down-tsmc-5nm-orders-with-the-crypto-gravy-train-derailed-amd-could-benefit

Quote
NVIDIA is reportedly looking to reduce orders for 5 nm wafers from TSMC as it anticipates a significant drop in demand from both gamers and crypto-currency miners. Miners are flooding the market with used GeForce RTX 30-series graphics cards, which gamers are all too happy to buy, affecting NVIDIA's sales to both segments of the market. Before the crypto-currency crash of Q1-2022, NVIDIA had projected good sales of its next-generation GeForce GPUs, and prospectively placed orders for a large allocation of 5 nm wafers from TSMC. The company had switched back over to TSMC from Samsung, which makes 8 nm GPUs from the RTX 30-series.

With NVIDIA changing its mind on 5 nm orders, it is at the mercy of TSMC, which has made those allocations (and now faces a loss). It's incumbent on NVIDIA to find a replacement customer for the 5 nm volumes it wants to back out from. Chiakokhua (aka Retired Engineer), interpreted a DigiTimes article originally written in Chinese, which says that NVIDIA has made pre-payments to TSMC for its 5 nm allocation, and now wants to withdraw from some of it. TSMC is unwilling to budge—it could at best hold off shipments by a quarter to Q1-2023, allowing NVIDIA to get the market to digest inventory of 8 nm GPUs; and NVIDIA is responsible for finding replacement customers for the cancelled allocation.

The same article paints a different picture for AMD: the company has reduced orders for 7 nm and 6 nm nodes; but its 5 nm orders are unaffected. AMD makes not its its next-generation RDNA3 GPUs on 5 nm, but also its next-generation "Zen 4" CPU chiplets. Any drop in demand for GPU silicon would be internally adjusted by increasing "Zen 4" chiplet orders. AMD's growth as a processor manufacturer is no longer bottlenecked by technology-leadership, but by volumes. The company could jump at the prospect of higher 5 nm allocation, as it would enable it to increase output of "Zen 4" processors to meet rising demand of high-margin server processors with its upcoming EPYC "Genoa" and "Bergamo" processors.

The BTC/ETH gravy train is over. GAMERS RISE UP!!
⠀⠀⠀⠀⠀

Polident Hive

  • Member
Nvidia is offering games with GPU purchases again. Older games, Doom Eternal complete and Ghostwire Tokyo. But hitting at or below MSPR and incentives, it’s healing.

Nintex

  • Finish the Fight
  • Senior Member


More on the TSMC/Nvidia story.

Honestly selling a 5nm Switch 2 GPU to Nintendo seems like the best solution.  :doge
They would still have to produce something now but will eventually recoup their costs.

It's almost like the Tegra situation. Nvidia had a surplus and Nintendo wanted a cheap yet capable GPU.
🤴

Nintex

  • Finish the Fight
  • Senior Member
The lower end market is fucked too.
Nvidia drops a GTX 1630 for a whooping $199, which is worse than the $150 Radeon 6400, which was the previous worst budget card in price vs. performance.

🤴

Skullfuckers Anonymous

  • Will hunt bullies for fruit baskets. PM for details.
  • Senior Member
Holy shit @ no new EVGA gpus.



Next Tuesday is going to be very interesting.

Coax

  • Member
Oh fuck. EVGA was my go-to brand for Nvidia cards  :-\

Nintex

  • Finish the Fight
  • Senior Member
Already a New Generation but they will keep the old one in circulation.

RTX 4080 ($899) offers twice the performance of the 3090 Ti. 4090 comes in at $1599.
DLSS3 is very impressive too.





🤴

Bebpo

  • Senior Member
Prices are zany  :doge

It's like they priced them for miners.

Nintex

  • Finish the Fight
  • Senior Member
Nvidia is in a pickle because they ordered a lot of production capacity that they can't fill and either have to hold up their end of the contract with TSMC or find other clients willing to take over those orders.
Add in the difficulties with shipping and other things and all they can do is charge a premium price to cover up the higher costs of making these cards.

They've also rebranded the 4070 into a 4080 12GB model to justify the $899 price tag.
People looking to buy a 3080 or 3090 Ti will switch to these new 4080/4090 models because it doesn't make sense to buy a 3080 for $799 when another $100 gets you twice the performance.
But everyone else will look at $400 - $500 options.

The 3060 Ti is the 2080 which is roughly the 2070 Super with an overclock and that runs everything at 1440p ultra still.
When you get into 4k, all you need to move from 50 to 60fps in certain games is a 3070. This won't change anytime soon considering the Xbox Series / PS5 hardware that most games target.
There is no big game that launches alongside these cards either. Even if the performance jump over the 2080 is significant there's nothing to upgrade for.
It'll probably be a very limited supply paper launch to drum up demand.

They are pushing ray tracing hard though, this tool is pretty awesome. Basically any DX8/9 game can be modded into an RT game now.
https://www.moddb.com/news/nvidia-releases-dx89-rtx-modding-tool
« Last Edit: September 20, 2022, 06:11:29 PM by Nintex »
🤴

Bebpo

  • Senior Member
Yeah, if they aren't useful for miners anymore, I don't see these cards selling at this price. I think Nvidia is gonna get fucked during the 4xxx generation for the reasons you mentioned. Which hopefully means bargain bin sales after a while at which point I'll grab the 16GB 4080.

I wonder how much of this played into EVGA burning bridges.

Polident Hive

  • Member
Debating what path I should take here.

I’m on the 1000 series (1060) and to put it gently, it sucks ass and wheezes and spittles running some games at 900p 60fps. I’ve a 1440p 144hz monitor and rarely am I seeing the benefits in games. Some games I’d classify as borderline unplayable. CPU isn’t hot either. Some intel mid range 4000 chip. Regardless, gotta put together a new desktop.

One decision I’ve already made is going for a sff case. Takes the top end cards out for thermal and space limitations. Waiting for the 4060s and 4070s could be a ways off. Depending on discounts and sales, thinking a 3060 as a crutch for now is best. Still a massive improvement. Don’t do too many CPU intensive tasks so instead of waiting on the 13th gen, a 12700 should be fine for some time.

Nintex

  • Finish the Fight
  • Senior Member
Debating what path I should take here.

I’m on the 1000 series (1060) and to put it gently, it sucks ass and wheezes and spittles running some games at 900p 60fps. I’ve a 1440p 144hz monitor and rarely am I seeing the benefits in games. Some games I’d classify as borderline unplayable. CPU isn’t hot either. Some intel mid range 4000 chip. Regardless, gotta put together a new desktop.

One decision I’ve already made is going for a sff case. Takes the top end cards out for thermal and space limitations. Waiting for the 4060s and 4070s could be a ways off. Depending on discounts and sales, thinking a 3060 as a crutch for now is best. Still a massive improvement. Don’t do too many CPU intensive tasks so instead of waiting on the 13th gen, a 12700 should be fine for some time.
CPU's are fantastic value these days. Just make sure you have enough cores and threads.
The 12600k or Ryzen 5600X or 5800X is all you need. There are some crazy deals on the 5800X, I've seen them as low as $250 on Amazon.

AMD is going to announce their new line-up november 3rd so you might want to hold on to see what they have on offer (if only to see what it does to prices).
But overall, if you can get a good price on a 3060 Ti I would go for it. Below that Nvidia has a gap and your best bet is a ~$300 6600XT.

I have a 2070 Super and a Ryzen 5800X and everything runs perfect on my 1440p Samsung Odyssey panel. DLSS goes a long way to smooth out the edges on more demanding games.
🤴

Polident Hive

  • Member
Luckily I have access to a micro center. Intel chips are reasonably priced year round. Right now a 12600k is $250 (or $230 with motherboard). During holidays they get cheaper. Before the 4000 announcements, GPU prices there were seeing triple digit discounts with bundled games. Mental math, aiming for around $1.2K on a compromised build. Then splurging on a GPU upgrade down the road.

Honestly, everything I’ve read makes AMD the smarter choice. But been intel+nvidia for so long, just gonna stick with it. Going with a smaller build, I hadn’t considered how hot modern hardware gets, and AMD all around appears more efficient.

Since building my desktop during Obama’s admin, AIO took over as the go to CPU cooling solution? Always went with fans and air cooling but water cooling looks dead simple today. Two cases I’m looking at are the fractal design torrent nano, larger and air cooled, and cooler master nr200, smaller and suited to an aio. There’s even a model of the nr200 with an aio and psu pre set for $400.

naff

  • someday you feed on a tree frog
  • Senior Member
I don't think you'll regret going with the 12 series chips. I have a 5800x but the alder lake chips are sick, and the Ampere cards are fine. I was moaning about not getting all the DLSS3 features, but looking at the price of those 4k series cards... Yeah fuck that for fancy frame insertion.

You can squeeze a 3080 into many sff cases and get decent thermals
◕‿◕

Tycoon Padre

  • Junior Member
Insane pricing that's designed to make the glut of 30 series cards they over-manufactured  look reasonably priced. Who even needs these? A 3080 with DLSS will get you 4K/60 on literally everything but Cyberpunk and Flight Sim.

The early DF coverage of these seems like it'll be pretty icky again btw, just like with the 30 series (even though those were actually good cards!). Looks like they'll be testing on Nvidia's terms again and obfuscating raw performance #s. Gross.
« Last Edit: September 22, 2022, 09:41:18 AM by Tycoon Padre »

Tycoon Padre

  • Junior Member
(wrong thread)

Pissy F Benny

  • Is down with the sickness
  • Senior Member
Insane pricing that's designed to make the glut of 30 series cards they over-manufactured  look reasonably priced. Who even needs these? A 3080 with DLSS will get you 4K/60 on literally everything but Cyberpunk and Flight Sim.

The early DF coverage of these seems like it'll be pretty icky again btw, just like with the 30 series (even though those were actually good cards!). Looks like they'll be testing on Nvidia's terms again and obfuscating raw performance #s. Gross.

Hold up, are you suggesting games coverage may not be on the up and up?
(ice)

Skullfuckers Anonymous

  • Will hunt bullies for fruit baskets. PM for details.
  • Senior Member
Was planning on upgrading to the 4000 series from my 2070 super, but probably going to hold off for now due to price and unknowns about dlss 3. I looked at big upcoming releases and the only one I’ll probably want next year is re4 remake which I shouldn’t have trouble running so I don’t see the point right now.

who is ted danson?

  • ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀✋💎✋🤬
  • Senior Member
⠀⠀⠀⠀⠀

Nintex

  • Finish the Fight
  • Senior Member


Big Chungus
🤴


Nintex

  • Finish the Fight
  • Senior Member
How does that even fit into your case without completely killing your airflow in the process. Is that why some of these have fans added on the back?
🤴


Nintex

  • Finish the Fight
  • Senior Member
Dutch Founders Edition pricing for the 4000 series

RTX 4090 €1959
RTX 4080 (16GB) €1509
RTX 4080 (12GB) €1129 (which is actually the 4070)

MSI/Gigabyte/Asus cards etc. will probably be 200 - 400 Euro's more expensive than the FE depending on the cooling solution and OC's.
I remember when people were complaining that the top tier 1080 Ti launched at 800  :lol

I paid €629 for my RTX 2070 Aorus Super in 2019 and that included CONTROL and Wolfenstein for free.   
« Last Edit: September 25, 2022, 06:28:21 AM by Nintex »
🤴

Nintex

  • Finish the Fight
  • Senior Member


Big chungus
🤴

Nintex

  • Finish the Fight
  • Senior Member


Intel's cards perform well in new games but not so good in old games.
🤴

D3RANG3D

  • The Bore's Like Bot
  • Senior Member

Nintex

  • Finish the Fight
  • Senior Member
The 4090's power connectors are melting down
https://twitter.com/earlygamegg/status/1580472261358026753
🤴

Nintex

  • Finish the Fight
  • Senior Member
🤴

who is ted danson?

  • ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀✋💎✋🤬
  • Senior Member
nvidia subreddit is full of "my shit melted" psots
https://www.reddit.com/r/nvidia/
⠀⠀⠀⠀⠀

remy

  • my hog is small but it is mighty
  • Senior Member
The 4090's power connectors are melting down
https://twitter.com/earlygamegg/status/1580472261358026753
I thought this was a weird meme the first time i saw it and not what was actually happening to guys with 4090s  :lol


Coax

  • Member
The 4090's power connectors are melting down
[...]

That's... an EVGA card :p (It's a joke tweet, though there are enough people seeing real melted connectors).

Nintex

  • Finish the Fight
  • Senior Member
RDNA 3

RX 7900 XT / 20GB = $899 / 300 watt

RX 7900 XTX / 24GB = $999 / 355 watt / 61TFLOPS

Up to 1.7x faster than the RX6950
Bunch of new technologies shown like Display port 2.1 and a new Snowdrop Engine from Ubisoft, FSR3 along with some fancy UE5 tech demo's and a HYPR-RX mode that enables all the fancy RX enhancements with 1 click so you don't have to tinker with the settings.

Some performance thingies that I noticed.
1440p@300fps Apex Legends
1440p@600fps Overwatch 2
1440p@833fps valorant

Assassins Creed Valhalla 8k@96fps
UE5 Matrix city, 4k(?)@121fps with FSR enabled.


https://twitter.com/tomwarren/status/1588277571011026944
:wut
« Last Edit: November 03, 2022, 05:20:58 PM by Nintex »
🤴

pilonv1

  • I love you just the way I am
  • Senior Member
4080 seems like a good card if the price wasn't jacked.
itm