|
RTX 5060s announced.
By Afania 2025-04-20 02:48:34
The only reason nVidia is still selling any consumer GPU's is that Jensen likes to wear leather jackets and stand on stage acting like he's Steve Jobs.
Nah, his social influence in the world has elevated several times higher because of AI hardware. No one, except game nerds, cared about him when he only sell GPU to gamers back then.
These days he can wear leather jackets and stand on stage acting like he's Steve Jobs talking about AI stuff. And every big manufacturing company CEO and politicians listen to him talk below like he is Jesus alive, with media talking about him 3 days straight.
And all those social influence came from AI server, not 5090. Even if he wants social influence, consumer GPU doesn't offer them nearly as much.
By K123 2025-04-20 06:21:59
Ok meta might be using 5nm but it definitely wouldn't be out of choice, they just don't have the might to get capacity on 3nm. Noone right now will be designing and making chips for 3nm that aren't already far down the design pipeline. I struggle to believe Nvidia got the demand wrong. Everyone was trashing the 5070 and 5080 since specs were released long ago. Could have been stuck into quantities they ordered before this, maybe, but they are far under market demand for 5090. I'd have bought a 5090 at £1800, maybe at £2000, but I'm not paying any more out of principle.
By Afania 2025-04-20 06:59:15
I'd have bought a 5090 at £1800, maybe at £2000, but I'm not paying any more out of principle.
Don't then. 5090 is luxury item for rich people that can pay anything for the best, it's not a requirement to play games. You are not their target audience. You can play majority of games with lower settings with weaker GPU.
By Seun 2025-04-20 09:14:01
I'd have bought a 5090 at £1800, maybe at £2000, but I'm not paying any more out of principle.
Don't then. 5090 is luxury item for rich people that can pay anything for the best, it's not a requirement to play games. You are not their target audience. You can play majority of games with lower settings with weaker GPU.
Seconded.
It's discretionary spending for a gamer, but it's a business expense and a write off to a professional. There really isn't any value to be found in a market where some of the consumers don't have to care about the cost. We can keep fooling ourselves with the pretty charts and graphs, but it's fallacy.
By K123 2025-04-20 10:09:10
I'd have bought a 5090 at £1800, maybe at £2000, but I'm not paying any more out of principle.
Don't then. 5090 is luxury item for rich people that can pay anything for the best, it's not a requirement to play games. You are not their target audience. You can play majority of games with lower settings with weaker GPU. I only have the 3090 for AI. Same reason the demand is so high for the 5090. Prior to this I was happy with a 4GB AMD card for FFXI and emulators.
[+]
Asura.Saevel
Server: Asura
Game: FFXI
Posts: 10097
By Asura.Saevel 2025-04-20 14:39:46
Clamshell is rarely used because it reduces the chips effective bandwidth in half and bandwidth is usually far more important to performance then raw capacity.
This is part of the reason why I wanted to see the head to head, specifically in instances where they're not spilling. I'm curious why 28G/s and not 32 tho? It makes my brain cell itch.
I posted it earlier .....
4060 Ti 8/16GB was already done. It's the exact same card with the only difference being memory and you can see when and where more memory starts to matter. And WTH you talking about 28G/s? 4060 TI is 288GB/s memory bandwidth. 5060 Ti is 448GB/s.
By Seun 2025-04-20 15:20:00
Clamshell is rarely used because it reduces the chips effective bandwidth in half and bandwidth is usually far more important to performance then raw capacity.
This is part of the reason why I wanted to see the head to head, specifically in instances where they're not spilling. I'm curious why 28G/s and not 32 tho? It makes my brain cell itch.
I posted it earlier .....
4060 Ti 8/16GB was already done. It's the exact same card with the only difference being memory and you can see when and where more memory starts to matter.
MB I thought it was normal vs OC. I didn't think 8GB cards were released to reviewers so I wasn't expecting to see anything before next week.
And WTH you talking about 28G/s? 4060 TI is 288GB/s memory bandwidth. 5060 Ti is 448GB/s.
28 x 16 = 448? That not how it works?
Asura.Saevel
Server: Asura
Game: FFXI
Posts: 10097
By Asura.Saevel 2025-04-20 18:21:33
28 x 16 = 448? That not how it works?
You need to specify what your referencing to. In this case it's the memory bus speed measured in G b/s clock.
Convert bits into Bytes we get 3.5GB/s per pin.
28/8 = 3.5
We have 128-bit width meaning 128 pins so 3.5 * 128 = 448 G B/s.
The other way is to do a per-package conversion, (28 * 32)/8 = 112GB/s per chip. Four chips gives us 4 * 112 = 448 GB/s.
28Gbps is the modules that nVidia purchased from their suppliers. The 5080 use's 32Gbps modules instead.
[+]
By Seun 2025-04-20 22:30:32
I arbitrarily picked the bus speed because 32 sounds better than 28, but I meant more bandwidth in general. Thanks for the breakdown though. I wasn't visualizing the configuration correctly.
By K123 2025-04-28 05:14:58
https://videocardz.com/newz/geforce-rtx-5080-super-rumored-to-feature-24gb-memory-rtx-5070-super-with-18gb-config
Says everything I was saying, e.g. 5060 should have used 3GB modules for 12GB, etc.
Nvidia have completely butchered this gen. If there are 2-3 years before 6000 series then I don't even expect desktop PCs to be anything like they are today by then. CAMM modules, etc.
By Seun 2025-04-28 18:45:10
Would it really be worth giving up speed and bandwidth for capacity here?
By K123 2025-04-28 20:10:34
It was assuming they didn't use such low bus also I suppose.
By Seun 2025-04-29 02:50:05
I suppose they wouldn't have to, but the 5070 would look even more out of place if a 5060 had the same memory configuration. Plus the 5070 still runs out of memory so I assume a 12GB version of the 5060 ti would have the same issue.
By Felgarr 2025-04-29 04:05:33
Wow, I hadn't heard of CAMM modules until I read through this thread. However, I expect CAMM modules to be in thin laptops and less-likely to see them in ITX/mATX/ATX motherboards for desktop PCs. Still Interesting though
What other changes do you forsee to desk PCs in 2-3 years?
Asura.Saevel
Server: Asura
Game: FFXI
Posts: 10097
By Asura.Saevel 2025-04-29 19:22:00
Wow, I hadn't heard of CAMM modules until I read through this thread. However, I expect CAMM modules to be in thin laptops and less-likely to see them in ITX/mATX/ATX motherboards for desktop PCs. Still Interesting though
What other changes do you forsee to desk PCs in 2-3 years?
CAMM are nonexistent in the consumer desktop space. Some companies made them but ultimately there was very little demand, it's a solution looking for a problem. The two to four slot DIMM formfactor is just too useful and flexible, it'd be like trying to replace USB peripherals with a proprietary fruit branded... ohh wait. It's like MXM, useful for notebooks or embedded devices, scant everywhere else.
By K123 2025-04-30 13:00:57
Wow, I hadn't heard of CAMM modules until I read through this thread. However, I expect CAMM modules to be in thin laptops and less-likely to see them in ITX/mATX/ATX motherboards for desktop PCs. Still Interesting though
What other changes do you forsee to desk PCs in 2-3 years? Affixing GPUs perpendicular to the mobo as standard is way overdue as a dumb idea. I hope cases that support horizontal mounting become standard.
Asura.Eiryl
By Asura.Eiryl 2025-04-30 14:12:01
Hot take; Mounting any pc hardware vertically is dogshit.
Everything should be flat. I don't know how the idiots won and made fishtank cases and unicorn vomit the standard. But it's gross.
(I actually do know why, the only answer to every question)
By K123 2025-04-30 14:18:10
I have my tower flat on the floor to keep my 3090 from sagging.
By Seun 2025-04-30 14:38:27
If there isn't a performance increase that comes along with vertical mounting, it shouldn't be standard. Not everyone wants to pay more money simply to have all sides of their GPU plastered with marketing and branding. Some of us are actually more concerned with what the display looks like rather than the inside of the case.
Shiva.Thorny
Server: Shiva
Game: FFXI
Posts: 3148
By Shiva.Thorny 2025-04-30 14:46:58
Pretty sure the 'performance increase' is that it takes up like 160 square inches instead of 400 and won't be stepped on by pets if you put it on the floor. It's a valid concern, especially for office PCs, but there should definitely be more emphasis on horizontal layout with the weight of current GPUs.
Carbuncle.Nynja
Server: Carbuncle
Game: FFXI
Posts: 4929
By Carbuncle.Nynja 2025-04-30 14:52:22
My gpu has a weight supporting plate thats something like an inch thick. Ita gonna be a literal stick of rebar sooner or later.
As far as vertical vs horizontal:
Heat rises. PCs of today generate more heat than yesterdays PCs. Top mounted fans and vents may be hindered if you slap a monitor on top of it. Theres a reason airflow best practice is in from the front, out through the top (or back which is usually close to the wall).
By K123 2025-04-30 15:20:36
If there isn't a performance increase that comes along with vertical mounting, it shouldn't be standard. Not everyone wants to pay more money simply to have all sides of their GPU plastered with marketing and branding. Some of us are actually more concerned with what the display looks like rather than the inside of the case. GPU sag can make it pull out of the PCI slot and damage it. Big GPU are heavy as f now
By Seun 2025-04-30 15:37:21
If there isn't a performance increase that comes along with vertical mounting, it shouldn't be standard. Not everyone wants to pay more money simply to have all sides of their GPU plastered with marketing and branding. Some of us are actually more concerned with what the display looks like rather than the inside of the case. GPU sag can make it pull out of the PCI slot and damage it. Big GPU are heavy as f now
My GPU is locked into the PCI slot. If the sag were that pronounced, the PCI connector itself would detach from the mobo. GPU falling out of the PCI slot sounds like user error.
Asura.Saevel
Server: Asura
Game: FFXI
Posts: 10097
By Asura.Saevel 2025-04-30 18:06:07
Pretty sure the 'performance increase' is that it takes up like 160 square inches instead of 400 and won't be stepped on by pets if you put it on the floor. It's a valid concern, especially for office PCs, but there should definitely be more emphasis on horizontal layout with the weight of current GPUs.
I think it's more to do with the form factor allowing for better air circulation. The slot design dates all the way back to the 70's and 80's when daughter boards generated less heat and didn't require power inputs that melt connectors. With the card perpendicular and adjoining the motherboard it has to deal with the MB blocking the airflow from under the card. If the card is parallel to the motherboard there is nothing blocking the airflow in all directions.
GPU waterblocks get around the problem entirely by just moving the radiator and fans elsewhere.
By K123 2025-04-30 18:16:03
If there isn't a performance increase that comes along with vertical mounting, it shouldn't be standard. Not everyone wants to pay more money simply to have all sides of their GPU plastered with marketing and branding. Some of us are actually more concerned with what the display looks like rather than the inside of the case. GPU sag can make it pull out of the PCI slot and damage it. Big GPU are heavy as f now
My GPU is locked into the PCI slot. If the sag were that pronounced, the PCI connector itself would detach from the mobo. GPU falling out of the PCI slot sounds like user error. I don't think you have experience with beefy GPUs. You can google this rather than make wild assumptions that I don't know how to install a GPU for God's sake. It is a widespread issue, hence why extender cables exist, thick metal brackets, etc.
By Seun 2025-04-30 19:55:26
I don't think you have experience with beefy GPUs. You can google this rather than make wild assumptions that I don't know how to install a GPU for God's sake. It is a widespread issue, hence why extender cables exist, thick metal brackets, etc.
I guess it depends on your threshold for 'beefy'.
You shouldn't take this personally because it's just a general statement. If my GPU fell out of the PCI slot, the first question I would ask myself is "Did I install that GPU correctly?" because that's the most likely problem. If a GPU is properly seated and locked into the PCI slot, it's far more likely that the weight of an unsecured GPU would pull the entire PCI slot away from the motherboard. Not a wild assumption at all.
Hot take: Nobody should have to buy support brackets for their 'beefy' GPU. It's not difficult to manufacture a card that has enough support on it's own to mitigate the issue.
Valefor.Yandaime
Server: Valefor
Game: FFXI
Posts: 798
By Valefor.Yandaime 2025-04-30 21:28:13
Hot take: Nobody should have to buy support brackets for their 'beefy' GPU. It's not difficult to manufacture a card that has enough support on it's own to mitigate the issue. Well the thing is, support brackets are indeed necessary. Hence why there are several GPU models that literally come with support brackets ever since the 3080 and onward. (I don't remember if the 20-series had any such a thing or not).
3090 and 4090 both came with an Arm Supports that attach to the Expansion Slots. The 3090 Arm held up just fine but the 4090 Arm was starting to sag noticeably before too long so I fashioned one out of a 1" All Thread Rod + Coupling + A Washer welded to the Coupling. 5090 Vanguard Comes with its own little Jack-Support.
You really only need about 2lbs of lift or so at the far end to make it stable so you can literally use anything but you DO need something or you risk the thing eventually damaging the PCIE slot over time. There's simply too much weight in there now on the heavier, longer cards. Just can't physically support the weight that far out without bending things that don't like to be bent.
Edit: As for the question of what counts as "Beefy"? Any Nvidia Card XX80 or XX90 that's at least 3000 or newer will fall into the Beefy-Box. Additional support was never a concern before that lineup. XX70 and XX60 is still a non-issue.
By K123 2025-05-01 01:57:54
Hot take: Nobody should have to buy support brackets for their 'beefy' GPU. It's not difficult to manufacture a card that has enough support on it's own to mitigate the issue. Well the thing is, support brackets are indeed necessary. Hence why there are several GPU models that literally come with support brackets ever since the 3080 and onward. (I don't remember if the 20-series had any such a thing or not).
3090 and 4090 both came with an Arm Supports that attach to the Expansion Slots. The 3090 Arm held up just fine but the 4090 Arm was starting to sag noticeably before too long so I fashioned one out of a 1" All Thread Rod + Coupling + A Washer welded to the Coupling. 5090 Vanguard Comes with its own little Jack-Support.
You really only need about 2lbs of lift or so at the far end to make it stable so you can literally use anything but you DO need something or you risk the thing eventually damaging the PCIE slot over time. There's simply too much weight in there now on the heavier, longer cards. Just can't physically support the weight that far out without bending things that don't like to be bent.
Edit: As for the question of what counts as "Beefy"? Any Nvidia Card XX80 or XX90 that's at least 3000 or newer will fall into the Beefy-Box. Additional support was never a concern before that lineup. XX70 and XX60 is still a non-issue. This guy knows, except for the last comment - high end professional CAD and AI cards started coming with the support brackets a few years ago, before 3000 series, when they started getting ridiculously long and heavy
By K123 2025-05-01 02:04:07
By Seun 2025-05-01 02:13:06
Gotta love nVidiaAIBs. Make you pay 3k for a GPU that comes with a crutch so you can prop it up to keep it from collapsing on itself.
2kg is too much, huh? We can't engineer our way out of that somehow? Someone please tell me it's not science fiction.
|
|