|
Post by crashV👀d👀 on Jan 9, 2024 15:34:44 GMT
Even though my PC is less accessible than I would like I have pondered upgrades (RTX4k supers are a possibility along with new mobo and cpu). Been looking at arricles about the new 12v power connectors, are they adaptable (safely) to older PSUs or will I need a new one of those too? (Antec trupower 720 is what I have IIRC, fairly old) I used a non atx3 pay with my 4090 for a while and it worked ok. They come with a cat-o-ninetails power cable (12vhpwr one end splitting to 2, 3 or 4 older 8pins the other). You shouldn't need to replace the PSU but I would just confirm the draw requirement for a 4080 if you go that far...
|
|
|
Post by One_Vurfed_Gwrx on Jan 9, 2024 18:10:30 GMT
Even though my PC is less accessible than I would like I have pondered upgrades (RTX4k supers are a possibility along with new mobo and cpu). Been looking at arricles about the new 12v power connectors, are they adaptable (safely) to older PSUs or will I need a new one of those too? (Antec trupower 720 is what I have IIRC, fairly old) I used a non atx3 pay with my 4090 for a while and it worked ok. They come with a cat-o-ninetails power cable (12vhpwr one end splitting to 2, 3 or 4 older 8pins the other). You shouldn't need to replace the PSU but I would just confirm the draw requirement for a 4080 if you go that far... Thanks for the reply, that was what I was wondering as I remember using adaptors for stuff decades ago but not needed to in recent years.
|
|
|
Post by captbirdseye on Jan 10, 2024 14:30:23 GMT
I used a non atx3 pay with my 4090 for a while and it worked ok. They come with a cat-o-ninetails power cable (12vhpwr one end splitting to 2, 3 or 4 older 8pins the other). You shouldn't need to replace the PSU but I would just confirm the draw requirement for a 4080 if you go that far... Thanks for the reply, that was what I was wondering as I remember using adaptors for stuff decades ago but not needed to in recent years. The power supply maker might sell a dedicated 12vhpwr cable rather than using the ugly Nvidia one.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,159
|
Post by malek86 on Jan 10, 2024 19:47:38 GMT
Wondering if it might be a good idea to replace my Ryzen 3600 with a 5800X3D on the (somewhat) cheap, which should presumably last me until the end of the console generation. I was even thinking a 5700X3D but the clock might be a tad low. I'll have to wait for benchmarks.
Main problem is that it would have to go with my current slow DDR4-2666mhz memory, not sure how much it would hold it back. Also, looking at some tests online, it seems that for a mainstream card like my RX 6600 XT there might not be that much of an improvement since the GPU is more of a bottleneck in modern games, even at 1080p.
The alternative is replacing the entire set and buy a new mobo with DDR5 and Ryzen 8000-whatever when it comes out. Maybe that's the best solution, if a little more expensive. At least I'd be more future-proof.
|
|
|
Post by uiruki on Jan 10, 2024 20:19:00 GMT
The X3D is actually much less dependent on fast RAM. If you do other stuff with your PC it'll be a big upgrade; for games the 5800X3D is way faster than my old 3900X even though it has 4 fewer cores.
|
|
|
Post by crashV👀d👀 on Jan 10, 2024 21:47:35 GMT
malek86 I went from a Ryzen 7 3700x to a 5800x3d and the difference was absolutely noticeable especially in the 1% lows. It just steamrolls through tasks and feeds the GPU like a champ meaning gaming is super smooth. Don't know how much it is now but it was well worth £300 at the time
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,159
|
Post by malek86 on Jan 11, 2024 22:40:49 GMT
The X3D is actually much less dependent on fast RAM. If you do other stuff with your PC it'll be a big upgrade; for games the 5800X3D is way faster than my old 3900X even though it has 4 fewer cores. malek86 I went from a Ryzen 7 3700x to a 5800x3d and the difference was absolutely noticeable especially in the 1% lows. It just steamrolls through tasks and feeds the GPU like a champ meaning gaming is super smooth. Don't know how much it is now but it was well worth £300 at the time Yeah, though in my case the 6600 XT is most likely the main bottleneck in modern games, there have been some games where I've noticed framerate drops regardless of resolution and graphic settings. I think even if max framerates wouldn't change much, 1% lows would benefit from it. I'm pretty sure my 3600 is slightly held back by my slow RAM. Only, the 5800X3D does look a bit expensive for a mid-life upgrade. Can be upwards of 320 euro nowadays. Kinda want to wait for the 5700X3D and see how much of a difference the lower clocks make. Mind, I'll also have to wait until Gigabyte makes my mobo compatible with it. I have a B450, so anything Ryzen 7000 is out of the question unless I upgrade that too.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,159
|
Post by malek86 on Feb 1, 2024 11:45:44 GMT
The 5700X3D is out, waiting for reviews, but the general consensus seems to be that it should be nearly as fast as the 5800X3D. Meaning I could get it for 300 euro. It's compatible with my mobo and etc, so it would be a fairly cheap upgrade for sure. Especially compared to an AM5 system with CPU and RAM. I'm just not sure I want to. If I do buy it, then I won't upgrade for another three years at least. Do I want my AM4 motherboard with 16GB of slow DDR4-2666mhz (yeah, I could upgrade it too and even expand it, but that would feel like a bit of a waste), and PCI-Ex 3.0 to go on that much longer? Realistically, three years is also how much I'd like my 6600XT to last. So unless my 3600 is holding me back massively, even the 5800X3D would feel like a waste, because when I do upgrade the entire system, it will be left behind anyway. That's kinda hard to say though. There's this review, but it mostly uses older games. www.techspot.com/review/2502-upgrade-ryzen-3600-to-5800x3d/So I have no idea how much a 3600 struggles today. I dobn't even play that many modern games. Dead Space did seem somewhat CPU limited at times, but... it seemed GPU limited much more often. And with modern games, it's hard to say what is CPU bottleneck and what is just stutter due to DX12. Maybe I'll wait the next paycheck and see how much I have left.
|
|
|
Post by uiruki on Feb 1, 2024 12:05:44 GMT
Considering that the PS5/XSX are roughly 3700X equivalents anyway, with an X3D you're basically set for the generation CPU-wise with room to take 30fps console games to 60 with lower settings or a faster graphics card.
If you're happy with the features on your motherboard (Wifi, ports etc), I'd say go for it - you're not likely to get that significant a jump from an AM4 X3D processor without spending a large amount of cash. That's money you can instead shuffle to a faster graphics card in a year or two.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,159
|
Post by malek86 on Feb 1, 2024 13:39:50 GMT
Considering that the PS5/XSX are roughly 3700X equivalents anyway, with an X3D you're basically set for the generation CPU-wise with room to take 30fps console games to 60 with lower settings or a faster graphics card. If you're happy with the features on your motherboard (Wifi, ports etc), I'd say go for it - you're not likely to get that significant a jump from an AM4 X3D processor without spending a large amount of cash. That's money you can instead shuffle to a faster graphics card in a year or two. Yeah, I know. It should be enough for another while. Worst case, I could add some semi-cheap extra 16GB of memory for comfort. Heck, I might even just make it 2x4GB and cruise along with 24GB, which should be more than enough. And I think 16GB is already enough anyway. Regarding the GPU, I don't mind playing at sub-native res or low settings. I do enjoy a steady 60fps though, and perhaps the 6600XT is a tad undersized for my current ultrawide 1080p monitor. After all, for recent games, even the PS5/SX are often seen dropping below 1080p for 60fps. Dead Space for me struggled even with 2310x990, medium settings and even FSR2 enabled. But again, you never know when a PC port might be merely unoptimized.
|
|
|
Post by uiruki on Feb 1, 2024 13:47:17 GMT
Don’t mix and match RAM! It won’t work and you can’t do it - all it’ll do is add a bunch of weird instability that will persist until you undo it.
If you can’t work out if something is GPU or CPU limited then get MSI Afterburner or similar installed, and activate the per-core stats for your CPU. If your GPU is consistently dropping below 90% while dropping frames and/or you’ve got at least one core pinned at 100% then you have a CPU limit. Going to the X3D you get both extra cores and considerably stronger individual cores, as well as removing the cross-CCD latency because of the way the processor is laid out.
In the case of Dead Space, in my play through I had a few bits where a 3080 struggled; the DF video noted that it’d just spike VRAM usage which would specifically bring the 10gb card down to a crawl. Combined with the traversal stutter as it struggles to load in what’s ahead of you and you have a rough port that’s tricky to fix with just settings.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,159
|
Post by malek86 on Feb 1, 2024 13:52:01 GMT
Don’t mix and match RAM! It won’t work and you can’t do it - all it’ll do is add a bunch of weird instability that will persist until you undo it. Ah, don't worry, I wouldn't just blindly buy any stick. I'd get th same sticks as the ones I have now, same latency and speed, just different size. Though it turns out nowadays they cost the same as 2x8GB, so... it's pretty pointless...
|
|
|
Post by uiruki on Feb 1, 2024 13:59:13 GMT
That applies to size too. If you were running faster RAM I’d just say buy a new set of 32 while it’s still cheap as memory controllers nowadays are really skittish. Believe it or not DDR5 is even worse, you could get two sets of the same brand, same specs but different runs and it might be different enough to make things not work properly at XMP/EXPO speeds.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,159
|
Post by malek86 on Feb 1, 2024 14:16:37 GMT
That applies to size too. If you were running faster RAM I’d just say buy a new set of 32 while it’s still cheap as memory controllers nowadays are really skittish. Believe it or not DDR5 is even worse, you could get two sets of the same brand, same specs but different runs and it might be different enough to make things not work properly at XMP/EXPO speeds. Darn, things are getting worse. And they were already bad. I do remember having trouble getting my Corsair Vengeance sticks to run at XMP speed on my previous A320. Went to read the mobo support list and it said those specific sticks couldn't run at XMP speed in dual channel mode. So I had to keep them at 2133mhz, which sucked, especially on my older Ryzen 1500X. After some years, I upgraded to a Ryzen 3600 and B450M, and I didn't have as much trouble. Because this time, the mobo specifically supported those sticks running at XMP speed in dual channel mode. Anyway for 50 euro I could get another couple 2x8GB exactly the same type as the ones I'm running now. The support list says I can run that specific RAM model and version on 4 sockets too, so it should be fine, I guess...? Getting a brand new 32GB would be interesting too, I'd take the chance to increase the speed (my mobo supports up to 3600mhz), but that would be a bit more expensive for a decent brand, and chances are the 5700X3D, with its massive cache, might not need it as much. All assuming I even need an extra16GB. All tets I've seen show that 32GB barely makes any difference, except for Hogwarts Legacy. Also, I'll admit it would be interesting to run 24GB of RAM. That's pretty uncommon.
|
|
KD
Junior Member
RIP EG
Posts: 1,314
|
Post by KD on Feb 1, 2024 14:53:28 GMT
|
|
|
Post by uiruki on Feb 1, 2024 14:54:14 GMT
If it's the same then it should be fine - if not I think you'd have a pretty good case for a refund. And yeah, I specifically bought some sticks for when I upgraded to my 3900X and my X570 board just wasn't having it, even though it was specifically on the approved list. I then did just that - got another 16 gigs of the stuff I already had and have been running that solidly since.
With DDR5 I've seen suggestions that you don't want to run more than two sticks at all, even. Hopefully things improve as it matures but it's not like the days when I split two types of SIMMs on my 486 to get 5 megabytes of memory.
|
|
|
Post by dfunked on Feb 1, 2024 15:03:11 GMT
8MB soldered on to the motherboard for me + a whopping 32MB added for 40MB total. I could finally play Blood on it!
Good times!
|
|
|
Post by uiruki on Feb 1, 2024 15:05:09 GMT
Yeah, Blood could technically start up in 8 but if you actually tried to play the game you'd see it just wasn't having it.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,159
|
Post by malek86 on Feb 1, 2024 15:38:31 GMT
Luckily, the 8MB of PC-100 my Compaq Presario came with, were enough to play Quake.
|
|
|
Post by crashV👀d👀 on Feb 1, 2024 17:12:04 GMT
I upgraded to 32gb when I went through my cities skylines phase and With lots of mods it was easily chewing through 16gb. I then drifted off that game and dont think I've ever needed it again.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,159
|
Post by malek86 on Feb 2, 2024 8:09:58 GMT
I upgraded to 32gb when I went through my cities skylines phase and With lots of mods it was easily chewing through 16gb. I then drifted off that game and dont think I've ever needed it again. Aside from some specific cases like a modded Cities Skyline, I think the only other game so far that actually benefits from 32GB is Hogwarts Legacy. Other than that, we'll have to see in future games.
|
|
|
Post by barchetta on Feb 2, 2024 8:42:59 GMT
I'm hoping MSFS2024 will make more use of available RAM.
|
|
|
Post by dfunked on Feb 2, 2024 9:37:26 GMT
Seems to be a safe amount to aim for if games are starting to recommend it. It's a nice amount to have for an extra bit of breathing room and if do stuff like mucking about with VMs.
I couldn't resist going for 64GB when I saw a deal recently, but in hindsight that's a fucking daft amount to have unless you genuinely have a specific use case for it. I'll probably go back to 32GB whenever I do a full upgrade, although that'll probably be 3+ years away in fairness so things might've changed.
|
|
|
Post by Phattso on Feb 2, 2024 9:49:00 GMT
Many of you must be using your systems very differently to how I use mine. It’s not just about specific games needing the RAM for me but more about overall system snappiness and multitasking. I can keep a bunch of apps and browsers open without issue on my 32GB laptop and still play top tier games.
|
|
|
Post by Vandelay on Feb 2, 2024 10:14:52 GMT
I have 32GB and probably don't hugely require it. I don't have loads of things open and I shudder a bit when I see people with browsers that have dozens of tabs (I don't get it, clicking through all of them to find the one you need takes just as long as getting back to page - plus, I only go to about 3 websites anyway ). Having said that, I do regularly get near 16GB with the way I use my system. I certainly wouldn't want any less then 16. It probably would have made more sense for me to upgrade my 16GB with faster RAM rather than more last time, as I'm only on 3000Mhz at the moment. Next upgrade, I'll likely stick with 32GB, but make sure I get some more speed. My 5800x seems to be fine for the moment though and don't think it is worth doing a RAM upgrade until I move to DDR5 (or whatever it might be by the time I upgrade). Might take a look at the Zen 5s that come out towards the end of this year, but think I can probably stick with my current CPU for another couple of years.
|
|
|
Post by crashV👀d👀 on Feb 2, 2024 10:45:50 GMT
Many of you must be using your systems very differently to how I use mine. It’s not just about specific games needing the RAM for me but more about overall system snappiness and multitasking. I can keep a bunch of apps and browsers open without issue on my 32GB laptop and still play top tier games. Yeah I do. I have laptops for alsorts of other bullshit but my pc is for games and games only. Loads of services turned off, as little loaded into the systray/memory as possible and everything to the benefit of FPS, frametimes and speed.
|
|
|
Post by uiruki on Feb 2, 2024 10:57:23 GMT
I think that that 16's probably going to be quite tight in the future but the only upgrade from that is to double it. If you're on DDR4 and anticipate a move to DDR5 in the next year it's hard to recommend an upgrade, as you'll be moving on before things start to get rough, but if you're already on DDR5 or are building anew then 32 is likely the way to go unless you're just going to be on the desktop rather than games or work.
|
|
|
Post by Matt A on Feb 2, 2024 11:20:07 GMT
I'm going to get a new PC in a couple of years or so. I currently have a 3070ti laptop with 16gb and it works great for most games.
|
|
|
Post by crashV👀d👀 on Feb 10, 2024 23:25:33 GMT
|
|
|
Post by stuz359 on Feb 17, 2024 22:55:01 GMT
I've gone for a 4060 as an interim card. Going to build my own PC this year, but this is an upgrade on 1070 and has in addition much more modern features. The plan is to go top tier on the PC build, so I didn't want to spunk a lot of money on this. The 4060ti was a bit too expensive for features, the 3070/3070ti was too power hungry. I was kind of tempted by the 4070, but would have involved trading in the Xbox series X and my 1070 so I could afford it. I basically use my Xbox as a backwards compatible device and I have no use for a 1070.
A big factor was power draw, 4060 TDP is around 115W, 1070 around 150W, meaning I don't have to upgrade PSU. I have a PSU ready to upgrade, but for an interim card, it's another headache. I just wanted a nice simple upgrade, without the hassle of multiple component upgrades.
I'll let you know how it goes.
|
|