|
Post by crashV👀d👀 on Sept 12, 2024 21:17:31 GMT
You could bypass them, but Microsoft have now patched it out. I’ve got a PC that could easily last a few more years, but because the motherboard doesn’t have TPM, I need a new build 🤬 You can download the ISO direct from Microsoft then use Rufus to create an installation USB that bypasses whatever requirement your system doesn't meet. Don't know if that might cause some funny business with future updates, but it might be worth a go if you don't want to upgrade surely you could just acquire an older build of windows and do an install without online connection ?
|
|
X201
Full Member
Posts: 5,106
|
Post by X201 on Sept 12, 2024 21:23:35 GMT
Looks like they've patched Windows 11 itself to check for it after install, so even if you get it running its going to be a case of finding new loopholes to keep it running.
|
|
|
Post by crashV👀d👀 on Sept 12, 2024 21:38:18 GMT
Wouldn't that render any installs that hopped through that loophole to now fall over and become e-waste?
I have 3 laptops and all of them are older hardware where I used Rufus to install windows. None have fallen over yet (not saying they won't at some point though)
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,247
|
Post by malek86 on Sept 15, 2024 14:07:58 GMT
Finally got my Ryzen 5700X3D. Upgrading wasn't as hard as I expected - I forgot my aftermarket cooler is set up so that it can be removed without any need to take out the mobo.
All I've been able to try so far is the Assassin's Creed Odyssey Discovery Tour, and my CPU time went from 14-16ms (with the 3600) to 10-11ms. So that's a big improvement to be sure. Even if the slow RAM were a limiting factor, it would be fine for my 60hz monitor until the end of the generation at least. So with relatively little money spent, I can definitely press on for another few years. When UE6 games start coming out on the PS6, I'll just upgrade the entire system.
|
|
Blue_Mike
Full Member
Meet Hanako At Embers
Posts: 5,365
Member is Online
|
Post by Blue_Mike on Sept 15, 2024 16:44:14 GMT
|
|
|
Post by crashV👀d👀 on Sept 17, 2024 8:06:17 GMT
Most of the 650 boards seem to be PCI4 for storage with a select few offering 5. The ones that offer 5 seem to be up there in the £200 range, at which point isn't it just better to get a 670 board?
Am I missing something?
|
|
|
Post by Fake_Blood on Sept 17, 2024 8:41:59 GMT
Probably some common sense.
|
|
|
Post by crashV👀d👀 on Sept 17, 2024 10:10:02 GMT
helpful
the last time i looked at boards its was the diff between 570 and 550. the 550 still offered the same speed storage option just less of it where-as this appears to not offer the same depending on board as far as I can tell and then we have E variants.
|
|
|
Post by uiruki on Sept 17, 2024 11:03:49 GMT
The ones offering PCIE5 will be B650E, because AMD wanted another variant to upsell on the stack. The main other connectivity difference between B650E and X670 is USB and SATA, which has double the capacity on the chipset. That gives 2 20gbps USB and 12 at 10gbps on the 670s, versus 1 and 6 on 650. B650E gives you more PCIE5 lanes than X670 non-E, but slightly less overall (B650E has 36 lanes, of which 24 can be 5 speeds, while X670 is 44/8 and X670E is 44/24). A single NVME drive uses up to 4 lanes and a graphics card 16, so you've got room for two full speed PCIE5 drives and your graphics card, then another 12 or 20 lanes for PCIE4. Only 8 lanes on the X670 seems bad at first but even the 4090 only runs at PCIE4 speeds so you're unlikely to be hamstrung by that. That makes X670E hard to justify, depending on pricing. My take as someone who isn't in the market is that if you want to use PCIE5 NVME drives than consider B650E only if it's cheaper than X670, otherwise just go with B650. If you need the extra lanes and USB on X670/E, then you already know you need them. After that, choose your board by the features you want and the number of M.2 sockets.
In short: Just want a computer that runs all your peripherals at full speed? B650. Do you really want PCIE5 storage, something that no games currently take advantage of? B650E or X670, whichever is cheaper.
Do you want a halo product that unless you're doing workstation/heavy duty networking stuff you'll definitely not use at all? That's what X670E is for.
|
|
|
Post by Fake_Blood on Sept 17, 2024 11:51:49 GMT
helpful the last time i looked at boards its was the diff between 570 and 550. the 550 still offered the same speed storage option just less of it where-as this appears to not offer the same depending on board as far as I can tell and then we have E variants. Only joking of course, didn’t we have virtually the same enthusiast level pc?
|
|
|
Post by crashV👀d👀 on Sept 17, 2024 12:03:53 GMT
Pretty much, I'm so enthusiastic.
I'm still AM4 with a 5800x3d and you went whole hog and got a 7800 didn't ya?
|
|
|
Post by crashV👀d👀 on Sept 17, 2024 12:05:14 GMT
uiruki appreciate that thanks. It's pretty much what I was understanding especially after digging into it a bit more this morning but I needed the sanity check
|
|
|
Post by Fake_Blood on Sept 21, 2024 11:23:06 GMT
Pretty much, I'm so enthusiastic. I'm still AM4 with a 5800x3d and you went whole hog and got a 7800 didn't ya? Yeah I went AM5 with the new PCI whatever thing so I can switch to the faster NVMe drives whenever that’ll make sense or make a measurable difference. I was switching from Intel though so needed new everything anyways.
|
|
|
Post by crashV👀d👀 on Sept 26, 2024 20:34:05 GMT
Pinch of salt time people videocardz.com/newz/nvidia-geforce-rtx-5090-and-rtx-5080-specs-leakedGeForce RTX 5090 to feature 21760 CUDA cores, 32GB GDDR7 memory and 600W, RTX 5080 gets 16GB VRAM Coming from Kopite7kimi himself. RTX 5090 One of the most reliable NVIDIA leakers has now confirmed the specs for two of NVIDIA's upcoming Blackwell graphics cards, specifically the RTX 5090 and RTX 5080. According to Kopite7kimi, the RTX 5090 is now said to feature a 512-bit memory bus, not 448-bit as mentioned earlier. The RTX 5090 will also feature 32GB of GDDR7 memory. RTX 5090 with 170 SMs, 32GB VRAM, return of 512-bit bus The flagship Blackwell gaming model is expected to use the GB202-300 GPU with 21,760 FP32 (CUDA) cores, which is fewer than the 24,576 cores of the full chip (a reduction of 13%). Additionally, the card is rumored to have a 600W spec, though the leaker has ot confirmed whether that's the TBP or TGP, as both an refer to different metrics. Interestingly, although the power has increased from
|
|
X201
Full Member
Posts: 5,106
|
Post by X201 on Sept 26, 2024 21:49:54 GMT
|
|
X201
Full Member
Posts: 5,106
|
Post by X201 on Sept 26, 2024 21:53:35 GMT
I’ve been saving for ages to make the leap from my old 1080, I’ll be getting one for my rebuild - after I’ve picked myself up from the floor after seeing the price. The extra ram will be great for me - my 3D rendering software collapses if there are too many assets in my scenes.
|
|
|
Post by Chopsen on Sept 26, 2024 22:21:59 GMT
The last two generations of GPUs have been little about actual hardware innovation and just physically bigger and drawn more power to achieve any performance improvement. Oh and some fancy pants stuff on the driver side to make it looks like there were more frames and details being drawn than there actually were.
|
|
|
Post by captbirdseye on Sept 27, 2024 7:23:52 GMT
32gb ddr 7 will be expensive as hell.
|
|
Frog
Full Member
Posts: 7,272
|
Post by Frog on Sept 27, 2024 7:41:20 GMT
I'm not sure that's fair, the frame generation is one of the biggest innovations in GPU technology for quite a while.
|
|
|
Post by Vandelay on Sept 27, 2024 7:51:33 GMT
Frame gen is pretty great. Even just for efficiency rather than actual performance gains. I'm playing Frostpunk 2 at the moment and I have locked it at 60fps and enabled frame gen. Gone from using 90%+ of my GPU to only 50-60%. It's not something I would do for an action game, as it means the game is really running at 30fps-ish, but perfect for a strategy/management game.
Of course, great for performance too. Even on my 4090 it can be hard to get the most out of my 144hz screen, even though it is only 1440p (ultra wide). Stick on frame gen though and it is no problem.
|
|
|
Post by dfunked on Sept 27, 2024 8:14:11 GMT
My 750w PSU is sweating right now...
In fairness I probably don't use my PC enough to justify upgrading my trusty 3080. VRAM limits aside it's still more than adequate for me.
|
|
|
Post by crashV👀d👀 on Sept 27, 2024 8:57:02 GMT
Some wild rumour going around that it'll require 2 of the new 16pin power connectors which surely cant be true.
|
|
Tomo
Junior Member
Posts: 3,490
|
Post by Tomo on Sept 27, 2024 9:09:31 GMT
My 750w PSU is sweating right now... In fairness I probably don't use my PC enough to justify upgrading my trusty 3080. VRAM limits aside it's still more than adequate for me. Yeah, same. My 3080 still feels like a monster tbh, unless I want to see nose hairs from 50ft. I'm also not convinced there are enough games coming out that actually need these bonkers cards. GPUs feel ahead of the games themselves atm in my mind.
|
|
|
Post by Vandelay on Sept 27, 2024 9:30:20 GMT
I upgraded my 500w PSU when I went from my 1070 to a 2080 Super. Picked up an 850w thinking that will be more than enough... then I got a 4090 and it is only just meeting the requirements. I wouldn't want to put anything beefier in than that and the 5090 certainly looks like it is even more power hungry.
I do actually cap 4090 at 80% power usage, as it becomes massively inefficient above about 80% (think some even drop it down to 70% usage with minimal effect). I did have it go full tilt initially and the PSU was fine, but I expect it would run into issues without the cap.
|
|
X201
Full Member
Posts: 5,106
|
Post by X201 on Sept 27, 2024 12:44:37 GMT
I’ve just been reading about a 2200W PSU, it’s bonkers that we’re getting close to electric kettle levels
|
|
|
Post by Fake_Blood on Sept 27, 2024 13:36:01 GMT
I also have 850 watts from my precious build, it’s one of the reasons why I went from an 9900k to a 7800X3D, that thing uses like 60 watts when gaming.
|
|
Blue_Mike
Full Member
Meet Hanako At Embers
Posts: 5,365
Member is Online
|
Post by Blue_Mike on Sept 27, 2024 16:17:47 GMT
Also on 850w. Fucking hell, we should start installing solar panels connected to independent generators before long if we want to keep our 'leccy bills down after upgrading to new hardware.
|
|
|
Post by crashV👀d👀 on Sept 27, 2024 17:48:14 GMT
I swapped my 750 to a 1000w because my 4090 was causing it to whine plus the added benefit of a direct power cable without splitter/adapters.
Might need to have a separate PSU just for the GPU in the future at this rate
|
|
|
Post by Chopsen on Sept 27, 2024 18:59:32 GMT
You're going to need a separate *wall socket* for your GPU at this rate. It fucking nuts.
|
|
|
Post by crashV👀d👀 on Sept 27, 2024 23:12:18 GMT
|
|