zagibu
Junior Member
Posts: 1,946
|
Post by zagibu on May 22, 2024 15:19:27 GMT
Excuse my total ignorance, but how did Intel (and AMD i guess) suddenly get so far behind the curve when it comes to processors? They didn't really get behind the curve. They just stuck with their original philosophy and kept everything backwards compatible. But nowadays power efficiency is more important than performance in a lot of applications (mobile, cloud), which makes ARM or other native RISC chips in general an easy choice there. Also, there are many workloads now that do not profit from a complex instruction set, because they are very specialized and uniform in what they do, so even for some HPC stuff, you don't gain enough to offset the lower power efficiency.
The reason why some companies are also moving away from x86 in areas where neither of the above is true, is mostly because of internal logistics, I would guess. It probably doesn't really make sense for Apple to roll out hardware with two very different ecosystems, if they can get away with only ARM. Especially because the x86 platform is also prone to more security issues due to the much higher complexity.
But that's just my guesses. I'm a professional software developer, but I'm actually very far away from the hardware, so somebody else might be better informed to answer this question.
|
|
|
Post by Phattso on May 22, 2024 15:20:39 GMT
Kind of feels like Intel has already answered the tiny low power fanless PC requirement with N100 for example. I got a N100 mini pc for a silly amount of money from AliExpress and it's surprisingly competent even as a daily driver machine for the size and cost, and it natively runs everything. Ableton would be the only real blocker for me switching my laptop. If it wasn't for that I'd probably have switched to nix yonks ago as there aren't really any other native Windows apps I need. Switching my desktop would be a much harder sell The difference is that the ARM chips are going toe to toe with the big Intel and AMD chips. N100 is “ok for what it is” but you can have cool, quiet, AND obscenely powerful. Again: the Apple M-Series has pointed the way. My Ableton projects stopped glitching as soon as I binned the Intel and went M1.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,247
|
Post by malek86 on May 22, 2024 18:15:36 GMT
Intel had a good product and no real competition, so they had a stretch of 6 years where they were stuck at 14nm and just kept adding more cores and upping the frequency. So successful product, no real competition, and a company ran by bean counters that don’t want to change anything as long as the line on the chart keeps pointing up. Intel did try moving away from 14nm, but their 10nm efforts went completely bust and then they had to scramble to get 7nm working in time to not fall too far behind AMD anymore. Anyway, it doesn't matter, because I imagine even an average ARM CPU will probably still be a better pick for mobile devices than the most efficient x86 CPU. The big interrogation mark is software compatibility. Chrome books didn't really catch on, but then maybe people really wanted Windows.
|
|
|
Post by Jambowayoh on May 22, 2024 18:24:58 GMT
Kind of feels like Intel has already answered the tiny low power fanless PC requirement with N100 for example. I got a N100 mini pc for a silly amount of money from AliExpress and it's surprisingly competent even as a daily driver machine for the size and cost, and it natively runs everything. Ableton would be the only real blocker for me switching my laptop. If it wasn't for that I'd probably have switched to nix yonks ago as there aren't really any other native Windows apps I need. Switching my desktop would be a much harder sell The difference is that the ARM chips are going toe to toe with the big Intel and AMD chips. N100 is “ok for what it is” but you can have cool, quiet, AND obscenely powerful. Again: the Apple M-Series has pointed the way. My Ableton projects stopped glitching as soon as I binned the Intel and went M1. It really is incredible how impressive the M chips are.
|
|
|
Post by Phattso on May 22, 2024 18:52:43 GMT
Indeed! Our work machines were recently upgraded to M3 MacBook Pros with 64GB (genuinely necessary for the work we do) and it really does feel like an impossible amount of power for a unit that I can work really hard for 8 hours on one charge and literally never hear it. It doesn’t even get particularly warm.
Game changer in every sense. And for the Studio boxes they staple two of those fuckers together in “Ultra” configuration. It’s scary good. And, Apple being Apple, insultingly priced.
|
|
|
Post by Jambowayoh on May 22, 2024 19:00:07 GMT
I do wonder what would have happened if Nvidia had been allowed to buy ARM.
|
|
Tomo
Junior Member
Posts: 3,490
|
Post by Tomo on May 22, 2024 20:12:20 GMT
Yeah, I hate Apple generally, but the M chips are really good. Absolute pain in the ass programming on them a couple of years ago whilst software devs had to fix all their packages, but now it's mostly smooth sailing.
|
|
dmukgr
Junior Member
Posts: 1,516
Member is Online
|
Post by dmukgr on May 22, 2024 20:43:41 GMT
I do wonder what would have happened if Nvidia had been allowed to buy ARM. I’d be richer in my job (as I’m in this industry and some of the speculation in this thread isn’t that accurate really - I’ve worked on the M cores for many years before they existed in silicon).
|
|
zephro
Junior Member
Posts: 2,990
Member is Online
|
Post by zephro on May 23, 2024 19:34:56 GMT
Recall is already getting investigated by the UK data protection (?) watchdog. Excuse my total ignorance, but how did Intel (and AMD i guess) suddenly get so far behind the curve when it comes to processors? It doesn't seem like the kind of industry where you could just rock up and compete with no prior experience, and they should have had a lock on all the most experienced engineers, and all the decades of knowledge and patents. Eh it's somewhat illusionary. Most stuff except x86 has been RISC since the 80s. PowerPC chips (as seen in old.macbooks, the 360 and PS3) were RISC, ARM was originally designed for the BBC Micro in effect. So it's not even remotely new. They've always been more efficient. Which hypothetically means if you clocked one high enough they always had a raw power advantage over CISC. Less power per instruction means if you ram the same power through it you do more instructions. The main changes are economic. Amazon and Apple have gone in designing their own ARM chips, as have Nvidia with TEGRA. Where as previously the R&D for them was going into making them lower power for mobiles as that was the way for Qualcomm etc to grow. Now there's tech giants trying to get the performance out of them. That all said since Core 1 decades ago, Intel chips internally run something that's basically RISC. The pipeline is heavily based on RISC principals and nothing like the 40 stage pipeline in the Pentium 4 which was the last true CISC design. Internally the first stage of a modern intel chip is a decode layer into an internal RISC system. Intel maintained a genuine lead for most this time because they added more "features" for peak performance. Internal GPUs, SIMD extensions, hyper-threading etc. the new hotness being AI co processors. The main change of late has been Apple, Amazon, Google actually adding those features to ARM (or paying ARM to do the R&D). Back when I were a student the new hotness R&D from ARM was clockless asynchronous processors.
|
|
|
Post by Phattso on May 24, 2024 7:08:00 GMT
Of late?! Apple and Co added Tensor cores to their chips about 8 years ago. It’s how on-device photo manipulation went from wank to world class.
|
|
Fake_Blood
Junior Member
Posts: 1,642
Member is Online
|
Post by Fake_Blood on May 24, 2024 9:43:55 GMT
Yeah, here I’m complaining that windows will be taking screenshots, but meanwhile Apple knows everything in my photo library.
|
|
|
Post by Phattso on May 24, 2024 10:29:06 GMT
Yeah. It's somewhat terrifying that the Apple Photo of the Day thing can pick out a face that's about five pixels high and put it in the right category. But I LOVE the feature, so fuck it. They can have my data.
|
|
|
Post by uiruki on May 24, 2024 10:44:02 GMT
I think that's very different for a camera app to look at stuff that's in the viewfinder when you press a button to just slurping everything that appears on your screen, and it's the kind of difference AI tech bro types will try and make seem like it's less than it is in order to try and make this kind of shit a fait accompli.
It's part of the reason the hype artists have been pushing for grouping previously existing ML-driven processes like computational photography alongside the theft of work at a massive scale that is things like Midjourney and the various LLMs that have sprung up under the banner 'AI'.
|
|
|
Post by Jambowayoh on May 24, 2024 11:06:45 GMT
Yeah. It's somewhat terrifying that the Apple Photo of the Day thing can pick out a face that's about five pixels high and put it in the right category. But I LOVE the feature, so fuck it. They can have my penis. FTFY
|
|
|
Post by Phattso on May 24, 2024 11:22:55 GMT
Yeah. It's somewhat terrifying that the Apple Photo of the Day thing can pick out a face that's about five pixels high and put it in the right category. But I LOVE the feature, so fuck it. They can have my penis. FTFY Mate, if they wanna put close-up pics of my oozing bangle into one of their image models, more power to them. I think I might have just put myself off my own lunch with that one. Jesus.
|
|
Fake_Blood
Junior Member
Posts: 1,642
Member is Online
|
Post by Fake_Blood on May 24, 2024 12:18:16 GMT
Yeah. It's somewhat terrifying that the Apple Photo of the Day thing can pick out a face that's about five pixels high and put it in the right category. But I LOVE the feature, so fuck it. They can have my data. Luckily it’s all handled locally right? Right?? I’ll be honest though, I trust Apple more than Microsoft.
|
|
zephro
Junior Member
Posts: 2,990
Member is Online
|
Post by zephro on May 24, 2024 19:20:41 GMT
Of late?! Apple and Co added Tensor cores to their chips about 8 years ago. It’s how on-device photo manipulation went from wank to world class. The first super scalar CPU was in the 60s and the first common implementation was the Pentium 1 in the mid 90s. RISC designs like ARM fundamentally date to the 80s. Silicon changes slowly. Plus the neural engine and tensor chips aren't that specialist over a GPU. Or they weren't originally. But going to heterogeneous computing with SoCs is more of an economics change regardless. It's been in progress for 15 years or so.
|
|
|
Post by Phattso on May 24, 2024 19:25:10 GMT
Sure. But like the process nodes the gaps between major steps have been shortening. Eight years is an eternity in the present tech world. Plus there are more players now, and software is playing a bigger part. But your post just reminds me how much I miss the 80s and 90s where it felt like one could understand it all. Now there’s just too much.
|
|
zephro
Junior Member
Posts: 2,990
Member is Online
|
Post by zephro on May 24, 2024 20:21:27 GMT
Sure. But like the process nodes the gaps between major steps have been shortening. Eight years is an eternity in the present tech world. Plus there are more players now, and software is playing a bigger part. But your post just reminds me how much I miss the 80s and 90s where it felt like one could understand it all. Now there’s just too much. I feel like we're finally returning to something more like the 90s. When I was a student (in the 00s) learning all this stuff there were all sorts of major people making chips, Sun with SPARC, IBM with PowerPC, ARM, Motorolas plus all the mad shit like custom Cray CPUs. Though that was a reduction from the glory days. By the 2010s we hit an absolutely dreary period of it being an x86 or an ARM (but only in mobile devices). This is getting back to the fun times of co-processors for specific jobs. Though I'm not sure the major steps are all that shortened, they're by and large incremental. Like when I did my Phd everyone was adamant that the MESI protocol for cache coherence could not scale past 16 cores on a die. Low and behold that basically still holds true, Intel are only edging around it because of the "efficiency" core thing. They've just got a shit load more cache on chip these days. Feels kinda fun. But also that people have been saying this shit for decades, just Intel's dominance kinda kept it under wraps.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,856
|
Post by crashV👀d👀 on Jun 21, 2024 9:49:35 GMT
|
|
|
Post by Jambowayoh on Jun 21, 2024 9:53:22 GMT
I wonder if I should invest in some ARM shares seeing as how Nvidia are seemingly the world's most valuable company.
|
|
dmukgr
Junior Member
Posts: 1,516
Member is Online
|
Post by dmukgr on Jun 21, 2024 10:20:01 GMT
ARM are making an interesting play for their customers markets, moving more to SOCs. They have seven years worth of royalties in the pipeline that means it is a good time to take this risk, which is them attempting to replicate Nvidia's success.
Not that that helps you with your decision on shares. I have no idea, and don't know how the Risc threat will play out, though I am personally invested in the latter, through work.
|
|
dmukgr
Junior Member
Posts: 1,516
Member is Online
|
Post by dmukgr on Jun 21, 2024 10:22:14 GMT
Everyone's favourite grandpa investor, Buffet, always recommends you invest in what you know. I guess I need to purchase a load of shares in mustard.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,856
|
Post by crashV👀d👀 on Sept 20, 2024 21:28:39 GMT
|
|
|
Post by Chopsen on Sept 20, 2024 21:45:42 GMT
Do Qualcomm have their own fabrication plants? Intel's are so big they can't even fill the manufacturing capacity with demand for their own chips, and there was on/off talk about spinning the manufacturing bit off entirely.
|
|
zephro
Junior Member
Posts: 2,990
Member is Online
|
Post by zephro on Sept 20, 2024 23:07:16 GMT
Lol, what a crock of shit. Intel may be struggling through a rebuild at the moment but their long term prospects are still solid even if they're not the Nvidia darling of this world. Considering the article is basically just gaming and AI references I'm not sure they've paid attention to the insane amount of regular laptop/desktop and server chips Intel sell. They're not flashy or going to increase your share price, but its much like selling a fuckton of Ford Fiestas.
|
|