22
Former Intel CPU engineer details how internal x86-64 efforts were suppressed prior to AMD64's success
(www.tomshardware.com)
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
Follow the Lemmy.world Rules - https://mastodon.world/about
Be kind. No bullying, harassment, racism, sexism etc. against other users.
No Spam, illegal content, or NSFW content.
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
Please try and post original sources when possible (as opposed to summaries).
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
Icon by "icon lauk" under CC BY 3.0
Itanium was an interesting architecture but it relied on compilers to build efficient code for it's bundles. This compares to x86 which dedicates loads of silicon to getting the best performance even out of mediocre code. Unfortunately the Itanic architecture was really ill suited to emulating the other.
ETA: rm dups.
It dedicates dedicated silicon?
All the cache and prediction logic which eventually gave us spectre is basically compensating for whatever crap random compilers kick out.
I was only making a joke, and not asking for clarification. Do you not see the redundancy of saying "dedicates dedicated"? I thought my reply would have made it obvious.