Not bad overall, although I don't know where they got that "Atari 2600 Pitfall" screenshot. Not only is that not taken from the actual game, the 2600 couldn't display that image. It looks like someone who mostly remembered the game drew it from memory in MS Paint.
Redkey
If you might be interested but don't want to click a random link without knowing what it's for, this is a video about the history of a Mario 64 speedrun category, 120 stars, which involves collecting every star in the game.
I think it depends a lot on a person's individual knowledge. If you keep studying far enough away from your main area of expertise, there'll still be some point where you stop and have to blindly accept that something "just works", but it will no longer feel like that's what your main field is based upon.
Imagine a chef. You can be an OK chef just by memorizing facts and getting a "feel" for how recipes work. Many chefs study chemistry to better understand how various cooking/baking processes work. A few might even get into the physics underlying the chemical reactions just to satisfy curiosity. But you don't need to keep going into subatomic particles to have lost the feeling that cooking is based on mysterious unknowns.
For my personal interest, I've learned about compilers, machine code, microcode and CPU design, down to transistor-based logic. Most of this isn't directly applicable to modern programming, and my knowledge still ends at a certain point, but programming itself no longer feels like it's built on a mystery.
I don't recommend that every programmer go to this extreme, but we don't have to feel that our work is based on "magic smoke" if we really don't want to.
ADDED: If anyone's curious, I highly recommend Ben Eater's YouTube videos about "Building an 8-bit breadboard computer!" It's a playlist/course that covers pretty much everything starting from an overview of oscillators and logic gates, and ending with a simple but functional computer, including a CPU core built out of discrete components. He uses a lot of ICs, but he usually explains what circuits they contain, in isolation, before he adds them to the CPU. He does a great job of covering the important points, and tying them together well.
AI coding is an improvement on infinite monkey Shakespeare in the sense that it only types whole words from a dictionary. Although that dictionary has been built from a mix of classic literature and SNS posts.
So this is a list of responses given by AI when you correct it? My guess was "Things you will never hear from a client when you politely point out a logical inconsistency, an incorrect assumption, or a wild over/underestimation in their project plan." 'Cause in my experience the response you will get, 99% of the time, is "That won't happen."
Ha! I was a Mega Drive fan as a teen, and I got really angered by this... until I realized that you were speaking about the Mega CD and 32x specifically. Yep, there really weren't many good games for either of them.
I think the DC had the technical strength to go up against the PS2, not just early on, but for quite a while. The PS2 is incredibly flexible in theory, but looking at its library it seems like most developers just used Sony's default rendering setups. If you ignore the quickie PS1-to-DC ports and only compare titles which got equal effort from developers, it can be hard to tell the difference, and in some cases I'd even say the DC version looks a little nicer.
In this alternate universe where the DC didn't get killed off prematurely, what might've eventually turned the tide for the PS2 would be having between 1.5 and 2 times as much RAM (depending on how you account for different distribution), although that advantage may not have existed if it weren't for the large gap between their release dates.
But Sony could afford to delay for two years; consumers waited for them. Sega couldn't sustain launch-pitch marketing for that long, especially with an actual console on store shelves that people could experience firsthand, as opposed to teaser videos of what the console "might" be capable of. Few publishers or consumers wanted to invest in a console before the clear winner of the previous generation had entered the market.
All that being said, I don't know that the DC was really under-utilized, in technical terms. I feel like a good proportion of the games in its library are using almost all of the power it had under the hood. Perhaps Sega's management and engineers had learned their lesson from the Saturn, because the DC seems very straightforward from a programming perspective. It's almost ironic that it lost to the PS2, which took flexibility and parallelism to heart at least as much as the Saturn did, if not more.
Also, the battery life was hideously short. It would suck down a set of 6 AAs in less than 3 hours. I suspect that the CCFL backlight on the LCD screen was the culprit. And the console was huge. I have the official belt pouch and as a teen it reached most of the way down to my knee. The redesign was a bit smaller, but not much.
A lot of the games sucked, but there were some pretty good ones too. Just not enough games overall, I think.
How do you think your A.R.S.E. compares to Microsoft's planned Binary Universal Technology Translation System, and Sony's upcoming Original Software Heuristic Inter-platform Real-time Interpreter?
I like the big offering from MS, I cannot lie. Sony's outline looks well-rounded, too. I searched online, but I haven't seen any real details about your system. Even after I put down my phone and got on my desktop to type on a proper keyboard, I couldn't find A.R.S.E. with both hands.
Also available on itch (and currently on sale as it is on Steam), for those who might like to know.
ASM doesn't care about your variable types, because it doesn't care about your variables. What's a variable, anyway? There is only address space.
I've had to interact with too many people who say this with a straight face.