Redkey

joined 2 years ago
[–] Redkey@programming.dev 4 points 1 week ago

Gain Ground and Arcus Odyssey both got many hours of play on my Mega Drive back in the day. :)

[–] Redkey@programming.dev 4 points 2 weeks ago* (last edited 2 weeks ago)

I'm not too knowledgeable about the detailed workings of the latest hardware and APIs, but I'll outline a bit of history that may make things easier to absorb.

Back In the early 1980s, IBM was still setting the base designs and interfaces for PCs. The last video card they relased which was an accepted standard was VGA. It was a standard because no matter whether the system your software was running on had an original IBM VGA card or a clone, you knew that calling interrupt X with parameters Y and Z would have the same result. You knew that in 320x200 mode (you knew that there would be a 320x200 mode) you could write to the display buffer at memory location ABC, and that what you wrote needed to be bytes that indexed a colour table at another fixed address in the memory space, and that the ordering of pixels in memory was left-to-right, then top-to-bottom. It was all very direct, without any middleware or software APIs.

But IBM dragged their feet over releasing a new video card to replace VGA. They believed that VGA still had plenty of life in it. The clone manufacturers started adding little extras to their VGA clones. More resolutions, extra hardware backbuffers, extended palettes, and the like. Eventually the clone manufacturers got sick of waiting and started releasing what became known as "Super VGA" cards. They were backwards compatible with VGA BIOS interrupts and data structures, but offered even further enhancements over VGA.

The problem for software support was that it was a bit of a wild west in terms of interfaces. The market quickly solidified around a handful of "standard" SVGA resolutions and colour depths, but under the hood every card had quite different programming interfaces, even between different cards from the same manufacturer. For a while, programmers figured out tricky ways to detect which card a user had installed, and/or let the user select their card in an ANSI text-based setup utility.

Eventually, VESA standards were created, and various libraries and drivers were produced that took a lot of this load off the shoulders of application and game programmers. We could make a standardised call to the VESA library, and it would have (virtually) every video card perform the same action (if possible, or return an error code if not). The VESA libraries could also tell us where and in what format the card expected to receive its writes, so we could keep most of the speed of direct access. This was mostly still in MS-DOS, although Windows also had video drivers (for its own use, not exposed to third-party software) at the time.

Fast-forward to the introduction of hardware 3D acceleration into consumer PCs. This was after the release of Windows 95 (sorry, I'm going to be PC-centric here, but 1: it's what I know, and 2: I doubt that Apple was driving much of this as they have always had proprietary systems), and using software drivers to support most hardware had become the norm. Naturally, the 3D accelerators used drivers as well, but we were nearly back to that SVGA wild west again; almost every hardware manufacturer was trying to introduce their own driver API as "the standard" for 3D graphics on PC, naturally favouring their own hardware's design. On the actual cards, data still had to be written to specific addresses in specific formats, but the manufacturers had recognized the need for a software abstraction layer.

OpenGL on PC evolved from an effort to create a unified API for professional graphics workstations. PC hardware manufacturers eventually settled on OpenGL as a standard which their drivers would support. At around the same time, Microsoft had seen the writing on the wall with regards to games in Windows (they sucked), and had started working on the "WinG" graphics API back in Windows.3.1, and after a time that became DirectX. Originally, DirectX only supported 2D video operations, but Microsoft worked with hardware manufacturers to add 3D acceleration support.

So we still had a bunch of different hardware designs, but they still had a lot of fundamental similarities. That allowed for a standard API that could easily translate for all of them. And this is how the hardware and APIs have continued to evolve hand-in-hand. From fixed pipelines in early OpenGL/DirectX, to less-dedicated hardware units in later versions, to the extremely generalized parallel hardware that caused the introduction of Vulkan, Metal, and the latest DirectX versions.

To sum up, all of these graphics APIs represent a standard "language" for software to use when talking to graphics drivers, which then translate those API calls into the correctly-formatted writes and reads that actually make the graphics hardware jump. That's why we sometimes have issues when a manufacturer's drivers don't implement the API correctly, or the API specification turns out to have a point which isn't defined clearly enough and some drivers interpret it one way, while other drivers interpret the same API call slightly differently.

[–] Redkey@programming.dev 3 points 2 weeks ago* (last edited 2 weeks ago)

In my (admittedly limited) experience, SDL/SDL2 is more of a general-purpose library for dealing with different operating systems, not for abstracting graphics APIs. While it does include a graphics abstraction layer for doing simple 2D graphics, many people use it to have the OS set up a window, process, and whatever other housekeeping is needed, and instantiate and attach a graphics surface to that window. Then they communicate with that graphics surface directly, using the appropriate graphics API rather than SDL. I've done it with OpenGL, but my impression is that using Vulkan is very similar.

SDL_gui appears to sit on top of SDL/SDL2's 2D graphics abstraction to draw custom interactive UI elements. I presume it also grabs input through SDL and runs the whole show, just outputting a queue of events for your program to process.

[–] Redkey@programming.dev 2 points 2 weeks ago (1 children)

I'm a bit confused about this story synopsis, to be honest. How much does the home video game crash of 1983 have to do with video game arcades?

I'll admit that I was still very young at the time, and wasn't living in the USA. I had no idea about the crash until decades later when I read about it as a historical event. Looking back I can recall that console games in my country were a bit thin on the ground for two or three years, but home computers (the 8-bit and 16-bit kinds, not PC/Mac) and arcades kept going strong.

Video game arcades and random video games tucked into corners of takeaway shops and shopping malls were still a common thing for me well into the 1990s. They only really started dying out rapidly in the 2000s due to increasing competition from home systems and computers (this time it was PC/Mac) in terms of power and networking, along with the rise of Internet-connected feature phones and then smartphones, which gave many people an ever-present source of time-killing entertainment in their pocket.

Dedicated arcades with big cabinet games involving custom controllers or displays, or other special features hung on for a while, but those machines rarely appeared anywhere else, and in the end even most of them couldn't keep drawing enough customers to remain profitable.

[–] Redkey@programming.dev 19 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

At the very least, please state which section you made small changes to, even if you are sure it's not worth mentioning what or why.

[–] Redkey@programming.dev 4 points 2 weeks ago

Maybe we could treat the appearances of recognizable, non-living entities in games (cars, buildings, airplanes, etc.) the same way we treat musical scores; the producer would be legally obligated to pay some reasonable, small, fixed fee per use to the original creator, and that creator wouldn't be allowed to object. And this wouldn't entitle the producer to use any trademarked brand or model name, just the form.

[–] Redkey@programming.dev 8 points 3 weeks ago (1 children)

Because the Gameboy logo check and the actual display of the logo happen separately, there were ways to pass the check while still displaying a different logo on the screen. Given that I bought cartridges from major retailers that did this, I'm guessing that Nintendo either didn't know about them, or didn't like their odds in court.

Sega was doing something conceptually similar around the same time, and that did get tested at trial (Sega vs. Acclaim), where the court ruled that Sega could go suck a lemon. So there's some doubt as to whether any of this is enforceable anyway, although Sega kept including a similar system in their hardware up to and including the Dreamcast.

Of course, a company as large as Nintendo could just bankrupt a lot of smaller companies with legal fees via delaying tactics.

[–] Redkey@programming.dev 7 points 3 weeks ago

It's weird to me how GIMP and Krita clearly share a large amount of code under the hood, and even some UI design, but at the same time it feels so much less painful to draw illustrations in Krita than in GIMP. I'm glad I gave it a try.

[–] Redkey@programming.dev 5 points 3 weeks ago

I think that like a great many game mechanics, the fact that it's been done badly many times doesn't mean that it can't be done well.

[–] Redkey@programming.dev 2 points 3 weeks ago

I totally agree with the article's author. He mirrors my own feelings perfectly with the line "I can only say, with the best will in the world, that I'll believe it when I see it at this point."

I still look forward to playing Routine, and I believe that it will be released before I die (unforseen accidents notwithstanding), but beyond that I'm not holding my breath.

[–] Redkey@programming.dev 4 points 4 weeks ago

Child's play compared to what you'd need to do on a modern chip.

I don't think it's the chips, but the operating environments. Modern CPUs offer dozens of multipurpose registers and many more instructions and addressing modes compared to those old, low-cost CPUs, which should make things easier, not harder. But no-one's building old-style dedicated systems around modern CPUs; our code now has to play nice with firmware, OS, libraries, and other processes, including resource management and preempting.

Compare a single-gear go-kart to an automatic sedan. Getting top performance out of the go-kart on a closed track is difficult and requires nuance. If we could drive the automatic sedan around the same closed track, we could easily demolish the go-kart, and not just with raw engine power. The improved acceleration, braking assist, and power steering are enough. But when we drive the sedan we're usually doing it on public roads with traffic signals, intersections, speed limits, and other road users. That's what's more difficult.

[–] Redkey@programming.dev 2 points 1 month ago

Apparently the original game and Brood War expansion are free to install through the Battle.Net launcher these days.

If you have the original discs, the later official patches added the ability to copy the "mpq" files from the CD into the game's directory, so you no longer need the disc in the drive. Of course, you're still going to need a drive for the initial installation. That should work for single player (it's been a few years since I last did it) but I don't know about online multiplayer.

view more: next ›