Redkey

joined 2 years ago
[–] Redkey@programming.dev 7 points 3 months ago

I was so triggered by the conversion from char-to-int-to-string-to-packedint that I had to write a bitwise version that just does char-to-packedint (and back again), with bitwise operators.

https://pastebin.com/V2An9Xva

As others have pointed out, there are probably better options for doing this today in most real-life situations, but it might make sense on old low-spec systems if not for all the intermediate conversion steps, which is why I wrote this.

[–] Redkey@programming.dev 3 points 3 months ago

Outside the major cities, at least, video arcades in Japan are still hanging on in 2025 with a mix of games. There are a lot of pseudo-gambling token games (think prize tickets), crane-style prize games, and simple, highly physical games (big buttons and levers, controller and body tracking) aimed at the 5-to-10-year-old segment.

In terms of things we'd recognize as "real" games, almost everything is groups of locally networked terminals with some kind of physical gimmick that doesn't translate well to a home experience. There are still some racing games, music games, and the like, with uncommon controllers and layouts, but the most common format right now is probably a flat table with an embedded screen that has some way of scanning and tracking collectible trading cards. The cards aren't just scanned in once for use and then put aside, but actually moved around the table as tokens within the game. Obviously there are "Magic" style games, but also RPGs (both turn-based and action), MOBAs, real-time strategy, and more. Horse racing games are also popular, but to be clear, the players don't "ride" the horses; they raise, trade, manage, and "bet" on them, and watch simulated races.

And these days almost everything uses player profiles saved to IC cards, ranked across the country and sometimes even the world.

Occasionally you'll see four or six of the old sit-down "city" style cabinets (like the ones pictured in the article) in a corner, running 1-on-1 fighting games, but those are mainly found in the specifically "retro" arcades.

[–] Redkey@programming.dev 2 points 4 months ago

One thing that I discovered about charging PS3 pads, which doesn't seem to be mentioned a lot, is that they appear (my guess, unconfirmed) to require proper USB current negotiation before they will start charging. In fact, I've found multiple sources saying that they can be charged from any USB power source, which isn't true.

The original USB standard states that USB hosts should start a connection with 100mA of current, and the client can request increases in 100mA steps up to 500mA. I assume that the PS3 USB ports support this, as do pretty much all computer USB ports. But the majority of wall plug USB chargers don't; they just allow a maximum current draw of 500mA (or more) from the start and ignore increase requests.

It seems like the majority of equipment manufacturers ignored this part of the spec, since the host needs circuitry to limit current in any case, so many chargers don't bother with circuitry to respond, and even when the port does respond to increase requests, the port is actually always allowing the maximum draw and simply approving all requests.

However, I think that the PS3 pads actually wait for an "OK" response before continuing, which the majority of wall chargers (especially the cheap ones) never send. I had to use the PS3 or a PC (direct connection, not through a hub) to charge my pads until I found a cheap PS3 controller charging dock that works with any supply.

[–] Redkey@programming.dev 7 points 4 months ago (1 children)

I have a stack of Logitec F310 controllers, and I've never had them fail to work on any system (Windows, Linux, Android). They're not "pro gamer" or anything, fairly basic, but they've always responded smoothly for me even after many years of use. They're inexpensive, wired, and have an "XBox - DInput" switch on the back (at least mine do; that feature may have been removed by now).

The F310 (what I use) is wired and has no rumble feedback.

The F510 is wired and has rumble feedback, but I've never used one.

The F710 is wireless 2.4GHz (not Bluetooth) and has rumble feedback. I have two of these, and in my experience neither of them connects reliably, even under Windows with the official software installed.

[–] Redkey@programming.dev 1 points 4 months ago

I loved my MDs and Hi-MDs, but they had so many frills. All the frills. That was part of why I loved them!

[–] Redkey@programming.dev 1 points 4 months ago* (last edited 4 months ago)

The PlayStation 1 had a copy protection system that measured physical properties of the disc which couldn't be replicated by normal CD writers. There were a few ways to get around this, but to be able to put a burned CD into your console and boot directly from it into the game (as usual) required the installation of a fairly complex mod chip. A lot of people alternatively used the "swap trick", which is how I used to play my imported original games.

The DreamCast's copy protection was heavily reliant on using dual-layer GD-ROM discs rather than regular CDs, even though they look the same to the naked eye. There were other checks in place as well, but simply using GD-ROMs was pretty effective in and of itself.

Unfortunately, Sega also added support for a thing called "MIL-CD" to the DreamCast. MIL-CD was intended to allow regular music CDs to include interactive multimedia components when played on the console. However, MIL-CD was supported for otherwise completely standard CDs, including burned CDs, and had no copy protection, because Sega wanted to make it as easy as possible for other companies to make MIL-CDs, so the format could spread and hopefully become popular. Someone found a way to "break out" of the MIL-CD system and take over the console to run arbitrary code like a regular, officially released game, and that was the end of DreamCast's copy protection. People couldn't just copy an original game disc 1:1 and have it work; some work had to be done on the game to put it on a burned CD and still have it run (sometimes quite a lot of work, actually), but no console modification was needed. Anyone with a DreamCast relased before Sega patched this issue (which seems to be most of them) can simply burn a CD and play it on their console, provided they can get a cracked copy of the game.

[–] Redkey@programming.dev 2 points 4 months ago

Very nice. The minimal permissions are especially welcome for me.

[–] Redkey@programming.dev 2 points 4 months ago

I presume that you have at least some vague idea of what you want your program to do. If not, maybe a good random writing prompt generator can help.

But if you do have something in mind, I usually start by thinking about how I'm going to store my data. That goes hand-in-hand with thinking about how to break up the top-level tasks into individual functions and sub-functions. Those two processes tend to feed into each other, and before I'm aware of it, I've got a basic framework hammered out, at least in my notes.

[–] Redkey@programming.dev 3 points 4 months ago

Go for it, if it's to satisfy your own curiosity, but there's virtually no practical use for it these days. I had a personal interest in it at uni, and a project involving coding in assembly for an imaginary processor was a small part of one optional CS course. Over the years I've dabbled with asm for 32-bit Intel PCs and various retro consoles; at the moment I'm writing something for the Atari 2600.

In the past, assembly was useful for squeezing performance out of low-powered and embedded systems, but now that "embedded" includes SoCs with clock speeds in the hundreds of MHz and several megabytes of RAM, and optimizing compilers have improved greatly, the tiny potential performance gain (and you have to be very good at it before you'll be able to match or do better than most optimizing compilers) is almost always outweighed by the overhead of hand-writing and maintaining assembly language.

[–] Redkey@programming.dev 2 points 4 months ago

My main concern is getting games in a form that I can store locally for 20 years and then reasonably expect to boot up and play. A secondary concern (ever since I moved permanently to another country) is going digital whenever possible because shipping stuff long distances is expensive. I had hundreds of physical books that it pained me to give away, but it simply wasn't economical to move them to my new home. I kept my physical games, CDs, and DVDs, because they're mostly thin discs and air-filled plastic cases (often replaceable once paper inserts have been removed) and I was able to bring them over affordably.

Over the last few years I'd say I've slowed down on physical retro collecting and only bought a couple dozen retro console games. More often I sail the high seas looking for them because morally there's no sane argument decades after release that paying $50-100 to a private collector or dealer today has any impact on the developer's or publisher's profits in terms of secondary or tertiary sales. The physical game media and packaging have ceased to be games and have become artifacts, almost independent of their content, like other vintage or antique items. Of course that doesn't apply if the game has been rereleased in more or less its original form, in which case I either buy it (if the price is reasonable) or don't play it at all (if the price is unreasonable). I actually have such a game in digital storage that I've been meaning to play for years, and I learned that it's quite recently been put up in GOG, so now I'm morally obligated to buy it if I still want to play it, heh. Luckily for me the price seems fair.

And speaking of GOG, the majority of my recent game purchases have been split pretty evenly between GOG and itch.io; about 95%. I basically haven't bought anything directly from Steam for more than a decade. I understand that many games there are actually DRM-free, but I'm not interested in trying to research every game before I make a purchase. If each game's store page indicated its true DRM status clearly (not just "third-party DRM"), I'd consider buying through Steam again. As it is, whenever I learn about an interesting game that's on Steam, I try to find it on itch.io or GOG, and if I can't, I generally don't buy it; I'll buy it on Steam only if it looks really interesting and it's dirt cheap.

Whenever I look at ~~buying~~ "leasing with no fixed term" anything with DRM, I assume that it will be taken away from me or otherwise rendered unusable unexpectedly at some point in the future through no fault of my own. It's already happened to me a couple of times, and once bitten, twice shy. I know that everyone loves Gabe Newell, and he seems like a genuinely good guy, and he's said that if Steam ever closed its doors that they'd unlock everything. However the simple fact is that in the majority of situations where that might happen, the call wouldn't be up to Gaben, even for games published by Valve.

So yeah, I may put up with DRM in a completely offline context, but in any situation where my access terms can be changed remotely and unilaterally with a forced update, server shutdown, or removal, that's a hard pass from me.

[–] Redkey@programming.dev 4 points 5 months ago

Gain Ground and Arcus Odyssey both got many hours of play on my Mega Drive back in the day. :)

[–] Redkey@programming.dev 4 points 5 months ago* (last edited 5 months ago)

I'm not too knowledgeable about the detailed workings of the latest hardware and APIs, but I'll outline a bit of history that may make things easier to absorb.

Back In the early 1980s, IBM was still setting the base designs and interfaces for PCs. The last video card they relased which was an accepted standard was VGA. It was a standard because no matter whether the system your software was running on had an original IBM VGA card or a clone, you knew that calling interrupt X with parameters Y and Z would have the same result. You knew that in 320x200 mode (you knew that there would be a 320x200 mode) you could write to the display buffer at memory location ABC, and that what you wrote needed to be bytes that indexed a colour table at another fixed address in the memory space, and that the ordering of pixels in memory was left-to-right, then top-to-bottom. It was all very direct, without any middleware or software APIs.

But IBM dragged their feet over releasing a new video card to replace VGA. They believed that VGA still had plenty of life in it. The clone manufacturers started adding little extras to their VGA clones. More resolutions, extra hardware backbuffers, extended palettes, and the like. Eventually the clone manufacturers got sick of waiting and started releasing what became known as "Super VGA" cards. They were backwards compatible with VGA BIOS interrupts and data structures, but offered even further enhancements over VGA.

The problem for software support was that it was a bit of a wild west in terms of interfaces. The market quickly solidified around a handful of "standard" SVGA resolutions and colour depths, but under the hood every card had quite different programming interfaces, even between different cards from the same manufacturer. For a while, programmers figured out tricky ways to detect which card a user had installed, and/or let the user select their card in an ANSI text-based setup utility.

Eventually, VESA standards were created, and various libraries and drivers were produced that took a lot of this load off the shoulders of application and game programmers. We could make a standardised call to the VESA library, and it would have (virtually) every video card perform the same action (if possible, or return an error code if not). The VESA libraries could also tell us where and in what format the card expected to receive its writes, so we could keep most of the speed of direct access. This was mostly still in MS-DOS, although Windows also had video drivers (for its own use, not exposed to third-party software) at the time.

Fast-forward to the introduction of hardware 3D acceleration into consumer PCs. This was after the release of Windows 95 (sorry, I'm going to be PC-centric here, but 1: it's what I know, and 2: I doubt that Apple was driving much of this as they have always had proprietary systems), and using software drivers to support most hardware had become the norm. Naturally, the 3D accelerators used drivers as well, but we were nearly back to that SVGA wild west again; almost every hardware manufacturer was trying to introduce their own driver API as "the standard" for 3D graphics on PC, naturally favouring their own hardware's design. On the actual cards, data still had to be written to specific addresses in specific formats, but the manufacturers had recognized the need for a software abstraction layer.

OpenGL on PC evolved from an effort to create a unified API for professional graphics workstations. PC hardware manufacturers eventually settled on OpenGL as a standard which their drivers would support. At around the same time, Microsoft had seen the writing on the wall with regards to games in Windows (they sucked), and had started working on the "WinG" graphics API back in Windows.3.1, and after a time that became DirectX. Originally, DirectX only supported 2D video operations, but Microsoft worked with hardware manufacturers to add 3D acceleration support.

So we still had a bunch of different hardware designs, but they still had a lot of fundamental similarities. That allowed for a standard API that could easily translate for all of them. And this is how the hardware and APIs have continued to evolve hand-in-hand. From fixed pipelines in early OpenGL/DirectX, to less-dedicated hardware units in later versions, to the extremely generalized parallel hardware that caused the introduction of Vulkan, Metal, and the latest DirectX versions.

To sum up, all of these graphics APIs represent a standard "language" for software to use when talking to graphics drivers, which then translate those API calls into the correctly-formatted writes and reads that actually make the graphics hardware jump. That's why we sometimes have issues when a manufacturer's drivers don't implement the API correctly, or the API specification turns out to have a point which isn't defined clearly enough and some drivers interpret it one way, while other drivers interpret the same API call slightly differently.

view more: ‹ prev next ›