Seth Michael Larson
sethmlarson.dev.web.brid.gy
Seth Michael Larson
@sethmlarson.dev.web.brid.gy
Python, open source, and the internet

[bridged from https://sethmlarson.dev/ on the web: https://fed.brid.gy/web/sethmlarson.dev ]
Cutting spritesheets like cookies with Python & Pillow 🍪
_Happy new year!_ 🎉 For an upcoming project on the blog requiring many video-game sprites I've created a small tool (“sugarcookie”) using the always-lovely Python image-processing library Pillow. This tool takes a spritesheet and a list of mask colors, a minimum size, and then cuts the spritesheet into its component sprites. I'm sure this could be implemented more efficiently, or with a friendly command line interface, but for more own purposes (~10 spritesheets) this worked just fine. Feel free to use, share, and improve. The script is available as a GitHub gist, but also included below. Source code for `sugarcookie` #!/usr/bin/env python # /// script # requires-python = ">=3.13" # dependencies = [ # "Pillow", # "tqdm" # ] # /// # License: MIT # Copyright 2025, Seth Larson import os.path import math from PIL import Image import tqdm # Parameters spritesheet = "" # Path to spritesheet. masks = {} # Set of 3-tuples for RGB. min_dim = 10 # Min and max dimensions in pixels. max_dim = 260 img = Image.open(spritesheet) if img.mode == "RGB": # Ensure an alpha channel. alpha = Image.new("L", img.size, 255) img.putalpha(alpha) output_prefix = os.path.splitext(os.path.basename(spritesheet))[0] data = img.getdata() visited = set() shapes = set() reroll_shapes = set() def getpixel(x, y) -> tuple[int, int, int, int]: return data[x + (img.width * y)] def make_2n(value: int) -> int: return 2 ** int(math.ceil(math.log2(value))) with tqdm.tqdm( desc="Cutting cookies", total=int(img.width * img.height), unit="pixels", ) as t: for x in range(img.width): for y in range(img.height): xy = (x, y) if xy in visited: continue inshape = set() candidates = {(x, y)} def add_candidates(cx, cy): global candidates candidates |= {(cx - 1, cy), (cx + 1, cy), (cx, cy - 1), (cx, cy + 1)} while candidates: cx, cy = candidates.pop() if ( (cx, cy) in visited or cx < 0 or cx >= img.width or cy < 0 or cy >= img.height or abs(cx - x) > max_dim or abs(cy - y) > max_dim ): continue visited.add((cx, cy)) rgba = r, g, b, a = getpixel(cx, cy) if a == 0 or (r, g, b) in masks: continue else: inshape.add((cx, cy)) add_candidates(cx, cy) if inshape: shapes.add(tuple(inshape)) t.update(img.height) max_width = 0 max_height = 0 shapes_and_offsets = [] for shape in sorted(shapes): min_x = img.width + 2 min_y = img.height + 2 max_x = -1 max_y = -1 for x, y in shape: max_x = max(x, max_x) max_y = max(y, max_y) min_x = min(x, min_x) min_y = min(y, min_y) width = max_x - min_x + 1 height = max_y - min_y + 1 # Too small! We have to reroll this # potentially into another shape. if width < min_dim or height < min_dim: reroll_shapes.add(shape) continue max_width = max(max_width, width) max_height = max(max_height, height) shapes_and_offsets.append((shape, (width, height), (min_x, min_y))) # Make them powers of two! max_width = make_2n(max_width) max_height = make_2n(max_height) sprite_number = 0 with tqdm.tqdm( desc="Baking cookies", total=len(shapes_and_offsets), unit="sprites" ) as t: for shape, (width, height), (offset_x, offset_y) in shapes_and_offsets: new_img = Image.new(mode="RGBA", size=(max_width, max_height)) margin_x = (max_width - width) // 2 margin_y = (max_height - height) // 2 for rx in range(max_width): for ry in range(max_height): x = rx + offset_x y = ry + offset_y if (x, y) not in shape: continue new_img.putpixel((rx + margin_x, ry + margin_y), getpixel(x, y)) new_img.save(f"images/{output_prefix}-{sprite_number}.png") sprite_number += 1 t.update(1) When using the tool you may find yourself needing to add additional masking across elements, such as the original spritesheet curator's name, in order for the cutting process to work perfectly. This script also doesn't work great for sprites which aren't contiguous across their bounding box. There's an exercise left to the reader to implement `reroll_shapes`, a feature I didn't end up needing for my own project. Let me know if you implement this and send me a patch! * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
January 1, 2026 at 9:58 PM
Nintendo GameCube and Switch “Wrapped” 2025 🎮🎁
> This is my last blog post for 2025 💜 Thanks for reading, see you in 2026! One of my goals for 2025 was to _play more games!_ I've been collecting play activity for my Nintendo Switch, Switch 2, and my Nintendo GameCube. I've published a combined SQLite database with this data for 2025 with games, play sessions, and more. Feel free to dig into this data yourself, I've included some queries and my own thoughts, too. Here are the questions I answered with this data: * What were my favorite games this year? * Which game system did I play the most? * Which games did I play the most? * Which games did I play most per-session? * When did I start and stop playing each game? * Which game was I most consistently playing? * When did I play games? * Which day of the week did I play most? ## What were my favorite games this year? Before we get too deep into quantitative analysis, let's start with the games I enjoyed the most and defined this year for me. My favorite game for the GameCube in 2025 is **Pikmin 2**. The Pikmin franchise has always held a close place in my heart being a lover of plants, nature, and dandori. One of my major video-gaming projects in 2025 was to gather every unique treasure in Pikmin 2. I called this the “International Treasure Hoard” following a similar naming to the “National PokéDex”. This project required buying both the Japanese and PAL versions of Pikmin 2 which I was excited to add to my collection. The project was inspired by a YouTube short from a Pikmin content creator I enjoy named “JessicaIn3D”. The video shows how there are three different treasure hoards in Pikmin 2, one per region, meaning its impossible to collect every treasure in a single game. _“You Can't Collect Every Treasure In Pikmin 2” by JessicaIn3D_ The project took 4 months to complete, running from June to October. I published a script which analyzed Pikmin 2 save file data using this documentation on the save file format. From here the HTML table in the project page could be automatically updated as I made more progress. I published regular updates to the project page as the project progressed, too. _Pikmin 2 for the GameCube (NTSC-J, NTSC-M, and PAL) and the Switch._ My favorite game for the Switch family of consoles in 2025 is **Kirby Air Riders**. This game is pure chaotic fun with a heavy dose of nostalgia for me. I was one of the very few that played the original game on the Nintendo GameCube 22 years ago. I still can't believe this gem of a game only sold 750,000 units in the United States. It's amazing to see what is essentially the game my brother and I dreamed of as a sequel: taking everything great about the original release and adding online play, a more expansive world, and near-infinite customization and unlockables. This game is fast-paced and fits into a busy life easily: a single play session of City Trail or Air Ride mode lasts less than 7 minutes from start to finish. I'll frequently pick this game up to play a few rounds between work and whatever the plans of the evening are. Each round is different, and you can pursue whichever strategy you prefer (combat, speed, gliding, legendary-vehicle hunting) and expect to have a fighting chance in the end. _Kirby Air Ride for the GameCube (NTSC-J and NTSC-M) and Kirby Air Riders for the Switch 2._ ## Which game system did I play the most? Even with the Switch and Switch 2 bundled into one category I **played the GameCube more in 2025**. This was a big year for me and GameCube: I finally modded a platinum cube and my childhood indigo cube with the FlippyDrive and ETH2GC. I've got a lot more in store for the GameCube next year, too. System | Duration ---|--- GameCube | 41h 55m Switch | 35h 45m > SQLite query > > > SELECT game_system, SUM(duration) AS d FROM sessions WHERE STRFTIME('%Y', date) = '2025' GROUP BY game_system ORDER BY d DESC; Here's the same data stylized like the GitHub contributor graph. Blue squares represent days when I played GameCube and red squares I played the Switch or Switch 2, starting in June 2025: > I also played **Sonic Adventure** on a newly-purchased **SEGA Dreamcast** for the first time in 2025, too. Unfortunately I don't have a way to track play data for the Dreamcast (yet?) but my experience with the Dreamcast and Sonic Adventure will likely get its own short blog post eventually, stay tuned. ## Which games did I play the most? I played 9 unique titles this year (including region and platform variants), but which ones did I play the most? Game | Duration ---|--- Pikmin 2 | 27h 11m Animal Crossing | 16h 47m Kirby Air Riders | 16h 15m Mario Kart World | 10h 25m Super Mario Odyssey | 4h 45m Pikmin 4 | 1h 5m Overcooked! 2 | 45m Kirby's Airride | 15m Sonic Origins | 10m > SQLite query > > > SELECT game_name, SUM(duration) AS d FROM sessions WHERE STRFTIME('%Y', date) = '2025' GROUP BY game_name ORDER BY d DESC; That's a lot of **Pikmin 2** , huh? This year collected all 245 unique treasures across the three regions of Pikmin 2 (JP, US, and PAL) including a 100% complete save file for the Japanese region. This is the first time I collected all treasures for a single Pikmin 2 play-through. We can break down how much time was spent in each region and system for Pikmin 2: System | Region | Duration ---|---|--- GameCube | US | 9h 24m GameCube | JP | 9h 17m GameCube | PAL | 6h 9m Switch | US | 2h 20m > SQLite query > > > SELECT game_system, game_region, SUM(duration) AS d FROM sessions WHERE STRFTIME('%Y', date) = '2025' AND game_name = 'Pikmin 2' GROUP BY game_system, game_region ORDER BY d DESC; You can see I even started playing the Switch edition of Pikmin 2 but bailed on that idea pretty early. Playing through the same game 3 times in a year was already enough :) The US and JP versions were ~9 hours each with PAL receiving less play time. This is due to PAL only having 10 unique treasures, so I was able to speed-run most of the game. ## Which games did I play most per session? This query sorta indicates “binge-ability”, when I did play a title how long was that play session on average? **Super Mario Odyssey** just barely took the top spot here, but the two Switch 2 titles I own were close behind. Name | Duration ---|--- Super Mario Odyssey | 57m Mario Kart World | 56m Kirby Air Riders | 51m Pikmin 2 | 49m Animal Crossing | 47m Overcooked! 2 | 45m Pikmin 4 | 32m Kirby's Airride | 15m Sonic Origins | 5m > SQLite query > > > SELECT game_name, SUM(duration)/COUNT(DISTINCT date) AS d FROM sessions WHERE STRFTIME('%Y', date) = '2025' GROUP BY game_name ORDER BY d DESC; ## When did I start and stop playing each game? I only have enough time to _focus_ on one game at a time, so there is a pretty linear history of which game is top-of-mind for me at any one time. From this query we can construct a linear history: * Pikmin 2 (June→Oct) * Animal Crossing (July→Aug) * Super Mario Odyssey (Oct) * Pikmin 4 (Nov, “Decor Pikmin”) * Mario Kart World (July→Nov) * Kirby Air Riders (Nov→Dec) I still want to return to Super Mario Odyssey, I was having a great time with the game! It's just that and Kirby Air Riders came out and stole my attention. Game | First played | Last played ---|---|--- Pikmin 2 | 2025-06-01 | 2025-10-06 Mario Kart World | 2025-07-20 | 2025-11-17 Animal Crossing | 2025-07-29 | 2025-09-08 Sonic Origins | 2025-08-11 | 2025-08-25 Super Mario Odyssey | 2025-10-13 | 2025-10-21 Kirby Air Riders | 2025-11-07 | 2025-12-21 Pikmin 4 | 2025-11-10 | 2025-11-12 > SQLite query > > > SELECT ( game_name, MIN(date) AS fp, MAX(date) ) FROM sessions WHERE STRFTIME('%Y', date) = '2025' GROUP BY game_name ORDER BY fp ASC; ## Which game was I most consistently playing? We can take the data from the “Days” column above and use that as a divisor for the number of unique days each game was played. This will give a sense of how often I was playing a game within the time span that I was “active” for a game: Game | % | Days Played | Span ---|---|---|--- Pikmin 4 | 100% | 2 | 2 Super Mario Odyssey | 63% | 5 | 8 Animal Crossing | 51% | 21 | 41 Kirby Air Riders | 43% | 19 | 44 Pikmin 2 | 26% | 33 | 127 Sonic Origins | 14% | 2 | 14 Mario Kart World | 9% | 11 | 120 > SQLite query > > > SELECT ( game_name, COUNT(DISTINCT date) AS played, ( STRFTIME('%j', MAX(date)) -STRFTIME('%j', MIN(date)) ) AS days ) FROM sessions WHERE STRFTIME('%Y', date) = '2025' GROUP BY game_name ORDER BY MIN(date) ASC; If we look at total year gaming “saturation” for 2025 and June-onwards (214 days): Days Played | % Days (2025) | % Days (>=June) ---|---|--- 89 | 24% | 42% > SQLite query > > > SELECT COUNT(DISTINCT date) FROM sessions WHERE STRFTIME('%Y', date) = '2025'; ## When did I play games? Looking at the year, I didn't start playing games on either system this year until June. That lines up with me receiving my GameCube FlippyDrives which I had pre-ordered in 2024. After installing these modifications to my GameCube I began playing games more regularly again :) Month | Duration ---|--- June | 10h 4m July | 9h 40m August | 18h 26m September | 7h 22m October | 10h 0m November | 15h 5m December | 7h 0m > SQLite query > > > SELECT STRFTIME('%m', date) AS m, SUM(duration) FROM sessions WHERE STRFTIME('%Y', date) = '2025' GROUP BY m ORDER BY m ASC; August was the month with the most play! This was due entirely to playing Animal Crossing Deluxe (~16 hours), a mod by Cuyler for Animal Crossing on the GameCube. Animal Crossing feels the best when you play for short sessions each day which I why I was playing so often. Game | Duration ---|--- Animal Crossing | 15h 41m Mario Kart World | 2h 15m Pikmin 2 | 19m Sonic Origins | 10m > SQLite query > > > SELECT game_name, SUM(duration) FROM sessions WHERE STRFTIME('%Y-%m', date) = '2025-08' GROUP BY game_name; ## Which day of the week did I play most? Unsurprisingly, weekends tend to be the days on average with the longest play sessions. **Sunday** just barely takes the highest average play duration per day. Wednesday, Thursday, and Friday have the lowest play activity as these days are reserved for board-game night, seeing family, and date night respectively :) Day | Duration | Days | Average ---|---|---|--- Sun | 16h 16m | 15 | 1h 5m Mon | 13h 52m | 17 | 48m Tues | 14h 9m | 16 | 53m Wed | 11h 17m | 15 | 45m Thur | 6h 35m | 9 | 43m Fri | 5h 45m | 8 | 43m Sat | 9h 42m | 9 | 1h 4m > SQLite query > > > SELECT STRFTIME('%w', date) AS day_of_week,SUM(duration),COUNT(DISTINCT date),SUM(duration)/COUNT(DISTINCT date) FROM sessions WHERE STRFTIME('%Y', date) = '2025' GROUP BY day_of_week ORDER BY day_of_week ASC; * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
December 30, 2025 at 1:02 AM
Getting started with Playdate on Ubuntu 🟨
Trina got me a Playdate for Christmas this year! I've always been intrigued by this console, as it is highly constrained in terms of pixel and color-depth (400x240, 2 colors), but also provides many helpful resources for game development such as a software development kit (SDK) and a simulator to quickly test games during development. I first discovered software programming as an amateur game developer using BYOND, so “returning to my roots” and doing some game development feels like a fun and fulfilling diversion from the current direction software is taking. Plus, I now have a reason to learn a new programming language: Lua! _Running software on the Playdate!_ ## Getting started with Playdate on Ubuntu Here's what I did to quickly get started with a Playdate development environment on my Ubuntu 24.04 laptop: * Unbox the Playdate and start charging the console so it's charged enough for the next steps involving the console. * Create a Playdate account. * Download the SDK. For Linux you need to extract to your desired directory (I chose `~/PlaydateSDK`) and run the setup script (`sudo ~/PlaydateSDK/setup.sh`). * Add the SDK `bin` to `PATH` and `PLAYDATE_SDK_PATH` environment variables to your `~/.bashrc`. * Start the simulator (`PlaydateSimulator`) and register the simulator to your Playdate account when prompted. * Turn on the console and play the startup tutorial. Connect to Wi-Fi and let the console update. * When prompted by the console, register the console to your Playdate account. * Download and install VSCode. I used the `.deb` installer for Ubuntu. * Disable all AI features in VSCode by adding `"chat.disableAIFeatures": true` to your `settings.json`. * Copy the `.vscode` directory from this Playdate template project. The author of this template, SquidGod, has multiple video guides about Playdate development. * Select "Extensions" in VSCode and install the "Lua" and "Playdate Debug" extensions. * Create two directories: `source` and `builds`. Within the `source` directory create a file called `main.lua`. This file will be the entry-point into your Playdate application. That's it, your Playdate development environment should be ready to use. ## “Hello, world” on the Playdate Within `source/main.lua` put the following Lua code: import "CoreLibs/graphics" import "CoreLibs/ui" -- Localizing commonly used globals local pd <const> = playdate local gfx <const> = playdate.graphics function playdate.update() gfx.drawTextAligned( "Hello, world", 200, 30, kTextAlignment.center ) end Try building and running this with the simulator `Ctrl+Shift+B`. You should see our "Hello world" message in the simulator. ## Running “Hello, world” on real hardware Next is getting your game running on an actual Playdate console. Connect the Playdate to your computer using the USB cable and make sure the console is awake. Start your game in the simulator (`Ctrl+Shift+B`) and then once the simulator starts select `Device` > `Upload Game to Device` in the menus or use the hotkey `Ctrl+U`. Uploading the game to the Playdate console takes a few seconds, so be patient. The console will show a message like “Sharing DATA segment with USB. Will reboot when ejected”. You can select the "Home" button in the Playdate console menu to stop the game. ## Making a network request One of my initial hesitations with buying a Playdate was that it didn't originally ship with network connectivity within games, despite supporting Wi-Fi. This is no longer the case, as this year Playdate OS 2.7 added support for HTTP and TCP networking. So immediately after my "Hello world" game, I wanted to try this new feature. I created the following small application that sends an HTTP request after pressing the `A` button: import "CoreLibs/graphics" import "CoreLibs/ui" local pd <const> = playdate local gfx <const> = playdate.graphics local net <const> = playdate.network local networkEnabled = false function networkHttpRequest() local host = "sethmlarson.dev" local port = 443 local useHttps = true local req = net.http.new( host, port, useHttps, "Making an HTTP request" ) local path = "/" local headers = {} req:get(path, headers) end function networkEnabledCallback(err) networkEnabled = true end function init() net.setEnabled(true, networkEnabledCallback) end function playdate.update() gfx.clear() if networkEnabled then gfx.drawTextAligned( "Network enabled", 200, 30, kTextAlignment.center ) if pd.buttonJustPressed(pd.kButtonA) then networkHttpRequest() end else gfx.drawTextAligned( "Network disabled", 200, 30, kTextAlignment.center ) end end init() First I tried running this program with a local HTTP server on `localhost:8080` with `useHttps` set to `false` and was able to capture this HTTP request using Wireshark: 0000 47 45 54 20 2f 20 48 54 54 50 2f 31 2e 31 0d 0a GET / HTTP/1.1.. 0010 48 6f 73 74 3a 20 6c 6f 63 61 6c 68 6f 73 74 0d Host: localhost. 0020 0a 55 73 65 72 2d 41 67 65 6e 74 3a 20 50 6c 61 .User-Agent: Pla 0030 79 64 61 74 65 2f 53 69 6d 0d 0a 43 6f 6e 6e 65 ydate/Sim..Conne 0040 63 74 69 6f 6e 3a 20 63 6c 6f 73 65 0d 0a 0d 0a ction: close.... So we can see that Playdate HTTP requests are quite minimal, only sending a `Host`, `User-Agent` and `Connection: close` header by default. Keep-Alive and other headers can be optionally configured. The `User-Agent` for the Playdate simulator was `Playdate/Sim`. I then tested on real hardware and targeting my own website: `sethmlarson.dev:443` with `useHttps` set to `true`. This resulted in the same request being sent, with a `User-Agent` of `Playdate/3.0.2`. There's no-doubt lots of experimentation ahead for what's possible with a networked Playdate. That's all for now, _happy cranking!_ * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
December 26, 2025 at 9:58 PM
Blind Carbon Copy (BCC) for SMS
Have you ever wanted the power of email Blind Carbon Copy (BCC), but for SMS? I've wanted this functionality myself for parties and organizing, specifically without needing to use a third-party service. This script automates the difficult parts of drafting and sending a text message to many recipients with SMS URLs and QR codes. Draft your message, choose your recipients, and then scan-and-send all the QR codes until you're done. Save your command for later to follow-up in different groups. ## Source code Copy-and-paste the following source code into a file named `sms-bcc`, make the file executable (`chmod a+x sms-bcc`) and you're ready to start using the script. Requires Python and the `qrcode` package (`pip install qrcode`) to run. This script is licensed MIT. > Source code for `sms-bcc` script > > > #!/usr/bin/env python # /// script # requires-python = ">=3.12" # dependencies = [ # "qrcode" # ] # /// # License: MIT # Copyright 2025, Seth Larson import argparse import pathlib import re import sys import urllib.parse from qrcode.console_scripts import main as qrcode_main __version__ = "2025.12.26" def sms_url(recipients: list[str], message: str, mobile_os: str | None = None) -> str: """ Generate an SMS URL from a list of recipients and a message. """ if len(recipients) > 1 and mobile_os is None: raise ValueError("Mobile OS required for multi-recipient messages") if not recipients: raise ValueError("Recipients required") message_encoded = urllib.parse.quote(message) if mobile_os is None or mobile_os == "android": return f"sms:{','.join(recipients)}?body={message_encoded}" else: # mobile_os == "ios" return f"sms://open?addresses={','.join(recipients)}&body={message_encoded}" def parse_contacts(contacts_data: str) -> dict[str, str]: """ Parses a vCard file. Assumes that each contact has a full name and telephone number. """ vcard_fn_re = re.compile(r"^FN:(.+)$", re.MULTILINE) vcard_tel_re = re.compile(r"^(?:item[0-9]\.)?TEL[^:]*:([ \.\(\)+0-9\-]+)$", re.MULTILINE) names_to_tel = {} for vcard in contacts_data.split("BEGIN:VCARD"): if not ( (match_fn := vcard_fn_re.search(vcard)) and (match_tel := vcard_tel_re.search(vcard)) ): continue tel = re.sub(r"[^0-9]", "", match_tel.group(1)) names_to_tel[match_fn.group(1)] = tel return names_to_tel def main() -> int: parser = argparse.ArgumentParser( description="Blind Carbon Copy (BCC) for SMS" ) parser.add_argument( "--contacts", type=str, required=True, help="Path to contacts file in the vCard format", ) parser.add_argument( "--recipients", type=str, nargs="+", required=False, help="List of recipients pulled from contacts", ) parser.add_argument( "--always-recipients", type=str, nargs="+", required=False, help="List of recipients to include in every recipient group", ) parser.add_argument( "--message", type=str, required=True, help="Message to send", ) parser.add_argument( "--mobile-os", type=str, choices=["ios", "android"], required=False, default="ios", help="Mobile OS, only required for multi-recipient messages", ) args = parser.parse_args(sys.argv[1:]) contacts_data = pathlib.Path(args.contacts).read_text() names_to_tel = parse_contacts(contacts_data) message_data = pathlib.Path(args.message).read_text() list_of_recipients = [ [r.strip() for r in recipients.split(",")] for recipients in args.recipients ] always_recipients = list(args.always_recipients or ()) if (mobile_os := args.mobile_os) not in (None, "android", "ios"): raise ValueError("--mobile-os must be one of 'android' or 'ios'") def clear_terminal() -> None: print(chr(27) + "[2J") for recipients in list_of_recipients: recipients.extend(always_recipients) # Figure out which telephone numbers to include # and exclude. Can be numbers or names. recipient_tels = {} for recipient in recipients: # Last character is a number, probably a telephone number. if recipient[-1].isdigit(): recipient_tels[recipient] = recipient continue for name, tel in names_to_tel.items(): if recipient in name: recipient_tels[name] = tel # Remove names filtered via '-Name'. for recipient in recipients: if recipient.startswith("-"): recipient_tels = { name: tel for name, tel in recipient_tels.items() if recipient[1:] not in name } clear_terminal() qrcode_data = sms_url( sorted(set(recipient_tels.values())), message_data, mobile_os ) qrcode_main(["--error-correction=L", qrcode_data]) input( f"\n\nSending to: {', '.join(sorted(recipient_tels.keys()))}\nScan, send, and press enter to continue." ) clear_terminal() print(f"Done sending {len(list_of_recipients)} messages") return 0 if __name__ == "__main__": sys.exit(main()) ## How to use Export your contacts from your phone to a vCard file (`.vcf`). For iPhones this is done within the contacts app: long-press-and-hold “All Contacts” and select “Export”. This will create a `.vcf` file that you can transfer to your computer. Now run the `sms-bcc` script with `--contacts` for the `.vcf` file, draft a message in a file and pass with the `--message` option, and choose your recipients by their name with the `--recipients` option. ./sms-bcc \ --contacts contacts.vcf \ --recipients Alex,Bob Charlie \ --message ./message.txt This will draft the message to two groups: "You, Alex, and Bob" and "You and Charlie". Note how spaces delimit groups of recipients and commas (`,`) delimit recipient names within a group. After running this script, a series of QR codes using the `sms://` URL scheme will be generated one after another. Scan the QR code to load the message and recipient into your phone, then you can optionally send the message or skip, then press `Enter` to generate the next QR code. The `--recipients` option uses a simple string-contains operation, so I recommend having full names in your contacts to avoid excessive duplicates. You can pass a name with a leading hyphen/minus (`-`) character to exclude a name from the list of recipients. The below invocation will match people named "Alex" without matching "Alexander": ./sms-bcc --recipients Alex,-Alexander If you have a spouse or partner that you want to include in every recipient group, use the `--always-recipients` option: ./sms-bcc \ --contacts contacts.vcf \ --recipients Bob Charlie,Dave \ --always-recipients Alex \ --message ./message.txt This will draft the message for "You, Alex, and Bob" and "You, Alex, Charlie, and Dave". 🎄 _Merry Christmas and happy organizing!_ 🎄 ## Changelog * `2025.12.26`: Better handling for many different telephone number formats such as `(555) 555-555`. Added inline script metadata to header. * `2025.12.25`: Initial release. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
December 26, 2025 at 9:58 PM
PEP 770 Software Bill-of-Materials (SBOM) data from PyPI, Fedora, and Red Hat
This year I authored PEP 770 which proposed a new standardized location for Software Bill-of-Materials (SBOM) data within Python wheel archives. SBOM data can now be stored in `.dist-info/sboms/`. You can see the canonical specification on packaging.python.org. While writing this document we also reserved all subdirectory names under `.dist-info/` within a registry for future use in other standards. Reviewers agreed that this method of defining file-based metadata (such as SBOMs, but also licenses) is a great mechanism as it doesn't require creating a new metadata field and version. Creating a new metadata field in particular requires large amounts of “head-of-line blocking” to rollout completely to an ecosystem of independent packaging installers, builders, publishers, and the Python Package Index; the proposed method side-steps all of this by making inclusion in the directory the mechanism instead. So now that this PEP is published, what has happened since? A few things: ## Unmasking the Phantom Dependency problem In case you missed it, I published a white paper on this project with Alpha-Omega. If you want to learn more about the whole project from end-to-end, this is a good place to start! ## Auditwheel and cibuildwheel Back in 2022 there was a public issue opened for Auditwheel asking to generate an SBOM during the `auditwheel repair` command. Now in Auditwheel v6.5.0 which was released in early November, Auditwheel will now automatically generate SBOM data and include the SBOM in the wheel following PEP 770. The manylinux container images adopted the new auditwheel version soon after publication. These images are used by common Python wheel building platforms like cibuildwheel and multibuild. Because this functionality was enabled by default we can look at Python wheel data and determine how many packages already supply PEP 770 SBOM data: When querying the pypi-code.org dataset including all code within Python wheels I was able to find 332 projects on PyPI that are shipping SBOM data in their wheels: SELECT repository, project_name, path FROM './dataset-*.parquet' WHERE archive_path LIKE '%.dist-info/sboms/%' AND skip_reason == '' LIMIT 10; Of these projects, these are the top-10 most downloaded with SBOM data so far: Project | Downloads/Month ---|--- greenlet | 205M numba | 33M pymssql | 27M ddtrace | 17M psycopg-binary | 14M faiss-cpu | 13M logbook | 6M simsimd | 2M clang-format | 2M nodejs-wheel-binaries | 1M There are far more projects which will likely require SBOM data on their bundled dependencies, so I'll continue watching the numbers grow over time! ## RedHat and Fedora adopt PEP 770 Back in July of this year, Miro Hrončok asked if there was a mechanism for specifying the "origin" of a package, as many tools incorrectly assume that any package that's installed to an environment originated from the Python Package Index (and therefore would use a Package URLs like `pkg:pypi/...`). Their use-case was Python packages provided by the system package manager, such as `rpm` on Fedora and RedHat Linux. Vulnerability scanners were incorrectly assuming packages like `pip` were vulnerable as older versions of `pip` are packaged, but with vulnerability patches backported and applied to older versions. _SBOMs to the rescue!_ Miro adopted PEP 770 for Fedora and RedHat Linux to reduce false-positives in vulnerability scans by defining the actual correct Package URL for the installed package in the SBOM: { "bomFormat": "CycloneDX", "specVersion": "1.6", "components": [ { "type": "library", "name": "python3.11-setuptools", "version": "65.5.1-3.el9", // This Package URL is for the RedHat distribution, // of setuptools, not the PyPI distribution. "purl": "pkg:rpm/redhat/python3.11-setuptools@65.5.1-3.el9?arch=src" } ] } If scanners adopt this approach and other Linux distros do as well, there will be far fewer false-positives from scanning Python environments using those Linux distros. A win for everyone! Miro is asking for feedback on this approach by consuming tools. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
December 23, 2025 at 9:58 PM
Delta emulator adds support for SEGA Genesis games
The Delta emulator which I've used for mobile retro-gaming in the past has added beta support for SEGA Genesis and Master System games! Riley and Shane made the announcement through the Delta emulator Patreon and also on Mastodon. You can install the emulator on iOS through the “TestFlight” application to get access right away. I've done so and tested many of my favorite games including the Sonic the Hedgehog and the Streets of Rage trilogies and found that the emulator handled these games flawlessly. _Delta emulator loaded with SEGA Genesis ROMs_ The addition of SEGA Genesis support in Delta is quite exciting for me as the Genesis was my first console. I've amassed quite the collection of SEGA Genesis ROMs from the Sonic Mega Collection on the GameCube and the SEGA Classics collection previously available on Steam. Now I can play any of these games on the go, but I'll probably need to buy a simple Bluetooth controller with a D-Pad for the hand ergonomics. Unrelatedly: did you know that the AltStore is connected to the Fediverse now? Pretty cool stuff. Have you tried the Delta emulator or grow up playing SEGA Genesis games like me? Let me know your favorite game from this era! _Playing the Sonic the Hedgehog 3 & Knuckles using “LOCK-ON Technology”_ * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
December 18, 2025 at 10:00 PM
Extracting Nintendo Switch “Play Activity” with OCR
Despite considering myself a “gamer”, I realized I had only played **~5 hours of video-games in the whole year 2022** and ~6 hours in 2021. Honestly, these numbers made me a bit sad to see... You can't “improve” what you don't measure, so I started looking for low-effort ways to measure the amount of play time while getting back into _actually playing video-games_. I have already achieved what I wanted for GameCube by mid-2025 using the Memcard Pro GC’s Wi-Fi and API. I’ve blogged about this setup which gathers date and duration data for playing GameCube, but I wanted to cover my other consoles. ## What about the Nintendo Switch? Surprisingly, Nintendo Switch offered no such data, despite having an option called “Play Activity” in the menus of the Nintendo Switch, Nintendo Account, and many of their mobile apps. This was unfortunate, as I was playing many more new Nintendo Switch games like the Paper Mario: Thousand-Year Door remake and Pikmin 4, and going back to games I had “missed” like Super Mario Odyssey. That is... until the Nintendo Store app was released just a few weeks ago. This app provides “Play Activity” data at a **much higher resolution** than any other Nintendo app or service. You can find complete historical data across your Nintendo Account, going back as far as the _Nintendo 3DS and Wii-U!_ The data includes games played, dates, and play durations in 15 minute increments. Shoutout to the WULFF DEN podcast for talking about this, otherwise I would never have discovered this niché new feature. But how can I query this data for my own purposes? _Example of data available in the Nintendo Store “Play Activity”._ ## Using Optical Character Recognition (OCR) Basically the data was in the app, but couldn't be selected and copy-pasted or exported. Instead, the data would have to be transferred to a queryable format another way. I took this as an opportunity to try out a technology I'd never used before: Optical Character Recognition (OCR). OCR basically turns pictures of letters and numbers into actual strings of text. State of the art for OCR today appears to be using machine-learning models. After a bit of research, I landed on EasyOCR which uses PyTorch models that are already pre-trained. This appeared to require downloading the model from the internet, which bothered me a bit, but I decided that running the model within a Docker container without network access (`--net=none`) was _probably_ enough to guarantee this library wasn't sending my data off my machine. I created a workflow (source code available on GitHub) that takes a directory of images mounted as a volume, runs OCR on each image, and then returns the parsed text as “JSON lines” for each image along with the checksum of the image. This checksum is stored by the program processing the OCR text to avoid running OCR on images more than once. This is an example of the text that OCR is able to read from one screenshot: [ "20:13", "15", "Play Activity", "Animal Crossing: New Horizons", "5/9/2020", "1 hr; 15 min.", "5/8/2020", "1 hr. 0 min:", "5/5/2020", "45 min:", "5/4/2020", "1 hr. 30 min:", "5/3/2020", "A few min.", ... ] There's some unexpected elements here! Notice how the phone time and battery are picked up by OCR and how the play time durations all have either `.` or `:` at the end. This extra punctuation seems to come from the vertical border on the screen to the right of the text. The least consistent readings are when there is text as a part of the game logo. ## Segmenting and parsing OCR data OCR can consistently the actual text from the application itself, so we can use the `Play Activity` and `First played` labels as anchors to know where the other data is. Using these anchors we can segment OCR text into: * Phone UI (time, battery %) * Game information (title, first played, last played) * Game play activity (date, duration) For some games the model really struggles to read the game title consistently. To fix this I created a list of words that the OCR model _does_ consistently read and mapped those words to corresponding game titles, such as “`Wonder`” → “`Super Mario Bros. Wonder`”. This would be a problem if I played more games, but we’ll cross that bridge when we come to it! ;) The game play activity data parses fairly consistently. The date is always `MM/DD/YYYY` and there are three forms of duration that the application uses: * `A few min` * `XX min` * `X hr Y min` Parsing the date and duration text and accounting for the extra punctuation was accomplished with a single regular expression: ([1-9][0-9]?/[1-9][0-9]?/2[0-9]{3}) (A few min|(?:([0-9]+)\s*hr[:;,. ]+)?([0-9]+)\s*min) This parses out into 4 groups, the date, a “flag” for detecting “A few min”, and then hours and minutes. Because the resolution below 15 minutes isn't shown by the application I assigned the “A few min” duration an approximate value of 5 minutes of play time. The explicit hours and minutes values are calculated as expected. So now we have the game name and a list of play activity days and durations from a single image, do that to each image and insert the results into an SQLite database that you can query: SELECT STRFTIME('%Y', date) AS y, SUM(duration)/3600 AS d FROM sessions GROUP BY y ORDER BY y ASC; The results show just how little I was playing video games in 2021 and 2022 and how I started playing more again in 2023 onwards. Year | Play Activity (Hours) ---|--- 2020 | 151 2021 | 6 2022 | 5 2023 | 30 2024 | 33 2025 | 66 ❤️ Whenever I want fresh data I can take new screenshots of the Nintendo Store app on my phone, place the new screenshots in the `images/` folder, and run the `index.py` script to only run OCR on the new images. If this blog post was interesting to you, I'm planning to look at this data combined with my GameCube play activity data before the end of 2025. _Stay tuned and play more games!_ * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
December 10, 2025 at 9:49 PM
Deprecations via warnings don’t work for Python libraries
Last week urllib3 v2.6.0 was released which contained removals for several APIs that we've known were problematic since 2019 and have been deprecated since 2022. The deprecations were marked in the documentation, changelog, and what I incorrectly believed would be the most meaningful signal to users: with a `DeprecationWarning` being emitted for each use for the API. The API that urllib3 recommended users use instead has the same features and no compatibility issues between urllib3 1.x and 2.x: resp = urllib3.request("GET", "https://example.com") # Deprecated APIs resp.getheader("Content-Length") resp.getheaders() # Recommended APIs resp.headers.get("Content-Length") resp.headers This API was emitting warnings for over 3 years in a top-3 Python package by downloads urging libraries and users to stop using the API and **that was not enough**. We still received feedback from users that this removal was unexpected and was breaking dependent libraries. We ended up adding the APIs back and creating a hurried release to fix the issue. It's not clear to me that waiting longer would have helped, either. The libraries that were impacted are actively developed, like the Kubernetes client, Fastly client, and Airflow and I trust that if the message had reached them they would have taken action. My conclusion from this incident is that `DeprecationWarning` in its current state does not work for deprecating APIs, at least for Python libraries. That is unfortunate, as `DeprecationWarning` and the `warnings` module are easy-to-use, language-“blessed”, and explicit without impacting users that don't need to take action due to deprecations. Any other method of deprecating API features is likely to be home-grown and different across each project which is far worse for users and project maintainers. ## Possible solutions? `DeprecationWarning` is called out in the “ignored by default” list for Python. I could ask for more Python developers to run with warnings enabled, but solutions in the form of “if only we could all just” are a folly. Maybe the answer is for each library to create its own “deprecation warning” equivalent just to not be in the “ignored by default” list: import warnings class Urllib3DeprecationWarning(UserWarning): pass warnings.warn( "HTTPResponse.getheader() is deprecated", category=Urllib3DeprecationWarning, stacklevel=2 ) Maybe the answer is to do away with advance notice and adopt SemVer with many major versions, similar to how Cryptography operates for API compatibility. Let me know if you have other ideas. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
December 8, 2025 at 9:59 PM
One weird trick for cheaper physical Switch 2 games?
**Maybe sell your boxes?** I happened to be browsing PriceCharting and saw that _only the box_ for Kirby Air Riders was selling for **$20** on average. I couldn't believe my eyes! But I looked and there were at least two boxes sold on eBay. I quickly put together this table after manually sorting through the detected eBay listings for three popular Switch 2 titles: Mario Kart World, Donkey Kong Bananza, and Kirby Air Riders. The data only includes listings that actually sold at the price and filters out incorrect listing (such as boxes for the Switch 2 console or bundle). Here are the results in USD: Game | Box-Only Price (SOLD) ---|--- Kirby Air Riders | $16.99 | $29.99 (!!!) Mario Kart World | $17.99 | $19.00 | $15.00 Donkey Kong Bananza | $16.49 | $27.97 (!!!) **Average Box Price** | **$20.49** I am not sure what to make of this. If you include a conservative shipping cost of $8 you're _still saving money_ over a digital copy by buying physical and selling the box, assuming you're able to sell the box for an average price. I suspect if everyone were to employ this strategy the return would deteriorate quickly. If you combine this with the storage cost savings of buying physical over digital then you're "saving" even more money despite the $10 price difference. Something to consider if you're not a video-game ~~hoarder~~ “ _collector_ ”, like me. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
December 2, 2025 at 5:54 PM
Mobile browsers see telephone numbers everywhere
Just like Excel seeing everything as a date, mobile browsers automatically interpret many numbers as telephone numbers. When detected, mobile browsers replace the text in the HTML with a clickable `<a href="tel:...">` value that when selected will call the number denoted. This can be helpful sometimes, but frustrating other times as random numbers in your HTML suddenly become useless hyperlinks. Below I've included numbers that _may_ be turned into phone numbers so you can see for yourself why this may be a problem and how many cases there are. Numbers that are detected as a phone number by your browser are highlighted blue by this CSS selector: a[href^=tel] { background-color: #00ccff; } None of the values below are denoted as telephone number links in the source HTML, they are all automatically created by the browser. Also, if you're not using a mobile browser **then the below numbers won't be highlighted**. Try opening this page on a mobile phone. * 2 * 22 * 222 * 2222 * 22222 * 222222 * 2222222 * 22222222 * 222222222 * 2222222222 * 22222222222 * 111111111111 * 222222222222 * 555555555555 * 1111111111111 * 2222222222222 (???) * 5555555555555 * 11111111111111 * 22222222222222 * 55555555555555 * 111111111111111 * 222222222222222 * 555555555555555 * 2-2 * 2-2-2 * 22-2-2 * 22-22-2 * 22-22-22 * 22-22-222 * 22-222-222 * 222-222-222 * 222-222-2222 * 222-2222-2222 * 2222-2222-2222 * 2222-2222-22222 * 2222-22222-22222 * 22222-22222-22222 * 2 222-222-2222 * +1 222-222-2222 * +2 222-222-2222 (There is no +2 country code...) * +28 222-222-2222 (Unassigned codes aren't used) * +1222-222-2222 * +2222-222-2222 * (+1)222-222-2222 * (+2)222-222-2222 * (1)222-222-2222 * (2)222-222-2222 * (1222-222-2222 * (1 222-222-2222 * 1)222-222-2222 * 222–222–2222 (en-dashes) * 222—222—2222 (em-dashes) * [1]222-222-2222 * <1>222-222-2222 Are there any other combinations that get detected as telephone numbers that I missed? Send me a pull request or email. ## How to prevent automatic telephone number detection? So how can you prevent browsers from parsing telephone numbers automatically? Add this HTML to your `<head>` section: <meta name="format-detection" content="telephone=no"> This will disable automatic telephone detection, and then you can be explicit about clickable telephone numbers by using the `tel:` URL scheme like so: <a href="tel:+222-222-222-2222">(+222)222-222-2222</a> * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
November 25, 2025 at 5:44 PM
Blogrolls are the Best(rolls)
Happy 6-year blogiversary to me! 🎉 To celebrate I want to talk about _other peoples’ blogs_ , more specifically the magic of “blogrolls”. Blogrolls are “lists of other sites that you read, are a follower of, or recommend”. Any blog can host a blogroll, or sometimes websites can be one big blogroll. I’ve hosted a blogroll on my own blog since 2023 and encourage other bloggers to do so. My own blogroll is generated from the list of RSS feeds I subscribe to and articles that I “favorite” within my RSS reader. If you want to be particularly fancy you can add an RSS feed (example) to your blogroll that provides readers a method to “subscribe” for future blogroll updates. Blogrolls are like catnip for me: I cannot resist opening and `Ctrl`-clicking every link until I can’t see my tabs anymore. The feeling is akin to the first deep breath of air before starting a hike: there’s a rush of new information, topics, and potential new blogs to follow. Blogrolls can bridge the “effort chasm” I frequently hear as an issue when I recommend folks try an RSS feed reader. We’re not used to empty feeds anymore; self-curating blogs until you receive multiple articles per day takes time and effort. Blogrolls can help here, especially ones that publish using the importable OPML format. You can instantly populate your feed reader app with hundreds of feeds from blogs that are likely relevant to you. Simply create an account on a feed reader, import the blogroll OPML document from a blogger you enjoy, and watch the articles “roll” in. Blogrolls are almost like Bluesky “Starter Packs” in this way! Hopefully this has convinced you to either curate your own blogroll or to start looking for (or asking for!) blogrolls from your favorite writers on the Web. Share your favorite blogroll with me on email or social media. Title inspired by “Hexagons are the Best-agons”. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
November 12, 2025 at 5:42 PM
Ice Pikmin and difficulty of Pikmin Bloom event decor sets
I play Pikmin Bloom regularly with a group of friends. The game can be best described as “Pokémon Go, but walking”. One of the main goals of the game is to collect “decor Pikmin” which can come from the environment, landmarks, and businesses that you walk by. Recently there's been a change to the game that makes completing sets of decor Pikmin **significantly more difficult** , this post explores the new difficulty increase. Every month there are special decor Pikmin which are earned by completing challenges like walking, growing Pikmin, or planting flowers. The type of Pikmin you receive is randomized between the available types, and you can only complete the set by collecting one of each available Pikmin type. You can only receive these special decors during the specific month, after the month is over you have to wait a calendar year before you can continue collecting seedlings. Just a few days ago there were 7 Pikmin types corresponding to the first three mainline Pikmin games: Red, Yellow, Blue, White, Purple, Rock, and Wing Pikmin. With the most recent update another Pikmin type has been added: Ice Pikmin from Pikmin 4. This means that going forward there will be 8 Pikmin types per event decor set instead of only 7. **So, what does that mean for the difficulty of the game?** I could probably do some math here, but it'd be much easier to simulate how many event Pikmin you'd need to receive before completing the set depending on if there are 7 or 8 total Pikmin types. Running this Python simulation script creates the below table which shows the difference in cumulative probability for completing a decor set after receiving a number of Pikmin seedlings. For example, if you have grown 10 seedlings you'd have a **10.5% chance** of completing the decor set **before Ice Pikmin** and only a **2.8% chance** of completing the decor set **after Ice Pikmin**. # Seedlings | Before | After | Diff ---|---|---|--- 7| 0.6%| 0.0%| -0.6% 8| 2.4%| 0.3%| -2.2% 9| 5.8%| 1.1%| -4.7% 10| 10.5%| 2.8%| -7.7% 11| 16.3%| 5.6%| -10.7% 12| 22.8%| 9.4%| -13.5% 13| 29.7%| 13.9%| -15.8% 14| 36.6%| 19.2%| -17.4% 15| 43.3%| 24.9%| -18.5% 16| 49.7%| 30.7%| -19.0% 17| 55.7%| 36.5%| -19.1% 18| 61.1%| 42.3%| -18.8% 19| 66.0%| 47.9%| -18.1% 20| 70.4%| 53.1%| -17.3% 21| 74.3%| 57.9%| -16.4% 22| 77.7%| 62.4%| -15.3% 23| 80.7%| 66.5%| -14.2% 24| 83.3%| 70.3%| -13.1% 25| 85.6%| 73.6%| -12.0% 26| 87.6%| 76.7%| -10.9% 27| 89.3%| 79.4%| -9.9% 28| 90.8%| 81.8%| -9.0% 29| 92.1%| 84.0%| -8.1% 30| 93.2%| 85.9%| -7.3% 31| 94.2%| 87.6%| -6.5% 32| 95.0%| 89.1%| -5.9% 33| 95.7%| 90.4%| -5.3% 34| 96.3%| 91.6%| -4.7% 35| 96.8%| 92.6%| -4.2% 36| 97.3%| 93.6%| -3.7% 37| 97.7%| 94.4%| -3.3% 38| 98.0%| 95.1%| -2.9% 39| 98.3%| 95.7%| -2.6% 40| 98.5%| 96.2%| -2.3% 41| 98.7%| 96.7%| -2.1% 42| 98.9%| 97.1%| -1.8% 43| 99.1%| 97.5%| -1.6% 44| 99.2%| 97.8%| -1.4% 45| 99.3%| 98.0%| -1.3% 46| 99.4%| 98.3%| -1.1% 47| 99.5%| 98.5%| -1.0% 48| 99.6%| 98.7%| -0.9% 49| 99.6%| 98.9%| -0.8% 50| 99.7%| 99.0%| -0.7% For mid-range numbers of Pikmin seedlings (13-22) you'll be at **least 15% less likely** to have completed the Pikmin decor set for that number of seedlings. To have a 95% chance of completing a decor set you'd need to gather 32 seedlings prior to Ice Pikmin, with Ice Pikmin you'd need to collect 38 seedlings. I don't know how many event Pikmin seedlings I receive in a typical month, I'll be watching that number and see if I'm able to complete the set. Good luck out there, Pikmin players! 😬 * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
November 6, 2025 at 5:41 PM
GameCube Nintendo Classics and storage size
If you're into GameCube collecting and archiving you may already know that GameCube ISOs or "ROMs" are around ~1.3 GB in size, regardless of the game that is contained within the `.iso` file. This is because GameCube ROMs are all copies of the same disk format: the GameCube Game disc (DOL-6). The GameCube Game disc is a 8cm miniDVD-based disc with a static storage capacity of 1.5 GB. Compare this to cartridges which using memory-mapping controllers (MMC) can encase different amounts of storage ROM depending on the size of the game data itself. This was a concern raised by some GameCube players on Switch. Given the prices of microSD Express storage (~28 ¢/GB) and the size of the GameCube game library (>650 total, 45 first-party) meant storage requirements could increase quickly for new GameCube titles. Luckily, looking at the data about the GameCube Nintendo Classics application on the Switch we can see that the ROMs in use are "trimmed", such that their size is less than 1.3 GB: Date | Titles | Games | Storage | Storage/Game ---|---|---|---|--- 2025-06-03 | F-Zero GX Legend of Zelda: The Wind Waker Soulcalibur II | 3 | 3.5 GB | 1.16 GB 2025-07-03 | Super Mario Strikers | 4 | ??? GB | ??? GB 2025-08-21 | Chibi-Robo! | 5 | 6.9 GB | 1.38 GB 2025-10-30 | Luigi's Mansion | 6 | 7.2 GB | 1.2 GB Luigi's Mansion in particular is known to only require ~100 MB of data on the 1.3 GB disc. Animal Crossing for the GameCube is also legendarily small due to starting life as an N64 game, requiring only 50 MB of data. It'll be interesting to see what happens for the first multi-disc game to be added to Nintendo Classics. Notably, Namco already has a GameCube game in Nintendo Classics: Soulcalibur II. For this reason, I suspect that the first multi-disc game will be one of these three published by Namco: * Baten Kaitos Origins * Baten Kaitos: Eternal Wings and the Lost Ocean * Tales of Symphonia * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
November 4, 2025 at 5:40 PM
RSS feed for new Nintendo Classics games
It's November! Many folks use this month to write more, whether it's a novel or generating text. I'm going to be trying to write and share more often, too. So here's something I created for mostly me, but maybe you too. I've created a small RSS feed for new games being added to the Nintendo Classics collection over time. Nintendo uses this collection as the drippiest-of-drip-feeds, so there's typically only a few new games per month. So instead of checking frequently I can follow this feed in my feed reader and be notified on new releases. I thought this was interesting for a few reasons, one I implemented “Oxford commas” when joining a `list[str]` using f-strings like so, split apart for easier reading: def oxford_comma(x: list[str]) -> str: return ( f"{', '.join(x[:-1])}" f"{',' if len(x) >= 3 else ''}" # Oxford comma! f"{' and ' if len(x) >= 2 else ''}" f"{x[-1]}" # Last or only element. ) I'm sure this could be done in less space somehow, if you're able to code-golf this smaller please send me your code :) Second interesting thing about the RSS feed is there's only one `<item>` or entry in the feed at any one time. I suppose I could have implemented the latest N new groups of releases, but I felt that wouldn't be useful for me who was already caught up on what has been released in the past month or so. Unsurprisingly, my feed reader had no issue with a single entry feed on the first crawl, but I've never seen this in the wild so it'll be interesting to see how the reader reacts to a new entry replacing the old one. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
November 2, 2025 at 5:40 PM
Drawing an ASCII TIE fighter for post-quantum cryptography
This is a funny short story about contributing to internet standards. The real heroes of the story are Filippo Valsorda and all the other contributors to post-quantum cryptography standards (PQC). Without their efforts internet communications would be less secure, so thank you :) As I understand the situation, the internet standard being discussed is “Concrete Hybrid PQ/T Key Encapsulation Mechanisms” (KEMs) which combine PQ algorithms with traditional algorithms. The reasoning being that PQ algorithms are relatively new which is not _great_ for cryptography which would prefer using algorithms that have survived many years of scrutiny and analysis, such as EdDSA. Therefore, it is desirable to have a “hybrid” KEM designed such that the hybrid “fails gracefully” by only losing the _quantum safety_ property if the PQ algorithm is insecure, but doesn't compromise safety to traditional attack vectors that exist without sufficiently powerful quantum computers. My intuition is this will allow for more confident experimentation and deployment of PQ algorithms, as the stakes are much lower for actually rolling out the new algorithms with this construction. There are three concrete hybrid KEM instances that are defined within the standard, they are named but the names may change in the future: * `MLKEM768-P256` * `MLKEM768-X25519` (aka “X-Wing”) * `MLKEM1024-P384` Note that the X-Wing KEM was first published in January 2024, much earlier than the other KEMs in this draft. You may have already guessed that the name of the "X-Wing" KEM is relevant to the title of the blog post :) These hybrid KEMs are made up of 5 components: * Traditional component that is either a “nominal group” or a traditional KEM * A post-quantum KEM * A pseudo-random number generator (PRG) * A key-derivation function (KDF) * And finally, a label which is a byte string that labels the specific combination of the above components. For the X-Wing KEM the label is the byte string “`0x5C2E2F2F5E5C`”, which if printed out as ASCII looks like an X-Wing from Star Wars: \./ /^\ Note that the newline was added to better visualize the ASCII art of an X-Wing, the newline isn't present in the actual KEM label byte string. This is where my concrete understanding of the context is fuzzier, and figuring it all out would require digging through IETF mailing list exchanges. As far as I could tell from a quick read the new KEM constructions being proposed were going to have real names instead of ASCII art to match the actual name of the construction, basically what name you'd end up configuring in OpenSSL, NGINX, etc. This naming discussions was taking time and the label being used for key derivation means that implementations of this draft standard couldn't be deployed, as that label was not finalized and could change later. Filippo asked on the IETF mailing list whether the label and the name could be disconnected for the new set of KEM constructions and instead follow the lead of X-Wing using small ASCII art of spaceships. This would let the naming discussion continue while allowing implementers to begin deploying their experiments without fear of having the labels change at a later time. So Filippo created a few new ASCII art pieces, one of an imperial TIE fighter for `MLKEM768-P256` and another of an imperial Lambda shuttle for `MLKEM1024-P384`: |A| | |V| /-\ Filippo posted about this on Mastodon, where Frederik Braun suggested changing the TIE fighter to `|-o-|`. Filippo wanted to keep the characters used at exactly 6 bytes, so I suggested: > Post by @sethmlarson@mastodon.social View on Mastodon Frederik and Filippo approved of my rendition, and submitted a pull request which was eventually merged into the draft. I love little easter-eggs left in internet standards by authors so it felt special to be able to contribute my own for future readers' enjoyment. _Happy implementing!_ :) * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
October 30, 2025 at 5:41 PM
Re(blog, tweet, toot, skoot, skeеt, post)
Have you noticed the similar terms used for sharing someone else's content with attribution from your own account? Reblogging was the original term for “blogging” another user's content, first developed by project “reBlog” and popularized by Tumblr. Remember that “blog” is a truncation of “weblog” (... should it have been **’blog** to capture the shortening?) Anyway, here's a railroad diagram of all the different words I could think of: re blog t wee oo Twitter & Mastodon sk ee oo Bluesky t post boost Tweet was coined for Twitter, a “micro-blogging” platform, so retweets were what reblogging was called on that platform. That naming has since changed, but wouldn't you know it: tweets used to be known as “twits”. Given Twitters owner that name makes more sense now than ever. “Toots” are an elephant-themed “tweet” for Mastodon, with “boost” being the official term. “Skoot” was initially proposed for Bluesky, but cleverer folks suggested “skeеt”, much to Jay's dismay (stop trying to make skoot happen). Now you see less “platform-specific” terminology being thrown around, like “post” and “repost”. Personally, I'm not a fan: check your _posts_ at the door, **we're goin’ ‘bloggin’!** 🏄︎ I and many other blogs publish a “blogroll”, or a list of other blogs and pages that we've “reblogged” to our own website. If you're interested, give those a read and discover something new by surfing the web like we did in the 90s. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
October 28, 2025 at 5:38 PM
Is the "Nintendo Classics" collection a good value?
Nintendo Classics is a collection of hundreds of retro video games from Nintendo (and Sega) consoles from the NES to the GameCube. Nintendo Classics is included with the Nintendo Switch Online (NSO) subscription, which starts at $20/year (~$1.66/month) for individual users. Looking at the prices of retro games these days, this seems like an incredible value for players that want to play these games. This post is sharing a dataset that I've curated about Nintendo Classics games and mapping their value to actual physical prices of the same games, with some interesting queries. For example, here's a graph showing the total value (in $USD) of Nintendo Classics over time: The dataset was generated from the tables provided on Wikipedia (CC-BY-SA). The dataset doesn't contain pricing information, instead only links to corresponding Pricecharting pages. This page only shares approximate aggregate price information, not prices of individual games. This page will be automatically updated over time as Nintendo announces more games are coming to Nintendo Classics. This page was last updated **October 7th, 2025**. ## How many games and value per platform? There are 8 unique platforms on Nintendo Classics each with their own collection of games. The below table includes the value of both added and announced-but-not-added games. You can see that the total value of games in Nintendo Classics is _many thousands of dollars_ if genuine physical copies were purchased instead. Here's a graph showing the total value of each platform changing over time: And here's the data for all published and announced games as a table: Platform | Games | Total Value | Value per Game ---|---|---|--- NES | 91 | $1980 | $21 SNES | 83 | $3600 | $43 Game Boy (GB/GBC) | 41 | $1615 | $39 Nintendo 64 (N64) | 42 | $1130 | $26 Sega Genesis | 51 | $2910 | $57 Game Boy Advance (GBA) | 30 | $930 | $31 GameCube | 9 | $640 | $71 Virtual Boy | 14 | $2580 | $184 All Platforms | 361 | $15385 | $42 View SQL query SELECT platform, COUNT(*), SUM(price), SUM(price)/COUNT(*) FROM games GROUP BY platform; ## How much value is in each Nintendo Classics tier? There are multiple "tiers" of Nintendo Classics each with a different up-front price (for the console itself) and ongoing price for the Nintendo Switch Online (NSO) subscription. Certain collections require specific hardware, such as Virtual Boy requiring either the recreation ($100) or cardboard ($30) Virtual Boy headset and GameCube collection requiring a Switch 2 ($450). All other collections work just fine with a Switch Lite ($100). All platforms beyond NES, SNES, Game Boy, and Game Boy Color require NSO + Expansion Pass. Platforms | Requires | Price | Games | Games Value ---|---|---|---|--- NES, SNES, GB, GBC | Switch Lite & NSO * | $100 + $20/Yr | 215 | $7195 +N64, Genesis, GBA | Switch Lite & NSO+EP | $100 + $50/Yr | 338 | $12165 +Virtual Boy | Switch Lite, NSO+EP, & VB | $130 + $50/Yr | 352 | $14745 +GameCube | Switch 2 & NSO+EP | $450 + $50/Yr | 361 | $15385 * I wanted to highlight that Nintendo Switch Online (NSO) without Expansion Pack has the option to actually pay $3 monthly rather than $20 yearly. This doesn't make sense if you're paying for a whole year anyway, but if you want to just play a game in the NES, SNES, GB, or GBC collections you can pay $3 for a month of NSO and play games for very cheap. ## How often are games added to Nintendo Classics? Nintendo Classics tends to add a few games per platform every year. Usually when a platform is first announced a whole slew of games are added during the announcement with a slow drip-feed of games coming later. Here's the break-down per year how many games were added to each platform: Platform | 2018 | 2019 | 2020 | 2021 | 2022 | 2023 | 2024 | 2025 ---|---|---|---|---|---|---|---|--- NES | 30 | 30 | 8 | 2 | 5 | 4 | 12 | SNES | | 25 | 18 | 13 | 9 | 1 | 9 | 8 N64 | | | | 10 | 13 | 8 | 8 | 3 Genesis | | | | 20 | 17 | 8 | 3 | 3 Game Boy | | | | | | 19 | 16 | 6 GBA | | | | | | 13 | 12 | 5 GameCube | | | | | | | | 9 Virtual Boy | | | | | | | | All Platforms | 30 | 55 | 26 | 55 | 43 | 53 | 60 | 34 View SQL query SELECT platform, STRFTIME('%Y', added_date) AS year, COUNT(*) FROM games GROUP BY platform, year ORDER BY platform, year DESC; ## What are the rarest or valuable games in Nintendo Classics? There are a bunch of valuable and rare games available in Nintendo Classics. Here are the top-50 most expensive games that are available in the collection: Platform| Game| Added Date | Virtual Boy| Jack Bros.| TBA ---|---|--- Virtual Boy| Virtual Bowling| TBA Genesis| Crusader of Centy| 2023-06-27 Genesis| Pulseman| 2023-04-18 Virtual Boy| Space Invaders Virtual Collection| TBA Genesis| Alien Soldier| 2022-03-16 SNES| EarthBound| 2022-02-09 Genesis| MUSHA| 2021-10-25 SNES| Harvest Moon| 2022-03-30 SNES| Wild Guns| 2020-05-20 Virtual Boy| Innsmouth no Yakata| TBA Genesis| Mega Man: The Wily Wars| 2022-06-30 GB/GBC| Mega Man V| 2024-06-07 SNES| Sutte Hakkun| 2025-01-24 NES| Fire 'n Ice| 2021-02-17 SNES| Kirby's Dream Land 3| 2019-09-05 NES| Donkey Kong Jr. Math| 2024-07-04 GB/GBC| Survival Kids| 2025-05-23 SNES| Demon's Crest| 2019-09-05 GameCube| Chibi-Robo!| 2025-08-21 GameCube| Pokémon XD: Gale of Darkness| TBA GB/GBC| Castlevania Legends| 2023-10-31 NES| S.C.A.T.: Special Cybernetic Attack Team| 2020-09-23 SNES| Star Fox 2| 2019-12-12 SNES| Kirby's Star Stacker| 2022-07-21 GBA| F-Zero Climax| 2024-10-11 GameCube| Pokémon Colosseum| TBA GB/GBC| Mega Man IV| 2024-06-07 SNES| Uncharted Waters: New Horizons| 2025-03-28 Virtual Boy| Virtual Boy Wario Land| TBA NES| Shadow of the Ninja| 2020-02-19 SNES| Super Metroid| 2019-09-05 GBA| Mr. Driller 2| 2025-09-25 SNES| Joe & Mac 2: Lost in the Tropics| 2019-09-05 SNES| Breath of Fire II| 2019-12-12 SNES| Umihara Kawase| 2022-05-26 Genesis| Gunstar Heroes| 2021-10-25 Genesis| Ristar| 2021-10-25 Virtual Boy| Virtual Fishing| TBA NES| Vice: Project Doom| 2019-08-21 N64| Sin and Punishment| 2021-10-25 N64| Pokémon Stadium 2| 2023-08-08 Genesis| Castlevania: Bloodlines| 2021-10-25 Genesis| Phantasy Star IV| 2021-10-25 SNES| The Peace Keepers| 2020-09-23 GB/GBC| Kirby Tilt 'n' Tumble| 2023-06-05 N64| The Legend of Zelda: Majora's Mask| 2022-02-25 Virtual Boy| Mario Clash| TBA SNES| Super Valis IV| 2020-12-18 SNES| Wrecking Crew '98| 2024-04-12 View SQL query SELECT platform, name, price FROM games ORDER BY price DESC LIMIT 50; ## Who publishes their games to Nintendo Classics? Nintendo Classics has more publishers than just Nintendo and Sega. Looking at which third-party publishers are publishing their games to Nintendo Classics can give you a hint at what future games might make their way to the collection: Publisher | Games | Value ---|---|--- Capcom | 17 | $1055 Xbox Game Studios | 13 | $245 Koei Tecmo | 13 | $465 City Connection | 11 | $240 Konami | 10 | $505 Bandai Namco Entertainment | 9 | $190 Sunsoft | 7 | $155 Natsume Inc. | 7 | $855 G-Mode | 7 | $190 Arc System Works | 6 | $110 View SQL query SELECT publisher, COUNT(*) AS num_games, SUM(price) FROM games WHERE publisher NOT IN ('Nintendo', 'Sega') GROUP BY publisher ORDER BY num_games DESC LIMIT 20; ## What games have been removed from Nintendo Classics? There's only been one game that's been removed from Nintendo Classics so far. There likely will be more in the future: Platform| Game| Added Date| Removed Date | SNES| Super Soccer| 2019-09-05| 2025-03-25 ---|---|---|--- View SQL query: SELECT platform, name, added_date, removed_date FROM games WHERE removed_date IS NOT NULL; This site uses the MIT licensed ChartJS for the line chart visualization. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
October 20, 2025 at 5:37 PM
Winning a bet about “six”, the Python 2 compatibility shim
Exactly five years ago today Andrey Petrov and I made a bet about whether “`six`”, the compatibility shim for Python 2 and 3 APIs, would still be in the top 20 daily downloads on PyPI. I said it would, Andrey took the side against. Well, today I can say that I've won the bet. When the bet was placed, `six` was #2 in terms of daily downloads and today `six` is #14. Funnily enough, `six` was still exactly #14 back in 2023: > “six is top 14 after 3 years, 2 years left, sounds like [Andrey] is winning” > -- Quentin Pradet (2023-07-09) Completely unrelated to this bet, Hynek mentioned `six` still being in the top 20 downloaded packages during his PyCon UK keynote. `six` itself isn't a library that many use on its own, as at least 96% of `six` downloads come from Python 3 versions. Instead, this library is installed because other libraries depend on the library. Here are the top packages that still depend on `six`: Package | Downloads / Day | Last Uploaded ---|---|--- python-dateutil | 22M | 2024-03-01 yandexcloud | 6M | 2025-09-22 azure-core | 4M | 2025-09-11 jedi | 2M | 2024-11-11 kubernetes | 2M | 2025-06-09 rfc3339-validator | 2M | 2021-05-12 google-pasta | 1M | 2020-03-13 confluent-kafka | 1M | 2025-08-18 oauth2client | 1M | 2018-09-07 ecdsa | 1M | 2025-03-13 These packages were found by querying my own dataset about PyPI: SELECT packages.name, packages.downloads FROM packages JOIN deps ON packages.name = deps.package_name WHERE deps.dep_name = 'six' GROUP BY packages.name ORDER BY packages.downloads DESC LIMIT 10; Notice how a single popular library, `python-dateutil`, keeping `six` as a dependency was enough to carry me to victory. Without `python-dateutil` I likely would have lost this bet. I also wanted to note the "last uploaded" dates, as some of the libraries aren't uploaded frequently, potentially explaining why they still depend on `six`. > “surely in 10 years, six won't be a thing. right? RIGHT?” > -- Andrey Petrov (2020-10-01) We'll see! ;) Thanks to Benjamin Peterson for creating and maintaining `six`. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
October 14, 2025 at 5:36 PM
GZipped files and streams may contain names
It's just another day, you're sending a bunch of files to a friend. For no particular reason you decide to name the archive with your controversial movie opinions: $ tar -cf i-did-not-care-for-the-godfather.tar *.txt $ gzip i-did-not-care-for-the-godfather.tar Realizing you'd be sharing this file with others, you decide to rename the file. $ mv i-did-not-care-for-the-godfather.tar.gz \ i-love-the-godfather.tar.gz That's better! Now your secret is safe. You share the tarball with your colleague who notes your "good taste" in movies and proceeds to extract the archive. $ gunzip --name i-love-the-godfather.tar.gz i-love-the-godfather.tar.gz: 100.0% -- replaced with i-did-not-care-for-the-godfather.tar Uh oh, your secret is out! The decompressed `.tar` file was named `i-did-not-care-for-the-godfather.tar` instead of `i-love-the-godfather.tar` like we intended. _How could this happen?_ It turns out that GZip streams have fields for information about the original file including the filename, modified timestamp, and comments. This means GZip streams can leak secret information if it's contained within the file metadata. Luckily `tar` when using `$ tar -czf` (which is the typical workflow) instead of the `gzip` and `gunzip` commands doesn't preserve the original filename in the GZip stream. If you do have to use `gzip`, **use the`--no-name` option to strip this information from the GZip stream.** Use a hex editor to check a GZip compressed file if you are unsure. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
October 9, 2025 at 5:26 PM
Extracting NES & N64 ROMs from Zelda Collector's Edition
Gaming as a hobby is **about to become much more expensive** in the United States due to tariffs. I cannot recall a time in the past where a **console's price has increased during its generation** , and yet the Xbox Series X & S, the Nintendo Switch, and most recently the Playstation 5 have had price hikes. These are not normal times. So here's another entry in my mini-series (#1, #2) of extracting ROMs from GameCube games, this time the “Zelda Collector's Edition” which contains 2 NES and 2 N64 Zelda titles. This article only took so long because I was trying to actually implement the Nintendo “TGC” archive format, but it turns out that even the popular tools handling TGC can't parse the NES ROMs out of the archives included in the Zelda Collector's Edition game properly. ¯\\_(ツ)_/¯ So instead I created a script which looks for NES and N64 ROM header magic strings (`NES\x1A` and `\x80\x37\x12\x40`) and used known lengths for these ROMs instead of the TGC format for framing information. _So much easier!_ Game | Length | MD5 ---|---|--- Legend of Zelda | 131088 | `BF8266F0FA69A5A8DAF5F23C2876A1AD` Zelda II - The Adventure of Link | 262160 | `32308B00B9DEC4DF130C7BF703340FF3` Legend of Zelda - Ocarina of Time | 33554432 | `CD09029EDCFB7C097AC01986A0F83D3F` Legend of Zelda - Majora's Mask | 33554432 | `AC0751DBC23AB2EC0C3144203ACA0003` You know the drill by now: buying these games costs over $150 USD and “Zelda Collector's Edition” for the GameCube only costs ~$50 USD. Pretty good savings, especially for two of the most celebrated Zelda titles. However, the price is high enough that you might consider buying a year of “Nintendo Switch Online + Expansion Pack” if you already have the console. I still haven't beaten “Ocarina of Time” or “Majora's Mask”, even though I know they are both masterpieces. The only Legend of Zelda games I've beaten end-to-end are “Wind Waker” and “Four Swords Adventures”, can you tell I'm a GameCube player? Let me know what your favorite Legend of Zelda title is. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
September 15, 2025 at 4:27 PM
Draft SMS and iMessage from any computer keyboard
If you're like me, you don't _love_ the ergonomics of writing long text messages on your mobile phone keyboard. We own an “Arteck HB066” Bluetooth keyboard for this use-case which works great and costs $45. But I'm not interested in _spending money_ today. What if I could write text messages, both SMS or iMessage, using any computer keyboard? This little tool does just that: write text messages in this browser window, and it'll generate a QR code which you can scan with your phone camera to send the message. If you are sending to multiple recipients, use a comma (`,`) to delimit the different recipient phone numbers. I recommend using international codes (`+1` for the USA), but it appears to work at least on iOS without them. Don't know or don't want to type in your recipient phone number directly? Add a `1` as the recipient, scan the QR code, and then fill in the recipients on your phone to use auto-complete from your contacts list. All data stays within the browser: your data is not processed, saved, or sent to any other server. If this tool is useful: bookmark the page for later use and let me know what you think. * * * Recipient(s): Message: What phone is sending the messages? (Why does this matter?) iPhone or iOS Android, Pixel, Samsung, etc * * * QR codes generated using qrcode-svg, licensed MIT and Copyright (c) 2020 datalog. * * * Thanks for keeping RSS alive! ♥
sethmlarson.dev
September 5, 2025 at 4:11 PM