Today I LearnedRSS

Most Recently

2026-04-04
We Haven't Seen the Worst of What Gambling and Prediction Markets Will Do to America

I've not got much to add here. I guess I should point out, it's not just the United States of America like the title claims, but I assume that's fairly obvious. Welcome the next big tech hype after generative AI.

2026-04-03
Lecture Friday: The Beauty of Bézier Curves

You may have already watched this video given its view count and featuring in the Summer of Math Exposition from 3Blue1Brown. If you haven't, it's definitely worth the watch.

While you can't calculate the distance along the bézier in closed form, you can at least calculate the roots as I do in my bézier visualizer using Cardano's Method. Just an extra tidbit I found hard to locate when I was doing my research for that project.

2026-04-02
The Three Pillars of JavaScript Bloat

Great rundown on the problem and a number of tools to help you solve it.

We're definitely at a tipping point as an industry. Security vulnerabilities are now being routinely exploited within hours of a patch being made available. Supply chain attacks are punishing those too eager to update. We're being squeezed on both sides.

Your best defence is to narrow and shorten your supply chains. You need fewer dependencies. We just saw the axios supply chain attack. Maybe you dodged the bullet because you pinned dependencies and waited a few days. Something is going to sneak through. How much longer until something as brutal as the XZ backdoor slips through without someone managing to catch it early enough to save you?

Fewer dependencies make every other countermeasure more effective. If you think scanning is the answer, there is still a false negative rate to worry about. If this scanning is centralized at the public repository level, attackers can easily keep probing until they find something that slips through in throw away accounts before launching their attack. If there are vendor solutions, those can also be tested VirusTotal style. If you're instead focused on manually reviewing and signing off on updates, good luck having your team manually review tens or hundreds of updates a week. Every countermeasure is harder at scale. You're going to have to descale.

That said, if you've got a better idea than manual or automated review, there could be a million-dollar idea in there. For now I'm building a little tool that feeds the diff of package updates to an LLM to try and flag suspicious code for manual review. A hybrid approach run locally. It's not a great solution, but I'm also sure it's only a matter of time until the antivirus vendors catch on and offer basically the same thing but with much better classifiers and heuristics.

The problem is that every source code dependency has effectively complete and unrestricted access. I'd love it if my execution runtime could come with something like pledge(2) at the module level so I could create a list of just the allowed permissions each package is allowed to use. Then a module can only call other modules that have a subset of its own privileges. I'd even use this for my own code like I do already at the process level. Start by getting an inventory and then strictly watch that new additions are appropriate. The hard part is doing it without requiring a whole new programming language or dependency ecosystem.

In any case, a little duplication is better than a little dependency.

2026-04-01
Shell Tricks That Actually Make Life Easier (And Save Your Sanity)

As someone who spends easily over 85% of my time in a terminal, it's rare for me to learn something new from one of these sorts of articles, especially those starting with simple concepts. I did not know about Ctrl-Y.

Kudos to the author here. This is the exact sort of list of everyday things I use too, so it's likely from the top of their head and not just regurgitating the Zsh manual at you. Not everything noted is a daily mainstay for me (I've never needed pushd/popd), but it's got a lot of the things I use constantly and very little else I don't use.

Here is one bonus tip: they cover cd - but didn't mention just running cd without an argument. That takes you to your home directory.

2026-03-31
GitHub's Historic Uptime

Wow that's telling. Now I really want to know how this happened. What changed internally to mess it up this badly. I just want to learn from this car crash.

2026-03-27
Lecture Friday: Misuser

The divide between technologist and artist is frustrating, but it shouldn't be. Too many techies look down on the humanities, arguing, "Their problems can't be objectively answered. If there are no right answers, it's all meaningless."

This mindset is a legacy of multiple choice questions and standardized testing. Any question having a single objectively correct answer is nonsense, outside of borderline fantasy levels of abstraction (spherical cows in a frictionless vacuum and all that).

But the art of engineering is often just as subjective as painting. You can talk about, "thinking like an artist," but frankly, that's just old-school hacker culture. They're the same thing; making things do something nobody thought it could because you can. Technology used to be cool because it rewarded those who could build something nobody thought possible. Those who made dreams real. I'd argue that's still what makes technology cool, but being profitable has sucked all the air out of the room.

Yet, the problem of keeping our digital artifacts running is fascinating. Theory says digital media lasts forever; reality says otherwise. While a film reel can last a few decades and a good book can last a century, most digital formats only survive a couple of years.

Even just looking at raw device lifetimes, you rarely hit the decade mark with the hardware. Factor in the digital services—the gatekeepers of media delivery—and the average lifespan drops to mere years. Web links rot at a rate of roughly 1-3% per year.

It's not just historians that need this. One of the biggest problems with our current media landscape is echo chambers. Hopefully, you're aware that what you experience online is mostly whatever happily passes through your cognitive filter. If you agree, the loop is reinforced. If you disagree, the confrontation is avoided. Over time, you curate an information diet that pre-filters for your existing beliefs. Algorithms amplify this because platforms want to maximize time on site.

The best way to break out of this loop is to engage with "slow media." Media you can take a break from and ponder on. Read the work of people steeped in a world before the recent wars, before the industrial expansion, or even mercantile trade. Reading a work a hundred years old forces you to encounter values vastly different from your own, allowing you to see the world through eyes that don't share your assumptions.

However, there is a darker risk. Popular works still generate profit, so they get copied and updated, re-encoded and given the occasional touch-up. But common works and periodicals go missing because there's no financial incentive to update them. If you've been online since the early 2000s or before, ask yourself: how much of that internet actually survives?

I once heard someone muse about a possible second dark age. A dark age is not about barbarism as many people think, though we can leave discussions of our current political climate for another time. No, a dark age refers to a period of history where not many works survive to inform historical understanding. When history becomes dark, we are left to infer the era by looking through the trash (archaeology).

I'd argue we're neither in nor entering a modern dark age, but I'd be over stating it if I didn't acknowledge that conditions are ripening. We are entering a period where the future isn't strongly considered. As a designer, engineer, or creator, ask yourself: What are you leaving behind?

We have an ugly narrative forming, one that assumes climate change, war, or disease will cause a complete global collapse. We tell ourselves there is no future, that we should only live for today. I mean, I do have 100% certainty you're going to die. Mortality is still the undefeated champion of life. There will even be destruction, loss, and disasters. But assuming every human will vanish or that you can prepare for such a fate is delusional. It's a coping mechanism for fear. Why not consider trying to improve the lives of those who will come after?

Are you planting metaphorical trees for them to sit under? Or, are you busy tearing down the past to justify your present? Spending your time criticizing those who came before so that those who came after can criticize you? Why leave the future an orphaned people without hope or heritage?

Many people talk about how the internet used to be fun. It still is! You just have to ignore the thousand-pound corporate gorillas. It's not that the internet changed; it's that you did. What used to be your counterculture is now the dominant culture.

So go make a subculture, go make a new counterculture. To me that means making and sharing things in your own space, with your own friends. Build real communities of real people in the real world who collaborate online.

This is the only way to avoid the dark age. We need to leave something behind, not just consume what is given. Yes, it's hard work. Anything worth doing is. But boredom is the key. If you keep satiating it with online content, you pacify the desire to self-actualize.

2026-03-26
The Last Quiet Thing

Your phone is not a slot machine. It's a to-do list that writes itself.

They're describing basically the same trick the consumer goods industry played blaming you for plastic waste. It's not their fault they package everything in single use plastics. It's your fault for not recycling hard enough. Sure, most of that recycled plastic just goes into landfills, but that's governments' fault for not spending all your money subsidising it.

If recycling worked, they'd pay you to do it because those materials would be valuable. They're not.

Unfortunately, this article claims to not want to tell you personal action is the answer, but instead implicitly tells you that you should vote with your wallet. Vote in an election where the fattest wallets get the most votes.

Sure, there's a method to the madness. I don't buy phones that don't have a 3.5mm headphone jack. My 2011 car complete with all the knobs and buttons has an AUX port that plays FLAC albums in excellent quality. My TV has never experienced the internet for itself, cursed to live vicariously though a connected PC complete with ad blocking web browser and a hard drive stuffed with DRM free video games. But this is a strategy to cope, not a solution.

Welcome to the atomized society. A place where our answer to all abuse, from consumer fraud to sexual assault is the same. If you don't like it, make better choices, buy better things. There is no alternative. There are only individuals and families.

This isn't going to change until we stop thinking memes and internet points count as political action. Physical action counts, everything else is intellectual masturbation.

2026-03-20
Lecture Friday: Computers for Cynics

Project Xanadu might just be the most compelling example in favour of the Worse is Better philosophy.

I feel talks by Ted Nelson are a lot like Alan Kay. Full of thoughts and ideas to really expand your conception of what a computer can be, but always best taken in moderation.

Every time I revisit this set of lectures I find more things that click. It's full of wisdom and I hope you too get something useful to help you rethink what you "know" about technology.

2026-03-18
How Kernel Anti-Cheats Work: A Deep Dive into Modern Game Protection

Surely not every trick in the book, but a good look under the hood. Just sharing because I only had a theoretical understanding before this. This article is fairly detailed and specific. Great work!

Really interesting idea about how this moves the best attacks into hardware. Could be a really cool project to design an FPGA based memory proxy. The applications are actually really interesting.

2026-03-16
Nobody Gets Promoted for Simplicity

It comes back to selling your work. That said, the example paragraphs they wrote for selling your work building simple solutions are atrocious. It's not that they're wrong about their thesis, just that the examples provided to help the audience isn't going to be of much help. That's because getting recognition for your work is a performance. You have to learn how to sell your work effectively, not just at all.

Never just list features you built, nor the ones you considered. Instead, always talk about your work in the broader context of the impact to the business and its customers. I've deleted a few dozen lines of code that saved a company a 6 figure sum. Sure 99% of the work was being completely sure those lines weren't load bearing. You think I sell it by saying I deleted a little bit of code? No! I talk about how I just saved us all a small yacht's worth of cash.

Making more money requires people thinking you're too valuable to not pay for. Meritocracies are a utopian myth. If they really existed, companies wouldn't be spending trillions on advertising. Your skills and abilities are as valuable as people think they are. Part of that is genuine demonstrated ability. But people pay more for Coca-Cola over the store brand because it gives them a good feeling. Blind taste tests show store brands and home made soda tastes better. I'd link a study here, but there's hundreds of them and to my limited knowledge, no good meta analysis. Just go search around.

Think about it like balancing the five P's of marketing: product, price, place, people, and promotion. You want your price to go up, and let's assume you're working as hard as you already can on your product, that is, doing your best work. You can still impact place, people, and especially promotion (because so few engineers are thinking about it). Place here is where you spend your time. Are you working on the problems your company cares about? If not, you have to then spend extra time to convince them that the work you're doing even matters. Why fight an uphill battle into a fierce headwind? People is about soft skills. Knowing your audience, presenting yourself professionally, and having great customer service; that is being attentive, friendly, and communicative. Lastly is promotion, and that's a big part of building the feeling people have about the work you do.

That feeling matters. In all but the most Byzantine systems, promotions happen way before any promotion package or formal process. Even in large companies, people know roughly who they can count on to get things done well. You're basically playing dodge ball in gym class again and managers are picking team members. They know the kid who can run fastest isn't always the best at the game. But that kid's at least got a reputation. The weird kid that keeps to themselves is an enigma though. Especially if they mostly just stand there and maybe walk back and forth a bit. They may be a secret martial arts master able to dodge anything, but it's not obvious if they're downplaying it.

One of the missing pieces in this discussion is how much harder simple solutions are to make. This is going to hurt your perceived velocity. What's most likely to hold you back is if you're spending your time simplifying everyone else's projects. You need things you can take credit for. If you're actively leading projects and able to deliver working solutions under budget—in both time and money—then it becomes easier and easier to demonstrate a track record.

My strategy here is have long term goals for the business and your customers. Push for the time to action on the things that would lead to demonstrable improvements in business value terms. Not all of your ideas will be tractable. Some of the simplifications I've been pushing for, I'm still pushing for over five years later. Many of the simplifications I wanted have already been done. Doing those and demonstrating the value gave people confidence in me to go work on bigger projects. Past performance inspires confidence in future results but people only know about your performance if you talk about it. If you can deliver concrete value, you'll be trusted with the power to do more.

2026-03-13
Lecture Friday: A Swift Introduction to Geometric Algebra

I don't have a lot to add here, only to say that I keep coming across geometric algebra and it keeps seeming like something I need to really dig into and learn. I've started going through a bunch of the resources from bivector.net. Lots of things to unpack in this. Not fully sure if this is a better basis for a 3D renderer than quaternions, but it's definitely worth playing with given how much more intuitive the math notation seems to be.

If you're also going to look into studying this, another term you want to know is Clifford Algebra.

2026-03-12
Stop Telling People To Sanitize User Input

Signal boosting this. Always manage user data transforms for safe encoding on output, never input. Your input code shouldn't know how the data's going to be used. Maybe it'll be put in an email, maybe it's going on a webpage, maybe you're printing it. Each of those have very different sanitization requirements. Your input code should not be trying to account for all the different ways it'll be displayed.

Data given to the system should be treated like your handling explosives. There are ways to safely move it, ways to safely store it, but you don't go messing with it prematurely. Definitely don't trust, touch, look, or lick it unless you absolutely have to.

2026-03-11
Dynamicland

I know I missed posting my usual Lecture Friday, so I come bearing great gifts. Today I was reading Dave Gauer's book review on The Art of Doing Science and Engineering, and he opened by talking about Bret Victor's forward and noted that his site has an amazing collection of all the best research papers in computing. Upon reading that I wondered, "Huh, what's Bret up to these days." I then proceeded to fall head first into Dynamicland. I'm still not sure what to make of it all but I'm really excited by everything I'm seeing.

So what can I say except, "You're welcome?"

2026-02-27
Lecture Friday: Wat

Basically programming standup comedy. Not much to learn here other than how weird some parts of some programming languages are.

2026-02-20
Lecture Friday: The Thirty Million Line Problem

I disagree that software used to be "serene and pleasant." I remember using MS-DOS for things like desktop publishing and video games and a lot of things were fairly slow and unreliable. Like watch the screen paint the UI, slow. Crashes are probably a toss up. They were worse having gone from computer crashes, to application crashes, to tab crashes, to cringe mascots. On the other hand, the world now has so much more software I experience crashes a dozen or more times a day. We literally joke about "snow days" at work every time Github shits the bed; a near weekly occurrence these days.

But software used to crash enough that I developed a compulsive Ctrl-S habit from word processors and image editors that would crash and loose your work every few hours. I remember the advice to periodically close and restart Photoshop because of how buggy and unstable it was (is?). Overall, I think there's a lot of rose tinting on this assessment. You usually remember games from your childhood as these incredible experiences, but far too often, you go back and play them and many don't hold up to your now refined modern standards. More importantly, they've also all stabilized. Developers aren't updating them anymore so they're either known to still work or not. Really popular old software may even have community patches you're expected to apply. But it's all been worked out. You know what to expect. It won't stop working tomorrow because someone pushed a broken change through CI.

I agree with the problem, there's too much code, that's too many bugs, too many security vulnerabilities, too much fiddling after launch. Monocultures are inherently a problem. Large codebases are inherently a problem. "Many eyes make bugs shallow" is a myth. A good theory, but it's not backed up empirically. Who's eyes and what they're looking for matters more. Even when the project is doing literally everything they can, the pressure to break in becomes enormous with enough users, and the larger the interface the more porous to attack it becomes. It's why I don't put OpenSSH directly on the internet.

Even still, I think his sales pitch is a bit weak. The reason operating systems consolidated (or we consolidated on operating systems if you want) isn't because of hardware interoperability. It's that running on an operating system is more convenient than running on metal. Preemptive multitasking is just that good. It's one of those core computer technologies that make a computer what it's become like networking, bitmap displays, floating point arithmetic, and most importantly, backward compatibility. Even if I have my computer set to boot to USB, having to shutdown, dig through my stash of sticks, plug it in, and boot again—it's too much. At one point I dual boot Windows and Linux. I got rid of the Windows partition because juggling two whole systems on the same machine was annoying.

In this every application is an operating system world, I still want to listen to music while I program and consult various sources of documentation. Then when I build and run that program, I ideally want changes I make in code to be reflected in that program as fast as possible so I can iterate quickly. I also want to be able to use one or more best in class debuggers and profilers to quickly figure out where and what the problems are. That leads the development environment back to being a platform and you invoke jwz's law, Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can. Unix is just a developer environment that grew into an operating system. Browsers are just fancy document readers that grew into an operating system.

If you want the user experience, consider that I might want to check a wiki while I play some video games. I might need to consult my email while working on a spreadsheet. I want to balance my ledger against my bills and bank account. I could go on. Would any of these be as possible if we relied on a single vendor to bundle all these features into their application?

Games are maybe the only completely isolated software experiences left, and even there many people want to stream podcasts, take screenshots, reference the community wiki, chat with friends, stream their session online, or install mods to their experience of it without relying on the developer's support for any of this. It's the classic TV-VCR goal dilution trap. Wildly unpopular despite how good the TV or VCR was because you always knew getting each in a dedicated unit would be better. When we switched to DVD, anyone who did get the combo unit ended up having to get a dedicated box anyway or throw it all away and upgrade to a three-in-one. The GNSS (a.k.a. GPS) built into modern cars is universally awful because car manufacturers don't have the pressure to be as good as Google or Apple Maps do. You already bought the car.

Do we expect people to buy half a dozen different computers to be able to do these at once? An MP3 player, a wiki reader, a handheld email client, a digital painting tablet? That's all called a phone and it's wildly successful because it's portable, integrated, and so simple you can park a toddler in front of it as a babysitter.

Convenience plays a massive role in purchasing decisions. This is why developers have also embraced this tower. To send you these text files, I don't have to write a font renderer. I don't have to write a video codec. I don't have to write a web server or file system or network driver or remote administration server or resource cache or smooth scrolling interface or vector graphics interpreter or hyperlink loader. Better still, you've already downloaded half that stuff in a browser and I can cobble the rest together with Neocities and PeerTube for free in an afternoon.

These towers of code are as big as they are for various human factors, just one being the allure of convenience. Others include the overhead and imprecision of communicating between people, especially over time. How easy something is to get started with and learn. The challenge of fighting entropy in a long lived software project. The temptation to add a layer of indirection or reuse something it was only kind of designed to handle originally. The fear of not making payroll and trying to find safety in doing whatever users say they want. Egos, politics, inertia, greed, ignorance, sloth; the list goes on.

A lot of factors lead us here. SoCs don't seem like they're going to help much. Raspberry Pi and the ESP32 are probably the most popular examples of this. You might argue it's the binary blob firmware that's holding them back. Maybe? The ESP32's Wi-Fi stack is being reverse engineered and open sourced though. You could then review that code to create your own MAC driver for the chip. Maybe after that the dominant application delivery platform will begin to shift?

Obviously you can level the, "it's not good enough," retort that's become popular these days. They're not as powerful as your average desktop computer. Sure, but if they were, you'd then want to run all your existing software on it. That would require Windows or maybe one of the big two mobile operating systems. If you're running one of those, why would you want to reboot into a game instead of launch from within? Latency is a super niche reason. That's why Windows is still popular despite being widely loathed. It runs almost all the world's consumer software. Every game on Steam has to run on Windows according to the Steam publishing agreement. All the most popular Linux "only" software has been ported to Windows.

That's why Linux still only has single digit adoption. The console wars understood this. VHS and Betamax understood this. Content sells platforms. Nobody buys a platform. They buy a platform to access content. Think of it as a little two by two matrix. On one side is their current platform. On the other, the new one. Between them you have forces pushing from old to new, forces pulling from old to new, forces repelling them from the new, and forces holding them back on the old. To switch, the first two must overcome the second two. One insanely valuable application can drive the sale of a whole platform, but usually it's the sum of all the potential that drives the sale because making something so valuable it's worth hundreds or thousands of dollars in switching costs is really hard.

With that in mind, the 3-6% or whatever Linux is currently at is almost exclusively driven by just how incompetent directors at Microsoft are in managing their cash cow, resulting in people who hate Windows enough to switch. There's very little pulling users, lots repelling them, and a mountain of things holding them back. But that mountain's been shrinking as they keep abusing their developers who've been moving to the web, the pushes have been getting stronger with ads and instability, and the repellents have been slowly easing as polished user friendly UIs and Windows compatibility layers have gotten pretty great. All that's missing is the killer app. Something you can only get on Linux and not Windows that's worth switching for.

That's also why people build on popular platforms and end up reinforcing their dominance. Gambling on being so good people would switch for you is risky. Why use a different platform if it's not likely to pay off spectacularly well if you succeed.

That all aside, we can see if this plays out because who says the best market is wealthy industrialized nations? With the insanely low cost of solar that continues to fall, lots of developing nations are rapidly expanding their energy capacity. That opens up a wealth of opportunities for low cost computerization like Raspberry Pis even in remote areas. If there's ever been a time to see if SoC bare metal single purpose systems will take off, it started a couple years ago and will play itself out over the next decade and a half. And it kind of is. Inverter and charge controllers are becoming a huge area of concern because they are these network connected embedded single purpose proprietary systems that also pose significant risk to nations as countries continue to expand their weaponization of supply chains.

I've considered going with a unikernel for my backend. A single binary operating system and application together. Boot the machine directly to the server shipped as a single disk image. But then I ask, why? I reboot this server a couple times a year. All that effort to save maybe 60 seconds a year waiting for the server to boot. This machine sees a normal CPU load of between 0.5% and 1.5%, mostly depending on how much connection spam I'm handling.

Sure, I could spend my time maintaining my own data storage (file system, database, and backups), network stack (TCP/IP, DNS, HTTPS, and firewall), thread scheduler, and remote administration/debugging interface. Or, I could spend my time writing these posts to collect my thoughts and spread my ideas. I'm not cursed to live forever. I can only do so much in the very limited time I have left to live. I'm not seeing an advantage to doing it myself instead of bringing together a pile of open source software. With those, I spend the equivalent of 1/5th of a full time dev (in exchange for not learning to play guitar or something) plus $0.03/hour ($20/month) building and maintaining this thing.

And that's for a target that almost amounts to an SoC. I run on a VPS, which conforms to the VirtIO machine spec running on an x86_64 Skylake chip. But why do I want to write and maintain a SCSI controller? The machine is already so huge for the workload and the existing driver in Linux is good enough. Sure there are efficiencies that could be gained by writing a controller that exposes a relational database interface directly instead of layering storage controller, file system, and database. It's also thousands of hours of work to build. It's hard to justify. This site is never going to get big. If it does, I can throw money at it and consider the tradeoffs involved in rewriting it to be more performant. Sure, writing it to be able to handle planet scale from the start sounds great, maybe even sounds cost effective, but I'm very unlikely to correctly design such a system from first principals (scaling behaviour always surprises you), and I'd spend all my time designing a platform without content and guarantee I never need to scale.

2026-02-14
Wikipedia: Henrietta Lacks

Learned about her last night, or rather, her contribution to medical research for things like the polio vaccine (a disease a segment of the population seems adamant to bring back from the edge of extinction). Sadly, not voluntarily. Go read her Wikipedia page. Least you can do is learn how much she's helped humanity without her knowledge or consent thanks to her unfortunate early death.

2026-02-13
Lecture Friday: Repsych: Psychological Warfare in Reverse Engineering

This talk is wonderful. You won't learn too much, but it's such a cool idea taken to great lengths. It's stuff like this that gets me excited about computers. I hope it brightens your week.

2026-02-06
Lecture Friday: Stop Rate Limiting! Capacity Management Done Right

Quick refresher on queue theory. Really take some time and do the math he's doing yourself by hand. That's the big skill from this talk. If you just let him do it for you, you won't learn it. It'll build the ability for you to use Little's Law in your own programming and design work. Simple summary was always limit request concurrency. Specifically he shows how to do this leveraging TCP's congestion protocol.

Can't agree more. The failure mode I see again in again in asynchronous services is not limiting queue sizes including the request queue. You never want an infinite queue. Honestly, you usually don't want queues either, you want stacks, because stacks prioritize liveness, not fairness. If someone shows up with a thousand things to do, a stack will ensure everyone with odd requests who show up after gets priority.