I'm occasionally accused of using LLMs to write the content on my blog. I don't do that, and I don't think my writing has much of an LLM smell to it... with one notable exception:
# Finally, do em dashess=s.replace(' - ', u'\u2014')
That code to add em dashes to my posts dates back to at least 2015 when I ported my blog from an older version of Django (in a long-lost Mercurial repository) and started afresh on GitHub.
In the latest OpenSSL security release> on January 27, 2026, twelve new zero-day vulnerabilities (meaning unknown to the maintainers at time of disclosure) were announced. Our AI system is responsible for the original discovery of all twelve, each found and responsibly disclosed to the OpenSSL team during the fall and winter of 2025. Of those, 10 were assigned CVE-2025 identifiers and 2 received CVE-2026 identifiers. Adding the 10 to the three we already found in the Fall 2025 release, AISLE is credited for surfacing 13 of 14 OpenSSL CVEs assigned in 2025, and 15 total across both releases. This is a historically unusual concentration for any single research team, let alone an AI-driven one.
These weren’t trivial findings either. They included CVE-2025-15467, a stack buffer overflow in CMS message parsing that’s potentially remotely exploitable without valid key material, and exploits for which have been quickly developed online. OpenSSL rated it HIGH severity; NIST‘s CVSS v3 score is 9.8 out of 10 (CRITICAL, an extremely rare severity rating for such projects). Three of the bugs had been present since 1998-2000, for over a quarter century having been missed by intense machine and human effort alike. One predated OpenSSL itself, inherited from Eric Young’s original SSLeay implementation in the 1990s. All of this in a codebase that has been fuzzed for millions of CPU-hours and audited extensively for over two decades by teams including Google’s.
In five of the twelve cases, our AI system directly proposed the patches that were accepted into the official release.
AI vulnerability finding is changing cybersecurity, faster than expected. This capability will be used by both offense and defense.
Opus 4.6 is notably better at finding high-severity vulnerabilities than previous models and a sign of how quickly things are moving. Security teams have been automating vulnerability discovery for years, investing heavily in fuzzing infrastructure and custom harnesses to find bugs at scale. But what stood out in early testing is how quickly Opus 4.6 found vulnerabilities out of the box without task-specific tooling, custom scaffolding, or specialized prompting. Even more interesting is how it found them. Fuzzers work by throwing massive amounts of random inputs at code to see what breaks. Opus 4.6 reads and reasons about code the way a human researcher would—looking at past fixes to find similar bugs that weren’t addressed, spotting patterns that tend to cause problems, or understanding a piece of logic well enough to know exactly what input would break it. When we pointed Opus 4.6 at some of the most well-tested codebases (projects that have had fuzzers running against them for years, accumulating millions of hours of CPU time), Opus 4.6 found high-severity vulnerabilities, some that had gone undetected for decades.
The details of how Claude Opus 4.6 found these zero-days is the interesting part—read the whole blog post.
The 2026 Winter Olympics in Milan and Cortina put us in the action
Image via NBC Olympics
This year, drones have taken center stage. Not the quiet, distant kind hovering politely overhead.
These are fast, nimble FPV drones that dive down ski runs, chase snowboarders through powder, and skim the ice alongside bobsleds. For the first time, watching winter sports actually feels fast.
Image via NBC Olympics
For decades, Olympic coverage meant long lenses and helicopter shots. Beautiful, sure. But distant. Now the camera drops into the action. It banks when the skier banks. It feels the pitch of the slope. It rides the line.
While we on TV could hear the buzz of the drone overhead, the athletes wearing helmets and plummeting down hills could not.
Rigorous testing was done to ensure that the drones didn’t distract the athletes or interfere with their event.
A drone chasing a skier. Image via NBC Olympics
The result was visceral. You can almost sense the cold air and the edge of steel carving into ice as the olympians did their events.
Image via NBC Olympics
And more importantly, it gives the athletes justice. These competitors spend their lives chasing hundredths of a second, committing fully to risk, gravity, and precision.
Image via NBC Olympics
A static camera flattens that ambition. This new perspective honors it. You finally see how steep the slope really is. How tight the turns are. How little room there is for error.
The drones helped give some added (and needed) perspective to those of us watching from afar.
Image via NBC Olympics
What did you think of the new perspectives that the drones offered this year?
Image via NBC OlympicsImage via NBC/Youtube
Image via NBC OlympicsImage via NBC OlympicsImage via NBC Olympics