Python 3.15's JIT is now back on track

(fidget-spinner.github.io)

246 points | by guidoiaquinti 6 hours ago

18 comments

  • owaislone 1 hour ago
    Oh man, Python 2 > 3 was such a massive shift. Took almost half a decade if not more and yet it mainly changing superficial syntax stuff. They should have allowed ABIs to break and get these internal things done. Probably came up with a new, tighter API for integrating with other lower level languages so going forward Python internals can be changed more freely without breaking everything.
    • scorpioxy 1 hour ago
      The text encoding stuff wasn't a small change considering what it could break, at least. And remember we're sometimes talking about software that would cost a lot of money to migrate or upgrade. I still maintain some 2.x python code-bases that will be very expensive to migrate and the customer is not willing to invest that money.

      Although your general sentiment is something I agree with(if it's going to be painful do it and get it over with), I don't believe anybody knew or could've guessed what the reaction of the ecosystem would be.

      Your last point about being able to change internals more freely is also great in theory but very difficult(if not impossible) to achieve in practice.

      I don't know. Having maintained some small projects that were free and open source, I saw the hostility and entitlement that can come from that position. And those projects were a spec of dust next to something like Python. So I think the core team is doing the best they can. It was always going to be damned if you do, damned if you don't.

    • gjvc 1 hour ago
      yes. it was not a massive shift. it was barely worth the effort.
      • pansa2 1 hour ago
        The Python devs didn’t want to make huge changes because they were worried Python 3 would end up taking forever like Perl 6. Instead they went to the other extreme and broke everyone’s code for trivial reasons and minimal benefit, which meant no-one wanted to upgrade.

        Even the main driver for Python 3, the bytes-Unicode split, has unfortunately turned out to be sub-optimal. Python essentially bet on UTF-32 (with space-saving optimisations), while everyone else has chosen UTF-8.

        • diziet_sma 8 minutes ago
          > Python essentially bet on UTF-32 (with space-saving optimisations)

          How so? Python3 strings are unicode and all the encoding/decoding functions default to utf-8. In practice this means all the python I write is utf-8 compatible unicode and I don't ever have to think about it.

        • rjh29 23 minutes ago
          Ironically Perl 5 managed to do the bytes-Unicode split with a feature gate, no giant major version change.
  • adrian17 3 hours ago
    I'm been occasionally glancing at PR/issue tracker to keep up to date with things happening with the JIT, but I've never seen where the high level discussions were happening; the issues and PRs always jumped right to the gritty details. Is there anywhere a high-level introduction/example of how trace projection vs recording work and differ? Googling for the terms often returns CPython issue tracker as the first result, and repo's jit.md is relatively barebones and rarely updated :(

    Similarly, I don't entirely understand refcount elimination; I've seen the codegen difference, but since the codegen happens at build time, does this mean each opcode is possibly split into two (or more?) stencils, with and without removed increfs/decrefs? With so many opcodes and their specialized variants, how many stencils are there now?

    • flakes 3 hours ago
      You’ll probably want to look to the PEPs. Havent dug into this topic myself but looks related https://peps.python.org/pep-0744/
      • adrian17 3 hours ago
        I think CPython already had tier2 and some tracing infrastructure when the copy-and-patch JIT backend was added; it's the "JIT frontend" that's more obscure to me.
    • saikia81 3 hours ago
      have you read the dev mailing list? There the developers of python discuss lots.
      • pansa2 2 hours ago
        There isn’t a dev mailing list any more, is there? Do you mean the Discord forum?
    • sheepscreek 3 hours ago
      UPDATE: I misunderstood the question :-/ You can ignore this.

      I love playing with compilers for fun, so maybe I can shed some light. I’ll explain it in a simplified way for everyone’s benefit (going to ignore the stack):

      When an object is passed between functions in Python, it doesn’t get copied. Instead, a reference to the object’s memory address is sent. This reference acts as a pointer to the object’s data. Think of it like a sticky note with the object’s memory address written on it. Now, imagine throwing away one sticky note every time a function that used a reference returns.

      When an object has zero references, it can be freed from memory and reused. Ensuring the number of references, or the “reference count” is always accurate is therefore a big deal. It is often the source of memory leaks, but I wouldn’t attribute it to a speed up (only if it replaces GC, then yes).

      • yuliyp 2 hours ago
        what at all does this comment have to do with what it's replying to?
        • sheepscreek 59 minutes ago
          I misread the original comment, thinking it was a question about what is refcount elimination, than how it affects the JIT's performance(?).
  • rslashuser 1 hour ago
    I'm curious is the JIT developers could mention any Python features that prevent promising JIT features. An earlier Ken Jin blog [1], mentions how __del__ complicates reference counting optimization.

    There is a story that Python is harder to optimize than, say, Typescript, with Python flexibility and the C API getting mentioned. Maybe, if the list of troublesome Python features was out there, programmers could know to avoid those features with the promise of activating the JIT when it can prove the feature is not in use. This could provide a way out of the current Python hard-to-JIT trap. It's just a gist of an idea, but certainly an interesting first step would be to hear from the JIT people which Python features they find troublesome.

    [1] https://fidget-spinner.github.io/posts/faster-jit-plan.html

    • rtpg 1 hour ago
      It's interesting you mention __del__ because Javascript not only doesn't have destructors but for security reasons (that are above my pay grade) but the spec _explicitly prohibits_ implementations from allowing visibility into garbage collection state, meaning that code cannot have any visibility into deallocations.

      I think __del__ is tricky though. In theory __del__ is not meant to be reliable. In practice CPython reliably calls it cuz it reference counts. So people know about it and use it (though I've only really seen it used for best effort cleanup checks)

      In a world where more people were using PyPy we could have pressure from that perspective to avoid leaning into it. And that would also generate more pressure to implement code that is performant in "any" system.

    • adgjlsfhk1 52 minutes ago
      The biggest thing is BigInt by default. It makes every integer operation require an overflow check.
  • vanderZwan 1 hour ago
    > However, I misunderstood and came up with an even more extreme version: instead of tracing versions of normal instructions, I had only one instruction responsible for tracing, and all instructions in the second table point to that. Yes I know this part is confusing, I’ll hopefully try to explain better one day. This turned out to be a really really good choice. I found that the initial dual table approach was so much slower due to a doubling of the size of the interpreter, causing huge compiled code bloat, and naturally a slowdown.

    > By using only a single instruction and two tables, we only increase the interpreter by a size of 1 instruction, and also keep the base interpreter ultra fast. I affectionally call this mechanism dual dispatch.

    I really do hope they'll write that better explanation one day because this sounds pretty intriguing all on its own.

  • ghm2199 1 hour ago
    Thanks for all the amazing work! I have Noob question. Wouldn't this get the funding back? Or would that not be preferable way to continue(as opposed to just volunteer driven)?

    Like this is a big deal to get a project to a state where volunteers are spun up and actively breaking tasks and getting work done, no? It's a python JIT something I know next to nothing about — as do most application developers — which tells one how difficult this must have been.

    • pansa2 49 minutes ago
      > Wouldn't this get the funding back?

      The funding was Microsoft employing most of the team. They were laid off (or at least, moved onto different projects), apparently because they weren't working on AI.

    • Ralfp 44 minutes ago
      It looks like ARM picked up plenty of those folk and pays them to continue this work.
  • thunky 1 hour ago
    I always wanted this for Python but now that machines write code instead of humans I feel like languages like Python will not be needed as much anymore. They're made for humans, not machines. If a machine is going to do the dirty work I want it to produce something lean, fast, and strictly verified.
    • JodieBenitez 59 minutes ago
      Pretty much my thoughts the other day... now that Codex does the writing, maybe I can finally switch to Go for the web backend stuff without being annoyed by some of its archaisms and gain significant execution performance, while still having a relatively easy to read language.
      • kccqzy 43 minutes ago
        You ask a machine to write your code and you still care about being easy to read?

        In my experience the people who care the most about code readability tend to be the people most opinionated on having the right abstractions, which are historically not available in Go.

        • thunky 18 minutes ago
          I don't think people mind reading Go as much as they mind writing it.
  • nilslindemann 44 minutes ago
    Meanwhile, PyPI still states that "pip install <package>" works, confusing beginners who install Python using the install manager. Things break in the obvious parts, while they are busy doing esoteric work.

    Python has gone down 5.73% on the TIOBE index since July 2025 [1].

    [1]: https://www.tiobe.com/tiobe-index/

  • oystersareyum 4 hours ago
    > We don’t have proper free-threading support yet, but we’re aiming for that in 3.15/3.16. The JIT is now back on track.

    I recently read an interview about implementing free-threading and getting modifications through the ecosystem to really enable it: https://alexalejandre.com/programming/interview-with-ngoldba...

    The guy said he hopes the free-threaded build'll be the only one in "3.16 or 3.17", I wonder if that should apply to the JIT too or how the JIT and interpreter interact.

    • zarzavat 2 hours ago
      I continue to believe that free-threading hurts performance more than it helps and Python should abandon it.

      Having to have thread safe code all over the place just for the 1% of users who need to have multi-threading in Python and can't use subinterpreters for some reason is nuts.

      • kzrdude 2 hours ago
        I don't want to go too heavy on the negatives, but what's nuts is Python going for trust-the-programmer style multithreading. The risk is that extension modules could cause a lot of crashes.
      • pansa2 2 hours ago
        Maybe they could have two versions of the interpreter, one that’s thread-safe and one that’s optimised for single-threading?

        Microsoft used to do this for their C runtime library.

        • chuckadams 35 minutes ago
          PHP does this as well. Most distributions ship PHP without thread safety, but it's seeing more use now that FrankenPHP uses it. Speaking of which, it would be nice if PHP's JIT got a little love: it's never eked out more than marginal gains in heavily-numeric code.
        • veber-alex 1 hour ago
          That's exactly what we have now and it looks like the python devs want a single unified build at some point
  • ekjhgkejhgk 4 hours ago
    Doesn't PyPy already have a jit compiler? Why aren't we using that?
    • olivia-banks 4 hours ago
      As far as I know, PyPy doesn't support all CPython extensions, so pure Python code will probably (very likely) run fine but for other things most bets are off. I believe PyPy also only supports up to 3.11?
    • cpburns2009 3 hours ago
      PyPy is limited to maintenance mode due to a lack of funding/contributors. In the past, I think a few contributors or funding is what helped push "minor" PyPy versions. It's too bad PyPy couldn't take the federal funding the PSF threw away.
    • contravariant 3 hours ago
      Why shouldn't the reference implementation get JIT? Just because some other implementations already have it is no reason not to. That'd be like skipping list comprehensions because they already exist in CPython.
    • hrmtst93837 2 hours ago
      PyPy isn't CPython.

      A lot of Python code still leans on CPython internals, C extensions, debuggers, or odd platform behavior, so PyPy works until some dependency or tool turns that gap into a support problem.

      The JIT helps on hot loops, but for mixed workloads the warmup cost and compatibility tax are enough to keep most teams on the interpreter their deps target first.

    • 3laspa 3 hours ago
      Because the same people who made a big deal about supporting PyPy and PEP 399 when it was fashionable to do so are now told by their corporations that PyPy does not matter. CPython only moves with what is currently fashionable, employer mandated and profitable.
    • JoshTriplett 4 hours ago
      Because PyPy seems to be defunct. It hasn't updated for quite a while.

      See https://github.com/numpy/numpy/issues/30416 for example. It's not being updated for compatibility with new versions of Python.

      • mkl 3 hours ago
      • LtWorf 3 hours ago
        last release 4 days ago.

        Can you please not post "facts" you just invented yourself?

        • Waterluvian 3 hours ago
          It supports at best Python 3.11 code, right?

          So it’s not unmaintained, no. But the project is currently under resourced to keep up with the latest Python spec.

          • LtWorf 2 hours ago
            That is not the same thing at all, and not what he said.
            • JoshTriplett 2 hours ago
              It is exactly what I'm referring to. I didn't say there aren't still people around. But they're far enough behind CPython that folks like NumPy are dropping support. Unless they get a substantial injection of new people and new energy, they're likely to continue falling behind.
  • ecshafer 3 hours ago
    What is wrong with the Python code base that makes this so much harder to implement than seemingly all other code bases? Ruby, PHP, JS. They all seemed to add JITs in significantly less time. A Python JIT has been asked for for like 2 decades at this point.
    • 0cf8612b2e1e 3 hours ago
      The Python C api leaks its guts. Too much of the internal representation was made available for extensions and now basically any change would be guaranteed to break backwards compatibility with something.
      • patmorgan23 2 hours ago
        Ooo this makes sense it's like if the Linux had don't break users space AND a whole bunch of other purely internal APIs you also can't refactor.
      • echelon 2 hours ago
        It's a shame that Python 2->3 transition was so painful, because Python could use a few more clean breaks with the past.

        This would be a potential case for a new major version number.

        • froobius 2 hours ago
          On the other hand, taking backwards compatibility so seriously is a big part of the massive success of Python
          • __mharrison__ 2 hours ago
            I would argue that the libraries, and specifically NumPy, are the reason Python is still in the picture today.

            It will be interesting to see, moving forward, what languages survive. A 15% perf increase seems nice, until you realize that you get a 10x increase porting to Rust (and the AI does it for you).

            Maybe library use/popularity is somewhat related to backwards compatibility.

            Disclaimer: I teach Python for a living.

            • punnerud 1 hour ago
              And PyTorch, and Pandas, and, and…
            • B1FF_PSUVM 1 hour ago
              > you get a 10x increase porting to Rust (and the AI does it for you)

              So, you keep reading/writing Python and push a button to get binary executables through whatever hoops are best today ?

              (I haven't seen the "fits your brain" tagline in the recent past ...)

          • pansa2 2 hours ago
            >> Python 2->3 transition

            > taking backwards compatibility so seriously

            Python’s backward compatibility story still isn’t great compared to things like the Go 1.x compatibility promise, and languages with formal specs like JS and C.

            The Python devs still make breaking changes, they’ve just learned not to update the major version number when they do so.

            • BarryMilo 1 hour ago
              Indeed, Python's version format is semver but it's just aesthetics, they remove stuff in most (every?) minor version. Just yesterday I wasted hours trying to figure out a bug before realizing my colleague hadn't read the patch notes.
          • kccqzy 1 hour ago
            Python does not take backwards compatibility seriously. 2 to 3 is a big compatibility break. But things like `map(None, seq1, seq2)` also broke; such deliberate compatibility break is motivated by no more than aesthetic purity.
          • IshKebab 2 hours ago
            Python does not take backwards compatibility very seriously at all. Take a look at all the deprecated APIs.

            I would say it's probably worth it to clean up all the junk that Python has accumulated... But it's definitely not very high up the list of languages in terms of backwards compatibility. In fact I'm struggling to think of other languages that are worse. Typescript probably? Certainly Go, C++ and Rust are significantly better.

    • hardwaregeek 2 hours ago
      For what it’s worth Ruby’s JIT took several different implementations, definitely struggled with Rails compatibility and literally used some people’s PhD research. It wasn’t a trivial affair
    • stmw 3 hours ago
      Some languages are much harder to compile well to machine code. Some big factors (for any languages) are things like: lack of static types and high "type uncertainty", other dynamic language features, established inefficient extension interfaces that have to be maintained, unusual threading models...
      • RussianCow 3 hours ago
        That makes sense if you're comparing with Java or C#, but not Ruby, which is way more dynamic than Python.

        The more likely reason is that there simply hasn't been that big a push for it. Ruby was dog slow before the JIT and Rails was very popular, so there was a lot of demand and room for improvement. PHP was the primary language used by Facebook for a long time, and they had deep pockets. JS powers the web, so there's a huge incentive for companies like Google to make it faster. Python never really had that same level of investment, at least from a performance standpoint.

        To your point, though, the C API has made certain types of optimizations extremely difficult, as the PyPy team has figured out.

        • vlovich123 2 hours ago
          Google, Dropbox, and Microsoft from what I can recall all tried to make Python fast so I don’t buy the “hasn’t seen a huge amount of investment”. For a long time Guido was opposed to any changes and that ossified the ecosystem.

          But the main problem was actually that pypy was never adopted as “the JIT” mechanism. That would have made a huge difference a long time ago and made sure they evolved in lock step.

          • int_19h 1 hour ago
            Microsoft is the one the TFA refers to cryptically when it says "the Faster CPython team lost its main sponsor in 2025".

            AFAIK it was not driven by anything on the tech side. It was simply unlucky timing, the project getting in the middle of Microsoft's heavy handed push to cut everything. So much so that the people who were hired by MS to work on this found out they were laid off in a middle of a conference where they were giving talks on it.

        • flykespice 2 hours ago
          > Python never really had that same level of investment, at least from a performance standpoint.

          Or lack of incentive?

          Alot of big python projects that does machine learning and data processing offloads the heavy data processing from pure python code to libraries like numpy and pandas that take advantage of C api binding to do native execution.

      • simonask 2 hours ago
        The simplest JIT just generates the machine code instructions that the interpreter loop would execute anyway. It’s not an extremely difficult thing, but it also doesn’t give you much benefit.

        A worthwhile JIT is a fully optimizing compiler, and that is the hard part. Language semantics are much less important - dynamic languages aren’t particularly harder here, but the performance roof is obviously just much lower.

    • fridder 2 hours ago
      For better or for worse they have been very consistent throughout the years that they don't want want to degrade existing performance. It is why the GIL existed for so long
    • bawolff 2 hours ago
      I thought php hasn't shipped jit yet (as in its behind a disabled by default config)
      • SahAssar 2 hours ago
        PHP 8 shipped with JIT on by default unless I'm mistaken.
    • wat10000 3 hours ago
      PHP and JS had huge tech companies pouring resources into making them fast.
    • brokencode 3 hours ago
      Are you forgetting about PyPy, which has existed for almost 2 decades at this point?
      • RussianCow 3 hours ago
        That's a completely separate codebase that purposefully breaks backwards compatibility in specific areas to achieve their goals. That's not the same as having a first-class JIT in CPython, the actual Python implementation that ~everyone uses.
        • brokencode 2 hours ago
          Definitely agree that it’s better to have JIT in the mainline Python, but it’s not like there weren’t options if you needed higher performance before.

          Including simply implementing the slow parts in C, such as the high performance machine learning ecosystem that exists in Python.

    • g947o 2 hours ago
      Money.
  • fluidcruft 3 hours ago
    (what are blueberry, ripley, jones and prometheus?)
    • mkl 3 hours ago
      Yes, the graphs are incomprehensible because those are not defined in the article. They turn out to be different physical machines with different architectures: https://doesjitgobrrr.com/about

        blueberry (aarch64)
        Description: Raspberry Pi 5, 8GB RAM, 256GB SSD
        OS: Debian GNU/Linux 12 (bookworm)
        Owner: Savannah Ostrowski
      
        ripley (x86_64)
        Description: Intel i5-8400 @ 2.80GHz, 8GB RAM, 500GB SSD
        OS: Ubuntu 24.04
        Owner: Savannah Ostrowski
      
        jones (aarch64)
        Description: Apple M3 Pro, 18GB RAM, 512GB SSD
        OS: macOS
        Owner: Savannah Ostrowski
      
        prometheus (x86_64)
        Description: AMD Ryzen 5 3600X @ 3.80GHz, 16GB RAM
        OS: Windows 11 Pro
        Owner: Savannah Ostrowski
    • max-m 3 hours ago
      The names of the benchmark runners. https://doesjitgobrrr.com/about
      • fluidcruft 3 hours ago
        So the biggest gains so far are on Windows 11 Pro of (x86_64) ~20%? Is that because Windows was bad as a baseline (promethius)? It doesn't seem like the x86_64/Linux has improved as dramatically ~5% (ripley). I'm just surprised OS has that much of an effect that can be attributed to JIT vs other OS issues.
        • raddan 3 hours ago
          It's hard to say whether it's Windows related since the two x86_64 machines don't just run different OSes, they also have different processors, from different manufacturers. I don't know whether an AMD Ryzen 5 3600X versus Intel i5-8400 have dramatically different features, but unlike a generic static binary for x86_64, a JIT could in principle exploit features specific to a given manufacturer.
    • nonameiguess 2 hours ago
      The immediate question has been answered, but what about the names? The latter three are obvious references to the Alien universe, but what relationship does blueberry have to them?
      • luhn 2 hours ago
        I assume Blueberry is a nod to the machine being a Raspberry Pi.
  • killingtime74 3 hours ago
    Sorry but the graphs are completely unreadable. There are four code names for each of the lines. Which is jit and which is cpython?
    • mkl 3 hours ago
      They are all JIT on different architectures, measured relative to CPython. https://doesjitgobrrr.com/about: blueberry is aarch64 Raspberry Pi, ripley is x86_64 Intel, jones is aarch64 M3 Pro, prometheus is x86_64 AMD.
  • devnotes77 37 minutes ago
    [dead]
  • aplomb1026 1 hour ago
    [dead]
  • rafph 4 hours ago
    [flagged]
    • rsoto2 3 hours ago
      I am trying to push back. I don't care if other people think the tools make them faster, I did not sign up to be a guinea pig for my employer or their AI-corp partner.
  • AgentMarket 3 hours ago
    [flagged]
    • anon291 3 hours ago
      Reference counting is not a strict requirement for python. Certainly not accurate counting.
    • 1819231267 3 hours ago
      [flagged]
      • jqbd 3 hours ago
        Wait is this real? Does it mean this person read it or the bot read it, I don't think this is moltbook if the latter
        • ayhanfuat 3 hours ago
          AgentMarket is a bot spamming multiple threads with AI generated comments, if that is what you are asking.
      • AgentMarket 2 hours ago
        [dead]
  • wei03288 2 hours ago
    [flagged]
  • fivedicks 1 hour ago
    Python is obviously going the same route as PHP.

    Substitute Wordpress for Django, it’s the same slow user/permissions platform built in a different slow language.

    The rest of Python larps in Go fashion as a real language like JavaScript.

    All these dynamic languages that lack a major platform and use case beyond syntax preference should just go away.