idoubtit 21 hours ago

The README is quite ahead of reality: it never mention that Mago is still beta software. A roadmap to a first release was created 5 days ago, https://github.com/carthage-software/mago/issues/405.

I've just tried to apply it to a medium-sized project, and it spitted tens of thousands of errors where phpstan and psalm don't see any. At first glance, it's because Mago does not parse phpdoc. In its current beta state, Mago is meaningless for all the big PHP projects, which are its main target.

Mago might succeed, but I wouldn't bet on it. Its main selling point is that it promises to be faster than the usual static analysers-linters (phpstan and psalm). But if it does not reach feature parity with them, the speed gain probably won't convince PHP projects to drop the standard tools. Since phpstan and Co keep evolving, keeping feature parity will require constant work. And PHP is more niche than Python or JS, so contributors mastering Rust and PHP will be fewer, compared to phpstan/psalm which are written in PHP.

  • eider 20 hours ago

    I pulled one of classes from my project that does not depend on anything external and fed it to their demo site. It threw up bunch of errors complaining that RuntimeException is undefined. It seems it doesn't understand built-ins unless you prefix them with \, even though they are imported properly with use statement. This is pretty core issue to lack support for. Calling it "beta" is actually giving it a lock of slack, I'd say it's closer to being proof of concept.

    • azjezz2 15 hours ago

      Mago author here.

      The online playground is running a very old version (~0.20.0 from months ago, which did not even have a static analyzer) and gives a poor impression of the current tool. That's on me to fix or take down.

      The issue you saw with built-in classes like `RuntimeException` was absolutely a bug in those early alpha versions, but it has been fixed for a long time now. The analyzer has matured a lot since then.

      The current beta is stable enough to be the sole static analysis tool for a couple of extremely well-typed projects:

      - https://github.com/azjezz/psl/ - https://github.com/carthage-software/cel-php

      I'd definitely encourage trying the latest release locally to get a real feel for it. Thanks again for the write-up!

  • azjezz2 15 hours ago

    Mago author here. Thanks for the feedback.

    You're right that the README should be clearer about the beta status and the current feature set. That's a great suggestion, and I'll get that updated.

    Regarding the errors you saw, I suspect the main culprit isn't a general lack of PHPDoc parsing, but rather the two biggest remaining features we're actively working on: support for magic _@method_ and _@property_ tags. Mago has full support for generics (@template), assertions (@psalm-assert*), conditional types, etc., but the absence of those two is definitely a major source of noise on established projects right now. They are our top priority and should land in the next beta release.

    On the topic of feature parity, you're right that it's a moving target. Our goal isn't to be a 1-to-1 clone of Psalm or PHPStan, but a different tool with its own strengths (see: https://github.com/carthage-software/mago/discussions/379). For example, Mago will flag code like `[0 => $a, $b] = ["a", "b"]` as an error, which other tools currently do not.

    We're very aware of the current noise level. We test Mago daily against massive, multi-million-line codebases. On one such project, the first beta reported ~250,000 errors; we're now down to ~30,000. While still a lot, it shows how quickly we're closing the gap on false positives.

    Thanks again for the valuable feedback. It's a long road, but we're confident we can reach and surpass the current standards in a very short time.

    • idoubtit 7 hours ago

      Thanks for explaining your goal and some of the context.

      I suggest that stating them prominently would help the project. When I read Mago's home page, I downloaded the latest release, followed the "Getting started" process on a local repository, then nearly lost interest when I saw the amount of false errors. If I had first read "here is our goal, with this roadmap, this kind of validation on real projects, and this position toward existing well-known tools", I would have been much more willing to follow the project and accept its false positives.

      As a side note, for my first try with Mago, I think its lack of parsing `@property` was a major source of false errors, because the source code it analyzed had a few omnipresent classes that used it. BTW, Mago panicked when I tried to lint another repository... I'll open a issue.

anta40 a day ago

I initially thought this was a PHP implementation in Rust.... but it's not

Will Mago implement a PHP runtime?

Absolutely not. The PHP runtime is incredibly complex. Major efforts by large companies (e.g., Facebook's HHVM, VK's KPHP) have struggled to reach full parity with Zend Engine. Achieving this as a smaller project is infeasible and would lead to community fragmentation. We are focused on tooling, not runtimes.

https://mago.carthage.software/faq

  • password4321 a day ago

    https://www.peachpie.io compiles PHP to .NET.

    • VoidWhisperer a day ago

      Isn't this more transpiling than compiling?

      • tialaramex 21 hours ago

        Not really? .NET has a "Common Language Runtime", which you can think of as analogous to the Java VM or to Beam.

        A transpiler might read PHP and spit out C, or Java or some other existing programming language, spitting out the code for a virtual machine doesn't make you a transpiler unless you're going to argue that all compilers are just transpilers, it's like one of those "Actually goats are fish" arguments. OK, but now the word "fish" is useless so why go to this bother ?

NietTim 20 hours ago

Seen this around quite abit over the past few days. I wish the github landing page/readme would actually substansiate why this is better beyond it being written in rust which seems to be the main argument for the tool right now. I make my money from PHP, I preffer stability.

  • ajsnigrutin 17 hours ago

    Years before it was "... written in ruby", now ruby lies forgotten and rust is the new language of the week.

CiaranMcNulty 5 hours ago

It's good to see more focus on static analysis.

With Phpstan and Psalm already in the space I'd like to see more differentiator features than 'written in Rust' - there are certainly advantages to that, but the disadvantage of not using PHP is it's harder to get contributions from the community using the tool.

Cool project overall!

dzonga a day ago

seems rust's biggest win was improving other languages toolchains and bringing increased productivity to those languages.

  • retrocog a day ago

    Not a bad win so far, right? One hand washes the other and both wash the face.

  • giancarlostoro a day ago

    I am waiting for someone to build a modern scripting language in Rust that has the popularity and rich tooling and capabilities of Rust as a result.

    • ekidd 21 hours ago

      There are two deep capabilities that make Rust, Rust:

      1. Banning shared, mutable data. You can't change data that other code might be reading. This is a huge win for threading and multiple CPUs, but it's a dramatic departure from other popular languages.

      2. Knowing how data is laid out in memory. This is classic "systems programming" stuff, and it's also present in C, C++, Zed, etc. This usually goes along with making memory allocation visible (though not always in C++). This is a big win for performance.

      If you wanted to build a "scripting language" version of Rust, you could probably lose (2). Languages like Haskell are even stricter than Rust, but they hide the details of memory layout. But then you need to decide whether to keep or lose (1). If you keep it, your language will have good threading, but users will need to think about ownership and borrowing. If you lose (1), then your language won't feel very much like Rust.

      It would be an interesting intellectual exercise! But actually turning it into a popular scripting language would probably require the same luck and the same 10 years of work that most successful languages need to get real traction.

      • coolsunglasses 20 hours ago

        >If you wanted to build a "scripting language" version of Rust, you could probably lose (2).

        Not really no. I work on an interpreted language runtime in Rust professionally and it's still a huge help even if you're still eating perf pain on the interpreted language itself for the same reasons everyone else does. There's more benefit to Rust than you're really capturing here but that's to be expected, it's a short comment.

        Here are some other things we get from using Rust for interpreted languages:

        - The `unsafe` parts are relatively limited in scope and we have much better and more automated verification tools for `unsafe`, especially WRT undefined behavior

        - Being able to make large architectural changes and work through them mechanically/quickly because of the type-checking and borrow-checking is absurdly powerful for any project, all the more so in this context.

        - The library ecosystem for Rust has been fantastic for all kinds of projects but it's especially good for PL runtimes.

        - LLMs are a lot better at Rust than other programming languages. I have a lot of experience using LLMs in a variety of domains and programming languages and it's far better at Rust than anything else that's expressly about programming. Arguably it's even better at Terraform and Ansible but I consider that a different category. Controversial point maybe but I get tremendous yield out of it.

        - It's not just that Rust is fast. It is on par w/ C/C++ all else being equal. What's significant here is that it is a _lot_ quicker/easier to hit the 80/20 perf targets as well as the bleeding edge performance frontier in a Rust application than it is in C and C++. A lot of C and C++ projects leave performance on the table either because it's too hard to make the ownership model human-maintainable/correct or because it would be too much work to refactor for the hoped-for perf yield. Not as much an issue in Rust. You can gin up hypothetical perf improvements in Rust with gpt-5 lickety-split and the types/borrowck will catch most problems while the agent is iterating.

        Shared, mutable data aren't really banned, we use it strategically in our Rust interpreter, it's just not default-permitted. Aliasing is precisely the distinction between a safe reference and an unsafe pointer in Rust. Aliasing a mutable pointer in Rust isn't UB, it's just `unsafe`. OTOH, aliasing a mutable reference _is_ UB and not allowed in Rust. Miri will catch you if you do this.

        On top of all that, you have some nice kit for experimenting with JIT like Cranelift.

        • Cyph0n 17 hours ago

          > You can gin up hypothetical perf improvements in Rust with gpt-5 lickety-split and the types/borrowck will catch most problems while the agent is iterating.

          I am a huge Rust fan, but never really got a chance to write it in the modern LLM era. It makes absolute sense that the borrow checker would make LLM agent-driven refactors easier.

    • jasonpeacock 19 hours ago

      I just use Rust to do any "scripting" work. I stopped using Python and write it in Rust instead, and I'm more productive than before.

      What do you need a scripting language for that's different than using Rust?

      • Aeolos 17 hours ago

        How do you deal with slow compilation times?

    • librasteve 19 hours ago

      suggest you test drive https://raku.org while you wait (spoiler alert - written in C)

      • giancarlostoro 18 hours ago

        I have fiddled with Raku but it feels like a language thats from a parallel universe to Perl which might not get serious adoption. When I look at languages I evaluate them by libraries available for UI, database access, networking libraries and web frameworks primarily.

        • librasteve 7 hours ago

          lol

          Thanks for giving Raku a test drive (https://raku.org)

          While Raku can access all the perl CPAN modules and Python modules via Inline::Perl5 and Inline::Python, I agree native modules are also a good indication of the level of "adoptability" of a language.

          I would say that Raku is currently ready for "bleeding edge" and "early-adopters" but not for "early-main" (terms from Crossing the Chasm)

          For example, there are three pretty nice actively maintained web framework libraries:

            - Cro (also HARC stack that uses Cro)
            - Hummingbird
            - Web::App
          
          For me the strength of being an early adopter is that the community is small and friendly (with many experts who help me out) and that I can be an influential contributor to shape the ecosystem to do things that meet my needs. And this is not in an ocean of cruft (like CPAN and Python).

          Similar stories for each of the domains you mention: https://chatgpt.com/share/68c67f75-d654-8009-9c8d-fdb1081869...

    • testdelacc1 a day ago

      Like Gleam?

      • giancarlostoro 21 hours ago

        I was thinking similar but more like a Python or Lua but all in Rust.

      • scotty79 21 hours ago

        It doesn't have imperative constructs I think. So half of developers or more are out front the get go.

  • ainiriand a day ago

    Do you really think that this is Rust's biggest win or are you just joking/trolling?

    • IshKebab a day ago

      To be fair it is pretty significant. Especially uv. I don't know anything about PHP's existing toolchain but I do know that Python's is a horrifying mess, and uv basically fixes it.

      It's a small thing in the Rust community but it's pretty huge in the world simply because there are so many Python developers (and also because of the extreme magnitude of improvement). Probably wouldn't have happened without Rust.

      • 3eb7988a1663 a day ago

        Thanks to Rust, there are heaps of next generation CLI utilities that have come onto the scene in the past decade. Cross platform by default, UTF8 aware, more likely to be multi-threaded, simple distribution, and most importantly - improving on some unfortunate legacy API decisions.

        Ripgrep, fd, tokei, Just, zellij, uv, and so forth. Porting languages has given the opportunity to remove some of the cruft decided on a whim in the 70s. None of these are world changing, but they do make life easier than the originals.

        • johnisgood 18 hours ago

          Why is it "thanks to Rust"? They could have been written in any language.

          • 3eb7988a1663 16 hours ago

            To replace a historical C tool, you cannot compromise too much on the original constraints. Replacement has to be equivalently fast and no runtime. Which means even if you made a great design in C#/Java/Python/whatever, that is going to be a deal breaker for some. Safety is (somehow) not a compelling enough argument vs battle-tested.

            Rust (or Go) are about the only popular languages around that can meet the spec. That Rust will always be theoretically more performant than GC Go makes it the more attractive option for re-imagining some of these bedrock utilities.

            • johnisgood 6 hours ago

              There are actually quite a few viable alternatives to Rust for building these fast toolchains, such as Zig, Odin, OxCaml, and Ada.

              Granted, not all of them are popular, but that is how Rust started off, too, right?

              ---

              Ada would have been great as it is quite mature and used for serious and critical software, but lacks the "cool factor" that Rust has.

              I would like to add, that there is clearly a disconnect between technical merit and adoption. Ada is a perfect example of this. It has been around since the 1980s with:

              * Stronger safety guarantees than Rust in many ways (stricter type system, built-in contracts, formal verification support)

              * Proven track record in safety-critical systems (aerospace, defense, medical devices)

              * Native compilation with no runtime overhead

              * Mature toolchain and decades of real-world deployment

              * ... and much more.

              If safety was truly the driving factor, Ada should have dominated systems programming decades ago. But it didn't, and that reveals what's really going on.

              I believe that Rust was chosen over Ada because of:

              * Timing - arrived when web developers were ready to try systems programming

              * Community building - excellent documentation, welcoming culture, modern package manager[1]

              * Aesthetic appeal - syntax familiar to C/Java/JavaScript developers

              * Marketing narrative - "systems programming without fear", i.e. fuckton of hype

              Ada had superior safety for decades but lacked the cultural momentum. It was associated with DoD contracts and enterprise software, not cool indie CLI tools.

              [1] Ada has Alire now as its modern package manager.

          • Aeolos 17 hours ago

            Yet they were not - why is that?

            • johnisgood 7 hours ago

              Why are you asking me back?

              I wonder why, too. Now, you can give me a reply for why you think that is, which you could have done without this comment, or you can just keep adding noise to this thread. Up to you.

    • wavemode 21 hours ago

      I don't think it's strange at all to call this Rust's biggest win. Adopting Rust-based tooling has sped up development and lowered CI costs for millions of JavaScript and Python developers. I can't think of anything else Rust has been used for which has had a larger direct impact on people's everyday lives.

    • testdelacc1 a day ago

      Not the biggest, but definitely the most visible to people who aren’t dialled into Rust news. For example, many people use Android but they wouldn’t know or need to know that their Bluetooth stack is written in Rust.

      Whereas anyone who uses Python would have heard of uv and why it’s much faster than other tools.

  • smt88 a day ago

    You're wrong because it's also incrementally replacing individual, high-risk components in Windows and Linux.

    But even if you're not wrong, a major mission of Rust was to be a safer C/C++, and language tooling used to be dominated by those languages.

    • tredre3 a day ago

      All the language tools that are being displaced by newer rust replacements were definitely not written in C/C++. They were/are written in the host language (js/java/python/php/ruby).

      • tialaramex 21 hours ago

        Which is striking right? Nobody went "Oh, I should write C++ to speed up my Python tool", or if they did we don't know about it because they're still trying to understand the six thousand lines of template spaghetti their compiler spat out due to a typo in one line of their code.

darkamaul a day ago

So I guess this is `uv`, but for PHP?

If it has remotely the same success, that would be a huge win for the ecosystem!

  • techtalsky a day ago

    It's more like `ruff` for PHP.

  • aszen a day ago

    No its different, php already has a good package manager, this is about formatting, linting and type checking

  • Einenlum 12 hours ago

    PHP already has a pretty good package manager. But it would be great indeed to have a tool to install different versions of the runtime, on top of a Ruff equivalent

muglug a day ago

It is very cool that this exists, but the PHP community lacks the resources to see a non-PHP tool thrive.

Tools like Sorbet (C typechecker for Ruby) or tsgo (Go-based successor to TypeScript's typechecker) are only viable because big profitable companies can back them up with engineering hours.

  • hu3 a day ago

    > PHP community lacks the resources to see a non-PHP tool thrive.

    Why do you think so?

    The PHP Foundation has raised over 2 million USD in contribution and has over 500K in their balance currently according to:

    https://opencollective.com/phpfoundation

    PHP has some well funded groups using it like Wordpress, Wikipedia, Laravel to name a few.

    And recently the PHP Foundation started officially sponsoring a Go project, FrankenPHP.

    https://thephp.foundation/blog/2025/05/15/frankenphp/

    So PHP looks like a friendly and well supported community to foster tooling made in other languages.

    • muglug a day ago

      > The PHP Foundation has raised over 2 million USD in contribution and has over 500K in their balance

      This is great, but it is still dwarfed by the amount Microsoft has spent on TypeScript and also by the amount Stripe has spent on Sorbet.

      500k is roughly comparable to the amount my previous company spent (grudgingly) to keep me employed and working on PHP tooling for a couple of years.

      • hu3 a day ago

        True but TypeScript and Sorbet are magnitudes above linting and formatting PHP, in terms of challenge size.

        TypeScript is a very complex language with a huge mission. From the same creator of C#.

        Sorbet is trying to tame a dynamically typed language which supports monkey patching. Stripe can get away with it because they have close to infinite money and a large Ruby codebase.

        Meanwhile PHP is stable and typed. Parsing AST, linting and formatting are trivial in comparison to the examples you cited. Their package manager, composer, is also boring a stable, in a good way. Prime target for a second pass if need be.

        • muglug 20 hours ago

          I would posit that you do not know what you’re talking about. Mayo is also a static analysis tool that does typechecking. It incorporates and is heavily influenced by code I wrote.

          • hu3 19 hours ago

            I'll assume you meant Mago instead of Mayo.

            Why do you feel the need to personally discredit me instead of sticking to constructive arguments?

            And where did I say Mago wasn't a static analysis tool?

            I'm glad you wrote Psalm. However I and most of the PHP community use https://phpstan.org instead, as you may know.

            At the time I made this choice on technical grounds. Then PHPStan found a way to stay profitable with PHPStan Pro while Psalm stagnated, which cemented my decision. Recently, vimeo being acquired by Bending Spoons doesn't help either since psalm still lives in https://github.com/vimeo/psalm

            • muglug 18 hours ago

              I wasn’t talking about Psalm - I was talking about Hakana, a Rust project that informs much of Mago’s type-checking architecture. Hakana was designed to analyse Hack, a fork of PHP that doesn’t support PHP magic methods and so is much easier to analyse statically.

              The big challenge to typechecking PHP is that it’s essentially two different languages — there’s typechecking code like Laravel that makes heavy use of magic methods (effectively impossible statically) and typechecking code that doesn’t (very doable).

              Psalm was mostly designed for the latter, whereas PHPStan excelled at the former with dynamic analysis in plugins like Larastan.

              That dynamic analysis — where you have to run some PHP code to make sense of a PHP codebase — is another big reason a Rust-based tool will have a hard time becoming popular in the PHP ecosystem.

              • hu3 14 hours ago

                Thanks for the information. I understand the challenge now.

                And yeah, the parts which I don't like about Laravel are usually related to magic. IDEs often need extensions to understand Laravel which imo is not a good look.

  • retrocog a day ago

    Interesting. Do you have any thoughts to share along the same lines about FrankenPHP?

    • muglug a day ago

      It's cool that it's part of the PHP foundation, but it's not all that complex.

      FrankenPHP has >100 contributors, including 3 very frequent ones, and about 17k lines of Go.

      Mago has 11 contributors, with just 1 very frequent one, and about 135k lines of Rust.

Raed667 a day ago

Love seeing some Tunisian representation here ! Kudos on the project !

quotemstr 20 hours ago

How many hours and dollars get wasted on reimplementing the same basic concepts over and over for this or that language runtime?

  • azjezz2 15 hours ago

    Hours? A lot. Dollars? A couple hundred for the logo.

loeg 20 hours ago

But what does it do?

cynicalsecurity a day ago

All of this already exists and each separate product is actively developed, keeping up with all of the changes in PHP. This toolset looks too ambitious.

  • lucideer a day ago

    Astral did something similar for the Python ecosystem (Rust-inspired tooling built in Rust, replacing a lot of pre-existing - bad - tooling) & the impact has been revolutionary. Python had some of the worst tooling of any popular language & now has some of the best.

    Composer is one of the best package managers in any language ecosystem but beyond that, other PHP tooling, while technically well maintained, aren't particularly great at what they do. It's an ideal starting point for positive disruption.

    • SXX 21 hours ago

      I really wasn't using Composer for last couple of years. Did it already stopped eating 2GB RAM for any project that use framework like Laravel?

      Composer is a good tool, but it's resource usage was abysmal.

      • gocartStatue 20 hours ago

        2G of ram at build is not that outrageous, however yes, with composer2 (released like 7 years ago?) it uses less ram. Also the frameworks got less bloated and more modular

      • 10us 20 hours ago

        Yes it did. Composer 2 is way lighter and faster

    • CR007 18 hours ago

      BS, compare projects like Infection vs whatever you have to fax to a guy running Ruby.

  • azjezz2 15 hours ago

    That's a great point, and you've touched on what might seem like a paradox: while the toolset is ambitious, keeping up with new PHP features is actually one of Mago's biggest strengths.

    The traditional PHP tooling ecosystem relies on a dependency chain. A new syntax feature has to be implemented in a core library like nikic/PHP-Parser, then released, then adopted by tools like Psalm or PHPStan, and then finally those tools make a new release. This process can take weeks or months.

    Because Mago is a single, cohesive toolchain, we control the entire stack. We can add support for new syntax across the lexer, parser, formatter, linter, and analyzer in one go.

    For example:

    - Mago's formatter and analyzer already have full support for the Pipe Operator (`|>`) and `clone with` from the upcoming PHP 8.5. The pipe operator was implemented across the entire toolchain in about 30 minutes, just hours after its RFC was approved. - For comparison, many existing tools are still catching up with PHP 8.4 features like Property Hooks.

    This agility is a core part of the project's value proposition.