In Defense of Matlab Code
Key topics
The debate around Matlab's merits has been reignited, with a passionate defense of its "whiteboard-style" code sparking a lively discussion about its rivals, particularly Julia and Python. While some commenters, like lemonwaterlime and fph, argue that Julia has bridged the gap, others, such as drnick1 and kelipso, counter that Python, augmented with tools like Numba, remains a viable alternative, and that Julia's advantages are overstated. However, proponents of Julia, including SatvikBeri and jakobnissen, point out its unique strengths, such as seamless performance and dynamic language features, that set it apart from Python. As the discussion unfolds, it becomes clear that the quest for a more readable, efficient, and user-friendly coding environment remains a pressing concern.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
3d
Peak period
111
72-84h
Avg / period
20
Based on 160 loaded comments
Key moments
- 01Story posted
Dec 12, 2025 at 3:13 PM EST
21 days ago
Step 01 - 02First comment
Dec 15, 2025 at 2:58 PM EST
3d after posting
Step 02 - 03Peak activity
111 comments in 72-84h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 19, 2025 at 9:29 PM EST
14 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Earlier in my career, I found that my employers would often not buy Matlab licenses, or would make everyone share even when it was a resource needed daily by everyone. Not having access to the closed-source, proprietary tool hurt my ability to be effective. So I did my "whiteboard coding" in Julia. I still do.
But Julia also introduces new problems, such as JIT warmup (so it's not really suitable for scripting) and is still not considered trustworthy:
https://yuri.is/not-julia/
> I don't think Julia really solves any problems that aren't already solved by Python.
I don't really need proper furniture, the cardboard boxes and books setup I had previously "solved" the same problems, but I feel less worried about random parts of it suddenly buckling, and it is much more ergonomic in practice too.
At least it has those tools and libraries, what cannot be said about Julia.
My experience with this website is that it would be rather pointless to enumerate, because you will then point to some poorly documented, buggy and supporting fraction of features Julia "alternatives" to Python packages or APIs that are developed and maintained by well-resourced organizations.
The same thing for tooling - unstable, buggy Julia plugin for VSCode is not the same as having products like PyCharm and official Python plugins made by Microsoft for VS and VSCode.
Now, I will admit that Julia also has some niceties that would be hard to find in Python ecosystem (mainly SciML packages), but it is not enough.
> Have you used the language or merely speculating?
I just saw the logo in Google Images.
This is a huge understatement. At the hedge fund I work at, I learned Julia by porting a heavily optimized Python pipeline. Hundreds of hours had gone into the Python version – it was essentially entirely glue code over C.
In about two weeks of learning Julia, I ported the pipeline and got it 14x faster. This was worth multiple senior FTE salaries. With the same amount of effort, my coworkers – who are much better engineers than I am – had not managed to get any significant part of the pipeline onto Numba.
> And if something is truly performance critical, it should be written or rewritten in C++ anyway.
Part of our interview process is a take-home where we ask candidates to build the fastest version of a pipeline they possibly can. People usually use C++ or Julia. All of the fastest answers are in Julia.
That's surprising to me and piques my interest. What sort of pipeline is this that's faster in Julia than C++? Does Julia automatically use something like SIMD or other array magic that C++ doesn't?
In my view, it's not that Julia itself is faster than Rust - on the contrary, Rust as a language is faster than Julia. However, Julia's prototyping, iteration speed, benchmarking, profiling and observability is better. By the time I would have written the first working Rust version, I would have written it in Julia, profiled it, maybe changed part of the algorithm, and optimised it. Also, Julia makes more heavy use of generics than Rust, which often leads to better code specialization.
There are some ways in which Julia produces better machine code that Rust, but they're usually not decisive, and there are more ways in which Rust produces better machine code than Julia. Also, the performance ceiling for Rust is better because Rust allows you to do more advanced, low level optimisations than Julia.
Everything you can do in Julia you can do in C++, but lots of projects that would take a week in C++ can be done in an hour in Julia.
The pipeline was pretty heavily focused on mathematical calculations – something like, given a large set of trading signals, calculate a bunch of stats for those signals. All the best Julia and C++ answers used SIMD.
It would be fun if you could share a similar pipeline problem to your take-home (I know you can't share what's in your interview). I started off in scientific Python in 2003 and like noodling around with new programming languages, and it's great to have challenges like this to work through. I enjoyed the 1BRC problem in 2024.
https://github.com/gunnarmorling/1brc
But isn't the whole point of this article that Matlab is more readable than Python (i.e. solves the readability problem)? The Matlab and Julia code for the provided example are equivalent[1]: which means Julia has more readable math than Python.
[1]: Technically, the article's code will not work in Julia because Julia gives semantic meaning to commas in brackets, while Matlab does not. It is perfectly valid to use spaces as separators in Matlab, meaning that the following Julia code is also valid Matlab which is equivalent to the Matlab code block provided in the article.
It's wild what people get used to. Rustaceans adapt to excruciating compile times and borrowchecker nonsense, and apparently Pythonistas think it's a great argument in favor of Python that all performance sensitive Python libraries must be rewritten in another language.
In fairness, we Julians have to adapt to a script having a 10 second JIT latency before even starting...
It is, because usually someone already did them it them.
Our goal is to make a runtime that lets people stay at the math layer as much as possible, and run the math as fast as possible.
The diag is admittedly unfortunate and it has confused me myself, it should actually be 2 different functions (which are sort of reverse of each other, weirdly making it sort of an involution).
[1] https://numpy.org/devdocs/user/basics.broadcasting.html#gene...
This does not work only with addition and subtraction, but with dot-product and other functions as well. You can do this across arbitrary dimensions, as long as your input matrices non-unit dimensions do not overlap.
The most cogent argument for the use of parentheses for array slicing (which derives from Fortran, another language that I love) is that it can be thought of as a lookup table, but in practice it's useful to immediately identify if you are calling a function or slicing an array.
Companies do not buy matlab to do scientific computing. They buy matlab, because it is the only software package in the world where you can get basically everything you ever want to do with software from a single vendor.
I say this as someone who’d be quite happy never seeing Matlab code again: Mathworks puts a lot of effort into support and engineering applications.
https://www.youtube.com/watch?v=kc9HwsxE1OY
I think it seems pretty interesting.
matlab is like what it would look like to put the math in an ascii email just like how python is what it would look like to write pseudocode and in both cases it is a good thing.
Why not use X.transpose()?
To make `Z` a column vector, we would need something like `Z = (Y @ X)[:,np.newaxis]`.
Doesn't just (Y @ X)[None] work? None adding an extra dimension works in practice but I don't know if you're "supposed" to do that
I know what all of these do, but I just can’t shake the feeling that I’m constantly fighting with an actual python. Very aptly named.
I also think it’s more to do with the libraries than with the language, which I actually like the syntax of. But numpy tries to be this highly unopinionated tool that can do everything, and ends up annoying to use anywhere. Matplotlib is even worse, possibly the worst API I’ve ever used.
This seems to work,
thought it is arguably more complicated than calling the `.reshape(3,1)` method.Or just X.T, the shorthand alias for that
RunMat is an interesting idea, but a lot of MATLAB's utility comes from the toolboxes, and unless RunMat supports every single toolbox I need, I'm going to be reaching for that expensive MATLAB license over and over again.
Will have a really solid rust inspired package manager soon, and a single #macro to expose a rust function in the RunMat script's namespace (= easy to bring any aspects of the rust ecosystem to RunMat).
When I've had to write similar code in Python, it's a massive pain to "prove" that my conversation code is correct. Often I've resorted to using MATLAB's trusted functions to generate "truth" data and then feeding that to Python to verify it gets the same results.
Obviously this is more work than just using the premade stuff that comes with the toolbox.
Any MATLAB alternative faces the same trust issue. Until it reaches enough mindshare that people assume that it's too popular to have incorrect math (which might not be a good assumption but it is one that people make about MATLAB) then it doesn't actually mimic the main benefit of MATLAB which is that I don't need to check its work.
Cases where a JIT running would conflict with requirements notwithstanding (e.g. HIL with strict requirements and whatnot)...
Octave is not particularly fast.
RunMat is very fast (orders of magnitude -- see benchmarks).
I think the MATLAB JIT compiler is probably difficult to match.
Biggest drawback though is that it's over-optimized for matrix math, that it forces you to think about everything as matrices, even if that's not how your data naturally lies. The first thing they teach about performant Matlab code is that simple for-loops will tank performance. And you feel it pretty quickly, I saw a case once of some image processing, with a 1000x speedup from Matlab-optimized syntax.
Other things issues I've run into are string handling (painful), and generally OOP is unnatural. Would love to see something with the convenient math syntax of Matlab, but with broader ease of use of something like JS.
Author of RunMat (this project) here --
> The first thing they teach about performant Matlab code is that simple for-loops will tank performance.
Yes! Since in RunMat we're building a computation graph and fusing operations into GPU kernels, we built the foundations to extend this to loop fusion.
That should allow RunMat to take loops as written, and unwrap the matrix math in the computation graph into singular GPU programs -- effectively letting loop written math run super fast too.
Will share more on this soon as we finish loop fusion, but see `docs/fusion/INTERNAL_NOTE_FLOOPS_VM_OPS.md` in the repo if curious (we're also creating VM ops for math idioms where they're advantageous).
> Would love to see something with the convenient math syntax of Matlab, but with broader ease of use of something like JS.
What does "convenient math syntax of Matlab, but with broader ease of use of something like JS" look like to you? What do you wish you could do with Matlab but can't / it doesn't do well with?
Honest question, Octave is an old project that never gained as much traction as Julia or NumPy, so I'm sure it has problems, and I wouldn't be surprised if you have excellent reasons for starting fresh. I'm just curious to hear what they are, and I suspect you'll save yourself some time fielding the same question over and over if you add a few sentences about it. I did find [1] on the site, and read it, but I'm still not clear on if you considered e.g. adding a JIT to Octave.
[1] https://runmat.org/blog/matlab-alternatives
We like Octave a lot, but the reason we started fresh is architectural: RunMat is a new runtime written in Rust with a design centered on aggressive fusion and CPU/GPU execution. That’s not a small feature you bolt onto an older interpreter; it changes the core execution model, dataflow, and how you represent/optimize array programs.
Could you add a JIT to Octave? Maybe in theory, but in practice you’d still be fighting the existing stack and end up with a very long, risky rewrite inside a mature codebase. Starting clean let us move fast (first release in August, Fusion landed last month, ~250 built-ins already) and build toward things that depend on the new engine.
This isn’t a knock on Octave, it’s just a different goal: Octave prioritizes broad compatibility and maturity; we’re prioritizing a modern, high-performance runtime for math workloads.
Unfortunately, mathworks is a quite litigious company. I guess you are aware of mathworks versus AccelerEyes or Comsol.
For our department, we mostly stop to use MATLAB about 7 years ago, migrating to python, R or Julia. Julia fits the "executable math" quite well for me.
Yes, please. What do I google?
Try JuliaC for compiling shared libraries if that is what you mean by a "module": https://github.com/JuliaLang/JuliaC.jl
That said Julia's original design focused on just-in-time compilation rather than ahead-of-time compilation, so the AOT process is still rough.
> I don't understand why there's so much friction between julia and python. You should be able to trivially throw a numpy array at julia and get a result back.
You can throw a numpy array at Julia and get a result back. See https://juliapy.github.io/PythonCall.jl/stable/juliacall/
The loop fusion idea sounds amazing. Another point of friction which I ran into is that MATLAB uses 1-based offsets instead of 0-based offsets for matrices/arrays, which can make porting code examples from other languages tricky. I wish there was a way to specify the offset base with something like a C #define or compiler directive. Or a way to rewrite code in-place to use the other base, a bit like running Go's gofmt to format code. Apologies if something like this exists and I'm just too out of the loop.
I'd like to point out one last thing, which is that working at the fringe outside of corporate sponsorship causes good ideas to take 10 or 20 years to mature. We all suffer poor tooling because the people that win the internet lottery pull up the ladder behind them.
Julia has OffsetArrays.jl implementing arbitrary-base indexing: https://juliaarrays.github.io/OffsetArrays.jl/stable/
The experience with this has been quite mixed, creating a new surface for bugs to appear. Used well, it can be very convenient for the reasons you state.
I think this is what inspired the creation of Julia -- they wanted a Matlab clone where for loops were fast because some problems don't fit the matrix mindset.
Matlab is successful because of precisely one thing, which nobody has replicated. It offers a complete software environment from one source.
Nowhere else can you get scientific computing, a GUI toolkit, a high level embedded software environment, a HiL/SiL toolkit, a model based simulation environment, a plotting and visualization toolkit and so much more in a single cohesive package. Nobody else has any offering that comes even close.
>The engine is closed source. You cannot see how fft or ode45 are implemented under the hood. For high-stakes engineering, not being able to audit your tools is a risk.
This is just a lie. Open matlab and you can inspect all the implementation details behind ode45. It is not a black box.
>The Cloud Gap: Modern engineering happens in CI/CD pipelines, Docker containers, and cloud clusters. Integrating a heavy, licensed desktop application into these lightweight, automated workflows is painful.
Another lie. See: https://de.mathworks.com/help/compiler/package-matlab-standa... Mathworks has done everything hard for you already. I do not understand why the author feels the need to authoritatively speak on a subject he absolutely does not understand.
How do I see the .c files / trace how `ode45` will execute on my machine? Can I see the JIT's source code?
--
Entitled to your view, but clearly difference of opinion here. From perspective of open / closed source -- maybe for you it qualifies as open source, but I can't follow the logic chain, so to me it's not open source.
"You cannot see how fft or ode45 are implemented under the hood." is a totally false statement. You absolutely can do exactly that. This is not a matter of opinion. Right click the function and open it, you can view it like any other matlab function.
> From perspective of open / closed source -- maybe for you it qualifies as open source
Matlab is obviously not open source. Who said anything about that? The article claims you can not audit ode45, that is false and it seems pretty embarrassing for someone speaking authoritatively about matlab to make such basic claims, which every matlab user can disprove with two clicks. Every single matlab user has the ability to view exactly how ode45 is implemented and has the ability to audit that function. This is not a matter of opinion, this is a matter about being honest about what matlab offers.
But okay -- as I mentioned, you're entitled to your views!
% Built-in function.
The algorithms written in C and compiled by mex are the "built-in" ones that are not viewable.
% Built-in function.
If the algorithm is implemented as a compiled mex function, then you cannot inspect its details.
Mathematica does. Arguably Mathematica is even more cohesive because it's not split up into "feature sold separately" packages.
Also there are high quality free and/or open source alternatives.
GNU Octave https://octave.org and Octave online https://octave-online.net/
Freemat https://freemat.sourceforge.net/ (sadly no ongoing development)
Scilab https://www.scilab.org/ and Scilab online https://cloud.scilab.in/
https://runmat.org
as much as I love the meme in your post, it's the reason I won't be able to share it with work colleagues who use matlab every day
just something to consider
Cheers for the comment!
Among modern alternatives that don't strictly follow MATLAB syntax, Julia is probably the best now?
GNU Octave, as a superset of the MATLAB language, was (is) most capable of running existing MATLAB code. While Octave implemented some solvers better than MATLAB, the former just could not replicate a large enough portion of the latter's functionality that many scientists/engineers were unable to fully commit to it. I wonder whether runmat.org would run up against this same problem.
The other killer app of MATLAB is Simulink, which to my knowledge is not replicated in any other open source ecosystem.
It gets better and better all the time.
Real advantages of matlab:
* Simulink
* Autocoding straight to embedded
* Reproducible & easily versioned environment
* Single-source dependency easier to get security to sign off on
* Plotting still better than anything else
Big disadvantages of matlab:
* Cost
* Lock-in
* Bad namespaces
* Bad typing
* 1-indexing
* Small package ecosystem
* Low interoperability & support in 3rd party toolchains
I will add to that:
* it does not support true 1d arrays; you have to artificially choose them to be row or column vectors.
Ironically, the snippet in the article shows that MATLAB has forced them into this awkward mindset; as soon as they get a 1d vector they feel the need to artificially make it into a 2d column. (BTW (Y @ X)[:,np.newaxis] would be more idiomatic for that than Y @ X.reshape(3, 1) but I acknowledge it's not exactly compact.)
They cleverly chose column concatenation as the last operation, hardly the most common matrix operation, to make it seem like it's very natural to want to choose row or column vectors. In my experience, writing matrix maths in numpy is much easier thanks to not having to make this arbitrary distinction. "It's this 1D array a row or a column?" is just over less thing to worry about in numpy. And I learned MATLAB first, do I don't think I'm saying that just because it's what I'm used to.
I despise Matlab, but I don't think this is a valid criticism. It simply isn't possible to do serious math with vectors that are ambiguously column vs. row, and this is in fact a constant annoyance with NumPy that one has to solve by checking the docs and/or running test lines on a REPL or in a debugger.
You do actually need to make a decision on how to handle 0 or 1-dimensional vectors, and I do not think that NumPy (or PyTorch, or TensorFlow, or any Python lib I've encountered) is particularly consistent about this, unless you ingrain certain habits to always call e.g. .ravel or .flatten, followed by subsequent .reshape calls to avoid these issues.
(There is unhelpful subtext here that I can't possibly have done serious math, but putting that aside...) On the contrary, most actual linear algebra is easier when you have real 1D arrays. Compare an inner product form in Matlab:
vs numpy: OK, that saving of one character isn't life changing, but the point is that you don't need to form row and column vectors first (x[None,:] @ A @ y[:,None] - which BTW would give you a 1x1 matrix rather than the 0D scalar you actually want). You can just shed that extra layer of complexity from your mind (and your formulae). It's actually Matlab where you have to worry more - what if x and y were passed in as row vectors? They probably won't be but it's a non-issue in numpy.> math texts ... are all extremely clear about column vs row vectors and notation too, and all make it clear whether column vs. row vector is the default notation, and use superscript transpose accordingly.
That's because they use the blunt tool of matrix multiplication for composing their tensors. If they had an equivalent of the @ operator then there would be no need, as in the above formula. (It does mean that, conversely, numpy needs a special notation for the outer product, whereas if you only ever use matrix multiplication and column vectors then you can do x * y', but I don't think that's a big deal.)
> This is also a constant issue working with scikit-learn, and if you regularly read through the source there, you see why.
I don't often use scikit-learn but I tried to look for 1D/2D agreement issues in the source as you suggested. I found a couple, and maybe they weren't representative, but they were for functions that could operate on a single 1D vector or could be passed as a 2D numpy array but, philosophically, with a meaning more like "list of vectors to operate on in parallel" rather than an actual matrix. So if you only care about 1d arrays then you can just pass it in (there's a np.newaxis in the implementation, but you as the user don't need to care). If you do want to take advantage of passing multiple vectors then, yes, you would need to care about whether those are treated column-wise or row-wise but that's no different from having to check the same thing in Matlab.
Notably, this fuss is precisely not because you're doing "real linear algebra" - again, those formulae are (usually) easiest with real 1D arrays. It when you want to do software-ish things, like vectorise operations as part of a library function, that you might start to worry about axes.
> unless you ingrain certain habits to always call e.g. .ravel or .flatten or [:, :, None] arcana
You shouldn't have to call .ravel or .flatten if you want a 1D array - you should already have one! Unless you needlessly went to the extra effort of turning it into a 2D row/column vector. (Or unless you want to flatten an actual multidimensional array to 1D, which does happen; but that's the same as doing A(:) in Matlab.)
Writing foo[:, None] vs foo[None, :] is no different from deciding whether to make a column or row vector (respectively) in MATLAB. I will admit it's a bit harder to remember - I can never remember which index is which (but I also couldn't remember without checking back when I used Matlab either). But that's because it's more general in that it works for higher dimensions too. Plus, as I've said, you should rarely need it in practice.
I used this twenty-something years ago. It worked, but I would not have wanted to use it for anything serious. Admittedly, at the time, C on embedded platforms was a truly awful experience, but the C (and Rust, etc) toolchain situation is massively improved these days.
> Plotting still better than anything else
Is it? IIRC one could fairly easily get a plot displayed on a screen, but if you wanted nice vector output suitable for use in a PDF, the experience was not enjoyable.
It was a very unpleasant feeling when I graduated from my PhD and realized that most, if not all, of the Matlab scripts I had used for my research would now be useless to me unless I joined a company or national laboratory that paid for licenses with the specific toolboxes I had used.
I'm glad that a significant portion of tools in my current field are in open source languages such as Python and Julia. It widens access to other researchers who can then build upon it.
(And yes, I'm aware of Octave. It does not have the capabilities of Matlab in the areas that I worked in, and was not able to run all of my PhD scripts.)
Was there a specific reason for that? Or was it simply nobody wrote the code?
It’s like how open source will never replace Excel but probably worse because it’s multiple fields and it’s way harder to replicate it.
https://hg.savannah.gnu.org/hgweb/octave/file/tip/scripts/he...
And this is why you should write free software and, as a scientist, develop algorithms that do not rely on the facilities of a specific language or platform. Nothing is more annoying than reading a scientific paper and finding out that 90% of the "implementation" is calling a third party library treated as a blackbox.
I’m going to stop you right there. Matlab has 5 issues:
1. The license
2. Most users don’t understand what makes Matlab special and they write for loops over their arrays.
3. The other license
4. The other license
5. The license server
Mathworks seems to have set up licensing to maximize how much revenue they can extract with no thought given to how deeply annoying it is to use.
In my case, trivial uses are as important as high-visibility projects. I can spin up a complete Python installation to do something like log data from some sensors in the lab, while I do something in another lab, and have something going at my desk, and at home. I use hobby projects to learn new skills. I've played with CircuitPython to create little gadgets that my less technically inclined colleagues can work with. I encouraged my kids to learn Python. I write little apps and give them to colleage. I probably have a dozen Python installations running here and there at any moment.
This isn't a slam on Matlab, since I know it has a loyal following. And I'm unaware of an alternative to Simulink, if that's your bag. And Matlab might be doing the right thing for their business. My impression is that most "engineering software" is geared towards the engineer sitting at a stationary workstation all day, like a CAD operator. And this may be the main way that software is used. Maybe I'm the freak.
thankfully there are fast open source alternatives out there now, hint hint runmat ;)
I had to jump like 3 links and 4 pages down to figure out what runmat actually "is" / "does".
As someone who's done their whole thesis using Octave this looks interesting.
I love Octave, it's one of my favourite languages. And, for reasons I don't understand even myself, I don't like matlab that much (though I admit their documentation is excellent).
How would you "sell" runmat to someone like me?
Coming from Octave, you'll notice significant speedup advantages, you can see some of our benchmarks with it here https://runmat.org/blog/introducing-runmat
Last month, we put out 250+ built-in functions and Accelerate, which fuses operations and routes between CPU/GPU without any extra code/memory management, i.e. no GPUarray.
We're still flushing out the plotting function, but we'll have updates to share around that and a browser version very soon.
I feel like you should be saying Matlab / Octave wherever possible; especially since your target audience is far more likely to be the one that wants a "faster Octave" rather than a "cheaper Matlab".
PS: Don't trust github language stats; half of that code is octave specific, but still gets labelled as Matlab.
Terrible HPC integration.
Proprietary runtime.
Granted, I've seen Python horrors on university HPC clusters too, but at least there are libraries and clear documentation (e.g. Lightning, Ray, etc) for how to properly manage these things. Good luck finding that with Matlab.
You claimed they asserted that "Universities are wasteful".
Put the goalposts back where they were.
I am saying that because it is much harder to find good documentation on using MATLAB on HPCs, a lot of computations on HPCs that use MATLAB are highly wasteful compared to if they had been written using a language and/or tools that make it much easier to use HPC resources more efficiently. I was NOT in any way saying that "universities are wasteful".
All other things being equal, suppose two research programs are proposing to study roughly the same thing (say, some novel optimization or basic stuff on something like simple neural networks; and let's pretend this is some years ago when people still actually reached for MATLAB neural net tools). If both request significant compute allocations, and I see one is planning on MATLAB, and the other on PyTorch Lightning, I know for sure I would want to give the MATLAB users far less funding, or even none at all, since they're really going to struggle to properly leverage the CPUs and GPUs available to them, whereas the Lightning people will largely just have this work immediately, and almost certainly be able to iterate faster and be more likely to find something meaningful.
It's a contrived and unfair example, and in practice the real problem is actually the annoying MATLAB licensing mostly, but also it really is a fact that MATLAB screws up even basic stuff in HPC environments (see e.g. https://docs.alliancecan.ca/wiki/MATLAB#Simultaneous_paralle...).
I want to say the least offensive way possible that I really do hope you do not ever get to review NSF/NIH grants because of bias like this.
The example that you gave is/was also a solved problem in the HPC systems I used. SLURM/PBS, when properly configured, can ensure that local clusters do not spill over and not use resources that are not supposed to be used.
and I just straight up installed GNU Octave on the server and called out to it from python, using the exact code the mathematician had devised.
I have gone further and asked AI to port working but somewhat slow numerical scripts to C++ and it's completely effortless and very low risk when you have the original implementation as test.
At the time, we had massive issues with using Matlab with large fMRI/EEG/MEG data sets, and attempts to write naive matrix-based versions of code would occasionally blow up memory consumption, and turn a 3-week analysis into a 50-year analysis.
So, yeah, I had to replace a decent amount of pretty matrix code into gnarly, but performant, for loops. Maybe the situation has improved since then, but I don't care to find out.
---
Want strings? You had your choice of cells or 2D char matrices? Who ever thought char matrices were a good idea? strfind() vs findstr()? Even after years of Matlab, I had to double-check the docs to recall which one I wanted.
---
Anything to encourage reliability or assist scientists in their workflows, like built-in version control? Nope. Or basic testing support for your ad hoc statistical functions? No.
I guarantee there's a ton of Matlab code that produced biased/wrong results, and nobody knows because it produced numbers in the expected range, and nobody ever thought to check it.
Mathworks was in a unique position to improve scientific code quality, and did nothing with it.
---
Matlab really excelled at only two things: matrix math and making pretty plots. As soon as you needed to do anything else, it was unbelievably painful, and that's where my personal dislike came from.
In my experience, those arguing for the value of Matlab are mostly 55+ years old, or are in an extremely niche industry using something like e.g. Simulink or other highly-industry-specific tooling, in which case it seems the considerations are irrelevant to something like 99.5% of the modern population.
Matlab will clearly be dead and irrelevant otherwise, in a short amount of time and in almost all domains.
Wild to hear. At the time, almost everybody in the field used it. The then-dominant fMRI package (SPM) and EEG/MEG package (Fieldtrip) were both open-source Matlab. (I think I knew one prof who used BrainVoyager, and that's because he hired a former BV employee as an RA.)
I feel attacked.
They came to speak at my school and described open source alternatives (Python in particular) as the biggest threat to MATLAB.
I think if they open-sourced the MATLAB runtime and embraced a model similar to Canonical or Red Hat where users paid for support or integrations, they'd make more money. But it's hard to get there from where they are now.
You can do the same thing in other languages but it won't be built in like that.
I suppose it basically boils down to whether your orgs engineering is run by academics or software engineers, but Matlab doesn't really do anything that python can't for free. And python is more accessible, has more use cases, and strong academic support already.