Author here. Somehow the worst thing I ever wrote is on the front page of HN.
I wrote this fast so there's jargon and bad prose.
The title is deliberately dry and bland so I wasn't expecting anyone to click it. Also I slightly changed my mind on some of the claims .. might write up later.
The main reason I like to think of creative work in a more abstract/formal/geometric way (acceptance volume, latency, sampling) is it's easier for me categorize tasks, modalities and domains and know how to design or work around it. It's very much biased by more own experiences making things.
Also, abstract technical concept often come with nice guarantees/properties/utils to build on .. some would say that's their raison d'être.
Re comments:
* "this is just diminishing returns" -- ok and this is a framework for why: the non-worsening region collapses, so most micro-edits fail
* "bands record bangers in an hour" –– practice tax was prepaid. The recording session is exploitation/search riding on cached heuristics imo (and it still takes hours of repeated recording/mixing/producing to actually produce a single album track).
* music key example –– yes I should've picked a different one. Main point was that some choices create wider tolerance (arrangement/range/timbre) even if keys are symmetric in equal temperament
the interesting web problems (collaborative tools, creative software, scientific computing) have to fight against a framework ecosystem designed for the blandest landing pages.
Nextjs and most web frameworks assume you're building an e-commerce site that has to only differentiate on loading speeds.
It's great people experiment. Yet, after a quick glance the framework seems to break expected semantics for no gain? For example: the "component" construct auto-renders the JSX within the brackets instead of being a function that "returns a view as value" (if I understood correctly). I'm cool with breaking the "f(x)=>UI" but what for? The TS and VS Code integration is all fine and good but what's the gain of using JSX (again) instead of the beloved HTML-like Svelte markup? Have people ever loved JSX? Nostalgia maybe? Again, I'm all for tearing down the house but shouldn't we get more elegance or expressive semantics in return?
Still, great he did it I just hope he'll be bolder next time! Of the 7 mentioned features 3-4 are not actually about the language but minor tooling hook ins (prettier, VSCode,) ... maybe that's the wrong focus.
Either way, the strong similarity with other JS/TS frameworks (semantics & syntax) suggests that for really new ideas the community should look elsewhere? Perhaps Clojurescript or Laravel (PhP) ...
I love JSX and hate HTML template syntax, so there's at least one person.
What I like about JSX is that you can reuse all of the control flow primitives from JS. You have if, for, map, filter, flatMap, etc. if you want a new combinator, just write a function as you would for any other type of data transformation.
I consider this one of the biggest advantages of React compared to the approach of having separate templating and scripting languages.
Don't you still have imperative constructs if you use Svelte? It's not like any web framework can remove the need to render things conditionally.
If you have `{#if cond}`, that's imperative programming, not declarative, it's just imperative programming using a different syntax.
Using a different syntax for a conditional/loop depending on the type of the data is odd when you think about it. Why should vnodes use one syntax and every other data type use another syntax?
I will grant that the C ?: ternary operator syntax is ugly. If C had had expression based if/else like Rust then maybe people wouldn't mind JSX so much.
Absolutely, and that's one of the reasons that, as much as I was enthusiastic about svelte, it still didn't quite hit the spot. Close but not quite there yet.
It's neither html nor js. Still a bit confusing.
I've never liked JSX, much prefering HTML like (Vue, Svelte), but I do like this template syntax. On reading it makes sense to me in a way that JSX never did, feeling squashed into the 'language'.
HTML is an implementation detail and JSX is a syntax for describing UI components. For example JSX can be reused across different contexts such as for React Mobile. It just so happens most usage of JSX is used to render HTML so it superficially appears to be writing HTML syntax. But I prefer JSX because it allows me to write UI HTML components as pure functions, without having to syntactically write out actual functions in whatever programming language I'm using.
Why are people getting hung up on his hubris? If his megalomania gives us Mathematica who cares. It's a phenomenal accomplishment. It can, out of the box, do hundreds of things that'd take you weeks in python/julia/etc or would be entirely impossible for most in other systems.
> Why are people getting hung up on his hubris? If his megalomania gives us Mathematica who cares.
We've had some bad results taking that approach recently - which we really should have anticipated, with millenia-old stories warning us about it. For one thing, megalomaniacs tend to be frauds (perhaps because, to be a megalomaniac, it's necessary to lie to yourself); maybe Mathematica will turn out to be the FTX, or Tesla auto-pilot, or ..., of mathematical software/languages.
No way - how could that happen? All those people at Mathematica would just go along with it, right? There's no evidence! It would be too brazen!
Mathematica has been publicly available for 35 years now, this is the 14th release. The tool has worked extremely well and been tangible for decades, with the value during that period being in the customer's direct use. In what ways does this remind you of FTX or Tesla auto-pilot?
Well, he claimed that Mathematica, particularly cellular automata implemented in Mathematica, would bring about "A New Kind of Science" as described in his thick book of that name. It didn't of course -- people like to still play with cellular automata like The Game of Life (or my favorite, Wireworld), but no revolution in science involving CAs has occurred. The whole thing was a marketing ploy for Mathematica disguised as science. Not mention that the only truly new thing mentioned in the book (that CAs can be Turing complete) wasn't even Wolfram's finding, but rather Matthew Cook's (who sued him for not crediting it to him, although they have since settled).
Wolfram does lots of wacky stuff but how does it make Mathematica any less tangibly useful today? He can want to use Mathematica as part of eradicating cancer using toilet brushes and it would still not change that Mathematica is and has been an extremely useful piece of software for decades. The point isn't every single thought he has is gospel revolution it's that Mathematica is already a delivered value.
Eeeh. Scott Aaronson review [0] and Cosma Shali's review [1] of NKS kinds of points to this direction, so he may well be a fraud. The plagiarism case regarding rule 110 (and attempts to hide this through NDA and lawsuits [2]) doesn't do him any favors, either. Indeed maybe the biggest problem of Wolfram isn't the megalomany per se, but the total unwillingness to give credit to others and cite their damn papers. Wolfram simply isn't in the business of sharing his bibliography, which is problematic for a scientist.
However, the newer Wolfram Physics [3] [4] looks so damn promising that I'm willing to entertain possible quackery. I mean it surely has many ideas regarding how a digital universe would look like; he may be all wrong in the details but his ideas look important contributions to me. I sometimes think what's like to be at the frontier of science; today we take relativity (for example) for granted but there was a time when it was up in the air. When I read Wolfram's stuff stuff I just think this could very well be true, and while there's absolutely no evidence for it the conjectures all make sense.
> maybe Mathematica will turn out to be the FTX, or Tesla auto-pilot, or ..., of mathematical software/languages.
You can install Mathematica at any time. It produces graphical output. You can see for yourself what it does, and what it does not do.
If Wolfram were making claims about a future unreleased version of Mathematica, sure, I would absolutely weigh his ego against those claims. But they are largely irrelevant when it comes to a currently existing and available product. If he were to claim that Mathematica 14 cures cancer, that would deserve eyerolls but it wouldn't tarnish the quality of the software.
> We've had some bad results taking that approach recently
I'm struggling to believe that Wolfram's hubris is anything like that of Trump or Musk or SBF etc. Wolfram seems like a genuinely smart person who knows he's smart, but is probably too aware of it. But he's not done anything majorly bad - and arguably has done good in creating tools for scientists and engineers. I think we need to give him a break. I certainly think that the perennial discussion of his ego is tiresome and distracting.
I'm basing this on my experience a few years back.
It's not that you can't do it with other software, Mathematica just has all of it included by default, with very comprehensive documentation. And a slick UI.
Sure you can probably do most of it in python, but you'll find yourself chasing some obscure modules to get some things working. Even just to get arbitrary precision calculations for just about everything, for example. And don't forget the importance of a documentation that explains every option with examples (barring some obscure stuff, which can be quite annoying if you encounter it).
Since it's all integrated Mathematica allows you to go from calculating Sin[2] to arbitrary precision, to calculating the derivative of Sin[2x], to showing the first 10 terms of its Taylor series. All using the same sine function.
This does have some downsides. For one it's a pretty heavy program to run. And because they want to include everything they need to be quite opinionated about certain things. There are multiple ways to define fractional derivatives for instance, since they include a function for it they must have picked one of them. And then there are the name clashes, I once had to laugh quite loudly when I tried "Rotate[{0,1}, 45deg]" and got back an image of {0,1} at a 45 degree angle.
It is an impressive piece of engineering. Which could have had much more of an impact if it was a bit more open, but oh well...
You should give it a spin. It’s an incredibly comprehensive Computer Algebra System with all the batteries included…. No, with a small nuclear reactor in the box.
I’m just sad that I don’t have access anymore after college, being too cheap to pay for a license.
- Far better language design, actually designed from ground up to write Mathematics in, rather than hacked in later.
- Specially, the language supports symbolic computation natively. In a lot of software, you have to declare symbols as symbolic variables before using them. In Mathematica, you don't have to deal with this nonsense.
- A library of both symbolic and numerical algorithms that is far far far better than any other library out there. Especially its outstanding how much symbolic computation algorithms are built in. Fairly fast numerical algorithms as well. Best of all, Mathematica guesses the best algorithm to use in a particular situation with frightening accuracy.
- A lot of Maths is just built in. Example: A number of common Groups are just there for you to immediately start playing with.
I had thought of him as megalomaniac, but I recently heard him do a two hour podcast on the Joe Walker Podcast (formerly Jolly Swagman). He came off as relatively normal for someone who earned a Ph.D. in particle physics from Caltech at the ripe age of 20.
sad that this guy could've continued doing good CS research as assoc prof but instead is compelled to do youtube and sell productivity courses of the three same rehashed ideas ad infinitum.
It’s even sadder that you feel the need to comment on someone else’s career as if you decide what is important or not in terms of human work and worth.
Cal has applied systematic and algorithmic thinking to the messy world of productivity annd life, and teaches unique frameworks to this day about how to pursue a life that is eudamonious and well lived.
It’s also distressing that you think that mathematics research is more important than your fellow man having a fulfilling career.
His non-academic work is an unbroken continuation/extension of work he started as an undergraduate student writing about productivity for students before he was ever a researcher.
Your comment reads like he was just another researcher searching for gigs to supplement a meager academic income instead of being a human being with more interests than just his primary line of work.
I wrote this fast so there's jargon and bad prose. The title is deliberately dry and bland so I wasn't expecting anyone to click it. Also I slightly changed my mind on some of the claims .. might write up later.
The main reason I like to think of creative work in a more abstract/formal/geometric way (acceptance volume, latency, sampling) is it's easier for me categorize tasks, modalities and domains and know how to design or work around it. It's very much biased by more own experiences making things.
Also, abstract technical concept often come with nice guarantees/properties/utils to build on .. some would say that's their raison d'être.
Re comments: * "this is just diminishing returns" -- ok and this is a framework for why: the non-worsening region collapses, so most micro-edits fail
* "bands record bangers in an hour" –– practice tax was prepaid. The recording session is exploitation/search riding on cached heuristics imo (and it still takes hours of repeated recording/mixing/producing to actually produce a single album track).
* music key example –– yes I should've picked a different one. Main point was that some choices create wider tolerance (arrangement/range/timbre) even if keys are symmetric in equal temperament