The integer 2026 is semiprime and a happy number, with 365 as one of its primitive roots. Although 2026 may not be particularly noteworthy in number theory, this provides a great excuse to create various elaborate visualizations that reveal some interesting aspects of the number.
This document (notebook) shows transformations of a movie dataset into a format more suitable for data analysis and for making a movie recommender system. It is the first of a three-part series of notebooks that showcase Raku packages for doing Data Science (DS).
Yes, Wolfram Language (WL) -- aka Mathematica -- introduced `Tabular` in 2025. It is a new data structure with a constellation of related functions (like `ToTabular`, `PivotToColumns`, etc.) Using it is 10÷100 times faster than using WL's older `Dataset` structure. (In my experience. With both didactic and real life data of 1_000÷100_000 rows and 10÷100 columns.)
Mostly, because Python is not a good a "discovery" and prototyping language. It is like that by design -- Guido Van Rossum decided that TMTOWTDI is counter-productive.
Another point, which could have mentioned in my previous response -- Raku has more elegant and easy to use asynchronous computations framework.
IMO, Python's introspection matches that Raku's introspection.
Some argue that Python's LLM packages are more and better than Raku's. I agree on the "more" part. I am not sure about the "better" part:
- Generally speaking, different people prefer decomposing computations in a different way.
- When few years ago I re-implemented Raku's LLM packages in Python, Python did not have equally convenient packages.
Ah, yes, Raku's "LLM::Graph" is heavily inspired by the design of the function LLMGraph of Wolfram Language (aka Mathematica.)
WL's LLMGraph is more developed and productized, but Raku's "LLM::Graph" is catching up.
I would like to say that "LLM::Graph" was relatively easy to program because of Raku's introspection, wrappers, asynchronous features, and pre-existing LLM functionalities packages. As a consequence the code of "LLM::Graph" is short.
Wolfram Language does not have that level introspection, but otherwise is likely a better choice mostly for its far greater scope of functionalities. (Mathematics, graphics, computable data, etc.)
In principle a corresponding Python "LLMGraph" package can be developed, for comparison purposes. Then the "better choice" question can be answered in a more informed manner. (The Raku packages "LLM::Functions" and "LLM::Prompts" have their corresponding Python packages implemented already.)
Specifications for asynchronous LLM computations with Raku's "LLM::Graph" detail how to manage complex, multi-step LLM workflows by representing them as graphs. By defining the workflow as a graph, developers can execute LLM function calls concurrently, enabling higher throughput and lower latency than synchronous, step-by-step processes.
"LLM::Graph" uses a graph structure to manage dependencies between tasks, where each node represents a computation and edges dictate the flow. Asynchronous behavior is a default feature, with specific options available for control.
I think Raku is better than Python agent based systems for a few reasons:
- You don't have to think about concurrency or multithreading as in Python. There is no GIL to worry about. The built in support for things Supply and hyper-operators are all available in the language. It is really easy to hook up disparate parts of a distributed agent without having to think about async or actors libraries or whatever in Python.
- Something I prefer is the OOP abstractions in Raku. They are much richer than Python. YMMV, depending on what you prefer.
- Better support for gradual typing and constraints out of the box in Raku.
Python wins on the AI ecosystem though :)
I started messing around with this code several years ago and the LLM libs in Raku were not as rich as today. I thought I needed a specific type of LLM message handling structure that could be extended to do tool handling and some of Letta type memory management (which I never got around to!). I have some Python libs of my own and I ported them. I suspect if I was starting now, I would use what is available in the community. This version of TallMountain is the last of a long series of prototypes, so I never rewrote those parts.
Nice to see others who think that Raku is a good fit for LLM ... I have had some success integrating LLM::DWIM (a raku command line LLM client built on LLM::Functions etc) with a DSL approach to make a command line calculator based on Raku Grammars.
> crag
> ?^<elephant mass in kg> / ?^<mouse mass in kg> #300000①
> ?^<speed of a flying swallow in mph> #30mph
- «Numerically 2026 is unremarkable yet happy: semiprime with primitive roots» https://community.wolfram.com/groups/-/m/t/3594686
- «Happy √2²²-22 -- And other ways to calculate 2026» https://community.wolfram.com/groups/-/m/t/3599161
reply