Can be argued that there is intuitive satisfaction/pleasure/utility that spectators gain from watching sports competitions. The payoff is a lot more obvious/instant. Whereas with a lot of tech these days, what needle are we really moving? Are people truly happier scrolling for two hours, compared with watching an edge-of-seat soccer game?
I recall way back in the day when Stephen Fry used to host QI, they did a bit about a lasagne battery. Sean Lock was on the panel if i remember and spun it out into a ipod-style "lasagne-pod"
Trivial to eliminate through window treatments and training to mitigate should-surfing risks.
It’s probably more valuable as a surveillance and monitoring tool than an espionage one, but they would no doubt be the first customers (if not already).
I find this interesting but the nomenclature escapes me: How is T(z) = z + zT + zT^2 + ... ? I find the jump from the functional programming description of the tree to this recurrence relation not intuitive
I'm afraid the author is writing for an audience that's familiar with generating functions. It's using a common trick in analytic combinatorics, the notation is actually "shorthand".
Very informally, let T(z) be the generating function where the coefficient of the z^n term is the number of trees we are searching with exactly n nodes.
We know from the functional description that such a tree can be:
- a leaf;
- a node with 1 child;
- a node with 2 children;
- a node with 3 children.
A leaf is a tree with one node and that's it, so it "contributes" as a single tree of size 1.
How many trees of size n are there that fall in the second case? Exactly as many as the total trees of size (n-1), hence this case "contributes" as zT(z).
How many trees of size n fall in the third case? That's a bit harder to see, because now you have (n-1) nodes in total, split between the two branches. So you can have 1 in the left branch and n-2 in the right branch, 2 left n-3 right and so on. With a bit of basic combinatorics you can notice that this is exactly the convolution product of the series with itself, in analytical terms T^2(z). Hence the third case "contributes" for zT^2(z).
The fourth case is similar.
At this point we have constructed the inverse of the generating function we are looking for, i.e. z as a function of T(z) rather than the reverse, but luckily there's a standard trick for this: Lagrange inversion.
Generating functions are definitely strange and unintuitive, but they can be really helpful for combinatorics.
The introduction to generating functions in this well known text on the subject (at the beginning of Chapter 1) might be more helpful than the Wikipedia article linked in the original post:
> Although giving a simple formula for the members of the sequence may be out of the question, we might be able to give a simple formula for the sum of a power series, whose coefficients are the sequence that we’re looking for.
This. There seems to be one of these announcements every so often, and i havent seen any of them used at scale, or making any kind of dent in the status quo.
Do you use WRF as an input? I think graphcast uses some kind of NWP input (qhich in my mind i've equated to WRF). Thank you for offering to answer questions!
We use a commercial meteorology service (aka Openweathermap with resolution 500m x 500m ) as one input so they might use WRF data (and indirectly my model as well).
OWM was chosen, because they provide an easy api for providing general weather data.
Last I checked the AI models still used the same initial conditions used in models like the GFS and ECMWF which is a combination of a previous forecast adjusted with observations.
What are some good metrics to evaluate LLM output performance in general? Or is it too hard to quantify at this stage (or not understood well enough). Perhaps the latter, or else those could be in the loss function itself..