Only if the Windows Regional Settings List Separators happens to be "comma", which is not the case in most of Europe (even in regions that use the decimal point) so only CSV files with SEP=, as the first line work reliably with Excel.
Literally did this all day today. Took a csv file, parsed it in elixir, processed it and created a new csv file, then opened that in excel, to confirm the changes. At least 100 times today.
Does anyone predict economy/population/... by simulating individual people based on real census information? Monte carlo simulation of major events (births, death, ...) based on known statistics based on age, economic background, location, education, profession, etc.? It seems there are not that many people that this would be computationally infeasible, and states and companies have plenty of data to feed into such systems. Is it not needed because other alternatives give better results, or is it already being done?
I've done a lot of advanced research in this domain. It is far more difficult than people expect for a few reasons.
The biggest issue is that the basic data model for population behavior is a sparse metastable graph with many non-linearities. How to even represent these types of data models at scale is a set of open problem in computer science. Using existing "big data" platforms is completely intractable, they are incapable of expressing what is needed. These data models also tend to be quite large, 10s of PB at a bare minimum.
You cannot use population aggregates like census data. Doing so produces poor models that don't ground truth in practice for reasons that are generally understood. It requires having distinct behavioral models of every entity in the simulation i.e. a basic behavioral profile of every person. It is very difficult to get entity data sufficient to produce a usable model. Think privileged telemetry from mobile carrier backbones at country scales (which is a lot of data -- this can get into petabytes per day for large countries).
Current AI tech is famously bad at these types of problems. There is an entire set of open problems here around machine learning and analytic algorithms that you would need to research and develop. There is negligible literature around it. You can't just throw tensorflow or LLMs at the problem.
This is all doable in principle, it is just extremely difficult technically. I will say that if you can demonstrably address all of the practical and theoretical computer science problems at scale, gaining access to the required data becomes much less of a problem.
I’m also super interested in this kind of question. The late Soviet Union and their cybernetics research were really into simulating this kind of stuff to improve the planned economy. But I’m curious if something like this can be done on a more local scale, to improve things like a single company output.
You might find early agent-based models (e.g. the Sante Fe Institute's Artificial Stock Market[0]) interesting.
IMO the short answer is that such models can be made to generate realistic trajectories, but calibrating the model the specific trajectory of reality we inhabit requires knowledge of the current state of the world bordering on omniscience.
Agent based modeling (ABM) is an attempt at this. I've wanted to forecast the economy on a per-person basis since playing Sim City as a kid (although Sim City is not an ABM to be clear). From doing a bit of research a while back it seemed like the research and real world forecasting have been done on a pretty small scale and nothing as grand as I'd hoped. It's been a while since I've looked into so I would be happy to be corrected.
Doyne Farmer's group at Oxford does 'agent-based' economics simulations in this vein. He has a new book called 'Making Sense of Chaos' that describes it.
Feynman has an interesting story about critical mass:
> Los Alamos was going to make the bomb, but at Oak Ridge they were trying to separate the isotopes of uranium ... he saw them wheeling a tank carboy of water, green water - which is uranium nitrate solution. He says, “Uh, you're going to handle it like that when it's purified too? Is that what you're going to do?" They said, “Sure -- why not?" "Won't it explode?" he says. Huh! Explode?" ... he noticed certain boxes in big lots in a room, but he didn't notice a lot of boxes in another room on the other side of the same wall ... what you would have to do to fix this. It's rather easy. You put cadmium in solutions to absorb the neutrons in the water, and you separate the boxes so they are not too dense ...
WPF is now open source (MIT licensed [1]), and its XAML control templates provide _as data_ a full declarative description of how a native Windows control is supposed to look like (in multiple Windows themes like Aero for Win7, Aero2 for Win10, Luna + Royale for WinXP, and Classic for Win95 look and feel [2]).
This includes everything like the exact colors and gradient stops and animation timing and vector shapes and accessibility behavior etc. of buttons and scrollbars and everything. Example: [3]
I wonder what one could learn / achieve trying to "port WPF to rust" / implement a XAML control template renderer in Rust. If you can "simply" parse and interpret those XAML files do you instantly get a native-like GUI that supports the exact look and feel of these different Windows themes? (on any OS!)
Somehow I think it is not realized how amazing that is!
I’ve had similar thoughts and starting writing my own rust XAML framework with exchangeable backends (backend implementations incomplete) but didn’t find much interest from the community despite how awesome XAML is at separating the UI from the toolkit.