It’s very hard to do tags in the physical world. You need to stick different colored post-its to things and do a full table scan (with your eyes) any time you want to process all docs of one tag. Or you cluster things together depending on similar colors.
Hierarchy is easy in the physical world.
But what is crazy is since the dawn of computing we can store data however we want and project it however we want…and yet we still use hierarchy for file storage…like we still just have a filing cabinet of manilla folders.
My favorite is traditional Japanese time. Breaks day and night into 6 equal time periods each, and adjusts them as the seasons change. I made a Sahku Dokei (19th century Japanese pillar clock) simulator to play with it.
In a way, China kind of has something like the Swatch gimmick for real. There's just one time zone in the whole country (which is roughly the size of the Continental US). This has benefits (easy to coordinate video conferences in different cities) and drawbacks (the official time is far off from what the sun would indicate in much of China).
It only works because the overwhelming majority of the population and all of the political and economic power lies on the east coast of China in a single time zone. I doubt that the people in Urumqi are happy to have the sun rise at 10 am, and I doubt that anyone cares about their opinions.
I stayed in Urumqi four times (in 1993, 1996, 2006 and 2010), each time for some weeks. It is really confusing that the (traditional) working hours are from 10AM to 2PM and 4PM to 8PM. I found myself everytime looking at the clock substracting two hours. Similar as to when we changed currencies in the Netherlands when the Euro was introduced. I guess it would have taken about half a year stop doing the reverse time calculations.
Given that the sun rose at 10 am approximately 0 times this year [1] I guess the people in Urumql were ecstatic. Also note that the sunrise time varied by about 3h over the course of the year so how many times do you want to change the clocks?
Of course to your actual point and everybody else that brings that same one up. You do not need to wake up at 8am every day ... If the sun actually rises at 1400 then feel free to start your day at 1500. I'm not sure what why people keep arguing as-if they can't figure out a time besides 8am to wake up; look at the world around you, so many people wake up at wildly different times in a timezone.
I get that it's all arbitrary, but if we're going to go for a global time system, just makes sense to me to try and align with already-existing universal standards. Both UTC +0000 and UTC +0100 are pretty alien to me as someone in UTC +0800 so it's not like I have any bias toward Switzerland or the UK either way...
I think the idea is that for most human uses of time we don't specify start or end times to a precision of more than about 5 minutes. Stuff like train timetables you might want to go down to about a minute. So one could argue that we have at least 60 times the resolution we really need for day-to-day use.
If you absolutely need more precision (accurate timestamping) then decimals are available.
Yep tho most ppl use microwaves by pressing the "30s" button (I guess it would be labelled 1/2 or 1/3) n times. Other cooking seldom requires time precision < 1 minute, for finicky precise things you usually watch the process and manage it by eye, rather than relying on absolute time.
"way less precise" ? There are only 1440 minutes in a day, so a beat is 1 minute and 26.4 seconds, precise enough. And then, if you you want more precision, like we use seconds for minutes, you can divide a beat by 100 (@500.12), not less inconvenient than using seconds.
What if my SQL engine is Presto, Trino [1], or a similar query engine? If it's federating multiple source databases we peel the SQL back and get... SQL? Or you peel the SQL back and get... S3 + Mongo + Hadoop? Junior analysts would work at 1/10th the speed if they had to use those raw.
>My dream is to make everything visualizable at runtime.
Check out demos of the old Lisp Machines, [1] is a brief overview demo, [2] links to a timestamp with a view of some simple diagramming, but I’ve seen TI-Symbolics beasts routinely display complex relationships in Lisp code on their massive (for the time) bitmapped screens. The limitation was the end user managing the visualization complexity.
With open source llvm, clang and similar making available abstract syntax trees and even semantic analysis and type checking results, LLM’s assisting with decompiling binary blobs, and modern hardware (goggles, graphics cards, and so on), I sometimes wonder how close we can come to reproducing that aspect of the Lisp Machine experience on open source operating systems.
Are you referring to the "exclusive advisory locks by default" policy it carried over from MS-DOS? If so, yes, I do feel like that's the worst architectural decision of Windows.
Ultimately, it's what makes software updates as hard as they are there: Just being able to do an "apt-get update" with applications open without anything exploding was a true revelation the first time I used a Unix.
Something cool however is you can actually build the open-source WebKit browser engine yourself and make closed-source Safari use your locally built version.
Interesting read. It’s amazing more people don’t use runtime variable value annotation tools like Wallaby.js, or a debugger.
So much time spent mentally remembering what is in what variable based on the naming.
I often find myself adding “// e.g. foo, bar” to show example cases for some lines of code…like recedes for example. Wallaby.js is a godsend for this though.
Hierarchy is easy in the physical world.
But what is crazy is since the dawn of computing we can store data however we want and project it however we want…and yet we still use hierarchy for file storage…like we still just have a filing cabinet of manilla folders.