> One massive restriction on ultralight airplanes is that you can't carry passengers. Again, on paper, because of risk, but it ends up having some controversial effects. If you want to occasionally carry passengers, as many pilots do, you can't fly a cheap, simple, light, and slow ultralight aircraft.
Ineresting. I get a ride as a passenger in an ultralight two-seater. The pilot was a Vietnam vet. It was in Virginia, and we overflew part of the Great Dismal Swamp, which has alligators.
Nice! If it was a two seater, it does not legally qualify as an ultralight in the US, see FAR 103.1(a) [1]. Back in the day, some older ultralights were designed with a very wide seat ("loveseat"), that physically but not legally allowed the pilot to take a passenger. Not sure if that's the case here. If the aircraft actually had two distinct seats, one clearly intended for a passenger, then it may have been a homebuilt aircraft, registered in the "experimental" category (FAR 21.191). Just like ultralights, these also operate without a type certificate, but upon completion of the build, the builder needs to provide a bunch of logs / photos / etc. documenting the construction process for the FAA to approve the aircraft for flights. Because such aircraft are designed to be built/assembled mostly (51%) by regular people (not at the factory), they often look very basic, just like proper ultralights, but they can be heavier, faster, carry passengers, etc.
I see, so thus guy was probably licensed to carry passengers. Thanks.
The seating arrangement was pilot in the front, passenger behind. When my son (about 8) get off his ride and was asked if he enjoyed it, he said "I prefer planes with doors".
"Voter literacy" is obviously a good thing; but blockchain and encryption are not part of that. Very few software developers, for example, are qualified to evaluate a crypto scheme.
> But it is possible that every extended family will have at least one member who can run a server
That's as may be; but many, many people have no access to an "extended family". And extended families are not necessarily warm, safe spaces where everyone trusts everyone else; extended families are more likely to be "broken" than nuclear families.
> And extended families are not necessarily warm, safe spaces where everyone trusts everyone else; extended families are more likely to be "broken" than nuclear families.
It is a good thing to promote and advance privacy, security, and freedom to isolated, atomized individuals; but it is important for all of humanity to promote and advance those same ideals to extended families. People who have no access to an extended family will ultimately either join a different one or disappear into the mists of ages past. In 100 years, the Earth will be populated mostly by the descendants of people in extended families today, however imperfect or even broken those extended families may be. If those people today don't see privacy, security, and freedom as both possible and worthy, their descendants may not value or even possess any of those ideals.
> How costly can it be to test the file fully in a CI job?
It didn't need a CI job. It just needed one person to actually boot and run a Windows instance with the Crowdstrike software installed: a smoke test.
TFA is mostly an irrelevent discourse on the product architecture, stuffed with proprietary Crowdstrike jargon, with about a couple of paragraphs dedicated to the actual problem; and they don't mention the non-existence of a smoke test.
To me, TFA is not a signal that Crowdstrike has a plan to remediate the problem, yet.
You just got tricked by this dishonest article. The whole section that mentions dogfooding is only about actual updates to the kernel driver. This was not a kernel driver update, the entire section is irrelevant.
This was a "content file", and the first time it was interpreted by the kernel driver was when it was pushed to customer production systems worldwide. There was no testing of any sort.
It's worse than that -- if your strategy actually was to use the customer fleet as QA and monitoring, then it probably wouldn't take you an hour and a half to notice that the fleet was exploding and withdraw the update, as it did here. There was simply no QA anywhere.
Thing is, as far as I can see, deploying this database update to a Windows machine will result promptly and unconditionally in a BSOD. That implies that this update was tried on exactly zero machines before it was shipped.
The bug can't have "slipped through internal testing"; it would have failed immediately on any machine it was loaded on.
This is true, but has no bearing on posters that command neither my endorsement not my respect.
You get my endorsement if I agree with you. You get my respect even if I don't agree with you, provided you don't post lies and nonsense, AND are amenable to rational argument.
But isn't that a fairly tiny risk, compared with letting a third party meddle with your kernel modules without asking nicely? I've never been hit by a zero-day (unless Drupageddon counts).
I would say no, it's definitely not a tiny risk. I'm confused what would lead you to call getting exploited by vulnerabilities a tiny risk -- if that were actually true, then Crowdstrike wouldn't have a business!
Companies get hit by zero days all the time. I have worked for one that got ransomwared as a result of a zero day. If it had been patched earlier, maybe they wouldn't have gotten ransomwared. If they start intentionally waiting two extra days to patch, the risk obviously goes up.
Companies get hit by zero day exploits daily, more often than Crowdstrike deploys a bug like this.
It's easy to say you should have done the other thing when something bad happens. If your security vendor was not releasing definitions until 48 hours later than they could have, when some huge hack happened becuase of that obviously the internet commentary would say they were stupid to be waiting 48 hours.
But if you think the risk of getting exploited by a vulnerability is less than the risk of being harmed by Crowdstrike software, and you are a decision maker at your organization, then obviously your organization would not be a Crowdstrike customer! That's fine.
I don't think anymore, but in the last century there were several cases of therapists using hypnosis to unlock "repressed memories" of alien abductions. They usually wrote up books about it for profit.
This is thoroughly studied by psychologists. I feel obligated to tell you that multiple therapists have used access to repressed memories to control their patients. A large, highly organized religious movement overtly claims that its alien teachings are supported by repressed memories.
Not specifically, but they will follow the absurd path of asking you leading questions until you convince yourself that you were abducted by aliens, and then being quacks will decide that their methodology couldn't be wrong, and so they validate your own invented beliefs no matter how stupid.
If it keeps you coming back for another session, they'll keep doing it, even if they know the whole process is bullshit.
Back in the 80s, we used to draw network diagrams on the whiteboard; those parts of the network that belonged neither to us nor to our users was represented by an outline of a cloud. This cloud didn't provide storage or (useable) computing resource. If you pushed stuff in here, it came out there.
I think it was a reasonable analogy. You can't see inside it; you don't know how it works, and you don't need to. Note that at this time, 'the internet' wasn't the only way of joining heterogenous networks; there was also the OSI stack.
So I was annoyed when some bunch of kids who had never seen such whiteboard diagrams decided to re-purpose the term to refer to whatever piece of the internet they had decided to appropriate, fence-in and then rent out.
Ineresting. I get a ride as a passenger in an ultralight two-seater. The pilot was a Vietnam vet. It was in Virginia, and we overflew part of the Great Dismal Swamp, which has alligators.