Oh man! That hot swap method! I remember my computer, running Linux (which freaked 14 year old me out) from a live cd, case open, and plugging in the hdd while the Xbox and pc was on.
I felt like a warlock when I could rip my games after that.
Great memories.
And my mate at school saying there is no such thing as a soft mod! Haha
I recall our progression was chipmod, drivemod, softmod- really didn’t use the softmod much because we had so many chipmodded devices, but we didn’t do much online then.
Some Call of Duty discs contain basically no data at all.
>Game disc only contains 1GB of data (In some regions it has even less data on disc) forcing you to download a 40+GB patch (at launch) and another 40GB of data packs in order to play the game.
That is a nice site. I was first made aware of this issue with Switch games. Some publishers will cut content on the memory card and force a download to stop their game requiring a card with larger capacity which costs more.
These aren't even new games that it is reasonable to expect to be patched. Re-releases like "Spyro Reignited Trilogy" require a download which is just a cost saving exercise.
It’s also an plausible anti-leaks measure - if the gamecard contains everything needed to play the game, the game can easily leak early when the cards are going to retail.
If a day1 patch is required, then it can’t leak until that patch is available?
Actually it’s called day one patches. Similar to zero day vulnerabilities (in the name sake only) these patches are usually required to play day 1 of the game…
I wonder how many publishers use S3 for this. Because, at current retail (quantity 1) prices, a bigger card looks like it will pay for itself after a whopping two downloads.
I assume that the game downloading ecosystem uses something that’s actually cost-effective. At AWS prices, it seems like it would be basically impossible to be a profitable publisher of multi-gigabyte games at any scale.
That also has the effect of preventing pre-release leaks, though as we've seen some of Nintendo's own games shared on the internet weeks before release I don't imagine it's a big part of the reason for requiring a download.
Since figuring THAT out might require reading the whole disk and doing byte by byte comparisons (or a whole disk checksum), easier to just download the whole thing.
Unless they track literally every single DVD variant perfectly, which ain’t happening.
And that is ignoring that many disks are basically just a hardware license dongle, and don’t actually have a full playable version of the game.
It’s an info theory thing. But nice try. Why don’t you propose something better? Besides what I already proposed anyway.
The ways to optimize/‘solve it’ all require degrees of rigor in information control and tracking that aren’t realistic given the market conditions and supply chains.
At least unless people stop being okay with paying for $40+ games that download 40G patches anyway. Which would require severe changes in trajectory of bandwidth availability, which isn’t likely.
No sir, excuse my bluntness but you are full of shit trying to claim that information theory blocks this from being practical. I worked on stuff like this. I worked on the format that Windows setup uses for installation media for example. It has delta patches too. I think the same exact idea used there would work. It has a hash of every file precomputed and only stores what's unique.
HN rated limiting sucks. And client crashes suck more.
You’re not reading my comment, or thinking about the game distribution problem.
MS can get benefit from reading all the files, verifying hashes, etc.
And in a typical OS update scenario, MS can trust that a files contents haven’t been updated since the hash was checked. Which reading from a DVD/BluRAY, scratches are a problem and it isn’t that simple.
Games typically don’t, especially those on a DVD or BluRAY. Because they are slow, and have terrible seek times.
So, like I pointed out - it doesn’t make sense to do the work you need to do in info theory to actually apply a delta patch in these scenarios.
I mean, zsync? Performing simple hashes of blocks of data isn’t exactly hard for the console. On the CDN side, just add a caching layer for the resulting chunks and it should sort itself out, since there are only so many variants of the source disc. It won’t get you the best compression ratios, but it’s flexible. We were considering this for firmware updates of an IoT product. It’s not like differential updates are unheard of.
For that sync to work, you have to read the whole disk, do hashes, then compare to hashes for a given other version. Which requires reading the DVD/BluRAY (slow) and comparing to the versions on the server.
And preferably, if you’re just reading hashes off the disk, that none of the actual data on the disk is corrupted or doesn’t match the hashes.
And then, due to the reality of the way games are distributed, downloading 90%+ of the game anyway.
Or just download the full version, which is simpler and likely the same amount of bandwidth, and faster since you’re not having to read/check the slow disks for more than basic ‘is it this game’ checking.
I understand the types of considerations that may make console developers not bother, but also it is a bit ridiculous on its face when they are selling physical media. But for patch-heavy games like CoD, maybe it’s a lost battle anyways, archival be damned.
But I will push back a little and say that a zsync-like differential update scheme would still be totally feasible. BD read speeds are going to be in the 100s of megabits per second, and the compute for the hash is free (ie. It’s I/O bound). You can parallelize and start downloading blocks before you’re done hashing every block on the disc. It seems likely to me that you’d still end up better off with this scheme if you have slower than gigabit download speeds (which is true for the vast majority of the US). Zsync is fairly coarse and flexible, and essentially looks like BitTorrent between two peers over HTTP if you squint. If you assume the download speed and network reliability is the bottleneck that outweighs things like disc I/O and compute, it essentially degrades gracefully to just downloading the entire update in the case that there are zero matching blocks.
Edit: I should mention that another key aspect of the setup is that there are a small number of printed disc revisions, and a small number of target download (most people will get the same game files for a given region). This means that a CDN cache will quickly find the hot blocks to serve, even without any precomputation of the diff between source and target.
Most games, near as I can tell, the version on the DVD/BluRAY for the initial release is pretty much never finished - often
barely playable. Even at 'release' date, the initial update is often at least as large or larger than the data on the DVD/BluRAY.
So possible? Sure, almost anything is possible with enough work and tradeoffs. It just isn't economical or likely actually faster given current bandwidth constraints and how they're distributed.
Especially if you consider that if there is network play, they'll have to be up to date anyway or most games won't let them connect, so 'offline' play is going to be a relatively rare situation. So why optimize for it?
Are delta patches still viable given the current sizes of games? I'm not sure if this is the state of the art, but according to https://www.daemonology.net/bsdiff/, bspatch would require more memory than most systems can offer.
I'd expect the patch generation to be memory hungry, not the patch aplication, which should be only data and offsets. If it uses maximum compression it might generate a huge data dictionary, but since it has to be distributed too, that would be contraproducent to patch size.
The patch download shows up as a download once you put a disc in for installation. The console still installs from disc and you can usually play without the patch. I actually had a game that failed to install from disc, the replacement disc worked fine. So unless this is something very recent I have no idea why an internet connection would have an impact on disc installations.
Some of us have *really* slow internet connections, such that the much smaller day one patch is still going to take a quite long time to download. Much longer than the full installation from disc.
When I switched from Intel Mac to the M2, I was easily able to get my Linus and windows VMs running again.
As well as my intel apps. Was pretty smooth and the speed and power efficiency gains are obvious. My mac book air draws about 5w total while the DAW is cool as a cucumber.
I have a suspicion that "number of crashes" got badly Goodharted inside major OS vendors. After all, you can "prevent" most crashes by just having a top-level "catch(Exception e) {}" handler, which of course just leads to the program doing nothing instead in an un-debuggable way, but hey: crashes went down! KPI achieved!
My experience is that macOS especially really REALLY absolutely hates when the Internet goes sideways - not actually down, but really bad packet loss; DNS starts being slow and failing sometimes but not completely ... then you enter hell.
Next time it happens try turning network connections all the way off.
Autosave also has versioning. You can always go back through the auto save history. You can also revert to the original opened file. This is done from the file menu.
Also, if like a good mac user you have Time Machine configured and on, you can browse your versions further into the past.
Heavy iWork user, I don’t know about Xcode though.
Does the version browser show a line-by-line diff? When I last used it ~10 yrs ago it displayed a fancy Time Machine-like interface that was useless for telling if I accidentally inserted a character in page 11.
It is a regular, but not universal, occurrence for people to elide the extra possessive 's when the word already ends in s.
For example, although adding a possessive suffix 's to the nane "Jordan" would result in "Jordan's," adding it to "James" could result in "James'". There's an apostrophe written at the end, but it's pronounced the same way as "James."
I said it's not universal because I still know people who would say "James's" (with an extra syllable at the end) in everyday speech anyway. I don't know to what extent this varies by dialect.
It's also elided when a plural already ends in s (and I think this is universal, but I haven't looked it up). For example, "the doctor's computer" (the computer of one doctor) sounds the same as "the doctors' computers" (the computers owned by the doctors). The apostrophe is written on the other side of the s, but it also sounds the same. Note that not all plurals end in s, e.g. man/men, woman/women, goose/geese, etc., and in these cases you still add the s, e.g. "the geese's beaks".
There is Goodnotes, which supports such functionality for PDFs. I’ve purchased such PDFs (bullet journal calendar/notes) and, without trying it, I think it a safe assumption that Goodnotes could consume what is generated by the Hyperpaper link above.
It'll work just fine on an iPad, several folks are doing so (Goodnotes and Notability are the most common apps people use for loading and annotating the pdf). I've done a little work towards color themes to better support the iPad, but for now it's just black & white since most customers use eInk tablets.