Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there no line, in your opinion? At this point, there are computers (many of which run variants of Linux in many cases) in my:

1. Laptop

2. Phone

3. Car

4. Washing machine

5. Handheld GPS

6. E-reader

7. TV

Is there some intrinsic different between a device where the manufacturer has programmed it using an ARM/x86-based chip vs a microcontroller vs some other method that means in the 1st case I have the right to install whatever I want? Because that feels like what's happened with cell phones: manufacturers started building them with more capable and powerful components to drive the features they wanted to include, and because those components overlapped what we'd seen in desktop computers, we've decided that we have an intrinsic right to treat them like we historically treated those computers.





For everything on that list, I'd say that if you figure out how to run software of your choice on them the manufacturer shouldn't be able to legally stop you. (And specifically, the anti-circumvention clauses of the DMCA are terrible).

Phones get a lot of attention in this regard because they've replaced a large amount of PC usage, so locking them down has the effect of substantially reducing computing freedom.


This is sort of delightfully circular?

> I'd say that if you figure out how to run software of your choice on them the manufacturer shouldn't be able to legally stop you.

That's already the case. The manufacturer can't come after you for anything you do to your device. They can:

1. Set up their terms of service so that things you do to alter the device are grounds for blocking your access to cloud/connected services that they host on their infrastructure

2. Attempt to make it difficult to run software of your choice.

3. Use legal means to combat specific methods of redistributing tools to other people that compromise things they do in number 2.


There is already a widespread notion of "general computing" device.

For all intents and purposes, a laptop computer and a smart phone are one. This is, for example, evidenced by the fact we run general purpose "applications" on them (not defined ahead of time), including a most general app of them all (a web browser).

For other device types you bring up, I would go with a very similar distinction: when you can run an open ended app platform like a browser, why not be able to install non-browser based applications as well? Why require going through a vendor to do that?


"why not" isn't a compelling case for something to be a fundamental right.

I'm not saying I dislike the concept of being able to run my own code on my devices. I love it. I do it on several devices, some of which involve circumventing manufacturer restrictions or controls.

I just don't think that because manufacturers started using the same chips in phones as computers, they magically had new requirements applied to them. Phones had app stores before they were built using the same chips. My watch lets me install apps from an app store.


You've asked for an intrinsic difference between a class of devices: no, you are unlikely to want to run general purpose apps on your washing machine. Yes, you are likely to do so on your smart phone. Probable on your modern "smart TV". Low probability on your eReader.

Legislation like EU Cybersecurity Act hopefully pushes things into more of a fundamental rights thing by demanding that devices don't go into the trash pile as soon as the vendor stops issuing security updates by mandating an ability to keep operating these devices without negatively affecting Internet at large (by, for example, becoming a part of a botnet).

This is already possible with many general compute devices by putting a version of up-to-date GNU/Linux or FreeBSD or... on it. And for a smaller subset of GC smartphones, with AOSP-based Android.


I'm not asking for an intrinsic difference: I'm suggesting that if "I can install custom applications/code on this device I own" is a fundamental right, there would need to be an intrinsic difference. My personal opinion is that there is not an intrinsic difference. That "I want to do it to these devices and not those" can't be the justification for it being a right that I'm able to.

To counter your claim, I've tried to explain what that intrinsic difference is in my previous comment.

I am not sure if you are disagreeing with me or ignoring my point :)


The only one that sounds potentially harmful is the car and in that case I think it should have to meet emissions standards and prove you aren't running a defeat device but like... Yeah. I should be allowed to run my own infotainment system that doesn't crash and doesn't spy on me

I'd like to be able to install my own software on all of these

I'm not asking what you'd like to do. I'd like to be able to customize all of those things too.

I'm asking why taking a device that uses a microcontroller and making a new model with an ARM chipset and a Linux-based OS seems to suddenly make people treat the ability to install custom software on it as a fundamental right.


If I own it, regardless of if it's Linux or ARM based, I should be able to install things on it.

Video game consoles?

Good catch. They are similarly noteworthy to phones: there are all kinds of projects and tools built around making custom and modded games for the Gameboy, or hacking the NES, but there wasn't a movement saying Nintendo was violating our fundamental rights by not allowing users to overwrite or modify the code inside the actual console.

Then consoles started shipping with recognizable internals, and we had waves of people very frustrated at things like Sony's removal of OtherOS, or Nintendo's attempts to squash the exploits that enabled Wii Homebrew.


Yes, you absolutely should have the right to install (or uninstall) whatever software you want on any of those, assuming it contains writable program memory. The alternative is a nightmarish dystopian future where your washing machine company is selling its estimate of your political inclinations, sexual activities, and risk aversion to your car insurance company, your ex-husband, your trade union representative, and your homeowners' association.

I thought I had this line, but I imagined if my credit card had writable program memory, I'd be fine with a third party preventing me from using it for its intended purpose if it wasn't trusted there. There must be some purpose for my own good for preventing me from writing to my own program memory, and I should be able to void this purpose if I deem it worth it.

Likewise, I'd be fine with banking apps on phones requiring some level of trust, but it shouldn't affect how the rest of my phone works so drastically.


Why would your credit card need to act against your interests? The only thing it should be doing is signing transactions to signal that you approve. The credit card company has their own computers that can be consulted to ask them if they approve a transaction. They don't need one in your pocket. They can rent a rack in a data center. It's not that expensive.

Similarly, the banking app on your phone should be representing your interests, 100%. It may need to keep secrets, such as a private transaction signing key, from your bank or from your boyfriend, but not from you. And it definitely should not be collecting information on your phone against your will or without your knowledge. But that is currently common practice.


Why?

My washing machine could be programmed to do all of those things you're worried about without any writeable memory. Why does the parts the manufacturer puts into it turn it from an appliance that washes my clothes to a computer that I have a right to install custom code on?


The principle is that the owner should have full control of their own device, because that's what defines private property. In particular, everything that the maker can make the device do must be something that the owner can make the device do. If the device is simply incapable of doing a certain thing, that might be bad for the owner, but it's not an abrogation of their right to their own property, and it doesn't create an ongoing opportunity for exploitation by the maker.

Maybe in theory your washing machine could be programmed to do those things without writable program memory. Like, if you fabricated custom large ROM chips with the malicious code? And custom Harvard-architecture microcontrollers with separate off-chip program and data buses? But then the functionality would be in theory detectable at purchase time (unlike, for example, Samsung's new advertising functionality: https://news.ycombinator.com/item?id=45737338) and you could avoid it by buying an older model that didn't have the malicious code. This would greatly reduce the maker's incentives to incorporate such features, even if it were possible. In practice, I don't think you could implement those features at all without writable program memory, even with the custom silicon designs I've posited here.

If you insist that manufacturers must not prevent owners from changing the code on their devices, you're insisting that they must not use any ROM, for any purpose, including things like the PLA that the 6502 used to decode instructions. It's far more viable, and probably sufficient, to insist that owners must be able to change any code on their devices that manufacturers could change.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: