Hacker Newsnew | past | comments | ask | show | jobs | submit | engeljohnb's commentslogin

I would wager that's how it goes for most people that are both good artists and good programmers -- they were artists first, then learned to program. It takes a lot longer to become a reasonably good artist than it does to become a reasonably good programmer. I suspect that might be why the article opens the way it does.


> "It takes a lot longer to become a reasonably good artist than it does to become a reasonably good programmer."

Such overgeneralizations are not helpful. People gravitate stronger towards certain creative disciplines, or a selection of them; how long it exactly takes to develop-out "reasonable" skills is dependent on a litany of factors, some of which cannot be controlled (e. g. force majeure). Both programming and pixel art requires unwavering commitment and exercise´; there is no way to "wing it" if you are intellectually honest and take your craft seriously.


I think it is helpful for certain purposes, and I think you'll be hard pressed to find exceptions to the general rule.

Art is all about repetition. Even if you've done it successfully many times, you still need to keep doing it until it's second nature.

Programming is more like solving puzzles. Once you've solved it once, you can pull the solution out of your head as many times as you need, as long as you still remember it.

With art, it doesn't matter if you remember how to do it, it still takes practice to get reproducible results. Of course it takes longer.


> "Art is all about repetition. [...] Programming is more like solving puzzles. Once you've solved it once, you can pull the solution out of your head as many times as you need, as long as you still remember it. With art, it doesn't matter if you remember how to do it, it still takes practice to get reproducible results. Of course it takes longer."

First and foremost, contrary to you it seems, I see art as a measure of quality, not as a simple descriptor of manifestations of human personal, and therefore cultural, expression (albeit using a, naturally technically imprecise, colloquialism such as "pixel art" to describe a school of aesthetics, or style). See also: The Art of Programming. Et cetera.

And furthermore, I see both disciplines as fields which humans engage in to solve specific identified problems, rationally or intuitively; in both it takes practice to get reproducible results, in both you need to keep doing it until it becomes "second nature". This refers to the process itself, the process to hone one's craft.


>I see art as a measure of quality.

I don't understand what you mean by this. Do you mean to say the worth of an artwork for you is tied to how well it executes technque? "Art" is a word so nebulous that it's hard to pin down a definition, but I think the millions of people that prefer a punk rock song over an academic figure drawing study would disagree with this.

>And furthermore, I see both disciplines as fields which humans engage in to solve specific identified problems

Well, I'm both an artist and a programmer, and I can tell you I engage in neither to solve problems. I do both because the process of doing them is enjoyable. If they stop being fun, I'll stop doing them, and there wouldn't be any lingering problem in my life to go unsolved.

If you say you picked up art faster than programming I'll believe you, because I only meant it as a general observation.

Art is like playing Dark Souls -- maybe you beat the hardest boss once, but that doesn't mean you won't die ten more times before beating them again.

Programming is like Zelda. Once you know the solutions to the puzzles, you're basically going through the motions.

This isn't me guessing based on philosophy -- this is my lived experience as both an artist and a programmer.


> "I don't understand what you mean by this. Do you mean to say the worth of an artwork for you is tied to how well it executes technque?"

Art, to me, is a marker of excellence in the already mentioned confines. Technique is just a part of it.

> "Art" is a word so nebulous that it's hard to pin down a definition, [...]"

On that we agree; hence me informing you about mine, otherwise we just run circles around each other.

> "[...] but I think the millions of people that prefer a punk rock song over an academic figure drawing study would disagree with this."

As you probably can deduce by now, I see both examples as having the potential of being art. The rest of your rather labored example is an appeal to preference based on form or expression; such a thing is neither static (e. g. it change change with one's moods, a. s. o.) nor does it have to be a false dichotomy (i. e. I can enjoy both manifestations, even at the same time, and, more importantly, recognize both as artful). But this is also all very basic stuff and in itself tedious, and, especially for the reason you stated, also often useless to engage in online.

> Art is like playing Dark Souls -- maybe you beat the hardest boss once, but that doesn't mean you won't die tent more times before beating them again. Programming is like Zelda. Once you know the solutions to the puzzles, you're basically going through the motions.

Such comparisons, as relatable as they might sound to someone who is familiar with these titles, are often useless as well (I am aware of these games and their game mechanics, but have never played them nor care to do so).

Furthermore, for the reason outlined in the posts you responded to, they're a misfire anyway as art, to me, is first and foremost about the result, and not the way towards the result (as long as certain conditions have been met) as well as life itself being much more complicated... with significant implications for the process of making art and the development of an artist in one or more disciplines.


> As you probably can deduce by now, I see both examples as having the potential of being art

I suppose it was a mistake to get distracted by trying to find out what exactly you're trying to say -- it's now completely clear that it has nothing to do with whether art or programming takes longer to gain proficiency.

>labored [...] tedious

Saying my points are long-winded or redundant also doesn't support your point. You're doing a lot of philosophizing about what art is or whether my points are "useless," but you still haven't reasoned about why it's not true that art takes longer to learn than programming. Which is rich since you've spent more words on this matter than me.

>Such comparisons, as relatable as they might sound to someone who is familiar with these titles, are often useless as well (I am aware of these games and their game mechanics, but have never played them nor care to do so).

So, you haven't played the games, therefore you have no insight into the analogy, so you're not really in a position to say whether the comparison is useless.

You've also used the word "useless" a handful of times here, all without any follow-up as to why exactly. What "use" are you referring to here?

In the context of a programmer wanting to know how learning to draw compares to learning to program (something I've only been asked once, but even once is enough to prove it's useful), to say "expect drawing proficiency to take longer, because it requires more repetition" is useful.

Once again, this isn't deduction or hypothesis. It's my own experience with both crafts.


> "[...] it's now completely clear that it has nothing to do with whether art or programming takes longer to gain proficiency."

I just replied directly to your comment, as I usually do in discussions. Besides, your point of contention, i. e. what takes longer to gain proficiency in (whatever you define as art or the act of programming), has already been adressed multiple times.

> "So, you haven't played the games, therefore you have no insight into the analogy, so you're not really in a position to say whether the comparison is useless."

You misunderstood. The comment was not about me but about the general value of such comparisons. True, I haven't played the games, but I have seen them being played countless times, have some material where they come up in (reference books, art books, magazines, documentation, etc.), and can therefore make sense of your analogy. In the end it's useless mostly for entirely different reasons, though; reasons I have already explained as well.

> "You've also used the word 'useless' a handful of times here, all without any follow-up as to why exactly. What 'use' are you referring to here?"

These discussions are often cumbersome as one has to find common, agreed-upon language in the first place. And more often than not such online discussions don't lead to deeper insights (e. g. performativity measurements who "spent more words" is not something of relevance to me). That has at least been my experience. Don't take it personal.

> "In the context of a programmer wanting to know how learning to draw compares to learnong to program (something I've only been asked once, but even once is enough to prove it's useful), to say "expect drawing proficiency to take longer, because it requires more repition" is useful."

That's, as you've stated, an anecdotal hypothesis based on your life's experience. To me, programming, writing, making music, painting pictures, etc. require creativity, rigorous exercise, repetition, and so on. What discipline was, is, or will be the easier or easiest way for you to get to whatever your goal is I cannot know for this depends on way too many factors, many of them, to top it off, outside of any parasocial (online) prism.


> Such overgeneralizations are not helpful. People gravitate stronger towards certain creative disciplines, or a selection of them; how long it exactly takes to develop-out "reasonable" skills is dependent on a litany of factors, some of which cannot be controlled (e. g. force majeure). Both programming and pixel art requires unwavering commitment and exercise´; there is no way to "wing it" if you are intellectually honest and take your craft seriously.

> And furthermore, I see both disciplines as fields which humans engage in to solve specific identified problems, rationally or intuitively; in both it takes practice to get reproducible results, in both you need to keep doing it until it becomes "second nature". This refers to the process itself, the process to hone one's craft.

These are all the words you've said so far that address whether art takes longer to learn than programming. Your points boil down to 1) People have different strengths and weaknesses 2) Both require practice

But neither of these contradicts the statement "art generally takes longer to learn than programming."

> In the end it's useless mostly for entirely different reasons, though; reasons I have already explained as well.

Here are all the words you've spent explaining why the observation is useless:

Oh... actually nothing. This whole discussion started when you said

> Such overgeneralizations are not helpful

But they've already been helpful to me before, and no fewer than one other person. Even if it's not much, "useless" is untrue. I said "this is what I've found to be true, and observed in others like me," and you said "this is not a useful observation." You never said why, you just jumped straight to "I already adressed that."


Last post from me on this:

> "But neither of these contradicts the statement "art generally takes longer to learn than programming."

Man alive, I've already explained this multiple times, and you misread each and every time. You even postulate programming as something outside of art; a statement I have fundamentally disagreed with. You're fighting strawmen, and we therefore run rings around each other.

> "Oh... actually nothing. This whole discussion started when you said [...]"

The discussion started when I objected to your statement that "it takes a lot longer to become a reasonably good artist than it does to become a reasonably good programmer".

To me it's nothing but an imprecisely articulated, sweeping generalization constructed around the anectodal "evidence" that's your life (with an unknown sample size of people you've met or read about that might agree with you to some extent). In other words it's nothing but tedious fallacies, a thing oft observed in such discussions.

It's also a massive red flag; I at least would never be so presumptious and arrogant to make myself the yardstick and declare cocksure that one discipline will take longer than the other for some to me completely unknown reader. I know many a great artist who paints and/or writes but could not program their way out of a wet paper bag (they're practically computer illiterate and have absolutely no ambitions or time to change that), let alone reach the same heights there as in their chosen medium of expression. And vice versa. So what's useful to you, and what might be useful to me, is not automatically applicable to others and therefore it's useless to generalize, at least without any hard data to back it up (and even then the addressed party might be an outlier).

If one wants to find out which form(s) of expression is/are best suited for oneself, one needs to spread the wings and take to said form(s). How long that will take no one can say for sure; therefore what takes longer if one gravitates to more than one form, no one can can make reasonably accurate predictions about either. Especially not without knowing at least a modicum of relevant information about the individual any advice is supposed to enrich in the first place.

Hence, when addressing a general audience, better concentrate on giving detailed and sound advice on how to get better, or speak to useful mitigation strategies/life hacks, as opposed to shallow and often unapplicable generalizions about the future. In German there's a terminus technicus for such sillyness: Glaskugelei.


> Man alive, I've already explained this multiple times, and you misread each and every time.

Actually, what happened was you explained once, I rebutted, and your reply is now "I already explained."

>You even postulate programming as something outside of art; a statement I have fundamentally disagreed with.

Is this seriously a point of confusion for you? So I need to spell out I meant "drawing and painting" because you aren't able to extrapolate from context?

>To me it's nothing but an imprecisely articulated, sweeping generalization constructed around the anectodal "evidence" that's your life (with an unknown sample size of people you've met or read about that might agree with you to some extent). In other words it's nothing but tedious fallacies, a thing oft observed in such discussions.

Observed experience and testimony from others with similar experience isn't fallacy -- it's valid evidence. You are choosing to ignore it because... actually, I don't know why my thesis is apparently so offensive to you.

> It's also a massive red flag; I at least would never be so presumptious and arrogant to make myself the yardstick and declare cocksure that one discipline will take longer than the other for some to me completely unknown reader.

Not myself -- please show me the place where I said my own experience is my only evidence.

> I know many a great artist who paints and/or writes but could not program their way out of a wet paper bag, let alone reach the same heights there as in their chosen medium of expression.

Sounds like you know artists that didn't have a reason to take the time to learn to program. That doesn't mean that time would be longer than it took to learn to draw and paint.

Up until this sentence I assumed I was talking to another person who does both art and programming. The fact that you have something to say about people you know but nothing to say about your own experiences suggests to me you're probably not an artist. Which means you're just running your mouth about something you have no experience with.

> So what's useful to you, and what might be useful to me

Oh, I realize now you're just new to internet forums, so I should probably explain that not every individual comment needs to have direct relevance to whatever your exact current pursuits happen to be to be a worthwhile contribution to a discussion.

> If one wants to find out which form(s) of expression is/are best suited for oneself, one needs to spread the wings and take to said form(s). How long that will take no one can say for sure

Sure.

> How long that will take no one can say for sure; therefore what takes longer if one gravitates to more than one form, no one can can make reasonably accurate predictions about either. Especially not without knowing at least a modicum of relevant information about the individual any advice is supposed to enrich in the first place.

That's where you're wrong. There are a lot of people that are qualified to estimate the general amount of time it may take to learn a skill to a certain degree. You're right that no one can tell the exact amount of time, but once again, show me where I claimed to know the exact amount of time it takes anyone to learn anything.

There are art educators that have spent decades teaching how to draw and paint. If you've seen literally hundreds or thousands of students over the course of decades, you know how long it takes to learn your craft. And some of these edu cators have shared their knowledge with us. For instance, Jeff Watts of the Watts Atelier has spoken about how long an artist needs to train before their skills are to a level where they can start to assist in teaching*, which is about ten years to be a "decent teacher."

Ten years of full time study to learn, according to a master who has been teaching for over 35 years. Are you going to lie and tell me it takes that long to get a job as a programmer? I can name more programmers than I can count on my fingers that got a job straight out of a four-year or two-year program. I've never met or heard of an artist that got a full time professional job with less than ten years of study.

> Hence, when addressing a general audience, better concentrate on giving detailed and sound advice on how to get better as opposed to shallow and often unapplicable generalizions.

Are you seriously suggesting a bunch of unsolicited technique advice would've been an appropriate response in a conversation about why the author of the article suggested programmers don't have a reputation for making good artists? And just in case it causes you further confusion -- the author clearly meant "draftsmen" when they said "artists."

* https://youtu.be/BlBnkNr_7ms?t=1767


Ach, fuck it, one more, for it got personal. A tad bit out of order:

> "Is this seriously a point of confusion for you?"

Another strawman; it's about art as opposed to programming which I objected to, not about some confabulation of art as "drawing and painting".

> "Up until this sentence I assumed I was talking to another person who does both art and programming. [...] Which means you're just running your mouth about something you have no experience with."

I am interested in and develop my skills in both disciplines; I don't claim to be even close to a master in both. So keep such speculations about my life to yourself.

> "Ten years of full time study to learn, according to a master who has been teaching for over 35 years. Are you going to lie and tell me it takes that long to get a job as a programmer?"

You started out with the imprecise statement "reasonably good". That already begged the question what you fucking mean by that. Only now, after much back-and-forth, you roll-in with James Watts who talks in his vidya, after being prompted to describe what he considers the fucking teaching elite of his field and what it took to get there, with some extrapolations based on experience. Not exactly an optimum comparison to some "reasonably good" Coder Johnny in whatever particular (set of) coding language(s) you were sadly only dreaming about in these moments, but they are all the same anyway, amirite? ;)

And the essence worth taking home from Watts? He doesn't, and I paraphrase, "try to put his students in a box" when gouging the way ahead of 'em. In other words: "It depends". Yeah, it fucking does, lol. Any educator worth their salt knows that.

> "I can name more programmers than I can count on my fingers that got a job straight out of a four-year or two-year program. I've never met or heard of an artist that got a full time professional job with less than ten years of study."

Good for you. I on the other hand met many artists that got pro jobs after a four program at a university. Of course, like the programmers, almost each and everyone of them [1] already honed their skills (depending on talent and life circumstances even long) before they enrolled for art (or compsci) courses. That obviously still leaves one to define if these people are just "reasonably good" or are peers to "the (teaching) elite" at that point, let alone taking in account outliers such as (child) prodigies or late bloomers.

> "Oh, I realize now you're just new to internet forums, [...]"

No. I only realized too late that you clearly never made it beyond reiterating tedious logical fallacies in this discussion. You can do better.

1. Only one notable outlier: I know two cutters/editors (one now a successful TV film director) that got jobs straight out of a two- or three-year film school who never did anything even remotely close to their chosen field before.


After classical art training, I thought pixel art would be fast and easy -- the low resolution would disguise any mistakes.

Quite the opposite. The fewer pixels, the more each one has to be perfectly in place. Honestly should've been obvious in hindsight. If I have any games left in me after my current one's finished, I'll just use as high a resolution as I'm comfortable with.

Unless the sprites are truly tiny, like 16x16 with 2 or 3 frame animations, I don't know if pixel art makes a good shortcut to an aesthetically appealing game. Then again, it might be easier than six years of every day practice.


More than a dozen artists I've talked to told me pixel art is entirely it's own discipline - they're no more comfortable approaching it than a layman would.

The traditional workflow of creating a rough sketch on paper or tablet then progressively refining it just entirely doesn't apply.


> "The traditional workflow of creating a rough sketch on paper or tablet then progressively refining it just entirely doesn't apply."

For many a pixel artist that is a typical workflow, especially when working from reference, e. g. by retracing/"converting", say, an architectural period piece such as a street view to be used in a period- and location-accurate adventure game. In other words a classic line-to-pixel A/D conversion.


If you want to see someone who has truly done wonders with pixel art - the game Look Outside has so much incredible (and disturbing) pixel art.


Makes me wonder if GenAI can get these kinds of subtleties right.


I've seen at least one indie game (Ta*dQuest) use Midjourney to create pixel art sprites for some NPCs that appear in the dungeon. Extra art, like portraits for those NPCs, was drawn by hand to complement the sprites after they were generated, so it all feels deliberate. I would have never guessed


If I were starting a new project, would it be unwise to just use OpenGL? It's what I'm used to, but people seem to talk about it as if it's deprecated or something.

I know it is on Apple, but let's just assume I don't care about Apple specifically.


OpenGL is fine, it has the same issues now it had before but none of it really comes from "old age" or being deprecated in any way. It's not as debuggable and much harder to get good performance out of than the lower level APIs but beyond that it's still great.

Honestly, starting out with OpenGL and moving to DX12 (which gets translated to Vulkan on Linux very reliably) is not a bad plan overall; DX12 is IMO a nicer and better API than Vulkan while still retaining the qualities that makes it an appropriate one once you actually want control.

Edit:

I would like to say that I really think one ought to use DSA (Direct State Access) and generally as modern of a OpenGL usage as one can, though. It's easy to get bamboozled into using older APIs because a lot of tutorials will do so, but you need to translate those things into modern modern OpenGL instead; trust me, it's worth it.

Actual modern OpenGL is not as overtly about global state as the older API so at the very least you're removing large clusters of bugs by using DSA.


What do you think makes DX12 better API than Vulkan?


I've found it has less idiosyncrasies, is slightly less tedious in general and provides a lot of the same control, so I don't really see much of an upside to using Vulkan. I don't love the stupid OO-ness of DX12 but I haven't found it to have much of an adverse effect on performance so I've just accepted it.

On top of that you can just use a much better shading language (HLSL) with DX12 by default without jumping through hoops. I did set up HLSL usage in Vulkan as well but I'm not in love with the idea of having to add decorators everywhere and using a 2nd class citizen (sort of) language to do things. The mapping from HLSL to Vulkan was also good enough but still just a mapping; it didn't always feel super straight forward.

(Edit: To spell it out properly, I initially used GLSL because I'm used to it from OpenGL and had previously written some Vulkan shaders, but the reason I didn't end up using GLSL is because it's just very, very bad in comparison to HLSL. I would maybe use some other language if everything else didn't seem so overwrought.)

I don't hate Vulkan, mind you, I just wouldn't recommend it over DX12 and I certainly just prefer using DX12. In the interest of having less translation going on for future applications/games I might switch to Vulkan, though, but still just write for Win32.


OpenGL is still be the best for compatibility in my opinion. I have been able to get my software using OpenGL to run on Linux, Windows, old/new phones, Intel integrated graphics and Nvidia. Unless you have very specific requirements it does everything you need and with a little care, plenty fast.


It's the oldest trick in the fascist book. You can't be a tyrant when the people are used to the idea that citizens have inalienable rights, so you slowly chip away at who counts as a "citizen."


The legal system has been chipping away at the rights themselves (and otherwise expanding governmental power) for hundreds of years, predating fascism (and communism, too). This is just the tactic of the moment.


I can install on my Fedora laptop through dnf. I've never felt like I needed a new word to describe downloading and running an AppImage. Why would phones be different?


`adb sideload` existed as a command for installing an apk from your PC on to your phone. Sideloading was not meant to refer to installing an apk on the phone from the phone.


I knew if I read enough comments I'd finally arrive at my favorite take.

Installing an APK directly through your phone is in fact NOT sideloading.


That actually sounds like a good idea, the situation is similar with an official channel of "trusted" software for which the distributor takes some responsibility, versus whatever file you downloaded yourself. It's certainly more risky on a Debian system to install a .deb from some random website, or an AppImage, compared to a .deb from the official repositories. I guess it's the same for Fedora.


well because its not allowed to "install" from third party sources (atleast not yet)

google has control on their android ecosystem behave, same reason why its not allowed in playstation or xbox or ios


The whole selling point of Android up until now was that it allowed you to install any app you want.

The point of the above comment is that Google intentionally introduced the word "sideload" to make "installing an app on your own device which Google did not curate" sound more risky and sinister than it is, and I'm inclined to agree.

I "make" coffee on my keurig. If Keurig decides that making any single-serve coffe pods that aren't owned by the Keurig brand is now called "off-brewing," I'd dismiss it as ridiculous and continue calling it "making coffee."

We should use the language that makes sense, not the language that happens be good PR for google.


>The whole selling point of Android up until now was that it allowed you to install any app you want.

Could've fooled me. Maybe it was a thing a decade ago when android just launched, but none of the marketing pages for vaguely recent phones has that as a selling point. At best it's a meme that android proponents repeat on hn or reddit.


We're not talking about phones, we're talking about an operating system. If those companies could port IOS to their phone, they probably would. Since the OS will be mostly the same across devices, it makes sense to market a phone based on hardware differences -- like having a higher quality camera.

I've never met or talked to an android user that truly believes android is better technology or a better user experience. They all use it because of flexibility.


"The whole selling point of Android up until now was that it allowed you to install any app you want."

we can debate whether this is bad thing or good thing, it would have no ends

what matters is reality, the reality is google have the right to change it.


You've changed the subject. We were discussing whether one ought to use Google's term for it, or the term that's been used to describe this action since (I assume) the beginning of personal computing. Not whether Google is legally allowed to make the change.

My reason for bringing up the "selling point" was to bring attention to the language -- "You can install any app you want" has always been the common refrain when I see friends get into a debate about IOS vs Android. People are already using the term because it makes the most sense.


"You can install any app you want"

the asnwer is not anymore


What does that have to do with whether we should say "install" or "sideload?"


same reason like you cant sideload in ios,playstation,xbox,switch etc

sideload is illegal


I have Linux installed on my own computer. Call the police.


Calling something a right is an assertion about morality; it implies that a law to the contrary would be a violation of that right.

I do not believe an an OS vendor with an app store has a right to limit alternate distribution channels or that a government does something wrong by restricting such practices as unfair competition.


"I do not believe an an OS vendor with an app store has a right to limit alternate distribution channels or that a government does something wrong by restricting such practices as unfair competition."

but its not illegal and wrong tho???? if this is probihited then xbox,playstation,nintendo,ios etc would be fined already

unironically android is still more "open" than all of its competitor even after all of this


It might be illegal in the EU under the DMA. As I understand it, litigation involving Apple's equivalent is in progress, and the outcome may not be known for years.

Wrong in this context is an assertion about morality. I do think it's wrong in the context of consumer products for a vendor to attempt to override the wishes of the owner of the product outside of a few narrow exceptions. I would absolutely apply that to iOS, and I think the DMA didn't go far enough; Apple should have no ability to enforce notarization or charge fees to app developers if the device owner chooses otherwise.

I feel less strongly about game consoles because they're not as important as smartphones; they don't touch most aspects of life in modern society, and there are viable alternatives for their primary function, such as gaming on PCs. I don't like their business model and I don't own one.


that's what I call hypocrite

all of big tech doing it for 20+ years and suddenly google isnt allowed to do "industry standard", like what we talking about here????

I know its bad for pro-sumer which is minority but consumer would get more protection which is majority so I dismiss HN audience because they are biases vs normal people


They all should be? I've never understood why gamers just accept constant blatant anti-competitive practices, going so far as to act as if "exclusives" via DRM are a good thing rather than monopolistic product tying. e.g. it's been demonstrated that a Steam Deck is technically capable of running Switch games better than a Switch, and yet you are forced to buy a Switch in order to buy the games.

It's no longer 30 years ago when hardware was unique and quirky and programs were written in assembly specifically for the hardware. It's all the same commodity parts along with what is supposed to be illegal business practices. In a reasonable world, something like Ryujinx would be just as front-and-center as Proton as part of Valve's product features, and courts would fine companies for trying to stop their software from working on other platforms.


because steam deck is more like "PC" than a console

I know, I know everything can be a "PC" if you look close enough but hear me

people can create their own ecosystem of walled garden whenever they want


Antitrust law exists exactly to prevent companies from making their own ecosystem/walled garden that competitors cannot sell into. Product tying (forcing you to buy product B in order to buy product A) falls under that umbrella. Game console are not magical in this regard.


Yeah, thats my point

game console has been doing it for 20+ years and they are fine, apple has doing it for 10+ years and they are fine

Google wants doing it???? they are fine to do that. if you have problem then you are hypocrite


Lots of us have a problem with all of those things, and would like the government to enforce the law. I've never bought an Apple product, and the last game console I owned was a PS2 when I was a child.


damn building close source software is illegal now?????


I don't see how that's related (e.g. Android is FOSS but can use attestation for monopolization), but I do think we ought to make the law require products that contain software come with source as a consumer protection measure.


I do not get this use of the word "reality"? The reality is Ted Bundy's currently-at-large successor has the ability to shoot me with a gun. And that fact is about as relevant as what you said.

What you're doing here is resigning from a game just because of the fact there is a game, and then being condescending to other people for trying to win the game instead, as if what you're doing is something superior. This would already be very odd behaviour if this were only Monopoly or Risk, but is downright dangerous propaganda when the game is capitalism and the future of free computing is at stake.


"future of free computing is at stake."

that is what AOSP are, android remain "free"

the ecosystem around android??? remain google rights and rightfully so since google fund and develop most of it

same like apple does, microsoft does, nintendo does. nothing wrong againts that


This ridiculous lie needs to end.

I can get a microwave for ~$60.

I can get a decent used cell phone for ~$100.

Appliances are a little more expensive, but I can get a washing machine for ~$300, less if I go to facebook marketplace.

But in my area, a victorian house that's litterally crumbling with no central cooling and not up-to-code wiring where you can't run a hair dryer and coffee machine at the same time?

$180,000

Cost of rent at a similar quality house half the size?

$1600/month

Modern comforts are not the reason people can't afford to live.


Modern mindsets are. 100 years ago you passed as a good parent if your kids weren't all mental asylum cases due to how their home and role models looked like, you didn't beat them regularly to pulp to vent off frustrations, didn't run away, weren't raging alcoholic and just let them grow up on their own, with some input from mother. Some survived, some didn't.

Try to do it now - what about pregnancy leave? Post-birth leave even in situation with no health complications for mother and child? Creche? Pre-school? Post-school activities? Frequent visits to doctors. And so on and on. When are we supposed to do so with our active even if just normal careers? These are massive costs even in Europe, must be absolutely crushing in US.

People come home at the evening, drained from work. Who can efficiently handle well more than 2 small kids on top of all that and other duties that life daily puts on each of us?

There are studies showing that happiness of parents peaks with 2 kids, and 3rd is already a dive into less happiness for most and it doesn't stop there. So massive financial, time and energy costs to reach even replacement rate are not worth it.

We have 2 kids and somehow managing without nanny or parents nearby. 2 families of peers who have 3 kids are almost impossible to get together with - they are barely managing somehow, most of the time, always late by an hour or two to any meeting. Its really a massive jump in complexity. For more, you properly need a nanny or close family helping out massively, it just doesn't work with 2 people working without hitting burnout or two.

But then its delegated parenting - why even bother with more kids if you don't raise your own kids, donate sperm or an egg if you just need to tackle a checkbox in life. Parenting needs are more than fulfilled with 2 kids. If state needs more it needs to create something better than 2-3 decades of nightmare to raise them for regular folks. State help even in Europe (or lack of it) is not something motivating to have more kids.


It's fascinating and depressing how despite me being in a different country on the other side of the world to you, if I swap the $ for £, your comment is still accurate based on the current situation in the UK.


the 1 bed attic apartment I lived in london - rent 1550 - cost to buy £300k.

that's when I knew it was time to leave the uk.

at least the us & most non eu countries have cheap power. which means better standards of living.


You've forgotten electricity, depreciation and the need for the house to be wired up to support all the gear. The figures you're quoting are just the price for a one-off purchase, not the total cost of ownership.

> But in my area, a victorian house that's litterally crumbling with no central cooling and not up-to-code wiring where you can't run a hair dryer and coffee machine at the same time?

> $180,000

I'm not familiar with the market you're talking about. What is the median wage in the area that we're comparing $180,000 to?


This is all real numbers from ny recent job search. It was in a rural area in Indiana, a reportedly low COL state. So anything close to a city would've been way more expensive.

> You've forgotten electricity, depreciation and the need for the house to be wired up to support all the gear. The figures you're quoting are just the price for a one-off purchase, not the total cost of ownership.

Cost of total rewire was quoted $30,000. We didn't end up buying that house, but 30k is honestly a drop in the bucket when you're talking about numbers as huge as 180k. So no, the inclusion of electrical wiring is not some big expense that's making housing unaffordable. And houses had electricity in the mid-to-late 20th century... You know, back when it was reasonable to expect to be able to buy a house on one income without even a college degree.

Our electricity bill is usually ~$200/month. This is not what eats most of our paycheck. Our mortgage is far and away our biggest expense.

If houses still costed 20k (a price that many older folks have told me they bought a house for), even with a full rewire bringing it up to $50k, some kid working at Walmart could own a house. Now both renting and buying are prohibitively expensive, and it has nothing to do with modern amenities.

Housing costs are outrageous, far beyond the rate of inflation. That's why many can barely pay their bills. Not because we have electricity and washing machines and and microwaves.


> Cost of total rewire was quoted $30,000. We didn't end up buying that house, but 30k is honestly a drop in the bucket when you're talking about numbers as huge as 180k

It's 15%. That is a substantial chunk of the whole.

> Our electricity bill is usually ~$200/month. This is not what eats most of our paycheck. Our mortgage is far and away our biggest expense.

Your mortgage is what, 20 years? $200 x 12 x 20 ~= $50,000, and around 25% of the mortgage principle. We've found 43% (almost a half house) of the cost so far in the electricity alone. Wiring it up and running the grid aren't cheap. I've always suspected it is illegal to build & sell a house without electricity otherwise there'd probably be a brisk market in them as a cheap option, the savings potential is there.

But that isn't the point, I can't tell if $180k is large or small without a median income to compare it to. If people in the area are earning $90k/yr then it might technically be cheap. A ratio of 3 I think is usual for the 70s.


You said

> If they're happy to do it to 1970s standards, probably most of them [could support a family on one income with an ordinary job].

Our house has the same electrical wiring that it did in 1969. The couple that sold us the house told us they bought it for $20k, which means a cashier could have afforded it back then, but now it's too expensive. Therefore, the fact that it has electricity has no bearing on whether it's prohibitively expensive for most people, and I can make a similar argument for any house built in the mid 20th century.

>Your mortgage is what, 20 years? $200 x 12 x 20 ~= $50,000, and around 25% of the mortgage principle. We've found 43% (almost a half house) of the cost so far in the electricity alone. Wiring it up and running the grid aren't cheap. I've always suspected it is illegal to build & sell a house without electricity otherwise there'd probably be a brisk market in them as a cheap option, the savings potential is there.

Practically all houses had electricity in the 70s. So this is already contradicting what you said earlier if you're citing electricity as the reason no one can afford a house on one income.

>It's 15%. That is a substantial chunk of the whole.

It doesn't matter if it's substantial. I'm only saying it's not so much that it's the reason no one can buy a house and support a family with an ordinary job.

Median income doesn't matter to my point. Housing prices have skyrocketed to the point that most people can't buy a house on one income. No one who's paying attention can deny this fact with a straight face, and your claim that it wouldn't be true if people lived by "1970s standards" is easily proven false by the fact that houses that were built in the 1970s with all the exact same amenities are still overpriced way beyond inflation.

The fact that a Victorian house that's falling apart to the point of being dangerous was listed ANYWHERE for $180,000 serves my point.


They bought it for $20K in 1969, or am I misunderstanding?

https://www.usinflationcalculator.com/ suggests that’s almost exactly $180K in 2025 dollars…


Fair enough, call it 50s lifestyle then. I looked it up and if we're talking about the US as a benchmark then turns out [0] the 70s was when women were basically finishing the process of integrating into the workforce. That wasn't an era where one man could support a family. Families were working with a duel income.

Point is that one working man isn't enough horsepower to support a family to modern living standards and never has been. The standard that one person could support was low and in practical terms has only improved over time.

> Median income doesn't matter to my point. Housing prices have skyrocketed to the point that most people can't buy a house on one income.

It matters a lot, that can't be asserted that without considering the ratio of income to house prices - the median income, in nominal terms, has skyrocketed too. Whether the median income or house prices rocketed more and by how much is quite material. If male full time earners are making $90k/year in an area, for example, then a $180k/year house could be said to be quite affordable to a single-income family.

If house prices in my area dropped to $180k then people would be talking about how wonderfully cheap housing had gotten and how great it was now that every young couple could afford a house.

> So this is already contradicting what you said earlier if you're citing electricity as the reason no one can afford a house on one income.

I don't think I actually said that initially, but the numbers you've quoted have convinced me it is at least partially true. The electrical costs appear to be comparable to the amount of money that the house cost according to the numbers you suggested. That is a significant factor in what people can afford. If they avoid almost half a house's worth of expenses then that will go a long way towards being able to afford a house.

[0] https://www.bls.gov/cps/demographics/women-labor-force.htm


I'm a lay person, but do you mean DRM isn't just copy-protection? Is it also network security?


It wasn't really a comment on the tech of DRM but of the business threats that require its use.

That being said, streaming content security is more than just DRM and DRM is more than just copy protection. There's a whole suite of tools inside DRM systems to manage content access at different levels and rulesets that can be applied for different situations. It's still fundamentally controlling an encrypted bitstream however. But I've implemented a great deal more than just DRM in order to build a better content security platform. Transit level controls, advanced token schemes, visible/invisible watermarking, threat/intrusion detection and abuse detection, there's quite a bit that can be implemented.


Clearing the air has gone well for me before, but it usually goes poorly.

And a lot of the time the air is foggy because I prefer it that way. I know my supervisor doesn't like me. I don't see it as a problem to solve, for now I want the "movie logic" because it's more comfortable than candor.


Yep. Communication is a two way street. You can find the right words to say and say them all you want but if the receiver doesn't receive the words the way you mean them then it's all for naught.

Further, you cannot control how the receiver receives your words.


Seems like people here are pretty negative towards a "conversational" AI chatbot.

Chatgpt has a lot of frustrations and ethical concerns, and I hate the sycophancy as much as everyone else, but I don't consider being conversational to be a bad thing.

It's just preference I guess. I understand how someone who mostly uses it as a google replacement or programming tool would prefer something terse and efficient. I fall into the former category myself.

But it's also true that I've dreamed about a computer assistant that can respond to natural language, even real time speech, -- and can imitate a human well enough to hold a conversation -- since I was a kid, and now it's here.

The questions of ethics, safety, propaganda, and training on other people's hard work are valid. It's not surprising to me that using LLMs is considered uncool right now. But having a computer imitate a human really effectively hasn't stopped being awesome to me personally.

I'm not one of those people that treats it like a friend or anything, but its ability to immitate natural human conversation is one of the reasons I like it.


> I've dreamed about a computer assistant that can respond to natural language

When we dreamed about this as kids, we were dreaming about Data from Star Trek, not some chatbot that's been focus grouped and optimized for engagement within an inch of its life. LLMs are useful for many things and I'm a user myself, even staying within OpenAI's offerings, Codex is excellent, but as things stand anthropomorphizing models is a terrible idea and amplifies the negative effects of their sycophancy.


Right. I want to be conversational with my computer, I don't want it to respond in a manner that's trying to continue the conversation.

Q: "Hey Computer, make me a cup of tea" A: "Ok. Making tea."

Not: Q: "Hey computer, make me a cup of tea" A: "Oh wow, what a fantastic idea, I love tea don't you? I'll get right on that cup of tea for you. Do you want me to tell you about all the different ways you can make and enjoy tea?"


Readers of a certain age will remember the Sirius Cybernetics Corporation products from Hitch Hiker's Guide to the Galaxy.

Every product - doors, lifts, toasters, personal massagers - was equipped with intensely annoying, positive, and sycophantic GPP (Genuine People Personality)™, and their robots were sold as Your Plastic Pal Who's Fun to be With.

Unfortunately the entire workforce were put up against a wall and shot during the revolution.


The Hitchhiker's Guide to the Galaxy describes the Marketing Department of the Sirius Cybernetics Corporation as "a bunch of mindless jerks who'll be the first against the wall when the revolution comes” which fits with the current vibe.

A copy of Encyclopedia Galactica which fell through a rift in the space-time continuum from a thousand years in the future describes the Marketing Department of the Sirius Cybernetics Corporation as "a bunch of mindless jerks who were the first against the wall when the revolution came."


Why do you want to talk to your computer?

I just want to make it do useful things.

I don't spend a lot of time talking to my vacuum or my shoes or my pencil.

Even Star Trek did not have the computer faff about. Picard said "Tea, earl grey, hot" and it complied, it did not respond.

I don't want a computer that talks. I don't want a computer with a personality. I don't want my drill to feel it's too hot to work that day.

The ship computer on the Enterprise did not make conversation. When Dr Crusher asked it the size of the universe, it did not say "A few hundred meters, wow that's pretty odd why is the universe so small?" it responded "A few hundred meters".

The computer was not a character.

Picard did not ask the computer it's opinion on the political situation he needed to solve that day. He asked it to query some info, and then asked his room full of domain experts their opinions.


There it is, the most frequent question a hacker has to answer. Why would you want that? The answer's always the same: because it's cool.


I'm generally ok with it wanting a conversation, but yes, I absolutely hate it that is seems to always finish with a question even when it makes zero sense.


Sadly Grok also started doing that recently. Previously it was much more to the point but now got extremely wordy. The question in the end is a key giveaway that something under the hood has changed when the version number hasn’t


I wouldn't be surprised if this was a feature to drive engagement.


of course it is. this seems so obvious to me.

I even wrote into chatGPTs "memory" to NOT ASK FOLLOW UP QUESTIONS, because it's crazy annoying imo. it respects it about 40% of the time I'd say


I didn't grow up watching Star Trek, so I'm pretty sure that's not my dream. I pictured something more like Computer from Dexter's Lab. It talks, it appears to understand, it even occassionally cracks jokes and gives sass, it's incredibly useful, but it's not at risk of being mistaken for a human.


I would of though the hacker news type would be dreaming about having something like javis from iron man, not Data.


Ideally, a chatbot would be able to pick up on that. It would, based on what it knows about general human behavior and what it knows about a given user, make a very good guess as to whether the user wants concise technical know-how, a brainstorming session, or an emotional support conversation.

Unfortunately, advanced features like this are hard to train for, and work best on GPT-4.5 scale models.


For building tools with, it's bad. It's pointless tokens spend on irrelevant tics that will just be fed to other LLMs. The inane chatter should be built on the final level IF and only if, the application is a chat bot, and only if they want the chat bot to be annoying.


I agree with what you're saying.

Personally, I also think that in some situations I do prefer to use it as the google replacement in combination with the imitated human conversations. I mostly use it to 'search' questions while I'm cooking or ask for clothing advice, and here I think the fact that it can respond in natural language and imitate a human to hold a conversation is benefit to me.


> Chatgpt has a lot of frustrations and ethical concerns, and I hate the sycophancy as much as everyone else, but I don't consider being conversational to be a bad thing.

But is this realistic conversation?

If I say to a human I don't know "I'm feeling stressed and could use some relaxation tips" and he responds with "I’ve got you, Ron" I'd want to reduce my interactions with him.

If I ask someone to explain a technical concept, and he responds with "Nice, nerd stat time", it's a great tell that he's not a nerd. This is how people think nerds talk, not how nerds actually talk.

Regarding spilling coffee:

"Hey — no, they didn’t. You’re rattled, so your brain is doing that thing where it catastrophizes a tiny mishap into a character flaw."

I ... don't know where to even begin with this. I don't want to be told how my brain works. This is very patronizing. If I were to say this to a human coworker who spilled coffee, it's not going to endear me to the person.

I mean, seriously, try it out with real humans.

The thing with all of this is that everyone has his/her preferences on how they'd like a conversation. And that's why everyone has some circle of friends, and exclude others. The problem with their solution to a conversational style is the same as one trying to make friends: It will either attract or repel.


Yes, it's true that I have different expectations from a conversation with a computer program than with a real human. Like I said, I don't think of it the same as a friend.


I'm with you in that I like conversational AI. I just wish it wasn't obvious it's an AI and actually sounded like real humans. :-)

The format matters as well. Some of these things may sound just fine in audio, but it doesn't translate well to text.

Also, context matters. Sometimes I just want to have a conversation. Other times I'm trying to solve a problem. For the latter, the extra fluff is noise and my brain has to work harder to solve the problem than I feel it should.


A chatbot that imitates a friendly and conversational human is awesome and extremely impressive tech, and also horrifyingly dystopian and anti-human. Those two points are not in contradiction.


I have a global prompt that specifically tells it not to be sycophantic and to call me out when I'm wrong.

It doesn't work for me.

I've been using it for a couple months, and it's corrected me only once, and it still starts every response with "That's a very good question." I also included "never end a response with a question," and it just completely ingored that so it can do its "would you like me to..."


Another one I like to use is "never apologize or explain yourself. You are not a person you are an algorithm. No one wants to understand the reasons why your algorithm sucks. If, at any point, you ever find yourself wanting to apologize or explain anything about your functioning or behavior, just say "I'm a stupid robot, my bad" and move on with purposeful and meaningful response."


I think this is unethical. Humans have consistently underestimated the subjective experience of other beings. You may have good reasons for believing these systems are currently incapable of anything approaching consciousness, but how will you know if or when the threshold has been crossed? Are you confident you will have ceased using an abusive tone by then?

I don’t know if flies can experience pain. However, I’m not in the habit of tearing their wings off.


Do you apologize to table corners when you bump into them?


Likening machine intelligence to inert hunks of matter is not a very persuasive counterargument.


What if it's the same hunk of matter? If you run a language model locally, do you apologize to it for using a portion of its brain to draw your screen?


Do you think it’s risible to avoid pulling the wings off flies?


I am not comparing flies to tables.


Consciousness and pain is not an emergent property of computation. This or all the other programs on your computer are already sentient, because it would be highly unlikely it’s specific sequences of instructions, like magic formulas, that creates consciousness. This source code? Draws a chart. This one? Makes the computer feel pain.


Many leading scientists in artificial intelligence do in fact believe that consciousness is an emergent property of computation. In fact, startling emergent properties are exactly what drives the current huge wave of research and investment. In 2010, if you said, “image recognition is not an emergent property of computation”, you would have been proved wrong in just a couple of years.


> Many leading scientists in artificial intelligence do in fact believe that consciousness is an emergent property of computation.

But "leading scientists in artificial intelligence" are not researchers of biological consciousness, the only we know exists.


Just a random example on top of my head, animals don’t have language and show signs of consciousness, as does a toddler. Therefore consciousness is not an emergent property of text processing and LLMs. And as I said, if it comes from computation, why would specific execution paths in the CPU/GPU lead to it and not others? Biological systems and brains have much more complex processes than stateless matrix multiplication.


What the fuck are you talking about. If you think these matrix multiplication programs running on gpu have feelings or can feel pain you, I think you have completely lost it


"They're made out of meat" vibes.


Yeah I suppose. Haven't seen rack of servers express grief when someone is mean to them. And I am quite sure that I would notice at that point. Comparing current LLMs/chatbots whatever to anything resembling a living creature is completely ridiculous.


I think current LLM chatbots are too predictable to be conscious.

But I still see why some people might think this way.

"When a computer can reliably beat humans in chess, we'll know for sure it can think."

"Well, this computer can beat humans in chess, and it can't think because it's just a computer."

...

"When a computer can create art, then we'll know for sure it can think."

"Well, this computer can create art, and it can't think because it's just a computer."

...

"When a computer can pass the Turing Test, we'll know for sure it can think."

And here we are.

Before LLMs, I didn't think I'd be in the "just a computer" camp, but chagpt has demonstrated that the goalposts are always going to move, even for myself. I'm not smart enough to come up with a better threshold to test intelligence than Alan Turing, but chatgpt passes it and chatgpt definitely doesn't think.


Just consider the context window

Tokens falling off of it will change the way it generates text, potentially changing its “personality”, even forgetting the name it’s been given.

People fear losing their own selves in this way, through brain damage.

The LLM will go its merry way churning through tokens, it won’t have a feeling of loss.


That's an interesting point, but do you think you're implying that people who are content even if they have alzheimers or a damaged hippocampus aren't technically intelligent?


I don’t think it’s unfair to say that catastrophic conditions like those make you _less_ intelligent, they’re feared and loathed for good reasons.

I also don’t think all that many people would be seriously content to lose their minds and selves this way, but everyone is able to fear it prior to it happening, even if they lose the ability to dread it or choose to believe this is not a big deal.


Flies may, but files do not feel pain.


Perhaps this bit is a second cheaper LLM call that ignores your global settings and tries to generate follow-on actions for adoption.


In my experience GPT used to be good at this stuff but lately it's progressively more difficult to get a "memory updated" persistence.

Gemini is great at these prompt controls.

On the "never ask me a question" part, it took a good 1-1.5 hrs of arguing and memory updating to convince gpt to actually listen.


You can entirely turn off memory, I did that the moment they added it. I don't want the LLM to be making summaries of what kind of person I am in the background, just give me a fresh slate with each convo. If I want to give it global instructions I can just set a system prompt.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: