You don't need an educated workforce if you have machines that can do it reliably. The more important question is: who will buy your crap if your population is too poor due to lack of well paying jobs? A look towards England or Germany has the answer.
For what it's worth I have been using Gemini 2.5/3 extensively for my masters thesis and it has been a tremendous help. It's done a lot of math for me that I couldn't have done on my own (without days of research), suggested many good approaches to problems that weren't on my mind and helped me explore ideas quickly. When I ask it to generate entire chapters they're never up to my standard but that's mostly an issue of style. It seems to me that LLMs are good when you don't know exactly what you want or you don't care too much about the details. Asking it to generate a presentation is an utter crap shoot, even if you merely ask for bullet points without formatting.
I bet they were talking about how people didn't do long division when the calculator first came out too. Is using matlab and excel ok but AI not? Where do we draw the line with tools?
Apparently not. This is the most perfect example I've seen of "I can recite it, but I don't understand it so I don't know if it's really right or not" that I've seen in a while.
I don't do a lot of frontend work. But when I did, the mocks were almost always best case; no mocks for when there was missing data (which was often). I did work on one project with an interaction designer, which was great --- having all the flows laid out was awesome, but it was only the happy path.
You have photos and text, try making them funny and showing off that you’re not afraid to hide things about yourself that others would hide- maybe emphasize and show a weird hobby or interest that other people might be worried to share. Of course this only works if you’re actually funny or actually have unusual hobbies. One guy I know literally posted silly photos of himself awkwardly pole dancing and got tons of matches. Even if you’re tall and rich, I advise being interesting instead and making it impossible to tell those things, if you want to match with people that are not boring and shallow. Have a good time and be light hearted- having a bad attitude about how dating is unfair will be impossible to hide and looks like a major personality flaw.
Well, sure, it's all relative and no system is perfect. Not every mother is perfect, doesn't mean I escort mine around the house at gunpoint whenever she visits.
TFA is indignantly reacting to the least charitable interpretation of what his employer has asked him to do. I'd like to know the honest shape of the code his manager rejected before judging the employer over this.
Sometimes? A lot of the time the point of therapy is to challenge your statements and get you to the heart of the issue so you can mend it or recognize things that are off base and handle things differently. A lot of the relationship is meant to be a supportive kind of conflict so that you can get better. Sometimes people really do need validation, but other times they need to be challenged to be improved. As it stands today, AI models can't challenge you in the way a human therapist can.
Any field has hacks. Telling someone what they want to hear and helping get someone where they want to be are different things. Quality professionals help people reach their goals without judgment or presumption. That goes for mental health professionals as well as any professional field.
Absolutely true but I don't think a person should rely on an LLM alone for that reason. It's just not smart and insightful enough.
It's more like that really good friend that's not a therapist but always tells you what you want to hear and makes you feel a bit better until you get to your actual therapist.
Yes but still, talking to someone helps. No matter what they say back. If an LLM is the only thing around at the moment (e.g. in the middle of the night) this can be useful for therapeutic purposes.
Therapy isn't only about what the therapist says to you. There is a lot about you talking to them and the process that creates in the mind. By sharing your thoughts with someone else you view them from a different perspective already.
Then it’s actually better to talk to yourself. Not an LLM that’s trained on all of the internet’s combo of valuable and unhinged takes on all matter of trauma.
It's not the same. An internalised conversation doesn't have the same effect.
And I have good experiences with the LLM for this purpose. It's probably my prompt and RAG that I provided with a lot of my personal stuff but even the uncensored model I use is always supportive and often comes up with interesting takes / practical suggestions.
I don't rely on it for advice but for talking to when real friends aren't around and there's something urgent I'm worried about it's really good.
Would you be willing to email me at me@xeiaso.net? I have some questions I'd like to ask you as part of research for a followup piece. No judgement, I just want to know how it's affected your life.
I honestly would recommend against that but we’re all free to do with our brains as we please. I just hope it’s not as destructive as I intuit it to be…
Anyone interested in better understanding a complex system can benefit from a qualified professional’s collaboration, often and especially when an outside perspective can help find different approaches than what appear to be available from inside the system.
Not really. Good therapy is uncomfortable. You are learning how to deal with thought patterns that are habitual but unhealthy. Changing those requires effort, not soothing compliments and validation of the status quo.
This paper is so important. Just imagine how much pain could have been avoided if the Gitlab and Github developers read this before making the steaming shit pile of Github Actions.