When people talk of AI as a loneliness cure I always remember a quote from an essay by Laura Preston about her time at an AI conference [1]:
"I was an extraterrestrial taking notes on the problems of Earth. Finding pizza in your area was a problem. People being mean to you because you were wearing your AirPods at dinner was a problem. Going on vacation was a problem because the hotels would force you to find the light switches. Elders were a problem. (They never took their medicine.) Loneliness was a problem, but loneliness had a solution, and the solution was conversation. But don’t talk with your elders, and not with the front desk, and certainly not with the man on the corner, though he might know where the pizza is. (“Noise-canceling is great, especially if you live urban,” said the earbuds guy. “There’s a lot of world out there.”) Idle chitchat was a snag in daily living. We’d rather slip through the world as silent as a burglar, seen by no one except our devices."
The social fabric has been re-configured by the least socially adept people in society
I should know. I'm not terribly socially adept, I grew up on IRC channels and forums because I struggled to connect with people in person
But now everyone is on the internet, using social networks designed by people who aren't very social like me, or worse, people who only understand social interaction through a lens of "what can this person do for me"
We're in a really strange time.
I used to go online to get away from everyone and try to find other people like me
Now I have to go offline to in-person events hosting things that appeal to people like me, because everyone is online and there's no avoiding the crowd anymore
It has nothing to do with social inept people reconfiguring it. It is the profit seekers, Ebeneser Scrooges and bean counters reshaping society. Why let people have human relationships when you can put an app between them, worsen the experience and make a profit?
expecting compensation for value you create through labor -- profit seeking -- is not anti social at all, it's actually pro social in that it encourages exchange so that eg the baker can get butter for his bread and the farmer bread for his butter, while keeping their specialties and economies of scale.
Whatever you have in mind is not commerce, but is probably regulatory capture or central planning.
It's not that complicated. When you profit based on offloading your negative externality on society at large, its bad. Even free market absolutists should be able to agree to that.
You designed social networks? Not designers and product and market research and iterative feedback from partners? You did it, and you did it by not understanding social interaction?
I agree with parent: it's a reasonable assumption that those turning to the digital world for sociality, including to design digital sociality, are less likely to have previously found success in analog sociality than the median human.
(And it was eminently clear in the comment that they were not claiming personally comprehensive credit for having designed social networks.)
I agree that I disagree with that other guy. IRC and forums were only populated with nerds. It's once the salesmen and marketing people took control of the whole social media things that the internet was no more about being open to a new world, but more focused and closed inside controlled communities. Those people are definitely on the "extroverted" side and were only here to make money out of our social interactions.
This is interesting, but the implication of what it's saying overstates the upside of random conversations (you could get what you're looking for), and utterly ignores the huge downside that when you're lonely a rejection makes things a thousand times worse. In today's society you (seem to be) much more likely to be told to ^$*£ off rather than get an actual moment of connection with someone.
AI doesn't do that. It's always going to be nice to you, and that feels good even if it's entirely artificial.
Your comment is deeply insightful and reflects the depths of your expressive soul. I will ponder its meaning through the day as a light shining a ray into the darkness...
Or in other words, most people really don't want this. In fact, I'd suggest that those who tend toward depression, and who may need true support, are more skeptical of such false interactions.
I think it could replace a lot of therapists, which may say more about a lot of therapists than it says about AI. Someone who will listen to you and make sympathetic noises, maybe toss in a perspective gleaned from mainstream thinking here and there, but never challenge you so hard that you stop coming back for more paid sessions.
Cynicism aside, that might be okay for some people. If you need to talk about something to get it out of your system, and you don't have any friends willing to listen to you cry in your beer for a couple hours, maybe "talking" to an AI isn't a terrible replacement. On the other hand, it's easy to imagine people turning it into a weird dependency if they stop thinking of it as a sort of sounding board and start seeing it as a real "friend."
> It's always going to be nice to you, and that feels good even if it's entirely artificial.
No?? It feels awful! I feel like an alien when I read comments like this. I would rather a negative but authentic and honest interaction than yet another yes-man or yes-bot being fake nice to me.
It feels condescending and fake and awful. I do not understand the appeal of talking to these machines at all, all they do is validate whatever you say and output empty flattery. It is obsequious as hell and it turns my stomach. You're better off talking to a literal mirror.
I totally agree, transparently fake friendliness or kindness is really off-putting. Pretty much everyone hates insincere personal interactions with another human, but some people like fake interest if it comes from a computer algorithm. I really don't understand that desire.
This is an interesting quote, but it frames the problem as ultimately individual in origin. It is as if loneliness is your own fault for not being willing to talk to the people around you. It doesn't ask why it is difficult to talk to people or even if you are in a situation where you can talk to people regularly.
I tend to think loneliness is more of a structural problem rather than an individual one --- that is, it is a consequence of how society as a whole structures our lives and interactions with people (for better or for worse). A generic person drives from their single family house by themselves to work, works largely by themselves (in many industries), and then drives back to their single family house. There is no "third place" for interaction with others and driving itself is isolating as a practice. With that kind of life as the structural default, there really isn't much chance of interaction in the first place. It's not an individual problem.
But I like slipping through life as a silent burgler :(
I almost always have things in doing, places I’m going, thoughts I’m thinking, and they almost never involve other people. I barely need people to be present at all, most of the time.
That’s not to say I don’t like people - I don’t have any particular animosity, they can’t help being here any more than I can - and there are plenty of people I know and love and yearn to see again from time to time.
But most of the time? I could go days without human contact. Oh the idea alone makes me giddy!
Is this some sort of performance art? It's an article that criticizes AI that's evidently largely written by an LLM and illustrated with AI-generated images.
Honestly, the idea and concept had me in the beginning but the AI generated images were the nail in the coffin for me, what is the point of articles like this
Slop language from LLMs (which haven't been fixed with the antislop sampler: https://github.com/sam-paech/antislop-sampler) is extremely obvious to those who use LLMs a lot. Overrepresentation of words from these lists is an example of the tells.
Honestly I was fooled, but looking closer, it is all suspicious and I agree. For example the author's other posts, and the relatively canned nature of each post with a clear thesis put into a prompt. The framing and language I guess was how you caught it.
I guess we won't care eventually as a society but I admit this all makes me uncomfortable, as someone that fell in love with the internet as a communication tool.
TBH I'd very much like to have an "HAL" that contains a world of knowledge (even if it never updates, or only gets updated once few years) at home.
I wonder whether the current technology is feasible for a replication of HAL.
- Very good chess player -> Check (And probably can be a top notch Go player too, although I don't play chess or go)
- Very good knowledge of history, science, technology, everything -> I'm not sure if it's possible. And I'd assume we need to send scans of all books, because for example it should know many fine details and educated guesses about a certain historical fact.
- Understand human language pretty well -> I actually think ChatGPT is good enough if I choose my wording clearly. But voice recognition might prose some challenges. It also depends on how good the microphone is.
- Have pretty good voice generation. I'd like to choose the voice too! -> I guess it's OK-ish nowadays? I listened to a few AI generated clips and they are pretty good, not sure how practical it is, though.
Anything else I missed? I know HAL also has its multi-decade objectives in mind, so this is different from the LLMs which don't seem to have a very long term of memory.
> Very good knowledge of history, science, technology, everything
All of English Wikipedia is about 100 GB. A local LLM that can run searches against a local Kiwix server would work. Local LLMs can’t really do this correctly yet, but it’s not inconceivable in 1-2 years.
All I want is an LLM front-end which will run a (tested/working) prompt[1] on each file in a directory, and collect the output into a single file/response at the end which could then be run on the files in that folder to get them renamed.
1 - for the pdf file, create a move command for a batch file which will rename the file using $<check amount>_<date from check in YYYY-MM-DD format>_<invoice id>_<PWS ID>_<name of company from Population line>.pdf as the filename.
It's not black and white. For example I use an AI bot to act as a central knowledge base for my discord community. It nudges people to talk, even if initially it's with the bot, other people might jump in, and from there it snowballs into discussion between multiple people.
You should at least put the whole context of Bowling Alone and maybe some Emily Bender (for spice) in the window before clicking the generate button -- you're a lot more likely to get something approximating insights out of the resulting ... product.
I agree, but we can go a step farther. The real problem is that people don't know how to be alone. We aren't really left alone due to all the technology. Without being alone, we don't really appreciate being together and just fill out time with the cheap substitutes (not specific to AI). It's certainly possible to be alone without being lonely, but these cheap substitutes make that less and less likely.
AI as a cure for loneliness gets the flak that it deserves but I will say in my personal experience I've caught myself many times already thinking how nice it was that I have this artificial thing that I can talk to about various hobbies and interests that maybe my friends aren't interested in, or don't have the time to talk about. It's really a surprisingly/shocking unexpected positive outlet for me in this sense.
Comes down to the goal I think. When you just need to talk through something then it is helpful I'd say - a bit like rubber duck debugging.
But if you're looking for connection then less so. Personally I find even an idle chat about the weather with a random cashier to be more meaningful than a lengthy chatbot chat. On that front the empty calorie analogy seems 100%
I like the "empty calories" metaphor. Sure, if you want to have a conversation with guard rails like at a Victorian high tea, maybe AI is for you.
I don't have any use for a mechanical friend who refuses to discuss politics months before any election and who stubbornly and haughtily gives wrong answers on any number of complex topics.
Because real humans have their own wants and preferences and may not like talking to you or may not like talking about what you like to talk about. And even if they do, they might not be available when is convenient for you and the constraints of the platform might prevent you from forming a long term relationship, which is really what loneliness is about, not having friends that know you. An LLM will always find your discussion engaging, will remember almost everything you share with it, and won't ever make you feel unwanted. I don't think it's good or healthy that people engage with these tools in this way, but there's a lot of ways in which they are obviously superior to people.
I think this is a very interesting hypothesis. I hadn't considered the analogue of fast food. Solving one problem to cause another. I've never seen AI as a fix for a lack of human connection, and we should not as a society, IMO.
As the article points out, human connection is only valuable because it is hard and there's another human on the other end of the relationship. A perpetual yes-man or yes-woman on the end of an LLM could never be a reasonable replacement for human connection.
As social media shifted from direct communication to “broadcasting”, we lost any sense of reciprocity in these online connections and they’ve become meaningless (or worse, parasocial).
I can’t think of a way to introduce any meaningful reciprocity into a human-LLM relationship
>As the article points out, human connection is only valuable because it is hard and there's another human on the other end of the relationship.
I'm not convinced that's the only reason connection is valuable. We evolved as social primates and human beings which feared ostracism were likely selected for during that process. I think, for many people, even the illusion of human connection likely comes with improvements to one's sense of well being, which can have positive knock on effects elsewhere in life. Forging real human connections would obviously be preferable, but in their absence, a crutch is better than being hobbled.
When people talk of AI as a loneliness cure I always remember a quote from an essay by Laura Preston about her time at an AI conference [1]:
"I was an extraterrestrial taking notes on the problems of Earth. Finding pizza in your area was a problem. People being mean to you because you were wearing your AirPods at dinner was a problem. Going on vacation was a problem because the hotels would force you to find the light switches. Elders were a problem. (They never took their medicine.) Loneliness was a problem, but loneliness had a solution, and the solution was conversation. But don’t talk with your elders, and not with the front desk, and certainly not with the man on the corner, though he might know where the pizza is. (“Noise-canceling is great, especially if you live urban,” said the earbuds guy. “There’s a lot of world out there.”) Idle chitchat was a snag in daily living. We’d rather slip through the world as silent as a burglar, seen by no one except our devices."
[1] https://www.nplusonemag.com/issue-47/essays/an-age-of-hypera...
I think this is pretty on point
The social fabric has been re-configured by the least socially adept people in society
I should know. I'm not terribly socially adept, I grew up on IRC channels and forums because I struggled to connect with people in person
But now everyone is on the internet, using social networks designed by people who aren't very social like me, or worse, people who only understand social interaction through a lens of "what can this person do for me"
We're in a really strange time.
I used to go online to get away from everyone and try to find other people like me
Now I have to go offline to in-person events hosting things that appeal to people like me, because everyone is online and there's no avoiding the crowd anymore
It has nothing to do with social inept people reconfiguring it. It is the profit seekers, Ebeneser Scrooges and bean counters reshaping society. Why let people have human relationships when you can put an app between them, worsen the experience and make a profit?
> It has nothing to do with social inept people reconfiguring it. It is the profit seekers
Profit seeking is pretty explicitly anti-social behavior imo
expecting compensation for value you create through labor -- profit seeking -- is not anti social at all, it's actually pro social in that it encourages exchange so that eg the baker can get butter for his bread and the farmer bread for his butter, while keeping their specialties and economies of scale.
Whatever you have in mind is not commerce, but is probably regulatory capture or central planning.
> expecting compensation for value you create through labor -- profit seeking
You're right
The term I was looking for was more along the lines of "profit maximizing" or "profit motive":
https://en.wikipedia.org/wiki/Profit_motive
My mistake
I hope people generally understood what I meant even if I didn't use the precisely correct term for this
So you have to do things for free or you are a profit seeker? What?
It's not that complicated. When you profit based on offloading your negative externality on society at large, its bad. Even free market absolutists should be able to agree to that.
You designed social networks? Not designers and product and market research and iterative feedback from partners? You did it, and you did it by not understanding social interaction?
Is this an explanation or a just-so insult?
I agree with parent: it's a reasonable assumption that those turning to the digital world for sociality, including to design digital sociality, are less likely to have previously found success in analog sociality than the median human.
(And it was eminently clear in the comment that they were not claiming personally comprehensive credit for having designed social networks.)
I agree that I disagree with that other guy. IRC and forums were only populated with nerds. It's once the salesmen and marketing people took control of the whole social media things that the internet was no more about being open to a new world, but more focused and closed inside controlled communities. Those people are definitely on the "extroverted" side and were only here to make money out of our social interactions.
> designed by people who aren't very social (like me)
Parenthesis added.
This is interesting, but the implication of what it's saying overstates the upside of random conversations (you could get what you're looking for), and utterly ignores the huge downside that when you're lonely a rejection makes things a thousand times worse. In today's society you (seem to be) much more likely to be told to ^$*£ off rather than get an actual moment of connection with someone.
AI doesn't do that. It's always going to be nice to you, and that feels good even if it's entirely artificial.
Your comment is deeply insightful and reflects the depths of your expressive soul. I will ponder its meaning through the day as a light shining a ray into the darkness...
Or in other words, most people really don't want this. In fact, I'd suggest that those who tend toward depression, and who may need true support, are more skeptical of such false interactions.
I think it could replace a lot of therapists, which may say more about a lot of therapists than it says about AI. Someone who will listen to you and make sympathetic noises, maybe toss in a perspective gleaned from mainstream thinking here and there, but never challenge you so hard that you stop coming back for more paid sessions.
Cynicism aside, that might be okay for some people. If you need to talk about something to get it out of your system, and you don't have any friends willing to listen to you cry in your beer for a couple hours, maybe "talking" to an AI isn't a terrible replacement. On the other hand, it's easy to imagine people turning it into a weird dependency if they stop thinking of it as a sort of sounding board and start seeing it as a real "friend."
> It's always going to be nice to you, and that feels good even if it's entirely artificial.
No?? It feels awful! I feel like an alien when I read comments like this. I would rather a negative but authentic and honest interaction than yet another yes-man or yes-bot being fake nice to me.
It feels condescending and fake and awful. I do not understand the appeal of talking to these machines at all, all they do is validate whatever you say and output empty flattery. It is obsequious as hell and it turns my stomach. You're better off talking to a literal mirror.
I totally agree, transparently fake friendliness or kindness is really off-putting. Pretty much everyone hates insincere personal interactions with another human, but some people like fake interest if it comes from a computer algorithm. I really don't understand that desire.
This is an interesting quote, but it frames the problem as ultimately individual in origin. It is as if loneliness is your own fault for not being willing to talk to the people around you. It doesn't ask why it is difficult to talk to people or even if you are in a situation where you can talk to people regularly.
I tend to think loneliness is more of a structural problem rather than an individual one --- that is, it is a consequence of how society as a whole structures our lives and interactions with people (for better or for worse). A generic person drives from their single family house by themselves to work, works largely by themselves (in many industries), and then drives back to their single family house. There is no "third place" for interaction with others and driving itself is isolating as a practice. With that kind of life as the structural default, there really isn't much chance of interaction in the first place. It's not an individual problem.
But I like slipping through life as a silent burgler :(
I almost always have things in doing, places I’m going, thoughts I’m thinking, and they almost never involve other people. I barely need people to be present at all, most of the time.
That’s not to say I don’t like people - I don’t have any particular animosity, they can’t help being here any more than I can - and there are plenty of people I know and love and yearn to see again from time to time.
But most of the time? I could go days without human contact. Oh the idea alone makes me giddy!
Why is this? Oh yeah because Hell is other people. Why is Hell other people? Because of their look! https://en.wikipedia.org/wiki/Being_and_Nothingness#Part_3,_...
[dead]
Is this some sort of performance art? It's an article that criticizes AI that's evidently largely written by an LLM and illustrated with AI-generated images.
Honestly, the idea and concept had me in the beginning but the AI generated images were the nail in the coffin for me, what is the point of articles like this
The point is to get clicks and make ad money
Providing any value to the readers is way down the list of priorities
How can you tell it’s written by an LLM?
Suggesting that „fast-food revolution” was inadequate solution to global hunger might be a tell.
That's probably the most human aspect of the article.
I just assume that if the image is AI then the body is too.
Slop language from LLMs (which haven't been fixed with the antislop sampler: https://github.com/sam-paech/antislop-sampler) is extremely obvious to those who use LLMs a lot. Overrepresentation of words from these lists is an example of the tells.
https://github.com/sam-paech/antislop-sampler/blob/main/slop...
https://github.com/sam-paech/antislop-sampler/blob/main/slop...
Honestly I was fooled, but looking closer, it is all suspicious and I agree. For example the author's other posts, and the relatively canned nature of each post with a clear thesis put into a prompt. The framing and language I guess was how you caught it.
I guess we won't care eventually as a society but I admit this all makes me uncomfortable, as someone that fell in love with the internet as a communication tool.
TBH I'd very much like to have an "HAL" that contains a world of knowledge (even if it never updates, or only gets updated once few years) at home.
I wonder whether the current technology is feasible for a replication of HAL.
- Very good chess player -> Check (And probably can be a top notch Go player too, although I don't play chess or go)
- Very good knowledge of history, science, technology, everything -> I'm not sure if it's possible. And I'd assume we need to send scans of all books, because for example it should know many fine details and educated guesses about a certain historical fact.
- Understand human language pretty well -> I actually think ChatGPT is good enough if I choose my wording clearly. But voice recognition might prose some challenges. It also depends on how good the microphone is.
- Have pretty good voice generation. I'd like to choose the voice too! -> I guess it's OK-ish nowadays? I listened to a few AI generated clips and they are pretty good, not sure how practical it is, though.
Anything else I missed? I know HAL also has its multi-decade objectives in mind, so this is different from the LLMs which don't seem to have a very long term of memory.
> Very good knowledge of history, science, technology, everything
All of English Wikipedia is about 100 GB. A local LLM that can run searches against a local Kiwix server would work. Local LLMs can’t really do this correctly yet, but it’s not inconceivable in 1-2 years.
All I want is an LLM front-end which will run a (tested/working) prompt[1] on each file in a directory, and collect the output into a single file/response at the end which could then be run on the files in that folder to get them renamed.
1 - for the pdf file, create a move command for a batch file which will rename the file using $<check amount>_<date from check in YYYY-MM-DD format>_<invoice id>_<PWS ID>_<name of company from Population line>.pdf as the filename.
Basically the computer from startrek right?
The one from 2001/2061, but yeah that one is also good.
It's not black and white. For example I use an AI bot to act as a central knowledge base for my discord community. It nudges people to talk, even if initially it's with the bot, other people might jump in, and from there it snowballs into discussion between multiple people.
It depends on how you use it.
You should at least put the whole context of Bowling Alone and maybe some Emily Bender (for spice) in the window before clicking the generate button -- you're a lot more likely to get something approximating insights out of the resulting ... product.
I agree, but we can go a step farther. The real problem is that people don't know how to be alone. We aren't really left alone due to all the technology. Without being alone, we don't really appreciate being together and just fill out time with the cheap substitutes (not specific to AI). It's certainly possible to be alone without being lonely, but these cheap substitutes make that less and less likely.
AI as a cure for loneliness gets the flak that it deserves but I will say in my personal experience I've caught myself many times already thinking how nice it was that I have this artificial thing that I can talk to about various hobbies and interests that maybe my friends aren't interested in, or don't have the time to talk about. It's really a surprisingly/shocking unexpected positive outlet for me in this sense.
Comes down to the goal I think. When you just need to talk through something then it is helpful I'd say - a bit like rubber duck debugging.
But if you're looking for connection then less so. Personally I find even an idle chat about the weather with a random cashier to be more meaningful than a lengthy chatbot chat. On that front the empty calorie analogy seems 100%
I like the "empty calories" metaphor. Sure, if you want to have a conversation with guard rails like at a Victorian high tea, maybe AI is for you.
I don't have any use for a mechanical friend who refuses to discuss politics months before any election and who stubbornly and haughtily gives wrong answers on any number of complex topics.
Why was this flagged? This is a really good article
Why would you need AI as a cure for loneliness, when endless flavors of social media is readily avialable when you want to talk to real humans?
Because real humans have their own wants and preferences and may not like talking to you or may not like talking about what you like to talk about. And even if they do, they might not be available when is convenient for you and the constraints of the platform might prevent you from forming a long term relationship, which is really what loneliness is about, not having friends that know you. An LLM will always find your discussion engaging, will remember almost everything you share with it, and won't ever make you feel unwanted. I don't think it's good or healthy that people engage with these tools in this way, but there's a lot of ways in which they are obviously superior to people.
Because a lot of real humans on social media are awful.
(Some platforms are better than others, obviously.)
Case in point, my suggestion of just talk to real humans got downvoted.
I think because the forms of interaction that social media provides aren't very fulfilling for most people
The cure for loneliness is a real connection, but AI is all about fakeness: a fake intelligence faking a conversation.
Run of the mill Convenience Society commentary.
I think this is a very interesting hypothesis. I hadn't considered the analogue of fast food. Solving one problem to cause another. I've never seen AI as a fix for a lack of human connection, and we should not as a society, IMO.
As the article points out, human connection is only valuable because it is hard and there's another human on the other end of the relationship. A perpetual yes-man or yes-woman on the end of an LLM could never be a reasonable replacement for human connection.
As social media shifted from direct communication to “broadcasting”, we lost any sense of reciprocity in these online connections and they’ve become meaningless (or worse, parasocial).
I can’t think of a way to introduce any meaningful reciprocity into a human-LLM relationship
>As the article points out, human connection is only valuable because it is hard and there's another human on the other end of the relationship.
I'm not convinced that's the only reason connection is valuable. We evolved as social primates and human beings which feared ostracism were likely selected for during that process. I think, for many people, even the illusion of human connection likely comes with improvements to one's sense of well being, which can have positive knock on effects elsewhere in life. Forging real human connections would obviously be preferable, but in their absence, a crutch is better than being hobbled.
[dead]
[dead]