or…they’re both assholes and she’s a gaslighting psychopath. just going off what evidence is at my disposal.
at this point if you’re with a partner that refuses to acknowledge your needs in the relationship there’s literally no reason to remain in the relationship.
Like her need for him to answer reasonable questions? Why does the origin of the question pose a threat and why doesn’t he give examples? He’s like the rando poster who says ‘hey guys I forgot the passcode to my iPhone, got a workaround for that?’ okay buddy, so you stole a phone then.
if they were reasonable questions then she wouldn’t need AI to ask them.
she’s using AI to analyze her perception of the argument and then attacking him based on a flawed analysis.
he’s not sharing enough info to determine why they have so many arguments nor what they are about.
they’re both being shitty to each other and they both need to acknowledge the relationship is failing due to the individual flaws they have as people.
in a relationship differences can be strengths, similarities can be weaknesses, and personality flaws can be dangerous. it all depends on how those in the relationship deal with their differences, similarities, and flaws.
Are you saying it would be preferable if she was given the same advice from a human or read it in a book? This guy cannot defend his point of view because it’s probably not particularly defensible, the robot is immaterial.
Are you saying it would be preferable if she was given the same advice from a human or read it in a book?
I’ll spell it out for you. Y E S
I’m not going to argue the finer points of how a LLM has literally no concept of human relationships. Or how LLMs give the least effective advice on record.
if you trust a LLM to give anything other than half-baked garbage I genuinely feel sad for any of your current and future partners.
This guy cannot defend his point of view because it’s probably not particularly defensible, the robot is immaterial.
when you have a disagreement in a long-term intimate relationship it’s not about who’s right or wrong. its about what you and your partner can agree on and disagree on and still respect each other.
I’ve been married for almost 10 years, been together for over 20, we don’t agree on everything. I still respect my partners opinion and trust their judgment with my life.
every good relationship is based on trust and respect. both are concepts foreign to LLMs, but not impossible for a real person to comprehend. this is why getting a second opinion from a 3rd party is so effective. even if it’s advice from a book, the idea comes from a separate person.
a good marriage counselor will not choose sides, they aren’t there to judge. a counselor’s first responsibility is to build a bridge of trust with both members of the relationship to open dialogue between the two as a conduit. they do this by asking questions like, “how did that make you feel?” and “tell me more about why you said that to them.”
the goal is open dialogue, and what she is doing by using ChatGPT is removing her voice from the relationship. she’s sitting back and forcing the guy to have relationship building discussions with a LLM. now stop, now think about how fucked up that is.
in their relationship he is expressing what he needs from her, “I want to you stop using ChatGPT and just talk to me.” she refuses and ignores what he needs. in this scenario we don’t know what she needs because he didn’t communicate that. the only thing we can assume based on her actions is that she has a need to be “right”. what did we learn about relationships and being “right”? it’s counterproductive to the goals of a healthy relationship.
my point is, they’re both flawed enough and are failing to communicate. neither are right, and introducing LLMs into a broken relationship is not the answer.
Okay so you don’t trust the robot to give relationship advice, even if that advice is identical to what humans say. The trouble is we never really know where ideas come from. They percolate up into consciousness, unbidden. Did I speak to a robot earlier? Are you speaking to a robot right now? Who knows. All I know is that when someone I love and respect asks me to explain myself I feel that I should do that no matter what.
If she’s using ChatGPT to try and understand behavioural psychology, she’s not smarter than him.
It would be one thing to go off and do some reading and come back with some resources and functional strategies for OP to avoid argumentative fallacies and navigate civil discourse, but she’s using a biased generative AI to armchair diagnose her boyfriend.
“you don’t have the emotional bandwidth to understand what I’m saying” okay, so what if he doesn’t, now what lady? Does ChatGPT have a self development program so your boyfriend can develop the emotional intelligence required to converse with you?
Picking apart an argument is not how you resolve an argument, ChatGPT is picking it apart because she’s prompting it to do that, where as a therapist or couple’s counsellor would actually help address the root issues of the argument.
He’s probably gaslighting her and she doesn’t have anyone else to turn to for a reality check.
His question amounts to ‘how can I continue to shape her reality with my narrative?’
It doesn’t matter what chatgpt or anyone else says, he ought to be able to answer reasonable questions. Note that he doesn’t provide any specific examples. He would get roasted in the comment section.
This man is probably an asshole and this woman is probably smarter than him.
or…they’re both assholes and she’s a gaslighting psychopath. just going off what evidence is at my disposal.
at this point if you’re with a partner that refuses to acknowledge your needs in the relationship there’s literally no reason to remain in the relationship.
Like her need for him to answer reasonable questions? Why does the origin of the question pose a threat and why doesn’t he give examples? He’s like the rando poster who says ‘hey guys I forgot the passcode to my iPhone, got a workaround for that?’ okay buddy, so you stole a phone then.
if they were reasonable questions then she wouldn’t need AI to ask them.
she’s using AI to analyze her perception of the argument and then attacking him based on a flawed analysis.
he’s not sharing enough info to determine why they have so many arguments nor what they are about.
they’re both being shitty to each other and they both need to acknowledge the relationship is failing due to the individual flaws they have as people.
in a relationship differences can be strengths, similarities can be weaknesses, and personality flaws can be dangerous. it all depends on how those in the relationship deal with their differences, similarities, and flaws.
these two obviously can’t deal.
Are you saying it would be preferable if she was given the same advice from a human or read it in a book? This guy cannot defend his point of view because it’s probably not particularly defensible, the robot is immaterial.
I’ll spell it out for you. Y E S
I’m not going to argue the finer points of how a LLM has literally no concept of human relationships. Or how LLMs give the least effective advice on record.
if you trust a LLM to give anything other than half-baked garbage I genuinely feel sad for any of your current and future partners.
when you have a disagreement in a long-term intimate relationship it’s not about who’s right or wrong. its about what you and your partner can agree on and disagree on and still respect each other.
I’ve been married for almost 10 years, been together for over 20, we don’t agree on everything. I still respect my partners opinion and trust their judgment with my life.
every good relationship is based on trust and respect. both are concepts foreign to LLMs, but not impossible for a real person to comprehend. this is why getting a second opinion from a 3rd party is so effective. even if it’s advice from a book, the idea comes from a separate person.
a good marriage counselor will not choose sides, they aren’t there to judge. a counselor’s first responsibility is to build a bridge of trust with both members of the relationship to open dialogue between the two as a conduit. they do this by asking questions like, “how did that make you feel?” and “tell me more about why you said that to them.”
the goal is open dialogue, and what she is doing by using ChatGPT is removing her voice from the relationship. she’s sitting back and forcing the guy to have relationship building discussions with a LLM. now stop, now think about how fucked up that is.
in their relationship he is expressing what he needs from her, “I want to you stop using ChatGPT and just talk to me.” she refuses and ignores what he needs. in this scenario we don’t know what she needs because he didn’t communicate that. the only thing we can assume based on her actions is that she has a need to be “right”. what did we learn about relationships and being “right”? it’s counterproductive to the goals of a healthy relationship.
my point is, they’re both flawed enough and are failing to communicate. neither are right, and introducing LLMs into a broken relationship is not the answer.
Okay so you don’t trust the robot to give relationship advice, even if that advice is identical to what humans say. The trouble is we never really know where ideas come from. They percolate up into consciousness, unbidden. Did I speak to a robot earlier? Are you speaking to a robot right now? Who knows. All I know is that when someone I love and respect asks me to explain myself I feel that I should do that no matter what.
If she’s using ChatGPT to try and understand behavioural psychology, she’s not smarter than him.
It would be one thing to go off and do some reading and come back with some resources and functional strategies for OP to avoid argumentative fallacies and navigate civil discourse, but she’s using a biased generative AI to armchair diagnose her boyfriend.
“you don’t have the emotional bandwidth to understand what I’m saying” okay, so what if he doesn’t, now what lady? Does ChatGPT have a self development program so your boyfriend can develop the emotional intelligence required to converse with you?
Picking apart an argument is not how you resolve an argument, ChatGPT is picking it apart because she’s prompting it to do that, where as a therapist or couple’s counsellor would actually help address the root issues of the argument.
He’s probably gaslighting her and she doesn’t have anyone else to turn to for a reality check.
His question amounts to ‘how can I continue to shape her reality with my narrative?’
It doesn’t matter what chatgpt or anyone else says, he ought to be able to answer reasonable questions. Note that he doesn’t provide any specific examples. He would get roasted in the comment section.