It doesn't have to. As long as it provides reasonable justification for a decision, that's better than no justification at all, eg. Like the OP said, "because I wanted to".
How is a false justification reasonable or better than admitting that the choice is arbitrary?
(especially in the framework of the comment I replied to, where they are talking about the process being internally useful. Of course it may be useful to bullshit someone else, but that's a different thing.)
Because at least you put some thought was put into it, which again, is better than no thought at all.
Consider your question applied to math: how is a false mathematical argument for believing a theorem better than just believing it for no reason at all? It's clearly better because you can actually point out the flaw in the specific argument.
If someone accepts a mental framework and is faced with a choice they want to make, either:
1. They can find no justification for a choice they want to make, at which point they are more likely to question whether their choice is justifiable, or their mental framework is sufficient. Either outcome should be encouraged.
2. If they find a justification that's valid in the framework, then they have an explicable basis from which to convince others they made the right choice.
3. If they find a justification that's fallacious, then pointing out either how the framework is incomplete, or their argument within the framework is incorrect is far more likely to change their minds than simply claiming their choice was wrong and trying to explain why from your own framework, which they haven't accepted (and likely won't without a lot more convincing).
Ultimately, applying thought to a problem is always better than no thought.
Doubtful. Like the OP said, "because I want to" is all the thought it requires. It's the answer children instinctively give. How much thought do they give their justifications?
Finally, if you're giving credence to a process, even a faulty one, then at least it's possible to erode confidence in the process or your application of it. To see its effectiveness, you need only browse the countless posts on reddit written by former Christians about how they came to question and ultimately give up their religion after a discussion with an atheist.
>Because at least you put some thought was put into it, which again, is better than no thought at all.
That's the problem. By opting for a an argument that sounds reasonable to justify your pre-existing decision/idea, you don't "put thought into it". You use your emotions to pick a logical sounding justification.
That makes it even more dangerous than admitting "I don't really have a justification, I just like this decision/idea".
That is what we call "rationalization", and it's the worst thing one can do. It's how abuse victims remain with their abuser, how racists find supporting arguments for their beliefs, etc.
>Consider your question applied to math: how is a false mathematical argument for believing a theorem better than just believing it for no reason at all? It's clearly better because you can actually point out the flaw in the specific argument.
That's a false equivalent, because in actual life arguments about personal beliefs (about decisions/ideas/relationships/etc) can not always be shown to be faulty as they are not axiomatic but casual.
The fault in such arguments often depends on qualitative judgements, guestimations, subjective opinions about people and situations, and so on.
> By opting for a an argument that sounds reasonable to justify your pre-existing decision/idea, you don't "put thought into it". You use your emotions to pick a logical sounding justification.
So taking the time to read or consider an argument that might justify something you instinctively believe, is putting zero thought into it? Come on, the amount of thought is quite literally not zero, and it is zero when letting your emotions drive your decisions. Therefore, yes, a rationalization is putting some thought into it.
> That makes it even more dangerous than admitting "I don't really have a justification, I just like this decision/idea". That is what we call "rationalization", and it's the worst thing one can do.
Is it? What evidence do you have that people who rationalize make worse or "more dangerous" decisions in any or most contexts than those who apply no thought at all to their decisions?
> It's how abuse victims remain with their abuser, how racists find supporting arguments for their beliefs, etc.
And by relying on emotions to make decisions, you think victims of domestic abuse and racists are going to suddenly change those behaviours? The rationalization is a complete red herring here.
Domestic abuse victims stay in abusive relationships for emotional reasons. At least deconstructing their rationalizations could drive them to seek help. What argument do you think could contravene an emotional decision that was given no rationalization?
Similarly, you think racists who don't rationalize are somehow "more rational", or more open to changing their views than those who don't? Do you have evidence of this?
> That's a false equivalent, because in actual life arguments about personal beliefs (about decisions/ideas/relationships/etc) can not always be shown to be faulty as they are not axiomatic but casual.
Which is exactly the case for emotional decisions with no rationalization. Decisions which have been subjected to some thought actually have some justification that can be challenged.
The other person may assert they're right regardless, in which case you're no better off than if they had made decision without a rationalization. Or they may acknowledge their mistake or switch to another rationalization, either of which is a victory for truth. In the latter case, it's a retreat to an ever-narrowing gap of fallacious reasoning.
Like I said, you can see exactly this progession in hundreds of testimonials in the atheism subreddit. Former Christians admit that when challenged, they fell back on progressively more absurd rationalizations for Christian dogma until they could no longer accept it, and then they eventually became agnostics or atheists.
So I have plenty of evidence suggesting the progression I describe actually works. I've seen zero evidence that emotional decisions can be changed in any way other than some oppressive action (like jail time, social shunning, etc., all of which are dehumanizing).
In order to accept your framing that rationalization is "more dangerous", I require a comparable amount of evidence that people who don't rationalize are more rational on average, or are more easily convinced of the truth when challenged, or something along those lines. Can you provide that?
Sorry this is really random but I've been reading your contributions/comments on your profile and I've enjoyed it so much. I really like the way you think.