mirh t1_j9k2rle wrote
> First, often people respond to them differently across demographic groups, particularly different cultures,
No shit, as with anything and everything? Even semantic memory is still inevitably sprung from a life of experiences.
> and second; small, irrelevant changes in how thought experiments are worded can change entirely how we respond to them.
And that's a plus, not a negative thing?
Just like in normal "physical" experiments, figuring this out allows you to notice nuances and variables that you had never thought mattered or even just existed.
You can't criticize people with the hindsight of their future self having discovered them to be non-trivially wrong. Ironically this is the kind of insight that the Gettier problem eventually leads you to.
> and their assessment of free will and responsibility differ from the one found in other parts of the world. Women have different intuitions about moral dilemmas such as the Trolley cases from men.
Literally the observational point of the entire experiment-making. In fact, thanks god you had such simplified thought experiments to begin with, because no way anything more convoluted would have given you a better time.
Beyond the most obvious "you should always be careful with X" platitude, this article is absolute trash.
> Of particular interest is the recent emphasis on conceptual engineering, i.e. on attempts to reform philosophically significant concepts.
That's known as ordinary language philosophy and it's like a hundred years old by now.
Wizzdom t1_j9ki66s wrote
I think he should have focused more on why it could be problematic to apply thought experiments to real world applications such as AI or self-driving cars. That would be a much more interesting conversation imo.
mirh t1_j9ko0fn wrote
I could swear I had read a very insightful comment/article in this regard, but I cannot find it anymore...
Anyhow, I see where you are coming from. But then you aren't talking about thought experiments "per se" anymore (this dude even lowkey criticizes Gettier somehow!) but just warning not to talk out of one's ass like in any other kind of argument.
Like, those atrocious "should the car kill the elderly or the baby" are either more of an engineering problem than truly philosophy, or they are ethics from somebody that thinks either too sanctimoniously about people or too stupidly about computers.
Viewing a single comment thread. View all comments