Linkerbaan@lemmy.world to Mildly Infuriating@lemmy.worldEnglish · 10 months agoGoogle Gemini refuses to answer questions about deaths in Gaza but has no problem answering the same question for Ukraine.lemmy.worldimagemessage-square63fedilinkarrow-up113arrow-down13
arrow-up110arrow-down1imageGoogle Gemini refuses to answer questions about deaths in Gaza but has no problem answering the same question for Ukraine.lemmy.worldLinkerbaan@lemmy.world to Mildly Infuriating@lemmy.worldEnglish · 10 months agomessage-square63fedilink
minus-squareViking_Hippie@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·10 months ago Why do i find it so condescending? Because it absolutely is. It’s almost as condescending as it’s evasive.
minus-squareLad@reddthat.comlinkfedilinkEnglisharrow-up1·10 months agoFor me the censorship and condescending responses are the worst thing about these LLM/AI chat bots. I WANT YOU TO HELP ME NOT LECTURE ME
minus-squareViking_Hippie@lemmy.worldlinkfedilinkEnglisharrow-up0arrow-down1·10 months agoThat sort of simultaneously condescending and circular reasoning makes it seem like they already have been lol
Because it absolutely is. It’s almost as condescending as it’s evasive.
For me the censorship and condescending responses are the worst thing about these LLM/AI chat bots.
I WANT YOU TO HELP ME NOT LECTURE ME
deleted by creator
That sort of simultaneously condescending and circular reasoning makes it seem like they already have been lol