Linkerbaan@lemmy.worldBanned to Mildly Infuriating@lemmy.worldEnglish · 1 year agoGoogle Gemini refuses to answer questions about deaths in Gaza but has no problem answering the same question for Ukraine.lemmy.worldimagemessage-square61linkfedilinkarrow-up113arrow-down13
arrow-up110arrow-down1imageGoogle Gemini refuses to answer questions about deaths in Gaza but has no problem answering the same question for Ukraine.lemmy.worldLinkerbaan@lemmy.worldBanned to Mildly Infuriating@lemmy.worldEnglish · 1 year agomessage-square61linkfedilink
minus-squareeatthecake@lemmy.worldlinkfedilinkEnglisharrow-up1·1 year agoWhy do i find it so condescending? I don’t want to be schooled on how to think by a bot.
minus-squareViking_Hippie@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·1 year ago Why do i find it so condescending? Because it absolutely is. It’s almost as condescending as it’s evasive.
minus-squareLad@reddthat.comlinkfedilinkEnglisharrow-up1·1 year agoFor me the censorship and condescending responses are the worst thing about these LLM/AI chat bots. I WANT YOU TO HELP ME NOT LECTURE ME
minus-squareViking_Hippie@lemmy.worldlinkfedilinkEnglisharrow-up0arrow-down1·1 year agoThat sort of simultaneously condescending and circular reasoning makes it seem like they already have been lol
Why do i find it so condescending? I don’t want to be schooled on how to think by a bot.
Because it absolutely is. It’s almost as condescending as it’s evasive.
For me the censorship and condescending responses are the worst thing about these LLM/AI chat bots.
I WANT YOU TO HELP ME NOT LECTURE ME
deleted by creator
That sort of simultaneously condescending and circular reasoning makes it seem like they already have been lol