if you have a soup of all liquids and a sieve that only lets coffee and ice cream through it produces coffee ice cream (metaphor, don’t think too hard about it)
that’s how gen ai works. each step sieves out raw data to get closer to the prompt.
Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.
What the fuck is AI being trained on to produce the stuff?
if you have a soup of all liquids and a sieve that only lets coffee and ice cream through it produces coffee ice cream (metaphor, don’t think too hard about it)
that’s how gen ai works. each step sieves out raw data to get closer to the prompt.
Pictures of clothed children and naked adults.
Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.
Given the “we spared no expense” attitude to the rest of the data these things are trained on, I fear that may be wishful thinking…
Well, that’s somewhat reassuring.
Still reprehensible that it’s being used that way, of course.