Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
misophist @ misophist @lemmy.world Posts 0Comments 176Joined 2 yr. ago
misophist @ misophist @lemmy.world
Posts
0
Comments
176
Joined
2 yr. ago
That is entirely not the point. The issue isn't the infinitely repeated word. The issue is that requesting an infinitely repeated word has been found to semi-reliably cause LLM hallucinations that devolve into revealing training data. In short, it is an unintended exploit and until they have it reliably patched, they are making it against their TOS to try to exploit their systems.