No. Even if you tried the AI companies don’t care. You don’t have the resources to enforce it, and slap on the wrist fines don’t bother big tech companies.
First, the obvious: legal ways to stop anything have never succeeded in stopping anything.
Second, what would you loose from your book being used to train LLMs and what would you gain from preventing that? Are you afraid of loost sales? Or maybe plagiarism? Those problems have existed for a loooong time, LLMs or not. Or is it that you don't generally like the current AI craze and don't want to fuel it?
No. Even if you tried the AI companies don’t care. You don’t have the resources to enforce it, and slap on the wrist fines don’t bother big tech companies.
Recent court rules have affirmed that it is ok for models to read copyrighted material as part of training, as long as they pay for it.
That being said, some AI companies are offering methods to opt-out
I hope sometime soon the US and other major countries backtracks that decision, and make it fully opt-in not opt-out
What's wrong with an AI reading your book? They have to pay for it, so you will make money
First, the obvious: legal ways to stop anything have never succeeded in stopping anything.
Second, what would you loose from your book being used to train LLMs and what would you gain from preventing that? Are you afraid of loost sales? Or maybe plagiarism? Those problems have existed for a loooong time, LLMs or not. Or is it that you don't generally like the current AI craze and don't want to fuel it?
Don't show it to anybody.
Or show them in place they cannot have any way to copy it onto a digital medium.
include the gamer word on every page and the whole book might get filtered out from the dataset
Never