Reporting known instances of transactions which could result in the training of a large AI model with potential capabilities that could be used in malicious cyber-enabled activity.
Which is to say: every language model and image generator with >100M params. In other words, IaaS providers must report nearly every transaction, since you can train a LoRA module for a small model on a few hundred ARM cores, or on nearly any datacenter GPU.
To be clear though, the rule doesn’t include prison time for you, the GPU user. The prison time and/or fines are for the noncompliant IaaS provider, which means that cloud GPUs (and possibly every other resource) will be much more expensive and harder to access.
Which is to say: every language model and image generator with >100M params. In other words, IaaS providers must report nearly every transaction, since you can train a LoRA module for a small model on a few hundred ARM cores, or on nearly any datacenter GPU.
To be clear though, the rule doesn’t include prison time for you, the GPU user. The prison time and/or fines are for the noncompliant IaaS provider, which means that cloud GPUs (and possibly every other resource) will be much more expensive and harder to access.