A few decades ago, I remember, probably incorrectly, a court case where the police wanted to (or did) track people using GPS units that they stuck in the suspects’ cars. The case boiled down to this: a court order is needed to place a GPS unit on someone’s car, but a court order is not required to drive around following someone. The rationale is that driving around following someone is self-regulating, as it is expensive and challenging to do. The police are unlikely to abuse their power or violate people’s rights because it’s too much trouble to do so – they don’t have the time or manpower for it. At least that’s how I remember it.

Training AI is similar in my mind to the police placing a GPS unit in any car they deem suspect. I’m drawing a comparison between a human looking at your work and copying it vs. an AI looking at your work and copying it. The AI is looking at your work with much less effort than a human, and the AI is going to, without much work, take your job. We don’t need to regulate humans copying other humans’ work because of the difficulty of it. However, we do need to regulate AI due to the ease with which it can abuse its power.

The next time someone like me says that there is no difference between an AI looking at your work and a human looking at your work, tell them they are wrong. I was. We need to think about regulation when abuse comes easily.