Whitewashing A.I.

Lately, I have been seeing a lot of A.I. companies trying to put supposedly “ethical” frameworks in place, in an apparent attempt to make their services ethically compliant.

In some cases, this looks like a genuine effort to at least seek the approval of the artists whose work is fed to the machine in order to “train” it. But in other cases, it looks like cynical whitewashing, by slapping a legal disclaimer and then still allowing anyone to upload the work of others, as long as they pretend to not have stolen it. We´ve seen this before, in illegal file sharing. Wink wink, nudge nudge, this work is “not” someone else´s, and I “promise” I have the “right” to distribute it.

We can expect to see many many ways in which the tech industry is going to try to circumvent the ethical dilemma at the heart of this, but regardless of the authenticity and sincerity behind those attempts, one thing they simply cannot avoid is the fact that generative A.I. output is replacing human counterparts.

While that may be acceptable if we are talking about menial tasks like street cleaning, or outer space mining, it is emphatically not okay when it purports to remove humans from profoundly human endeavors like art, or literature, or music. Nothing good will come out of replacing humans in that context. I don´t care if you wish to make a qualitative argument, that only a select few artists are worth respecting, and that the others somehow deserve to have the rug pulled out from under them in this way. (Yes, I have heard that argument made many many times: “why should art be sacred?”)

We can find productive uses of machines in many different aspects of society and human life, but replacing artists and poets is simply just a monstrous, inhumane proposition. Why would we refer to the arts as “the Humanities”, if this wasn´t about humanity and humanism as a whole…?

Trying to undermine humanism, depleting the intellectual capital and artistic abilities which make us human, is not an effort we should ever accept as a species.

Leave a comment