I hardly need to mention that there's a lot of feelings about ai-generated content, and LL has already heard quite a lot of feedback about it from residents.
I'd like to suggest a thing we can do about it!
Let me first admit that no solution is
perfect
and that an AI-using-creator could compile, train, and operate AI indpendently without the tech I mention below. But, the 99% use-case is that people generate content using ready-made solutions, and an enormous number of these ai-generating business have committed to a measure of transparency and accountability that I want to mention.
If a person generates AI content--marketplace images, for example--using Midjourney, OpenAI (chatGPT), NanoBanana (Google Gemini), Firefly (Adobe), Runaway AI... or any of the myriad next-tier platforms implementing stable diffusion 3, then ALL of these tools automatically embed a content credential into the work. Cropping, resizing, or trivial modifications to the work do not alter the credential. Only significant human-made alterations to the content of the image result in the credential being changed.
The presense of a content credential does not, I should note, mean "made with AI". It means "there's a content credential here." It's important because all major genAI tools
do
generate a credential that describes the origin of the work as a GenAI tool. The content credential carries a modification history with it. This means that even the creator of an original, handmade work could embed a Credential, and if their work was taken and modified by AI, then the credential would list out both versions.
Again, all major genAI tools use this. The credentials are already there today. The credentials are not a value judgement, and they are not an ethical judgement. They are already within the ai-generated content, and they help us see what was done with a piece of content. The customer/end user makes the value judgement for themselves.
LL has seen that customers want to know. The information is already within the file. The platform just needs to expose that information. A simple "CR" tag (using the established logo made with the c2pa standard) on the corner of any image on the marketplace, with a hover-tip to explain the content credential would go a long way. Further, adding a property for texture assets generated for use within SL itself would be a nice next-step (but less trivial to implement).
For more info about the industry-wide standard C2PA (Coalition for Content Protection and Authenticity): https://c2pa.org/
The specification is avaialble here:
This isn't a fun, easy feature but it feels like an important feature.