Web/Marketplace Features

Marketplace and SL websites
  • Search existing ideas before submitting
  • Use support.secondlife.com for customer support issues
  • Keep posts on-topic
Thank you for your ideas!
ADD Do Not Show "AI Generated" Filter on Marketplace
Please add a required bool while creating a Marketplace listing that asks the creator to disclose if AI generative tools were used in part/all of the products creation. Add this bool to the 'Do Not Show' section of the Marketplace search filter. Allow listings to be reported if they do not honestly disclose this. ---- For the health of the market we need the use of AI tools to be disclosed by creators. I would prefer personally that no AI generation be supported. I understand that tolorance of it is an effort to avoid censorships slippery slope, and is complicated by the sheer variation of AI tools that currently exist. But regardless of opinions or complications, it's existance cannot be ignored. So lets focus on what can be done, by giving some power of choice to the consumer instead. Users should be provided a way to choose products with or without AI generation while shopping, and naturally creators who choose not to disclose or to lie should face the same reportability and moderation as any other creator that would choose to lie while listing. The moderation half of this should not necessarily focus on trying to prove the use of AI gen tools, it doesn't actually need to. Creators who have done the work to provide support and actually prepare the AI generated content for its use in SL are unlikely to warrent being reported, and should not need to lie about the useage anyway. It is products that blatantly lack support, have egregious LOD & LI values that would be inclined to lie, and would want to be reported by users, this is the same for ANY product currently listed. If you buy it, and you don't get what was described, it should be reportable, the same goes for AI generated products, AI generated should not get a pass because you have to 'prove' something first, when the fact remains that you didn't get what thought you bought. So do not focus on "did you use a tool?" focus on "can you fix & support the product?" If it can't be supported, then disclose it was made via AI generation and move on. This is a similar complication to full perm and gacha resales, but also ripped models, copybot etc, in they will all lack support due to not being made directly by the creator of the listing. There is a big difference between fullperm & gacha resale vs ripping & botting, and that difference is honest disclosure.
162
·
tracked
Mandatory information if off-world servers are used
Complex projects and products have always resorted to external servers under the control of the specific designer/creator who made that product. This always carried a risk in the case of the designer going out of business, or the external server being compromised or data being leaked. This would result at least in a disruption of service, or ultimately in a paid-for product not working anymore, In the past, using a setup like this required skill and experience, leading to the educated assumption that the off-world servers would be professionally maintained and secured. A prominent example would be CasperVend. Fast-forward to 2026, where LLM's give laypeople the tools to set up complex client server solutions, connecting LLM generated LSL scripts with LLM generated off-world server scripts. It's a sad truth, that those projects often lack basic cybersecurity and scalability, can leak data, or can incur sudden - often dramatic - cost spikes on their creators. The result is, that as fast as those products land on the Marketplace, as fast they become inoperable and a customer is left with the SL-side of such a product, and a non-functioning off-world server, rendering their legitimate purchase useless. I suggest a mandatory field for (new / updated) Marketplace listings, where the creator has to declare whether they use off-world components for their products or not. That way, a potential customer can decide for themselves if they trust the creator's expertise enough to believe their product will still work in a year or 5 years, or if they rather not take the risk. I am aware that this is hard to police, and that it WILL be abused (i.e. false assurance will be given). In case of conflict, it would be easy for the Lab to check any given product for off-world http-connections, and if a creator has given a false assurance, this can be dealt with. Any creator serious about cybersecurity, privacy, scalability and reliability can state so in their product description and/or the "Profile" part of their Marketplace store. Ultimately, the customers win and get enabled to make their own decisions.
0
·
Marketplace
Load More