NSFW DApp – Renting AI Trained Model on iExec
Not Safe For Work is a commonly used app that filters out pictures that can’t be used for work. It can be used in situations such as when a user loads a new picture onto a web service, so the service can automatically validate whether it is appropriate to be shown to other users.
In iExec NSFW dapp, the requester (end-user) can actually choose which model he wants to use to filter his/her image. The default model is this standard model, historically maintained by Yahoo. The end-user workflow is pretty straight forward: you choose the dataset you’re going to use, you select a picture to be analyzed and run the app….
Was this helpful?
Did you like the post? Let us know your opinion!
Thanks. Would you like to add something?
Thanks for your feedback!