Skip to content

Bubble.io Plugin

Vettly's Bubble.io plugin lets you add AI-powered content moderation to your no-code apps.

Installation

  1. Go to the Bubble Plugin Marketplace
  2. Search for "Vettly - AI Content Moderation"
  3. Install the plugin to your app

Setup

  1. Get your API key from vettly.dev/dashboard
  2. In Bubble, go to PluginsVettly
  3. Paste your API key in the api_key field

Available Actions

Check Text Content

Moderate text content in real-time.

Inputs:

FieldTypeRequiredDescription
ContentTextYesThe text to moderate
Policy IDTextNoCustom policy (default: "default")
Use CaseTextNoContext: comment, review, message, etc.

Outputs:

FieldTypeDescription
SafeYes/NoIs the content safe?
FlaggedYes/NoWas content flagged for review?
ActionTextallow, warn, flag, or block
Decision IDTextReference ID for this decision
Top CategoryTextHighest-scoring category
Top ScoreNumberScore of top category (0-1)

Check Image

Moderate image content for NSFW, violence, etc.

Inputs:

FieldTypeRequiredDescription
Image URLTextYesURL of the image to moderate
Policy IDTextNoCustom policy

Outputs: Same as Check Text Content

Check Multimodal Content

Moderate text and images together.

Inputs:

FieldTypeRequiredDescription
TextTextNoText content
Image URLsList of textsNoUp to 10 image URLs
Policy IDTextNoCustom policy

Batch Check Content

Moderate multiple items at once (up to 100).

Inputs:

FieldTypeRequiredDescription
Content ItemsList of textsYesItems to moderate
Policy IDTextNoCustom policy

Outputs:

FieldTypeDescription
TotalNumberTotal items processed
Safe CountNumberItems that passed
Flagged CountNumberItems flagged for review
Blocked CountNumberItems blocked

Example Workflows

Moderate Comment Before Saving

When Form is submitted:
  → Plugin: Vettly - Check Text Content
      content = Input Comment's value
  → Only when Result's action is "block":
      → Show alert "Your comment violates our guidelines"
  → Only when Result's action is not "block":
      → Create new Comment
          flagged = Result's flagged

Moderate Profile Picture

When Picture Uploader's value changes:
  → Plugin: Vettly - Check Image
      image_url = Picture Uploader's value
  → Only when Result's safe is "no":
      → Show alert "This image is not allowed"
      → Reset Picture Uploader

Moderate Listing Before Publish

When Button "Publish" is clicked:
  → Plugin: Vettly - Check Multimodal Content
      text = Input Title + Input Description
      image_urls = Repeating Group Images' list of URLs
  → Only when Result's action is "block":
      → Show alert "Your listing contains prohibited content"
  → Only when Result's action is not "block":
      → Create new Listing

Content Categories

Vettly detects these categories:

CategoryDescription
hate_speechDiscrimination, slurs, dehumanizing content
harassmentBullying, threats, intimidation
violenceGraphic violence, gore, threats of harm
self_harmSelf-injury, suicide content
sexualNSFW, explicit content
spamPromotional spam, scams
profanitySwear words, vulgar language
scamFraud, phishing attempts
illegalDrug sales, illegal activities

Pricing

The Bubble plugin is free to install. You pay for Vettly API usage based on your plan:

PlanText ChecksImage ChecksVideo ChecksPrice
Developer2,000/mo100/mo25/mo$0
Growth50,000/mo5,000/mo1,000/mo$49/mo
Pro250,000/mo25,000/mo5,000/mo$149/mo
EnterpriseCustomCustomCustomCustom

See vettly.dev/pricing for details.

Support