fal.toolkit.image.nsfw_filter package

Submodules

fal.toolkit.image.nsfw_filter.env module

fal.toolkit.image.nsfw_filter.env.get_requirements()

fal.toolkit.image.nsfw_filter.inference module

class fal.toolkit.image.nsfw_filter.inference.NSFWImageDetectionInput(**data)

Bases: BaseModel

image_url: str
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class fal.toolkit.image.nsfw_filter.inference.NSFWImageDetectionOutput(**data)

Bases: BaseModel

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

nsfw_probability: float
fal.toolkit.image.nsfw_filter.inference.check_nsfw_content(pil_image)
fal.toolkit.image.nsfw_filter.inference.run_nsfw_estimation(input)
Return type:

NSFWImageDetectionOutput

fal.toolkit.image.nsfw_filter.model module

fal.toolkit.image.nsfw_filter.model.get_model()

Module contents

class fal.toolkit.image.nsfw_filter.NSFWImageDetectionInput(**data)

Bases: BaseModel

image_url: str
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class fal.toolkit.image.nsfw_filter.NSFWImageDetectionOutput(**data)

Bases: BaseModel

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

nsfw_probability: float
fal.toolkit.image.nsfw_filter.run_nsfw_estimation(input)
Return type:

NSFWImageDetectionOutput