Detect toxic or harmful language
Request Body:
input.text (string, required): The text to scan for toxic contentconfig.threshold (float, optional): Detection threshold (default: 0.5)Documentation Index
Fetch the complete documentation index at: https://www.traceloop.com/docs/llms.txt
Use this file to discover all available pages before exploring further.