Skip to content
  • Auto
  • Light
  • Dark
Log in to API

Create

Create
client.Moderations.New(ctx, body) (*ModelstringResultsarrayModerationNewResponse, error)
post/moderations

Classifies if given messages are potentially harmful across several categories.

Parameters
bodyMessagesfieldModelfieldModerationNewParams
Hide ParametersShow Parameters
Messagesfield
param.Field[[]CompletionMessagestruct{…}MessageUnion]

List of messages in the conversation.

Hide ParametersShow Parameters
UserMessagestruct

A message from the user in a chat conversation.

Hide ParametersShow Parameters

The content of the user message, which can include text and other media.

Hide ParametersShow Parameters
RoleUserMessageRoleUserUserMessageRoleUserMessageRole

Must be "user" to identify this as a user message.

Hide ParametersShow Parameters
UserMessageRoleUserconst
"user"
SystemMessagestruct

A system message providing instructions or context to the model.

Hide ParametersShow Parameters

The content of the system message.

Hide ParametersShow Parameters
RoleSystemSystemSystemSystem

Must be "system" to identify this as a system message

Hide ParametersShow Parameters
SystemSystemconst
SystemSystemSystemSystem
"system"
ToolResponseMessagestruct

A message representing the result of a tool invocation.

Hide ParametersShow Parameters

The content of the user message, which can include text and other media.

Hide ParametersShow Parameters
RoleToolToolToolTool

Must be "tool" to identify this as a tool response

Hide ParametersShow Parameters
ToolToolconst
ToolToolToolTool
"tool"
ToolCallIDstring

Unique identifier for the tool call this response is for

CompletionMessagestruct

A message containing the model's (assistant) response in a chat conversation.

Hide ParametersShow Parameters
RoleAssistantAssistantAssistantAssistant

Must be "assistant" to identify this as the model's response

Hide ParametersShow Parameters
AssistantAssistantconst
AssistantAssistantAssistantAssistant
"assistant"
ContentMessageTextContentItemstruct{…}CompletionMessageContentUnion
optional

The content of the model's response.

Hide ParametersShow Parameters
string
MessageTextContentItemstruct

A text content item

Hide ParametersShow Parameters
Textstring

Text content

Discriminator type of the content item. Always "text"

Hide ParametersShow Parameters
MessageTextContentItemTypeTextconst
"text"
StopReasonCompletionMessageStopReasonStopCompletionMessageStopReasonCompletionMessageStopReasonToolCallsCompletionMessageStopReasonCompletionMessageStopReasonLengthCompletionMessageStopReasonCompletionMessageStopReason
optional

The reason why we stopped. Options are: - "stop": The model reached a natural stopping point. - "tool_calls": The model finished generating and invoked a tool call. - "length": The model reached the maxinum number of tokens specified in the request.

Hide ParametersShow Parameters
CompletionMessageStopReasonStopconst
CompletionMessageStopReasonStopCompletionMessageStopReasonCompletionMessageStopReasonToolCallsCompletionMessageStopReasonCompletionMessageStopReasonLengthCompletionMessageStopReasonCompletionMessageStopReason
"stop"
CompletionMessageStopReasonToolCallsconst
CompletionMessageStopReasonStopCompletionMessageStopReasonCompletionMessageStopReasonToolCallsCompletionMessageStopReasonCompletionMessageStopReasonLengthCompletionMessageStopReasonCompletionMessageStopReason
"tool_calls"
CompletionMessageStopReasonLengthconst
CompletionMessageStopReasonStopCompletionMessageStopReasonCompletionMessageStopReasonToolCallsCompletionMessageStopReasonCompletionMessageStopReasonLengthCompletionMessageStopReasonCompletionMessageStopReason
"length"
ToolCallsarray
optional

The tool calls generated by the model, such as function calls.

Hide ParametersShow Parameters
IDstring

The ID of the tool call.

FunctionArgumentsstringNamestringCompletionMessageToolCallFunction

The function that the model called.

Hide ParametersShow Parameters
Argumentsstring

The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function.

Namestring

The name of the function to call.

Modelfield
optional
param.Field[string]

Optional identifier of the model to use. Defaults to "Llama-Guard".

Returns
ModerationNewResponsestruct
Hide ParametersShow Parameters
Modelstring
Hide ParametersShow Parameters
Flaggedbool
FlaggedCategoriesarray
[]string
package main

import (
  "context"
  "fmt"

  "github.com/stainless-sdks/-go"
  "github.com/stainless-sdks/-go/option"
)

func main() {
  client := llamaapi.NewClient(
    option.WithAPIKey("My API Key"),
  )
  moderation, err := client.Moderations.New(context.TODO(), llamaapi.ModerationNewParams{
    Messages: []llamaapi.MessageUnionParam{llamaapi.MessageUnionParam{
      OfUser: &llamaapi.UserMessageParam{
        Content: llamaapi.UserMessageContentUnionParam{
          OfString: llamaapi.String("string"),
        },
        Role: llamaapi.UserMessageRoleUser,
      },
    }},
  })
  if err != nil {
    panic(err.Error())
  }
  fmt.Printf("%+v\n", moderation.Model)
}
200 Example
{
  "model": "model",
  "results": [
    {
      "flagged": true,
      "flagged_categories": [
        "string"
      ]
    }
  ]
}