Deploying a Smart FAQ Model in a Dialogue Tree

Updated 

Before You Begin

Create a Smart FAQ Model, add relevant Content Sources, and ensure the Content Sources are active.

Overview

In this article, we'll delve into utilizing the Smart FAQ model you've created to deploy FAQ bots empowered by generative AI.

Steps to Deploy Model

  1. Navigate to the Dialogue Tree where you intend to integrate the model you've created.

    Enablement note:

    To ensure BotSmartReply API is incorporated into your environment, please work with your Success Manager.

  2. Insert an API node along the path where you want to get responses from the GPT Model.

  3. Choose BotSmartReply/BotSmartReply from the API selection dropdown menu.

  4. Configure the input parameters for the API node as follows:

    A. engineID:

    i. The engineID value should match the ModelID of the Smart FAQ model you intend to utilize for generating responses.

    ii. You can locate this value by accessing the Smart FAQ Model page. It typically appears as the last part of the URL, as shown below:

    B. text:

    • Description: The actual question or query posed by the user. This text will be analyzed and processed by the Smart FAQ model to generate a response.

    • Example: "text": "How can I reset my password?"

    C. assetId:

    • Description: The unique identifier for the asset (such as a document, article, or guide) that is being used to answer the user's query. This helps in tracking and referring to the asset used in generating the answer.

    • Example: "assetID": "guide_001"

    D. language:

    • Description: The input prompt sent to OpenAI to generate an answer for the FAQ query. This prompt helps OpenAI understand the context of the user’s question.

    • Example: "prompt": "Answer the user's question based on the FAQ guide provided"

    E. readTimeout:

    • Description: The duration, in milliseconds, after which the system will trigger a timeout if no response is received from the model. This helps in preventing long delays in the response process.

    • Example: "read_timeout": 3000 (indicating a 3-second timeout)

    F. prompt:

    • You can enhance the bot's accuracy by utilizing the tag functionality. By passing this tag as an additional parameter in the API input, you can fine-tune the bot's responses.

    • Example: “def additional = [:];

      additional['tags'] = ['Important'];

      return additional;“

    G. additional tag:

    • You can enhance the bot's accuracy by utilizing the tag functionality. By passing this tag as an additional parameter in the API input, you can fine-tune the bot's responses.

    • Example: “def additional = [:];

      additional['tags'] = ['Important'];

      return additional;“

  5. Configure the output parameters of the API node as follows:

  6. After setting up the API node, display the stored output parameter in a bot reply node.

What's Next?

Test your responses using golden test set.