Skip to main content
Build tools visually with a no-code drag-and-drop interface.

Overview

Workflow tools enable process automation through a visual flow builder. They let you automate multi-step processes, combine AI features with deterministic business logic, integrate with external systems (APIs, cloud services), generate structured outputs (emails, summaries, data responses), and reuse tools across multiple Agentic Apps—no complex coding required. Every flow consists of an input node that receives data and an output node that returns results. Other systems access these results through a deployable API endpoint.
┌─────────┐ ┌──────────┐ ┌─────────────┐ ┌─────────┐ │ Start │───▶│ Validate │───▶│ Call API │───▶│ End │ └─────────┘ └──────────┘ └─────────────┘ └─────────┘

When to Use

Workflow tools are ideal when:
  • Business logic is well-defined and consistent
  • You need visual traceability for debugging
  • Non-developers need to build or maintain tools
  • Processes involve multiple sequential steps
  • You want built-in monitoring and audit logs

Good Fit Examples

Use CaseWhy Workflow Works
Order status lookupClear input → API call → formatted output
Weather retrievalSimple API integration with response mapping
Database queriesStructured data fetch with transformation
Notification sendingMulti-channel delivery with conditions

Tool Creation

Create a New Tool

  1. Log in to your account and click Tools from the list of modules.
  2. On the Tools page, click Create a new tool. The New tool dialog box is displayed.
  3. Enter a name and a brief description for the tool and click Create. The tool is created, and the Tool Flow option is displayed. You can start creating your tool flow immediately.
You can also add a tool by installing a pre-configured template from the Tools Library marketplace and customizing the flow for your use case.

Tool Templates

Instead of building a tool flow from scratch, Tool Templates give you access to pre-configured flows designed for specific business use cases. These pre-built integrations require minimal setup and can be customized through the visual canvas. Template types:
  • Pre-built templates — Ready-to-deploy flows with pre-configured integrations (for example, an email auto-replier or automated grading system).
  • Customizable templates — Node-based flows you can tailor to your business needs using the various node types on the canvas.
Key benefits:
  • Quick setup — Significantly reduces deployment time and costs.
  • Customizable — Retain essential prebuilt elements and remove unnecessary ones.
  • Seamless integration — Import directly into existing flows and automate key actions within minutes.
  • Enhanced developer experience — Eliminate repetitive tasks with minimal coding effort.

Tools Library Marketplace

The Tools Library Marketplace offers more than 50 prebuilt tool templates across categories including Brand Insights, Brand Monitoring, CRM, Competitive Analysis, Content, Customer Support, Finance, Marketing, Operations, Retail, Sales, Social Media, and Speech to Text. Supported AI tasks include topic analysis, sentiment, summarization, tagging, intent detection, and content generation.

Access and Install a Tool Template

  1. Log in and click Tools from the list of modules.
  2. Click Tools on the top navigation bar.
  3. If adding your first tool, click Tool templates. Otherwise, on the All Tools or My Tools tab, click Tool Templates to access the Tools Library/Marketplace.
  4. Scroll to the Tools section. Use the left filter to select Categories and Tasks.
  5. Click the required template to view its information window, which includes the template name, description, compatible LLM model, configuration status, prebuilt flow preview, related templates, and developer details.
  6. Click Install to connect to the Marketplace and import the template along with its pre-configured flow.
The system redirects you to the Tool Flow page. Click Go to Flow to view and manage the prebuilt canvas. Important:
  • After it is installed, the template shows in My Tools.
  • A PDF document with key details (input/output, environment variables, usage guide, key components) is displayed for download.
  • The first installation retains the original template name and description.
  • Reinstalling the same template appends a unique sequential number: Tool Template Name_<sequential-number> (for example, Automatic grading system_1743151769005).
  • Canvas-level errors may appear on fresh installations (for example, unconnected AI nodes or empty API nodes). Click the warning icon to view and fix errors.

Managing a Tool Template

To modify a template`s name, description, async configuration, or environment variables:
  1. Open the Tools dashboard and click the required tool.
  2. Click Configurations in the left navigation menu.
  3. Make the required changes or proceed to delete the tool.
Delete a Tool Template:
  1. Scroll to the Delete tool section on the Configurations page.
  2. Click Proceed to delete, then confirm by clicking Delete.
Deleting the tool is irreversible and removes all the associated data.

View and Manage Tool Flow

  1. Click the required tool template on the Tools dashboard.
  2. Click Go to Flow. The prebuilt flow is displayed on the canvas.
  3. Manage nodes and their configurations, edit input and output, and run the flow to customize as needed.

Import a Tool

Using the import functionality, you can create a new tool or add it as a version to an existing tool. The import process preserves tool configuration—including prompts, properties, linked tools, and external/open-source models—while handling environment variables, errors, and role permissions.
Users with only Viewer permissions can`t import a tool.

Import to Create a New Tool

Import a .zip package from your local system containing the flow definition, app definition, and environment variable JSON files from another environment.
  1. Log in and click Tools from the list of modules.
  2. Click the Tools tab on the top navigation bar.
  3. Click Import Tool (or the Import tool icon on the Tools dashboard if tools already exist).
  4. In the Import tool window, provide: Required:
    • Tool Name — Provide a unique name to avoid conflicts with existing tools.
    • Flow definition file — Upload flow_definition.json. Includes the tool`s canvas definitions (node definitions) and AI node configurations, including prompts, hyperparameters, and timeout information.
    Optional:
    • App definition file — Upload app_definition.json. Includes general tool version information and guardrails.
    • Environment variable file — Upload env_variables.json. Includes environment variables set for the tool.
  5. Click Import. The system redirects you to the Tool Flow page. Click Go to flow to access the canvas.
After import:
  • The tool is listed under All tools and My tools with status In Development.
  • Node types, descriptions, input/output variables, scanners, and sync/async setup are preserved.
  • If the AI nodes model doesnt match available models in the current account, the model field is left empty but all prompt definitions are imported.
  • API keys are not transferred during import.
  • The imported tool cannot be shared with other users.
Failure scenarios: Import may fail due to an incompatible tool version, corrupt file, internal error, or missing guardrails—triggering an error notification.

Import Best Practices

  • Export first — Always export the current tool version before importing to ensure all configurations are available.
  • Check environment dependencies — Verify that the correct model is available, linked tools exist by the same name for auto-linking, and environment variables and configuration files are compatible.
  • Import in order — Import the parent tool and any associated versions in order, and ensure there are no conflicts with existing tools.

Tool Flow

Overview

The Tool Flow builder is a visual, node-based canvas where you design tool logic by connecting components. Developers can create complex AI-powered automations for use cases such as candidate evaluation, banking applications, content generation, and more.

Interface

┌─────────────────────────────────────────────────────────────┐ │ Workflow: get_order_status [Test] [Deploy]│ ├─────────────────────────────────────────────────────────────┤ │ │ │ ┌─────────┐ │ │ │ START │ │ │ └────┬────┘ │ │ │ │ │ ▼ │ │ ┌─────────────────┐ │ │ │ Validate Input │ │ │ └────────┬────────┘ │ │ │ │ │ ┌───┴───┐ │ │ ▼ ▼ │ │ ┌────────┐ ┌────────┐ │ │ │ Valid │ │Invalid │ │ │ └───┬────┘ └───┬────┘ │ │ │ │ │ │ ▼ ▼ │ │ ┌────────┐ ┌────────┐ │ │ │API Call│ │ Error │ │ │ └───┬────┘ └───┬────┘ │ │ │ │ │ │ └────┬─────┘ │ │ ▼ │ │ ┌─────────┐ │ │ │ END │ │ │ └─────────┘ │ │ │ └─────────────────────────────────────────────────────────────┘

Node Types

Control Nodes

NodePurpose
StartEntry point, receives input parameters
EndExit point, returns output
ConditionBranching based on logic
LoopIterate over collections

Action Nodes

NodePurpose
APIMake HTTP requests to external services
FunctionTransform data with expressions
IntegrationConnect to pre-built connectors
HumanPause for human input/approval

AI Nodes

NodePurpose
Text-to-TextGenerate text with LLM
Text-to-ImageGenerate images
Audio-to-TextTranscribe audio
Image-to-TextAnalyze images
DocSearchSearch knowledge bases

Managing Nodes

Add Nodes

Every new flow begins with a Start node, automatically placed on the canvas. All nodes must be connected—directly or indirectly—to the Start node for the flow to execute correctly. Add nodes in three ways:
  1. Drag from the bottom panel — Scroll through available node types and drag them onto the canvas.
  2. Use the left panel (Assets) — Select and place preconfigured nodes directly onto the canvas.
  3. Use the blue plus icon (+) on a node — Hover over the grey dot on any node to reveal the blue + icon. Click to choose:
    • Add new node — Instantly places and connects a new node.
    • Add existing node — Select a previously added node.

Rename Nodes

  1. Right-click the node on the canvas.
  2. Select Rename from the context menu.
  3. Enter the new name in the Node Name field of the configuration panel.
Use clear, descriptive names (for example, “Validate Email Input” instead of “Function Node 1”).

Delete Nodes

Right-click the node on the canvas and select Delete. Deleting a node also removes its associated connections—reconnect any dependent paths to maintain a valid flow.

Rearrange Nodes

Click and drag any node to move it. Connected lines adjust automatically while other nodes remain fixed. Layout options:
  • Auto Arrange — Right-click the canvas and select Auto arrange to automatically reposition all nodes for a cleaner layout.
  • Show/Hide UI or Grid — Right-click the canvas to toggle visual elements.

Connecting Nodes

Node connections determine how tasks flow—either sequentially or through parallel branches. Connection methods:
  • Canvas-based (visual) — Drag and drop to connect nodes directly on the canvas.
  • Node configuration panel — Define success and failure paths from the node`s property panel.

Sequential vs. Parallel Execution

FeatureSequential ExecutionParallel Execution
Use WhenTasks depend on previous steps and must run in orderTasks are independent and can run simultaneously
How It WorksEach node runs only after the prior node finishes; flows in a top-down or left-to-right sequenceAll connected branches from a single node execute at the same time; up to 10 branches from a single node
BenefitsPredictable, controlled execution; easier debuggingReduces total execution time; boosts performance with simultaneous operations
Ideal forStep-by-step logic, ordered linear processingMulti-channel actions, concurrent task execution

Designing Sequential Flows

In a sequential structure, nodes execute one after another in a defined order. Each node begins only after the previous one finishes. Option 1: Drag-to-Connect
  1. Hover over the blue + icon or grey connector dot on the source node.
  2. Click and drag a line to the destination node.
Option 2: Blue + Icon
  1. Hover over the node, click the blue + icon, and choose Add New or Add Existing. The node is added and connected in sequence.
Option 3: Connections Panel
  1. Click the node to open the Configuration Panel on the right.
  2. Go to the Connections tab.
  3. Under On Success or On Failure, use the dropdown to add a new node or connect to an existing, unused node.
Key considerations:
  • All paths must converge at an End Node.
  • Sequential chains are arranged left to right for clarity.
  • Logs display outputs in the exact order nodes are triggered, making debugging easier.
  • Only one connection can exist per outcome (On Success or On Failure) in a sequential flow—any additional connections are treated as parallel branches.

Designing Parallel Flows

In a parallel flow, multiple branches run at the same time from the same parent node. Each branch performs an independent task. Parallel design patterns:
  • Simple Parallel — A single node branches to multiple child nodes running independently. Use when tasks can occur simultaneously without dependencies.
  • Nested Parallel — A parallel branch contains its own parallel branches. Useful for multi-step logic where each level does independent work.
  • Conditional + Parallel — Combine condition nodes with parallel execution. Based on logic (for example, if/else), different sets of parallel branches are triggered.
Option 1: Blue + Icon
  1. Hover over the node and click the blue + icon.
  2. Select Add new or Add existing to add multiple nodes as parallel branches.
Option 2: Drag-to-Connect
  1. Drag from the source node to another node on the canvas. Repeat from the same parent to form additional parallel branches.
Option 3: Connections Panel
  1. Click the node and open the Configuration Panel.
  2. Go to the Connections tab.
  3. Use the On Success or On Failure dropdowns and click + Parallel Node to add more branches.
Limits and rules:
  • Maximum 10 outgoing connections from a single node.
  • No duplicate connections from the same parent node.
  • No backward loops—connecting a node to an earlier node in the flow is blocked to prevent logic cycles.
Key considerations for parallel flows:
  • All branches from a node run at the same time.
  • The flow waits for all parallel branches to complete before moving to the next step.
  • End node outputs from all branches are combined before passing to the next node.
  • The canvas layout expands automatically to fit multiple branches.
  • Logs show branch-specific outputs grouped under the parent node, labeled (A, B, C) for each path.

Manage and Remove Connections

From the canvas:
  1. Click the arrow/line between two nodes.
  2. Click the Delete icon. The nodes remain on the canvas but are no longer linked.
From the Configuration Panel:
  1. Open the panel by selecting a node.
  2. In the Connections tab, click the Delete icon next to the connection to remove.

Manage Input and Output

Tool flows use input and output variables as context objects throughout execution.
  • Input Variables — Provide initial data to the flow. Accessible immediately after the Start node using the syntax: context.steps.Start.inputVariable
  • Output Variables — Store and return derived values from the flow. Set in any node; for example, in the End node, assign results using: {{context.steps.AInode.output}}

Add Input Variables

  1. Click Manage I/O at the top of the Tool flow canvas, or click the Start node. The Manage Input & Output dialog is displayed.
  2. On the Input tab, click + Add input variable.
  3. Provide a Name (key) for the input variable (for example, Product_ID).
  4. Select a Type from the dropdown:
    • Text, Number, or Boolean — An optional Default value field appears.
    • Remote file — A File URL timeout field appears. Set the timeout between 5 minutes and 7 days (default: 5 minutes).
    • List of values (Enum) — Add predefined allowed values via Add Values +. Enable Default value to select from defined values. The system validates input against enum values and displays an error for invalid entries.
    • JSON — A schema editor appears. Define the JSON schema and ensure the default value matches the schema. If the JSON is invalid, execution fails with an error.
  5. Enable the Mandatory toggle if the field is required.
  6. Click Save. The input variable appears on the Input tab.

Add Output Variables

  1. Click Manage I/O at the top of the canvas.
  2. On the Output tab, click + Add output variable.
  3. Provide a Name (key) and select a Type: String, Number, JSON, or Boolean.
  4. Click Save.
You can also add an output variable in the End node using the Add a Key option on the node`s interface. If a type mismatch occurs, the endpoint succeeds but includes a warning in the response with the key name and nature of the mismatch. If one or more keys fail validation, the response includes warnings for failed keys and outputs for valid ones.

Building a Workflow

Step 1: Create the Tool

  1. Navigate to Tools+ New Tool
  2. Select Workflow Tool
  3. Enter name and description:
name: get_weather
description: |
  Retrieves current weather conditions for a specified location.
  Returns temperature, conditions, and forecast summary.

Step 2: Define Input Parameters

Configure what the tool accepts:
parameters:
  location:
    type: string
    description: City name or coordinates
    required: true

  units:
    type: string
    enum: [celsius, fahrenheit]
    default: celsius

Step 3: Build the Flow

Drag nodes onto the canvas and connect them:
│ Start │ │ │ ▼ | ┌─────────────────────────┐ | │ API Node: Weather API │ | │ URL: api.weather.com │ | │ Method: GET │ | │ Params: location, units │ | └───────────┬─────────────┘ │ ▼ | ┌─────────────────────────┐ | │ Function: Format Output │ | │ Transform response to │ | │ user-friendly format │ | └───────────┬─────────────┘ │ ▼ │ End

Step 4: Configure Nodes

API Node Example:
url: https://api.weather.com/v1/current
method: GET
headers:
  Authorization: Bearer {{env.WEATHER_API_KEY}}
query_params:
  q: "{{input.location}}"
  units: "{{input.units}}"
Function Node Example:
return {
  temperature: response.main.temp,
  condition: response.weather[0].description,
  humidity: response.main.humidity,
  location: response.name
}

Step 5: Define Output

Specify the return structure:
output:
  type: object
  properties:
    temperature:
      type: number
    condition:
      type: string
    humidity:
      type: number
    location:
      type: string

Step 6: Test

  1. Click the Run flow icon at the upper right corner of the canvas.
  2. Provide sample input values in the Run dialog.
  3. Click the Debug icon to open the debug log and monitor execution.
  4. Review the execution trace, output, and any errors.
  5. For a successful flow, copy output results using the Copy icon.
You can stop the flow at any point and restart by clicking Run flow again.

Step 7: Deploy

Click Deploy to generate API endpoints (see Deploy a Tool).

Running and Testing Flows

When you run a flow, it generates a context object stored temporarily at the node level, allowing you to monitor progress through the debug log. Flow results:
  • Successful flow — Copy output using the Copy icon. Overall runtime is displayed.
  • Flow errors — An error message is displayed; the output key appears empty with JSON-formatted failure output.

Debug Log

The debug log captures detailed information for each step:
  • Flow input values — The values provided for input variables.
  • Flow-level log details — Overview of flow initiation and progress.
  • Node-level information — Success or failure status per node, with links to additional details.
  • Tool calling details — Logs of any tools called during execution (AI nodes only), including inputs (JSON), responses, and errors. A separate panel shows detailed tool traces.
  • Node metrics per node:
    • Initiated On — Timestamp when the node was triggered.
    • Executed On — Timestamp when execution completed.
    • Total Time Taken — Duration of node execution.
    • Tokens — Token usage (AI nodes only).
Expand the debug panel to full screen for a cleaner layout. All nodes align left; clicking a node reveals its input, output, and metrics side by side.

Time Metrics for API and AI Nodes

API Nodes — Synchronous mode:
  • Node processing time — Time to complete execution.
  • API Response time — Time waiting for the external API response.
API Nodes — Asynchronous mode:
  • Node paused at — Timestamp when the node paused waiting for response.
  • Node resumed at — Timestamp when the node resumed after receiving response.
  • Total wait time — Duration between pausing and resuming.
  • Node processing time — Time spent processing after resuming.
AI Nodes:
  • Node processing time — Time to complete execution.
  • LLM response time — Time for the connected model to return a response.

Conditions and Branching

Use condition nodes for logic-based branching:
│ ┌─────────────┐ │ │ Condition │ │ │ amount > 100│ │ └──────┬──────┘ │ │ │ ┌────────────┼────────────┐ │ ▼ ▼ ▼ │ ┌────────┐ ┌────────┐ ┌────────┐ │ │ True │ │ False │ │Default │ │ └────────┘ └────────┘ └────────┘
Expression syntax:
// Comparisons
input.amount > 100
response.status === "success"

// Logical operators
input.priority === "high" && input.urgent === true

// String operations
input.email.includes("@company.com")

Loops

Iterate over arrays:
┌──────────────────────────────┐ │ Loop: for each item in list │ ├──────────────────────────────┤ │ ┌─────────────────────────┐ │ │ │ Process Item │ │ │ └─────────────────────────┘ │ └──────────────────────────────┘
Configuration:
loop:
  collection: "{{response.items}}"
  item_variable: current_item
  max_iterations: 100

Error Handling

Configure fallbacks for failures:
error_handling:
  on_error: continue  # or: stop, retry
  retry:
    attempts: 3
    delay_ms: 1000
  fallback:
    node: error_handler

Common Node Issues

IssueCauseResolution
Can`t add a new connectionNode has 10 outgoing connectionsDelete a connection to add more
”No available nodes” in the dropdownAll valid nodes are already linkedCreate a new node or unlink existing ones
Flow doesn`t executeBroken or incomplete connectionsCheck for stray nodes or missing End Nodes
Error when connecting to a previous nodeBackward looping is not allowedReconnect to a valid forward step in the flow

Flow Versions

You can save versions of your flows, restore older versions, and delete versions as needed. After deploying a flow, it appears on the Tools page with the status Deployed.

Create a New Version

  1. Click the down arrow on the canvas header. The Flow versions dialog is displayed.
  2. Click the + icon to save the current version.
  3. Enter a Version name and Description, then click Save.
The saved version becomes the current version. All subsequent changes are auto-saved to it.
In the Flow versions dialog, click the 3-dots icon beside a version name to restore or delete it. When you restore a version, the current version moves to the bottom of the list and the restored version becomes current. A deployed version can be restored but not deleted.

Tool Flow Change Log

The change log lets admins track, audit, and review all changes made to a tool`s flow over time—including node property updates, user changes, and version-specific modifications. Key features:
  • Event tracking — Records each change when a user exits a property field in a node, capturing the timestamp, responsible user, and change details.
  • Filters — Filter by date range, node type, or specific team member for enhanced searchability.

Access the Change Log

  1. Open the tool flow for the tool you want to audit.
  2. Click the History/Log icon at the top-right corner of the page. The change log panel opens on the right, showing the most recent changes first.
Each log entry displays:
  • Timestamp of the change
  • Brief description of the change
  • Name of the user who made the change

Filter the Change Log

  • Date filter — Click the Calendar icon to select a specific date range.
  • Filter By — Click the Filter icon to filter by User or Node Type. The log updates dynamically based on the selected filters.

Execution Modes

Synchronous

Default mode—the request waits for completion before returning a response. Suitable for real-time operations.
Request → Execute all nodes → Return response
Timeout: Configurable, range 60–300 seconds (default: 180 seconds).

Asynchronous

For long-running workflows. Supports two delivery methods:
  • Push endpoints — Actively send response data to the client via webhook as soon as it becomes available. Useful when the client needs immediate notification of changes.
  • Poll endpoints — The client periodically checks to retrieve response data on its own schedule. Useful when the client only needs updates at specific intervals.

Deploy a Tool

Agent Platform supports both synchronous and asynchronous tool deployment modes.
Before deploying the tool, you must fix any errors or warnings in the Tool Flow.

Deploy a Tool

  1. Log in and click Tools from the list of modules.
  2. Select the tool you want to deploy.
  3. Click Tool endpoint in the left menu.
  4. Click Deploy. The Sync and Async poll endpoints are generated.
  5. To generate an Async push endpoint:
    1. Click the Async push tab and click Enable/Settings.
    2. In the Sync/Async mode setup popup, enable the Enable async toggle.
    3. Provide the URL of your external application and enter the access token.
    4. Set the timeout:
      • Set timeout — Range: 60–600 seconds (default: 180 seconds).
      • No timeout — Process the request without a time limit.
    5. Click Save. The async push endpoint is created.
Timeout precedence: Tool timeout > Node timeout > Model timeout.

Redeploy for Flow Changes

After updating the in-development version of a flow:
  1. Go to Tool endpoint in the left navigation.
  2. Click Deploy at the top-right corner of the page.
Redeployment does not change the tool`s endpoint URL. To view the currently deployed version in read-only mode, click View the Flow on the Tool Flow page.

Configure a Tool

You can modify a tool`s general details, async configuration, and environment variables, and undeploy or delete it when no longer in use.

Sync/Async Mode Setup

Configure the sync/async mode for the tool endpoint. Changing the mode requires redeployment.
  1. Click Configurations in the left navigation, then click Setup sync/async.
  2. In the Sync/Async mode setup popup:
    • Synchronous mode timeout — Range: 60–300 seconds (default: 180 seconds).
    • Asynchronous mode — Enable the Enable async toggle, provide the external application URL, enter the access token, and set the timeout (60–600 seconds, or No timeout).
  3. Click Save.
  4. Go to Tool endpoint and click Deploy to redeploy with the new mode.

Environment Variables

Store reusable values and sensitive configurations that can be referenced by different nodes in the tool flow. Add an Environment Variable:
  1. Click Configurations in the left navigation, then click Manage environment variables.
  2. Click Add or Add variable.
  3. Provide:
    • Variable name — Descriptive name.
    • Secure variable — Enable the toggle to mark as a secret.
    • Value — Desired value.
    • Notes (optional) — Usage notes or purpose.
  4. Click Save.
Edit or Delete: Click the three-dots icon next to the variable name and select Edit or Delete. Reference variables in nodes:
url: "{{env.API_BASE_URL}}/endpoint"
headers:
  Authorization: "Bearer {{env.API_KEY}}"

Undeploy the Tool

Click Proceed to undeploy on the Configurations page and follow the on-screen instructions. Undeploying immediately disconnects all active instances.

Delete the Tool

Click Proceed to delete on the Configurations page and follow the on-screen instructions. The tool must be undeployed before deletion. Deletion removes all associated data and is irreversible.

Export a Tool

Export specific tool versions as self-contained packages for data preservation and sharing without compromising configuration integrity or security. The exported .zip file is named after the tool (for example, Banking Assistant.zip) and can be reimported to create a new tool or add it as a version.
Users with only Viewer permissions cannot export a tool.
Exported package contents:
FileContents
flow_definition.jsonCanvas definitions, node definitions, AI node configurations (prompts, hyperparameters, timeout)
app_definition.jsonGeneral tool version information and guardrails
env_variables.jsonEnvironment variables set for the tool
Excluded: API keys, sharing permissions, tool endpoint, and audit logs.

Steps to Export

  1. Log in and click Tools from the list of modules.
  2. Select the tool you want to export.
  3. Click Configurations in the left navigation.
  4. Scroll to the Export tool section. The currently deployed version is selected by default.
  5. To change, select another version from the dropdown.
  6. Click Export. A success message appears after validation and export complete.
After the export starts, you can`t change the selected version.

Import a Tool as a Version

Import a tool as a version of an existing parent tool from the Configurations page. Back up the current in-development version before importing to preserve its configuration definitions.
To import a tool as a version, the parent tool must be deployed in your account.
Required files:
  • flow_definition.json
  • app_definition.json
  • env_variables.json

Steps to Import a Tool as a Version

  1. Log in and click Tools from the list of modules.
  2. Select the parent tool and click Configurations.
  3. Scroll to the Import tool section and click Import.
  4. (Optional) Select Back up your current tool to preserve the data of the version being replaced. The system automatically exports and saves the .zip package to the designated location.
  5. Click Import in the confirmation dialog.
  6. In the Import Tool window, upload the required JSON files and click Import.
After validation, the imported version is added under the parent tool. The parent tool`s status is set to In Development, and Updated on reflects the import date.

Resolve Conflicting Environment Variables

When importing, conflicts may arise between the parent tool`s environment variables and those in the imported version.
  1. Select the Conflicting variables tab.
  2. For each conflict, select Overwrite (use the imported value) or Keep existing (retain the current value).
  3. Click Proceed to continue importing.
To edit variables before resolving:
  1. Click the three-dots icon for the variable and select Edit.
  2. Update the values and click Save.
To cancel during conflict resolution, click Cancel in the Edit window, then click Confirm.
Verify environment variables before importing. For conflicting values, existing ones are retained if not changed.

API Keys

Generate API keys to enable secure access to your deployed tools from external environments. Keys should be shared only with trusted consumers and can be revoked or rotated as needed.
The Platform won`t show the API key again after creation. Keep it secure and never expose it in client-side code or public repositories.

Create an API Key

  1. Log in and click Tools from the list of modules.
  2. Select the required tool from the list.
  3. Click API keys in the left panel.
  4. Click Create a new API key.
  5. Provide a descriptive name for the key and click Generate key.
  6. Click Copy and close to save the key to your clipboard. Share this key with authorized users as needed.
All generated API keys are listed in the API keys section for easy reference and management.

User Roles and Permissions

Account owners can invite users to collaborate on specific tools. Invited users can access the Models and Data modules for the invited account but can only see the tools they are invited to.
You can only invite users who already have access to your account.

Invite Users

  1. Log in and click Tools from the list of modules.
  2. Select the required tool.
  3. Click Sharing & Permission in the left navigation bar. Existing collaborators are listed.
  4. Click Invite (visible only to account owners). The Invite Users dialog is displayed.
  5. Enter users` email addresses and click Invite to grant access.

Monitoring and Observability

Tool Monitor

Tool Monitor tracks and analyzes tool performance across multiple runs, providing a time-based, comprehensive view of tool activities. Key capabilities:
  • Performance Tracking — Monitor response times, execution patterns, and overall efficiency.
  • Dual-View Analytics:
    • All Runs — Comprehensive data on all tool run instances (all endpoint calls).
    • Model Runs — Focused analytics on AI node executions.
  • Detailed Metrics — Total runs, average response times (P90 and P99), failure rates.
  • API Key Usage Monitoring — Track API key utilization across the tool ecosystem.
  • AI Node Analysis — Performance insights per individual AI node.
  • Advanced Filtering — Time-based searches and custom filters to drill down into specific scenarios.
  • Detailed Run Information — Comprehensive logs and debug information per tool run.

Accessing Tool Monitor

  1. Log in and click Tools from the list of modules.
  2. On the All tools page, click a deployed tool. Tool monitoring is only available for tools deployed in production.
  3. Click Tool monitor in the left navigation pane.
  4. Click All runs or Model runs to view the respective data.
  5. Click any row for detailed run information in the right panel.

All Runs Tab

Displays for each tool run:
FieldDescription
Run IDUnique identifier for the flow run
StatusIn Progress, Waiting, Success, or Failed
Response timeDuration to complete the request and return output
Nodes executedTotal number of nodes executed in the run
Start timeWhen the request was initiated
End timeWhen the response was received
SourceAgentic App name or API Key name used to run the tool
Summary metrics at the top of the page: Total Runs, Response Time (P90 and P99), Failure Rate. These metrics update when you apply filters or search criteria.
When there is nested (multi-level) tool calling, the immediate parent tool is displayed as the source.

Model Runs Tab

Each AI node in a tool is recorded as a separate request. If your tool has no AI nodes, this tab remains empty until AI nodes are added. Displays for each AI node call:
FieldDescription
Request IDUnique identifier for the AI node request
StatusIn Progress, Waiting, Success, or Failed
Node nameName of the AI node
Model nameModel used for the AI node
Response timeTime taken by the AI node
Start time / End timeExecution timestamps
Summary metrics: Total Requests, Response Time (P90 and P99), Failure Rate.

Viewing Detailed Run Information

Click any row in either tab to open a detailed panel on the right, similar to the Run dialog on the Tool flow canvas. The panel displays:
  • Run ID / Request ID
  • Response Time
  • Debug icon — Click to view debug log details.
  • Input — Input sent to the tool.
  • Flow log — Node-level success/failure details, including scanner information for AI nodes.
  • Output — Tool output for successful runs, with copy option and token view.

Timeout Impact on Endpoints

ScenarioBehavior
Tool Sync + API node SyncRequest immediately fulfilled; In-progress while running
Tool Sync + API node Async (node timeout < sync timeout)In-progress during execution; Waiting while paused for external response
Tool Async + API node SyncExecutes and sends response to callback URL; In-progress while running
Tool Async + API node Async (node timeout < tool async timeout, or both infinite)Waiting while paused for external response; resumes In-progress when response received

Searching and Filtering

  • Manual Search — Use the search box in the top right to find specific runs by keyword.
  • Time-based Search — Click the calendar button, select a predefined range (last day, week, month, or year) or set custom dates, then click Apply.
  • Custom Filters — Click the Filter icon, click + Add filter, select Column/Operator/Value, and click Apply. Combine multiple filters using AND/OR operators for more precise results.

Tool Run Errors

Errors that occur via the endpoint are displayed in a separate window for the specified Run ID. Click the corresponding entry in the Tool Monitor dashboard to view details. Error categories:
Error ScenarioDescriptionCategoryHTTP Status
Mandatory input field missingA required input field was not providedData Validation400 Bad Request
Invalid data type for input fieldAn incorrect data type was providedData Validation400 Bad Request
Empty Input ObjectA field input is missing a valueData Validation400 Bad Request
Large Request PayloadPayload exceeds the server`s size limitData Validation413 Payload Too Large
Server-side issuesTechnical issue caused the server to failInternal Server500 Internal Server Error
Request timeoutNetwork or server connection issueNetwork408 Request Timeout
Guardrail FailureRisk score exceeded threshold at AI nodeContent Filter403 Forbidden

Audit Logs

Audit Logs provide full visibility into user actions and system interactions—tracking logins, role changes, model updates, and tool changes through timestamped log entries. This empowers admins to ensure compliance with internal policies and proactively mitigate risks. Each log entry includes:
  • Event Name — The specific event or action that occurred.
  • Category — The module or entity affected.
  • User Name — Who performed the action.
  • Date and Time — When the event occurred.
  • Description — Detailed information about the action.

Access Audit Logs

  1. Log in and click Tools from the list of modules.
  2. On the All tools page, click the desired tool.
  3. In the left navigation pane, click Audit logs. The Audit logs page is displayed.
  4. Click each row to view more details about the event.
Use the time range selector to filter by current or past periods. Apply custom filters by category, event, or user to view only the required logs.

Guardrails

Enable safety scanning for AI nodes to enforce safety, policy, and compliance checks:
guardrails:
  input_scanning: true
  output_scanning: true
  scanners:
    - toxicity
    - pii_detection
    - jailbreak_detection
Guardrails scan LLM inputs and outputs, prevent unsafe or non-compliant responses, and apply consistent enforcement across workflows.

PII Handling

Workflow Tools inherit the application`s PII protection capabilities, ensuring sensitive data is securely processed without exposure in logs, traces, or model outputs.
  • Before execution, input fields are automatically scanned for declared PII patterns in the application config.
  • Inputs identified as PII are masked and passed to the tool in redacted form.
  • If the Workflow Tool is granted access to the original PII value, it can securely unredact and use it internally—while monitoring, debugging logs, and execution traces continue to show only masked values.
PII handling applies only to workflow tools associated with Agentic Apps. Tools in the library operate independently and do not apply PII sensitization to inputs.

Accessing App Environment Variables

Environment variables defined at the application level can be accessed by workflow tools through namespaces. To make a variable available within a workflow tool:
  1. Add the variable to a namespace.
  2. Associate the namespace with the workflow agent.
Once associated, reference variables using env.<variable-name>.

Tool Creation Scope

Workflow Tools can be created in two scopes depending on whether they are intended for reuse or app-specific behavior.

Tools Library

Tools created under the Tools module on the console. These exist independently of any Agentic App and, once built and deployed, function as a shared tools library serving as templates for Agentic Apps.

App-Scoped Tools

Workflow Tools created directly within an Agentic App, scoped to that app only. When an existing tool is imported into an Agentic App, it is also scoped to that app.
  • App-scoped tools are not shared across other apps.
  • Updates to app-scoped or imported tools apply only to the local copy and do not impact the original library tool.
  • Ideal for app-specific logic or experimentation.

Best Practices

Keep Workflows Focused

One workflow = one capability. Don`t combine unrelated logic.

Use Meaningful Names

# Good
get_customer_orders
validate_shipping_address
process_refund_request

# Avoid
workflow_1
helper
do_stuff

Handle Errors Gracefully

Always include error paths and meaningful error messages.

Test Edge Cases

  • Invalid inputs
  • API failures
  • Empty responses
  • Timeout scenarios

Document Complex Logic

Add comments to condition expressions and complex transformations.

Flow Design Tips

  • Use parallel structures to speed up independent tasks.
  • Use sequential paths when steps depend on the results of prior steps.
  • Combine both styles to build hybrid flows for advanced logic.
  • Always monitor node limits and watch for visual cues — one line = sequence, branching lines = parallel.
  • Regularly use Auto Arrange to clean up the canvas.
  • Ensure every logical branch concludes at an End Node.