Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.cognigy.com/llms.txt

Use this file to discover all available pages before exploring further.

Updated in 2026.10 To start using an Amazon Bedrock model with Cognigy.AI features, follow these steps:
  1. Add a Model
  2. Apply the Model

Add Models

You can add a model using one of the following interfaces:

Add Models via GUI

You can add a model provided by Amazon Bedrock to Cognigy.AI in Build > LLM. To add the model, you will need the following parameters:
ParameterDescription
Access Key IDEnter the Access Key ID. Log in to the AWS Management Console, go to the IAM dashboard, select Users, and choose the IAM user. Navigate to the Security credentials tab, and under Access keys, create a new access key if one hasn’t been created. Copy the Access Key ID provided after creation.
Secret Access KeyEnter the Secret Access Key. After creating the access key, you’ll be prompted to download a file containing the Access Key ID and the Secret Access Key. Alternatively, you can retrieve the Secret Access Key by navigating to the IAM dashboard, selecting the user, going to the Security credentials tab, and clicking Show next to the Access Key ID to reveal and copy the Secret Access Key.
RegionEnter the AWS region where your model is located, for example, us-east-1 for the US East (N. Virginia) region.
LocationSelect the routing behavior of your inference calls. Select one of the following options:
  • Region — the data sent in the request to the LLM stays within the AWS region you provided in Region. This is the default behavior.
  • Geo — the data sent in the request can be routed within a geographic boundary. This option allows for higher request throughput while keeping the benefits of a geographic boundary, for example, data residency for compliance with regulations. Enter the geographic boundary in the Geo field, for example, us, eu, or apac. The AWS region must be located within the geographic boundary entered in this parameter.
  • Global — the data sent in the request can be routed worldwide. This option allows for the highest request throughput and best performance for cases with no data residency constraints.
For more information, read the Regional availability article.
Apply changes. Check if the connection was set up by clicking Test.

Add Models via the API

You can add either a standard or custom model using the Cognigy.AI API POST /v2.0/largelanguagemodels request. Then, test your connection for the created model via the Cognigy.AI API POST /v2.0/largelanguagemodels//test.

Apply the Model

To apply a model, follow these steps:
  1. In the left-side menu of the Project, go to Manage > Settings.
  2. Go to the section based on your use case for using a model:
    • Generative AI Settings. In the Generative AI Settings section, activate Enable Generative AI Features. This setting is toggled on by default if you have previously set up the Generative AI credentials.
    • Knowledge AI Settings. Use this section if you need to add a model for Knowledge AI. Select a model for the Knowledge Search and Answer Extraction features. Refer to the list of standard models and find the models that support these features.
  3. Navigate to the desired feature and select a model from the list. If there are no models available for the selected feature, the system will automatically select None. Save changes.

More Information