Skip to main content

AiCollabOptions Interface

Options for the AI collaboration.

This API is provided as an alpha preview and may change without notice.

To use, import via @fluidframework/ai-collab/alpha.

For more information about our API support guarantees, see here.

Signature

export interface AiCollabOptions

Properties

Property Alerts Modifiers Type Description
dumpDebugLog Alpha optional, readonly boolean When enabled, the library will console.log information useful for debugging the AI collaboration.
finalReviewStep Alpha optional, readonly boolean When set to true, the LLM will be asked to complete a final review of the changes and determine if any additional changes need to be made. When set to false, the LLM will not be asked to complete a final review.
limiters Alpha optional, readonly { readonly abortController?: AbortController; readonly maxSequentialErrors?: number; readonly maxModelCalls?: number; readonly tokenLimits?: TokenLimits; } Limiters are various optional ways to limit this library's usage of the LLM.
openAI Alpha readonly OpenAiClientOptions The OpenAI client options to use for the LLM based AI collaboration.
planningStep Alpha optional, readonly boolean When set to true, the LLM will be asked to first produce a plan, based on the user's ask, before generating any changes to your applications data. This can help the LLM produce better results. When set to false, the LLM will not be asked to produce a plan.
prompt Alpha readonly { readonly systemRoleContext: string; readonly userAsk: string; } The prompt context to give the LLM in order to collaborate with your applications data.
treeNode Alpha readonly TreeNode The specific tree node you want the AI to collaborate on. Pass the root node of your tree if you intend for the AI to work on the entire tree.
validator Alpha optional, readonly (newContent: TreeNode) => void An optional validator function that can be used to validate the new content produced by the LLM.

Property Details

dumpDebugLog

When enabled, the library will console.log information useful for debugging the AI collaboration.

This API is provided as an alpha preview and may change without notice.

To use, import via @fluidframework/ai-collab/alpha.

For more information about our API support guarantees, see here.

Signature

readonly dumpDebugLog?: boolean;

Type: boolean

finalReviewStep

When set to true, the LLM will be asked to complete a final review of the changes and determine if any additional changes need to be made. When set to false, the LLM will not be asked to complete a final review.

This API is provided as an alpha preview and may change without notice.

To use, import via @fluidframework/ai-collab/alpha.

For more information about our API support guarantees, see here.

Signature

readonly finalReviewStep?: boolean;

Type: boolean

limiters

Limiters are various optional ways to limit this library's usage of the LLM.

This API is provided as an alpha preview and may change without notice.

To use, import via @fluidframework/ai-collab/alpha.

For more information about our API support guarantees, see here.

Signature

readonly limiters?: {
readonly abortController?: AbortController;
readonly maxSequentialErrors?: number;
readonly maxModelCalls?: number;
readonly tokenLimits?: TokenLimits;
};

Type: { readonly abortController?: AbortController; readonly maxSequentialErrors?: number; readonly maxModelCalls?: number; readonly tokenLimits?: TokenLimits; }

openAI

The OpenAI client options to use for the LLM based AI collaboration.

This API is provided as an alpha preview and may change without notice.

To use, import via @fluidframework/ai-collab/alpha.

For more information about our API support guarantees, see here.

Signature

readonly openAI: OpenAiClientOptions;

Type: OpenAiClientOptions

planningStep

When set to true, the LLM will be asked to first produce a plan, based on the user's ask, before generating any changes to your applications data. This can help the LLM produce better results. When set to false, the LLM will not be asked to produce a plan.

This API is provided as an alpha preview and may change without notice.

To use, import via @fluidframework/ai-collab/alpha.

For more information about our API support guarantees, see here.

Signature

readonly planningStep?: boolean;

Type: boolean

prompt

The prompt context to give the LLM in order to collaborate with your applications data.

This API is provided as an alpha preview and may change without notice.

To use, import via @fluidframework/ai-collab/alpha.

For more information about our API support guarantees, see here.

Signature

readonly prompt: {
readonly systemRoleContext: string;
readonly userAsk: string;
};

Type: { readonly systemRoleContext: string; readonly userAsk: string; }

treeNode

The specific tree node you want the AI to collaborate on. Pass the root node of your tree if you intend for the AI to work on the entire tree.

This API is provided as an alpha preview and may change without notice.

To use, import via @fluidframework/ai-collab/alpha.

For more information about our API support guarantees, see here.

Signature

readonly treeNode: TreeNode;

Type: TreeNode

Remarks

  • Optional root nodes are not supported - Primitive root nodes are not supported

validator

An optional validator function that can be used to validate the new content produced by the LLM.

This API is provided as an alpha preview and may change without notice.

To use, import via @fluidframework/ai-collab/alpha.

For more information about our API support guarantees, see here.

Signature

readonly validator?: (newContent: TreeNode) => void;

Type: (newContent: TreeNode) => void