Files
electron/docs/api/language-model.md
David Sanders 39aed69a33 feat: implement the Prompt API via localAIHandler
Assisted-by: Claude Opus 4.6
2026-04-03 23:15:18 -07:00

2.7 KiB

Class: LanguageModel

Implement local AI language models

Process: Utility

new LanguageModel(initialState)

  • initialState Object
    • contextUsage number
    • contextWindow number

Note

Do not use this constructor directly outside of the class itself, as it will not be properly connected to the localAIHandler

Static Methods

The LanguageModel class has the following static methods:

LanguageModel.create(options) Experimental

Returns Promise<LanguageModel>. Creates a new LanguageModel with the provided options.

LanguageModel.availability([options]) Experimental

Returns Promise<string>

Determines the availability of the language model and returns one of the following strings:

  • available
  • downloadable
  • downloading
  • unavailable

Instance Properties

The following properties are available on instances of LanguageModel:

languageModel.contextUsage Experimental

A number representing how many tokens are currently in the context window.

languageModel.contextWindow Experimental

A number representing the size of the context window, in tokens.

Instance Methods

The following methods are available on instances of LanguageModel:

languageModel.prompt(input, options) Experimental

Returns Promise<string> | Promise<import('stream/web').ReadableStream<string>>. Prompt the model for a response.

languageModel.append(input, options) Experimental

Returns Promise<undefined>. Append a message without prompting for a response.

languageModel.measureContextUsage(input, options) Experimental

Returns Promise<number>. Measure how many tokens the input would use.

languageModel.clone(options) Experimental

Returns Promise<LanguageModel>. Clones the LanguageModel such that the context and initial prompt should be preserved.

languageModel.destroy() Experimental

Destroys the model, and any ongoing executions are aborted.