Docs/Intelligence SDK
Universal Developer

Intelligence SDK

Intelligence SDK

Overview

The Intelligence SDK provides a unified interface for plugins to access AI capabilities, supporting multiple AI Providers (OpenAI, Anthropic, DeepSeek, SiliconFlow, etc.).

Introduction

Quick Start

EXAMPLE.TYPESCRIPT
import { useIntelligence } from '@talex-touch/utils/renderer/hooks'

const { text, vision } = useIntelligence()

// AI Chat
const chatRes = await text.chat({
  messages: [{ role: 'user', content: 'Hello!' }]
})
console.log(chatRes.result)

// Translation
const translateRes = await text.translate({
  text: 'Hello World',
  targetLang: 'zh-CN'
})
console.log(translateRes.result)

// OCR
const ocrRes = await vision.ocr({
  source: { type: 'data-url', dataUrl: imageDataUrl }
})
console.log(ocrRes.result.text)

See also

  • /docs/dev/intelligence (Developer chapter)
  • /docs/dev/intelligence/configuration
  • /docs/dev/intelligence/capabilities
  • /docs/dev/intelligence/troubleshooting

API Reference

useIntelligence()

Get Intelligence SDK instance (Vue Composable).

EXAMPLE.TYPESCRIPT
import { useIntelligence } from '@talex-touch/utils/renderer/hooks'

const intelligence = useIntelligence()
void intelligence

Returns an object with the following properties and methods:

Property/MethodDescription
invokeGeneric invocation interface
textText processing capabilities
codeCode processing capabilities
analysisAnalysis capabilities
visionVision processing capabilities
embeddingVector embedding capabilities
ragRAG retrieval capabilities
agentAgent capabilities
isLoadingLoading state (Ref)
lastErrorLast error message (Ref)

Text Processing (text)

text.chat(payload, options?)

AI Chat.

EXAMPLE.TYPESCRIPT
const result = await text.chat({
  messages: [
    { role: 'system', content: 'You are a helpful assistant' },
    { role: 'user', content: 'Hello!' }
  ],
  temperature: 0.7,
  maxTokens: 1000
})

console.log(result.result) // Chat response
console.log(result.usage) // { promptTokens, completionTokens, totalTokens }

text.translate(payload, options?)

Text translation.

EXAMPLE.TYPESCRIPT
const result = await text.translate({
  text: 'Hello World',
  sourceLang: 'en', // Optional, auto-detect
  targetLang: 'zh-CN'
})
console.log(result.result)

text.summarize(payload, options?)

Text summarization.

EXAMPLE.TYPESCRIPT
const result = await text.summarize({
  text: 'This is a long article',
  maxLength: 200,
  style: 'bullet-points' // 'concise' | 'detailed' | 'bullet-points'
})
console.log(result.result)

text.rewrite(payload, options?)

Text rewriting.

EXAMPLE.TYPESCRIPT
const result = await text.rewrite({
  text: 'This product is good',
  style: 'formal', // 'formal' | 'casual' | 'professional' | 'creative'
  tone: 'authoritative' // 'neutral' | 'friendly' | 'authoritative'
})
console.log(result.result)

text.grammarCheck(payload, options?)

Grammar checking.

EXAMPLE.TYPESCRIPT
const res = await text.grammarCheck({
  text: 'I has a apple',
  language: 'en',
  checkTypes: ['spelling', 'grammar', 'punctuation']
})
console.log(res.result)

Code Processing (code)

code.generate(payload, options?)

Code generation.

EXAMPLE.TYPESCRIPT
const res = await code.generate({
  description: 'Implement a quicksort algorithm',
  language: 'typescript',
  includeTests: true,
  includeComments: true
})
console.log(res.result)

code.explain(payload, options?)

Code explanation.

EXAMPLE.TYPESCRIPT
const res = await code.explain({
  code: 'const [a, b] = [b, a]',
  language: 'javascript',
  depth: 'detailed',
  targetAudience: 'beginner'
})
console.log(res.result)

code.review(payload, options?)

Code review.

EXAMPLE.TYPESCRIPT
const res = await code.review({
  code: myCode,
  language: 'typescript',
  focusAreas: ['security', 'performance', 'best-practices']
})
console.log(res.result)

code.refactor(payload, options?)

Code refactoring.

EXAMPLE.TYPESCRIPT
const res = await code.refactor({
  code: legacyCode,
  language: 'javascript',
  goals: ['readability', 'maintainability']
})
console.log(res.result)

code.debug(payload, options?)

Code debugging.

EXAMPLE.TYPESCRIPT
const res = await code.debug({
  code: buggyCode,
  error: 'TypeError: Cannot read property...',
  stackTrace: '...'
})
console.log(res.result)

Analysis Capabilities (analysis)

analysis.detectIntent(payload, options?)

Intent detection.

EXAMPLE.TYPESCRIPT
const res = await analysis.detectIntent({
  text: 'Book a flight to New York tomorrow',
  possibleIntents: ['book_flight', 'book_hotel', 'query_weather']
})
console.log(res.result)

analysis.analyzeSentiment(payload, options?)

Sentiment analysis.

EXAMPLE.TYPESCRIPT
const res = await analysis.analyzeSentiment({
  text: 'This product is amazing!',
  granularity: 'document'
})
console.log(res.result)

analysis.extractContent(payload, options?)

Content extraction.

EXAMPLE.TYPESCRIPT
const res = await analysis.extractContent({
  text: 'Contact John at [email protected] or call 555-1234',
  extractTypes: ['people', 'phones', 'emails']
})
console.log(res.result)

analysis.extractKeywords(payload, options?)

Keyword extraction.

EXAMPLE.TYPESCRIPT
const res = await analysis.extractKeywords({
  text: articleContent,
  maxKeywords: 10,
  includeScores: true
})
console.log(res.result)

analysis.classify(payload, options?)

Text classification.

EXAMPLE.TYPESCRIPT
const res = await analysis.classify({
  text: 'Apple released a new iPhone',
  categories: ['Technology', 'Sports', 'Entertainment', 'Finance'],
  multiLabel: false
})
console.log(res.result)

Vision Processing (vision)

vision.ocr(payload, options?)

OCR text recognition.

EXAMPLE.TYPESCRIPT
const res = await vision.ocr({
  source: {
    type: 'data-url',
    dataUrl: 'data:image/png;base64,...'
  },
  language: 'en',
  includeLayout: true,
  includeKeywords: true
})
console.log(res.result.text)

Image Source Types:

EXAMPLE.TYPESCRIPT
// Data URL
const sourceDataUrl = { type: 'data-url', dataUrl: 'data:image/png;base64,...' } as const

// File path
const sourceFile = { type: 'file', filePath: '/path/to/image.png' } as const

// Base64
const sourceBase64 = { type: 'base64', base64: '...' } as const

console.log(sourceDataUrl.type, sourceFile.type, sourceBase64.type)

vision.caption(payload, options?)

Image captioning.

EXAMPLE.TYPESCRIPT
const res = await vision.caption({
  source: { type: 'data-url', dataUrl: imageUrl },
  style: 'detailed',
  language: 'en'
})
console.log(res.result)

vision.analyze(payload, options?)

Image analysis.

EXAMPLE.TYPESCRIPT
const res = await vision.analyze({
  source: { type: 'data-url', dataUrl: imageUrl },
  analysisTypes: ['objects', 'faces', 'colors', 'scene']
})
console.log(res.result)

vision.generate(payload, options?)

Image generation.

EXAMPLE.TYPESCRIPT
const res = await vision.generate({
  prompt: 'A cute cat sitting on a sofa',
  width: 1024,
  height: 1024,
  quality: 'hd',
  count: 1
})
console.log(res.result)

Embedding (embedding)

embedding.generate(payload, options?)

Generate text embeddings.

EXAMPLE.TYPESCRIPT
const res = await embedding.generate({
  text: 'This is some text',
  model: 'text-embedding-3-small'
})

console.log(res.result.length)

RAG (rag)

rag.query(payload, options?)

RAG query.

EXAMPLE.TYPESCRIPT
const res = await rag.query({
  query: 'How to configure plugins?',
  documents: [],
  topK: 5
})
console.log(res.result)

rag.semanticSearch(payload, options?)

Semantic search.

EXAMPLE.TYPESCRIPT
const res = await rag.semanticSearch({
  query: 'user interface design',
  corpus: 'documentation',
  limit: 10
})
console.log(res.result)

rag.rerank(payload, options?)

Result reranking.

EXAMPLE.TYPESCRIPT
const res = await rag.rerank({
  query: 'search query',
  documents: searchResults,
  topK: 5
})
console.log(res.result)

Agent (agent)

agent.run(payload, options?)

Run an agent.

EXAMPLE.TYPESCRIPT
const res = await agent.run({
  task: 'Analyze this data for me',
  tools: ['calculator', 'web_search'],
  context: { data: [] }
})
console.log(res.result)

Generic Invocation

invoke(capabilityId, payload, options?)

Directly invoke any capability.

EXAMPLE.TYPESCRIPT
const res = await invoke('text.chat', {
  messages: [{ role: 'user', content: 'Hello' }]
})
console.log(res.result)

Invocation Options

All methods support a second options parameter:

EXAMPLE.TYPESCRIPT
interface IntelligenceInvokeOptions {
  strategy?: string // Strategy ID
  modelPreference?: string[] // Preferred models list
  costCeiling?: number // Cost ceiling
  latencyTarget?: number // Target latency (ms)
  timeout?: number // Timeout (ms)
  stream?: boolean // Enable streaming
  preferredProviderId?: string // Preferred Provider
  allowedProviderIds?: string[] // Allowed Provider list
}

const _invokeOptions: IntelligenceInvokeOptions = {}
void _invokeOptions

Response Structure

All APIs return a unified response structure:

EXAMPLE.TYPESCRIPT
interface IntelligenceInvokeResult<T> {
  result: T // Result data
  usage: {
    promptTokens: number
    completionTokens: number
    totalTokens: number
    cost?: number
  }
  model: string // Model used
  latency: number // Request latency (ms)
  traceId: string // Trace ID
  provider: string // Provider used
}

const _invokeResult = {} as IntelligenceInvokeResult<string>
void _invokeResult

State Management

EXAMPLE.TYPESCRIPT
const { isLoading, lastError } = useIntelligence()

// Watch loading state
watch(isLoading, loading => console.log('loading', loading))

// Watch errors
watch(lastError, error => console.log('error', error))

Provider Types

EXAMPLE.TYPESCRIPT
enum IntelligenceProviderType {
  OPENAI = 'openai',
  ANTHROPIC = 'anthropic',
  DEEPSEEK = 'deepseek',
  SILICONFLOW = 'siliconflow',
  LOCAL = 'local',
  CUSTOM = 'custom'
}

void IntelligenceProviderType.OPENAI

Capability Types

EXAMPLE.TYPESCRIPT
enum IntelligenceCapabilityType {
  // Text
  CHAT, COMPLETION, EMBEDDING, SUMMARIZE, TRANSLATE, REWRITE, GRAMMAR_CHECK,
  // Code
  CODE_GENERATE, CODE_EXPLAIN, CODE_REVIEW, CODE_REFACTOR, CODE_DEBUG,
  // Analysis
  INTENT_DETECT, SENTIMENT_ANALYZE, CONTENT_EXTRACT, KEYWORDS_EXTRACT, CLASSIFICATION,
  // Audio
  TTS, STT, AUDIO_TRANSCRIBE,
  // Vision
  VISION, VISION_OCR, IMAGE_CAPTION, IMAGE_ANALYZE, IMAGE_GENERATE, IMAGE_EDIT,
  // RAG
  RAG_QUERY, SEMANTIC_SEARCH, RERANK,
  // Workflow
  WORKFLOW, AGENT
}

void IntelligenceCapabilityType.CHAT

Best Practices

  • Gate calls by quota/subscription to avoid failures.
  • Redact sensitive input where appropriate and prompt for confirmation.
  • Cache or rate-limit high-frequency requests to control cost.

Technical Notes

  • The SDK wraps calls in the renderer, while the main process routes to concrete providers.
  • Unified responses include timing and token usage for monitoring.
Was this helpful?