Jesús Pérez 93b0e5225c
feat(platform): control plane — NATS JetStream + SurrealDB + SOLID enforcement
New crates
  - platform-nats: async_nats JetStream bridge; pull/push consumers, explicit ACK,
    subject prefixing under provisioning.>, 6 stream definitions on startup
  - platform-db: SurrealDB pool (embedded RocksDB solo, Surreal<Mem> tests,
    WebSocket server multi-user); migrate() with DEFINE TABLE IF NOT EXISTS DDL

  Service integrations
  - orchestrator: NATS pub on task state transitions, execution_logs → SurrealDB,
    webhook handler (HMAC-SHA256), AuditCollector (batch INSERT, 100-event/1s flush)
  - control-center: solo_auth_middleware (intentional bypass, --mode solo only),
    NATS session events, WebSocket bridge via JetStream subscription (no polling)
  - vault-service: NATS lease flow; credentials over HTTPS only (lease_id in NATS);
    SurrealDB storage backend with MVCC retry + exponential backoff
  - secretumvault: complete SurrealDB backend replacing HashMap; 9 unit + 19 integration tests
  - extension-registry: NATS lifecycle events, vault:// credential resolver with TTL cache,
    cache invalidation via provisioning.workspace.*.deploy.done

  Clippy workspace clean
  cargo clippy --workspace -- -D warnings: 0 errors
  Patterns fixed: derivable_impls (#[default] on enum variants), excessive_nesting
  (let-else, boolean arithmetic in retain, extracted helpers), io_error_other,
  redundant_closure, iter_kv_map, manual_range_contains, pathbuf_instead_of_path
2026-02-17 23:58:14 +00:00

107 lines
2.8 KiB
Rust

//! LLM integration using stratum-llm
use stratum_llm::{
AnthropicProvider, ConfiguredProvider, CredentialSource, GenerationOptions, Message,
ProviderChain, Role, UnifiedClient,
};
use tracing::info;
use crate::error::Result;
pub struct LlmClient {
client: UnifiedClient,
pub model: String,
}
impl LlmClient {
pub fn new(model: String) -> Result<Self> {
let api_key = std::env::var("ANTHROPIC_API_KEY").ok();
if api_key.is_none() {
tracing::warn!("ANTHROPIC_API_KEY not set - LLM calls will fail");
}
let provider = AnthropicProvider::new(api_key.unwrap_or_default(), model.clone());
let configured = ConfiguredProvider {
provider: Box::new(provider),
credential_source: CredentialSource::EnvVar {
name: "ANTHROPIC_API_KEY".to_string(),
},
priority: 0,
};
let chain = ProviderChain::with_providers(vec![configured]);
let client = UnifiedClient::builder()
.with_chain(chain)
.build()
.map_err(|e| {
crate::error::RagError::LlmError(format!("Failed to build LLM client: {}", e))
})?;
info!("Initialized stratum-llm client: {}", model);
Ok(Self { client, model })
}
pub async fn generate_answer(&self, query: &str, context: &str) -> Result<String> {
let system_prompt = format!(
r#"You are a helpful assistant answering questions about a provisioning platform.
You have been provided with relevant documentation context below.
Answer the user's question based on this context.
Be concise and accurate.
# Retrieved Context
{}
"#,
context
);
let messages = vec![
Message {
role: Role::System,
content: system_prompt,
},
Message {
role: Role::User,
content: query.to_string(),
},
];
let options = GenerationOptions {
max_tokens: Some(1024),
..Default::default()
};
let response = self
.client
.generate(&messages, Some(&options))
.await
.map_err(|e| {
crate::error::RagError::LlmError(format!("LLM generation failed: {}", e))
})?;
info!("Generated answer: {} characters", response.content.len());
Ok(response.content)
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_llm_client_creation() {
let client = LlmClient::new("claude-opus-4".to_string());
assert!(client.is_ok());
}
#[test]
fn test_llm_client_model() {
let client = LlmClient::new("claude-sonnet-4".to_string());
assert!(client.is_ok());
assert_eq!(client.unwrap().model, "claude-sonnet-4");
}
}