ontoref/crates/ontoref-derive/src/lib.rs
Jesús Pérez 82a358f18d
Some checks failed
Nickel Type Check / Nickel Type Checking (push) Has been cancelled
Rust CI / Security Audit (push) Has been cancelled
Rust CI / Check + Test + Lint (push) Has been cancelled
feat: #[onto_mcp_tool] catalog, OCI credential vault layer, validate ADR-018 mode hierarchy
ontoref-derive: #[onto_mcp_tool] attribute macro registers MCP tool unit-structs in
  the catalog at link time via inventory::submit!; annotated item is emitted unchanged,
  ToolBase/AsyncTool impls stay on the struct. All 34 tools migrated from manual wiring
  (net +5: ontoref_list_projects, ontoref_search, ontoref_describe,
  ontoref_list_ontology_extensions, ontoref_get_ontology_extension).

  validate modes (ADR-018): reads level_hierarchy from workflow.ncl and checks every
  .ncl mode for level declared, strategy declared, delegate chain coherent, compose
  extends valid. mode resolve <id> shows which hierarchy level handles a mode and why.
  --self-test generates synthetic fixtures in a temp dir for CI smoke-testing.

  validate run-cargo: two-step Cargo.toml resolution — workspace layout first
  (crates/<check.crate>/Cargo.toml), single-crate fallback by package name or repo
  basename. Lets the same ADR constraint shape apply to workspace and single-crate repos.

  ontology/schemas/manifest.ncl: registry_topology_type contract — multi-registry
  coordination, push targets, participant scopes, per-namespace capability.

  reflection/requirements/base.ncl: oras ≥1.2.0, cosign ≥2.0.0, sops ≥3.9.0, age
  ≥1.1.0, restic declared as Hard/Soft requirements with version_min, check_cmd, and
  install_hint (ADR-017 toolchain surface).

  ADR-019: per-file recipient routing for tenant isolation without multi-vault. Schema
  additions: sops.recipient_groups + sops.recipient_rules in ontoref-project.ncl.
  secrets-bootstrap generates .sops.yaml from project.ncl in declarative mode. Three
  new secrets-audit checks: recipient-routing-coherent, recipient-routing-coverage,
  no-multi-vault. Adoption templates: single-team/, multi-tenant/, agent-first/.
  Integration templates: domain-producer/, mode-producer/, mode-consumer/.

  UI: project_picker surfaces registry badge (⟳ participant) and vault badge
  (⛁ vault_id · N, green=declarative / amber=legacy) per project card. Expanded panel
  adds collapsible Registry section with namespace, endpoint, and push/pull capability.
  manage.html gains Runtime Services card — MCP and GraphQL toggleable without restart
  via HTMX POST /ui/manage/services/{service}/toggle.

  describe.nu: capabilities JSON includes registry_topology and vault_state per project.
  sync.nu: drift check extended to detect //! absence on newly registered crates.
  qa.ncl: six entries — credential-vault-best-practice (layered data-flow diagram),
  credential-vault-templates (paths A/B/C), credential-vault-troubleshooting (15 named
  errors), integration-what-and-why (ADR-042 OCI federation), integration-how-to-implement,
  integration-troubleshooting.

  on+re: core.ncl + manifest.ncl updated to reflect OCI, MCP, and mode-hierarchy nodes.
  Deleted stale presentation assets (2026-02 slides + voice notes).
2026-05-12 04:46:15 +01:00

991 lines
34 KiB
Rust

//! Proc-macro crate for the ontoref protocol.
//!
//! Provides `#[onto_api(...)]` — an attribute macro for daemon HTTP handler
//! functions that registers each endpoint in the `api_catalog` at link time
//! via `inventory::submit!`. Metadata declared in the attribute (auth level,
//! actor set, tags, params) is exported to `artifacts/api-catalog-*.ncl` by
//! `just export-api-catalog`, making the full HTTP surface queryable as typed
//! NCL without requiring a running daemon (ADR-007).
use proc_macro::TokenStream;
use proc_macro2::Span;
use quote::quote;
use syn::{
parse_macro_input, punctuated::Punctuated, DeriveInput, Expr, ExprLit, ItemFn, Lit, LitStr,
MetaNameValue, Token,
};
// ── #[onto_api(...)]
// ──────────────────────────────────────────────────────────
/// Attribute macro for daemon HTTP handler functions.
///
/// Registers the endpoint in the `api_catalog` at link time via
/// `inventory::submit!`. The annotated function is emitted unchanged.
///
/// # Required keys
/// - `method = "GET"` — HTTP verb
/// - `path = "/graph/impact"` — URL path pattern (axum syntax)
/// - `description = "..."` — one-line description (optional if a `///` doc
/// comment is present; explicit attribute value takes priority over the doc
/// comment)
///
/// # Optional keys
/// - `auth = "none"` — authentication level: "none" | "viewer" | "admin"
/// (default: "none")
/// - `actors = "agent, developer"` — comma-separated actor contexts
/// - `params = "name:type:constraint:desc; ..."` — semicolon-separated param
/// entries
/// - `tags = "graph, federation"` — comma-separated semantic tags
/// - `feature = "db"` — feature flag required for this endpoint (empty = always
/// available)
///
/// # Example
/// ```ignore
/// #[onto_api(
/// method = "GET", path = "/graph/impact",
/// description = "Cross-project impact graph from an ontology node",
/// auth = "viewer", actors = "agent, developer",
/// params = "node:string:required:Ontology node id; depth:u32:default=2:Max BFS hops",
/// tags = "graph, federation",
/// )]
/// async fn graph_impact(...) { ... }
/// ```
#[proc_macro_attribute]
pub fn onto_api(args: TokenStream, input: TokenStream) -> TokenStream {
match expand_onto_api(args, input) {
Ok(ts) => ts.into(),
Err(err) => err.to_compile_error().into(),
}
}
/// Parsed fields from `#[onto_api(...)]`.
struct OntoApiAttr {
method: String,
path: String,
description: String,
auth: String,
actors: Vec<String>,
params: Vec<OntoApiParam>,
tags: Vec<String>,
feature: String,
}
struct OntoApiParam {
name: String,
kind: String,
constraint: String,
description: String,
}
fn expand_onto_api(args: TokenStream, input: TokenStream) -> syn::Result<proc_macro2::TokenStream> {
let item = proc_macro2::TokenStream::from(input);
// Extract first non-empty `///` doc comment from the annotated function.
// `/// text` compiles to `#[doc = " text"]` before the macro sees it.
let doc_desc: Option<String> = syn::parse2::<ItemFn>(item.clone())
.ok()
.and_then(|fn_item| fn_item.attrs.into_iter().find_map(doc_attr_text));
let kv_args = syn::parse::Parser::parse(
Punctuated::<MetaNameValue, Token![,]>::parse_terminated,
args,
)?;
let mut method: Option<String> = None;
let mut path: Option<String> = None;
let mut description: Option<String> = None;
let mut auth = "none".to_owned();
let mut actors: Vec<String> = Vec::new();
let mut params_raw: Option<String> = None;
let mut tags: Vec<String> = Vec::new();
let mut feature = String::new();
for kv in &kv_args {
let key = kv
.path
.get_ident()
.ok_or_else(|| syn::Error::new_spanned(&kv.path, "expected identifier"))?
.to_string();
let val = lit_str(&kv.value)
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
match key.as_str() {
"method" => method = Some(val),
"path" => path = Some(val),
"description" => description = Some(val),
"auth" => match val.as_str() {
"none" | "viewer" | "bearer" | "admin" => auth = val,
other => {
return Err(syn::Error::new_spanned(
&kv.value,
format!(
"unknown auth level '{other}'; expected none | viewer | bearer | admin"
),
))
}
},
"actors" => actors = split_csv(&val),
"params" => params_raw = Some(val),
"tags" => tags = split_csv(&val),
"feature" => feature = val,
other => {
return Err(syn::Error::new_spanned(
&kv.path,
format!("unknown onto_api key: {other}"),
))
}
}
}
let method = method.ok_or_else(|| {
syn::Error::new(Span::call_site(), "#[onto_api] requires method = \"...\"")
})?;
let path = path
.ok_or_else(|| syn::Error::new(Span::call_site(), "#[onto_api] requires path = \"...\""))?;
let desc = description.or(doc_desc).ok_or_else(|| {
syn::Error::new(
Span::call_site(),
"#[onto_api] requires description = \"...\" or a /// doc comment on the function",
)
})?;
let params = parse_params(params_raw.as_deref().unwrap_or(""))?;
let attr = OntoApiAttr {
method,
path,
description: desc,
auth,
actors,
params,
tags,
feature,
};
let ts = emit_onto_api(attr, item);
Ok(ts)
}
fn split_csv(s: &str) -> Vec<String> {
s.split(',')
.map(|p| p.trim().to_owned())
.filter(|p| !p.is_empty())
.collect()
}
/// Parse `"name:type:constraint:description; ..."` param string.
/// Separator between params: `;`. Fields within a param: `:` (max 4 splits).
fn parse_params(raw: &str) -> syn::Result<Vec<OntoApiParam>> {
if raw.trim().is_empty() {
return Ok(Vec::new());
}
raw.split(';')
.map(|entry| {
let parts: Vec<&str> = entry.trim().splitn(4, ':').collect();
if parts.len() < 3 {
return Err(syn::Error::new(
Span::call_site(),
format!("param entry '{entry}' must have at least name:type:constraint"),
));
}
Ok(OntoApiParam {
name: parts[0].trim().to_owned(),
kind: parts[1].trim().to_owned(),
constraint: parts[2].trim().to_owned(),
description: parts.get(3).map(|s| s.trim()).unwrap_or("").to_owned(),
})
})
.collect()
}
fn emit_onto_api(attr: OntoApiAttr, item: proc_macro2::TokenStream) -> proc_macro2::TokenStream {
let method = LitStr::new(&attr.method, Span::call_site());
let path = LitStr::new(&attr.path, Span::call_site());
let desc = LitStr::new(&attr.description, Span::call_site());
let auth = LitStr::new(&attr.auth, Span::call_site());
let feature = LitStr::new(&attr.feature, Span::call_site());
let actor_lits: Vec<LitStr> = attr
.actors
.iter()
.map(|a| LitStr::new(a, Span::call_site()))
.collect();
let tag_lits: Vec<LitStr> = attr
.tags
.iter()
.map(|t| LitStr::new(t, Span::call_site()))
.collect();
let param_exprs: Vec<_> = attr
.params
.iter()
.map(|p| {
let n = LitStr::new(&p.name, Span::call_site());
let k = LitStr::new(&p.kind, Span::call_site());
let c = LitStr::new(&p.constraint, Span::call_site());
let d = LitStr::new(&p.description, Span::call_site());
quote! {
::ontoref_ontology::api::ApiParam { name: #n, kind: #k, constraint: #c, description: #d }
}
})
.collect();
// Unique ident derived from path+method to prevent duplicate statics.
let unique = {
let s = format!("{}{}", attr.method, attr.path);
s.bytes()
.fold(5381u64, |h, b| h.wrapping_mul(33).wrapping_add(b as u64))
};
let static_ident = syn::Ident::new(
&format!("__ONTOREF_API_ROUTE_{unique:x}"),
Span::call_site(),
);
quote! {
::inventory::submit! {
::ontoref_ontology::api::ApiRouteEntry {
method: #method,
path: #path,
description: #desc,
auth: #auth,
actors: &[#(#actor_lits),*],
params: &[#(#param_exprs),*],
tags: &[#(#tag_lits),*],
feature: #feature,
// file!() expands at call site — the .rs file where #[onto_api] is placed.
source_file: file!(),
}
}
#[doc(hidden)]
#[allow(non_upper_case_globals, dead_code)]
static #static_ident: () = ();
#item
}
}
// ── #[onto_mcp_tool(...)]
// ──────────────────────────────────────────────────
/// Attribute macro for MCP tool unit-structs in ontoref-daemon.
///
/// Registers the tool in the MCP catalog at link time via
/// `inventory::submit!(McpToolEntry{...})`. The annotated item is emitted
/// unchanged — `ToolBase` and `AsyncTool` impls below the struct continue to
/// own the executable behaviour (ADR-015).
///
/// # Required keys
/// - `name = "ontoref_xxx"` — must equal `ToolBase::name()` for the same struct
/// - `description = "..."` — one-line agent-facing description
///
/// # Optional keys
/// - `category = "discovery"` — semantic grouping (discovery | ontology |
/// knowledge | validation | config). Empty by default.
/// - `params = "name:type:constraint:desc; ..."` — same grammar as
/// `#[onto_api(params = ...)]`; semicolon-separated, four-field entries.
///
/// # Example
/// ```ignore
/// #[onto_mcp_tool(
/// name = "ontoref_search",
/// description = "Free-text search across nodes, ADRs, and modes.",
/// category = "discovery",
/// params = "query:string:required:Search term; project:string:optional:Project slug",
/// )]
/// struct SearchTool;
/// ```
#[proc_macro_attribute]
pub fn onto_mcp_tool(args: TokenStream, input: TokenStream) -> TokenStream {
match expand_onto_mcp_tool(args, input) {
Ok(ts) => ts.into(),
Err(err) => err.to_compile_error().into(),
}
}
fn expand_onto_mcp_tool(
args: TokenStream,
input: TokenStream,
) -> syn::Result<proc_macro2::TokenStream> {
let item = proc_macro2::TokenStream::from(input);
let kv_args = syn::parse::Parser::parse(
Punctuated::<MetaNameValue, Token![,]>::parse_terminated,
args,
)?;
let mut name: Option<String> = None;
let mut description: Option<String> = None;
let mut category = String::new();
let mut params_raw: Option<String> = None;
for kv in &kv_args {
let key = kv
.path
.get_ident()
.ok_or_else(|| syn::Error::new_spanned(&kv.path, "expected identifier"))?
.to_string();
let val = lit_str(&kv.value)
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
match key.as_str() {
"name" => name = Some(val),
"description" => description = Some(val),
"category" => category = val,
"params" => params_raw = Some(val),
other => {
return Err(syn::Error::new_spanned(
&kv.path,
format!(
"unknown onto_mcp_tool key: {other}; expected name, description, \
category, params"
),
))
}
}
}
let name = name.ok_or_else(|| {
syn::Error::new(
Span::call_site(),
"#[onto_mcp_tool] requires name = \"ontoref_xxx\"",
)
})?;
let description = description.ok_or_else(|| {
syn::Error::new(
Span::call_site(),
"#[onto_mcp_tool] requires description = \"...\"",
)
})?;
let params = parse_params(params_raw.as_deref().unwrap_or(""))?;
let name_lit = LitStr::new(&name, Span::call_site());
let desc_lit = LitStr::new(&description, Span::call_site());
let category_lit = LitStr::new(&category, Span::call_site());
let param_exprs: Vec<_> = params
.iter()
.map(|p| {
let n = LitStr::new(&p.name, Span::call_site());
let k = LitStr::new(&p.kind, Span::call_site());
let c = LitStr::new(&p.constraint, Span::call_site());
let d = LitStr::new(&p.description, Span::call_site());
quote! {
::ontoref_ontology::ApiParam { name: #n, kind: #k, constraint: #c, description: #d }
}
})
.collect();
let unique = name
.bytes()
.fold(5381u64, |h, b| h.wrapping_mul(33).wrapping_add(b as u64));
let static_ident =
syn::Ident::new(&format!("__ONTOREF_MCP_TOOL_{unique:x}"), Span::call_site());
Ok(quote! {
::inventory::submit! {
::ontoref_ontology::McpToolEntry {
name: #name_lit,
description: #desc_lit,
category: #category_lit,
params: &[#(#param_exprs),*],
source_file: file!(),
}
}
#[doc(hidden)]
#[allow(non_upper_case_globals, dead_code)]
static #static_ident: () = ();
#item
})
}
// ── Attribute parsing
// ─────────────────────────────────────────────────────────
/// Parsed contents of a single `#[onto(...)]` attribute.
#[derive(Default)]
struct OntoAttr {
id: Option<String>,
name: Option<String>,
level: Option<String>,
pole: Option<String>,
description: Option<String>,
adrs: Vec<String>,
paths: Vec<String>,
invariant: Option<bool>,
}
/// Parse `key = "value"` pairs from a `#[onto(k = "v", ...)]` attribute.
fn parse_onto_attr(attr: &syn::Attribute) -> syn::Result<OntoAttr> {
let mut out = OntoAttr::default();
let args = attr.parse_args_with(Punctuated::<MetaNameValue, Token![,]>::parse_terminated)?;
for kv in &args {
let key = kv
.path
.get_ident()
.ok_or_else(|| syn::Error::new_spanned(&kv.path, "expected identifier"))?
.to_string();
match key.as_str() {
"id" | "name" | "level" | "pole" | "description" => {
let s = lit_str(&kv.value)
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
match key.as_str() {
"id" => out.id = Some(s),
"name" => out.name = Some(s),
"level" => out.level = Some(s),
"pole" => out.pole = Some(s),
"description" => out.description = Some(s),
_ => unreachable!(),
}
}
"adrs" => {
// adrs = "adr-001, adr-002" — comma-separated list in a single string
let s = lit_str(&kv.value)
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
out.adrs.extend(s.split(',').map(|a| a.trim().to_owned()));
}
"paths" => {
// paths = "crates/foo/, docs/foo.md" — comma-separated artifact paths
let s = lit_str(&kv.value)
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
out.paths.extend(
s.split(',')
.map(|p| p.trim().to_owned())
.filter(|p| !p.is_empty()),
);
}
"invariant" => {
out.invariant =
Some(lit_bool(&kv.value).ok_or_else(|| {
syn::Error::new_spanned(&kv.value, "expected bool literal")
})?);
}
other => {
return Err(syn::Error::new_spanned(
&kv.path,
format!(
"unknown onto key: {other}; expected id, name, level, pole, description, \
adrs, paths, invariant"
),
));
}
}
}
Ok(out)
}
/// Extract the text of a `#[doc = "..."]` attribute, or `None` if it is empty
/// or not a doc attribute.
fn doc_attr_text(attr: syn::Attribute) -> Option<String> {
if !attr.path().is_ident("doc") {
return None;
}
let syn::Meta::NameValue(mnv) = attr.meta else {
return None;
};
let Expr::Lit(ExprLit {
lit: Lit::Str(s), ..
}) = mnv.value
else {
return None;
};
let t = s.value().trim().to_owned();
if t.is_empty() {
None
} else {
Some(t)
}
}
fn lit_str(expr: &Expr) -> Option<String> {
if let Expr::Lit(ExprLit {
lit: Lit::Str(s), ..
}) = expr
{
Some(s.value())
} else {
None
}
}
fn lit_bool(expr: &Expr) -> Option<bool> {
if let Expr::Lit(ExprLit {
lit: Lit::Bool(b), ..
}) = expr
{
Some(b.value())
} else {
None
}
}
// ── #[derive(OntologyNode)]
// ───────────────────────────────────────────────────
/// Derive macro that registers a Rust type as a
/// `NodeContribution` (see `ontoref_ontology::contrib::NodeContribution`).
///
/// The `#[onto(...)]` attribute declares the node's identity in the ontology
/// DAG. All `#[onto]` helper attributes on the type are merged in declaration
/// order — later keys overwrite earlier ones, except `adrs` which concatenates.
///
/// # Required attributes
/// - `id = "my-node-id"` — unique node identifier (must match NCL convention)
/// - `level = "Practice"` — `AbstractionLevel` variant name
/// - `pole = "Yang"` — `Pole` variant name
///
/// # Optional attributes
/// - `name = "Human Name"` — display name (defaults to `id` if absent)
/// - `description = "..."` — one-line description; omit to fall back to the
/// `///` doc comment on the type
/// - `adrs = "adr-001, adr-002"` — comma-separated ADR references (accumulates
/// across multiple `#[onto]` attributes)
/// - `paths = "crates/foo/, docs/bar.md"` — comma-separated artifact paths
/// (accumulates across multiple `#[onto]` attributes)
/// - `invariant = true` — mark node as invariant (default: false)
///
/// # Example
/// ```ignore
/// /// Caches nickel export results to avoid re-eval on unchanged files.
/// #[derive(OntologyNode)]
/// #[onto(id = "ncl-cache", name = "NCL Cache", level = "Practice", pole = "Yang")]
/// #[onto(adrs = "adr-002, adr-004", paths = "crates/ontoref-daemon/src/cache.rs")]
/// pub struct NclCache { /* ... */ }
/// ```
#[proc_macro_derive(OntologyNode, attributes(onto))]
pub fn derive_ontology_node(input: TokenStream) -> TokenStream {
let ast = parse_macro_input!(input as DeriveInput);
match expand_ontology_node(ast) {
Ok(ts) => ts.into(),
Err(err) => err.to_compile_error().into(),
}
}
fn expand_ontology_node(ast: DeriveInput) -> syn::Result<proc_macro2::TokenStream> {
// Merge all #[onto(...)] attributes on the type.
let mut merged = OntoAttr::default();
for attr in ast.attrs.iter().filter(|a| a.path().is_ident("onto")) {
let parsed = parse_onto_attr(attr)?;
if parsed.id.is_some() {
merged.id = parsed.id;
}
if parsed.level.is_some() {
merged.level = parsed.level;
}
if parsed.pole.is_some() {
merged.pole = parsed.pole;
}
if parsed.description.is_some() {
merged.description = parsed.description;
}
if parsed.invariant.is_some() {
merged.invariant = parsed.invariant;
}
merged.adrs.extend(parsed.adrs);
merged.paths.extend(parsed.paths);
if parsed.name.is_some() {
merged.name = parsed.name;
}
}
let id = merged.id.ok_or_else(|| {
syn::Error::new(
Span::call_site(),
"#[derive(OntologyNode)] requires #[onto(id = \"...\")]",
)
})?;
let level_str = merged.level.ok_or_else(|| {
syn::Error::new(
Span::call_site(),
"#[derive(OntologyNode)] requires #[onto(level = \"...\")]",
)
})?;
let pole_str = merged.pole.ok_or_else(|| {
syn::Error::new(
Span::call_site(),
"#[derive(OntologyNode)] requires #[onto(pole = \"...\")]",
)
})?;
// Validate level and pole at compile time via known variant names.
let level_variant = match level_str.as_str() {
"Axiom" => quote! { ::ontoref_ontology::AbstractionLevel::Axiom },
"Tension" => quote! { ::ontoref_ontology::AbstractionLevel::Tension },
"Practice" => quote! { ::ontoref_ontology::AbstractionLevel::Practice },
"Project" => quote! { ::ontoref_ontology::AbstractionLevel::Project },
"Moment" => quote! { ::ontoref_ontology::AbstractionLevel::Moment },
other => {
return Err(syn::Error::new(
Span::call_site(),
format!(
"unknown AbstractionLevel: {other}; expected one of Axiom, Tension, Practice, \
Project, Moment"
),
))
}
};
let pole_variant = match pole_str.as_str() {
"Yang" => quote! { ::ontoref_ontology::Pole::Yang },
"Yin" => quote! { ::ontoref_ontology::Pole::Yin },
"Spiral" => quote! { ::ontoref_ontology::Pole::Spiral },
other => {
return Err(syn::Error::new(
Span::call_site(),
format!("unknown Pole: {other}; expected one of Yang, Yin, Spiral"),
))
}
};
// description: explicit attribute wins; fall back to /// doc comment on the
// type.
let doc_desc_type: Option<String> = ast.attrs.iter().cloned().find_map(doc_attr_text);
let description = merged.description.or(doc_desc_type).unwrap_or_default();
// name: explicit attribute wins; fall back to id.
let name = merged.name.unwrap_or_else(|| id.clone());
let invariant = merged.invariant.unwrap_or(false);
let adrs: Vec<LitStr> = merged
.adrs
.iter()
.filter(|s| !s.is_empty())
.map(|s| LitStr::new(s, Span::call_site()))
.collect();
let path_lits: Vec<LitStr> = merged
.paths
.iter()
.filter(|s| !s.is_empty())
.map(|s| LitStr::new(s, Span::call_site()))
.collect();
let id_lit = LitStr::new(&id, Span::call_site());
let name_lit = LitStr::new(&name, Span::call_site());
let description_lit = LitStr::new(&description, Span::call_site());
// Derive a unique identifier for the inventory submission from the type name.
let type_name = &ast.ident;
let submission_ident = syn::Ident::new(
&format!("__ONTOREF_NODE_CONTRIB_{}", type_name),
Span::call_site(),
);
Ok(quote! {
#[automatically_derived]
impl #type_name {
/// Returns the ontology node declared by `#[derive(OntologyNode)]`.
pub fn ontology_node() -> ::ontoref_ontology::Node {
::ontoref_ontology::Node {
id: #id_lit.to_owned(),
name: #name_lit.to_owned(),
pole: #pole_variant,
level: #level_variant,
description: #description_lit.to_owned(),
invariant: #invariant,
artifact_paths: vec![#(#path_lits.to_owned()),*],
adrs: vec![#(#adrs.to_owned()),*],
}
}
}
#[cfg(feature = "derive")]
::inventory::submit! {
::ontoref_ontology::NodeContribution {
supplier: <#type_name>::ontology_node,
}
}
// Unique static to prevent duplicate submissions at link time.
#[cfg(feature = "derive")]
#[doc(hidden)]
static #submission_ident: () = ();
})
}
/// Extract a `#[serde(rename = "...")]` value from a field's attributes.
/// Returns `None` if no serde rename is present.
fn serde_rename_of(field: &syn::Field) -> Option<String> {
use syn::punctuated::Punctuated;
use syn::MetaNameValue;
for attr in &field.attrs {
if !attr.path().is_ident("serde") {
continue;
}
let Ok(args) =
attr.parse_args_with(Punctuated::<MetaNameValue, Token![,]>::parse_terminated)
else {
continue;
};
let renamed = args
.iter()
.find(|kv| kv.path.is_ident("rename"))
.and_then(|kv| lit_str(&kv.value));
if renamed.is_some() {
return renamed;
}
}
None
}
// ── #[derive(ConfigFields)]
// ────────────────────────────────────────────────────────
/// Derive macro that extracts serde field names from a config struct and
/// registers them via `inventory::submit!` at link time.
///
/// Annotate the struct with `#[config_section(id = "...", ncl_file = "...")]`
/// to declare which NCL section this struct reads. The macro then emits an
/// `inventory::submit!(ConfigFieldsEntry { ... })` for each annotated struct,
/// allowing ontoref to compare declared Rust fields against NCL section exports
/// without running the daemon.
///
/// Respects `#[serde(rename = "...")]` — the registered field name is the
/// JSON key serde would deserialize, not the Rust identifier.
///
/// # Example
///
/// ```ignore
/// #[derive(serde::Deserialize, ConfigFields)]
/// #[config_section(id = "server", ncl_file = "config/server.ncl")]
/// pub struct ServerConfig {
/// pub host: String,
/// #[serde(rename = "listen_port")]
/// pub port: u16,
/// }
/// // Registers: section_id="server", fields=["host","listen_port"]
/// ```
#[proc_macro_derive(ConfigFields, attributes(config_section))]
pub fn derive_config_fields(input: TokenStream) -> TokenStream {
let ast = parse_macro_input!(input as DeriveInput);
match expand_config_fields(ast) {
Ok(ts) => ts.into(),
Err(err) => err.to_compile_error().into(),
}
}
fn expand_config_fields(ast: DeriveInput) -> syn::Result<proc_macro2::TokenStream> {
// Parse #[config_section(id = "...", ncl_file = "...")]
let mut section_id: Option<String> = None;
let mut ncl_file: Option<String> = None;
for attr in ast
.attrs
.iter()
.filter(|a| a.path().is_ident("config_section"))
{
let args =
attr.parse_args_with(Punctuated::<MetaNameValue, Token![,]>::parse_terminated)?;
for kv in &args {
let key = kv
.path
.get_ident()
.ok_or_else(|| syn::Error::new_spanned(&kv.path, "expected identifier"))?
.to_string();
let val = lit_str(&kv.value)
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
match key.as_str() {
"id" => section_id = Some(val),
"ncl_file" => ncl_file = Some(val),
other => {
return Err(syn::Error::new_spanned(
&kv.path,
format!("unknown config_section key: {other}; expected id or ncl_file"),
))
}
}
}
}
let section_id = section_id.ok_or_else(|| {
syn::Error::new(
Span::call_site(),
"#[derive(ConfigFields)] requires #[config_section(id = \"...\", ncl_file = \"...\")]",
)
})?;
let ncl_file = ncl_file.ok_or_else(|| {
syn::Error::new(
Span::call_site(),
"#[derive(ConfigFields)] requires #[config_section(ncl_file = \"...\")]",
)
})?;
// Extract named fields, respecting #[serde(rename = "...")].
let fields = match &ast.data {
syn::Data::Struct(s) => match &s.fields {
syn::Fields::Named(named) => &named.named,
_ => {
return Err(syn::Error::new(
Span::call_site(),
"#[derive(ConfigFields)] requires a struct with named fields",
))
}
},
_ => {
return Err(syn::Error::new(
Span::call_site(),
"#[derive(ConfigFields)] can only be used on structs",
))
}
};
let field_names: Vec<String> = fields
.iter()
.map(|f| {
serde_rename_of(f)
.unwrap_or_else(|| f.ident.as_ref().map(|i| i.to_string()).unwrap_or_default())
})
.filter(|s| !s.is_empty())
.collect();
let field_lits: Vec<LitStr> = field_names
.iter()
.map(|s| LitStr::new(s, Span::call_site()))
.collect();
let section_lit = LitStr::new(&section_id, Span::call_site());
let ncl_file_lit = LitStr::new(&ncl_file, Span::call_site());
let type_name = &ast.ident;
let struct_name_lit = LitStr::new(&type_name.to_string(), Span::call_site());
let unique = {
let s = format!("{section_id}{ncl_file}");
s.bytes()
.fold(5381u64, |h, b| h.wrapping_mul(33).wrapping_add(b as u64))
};
let static_ident = syn::Ident::new(
&format!("__ONTOREF_CONFIG_FIELDS_{unique:x}"),
Span::call_site(),
);
Ok(quote! {
::inventory::submit! {
::ontoref_ontology::ConfigFieldsEntry {
section_id: #section_lit,
ncl_file: #ncl_file_lit,
struct_name: #struct_name_lit,
fields: &[#(#field_lits),*],
}
}
#[doc(hidden)]
#[allow(non_upper_case_globals, dead_code)]
static #static_ident: () = ();
})
}
// ── #[onto_validates]
// ─────────────────────────────────────────────────────────
/// Attribute macro for test functions: registers which ontology practices and
/// ADRs the test validates.
///
/// Only active under `#[cfg(test)]` — zero production binary impact.
///
/// # Example
/// ```ignore
/// #[onto_validates(practice = "ncl-cache", adr = "adr-002")]
/// #[test]
/// fn cache_returns_stale_on_missing_file() { /* ... */ }
/// ```
#[proc_macro_attribute]
pub fn onto_validates(args: TokenStream, input: TokenStream) -> TokenStream {
match expand_onto_validates(args, input) {
Ok(ts) => ts.into(),
Err(err) => err.to_compile_error().into(),
}
}
fn expand_onto_validates(
args: TokenStream,
input: TokenStream,
) -> syn::Result<proc_macro2::TokenStream> {
let item = proc_macro2::TokenStream::from(input);
// Parse key=value pairs from the attribute args.
let kv_args = syn::parse::Parser::parse(
Punctuated::<MetaNameValue, Token![,]>::parse_terminated,
args,
)?;
let mut practice_id: Option<String> = None;
let mut adr_id: Option<String> = None;
for kv in &kv_args {
let key = kv
.path
.get_ident()
.ok_or_else(|| syn::Error::new_spanned(&kv.path, "expected identifier"))?
.to_string();
match key.as_str() {
"practice" => {
practice_id = Some(
lit_str(&kv.value)
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string"))?,
)
}
"adr" => {
adr_id = Some(
lit_str(&kv.value)
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string"))?,
)
}
other => {
return Err(syn::Error::new_spanned(
&kv.path,
format!("unknown onto_validates key: {other}; expected 'practice' or 'adr'"),
))
}
}
}
let practice_tokens = match &practice_id {
Some(p) => quote! { ::core::option::Option::Some(#p) },
None => quote! { ::core::option::Option::None },
};
let adr_tokens = match &adr_id {
Some(a) => quote! { ::core::option::Option::Some(#a) },
None => quote! { ::core::option::Option::None },
};
// We need a unique ident for the inventory submission per call site.
// Use a uuid-like approach via the args hash to avoid collisions.
let hash = {
let s = format!(
"{}{}",
practice_id.as_deref().unwrap_or(""),
adr_id.as_deref().unwrap_or("")
);
// Simple djb2 hash for uniqueness in the ident.
s.bytes()
.fold(5381u64, |h, b| h.wrapping_mul(33).wrapping_add(b as u64))
};
let submission_ident = syn::Ident::new(
&format!("__ONTOREF_TEST_COVERAGE_{hash:x}"),
Span::call_site(),
);
Ok(quote! {
#[cfg(all(test, feature = "derive"))]
::inventory::submit! {
::ontoref_ontology::TestCoverage {
practice_id: #practice_tokens,
adr_id: #adr_tokens,
}
}
#[cfg(all(test, feature = "derive"))]
#[doc(hidden)]
static #submission_ident: () = ();
// Emit the original item unchanged.
#item
})
}