---
feat: API catalog surface, protocol v2 tooling, MCP expansion, on+re update
## Summary
Session 2026-03-23. Closes the loop between handler code and discoverability
across all three surfaces (browser, CLI, MCP agent) via compile-time inventory
registration. Adds protocol v2 update tooling, extends MCP from 21 to 29 tools,
and brings the self-description up to date.
## API Catalog Surface (#[onto_api] proc-macro)
- crates/ontoref-derive: new proc-macro crate; `#[onto_api(method, path,
description, auth, actors, params, tags)]` emits `inventory::submit!(ApiRouteEntry{...})`
at link time
- crates/ontoref-daemon/src/api_catalog.rs: `catalog()` — pure fn over
`inventory::iter::<ApiRouteEntry>()`, zero runtime allocation
- GET /api/catalog: returns full annotated HTTP surface as JSON
- templates/pages/api_catalog.html: new page with client-side filtering by
method, auth, path/description; detail panel per route (params table,
feature flag); linked from dashboard card and nav
- UI nav: "API" link (</> icon) added to mobile dropdown and desktop bar
- inventory = "0.3" added to workspace.dependencies (MIT, zero transitive deps)
## Protocol Update Mode
- reflection/modes/update_ontoref.ncl: 9-step DAG (5 detect parallel, 2 update
idempotent, 2 validate, 1 report) — brings any project from protocol v1 to v2
by adding manifest.ncl and connections.ncl if absent, scanning ADRs for
deprecated check_hint, validating with nickel export
- reflection/templates/update-ontology-prompt.md: 8-phase reusable prompt for
agent-driven ontology enrichment (infrastructure → audit → core.ncl →
state.ncl → manifest.ncl → connections.ncl → ADR migration → validation)
## CLI — describe group extensions
- reflection/bin/ontoref.nu: `describe diff [--fmt] [--file]` and
`describe api [--actor] [--tag] [--auth] [--fmt]` registered as canonical
subcommands with log-action; aliases `df` and `da` added; QUICK REFERENCE
and ALIASES sections updated
## MCP — two new tools (21 → 29 total)
- ontoref_api_catalog: filters catalog() output by actor/tag/auth; returns
{ routes, total } — no HTTP roundtrip, calls inventory directly
- ontoref_file_versions: reads ProjectContext.file_versions DashMap per slug;
returns BTreeMap<filename, u64> reload counters
- insert_mcp_ctx: audited and updated from 15 to 28 entries in 6 groups
- HelpTool JSON: 8 new entries (validate_adrs, validate, impact, guides,
bookmark_list, bookmark_add, api_catalog, file_versions)
- ServerHandler::get_info instructions updated to mention new tools
## Web UI — dashboard additions
- Dashboard: "API Catalog" card (9th); "Ontology File Versions" section showing
per-file reload counters from file_versions DashMap
- dashboard_mp: builds BTreeMap<String, u64> from ctx.file_versions and injects
into Tera context
## on+re update
- .ontology/core.ncl: describe-query-layer and adopt-ontoref-tooling descriptions
updated; ontoref-daemon updated ("11 pages", "29 tools", API catalog,
per-file versioning, #[onto_api]); new node api-catalog-surface (Yang/Practice)
with 3 edges; artifact_paths extended across 3 nodes
- .ontology/state.ncl: protocol-maturity blocker updated (protocol v2 complete);
self-description-coverage catalyst updated with session 2026-03-23 additions
- ADR-007: "API Surface Discoverability via #[onto_api] Proc-Macro" — Accepted
## Documentation
- README.md: crates table updated (11 pages, 29 MCP tools, ontoref-derive row);
MCP representative table expanded; API Catalog, Semantic Diff, Per-File
Versioning paragraphs added; update_ontoref onboarding section added
- CHANGELOG.md: [Unreleased] section with 4 change groups
- assets/web/src/index.html: tool counts 19→29 (EN+ES), page counts 12→11
(EN+ES), daemon description paragraph updated with API catalog + #[onto_api]
2026-03-23 00:58:27 +01:00
|
|
|
use proc_macro::TokenStream;
|
|
|
|
|
use proc_macro2::Span;
|
|
|
|
|
use quote::quote;
|
|
|
|
|
use syn::{
|
|
|
|
|
parse_macro_input, punctuated::Punctuated, DeriveInput, Expr, ExprLit, Lit, LitStr,
|
|
|
|
|
MetaNameValue, Token,
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
// ── #[onto_api(...)]
|
|
|
|
|
// ──────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
/// Attribute macro for daemon HTTP handler functions.
|
|
|
|
|
///
|
|
|
|
|
/// Registers the endpoint in the `api_catalog` at link time via
|
|
|
|
|
/// `inventory::submit!`. The annotated function is emitted unchanged.
|
|
|
|
|
///
|
|
|
|
|
/// # Required keys
|
|
|
|
|
/// - `method = "GET"` — HTTP verb
|
|
|
|
|
/// - `path = "/graph/impact"` — URL path pattern (axum syntax)
|
|
|
|
|
/// - `description = "..."` — one-line description of what the endpoint does
|
|
|
|
|
///
|
|
|
|
|
/// # Optional keys
|
|
|
|
|
/// - `auth = "none"` — authentication level: "none" | "viewer" | "admin"
|
|
|
|
|
/// (default: "none")
|
|
|
|
|
/// - `actors = "agent, developer"` — comma-separated actor contexts
|
|
|
|
|
/// - `params = "name:type:constraint:desc; ..."` — semicolon-separated param
|
|
|
|
|
/// entries
|
|
|
|
|
/// - `tags = "graph, federation"` — comma-separated semantic tags
|
|
|
|
|
/// - `feature = "db"` — feature flag required for this endpoint (empty = always
|
|
|
|
|
/// available)
|
|
|
|
|
///
|
|
|
|
|
/// # Example
|
|
|
|
|
/// ```ignore
|
|
|
|
|
/// #[onto_api(
|
|
|
|
|
/// method = "GET", path = "/graph/impact",
|
|
|
|
|
/// description = "Cross-project impact graph from an ontology node",
|
|
|
|
|
/// auth = "viewer", actors = "agent, developer",
|
|
|
|
|
/// params = "node:string:required:Ontology node id; depth:u32:default=2:Max BFS hops",
|
|
|
|
|
/// tags = "graph, federation",
|
|
|
|
|
/// )]
|
|
|
|
|
/// async fn graph_impact(...) { ... }
|
|
|
|
|
/// ```
|
|
|
|
|
#[proc_macro_attribute]
|
|
|
|
|
pub fn onto_api(args: TokenStream, input: TokenStream) -> TokenStream {
|
|
|
|
|
match expand_onto_api(args, input) {
|
|
|
|
|
Ok(ts) => ts.into(),
|
|
|
|
|
Err(err) => err.to_compile_error().into(),
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/// Parsed fields from `#[onto_api(...)]`.
|
|
|
|
|
struct OntoApiAttr {
|
|
|
|
|
method: String,
|
|
|
|
|
path: String,
|
|
|
|
|
description: String,
|
|
|
|
|
auth: String,
|
|
|
|
|
actors: Vec<String>,
|
|
|
|
|
params: Vec<OntoApiParam>,
|
|
|
|
|
tags: Vec<String>,
|
|
|
|
|
feature: String,
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
struct OntoApiParam {
|
|
|
|
|
name: String,
|
|
|
|
|
kind: String,
|
|
|
|
|
constraint: String,
|
|
|
|
|
description: String,
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn expand_onto_api(args: TokenStream, input: TokenStream) -> syn::Result<proc_macro2::TokenStream> {
|
|
|
|
|
let item = proc_macro2::TokenStream::from(input);
|
|
|
|
|
|
|
|
|
|
let kv_args = syn::parse::Parser::parse(
|
|
|
|
|
Punctuated::<MetaNameValue, Token![,]>::parse_terminated,
|
|
|
|
|
args,
|
|
|
|
|
)?;
|
|
|
|
|
|
|
|
|
|
let mut method: Option<String> = None;
|
|
|
|
|
let mut path: Option<String> = None;
|
|
|
|
|
let mut description: Option<String> = None;
|
|
|
|
|
let mut auth = "none".to_owned();
|
|
|
|
|
let mut actors: Vec<String> = Vec::new();
|
|
|
|
|
let mut params_raw: Option<String> = None;
|
|
|
|
|
let mut tags: Vec<String> = Vec::new();
|
|
|
|
|
let mut feature = String::new();
|
|
|
|
|
|
|
|
|
|
for kv in &kv_args {
|
|
|
|
|
let key = kv
|
|
|
|
|
.path
|
|
|
|
|
.get_ident()
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.path, "expected identifier"))?
|
|
|
|
|
.to_string();
|
|
|
|
|
let val = lit_str(&kv.value)
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
|
|
|
|
|
|
|
|
|
|
match key.as_str() {
|
|
|
|
|
"method" => method = Some(val),
|
|
|
|
|
"path" => path = Some(val),
|
|
|
|
|
"description" => description = Some(val),
|
|
|
|
|
"auth" => match val.as_str() {
|
|
|
|
|
"none" | "viewer" | "admin" => auth = val,
|
|
|
|
|
other => {
|
|
|
|
|
return Err(syn::Error::new_spanned(
|
|
|
|
|
&kv.value,
|
|
|
|
|
format!("unknown auth level '{other}'; expected none | viewer | admin"),
|
|
|
|
|
))
|
|
|
|
|
}
|
|
|
|
|
},
|
|
|
|
|
"actors" => actors = split_csv(&val),
|
|
|
|
|
"params" => params_raw = Some(val),
|
|
|
|
|
"tags" => tags = split_csv(&val),
|
|
|
|
|
"feature" => feature = val,
|
|
|
|
|
other => {
|
|
|
|
|
return Err(syn::Error::new_spanned(
|
|
|
|
|
&kv.path,
|
|
|
|
|
format!("unknown onto_api key: {other}"),
|
|
|
|
|
))
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let method = method.ok_or_else(|| {
|
|
|
|
|
syn::Error::new(Span::call_site(), "#[onto_api] requires method = \"...\"")
|
|
|
|
|
})?;
|
|
|
|
|
let path = path
|
|
|
|
|
.ok_or_else(|| syn::Error::new(Span::call_site(), "#[onto_api] requires path = \"...\""))?;
|
|
|
|
|
let desc = description.ok_or_else(|| {
|
|
|
|
|
syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
"#[onto_api] requires description = \"...\"",
|
|
|
|
|
)
|
|
|
|
|
})?;
|
|
|
|
|
|
|
|
|
|
let params = parse_params(params_raw.as_deref().unwrap_or(""))?;
|
|
|
|
|
|
|
|
|
|
let attr = OntoApiAttr {
|
|
|
|
|
method,
|
|
|
|
|
path,
|
|
|
|
|
description: desc,
|
|
|
|
|
auth,
|
|
|
|
|
actors,
|
|
|
|
|
params,
|
|
|
|
|
tags,
|
|
|
|
|
feature,
|
|
|
|
|
};
|
|
|
|
|
let ts = emit_onto_api(attr, item);
|
|
|
|
|
Ok(ts)
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn split_csv(s: &str) -> Vec<String> {
|
|
|
|
|
s.split(',')
|
|
|
|
|
.map(|p| p.trim().to_owned())
|
|
|
|
|
.filter(|p| !p.is_empty())
|
|
|
|
|
.collect()
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/// Parse `"name:type:constraint:description; ..."` param string.
|
|
|
|
|
/// Separator between params: `;`. Fields within a param: `:` (max 4 splits).
|
|
|
|
|
fn parse_params(raw: &str) -> syn::Result<Vec<OntoApiParam>> {
|
|
|
|
|
if raw.trim().is_empty() {
|
|
|
|
|
return Ok(Vec::new());
|
|
|
|
|
}
|
|
|
|
|
raw.split(';')
|
|
|
|
|
.map(|entry| {
|
|
|
|
|
let parts: Vec<&str> = entry.trim().splitn(4, ':').collect();
|
|
|
|
|
if parts.len() < 3 {
|
|
|
|
|
return Err(syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
format!("param entry '{entry}' must have at least name:type:constraint"),
|
|
|
|
|
));
|
|
|
|
|
}
|
|
|
|
|
Ok(OntoApiParam {
|
|
|
|
|
name: parts[0].trim().to_owned(),
|
|
|
|
|
kind: parts[1].trim().to_owned(),
|
|
|
|
|
constraint: parts[2].trim().to_owned(),
|
|
|
|
|
description: parts.get(3).map(|s| s.trim()).unwrap_or("").to_owned(),
|
|
|
|
|
})
|
|
|
|
|
})
|
|
|
|
|
.collect()
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn emit_onto_api(attr: OntoApiAttr, item: proc_macro2::TokenStream) -> proc_macro2::TokenStream {
|
|
|
|
|
let method = LitStr::new(&attr.method, Span::call_site());
|
|
|
|
|
let path = LitStr::new(&attr.path, Span::call_site());
|
|
|
|
|
let desc = LitStr::new(&attr.description, Span::call_site());
|
|
|
|
|
let auth = LitStr::new(&attr.auth, Span::call_site());
|
|
|
|
|
let feature = LitStr::new(&attr.feature, Span::call_site());
|
|
|
|
|
|
|
|
|
|
let actor_lits: Vec<LitStr> = attr
|
|
|
|
|
.actors
|
|
|
|
|
.iter()
|
|
|
|
|
.map(|a| LitStr::new(a, Span::call_site()))
|
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
|
|
let tag_lits: Vec<LitStr> = attr
|
|
|
|
|
.tags
|
|
|
|
|
.iter()
|
|
|
|
|
.map(|t| LitStr::new(t, Span::call_site()))
|
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
|
|
let param_exprs: Vec<_> = attr
|
|
|
|
|
.params
|
|
|
|
|
.iter()
|
|
|
|
|
.map(|p| {
|
|
|
|
|
let n = LitStr::new(&p.name, Span::call_site());
|
|
|
|
|
let k = LitStr::new(&p.kind, Span::call_site());
|
|
|
|
|
let c = LitStr::new(&p.constraint, Span::call_site());
|
|
|
|
|
let d = LitStr::new(&p.description, Span::call_site());
|
|
|
|
|
quote! {
|
|
|
|
|
crate::api_catalog::ApiParam { name: #n, kind: #k, constraint: #c, description: #d }
|
|
|
|
|
}
|
|
|
|
|
})
|
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
|
|
// Unique ident derived from path+method to prevent duplicate statics.
|
|
|
|
|
let unique = {
|
|
|
|
|
let s = format!("{}{}", attr.method, attr.path);
|
|
|
|
|
s.bytes()
|
|
|
|
|
.fold(5381u64, |h, b| h.wrapping_mul(33).wrapping_add(b as u64))
|
|
|
|
|
};
|
|
|
|
|
let static_ident = syn::Ident::new(
|
|
|
|
|
&format!("__ONTOREF_API_ROUTE_{unique:x}"),
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
);
|
|
|
|
|
|
|
|
|
|
quote! {
|
|
|
|
|
::inventory::submit! {
|
|
|
|
|
crate::api_catalog::ApiRouteEntry {
|
|
|
|
|
method: #method,
|
|
|
|
|
path: #path,
|
|
|
|
|
description: #desc,
|
|
|
|
|
auth: #auth,
|
|
|
|
|
actors: &[#(#actor_lits),*],
|
|
|
|
|
params: &[#(#param_exprs),*],
|
|
|
|
|
tags: &[#(#tag_lits),*],
|
|
|
|
|
feature: #feature,
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
#[doc(hidden)]
|
|
|
|
|
#[allow(non_upper_case_globals, dead_code)]
|
|
|
|
|
static #static_ident: () = ();
|
|
|
|
|
|
|
|
|
|
#item
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// ── Attribute parsing
|
|
|
|
|
// ─────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
/// Parsed contents of a single `#[onto(...)]` attribute.
|
|
|
|
|
#[derive(Default)]
|
|
|
|
|
struct OntoAttr {
|
|
|
|
|
id: Option<String>,
|
|
|
|
|
level: Option<String>,
|
|
|
|
|
pole: Option<String>,
|
|
|
|
|
description: Option<String>,
|
|
|
|
|
adrs: Vec<String>,
|
|
|
|
|
invariant: Option<bool>,
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/// Parse `key = "value"` pairs from a `#[onto(k = "v", ...)]` attribute.
|
|
|
|
|
fn parse_onto_attr(attr: &syn::Attribute) -> syn::Result<OntoAttr> {
|
|
|
|
|
let mut out = OntoAttr::default();
|
|
|
|
|
|
|
|
|
|
let args = attr.parse_args_with(Punctuated::<MetaNameValue, Token![,]>::parse_terminated)?;
|
|
|
|
|
|
|
|
|
|
for kv in &args {
|
|
|
|
|
let key = kv
|
|
|
|
|
.path
|
|
|
|
|
.get_ident()
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.path, "expected identifier"))?
|
|
|
|
|
.to_string();
|
|
|
|
|
|
|
|
|
|
match key.as_str() {
|
|
|
|
|
"id" | "level" | "pole" | "description" => {
|
|
|
|
|
let s = lit_str(&kv.value)
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
|
|
|
|
|
match key.as_str() {
|
|
|
|
|
"id" => out.id = Some(s),
|
|
|
|
|
"level" => out.level = Some(s),
|
|
|
|
|
"pole" => out.pole = Some(s),
|
|
|
|
|
"description" => out.description = Some(s),
|
|
|
|
|
_ => unreachable!(),
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
"adrs" => {
|
|
|
|
|
// adrs = "adr-001, adr-002" — comma-separated list in a single string
|
|
|
|
|
let s = lit_str(&kv.value)
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
|
|
|
|
|
out.adrs = s.split(',').map(|a| a.trim().to_owned()).collect();
|
|
|
|
|
}
|
|
|
|
|
"invariant" => {
|
|
|
|
|
out.invariant =
|
|
|
|
|
Some(lit_bool(&kv.value).ok_or_else(|| {
|
|
|
|
|
syn::Error::new_spanned(&kv.value, "expected bool literal")
|
|
|
|
|
})?);
|
|
|
|
|
}
|
|
|
|
|
other => {
|
|
|
|
|
return Err(syn::Error::new_spanned(
|
|
|
|
|
&kv.path,
|
|
|
|
|
format!("unknown onto key: {other}"),
|
|
|
|
|
));
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
Ok(out)
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn lit_str(expr: &Expr) -> Option<String> {
|
|
|
|
|
if let Expr::Lit(ExprLit {
|
|
|
|
|
lit: Lit::Str(s), ..
|
|
|
|
|
}) = expr
|
|
|
|
|
{
|
|
|
|
|
Some(s.value())
|
|
|
|
|
} else {
|
|
|
|
|
None
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn lit_bool(expr: &Expr) -> Option<bool> {
|
|
|
|
|
if let Expr::Lit(ExprLit {
|
|
|
|
|
lit: Lit::Bool(b), ..
|
|
|
|
|
}) = expr
|
|
|
|
|
{
|
|
|
|
|
Some(b.value())
|
|
|
|
|
} else {
|
|
|
|
|
None
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// ── #[derive(OntologyNode)]
|
|
|
|
|
// ───────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
/// Derive macro that registers a Rust type as an
|
|
|
|
|
/// [`ontoref_ontology::NodeContribution`].
|
|
|
|
|
///
|
|
|
|
|
/// The `#[onto(...)]` attribute declares the node's identity in the ontology
|
|
|
|
|
/// DAG. All `#[onto]` helper attributes on the type are merged in declaration
|
|
|
|
|
/// order — later keys overwrite earlier ones, except `adrs` which concatenates.
|
|
|
|
|
///
|
|
|
|
|
/// # Required attributes
|
|
|
|
|
/// - `id = "my-node-id"` — unique node identifier (must match NCL convention)
|
|
|
|
|
/// - `level = "Practice"` — [`AbstractionLevel`] variant name
|
|
|
|
|
/// - `pole = "Yang"` — [`Pole`] variant name
|
|
|
|
|
///
|
|
|
|
|
/// # Optional attributes
|
|
|
|
|
/// - `description = "..."` — human-readable description
|
|
|
|
|
/// - `adrs = "adr-001, adr-002"` — comma-separated ADR references
|
|
|
|
|
/// - `invariant = true` — mark node as invariant (default: false)
|
|
|
|
|
///
|
|
|
|
|
/// # Example
|
|
|
|
|
/// ```ignore
|
|
|
|
|
/// #[derive(OntologyNode)]
|
|
|
|
|
/// #[onto(id = "ncl-cache", level = "Practice", pole = "Yang")]
|
|
|
|
|
/// #[onto(description = "Caches NCL exports to avoid re-eval on unchanged files")]
|
|
|
|
|
/// #[onto(adrs = "adr-002, adr-004")]
|
|
|
|
|
/// pub struct NclCache { /* ... */ }
|
|
|
|
|
/// ```
|
|
|
|
|
///
|
|
|
|
|
/// [`AbstractionLevel`]: ontoref_ontology::AbstractionLevel
|
|
|
|
|
/// [`Pole`]: ontoref_ontology::Pole
|
|
|
|
|
#[proc_macro_derive(OntologyNode, attributes(onto))]
|
|
|
|
|
pub fn derive_ontology_node(input: TokenStream) -> TokenStream {
|
|
|
|
|
let ast = parse_macro_input!(input as DeriveInput);
|
|
|
|
|
match expand_ontology_node(ast) {
|
|
|
|
|
Ok(ts) => ts.into(),
|
|
|
|
|
Err(err) => err.to_compile_error().into(),
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn expand_ontology_node(ast: DeriveInput) -> syn::Result<proc_macro2::TokenStream> {
|
|
|
|
|
// Merge all #[onto(...)] attributes on the type.
|
|
|
|
|
let mut merged = OntoAttr::default();
|
|
|
|
|
for attr in ast.attrs.iter().filter(|a| a.path().is_ident("onto")) {
|
|
|
|
|
let parsed = parse_onto_attr(attr)?;
|
|
|
|
|
if parsed.id.is_some() {
|
|
|
|
|
merged.id = parsed.id;
|
|
|
|
|
}
|
|
|
|
|
if parsed.level.is_some() {
|
|
|
|
|
merged.level = parsed.level;
|
|
|
|
|
}
|
|
|
|
|
if parsed.pole.is_some() {
|
|
|
|
|
merged.pole = parsed.pole;
|
|
|
|
|
}
|
|
|
|
|
if parsed.description.is_some() {
|
|
|
|
|
merged.description = parsed.description;
|
|
|
|
|
}
|
|
|
|
|
if parsed.invariant.is_some() {
|
|
|
|
|
merged.invariant = parsed.invariant;
|
|
|
|
|
}
|
|
|
|
|
merged.adrs.extend(parsed.adrs);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let id = merged.id.ok_or_else(|| {
|
|
|
|
|
syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
"#[derive(OntologyNode)] requires #[onto(id = \"...\")]",
|
|
|
|
|
)
|
|
|
|
|
})?;
|
|
|
|
|
let level_str = merged.level.ok_or_else(|| {
|
|
|
|
|
syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
"#[derive(OntologyNode)] requires #[onto(level = \"...\")]",
|
|
|
|
|
)
|
|
|
|
|
})?;
|
|
|
|
|
let pole_str = merged.pole.ok_or_else(|| {
|
|
|
|
|
syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
"#[derive(OntologyNode)] requires #[onto(pole = \"...\")]",
|
|
|
|
|
)
|
|
|
|
|
})?;
|
|
|
|
|
|
|
|
|
|
// Validate level and pole at compile time via known variant names.
|
|
|
|
|
let level_variant = match level_str.as_str() {
|
|
|
|
|
"Axiom" => quote! { ::ontoref_ontology::AbstractionLevel::Axiom },
|
|
|
|
|
"Tension" => quote! { ::ontoref_ontology::AbstractionLevel::Tension },
|
|
|
|
|
"Practice" => quote! { ::ontoref_ontology::AbstractionLevel::Practice },
|
|
|
|
|
"Project" => quote! { ::ontoref_ontology::AbstractionLevel::Project },
|
|
|
|
|
"Moment" => quote! { ::ontoref_ontology::AbstractionLevel::Moment },
|
|
|
|
|
other => {
|
|
|
|
|
return Err(syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
format!(
|
|
|
|
|
"unknown AbstractionLevel: {other}; expected one of Axiom, Tension, Practice, \
|
|
|
|
|
Project, Moment"
|
|
|
|
|
),
|
|
|
|
|
))
|
|
|
|
|
}
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let pole_variant = match pole_str.as_str() {
|
|
|
|
|
"Yang" => quote! { ::ontoref_ontology::Pole::Yang },
|
|
|
|
|
"Yin" => quote! { ::ontoref_ontology::Pole::Yin },
|
|
|
|
|
"Spiral" => quote! { ::ontoref_ontology::Pole::Spiral },
|
|
|
|
|
other => {
|
|
|
|
|
return Err(syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
format!("unknown Pole: {other}; expected one of Yang, Yin, Spiral"),
|
|
|
|
|
))
|
|
|
|
|
}
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let description = merged.description.as_deref().unwrap_or("");
|
|
|
|
|
let invariant = merged.invariant.unwrap_or(false);
|
|
|
|
|
let adrs: Vec<LitStr> = merged
|
|
|
|
|
.adrs
|
|
|
|
|
.iter()
|
|
|
|
|
.filter(|s| !s.is_empty())
|
|
|
|
|
.map(|s| LitStr::new(s, Span::call_site()))
|
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
|
|
let id_lit = LitStr::new(&id, Span::call_site());
|
|
|
|
|
let id_lit2 = id_lit.clone();
|
|
|
|
|
let description_lit = LitStr::new(description, Span::call_site());
|
|
|
|
|
|
|
|
|
|
// Derive a unique identifier for the inventory submission from the type name.
|
|
|
|
|
let type_name = &ast.ident;
|
|
|
|
|
let submission_ident = syn::Ident::new(
|
|
|
|
|
&format!("__ONTOREF_NODE_CONTRIB_{}", type_name),
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
);
|
|
|
|
|
|
|
|
|
|
Ok(quote! {
|
|
|
|
|
#[automatically_derived]
|
|
|
|
|
impl #type_name {
|
|
|
|
|
/// Returns the ontology node declared by `#[derive(OntologyNode)]`.
|
|
|
|
|
pub fn ontology_node() -> ::ontoref_ontology::Node {
|
|
|
|
|
::ontoref_ontology::Node {
|
|
|
|
|
id: #id_lit.to_owned(),
|
|
|
|
|
name: #id_lit2.to_owned(),
|
|
|
|
|
pole: #pole_variant,
|
|
|
|
|
level: #level_variant,
|
|
|
|
|
description: #description_lit.to_owned(),
|
|
|
|
|
invariant: #invariant,
|
|
|
|
|
artifact_paths: vec![],
|
|
|
|
|
adrs: vec![#(#adrs.to_owned()),*],
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
#[cfg(feature = "derive")]
|
|
|
|
|
::inventory::submit! {
|
|
|
|
|
::ontoref_ontology::NodeContribution {
|
|
|
|
|
supplier: <#type_name>::ontology_node,
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// Unique static to prevent duplicate submissions at link time.
|
|
|
|
|
#[cfg(feature = "derive")]
|
|
|
|
|
#[doc(hidden)]
|
|
|
|
|
static #submission_ident: () = ();
|
|
|
|
|
})
|
|
|
|
|
}
|
|
|
|
|
|
feat: config surface, NCL contracts, override-layer mutation, on+re update
Config surface — per-project config introspection, coherence verification, and
audited mutation without destroying NCL structure (ADR-008):
- crates/ontoref-daemon/src/config.rs — typed DaemonNclConfig (parse-at-boundary
pattern); all section structs derive ConfigFields + config_section(id, ncl_file)
emitting inventory::submit!(ConfigFieldsEntry{...}) at link time
- crates/ontoref-derive/src/lib.rs — #[derive(ConfigFields)] proc-macro; serde
rename support; serde_rename_of() helper extracted to fix excessive_nesting
- crates/ontoref-daemon/src/main.rs — 3-tuple bootstrap block (nickel_import_path,
loaded_ncl_config: Option<DaemonNclConfig>, stdin_raw); apply_ui_config takes
&UiConfig; NATS call site typed; resolve_asset_dir cfg(feature = "ui")
- crates/ontoref-daemon/src/api.rs — config GET/PUT endpoints, quickref, coherence,
cross-project comparison; index_section_fields() extracted (excessive_nesting)
- crates/ontoref-daemon/src/config_coherence.rs — multi-consumer coherence;
merge_meta_into_section() extracted; and() replaces unnecessary and_then
NCL contracts for ontoref's own config:
- .ontoref/contracts.ncl — LogConfig (LogLevel, LogRotation, PositiveInt) and
DaemonConfig (Port, optional overrides); std.contract.from_validator throughout
- .ontoref/config.ncl — log | C.LogConfig applied
- .ontology/manifest.ncl — contracts_path, log/daemon contract refs, daemon section
with DaemonRuntimeConfig consumer and 7 declared fields
Protocol:
- adrs/adr-008-ncl-first-config-validation-and-override-layer.ncl — NCL contracts
as single validation gate; Rust structs are contract-trusted; override-layer
mutation writes {section}.overrides.ncl + _overrides_meta, never touches source
on+re update:
- .ontology/core.ncl — config-surface node (28 practices); adr-lifecycle extended
to adr-007 + adr-008; 6 new edges (ManifestsIn daemon, DependsOn ontology-crate,
Complements api-catalog-surface/dag-formalized/self-describing/adopt-ontoref)
- .ontology/state.ncl — protocol-maturity blocker and self-description-coverage
catalyst updated for session 2026-03-26
- README.md / CHANGELOG.md updated
2026-03-26 20:20:22 +00:00
|
|
|
/// Extract a `#[serde(rename = "...")]` value from a field's attributes.
|
|
|
|
|
/// Returns `None` if no serde rename is present.
|
|
|
|
|
fn serde_rename_of(field: &syn::Field) -> Option<String> {
|
|
|
|
|
use syn::punctuated::Punctuated;
|
|
|
|
|
use syn::MetaNameValue;
|
|
|
|
|
for attr in &field.attrs {
|
|
|
|
|
if !attr.path().is_ident("serde") {
|
|
|
|
|
continue;
|
|
|
|
|
}
|
|
|
|
|
let Ok(args) =
|
|
|
|
|
attr.parse_args_with(Punctuated::<MetaNameValue, Token![,]>::parse_terminated)
|
|
|
|
|
else {
|
|
|
|
|
continue;
|
|
|
|
|
};
|
|
|
|
|
let renamed = args
|
|
|
|
|
.iter()
|
|
|
|
|
.find(|kv| kv.path.is_ident("rename"))
|
|
|
|
|
.and_then(|kv| lit_str(&kv.value));
|
|
|
|
|
if renamed.is_some() {
|
|
|
|
|
return renamed;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
None
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// ── #[derive(ConfigFields)]
|
|
|
|
|
// ────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
/// Derive macro that extracts serde field names from a config struct and
|
|
|
|
|
/// registers them via `inventory::submit!` at link time.
|
|
|
|
|
///
|
|
|
|
|
/// Annotate the struct with `#[config_section(id = "...", ncl_file = "...")]`
|
|
|
|
|
/// to declare which NCL section this struct reads. The macro then emits an
|
|
|
|
|
/// `inventory::submit!(ConfigFieldsEntry { ... })` for each annotated struct,
|
|
|
|
|
/// allowing ontoref to compare declared Rust fields against NCL section exports
|
|
|
|
|
/// without running the daemon.
|
|
|
|
|
///
|
|
|
|
|
/// Respects `#[serde(rename = "...")]` — the registered field name is the
|
|
|
|
|
/// JSON key serde would deserialize, not the Rust identifier.
|
|
|
|
|
///
|
|
|
|
|
/// # Example
|
|
|
|
|
///
|
|
|
|
|
/// ```ignore
|
|
|
|
|
/// #[derive(serde::Deserialize, ConfigFields)]
|
|
|
|
|
/// #[config_section(id = "server", ncl_file = "config/server.ncl")]
|
|
|
|
|
/// pub struct ServerConfig {
|
|
|
|
|
/// pub host: String,
|
|
|
|
|
/// #[serde(rename = "listen_port")]
|
|
|
|
|
/// pub port: u16,
|
|
|
|
|
/// }
|
|
|
|
|
/// // Registers: section_id="server", fields=["host","listen_port"]
|
|
|
|
|
/// ```
|
|
|
|
|
#[proc_macro_derive(ConfigFields, attributes(config_section))]
|
|
|
|
|
pub fn derive_config_fields(input: TokenStream) -> TokenStream {
|
|
|
|
|
let ast = parse_macro_input!(input as DeriveInput);
|
|
|
|
|
match expand_config_fields(ast) {
|
|
|
|
|
Ok(ts) => ts.into(),
|
|
|
|
|
Err(err) => err.to_compile_error().into(),
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn expand_config_fields(ast: DeriveInput) -> syn::Result<proc_macro2::TokenStream> {
|
|
|
|
|
// Parse #[config_section(id = "...", ncl_file = "...")]
|
|
|
|
|
let mut section_id: Option<String> = None;
|
|
|
|
|
let mut ncl_file: Option<String> = None;
|
|
|
|
|
|
|
|
|
|
for attr in ast
|
|
|
|
|
.attrs
|
|
|
|
|
.iter()
|
|
|
|
|
.filter(|a| a.path().is_ident("config_section"))
|
|
|
|
|
{
|
|
|
|
|
let args =
|
|
|
|
|
attr.parse_args_with(Punctuated::<MetaNameValue, Token![,]>::parse_terminated)?;
|
|
|
|
|
for kv in &args {
|
|
|
|
|
let key = kv
|
|
|
|
|
.path
|
|
|
|
|
.get_ident()
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.path, "expected identifier"))?
|
|
|
|
|
.to_string();
|
|
|
|
|
let val = lit_str(&kv.value)
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string literal"))?;
|
|
|
|
|
match key.as_str() {
|
|
|
|
|
"id" => section_id = Some(val),
|
|
|
|
|
"ncl_file" => ncl_file = Some(val),
|
|
|
|
|
other => {
|
|
|
|
|
return Err(syn::Error::new_spanned(
|
|
|
|
|
&kv.path,
|
|
|
|
|
format!("unknown config_section key: {other}; expected id or ncl_file"),
|
|
|
|
|
))
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let section_id = section_id.ok_or_else(|| {
|
|
|
|
|
syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
"#[derive(ConfigFields)] requires #[config_section(id = \"...\", ncl_file = \"...\")]",
|
|
|
|
|
)
|
|
|
|
|
})?;
|
|
|
|
|
let ncl_file = ncl_file.ok_or_else(|| {
|
|
|
|
|
syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
"#[derive(ConfigFields)] requires #[config_section(ncl_file = \"...\")]",
|
|
|
|
|
)
|
|
|
|
|
})?;
|
|
|
|
|
|
|
|
|
|
// Extract named fields, respecting #[serde(rename = "...")].
|
|
|
|
|
let fields = match &ast.data {
|
|
|
|
|
syn::Data::Struct(s) => match &s.fields {
|
|
|
|
|
syn::Fields::Named(named) => &named.named,
|
|
|
|
|
_ => {
|
|
|
|
|
return Err(syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
"#[derive(ConfigFields)] requires a struct with named fields",
|
|
|
|
|
))
|
|
|
|
|
}
|
|
|
|
|
},
|
|
|
|
|
_ => {
|
|
|
|
|
return Err(syn::Error::new(
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
"#[derive(ConfigFields)] can only be used on structs",
|
|
|
|
|
))
|
|
|
|
|
}
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let field_names: Vec<String> = fields
|
|
|
|
|
.iter()
|
|
|
|
|
.map(|f| {
|
|
|
|
|
serde_rename_of(f)
|
|
|
|
|
.unwrap_or_else(|| f.ident.as_ref().map(|i| i.to_string()).unwrap_or_default())
|
|
|
|
|
})
|
|
|
|
|
.filter(|s| !s.is_empty())
|
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
|
|
let field_lits: Vec<LitStr> = field_names
|
|
|
|
|
.iter()
|
|
|
|
|
.map(|s| LitStr::new(s, Span::call_site()))
|
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
|
|
let section_lit = LitStr::new(§ion_id, Span::call_site());
|
|
|
|
|
let ncl_file_lit = LitStr::new(&ncl_file, Span::call_site());
|
|
|
|
|
|
|
|
|
|
let type_name = &ast.ident;
|
|
|
|
|
let struct_name_lit = LitStr::new(&type_name.to_string(), Span::call_site());
|
|
|
|
|
|
|
|
|
|
let unique = {
|
|
|
|
|
let s = format!("{section_id}{ncl_file}");
|
|
|
|
|
s.bytes()
|
|
|
|
|
.fold(5381u64, |h, b| h.wrapping_mul(33).wrapping_add(b as u64))
|
|
|
|
|
};
|
|
|
|
|
let static_ident = syn::Ident::new(
|
|
|
|
|
&format!("__ONTOREF_CONFIG_FIELDS_{unique:x}"),
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
);
|
|
|
|
|
|
|
|
|
|
Ok(quote! {
|
|
|
|
|
::inventory::submit! {
|
|
|
|
|
::ontoref_ontology::ConfigFieldsEntry {
|
|
|
|
|
section_id: #section_lit,
|
|
|
|
|
ncl_file: #ncl_file_lit,
|
|
|
|
|
struct_name: #struct_name_lit,
|
|
|
|
|
fields: &[#(#field_lits),*],
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
#[doc(hidden)]
|
|
|
|
|
#[allow(non_upper_case_globals, dead_code)]
|
|
|
|
|
static #static_ident: () = ();
|
|
|
|
|
})
|
|
|
|
|
}
|
|
|
|
|
|
---
feat: API catalog surface, protocol v2 tooling, MCP expansion, on+re update
## Summary
Session 2026-03-23. Closes the loop between handler code and discoverability
across all three surfaces (browser, CLI, MCP agent) via compile-time inventory
registration. Adds protocol v2 update tooling, extends MCP from 21 to 29 tools,
and brings the self-description up to date.
## API Catalog Surface (#[onto_api] proc-macro)
- crates/ontoref-derive: new proc-macro crate; `#[onto_api(method, path,
description, auth, actors, params, tags)]` emits `inventory::submit!(ApiRouteEntry{...})`
at link time
- crates/ontoref-daemon/src/api_catalog.rs: `catalog()` — pure fn over
`inventory::iter::<ApiRouteEntry>()`, zero runtime allocation
- GET /api/catalog: returns full annotated HTTP surface as JSON
- templates/pages/api_catalog.html: new page with client-side filtering by
method, auth, path/description; detail panel per route (params table,
feature flag); linked from dashboard card and nav
- UI nav: "API" link (</> icon) added to mobile dropdown and desktop bar
- inventory = "0.3" added to workspace.dependencies (MIT, zero transitive deps)
## Protocol Update Mode
- reflection/modes/update_ontoref.ncl: 9-step DAG (5 detect parallel, 2 update
idempotent, 2 validate, 1 report) — brings any project from protocol v1 to v2
by adding manifest.ncl and connections.ncl if absent, scanning ADRs for
deprecated check_hint, validating with nickel export
- reflection/templates/update-ontology-prompt.md: 8-phase reusable prompt for
agent-driven ontology enrichment (infrastructure → audit → core.ncl →
state.ncl → manifest.ncl → connections.ncl → ADR migration → validation)
## CLI — describe group extensions
- reflection/bin/ontoref.nu: `describe diff [--fmt] [--file]` and
`describe api [--actor] [--tag] [--auth] [--fmt]` registered as canonical
subcommands with log-action; aliases `df` and `da` added; QUICK REFERENCE
and ALIASES sections updated
## MCP — two new tools (21 → 29 total)
- ontoref_api_catalog: filters catalog() output by actor/tag/auth; returns
{ routes, total } — no HTTP roundtrip, calls inventory directly
- ontoref_file_versions: reads ProjectContext.file_versions DashMap per slug;
returns BTreeMap<filename, u64> reload counters
- insert_mcp_ctx: audited and updated from 15 to 28 entries in 6 groups
- HelpTool JSON: 8 new entries (validate_adrs, validate, impact, guides,
bookmark_list, bookmark_add, api_catalog, file_versions)
- ServerHandler::get_info instructions updated to mention new tools
## Web UI — dashboard additions
- Dashboard: "API Catalog" card (9th); "Ontology File Versions" section showing
per-file reload counters from file_versions DashMap
- dashboard_mp: builds BTreeMap<String, u64> from ctx.file_versions and injects
into Tera context
## on+re update
- .ontology/core.ncl: describe-query-layer and adopt-ontoref-tooling descriptions
updated; ontoref-daemon updated ("11 pages", "29 tools", API catalog,
per-file versioning, #[onto_api]); new node api-catalog-surface (Yang/Practice)
with 3 edges; artifact_paths extended across 3 nodes
- .ontology/state.ncl: protocol-maturity blocker updated (protocol v2 complete);
self-description-coverage catalyst updated with session 2026-03-23 additions
- ADR-007: "API Surface Discoverability via #[onto_api] Proc-Macro" — Accepted
## Documentation
- README.md: crates table updated (11 pages, 29 MCP tools, ontoref-derive row);
MCP representative table expanded; API Catalog, Semantic Diff, Per-File
Versioning paragraphs added; update_ontoref onboarding section added
- CHANGELOG.md: [Unreleased] section with 4 change groups
- assets/web/src/index.html: tool counts 19→29 (EN+ES), page counts 12→11
(EN+ES), daemon description paragraph updated with API catalog + #[onto_api]
2026-03-23 00:58:27 +01:00
|
|
|
// ── #[onto_validates]
|
|
|
|
|
// ─────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
/// Attribute macro for test functions: registers which ontology practices and
|
|
|
|
|
/// ADRs the test validates.
|
|
|
|
|
///
|
|
|
|
|
/// Only active under `#[cfg(test)]` — zero production binary impact.
|
|
|
|
|
///
|
|
|
|
|
/// # Example
|
|
|
|
|
/// ```ignore
|
|
|
|
|
/// #[onto_validates(practice = "ncl-cache", adr = "adr-002")]
|
|
|
|
|
/// #[test]
|
|
|
|
|
/// fn cache_returns_stale_on_missing_file() { /* ... */ }
|
|
|
|
|
/// ```
|
|
|
|
|
#[proc_macro_attribute]
|
|
|
|
|
pub fn onto_validates(args: TokenStream, input: TokenStream) -> TokenStream {
|
|
|
|
|
match expand_onto_validates(args, input) {
|
|
|
|
|
Ok(ts) => ts.into(),
|
|
|
|
|
Err(err) => err.to_compile_error().into(),
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn expand_onto_validates(
|
|
|
|
|
args: TokenStream,
|
|
|
|
|
input: TokenStream,
|
|
|
|
|
) -> syn::Result<proc_macro2::TokenStream> {
|
|
|
|
|
let item = proc_macro2::TokenStream::from(input);
|
|
|
|
|
|
|
|
|
|
// Parse key=value pairs from the attribute args.
|
|
|
|
|
let kv_args = syn::parse::Parser::parse(
|
|
|
|
|
Punctuated::<MetaNameValue, Token![,]>::parse_terminated,
|
|
|
|
|
args,
|
|
|
|
|
)?;
|
|
|
|
|
|
|
|
|
|
let mut practice_id: Option<String> = None;
|
|
|
|
|
let mut adr_id: Option<String> = None;
|
|
|
|
|
|
|
|
|
|
for kv in &kv_args {
|
|
|
|
|
let key = kv
|
|
|
|
|
.path
|
|
|
|
|
.get_ident()
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.path, "expected identifier"))?
|
|
|
|
|
.to_string();
|
|
|
|
|
match key.as_str() {
|
|
|
|
|
"practice" => {
|
|
|
|
|
practice_id = Some(
|
|
|
|
|
lit_str(&kv.value)
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string"))?,
|
|
|
|
|
)
|
|
|
|
|
}
|
|
|
|
|
"adr" => {
|
|
|
|
|
adr_id = Some(
|
|
|
|
|
lit_str(&kv.value)
|
|
|
|
|
.ok_or_else(|| syn::Error::new_spanned(&kv.value, "expected string"))?,
|
|
|
|
|
)
|
|
|
|
|
}
|
|
|
|
|
other => {
|
|
|
|
|
return Err(syn::Error::new_spanned(
|
|
|
|
|
&kv.path,
|
|
|
|
|
format!("unknown onto_validates key: {other}; expected 'practice' or 'adr'"),
|
|
|
|
|
))
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let practice_tokens = match &practice_id {
|
|
|
|
|
Some(p) => quote! { ::core::option::Option::Some(#p) },
|
|
|
|
|
None => quote! { ::core::option::Option::None },
|
|
|
|
|
};
|
|
|
|
|
let adr_tokens = match &adr_id {
|
|
|
|
|
Some(a) => quote! { ::core::option::Option::Some(#a) },
|
|
|
|
|
None => quote! { ::core::option::Option::None },
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
// We need a unique ident for the inventory submission per call site.
|
|
|
|
|
// Use a uuid-like approach via the args hash to avoid collisions.
|
|
|
|
|
let hash = {
|
|
|
|
|
let s = format!(
|
|
|
|
|
"{}{}",
|
|
|
|
|
practice_id.as_deref().unwrap_or(""),
|
|
|
|
|
adr_id.as_deref().unwrap_or("")
|
|
|
|
|
);
|
|
|
|
|
// Simple djb2 hash for uniqueness in the ident.
|
|
|
|
|
s.bytes()
|
|
|
|
|
.fold(5381u64, |h, b| h.wrapping_mul(33).wrapping_add(b as u64))
|
|
|
|
|
};
|
|
|
|
|
let submission_ident = syn::Ident::new(
|
|
|
|
|
&format!("__ONTOREF_TEST_COVERAGE_{hash:x}"),
|
|
|
|
|
Span::call_site(),
|
|
|
|
|
);
|
|
|
|
|
|
|
|
|
|
Ok(quote! {
|
|
|
|
|
#[cfg(all(test, feature = "derive"))]
|
|
|
|
|
::inventory::submit! {
|
|
|
|
|
::ontoref_ontology::TestCoverage {
|
|
|
|
|
practice_id: #practice_tokens,
|
|
|
|
|
adr_id: #adr_tokens,
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
#[cfg(all(test, feature = "derive"))]
|
|
|
|
|
#[doc(hidden)]
|
|
|
|
|
static #submission_ident: () = ();
|
|
|
|
|
|
|
|
|
|
// Emit the original item unchanged.
|
|
|
|
|
#item
|
|
|
|
|
})
|
|
|
|
|
}
|