holonic.backends package
Submodules
holonic.backends.fuseki_backend module
Apache Jena Fuseki backend for holonic.
Wraps a FusekiClient (async) with synchronous methods matching the GraphBackend protocol. Uses asyncio.run() for sync callers; for async usage, call the underlying client directly.
Requires: aiohttp
- class holonic.backends.fuseki_backend.FusekiBackend(base_url, *, dataset, extra_headers=None, **client_kwargs)[source]
Bases:
AbstractHolonicStoreHolonicStore implementation backed by an Apache Jena Fuseki server.
- Parameters:
base_url (
str) – Fuseki server URL, e.g. “http://localhost:3030”. Positional.dataset (
str) – Dataset name on the server. Keyword-only since 0.4.0.extra_headers (
dict[str,str] |None) – Optional HTTP headers merged into every outbound request.**client_kwargs (
Any) – Extra kwargs forwarded to FusekiClient.versionchanged: (..) – 0.4.0:
datasetis now keyword-only. Callers using the legacy positional form (FusekiBackend(base_url, dataset)) receive aTypeError; migrate toFusekiBackend(base_url, dataset=name).
holonic.backends.protocol module
Deprecated: legacy import path for the store protocol.
This module exists for backward compatibility with 0.3.x code. New
code should import from holonic.backends.store. This shim will
be removed in 0.5.0.
Importing GraphBackend from here emits a DeprecationWarning
the first time it happens per Python session. Set the environment
variable HOLONIC_SILENCE_DEPRECATION=1 to suppress the warning
(useful in CI until migration is complete).
- class holonic.backends.protocol.AbstractHolonicStore[source]
Bases:
ABCAbstract base class for holonic stores with optional-method defaults.
Inheriting this is the recommended way to implement a backend. Subclasses define the mandatory methods (abstract here); the ABC provides Python fallback implementations of optional methods so backend authors don’t have to ship them.
Mandatory surface
Eleven methods marked
@abstractmethod: named-graph CRUD (graph_exists,get_graph,put_graph,post_graph,delete_graph,parse_into), SPARQL dispatch (query,construct,ask,update), and one utility (list_named_graphs). Python refuses to instantiate a subclass that doesn’t implement all eleven.Optional surface
Additional methods that backends MAY override to replace the library’s generic Python fallbacks with native, typically faster implementations. Discovery is duck-typed via
hasattr(store, method_name); no registration is required.As of 0.4.0, one optional method is recognized:
refresh_graph_metadata(graph_iri, registry_iri) -> GraphMetadata | Nonerecompute per-graph metadata (triple count, class inventory, last-modified timestamp) natively. The library’sMetadataRefresher.refresh_graphdispatches to this if the method exists on the store; otherwise it runs the generic Python implementation.
Future 0.4.x releases will add more optional methods for scope walking, bulk load, and pipeline execution (see SPEC R9.17).
Example:
A minimal backend implementing only the mandatory surface:
from holonic.backends.store import AbstractHolonicStore class MyBackend(AbstractHolonicStore): def __init__(self): self._store = {} # graph_iri -> set[(s, p, o)] def graph_exists(self, graph_iri): return bool(self._store.get(graph_iri)) def get_graph(self, graph_iri): from rdflib import Graph g = Graph() for triple in self._store.get(graph_iri, ()): g.add(triple) return g # ... other mandatory methods ...
A backend with a native metadata fast path:
class FusekiBackend(AbstractHolonicStore): # ... mandatory methods ... def refresh_graph_metadata(self, graph_iri, registry_iri): # Use Fuseki's native statistics endpoint stats = self._fetch_stats(graph_iri) return GraphMetadata( iri=graph_iri, triple_count=stats["count"], last_modified=stats["modified"], ... )
See Also:
- HolonicStoreThe Protocol view of the mandatory surface;
use this for type annotations on library APIs.
- holonic._metadata.MetadataRefresherDispatcher that
chooses native vs generic metadata paths.
- abstractmethod graph_exists(graph_iri)[source]
Return True if the named graph contains at least one triple.
Implementations SHOULD treat “does not exist” and “exists but empty” as equivalent — both return False. Callers use this as a cheap presence check before committing to a full read.
- abstractmethod get_graph(graph_iri)[source]
Return the named graph as an
rdflib.Graph.The returned graph is a copy for local processing; mutations do not flow back to the store. Callers wanting to mutate the backing state use
put_graph/post_graph/parse_into/update.If the named graph does not exist, implementations SHOULD return an empty
rdflib.Graphrather than raise.
- abstractmethod put_graph(graph_iri, g)[source]
Replace the named graph with the contents of
g.Existing triples in the named graph are removed; the new triples are then added. Atomic with respect to other callers where the backing store supports it; non-atomic implementations SHOULD document the window.
- abstractmethod post_graph(graph_iri, g)[source]
Append the triples in
gto the named graph.Existing triples are preserved. Duplicate triples are coalesced at the RDF level (a quad store stores each
(s, p, o, g)at most once).
- abstractmethod delete_graph(graph_iri)[source]
Delete the named graph entirely.
SHOULD be idempotent: deleting a non-existent graph is a no-op, not an error.
- abstractmethod parse_into(graph_iri, data, format='turtle')[source]
Parse serialized RDF and append into the named graph.
formatis an rdflib parser name; common values are"turtle","xml","n3","json-ld","nquads". Semantic equivalent topost_graph(graph_iri, rdflib.Graph().parse(data=data, format=format))but implementations MAY optimize (e.g. stream-parse into the backing store directly).
- abstractmethod query(sparql, **bindings)[source]
Execute a SPARQL SELECT query.
Returns a list of binding dictionaries, one per result row. Each dict maps variable names (without the leading
?) to their bound values. Values are Python scalars for literals (strings, ints, floats, booleans,datetimeobjects forxsd:dateTime) and strings for IRIs.bindingsis reserved for future parameterized-query support; implementations MAY raiseNotImplementedErroron non-empty bindings in 0.4.x.
- abstractmethod construct(sparql, **bindings)[source]
Execute a SPARQL CONSTRUCT query.
Returns the constructed triples as an
rdflib.Graph. The return value is a fresh graph, not bound to any named graph in the store; callers wanting to persist it useput_graphorpost_graph.bindings: seequery.
- abstractmethod ask(sparql, **bindings)[source]
Execute a SPARQL ASK query.
Returns True if the query has at least one solution, False otherwise.
bindings: seequery.
- abstractmethod update(sparql)[source]
Execute a SPARQL UPDATE (INSERT / DELETE / DROP / CREATE).
Mutates the backing store according to the update request. Callers using this path bypass the library’s metadata-refresh machinery; if
metadata_updates="eager"is the dataset policy, callHolonicDataset.refresh_metadataafter out-of-band updates to reconcile.
- abstractmethod list_named_graphs()[source]
Return the IRIs of all named graphs in the store.
Implementations SHOULD exclude graphs that exist as identifiers but contain no triples. The default graph (if the backing store has one) is NOT included; the library does not use the default graph and expects every triple to live in a named graph per R1.4.
- class holonic.backends.protocol.HolonicStore(*args, **kwargs)[source]
Bases:
ProtocolMandatory interface for a quad-aware graph store.
Every backend must satisfy this protocol. The methods cover named-graph CRUD and SPARQL dispatch — enough for all holonic operations when combined with the library’s Python-side helpers (
MetadataRefresher,ScopeResolver,run_projection).Any object matching this protocol shape can be used with
HolonicDataset, regardless of whether it inheritsAbstractHolonicStore. Inheritance is recommended for the defaults-for-optional-methods it provides, but not required.Choosing between Protocol and ABC
Use the Protocol (
HolonicStore) for type annotations on library-public functions and APIs. It captures the structural contract without requiring inheritance from users:def do_something(store: HolonicStore) -> None: ...
Use the ABC (
AbstractHolonicStore) as the base class for new backend implementations. It adds@abstractmethodenforcement (so Python refuses to instantiate a subclass that forgets a method) plus hook points for optional-method defaults:class MyBackend(AbstractHolonicStore): def graph_exists(self, graph_iri): ... # ... all the other abstract methods
Examples:
The two first-party backends (
RdflibBackend,FusekiBackend) both inherit the ABC. Duck-typed protocol satisfaction works too, as verified byisinstance(backend, HolonicStore).See Also:
AbstractHolonicStore : Recommended base class for new backends. holonic.backends.rdflib_backend.RdflibBackend : First-party default. holonic.backends.fuseki_backend.FusekiBackend : First-party HTTP.
- parse_into(graph_iri, data, format='turtle')[source]
Parse serialized RDF into the named graph (append).
- query(sparql, **bindings)[source]
Execute a SELECT query. Return list of binding dicts.
Each dict maps variable names (without
?) to their values. Values are strings (IRIs/literals) — callers convert as needed.
- construct(sparql, **bindings)[source]
Execute a CONSTRUCT query. Return results as an rdflib.Graph.
holonic.backends.rdflib_backend module
rdflib.Dataset backend for holonic.
This is the default backend — zero infrastructure, pure Python. Uses rdflib.Dataset (a ConjunctiveGraph with explicit named-graph support) as the quad store.
- class holonic.backends.rdflib_backend.RdflibBackend(dataset=None)[source]
Bases:
AbstractHolonicStoreHolonicStore implementation backed by an rdflib.Dataset.
- Parameters:
dataset (
Dataset|None) – An existing rdflib.Dataset instance. If None, a fresh in-memory dataset is created.
holonic.backends.store module
Holonic store protocol and ABC (0.4.0).
Any graph store — rdflib.Dataset, Fuseki, Oxigraph, GraphDB — can back
a HolonicDataset by satisfying the HolonicStore protocol. All
methods operate on named graphs via IRIs and SPARQL strings; no
rdflib types leak through the interface beyond rdflib.Graph for
query results.
Design
HolonicStore is a typing.Protocol that declares the MANDATORY
surface. Any object with these methods can be used with
HolonicDataset.
AbstractHolonicStore is an ABC that inherits the protocol and
provides default implementations of OPTIONAL methods in terms of the
mandatory ones. Backends that want native optimizations inherit the
ABC and override; backends that implement only the protocol get the
generic Python fallbacks from the library’s helpers (MetadataRefresher,
ScopeResolver).
Optional surface (0.4.0)
Minimal to start — see docs/DECISIONS.md § 0.4.0:
refresh_graph_metadata(graph_iri)— recompute per-graph metadata natively. Library dispatches to this if present; otherwise falls back to the PythonMetadataRefresher.
Future 0.4.x extensions (scope walking, bulk load, pipeline execution) will be additive — a backend that implements none of them continues to work; a backend that implements some gets native speed for those operations.
Backward compatibility
holonic.backends.protocol.GraphBackend is a deprecated alias for
HolonicStore kept through all of 0.4.x. Removal scheduled for
0.5.0. See docs/MIGRATION.md.
- class holonic.backends.store.AbstractHolonicStore[source]
Bases:
ABCAbstract base class for holonic stores with optional-method defaults.
Inheriting this is the recommended way to implement a backend. Subclasses define the mandatory methods (abstract here); the ABC provides Python fallback implementations of optional methods so backend authors don’t have to ship them.
Mandatory surface
Eleven methods marked
@abstractmethod: named-graph CRUD (graph_exists,get_graph,put_graph,post_graph,delete_graph,parse_into), SPARQL dispatch (query,construct,ask,update), and one utility (list_named_graphs). Python refuses to instantiate a subclass that doesn’t implement all eleven.Optional surface
Additional methods that backends MAY override to replace the library’s generic Python fallbacks with native, typically faster implementations. Discovery is duck-typed via
hasattr(store, method_name); no registration is required.As of 0.4.0, one optional method is recognized:
refresh_graph_metadata(graph_iri, registry_iri) -> GraphMetadata | Nonerecompute per-graph metadata (triple count, class inventory, last-modified timestamp) natively. The library’sMetadataRefresher.refresh_graphdispatches to this if the method exists on the store; otherwise it runs the generic Python implementation.
Future 0.4.x releases will add more optional methods for scope walking, bulk load, and pipeline execution (see SPEC R9.17).
Example:
A minimal backend implementing only the mandatory surface:
from holonic.backends.store import AbstractHolonicStore class MyBackend(AbstractHolonicStore): def __init__(self): self._store = {} # graph_iri -> set[(s, p, o)] def graph_exists(self, graph_iri): return bool(self._store.get(graph_iri)) def get_graph(self, graph_iri): from rdflib import Graph g = Graph() for triple in self._store.get(graph_iri, ()): g.add(triple) return g # ... other mandatory methods ...
A backend with a native metadata fast path:
class FusekiBackend(AbstractHolonicStore): # ... mandatory methods ... def refresh_graph_metadata(self, graph_iri, registry_iri): # Use Fuseki's native statistics endpoint stats = self._fetch_stats(graph_iri) return GraphMetadata( iri=graph_iri, triple_count=stats["count"], last_modified=stats["modified"], ... )
See Also:
- HolonicStoreThe Protocol view of the mandatory surface;
use this for type annotations on library APIs.
- holonic._metadata.MetadataRefresherDispatcher that
chooses native vs generic metadata paths.
- abstractmethod graph_exists(graph_iri)[source]
Return True if the named graph contains at least one triple.
Implementations SHOULD treat “does not exist” and “exists but empty” as equivalent — both return False. Callers use this as a cheap presence check before committing to a full read.
- abstractmethod get_graph(graph_iri)[source]
Return the named graph as an
rdflib.Graph.The returned graph is a copy for local processing; mutations do not flow back to the store. Callers wanting to mutate the backing state use
put_graph/post_graph/parse_into/update.If the named graph does not exist, implementations SHOULD return an empty
rdflib.Graphrather than raise.
- abstractmethod put_graph(graph_iri, g)[source]
Replace the named graph with the contents of
g.Existing triples in the named graph are removed; the new triples are then added. Atomic with respect to other callers where the backing store supports it; non-atomic implementations SHOULD document the window.
- abstractmethod post_graph(graph_iri, g)[source]
Append the triples in
gto the named graph.Existing triples are preserved. Duplicate triples are coalesced at the RDF level (a quad store stores each
(s, p, o, g)at most once).
- abstractmethod delete_graph(graph_iri)[source]
Delete the named graph entirely.
SHOULD be idempotent: deleting a non-existent graph is a no-op, not an error.
- abstractmethod parse_into(graph_iri, data, format='turtle')[source]
Parse serialized RDF and append into the named graph.
formatis an rdflib parser name; common values are"turtle","xml","n3","json-ld","nquads". Semantic equivalent topost_graph(graph_iri, rdflib.Graph().parse(data=data, format=format))but implementations MAY optimize (e.g. stream-parse into the backing store directly).
- abstractmethod query(sparql, **bindings)[source]
Execute a SPARQL SELECT query.
Returns a list of binding dictionaries, one per result row. Each dict maps variable names (without the leading
?) to their bound values. Values are Python scalars for literals (strings, ints, floats, booleans,datetimeobjects forxsd:dateTime) and strings for IRIs.bindingsis reserved for future parameterized-query support; implementations MAY raiseNotImplementedErroron non-empty bindings in 0.4.x.
- abstractmethod construct(sparql, **bindings)[source]
Execute a SPARQL CONSTRUCT query.
Returns the constructed triples as an
rdflib.Graph. The return value is a fresh graph, not bound to any named graph in the store; callers wanting to persist it useput_graphorpost_graph.bindings: seequery.
- abstractmethod ask(sparql, **bindings)[source]
Execute a SPARQL ASK query.
Returns True if the query has at least one solution, False otherwise.
bindings: seequery.
- abstractmethod update(sparql)[source]
Execute a SPARQL UPDATE (INSERT / DELETE / DROP / CREATE).
Mutates the backing store according to the update request. Callers using this path bypass the library’s metadata-refresh machinery; if
metadata_updates="eager"is the dataset policy, callHolonicDataset.refresh_metadataafter out-of-band updates to reconcile.
- abstractmethod list_named_graphs()[source]
Return the IRIs of all named graphs in the store.
Implementations SHOULD exclude graphs that exist as identifiers but contain no triples. The default graph (if the backing store has one) is NOT included; the library does not use the default graph and expects every triple to live in a named graph per R1.4.
- class holonic.backends.store.HolonicStore(*args, **kwargs)[source]
Bases:
ProtocolMandatory interface for a quad-aware graph store.
Every backend must satisfy this protocol. The methods cover named-graph CRUD and SPARQL dispatch — enough for all holonic operations when combined with the library’s Python-side helpers (
MetadataRefresher,ScopeResolver,run_projection).Any object matching this protocol shape can be used with
HolonicDataset, regardless of whether it inheritsAbstractHolonicStore. Inheritance is recommended for the defaults-for-optional-methods it provides, but not required.Choosing between Protocol and ABC
Use the Protocol (
HolonicStore) for type annotations on library-public functions and APIs. It captures the structural contract without requiring inheritance from users:def do_something(store: HolonicStore) -> None: ...
Use the ABC (
AbstractHolonicStore) as the base class for new backend implementations. It adds@abstractmethodenforcement (so Python refuses to instantiate a subclass that forgets a method) plus hook points for optional-method defaults:class MyBackend(AbstractHolonicStore): def graph_exists(self, graph_iri): ... # ... all the other abstract methods
Examples:
The two first-party backends (
RdflibBackend,FusekiBackend) both inherit the ABC. Duck-typed protocol satisfaction works too, as verified byisinstance(backend, HolonicStore).See Also:
AbstractHolonicStore : Recommended base class for new backends. holonic.backends.rdflib_backend.RdflibBackend : First-party default. holonic.backends.fuseki_backend.FusekiBackend : First-party HTTP.
- parse_into(graph_iri, data, format='turtle')[source]
Parse serialized RDF into the named graph (append).
- query(sparql, **bindings)[source]
Execute a SELECT query. Return list of binding dicts.
Each dict maps variable names (without
?) to their values. Values are strings (IRIs/literals) — callers convert as needed.
- construct(sparql, **bindings)[source]
Execute a CONSTRUCT query. Return results as an rdflib.Graph.
Module contents
Graph store backends for holonic.
Canonical protocol: HolonicStore from holonic.backends.store.
Recommended base class for new backends: AbstractHolonicStore.
GraphBackend is preserved as a deprecated alias through the
entire 0.4.x series. New code should use HolonicStore.
- class holonic.backends.AbstractHolonicStore[source]
Bases:
ABCAbstract base class for holonic stores with optional-method defaults.
Inheriting this is the recommended way to implement a backend. Subclasses define the mandatory methods (abstract here); the ABC provides Python fallback implementations of optional methods so backend authors don’t have to ship them.
Mandatory surface
Eleven methods marked
@abstractmethod: named-graph CRUD (graph_exists,get_graph,put_graph,post_graph,delete_graph,parse_into), SPARQL dispatch (query,construct,ask,update), and one utility (list_named_graphs). Python refuses to instantiate a subclass that doesn’t implement all eleven.Optional surface
Additional methods that backends MAY override to replace the library’s generic Python fallbacks with native, typically faster implementations. Discovery is duck-typed via
hasattr(store, method_name); no registration is required.As of 0.4.0, one optional method is recognized:
refresh_graph_metadata(graph_iri, registry_iri) -> GraphMetadata | Nonerecompute per-graph metadata (triple count, class inventory, last-modified timestamp) natively. The library’sMetadataRefresher.refresh_graphdispatches to this if the method exists on the store; otherwise it runs the generic Python implementation.
Future 0.4.x releases will add more optional methods for scope walking, bulk load, and pipeline execution (see SPEC R9.17).
Example:
A minimal backend implementing only the mandatory surface:
from holonic.backends.store import AbstractHolonicStore class MyBackend(AbstractHolonicStore): def __init__(self): self._store = {} # graph_iri -> set[(s, p, o)] def graph_exists(self, graph_iri): return bool(self._store.get(graph_iri)) def get_graph(self, graph_iri): from rdflib import Graph g = Graph() for triple in self._store.get(graph_iri, ()): g.add(triple) return g # ... other mandatory methods ...
A backend with a native metadata fast path:
class FusekiBackend(AbstractHolonicStore): # ... mandatory methods ... def refresh_graph_metadata(self, graph_iri, registry_iri): # Use Fuseki's native statistics endpoint stats = self._fetch_stats(graph_iri) return GraphMetadata( iri=graph_iri, triple_count=stats["count"], last_modified=stats["modified"], ... )
See Also:
- HolonicStoreThe Protocol view of the mandatory surface;
use this for type annotations on library APIs.
- holonic._metadata.MetadataRefresherDispatcher that
chooses native vs generic metadata paths.
- abstractmethod graph_exists(graph_iri)[source]
Return True if the named graph contains at least one triple.
Implementations SHOULD treat “does not exist” and “exists but empty” as equivalent — both return False. Callers use this as a cheap presence check before committing to a full read.
- abstractmethod get_graph(graph_iri)[source]
Return the named graph as an
rdflib.Graph.The returned graph is a copy for local processing; mutations do not flow back to the store. Callers wanting to mutate the backing state use
put_graph/post_graph/parse_into/update.If the named graph does not exist, implementations SHOULD return an empty
rdflib.Graphrather than raise.
- abstractmethod put_graph(graph_iri, g)[source]
Replace the named graph with the contents of
g.Existing triples in the named graph are removed; the new triples are then added. Atomic with respect to other callers where the backing store supports it; non-atomic implementations SHOULD document the window.
- abstractmethod post_graph(graph_iri, g)[source]
Append the triples in
gto the named graph.Existing triples are preserved. Duplicate triples are coalesced at the RDF level (a quad store stores each
(s, p, o, g)at most once).
- abstractmethod delete_graph(graph_iri)[source]
Delete the named graph entirely.
SHOULD be idempotent: deleting a non-existent graph is a no-op, not an error.
- abstractmethod parse_into(graph_iri, data, format='turtle')[source]
Parse serialized RDF and append into the named graph.
formatis an rdflib parser name; common values are"turtle","xml","n3","json-ld","nquads". Semantic equivalent topost_graph(graph_iri, rdflib.Graph().parse(data=data, format=format))but implementations MAY optimize (e.g. stream-parse into the backing store directly).
- abstractmethod query(sparql, **bindings)[source]
Execute a SPARQL SELECT query.
Returns a list of binding dictionaries, one per result row. Each dict maps variable names (without the leading
?) to their bound values. Values are Python scalars for literals (strings, ints, floats, booleans,datetimeobjects forxsd:dateTime) and strings for IRIs.bindingsis reserved for future parameterized-query support; implementations MAY raiseNotImplementedErroron non-empty bindings in 0.4.x.
- abstractmethod construct(sparql, **bindings)[source]
Execute a SPARQL CONSTRUCT query.
Returns the constructed triples as an
rdflib.Graph. The return value is a fresh graph, not bound to any named graph in the store; callers wanting to persist it useput_graphorpost_graph.bindings: seequery.
- abstractmethod ask(sparql, **bindings)[source]
Execute a SPARQL ASK query.
Returns True if the query has at least one solution, False otherwise.
bindings: seequery.
- abstractmethod update(sparql)[source]
Execute a SPARQL UPDATE (INSERT / DELETE / DROP / CREATE).
Mutates the backing store according to the update request. Callers using this path bypass the library’s metadata-refresh machinery; if
metadata_updates="eager"is the dataset policy, callHolonicDataset.refresh_metadataafter out-of-band updates to reconcile.
- abstractmethod list_named_graphs()[source]
Return the IRIs of all named graphs in the store.
Implementations SHOULD exclude graphs that exist as identifiers but contain no triples. The default graph (if the backing store has one) is NOT included; the library does not use the default graph and expects every triple to live in a named graph per R1.4.
- class holonic.backends.HolonicStore(*args, **kwargs)[source]
Bases:
ProtocolMandatory interface for a quad-aware graph store.
Every backend must satisfy this protocol. The methods cover named-graph CRUD and SPARQL dispatch — enough for all holonic operations when combined with the library’s Python-side helpers (
MetadataRefresher,ScopeResolver,run_projection).Any object matching this protocol shape can be used with
HolonicDataset, regardless of whether it inheritsAbstractHolonicStore. Inheritance is recommended for the defaults-for-optional-methods it provides, but not required.Choosing between Protocol and ABC
Use the Protocol (
HolonicStore) for type annotations on library-public functions and APIs. It captures the structural contract without requiring inheritance from users:def do_something(store: HolonicStore) -> None: ...
Use the ABC (
AbstractHolonicStore) as the base class for new backend implementations. It adds@abstractmethodenforcement (so Python refuses to instantiate a subclass that forgets a method) plus hook points for optional-method defaults:class MyBackend(AbstractHolonicStore): def graph_exists(self, graph_iri): ... # ... all the other abstract methods
Examples:
The two first-party backends (
RdflibBackend,FusekiBackend) both inherit the ABC. Duck-typed protocol satisfaction works too, as verified byisinstance(backend, HolonicStore).See Also:
AbstractHolonicStore : Recommended base class for new backends. holonic.backends.rdflib_backend.RdflibBackend : First-party default. holonic.backends.fuseki_backend.FusekiBackend : First-party HTTP.
- parse_into(graph_iri, data, format='turtle')[source]
Parse serialized RDF into the named graph (append).
- query(sparql, **bindings)[source]
Execute a SELECT query. Return list of binding dicts.
Each dict maps variable names (without
?) to their values. Values are strings (IRIs/literals) — callers convert as needed.
- construct(sparql, **bindings)[source]
Execute a CONSTRUCT query. Return results as an rdflib.Graph.
- class holonic.backends.RdflibBackend(dataset=None)[source]
Bases:
AbstractHolonicStoreHolonicStore implementation backed by an rdflib.Dataset.
- Parameters:
dataset (
Dataset|None) – An existing rdflib.Dataset instance. If None, a fresh in-memory dataset is created.