Dynaconf Plugin API Reference

Structum Dynaconf (structum-dynaconf)

Documentation Python 3.11+ License: Apache-2.0

Structum Dynaconf integra la potenza di Dynaconf in Structum Lab, offrendo una gestione della configurazione multi-source, validazione Pydantic e hot-reloading.

🚀 Installazione

pip install structum-dynaconf

📚 Documentazione Completa

Documentazione, API Reference e Guide: 👉 https://structum-lab.pages.dev/plugins/dynaconf

✨ Funzionalità Principali

  • Multi-Source Loading: Carica config da file (TOML, YAML), Env Vars, Redis, Vault.

  • Type Safety: Validazione rigorosa tramite Pydantic v2.

  • Secrets Management: Gestione sicura dei segreti.

  • Hot Reloading: Ricarica automatica al variare dei file di configurazione (opzionale).

Dynaconf Plugin Package

structum_lab.plugins.dynaconf - Production-Grade Configuration Engine Plugin

Architettura: - core/: Componenti essenziali (Provider, Builders, Manager) - features/: Features production-grade (Health, Cache, Concurrency, Migrations, Hot Reload)

class structum_lab.plugins.dynaconf.DynaconfConfigProvider(root_path: str = '.', env_prefix: str = 'STRUCTUM', environments: bool = True, active_env: str = 'development', enable_hot_reload: bool = False, enable_cache: bool = True)[source]

Bases: ConfigProviderInterface

Advanced configuration provider built on Dynaconf with Pydantic validation.

Acts as adapter between Structum’s key-value ConfigInterface and strongly-typed Pydantic models, providing:

  • Auto-discovery: Scans config/app/*.toml and matches with config/models/*.py

  • Type safety: Pydantic validation on all config updates

  • Hot reload: Optional file watching with automatic reload

  • Transactions: Atomic multi-key updates

  • Health checks: Configuration integrity and validation monitoring

  • Smart caching: Multi-level cache with invalidation

  • Metrics: Prometheus-compatible operation metrics

Architecture:

The provider maintains two representations:

  1. Pydantic Models (source of truth) - Strongly typed, validated

  2. Flat Dictionary (performance cache) - Fast dot-notation access

All updates flow through Pydantic validation before being persisted.

Configuration Flow:

TOML Files → Dynaconf → Pydantic Models → Flat Cache → get()
                               ↓
                       Validation Errors
                               ↓
                    ConfigurationError

Example

Basic usage with auto-discovery:

from structum_lab.plugins.dynaconf import DynaconfConfigProvider

# Initialize provider
config = DynaconfConfigProvider(
    root_path=".",
    env_prefix="MYAPP",
    enable_hot_reload=True
)

# Auto-discover config files
config.auto_discover(
    app_dir="config/app",
    models_dir="config/models"
)

# Access configuration
db_host = config.get("database.host", "localhost")
db_port = config.get("database.port", 5432)

# Update with validation
config.set("database.port", 5433)  # Validates against Pydantic model
config.save()  # Persist to ~/.structum/

With manual builder registration:

from structum_lab.plugins.dynaconf.core.builders import GenericConfigBuilder
from myapp.config_models import DatabaseConfig

# Register typed configuration
builder = GenericConfigBuilder(
    name="database",
    files=["config/database.toml"],
    model=DatabaseConfig  # Pydantic model
)
config.register_builder("database", builder)

# Access is now type-safe
port = config.get("database.port")  # Validated as int

Atomic transactions:

# Multiple updates atomically
with config.transaction() as tx:
    tx.set("api.rate_limit", 1000)
    tx.set("api.timeout_seconds", 30)
    # All changes applied together on commit
Implementations:

This is the primary implementation of ConfigInterface for Structum Lab applications.

Note

  • Thread-safe with read-write locks

  • Supports environment variable overrides (MYAPP_DATABASE_PORT)

  • Hot reload requires watchdog package

  • Persistence files stored in ~/.structum/

Warning

Do not modify _flat_config or _manager directly. Always use public methods (get, set, transaction) to ensure validation and cache coherence.

See also

ConfigInterface: Base protocol AbstractConfigBuilder: Custom builders ConfigTransaction: Transaction API

__del__() None[source]

Cleanup hot reload observer on provider destruction.

Automatically called by Python garbage collector. Stops file watchers to prevent resource leaks.

Note

Manual cleanup via enable_hot_reload().stop() not required.

__init__(root_path: str = '.', env_prefix: str = 'STRUCTUM', environments: bool = True, active_env: str = 'development', enable_hot_reload: bool = False, enable_cache: bool = True)[source]

Initialize the configuration provider.

Sets up the provider infrastructure including thread-safe locks, configuration manager, health checks, caching, and optional hot reload capabilities.

Parameters:
root_path : str

Root directory for resolving configuration file paths. All relative paths in builders will be resolved from this directory. Defaults to current directory (".").

env_prefix : str

Prefix for environment variable overrides. E.g., "MYAPP" enables MYAPP_DATABASE_PORT to override database.port. Defaults to "STRUCTUM".

environments : bool

Enable multi-environment support ([production], [development] in TOML files). If True, environment-specific sections are loaded based on active_env. Defaults to True.

active_env : str

Active environment name when environments=True. Determines which TOML section to load. Defaults to "development".

enable_hot_reload : bool

Enable automatic file watching and reload. Requires watchdog package. When True, watches configuration files for changes and reloads automatically. Defaults to False.

enable_cache : bool

Enable smart caching for get() operations. Significantly improves performance for frequently accessed keys. Defaults to True.

Raises:

ImportError – If enable_hot_reload=True but watchdog not installed.

Example

Development setup with hot reload:

config = DynaconfConfigProvider(
    root_path="/app",
    env_prefix="MYAPP",
    active_env="development",
    enable_hot_reload=True  # Auto-reload on file changes
)

Production setup (no hot reload, custom prefix):

config = DynaconfConfigProvider(
    root_path="/etc/myapp",
    env_prefix="PROD",
    active_env="production",
    enable_hot_reload=False,  # Static config in production
    enable_cache=True
)

Note

  • Hot reload is started when first builder is registered

  • Cache is populated lazily on first get() call

  • Health checks are registered automatically

  • No configuration is loaded until register_builder() or auto_discover()

See also

auto_discover(): Automatic configuration discovery register_builder(): Manual builder registration enable_hot_reload(): Enable hot reload after initialization

auto_discover(app_dir: str = 'config/app', models_dir: str = 'config/models') list[str][source]

Automatically discover and register configuration files.

Scans directories for TOML files and optionally matches them with Pydantic models. This is the recommended initialization pattern for structured projects.

Discovery Process:

  1. Scan {root_path}/{app_dir}/*.toml for configuration files

  2. For each TOML file (e.g., database.toml): - Check for matching model in {root_path}/{models_dir}/database.py - If model exists: Load with strict Pydantic validation - If no model: Load dynamically without validation

  3. Register all discovered namespaces

Directory Structure Example:

myproject/
├── config/
│   ├── app/
│   │   ├── database.toml      # → database namespace
│   │   ├── api.toml           # → api namespace
│   │   └── cache.toml         # → cache namespace
│   └── models/
│       ├── database.py         # DatabaseConfig (Pydantic)
│       └── api.py              # ApiConfig (Pydantic)
└── app.py
Parameters:
app_dir : str

Relative directory containing TOML files. Defaults to "config/app".

models_dir : str

Relative directory containing Pydantic models. Defaults to "config/models".

Returns:

List of discovered namespace names (e.g., ["database", "api", "cache"]).

Return type:

list[str]

Example

Basic auto-discovery:

from structum_lab.plugins.dynaconf import DynaconfConfigProvider

config = DynaconfConfigProvider(root_path=".")
namespaces = config.auto_discover()
# Logs: Auto-discovered: database [Strict (DatabaseConfig)] → config/app/database.toml
# Logs: Auto-discovered: api [Strict (ApiConfig)] → config/app/api.toml
# Logs: Auto-discovered: cache [Dynamic] → config/app/cache.toml

print(namespaces)  # ['database', 'api', 'cache']

Custom directories:

config = DynaconfConfigProvider(root_path="/etc/myapp")
config.auto_discover(
    app_dir="settings",
    models_dir="schemas"
)

With FastAPI:

from fastapi import FastAPI

app = FastAPI()

@app.on_event("startup")
def load_config():
    config = DynaconfConfigProvider()
    discovered = config.auto_discover()
    app.state.config = config
    log.info(f"Loaded {len(discovered)} config namespaces")

Model File Format (config/models/database.py):

from pydantic import BaseModel, Field

class DatabaseConfig(BaseModel):
    host: str = "localhost"
    port: int = Field(ge=1, le=65535, default=5432)
    database: str
    username: str
    password: str  # Will be overridden by env vars

Warning

  • Hidden files (starting with .) are skipped

  • Already-registered builders are skipped

  • Model loading errors fallback to dynamic mode (logged as warnings)

Note

  • Thread-safe operation

  • Models must inherit from pydantic.BaseModel

  • Model class name can be anything (first BaseModel found is used)

  • Dynamically imports model files via importlib

See also

register_builder(): Manual builder registration load(): Load single file without discovery GenericConfigBuilder: Builder implementation

enable_hot_reload(watch_secrets: bool = True, debounce_seconds: float = 2.0, callback: Any | None = None) None[source]

Enable automatic configuration reloading on file changes.

Watches configuration files (TOML, JSON) and automatically calls reload() when changes detected. Requires watchdog package.

Parameters:
watch_secrets : bool

Whether to watch secret files in addition to config files. Defaults to True.

debounce_seconds : float

Debounce interval to avoid rapid reloads on multiple file changes. Defaults to 2.0 seconds.

callback : Optional[Callable]

Optional callback function called after reload. Signature: callback(changed_files: list[str]) -> None.

Raises:

ImportError – If watchdog package not installed.

Example

Development environment:

config = DynaconfConfigProvider(
    root_path=".",
    enable_hot_reload=False  # Start disabled
)
config.auto_discover()

# Enable later with custom debounce
config.enable_hot_reload(debounce_seconds=5.0)

# Now changes to config files auto-reload

With callback:

def on_config_changed(files):
    log.info(f"Config reloaded: {files}")
    # Notify connected clients
    websocket.broadcast({"type": "config_updated"})

config.enable_hot_reload(
    debounce_seconds=3.0,
    callback=on_config_changed
)

Production (disabled):

# Don't use hot reload in production
config = DynaconfConfigProvider(
    enable_hot_reload=False  # Static config
)

Warning

  • Not recommended for production (use SIGHUP reload instead)

  • Debounce prevents rapid reload storms but adds latency

  • File watcher holds file system resources

Note

  • Watches both TOML source files and JSON persistence files

  • Auto-started on first builder if enable_hot_reload=True in __init__

  • Calls are idempotent (ignores if already enabled)

  • Cleanup automatic on provider destruction

See also

reload(): Manual reload HotReloadManager: Watcher implementation

get(key: str, default: Any = None) Any[source]

Retrieve a configuration value by dot-notation key.

Implements get() with smart caching and performance metrics. Cache-first lookup minimizes lock contention.

Parameters:
key : str

Dot-notation configuration key (e.g., "database.host", "api.rate_limit").

default : Any

Default value if key not found. Defaults to None.

Returns:

Configuration value (typed according to Pydantic model) or default.

Return type:

Any

Example

Basic usage:

# Simple get with default
db_host = config.get("database.host", "localhost")
port = config.get("database.port", 5432)

# Nested keys
timeout = config.get("api.client.timeout_seconds", 30)

Environment override:

# TOML: database.host = "localhost"
# ENV:  MYAPP_DATABASE_HOST="prod-db.example.com"
host = config.get("database.host")  # Returns "prod-db.example.com"

Type safety with Pydantic:

# Model defines port as int with validation
port = config.get("database.port")  # Guaranteed int type

Note

  • Thread-safe read operation

  • Cache hit bypasses locks entirely

  • First access populates cache

  • Metrics emitted for monitoring (cache hit/miss rate, latency)

See also

set(): Update configuration values has(): Check key existence get_cache_stats(): Cache performance metrics

get_cache_stats() dict[str, Any][source]

Retrieve cache performance metrics for monitoring.

Returns cache hit rate, size, and other statistics. Automatically emits metrics to Prometheus/statsD via metrics interface.

Returns:

Cache statistics dictionary:
  • enabled (bool): Whether cache is active

  • size (int): Number of cached keys

  • hit_rate (float): Cache hit ratio (0.0-1.0)

  • Additional metrics from cache implementation

Return type:

Dict[str, Any]

Example

Monitoring dashboard:

stats = config.get_cache_stats()
print(f"Cache enabled: {stats['enabled']}")
print(f"Cache size: {stats['size']} keys")
print(f"Hit rate: {stats['hit_rate']:.2%}")

Prometheus metrics:

# Called automatically by metrics.gauge()
# structum_config_cache_size{} 1234
# structum_config_cache_hit_rate{} 0.95

Health check integration:

@app.get("/health/cache")
def cache_health():
    stats = config.get_cache_stats()
    if not stats.get('enabled'):
        return {"status": "disabled"}

    if stats['hit_rate'] < 0.7:
        return {"status": "degraded", "hit_rate": stats['hit_rate']}

    return {"status": "healthy", **stats}

Note

  • Returns {"enabled": False} if cache disabled

  • Metrics auto-emitted as gauges

  • Thread-safe read operation

See also

__init__(): enable_cache parameter SmartCache: Cache implementation

has(key: str) bool[source]

Check if a configuration key exists.

Implements has() with thread-safe lookup.

Parameters:
key : str

Dot-notation configuration key.

Returns:

True if key exists in configuration, False otherwise.

Return type:

bool

Example

Conditional configuration:

# Check before accessing
if config.has("features.experimental"):
    experimental = config.get("features.experimental")
    if experimental:
        enable_experimental_features()

# Avoid KeyError in strict mode
if not config.has("optional.setting"):
    config.set("optional.setting", default_value)

Note

  • Thread-safe read operation

  • Checks flat config cache (no Pydantic model lookup)

  • Does not distinguish between None value and missing key

See also

get(): Retrieve values with default

health_check() dict[str, HealthCheckResult][source]

Execute comprehensive configuration health verification.

Runs all registered health checks and returns detailed results. Useful for monitoring dashboards, readiness probes, and diagnostics.

Returns:

Mapping of check names to results.

Each result contains status (healthy/unhealthy), message, and metrics.

Return type:

Dict[str, HealthCheckResult]

Example

FastAPI readiness probe:

@app.get("/health/config")
def config_health():
    results = config.health_check()
    all_healthy = all(r.is_healthy for r in results.values())

    if all_healthy:
        return {"status": "healthy", "checks": results}
    else:
        return JSONResponse(
            {"status": "unhealthy", "checks": results},
            status_code=503
        )

Prometheus metrics:

results = config.health_check()
for check_name, result in results.items():
    metrics.gauge(
        "config_health_check",
        1 if result.is_healthy else 0,
        tags={"check": check_name}
    )

Note

Thread-safe read operation. Does not modify configuration state.

See also

HealthCheckResult: Result structure

load(name: str, file_path: str) None[source]

Load a configuration file manually without Pydantic model.

Convenience shortcut for GenericConfigBuilder registration. Useful for quick configuration loading during development or dynamic configs.

Parameters:
name : str

Configuration namespace (e.g., "payment", "cache").

file_path : str

Relative path to TOML file from root_path (e.g., "config/app/payment.toml").

Example

Quick load without model:

# Load payment config
config.load("payment", "config/app/payment.toml")

# Access immediately
api_key = config.get("payment.stripe.api_key")
webhook_secret = config.get("payment.stripe.webhook_secret")

Multiple configs:

# Load several configs
config.load("cache", "config/cache.toml")
config.load("email", "config/email.toml")
config.load("features", "config/features.toml")

Warning

No Pydantic validation - values are loaded as-is from TOML. For strict typing, use register_builder() with a model.

Note

  • Internally creates a GenericConfigBuilder without model

  • Equivalent to register_builder(name, GenericConfigBuilder(name, [file_path]))

  • File must be valid TOML format

See also

register_builder(): Register with Pydantic model auto_discover(): Automatic multi-file discovery

register_builder(name: str, builder: type[AbstractConfigBuilder] | AbstractConfigBuilder) None[source]

Register a custom configuration builder.

Builders define how to load and validate configuration from files. Must be called during application bootstrap before accessing config values.

Parameters:
name : str

Configuration namespace (e.g., "database", "api"). Used as prefix for all keys (database.host, database.port).

builder : Union[Type[AbstractConfigBuilder], AbstractConfigBuilder]

Builder class or instance. Can be: - Builder class (instantiated internally) - Builder instance (e.g., pre-configured GenericConfigBuilder)

Raises:

ConfigurationError – If builder registration fails or model validation fails.

Example

With custom Pydantic model:

from structum_lab.plugins.dynaconf.core.builders import GenericConfigBuilder
from myapp.models import DatabaseConfig

# Define strict Pydantic model
class DatabaseConfig(BaseModel):
    host: str = "localhost"
    port: int = Field(ge=1, le=65535)
    database: str

# Register with model
builder = GenericConfigBuilder(
    name="database",
    files=["config/database.toml"],
    model=DatabaseConfig
)
config.register_builder("database", builder)

# Access validated config
port = config.get("database.port")  # Guaranteed int, 1-65535

Without model (dynamic):

# Load without validation
config.load("cache", "config/cache.toml")
ttl = config.get("cache.ttl", 3600)  # No type enforcement

Note

  • Configuration is loaded immediately upon registration

  • Validation errors are logged but don’t block startup

  • Hot reload auto-starts on first builder if enable_hot_reload=True

  • Thread-safe write operation

See also

auto_discover(): Automatic builder registration load(): Shortcut for GenericConfigBuilder AbstractConfigBuilder: Builder base class

reload() None[source]

Reload all configuration from disk.

Re-reads TOML files, JSON overrides, and environment variables. Rebuilds Pydantic models and flushes cache. Useful for pulling external configuration changes.

Example

Manual reload:

# External process updated ~/.structum/database.json
config.reload()
new_port = config.get("database.port")  # Fresh value

Reload on signal:

import signal

def handle_sighup(signum, frame):
    log.info("Reloading configuration...")
    config.reload()

signal.signal(signal.SIGHUP, handle_sighup)

Hot reload (automatic):

# Instead of manual reload, use hot reload
config.enable_hot_reload()
# Config auto-reloads on file changes

Warning

  • Blocks all config operations during reload (write lock)

  • May raise validation errors if files corrupted

  • Cache is completely flushed

Note

  • Thread-safe write operation

  • Reloads in priority order: TOML → JSON → ENV

  • Hot reload feature calls this automatically on file changes

See also

save(): Persist configuration to disk enable_hot_reload(): Automatic reload on file changes

save() None[source]

Persist current configuration state to disk.

Saves all registered configuration namespaces to JSON files in ~/.structum/. Each namespace gets its own file (e.g., database.json, api.json).

Raises:

IOError – If file write fails (permissions, disk full, etc.).

Example

Explicit save after updates:

# Make changes
config.set("database.port", 5433)
config.set("database.pool_size", 20)

# Persist to disk
config.save()  # Writes ~/.structum/database.json

Transaction with save:

with config.transaction() as tx:
    tx.set("api.rate_limit", 1000)
    tx.set("api.timeout", 30)
# Auto-saved on commit

Periodic auto-save:

from apscheduler.schedulers.background import BackgroundScheduler

scheduler = BackgroundScheduler()
scheduler.add_job(
    config.save,
    'interval',
    minutes=5,
    id='config_autosave'
)
scheduler.start()

Warning

Overwrites existing files. No backup is created automatically. Consider version control or backup strategy for production.

Note

  • Creates ~/.structum/ directory if missing

  • Saves Pydantic models as JSON with {"default": {...}} wrapper

  • Uses model’s by_alias=True for UPPERCASE keys if defined

  • Errors are logged but don’t block (individual namespace failures)

See also

reload(): Load configuration from disk set(): Update values (doesn’t auto-save)

set(key: str, value: Any) None[source]

Update a configuration value with Pydantic validation.

Implements set() with atomic updates, cache invalidation, and model validation. Does NOT auto-save - call save() explicitly to persist.

Parameters:
key : str

Dot-notation key (e.g., "database.port").

value : Any

New value. Must pass Pydantic model validation.

Raises:

ConfigurationError – If key invalid, namespace not registered, or value fails Pydantic validation.

Example

Update with validation:

# Pydantic model validates port is int, 1-65535
config.set("database.port", 5433)  # OK
config.set("database.port", "invalid")  # Raises ConfigurationError

# Must save explicitly
config.save()  # Persist to ~/.structum/database.json

Runtime reconfiguration:

@app.post("/admin/config")
def update_config(key: str, value: Any):
    try:
        config.set(key, value)
        config.save()
        return {"status": "updated", "key": key}
    except ConfigurationError as e:
        raise HTTPException(400, str(e))

Cache invalidation behavior:

# Setting "backend.db.port" invalidates:
# - "backend.db.port" (exact key)
# - "backend.db.*" (parent prefix)
# - "backend.*" (grandparent prefix)
config.set("backend.db.port", 5433)

Warning

Changes are in-memory only until save() is called. Use transaction() for atomic multi-key updates.

Note

  • Thread-safe write operation

  • Validates against Pydantic model in real-time

  • Cascading cache invalidation for consistency

  • Metrics emitted for operations and errors

See also

get(): Retrieve values save(): Persist changes to disk transaction(): Atomic multi-updates

set_atomic(key: str, value: Any) None[source]

Set value atomically without acquiring lock (internal use).

Used by ConfigTransaction which already holds write lock. Applies change and saves immediately.

Parameters:
key : str

Configuration key.

value : Any

New value.

Note

Assumes caller holds write lock. Not for direct use.

See also

set(): Public set method transaction(): Atomic multi-updates

transaction() Generator[ConfigTransaction, None, None][source]

Atomic transaction context for multiple configuration updates.

Groups multiple set() operations into a single atomic unit. All changes commit together or rollback on exception.

Yields:

ConfigTransaction – Transaction object with set() method.

Raises:

Example

Atomic multi-key update:

with config.transaction() as tx:
    tx.set("api.rate_limit", 1000)
    tx.set("api.timeout_seconds", 30)
    tx.set("api.retry_attempts", 3)
# All changes committed atomically here

Rollback on error:

try:
    with config.transaction() as tx:
        tx.set("database.port", 5433)
        tx.set("database.invalid_key", "value")  # Error!
except ConfigurationError:
    # Transaction auto-rolled back
    port = config.get("database.port")  # Still old value

Feature flag migration:

# Safely migrate multiple feature flags
with config.transaction() as tx:
    tx.set("features.new_ui", True)
    tx.set("features.old_ui", False)
    tx.set("features.migration_complete", True)

Note

  • Single write lock held for entire transaction

  • Auto-saves on successful commit

  • Automatic rollback on any exception

See also

set(): Single value updates ConfigTransaction: Transaction implementation

class structum_lab.plugins.dynaconf.AbstractConfigBuilder[source]

Bases: Generic[T], ABC

Contratto astratto per tutti i builder di configurazione.

Per creare un builder concreto, estendi questa classe e implementa: - config_name: nome univoco della configurazione - config_model_class: la classe Pydantic per validare la configurazione - _get_specific_files: lista dei file TOML da caricare

Esempio:
class MyConfigBuilder(AbstractConfigBuilder[MyConfig]):

@property def config_name(self) -> str:

return “myapp”

@property def config_model_class(self) -> Type[MyConfig]:

return MyConfig

def _get_specific_files(self) -> List[str]:

return [“config/app/myapp.toml”]

abstract property config_model_class : type[T]

La classe Pydantic che rappresenta la configurazione.

abstract property config_name : str

Il nome univoco della configurazione (es. ‘backend’).

create_config_model(settings: LazySettings) T[source]

Crea e valida il modello Pydantic dai settings Dynaconf.

property envvar_prefix : str

Prefisso per le variabili d’ambiente.

Returns:

Il prefisso (default: STRUCTUM).

Return type:

str

get_migrations() list[Migration][source]

Override this method to register migrations for this config.

Returns:

List of Migration instances to apply.

get_persistence_file() Path[source]

Restituisce il path completo del file JSON di persistenza.

get_user_config_dir() Path[source]

Risolve la directory utente per la persistenza (~/.structum).

load_settings() LazySettings[source]

Carica le impostazioni unendo TOML statici e JSON utente.

class structum_lab.plugins.dynaconf.GenericConfigBuilder(name: str, files: list[str], model: type[BaseModel] | None = None, env_prefix: str = 'STRUCTUM')[source]

Bases: AbstractConfigBuilder

Un builder generico pronto all’uso. Elimina la necessità di creare sottoclassi per ogni file di configurazione.

Uso:

builder = GenericConfigBuilder(“auth”, [“config/app/auth.toml”]) provider.register_builder(“auth”, builder)

Opzionale:

Puoi passare un modello Pydantic specifico per enforcement dei tipi. Se non fornito, viene creato un modello dinamico permissivo (extra=”allow”).

__init__(name: str, files: list[str], model: type[BaseModel] | None = None, env_prefix: str = 'STRUCTUM')[source]
property config_model_class : type[BaseModel]

La classe Pydantic che rappresenta la configurazione.

property config_name : str

Il nome univoco della configurazione (es. ‘backend’).

property envvar_prefix : str

Prefisso per le variabili d’ambiente.

Returns:

Il prefisso (default: STRUCTUM).

Return type:

str

exception structum_lab.plugins.dynaconf.ConfigurationError(message: str)[source]

Bases: Exception

Errore generale di configurazione (Base Exception).

__init__(message: str)[source]

Core Components

Provider

Provider Structum basato su Dynaconf.

Espone l’interfaccia ConfigInterface al core, utilizzando la libreria interna per costruire modelli Pydantic validati e fornire accesso dot-notated.

class structum_lab.plugins.dynaconf.core.provider.DynaconfConfigProvider(root_path: str = '.', env_prefix: str = 'STRUCTUM', environments: bool = True, active_env: str = 'development', enable_hot_reload: bool = False, enable_cache: bool = True)[source]

Bases: ConfigProviderInterface

Advanced configuration provider built on Dynaconf with Pydantic validation.

Acts as adapter between Structum’s key-value ConfigInterface and strongly-typed Pydantic models, providing:

  • Auto-discovery: Scans config/app/*.toml and matches with config/models/*.py

  • Type safety: Pydantic validation on all config updates

  • Hot reload: Optional file watching with automatic reload

  • Transactions: Atomic multi-key updates

  • Health checks: Configuration integrity and validation monitoring

  • Smart caching: Multi-level cache with invalidation

  • Metrics: Prometheus-compatible operation metrics

Architecture:

The provider maintains two representations:

  1. Pydantic Models (source of truth) - Strongly typed, validated

  2. Flat Dictionary (performance cache) - Fast dot-notation access

All updates flow through Pydantic validation before being persisted.

Configuration Flow:

TOML Files → Dynaconf → Pydantic Models → Flat Cache → get()
                               ↓
                       Validation Errors
                               ↓
                    ConfigurationError

Example

Basic usage with auto-discovery:

from structum_lab.plugins.dynaconf import DynaconfConfigProvider

# Initialize provider
config = DynaconfConfigProvider(
    root_path=".",
    env_prefix="MYAPP",
    enable_hot_reload=True
)

# Auto-discover config files
config.auto_discover(
    app_dir="config/app",
    models_dir="config/models"
)

# Access configuration
db_host = config.get("database.host", "localhost")
db_port = config.get("database.port", 5432)

# Update with validation
config.set("database.port", 5433)  # Validates against Pydantic model
config.save()  # Persist to ~/.structum/

With manual builder registration:

from structum_lab.plugins.dynaconf.core.builders import GenericConfigBuilder
from myapp.config_models import DatabaseConfig

# Register typed configuration
builder = GenericConfigBuilder(
    name="database",
    files=["config/database.toml"],
    model=DatabaseConfig  # Pydantic model
)
config.register_builder("database", builder)

# Access is now type-safe
port = config.get("database.port")  # Validated as int

Atomic transactions:

# Multiple updates atomically
with config.transaction() as tx:
    tx.set("api.rate_limit", 1000)
    tx.set("api.timeout_seconds", 30)
    # All changes applied together on commit
Implementations:

This is the primary implementation of ConfigInterface for Structum Lab applications.

Note

  • Thread-safe with read-write locks

  • Supports environment variable overrides (MYAPP_DATABASE_PORT)

  • Hot reload requires watchdog package

  • Persistence files stored in ~/.structum/

Warning

Do not modify _flat_config or _manager directly. Always use public methods (get, set, transaction) to ensure validation and cache coherence.

See also

ConfigInterface: Base protocol AbstractConfigBuilder: Custom builders ConfigTransaction: Transaction API

__init__(root_path: str = '.', env_prefix: str = 'STRUCTUM', environments: bool = True, active_env: str = 'development', enable_hot_reload: bool = False, enable_cache: bool = True)[source]

Initialize the configuration provider.

Sets up the provider infrastructure including thread-safe locks, configuration manager, health checks, caching, and optional hot reload capabilities.

Parameters:
root_path : str

Root directory for resolving configuration file paths. All relative paths in builders will be resolved from this directory. Defaults to current directory (".").

env_prefix : str

Prefix for environment variable overrides. E.g., "MYAPP" enables MYAPP_DATABASE_PORT to override database.port. Defaults to "STRUCTUM".

environments : bool

Enable multi-environment support ([production], [development] in TOML files). If True, environment-specific sections are loaded based on active_env. Defaults to True.

active_env : str

Active environment name when environments=True. Determines which TOML section to load. Defaults to "development".

enable_hot_reload : bool

Enable automatic file watching and reload. Requires watchdog package. When True, watches configuration files for changes and reloads automatically. Defaults to False.

enable_cache : bool

Enable smart caching for get() operations. Significantly improves performance for frequently accessed keys. Defaults to True.

Raises:

ImportError – If enable_hot_reload=True but watchdog not installed.

Example

Development setup with hot reload:

config = DynaconfConfigProvider(
    root_path="/app",
    env_prefix="MYAPP",
    active_env="development",
    enable_hot_reload=True  # Auto-reload on file changes
)

Production setup (no hot reload, custom prefix):

config = DynaconfConfigProvider(
    root_path="/etc/myapp",
    env_prefix="PROD",
    active_env="production",
    enable_hot_reload=False,  # Static config in production
    enable_cache=True
)

Note

  • Hot reload is started when first builder is registered

  • Cache is populated lazily on first get() call

  • Health checks are registered automatically

  • No configuration is loaded until register_builder() or auto_discover()

See also

auto_discover(): Automatic configuration discovery register_builder(): Manual builder registration enable_hot_reload(): Enable hot reload after initialization

health_check() dict[str, HealthCheckResult][source]

Execute comprehensive configuration health verification.

Runs all registered health checks and returns detailed results. Useful for monitoring dashboards, readiness probes, and diagnostics.

Returns:

Mapping of check names to results.

Each result contains status (healthy/unhealthy), message, and metrics.

Return type:

Dict[str, HealthCheckResult]

Example

FastAPI readiness probe:

@app.get("/health/config")
def config_health():
    results = config.health_check()
    all_healthy = all(r.is_healthy for r in results.values())

    if all_healthy:
        return {"status": "healthy", "checks": results}
    else:
        return JSONResponse(
            {"status": "unhealthy", "checks": results},
            status_code=503
        )

Prometheus metrics:

results = config.health_check()
for check_name, result in results.items():
    metrics.gauge(
        "config_health_check",
        1 if result.is_healthy else 0,
        tags={"check": check_name}
    )

Note

Thread-safe read operation. Does not modify configuration state.

See also

HealthCheckResult: Result structure

register_builder(name: str, builder: type[AbstractConfigBuilder] | AbstractConfigBuilder) None[source]

Register a custom configuration builder.

Builders define how to load and validate configuration from files. Must be called during application bootstrap before accessing config values.

Parameters:
name : str

Configuration namespace (e.g., "database", "api"). Used as prefix for all keys (database.host, database.port).

builder : Union[Type[AbstractConfigBuilder], AbstractConfigBuilder]

Builder class or instance. Can be: - Builder class (instantiated internally) - Builder instance (e.g., pre-configured GenericConfigBuilder)

Raises:

ConfigurationError – If builder registration fails or model validation fails.

Example

With custom Pydantic model:

from structum_lab.plugins.dynaconf.core.builders import GenericConfigBuilder
from myapp.models import DatabaseConfig

# Define strict Pydantic model
class DatabaseConfig(BaseModel):
    host: str = "localhost"
    port: int = Field(ge=1, le=65535)
    database: str

# Register with model
builder = GenericConfigBuilder(
    name="database",
    files=["config/database.toml"],
    model=DatabaseConfig
)
config.register_builder("database", builder)

# Access validated config
port = config.get("database.port")  # Guaranteed int, 1-65535

Without model (dynamic):

# Load without validation
config.load("cache", "config/cache.toml")
ttl = config.get("cache.ttl", 3600)  # No type enforcement

Note

  • Configuration is loaded immediately upon registration

  • Validation errors are logged but don’t block startup

  • Hot reload auto-starts on first builder if enable_hot_reload=True

  • Thread-safe write operation

See also

auto_discover(): Automatic builder registration load(): Shortcut for GenericConfigBuilder AbstractConfigBuilder: Builder base class

transaction() Generator[ConfigTransaction, None, None][source]

Atomic transaction context for multiple configuration updates.

Groups multiple set() operations into a single atomic unit. All changes commit together or rollback on exception.

Yields:

ConfigTransaction – Transaction object with set() method.

Raises:

Example

Atomic multi-key update:

with config.transaction() as tx:
    tx.set("api.rate_limit", 1000)
    tx.set("api.timeout_seconds", 30)
    tx.set("api.retry_attempts", 3)
# All changes committed atomically here

Rollback on error:

try:
    with config.transaction() as tx:
        tx.set("database.port", 5433)
        tx.set("database.invalid_key", "value")  # Error!
except ConfigurationError:
    # Transaction auto-rolled back
    port = config.get("database.port")  # Still old value

Feature flag migration:

# Safely migrate multiple feature flags
with config.transaction() as tx:
    tx.set("features.new_ui", True)
    tx.set("features.old_ui", False)
    tx.set("features.migration_complete", True)

Note

  • Single write lock held for entire transaction

  • Auto-saves on successful commit

  • Automatic rollback on any exception

See also

set(): Single value updates ConfigTransaction: Transaction implementation

load(name: str, file_path: str) None[source]

Load a configuration file manually without Pydantic model.

Convenience shortcut for GenericConfigBuilder registration. Useful for quick configuration loading during development or dynamic configs.

Parameters:
name : str

Configuration namespace (e.g., "payment", "cache").

file_path : str

Relative path to TOML file from root_path (e.g., "config/app/payment.toml").

Example

Quick load without model:

# Load payment config
config.load("payment", "config/app/payment.toml")

# Access immediately
api_key = config.get("payment.stripe.api_key")
webhook_secret = config.get("payment.stripe.webhook_secret")

Multiple configs:

# Load several configs
config.load("cache", "config/cache.toml")
config.load("email", "config/email.toml")
config.load("features", "config/features.toml")

Warning

No Pydantic validation - values are loaded as-is from TOML. For strict typing, use register_builder() with a model.

Note

  • Internally creates a GenericConfigBuilder without model

  • Equivalent to register_builder(name, GenericConfigBuilder(name, [file_path]))

  • File must be valid TOML format

See also

register_builder(): Register with Pydantic model auto_discover(): Automatic multi-file discovery

auto_discover(app_dir: str = 'config/app', models_dir: str = 'config/models') list[str][source]

Automatically discover and register configuration files.

Scans directories for TOML files and optionally matches them with Pydantic models. This is the recommended initialization pattern for structured projects.

Discovery Process:

  1. Scan {root_path}/{app_dir}/*.toml for configuration files

  2. For each TOML file (e.g., database.toml): - Check for matching model in {root_path}/{models_dir}/database.py - If model exists: Load with strict Pydantic validation - If no model: Load dynamically without validation

  3. Register all discovered namespaces

Directory Structure Example:

myproject/
├── config/
│   ├── app/
│   │   ├── database.toml      # → database namespace
│   │   ├── api.toml           # → api namespace
│   │   └── cache.toml         # → cache namespace
│   └── models/
│       ├── database.py         # DatabaseConfig (Pydantic)
│       └── api.py              # ApiConfig (Pydantic)
└── app.py
Parameters:
app_dir : str

Relative directory containing TOML files. Defaults to "config/app".

models_dir : str

Relative directory containing Pydantic models. Defaults to "config/models".

Returns:

List of discovered namespace names (e.g., ["database", "api", "cache"]).

Return type:

list[str]

Example

Basic auto-discovery:

from structum_lab.plugins.dynaconf import DynaconfConfigProvider

config = DynaconfConfigProvider(root_path=".")
namespaces = config.auto_discover()
# Logs: Auto-discovered: database [Strict (DatabaseConfig)] → config/app/database.toml
# Logs: Auto-discovered: api [Strict (ApiConfig)] → config/app/api.toml
# Logs: Auto-discovered: cache [Dynamic] → config/app/cache.toml

print(namespaces)  # ['database', 'api', 'cache']

Custom directories:

config = DynaconfConfigProvider(root_path="/etc/myapp")
config.auto_discover(
    app_dir="settings",
    models_dir="schemas"
)

With FastAPI:

from fastapi import FastAPI

app = FastAPI()

@app.on_event("startup")
def load_config():
    config = DynaconfConfigProvider()
    discovered = config.auto_discover()
    app.state.config = config
    log.info(f"Loaded {len(discovered)} config namespaces")

Model File Format (config/models/database.py):

from pydantic import BaseModel, Field

class DatabaseConfig(BaseModel):
    host: str = "localhost"
    port: int = Field(ge=1, le=65535, default=5432)
    database: str
    username: str
    password: str  # Will be overridden by env vars

Warning

  • Hidden files (starting with .) are skipped

  • Already-registered builders are skipped

  • Model loading errors fallback to dynamic mode (logged as warnings)

Note

  • Thread-safe operation

  • Models must inherit from pydantic.BaseModel

  • Model class name can be anything (first BaseModel found is used)

  • Dynamically imports model files via importlib

See also

register_builder(): Manual builder registration load(): Load single file without discovery GenericConfigBuilder: Builder implementation

get(key: str, default: Any = None) Any[source]

Retrieve a configuration value by dot-notation key.

Implements get() with smart caching and performance metrics. Cache-first lookup minimizes lock contention.

Parameters:
key : str

Dot-notation configuration key (e.g., "database.host", "api.rate_limit").

default : Any

Default value if key not found. Defaults to None.

Returns:

Configuration value (typed according to Pydantic model) or default.

Return type:

Any

Example

Basic usage:

# Simple get with default
db_host = config.get("database.host", "localhost")
port = config.get("database.port", 5432)

# Nested keys
timeout = config.get("api.client.timeout_seconds", 30)

Environment override:

# TOML: database.host = "localhost"
# ENV:  MYAPP_DATABASE_HOST="prod-db.example.com"
host = config.get("database.host")  # Returns "prod-db.example.com"

Type safety with Pydantic:

# Model defines port as int with validation
port = config.get("database.port")  # Guaranteed int type

Note

  • Thread-safe read operation

  • Cache hit bypasses locks entirely

  • First access populates cache

  • Metrics emitted for monitoring (cache hit/miss rate, latency)

See also

set(): Update configuration values has(): Check key existence get_cache_stats(): Cache performance metrics

set(key: str, value: Any) None[source]

Update a configuration value with Pydantic validation.

Implements set() with atomic updates, cache invalidation, and model validation. Does NOT auto-save - call save() explicitly to persist.

Parameters:
key : str

Dot-notation key (e.g., "database.port").

value : Any

New value. Must pass Pydantic model validation.

Raises:

ConfigurationError – If key invalid, namespace not registered, or value fails Pydantic validation.

Example

Update with validation:

# Pydantic model validates port is int, 1-65535
config.set("database.port", 5433)  # OK
config.set("database.port", "invalid")  # Raises ConfigurationError

# Must save explicitly
config.save()  # Persist to ~/.structum/database.json

Runtime reconfiguration:

@app.post("/admin/config")
def update_config(key: str, value: Any):
    try:
        config.set(key, value)
        config.save()
        return {"status": "updated", "key": key}
    except ConfigurationError as e:
        raise HTTPException(400, str(e))

Cache invalidation behavior:

# Setting "backend.db.port" invalidates:
# - "backend.db.port" (exact key)
# - "backend.db.*" (parent prefix)
# - "backend.*" (grandparent prefix)
config.set("backend.db.port", 5433)

Warning

Changes are in-memory only until save() is called. Use transaction() for atomic multi-key updates.

Note

  • Thread-safe write operation

  • Validates against Pydantic model in real-time

  • Cascading cache invalidation for consistency

  • Metrics emitted for operations and errors

See also

get(): Retrieve values save(): Persist changes to disk transaction(): Atomic multi-updates

set_atomic(key: str, value: Any) None[source]

Set value atomically without acquiring lock (internal use).

Used by ConfigTransaction which already holds write lock. Applies change and saves immediately.

Parameters:
key : str

Configuration key.

value : Any

New value.

Note

Assumes caller holds write lock. Not for direct use.

See also

set(): Public set method transaction(): Atomic multi-updates

has(key: str) bool[source]

Check if a configuration key exists.

Implements has() with thread-safe lookup.

Parameters:
key : str

Dot-notation configuration key.

Returns:

True if key exists in configuration, False otherwise.

Return type:

bool

Example

Conditional configuration:

# Check before accessing
if config.has("features.experimental"):
    experimental = config.get("features.experimental")
    if experimental:
        enable_experimental_features()

# Avoid KeyError in strict mode
if not config.has("optional.setting"):
    config.set("optional.setting", default_value)

Note

  • Thread-safe read operation

  • Checks flat config cache (no Pydantic model lookup)

  • Does not distinguish between None value and missing key

See also

get(): Retrieve values with default

save() None[source]

Persist current configuration state to disk.

Saves all registered configuration namespaces to JSON files in ~/.structum/. Each namespace gets its own file (e.g., database.json, api.json).

Raises:

IOError – If file write fails (permissions, disk full, etc.).

Example

Explicit save after updates:

# Make changes
config.set("database.port", 5433)
config.set("database.pool_size", 20)

# Persist to disk
config.save()  # Writes ~/.structum/database.json

Transaction with save:

with config.transaction() as tx:
    tx.set("api.rate_limit", 1000)
    tx.set("api.timeout", 30)
# Auto-saved on commit

Periodic auto-save:

from apscheduler.schedulers.background import BackgroundScheduler

scheduler = BackgroundScheduler()
scheduler.add_job(
    config.save,
    'interval',
    minutes=5,
    id='config_autosave'
)
scheduler.start()

Warning

Overwrites existing files. No backup is created automatically. Consider version control or backup strategy for production.

Note

  • Creates ~/.structum/ directory if missing

  • Saves Pydantic models as JSON with {"default": {...}} wrapper

  • Uses model’s by_alias=True for UPPERCASE keys if defined

  • Errors are logged but don’t block (individual namespace failures)

See also

reload(): Load configuration from disk set(): Update values (doesn’t auto-save)

reload() None[source]

Reload all configuration from disk.

Re-reads TOML files, JSON overrides, and environment variables. Rebuilds Pydantic models and flushes cache. Useful for pulling external configuration changes.

Example

Manual reload:

# External process updated ~/.structum/database.json
config.reload()
new_port = config.get("database.port")  # Fresh value

Reload on signal:

import signal

def handle_sighup(signum, frame):
    log.info("Reloading configuration...")
    config.reload()

signal.signal(signal.SIGHUP, handle_sighup)

Hot reload (automatic):

# Instead of manual reload, use hot reload
config.enable_hot_reload()
# Config auto-reloads on file changes

Warning

  • Blocks all config operations during reload (write lock)

  • May raise validation errors if files corrupted

  • Cache is completely flushed

Note

  • Thread-safe write operation

  • Reloads in priority order: TOML → JSON → ENV

  • Hot reload feature calls this automatically on file changes

See also

save(): Persist configuration to disk enable_hot_reload(): Automatic reload on file changes

enable_hot_reload(watch_secrets: bool = True, debounce_seconds: float = 2.0, callback: Any | None = None) None[source]

Enable automatic configuration reloading on file changes.

Watches configuration files (TOML, JSON) and automatically calls reload() when changes detected. Requires watchdog package.

Parameters:
watch_secrets : bool

Whether to watch secret files in addition to config files. Defaults to True.

debounce_seconds : float

Debounce interval to avoid rapid reloads on multiple file changes. Defaults to 2.0 seconds.

callback : Optional[Callable]

Optional callback function called after reload. Signature: callback(changed_files: list[str]) -> None.

Raises:

ImportError – If watchdog package not installed.

Example

Development environment:

config = DynaconfConfigProvider(
    root_path=".",
    enable_hot_reload=False  # Start disabled
)
config.auto_discover()

# Enable later with custom debounce
config.enable_hot_reload(debounce_seconds=5.0)

# Now changes to config files auto-reload

With callback:

def on_config_changed(files):
    log.info(f"Config reloaded: {files}")
    # Notify connected clients
    websocket.broadcast({"type": "config_updated"})

config.enable_hot_reload(
    debounce_seconds=3.0,
    callback=on_config_changed
)

Production (disabled):

# Don't use hot reload in production
config = DynaconfConfigProvider(
    enable_hot_reload=False  # Static config
)

Warning

  • Not recommended for production (use SIGHUP reload instead)

  • Debounce prevents rapid reload storms but adds latency

  • File watcher holds file system resources

Note

  • Watches both TOML source files and JSON persistence files

  • Auto-started on first builder if enable_hot_reload=True in __init__

  • Calls are idempotent (ignores if already enabled)

  • Cleanup automatic on provider destruction

See also

reload(): Manual reload HotReloadManager: Watcher implementation

__del__() None[source]

Cleanup hot reload observer on provider destruction.

Automatically called by Python garbage collector. Stops file watchers to prevent resource leaks.

Note

Manual cleanup via enable_hot_reload().stop() not required.

get_cache_stats() dict[str, Any][source]

Retrieve cache performance metrics for monitoring.

Returns cache hit rate, size, and other statistics. Automatically emits metrics to Prometheus/statsD via metrics interface.

Returns:

Cache statistics dictionary:
  • enabled (bool): Whether cache is active

  • size (int): Number of cached keys

  • hit_rate (float): Cache hit ratio (0.0-1.0)

  • Additional metrics from cache implementation

Return type:

Dict[str, Any]

Example

Monitoring dashboard:

stats = config.get_cache_stats()
print(f"Cache enabled: {stats['enabled']}")
print(f"Cache size: {stats['size']} keys")
print(f"Hit rate: {stats['hit_rate']:.2%}")

Prometheus metrics:

# Called automatically by metrics.gauge()
# structum_config_cache_size{} 1234
# structum_config_cache_hit_rate{} 0.95

Health check integration:

@app.get("/health/cache")
def cache_health():
    stats = config.get_cache_stats()
    if not stats.get('enabled'):
        return {"status": "disabled"}

    if stats['hit_rate'] < 0.7:
        return {"status": "degraded", "hit_rate": stats['hit_rate']}

    return {"status": "healthy", **stats}

Note

  • Returns {"enabled": False} if cache disabled

  • Metrics auto-emitted as gauges

  • Thread-safe read operation

See also

__init__(): enable_cache parameter SmartCache: Cache implementation

Manager

Gestore di Configurazione Centrale per Structum.

Fornisce un punto di accesso centralizzato e singleton per tutte le configurazioni. Supporta la registrazione dinamica dei builder per estendibilità.

class structum_lab.plugins.dynaconf.core.manager.ConfigManager[source]

Bases: object

Gestisce il caricamento, il caching e l’accesso alle configurazioni.

__init__()[source]
register_builder(name: str, builder: type[AbstractConfigBuilder] | AbstractConfigBuilder) None[source]

Registra un nuovo builder di configurazione.

load_all() None[source]

Carica forzatamente tutte le configurazioni registrate.

get_config(config_name: str) Any[source]

Recupera una configurazione specifica con caching lazy.

refresh() None[source]

Invalida la cache interna.

Builders

Builder di Configurazione (Parte del Builder Pattern).

Questo modulo fornisce la classe base astratta AbstractConfigBuilder che definisce il contratto per tutti i builder di configurazione.

L’applicazione che utilizza structum_lab.plugins.dynaconf deve creare le proprie implementazioni concrete di questa classe base.

class structum_lab.plugins.dynaconf.core.builders.AbstractConfigBuilder[source]

Bases: Generic[T], ABC

Contratto astratto per tutti i builder di configurazione.

Per creare un builder concreto, estendi questa classe e implementa: - config_name: nome univoco della configurazione - config_model_class: la classe Pydantic per validare la configurazione - _get_specific_files: lista dei file TOML da caricare

Esempio:
class MyConfigBuilder(AbstractConfigBuilder[MyConfig]):

@property def config_name(self) -> str:

return “myapp”

@property def config_model_class(self) -> Type[MyConfig]:

return MyConfig

def _get_specific_files(self) -> List[str]:

return [“config/app/myapp.toml”]

abstract property config_name : str

Il nome univoco della configurazione (es. ‘backend’).

abstract property config_model_class : type[T]

La classe Pydantic che rappresenta la configurazione.

property envvar_prefix : str

Prefisso per le variabili d’ambiente.

Returns:

Il prefisso (default: STRUCTUM).

Return type:

str

get_user_config_dir() Path[source]

Risolve la directory utente per la persistenza (~/.structum).

get_persistence_file() Path[source]

Restituisce il path completo del file JSON di persistenza.

get_migrations() list[Migration][source]

Override this method to register migrations for this config.

Returns:

List of Migration instances to apply.

load_settings() LazySettings[source]

Carica le impostazioni unendo TOML statici e JSON utente.

create_config_model(settings: LazySettings) T[source]

Crea e valida il modello Pydantic dai settings Dynaconf.

class structum_lab.plugins.dynaconf.core.builders.GenericConfigBuilder(name: str, files: list[str], model: type[BaseModel] | None = None, env_prefix: str = 'STRUCTUM')[source]

Bases: AbstractConfigBuilder

Un builder generico pronto all’uso. Elimina la necessità di creare sottoclassi per ogni file di configurazione.

Uso:

builder = GenericConfigBuilder(“auth”, [“config/app/auth.toml”]) provider.register_builder(“auth”, builder)

Opzionale:

Puoi passare un modello Pydantic specifico per enforcement dei tipi. Se non fornito, viene creato un modello dinamico permissivo (extra=”allow”).

__init__(name: str, files: list[str], model: type[BaseModel] | None = None, env_prefix: str = 'STRUCTUM')[source]
property config_name : str

Il nome univoco della configurazione (es. ‘backend’).

property config_model_class : type[BaseModel]

La classe Pydantic che rappresenta la configurazione.

property envvar_prefix : str

Prefisso per le variabili d’ambiente.

Returns:

Il prefisso (default: STRUCTUM).

Return type:

str

Director

Direttore della Configurazione (Parte del Builder Pattern).

Questo modulo definisce la classe ConfigDirector, che implementa il ruolo del “Director” nel design pattern Builder. La sua unica responsabilità è orchestrare il processo di costruzione di un oggetto di configurazione, chiamando i metodi di un AbstractConfigBuilder in una sequenza specifica.

Il Director disaccoppia il client (il ConfigManager) dal processo di costruzione concreto, permettendo al ConfigManager di sapere solo che ha bisogno di una configurazione, senza dover conoscere i passaggi esatti necessari per costruirla.

Architettura:

[ConfigManager] -> [ConfigDirector] -> [SpecificConfigBuilder]

class structum_lab.plugins.dynaconf.core.director.ConfigDirector[source]

Bases: object

Orchestra il processo di costruzione della configurazione.

Questa classe è stateless e fornisce un unico metodo statico construct per eseguire la sequenza di costruzione. Non ha conoscenza dei dettagli specifici di come viene costruita una configurazione (es. quali file vengono letti); delega questi dettagli interamente al builder che riceve.

static construct(builder: AbstractConfigBuilder[T]) T[source]

Costruisce un oggetto di configurazione usando il builder fornito.

Questo metodo esegue la “ricetta” standard per la costruzione di qualsiasi configurazione nel sistema: 1. Chiama load_settings() sul builder per caricare tutti i dati grezzi

da file e variabili d’ambiente in un oggetto Dynaconf.

  1. Passa l’oggetto Dynaconf risultante al metodo create_config_model() del builder, che si occuperà della validazione tramite Pydantic e della creazione del modello finale.

Parameters:
builder: AbstractConfigBuilder[T]

Un’istanza concreta di AbstractConfigBuilder che sa come costruire un tipo specifico di configurazione.

Returns:

Un’istanza del modello di configurazione Pydantic, completamente validata e pronta all’uso.

Exceptions

Eccezioni personalizzate per il plugin Dynaconf di Structum.

exception structum_lab.plugins.dynaconf.core.exceptions.ConfigurationError(message: str)[source]

Bases: Exception

Errore generale di configurazione (Base Exception).

__init__(message: str)[source]
exception structum_lab.plugins.dynaconf.core.exceptions.ConfigDiscoveryError(message: str)[source]

Bases: ConfigurationError

Errore durante l’auto-discovery (es. directory mancanti).

exception structum_lab.plugins.dynaconf.core.exceptions.ConfigLoadError(message: str)[source]

Bases: ConfigurationError

Errore durante il caricamento di un file TOML (es. syntax error).

exception structum_lab.plugins.dynaconf.core.exceptions.ConfigValidationError(message: str)[source]

Bases: ConfigurationError

Errore di validazione Pydantic.

exception structum_lab.plugins.dynaconf.core.exceptions.ConfigPersistenceError(message: str)[source]

Bases: ConfigurationError

Errore durante il salvataggio o caricamento runtime.

exception structum_lab.plugins.dynaconf.core.exceptions.NamespaceNotFoundError(message: str)[source]

Bases: ConfigurationError

Il namespace richiesto non è registrato.

Features

Cache

Smart Caching System for Configuration.

Provides TTL-based caching with LRU eviction and selective invalidation.

class structum_lab.plugins.dynaconf.features.cache.CacheEntry(value: ~typing.Any, timestamp: ~datetime.datetime = <factory>, hit_count: int = 0)[source]

Bases: object

Entry in cache with metadata.

value : Any
timestamp : datetime
hit_count : int = 0
is_expired(ttl: timedelta) bool[source]

Check if entry has exceeded TTL.

age_seconds() float[source]

Return age in seconds.

__init__(value: ~typing.Any, timestamp: ~datetime.datetime = <factory>, hit_count: int = 0)
class structum_lab.plugins.dynaconf.features.cache.SmartCache(max_size: int = 1000, default_ttl: timedelta = datetime.timedelta(seconds=3600))[source]

Bases: object

Cache with TTL, LRU eviction, and selective invalidation.

__init__(max_size: int = 1000, default_ttl: timedelta = datetime.timedelta(seconds=3600))[source]
get(key: str) Any | None[source]

Retrieve from cache.

set(key: str, value: Any) None[source]

Insert into cache.

invalidate(key: str) None[source]

Invalidate a specific key.

invalidate_prefix(prefix: str) int[source]

Invalidate all keys starting with prefix.

clear() None[source]

Clear entire cache.

get_stats() dict[str, Any][source]

Get cache statistics for monitoring.

Health Checks

Configuration Health Check System.

Provides architectural components to perform runtime validation of the configuration system integrity.

class structum_lab.plugins.dynaconf.features.health.HealthStatus(*values)[source]

Bases: str, Enum

HEALTHY = 'healthy'
DEGRADED = 'degraded'
UNHEALTHY = 'unhealthy'
class structum_lab.plugins.dynaconf.features.health.HealthCheckResult(name: str, status: structum_lab.plugins.dynaconf.features.health.HealthStatus, message: str, metadata: Dict[str, Any] = <factory>)[source]

Bases: object

name : str
status : HealthStatus
message : str
metadata : Dict[str, Any]
__init__(name: str, status: ~structum_lab.plugins.dynaconf.features.health.HealthStatus, message: str, metadata: ~typing.Dict[str, ~typing.Any] = <factory>)
class structum_lab.plugins.dynaconf.features.health.HealthCheck(*args, **kwargs)[source]

Bases: Protocol

Protocol for a configuration health check component.

abstractmethod check() HealthCheckResult[source]

Executes the check and returns the result.

__init__(*args, **kwargs)
class structum_lab.plugins.dynaconf.features.health.ConfigFileIntegrityCheck(config_manager: ConfigManager)[source]

Bases: object

Verifies that persistence files are valid JSON if they exist.

__init__(config_manager: ConfigManager)[source]
check() HealthCheckResult[source]
class structum_lab.plugins.dynaconf.features.health.PydanticValidationCheck(config_manager: ConfigManager)[source]

Bases: object

Re-validates all loaded configuration models against their schemas.

__init__(config_manager: ConfigManager)[source]
check() HealthCheckResult[source]
class structum_lab.plugins.dynaconf.features.health.HealthCheckRegistry[source]

Bases: object

Registry to manage and run multiple health checks.

__init__()[source]
register(check: HealthCheck) None[source]
run_all() dict[str, HealthCheckResult][source]

Executes all registered checks.

Locks

Synchronization primitives for concurrent configuration access.

Provides a ReadWriteLock to allow multiple concurrent readers but exclusive writers, optimizing performance for read-heavy configuration workloads.

class structum_lab.plugins.dynaconf.features.locks.ReadWriteLock[source]

Bases: object

A lock allowing multiple simultaneous readers but exclusive access for writers.

This implementation gives priority to writers to prevent starvation.

__init__()[source]
read_lock() Generator[None, None, None][source]

Acquire a read lock. Blocks if there are active writers.

write_lock() Generator[None, None, None][source]

Acquire a write lock. Blocks until all readers and writers release.