diff --git a/CHANGELOG.md b/CHANGELOG.md index f95cbb7..e2f9391 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -18,6 +18,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Added +- Dynamic plugin configuration UI (PRD-v2 P0.15, task 15): plugins declaring a `[config]` block in their `plugin.toml` now expose their schema at runtime. Backend adds `ConfigField` / `ConfigFieldType` / `PluginConfigSchema` to `domain/model/plugin.rs` (typed validation, enum options, `min`/`max` bounds, regex via a std-only matcher — no external import in the domain), a `PluginConfigStore` port (`get_values` / `set_value` / `list_all` / `delete_all`) implemented by `SqlitePluginConfigRepo` backed by the new `plugin_configs (plugin_name, key, value)` table (migration `m20260425_000005_create_plugin_configs`, composite primary key). The manifest parser (`adapters/driven/plugin/manifest.rs`) now extracts `type`, `default`, `options`, `description`, `min`, `max`, `regex` on top of the existing defaults, and rejects defaults that fail their own field validation. CQRS gains `UpdatePluginConfigCommand` (validates against the schema, applies the runtime first then persists, rolls back on failure) and `GetPluginConfigQuery` (returns the schema plus persisted values, dropping any persisted entry that no longer matches the current schema and falling back to manifest defaults). `PluginLoader` is extended with `get_manifest()` and `set_runtime_config()`; `ExtismPluginLoader` implements both by reading from `PluginRegistry` and writing to `SharedHostResources::plugin_configs`, so `get_config(key)` calls from the WASM plugin observe the new value without a reload. At startup, `lib.rs` replays persisted configs onto the in-memory map before plugins are loaded. Frontend adds two components: `PluginConfigField.tsx` (dispatcher renderer: `string` → text input, `boolean` → shadcn switch, `integer`/`float` → numeric input with bounds, `url` → url input, `enum` (and `string` with options) → shadcn select; `aria-describedby` on the control points to the error message) and `PluginConfigDialog.tsx` (loads the schema via `useQuery`, validates each field on the UI side (rejects empty floats, validates JSON arrays) before sending, persists changed values sequentially, guards the schema-reset effect while a save is in flight to avoid clobbering the draft, invalidates the query on success). `PluginsView` queries `plugin_config_get` for each installed plugin (keyed off the unfiltered installed list to avoid churn while typing in search) to decide whether the *Configure* button (Settings icon, next to the *More* menu) should render: a plugin without `[config]` exposes no button. New IPC commands `plugin_config_get(name) → PluginConfigView` and `plugin_config_update(name, key, value)`. i18n (en/fr): `plugins.action.configure`, `plugins.config.{title,description,loading,error,noFields,toast.{saveSuccess,validationFailed}}`. (task 15) - History retention with automatic daily purge (PRD-v2 P0.14, task 14): new `history_retention_days` setting (default 30, presets 7 / 30 / 90 / 365 / `0 = unlimited`) exposed in the *General* Settings tab as a `Select` dropdown wired to `settings_update`. Backend ships a `Clock` domain port (`SystemClock` adapter under `adapters/driven/scheduler/`) and a `HistoryPurgeWorker` daemon spawned during Tauri setup that hard-deletes `history` rows where `completed_at < now - retention_days * 86_400`. The worker persists its last run as a Unix-epoch timestamp inside `/.history_purge_state` (sentinel filename `HISTORY_PURGE_STATE_FILE`). On startup, the daemon reads the sentinel and either runs immediately (missing/stale) or sleeps for `SECS_PER_DAY - elapsed` so the first post-launch purge stays anchored to the previous successful run instead of drifting up to ~47h after a restart; the recurring loop then ticks every 24h via `tokio::time::interval` with `MissedTickBehavior::Skip`. `retention_days <= 0` is a no-op that does not write the sentinel, so the next run re-fires the moment the user re-enables retention; corrupt sentinels are treated as "never ran" so a stuck file never blocks the scheduler. The worker shares the same `Arc` and `Arc` the IPC layer already mutates, so a settings change is observed without restart. Domain helper `normalize_history_retention_days` clamps negatives back to `0` and is now applied at every write boundary — `apply_patch` (so a crafted `settings_update` payload cannot persist a negative) and `From for AppConfig` (so a hand-edited `config.toml` is normalized at load) — plus the worker itself for defense-in-depth. (task 14) - Change-directory action that moves a download's on-disk file (and its `.vortex-meta` sidecar when present) into a new destination folder (PRD-v2 P0.13, task 13). New Tauri IPC commands `download_change_directory(id, newDestinationDir)` and `download_change_directory_bulk(ids, newDestinationDir)` are backed by `ChangeDirectoryCommand` / `ChangeDirectoryBulkCommand` in the application layer; the bulk variant returns a structured `{ moved: number[], failed: { id, message }[] }` outcome so the UI can keep failed rows selected for retry instead of swallowing partial errors. The handler pauses the download engine for `Downloading` items, relocates the body and the `.vortex-meta` sidecar, persists the new path, then resumes — segments survive the move so the engine picks up exactly where it left off. `Extracting` and `Checking` downloads are rejected because another worker is actively reading the file. The `FileStorage` port grows `move_file`, `move_meta` and `file_exists`; the production `FsFileStorage` adapter prefers `fs::rename` for same-filesystem moves and falls back to copy + size-verify + delete-source for cross-device cases (EXDEV / `ErrorKind::CrossesDevices`), with rollback on any partial failure so the source file always stays intact. New `DomainEvent::DownloadDirectoryChanged { id, newDestinationPath }` is forwarded to the frontend as the `download-directory-changed` event. Frontend ships a reusable `` (folder picker via `useBrowseFolder`, current path + selected path preview, confirm disabled until a folder is picked) and a `Move to...` action in the downloads `ActionsBar` selection toolbar that wires the bulk IPC, surfaces success / partial-failure / error toasts and clears or re-narrows the selection accordingly. New i18n keys `downloads.actions.moveSelected`, `downloads.moveDialog.*` and `downloads.toast.{moveSucceeded,movePartial,moveError}` (en/fr). (task 13) - Queue reordering via drag & drop and Move-to-Top / Move-to-Bottom (PRD-v2 P0.12, task 12): new Tauri IPC commands `download_move_to_top(id)`, `download_move_to_bottom(id)`, `download_reorder_queue(orderedIds)` backed by `MoveToTopCommand` / `MoveToBottomCommand` / `ReorderQueueCommand` in the application layer. A new `queue_position` column (migration `m20260425_000004_add_queue_position`, `BIGINT NOT NULL DEFAULT 0`, index `idx_downloads_queue_position`) persists the manual ordering so drag-reorders survive restart. `QueueManager` now sorts candidates by priority desc → `queue_position` asc → `created_at` asc, and also subscribes to two new domain events (`DownloadPrioritySet`, `QueueReordered`) so changing priority triggers immediate rescheduling — a high-priority item starts as soon as a slot is free. The default `download_list` sort uses `queue_position` ASC → `created_at` DESC so fresh downloads (position 0) still appear newest-first while manually-moved rows stick. Frontend integration in `DownloadsTable` adds `@dnd-kit/core` + `@dnd-kit/sortable` with a drag handle column (enabled only for Queued/Retry/Waiting rows), a `SortableContext` around the virtualized rows, and a `computeReorderedIds` helper that filters non-reorderable IDs from the new order before invoking `download_reorder_queue`. Row dropdown menu gets Move to top / Move to bottom items for reorderable rows. New i18n keys `downloads.table.actions.moveToTop` / `moveToBottom` (en/fr). `DownloadView` / `DownloadViewDto` now expose `priority` + `queuePosition`. (task 12) diff --git a/src-tauri/src/adapters/driven/plugin/capabilities.rs b/src-tauri/src/adapters/driven/plugin/capabilities.rs index d6a265a..7823c08 100644 --- a/src-tauri/src/adapters/driven/plugin/capabilities.rs +++ b/src-tauri/src/adapters/driven/plugin/capabilities.rs @@ -106,6 +106,24 @@ pub fn build_host_functions( // Ensure per-plugin config/state maps exist before any function runs. let plugin_configs = shared.plugin_configs.entry(name.clone()).or_default(); + // Persisted overrides may have been replayed before the plugin loaded. + // Drop entries that no longer pass the current schema (e.g. a manifest + // update narrowed an enum, tightened a regex, or removed the key) so + // the WASM plugin never observes a stale schema-invalid value via + // `get_config`. + let schema = manifest.config_schema(); + plugin_configs.retain(|key, value| { + if schema.validate(key, value).is_ok() { + true + } else { + tracing::warn!( + plugin = %name, + key = %key, + "dropping persisted plugin config value that no longer matches schema" + ); + false + } + }); for (key, value) in manifest.config_defaults() { plugin_configs .entry(key.clone()) diff --git a/src-tauri/src/adapters/driven/plugin/extism_loader.rs b/src-tauri/src/adapters/driven/plugin/extism_loader.rs index 5d897d0..97a5f3b 100644 --- a/src-tauri/src/adapters/driven/plugin/extism_loader.rs +++ b/src-tauri/src/adapters/driven/plugin/extism_loader.rs @@ -252,6 +252,19 @@ impl PluginLoader for ExtismPluginLoader { self.registry.set_enabled(name, enabled) } + fn get_manifest(&self, name: &str) -> Result, DomainError> { + Ok(self.registry.manifest(name)) + } + + fn set_runtime_config(&self, name: &str, key: &str, value: &str) -> Result<(), DomainError> { + self.shared_resources + .plugin_configs() + .entry(name.to_string()) + .or_default() + .insert(key.to_string(), value.to_string()); + Ok(()) + } + fn extract_links(&self, url: &str) -> Result { self.call_url_plugin_function(url, "extract_links") } diff --git a/src-tauri/src/adapters/driven/plugin/manifest.rs b/src-tauri/src/adapters/driven/plugin/manifest.rs index 6739d61..ab2cc1e 100644 --- a/src-tauri/src/adapters/driven/plugin/manifest.rs +++ b/src-tauri/src/adapters/driven/plugin/manifest.rs @@ -6,7 +6,10 @@ use std::path::{Path, PathBuf}; use serde::Deserialize; use crate::domain::error::DomainError; -use crate::domain::model::plugin::{PluginCategory, PluginInfo, PluginManifest}; +use crate::domain::model::plugin::{ + ConfigField, ConfigFieldType, PluginCategory, PluginConfigSchema, PluginInfo, PluginManifest, + regex_syntax_error, unsupported_regex_feature, +}; #[derive(Deserialize)] struct RawManifest { @@ -36,7 +39,14 @@ struct RawCapabilities { #[derive(Deserialize)] struct RawConfigEntry { + #[serde(rename = "type")] + field_type: Option, default: Option, + description: Option, + options: Option>, + min: Option, + max: Option, + regex: Option, } /// Parse a plugin directory containing `plugin.toml` and a `.wasm` file. @@ -83,10 +93,12 @@ pub fn parse_manifest(dir: &Path) -> Result<(PluginManifest, PathBuf), DomainErr .map(build_capabilities) .unwrap_or_default(); let config_defaults = build_config_defaults(&raw.config)?; + let config_schema = build_config_schema(&raw.config)?; let mut manifest = PluginManifest::new(info) .with_capabilities(caps) - .with_config_defaults(config_defaults); + .with_config_defaults(config_defaults) + .with_config_schema(config_schema); if let Some(v) = raw.plugin.min_vortex_version { manifest = manifest.with_min_version(v); } @@ -140,6 +152,61 @@ fn build_config_defaults( Ok(defaults) } +fn build_config_schema( + raw_config: &HashMap, +) -> Result { + let mut schema = PluginConfigSchema::new(); + for (key, entry) in raw_config { + let field_type = match entry.field_type.as_deref() { + Some(t) => t + .parse::() + .map_err(|e| DomainError::PluginError(format!("config field '{key}': {e}")))?, + None => ConfigFieldType::String, + }; + + let mut field = ConfigField::new(field_type); + if let Some(default) = &entry.default { + field = field.with_default(encode_config_default(default)?); + } + if let Some(desc) = &entry.description { + field = field.with_description(desc.clone()); + } + if let Some(options) = &entry.options { + let opts = options + .iter() + .map(encode_config_default) + .collect::, _>>()?; + field = field.with_options(opts); + } + if let Some(min) = entry.min { + field = field.with_min(min); + } + if let Some(max) = entry.max { + field = field.with_max(max); + } + if let Some(regex) = &entry.regex { + if let Some(err) = regex_syntax_error(regex) { + return Err(DomainError::PluginError(format!( + "config field '{key}' regex '{regex}' is malformed: {err}" + ))); + } + if let Some(bad) = unsupported_regex_feature(regex) { + return Err(DomainError::PluginError(format!( + "config field '{key}' regex '{regex}' uses unsupported feature '{bad}' (alternation, groups and counted quantifiers are not implemented)" + ))); + } + field = field.with_regex(regex.clone()); + } + if let Some(default) = field.default_value() { + field.validate(default).map_err(|e| { + DomainError::PluginError(format!("config field '{key}' has invalid default: {e}")) + })?; + } + schema.insert(key.clone(), field); + } + Ok(schema) +} + fn encode_config_default(value: &toml::Value) -> Result { match value { toml::Value::String(s) => Ok(s.clone()), @@ -448,4 +515,210 @@ description = "Dir name mismatch" assert!(result.is_err()); assert!(matches!(result.unwrap_err(), DomainError::PluginError(_))); } + + #[test] + fn test_parse_manifest_extracts_full_config_schema() { + use crate::domain::model::plugin::ConfigFieldType; + + let tmp = TempDir::new().unwrap(); + let plugin_dir = tmp.path().join("with-schema"); + std::fs::create_dir_all(&plugin_dir).unwrap(); + write_plugin_toml( + &plugin_dir, + r#" +[plugin] +name = "with-schema" +version = "1.0.0" +category = "crawler" +author = "Alice" +description = "Schema fields" + +[config] +default_quality = { type = "enum", default = "1080p", options = ["360p", "720p", "1080p"], description = "Preferred resolution" } +extract_audio_only = { type = "boolean", default = false } +max_retries = { type = "integer", default = 3, min = 0, max = 10 } +"#, + ); + write_dummy_wasm(&plugin_dir, "with-schema.wasm"); + + let (manifest, _) = parse_manifest(&plugin_dir).unwrap(); + let schema = manifest.config_schema(); + assert_eq!(schema.len(), 3); + + let q = schema.get("default_quality").unwrap(); + assert_eq!(q.field_type(), ConfigFieldType::Enum); + assert_eq!(q.default_value(), Some("1080p")); + assert_eq!(q.options(), &["360p", "720p", "1080p"]); + assert_eq!(q.description(), Some("Preferred resolution")); + + let a = schema.get("extract_audio_only").unwrap(); + assert_eq!(a.field_type(), ConfigFieldType::Boolean); + assert_eq!(a.default_value(), Some("false")); + + let r = schema.get("max_retries").unwrap(); + assert_eq!(r.field_type(), ConfigFieldType::Integer); + assert_eq!(r.min(), Some(0.0)); + assert_eq!(r.max(), Some(10.0)); + } + + #[test] + fn test_parse_manifest_missing_type_defaults_to_string() { + use crate::domain::model::plugin::ConfigFieldType; + + let tmp = TempDir::new().unwrap(); + let plugin_dir = tmp.path().join("loose-config"); + std::fs::create_dir_all(&plugin_dir).unwrap(); + write_plugin_toml( + &plugin_dir, + r#" +[plugin] +name = "loose-config" +version = "1.0.0" +category = "crawler" +author = "Alice" +description = "Loose schema" + +[config] +api_token = { default = "" } +"#, + ); + write_dummy_wasm(&plugin_dir, "loose-config.wasm"); + + let (manifest, _) = parse_manifest(&plugin_dir).unwrap(); + let f = manifest.config_schema().get("api_token").unwrap(); + assert_eq!(f.field_type(), ConfigFieldType::String); + } + + #[test] + fn test_parse_manifest_unknown_type_returns_err() { + let tmp = TempDir::new().unwrap(); + let plugin_dir = tmp.path().join("bad-type"); + std::fs::create_dir_all(&plugin_dir).unwrap(); + write_plugin_toml( + &plugin_dir, + r#" +[plugin] +name = "bad-type" +version = "1.0.0" +category = "crawler" +author = "Alice" +description = "Bad type" + +[config] +foo = { type = "spaceship" } +"#, + ); + write_dummy_wasm(&plugin_dir, "bad-type.wasm"); + + let result = parse_manifest(&plugin_dir); + assert!(result.is_err()); + let err = result.unwrap_err().to_string(); + assert!(err.contains("spaceship"), "got: {err}"); + } + + #[test] + fn test_parse_manifest_no_config_yields_empty_schema() { + let tmp = TempDir::new().unwrap(); + let plugin_dir = tmp.path().join("no-config"); + std::fs::create_dir_all(&plugin_dir).unwrap(); + write_plugin_toml( + &plugin_dir, + r#" +[plugin] +name = "no-config" +version = "1.0.0" +category = "utility" +author = "Charlie" +description = "No config block" +"#, + ); + write_dummy_wasm(&plugin_dir, "no-config.wasm"); + + let (manifest, _) = parse_manifest(&plugin_dir).unwrap(); + assert!(manifest.config_schema().is_empty()); + } + + #[test] + fn test_parse_manifest_rejects_malformed_regex() { + let tmp = TempDir::new().unwrap(); + let plugin_dir = tmp.path().join("malformed-regex"); + std::fs::create_dir_all(&plugin_dir).unwrap(); + write_plugin_toml( + &plugin_dir, + r#" +[plugin] +name = "malformed-regex" +version = "1.0.0" +category = "utility" +author = "Alice" +description = "Malformed regex" + +[config] +mode = { type = "string", regex = "[abc" } +"#, + ); + write_dummy_wasm(&plugin_dir, "malformed-regex.wasm"); + + let result = parse_manifest(&plugin_dir); + let err = result.unwrap_err().to_string(); + assert!( + err.contains("malformed"), + "expected malformed-pattern error, got: {err}" + ); + } + + #[test] + fn test_parse_manifest_rejects_unsupported_regex_feature() { + let tmp = TempDir::new().unwrap(); + let plugin_dir = tmp.path().join("bad-regex"); + std::fs::create_dir_all(&plugin_dir).unwrap(); + write_plugin_toml( + &plugin_dir, + r#" +[plugin] +name = "bad-regex" +version = "1.0.0" +category = "utility" +author = "Alice" +description = "Bad regex" + +[config] +mode = { type = "string", regex = "^(foo|bar)$" } +"#, + ); + write_dummy_wasm(&plugin_dir, "bad-regex.wasm"); + + let result = parse_manifest(&plugin_dir); + let err = result.unwrap_err().to_string(); + assert!( + err.contains("unsupported feature"), + "expected unsupported-feature error, got: {err}" + ); + } + + #[test] + fn test_parse_manifest_extracts_regex_constraint() { + let tmp = TempDir::new().unwrap(); + let plugin_dir = tmp.path().join("regexed"); + std::fs::create_dir_all(&plugin_dir).unwrap(); + write_plugin_toml( + &plugin_dir, + r#" +[plugin] +name = "regexed" +version = "1.0.0" +category = "utility" +author = "Alice" +description = "Regex" + +[config] +api_key = { type = "string", regex = "^[a-z0-9]+$" } +"#, + ); + write_dummy_wasm(&plugin_dir, "regexed.wasm"); + + let (manifest, _) = parse_manifest(&plugin_dir).unwrap(); + let f = manifest.config_schema().get("api_key").unwrap(); + assert_eq!(f.regex(), Some("^[a-z0-9]+$")); + } } diff --git a/src-tauri/src/adapters/driven/plugin/registry.rs b/src-tauri/src/adapters/driven/plugin/registry.rs index d649d27..318622f 100644 --- a/src-tauri/src/adapters/driven/plugin/registry.rs +++ b/src-tauri/src/adapters/driven/plugin/registry.rs @@ -51,6 +51,13 @@ impl PluginRegistry { self.plugins.contains_key(name) } + /// Clone the full manifest of a loaded plugin so callers (like the + /// configuration query handler) can inspect its `[config]` schema + /// without holding a registry reference. + pub fn manifest(&self, name: &str) -> Option { + self.plugins.get(name).map(|entry| entry.manifest.clone()) + } + /// Returns info for all plugins (enabled and disabled). pub fn list_info(&self) -> Vec { self.plugins diff --git a/src-tauri/src/adapters/driven/sqlite/entities/mod.rs b/src-tauri/src/adapters/driven/sqlite/entities/mod.rs index 3cf5cba..f25d1e2 100644 --- a/src-tauri/src/adapters/driven/sqlite/entities/mod.rs +++ b/src-tauri/src/adapters/driven/sqlite/entities/mod.rs @@ -1,3 +1,4 @@ pub mod download; pub mod download_segment; pub mod history; +pub mod plugin_config; diff --git a/src-tauri/src/adapters/driven/sqlite/entities/plugin_config.rs b/src-tauri/src/adapters/driven/sqlite/entities/plugin_config.rs new file mode 100644 index 0000000..bb19f21 --- /dev/null +++ b/src-tauri/src/adapters/driven/sqlite/entities/plugin_config.rs @@ -0,0 +1,16 @@ +use sea_orm::entity::prelude::*; + +#[derive(Clone, Debug, PartialEq, DeriveEntityModel)] +#[sea_orm(table_name = "plugin_configs")] +pub struct Model { + #[sea_orm(primary_key, auto_increment = false)] + pub plugin_name: String, + #[sea_orm(primary_key, auto_increment = false)] + pub key: String, + pub value: String, +} + +#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)] +pub enum Relation {} + +impl ActiveModelBehavior for ActiveModel {} diff --git a/src-tauri/src/adapters/driven/sqlite/migrations/m20260425_000005_create_plugin_configs.rs b/src-tauri/src/adapters/driven/sqlite/migrations/m20260425_000005_create_plugin_configs.rs new file mode 100644 index 0000000..3b2cdc9 --- /dev/null +++ b/src-tauri/src/adapters/driven/sqlite/migrations/m20260425_000005_create_plugin_configs.rs @@ -0,0 +1,48 @@ +use sea_orm_migration::prelude::*; + +#[derive(DeriveMigrationName)] +pub struct Migration; + +#[async_trait::async_trait] +impl MigrationTrait for Migration { + async fn up(&self, manager: &SchemaManager) -> Result<(), DbErr> { + manager + .create_table( + Table::create() + .table(PluginConfigs::Table) + .col( + ColumnDef::new(PluginConfigs::PluginName) + .string() + .not_null(), + ) + .col(ColumnDef::new(PluginConfigs::Key).string().not_null()) + .col(ColumnDef::new(PluginConfigs::Value).text().not_null()) + .primary_key( + Index::create() + .col(PluginConfigs::PluginName) + .col(PluginConfigs::Key), + ) + .to_owned(), + ) + .await + } + + async fn down(&self, manager: &SchemaManager) -> Result<(), DbErr> { + manager + .drop_table( + Table::drop() + .table(PluginConfigs::Table) + .if_exists() + .to_owned(), + ) + .await + } +} + +#[derive(DeriveIden)] +enum PluginConfigs { + Table, + PluginName, + Key, + Value, +} diff --git a/src-tauri/src/adapters/driven/sqlite/migrations/mod.rs b/src-tauri/src/adapters/driven/sqlite/migrations/mod.rs index 05fad73..4590293 100644 --- a/src-tauri/src/adapters/driven/sqlite/migrations/mod.rs +++ b/src-tauri/src/adapters/driven/sqlite/migrations/mod.rs @@ -4,6 +4,7 @@ mod m20260407_000001_create_tables; mod m20260415_000002_add_download_error_message; mod m20260424_000003_add_checksum_columns; mod m20260425_000004_add_queue_position; +mod m20260425_000005_create_plugin_configs; pub struct Migrator; @@ -15,6 +16,7 @@ impl MigratorTrait for Migrator { Box::new(m20260415_000002_add_download_error_message::Migration), Box::new(m20260424_000003_add_checksum_columns::Migration), Box::new(m20260425_000004_add_queue_position::Migration), + Box::new(m20260425_000005_create_plugin_configs::Migration), ] } } diff --git a/src-tauri/src/adapters/driven/sqlite/mod.rs b/src-tauri/src/adapters/driven/sqlite/mod.rs index c35bb9b..88cfa29 100644 --- a/src-tauri/src/adapters/driven/sqlite/mod.rs +++ b/src-tauri/src/adapters/driven/sqlite/mod.rs @@ -4,6 +4,7 @@ pub mod download_repo; pub mod entities; pub mod history_repo; pub mod migrations; +pub mod plugin_config_repo; pub mod progress_bridge; pub mod stats_repo; mod util; diff --git a/src-tauri/src/adapters/driven/sqlite/plugin_config_repo.rs b/src-tauri/src/adapters/driven/sqlite/plugin_config_repo.rs new file mode 100644 index 0000000..9313cb0 --- /dev/null +++ b/src-tauri/src/adapters/driven/sqlite/plugin_config_repo.rs @@ -0,0 +1,173 @@ +use std::collections::HashMap; + +use sea_orm::{ + ActiveValue::Set, ColumnTrait, DatabaseConnection, EntityTrait, QueryFilter, + sea_query::OnConflict, +}; + +use crate::domain::error::DomainError; +use crate::domain::ports::driven::plugin_config_store::PluginConfigStore; + +use super::entities::plugin_config; +use super::util::{block_on, map_db_err}; + +pub struct SqlitePluginConfigRepo { + db: DatabaseConnection, +} + +impl SqlitePluginConfigRepo { + pub fn new(db: DatabaseConnection) -> Self { + Self { db } + } +} + +impl PluginConfigStore for SqlitePluginConfigRepo { + fn get_values(&self, plugin_name: &str) -> Result, DomainError> { + let db = self.db.clone(); + let plugin = plugin_name.to_string(); + block_on(async move { + let rows = plugin_config::Entity::find() + .filter(plugin_config::Column::PluginName.eq(plugin)) + .all(&db) + .await + .map_err(map_db_err)?; + Ok(rows.into_iter().map(|m| (m.key, m.value)).collect()) + }) + } + + fn set_value(&self, plugin_name: &str, key: &str, value: &str) -> Result<(), DomainError> { + let db = self.db.clone(); + let model = plugin_config::ActiveModel { + plugin_name: Set(plugin_name.to_string()), + key: Set(key.to_string()), + value: Set(value.to_string()), + }; + block_on(async move { + plugin_config::Entity::insert(model) + .on_conflict( + OnConflict::columns([ + plugin_config::Column::PluginName, + plugin_config::Column::Key, + ]) + .update_column(plugin_config::Column::Value) + .to_owned(), + ) + .exec(&db) + .await + .map_err(map_db_err)?; + Ok(()) + }) + } + + fn list_all(&self) -> Result>, DomainError> { + let db = self.db.clone(); + block_on(async move { + let rows = plugin_config::Entity::find() + .all(&db) + .await + .map_err(map_db_err)?; + let mut out: HashMap> = HashMap::new(); + for row in rows { + out.entry(row.plugin_name) + .or_default() + .insert(row.key, row.value); + } + Ok(out) + }) + } + + fn delete_all(&self, plugin_name: &str) -> Result<(), DomainError> { + let db = self.db.clone(); + let plugin = plugin_name.to_string(); + block_on(async move { + plugin_config::Entity::delete_many() + .filter(plugin_config::Column::PluginName.eq(plugin)) + .exec(&db) + .await + .map_err(map_db_err)?; + Ok(()) + }) + } +} + +#[cfg(test)] +mod tests { + use super::*; + use crate::adapters::driven::sqlite::connection::setup_test_db; + + async fn make_repo() -> SqlitePluginConfigRepo { + let db = setup_test_db().await.unwrap(); + SqlitePluginConfigRepo::new(db) + } + + #[tokio::test(flavor = "multi_thread")] + async fn test_get_values_returns_empty_map_when_none_persisted() { + let repo = make_repo().await; + let values = repo.get_values("ghost").unwrap(); + assert!(values.is_empty()); + } + + #[tokio::test(flavor = "multi_thread")] + async fn test_set_then_get_round_trip() { + let repo = make_repo().await; + repo.set_value("youtube", "default_quality", "1080p") + .unwrap(); + let values = repo.get_values("youtube").unwrap(); + assert_eq!(values.get("default_quality"), Some(&"1080p".to_string())); + } + + #[tokio::test(flavor = "multi_thread")] + async fn test_set_value_upserts_existing_key() { + let repo = make_repo().await; + repo.set_value("youtube", "default_quality", "720p") + .unwrap(); + repo.set_value("youtube", "default_quality", "1080p") + .unwrap(); + let values = repo.get_values("youtube").unwrap(); + assert_eq!(values.len(), 1); + assert_eq!(values.get("default_quality"), Some(&"1080p".to_string())); + } + + #[tokio::test(flavor = "multi_thread")] + async fn test_set_value_isolates_plugins() { + let repo = make_repo().await; + repo.set_value("youtube", "k", "yt").unwrap(); + repo.set_value("vimeo", "k", "vm").unwrap(); + assert_eq!( + repo.get_values("youtube").unwrap().get("k"), + Some(&"yt".to_string()) + ); + assert_eq!( + repo.get_values("vimeo").unwrap().get("k"), + Some(&"vm".to_string()) + ); + } + + #[tokio::test(flavor = "multi_thread")] + async fn test_list_all_groups_by_plugin() { + let repo = make_repo().await; + repo.set_value("youtube", "a", "1").unwrap(); + repo.set_value("youtube", "b", "2").unwrap(); + repo.set_value("vimeo", "x", "9").unwrap(); + + let all = repo.list_all().unwrap(); + assert_eq!(all.len(), 2); + let yt = all.get("youtube").unwrap(); + assert_eq!(yt.get("a"), Some(&"1".to_string())); + assert_eq!(yt.get("b"), Some(&"2".to_string())); + assert_eq!(all.get("vimeo").unwrap().get("x"), Some(&"9".to_string())); + } + + #[tokio::test(flavor = "multi_thread")] + async fn test_delete_all_removes_only_target_plugin() { + let repo = make_repo().await; + repo.set_value("youtube", "k", "v").unwrap(); + repo.set_value("vimeo", "k", "v").unwrap(); + repo.delete_all("youtube").unwrap(); + assert!(repo.get_values("youtube").unwrap().is_empty()); + assert_eq!( + repo.get_values("vimeo").unwrap().get("k"), + Some(&"v".to_string()) + ); + } +} diff --git a/src-tauri/src/adapters/driving/tauri_ipc.rs b/src-tauri/src/adapters/driving/tauri_ipc.rs index 611e9fe..89e0d3a 100644 --- a/src-tauri/src/adapters/driving/tauri_ipc.rs +++ b/src-tauri/src/adapters/driving/tauri_ipc.rs @@ -21,18 +21,20 @@ use crate::application::commands::{ PauseDownloadCommand, PurgeHistoryCommand, RedownloadCommand, RedownloadSource, RemoveDownloadCommand, ReorderQueueCommand, ResolveLinksCommand, ResolvedLinkDto, ResumeAllDownloadsCommand, ResumeDownloadCommand, RetryDownloadCommand, SetPriorityCommand, - StartDownloadCommand, UninstallPluginCommand, UpdateConfigCommand, VerifyChecksumCommand, - VerifyChecksumOutcome, + StartDownloadCommand, UninstallPluginCommand, UpdateConfigCommand, UpdatePluginConfigCommand, + VerifyChecksumCommand, VerifyChecksumOutcome, }; use crate::application::error::AppError; use crate::application::queries::{ CountDownloadsByStateQuery, GetDownloadDetailQuery, GetDownloadsQuery, GetHistoryEntryQuery, - GetStatsQuery, ListHistoryQuery, ListPluginsQuery, SearchHistoryQuery, TopModulesQuery, + GetPluginConfigQuery, GetStatsQuery, ListHistoryQuery, ListPluginsQuery, SearchHistoryQuery, + TopModulesQuery, }; use crate::application::query_bus::QueryBus; use crate::application::read_models::download_detail_view::DownloadDetailViewDto; use crate::application::read_models::download_view::DownloadViewDto; use crate::application::read_models::history_view::HistoryViewDto; +use crate::application::read_models::plugin_config_view::PluginConfigView; use crate::application::read_models::plugin_store_view::PluginStoreEntryDto; use crate::application::read_models::plugin_view::PluginViewDto; use crate::application::read_models::stats_view::{ModuleStatsDto, StatsViewDto}; @@ -577,6 +579,36 @@ pub async fn plugin_store_update(state: State<'_, AppState>, name: String) -> Re .map_err(|e| e.to_string()) } +#[tauri::command] +pub async fn plugin_config_get( + state: State<'_, AppState>, + name: String, +) -> Result { + state + .query_bus + .handle_get_plugin_config(GetPluginConfigQuery { plugin_name: name }) + .await + .map_err(|e| e.to_string()) +} + +#[tauri::command] +pub async fn plugin_config_update( + state: State<'_, AppState>, + name: String, + key: String, + value: String, +) -> Result<(), String> { + state + .command_bus + .handle_update_plugin_config(UpdatePluginConfigCommand { + plugin_name: name, + key, + value, + }) + .await + .map_err(|e| e.to_string()) +} + fn store_cache_path() -> Result { dirs::config_dir() .ok_or_else(|| "cannot determine config directory — store unavailable".to_string()) diff --git a/src-tauri/src/application/command_bus.rs b/src-tauri/src/application/command_bus.rs index e9fc8c2..9487fa8 100644 --- a/src-tauri/src/application/command_bus.rs +++ b/src-tauri/src/application/command_bus.rs @@ -8,7 +8,7 @@ use std::sync::Arc; use crate::domain::ports::driven::{ ArchiveExtractor, ChecksumComputer, ClipboardObserver, ConfigStore, CredentialStore, DownloadEngine, DownloadRepository, EventBus, FileOpener, FileStorage, HistoryRepository, - HttpClient, PluginLoader, PluginStoreClient, + HttpClient, PluginConfigStore, PluginLoader, PluginStoreClient, }; /// Central dispatcher for CQRS commands. @@ -30,6 +30,7 @@ pub struct CommandBus { plugin_store_client: Option>, checksum_computer: Option>, file_opener: Option>, + plugin_config_store: Option>, /// Serializes queue-position allocation across handlers. Without this, /// two concurrent move-to-top/move-to-bottom/start-download calls can /// observe the same min/max and write colliding `queue_position` @@ -68,10 +69,23 @@ impl CommandBus { plugin_store_client, checksum_computer: None, file_opener: None, + plugin_config_store: None, queue_position_lock: tokio::sync::Mutex::new(()), } } + /// Builder-style setter for the plugin configuration persistence port. + /// Optional so existing test fixtures don't have to construct one when + /// they don't exercise the plugin-config commands. + pub fn with_plugin_config_store(mut self, store: Arc) -> Self { + self.plugin_config_store = Some(store); + self + } + + pub fn plugin_config_store(&self) -> Option<&dyn PluginConfigStore> { + self.plugin_config_store.as_deref() + } + /// Acquire the application-wide lock that serializes queue-position /// allocation. Held by handlers that read the current min/max and /// then persist a new `queue_position`, so the read+write is atomic diff --git a/src-tauri/src/application/commands/mod.rs b/src-tauri/src/application/commands/mod.rs index bef897b..ffc7d65 100644 --- a/src-tauri/src/application/commands/mod.rs +++ b/src-tauri/src/application/commands/mod.rs @@ -31,6 +31,7 @@ mod toggle_clipboard; mod toggle_plugin; mod uninstall_plugin; mod update_config; +mod update_plugin_config; mod verify_checksum; use std::path::PathBuf; @@ -174,6 +175,19 @@ pub struct UpdateConfigCommand { } impl Command for UpdateConfigCommand {} +/// Update a single (key, value) pair on a plugin's persisted configuration. +/// +/// The handler validates the value against the manifest schema before +/// persisting, so the backend remains the source of truth even if a +/// rogue caller bypasses the UI form. +#[derive(Debug)] +pub struct UpdatePluginConfigCommand { + pub plugin_name: String, + pub key: String, + pub value: String, +} +impl Command for UpdatePluginConfigCommand {} + // Handler: task 26 (archive extraction) #[derive(Debug)] pub struct ExtractArchiveCommand { diff --git a/src-tauri/src/application/commands/update_plugin_config.rs b/src-tauri/src/application/commands/update_plugin_config.rs new file mode 100644 index 0000000..d0b2e93 --- /dev/null +++ b/src-tauri/src/application/commands/update_plugin_config.rs @@ -0,0 +1,471 @@ +//! Handler for `UpdatePluginConfigCommand`. +//! +//! Validates the new value against the plugin's declared schema, persists +//! it via [`PluginConfigStore`], and updates the loader's in-memory map so +//! subsequent `get_config` calls inside the WASM plugin see the new value +//! without a reload. + +use crate::application::command_bus::CommandBus; +use crate::application::error::AppError; +use crate::domain::error::DomainError; + +impl CommandBus { + pub async fn handle_update_plugin_config( + &self, + cmd: super::UpdatePluginConfigCommand, + ) -> Result<(), AppError> { + let manifest = self + .plugin_loader() + .get_manifest(&cmd.plugin_name)? + .ok_or_else(|| { + AppError::Plugin(format!("plugin '{}' is not loaded", cmd.plugin_name)) + })?; + + if manifest.config_schema().is_empty() { + return Err(AppError::Domain(DomainError::ValidationError(format!( + "plugin '{}' has no configuration schema", + cmd.plugin_name + )))); + } + + manifest + .config_schema() + .validate(&cmd.key, &cmd.value) + .map_err(AppError::Domain)?; + + let store = self + .plugin_config_store() + .ok_or_else(|| AppError::Plugin("plugin config store not configured".into()))?; + + // Capture the prior effective value so we can roll back the runtime + // if the persistence step fails. The runtime map is seeded with both + // manifest defaults and persisted overrides at plugin load time, so + // the rollback target is the persisted value when present, else the + // manifest default, else nothing (in which case we leave the new + // value in the runtime map — the field has no fallback to restore). + let manifest_default = manifest + .config_schema() + .get(&cmd.key) + .and_then(|f| f.default_value()) + .map(String::from); + let previous = store + .get_values(&cmd.plugin_name) + .map_err(AppError::Domain)? + .get(&cmd.key) + .cloned() + .or(manifest_default); + + // Apply the runtime change first so a runtime failure can short + // circuit before anything is persisted. + self.plugin_loader() + .set_runtime_config(&cmd.plugin_name, &cmd.key, &cmd.value) + .map_err(AppError::Domain)?; + + if let Err(e) = store.set_value(&cmd.plugin_name, &cmd.key, &cmd.value) { + if let Some(prev) = previous { + let _ = self + .plugin_loader() + .set_runtime_config(&cmd.plugin_name, &cmd.key, &prev); + } + return Err(AppError::Domain(e)); + } + + Ok(()) + } +} + +#[cfg(test)] +mod tests { + use std::collections::HashMap; + use std::path::Path; + use std::sync::{Arc, Mutex}; + + use crate::application::command_bus::CommandBus; + use crate::application::commands::UpdatePluginConfigCommand; + use crate::domain::error::DomainError; + use crate::domain::event::DomainEvent; + use crate::domain::model::config::{AppConfig, ConfigPatch}; + use crate::domain::model::credential::Credential; + use crate::domain::model::download::{Download, DownloadId, DownloadState}; + use crate::domain::model::http::HttpResponse; + use crate::domain::model::meta::DownloadMeta; + use crate::domain::model::plugin::{ + ConfigField, ConfigFieldType, PluginCategory, PluginConfigSchema, PluginInfo, + PluginManifest, + }; + use crate::domain::ports::driven::{ + ClipboardObserver, ConfigStore, CredentialStore, DownloadEngine, DownloadRepository, + EventBus, FileStorage, HttpClient, PluginConfigStore, PluginLoader, + }; + + struct MockDownloadRepo; + impl DownloadRepository for MockDownloadRepo { + fn find_by_id(&self, _: DownloadId) -> Result, DomainError> { + Ok(None) + } + fn save(&self, _: &Download) -> Result<(), DomainError> { + Ok(()) + } + fn delete(&self, _: DownloadId) -> Result<(), DomainError> { + Ok(()) + } + fn find_by_state(&self, _: DownloadState) -> Result, DomainError> { + Ok(vec![]) + } + } + + struct MockDownloadEngine; + impl DownloadEngine for MockDownloadEngine { + fn start(&self, _: &Download) -> Result<(), DomainError> { + Ok(()) + } + fn pause(&self, _: DownloadId) -> Result<(), DomainError> { + Ok(()) + } + fn resume(&self, _: DownloadId) -> Result<(), DomainError> { + Ok(()) + } + fn cancel(&self, _: DownloadId) -> Result<(), DomainError> { + Ok(()) + } + } + + struct MockEventBus; + impl EventBus for MockEventBus { + fn publish(&self, _: DomainEvent) {} + fn subscribe(&self, _: Box) {} + } + + struct MockFileStorage; + impl FileStorage for MockFileStorage { + fn create_file(&self, _: &Path, _: u64) -> Result<(), DomainError> { + Ok(()) + } + fn write_segment(&self, _: &Path, _: u64, _: &[u8]) -> Result<(), DomainError> { + Ok(()) + } + fn read_meta(&self, _: &Path) -> Result, DomainError> { + Ok(None) + } + fn write_meta(&self, _: &Path, _: &DownloadMeta) -> Result<(), DomainError> { + Ok(()) + } + fn delete_meta(&self, _: &Path) -> Result<(), DomainError> { + Ok(()) + } + } + + struct MockHttpClient; + impl HttpClient for MockHttpClient { + fn head(&self, _: &str) -> Result { + Ok(HttpResponse { + status_code: 200, + headers: HashMap::new(), + body: vec![], + }) + } + fn get_range(&self, _: &str, _: u64, _: u64) -> Result, DomainError> { + Ok(vec![]) + } + fn supports_range(&self, _: &str) -> Result { + Ok(true) + } + } + + struct MockConfigStore; + impl ConfigStore for MockConfigStore { + fn get_config(&self) -> Result { + Ok(AppConfig::default()) + } + fn update_config(&self, _: ConfigPatch) -> Result { + Ok(AppConfig::default()) + } + } + + struct MockCredentialStore; + impl CredentialStore for MockCredentialStore { + fn get(&self, _: &str) -> Result, DomainError> { + Ok(None) + } + fn store(&self, _: &str, _: &Credential) -> Result<(), DomainError> { + Ok(()) + } + fn delete(&self, _: &str) -> Result<(), DomainError> { + Ok(()) + } + } + + struct MockClipboardObserver; + impl ClipboardObserver for MockClipboardObserver { + fn start(&self) -> Result<(), DomainError> { + Ok(()) + } + fn stop(&self) -> Result<(), DomainError> { + Ok(()) + } + fn get_urls(&self) -> Result, DomainError> { + Ok(vec![]) + } + } + + struct StubPluginLoader { + manifest: Option, + runtime_writes: Mutex>, + } + + impl StubPluginLoader { + fn new(manifest: Option) -> Self { + Self { + manifest, + runtime_writes: Mutex::new(Vec::new()), + } + } + } + + impl PluginLoader for StubPluginLoader { + fn load(&self, _: &PluginManifest) -> Result<(), DomainError> { + Ok(()) + } + fn unload(&self, _: &str) -> Result<(), DomainError> { + Ok(()) + } + fn resolve_url(&self, _: &str) -> Result, DomainError> { + Ok(None) + } + fn list_loaded(&self) -> Result, DomainError> { + Ok(vec![]) + } + fn set_enabled(&self, _: &str, _: bool) -> Result<(), DomainError> { + Ok(()) + } + fn get_manifest(&self, _: &str) -> Result, DomainError> { + Ok(self.manifest.clone()) + } + fn set_runtime_config( + &self, + name: &str, + key: &str, + value: &str, + ) -> Result<(), DomainError> { + self.runtime_writes.lock().unwrap().push(( + name.to_string(), + key.to_string(), + value.to_string(), + )); + Ok(()) + } + } + + struct InMemoryPluginConfigStore { + values: Mutex>, + } + + impl InMemoryPluginConfigStore { + fn new() -> Self { + Self { + values: Mutex::new(HashMap::new()), + } + } + } + + impl PluginConfigStore for InMemoryPluginConfigStore { + fn get_values(&self, plugin_name: &str) -> Result, DomainError> { + Ok(self + .values + .lock() + .unwrap() + .iter() + .filter(|((p, _), _)| p == plugin_name) + .map(|((_, k), v)| (k.clone(), v.clone())) + .collect()) + } + fn set_value(&self, plugin_name: &str, key: &str, value: &str) -> Result<(), DomainError> { + self.values.lock().unwrap().insert( + (plugin_name.to_string(), key.to_string()), + value.to_string(), + ); + Ok(()) + } + fn list_all(&self) -> Result>, DomainError> { + let mut out: HashMap> = HashMap::new(); + for ((p, k), v) in self.values.lock().unwrap().iter() { + out.entry(p.clone()) + .or_default() + .insert(k.clone(), v.clone()); + } + Ok(out) + } + fn delete_all(&self, plugin_name: &str) -> Result<(), DomainError> { + self.values + .lock() + .unwrap() + .retain(|(p, _), _| p != plugin_name); + Ok(()) + } + } + + struct FakeArchiveExtractor; + impl crate::domain::ports::driven::ArchiveExtractor for FakeArchiveExtractor { + fn detect_format( + &self, + _: &Path, + ) -> Result, DomainError> { + Ok(None) + } + fn can_extract(&self, _: &Path) -> Result { + Ok(false) + } + fn extract( + &self, + _: &Path, + _: &Path, + _: Option<&str>, + ) -> Result { + Ok(crate::domain::model::archive::ExtractSummary { + extracted_files: 0, + extracted_bytes: 0, + duration_ms: 0, + warnings: vec![], + }) + } + fn list_contents( + &self, + _: &Path, + _: Option<&str>, + ) -> Result, DomainError> { + Ok(vec![]) + } + fn detect_segments( + &self, + _: &Path, + ) -> Result>, DomainError> { + Ok(None) + } + } + + fn make_manifest_with_schema() -> PluginManifest { + let info = PluginInfo::new( + "yt".to_string(), + "1.0.0".to_string(), + "yt".to_string(), + "x".to_string(), + PluginCategory::Crawler, + ); + let mut schema = PluginConfigSchema::new(); + schema.insert( + "default_quality", + ConfigField::new(ConfigFieldType::Enum) + .with_options(vec!["360p".into(), "720p".into(), "1080p".into()]) + .with_default("720p"), + ); + PluginManifest::new(info).with_config_schema(schema) + } + + fn make_bus( + loader: Arc, + store: Arc, + ) -> CommandBus { + CommandBus::new( + Arc::new(MockDownloadRepo), + Arc::new(MockDownloadEngine), + Arc::new(MockEventBus), + Arc::new(MockFileStorage), + Arc::new(MockHttpClient), + loader, + Arc::new(MockConfigStore), + Arc::new(MockCredentialStore), + Arc::new(MockClipboardObserver), + Arc::new(FakeArchiveExtractor), + Arc::new(crate::application::test_support::NoopHistoryRepo), + None, + ) + .with_plugin_config_store(store) + } + + #[tokio::test] + async fn test_update_plugin_config_persists_and_propagates() { + let loader = Arc::new(StubPluginLoader::new(Some(make_manifest_with_schema()))); + let store = Arc::new(InMemoryPluginConfigStore::new()); + let bus = make_bus(loader.clone(), store.clone()); + + bus.handle_update_plugin_config(UpdatePluginConfigCommand { + plugin_name: "yt".into(), + key: "default_quality".into(), + value: "1080p".into(), + }) + .await + .unwrap(); + + assert_eq!( + store.get_values("yt").unwrap().get("default_quality"), + Some(&"1080p".to_string()) + ); + let writes = loader.runtime_writes.lock().unwrap().clone(); + assert_eq!(writes.len(), 1); + assert_eq!(writes[0].0, "yt"); + assert_eq!(writes[0].1, "default_quality"); + assert_eq!(writes[0].2, "1080p"); + } + + #[tokio::test] + async fn test_update_plugin_config_rejects_invalid_value() { + let loader = Arc::new(StubPluginLoader::new(Some(make_manifest_with_schema()))); + let store = Arc::new(InMemoryPluginConfigStore::new()); + let bus = make_bus(loader, store.clone()); + + let err = bus + .handle_update_plugin_config(UpdatePluginConfigCommand { + plugin_name: "yt".into(), + key: "default_quality".into(), + value: "8K".into(), + }) + .await + .unwrap_err(); + + assert!(matches!( + err, + crate::application::error::AppError::Domain(DomainError::ValidationError(_)) + )); + assert!(store.get_values("yt").unwrap().is_empty()); + } + + #[tokio::test] + async fn test_update_plugin_config_unknown_plugin_returns_err() { + let loader = Arc::new(StubPluginLoader::new(None)); + let store = Arc::new(InMemoryPluginConfigStore::new()); + let bus = make_bus(loader, store); + + let err = bus + .handle_update_plugin_config(UpdatePluginConfigCommand { + plugin_name: "ghost".into(), + key: "k".into(), + value: "v".into(), + }) + .await + .unwrap_err(); + assert!(matches!( + err, + crate::application::error::AppError::Plugin(_) + )); + } + + #[tokio::test] + async fn test_update_plugin_config_unknown_key_returns_not_found() { + let loader = Arc::new(StubPluginLoader::new(Some(make_manifest_with_schema()))); + let store = Arc::new(InMemoryPluginConfigStore::new()); + let bus = make_bus(loader, store); + + let err = bus + .handle_update_plugin_config(UpdatePluginConfigCommand { + plugin_name: "yt".into(), + key: "ghost".into(), + value: "x".into(), + }) + .await + .unwrap_err(); + assert!(matches!( + err, + crate::application::error::AppError::Domain(DomainError::NotFound(_)) + )); + } +} diff --git a/src-tauri/src/application/queries/get_plugin_config.rs b/src-tauri/src/application/queries/get_plugin_config.rs new file mode 100644 index 0000000..bff9da7 --- /dev/null +++ b/src-tauri/src/application/queries/get_plugin_config.rs @@ -0,0 +1,357 @@ +//! Handler for `GetPluginConfigQuery`. +//! +//! Returns the schema declared by the plugin's manifest joined with the +//! current persisted values (or the manifest defaults when nothing has +//! been persisted yet). The frontend uses the schema to render typed +//! form fields and the values to populate them. + +use std::collections::HashMap; + +use crate::application::error::AppError; +use crate::application::query_bus::QueryBus; +use crate::application::read_models::plugin_config_view::PluginConfigView; + +impl QueryBus { + pub async fn handle_get_plugin_config( + &self, + query: super::GetPluginConfigQuery, + ) -> Result { + let loader = self + .plugin_loader() + .ok_or_else(|| AppError::Plugin("plugin loader not configured".into()))?; + let manifest = loader.get_manifest(&query.plugin_name)?.ok_or_else(|| { + AppError::Plugin(format!("plugin '{}' is not loaded", query.plugin_name)) + })?; + + let store = self + .plugin_config_store() + .ok_or_else(|| AppError::Plugin("plugin config store not configured".into()))?; + let mut values = store + .get_values(&query.plugin_name) + .map_err(AppError::Domain)?; + + // Drop persisted values that no longer match the current schema + // (e.g. after a plugin update tightens a regex, removes an enum + // option, or renames a key) so the UI never surfaces a value + // the backend would reject on save. + let schema = manifest.config_schema(); + values.retain(|key, value| schema.validate(key, value).is_ok()); + + // Fill missing keys with their manifest defaults so the UI never + // renders an empty input for a field that has a declared default. + for (key, field) in schema.fields() { + if !values.contains_key(key) + && let Some(default) = field.default_value() + { + values.insert(key.clone(), default.to_string()); + } + } + + Ok(PluginConfigView::new( + manifest.config_schema(), + values_to_view(values), + )) + } +} + +fn values_to_view(values: HashMap) -> HashMap { + values +} + +#[cfg(test)] +mod tests { + use std::collections::HashMap; + use std::sync::Arc; + + use crate::application::queries::GetPluginConfigQuery; + use crate::application::query_bus::QueryBus; + use crate::domain::error::DomainError; + use crate::domain::model::download::DownloadId; + use crate::domain::model::plugin::{ + ConfigField, ConfigFieldType, PluginCategory, PluginConfigSchema, PluginInfo, + PluginManifest, + }; + use crate::domain::model::views::{ + DownloadDetailView, DownloadFilter, DownloadView, HistoryEntry, HistoryFilter, HistorySort, + ModuleStats, SortOrder, StateCountMap, StatsPeriod, StatsView, + }; + use crate::domain::ports::driven::{ + ArchiveExtractor, DownloadReadRepository, HistoryRepository, PluginConfigStore, + PluginLoader, PluginReadRepository, StatsRepository, + }; + + struct FakeArchiveExtractor; + impl ArchiveExtractor for FakeArchiveExtractor { + fn detect_format( + &self, + _: &std::path::Path, + ) -> Result, DomainError> { + Ok(None) + } + fn can_extract(&self, _: &std::path::Path) -> Result { + Ok(false) + } + fn extract( + &self, + _: &std::path::Path, + _: &std::path::Path, + _: Option<&str>, + ) -> Result { + Ok(crate::domain::model::archive::ExtractSummary { + extracted_files: 0, + extracted_bytes: 0, + duration_ms: 0, + warnings: vec![], + }) + } + fn list_contents( + &self, + _: &std::path::Path, + _: Option<&str>, + ) -> Result, DomainError> { + Ok(vec![]) + } + fn detect_segments( + &self, + _: &std::path::Path, + ) -> Result>, DomainError> { + Ok(None) + } + } + + struct MockReadRepo; + impl DownloadReadRepository for MockReadRepo { + fn find_downloads( + &self, + _: Option, + _: Option, + _: Option, + _: Option, + ) -> Result, DomainError> { + Ok(vec![]) + } + fn find_download_detail( + &self, + _: DownloadId, + ) -> Result, DomainError> { + Ok(None) + } + fn count_by_state(&self) -> Result { + Ok(HashMap::new()) + } + } + + struct MockHistoryRepo; + impl HistoryRepository for MockHistoryRepo { + fn record(&self, _: &HistoryEntry) -> Result<(), DomainError> { + Ok(()) + } + fn find_recent(&self, _: usize) -> Result, DomainError> { + Ok(vec![]) + } + fn find_by_download(&self, _: DownloadId) -> Result, DomainError> { + Ok(vec![]) + } + fn list( + &self, + _: Option, + _: Option, + _: Option, + _: Option, + ) -> Result, DomainError> { + Ok(vec![]) + } + fn search(&self, _: &str) -> Result, DomainError> { + Ok(vec![]) + } + fn find_by_id(&self, _: u64) -> Result, DomainError> { + Ok(None) + } + fn delete_by_id(&self, _: u64) -> Result { + Ok(false) + } + fn delete_all(&self) -> Result { + Ok(0) + } + fn delete_older_than(&self, _: u64) -> Result { + Ok(0) + } + } + + struct MockStatsRepo; + impl StatsRepository for MockStatsRepo { + fn record_completed(&self, _: u64, _: u64) -> Result<(), DomainError> { + Ok(()) + } + fn get_stats(&self, _: StatsPeriod) -> Result { + Ok(StatsView { + total_downloaded_bytes: 0, + total_files: 0, + avg_speed: 0, + peak_speed: 0, + success_rate: 0.0, + daily_volumes: vec![], + top_hosts: vec![], + }) + } + fn top_modules(&self, _: u32) -> Result, DomainError> { + Ok(vec![]) + } + } + + struct EmptyPluginRepo; + impl PluginReadRepository for EmptyPluginRepo { + fn list_loaded(&self) -> Result, DomainError> { + Ok(vec![]) + } + } + + struct StubPluginLoader { + manifest: Option, + } + impl PluginLoader for StubPluginLoader { + fn load(&self, _: &PluginManifest) -> Result<(), DomainError> { + Ok(()) + } + fn unload(&self, _: &str) -> Result<(), DomainError> { + Ok(()) + } + fn resolve_url(&self, _: &str) -> Result, DomainError> { + Ok(None) + } + fn list_loaded(&self) -> Result, DomainError> { + Ok(vec![]) + } + fn set_enabled(&self, _: &str, _: bool) -> Result<(), DomainError> { + Ok(()) + } + fn get_manifest(&self, _: &str) -> Result, DomainError> { + Ok(self.manifest.clone()) + } + } + + struct InMemoryStore { + values: HashMap<(String, String), String>, + } + impl InMemoryStore { + fn new() -> Self { + Self { + values: HashMap::new(), + } + } + } + impl PluginConfigStore for InMemoryStore { + fn get_values(&self, plugin_name: &str) -> Result, DomainError> { + Ok(self + .values + .iter() + .filter(|((p, _), _)| p == plugin_name) + .map(|((_, k), v)| (k.clone(), v.clone())) + .collect()) + } + fn set_value(&self, _: &str, _: &str, _: &str) -> Result<(), DomainError> { + Ok(()) + } + fn list_all(&self) -> Result>, DomainError> { + Ok(HashMap::new()) + } + fn delete_all(&self, _: &str) -> Result<(), DomainError> { + Ok(()) + } + } + + fn make_manifest() -> PluginManifest { + let info = PluginInfo::new( + "yt".to_string(), + "1.0.0".to_string(), + "yt".to_string(), + "x".to_string(), + PluginCategory::Crawler, + ); + let mut schema = PluginConfigSchema::new(); + schema.insert( + "default_quality", + ConfigField::new(ConfigFieldType::Enum) + .with_options(vec!["360p".into(), "720p".into(), "1080p".into()]) + .with_default("720p"), + ); + schema.insert( + "audio_only", + ConfigField::new(ConfigFieldType::Boolean).with_default("false"), + ); + PluginManifest::new(info).with_config_schema(schema) + } + + fn make_query_bus(loader: StubPluginLoader, store: InMemoryStore) -> QueryBus { + QueryBus::new( + Arc::new(MockReadRepo), + Arc::new(MockHistoryRepo), + Arc::new(MockStatsRepo), + Arc::new(EmptyPluginRepo), + Arc::new(FakeArchiveExtractor), + ) + .with_plugin_loader(Arc::new(loader)) + .with_plugin_config_store(Arc::new(store)) + } + + #[tokio::test] + async fn test_get_plugin_config_returns_schema_and_defaults() { + let bus = make_query_bus( + StubPluginLoader { + manifest: Some(make_manifest()), + }, + InMemoryStore::new(), + ); + let view = bus + .handle_get_plugin_config(GetPluginConfigQuery { + plugin_name: "yt".into(), + }) + .await + .unwrap(); + assert_eq!(view.fields.len(), 2); + assert_eq!( + view.values.get("default_quality"), + Some(&"720p".to_string()) + ); + assert_eq!(view.values.get("audio_only"), Some(&"false".to_string())); + } + + #[tokio::test] + async fn test_get_plugin_config_persisted_overrides_default() { + let mut store = InMemoryStore::new(); + store + .values + .insert(("yt".into(), "default_quality".into()), "1080p".into()); + let bus = make_query_bus( + StubPluginLoader { + manifest: Some(make_manifest()), + }, + store, + ); + let view = bus + .handle_get_plugin_config(GetPluginConfigQuery { + plugin_name: "yt".into(), + }) + .await + .unwrap(); + assert_eq!( + view.values.get("default_quality"), + Some(&"1080p".to_string()) + ); + } + + #[tokio::test] + async fn test_get_plugin_config_unknown_plugin_returns_err() { + let bus = make_query_bus(StubPluginLoader { manifest: None }, InMemoryStore::new()); + let err = bus + .handle_get_plugin_config(GetPluginConfigQuery { + plugin_name: "ghost".into(), + }) + .await + .unwrap_err(); + assert!(matches!( + err, + crate::application::error::AppError::Plugin(_) + )); + } +} diff --git a/src-tauri/src/application/queries/mod.rs b/src-tauri/src/application/queries/mod.rs index f3a54fe..1316df4 100644 --- a/src-tauri/src/application/queries/mod.rs +++ b/src-tauri/src/application/queries/mod.rs @@ -7,6 +7,7 @@ mod count_by_state; mod get_download_detail; mod get_downloads; mod get_history_entry; +mod get_plugin_config; mod get_plugin_store; mod get_stats; mod list_archive_contents; @@ -89,3 +90,12 @@ pub struct ListArchiveContentsQuery { pub password: Option, } impl Query for ListArchiveContentsQuery {} + +/// Read the schema and current values for a single plugin's +/// configuration. Powers the dynamic UI form rendered in the plugin +/// row's "Configure" dialog. +#[derive(Debug)] +pub struct GetPluginConfigQuery { + pub plugin_name: String, +} +impl Query for GetPluginConfigQuery {} diff --git a/src-tauri/src/application/query_bus.rs b/src-tauri/src/application/query_bus.rs index ed23dcf..756a5dd 100644 --- a/src-tauri/src/application/query_bus.rs +++ b/src-tauri/src/application/query_bus.rs @@ -6,8 +6,8 @@ use std::sync::Arc; use crate::domain::ports::driven::{ - ArchiveExtractor, DownloadReadRepository, HistoryRepository, PluginReadRepository, - StatsRepository, + ArchiveExtractor, DownloadReadRepository, HistoryRepository, PluginConfigStore, PluginLoader, + PluginReadRepository, StatsRepository, }; /// Central dispatcher for CQRS queries. @@ -20,6 +20,8 @@ pub struct QueryBus { stats_repo: Arc, plugin_read_repo: Arc, archive_extractor: Arc, + plugin_loader: Option>, + plugin_config_store: Option>, } impl QueryBus { @@ -36,9 +38,25 @@ impl QueryBus { stats_repo, plugin_read_repo, archive_extractor, + plugin_loader: None, + plugin_config_store: None, } } + /// Builder-style setter for the plugin loader. Optional so test + /// fixtures that never query plugin manifests don't have to provide + /// one. + pub fn with_plugin_loader(mut self, loader: Arc) -> Self { + self.plugin_loader = Some(loader); + self + } + + /// Builder-style setter for the plugin config persistence port. + pub fn with_plugin_config_store(mut self, store: Arc) -> Self { + self.plugin_config_store = Some(store); + self + } + pub fn download_read_repo(&self) -> &dyn DownloadReadRepository { self.download_read_repo.as_ref() } @@ -55,6 +73,14 @@ impl QueryBus { self.plugin_read_repo.as_ref() } + pub fn plugin_loader(&self) -> Option<&dyn PluginLoader> { + self.plugin_loader.as_deref() + } + + pub fn plugin_config_store(&self) -> Option<&dyn PluginConfigStore> { + self.plugin_config_store.as_deref() + } + pub(crate) fn archive_extractor_arc(&self) -> Arc { Arc::clone(&self.archive_extractor) } diff --git a/src-tauri/src/application/read_models/mod.rs b/src-tauri/src/application/read_models/mod.rs index 57fcac2..845c73c 100644 --- a/src-tauri/src/application/read_models/mod.rs +++ b/src-tauri/src/application/read_models/mod.rs @@ -3,6 +3,7 @@ pub mod download_detail_view; pub mod download_view; pub mod history_view; +pub mod plugin_config_view; pub mod plugin_store_view; pub mod plugin_view; pub mod stats_view; diff --git a/src-tauri/src/application/read_models/plugin_config_view.rs b/src-tauri/src/application/read_models/plugin_config_view.rs new file mode 100644 index 0000000..bc655c0 --- /dev/null +++ b/src-tauri/src/application/read_models/plugin_config_view.rs @@ -0,0 +1,98 @@ +//! Serializable plugin-configuration view DTO for the frontend. +//! +//! Bundles the schema (so the UI can render typed fields) and the +//! current values (so the UI can populate them) in a single payload. + +use std::collections::HashMap; + +use serde::Serialize; + +use crate::domain::model::plugin::{ConfigField, PluginConfigSchema}; + +#[derive(Debug, Clone, Serialize)] +#[serde(rename_all = "camelCase")] +pub struct ConfigFieldDto { + pub key: String, + pub field_type: String, + pub default: Option, + pub description: Option, + pub options: Vec, + pub min: Option, + pub max: Option, + pub regex: Option, +} + +impl ConfigFieldDto { + pub fn from_field(key: &str, field: &ConfigField) -> Self { + Self { + key: key.to_string(), + field_type: field.field_type().to_string(), + default: field.default_value().map(|s| s.to_string()), + description: field.description().map(|s| s.to_string()), + options: field.options().to_vec(), + min: field.min(), + max: field.max(), + regex: field.regex().map(|s| s.to_string()), + } + } +} + +#[derive(Debug, Clone, Serialize)] +#[serde(rename_all = "camelCase")] +pub struct PluginConfigView { + pub fields: Vec, + pub values: HashMap, +} + +impl PluginConfigView { + pub fn new(schema: &PluginConfigSchema, values: HashMap) -> Self { + let mut fields: Vec = schema + .fields() + .iter() + .map(|(k, f)| ConfigFieldDto::from_field(k, f)) + .collect(); + fields.sort_by(|a, b| a.key.cmp(&b.key)); + Self { fields, values } + } +} + +#[cfg(test)] +mod tests { + use super::*; + use crate::domain::model::plugin::{ConfigField, ConfigFieldType, PluginConfigSchema}; + + #[test] + fn test_plugin_config_view_serializes_camel_case() { + let mut schema = PluginConfigSchema::new(); + schema.insert( + "default_quality", + ConfigField::new(ConfigFieldType::Enum) + .with_options(vec!["360p".into(), "720p".into()]) + .with_default("720p") + .with_description("Quality tier"), + ); + let mut values = HashMap::new(); + values.insert("default_quality".to_string(), "720p".to_string()); + + let view = PluginConfigView::new(&schema, values); + let json = serde_json::to_value(&view).unwrap(); + assert!(json.get("fields").is_some()); + assert!(json.get("values").is_some()); + let field0 = &json.get("fields").unwrap().as_array().unwrap()[0]; + assert_eq!(field0.get("key").unwrap(), "default_quality"); + assert_eq!(field0.get("fieldType").unwrap(), "enum"); + assert_eq!(field0.get("default").unwrap(), "720p"); + } + + #[test] + fn test_plugin_config_view_fields_sorted_by_key() { + let mut schema = PluginConfigSchema::new(); + schema.insert("zeta", ConfigField::new(ConfigFieldType::String)); + schema.insert("alpha", ConfigField::new(ConfigFieldType::String)); + schema.insert("mu", ConfigField::new(ConfigFieldType::String)); + + let view = PluginConfigView::new(&schema, HashMap::new()); + let keys: Vec<&str> = view.fields.iter().map(|f| f.key.as_str()).collect(); + assert_eq!(keys, vec!["alpha", "mu", "zeta"]); + } +} diff --git a/src-tauri/src/domain/model/plugin.rs b/src-tauri/src/domain/model/plugin.rs index f74015b..314e353 100644 --- a/src-tauri/src/domain/model/plugin.rs +++ b/src-tauri/src/domain/model/plugin.rs @@ -2,6 +2,8 @@ use std::collections::HashMap; use std::fmt; use std::str::FromStr; +use crate::domain::error::DomainError; + #[derive(Debug, Clone, Copy, PartialEq, Eq)] pub enum PluginCategory { Crawler, @@ -115,6 +117,7 @@ pub struct PluginManifest { capabilities: Vec, min_vortex_version: Option, config_defaults: HashMap, + config_schema: PluginConfigSchema, } impl PluginManifest { @@ -124,6 +127,7 @@ impl PluginManifest { capabilities: Vec::new(), min_vortex_version: None, config_defaults: HashMap::new(), + config_schema: PluginConfigSchema::new(), } } @@ -142,6 +146,11 @@ impl PluginManifest { self } + pub fn with_config_schema(mut self, schema: PluginConfigSchema) -> Self { + self.config_schema = schema; + self + } + pub fn info(&self) -> &PluginInfo { &self.info } @@ -161,6 +170,768 @@ impl PluginManifest { pub fn config_defaults(&self) -> &HashMap { &self.config_defaults } + + pub fn config_schema(&self) -> &PluginConfigSchema { + &self.config_schema + } +} + +/// Type tag of a single configuration field. +#[derive(Debug, Clone, Copy, PartialEq, Eq)] +pub enum ConfigFieldType { + String, + Boolean, + Integer, + Float, + Url, + Enum, + Array, +} + +impl fmt::Display for ConfigFieldType { + fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { + let name = match self { + ConfigFieldType::String => "string", + ConfigFieldType::Boolean => "boolean", + ConfigFieldType::Integer => "integer", + ConfigFieldType::Float => "float", + ConfigFieldType::Url => "url", + ConfigFieldType::Enum => "enum", + ConfigFieldType::Array => "array", + }; + write!(f, "{name}") + } +} + +impl FromStr for ConfigFieldType { + type Err = String; + + fn from_str(s: &str) -> Result { + match s { + "string" => Ok(ConfigFieldType::String), + "boolean" | "bool" => Ok(ConfigFieldType::Boolean), + "integer" | "int" => Ok(ConfigFieldType::Integer), + "float" | "number" => Ok(ConfigFieldType::Float), + "url" => Ok(ConfigFieldType::Url), + "enum" => Ok(ConfigFieldType::Enum), + "array" => Ok(ConfigFieldType::Array), + other => Err(format!("unknown config field type: '{other}'")), + } + } +} + +/// One configuration field declared by a plugin's `[config]` table. +/// +/// Values are encoded as strings on the wire (matching the host's +/// `plugin_configs` storage). [`ConfigField::validate`] is the single +/// source of truth — UI hints are derived from the field metadata but +/// the backend re-validates before persisting. +#[derive(Debug, Clone, PartialEq)] +pub struct ConfigField { + field_type: ConfigFieldType, + default: Option, + description: Option, + options: Vec, + min: Option, + max: Option, + regex: Option, +} + +impl ConfigField { + pub fn new(field_type: ConfigFieldType) -> Self { + Self { + field_type, + default: None, + description: None, + options: Vec::new(), + min: None, + max: None, + regex: None, + } + } + + pub fn with_default(mut self, default: impl Into) -> Self { + self.default = Some(default.into()); + self + } + + pub fn with_description(mut self, description: impl Into) -> Self { + self.description = Some(description.into()); + self + } + + pub fn with_options(mut self, options: Vec) -> Self { + self.options = options; + self + } + + pub fn with_min(mut self, min: f64) -> Self { + self.min = Some(min); + self + } + + pub fn with_max(mut self, max: f64) -> Self { + self.max = Some(max); + self + } + + pub fn with_regex(mut self, regex: impl Into) -> Self { + self.regex = Some(regex.into()); + self + } + + pub fn field_type(&self) -> ConfigFieldType { + self.field_type + } + + pub fn default_value(&self) -> Option<&str> { + self.default.as_deref() + } + + pub fn description(&self) -> Option<&str> { + self.description.as_deref() + } + + pub fn options(&self) -> &[String] { + &self.options + } + + pub fn min(&self) -> Option { + self.min + } + + pub fn max(&self) -> Option { + self.max + } + + pub fn regex(&self) -> Option<&str> { + self.regex.as_deref() + } + + pub fn validate(&self, value: &str) -> Result<(), DomainError> { + match self.field_type { + ConfigFieldType::Boolean => { + if value != "true" && value != "false" { + return Err(DomainError::ValidationError(format!( + "expected 'true' or 'false', got '{value}'" + ))); + } + } + ConfigFieldType::Integer => { + let parsed: i64 = value.parse().map_err(|_| { + DomainError::ValidationError(format!("expected integer, got '{value}'")) + })?; + // Use ceil/floor on f64 bounds so fractional limits like + // min=1.5 still reject the integer 1, and so we never widen + // the range by truncating toward zero (e.g. -1.5 as i64 = -1). + if let Some(min) = self.min { + let min_required = min.ceil() as i64; + if parsed < min_required { + return Err(DomainError::ValidationError(format!( + "value {parsed} below minimum {min}" + ))); + } + } + if let Some(max) = self.max { + let max_allowed = max.floor() as i64; + if parsed > max_allowed { + return Err(DomainError::ValidationError(format!( + "value {parsed} above maximum {max}" + ))); + } + } + } + ConfigFieldType::Float => { + let parsed: f64 = value.parse().map_err(|_| { + DomainError::ValidationError(format!("expected float, got '{value}'")) + })?; + // `f64::from_str` accepts "NaN", "inf", "infinity" (case + // insensitive). NaN compares false against any bound so it + // would silently pass `check_numeric_bounds` — reject the + // non-finite values up front. + if !parsed.is_finite() { + return Err(DomainError::ValidationError(format!( + "expected finite float, got '{value}'" + ))); + } + self.check_numeric_bounds(parsed)?; + } + ConfigFieldType::Url => { + if !value.starts_with("http://") && !value.starts_with("https://") { + return Err(DomainError::ValidationError(format!( + "expected http(s) URL, got '{value}'" + ))); + } + } + ConfigFieldType::Enum => { + if !self.options.iter().any(|o| o == value) { + return Err(DomainError::ValidationError(format!( + "value '{value}' not in allowed options" + ))); + } + } + ConfigFieldType::String => { + if !self.options.is_empty() && !self.options.iter().any(|o| o == value) { + return Err(DomainError::ValidationError(format!( + "value '{value}' not in allowed options" + ))); + } + } + ConfigFieldType::Array => { + let count = parse_json_array_len(value).ok_or_else(|| { + DomainError::ValidationError(format!("expected JSON array, got '{value}'")) + })?; + if let Some(min) = self.min + && (count as f64) < min + { + return Err(DomainError::ValidationError(format!( + "array has {count} element(s), below minimum {min}" + ))); + } + if let Some(max) = self.max + && (count as f64) > max + { + return Err(DomainError::ValidationError(format!( + "array has {count} element(s), above maximum {max}" + ))); + } + } + } + + if let Some(pattern) = &self.regex { + if let Some(err) = regex_syntax_error(pattern) { + return Err(DomainError::ValidationError(format!( + "regex pattern '{pattern}' is malformed: {err}" + ))); + } + if let Some(bad) = unsupported_regex_feature(pattern) { + return Err(DomainError::ValidationError(format!( + "regex pattern '{pattern}' uses unsupported feature '{bad}' (alternation, groups and counted quantifiers are not implemented)" + ))); + } + if !match_regex(pattern, value) { + return Err(DomainError::ValidationError(format!( + "value '{value}' does not match regex" + ))); + } + } + + Ok(()) + } + + fn check_numeric_bounds(&self, n: f64) -> Result<(), DomainError> { + if let Some(min) = self.min + && n < min + { + return Err(DomainError::ValidationError(format!( + "value {n} below minimum {min}" + ))); + } + if let Some(max) = self.max + && n > max + { + return Err(DomainError::ValidationError(format!( + "value {n} above maximum {max}" + ))); + } + Ok(()) + } +} + +/// Schema describing every configurable field of a plugin. +#[derive(Debug, Clone, Default, PartialEq)] +pub struct PluginConfigSchema { + fields: HashMap, +} + +impl PluginConfigSchema { + pub fn new() -> Self { + Self { + fields: HashMap::new(), + } + } + + pub fn insert(&mut self, key: impl Into, field: ConfigField) { + self.fields.insert(key.into(), field); + } + + pub fn get(&self, key: &str) -> Option<&ConfigField> { + self.fields.get(key) + } + + pub fn fields(&self) -> &HashMap { + &self.fields + } + + pub fn is_empty(&self) -> bool { + self.fields.is_empty() + } + + pub fn len(&self) -> usize { + self.fields.len() + } + + pub fn validate(&self, key: &str, value: &str) -> Result<(), DomainError> { + let field = self.fields.get(key).ok_or_else(|| { + DomainError::NotFound(format!("config key '{key}' not declared by plugin")) + })?; + field.validate(value) + } +} + +/// Strict JSON-array parser built with std only. +/// +/// Returns the element count when `value` is a syntactically valid JSON +/// array (rejecting things like `[1 2]`, `[1,]`, `[tru]`), or `None` for +/// any malformation. Domain layer constraint forbids external crates, +/// so this is a hand-rolled recursive descent parser over the JSON grammar. +fn parse_json_array_len(value: &str) -> Option { + let bytes: Vec = value.chars().collect(); + let mut p = JsonCursor::new(&bytes); + p.skip_ws(); + let count = p.parse_array()?; + p.skip_ws(); + if p.pos < p.src.len() { + return None; + } + Some(count) +} + +struct JsonCursor<'a> { + src: &'a [char], + pos: usize, +} + +impl<'a> JsonCursor<'a> { + fn new(src: &'a [char]) -> Self { + Self { src, pos: 0 } + } + + fn peek(&self) -> Option { + self.src.get(self.pos).copied() + } + + fn bump(&mut self) -> Option { + let c = self.peek()?; + self.pos += 1; + Some(c) + } + + fn expect(&mut self, c: char) -> Option<()> { + if self.peek()? == c { + self.pos += 1; + Some(()) + } else { + None + } + } + + fn skip_ws(&mut self) { + while let Some(c) = self.peek() { + if c.is_whitespace() { + self.pos += 1; + } else { + break; + } + } + } + + fn parse_array(&mut self) -> Option { + self.expect('[')?; + self.skip_ws(); + if self.peek()? == ']' { + self.pos += 1; + return Some(0); + } + let mut count = 0; + loop { + self.parse_value()?; + count += 1; + self.skip_ws(); + match self.peek()? { + ',' => { + self.pos += 1; + self.skip_ws(); + if self.peek()? == ']' { + return None; // trailing comma + } + } + ']' => { + self.pos += 1; + return Some(count); + } + _ => return None, + } + } + } + + fn parse_object(&mut self) -> Option<()> { + self.expect('{')?; + self.skip_ws(); + if self.peek()? == '}' { + self.pos += 1; + return Some(()); + } + loop { + self.skip_ws(); + self.parse_string()?; + self.skip_ws(); + self.expect(':')?; + self.parse_value()?; + self.skip_ws(); + match self.peek()? { + ',' => { + self.pos += 1; + } + '}' => { + self.pos += 1; + return Some(()); + } + _ => return None, + } + } + } + + fn parse_value(&mut self) -> Option<()> { + self.skip_ws(); + match self.peek()? { + '"' => self.parse_string(), + '[' => self.parse_array().map(|_| ()), + '{' => self.parse_object(), + 't' => self.parse_keyword("true"), + 'f' => self.parse_keyword("false"), + 'n' => self.parse_keyword("null"), + '-' | '0'..='9' => self.parse_number(), + _ => None, + } + } + + fn parse_string(&mut self) -> Option<()> { + self.expect('"')?; + loop { + let c = self.bump()?; + match c { + '"' => return Some(()), + '\\' => { + let esc = self.bump()?; + match esc { + '"' | '\\' | '/' | 'b' | 'f' | 'n' | 'r' | 't' => {} + 'u' => { + for _ in 0..4 { + let h = self.bump()?; + if !h.is_ascii_hexdigit() { + return None; + } + } + } + _ => return None, + } + } + // RFC 8259: unescaped control characters (U+0000..=U+001F) + // are forbidden inside strings. + c if (c as u32) < 0x20 => return None, + _ => {} + } + } + } + + fn parse_keyword(&mut self, word: &str) -> Option<()> { + for ec in word.chars() { + if self.bump()? != ec { + return None; + } + } + Some(()) + } + + fn parse_number(&mut self) -> Option<()> { + let start = self.pos; + if self.peek() == Some('-') { + self.pos += 1; + } + // RFC 8259: integer part is `0` or [1-9][0-9]* — no leading zeros. + match self.peek()? { + '0' => { + self.pos += 1; + } + '1'..='9' => { + self.pos += 1; + while let Some(c) = self.peek() { + if c.is_ascii_digit() { + self.pos += 1; + } else { + break; + } + } + } + _ => return None, + } + if self.peek() == Some('.') { + self.pos += 1; + let frac_start = self.pos; + while let Some(c) = self.peek() { + if c.is_ascii_digit() { + self.pos += 1; + } else { + break; + } + } + if self.pos == frac_start { + return None; + } + } + if matches!(self.peek(), Some('e') | Some('E')) { + self.pos += 1; + if matches!(self.peek(), Some('+') | Some('-')) { + self.pos += 1; + } + let exp_start = self.pos; + while let Some(c) = self.peek() { + if c.is_ascii_digit() { + self.pos += 1; + } else { + break; + } + } + if self.pos == exp_start { + return None; + } + } + if self.pos == start { None } else { Some(()) } + } +} + +/// Returns a syntax-level error for `pattern` (unclosed `[`, trailing +/// `\`), or `None` when the pattern is well-formed for the matcher. The +/// matcher would otherwise silently degrade malformed patterns into +/// "never matches", so plugin authors get no feedback. +pub fn regex_syntax_error(pattern: &str) -> Option { + let chars: Vec = pattern.chars().collect(); + let mut i = 0; + let mut in_class = false; + while i < chars.len() { + let c = chars[i]; + match c { + '\\' => { + if i + 1 >= chars.len() { + return Some("trailing backslash".into()); + } + i += 2; + } + '[' if !in_class => { + in_class = true; + i += 1; + } + ']' if in_class => { + in_class = false; + i += 1; + } + _ => i += 1, + } + } + if in_class { + return Some("unclosed '['".into()); + } + None +} + +/// Returns the first regex feature in `pattern` the matcher does not +/// support, or `None` if the pattern only uses the supported subset +/// (anchors, `.`, `[...]`, `*`/`+`/`?`, `\d`/`\w`/`\s`, escapes). Plugin +/// authors expecting full PCRE/Rust regex semantics for `|`, `(...)` or +/// `{n,m}` would otherwise hit silent match failures — the manifest +/// parser uses this to fail loudly at load time. +pub fn unsupported_regex_feature(pattern: &str) -> Option { + let chars: Vec = pattern.chars().collect(); + let mut i = 0; + let mut in_class = false; + while i < chars.len() { + let c = chars[i]; + if c == '\\' { + i += 2; + continue; + } + if c == '[' && !in_class { + in_class = true; + i += 1; + continue; + } + if c == ']' && in_class { + in_class = false; + i += 1; + continue; + } + if !in_class && matches!(c, '|' | '(' | ')' | '{' | '}') { + return Some(c); + } + i += 1; + } + None +} + +/// Minimal POSIX-like regex matcher built with std only. +/// +/// Domain layer constraint: no external crate. Supports anchors (`^`, `$`), +/// wildcard (`.`), char classes (`[a-z]`, `[^abc]`), greedy quantifiers +/// (`*`, `+`, `?`) and escapes (`\d`, `\w`, `\s`, `\.`). Sufficient for the +/// validation patterns declared by community plugins. Returns `false` on +/// malformed patterns rather than panicking. +fn match_regex(pattern: &str, value: &str) -> bool { + let pat: Vec = pattern.chars().collect(); + let val: Vec = value.chars().collect(); + + let (anchor_start, body) = if pat.first() == Some(&'^') { + (true, &pat[1..]) + } else { + (false, &pat[..]) + }; + let (anchor_end, body) = if body.last() == Some(&'$') { + (true, &body[..body.len() - 1]) + } else { + (false, body) + }; + + if anchor_start { + regex_match_from(body, &val, 0, anchor_end) + } else { + for start in 0..=val.len() { + if regex_match_from(body, &val, start, anchor_end) { + return true; + } + } + false + } +} + +fn regex_match_from(pat: &[char], val: &[char], start: usize, anchor_end: bool) -> bool { + let mut pi = 0; + let mut vi = start; + + while pi < pat.len() { + let (atom_pat, atom_len) = parse_atom(&pat[pi..]); + let next_pi = pi + atom_len; + let quantifier = pat.get(next_pi).copied(); + + match quantifier { + Some('*') => { + let mut matches = vi; + while matches < val.len() && atom_match(&atom_pat, val[matches]) { + matches += 1; + } + loop { + if regex_match_from(&pat[next_pi + 1..], val, matches, anchor_end) { + return true; + } + if matches == vi { + return false; + } + matches -= 1; + } + } + Some('+') => { + if vi >= val.len() || !atom_match(&atom_pat, val[vi]) { + return false; + } + let mut matches = vi + 1; + while matches < val.len() && atom_match(&atom_pat, val[matches]) { + matches += 1; + } + while matches > vi { + if regex_match_from(&pat[next_pi + 1..], val, matches, anchor_end) { + return true; + } + matches -= 1; + } + return false; + } + Some('?') => { + if vi < val.len() + && atom_match(&atom_pat, val[vi]) + && regex_match_from(&pat[next_pi + 1..], val, vi + 1, anchor_end) + { + return true; + } + return regex_match_from(&pat[next_pi + 1..], val, vi, anchor_end); + } + _ => { + if vi >= val.len() || !atom_match(&atom_pat, val[vi]) { + return false; + } + vi += 1; + pi = next_pi; + } + } + } + + if anchor_end { vi == val.len() } else { true } +} + +#[derive(Debug, Clone)] +enum Atom { + Any, + Literal(char), + Class(Vec<(char, char)>, bool), + Digit, + Word, + Space, +} + +fn parse_atom(pat: &[char]) -> (Atom, usize) { + if pat.is_empty() { + return (Atom::Any, 0); + } + match pat[0] { + '.' => (Atom::Any, 1), + '\\' => { + if pat.len() < 2 { + return (Atom::Literal('\\'), 1); + } + let atom = match pat[1] { + 'd' => Atom::Digit, + 'w' => Atom::Word, + 's' => Atom::Space, + c => Atom::Literal(c), + }; + (atom, 2) + } + '[' => { + let mut i = 1; + let mut ranges = Vec::new(); + let negate = pat.get(1) == Some(&'^'); + if negate { + i = 2; + } + while i < pat.len() && pat[i] != ']' { + let start = pat[i]; + if i + 2 < pat.len() && pat[i + 1] == '-' && pat[i + 2] != ']' { + ranges.push((start, pat[i + 2])); + i += 3; + } else { + ranges.push((start, start)); + i += 1; + } + } + if i >= pat.len() || pat[i] != ']' { + return (Atom::Class(Vec::new(), false), pat.len()); + } + (Atom::Class(ranges, negate), i + 1) + } + c => (Atom::Literal(c), 1), + } +} + +fn atom_match(atom: &Atom, c: char) -> bool { + match atom { + Atom::Any => true, + Atom::Literal(l) => *l == c, + Atom::Class(ranges, negate) => { + let inside = ranges.iter().any(|(a, b)| c >= *a && c <= *b); + inside != *negate + } + Atom::Digit => c.is_ascii_digit(), + Atom::Word => c.is_ascii_alphanumeric() || c == '_', + Atom::Space => c.is_whitespace(), + } } #[cfg(test)] @@ -243,4 +1014,318 @@ mod tests { assert_eq!(PluginCategory::Notifier.to_string(), "Notifier"); assert_eq!(PluginCategory::Utility.to_string(), "Utility"); } + + #[test] + fn test_config_field_type_from_str_known() { + assert_eq!( + "string".parse::().unwrap(), + ConfigFieldType::String + ); + assert_eq!( + "boolean".parse::().unwrap(), + ConfigFieldType::Boolean + ); + assert_eq!( + "integer".parse::().unwrap(), + ConfigFieldType::Integer + ); + assert_eq!( + "float".parse::().unwrap(), + ConfigFieldType::Float + ); + assert_eq!( + "url".parse::().unwrap(), + ConfigFieldType::Url + ); + assert_eq!( + "enum".parse::().unwrap(), + ConfigFieldType::Enum + ); + } + + #[test] + fn test_config_field_type_from_str_unknown_returns_err() { + assert!("unknown".parse::().is_err()); + } + + #[test] + fn test_config_field_type_display_lowercase() { + assert_eq!(ConfigFieldType::String.to_string(), "string"); + assert_eq!(ConfigFieldType::Boolean.to_string(), "boolean"); + assert_eq!(ConfigFieldType::Integer.to_string(), "integer"); + } + + #[test] + fn test_config_field_validate_boolean_accepts_true_false() { + let f = ConfigField::new(ConfigFieldType::Boolean); + assert!(f.validate("true").is_ok()); + assert!(f.validate("false").is_ok()); + } + + #[test] + fn test_config_field_validate_boolean_rejects_other() { + let f = ConfigField::new(ConfigFieldType::Boolean); + let err = f.validate("yes").unwrap_err(); + assert!(matches!(err, DomainError::ValidationError(_))); + } + + #[test] + fn test_config_field_validate_integer_parses_and_checks_bounds() { + let f = ConfigField::new(ConfigFieldType::Integer) + .with_min(1.0) + .with_max(10.0); + assert!(f.validate("5").is_ok()); + assert!(matches!( + f.validate("abc").unwrap_err(), + DomainError::ValidationError(_) + )); + assert!(matches!( + f.validate("0").unwrap_err(), + DomainError::ValidationError(_) + )); + assert!(matches!( + f.validate("11").unwrap_err(), + DomainError::ValidationError(_) + )); + } + + #[test] + fn test_config_field_validate_float_with_bounds() { + let f = ConfigField::new(ConfigFieldType::Float) + .with_min(0.0) + .with_max(1.0); + assert!(f.validate("0.5").is_ok()); + assert!(f.validate("1.5").is_err()); + assert!(f.validate("-0.5").is_err()); + } + + #[test] + fn test_config_field_validate_url_requires_http_scheme() { + let f = ConfigField::new(ConfigFieldType::Url); + assert!(f.validate("https://example.com").is_ok()); + assert!(f.validate("http://example.com").is_ok()); + assert!(f.validate("ftp://example.com").is_err()); + assert!(f.validate("not-a-url").is_err()); + } + + #[test] + fn test_config_field_validate_enum_checks_options() { + let f = ConfigField::new(ConfigFieldType::Enum).with_options(vec![ + "360p".to_string(), + "720p".to_string(), + "1080p".to_string(), + ]); + assert!(f.validate("720p").is_ok()); + assert!(f.validate("4K").is_err()); + } + + #[test] + fn test_config_field_validate_string_with_options_acts_as_enum() { + let f = ConfigField::new(ConfigFieldType::String) + .with_options(vec!["fast".to_string(), "slow".to_string()]); + assert!(f.validate("fast").is_ok()); + assert!(f.validate("medium").is_err()); + } + + #[test] + fn test_config_field_validate_string_without_options_accepts_anything() { + let f = ConfigField::new(ConfigFieldType::String); + assert!(f.validate("anything goes").is_ok()); + assert!(f.validate("").is_ok()); + } + + #[test] + fn test_config_field_validate_regex_constrains_string() { + let f = ConfigField::new(ConfigFieldType::String).with_regex(r"^[a-z]+$"); + assert!(f.validate("hello").is_ok()); + assert!(f.validate("Hello").is_err()); + assert!(f.validate("hello123").is_err()); + } + + #[test] + fn test_config_field_default_value_optional() { + let f = ConfigField::new(ConfigFieldType::String).with_default("hi"); + assert_eq!(f.default_value(), Some("hi")); + let g = ConfigField::new(ConfigFieldType::Integer); + assert!(g.default_value().is_none()); + } + + #[test] + fn test_plugin_config_schema_insert_and_get() { + let mut schema = PluginConfigSchema::new(); + assert!(schema.is_empty()); + schema.insert( + "quality", + ConfigField::new(ConfigFieldType::Enum) + .with_options(vec!["360p".into(), "720p".into()]) + .with_default("720p"), + ); + assert!(!schema.is_empty()); + assert_eq!(schema.len(), 1); + let field = schema.get("quality").unwrap(); + assert_eq!(field.field_type(), ConfigFieldType::Enum); + assert_eq!(field.default_value(), Some("720p")); + } + + #[test] + fn test_plugin_config_schema_validate_unknown_key_returns_not_found() { + let schema = PluginConfigSchema::new(); + let err = schema.validate("ghost", "v").unwrap_err(); + assert!(matches!(err, DomainError::NotFound(_))); + } + + #[test] + fn test_plugin_config_schema_validate_delegates_to_field() { + let mut schema = PluginConfigSchema::new(); + schema.insert("audio", ConfigField::new(ConfigFieldType::Boolean)); + assert!(schema.validate("audio", "true").is_ok()); + assert!(matches!( + schema.validate("audio", "yes").unwrap_err(), + DomainError::ValidationError(_) + )); + } + + #[test] + fn test_plugin_manifest_with_config_schema() { + let info = make_info(); + let mut schema = PluginConfigSchema::new(); + schema.insert("foo", ConfigField::new(ConfigFieldType::String)); + let manifest = PluginManifest::new(info).with_config_schema(schema); + assert_eq!(manifest.config_schema().len(), 1); + assert!(manifest.config_schema().get("foo").is_some()); + } + + #[test] + fn test_plugin_manifest_default_config_schema_empty() { + let manifest = PluginManifest::new(make_info()); + assert!(manifest.config_schema().is_empty()); + } + + #[test] + fn test_parse_json_array_len_accepts_valid_arrays() { + assert_eq!(parse_json_array_len("[]"), Some(0)); + assert_eq!(parse_json_array_len("[1]"), Some(1)); + assert_eq!(parse_json_array_len(" [ 1 , 2 , 3 ] "), Some(3)); + assert_eq!(parse_json_array_len("[\"a\", \"b\"]"), Some(2)); + assert_eq!(parse_json_array_len("[null, true, false]"), Some(3)); + assert_eq!(parse_json_array_len("[{\"k\": \"v\"}, [1, 2]]"), Some(2)); + assert_eq!(parse_json_array_len("[-1.5e2]"), Some(1)); + } + + #[test] + fn test_parse_json_array_len_rejects_malformed() { + assert_eq!(parse_json_array_len("[1 2]"), None); // missing comma + assert_eq!(parse_json_array_len("[1,]"), None); // trailing comma + assert_eq!(parse_json_array_len("[tru]"), None); // partial keyword + assert_eq!(parse_json_array_len("[\"unterminated]"), None); + assert_eq!(parse_json_array_len("[1, 2"), None); // unclosed + assert_eq!(parse_json_array_len("[1] junk"), None); // trailing content + assert_eq!(parse_json_array_len("not an array"), None); + assert_eq!(parse_json_array_len("{\"k\":\"v\"}"), None); // object, not array + } + + #[test] + fn test_config_field_validate_array_applies_min_max_count() { + let f = ConfigField::new(ConfigFieldType::Array) + .with_min(1.0) + .with_max(3.0); + assert!(f.validate("[1]").is_ok()); + assert!(f.validate("[1, 2, 3]").is_ok()); + assert!(matches!( + f.validate("[]").unwrap_err(), + DomainError::ValidationError(_) + )); + assert!(matches!( + f.validate("[1, 2, 3, 4]").unwrap_err(), + DomainError::ValidationError(_) + )); + assert!(matches!( + f.validate("[1,]").unwrap_err(), + DomainError::ValidationError(_) + )); + } + + #[test] + fn test_unsupported_regex_feature_detects_unimplemented() { + assert_eq!(unsupported_regex_feature("^(foo|bar)$"), Some('(')); + assert_eq!(unsupported_regex_feature("a|b"), Some('|')); + assert_eq!(unsupported_regex_feature("a{2,3}"), Some('{')); + assert_eq!(unsupported_regex_feature("^[a-z]+$"), None); + assert_eq!(unsupported_regex_feature(r"\d+"), None); + // Escaped versions are literal, allowed. + assert_eq!(unsupported_regex_feature(r"a\|b"), None); + // Inside character class these are literal too. + assert_eq!(unsupported_regex_feature("[|()]"), None); + } + + #[test] + fn test_config_field_validate_regex_rejects_unsupported_syntax() { + let f = ConfigField::new(ConfigFieldType::String).with_regex("^(foo|bar)$"); + let err = f.validate("foo").unwrap_err(); + match err { + DomainError::ValidationError(msg) => assert!( + msg.contains("unsupported feature"), + "expected unsupported-feature message, got: {msg}" + ), + other => panic!("expected ValidationError, got {other:?}"), + } + } + + #[test] + fn test_regex_syntax_error_detects_malformations() { + assert_eq!(regex_syntax_error("^[a-z]+$"), None); + assert_eq!(regex_syntax_error(r"\d+"), None); + assert!(regex_syntax_error("[abc").is_some()); // unclosed + assert!(regex_syntax_error("foo\\").is_some()); // trailing backslash + // Backslash inside class still requires a follow-up. + assert!(regex_syntax_error("[a\\").is_some()); + } + + #[test] + fn test_config_field_validate_regex_rejects_malformed_pattern() { + let f = ConfigField::new(ConfigFieldType::String).with_regex("[abc"); + let err = f.validate("a").unwrap_err(); + match err { + DomainError::ValidationError(msg) => assert!( + msg.contains("malformed"), + "expected malformed-pattern message, got: {msg}" + ), + other => panic!("expected ValidationError, got {other:?}"), + } + } + + #[test] + fn test_parse_json_array_len_rejects_invalid_string_escapes() { + // \q is not a valid JSON escape. + assert_eq!(parse_json_array_len(r#"["\q"]"#), None); + // Bare control characters (here U+000A newline) must be escaped. + assert_eq!(parse_json_array_len("[\"line\nbreak\"]"), None); + // Valid escapes still pass. + assert_eq!(parse_json_array_len(r#"["\n", "\t", "A"]"#), Some(3)); + } + + #[test] + fn test_parse_json_array_len_rejects_leading_zero_numbers() { + assert_eq!(parse_json_array_len("[01]"), None); + assert_eq!(parse_json_array_len("[001]"), None); + assert_eq!(parse_json_array_len("[-01]"), None); + // Single zero and decimals starting with zero are valid. + assert_eq!(parse_json_array_len("[0]"), Some(1)); + assert_eq!(parse_json_array_len("[0.5]"), Some(1)); + assert_eq!(parse_json_array_len("[-0.5]"), Some(1)); + } + + #[test] + fn test_config_field_validate_float_rejects_nan_and_infinity() { + let f = ConfigField::new(ConfigFieldType::Float); + for input in ["NaN", "nan", "inf", "Infinity", "-Infinity", "-inf"] { + let err = f.validate(input).unwrap_err(); + assert!( + matches!(err, DomainError::ValidationError(_)), + "expected ValidationError for {input}" + ); + } + assert!(f.validate("0.5").is_ok()); + assert!(f.validate("-3.14").is_ok()); + } } diff --git a/src-tauri/src/domain/ports/driven/mod.rs b/src-tauri/src/domain/ports/driven/mod.rs index b5f20f3..0f74a16 100644 --- a/src-tauri/src/domain/ports/driven/mod.rs +++ b/src-tauri/src/domain/ports/driven/mod.rs @@ -15,6 +15,7 @@ pub mod file_opener; pub mod file_storage; pub mod history_repository; pub mod http_client; +pub mod plugin_config_store; pub mod plugin_loader; pub mod plugin_read_repository; pub mod plugin_store_client; @@ -34,6 +35,7 @@ pub use file_opener::FileOpener; pub use file_storage::FileStorage; pub use history_repository::HistoryRepository; pub use http_client::HttpClient; +pub use plugin_config_store::PluginConfigStore; pub use plugin_loader::PluginLoader; pub use plugin_read_repository::PluginReadRepository; pub use plugin_store_client::PluginStoreClient; diff --git a/src-tauri/src/domain/ports/driven/plugin_config_store.rs b/src-tauri/src/domain/ports/driven/plugin_config_store.rs new file mode 100644 index 0000000..fcb4149 --- /dev/null +++ b/src-tauri/src/domain/ports/driven/plugin_config_store.rs @@ -0,0 +1,33 @@ +//! Persistent storage for plugin configuration values. +//! +//! Each plugin owns a flat (key, value) map; values are encoded as +//! UTF-8 strings (matching the in-memory `plugin_configs` map used by +//! the host functions). The schema lives on the manifest, not here. + +use std::collections::HashMap; + +use crate::domain::error::DomainError; + +/// Persists plugin configuration values across restarts. +/// +/// Adapters typically back this with a SQLite table. Implementations +/// must be thread-safe so command handlers can mutate values from +/// concurrent IPC calls. +pub trait PluginConfigStore: Send + Sync { + /// Read every (key, value) pair recorded for `plugin_name`. + /// Returns an empty map when the plugin has no persisted overrides. + fn get_values(&self, plugin_name: &str) -> Result, DomainError>; + + /// Persist a single (key, value) pair, replacing the previous value + /// when one exists. + fn set_value(&self, plugin_name: &str, key: &str, value: &str) -> Result<(), DomainError>; + + /// Read every (plugin_name → key → value) tuple persisted by the + /// store. Used at startup to repopulate the in-memory `plugin_configs` + /// map after defaults from the manifest are applied. + fn list_all(&self) -> Result>, DomainError>; + + /// Remove every persisted value for `plugin_name`. Called when a + /// plugin is uninstalled so the configs do not linger as ghost rows. + fn delete_all(&self, plugin_name: &str) -> Result<(), DomainError>; +} diff --git a/src-tauri/src/domain/ports/driven/plugin_loader.rs b/src-tauri/src/domain/ports/driven/plugin_loader.rs index 6e54fd1..c6c4f06 100644 --- a/src-tauri/src/domain/ports/driven/plugin_loader.rs +++ b/src-tauri/src/domain/ports/driven/plugin_loader.rs @@ -107,6 +107,23 @@ pub trait PluginLoader: Send + Sync { "download_to_file not supported by this loader".into(), )) } + + /// Return the full manifest of a loaded plugin by name. + /// + /// Used by the configuration query/command handlers to read the + /// declared `[config]` schema. Default returns `None` so adapters + /// without a manifest registry remain compatible. + fn get_manifest(&self, _name: &str) -> Result, DomainError> { + Ok(None) + } + + /// Update the live config value for a plugin so subsequent + /// `get_config(key)` calls from inside the WASM plugin observe the + /// new value without a reload. Default is a no-op for adapters that + /// don't expose a runtime config map. + fn set_runtime_config(&self, _name: &str, _key: &str, _value: &str) -> Result<(), DomainError> { + Ok(()) + } } #[cfg(test)] diff --git a/src-tauri/src/lib.rs b/src-tauri/src/lib.rs index 41ffe8b..23207c0 100644 --- a/src-tauri/src/lib.rs +++ b/src-tauri/src/lib.rs @@ -65,10 +65,11 @@ pub use adapters::driving::tauri_ipc::{ download_reorder_queue, download_resume, download_resume_all, download_retry, download_set_priority, download_start, download_verify_checksum, history_clear, history_delete_entry, history_export, history_get_by_id, history_list, - history_purge_older_than, history_search, link_resolve, plugin_disable, plugin_enable, - plugin_install, plugin_list, plugin_store_install, plugin_store_list, plugin_store_refresh, - plugin_store_update, plugin_uninstall, reveal_in_folder, settings_get, settings_update, - stats_get, stats_top_modules, status_bar_get, + history_purge_older_than, history_search, link_resolve, plugin_config_get, + plugin_config_update, plugin_disable, plugin_enable, plugin_install, plugin_list, + plugin_store_install, plugin_store_list, plugin_store_refresh, plugin_store_update, + plugin_uninstall, reveal_in_folder, settings_get, settings_update, stats_get, + stats_top_modules, status_bar_get, }; #[cfg_attr(mobile, tauri::mobile_entry_point)] @@ -144,11 +145,43 @@ pub fn run() { // ── Plugin system ─────────────────────────────────────── let shared_resources = Arc::new(SharedHostResources::new()); + let plugin_config_store: Arc< + dyn crate::domain::ports::driven::PluginConfigStore, + > = Arc::new( + crate::adapters::driven::sqlite::plugin_config_repo::SqlitePluginConfigRepo::new( + db.clone(), + ), + ); let plugin_loader_impl = Arc::new( - ExtismPluginLoader::new(plugins_dir.clone(), shared_resources) + ExtismPluginLoader::new(plugins_dir.clone(), shared_resources.clone()) .map_err(|e| e.to_string())?, ); + // Replay persisted plugin configs into the in-memory map so + // `get_config()` calls inside loaded plugins observe the user's + // last-saved values from the previous session, not just the + // manifest defaults seeded by `build_host_functions`. Values + // are inserted raw here — `build_host_functions` re-validates + // the per-plugin map against the current schema when each + // plugin loads (including hot-loads via the file watcher), + // so stale entries get pruned at the right moment without + // dropping overrides for plugins that load later in the + // session. + match plugin_config_store.list_all() { + Ok(all) => { + for (plugin_name, kv) in all { + let entry = shared_resources + .plugin_configs() + .entry(plugin_name) + .or_default(); + for (k, v) in kv { + entry.insert(k, v); + } + } + } + Err(e) => tracing::warn!(error = %e, "startup: failed to load plugin configs"), + } + // Scan existing plugin directories and load them at startup. // The PluginWatcher reacts only to file-system events, so plugins // already present on disk before the watcher starts would otherwise @@ -276,7 +309,7 @@ pub fn run() { event_bus.clone(), file_storage, http_client, - plugin_loader, + plugin_loader.clone(), config_store, credential_store, clipboard_observer, @@ -285,16 +318,21 @@ pub fn run() { Some(store_client), ) .with_checksum_computer(checksum_computer) - .with_file_opener(file_opener), + .with_file_opener(file_opener) + .with_plugin_config_store(plugin_config_store.clone()), ); - let query_bus = Arc::new(QueryBus::new( - download_read_repo, - history_repo, - stats_repo, - plugin_read_repo, - archive_extractor, - )); + let query_bus = Arc::new( + QueryBus::new( + download_read_repo, + history_repo, + stats_repo, + plugin_read_repo, + archive_extractor, + ) + .with_plugin_loader(plugin_loader.clone()) + .with_plugin_config_store(plugin_config_store), + ); // ── Register AppState ─────────────────────────────────── let app_plugin_loader: Arc = plugin_loader_impl.clone(); @@ -390,6 +428,8 @@ pub fn run() { plugin_store_refresh, plugin_store_install, plugin_store_update, + plugin_config_get, + plugin_config_update, link_resolve, clipboard_toggle, clipboard_state, diff --git a/src/i18n/locales/en.json b/src/i18n/locales/en.json index 6603db2..c84814b 100644 --- a/src/i18n/locales/en.json +++ b/src/i18n/locales/en.json @@ -328,6 +328,7 @@ "uninstall": "Uninstall", "update": "Update", "more": "More actions", + "configure": "Configure", "refresh": "Check updates", "browseRepo": "Browse repository" }, @@ -359,6 +360,17 @@ "disableSuccess": "Plugin disabled", "enableSuccess": "Plugin enabled", "uninstallSuccess": "Plugin uninstalled" + }, + "config": { + "title": "Configure {{name}}", + "description": "Settings declared by the plugin manifest. Values are validated before being saved.", + "loading": "Loading configuration…", + "error": "Failed to load configuration", + "noFields": "This plugin exposes no configuration fields.", + "toast": { + "saveSuccess": "Configuration saved", + "validationFailed": "Fix validation errors before saving" + } } }, "clipboard": { diff --git a/src/i18n/locales/fr.json b/src/i18n/locales/fr.json index 846a22a..31529ee 100644 --- a/src/i18n/locales/fr.json +++ b/src/i18n/locales/fr.json @@ -328,6 +328,7 @@ "uninstall": "Désinstaller", "update": "Mettre à jour", "more": "Plus d'actions", + "configure": "Configurer", "refresh": "Vérifier les mises à jour", "browseRepo": "Parcourir le dépôt" }, @@ -359,6 +360,17 @@ "disableSuccess": "Plugin désactivé", "enableSuccess": "Plugin activé", "uninstallSuccess": "Plugin désinstallé" + }, + "config": { + "title": "Configurer {{name}}", + "description": "Paramètres déclarés par le manifeste du plugin. Les valeurs sont validées avant d'être enregistrées.", + "loading": "Chargement de la configuration…", + "error": "Échec du chargement de la configuration", + "noFields": "Ce plugin n'expose aucun champ de configuration.", + "toast": { + "saveSuccess": "Configuration enregistrée", + "validationFailed": "Corrigez les erreurs de validation avant d'enregistrer" + } } }, "clipboard": { diff --git a/src/types/plugin-config.ts b/src/types/plugin-config.ts new file mode 100644 index 0000000..ff81c10 --- /dev/null +++ b/src/types/plugin-config.ts @@ -0,0 +1,24 @@ +export type PluginConfigFieldType = + | "string" + | "boolean" + | "integer" + | "float" + | "url" + | "enum" + | "array"; + +export interface PluginConfigField { + key: string; + fieldType: PluginConfigFieldType; + default: string | null; + description: string | null; + options: string[]; + min: number | null; + max: number | null; + regex: string | null; +} + +export interface PluginConfigView { + fields: PluginConfigField[]; + values: Record; +} diff --git a/src/views/PluginsView.tsx b/src/views/PluginsView.tsx index 38cd091..a259abf 100644 --- a/src/views/PluginsView.tsx +++ b/src/views/PluginsView.tsx @@ -1,13 +1,17 @@ import { useMemo, useState } from "react"; import { useTranslation } from "react-i18next"; +import { useQuery } from "@tanstack/react-query"; +import { tauriInvoke } from "@/api/client"; import { PluginStoreRow } from "./PluginsView/PluginStoreRow"; import { PluginsHeader } from "./PluginsView/PluginsHeader"; import { PluginsToolbar } from "./PluginsView/PluginsToolbar"; +import { PluginConfigDialog } from "./PluginsView/PluginConfigDialog"; import { groupByCategory } from "./PluginsView/groupByCategory"; import { usePluginStore } from "./PluginsView/usePluginStore"; import { useTauriMutation } from "@/api/hooks"; import { toast } from "@/lib/toast"; import { Button } from "@/components/ui/button"; +import type { PluginConfigView } from "@/types/plugin-config"; const CATEGORIES = [ "all", @@ -38,6 +42,12 @@ export function PluginsView() { // field. State is not persisted across reloads on purpose — same as the // pre-PR behaviour. const [locallyDisabled, setLocallyDisabled] = useState>(new Set()); + const [configPluginName, setConfigPluginName] = useState(null); + + // Each installed plugin's schema is fetched once on mount so the + // "Configure" button can be hidden when the manifest declares no + // `[config]` fields. We piggyback on TanStack Query's cache so the + // dialog can reuse the same key without a duplicate fetch. const { entries, @@ -107,6 +117,37 @@ export function PluginsView() { [entries, locallyDisabled], ); + const installedNames = useMemo( + () => + entries + .filter((e) => isInstalled(e.status) && !locallyDisabled.has(e.name)) + .map((e) => e.name), + [entries, locallyDisabled], + ); + + const { data: configsByPlugin } = useQuery({ + queryKey: ["plugin_config_get_all", installedNames], + enabled: installedNames.length > 0, + queryFn: async () => { + const results = await Promise.all( + installedNames.map(async (name) => { + try { + const view = await tauriInvoke("plugin_config_get", { name }); + return [name, view] as const; + } catch { + return [name, null] as const; + } + }), + ); + return Object.fromEntries(results) as Record; + }, + }); + + const hasConfig = (name: string): boolean => { + const view = configsByPlugin?.[name]; + return view !== null && view !== undefined && view.fields.length > 0; + }; + const groups = useMemo(() => groupByCategory(filtered), [filtered]); return ( @@ -171,6 +212,8 @@ export function PluginsView() { onDisable={(name) => disableMutation.mutate({ name })} onEnable={(name) => enableMutation.mutate({ name })} onUninstall={(name) => uninstallMutation.mutate({ name })} + onConfigure={(name) => setConfigPluginName(name)} + hasConfig={hasConfig(entry.name)} isInstalling={isInstalling(entry.name)} isUpdating={isUpdating(entry.name)} /> @@ -182,6 +225,14 @@ export function PluginsView() {
+ + { + if (!open) setConfigPluginName(null); + }} + /> ); } diff --git a/src/views/PluginsView/PluginConfigDialog.tsx b/src/views/PluginsView/PluginConfigDialog.tsx new file mode 100644 index 0000000..e5df707 --- /dev/null +++ b/src/views/PluginsView/PluginConfigDialog.tsx @@ -0,0 +1,233 @@ +import { useEffect, useMemo, useRef, useState } from "react"; +import { useQuery, useQueryClient } from "@tanstack/react-query"; +import { useTranslation } from "react-i18next"; +import { tauriInvoke } from "@/api/client"; +import { useTauriMutation } from "@/api/hooks"; +import { toast } from "@/lib/toast"; +import { + Dialog, + DialogContent, + DialogDescription, + DialogFooter, + DialogHeader, + DialogTitle, +} from "@/components/ui/dialog"; +import { Button } from "@/components/ui/button"; +import { PluginConfigField } from "./PluginConfigField"; +import type { PluginConfigField as ConfigField, PluginConfigView } from "@/types/plugin-config"; + +interface PluginConfigDialogProps { + pluginName: string | null; + open: boolean; + onOpenChange: (open: boolean) => void; +} + +const QUERY_KEY = (name: string) => ["plugin_config_get", { name }] as const; + +function validate(field: ConfigField, value: string): string | null { + if (field.fieldType === "boolean") { + if (value !== "true" && value !== "false") return "Invalid boolean"; + return null; + } + if (field.fieldType === "integer") { + if (!/^-?\d+$/.test(value)) return "Must be an integer"; + const n = Number(value); + if (field.min !== null && n < field.min) return `Min ${field.min}`; + if (field.max !== null && n > field.max) return `Max ${field.max}`; + return null; + } + if (field.fieldType === "float") { + if (value.trim() === "") return "Must be a number"; + const n = Number(value); + // `Number.isFinite` rejects both NaN and ±Infinity. The Rust backend's + // `f64::from_str` accepts "Infinity" / "inf", so we'd otherwise let the + // user submit a value that the schema validator then turns into a + // generic toast instead of an inline field error. + if (!Number.isFinite(n)) return "Must be a finite number"; + if (field.min !== null && n < field.min) return `Min ${field.min}`; + if (field.max !== null && n > field.max) return `Max ${field.max}`; + return null; + } + if (field.fieldType === "url") { + if (!/^https?:\/\//.test(value)) return "Must be http(s)://"; + return null; + } + if (field.fieldType === "enum") { + if (!field.options.includes(value)) return "Pick one of the options"; + return null; + } + if (field.fieldType === "array") { + let parsed: unknown; + try { + parsed = JSON.parse(value); + } catch { + return "Must be valid JSON"; + } + if (!Array.isArray(parsed)) return "Must be a JSON array"; + if (field.min !== null && parsed.length < field.min) { + return `Min ${field.min} item(s)`; + } + if (field.max !== null && parsed.length > field.max) { + return `Max ${field.max} item(s)`; + } + return null; + } + if (field.fieldType === "string") { + if (field.options.length > 0 && !field.options.includes(value)) { + return "Pick one of the options"; + } + if (field.regex !== null) { + try { + if (!new RegExp(field.regex).test(value)) return "Does not match pattern"; + } catch { + // Malformed regex shipped by the plugin — let the backend reject it. + } + } + return null; + } + return null; +} + +export function PluginConfigDialog({ + pluginName, + open, + onOpenChange, +}: PluginConfigDialogProps) { + const { t } = useTranslation(); + const queryClient = useQueryClient(); + const [draft, setDraft] = useState>({}); + + const enabled = open && pluginName !== null; + const { data, isLoading, isError } = useQuery({ + queryKey: pluginName ? QUERY_KEY(pluginName) : ["plugin_config_get_disabled"], + queryFn: () => + tauriInvoke("plugin_config_get", { name: pluginName! }), + enabled, + }); + + const updateMutation = useTauriMutation< + void, + { name: string; key: string; value: string } + >("plugin_config_update"); + + // Seed the draft on the first successful fetch per dialog opening only. + // Subsequent refetches must not clobber the user's in-progress edits — + // in particular after a failed save, where keying off `isPending` + // would re-overwrite the draft when the mutation settles. + const initializedFor = useRef(null); + useEffect(() => { + if (!enabled) { + initializedFor.current = null; + return; + } + if (data && initializedFor.current !== pluginName) { + setDraft(data.values); + initializedFor.current = pluginName; + } + }, [data, enabled, pluginName]); + + const errors = useMemo(() => { + if (!data) return {} as Record; + const out: Record = {}; + for (const field of data.fields) { + const value = draft[field.key] ?? ""; + const err = validate(field, value); + if (err) out[field.key] = err; + } + return out; + }, [data, draft]); + + async function handleSave() { + if (!pluginName || !data) return; + if (Object.keys(errors).length > 0) { + toast.error(t("plugins.config.toast.validationFailed")); + return; + } + + const changedKeys = data.fields + .map((f) => f.key) + .filter((key) => draft[key] !== data.values[key]); + + if (changedKeys.length === 0) { + onOpenChange(false); + return; + } + + try { + for (const key of changedKeys) { + await updateMutation.mutateAsync({ + name: pluginName, + key, + value: draft[key], + }); + } + // Single invalidation after the loop so each iteration doesn't + // race a refetch back into the draft. + queryClient.invalidateQueries({ queryKey: QUERY_KEY(pluginName) }); + toast.success(t("plugins.config.toast.saveSuccess")); + onOpenChange(false); + } catch { + // useTauriMutation already surfaces a toast on each error. + } + } + + return ( + + + + + {t("plugins.config.title", { name: pluginName ?? "" })} + + + {t("plugins.config.description")} + + + + {isLoading && ( +

+ {t("plugins.config.loading")} +

+ )} + {isError && ( +

+ {t("plugins.config.error")} +

+ )} + {data && data.fields.length === 0 && ( +

+ {t("plugins.config.noFields")} +

+ )} + {data && data.fields.length > 0 && ( +
+ {data.fields.map((field) => ( + + setDraft((prev) => ({ ...prev, [field.key]: value })) + } + errorMessage={errors[field.key]} + /> + ))} +
+ )} + + + + + +
+
+ ); +} diff --git a/src/views/PluginsView/PluginConfigField.tsx b/src/views/PluginsView/PluginConfigField.tsx new file mode 100644 index 0000000..8f82ce5 --- /dev/null +++ b/src/views/PluginsView/PluginConfigField.tsx @@ -0,0 +1,130 @@ +import { Input } from "@/components/ui/input"; +import { Switch } from "@/components/ui/switch"; +import { + Select, + SelectContent, + SelectItem, + SelectTrigger, + SelectValue, +} from "@/components/ui/select"; +import type { PluginConfigField as ConfigField } from "@/types/plugin-config"; + +interface PluginConfigFieldProps { + field: ConfigField; + value: string; + onChange: (value: string) => void; + errorMessage?: string; +} + +function isEnumLike(field: ConfigField): boolean { + if (field.fieldType === "enum") return true; + return field.fieldType === "string" && field.options.length > 0; +} + +export function PluginConfigField({ + field, + value, + onChange, + errorMessage, +}: PluginConfigFieldProps) { + const labelId = `plugin-config-field-${field.key}`; + const describedBy = errorMessage ? `${labelId}-error` : undefined; + + return ( +
+ + {field.description && ( +

{field.description}

+ )} + {renderControl(field, value, onChange, labelId, field.key, describedBy)} + {errorMessage && ( + + )} +
+ ); +} + +function renderControl( + field: ConfigField, + value: string, + onChange: (value: string) => void, + labelId: string, + inputId: string, + describedBy: string | undefined, +) { + if (field.fieldType === "boolean") { + return ( + onChange(checked ? "true" : "false")} + /> + ); + } + + if (isEnumLike(field)) { + return ( + + ); + } + + if (field.fieldType === "integer" || field.fieldType === "float") { + return ( + onChange(e.target.value)} + className="h-8 text-xs" + /> + ); + } + + return ( + onChange(e.target.value)} + placeholder={field.default ?? ""} + className="h-8 text-xs" + /> + ); +} diff --git a/src/views/PluginsView/PluginStoreRow.tsx b/src/views/PluginsView/PluginStoreRow.tsx index 9609fcd..7bb9f2e 100644 --- a/src/views/PluginsView/PluginStoreRow.tsx +++ b/src/views/PluginsView/PluginStoreRow.tsx @@ -1,4 +1,4 @@ -import { MoreVertical, ArrowUpCircle } from "lucide-react"; +import { MoreVertical, ArrowUpCircle, Settings } from "lucide-react"; import { useTranslation } from "react-i18next"; import { Badge } from "@/components/ui/badge"; import { Button } from "@/components/ui/button"; @@ -19,6 +19,13 @@ interface PluginStoreRowProps { onDisable?: (name: string) => void; onEnable?: (name: string) => void; onUninstall?: (name: string) => void; + /** + * Open the dynamic config dialog for this plugin. Hidden when omitted + * or when the plugin declares no `[config]` fields (the row gets + * `hasConfig` from the parent's resolution of the schema query). + */ + onConfigure?: (name: string) => void; + hasConfig?: boolean; isInstalling: boolean; isUpdating: boolean; } @@ -42,6 +49,8 @@ export function PluginStoreRow({ onDisable, onEnable, onUninstall, + onConfigure, + hasConfig = false, isInstalling, isUpdating, }: PluginStoreRowProps) { @@ -124,6 +133,18 @@ export function PluginStoreRow({ )} + {installed && hasConfig && onConfigure && ( + + )} + {installed && (onDisable || onEnable || onUninstall) && ( diff --git a/src/views/PluginsView/__tests__/PluginConfigField.test.tsx b/src/views/PluginsView/__tests__/PluginConfigField.test.tsx new file mode 100644 index 0000000..5552c7b --- /dev/null +++ b/src/views/PluginsView/__tests__/PluginConfigField.test.tsx @@ -0,0 +1,127 @@ +import { describe, it, expect, vi } from "vitest"; +import { render, screen, fireEvent } from "@testing-library/react"; +import { PluginConfigField } from "../PluginConfigField"; +import type { PluginConfigField as ConfigField } from "@/types/plugin-config"; + +function makeField(overrides: Partial = {}): ConfigField { + return { + key: "test_key", + fieldType: "string", + default: null, + description: null, + options: [], + min: null, + max: null, + regex: null, + ...overrides, + }; +} + +describe("PluginConfigField", () => { + it("renders a text input for string fields", () => { + const onChange = vi.fn(); + render( + , + ); + const input = screen.getByLabelText("test_key") as HTMLInputElement; + expect(input.tagName).toBe("INPUT"); + expect(input.type).toBe("text"); + expect(input.value).toBe("hello"); + }); + + it("renders a number input with bounds for integer fields", () => { + render( + {}} + />, + ); + const input = screen.getByLabelText("test_key") as HTMLInputElement; + expect(input.type).toBe("number"); + expect(input.min).toBe("0"); + expect(input.max).toBe("10"); + }); + + it("renders a switch for boolean fields and propagates checked state", () => { + const onChange = vi.fn(); + render( + , + ); + const sw = screen.getByLabelText("test_key"); + fireEvent.click(sw); + expect(onChange).toHaveBeenCalledWith("true"); + }); + + it("renders a select for enum fields with the given options", () => { + render( + {}} + />, + ); + expect(screen.getByText("720p")).toBeInTheDocument(); + }); + + it("renders a select when string field declares options (enum-like)", () => { + render( + {}} + />, + ); + expect(screen.getByText("fast")).toBeInTheDocument(); + }); + + it("renders a url input for url fields", () => { + render( + {}} + />, + ); + const input = screen.getByLabelText("test_key") as HTMLInputElement; + expect(input.type).toBe("url"); + }); + + it("displays the description when provided", () => { + render( + {}} + />, + ); + expect(screen.getByText("Choose your preferred quality")).toBeInTheDocument(); + }); + + it("displays the error message when provided", () => { + render( + {}} + errorMessage="Required" + />, + ); + const alert = screen.getByRole("alert"); + expect(alert).toHaveTextContent("Required"); + }); +});