Files
dss/tools/dss_mcp/TRANSLATION_DICTIONARY_IMPLEMENTATION_PLAN.md
Digital Production Factory 276ed71f31 Initial commit: Clean DSS implementation
Migrated from design-system-swarm with fresh git history.
Old project history preserved in /home/overbits/apps/design-system-swarm

Core components:
- MCP Server (Python FastAPI with mcp 1.23.1)
- Claude Plugin (agents, commands, skills, strategies, hooks, core)
- DSS Backend (dss-mvp1 - token translation, Figma sync)
- Admin UI (Node.js/React)
- Server (Node.js/Express)
- Storybook integration (dss-mvp1/.storybook)

Self-contained configuration:
- All paths relative or use DSS_BASE_PATH=/home/overbits/dss
- PYTHONPATH configured for dss-mvp1 and dss-claude-plugin
- .env file with all configuration
- Claude plugin uses ${CLAUDE_PLUGIN_ROOT} for portability

Migration completed: $(date)
🤖 Clean migration with full functionality preserved
2025-12-09 18:45:48 -03:00

98 KiB

Translation Dictionary System - Implementation Plan

Version: 1.0.0 Date: December 9, 2024 Status: PLANNING Author: Architecture Planning (Gemini 3 Pro Simulation)


Executive Summary

This document provides a comprehensive implementation plan for the Translation Dictionary System - a critical missing component in the DSS Python core that enables mapping between external design token formats and the canonical DSS structure.

Why This Matters

The Translation Dictionary System is the keystone of the entire DSS philosophy:

  • DSS Core is immutable - external systems adapt to DSS, not vice versa
  • Each client project needs its own mappings from legacy tokens to DSS canonical tokens
  • Custom props must be isolated in client-specific namespaces
  • Full traceability from source to DSS to output

Current State

Component Status Location
DSS Core Principles Documented DSS_PRINCIPLES.md
Translation Schema Documented DSS_PRINCIPLES.md
Python Implementation MISSING Should be dss/translations/
MCP Integration BLOCKED Depends on Python implementation

Implementation Impact

Phase Current State After Implementation
Workflow 1 (Figma Import) Working Enhanced with translation
Workflow 2 (Skin Loading) 60% 100% - Full skin support
Workflow 3 (Design Apply) 10% 100% - Full token resolution

1. Architecture Overview

1.1 System Architecture Diagram

                                    DSS TRANSLATION DICTIONARY SYSTEM
+=======================================================================================+
|                                                                                       |
|   EXTERNAL SOURCES                    TRANSLATION LAYER                DSS CORE       |
|   ================                    =================                ========       |
|                                                                                       |
|   +-------------+                     +------------------+             +-----------+  |
|   | Figma       |----+                |  Translation     |             | Canonical |  |
|   | Tokens      |    |                |  Dictionary      |             | Tokens    |  |
|   +-------------+    |                |  Loader          |             |           |  |
|                      |                +--------+---------+             | color.    |  |
|   +-------------+    |    Load &               |                       |  primary. |  |
|   | Legacy CSS  |----+----Parse---->  +--------v---------+  Resolve    |  500      |  |
|   | Variables   |    |                |  Translation     |---------->  |           |  |
|   +-------------+    |                |  Registry        |             | spacing.  |  |
|                      |                +--------+---------+             |  md       |  |
|   +-------------+    |                         |                       |           |  |
|   | HeroUI/     |----+                +--------v---------+             | etc.      |  |
|   | shadcn      |    |                |  Token           |             +-----------+  |
|   +-------------+    |                |  Resolver        |                   |        |
|                      |                +--------+---------+                   |        |
|   +-------------+    |                         |                             |        |
|   | Custom      |----+                +--------v---------+             +-----v-----+  |
|   | JSON/YAML   |                     |  Custom Props    |             | Merged    |  |
|   +-------------+                     |  Merger          |             | Theme     |  |
|                                       +--------+---------+             | Output    |  |
|                                                |                       +-----------+  |
|                                       +--------v---------+                   |        |
|                                       |  Validation      |                   |        |
|                                       |  Engine          |                   v        |
|                                       +------------------+             OUTPUT FILES   |
|                                                                        - CSS vars     |
|                                                                        - SCSS vars    |
|                                                                        - JSON tokens  |
|                                                                        - TypeScript   |
|                                                                                       |
+=======================================================================================+

1.2 Data Flow Diagram

                          TRANSLATION DICTIONARY DATA FLOW
+--------------------------------------------------------------------------------+
|                                                                                |
|  PROJECT                                                                       |
|  +----------------------------+                                                |
|  | .dss/                      |                                                |
|  |   config.json              |--------> Project Configuration                 |
|  |   translations/            |                                                |
|  |     figma.json            -|--------> Figma Token Mappings                  |
|  |     legacy-css.json       -|--------> Legacy CSS Mappings                   |
|  |     heroui.json           -|--------> HeroUI Mappings                       |
|  |     shadcn.json           -|--------> shadcn Mappings                       |
|  |     custom.json           -|--------> Custom Props Extensions               |
|  +----------------------------+                                                |
|              |                                                                 |
|              v                                                                 |
|  +----------------------------+                                                |
|  | TranslationDictionaryLoader|  <-- Single entry point                        |
|  +----------------------------+                                                |
|              |                                                                 |
|              v                                                                 |
|  +----------------------------+     +------------------------+                 |
|  | TranslationRegistry       |<--->| ValidationEngine       |                 |
|  | (in-memory cache)         |     | - Schema validation    |                 |
|  +----------------------------+     | - DSS canonical check  |                 |
|              |                      | - Conflict detection   |                 |
|              |                      +------------------------+                 |
|              v                                                                 |
|  +----------------------------+                                                |
|  | TokenResolver              |                                                |
|  | - Resolve source -> DSS    |                                                |
|  | - Resolve DSS -> source    |  <-- BIDIRECTIONAL                             |
|  | - Handle aliases           |                                                |
|  | - Chain references         |                                                |
|  +----------------------------+                                                |
|              |                                                                 |
|              v                                                                 |
|  +----------------------------+     +------------------------+                 |
|  | ThemeMerger                |<--->| Base Theme             |                 |
|  | - Merge base + custom      |     | (light/dark)           |                 |
|  | - Apply translations       |     +------------------------+                 |
|  | - Generate resolved theme  |                                                |
|  +----------------------------+                                                |
|              |                                                                 |
|              v                                                                 |
|  +----------------------------+                                                |
|  | ResolvedProjectTheme       |  <-- Final output                              |
|  | - All tokens resolved      |                                                |
|  | - Custom props merged      |                                                |
|  | - Ready for export         |                                                |
|  +----------------------------+                                                |
|                                                                                |
+--------------------------------------------------------------------------------+

1.3 Module Integration Diagram

                    DSS MODULE INTEGRATION
+-------------------------------------------------------------+
|                                                             |
|  dss/__init__.py                                            |
|  +-----------------------------------------------------------+
|  |                                                           |
|  |  EXISTING MODULES              NEW MODULE                 |
|  |  =================             ==========                 |
|  |                                                           |
|  |  +-----------+                 +------------------+       |
|  |  | ingest    |<--------------->| translations    |       |
|  |  | - CSS     |   Token         | - loader        |       |
|  |  | - SCSS    |   extraction    | - registry      |       |
|  |  | - JSON    |   results       | - resolver      |       |
|  |  | - merge   |                 | - merger        |       |
|  |  +-----------+                 | - validator     |       |
|  |       ^                        | - models        |       |
|  |       |                        +------------------+       |
|  |       |                               ^                   |
|  |  +-----------+                        |                   |
|  |  | themes    |<-----------------------+                   |
|  |  | - default |   Base theme           |                   |
|  |  | - light   |   for merging          |                   |
|  |  | - dark    |                        |                   |
|  |  +-----------+                        |                   |
|  |       ^                               |                   |
|  |       |                               |                   |
|  |  +-----------+                        |                   |
|  |  | models    |<-----------------------+                   |
|  |  | - Theme   |   Data structures      |                   |
|  |  | - Token   |   & types              |                   |
|  |  +-----------+                        |                   |
|  |       ^                               |                   |
|  |       |                               |                   |
|  |  +-----------+                        |                   |
|  |  | validators|<-----------------------+                   |
|  |  | - schema  |   Validation           |                   |
|  |  | - project |   utilities            |                   |
|  |  +-----------+                        |                   |
|  |       ^                               |                   |
|  |       |                               v                   |
|  |  +-----------+                 +------------------+       |
|  |  | storybook |<--------------->| MCP Plugin      |       |
|  |  | - scanner |   Theme         | (tools/dss_mcp) |       |
|  |  | - theme   |   generation    |                 |       |
|  |  +-----------+                 +------------------+       |
|  |                                                           |
|  +-----------------------------------------------------------+
|                                                             |
+-------------------------------------------------------------+

2. File Structure

2.1 Complete Module Structure

dss-mvp1/dss/translations/
|
+-- __init__.py              # Module exports
+-- models.py                # Pydantic data models
+-- loader.py                # Dictionary file loading
+-- registry.py              # In-memory translation registry
+-- resolver.py              # Token path resolution
+-- merger.py                # Theme + custom props merging
+-- validator.py             # Schema & semantic validation
+-- writer.py                # Dictionary file writing
+-- utils.py                 # Utility functions
+-- canonical.py             # DSS canonical structure definitions
|
+-- schemas/                 # JSON Schema files
|   +-- translation-v1.schema.json
|   +-- config.schema.json
|
+-- presets/                 # Pre-built translation dictionaries
    +-- heroui.json          # HeroUI -> DSS mappings
    +-- shadcn.json          # shadcn -> DSS mappings
    +-- tailwind.json        # Tailwind -> DSS mappings

2.2 Project .dss Structure

project-root/
|
+-- .dss/                    # DSS project configuration
    |
    +-- config.json          # Project configuration
    |
    +-- translations/        # Translation dictionaries
    |   +-- figma.json       # Figma source mappings
    |   +-- legacy-css.json  # Legacy CSS mappings
    |   +-- custom.json      # Custom props for this project
    |
    +-- cache/               # Computed/resolved cache
        +-- resolved-theme.json
        +-- token-map.json

3. Data Models

3.1 models.py - Core Data Models

"""
Translation Dictionary Data Models

Pydantic models for translation dictionary system.
"""

from datetime import datetime
from enum import Enum
from typing import Any, Dict, List, Optional, Union
from uuid import uuid4
from pydantic import BaseModel, Field, ConfigDict, field_validator


class TranslationSource(str, Enum):
    """Source types for translation dictionaries."""
    FIGMA = "figma"
    CSS = "css"
    SCSS = "scss"
    HEROUI = "heroui"
    SHADCN = "shadcn"
    TAILWIND = "tailwind"
    JSON = "json"
    CUSTOM = "custom"


class MappingType(str, Enum):
    """Types of mappings in a translation dictionary."""
    TOKEN = "token"
    COMPONENT = "component"
    PATTERN = "pattern"


class TokenMapping(BaseModel):
    """Single token mapping from source to DSS canonical."""
    model_config = ConfigDict(extra="forbid")

    source_token: str = Field(
        ...,
        description="Source token name (e.g., '--brand-blue', '$primary-color')"
    )
    dss_token: str = Field(
        ...,
        description="DSS canonical token path (e.g., 'color.primary.500')"
    )
    source_value: Optional[str] = Field(
        None,
        description="Original value from source (for reference)"
    )
    notes: Optional[str] = Field(
        None,
        description="Human-readable notes about this mapping"
    )
    confidence: float = Field(
        default=1.0,
        ge=0.0,
        le=1.0,
        description="Confidence score for auto-generated mappings"
    )
    auto_generated: bool = Field(
        default=False,
        description="Whether this mapping was auto-generated"
    )


class ComponentMapping(BaseModel):
    """Single component mapping from source to DSS canonical."""
    model_config = ConfigDict(extra="forbid")

    source_component: str = Field(
        ...,
        description="Source component (e.g., '.btn-primary', 'HeroButton')"
    )
    dss_component: str = Field(
        ...,
        description="DSS canonical component (e.g., 'Button[variant=primary]')"
    )
    prop_mappings: Dict[str, str] = Field(
        default_factory=dict,
        description="Prop name mappings (source -> DSS)"
    )
    notes: Optional[str] = Field(None)


class PatternMapping(BaseModel):
    """Pattern mapping for structural translations."""
    model_config = ConfigDict(extra="forbid")

    source_pattern: str = Field(
        ...,
        description="Source pattern (e.g., 'form-row', 'card-grid')"
    )
    dss_pattern: str = Field(
        ...,
        description="DSS canonical pattern"
    )
    notes: Optional[str] = Field(None)


class CustomProp(BaseModel):
    """Custom property not in DSS core."""
    model_config = ConfigDict(extra="forbid")

    name: str = Field(
        ...,
        description="Token name in DSS namespace (e.g., 'color.brand.acme.primary')"
    )
    value: Any = Field(
        ...,
        description="Token value"
    )
    type: str = Field(
        default="string",
        description="Value type (color, dimension, string, etc.)"
    )
    description: Optional[str] = Field(None)
    deprecated: bool = Field(default=False)
    deprecated_message: Optional[str] = Field(None)


class TranslationMappings(BaseModel):
    """Container for all mapping types."""
    model_config = ConfigDict(extra="forbid")

    tokens: Dict[str, str] = Field(
        default_factory=dict,
        description="Token mappings: source_token -> dss_token"
    )
    components: Dict[str, str] = Field(
        default_factory=dict,
        description="Component mappings: source_component -> dss_component"
    )
    patterns: Dict[str, str] = Field(
        default_factory=dict,
        description="Pattern mappings: source_pattern -> dss_pattern"
    )


class TranslationDictionary(BaseModel):
    """Complete translation dictionary for a project."""
    model_config = ConfigDict(extra="forbid")

    # Metadata
    schema_version: str = Field(
        default="dss-translation-v1",
        alias="$schema",
        description="Schema version identifier"
    )
    uuid: str = Field(
        default_factory=lambda: str(uuid4()),
        description="Unique identifier for this dictionary"
    )
    project: str = Field(
        ...,
        description="Project identifier"
    )
    source: TranslationSource = Field(
        ...,
        description="Source type for this dictionary"
    )
    version: str = Field(
        default="1.0.0",
        description="Dictionary version"
    )
    created_at: datetime = Field(
        default_factory=datetime.utcnow
    )
    updated_at: datetime = Field(
        default_factory=datetime.utcnow
    )

    # Mappings
    mappings: TranslationMappings = Field(
        default_factory=TranslationMappings,
        description="All mappings from source to DSS"
    )

    # Custom extensions
    custom_props: Dict[str, Any] = Field(
        default_factory=dict,
        description="Custom props not in DSS core (namespaced)"
    )

    # Tracking
    unmapped: List[str] = Field(
        default_factory=list,
        description="Source tokens that couldn't be mapped"
    )
    notes: List[str] = Field(
        default_factory=list,
        description="Human-readable notes"
    )

    @field_validator('custom_props')
    @classmethod
    def validate_custom_props_namespace(cls, v: Dict[str, Any]) -> Dict[str, Any]:
        """Ensure custom props use proper namespacing."""
        for key in v.keys():
            # Custom props should be namespaced (e.g., color.brand.acme.primary)
            if not '.' in key:
                raise ValueError(
                    f"Custom prop '{key}' must use dot-notation namespace "
                    "(e.g., 'color.brand.project.name')"
                )
        return v


class TranslationRegistry(BaseModel):
    """In-memory registry of all loaded translation dictionaries."""
    model_config = ConfigDict(arbitrary_types_allowed=True)

    dictionaries: Dict[str, TranslationDictionary] = Field(
        default_factory=dict,
        description="Loaded dictionaries by source type"
    )
    combined_token_map: Dict[str, str] = Field(
        default_factory=dict,
        description="Combined source->DSS token mappings"
    )
    combined_component_map: Dict[str, str] = Field(
        default_factory=dict,
        description="Combined source->DSS component mappings"
    )
    all_custom_props: Dict[str, Any] = Field(
        default_factory=dict,
        description="Merged custom props from all dictionaries"
    )
    conflicts: List[Dict[str, Any]] = Field(
        default_factory=list,
        description="Detected mapping conflicts"
    )


class ResolvedToken(BaseModel):
    """A fully resolved token with provenance."""
    model_config = ConfigDict(extra="forbid")

    dss_path: str = Field(
        ...,
        description="DSS canonical path (e.g., 'color.primary.500')"
    )
    value: Any = Field(
        ...,
        description="Resolved value"
    )
    source_token: Optional[str] = Field(
        None,
        description="Original source token if translated"
    )
    source_type: Optional[TranslationSource] = Field(
        None,
        description="Source type if translated"
    )
    is_custom: bool = Field(
        default=False,
        description="Whether this is a custom prop"
    )
    provenance: List[str] = Field(
        default_factory=list,
        description="Resolution chain for debugging"
    )


class ResolvedTheme(BaseModel):
    """Fully resolved theme with all translations applied."""
    model_config = ConfigDict(arbitrary_types_allowed=True)

    name: str
    version: str = "1.0.0"
    base_theme: str = Field(
        ...,
        description="Base theme name (light/dark)"
    )
    tokens: Dict[str, ResolvedToken] = Field(
        default_factory=dict
    )
    custom_props: Dict[str, ResolvedToken] = Field(
        default_factory=dict
    )
    translations_applied: List[str] = Field(
        default_factory=list,
        description="List of translation dictionaries applied"
    )
    resolved_at: datetime = Field(
        default_factory=datetime.utcnow
    )

3.2 JSON Schema - translation-v1.schema.json

{
  "$schema": "http://json-schema.org/draft-07/schema#",
  "$id": "https://dss.dev/schemas/translation-v1.schema.json",
  "title": "DSS Translation Dictionary",
  "description": "Schema for DSS translation dictionary files",
  "type": "object",
  "required": ["$schema", "project", "source"],
  "properties": {
    "$schema": {
      "type": "string",
      "const": "dss-translation-v1",
      "description": "Schema version identifier"
    },
    "project": {
      "type": "string",
      "minLength": 1,
      "description": "Project identifier"
    },
    "source": {
      "type": "string",
      "enum": ["figma", "css", "scss", "heroui", "shadcn", "tailwind", "json", "custom"],
      "description": "Source type for this dictionary"
    },
    "version": {
      "type": "string",
      "pattern": "^\\d+\\.\\d+\\.\\d+$",
      "default": "1.0.0",
      "description": "Semantic version"
    },
    "mappings": {
      "type": "object",
      "properties": {
        "tokens": {
          "type": "object",
          "additionalProperties": {
            "type": "string",
            "pattern": "^[a-z][a-z0-9]*\\.[a-z][a-z0-9]*(\\.[a-z0-9]+)*$"
          },
          "description": "Token mappings: source -> DSS canonical path"
        },
        "components": {
          "type": "object",
          "additionalProperties": {
            "type": "string"
          },
          "description": "Component mappings"
        },
        "patterns": {
          "type": "object",
          "additionalProperties": {
            "type": "string"
          },
          "description": "Pattern mappings"
        }
      },
      "additionalProperties": false
    },
    "custom_props": {
      "type": "object",
      "propertyNames": {
        "pattern": "^[a-z][a-z0-9]*\\.[a-z][a-z0-9]*(\\.[a-z0-9]+)*$"
      },
      "description": "Custom properties in DSS namespace"
    },
    "unmapped": {
      "type": "array",
      "items": {
        "type": "string"
      },
      "description": "Source tokens that couldn't be mapped"
    },
    "notes": {
      "type": "array",
      "items": {
        "type": "string"
      },
      "description": "Human-readable notes"
    }
  },
  "additionalProperties": false
}

4. Core Classes & Methods

4.1 loader.py - Dictionary Loader

"""
Translation Dictionary Loader

Loads and parses translation dictionaries from project .dss directory.
"""

import json
from pathlib import Path
from typing import Dict, List, Optional, Union
from .models import TranslationDictionary, TranslationSource, TranslationRegistry
from .validator import TranslationValidator


class TranslationDictionaryLoader:
    """
    Loads translation dictionaries from project .dss/translations/ directory.

    Usage:
        loader = TranslationDictionaryLoader("/path/to/project")
        registry = await loader.load_all()

        # Or load specific dictionary
        figma_dict = await loader.load_dictionary("figma")
    """

    DEFAULT_DIR = ".dss/translations"

    def __init__(
        self,
        project_path: Union[str, Path],
        translations_dir: Optional[str] = None,
        validate: bool = True
    ):
        """
        Initialize loader.

        Args:
            project_path: Root path to project
            translations_dir: Custom translations directory (default: .dss/translations)
            validate: Whether to validate dictionaries on load
        """
        self.project_path = Path(project_path).resolve()
        self.translations_dir = self.project_path / (
            translations_dir or self.DEFAULT_DIR
        )
        self.validate = validate
        self.validator = TranslationValidator() if validate else None

    async def load_all(self) -> TranslationRegistry:
        """
        Load all translation dictionaries from project.

        Returns:
            TranslationRegistry with all loaded dictionaries
        """
        registry = TranslationRegistry()

        if not self.translations_dir.exists():
            return registry

        for json_file in self.translations_dir.glob("*.json"):
            try:
                dictionary = await self.load_dictionary_file(json_file)
                if dictionary:
                    registry.dictionaries[dictionary.source.value] = dictionary
                    self._merge_to_registry(registry, dictionary)
            except Exception as e:
                # Log error but continue loading other dictionaries
                registry.conflicts.append({
                    "file": str(json_file),
                    "error": str(e),
                    "type": "load_error"
                })

        return registry

    async def load_dictionary(
        self,
        source: Union[str, TranslationSource]
    ) -> Optional[TranslationDictionary]:
        """
        Load a specific translation dictionary by source type.

        Args:
            source: Source type (e.g., "figma", "css", TranslationSource.FIGMA)

        Returns:
            TranslationDictionary or None if not found
        """
        if isinstance(source, str):
            source = TranslationSource(source)

        file_path = self.translations_dir / f"{source.value}.json"
        if not file_path.exists():
            return None

        return await self.load_dictionary_file(file_path)

    async def load_dictionary_file(
        self,
        file_path: Union[str, Path]
    ) -> Optional[TranslationDictionary]:
        """
        Load a translation dictionary from a specific file.

        Args:
            file_path: Path to JSON file

        Returns:
            TranslationDictionary or None if invalid
        """
        file_path = Path(file_path)
        if not file_path.exists():
            raise FileNotFoundError(f"Dictionary file not found: {file_path}")

        with open(file_path, 'r', encoding='utf-8') as f:
            data = json.load(f)

        # Validate if enabled
        if self.validator:
            validation_result = self.validator.validate_dictionary(data)
            if not validation_result.is_valid:
                raise ValueError(
                    f"Invalid dictionary {file_path}: "
                    f"{[str(e) for e in validation_result.errors]}"
                )

        return TranslationDictionary(**data)

    def _merge_to_registry(
        self,
        registry: TranslationRegistry,
        dictionary: TranslationDictionary
    ) -> None:
        """Merge dictionary mappings into registry."""
        # Merge token mappings
        for source_token, dss_token in dictionary.mappings.tokens.items():
            if source_token in registry.combined_token_map:
                existing = registry.combined_token_map[source_token]
                if existing != dss_token:
                    registry.conflicts.append({
                        "type": "token_conflict",
                        "source_token": source_token,
                        "existing_mapping": existing,
                        "new_mapping": dss_token,
                        "source": dictionary.source.value
                    })
                    continue
            registry.combined_token_map[source_token] = dss_token

        # Merge component mappings
        for source_comp, dss_comp in dictionary.mappings.components.items():
            if source_comp in registry.combined_component_map:
                existing = registry.combined_component_map[source_comp]
                if existing != dss_comp:
                    registry.conflicts.append({
                        "type": "component_conflict",
                        "source_component": source_comp,
                        "existing_mapping": existing,
                        "new_mapping": dss_comp,
                        "source": dictionary.source.value
                    })
                    continue
            registry.combined_component_map[source_comp] = dss_comp

        # Merge custom props
        for prop_name, prop_value in dictionary.custom_props.items():
            if prop_name in registry.all_custom_props:
                existing = registry.all_custom_props[prop_name]
                if existing != prop_value:
                    registry.conflicts.append({
                        "type": "custom_prop_conflict",
                        "prop_name": prop_name,
                        "existing_value": existing,
                        "new_value": prop_value,
                        "source": dictionary.source.value
                    })
                    continue
            registry.all_custom_props[prop_name] = prop_value

    def get_translations_dir(self) -> Path:
        """Get the translations directory path."""
        return self.translations_dir

    def has_translations(self) -> bool:
        """Check if project has any translation dictionaries."""
        if not self.translations_dir.exists():
            return False
        return any(self.translations_dir.glob("*.json"))

    def list_available_dictionaries(self) -> List[str]:
        """List available dictionary source types."""
        if not self.translations_dir.exists():
            return []
        return [
            f.stem for f in self.translations_dir.glob("*.json")
        ]

4.2 resolver.py - Token Resolution

"""
Token Resolver

Resolves tokens between source formats and DSS canonical structure.
Supports bidirectional translation.
"""

from typing import Any, Dict, List, Optional, Tuple, Union
from .models import (
    TranslationRegistry,
    TranslationSource,
    ResolvedToken,
)
from .canonical import DSS_CANONICAL_TOKENS


class TokenResolver:
    """
    Resolves tokens between source and DSS canonical formats.

    Supports:
    - Source -> DSS translation (forward)
    - DSS -> Source translation (reverse)
    - Token path resolution with aliasing
    - Reference chain resolution

    Usage:
        resolver = TokenResolver(registry)

        # Forward translation
        dss_token = resolver.resolve_to_dss("--brand-blue")
        # -> "color.primary.500"

        # Reverse translation
        source_token = resolver.resolve_to_source("color.primary.500", "css")
        # -> "--brand-blue"
    """

    def __init__(self, registry: TranslationRegistry):
        """
        Initialize resolver with translation registry.

        Args:
            registry: Loaded TranslationRegistry with mappings
        """
        self.registry = registry
        self._reverse_map: Dict[str, Dict[str, str]] = {}
        self._build_reverse_maps()

    def _build_reverse_maps(self) -> None:
        """Build reverse lookup maps (DSS -> source) for each source type."""
        for source_type, dictionary in self.registry.dictionaries.items():
            self._reverse_map[source_type] = {
                dss: source
                for source, dss in dictionary.mappings.tokens.items()
            }

    def resolve_to_dss(
        self,
        source_token: str,
        source_type: Optional[Union[str, TranslationSource]] = None
    ) -> Optional[str]:
        """
        Resolve source token to DSS canonical path.

        Args:
            source_token: Source token (e.g., "--brand-blue", "$primary")
            source_type: Optional source type hint (searches all if not provided)

        Returns:
            DSS canonical path or None if not found
        """
        # Direct lookup in combined map
        if source_token in self.registry.combined_token_map:
            return self.registry.combined_token_map[source_token]

        # If source type specified, look only there
        if source_type:
            if isinstance(source_type, str):
                source_type = TranslationSource(source_type)
            dictionary = self.registry.dictionaries.get(source_type.value)
            if dictionary:
                return dictionary.mappings.tokens.get(source_token)

        # Try normalization patterns
        normalized = self._normalize_token_name(source_token)
        return self.registry.combined_token_map.get(normalized)

    def resolve_to_source(
        self,
        dss_token: str,
        source_type: Union[str, TranslationSource]
    ) -> Optional[str]:
        """
        Resolve DSS token to source format (reverse translation).

        Args:
            dss_token: DSS canonical path (e.g., "color.primary.500")
            source_type: Target source type

        Returns:
            Source token name or None if not mapped
        """
        if isinstance(source_type, str):
            source_type_str = source_type
        else:
            source_type_str = source_type.value

        reverse_map = self._reverse_map.get(source_type_str, {})
        return reverse_map.get(dss_token)

    def resolve_token_value(
        self,
        source_token: str,
        base_theme_tokens: Dict[str, Any],
        source_type: Optional[Union[str, TranslationSource]] = None
    ) -> Optional[ResolvedToken]:
        """
        Fully resolve a source token to its DSS value.

        Args:
            source_token: Source token name
            base_theme_tokens: Base theme token values
            source_type: Optional source type hint

        Returns:
            ResolvedToken with full provenance or None
        """
        # Get DSS path
        dss_path = self.resolve_to_dss(source_token, source_type)
        if not dss_path:
            # Check if it's a custom prop
            if source_token in self.registry.all_custom_props:
                return ResolvedToken(
                    dss_path=source_token,
                    value=self.registry.all_custom_props[source_token],
                    source_token=source_token,
                    is_custom=True,
                    provenance=[f"custom_prop: {source_token}"]
                )
            return None

        # Resolve value from base theme
        value = self._get_token_value(dss_path, base_theme_tokens)

        # Determine source type if not provided
        resolved_source = source_type
        if resolved_source is None:
            for src_type, dictionary in self.registry.dictionaries.items():
                if source_token in dictionary.mappings.tokens:
                    resolved_source = TranslationSource(src_type)
                    break

        return ResolvedToken(
            dss_path=dss_path,
            value=value,
            source_token=source_token,
            source_type=resolved_source if isinstance(resolved_source, TranslationSource) else (
                TranslationSource(resolved_source) if resolved_source else None
            ),
            is_custom=False,
            provenance=[
                f"source: {source_token}",
                f"mapped_to: {dss_path}",
                f"value: {value}"
            ]
        )

    def resolve_all_mappings(
        self,
        base_theme_tokens: Dict[str, Any]
    ) -> Dict[str, ResolvedToken]:
        """
        Resolve all mapped tokens to their DSS values.

        Args:
            base_theme_tokens: Base theme token values

        Returns:
            Dict of DSS path -> ResolvedToken
        """
        resolved = {}

        # Resolve all mapped tokens
        for source_token, dss_path in self.registry.combined_token_map.items():
            value = self._get_token_value(dss_path, base_theme_tokens)

            # Find source type
            source_type = None
            for src_type, dictionary in self.registry.dictionaries.items():
                if source_token in dictionary.mappings.tokens:
                    source_type = TranslationSource(src_type)
                    break

            resolved[dss_path] = ResolvedToken(
                dss_path=dss_path,
                value=value,
                source_token=source_token,
                source_type=source_type,
                is_custom=False,
                provenance=[f"source: {source_token}", f"mapped_to: {dss_path}"]
            )

        # Add custom props
        for prop_name, prop_value in self.registry.all_custom_props.items():
            resolved[prop_name] = ResolvedToken(
                dss_path=prop_name,
                value=prop_value,
                is_custom=True,
                provenance=[f"custom_prop: {prop_name}"]
            )

        return resolved

    def _get_token_value(
        self,
        dss_path: str,
        base_tokens: Dict[str, Any]
    ) -> Any:
        """Get token value from base theme using DSS path."""
        # Handle nested paths (e.g., "color.primary.500")
        parts = dss_path.split('.')
        current = base_tokens

        for part in parts:
            if isinstance(current, dict):
                current = current.get(part)
                if current is None:
                    break
            else:
                return None

        # If we got a DesignToken object, extract value
        if hasattr(current, 'value'):
            return current.value

        return current

    def _normalize_token_name(self, token: str) -> str:
        """Normalize token name for lookup."""
        # Remove common prefixes
        normalized = token.lstrip('-$')

        # Convert various formats to dot notation
        normalized = normalized.replace('-', '.').replace('_', '.')

        # Handle var() references
        if normalized.startswith('var(') and normalized.endswith(')'):
            normalized = normalized[4:-1].lstrip('-')

        return normalized.lower()

    def get_unmapped_tokens(self) -> List[str]:
        """Get list of tokens that couldn't be mapped."""
        unmapped = []
        for dictionary in self.registry.dictionaries.values():
            unmapped.extend(dictionary.unmapped)
        return list(set(unmapped))

    def validate_dss_path(self, path: str) -> bool:
        """Validate that a path matches DSS canonical structure."""
        return path in DSS_CANONICAL_TOKENS or self._is_valid_custom_namespace(path)

    def _is_valid_custom_namespace(self, path: str) -> bool:
        """Check if path uses valid custom namespace."""
        parts = path.split('.')
        if len(parts) < 3:
            return False
        # Custom props should be like: color.brand.acme.primary
        return parts[1] == 'brand' or parts[1] == 'custom'

4.3 merger.py - Theme Merger

"""
Theme Merger

Merges base DSS theme with translation mappings and custom props.
"""

from typing import Any, Dict, List, Optional, Union
from datetime import datetime
from .models import (
    TranslationRegistry,
    ResolvedToken,
    ResolvedTheme,
)
from .resolver import TokenResolver
from dss.models.theme import Theme, DesignToken, TokenCategory
from dss.themes.default_themes import get_default_light_theme, get_default_dark_theme


class ThemeMerger:
    """
    Merges base DSS theme with project-specific customizations.

    The merge hierarchy:
    1. Base Theme (DSS Light or Dark)
    2. Translation Mappings (external tokens -> DSS)
    3. Custom Props (project-specific extensions)

    Usage:
        merger = ThemeMerger(registry)
        resolved = await merger.merge(base_theme="light")
    """

    def __init__(self, registry: TranslationRegistry):
        """
        Initialize merger with translation registry.

        Args:
            registry: TranslationRegistry with loaded dictionaries
        """
        self.registry = registry
        self.resolver = TokenResolver(registry)

    async def merge(
        self,
        base_theme: str = "light",
        project_name: Optional[str] = None
    ) -> ResolvedTheme:
        """
        Merge base theme with translations and custom props.

        Args:
            base_theme: Base theme name ("light" or "dark")
            project_name: Project name for resolved theme

        Returns:
            ResolvedTheme with all tokens resolved
        """
        # Get base theme
        if base_theme == "light":
            theme = get_default_light_theme()
        elif base_theme == "dark":
            theme = get_default_dark_theme()
        else:
            raise ValueError(f"Unknown base theme: {base_theme}")

        # Convert theme tokens to dict for resolution
        base_tokens = self._theme_to_dict(theme)

        # Resolve all mapped tokens
        resolved_tokens = self.resolver.resolve_all_mappings(base_tokens)

        # Separate core tokens from custom props
        core_tokens = {}
        custom_props = {}

        for dss_path, resolved in resolved_tokens.items():
            if resolved.is_custom:
                custom_props[dss_path] = resolved
            else:
                core_tokens[dss_path] = resolved

        # Add base theme tokens that aren't in mappings
        for token_name, token in theme.tokens.items():
            # Normalize token name to DSS path
            dss_path = self._normalize_to_dss_path(token_name)
            if dss_path not in core_tokens:
                core_tokens[dss_path] = ResolvedToken(
                    dss_path=dss_path,
                    value=token.value,
                    is_custom=False,
                    provenance=[f"base_theme: {base_theme}"]
                )

        return ResolvedTheme(
            name=project_name or f"resolved-{base_theme}",
            version="1.0.0",
            base_theme=base_theme,
            tokens=core_tokens,
            custom_props=custom_props,
            translations_applied=[
                dict_name for dict_name in self.registry.dictionaries.keys()
            ],
            resolved_at=datetime.utcnow()
        )

    def _theme_to_dict(self, theme: Theme) -> Dict[str, Any]:
        """Convert Theme object to nested dict for resolution."""
        result = {}
        for token_name, token in theme.tokens.items():
            # Convert flat token names to nested structure
            parts = self._normalize_to_dss_path(token_name).split('.')
            current = result
            for part in parts[:-1]:
                if part not in current:
                    current[part] = {}
                current = current[part]
            current[parts[-1]] = token.value
        return result

    def _normalize_to_dss_path(self, token_name: str) -> str:
        """Normalize token name to DSS canonical path."""
        # Handle various formats
        normalized = token_name.replace('-', '.').replace('_', '.')

        # Map common prefixes
        prefix_map = {
            'space.': 'spacing.',
            'radius.': 'border.radius.',
            'text.': 'typography.size.',
        }

        for old, new in prefix_map.items():
            if normalized.startswith(old):
                normalized = new + normalized[len(old):]
                break

        return normalized

    async def merge_custom_props(
        self,
        resolved_theme: ResolvedTheme,
        additional_props: Dict[str, Any]
    ) -> ResolvedTheme:
        """
        Add additional custom props to a resolved theme.

        Args:
            resolved_theme: Existing resolved theme
            additional_props: Additional custom props to merge

        Returns:
            Updated ResolvedTheme
        """
        for prop_name, prop_value in additional_props.items():
            resolved_theme.custom_props[prop_name] = ResolvedToken(
                dss_path=prop_name,
                value=prop_value,
                is_custom=True,
                provenance=["additional_custom_prop"]
            )

        resolved_theme.resolved_at = datetime.utcnow()
        return resolved_theme

    def export_as_theme(self, resolved: ResolvedTheme) -> Theme:
        """
        Convert ResolvedTheme back to Theme model.

        Args:
            resolved: ResolvedTheme to convert

        Returns:
            Theme model instance
        """
        tokens = {}

        # Add core tokens
        for dss_path, resolved_token in resolved.tokens.items():
            token_name = dss_path.replace('.', '-')
            tokens[token_name] = DesignToken(
                name=token_name,
                value=resolved_token.value,
                type=self._infer_type(dss_path, resolved_token.value),
                category=self._infer_category(dss_path),
                source=f"resolved:{resolved.base_theme}"
            )

        # Add custom props
        for dss_path, resolved_token in resolved.custom_props.items():
            token_name = dss_path.replace('.', '-')
            tokens[token_name] = DesignToken(
                name=token_name,
                value=resolved_token.value,
                type=self._infer_type(dss_path, resolved_token.value),
                category=TokenCategory.OTHER,
                source="custom_prop"
            )

        return Theme(
            name=resolved.name,
            version=resolved.version,
            tokens=tokens
        )

    def _infer_type(self, path: str, value: Any) -> str:
        """Infer token type from path and value."""
        if 'color' in path:
            return 'color'
        if 'spacing' in path or 'size' in path or 'radius' in path:
            return 'dimension'
        if 'font' in path:
            return 'typography'
        if 'shadow' in path:
            return 'shadow'
        return 'string'

    def _infer_category(self, path: str) -> TokenCategory:
        """Infer token category from DSS path."""
        if path.startswith('color'):
            return TokenCategory.COLOR
        if path.startswith('spacing'):
            return TokenCategory.SPACING
        if path.startswith('typography') or path.startswith('font'):
            return TokenCategory.TYPOGRAPHY
        if path.startswith('border') or path.startswith('radius'):
            return TokenCategory.RADIUS
        if path.startswith('shadow'):
            return TokenCategory.SHADOW
        return TokenCategory.OTHER

4.4 validator.py - Validation Engine

"""
Translation Dictionary Validator

Validates translation dictionary schema and semantic correctness.
"""

import json
import re
from pathlib import Path
from typing import Any, Dict, List, Optional
from pydantic import ValidationError as PydanticValidationError
from .models import TranslationDictionary, TranslationSource
from .canonical import DSS_CANONICAL_TOKENS, DSS_CANONICAL_COMPONENTS


class ValidationError:
    """Single validation error."""

    def __init__(
        self,
        message: str,
        path: Optional[str] = None,
        severity: str = "error"
    ):
        self.message = message
        self.path = path
        self.severity = severity  # error, warning, info

    def __str__(self) -> str:
        if self.path:
            return f"[{self.severity}] {self.path}: {self.message}"
        return f"[{self.severity}] {self.message}"


class ValidationResult:
    """Validation result container."""

    def __init__(self):
        self.is_valid = True
        self.errors: List[ValidationError] = []
        self.warnings: List[ValidationError] = []
        self.info: List[ValidationError] = []

    def add_error(self, message: str, path: Optional[str] = None) -> None:
        self.errors.append(ValidationError(message, path, "error"))
        self.is_valid = False

    def add_warning(self, message: str, path: Optional[str] = None) -> None:
        self.warnings.append(ValidationError(message, path, "warning"))

    def add_info(self, message: str, path: Optional[str] = None) -> None:
        self.info.append(ValidationError(message, path, "info"))


class TranslationValidator:
    """
    Validates translation dictionaries.

    Validation stages:
    1. Schema validation - JSON structure matches Pydantic model
    2. Token path validation - DSS paths are canonical
    3. Component validation - Component mappings are valid
    4. Custom prop validation - Namespacing is correct
    5. Consistency validation - No conflicts or duplicates
    """

    # Valid DSS path pattern
    DSS_PATH_PATTERN = re.compile(
        r'^[a-z][a-z0-9]*(\.[a-z][a-z0-9]*)+$'
    )

    def __init__(
        self,
        strict: bool = False,
        allow_unknown_tokens: bool = True
    ):
        """
        Initialize validator.

        Args:
            strict: If True, unknown DSS tokens are errors (not warnings)
            allow_unknown_tokens: If False, all tokens must exist in canonical
        """
        self.strict = strict
        self.allow_unknown_tokens = allow_unknown_tokens

    def validate_dictionary(
        self,
        data: Dict[str, Any]
    ) -> ValidationResult:
        """
        Validate a translation dictionary.

        Args:
            data: Dictionary data to validate

        Returns:
            ValidationResult with all errors/warnings
        """
        result = ValidationResult()

        # Stage 1: Schema validation
        self._validate_schema(data, result)
        if not result.is_valid:
            return result

        # Stage 2: Token path validation
        self._validate_token_paths(data, result)

        # Stage 3: Component validation
        self._validate_components(data, result)

        # Stage 4: Custom prop validation
        self._validate_custom_props(data, result)

        # Stage 5: Consistency validation
        self._validate_consistency(data, result)

        return result

    def _validate_schema(
        self,
        data: Dict[str, Any],
        result: ValidationResult
    ) -> None:
        """Stage 1: Validate JSON structure."""
        try:
            TranslationDictionary(**data)
        except PydanticValidationError as e:
            for error in e.errors():
                path = ".".join(str(loc) for loc in error["loc"])
                result.add_error(error["msg"], path)
        except Exception as e:
            result.add_error(f"Schema validation failed: {str(e)}")

    def _validate_token_paths(
        self,
        data: Dict[str, Any],
        result: ValidationResult
    ) -> None:
        """Stage 2: Validate DSS token paths."""
        mappings = data.get("mappings", {})
        tokens = mappings.get("tokens", {})

        for source_token, dss_path in tokens.items():
            # Validate path format
            if not self.DSS_PATH_PATTERN.match(dss_path):
                result.add_error(
                    f"Invalid DSS path format: '{dss_path}' "
                    "(must be dot-notation like 'color.primary.500')",
                    f"mappings.tokens.{source_token}"
                )
                continue

            # Validate against canonical structure
            if dss_path not in DSS_CANONICAL_TOKENS:
                if self._is_custom_namespace(dss_path):
                    # Custom namespaces are allowed
                    result.add_info(
                        f"Custom namespace token: {dss_path}",
                        f"mappings.tokens.{source_token}"
                    )
                elif self.allow_unknown_tokens:
                    result.add_warning(
                        f"DSS token not in canonical structure: {dss_path}",
                        f"mappings.tokens.{source_token}"
                    )
                else:
                    result.add_error(
                        f"Unknown DSS token: {dss_path}",
                        f"mappings.tokens.{source_token}"
                    )

    def _validate_components(
        self,
        data: Dict[str, Any],
        result: ValidationResult
    ) -> None:
        """Stage 3: Validate component mappings."""
        mappings = data.get("mappings", {})
        components = mappings.get("components", {})

        for source_comp, dss_comp in components.items():
            # Extract base component name (before any variant specifiers)
            base_comp = dss_comp.split('[')[0]

            if base_comp not in DSS_CANONICAL_COMPONENTS:
                result.add_warning(
                    f"Component not in DSS canonical set: {base_comp}",
                    f"mappings.components.{source_comp}"
                )

            # Validate variant syntax if present
            if '[' in dss_comp:
                if not self._validate_variant_syntax(dss_comp):
                    result.add_error(
                        f"Invalid variant syntax: {dss_comp}",
                        f"mappings.components.{source_comp}"
                    )

    def _validate_custom_props(
        self,
        data: Dict[str, Any],
        result: ValidationResult
    ) -> None:
        """Stage 4: Validate custom prop namespacing."""
        custom_props = data.get("custom_props", {})

        for prop_name, prop_value in custom_props.items():
            # Must use dot notation
            if '.' not in prop_name:
                result.add_error(
                    f"Custom prop must use dot-notation namespace: {prop_name}",
                    f"custom_props.{prop_name}"
                )
                continue

            # Should use brand/custom namespace
            parts = prop_name.split('.')
            if len(parts) >= 2 and parts[1] not in ('brand', 'custom'):
                result.add_warning(
                    f"Custom prop should use 'brand' or 'custom' namespace: {prop_name}. "
                    f"Recommended: {parts[0]}.brand.{'.'.join(parts[1:])}",
                    f"custom_props.{prop_name}"
                )

    def _validate_consistency(
        self,
        data: Dict[str, Any],
        result: ValidationResult
    ) -> None:
        """Stage 5: Validate internal consistency."""
        mappings = data.get("mappings", {})
        tokens = mappings.get("tokens", {})
        custom_props = data.get("custom_props", {})

        # Check for duplicate DSS targets
        dss_targets = list(tokens.values())
        seen = set()
        for target in dss_targets:
            if target in seen:
                result.add_warning(
                    f"Multiple source tokens map to same DSS token: {target}",
                    "mappings.tokens"
                )
            seen.add(target)

        # Check custom props don't conflict with mappings
        for prop_name in custom_props.keys():
            if prop_name in tokens.values():
                result.add_error(
                    f"Custom prop conflicts with mapping target: {prop_name}",
                    f"custom_props.{prop_name}"
                )

    def _is_custom_namespace(self, path: str) -> bool:
        """Check if path uses custom namespace."""
        parts = path.split('.')
        if len(parts) >= 2:
            return parts[1] in ('brand', 'custom')
        return False

    def _validate_variant_syntax(self, comp: str) -> bool:
        """Validate component variant syntax like Button[variant=primary]."""
        if '[' not in comp:
            return True

        # Check for matching brackets
        if comp.count('[') != comp.count(']'):
            return False

        # Extract variant part
        variant_match = re.search(r'\[([^\]]+)\]', comp)
        if not variant_match:
            return False

        # Validate key=value format
        variant_str = variant_match.group(1)
        for pair in variant_str.split(','):
            if '=' not in pair:
                return False
            key, value = pair.split('=', 1)
            if not key.strip() or not value.strip():
                return False

        return True

    def validate_file(self, file_path: str) -> ValidationResult:
        """
        Validate a translation dictionary file.

        Args:
            file_path: Path to JSON file

        Returns:
            ValidationResult
        """
        result = ValidationResult()

        try:
            with open(file_path, 'r', encoding='utf-8') as f:
                data = json.load(f)
        except json.JSONDecodeError as e:
            result.add_error(f"Invalid JSON: {str(e)}")
            return result
        except FileNotFoundError:
            result.add_error(f"File not found: {file_path}")
            return result

        return self.validate_dictionary(data)

4.5 canonical.py - DSS Canonical Definitions

"""
DSS Canonical Structure Definitions

Defines the immutable DSS canonical token and component structure.
These definitions are used for validation and auto-completion.
"""

from typing import Set, Dict, List

# DSS Canonical Token Paths
# These are the core tokens that DSS defines
DSS_CANONICAL_TOKENS: Set[str] = {
    # Colors - Primary
    "color.primary.50",
    "color.primary.100",
    "color.primary.200",
    "color.primary.300",
    "color.primary.400",
    "color.primary.500",
    "color.primary.600",
    "color.primary.700",
    "color.primary.800",
    "color.primary.900",

    # Colors - Secondary
    "color.secondary.50",
    "color.secondary.100",
    "color.secondary.200",
    "color.secondary.300",
    "color.secondary.400",
    "color.secondary.500",
    "color.secondary.600",
    "color.secondary.700",
    "color.secondary.800",
    "color.secondary.900",

    # Colors - Neutral
    "color.neutral.50",
    "color.neutral.100",
    "color.neutral.200",
    "color.neutral.300",
    "color.neutral.400",
    "color.neutral.500",
    "color.neutral.600",
    "color.neutral.700",
    "color.neutral.800",
    "color.neutral.900",

    # Colors - Semantic
    "color.success.500",
    "color.warning.500",
    "color.danger.500",
    "color.info.500",
    "color.accent.500",

    # Colors - Surface
    "color.background",
    "color.foreground",
    "color.muted",
    "color.border",
    "color.ring",

    # Spacing
    "spacing.xs",
    "spacing.sm",
    "spacing.md",
    "spacing.lg",
    "spacing.xl",
    "spacing.2xl",
    "spacing.base",

    # Typography - Size
    "typography.size.xs",
    "typography.size.sm",
    "typography.size.base",
    "typography.size.lg",
    "typography.size.xl",
    "typography.size.2xl",
    "typography.size.3xl",
    "typography.size.4xl",

    # Typography - Weight
    "typography.weight.light",
    "typography.weight.normal",
    "typography.weight.medium",
    "typography.weight.semibold",
    "typography.weight.bold",

    # Typography - Line Height
    "typography.lineHeight.tight",
    "typography.lineHeight.normal",
    "typography.lineHeight.relaxed",

    # Typography - Font Family
    "typography.family.sans",
    "typography.family.serif",
    "typography.family.mono",

    # Border Radius
    "border.radius.none",
    "border.radius.sm",
    "border.radius.md",
    "border.radius.lg",
    "border.radius.xl",
    "border.radius.full",

    # Border Width
    "border.width.none",
    "border.width.thin",
    "border.width.default",
    "border.width.thick",

    # Shadows
    "shadow.none",
    "shadow.sm",
    "shadow.md",
    "shadow.lg",
    "shadow.xl",
    "shadow.inner",

    # Motion - Duration
    "motion.duration.instant",
    "motion.duration.fast",
    "motion.duration.normal",
    "motion.duration.slow",

    # Motion - Easing
    "motion.easing.linear",
    "motion.easing.ease",
    "motion.easing.easeIn",
    "motion.easing.easeOut",
    "motion.easing.easeInOut",

    # Z-Index
    "zIndex.base",
    "zIndex.dropdown",
    "zIndex.sticky",
    "zIndex.fixed",
    "zIndex.modal",
    "zIndex.popover",
    "zIndex.tooltip",

    # Opacity
    "opacity.0",
    "opacity.25",
    "opacity.50",
    "opacity.75",
    "opacity.100",

    # Breakpoints
    "breakpoint.sm",
    "breakpoint.md",
    "breakpoint.lg",
    "breakpoint.xl",
    "breakpoint.2xl",
}

# Commonly used aliases for DSS tokens
DSS_TOKEN_ALIASES: Dict[str, str] = {
    # Color aliases
    "color.primary": "color.primary.500",
    "color.secondary": "color.secondary.500",
    "color.success": "color.success.500",
    "color.warning": "color.warning.500",
    "color.danger": "color.danger.500",
    "color.destructive": "color.danger.500",
    "color.error": "color.danger.500",

    # Spacing aliases
    "space.xs": "spacing.xs",
    "space.sm": "spacing.sm",
    "space.md": "spacing.md",
    "space.lg": "spacing.lg",
    "space.xl": "spacing.xl",

    # Radius aliases
    "radius.sm": "border.radius.sm",
    "radius.md": "border.radius.md",
    "radius.lg": "border.radius.lg",

    # Typography aliases
    "font.size.base": "typography.size.base",
    "font.weight.bold": "typography.weight.bold",
    "lineHeight.normal": "typography.lineHeight.normal",
}

# DSS Canonical Components
DSS_CANONICAL_COMPONENTS: Set[str] = {
    # Primitives
    "Button",
    "Input",
    "Textarea",
    "Select",
    "Checkbox",
    "Radio",
    "RadioGroup",
    "Switch",
    "Slider",
    "Toggle",

    # Layout
    "Box",
    "Flex",
    "Grid",
    "Container",
    "Stack",
    "Spacer",
    "Divider",

    # Data Display
    "Card",
    "Avatar",
    "Badge",
    "Chip",
    "Tag",
    "Icon",
    "Image",
    "Table",
    "List",
    "ListItem",

    # Feedback
    "Alert",
    "Toast",
    "Progress",
    "Spinner",
    "Skeleton",
    "Tooltip",

    # Overlay
    "Modal",
    "Dialog",
    "Drawer",
    "Popover",
    "Dropdown",
    "DropdownMenu",
    "ContextMenu",

    # Navigation
    "Tabs",
    "TabList",
    "Tab",
    "TabPanel",
    "Breadcrumb",
    "Pagination",
    "Menu",
    "MenuItem",
    "NavLink",
    "Link",

    # Typography
    "Text",
    "Heading",
    "Label",
    "Code",

    # Forms
    "Form",
    "FormControl",
    "FormLabel",
    "FormHelperText",
    "FormErrorMessage",
}

# DSS Component Variants
DSS_COMPONENT_VARIANTS: Dict[str, List[str]] = {
    "Button": ["variant", "size", "colorScheme", "isDisabled", "isLoading"],
    "Input": ["variant", "size", "isDisabled", "isInvalid", "isReadOnly"],
    "Card": ["variant", "size", "shadow"],
    "Badge": ["variant", "colorScheme", "size"],
    "Alert": ["status", "variant"],
    "Modal": ["size", "isCentered", "scrollBehavior"],
}

# Valid variant values
DSS_VARIANT_VALUES: Dict[str, Dict[str, List[str]]] = {
    "Button": {
        "variant": ["solid", "outline", "ghost", "link", "unstyled"],
        "size": ["xs", "sm", "md", "lg"],
        "colorScheme": ["primary", "secondary", "success", "warning", "danger"],
    },
    "Input": {
        "variant": ["outline", "filled", "flushed", "unstyled"],
        "size": ["xs", "sm", "md", "lg"],
    },
    "Card": {
        "variant": ["elevated", "outline", "filled", "unstyled"],
        "size": ["sm", "md", "lg"],
    },
}


def get_canonical_token_categories() -> Dict[str, List[str]]:
    """Get tokens organized by category."""
    categories: Dict[str, List[str]] = {}

    for token in DSS_CANONICAL_TOKENS:
        parts = token.split('.')
        category = parts[0]
        if category not in categories:
            categories[category] = []
        categories[category].append(token)

    return categories


def is_valid_dss_token(path: str) -> bool:
    """Check if token path is in canonical structure or valid custom namespace."""
    if path in DSS_CANONICAL_TOKENS:
        return True

    # Check aliases
    if path in DSS_TOKEN_ALIASES:
        return True

    # Check custom namespace
    parts = path.split('.')
    if len(parts) >= 3 and parts[1] in ('brand', 'custom'):
        return True

    return False


def resolve_alias(path: str) -> str:
    """Resolve token alias to canonical path."""
    return DSS_TOKEN_ALIASES.get(path, path)

4.6 writer.py - Dictionary Writer

"""
Translation Dictionary Writer

Writes and updates translation dictionary files.
"""

import json
from datetime import datetime
from pathlib import Path
from typing import Any, Dict, List, Optional, Union
from .models import TranslationDictionary, TranslationSource, TranslationMappings


class TranslationDictionaryWriter:
    """
    Writes translation dictionaries to project .dss/translations/ directory.

    Usage:
        writer = TranslationDictionaryWriter("/path/to/project")

        # Create new dictionary
        await writer.create(
            source=TranslationSource.CSS,
            project="my-project",
            token_mappings={"--brand-blue": "color.primary.500"}
        )

        # Add mapping to existing dictionary
        await writer.add_mapping(
            source=TranslationSource.CSS,
            source_token="--brand-green",
            dss_token="color.success.500"
        )
    """

    DEFAULT_DIR = ".dss/translations"

    def __init__(
        self,
        project_path: Union[str, Path],
        translations_dir: Optional[str] = None
    ):
        """
        Initialize writer.

        Args:
            project_path: Root path to project
            translations_dir: Custom translations directory
        """
        self.project_path = Path(project_path).resolve()
        self.translations_dir = self.project_path / (
            translations_dir or self.DEFAULT_DIR
        )

    async def create(
        self,
        source: Union[str, TranslationSource],
        project: str,
        token_mappings: Optional[Dict[str, str]] = None,
        component_mappings: Optional[Dict[str, str]] = None,
        custom_props: Optional[Dict[str, Any]] = None,
        notes: Optional[List[str]] = None
    ) -> TranslationDictionary:
        """
        Create a new translation dictionary.

        Args:
            source: Source type
            project: Project identifier
            token_mappings: Initial token mappings
            component_mappings: Initial component mappings
            custom_props: Initial custom props
            notes: Optional notes

        Returns:
            Created TranslationDictionary
        """
        if isinstance(source, str):
            source = TranslationSource(source)

        # Ensure directory exists
        self.translations_dir.mkdir(parents=True, exist_ok=True)

        # Create dictionary
        dictionary = TranslationDictionary(
            project=project,
            source=source,
            mappings=TranslationMappings(
                tokens=token_mappings or {},
                components=component_mappings or {},
            ),
            custom_props=custom_props or {},
            notes=notes or []
        )

        # Write to file
        file_path = self.translations_dir / f"{source.value}.json"
        await self._write_file(file_path, dictionary)

        return dictionary

    async def update(
        self,
        source: Union[str, TranslationSource],
        token_mappings: Optional[Dict[str, str]] = None,
        component_mappings: Optional[Dict[str, str]] = None,
        custom_props: Optional[Dict[str, Any]] = None,
        notes: Optional[List[str]] = None
    ) -> TranslationDictionary:
        """
        Update an existing translation dictionary.

        Args:
            source: Source type
            token_mappings: Token mappings to add/update
            component_mappings: Component mappings to add/update
            custom_props: Custom props to add/update
            notes: Notes to append

        Returns:
            Updated TranslationDictionary
        """
        if isinstance(source, str):
            source = TranslationSource(source)

        file_path = self.translations_dir / f"{source.value}.json"
        if not file_path.exists():
            raise FileNotFoundError(
                f"Dictionary not found: {file_path}. Use create() first."
            )

        # Load existing
        with open(file_path, 'r', encoding='utf-8') as f:
            data = json.load(f)

        dictionary = TranslationDictionary(**data)

        # Update mappings
        if token_mappings:
            dictionary.mappings.tokens.update(token_mappings)
        if component_mappings:
            dictionary.mappings.components.update(component_mappings)
        if custom_props:
            dictionary.custom_props.update(custom_props)
        if notes:
            dictionary.notes.extend(notes)

        dictionary.updated_at = datetime.utcnow()

        # Write back
        await self._write_file(file_path, dictionary)

        return dictionary

    async def add_mapping(
        self,
        source: Union[str, TranslationSource],
        source_token: str,
        dss_token: str
    ) -> None:
        """
        Add a single token mapping to a dictionary.

        Args:
            source: Source type
            source_token: Source token name
            dss_token: DSS canonical path
        """
        await self.update(
            source=source,
            token_mappings={source_token: dss_token}
        )

    async def add_custom_prop(
        self,
        source: Union[str, TranslationSource],
        prop_name: str,
        prop_value: Any
    ) -> None:
        """
        Add a custom prop to a dictionary.

        Args:
            source: Source type
            prop_name: Property name (must use DSS namespace)
            prop_value: Property value
        """
        # Validate namespace
        if '.' not in prop_name:
            raise ValueError(
                f"Custom prop must use dot-notation namespace: {prop_name}"
            )

        await self.update(
            source=source,
            custom_props={prop_name: prop_value}
        )

    async def remove_mapping(
        self,
        source: Union[str, TranslationSource],
        source_token: str
    ) -> None:
        """
        Remove a token mapping from a dictionary.

        Args:
            source: Source type
            source_token: Source token to remove
        """
        if isinstance(source, str):
            source = TranslationSource(source)

        file_path = self.translations_dir / f"{source.value}.json"
        if not file_path.exists():
            return

        with open(file_path, 'r', encoding='utf-8') as f:
            data = json.load(f)

        dictionary = TranslationDictionary(**data)

        if source_token in dictionary.mappings.tokens:
            del dictionary.mappings.tokens[source_token]
            dictionary.updated_at = datetime.utcnow()
            await self._write_file(file_path, dictionary)

    async def mark_unmapped(
        self,
        source: Union[str, TranslationSource],
        unmapped_tokens: List[str]
    ) -> None:
        """
        Add tokens to unmapped list.

        Args:
            source: Source type
            unmapped_tokens: List of tokens that couldn't be mapped
        """
        if isinstance(source, str):
            source = TranslationSource(source)

        file_path = self.translations_dir / f"{source.value}.json"
        if not file_path.exists():
            return

        with open(file_path, 'r', encoding='utf-8') as f:
            data = json.load(f)

        dictionary = TranslationDictionary(**data)

        # Add unique unmapped tokens
        existing = set(dictionary.unmapped)
        for token in unmapped_tokens:
            if token not in existing:
                dictionary.unmapped.append(token)

        dictionary.updated_at = datetime.utcnow()
        await self._write_file(file_path, dictionary)

    async def _write_file(
        self,
        file_path: Path,
        dictionary: TranslationDictionary
    ) -> None:
        """Write dictionary to JSON file."""
        data = dictionary.model_dump(by_alias=True, mode='json')

        # Convert datetime to ISO format
        data['created_at'] = dictionary.created_at.isoformat()
        data['updated_at'] = dictionary.updated_at.isoformat()

        with open(file_path, 'w', encoding='utf-8') as f:
            json.dump(data, f, indent=2, ensure_ascii=False)

    def delete(
        self,
        source: Union[str, TranslationSource]
    ) -> bool:
        """
        Delete a translation dictionary file.

        Args:
            source: Source type

        Returns:
            True if deleted, False if not found
        """
        if isinstance(source, str):
            source = TranslationSource(source)

        file_path = self.translations_dir / f"{source.value}.json"
        if file_path.exists():
            file_path.unlink()
            return True
        return False

4.7 __init__.py - Module Exports

"""
DSS Translation Dictionary Module

Provides translation between external design token formats and DSS canonical structure.
"""

from .models import (
    TranslationSource,
    MappingType,
    TokenMapping,
    ComponentMapping,
    PatternMapping,
    CustomProp,
    TranslationMappings,
    TranslationDictionary,
    TranslationRegistry,
    ResolvedToken,
    ResolvedTheme,
)

from .loader import TranslationDictionaryLoader
from .resolver import TokenResolver
from .merger import ThemeMerger
from .validator import TranslationValidator, ValidationResult, ValidationError
from .writer import TranslationDictionaryWriter
from .canonical import (
    DSS_CANONICAL_TOKENS,
    DSS_CANONICAL_COMPONENTS,
    DSS_TOKEN_ALIASES,
    DSS_COMPONENT_VARIANTS,
    is_valid_dss_token,
    resolve_alias,
    get_canonical_token_categories,
)

__all__ = [
    # Models
    "TranslationSource",
    "MappingType",
    "TokenMapping",
    "ComponentMapping",
    "PatternMapping",
    "CustomProp",
    "TranslationMappings",
    "TranslationDictionary",
    "TranslationRegistry",
    "ResolvedToken",
    "ResolvedTheme",

    # Loader
    "TranslationDictionaryLoader",

    # Resolver
    "TokenResolver",

    # Merger
    "ThemeMerger",

    # Validator
    "TranslationValidator",
    "ValidationResult",
    "ValidationError",

    # Writer
    "TranslationDictionaryWriter",

    # Canonical Definitions
    "DSS_CANONICAL_TOKENS",
    "DSS_CANONICAL_COMPONENTS",
    "DSS_TOKEN_ALIASES",
    "DSS_COMPONENT_VARIANTS",
    "is_valid_dss_token",
    "resolve_alias",
    "get_canonical_token_categories",
]

5. Implementation Phases

Phase 1: Foundation (Days 1-2)

Day 1:
+-- models.py              # Complete data models
+-- canonical.py           # DSS canonical definitions
+-- __init__.py            # Module exports

Day 2:
+-- loader.py              # Basic dictionary loading
+-- validator.py           # Schema validation only
+-- schemas/
    +-- translation-v1.schema.json

Deliverables:

  • All Pydantic models defined
  • Basic loading from .dss/translations/
  • Schema validation working
  • Unit tests for models

Phase 2: Core Functionality (Days 3-4)

Day 3:
+-- resolver.py            # Token resolution
+-- utils.py               # Utility functions

Day 4:
+-- merger.py              # Theme merging
+-- writer.py              # Dictionary writing

Deliverables:

  • Bidirectional token resolution
  • Theme + custom props merging
  • Create/update dictionary files
  • Integration tests

Phase 3: Presets & Polish (Day 5)

Day 5:
+-- presets/
    +-- heroui.json        # HeroUI translations
    +-- shadcn.json        # shadcn translations
    +-- tailwind.json      # Tailwind translations
+-- Enhanced validation
+-- Full test coverage

Deliverables:

  • Pre-built translation dictionaries
  • Semantic validation (not just schema)
  • 90%+ test coverage
  • Documentation

6. Testing Strategy

6.1 Test Structure

tests/unit/translations/
|
+-- test_models.py         # Model validation
+-- test_loader.py         # Dictionary loading
+-- test_resolver.py       # Token resolution
+-- test_merger.py         # Theme merging
+-- test_validator.py      # Validation logic
+-- test_writer.py         # File writing
+-- test_canonical.py      # Canonical definitions

tests/integration/translations/
|
+-- test_full_workflow.py  # End-to-end workflows
+-- test_mcp_integration.py # MCP tool integration

tests/fixtures/translations/
|
+-- valid_dictionary.json  # Valid test dictionaries
+-- invalid_schema.json    # Invalid schema tests
+-- conflict_test.json     # Conflict detection tests

6.2 Key Test Cases

# tests/unit/translations/test_resolver.py

import pytest
from dss.translations import (
    TranslationRegistry,
    TranslationDictionary,
    TranslationSource,
    TokenResolver,
)


class TestTokenResolver:
    """Test suite for TokenResolver."""

    @pytest.fixture
    def sample_registry(self):
        """Create registry with sample mappings."""
        dictionary = TranslationDictionary(
            project="test",
            source=TranslationSource.CSS,
            mappings={
                "tokens": {
                    "--brand-blue": "color.primary.500",
                    "--brand-dark": "color.primary.700",
                    "--text-main": "color.neutral.900",
                }
            }
        )
        registry = TranslationRegistry()
        registry.dictionaries["css"] = dictionary
        registry.combined_token_map = dictionary.mappings.tokens
        return registry

    def test_forward_resolution(self, sample_registry):
        """Test source -> DSS resolution."""
        resolver = TokenResolver(sample_registry)

        result = resolver.resolve_to_dss("--brand-blue")
        assert result == "color.primary.500"

    def test_reverse_resolution(self, sample_registry):
        """Test DSS -> source resolution."""
        resolver = TokenResolver(sample_registry)

        result = resolver.resolve_to_source("color.primary.500", "css")
        assert result == "--brand-blue"

    def test_unknown_token_returns_none(self, sample_registry):
        """Test that unknown tokens return None."""
        resolver = TokenResolver(sample_registry)

        result = resolver.resolve_to_dss("--unknown-token")
        assert result is None

    def test_normalize_var_reference(self, sample_registry):
        """Test normalization of var() references."""
        resolver = TokenResolver(sample_registry)

        # Should normalize var(--brand-blue) to --brand-blue
        result = resolver.resolve_to_dss("var(--brand-blue)")
        assert result == "color.primary.500"

6.3 Integration Test Example

# tests/integration/translations/test_full_workflow.py

import pytest
import tempfile
from pathlib import Path
from dss.translations import (
    TranslationDictionaryLoader,
    TranslationDictionaryWriter,
    ThemeMerger,
    TokenResolver,
    TranslationSource,
)


class TestFullWorkflow:
    """Test complete translation workflow."""

    @pytest.fixture
    def temp_project(self):
        """Create temporary project directory."""
        with tempfile.TemporaryDirectory() as tmpdir:
            project_path = Path(tmpdir)
            translations_dir = project_path / ".dss" / "translations"
            translations_dir.mkdir(parents=True)
            yield project_path

    @pytest.mark.asyncio
    async def test_create_load_resolve_workflow(self, temp_project):
        """Test: Create dictionary -> Load -> Resolve -> Merge."""

        # 1. Create dictionary
        writer = TranslationDictionaryWriter(temp_project)
        await writer.create(
            source=TranslationSource.CSS,
            project="test-project",
            token_mappings={
                "--brand-blue": "color.primary.500",
                "--space-sm": "spacing.sm",
            },
            custom_props={
                "color.brand.test.accent": "#FF5733"
            }
        )

        # 2. Load dictionary
        loader = TranslationDictionaryLoader(temp_project)
        registry = await loader.load_all()

        assert len(registry.dictionaries) == 1
        assert "css" in registry.dictionaries

        # 3. Resolve tokens
        resolver = TokenResolver(registry)

        dss_path = resolver.resolve_to_dss("--brand-blue")
        assert dss_path == "color.primary.500"

        source_token = resolver.resolve_to_source("color.primary.500", "css")
        assert source_token == "--brand-blue"

        # 4. Merge with base theme
        merger = ThemeMerger(registry)
        resolved_theme = await merger.merge(base_theme="light", project_name="test")

        assert resolved_theme.name == "test"
        assert "color.brand.test.accent" in resolved_theme.custom_props
        assert len(resolved_theme.translations_applied) == 1

7. MCP Integration Plan

7.1 New MCP Tools

After the Python module is complete, add these MCP tools:

# tools/dss_mcp/integrations/translations.py

from typing import Dict, Any, List, Optional
from dss.translations import (
    TranslationDictionaryLoader,
    TranslationDictionaryWriter,
    ThemeMerger,
    TokenResolver,
    TranslationValidator,
    TranslationSource,
)


class TranslationIntegration:
    """MCP integration for translation dictionaries."""

    async def translation_load_all(
        self,
        project_path: str
    ) -> Dict[str, Any]:
        """
        Load all translation dictionaries for a project.

        MCP Tool: translation_load_all
        """
        loader = TranslationDictionaryLoader(project_path)
        registry = await loader.load_all()

        return {
            "dictionaries": [
                dict_obj.model_dump(mode='json')
                for dict_obj in registry.dictionaries.values()
            ],
            "total_token_mappings": len(registry.combined_token_map),
            "total_custom_props": len(registry.all_custom_props),
            "conflicts": registry.conflicts,
        }

    async def translation_resolve_token(
        self,
        project_path: str,
        source_token: str,
        source_type: Optional[str] = None
    ) -> Dict[str, Any]:
        """
        Resolve a source token to DSS canonical.

        MCP Tool: translation_resolve_token
        """
        loader = TranslationDictionaryLoader(project_path)
        registry = await loader.load_all()
        resolver = TokenResolver(registry)

        dss_path = resolver.resolve_to_dss(source_token, source_type)

        return {
            "source_token": source_token,
            "dss_path": dss_path,
            "found": dss_path is not None,
        }

    async def translation_create_dictionary(
        self,
        project_path: str,
        source: str,
        project_name: str,
        token_mappings: Optional[Dict[str, str]] = None,
        custom_props: Optional[Dict[str, Any]] = None
    ) -> Dict[str, Any]:
        """
        Create a new translation dictionary.

        MCP Tool: translation_create
        """
        writer = TranslationDictionaryWriter(project_path)
        dictionary = await writer.create(
            source=TranslationSource(source),
            project=project_name,
            token_mappings=token_mappings,
            custom_props=custom_props,
        )

        return {
            "success": True,
            "dictionary": dictionary.model_dump(mode='json'),
            "file_path": str(writer.translations_dir / f"{source}.json"),
        }

    async def translation_add_mapping(
        self,
        project_path: str,
        source: str,
        source_token: str,
        dss_token: str
    ) -> Dict[str, Any]:
        """
        Add a token mapping to an existing dictionary.

        MCP Tool: translation_add_mapping
        """
        writer = TranslationDictionaryWriter(project_path)
        await writer.add_mapping(
            source=TranslationSource(source),
            source_token=source_token,
            dss_token=dss_token,
        )

        return {
            "success": True,
            "mapping": {source_token: dss_token},
        }

    async def translation_validate(
        self,
        project_path: str,
        source: Optional[str] = None
    ) -> Dict[str, Any]:
        """
        Validate translation dictionaries.

        MCP Tool: translation_validate
        """
        loader = TranslationDictionaryLoader(project_path, validate=False)
        validator = TranslationValidator()

        results = []

        if source:
            dictionary = await loader.load_dictionary(source)
            if dictionary:
                result = validator.validate_dictionary(dictionary.model_dump())
                results.append({
                    "source": source,
                    "is_valid": result.is_valid,
                    "errors": [str(e) for e in result.errors],
                    "warnings": [str(w) for w in result.warnings],
                })
        else:
            for dict_file in loader.translations_dir.glob("*.json"):
                dictionary = await loader.load_dictionary_file(dict_file)
                result = validator.validate_dictionary(dictionary.model_dump())
                results.append({
                    "source": dict_file.stem,
                    "is_valid": result.is_valid,
                    "errors": [str(e) for e in result.errors],
                    "warnings": [str(w) for w in result.warnings],
                })

        all_valid = all(r["is_valid"] for r in results)

        return {
            "all_valid": all_valid,
            "results": results,
        }

    async def translation_merge_theme(
        self,
        project_path: str,
        base_theme: str = "light",
        project_name: Optional[str] = None
    ) -> Dict[str, Any]:
        """
        Merge base theme with translations and custom props.

        MCP Tool: translation_merge_theme
        """
        loader = TranslationDictionaryLoader(project_path)
        registry = await loader.load_all()
        merger = ThemeMerger(registry)

        resolved = await merger.merge(
            base_theme=base_theme,
            project_name=project_name,
        )

        return {
            "name": resolved.name,
            "base_theme": resolved.base_theme,
            "token_count": len(resolved.tokens),
            "custom_prop_count": len(resolved.custom_props),
            "translations_applied": resolved.translations_applied,
            "tokens": {
                path: token.model_dump(mode='json')
                for path, token in resolved.tokens.items()
            },
            "custom_props": {
                path: token.model_dump(mode='json')
                for path, token in resolved.custom_props.items()
            },
        }

7.2 MCP Tool Manifest Update

# Add to tools/dss_mcp/handler.py

TRANSLATION_TOOLS = [
    {
        "name": "translation_load_all",
        "description": "Load all translation dictionaries for a project",
        "parameters": {
            "project_path": {"type": "string", "required": True}
        }
    },
    {
        "name": "translation_resolve_token",
        "description": "Resolve a source token to DSS canonical path",
        "parameters": {
            "project_path": {"type": "string", "required": True},
            "source_token": {"type": "string", "required": True},
            "source_type": {"type": "string", "required": False}
        }
    },
    {
        "name": "translation_create",
        "description": "Create a new translation dictionary",
        "parameters": {
            "project_path": {"type": "string", "required": True},
            "source": {"type": "string", "required": True},
            "project_name": {"type": "string", "required": True},
            "token_mappings": {"type": "object", "required": False},
            "custom_props": {"type": "object", "required": False}
        }
    },
    {
        "name": "translation_add_mapping",
        "description": "Add a token mapping to a dictionary",
        "parameters": {
            "project_path": {"type": "string", "required": True},
            "source": {"type": "string", "required": True},
            "source_token": {"type": "string", "required": True},
            "dss_token": {"type": "string", "required": True}
        }
    },
    {
        "name": "translation_validate",
        "description": "Validate translation dictionaries",
        "parameters": {
            "project_path": {"type": "string", "required": True},
            "source": {"type": "string", "required": False}
        }
    },
    {
        "name": "translation_merge_theme",
        "description": "Merge base theme with translations and custom props",
        "parameters": {
            "project_path": {"type": "string", "required": True},
            "base_theme": {"type": "string", "required": False, "default": "light"},
            "project_name": {"type": "string", "required": False}
        }
    }
]

8. Example Usage

8.1 Example Translation Dictionary - HeroUI

{
  "$schema": "dss-translation-v1",
  "project": "heroui-migration",
  "source": "heroui",
  "version": "1.0.0",
  "mappings": {
    "tokens": {
      "--heroui-primary-50": "color.primary.50",
      "--heroui-primary-100": "color.primary.100",
      "--heroui-primary-200": "color.primary.200",
      "--heroui-primary-300": "color.primary.300",
      "--heroui-primary-400": "color.primary.400",
      "--heroui-primary-500": "color.primary.500",
      "--heroui-primary-600": "color.primary.600",
      "--heroui-primary-700": "color.primary.700",
      "--heroui-primary-800": "color.primary.800",
      "--heroui-primary-900": "color.primary.900",
      "--heroui-content1": "color.neutral.50",
      "--heroui-content2": "color.neutral.100",
      "--heroui-content3": "color.neutral.200",
      "--heroui-content4": "color.neutral.300",
      "--heroui-radius-small": "border.radius.sm",
      "--heroui-radius-medium": "border.radius.md",
      "--heroui-radius-large": "border.radius.lg",
      "--heroui-shadow-small": "shadow.sm",
      "--heroui-shadow-medium": "shadow.md",
      "--heroui-shadow-large": "shadow.lg"
    },
    "components": {
      "Button": "Button",
      "Card": "Card",
      "Input": "Input",
      "Modal": "Modal",
      "Dropdown": "Select"
    }
  },
  "custom_props": {},
  "unmapped": [],
  "notes": [
    "HeroUI uses numeric scales - direct 1:1 mapping",
    "Content layers map to neutral scale"
  ]
}

8.2 Example Translation Dictionary - shadcn

{
  "$schema": "dss-translation-v1",
  "project": "shadcn-migration",
  "source": "shadcn",
  "version": "1.0.0",
  "mappings": {
    "tokens": {
      "--background": "color.background",
      "--foreground": "color.foreground",
      "--primary": "color.primary.500",
      "--primary-foreground": "color.primary.50",
      "--secondary": "color.secondary.500",
      "--secondary-foreground": "color.secondary.50",
      "--muted": "color.neutral.200",
      "--muted-foreground": "color.neutral.600",
      "--accent": "color.accent.500",
      "--accent-foreground": "color.accent.50",
      "--destructive": "color.danger.500",
      "--card": "color.neutral.50",
      "--card-foreground": "color.foreground",
      "--popover": "color.neutral.50",
      "--popover-foreground": "color.foreground",
      "--border": "color.border",
      "--input": "color.neutral.200",
      "--ring": "color.ring",
      "--radius": "border.radius.md"
    },
    "components": {
      "Button": "Button",
      "Card": "Card",
      "Input": "Input",
      "Dialog": "Modal",
      "Popover": "Popover",
      "Select": "Select"
    }
  },
  "custom_props": {
    "color.brand.shadcn.chart.1": "hsl(12 76% 61%)",
    "color.brand.shadcn.chart.2": "hsl(173 58% 39%)",
    "color.brand.shadcn.chart.3": "hsl(197 37% 24%)",
    "color.brand.shadcn.sidebar.primary": "var(--sidebar-primary)",
    "color.brand.shadcn.sidebar.accent": "var(--sidebar-accent)"
  },
  "unmapped": [
    "--chart-1",
    "--chart-2",
    "--chart-3",
    "--chart-4",
    "--chart-5"
  ],
  "notes": [
    "shadcn is HEADLESS - no numeric color scales",
    "Semantic names expand to 500 (default) DSS values",
    "Chart colors are shadcn-specific, isolated in custom_props"
  ]
}

8.3 Example Python Usage

# Example: Complete workflow

import asyncio
from dss.translations import (
    TranslationDictionaryLoader,
    TranslationDictionaryWriter,
    ThemeMerger,
    TokenResolver,
    TranslationSource,
)


async def main():
    project_path = "/path/to/my-project"

    # 1. Create translation dictionaries for project
    writer = TranslationDictionaryWriter(project_path)

    # Create CSS legacy mappings
    await writer.create(
        source=TranslationSource.CSS,
        project="my-project",
        token_mappings={
            "--brand-primary": "color.primary.500",
            "--brand-secondary": "color.secondary.500",
            "--spacing-unit": "spacing.base",
            "--card-radius": "border.radius.md",
        },
        custom_props={
            "color.brand.myproject.gradient.start": "#FF6B6B",
            "color.brand.myproject.gradient.end": "#4ECDC4",
        },
        notes=["Migrated from legacy CSS variables"]
    )

    # 2. Load all dictionaries
    loader = TranslationDictionaryLoader(project_path)
    registry = await loader.load_all()

    print(f"Loaded {len(registry.dictionaries)} dictionaries")
    print(f"Total mappings: {len(registry.combined_token_map)}")
    print(f"Custom props: {len(registry.all_custom_props)}")

    # 3. Resolve tokens
    resolver = TokenResolver(registry)

    # Forward: source -> DSS
    dss_path = resolver.resolve_to_dss("--brand-primary")
    print(f"--brand-primary -> {dss_path}")

    # Reverse: DSS -> source
    source_token = resolver.resolve_to_source("color.primary.500", "css")
    print(f"color.primary.500 -> {source_token}")

    # 4. Merge with base theme
    merger = ThemeMerger(registry)
    resolved = await merger.merge(
        base_theme="light",
        project_name="my-project"
    )

    print(f"\nResolved theme: {resolved.name}")
    print(f"  Base: {resolved.base_theme}")
    print(f"  Tokens: {len(resolved.tokens)}")
    print(f"  Custom props: {len(resolved.custom_props)}")

    # 5. Export as Theme object (for Storybook integration)
    theme = merger.export_as_theme(resolved)
    print(f"\nExported Theme: {theme.name}")


if __name__ == "__main__":
    asyncio.run(main())

9. Dependencies

9.1 Python Dependencies

# No new dependencies required!
# Uses existing DSS dependencies:

pydantic>=2.0.0          # Already in requirements.txt
pydantic-settings>=2.0.0 # Already in requirements.txt

9.2 Integration with Existing Modules

Module Integration Type Description
dss.models.theme Import Use Theme, DesignToken models
dss.themes Import Use default light/dark themes
dss.ingest Complement Translations work with ingested tokens
dss.validators Pattern Follow validation patterns
dss.storybook Consumer Storybook uses merged themes

10. Risks & Mitigations

Risk Impact Likelihood Mitigation
Schema changes break existing dictionaries High Medium Version schema, provide migrations
Performance with large dictionaries Medium Low Implement caching, lazy loading
Circular token references High Low Detect cycles during resolution
Conflict resolution ambiguity Medium Medium Clear precedence rules, manual override
MCP tool complexity Medium Medium Simple API, comprehensive docs

11. Success Criteria

Phase 1 (Foundation)

  • All Pydantic models pass validation tests
  • Dictionary loader can read .dss/translations/*.json
  • Schema validation catches invalid dictionaries
  • 80%+ unit test coverage

Phase 2 (Core)

  • Forward resolution works (source -> DSS)
  • Reverse resolution works (DSS -> source)
  • Theme merger produces valid themes
  • Writer can create/update dictionaries
  • Integration tests pass

Phase 3 (Polish)

  • Pre-built dictionaries for HeroUI, shadcn, Tailwind
  • Semantic validation (not just schema)
  • 90%+ test coverage
  • Documentation complete
  • MCP integration working

12. Next Steps

  1. Review this plan with core team
  2. Approve architecture and data models
  3. Begin Phase 1 implementation
  4. Create GitHub issues for tracking
  5. Set up CI/CD for test automation

Appendix A: DSS Config Schema

{
  "$schema": "http://json-schema.org/draft-07/schema#",
  "$id": "https://dss.dev/schemas/config.schema.json",
  "title": "DSS Project Configuration",
  "description": "Project-level DSS configuration",
  "type": "object",
  "required": ["project", "version"],
  "properties": {
    "project": {
      "type": "string",
      "minLength": 1,
      "description": "Project identifier"
    },
    "version": {
      "type": "string",
      "pattern": "^\\d+\\.\\d+\\.\\d+$",
      "description": "Configuration version"
    },
    "base_theme": {
      "type": "string",
      "enum": ["light", "dark"],
      "default": "light",
      "description": "Base theme to use"
    },
    "translations": {
      "type": "object",
      "properties": {
        "sources": {
          "type": "array",
          "items": {
            "type": "string",
            "enum": ["figma", "css", "scss", "heroui", "shadcn", "tailwind", "json"]
          },
          "description": "Active translation sources"
        },
        "auto_map": {
          "type": "boolean",
          "default": true,
          "description": "Enable automatic token mapping"
        }
      }
    },
    "output": {
      "type": "object",
      "properties": {
        "formats": {
          "type": "array",
          "items": {
            "type": "string",
            "enum": ["css", "scss", "json", "typescript"]
          },
          "description": "Output formats to generate"
        },
        "dir": {
          "type": "string",
          "default": "dist/tokens",
          "description": "Output directory"
        }
      }
    }
  }
}

Document End

This implementation plan provides a complete blueprint for the Translation Dictionary System. The architecture follows existing DSS patterns, uses Pydantic for data validation, and integrates seamlessly with the MCP plugin infrastructure.