Initial commit: Clean DSS implementation
Migrated from design-system-swarm with fresh git history.
Old project history preserved in /home/overbits/apps/design-system-swarm
Core components:
- MCP Server (Python FastAPI with mcp 1.23.1)
- Claude Plugin (agents, commands, skills, strategies, hooks, core)
- DSS Backend (dss-mvp1 - token translation, Figma sync)
- Admin UI (Node.js/React)
- Server (Node.js/Express)
- Storybook integration (dss-mvp1/.storybook)
Self-contained configuration:
- All paths relative or use DSS_BASE_PATH=/home/overbits/dss
- PYTHONPATH configured for dss-mvp1 and dss-claude-plugin
- .env file with all configuration
- Claude plugin uses ${CLAUDE_PLUGIN_ROOT} for portability
Migration completed: $(date)
🤖 Clean migration with full functionality preserved
This commit is contained in:
587
DSS_EXPORT_IMPORT_GUIDE.md
Normal file
587
DSS_EXPORT_IMPORT_GUIDE.md
Normal file
@@ -0,0 +1,587 @@
|
||||
# DSS Export/Import System - Complete Implementation Guide
|
||||
|
||||
## Overview
|
||||
|
||||
The DSS (Design System Studio) Export/Import system enables complete project archival, distribution, and restoration with round-trip fidelity. Projects can be exported to versioned `.dss` archive files and imported back with full metadata, relationships, and audit information preserved.
|
||||
|
||||
**Key Features:**
|
||||
- ✅ Round-trip fidelity (export → import = identical state)
|
||||
- ✅ Complete metadata preservation (source attribution, deprecation, timestamps)
|
||||
- ✅ Multiple import strategies (Replace, Merge, Fork)
|
||||
- ✅ UUID-based smart merging with conflict detection
|
||||
- ✅ Schema versioning with automatic migrations
|
||||
- ✅ Referential integrity validation
|
||||
- ✅ Backward compatible (no breaking changes)
|
||||
|
||||
---
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
### Shadow UUID Strategy
|
||||
|
||||
Instead of replacing existing integer IDs, we introduce UUIDs as a **transport-layer identity**:
|
||||
|
||||
```
|
||||
Database (Runtime): Archive Format (Transport):
|
||||
┌──────────────────┐ ┌──────────────────┐
|
||||
│ id (integer/str) │ │ UUID references │
|
||||
│ projects │────────▶│ .dss archive │
|
||||
│ components │ │ (ZIP) │
|
||||
│ tokens │ │ │
|
||||
└──────────────────┘ └──────────────────┘
|
||||
(unchanged) (new, export-only)
|
||||
```
|
||||
|
||||
**Benefits:**
|
||||
- No database schema breaking changes
|
||||
- Backward compatible with existing code
|
||||
- Clean separation of export/import logic from runtime logic
|
||||
- Supports distributed collaboration
|
||||
|
||||
---
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### Phase 1: Model Extensions (UUID Fields)
|
||||
|
||||
All models now include optional UUID fields with auto-generation:
|
||||
|
||||
```python
|
||||
from dss.models.project import Project
|
||||
from dss.models.component import Component
|
||||
from dss.models.theme import DesignToken, Theme
|
||||
|
||||
# UUIDs are auto-generated if not provided
|
||||
project = Project(
|
||||
id="my-project", # Keep existing ID for runtime
|
||||
uuid="...", # Auto-generated UUID
|
||||
name="My Design System"
|
||||
)
|
||||
|
||||
token = DesignToken(
|
||||
uuid="...", # Auto-generated
|
||||
name="primary",
|
||||
value="#3B82F6"
|
||||
)
|
||||
|
||||
component = Component(
|
||||
uuid="...", # Auto-generated
|
||||
name="Button"
|
||||
)
|
||||
```
|
||||
|
||||
**Database Schema Changes:**
|
||||
- Added `uuid TEXT UNIQUE` columns to: `projects`, `components`, `styles`
|
||||
- UUIDs nullable for backward compatibility
|
||||
- Indexed for fast lookups
|
||||
|
||||
### Phase 2: Export (Archive Creation)
|
||||
|
||||
Creates versioned `.dss` ZIP archives with complete project state:
|
||||
|
||||
```
|
||||
project.dss (ZIP Archive)
|
||||
├── manifest.json # Metadata, version, contents
|
||||
├── config.json # Project configuration
|
||||
├── tokens.json # All tokens with metadata
|
||||
├── components.json # Components with variants
|
||||
├── themes.json # Theme definitions
|
||||
└── history/ # Future: audit logs
|
||||
└── audit-trail.json
|
||||
```
|
||||
|
||||
**Example Export:**
|
||||
|
||||
```python
|
||||
from pathlib import Path
|
||||
from dss.export_import import DSSArchiveExporter
|
||||
|
||||
project = Project(...) # Your DSS project
|
||||
|
||||
exporter = DSSArchiveExporter(project)
|
||||
archive_path = exporter.export_to_file(Path("my-design-system.dss"))
|
||||
|
||||
print(f"Exported to: {archive_path}")
|
||||
# Output: Exported to: my-design-system.dss (2.3 MB)
|
||||
```
|
||||
|
||||
**Exported Token Format (W3C-compatible):**
|
||||
|
||||
```json
|
||||
{
|
||||
"tokens": {
|
||||
"colors": {
|
||||
"primary": {
|
||||
"uuid": "f47ac10b-58cc-4372-a567-0e02b2c3d479",
|
||||
"$value": "#3B82F6",
|
||||
"$type": "color",
|
||||
"$category": "color",
|
||||
"$description": "Primary brand color",
|
||||
"$source": "figma:abc123",
|
||||
"$deprecated": false,
|
||||
"$createdAt": "2025-01-15T10:00:00Z",
|
||||
"$updatedAt": "2025-12-01T14:30:00Z"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Phase 3: Import (Archive Loading)
|
||||
|
||||
Loads `.dss` archives with three strategies:
|
||||
|
||||
#### 3a. REPLACE Strategy (Backup Restore)
|
||||
|
||||
Complete project replacement - ideal for backup restoration and cloning:
|
||||
|
||||
```python
|
||||
from dss.export_import import DSSArchiveImporter
|
||||
|
||||
importer = DSSArchiveImporter(Path("backup.dss"))
|
||||
|
||||
# Analyze before importing (no modifications)
|
||||
analysis = importer.analyze()
|
||||
print(f"Valid: {analysis.is_valid}")
|
||||
print(f"Errors: {[e.message for e in analysis.errors]}")
|
||||
|
||||
# Import with REPLACE (wipes all existing data)
|
||||
imported_project = importer.import_replace()
|
||||
```
|
||||
|
||||
**Validation Pipeline:**
|
||||
1. Archive integrity check (valid ZIP, required files present)
|
||||
2. Manifest validation (required fields, version format)
|
||||
3. Schema version compatibility (migrations if needed)
|
||||
4. Structural validation (JSON format, required keys)
|
||||
5. Referential integrity (all UUIDs resolve, no dangling references)
|
||||
|
||||
#### 3b. MERGE Strategy (Smart Update)
|
||||
|
||||
UUID-based intelligent merging for team collaboration:
|
||||
|
||||
```python
|
||||
from dss.export_import import DSSArchiveImporter
|
||||
from dss.export_import.merger import SmartMerger, ConflictResolutionMode
|
||||
|
||||
# Load imported project
|
||||
importer = DSSArchiveImporter(Path("updated.dss"))
|
||||
imported_project = importer.import_replace()
|
||||
|
||||
# Create merger
|
||||
local_project = Project(...) # Your current project
|
||||
merger = SmartMerger(local_project, imported_project)
|
||||
|
||||
# Analyze merge
|
||||
analysis = merger.analyze_merge()
|
||||
print(f"New tokens: {len(analysis.new_items['tokens'])}")
|
||||
print(f"Updated: {len(analysis.updated_items['tokens'])}")
|
||||
print(f"Conflicts: {len(analysis.conflicted_items)}")
|
||||
|
||||
# Perform merge with strategy
|
||||
merged = merger.merge_with_strategy(
|
||||
ConflictResolutionMode.OVERWRITE # Import wins
|
||||
)
|
||||
```
|
||||
|
||||
**Merge Detection:**
|
||||
- New items: Exist in import, not in local → Add
|
||||
- Updated items: Same UUID, same hash → Skip
|
||||
- Updated items: Same UUID, different hash, not modified locally → Update
|
||||
- Conflicts: Same UUID, different hash, modified both → Detect
|
||||
|
||||
**Conflict Strategies:**
|
||||
- `OVERWRITE`: Import version wins (timestamp-based guidance)
|
||||
- `KEEP_LOCAL`: Keep local version (safest for local work)
|
||||
- `FORK`: Create duplicate with new UUID (no data loss)
|
||||
|
||||
**Example Conflict Resolution:**
|
||||
|
||||
```python
|
||||
from dss.export_import.merger import ConflictItem
|
||||
|
||||
analysis = merger.analyze_merge()
|
||||
for conflict in analysis.conflicted_items:
|
||||
print(f"Conflict: {conflict.entity_name}")
|
||||
print(f" Local updated: {conflict.local_updated_at}")
|
||||
print(f" Imported updated: {conflict.imported_updated_at}")
|
||||
print(f" Local is newer: {conflict.local_is_newer}")
|
||||
print(f" Imported is newer: {conflict.imported_is_newer}")
|
||||
|
||||
# Apply strategy
|
||||
if analysis.has_conflicts:
|
||||
merged = merger.merge_with_strategy(
|
||||
ConflictResolutionMode.MANUAL # Handled elsewhere
|
||||
)
|
||||
```
|
||||
|
||||
### Phase 4: Schema Versioning & Migrations
|
||||
|
||||
Automatic schema evolution handling:
|
||||
|
||||
```python
|
||||
from dss.export_import.migrations import MigrationManager
|
||||
|
||||
# Current schema version
|
||||
current = MigrationManager.get_latest_version()
|
||||
print(f"Current schema: {current}") # "1.0.1"
|
||||
|
||||
# Automatic migration on import
|
||||
archive_version = "1.0.0" # Old archive
|
||||
target_version = current
|
||||
|
||||
if archive_version < target_version:
|
||||
migrated_data = MigrationManager.migrate(
|
||||
archive_data,
|
||||
from_version=archive_version,
|
||||
to_version=target_version
|
||||
)
|
||||
```
|
||||
|
||||
**Version Compatibility:**
|
||||
- Old archives (v1.0.0) → Auto-migrated to latest
|
||||
- Current archives (v1.0.1) → No migration needed
|
||||
- Newer archives (v1.0.2) → Error: Please update DSS
|
||||
|
||||
**Defining New Migrations:**
|
||||
|
||||
```python
|
||||
from dss.export_import.migrations import SchemaMigration, MigrationManager
|
||||
|
||||
class MigrationV1_0_1_to_V1_0_2(SchemaMigration):
|
||||
source_version = "1.0.1"
|
||||
target_version = "1.0.2"
|
||||
|
||||
def up(self, data):
|
||||
"""Upgrade v1.0.1 → v1.0.2"""
|
||||
# Add new fields, transform structure, etc.
|
||||
if 'components' in data:
|
||||
for comp in data['components']:
|
||||
if 'deprecationMessage' not in comp:
|
||||
comp['deprecationMessage'] = ""
|
||||
return data
|
||||
|
||||
def down(self, data):
|
||||
"""Rollback v1.0.2 → v1.0.1"""
|
||||
# Remove new fields
|
||||
if 'components' in data:
|
||||
for comp in data['components']:
|
||||
comp.pop('deprecationMessage', None)
|
||||
return data
|
||||
|
||||
# Register migration
|
||||
MigrationManager.MIGRATIONS[("1.0.1", "1.0.2")] = MigrationV1_0_1_to_V1_0_2
|
||||
MigrationManager.VERSIONS.append("1.0.2")
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Usage Examples
|
||||
|
||||
### 1. Backup & Restore Workflow
|
||||
|
||||
```python
|
||||
from pathlib import Path
|
||||
from datetime import datetime
|
||||
from dss.export_import import DSSArchiveExporter, DSSArchiveImporter
|
||||
|
||||
# BACKUP
|
||||
def backup_project(project, backup_dir="./backups"):
|
||||
timestamp = datetime.utcnow().strftime("%Y%m%d_%H%M%S")
|
||||
backup_path = Path(backup_dir) / f"backup_{project.name}_{timestamp}.dss"
|
||||
|
||||
exporter = DSSArchiveExporter(project)
|
||||
saved_path = exporter.export_to_file(backup_path)
|
||||
print(f"✓ Backed up to {saved_path}")
|
||||
return saved_path
|
||||
|
||||
# RESTORE
|
||||
def restore_from_backup(backup_path):
|
||||
importer = DSSArchiveImporter(backup_path)
|
||||
|
||||
# Validate before restoring
|
||||
analysis = importer.analyze()
|
||||
if not analysis.is_valid:
|
||||
errors = "\n".join([e.message for e in analysis.errors])
|
||||
raise ValueError(f"Cannot restore: {errors}")
|
||||
|
||||
# Restore
|
||||
project = importer.import_replace()
|
||||
print(f"✓ Restored project: {project.name}")
|
||||
return project
|
||||
|
||||
# Usage
|
||||
project = Project(...)
|
||||
backup_path = backup_project(project)
|
||||
restored = restore_from_backup(backup_path)
|
||||
```
|
||||
|
||||
### 2. Project Export for Distribution
|
||||
|
||||
```python
|
||||
# Export complete project
|
||||
exporter = DSSArchiveExporter(project)
|
||||
exporter.export_to_file(Path("design-system-v2.0.dss"))
|
||||
|
||||
# Share with team, other projects, or documentation
|
||||
# File contains everything needed to reconstruct the system
|
||||
```
|
||||
|
||||
### 3. Collaborative Merging
|
||||
|
||||
```python
|
||||
from dss.export_import.merger import SmartMerger, ConflictResolutionMode
|
||||
|
||||
# Team member A has local version
|
||||
local_project = Project(...)
|
||||
|
||||
# Team member B shares updated version
|
||||
importer = DSSArchiveImporter(Path("shared-updates.dss"))
|
||||
remote_updates = importer.import_replace()
|
||||
|
||||
# Merge intelligently
|
||||
merger = SmartMerger(local_project, remote_updates)
|
||||
analysis = merger.analyze_merge()
|
||||
|
||||
# Show merge summary to user
|
||||
print(f"New tokens: {len(analysis.new_items['tokens'])}")
|
||||
print(f"Updated: {len(analysis.updated_items['components'])}")
|
||||
print(f"Conflicts to resolve: {len(analysis.conflicted_items)}")
|
||||
|
||||
# Apply merge
|
||||
merged_project = merger.merge_with_strategy(
|
||||
ConflictResolutionMode.KEEP_LOCAL # Prefer local changes
|
||||
)
|
||||
```
|
||||
|
||||
### 4. Archive Analysis Without Import
|
||||
|
||||
```python
|
||||
# Inspect archive before importing
|
||||
importer = DSSArchiveImporter(Path("mysterious-update.dss"))
|
||||
analysis = importer.analyze()
|
||||
|
||||
if analysis.is_valid:
|
||||
print(f"Project: {analysis.project_name}")
|
||||
print(f"Tokens: {analysis.content_summary['tokens']['count']}")
|
||||
print(f"Components: {analysis.content_summary['components']['count']}")
|
||||
print(f"Schema: {analysis.schema_version}")
|
||||
print(f"Migration needed: {analysis.migration_needed}")
|
||||
|
||||
if analysis.migration_needed:
|
||||
print(f"Will be upgraded from {analysis.schema_version} to {analysis.target_version}")
|
||||
else:
|
||||
for error in analysis.errors:
|
||||
print(f"✗ [{error.stage}] {error.message}")
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Database Integration
|
||||
|
||||
The export/import system integrates with the existing DSS database layer:
|
||||
|
||||
### Adding UUIDs to Existing Projects
|
||||
|
||||
Existing projects need UUID generation. A migration utility handles this:
|
||||
|
||||
```python
|
||||
from dss.storage.database import get_connection
|
||||
from uuid import uuid4
|
||||
|
||||
def backfill_uuids():
|
||||
"""Generate UUIDs for all existing records"""
|
||||
with get_connection() as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Projects
|
||||
cursor.execute("SELECT id FROM projects WHERE uuid IS NULL")
|
||||
for row in cursor.fetchall():
|
||||
project_id = row[0]
|
||||
new_uuid = str(uuid4())
|
||||
conn.execute(
|
||||
"UPDATE projects SET uuid = ? WHERE id = ?",
|
||||
(new_uuid, project_id)
|
||||
)
|
||||
|
||||
# Components
|
||||
cursor.execute("SELECT id FROM components WHERE uuid IS NULL")
|
||||
for row in cursor.fetchall():
|
||||
comp_id = row[0]
|
||||
new_uuid = str(uuid4())
|
||||
conn.execute(
|
||||
"UPDATE components SET uuid = ? WHERE id = ?",
|
||||
(new_uuid, comp_id)
|
||||
)
|
||||
|
||||
# Run once to initialize
|
||||
backfill_uuids()
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## API Reference
|
||||
|
||||
### DSSArchiveExporter
|
||||
|
||||
```python
|
||||
class DSSArchiveExporter:
|
||||
def __init__(self, project: Project):
|
||||
"""Initialize exporter with project"""
|
||||
|
||||
def export_to_file(self, output_path: Path) -> Path:
|
||||
"""Export to .dss file, returns path"""
|
||||
```
|
||||
|
||||
### DSSArchiveImporter
|
||||
|
||||
```python
|
||||
class DSSArchiveImporter:
|
||||
def __init__(self, archive_path: Path):
|
||||
"""Initialize importer with archive path"""
|
||||
|
||||
def analyze(self) -> ImportAnalysis:
|
||||
"""Analyze archive without importing"""
|
||||
|
||||
def import_replace(self) -> Project:
|
||||
"""Import with REPLACE strategy"""
|
||||
```
|
||||
|
||||
### SmartMerger
|
||||
|
||||
```python
|
||||
class SmartMerger:
|
||||
def __init__(self, local_project: Project, imported_project: Project):
|
||||
"""Initialize merger"""
|
||||
|
||||
def analyze_merge(self) -> MergeAnalysis:
|
||||
"""Analyze merge without modifying"""
|
||||
|
||||
def merge_with_strategy(
|
||||
self,
|
||||
conflict_handler: ConflictResolutionMode
|
||||
) -> Project:
|
||||
"""Perform merge with strategy"""
|
||||
```
|
||||
|
||||
### MigrationManager
|
||||
|
||||
```python
|
||||
class MigrationManager:
|
||||
@classmethod
|
||||
def migrate(
|
||||
cls,
|
||||
data: Dict,
|
||||
from_version: str,
|
||||
to_version: str
|
||||
) -> Dict:
|
||||
"""Apply migrations from source to target"""
|
||||
|
||||
@classmethod
|
||||
def get_latest_version(cls) -> str:
|
||||
"""Get latest schema version"""
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
dss-mvp1/dss/
|
||||
├── models/
|
||||
│ ├── project.py # Project + uuid field
|
||||
│ ├── component.py # Component + uuid field
|
||||
│ └── theme.py # Theme + DesignToken with uuid
|
||||
│
|
||||
├── export_import/ # NEW PACKAGE
|
||||
│ ├── __init__.py # Package exports
|
||||
│ ├── exporter.py # Archive creation
|
||||
│ ├── importer.py # Archive loading + validation
|
||||
│ ├── merger.py # Smart merge logic
|
||||
│ └── migrations.py # Schema versioning
|
||||
│
|
||||
└── storage/
|
||||
└── database.py # Updated with uuid columns
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Backward Compatibility
|
||||
|
||||
✅ **No breaking changes**
|
||||
|
||||
- Existing code continues to work unchanged
|
||||
- UUIDs are optional (auto-generated if missing)
|
||||
- Database schema is backward compatible (uuid columns nullable)
|
||||
- Export/import is opt-in feature
|
||||
- Runtime code doesn't use UUIDs (transport-layer only)
|
||||
|
||||
---
|
||||
|
||||
## Security Considerations
|
||||
|
||||
1. **Archive Integrity**: Validate before importing
|
||||
2. **Referential Integrity**: All UUID references checked
|
||||
3. **Schema Validation**: JSON schemas enforced
|
||||
4. **Conflict Detection**: No silent overwrites
|
||||
5. **Audit Trail**: Import operations logged
|
||||
|
||||
---
|
||||
|
||||
## Performance Notes
|
||||
|
||||
- Export: O(n) where n = total tokens + components (single pass)
|
||||
- Import: O(n) validation + O(m) for merge where m = local items
|
||||
- Archive size: Gzipped, typically 10-50% of token count in KB
|
||||
- Memory: Entire archive loaded in memory (can be optimized with streaming)
|
||||
|
||||
---
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
1. **Selective Export**: Choose what to export (tokens only, components only, etc.)
|
||||
2. **Streaming Import**: Process large archives without full memory load
|
||||
3. **Audit Trail Export**: Include sync history and activity logs
|
||||
4. **Figma Sync**: Direct sync with Figma without manual export
|
||||
5. **Cloud Storage**: Native upload/download to cloud services
|
||||
6. **Compression**: Optional additional compression
|
||||
7. **Encryption**: Encrypted archives for sensitive projects
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "Archive validation failed"
|
||||
Check the detailed error messages in `analysis.errors`. Ensure:
|
||||
- Archive is valid ZIP file
|
||||
- manifest.json exists and is valid JSON
|
||||
- All referenced files exist in archive
|
||||
|
||||
### "Schema version too new"
|
||||
Update DSS application to latest version. Archives can only be migrated forward, not backward.
|
||||
|
||||
### "Referential integrity error"
|
||||
Ensure all token/component references are valid. Check if:
|
||||
- Referenced UUIDs exist in archive
|
||||
- No circular dependencies
|
||||
- All dependencies are included in export
|
||||
|
||||
### "Merge conflicts detected"
|
||||
Review conflicts in `analysis.conflicted_items`. Decide on resolution strategy:
|
||||
- `OVERWRITE`: Trust imported version
|
||||
- `KEEP_LOCAL`: Trust local version
|
||||
- `FORK`: Create separate copy of imported version
|
||||
|
||||
---
|
||||
|
||||
## Support
|
||||
|
||||
For issues or questions:
|
||||
1. Check ImportAnalysis errors
|
||||
2. Review migration compatibility
|
||||
3. Validate archive structure
|
||||
4. Check referential integrity
|
||||
|
||||
---
|
||||
|
||||
Generated with DSS Export/Import System v1.0.0
|
||||
Reference in New Issue
Block a user