Files
dss/docs/ARCHITECTURE.md
Digital Production Factory 276ed71f31 Initial commit: Clean DSS implementation
Migrated from design-system-swarm with fresh git history.
Old project history preserved in /home/overbits/apps/design-system-swarm

Core components:
- MCP Server (Python FastAPI with mcp 1.23.1)
- Claude Plugin (agents, commands, skills, strategies, hooks, core)
- DSS Backend (dss-mvp1 - token translation, Figma sync)
- Admin UI (Node.js/React)
- Server (Node.js/Express)
- Storybook integration (dss-mvp1/.storybook)

Self-contained configuration:
- All paths relative or use DSS_BASE_PATH=/home/overbits/dss
- PYTHONPATH configured for dss-mvp1 and dss-claude-plugin
- .env file with all configuration
- Claude plugin uses ${CLAUDE_PLUGIN_ROOT} for portability

Migration completed: $(date)
🤖 Clean migration with full functionality preserved
2025-12-09 18:45:48 -03:00

549 lines
12 KiB
Markdown

# DSS Architecture
> Detailed system design, module interactions, and extension points
## System Overview
DSS follows a **layered architecture** with clear separation of concerns:
```
Clients (Admin UI, CLI, AI Agents)
API Layer (REST + MCP)
Core Modules (Ingest, Analyze, Storybook, Figma)
Data Layer (SQLite, File Cache)
```
## Architecture Principles
1. **Separation of Concerns**: Each module has a single, well-defined responsibility
2. **Dependency Inversion**: Modules depend on abstractions (base classes), not concrete implementations
3. **Open/Closed**: Open for extension (new parsers), closed for modification (core unchanged)
4. **Immutability**: DSS canonical structure is immutable; external systems adapt via translation
5. **Local-First**: All operations are local by default; remote access requires explicit configuration
## Module Interactions
### Data Flow: Figma → DSS → Storybook
```
┌──────────┐
│ Figma │
│ API │
└────┬─────┘
│ 1. Extract variables/components
┌──────────────┐
│ figma_tools │ ──┐
└──────────────┘ │
│ 2. Parse to DesignToken
┌─────────────┐
│ ingest │
│ base.py │
└──────┬──────┘
│ 3. Store tokens
┌─────────────┐
│ storage │
│ database.py │
└──────┬──────┘
│ 4. Load tokens
┌─────────────┐
│ storybook │
│ theme.py │
└──────┬──────┘
│ 5. Generate theme
┌─────────────┐
│ Storybook │
│ theme.js │
└─────────────┘
```
### Data Flow: Code Analysis → Quick Wins
```
┌──────────────┐
│ React Project│
└──────┬───────┘
│ 1. Scan files
┌──────────────┐
│ analyze │
│ scanner.py │
└──────┬───────┘
│ 2. Extract components
┌──────────────┐
│ analyze │
│ react.py │
└──────┬───────┘
│ 3. Analyze styles
┌──────────────┐
│ analyze │
│ styles.py │
└──────┬───────┘
│ 4. Find issues
┌──────────────┐
│ analyze │
│quick_wins.py │
└──────┬───────┘
│ 5. Generate report
┌──────────────┐
│ MCP Tool │
│ get_quick_wins│
└──────────────┘
```
## Module Design
### 1. Ingestion Module (`/tools/ingest/`)
**Responsibility**: Parse design tokens from multiple sources
**Class Hierarchy**:
```
TokenSource (ABC)
├── CSSTokenSource
├── SCSSTokenSource
├── TailwindTokenSource
├── JSONTokenSource
└── FigmaTokenSource
```
**Key Methods**:
- `parse()` - Extract tokens from source
- `normalize()` - Convert to DSS canonical format
- `validate()` - Check token validity
**Extension Point**: Add new `TokenSource` subclass for new formats
**Example**:
```python
from tools.ingest.base import TokenSource, DesignToken
class YAMLTokenSource(TokenSource):
def parse(self, content: str) -> List[DesignToken]:
# Parse YAML
# Return DesignToken list
pass
```
### 2. Analysis Module (`/tools/analyze/`)
**Responsibility**: Code intelligence and improvement detection
**Class Hierarchy**:
```
ProjectScanner
├── detect_framework()
├── detect_styling()
└── scan_files()
ReactAnalyzer
├── extract_components()
├── find_inline_styles()
└── analyze_imports()
StyleAnalyzer
├── find_duplicates()
├── find_token_candidates()
└── analyze_naming()
DependencyGraph
├── build()
├── find_orphans()
└── find_hubs()
QuickWinFinder
├── find_all()
├── prioritize()
└── generate_report()
```
**Extension Point**: Add new analyzer for Vue, Angular, etc.
**Example**:
```python
from tools.analyze.base import ProjectAnalyzer
class VueAnalyzer(ProjectAnalyzer):
def analyze(self, path: str) -> ProjectAnalysis:
# Analyze Vue files
pass
```
### 3. Storybook Module (`/tools/storybook/`)
**Responsibility**: Storybook integration and generation
**Class Hierarchy**:
```
StorybookScanner
├── find_stories()
├── parse_story()
└── get_coverage()
StoryGenerator
├── generate_story()
├── select_template()
└── render()
ThemeGenerator
├── generate_from_tokens()
├── generate_theme_file()
└── generate_preview_head()
```
**Templates**:
- CSF3: Modern Component Story Format
- CSF2: Legacy format
- MDX: Markdown + JSX
**Extension Point**: Add new template types
### 4. Figma Module (`/tools/figma/`)
**Responsibility**: Figma API integration
**Class Hierarchy**:
```
FigmaClient
├── get_file()
├── get_variables()
├── get_components()
└── get_styles()
FigmaToolSuite
├── extract_variables()
├── extract_components()
├── extract_styles()
├── sync_tokens()
└── validate_components()
```
**Caching Strategy**:
- SQLite-based cache
- Configurable TTL (default 300s)
- Cache key: `figma:{md5(endpoint)}`
**Fallback Strategy**:
1. Try Variables API (requires paid plan)
2. Fall back to document-level data (free)
### 5. Storage Module (`/tools/storage/`)
**Responsibility**: Data persistence
**Tables**:
```sql
CREATE TABLE cache (
key TEXT PRIMARY KEY,
value TEXT,
expires_at INTEGER
);
CREATE TABLE activity_log (
id INTEGER PRIMARY KEY,
timestamp INTEGER,
action TEXT,
entity_type TEXT,
entity_id TEXT,
details TEXT
);
```
**Extension Point**: Add new tables for migration tracking, etc.
## API Design
### REST API Pattern
```python
@app.get("/tokens")
async def list_tokens(
category: Optional[str] = None,
source: Optional[str] = None
):
"""List all tokens with optional filters"""
# Implementation
pass
```
### MCP Tool Pattern
```python
@mcp.tool()
async def analyze_react_components(path: str) -> str:
"""
Analyze React components in project.
Args:
path: Path to React project
Returns:
JSON with component analysis
"""
# Implementation
pass
```
## Configuration Management
### Environment Variables
```python
# tools/config.py
class FigmaConfig:
token: str = os.getenv("FIGMA_TOKEN", "")
cache_ttl: int = int(os.getenv("FIGMA_CACHE_TTL", "300"))
use_variables_api: bool = bool(token)
class ServerConfig:
host: str = os.getenv("DSS_MCP_HOST", "127.0.0.1")
port: int = int(os.getenv("DSS_MCP_PORT", "3457"))
rest_port: int = int(os.getenv("PORT", "3456"))
```
### Security
- **No external MCP connections** by default
- **Local-only operations** unless explicitly configured
- **Environment-based secrets** (never hardcoded)
- **Host/port configuration** for remote access
## Extension Points
### 1. Add New Token Source
```python
# tools/ingest/new_source.py
from tools.ingest.base import TokenSource
class NewTokenSource(TokenSource):
def parse(self, content: str) -> List[DesignToken]:
# Parse implementation
pass
```
### 2. Add New Analyzer
```python
# tools/analyze/vue.py
from tools.analyze.base import ProjectAnalyzer
class VueAnalyzer(ProjectAnalyzer):
def analyze(self, path: str) -> ProjectAnalysis:
# Vue-specific analysis
pass
```
### 3. Add New MCP Tool
```python
# tools/api/mcp_server.py
@mcp.tool()
async def new_tool(param: str) -> str:
"""Tool description"""
# Implementation
pass
```
### 4. Add New REST Endpoint
```python
# tools/api/server.py
@app.get("/new-endpoint")
async def new_endpoint():
"""Endpoint description"""
# Implementation
pass
```
## Error Handling Strategy
### Python Modules
```python
try:
result = await operation()
except FigmaAPIError as e:
logger.error(f"Figma API error: {e}")
return {"error": str(e), "status": "failed"}
except ValidationError as e:
logger.error(f"Validation error: {e}")
return {"error": str(e), "status": "invalid"}
except Exception as e:
logger.error(f"Unexpected error: {e}")
return {"error": "Internal error", "status": "error"}
```
### MCP Tools
- Return JSON strings with `{"error": "message"}` on failure
- Use descriptive error messages
- Log errors for debugging
### REST API
- Use proper HTTP status codes (400, 404, 500)
- Return consistent error format
- Include error details in response body
## Performance Considerations
### Caching
- **Figma API**: SQLite cache with TTL
- **File scanning**: Cache project structure
- **Token parsing**: Cache parsed results
### Optimization Targets
- Project analysis: < 5 seconds for 100-component project
- Token merge: < 1 second for 1000 tokens
- Figma extraction: < 3 seconds (with cache)
### Scaling Limits
- **Tokens**: Tested up to 1000 tokens
- **Components**: Tested up to 100 components
- **Files**: Tested up to 500 files
## Testing Strategy
### Unit Tests
```bash
pytest tools/ingest/ # Test parsers
pytest tools/analyze/ # Test analyzers
pytest tools/storybook/ # Test generators
```
### Integration Tests
```bash
pytest tests/integration/ # End-to-end workflows
```
### Test Coverage Goals
- Core modules: > 80%
- Utils: > 60%
- Overall: > 70%
## Deployment Architecture
### Development
```
Local Machine
├── Python 3.x
├── Node.js (for CLI)
└── SQLite database
```
### Production
```
Server (overbits user)
├── systemd services
│ ├── dss-api.service (REST API)
│ └── dss-mcp.service (MCP server)
├── nginx reverse proxy
├── SSL/TLS (Let's Encrypt)
└── SQLite database
```
## Future Architecture Considerations
### Phase 4: Migration Engine
```
Migration Module
├── AST Parser (tree-sitter)
├── Code Transformer
├── Preview Generator
├── Rollback System
└── Test Generator
```
### Phase 5: UI Library Compatibility
```
Compatibility Module
├── HeroUI Translator
├── shadcn Translator
├── Tailwind v4 Generator
└── Translation Dictionary Generator
```
## Monitoring & Logging
### Activity Log
All major operations logged to SQLite:
- Token ingestion
- Figma API calls
- Analysis runs
- Storybook generation
### Metrics
- Cache hit rate
- API response times
- Error frequencies
- Tool usage statistics
## Security Architecture
### Threat Model
1. **Token Exposure**: Figma tokens in environment variables
2. **Injection Attacks**: CSS/SCSS parsing vulnerabilities
3. **Path Traversal**: File system access attacks
### Mitigations
1. **Environment-based secrets**: No hardcoded tokens
2. **Input validation**: Sanitize all user inputs
3. **Path restrictions**: Validate file paths before access
4. **Sandboxing**: Run parsers in isolated contexts
## Maintenance
### Regular Tasks
- Clear old cache entries (> 7 days)
- Archive old activity logs (> 30 days)
- Update dependencies (monthly)
- Review error logs (weekly)
### Health Checks
```bash
# Check MCP server
curl http://localhost:3457/health
# Check REST API
curl http://localhost:3456/status
# Check database
sqlite3 .dss/dss.db "SELECT COUNT(*) FROM cache;"
```
## Documentation Maintenance
- Update PROJECT_MEMORY.md after major changes
- Update this file when architecture changes
- Keep DSS_PRINCIPLES.md synchronized with implementation
- Document all breaking changes in CHANGELOG.md