Migrated from design-system-swarm with fresh git history.
Old project history preserved in /home/overbits/apps/design-system-swarm
Core components:
- MCP Server (Python FastAPI with mcp 1.23.1)
- Claude Plugin (agents, commands, skills, strategies, hooks, core)
- DSS Backend (dss-mvp1 - token translation, Figma sync)
- Admin UI (Node.js/React)
- Server (Node.js/Express)
- Storybook integration (dss-mvp1/.storybook)
Self-contained configuration:
- All paths relative or use DSS_BASE_PATH=/home/overbits/dss
- PYTHONPATH configured for dss-mvp1 and dss-claude-plugin
- .env file with all configuration
- Claude plugin uses ${CLAUDE_PLUGIN_ROOT} for portability
Migration completed: $(date)
🤖 Clean migration with full functionality preserved
14 KiB
Plugin Marketplace Architecture Analysis
Date: 2025-12-06 Status: ✅ Analysis Complete - Hybrid Architecture Recommended Confidence: Very High (90%+) Validated By: Gemini 3 Pro Preview (6-step thinkdeep analysis)
Executive Summary
Question: Should we build a plugin marketplace on our DSS server for plugin distribution?
Answer: NO to full marketplace. YES to lightweight hybrid architecture.
Recommendation: GitHub hosting + DSS Registry API (Hybrid Model)
Analysis Results
🔍 Key Discovery
Claude Code already supports multiple GitHub-based plugin marketplaces via the extraKnownMarketplaces setting in .claude/settings.json:
{
"extraKnownMarketplaces": [
{
"name": "My Marketplace",
"url": "https://github.com/user/repo",
"description": "Custom plugins"
}
]
}
Implication: The infrastructure for plugin distribution already exists. We don't need to reinvent it.
Architecture Comparison
| Aspect | Full Marketplace | Hybrid (Recommended) |
|---|---|---|
| Implementation Time | 5-7 weeks | 2-3 days |
| Maintenance Burden | ~40 hrs/month | ~2 hrs/month |
| Security Risk | High | Low |
| Infrastructure Needed | File hosting, CDN, backups, versioning | Simple REST API |
| GitHub Integration | Manual sync | Direct consumption |
| Development Cost | $20-30K equivalent | ~$1K equivalent |
Recommended Architecture: Hybrid Model
Components
1. GitHub (File Hosting & Distribution)
- Hosts actual plugin files (
.py,.json, etc.) - Provides version control, code review, audit trail
- Handles security scanning, access control
- Free infrastructure (GitHub Pages, Releases)
2. DSS Registry API (Discovery & Analytics)
- Lightweight REST API on DSS server
- Provides plugin discovery, search, categories
- Tracks download/usage analytics per team
- Enables private plugin sharing within organizations
3. Claude Code (Installation)
- Users add GitHub URLs to
extraKnownMarketplaces - Existing Claude Code plugin system handles installation
- Zero changes needed to Claude Code itself
Data Flow
┌──────────────┐
│ GitHub │ Hosts: plugin files, metadata, releases
│ (Storage) │
└──────┬───────┘
│
│ ① Fetch metadata
│
┌──────▼───────┐
│ DSS Registry │ Provides: search, analytics, team features
│ API │
└──────┬───────┘
│
│ ② Query available plugins
│
┌──────▼───────┐
│ Claude Code │ Displays: plugin browser, install instructions
│ UI │
└──────┬───────┘
│
│ ③ Install: Add to extraKnownMarketplaces
│
┌──────▼───────┐
│ User │
└──────────────┘
Implementation Plan
Phase 1: Registry API (2-3 days)
API Endpoints:
GET /api/plugins/registry - List all registered plugins
GET /api/plugins/search?q={query} - Search plugins
GET /api/plugins/{id} - Get plugin details + analytics
POST /api/plugins/register - Register new plugin (GitHub URL)
POST /api/plugins/{id}/analytics - Track download/usage event
Database Schema:
-- plugins table
CREATE TABLE plugins (
id VARCHAR(50) PRIMARY KEY,
name VARCHAR(200) NOT NULL,
description TEXT,
github_url VARCHAR(500) NOT NULL,
author VARCHAR(100),
category VARCHAR(50),
tags TEXT[],
version VARCHAR(20),
downloads_count INTEGER DEFAULT 0,
rating_avg DECIMAL(3,2),
created_at TIMESTAMP DEFAULT NOW(),
updated_at TIMESTAMP DEFAULT NOW()
);
-- plugin_analytics table
CREATE TABLE plugin_analytics (
id SERIAL PRIMARY KEY,
plugin_id VARCHAR(50) REFERENCES plugins(id),
team_id VARCHAR(50),
event_type VARCHAR(20), -- install, uninstall, usage
timestamp TIMESTAMP DEFAULT NOW()
);
GitHub Integration:
async def sync_plugin_metadata(plugin_id: str, github_url: str):
"""Fetch plugin.json from GitHub and update registry."""
# Parse GitHub URL
# Fetch raw content of .claude-plugin/plugin.json
# Validate schema
# Update database with metadata
# Cache for 1 hour
Phase 2: Dynamic Plugin Loader ⚠️ CRITICAL
Current Problem:
- Plugins are hardcoded in
tools/dss_mcp/tools/project_tools.py - Adding a new plugin requires manual code changes
- No auto-discovery mechanism
Solution: Create plugin auto-loader
File: tools/dss_mcp/plugin_loader.py
"""Dynamic plugin loader for DSS MCP server."""
import importlib
import pkgutil
import logging
from pathlib import Path
from typing import List, Dict, Any
logger = logging.getLogger(__name__)
def load_plugins(plugin_dirs: List[str] = None) -> List[Dict[str, Any]]:
"""
Dynamically load all plugins from specified directories.
Plugin Contract:
- Must be a Python module (.py file)
- Must expose EXPORT_TOOLS variable (list of MCP tool definitions)
- Optional: PLUGIN_METADATA dict with name, version, author
Args:
plugin_dirs: List of directories to scan. Defaults to ['tools'].
Returns:
List of MCP tool definitions from all discovered plugins.
"""
if plugin_dirs is None:
plugin_dirs = ['tools']
registry = []
for plugin_dir in plugin_dirs:
plugin_path = Path(plugin_dir)
if not plugin_path.exists():
logger.warning(f"Plugin directory not found: {plugin_dir}")
continue
# Scan for Python modules
for module_file in plugin_path.glob('*.py'):
if module_file.name.startswith('_'):
continue # Skip __init__.py, __pycache__, etc.
module_name = module_file.stem
try:
# Dynamically import the module
spec = importlib.util.spec_from_file_location(
f"plugins.{module_name}",
module_file
)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
# Check for plugin contract compliance
if hasattr(module, 'EXPORT_TOOLS'):
tools = module.EXPORT_TOOLS
registry.extend(tools)
metadata = getattr(module, 'PLUGIN_METADATA', {})
logger.info(
f"Loaded plugin: {metadata.get('name', module_name)} "
f"({len(tools)} tools)"
)
else:
logger.debug(
f"Skipping {module_name} (no EXPORT_TOOLS variable)"
)
except Exception as e:
logger.error(f"Failed to load plugin {module_name}: {e}")
return registry
Plugin Contract Example:
# tools/network_logger.py - Example plugin
PLUGIN_METADATA = {
"name": "Network Logger",
"version": "1.0.0",
"author": "DSS Team",
"description": "Captures and analyzes browser network traffic"
}
EXPORT_TOOLS = [
{
"name": "get_network_requests",
"description": "Get browser network requests",
"inputSchema": {
"type": "object",
"properties": {
"filter_type": {
"type": "string",
"enum": ["xhr", "fetch", "all"]
}
}
}
}
]
# Implementation class would go here
Update MCP Server:
# tools/dss_mcp/server.py
from .plugin_loader import load_plugins
# Instead of hardcoded imports:
# from .tools.project_tools import PROJECT_TOOLS
# from .tools.debug_tools import DEBUG_TOOLS
# Use dynamic loading:
all_tools = load_plugins(['tools/dss_mcp/tools'])
# Register with MCP
for tool in all_tools:
mcp.tool()(tool)
Phase 3: Discovery UI (1-2 days)
Location: admin-ui/plugins.html
Features:
- Browse plugins by category (Design Tools, Testing, Debugging, etc.)
- Search functionality
- Plugin details page with:
- Description, author, version
- Download count, ratings
- GitHub repository link
- Installation instructions
- "Install" button → copies to clipboard:
{ "name": "Network Logger", "url": "https://github.com/overbits/dss-plugins/tree/main/network-logger", "description": "Network traffic analyzer" }
User Flow:
- Browse/search plugins at
https://dss.overbits.luz.uy/plugins - Click "Install" → copies GitHub URL
- Add to
.claude/settings.json→extraKnownMarketplaces - Claude Code auto-discovers and installs plugin
Phase 4: Team Features (Optional, 1-2 days)
- Private Plugins: Team-specific plugin repos
- Usage Analytics: Which plugins are most used by your team
- Recommendations: "Teams like yours also use..."
- Admin Dashboard: Manage approved plugins for organization
What We DON'T Need ❌
- ❌ File hosting/storage infrastructure
- ❌ CDN (Content Delivery Network)
- ❌ Backup systems for plugin files
- ❌ Version management (GitHub handles this)
- ❌ Code security scanning (GitHub handles this)
- ❌ User authentication for downloads (GitHub handles access control)
- ❌ Payment processing (free plugins, GitHub Sponsors for premium)
Security Implications
GitHub-Based (Recommended) ✅
Strengths:
- Code review via Pull Requests
- Audit trail (Git history)
- GitHub's security scanning (Dependabot, CodeQL)
- Access control (public/private repos, teams)
- Trusted platform
Risks:
- Malicious plugins (mitigated by curated registry)
- Dependency vulnerabilities (mitigated by GitHub scanning)
Mitigation Strategy:
- Curated Registry: All plugins require PR review before appearing in registry
- Checksum Validation: Registry stores SHA256 checksums
- Dependency Whitelist: Start with "standard library only" or pre-approved packages
- Sandbox Execution: Future enhancement - run plugins in isolated environment
Server-Hosted ❌
Risks:
- Code injection vulnerabilities
- File upload attacks
- Unvetted code execution
- No audit trail
- Maintenance nightmare
Migration Path
Existing Plugins
Current structure:
dss-claude-plugin/
├── .claude-plugin/
│ └── plugin.json
├── commands/
├── skills/
└── agents/
Step 1: Create GitHub repo structure
dss-plugins/
├── network-logger/
│ ├── .claude-plugin/plugin.json
│ └── network_logger.py
├── performance-analyzer/
│ ├── .claude-plugin/plugin.json
│ └── performance_analyzer.py
└── registry.json # Central registry file
Step 2: Publish to GitHub
gh repo create overbits/dss-plugins --public
git push origin main
Step 3: Register in DSS API
POST /api/plugins/register
{
"github_url": "https://github.com/overbits/dss-plugins/tree/main/network-logger"
}
Step 4: Users install
// .claude/settings.json
{
"extraKnownMarketplaces": [
{
"name": "DSS Official Plugins",
"url": "https://github.com/overbits/dss-plugins",
"description": "Official DSS design system plugins"
}
]
}
Success Metrics
Phase 1 (Registry API)
- ✅ API endpoints functional
- ✅ Database schema created
- ✅ GitHub metadata sync working
- ✅ Basic search functional
Phase 2 (Dynamic Loader)
- ✅ Plugins auto-discovered on server start
- ✅ No manual registration needed
- ✅ Error handling for broken plugins
- ✅ Plugin contract documented
Phase 3 (Discovery UI)
- ✅ Plugin browser accessible
- ✅ Search & filter working
- ✅ Install flow documented
- ✅ Analytics tracking events
Phase 4 (Team Features)
- ✅ Private plugin support
- ✅ Team usage dashboard
- ✅ Admin controls functional
Timeline
| Phase | Component | Effort | Priority |
|---|---|---|---|
| 2 | Dynamic Plugin Loader | 1 day | 🔴 CRITICAL |
| 1 | Registry API | 2 days | 🟡 High |
| 3 | Discovery UI | 1-2 days | 🟢 Medium |
| 4 | Team Features | 1-2 days | 🔵 Low |
Total: 5-7 days for full implementation Minimum Viable: 3 days (Phases 1+2)
Expert Validation Notes
From Gemini 3 Pro Preview analysis:
"The proposed Hybrid Architecture (GitHub-hosted files + DSS Registry API) is the correct strategic choice. It avoids the 'marketplace antipattern' (building infrastructure for billing/hosting that already exists) while retaining control over discovery and curation."
Critical Insight:
"For this to work practically, the underlying DSS MCP server must be refactored to support dynamic tool loading, otherwise 'installing' a plugin requires manual code changes."
Security Warning:
"Code Injection Risk: Downloading Python files and executing them is high-risk. Mitigation: Registry must be curated (PR review required)."
Dependency Management:
"Plugins might require conflicting Python packages. Mitigation: Start with 'standard library only' or pre-approved dependency list (like httpx, numpy)."
Conclusion
Recommendation: Implement Hybrid Architecture (GitHub + DSS Registry API)
Priority Order:
- Phase 2 (Dynamic Loader) - CRITICAL, blocks everything else
- Phase 1 (Registry API) - Core discovery infrastructure
- Phase 3 (Discovery UI) - Improves UX
- Phase 4 (Team Features) - Nice-to-have
Total Effort: 5-7 days Maintenance: ~2 hours/month Security Risk: Low (GitHub handles file hosting) Cost: Minimal (no infrastructure needed)
Next Steps:
- Create
plugin_loader.pywith auto-discovery - Define plugin contract documentation
- Test with example plugin
- Implement Registry API
- Create discovery UI
- Launch with curated initial plugins
Analysis validated by: Gemini 3 Pro Preview Confidence: Very High (90%+) Date: 2025-12-06