Migrated from design-system-swarm with fresh git history.
Old project history preserved in /home/overbits/apps/design-system-swarm
Core components:
- MCP Server (Python FastAPI with mcp 1.23.1)
- Claude Plugin (agents, commands, skills, strategies, hooks, core)
- DSS Backend (dss-mvp1 - token translation, Figma sync)
- Admin UI (Node.js/React)
- Server (Node.js/Express)
- Storybook integration (dss-mvp1/.storybook)
Self-contained configuration:
- All paths relative or use DSS_BASE_PATH=/home/overbits/dss
- PYTHONPATH configured for dss-mvp1 and dss-claude-plugin
- .env file with all configuration
- Claude plugin uses ${CLAUDE_PLUGIN_ROOT} for portability
Migration completed: $(date)
🤖 Clean migration with full functionality preserved
19 KiB
Internal Plugin Marketplace Architecture
Date: 2025-12-06 Status: ✅ Analysis Complete - Server-Side Plugin Loading Recommended Use Case: Internal team use only (no external distribution) Confidence: Very High (95%+) Validated By: Gemini 3 Pro Preview (6-step thinkdeep analysis + expert validation)
Executive Summary
Question: How should we distribute plugins for internal DSS team use?
Answer: Server-Side Plugin Directory with auto-loading via plugin_loader.py
Previous Analysis Corrected: The GitHub-based marketplace analysis was for PUBLIC distribution. For internal-only use, we need a much simpler approach.
Key Requirements (Clarified)
- ✅ Internal use only - DSS is for company team members
- ✅ Own plugins only - No external plugin installation needed
- ✅ Simple distribution - Hosted on DSS server itself
- ✅ Works with REMOTE + LOCAL modes - Same plugins for both scenarios
- ✅ Zero client installation - Plugins auto-available when connected
Recommended Architecture: Server-Side Plugin Loading
Core Concept
Instead of "installing" plugins to developer machines, all plugins run on the DSS MCP server and are auto-loaded on server startup.
┌─────────────────────────────────────────────────┐
│ DSS Server (dss.overbits.luz.uy) │
│ │
│ ┌────────────────────────────────────────┐ │
│ │ /plugins/ (Server-Side Directory) │ │
│ │ ├── network-logger/ │ │
│ │ ├── performance-analyzer/ │ │
│ │ └── custom-workflow/ │ │
│ └────────────────┬───────────────────────┘ │
│ │ │
│ ┌────────────────▼───────────────────────┐ │
│ │ plugin_loader.py │ │
│ │ - Scans /plugins/ on server startup │ │
│ │ - Imports TOOLS from each plugin │ │
│ │ - Registers with FastMCP dynamically │ │
│ └────────────────┬───────────────────────┘ │
│ │ │
│ ┌────────────────▼───────────────────────┐ │
│ │ DSS MCP Server │ │
│ │ - Built-in tools (project, debug) │ │
│ │ - Plugin tools (auto-discovered) │ │
│ │ - Strategy Pattern (REMOTE/LOCAL) │ │
│ └────────────────┬───────────────────────┘ │
└───────────────────┼──────────────────────────────┘
│
│ MCP Protocol
│
┌──────────▼──────────┐
│ Developer Machine │
│ - Claude Code CLI │
│ - Connects to MCP │
│ - All tools ready │
└─────────────────────┘
Why Server-Side?
REMOTE Mode:
- ✅ MCP server on dss.overbits.luz.uy
- ✅ Plugins execute server-side
- ✅ Access to Shadow State API
- ✅ No localhost access needed
LOCAL Mode:
- ✅ MCP server still on dss.overbits.luz.uy
- ✅ Plugins use Strategy Pattern
- ✅ LocalStrategy handles browser automation
- ⚠️ May need reverse tunnel for localhost access (future enhancement)
Implementation Design
1. Directory Structure
/home/overbits/dss/
├── tools/dss_mcp/
│ ├── server.py (main MCP server)
│ ├── plugin_loader.py (NEW - auto-discovery)
│ ├── tools/
│ │ ├── project_tools.py (built-in)
│ │ └── debug_tools.py (built-in)
│ └── plugins/ (NEW - plugin directory)
│ ├── __init__.py
│ ├── README.md (plugin development guide)
│ ├── _template/ (copy-paste template)
│ │ ├── __init__.py
│ │ ├── tools.py
│ │ └── README.md
│ ├── network-logger/
│ │ ├── __init__.py (exports TOOLS)
│ │ ├── tools.py (implementation)
│ │ └── README.md
│ └── performance-analyzer/
│ ├── __init__.py
│ ├── tools.py
│ └── README.md
2. Plugin Contract
Every plugin MUST have an __init__.py that exports a TOOLS list.
Example: tools/dss_mcp/plugins/network-logger/__init__.py
"""Network Logger Plugin - Captures browser network requests."""
from .tools import get_network_requests, analyze_network_waterfall
# Plugin Contract:
# - TOOLS: List of callables to register as MCP tools
# - RESOURCES: (Optional) List of resources
# - PROMPTS: (Optional) List of prompts
TOOLS = [
get_network_requests,
analyze_network_waterfall
]
# Optional metadata
__version__ = "1.0.0"
__author__ = "DSS Team"
__description__ = "Captures and analyzes browser network traffic"
Example: tools/dss_mcp/plugins/network-logger/tools.py
"""Network Logger tool implementations."""
import logging
from typing import Dict, Any, List, Optional
logger = logging.getLogger(__name__)
async def get_network_requests(
session_id: str,
filter_type: str = "all"
) -> List[Dict[str, Any]]:
"""
Get browser network requests from Shadow State.
Args:
session_id: Browser session ID
filter_type: Filter by type (xhr, fetch, all)
Returns:
List of network request objects
"""
# Implementation using RemoteStrategy or LocalStrategy
# Can access Shadow State API via /api/browser-logs/{session_id}
logger.info(f"Fetching network requests for session {session_id}")
# TODO: Implement actual network request retrieval
return []
async def analyze_network_waterfall(
session_id: str
) -> Dict[str, Any]:
"""
Analyze network request waterfall.
Returns:
Performance analysis with waterfall data
"""
logger.info(f"Analyzing network waterfall for session {session_id}")
# TODO: Implement waterfall analysis
return {"status": "not_implemented"}
3. Plugin Loader with Deferred Registration Pattern
File: tools/dss_mcp/plugin_loader.py
"""
Dynamic plugin loader for DSS MCP server.
Implements the "Deferred Registration" pattern for FastMCP:
- Plugins define callable functions (not decorated)
- Loader scans /plugins/ directory
- Loader imports and collects TOOLS from each plugin
- Server applies @mcp.tool() decorator at runtime
"""
import importlib
import pkgutil
import os
import logging
from pathlib import Path
from typing import List, Callable
logger = logging.getLogger(__name__)
class PluginLoader:
"""
Loads plugins from a directory and collects their tools.
Resiliency: If one plugin fails to load, it logs the error
and continues with other plugins (doesn't crash the server).
"""
def __init__(self, plugin_dir: str):
"""
Initialize plugin loader.
Args:
plugin_dir: Absolute path to plugins directory
"""
self.plugin_dir = plugin_dir
self.loaded_tools: List[Callable] = []
self.failed_plugins: List[str] = []
def load_plugins(self) -> List[Callable]:
"""
Scan plugin directory, import modules, and aggregate tools.
Returns:
List of callables ready to be decorated by FastMCP.
"""
if not os.path.exists(self.plugin_dir):
logger.warning(f"Plugin directory not found: {self.plugin_dir}")
return []
logger.info(f"Scanning for plugins in {self.plugin_dir}")
# Iterate over subdirectories in plugins/
for module_info in pkgutil.iter_modules([self.plugin_dir]):
if module_info.ispkg:
self._load_single_plugin(module_info.name)
logger.info(
f"Plugin loading complete: "
f"{len(self.loaded_tools)} tools from "
f"{len(self.loaded_tools) - len(self.failed_plugins)} plugins"
)
if self.failed_plugins:
logger.warning(f"Failed to load plugins: {', '.join(self.failed_plugins)}")
return self.loaded_tools
def _load_single_plugin(self, plugin_name: str):
"""
Load a single plugin by name.
Args:
plugin_name: Name of the plugin subdirectory
"""
try:
# Dynamic import: tools.dss_mcp.plugins.<plugin_name>
# This assumes the server runs from project root
module_path = f"tools.dss_mcp.plugins.{plugin_name}"
module = importlib.import_module(module_path)
# Check plugin contract
if hasattr(module, "TOOLS") and isinstance(module.TOOLS, list):
self.loaded_tools.extend(module.TOOLS)
# Log plugin metadata if available
version = getattr(module, "__version__", "unknown")
description = getattr(module, "__description__", "")
logger.info(
f"✓ Loaded plugin '{plugin_name}' v{version}: "
f"{len(module.TOOLS)} tools"
)
if description:
logger.info(f" └─ {description}")
else:
logger.warning(
f"✗ Plugin '{plugin_name}' skipped: "
f"No 'TOOLS' list found in __init__.py"
)
self.failed_plugins.append(plugin_name)
except Exception as e:
# CRITICAL: Do not crash the server for a bad plugin
logger.error(
f"✗ Failed to load plugin '{plugin_name}': {str(e)}",
exc_info=True
)
self.failed_plugins.append(plugin_name)
4. Server Integration
File: tools/dss_mcp/server.py (update)
"""DSS MCP Server with dynamic plugin loading."""
import os
from mcp.server.fastmcp import FastMCP
# Import built-in tools
from .tools.project_tools import PROJECT_TOOLS
from .tools.debug_tools import DEBUG_TOOLS
# Import plugin loader
from .plugin_loader import PluginLoader
# Initialize FastMCP
mcp = FastMCP("DSS Core")
# Initialize Plugin Loader
plugin_path = os.path.join(os.path.dirname(__file__), "plugins")
loader = PluginLoader(plugin_path)
# Load plugins
discovered_tools = loader.load_plugins()
# Register all tools dynamically
# FastMCP requires us to apply the decorator manually
for tool_func in discovered_tools:
# Apply @mcp.tool() decorator to each discovered function
mcp.tool()(tool_func)
# Also register built-in tools
for tool_func in PROJECT_TOOLS + DEBUG_TOOLS:
mcp.tool()(tool_func)
# ... rest of server startup ...
Implementation Plan
Phase 1: Plugin Loader (1 day) - 🔴 CRITICAL
Files to create:
tools/dss_mcp/plugin_loader.py(complete implementation above)tools/dss_mcp/plugins/__init__.py(empty)tools/dss_mcp/plugins/README.md(plugin development guide)
Files to update:
tools/dss_mcp/server.py(integrate plugin loader)
Testing:
# Create test plugin
mkdir -p tools/dss_mcp/plugins/test_plugin
cat > tools/dss_mcp/plugins/test_plugin/__init__.py << 'EOF'
async def hello_dss():
"""Test plugin tool."""
return "Hello from test plugin!"
TOOLS = [hello_dss]
EOF
# Restart MCP server
sudo supervisorctl restart dss-mcp
# Verify plugin loaded (check logs)
tail -f /var/log/supervisor/dss-mcp.log
Phase 2: Plugin Directory Structure (0.5 day)
Create plugin template:
tools/dss_mcp/plugins/_template/__init__.py:
"""
Template Plugin - Copy this directory to create new plugins.
Steps:
1. Copy _template/ to your-plugin-name/
2. Update __init__.py with your plugin metadata
3. Implement your tools in tools.py
4. Export TOOLS list with your tool functions
5. Restart MCP server: sudo supervisorctl restart dss-mcp
"""
from .tools import example_tool
TOOLS = [
example_tool
]
__version__ = "1.0.0"
__author__ = "Your Name"
__description__ = "Description of what your plugin does"
tools/dss_mcp/plugins/_template/tools.py:
"""Plugin tool implementations."""
import logging
logger = logging.getLogger(__name__)
async def example_tool(input_param: str) -> dict:
"""
Example tool function.
Args:
input_param: Description of parameter
Returns:
Result dictionary
"""
logger.info(f"Example tool called with: {input_param}")
# Your implementation here
return {
"status": "success",
"result": f"Processed: {input_param}"
}
Phase 3: Example Plugins (1 day)
Create 2-3 real-world plugins:
- network-logger - Captures network requests via Shadow State
- performance-analyzer - Analyzes performance metrics
- workflow-helper - Common workflow shortcuts
Phase 4: Optional Discovery API (0.5 day)
Add to tools/api/server.py:
@app.get("/api/plugins/list")
async def list_plugins():
"""List all available server-side plugins."""
plugins_dir = Path("/home/overbits/dss/tools/dss_mcp/plugins")
plugins = []
for plugin_dir in plugins_dir.iterdir():
if plugin_dir.is_dir() and not plugin_dir.name.startswith('_'):
try:
# Import to get metadata
module_name = f"tools.dss_mcp.plugins.{plugin_dir.name}"
module = importlib.import_module(module_name)
plugins.append({
"id": plugin_dir.name,
"name": plugin_dir.name.replace('-', ' ').title(),
"version": getattr(module, '__version__', 'unknown'),
"author": getattr(module, '__author__', 'unknown'),
"description": getattr(module, '__description__', ''),
"tools_count": len(getattr(module, 'TOOLS', []))
})
except Exception as e:
logger.error(f"Error reading plugin {plugin_dir.name}: {e}")
return {"plugins": plugins}
Developer Workflow
Adding a New Plugin
# 1. Copy template
cd /home/overbits/dss/tools/dss_mcp/plugins
cp -r _template my-new-plugin
# 2. Edit plugin files
vim my-new-plugin/__init__.py
# Update metadata (__version__, __author__, __description__)
# Update TOOLS list
vim my-new-plugin/tools.py
# Implement your tool functions
# 3. Restart MCP server
sudo supervisorctl restart dss-mcp
# 4. Verify plugin loaded
tail -20 /var/log/supervisor/dss-mcp.log | grep "my-new-plugin"
# 5. Test from Claude Code
# All tools should now be available automatically!
Using Plugins (Developer Perspective)
# Developer connects to DSS MCP server
# All plugins auto-available, no installation needed!
# Example: Use network logger plugin
result = await get_network_requests(
session_id="abc123",
filter_type="xhr"
)
# Example: Analyze performance
analysis = await analyze_network_waterfall(session_id="abc123")
Key Advantages
- ✅ Zero Client Setup - No installation on developer machines
- ✅ Central Management - Update plugins server-side, all devs get new version
- ✅ Instant Availability - Connect to MCP → all plugins ready
- ✅ Team Consistency - Everyone uses exact same toolset
- ✅ Simple Development - Copy template, edit, restart server
- ✅ Works with REMOTE/LOCAL - Plugins use Strategy Pattern
- ✅ Resilient - Bad plugin doesn't crash server
- ✅ No External Dependencies - Everything internal
Security & Access Control
For Internal Use:
- ✅ No authentication needed for plugin loading (trusted code)
- ✅ File permissions on
/plugins/directory (dev team only) - ✅ Access control at MCP server level (who can connect)
- ✅ Code review before adding to
/plugins/(team process)
Security Note:
Since this is internal-only:
- All code in
/plugins/is trusted (written by team) - No sandboxing needed (not running untrusted code)
- Simple file permissions sufficient (Unix permissions)
Comparison: Previous vs Current Approach
| Aspect | GitHub Marketplace (Previous) | Server-Side Loading (Current) |
|---|---|---|
| Use Case | Public distribution | Internal team use |
| Installation | Client downloads | Zero installation |
| Updates | Each dev updates | Server-side only |
| Complexity | High (GitHub integration) | Low (file-based) |
| Implementation | 5-7 days | 2-3 days |
| Maintenance | Ongoing (registry API) | Minimal (add files) |
| Developer UX | Multi-step install | Instant availability |
| Consistency | Version mismatches | Always same version |
Future Enhancements
Possible Additions (Not Required Initially):
-
Hot Reloading - Reload plugins without server restart
- Use
importlib.reload()with file watching - WATCH OUT: Python caching issues
- Use
-
Plugin Dependencies - Handle inter-plugin dependencies
- Add
REQUIRES = ['other-plugin']to contract - Load in dependency order
- Add
-
Plugin Versioning - Semantic versioning and compatibility
- Check
__min_server_version__before loading - Warn on incompatible plugins
- Check
-
Local Proxy Agent - For LOCAL mode localhost access
- Lightweight agent on developer machine
- Reverse tunnel or SSH forwarding
- Enables LOCAL browser automation from server-side plugins
Expert Validation Summary
From Gemini 3 Pro Preview analysis:
"This is a robust architectural pivot. Moving from a hypothetical 'GitHub Marketplace' to a concrete Server-Side Plugin Loader significantly reduces complexity while solving the immediate requirement: extending DSS capabilities without managing client-side installations."
Key Validation Points:
- ✅ Deferred Registration Pattern correctly identified for FastMCP
- ✅ Resiliency Core with try/except to prevent bad plugins crashing server
- ✅ Plugin Contract with TOOLS list export pattern
- ✅ Naming Collision Risk noted (mitigated by team discipline)
- ✅ Dependency Risk handled via try/except and logging
- ✅ Hot Reloading correctly identified as risky (use supervisord restart)
Next Steps
Priority Order:
- 🔴 Phase 1 - Create
plugin_loader.py(1 day) - 🟡 Phase 2 - Create plugin directory structure (0.5 day)
- 🟢 Phase 3 - Create example plugins (1 day)
- 🔵 Phase 4 - Optional discovery API (0.5 day)
Total Effort: 2-3 days for complete implementation
Start With: Phase 1 - implement and test plugin_loader.py with a single test plugin.
Status: ✅ Ready for Implementation Confidence: Very High (95%+) Last Updated: 2025-12-06