diff --git a/.gitignore b/.gitignore
index 0d59aaf..4702ba0 100644
--- a/.gitignore
+++ b/.gitignore
@@ -99,9 +99,12 @@ ehthumbs.db
Thumbs.db
# Trading bot specific
-data/
-backtest-results/
-logs/
+.data/
+.backtest-results/
+.logs/
+.old/
+.mongo/
+.chat/
*.db
*.sqlite
*.sqlite3
diff --git a/ARCHITECTURE.md b/ARCHITECTURE.md
deleted file mode 100644
index 8838558..0000000
--- a/ARCHITECTURE.md
+++ /dev/null
@@ -1,703 +0,0 @@
-# ๐๏ธ Stock Bot Trading System - Architecture Documentation
-
-## ๐ Table of Contents
-- [System Overview](#system-overview)
-- [Current Architecture](#current-architecture)
-- [Service Breakdown](#service-breakdown)
-- [Data Flow](#data-flow)
-- [Technology Stack](#technology-stack)
-- [Future Architecture](#future-architecture)
-- [Improvement Recommendations](#improvement-recommendations)
-- [Deployment Architecture](#deployment-architecture)
-- [Security Architecture](#security-architecture)
-- [Monitoring & Observability](#monitoring--observability)
-
----
-
-## ๐ฏ System Overview
-
-The Stock Bot Trading System is a **microservice-based**, **event-driven** trading platform built for **real-time market analysis**, **strategy execution**, and **risk management**. The system follows a **service-oriented architecture (SOA)** with **clear separation of concerns** and **horizontal scalability**.
-
-### Core Principles
-- **Microservices Architecture**: Independent, deployable services
-- **Event-Driven Communication**: WebSocket and Redis pub/sub
-- **Real-Time Processing**: Sub-second latency requirements
-- **Scalable Design**: Horizontal scaling capabilities
-- **Fault Tolerance**: Circuit breakers and graceful degradation
-- **Type Safety**: Full TypeScript implementation
-
----
-
-## ๐๏ธ Current Architecture
-
-```mermaid
-graph TB
- subgraph "Frontend Layer"
- UI[Angular Trading Dashboard]
- end
-
- subgraph "API Gateway Layer"
- GW[API Gateway - Future]
- end
-
- subgraph "Core Services"
- MDG[Market Data Gateway
Port 3001]
- RG[Risk Guardian
Port 3002]
- SO[Strategy Orchestrator
Port 4001]
- end
-
- subgraph "Data Layer"
- Redis[(Redis Cache)]
- PG[(PostgreSQL)]
- QDB[(QuestDB)]
- Mongo[(MongoDB)]
- end
-
- subgraph "External APIs"
- Alpha[Alpha Vantage]
- IEX[IEX Cloud]
- Yahoo[Yahoo Finance]
- end
-
- UI -->|WebSocket/HTTP| MDG
- UI -->|WebSocket/HTTP| RG
- UI -->|WebSocket/HTTP| SO
-
- MDG --> Redis
- MDG --> QDB
- MDG -->|Fetch Data| Alpha
- MDG -->|Fetch Data| IEX
- MDG -->|Fetch Data| Yahoo
-
- RG --> Redis
- RG --> PG
-
- SO --> Redis
- SO --> PG
- SO --> Mongo
-```
-
----
-
-## ๐ง Service Breakdown
-
-### **1. Interface Services**
-
-#### **Trading Dashboard** (`apps/interface-services/trading-dashboard`)
-- **Framework**: Angular 20 + Angular Material + Tailwind CSS
-- **Port**: 4200 (development)
-- **Purpose**: Real-time trading interface and strategy management
-- **Key Features**:
- - Real-time market data visualization
- - Strategy creation and backtesting UI
- - Risk management dashboard
- - Portfolio monitoring
- - WebSocket integration for live updates
-
-**Current Structure:**
-```
-trading-dashboard/
-โโโ src/
-โ โโโ app/
-โ โ โโโ components/ # Reusable UI components
-โ โ โ โโโ sidebar/ # Navigation sidebar
-โ โ โ โโโ notifications/ # Alert system
-โ โ โโโ pages/ # Route-based pages
-โ โ โ โโโ dashboard/ # Main trading dashboard
-โ โ โ โโโ market-data/ # Market data visualization
-โ โ โ โโโ portfolio/ # Portfolio management
-โ โ โ โโโ strategies/ # Strategy management
-โ โ โ โโโ risk-management/ # Risk controls
-โ โ โโโ services/ # Angular services
-โ โ โ โโโ api.service.ts # HTTP API communication
-โ โ โ โโโ websocket.service.ts # WebSocket management
-โ โ โ โโโ strategy.service.ts # Strategy operations
-โ โ โโโ shared/ # Shared utilities
-โ โโโ styles.css # Global styles
-```
-
-### **2. Core Services**
-
-#### **Market Data Gateway** (`apps/core-services/market-data-gateway`)
-- **Framework**: Hono + Bun
-- **Port**: 3001
-- **Purpose**: Market data aggregation and real-time distribution
-- **Database**: QuestDB (time-series), Redis (caching)
-
-**Responsibilities:**
-- Aggregate data from multiple market data providers
-- Real-time WebSocket streaming to clients
-- Historical data storage and retrieval
-- Rate limiting and API management
-- Data normalization and validation
-
-**API Endpoints:**
-```
-GET /health # Health check
-GET /api/market-data/:symbol # Get latest market data
-GET /api/historical/:symbol # Get historical data
-WS /ws # WebSocket for real-time data
-```
-
-#### **Risk Guardian** (`apps/core-services/risk-guardian`)
-- **Framework**: Hono + Bun
-- **Port**: 3002
-- **Purpose**: Real-time risk monitoring and controls
-- **Database**: PostgreSQL (persistent), Redis (real-time)
-
-**Responsibilities:**
-- Position size monitoring
-- Daily loss tracking
-- Portfolio risk assessment
-- Volatility monitoring
-- Real-time risk alerts
-- Risk threshold management
-
-**API Endpoints:**
-```
-GET /health # Health check
-GET /api/risk/thresholds # Get risk thresholds
-PUT /api/risk/thresholds # Update risk thresholds
-POST /api/risk/evaluate # Evaluate position risk
-GET /api/risk/history # Risk evaluation history
-WS /ws # WebSocket for risk alerts
-```
-
-#### **Strategy Orchestrator** (`apps/intelligence-services/strategy-orchestrator`)
-- **Framework**: Hono + Bun
-- **Port**: 4001
-- **Purpose**: Strategy lifecycle management and execution
-- **Database**: MongoDB (strategies), PostgreSQL (trades), Redis (signals)
-
-**Responsibilities:**
-- Strategy creation and management
-- Backtesting engine (vectorized & event-based)
-- Real-time strategy execution
-- Signal generation and broadcasting
-- Performance analytics
-- Strategy optimization
-
-**Current Structure:**
-```
-strategy-orchestrator/
-โโโ src/
-โ โโโ core/
-โ โ โโโ backtesting/
-โ โ โ โโโ BacktestEngine.ts # Main backtesting engine
-โ โ โ โโโ BacktestService.ts # Backtesting service layer
-โ โ โ โโโ MarketDataFeed.ts # Historical data provider
-โ โ โ โโโ PerformanceAnalytics.ts # Performance metrics
-โ โ โโโ execution/
-โ โ โ โโโ StrategyExecutionService.ts # Real-time execution
-โ โ โโโ strategies/
-โ โ โ โโโ Strategy.ts # Base strategy interface
-โ โ โ โโโ StrategyRegistry.ts # Strategy management
-โ โ โ โโโ BaseStrategy.ts # Abstract base class
-โ โ โ โโโ VectorizedStrategy.ts # Vectorized base class
-โ โ โ โโโ MovingAverageCrossover.ts # MA strategy
-โ โ โ โโโ MeanReversionStrategy.ts # Mean reversion
-โ โ โโโ indicators/
-โ โ โโโ TechnicalIndicators.ts # Technical analysis
-โ โโโ controllers/
-โ โ โโโ StrategyController.ts # API endpoints
-โ โโโ index.ts # Main entry point
-```
-
-**API Endpoints:**
-```
-GET /health # Health check
-GET /api/strategies # List strategies
-POST /api/strategies # Create strategy
-PUT /api/strategies/:id # Update strategy
-POST /api/strategies/:id/:action # Start/stop/pause strategy
-GET /api/strategies/:id/signals # Get strategy signals
-POST /api/strategies/:id/backtest # Run backtest
-GET /api/strategies/:id/performance # Get performance metrics
-WS /ws # WebSocket for strategy updates
-```
-
-### **3. Shared Packages**
-
-#### **Shared Types** (`packages/types`)
-```typescript
-export interface MarketData {
- symbol: string;
- price: number;
- volume: number;
- timestamp: Date;
- bid: number;
- ask: number;
-}
-
-export interface Strategy {
- id: string;
- name: string;
- symbols: string[];
- parameters: Record;
- status: 'ACTIVE' | 'PAUSED' | 'STOPPED';
-}
-
-export interface BacktestResult {
- totalReturn: number;
- sharpeRatio: number;
- maxDrawdown: number;
- winRate: number;
- totalTrades: number;
-}
-```
-
-#### **Configuration** (`packages/config`)
-```typescript
-export const config = {
- redis: {
- host: process.env.REDIS_HOST || 'localhost',
- port: parseInt(process.env.REDIS_PORT || '6379')
- },
- database: {
- postgres: process.env.POSTGRES_URL,
- questdb: process.env.QUESTDB_URL,
- mongodb: process.env.MONGODB_URL
- },
- marketData: {
- alphaVantageKey: process.env.ALPHA_VANTAGE_KEY,
- iexKey: process.env.IEX_KEY
- }
-};
-```
-
----
-
-## ๐ Data Flow
-
-### **Real-Time Market Data Flow**
-```mermaid
-sequenceDiagram
- participant EXT as External APIs
- participant MDG as Market Data Gateway
- participant Redis as Redis Cache
- participant QDB as QuestDB
- participant UI as Trading Dashboard
-
- EXT->>MDG: Market data feed
- MDG->>Redis: Cache latest prices
- MDG->>QDB: Store historical data
- MDG->>UI: WebSocket broadcast
- UI->>UI: Update charts/tables
-```
-
-### **Strategy Execution Flow**
-```mermaid
-sequenceDiagram
- participant UI as Trading Dashboard
- participant SO as Strategy Orchestrator
- participant MDG as Market Data Gateway
- participant RG as Risk Guardian
- participant Redis as Redis
-
- UI->>SO: Start strategy
- SO->>MDG: Subscribe to market data
- MDG->>SO: Real-time price updates
- SO->>SO: Generate trading signals
- SO->>RG: Risk evaluation
- RG->>SO: Risk approval/rejection
- SO->>Redis: Store signals
- SO->>UI: WebSocket signal broadcast
-```
-
----
-
-## ๐ป Technology Stack
-
-### **Backend**
-- **Runtime**: Bun (ultra-fast JavaScript runtime)
-- **Web Framework**: Hono (lightweight, fast web framework)
-- **Language**: TypeScript (type safety)
-- **Build Tool**: Turbo (monorepo management)
-
-### **Frontend**
-- **Framework**: Angular 20 (latest stable)
-- **UI Library**: Angular Material + Tailwind CSS
-- **State Management**: Angular Signals (reactive programming)
-- **WebSocket**: Native WebSocket API
-
-### **Databases**
-- **Time-Series**: QuestDB (market data storage)
-- **Relational**: PostgreSQL (structured data)
-- **Document**: MongoDB (strategy configurations)
-- **Cache/Pub-Sub**: Redis (real-time data)
-
-### **Infrastructure**
-- **Containerization**: Docker + Docker Compose
-- **Process Management**: PM2 (production)
-- **Monitoring**: Built-in health checks
-- **Development**: Hot reload, TypeScript compilation
-
----
-
-## ๐ Future Architecture
-
-### **Phase 1: Enhanced Microservices (Q2 2025)**
-```mermaid
-graph TB
- subgraph "API Gateway Layer"
- GW[Kong/Envoy API Gateway]
- LB[Load Balancer]
- end
-
- subgraph "Authentication"
- AUTH[Auth Service
JWT + OAuth]
- end
-
- subgraph "Core Services"
- MDG[Market Data Gateway]
- RG[Risk Guardian]
- SO[Strategy Orchestrator]
- OE[Order Execution Engine]
- NS[Notification Service]
- end
-
- subgraph "Analytics Services"
- BA[Backtest Analytics]
- PA[Performance Analytics]
- ML[ML Prediction Service]
- end
-
- subgraph "Message Queue"
- NATS[NATS/Apache Kafka]
- end
-```
-
-### **Phase 2: Machine Learning Integration (Q3 2025)**
-- **ML Pipeline**: Python-based ML services
-- **Feature Engineering**: Real-time feature computation
-- **Model Training**: Automated model retraining
-- **Prediction API**: Real-time predictions
-
-### **Phase 3: Multi-Asset Support (Q4 2025)**
-- **Crypto Trading**: Binance, Coinbase integration
-- **Forex Trading**: OANDA, FXCM integration
-- **Options Trading**: Interactive Brokers integration
-- **Futures Trading**: CME, ICE integration
-
----
-
-## ๐ Improvement Recommendations
-
-### **1. High Priority Improvements**
-
-#### **API Gateway Implementation**
-```typescript
-// Implement Kong or Envoy for:
-- Rate limiting per service
-- Authentication/authorization
-- Request/response transformation
-- Circuit breaker patterns
-- Load balancing
-```
-
-#### **Enhanced Error Handling**
-```typescript
-// Implement structured error handling:
-interface ServiceError {
- code: string;
- message: string;
- service: string;
- timestamp: Date;
- correlationId: string;
-}
-```
-
-#### **Comprehensive Logging**
-```typescript
-// Implement structured logging:
-interface LogEntry {
- level: 'debug' | 'info' | 'warn' | 'error';
- service: string;
- message: string;
- metadata: Record;
- timestamp: Date;
- correlationId: string;
-}
-```
-
-### **2. Medium Priority Improvements**
-
-#### **Database Optimization**
-```sql
--- QuestDB optimizations for market data:
-CREATE TABLE market_data (
- symbol SYMBOL,
- timestamp TIMESTAMP,
- price DOUBLE,
- volume DOUBLE,
- bid DOUBLE,
- ask DOUBLE
-) timestamp(timestamp) PARTITION BY DAY;
-
--- Add indexes for fast queries:
-CREATE INDEX idx_symbol_timestamp ON market_data (symbol, timestamp);
-```
-
-#### **Caching Strategy**
-```typescript
-// Implement multi-layer caching:
-interface CacheStrategy {
- L1: Map; // In-memory cache
- L2: Redis; // Distributed cache
- L3: Database; // Persistent storage
-}
-```
-
-#### **WebSocket Optimization**
-```typescript
-// Implement WebSocket connection pooling:
-interface WSConnectionPool {
- connections: Map;
- balancer: RoundRobinBalancer;
- heartbeat: HeartbeatManager;
-}
-```
-
-### **3. Low Priority Improvements**
-
-#### **Code Quality**
-- Implement comprehensive unit tests (>90% coverage)
-- Add integration tests for all services
-- Implement E2E tests for critical user flows
-- Add performance benchmarks
-
-#### **Documentation**
-- API documentation with OpenAPI/Swagger
-- Developer onboarding guide
-- Deployment runbooks
-- Architecture decision records (ADRs)
-
----
-
-## ๐๏ธ Deployment Architecture
-
-### **Development Environment**
-```yaml
-# docker-compose.dev.yml
-version: '3.8'
-services:
- # Databases
- postgres:
- image: postgres:15
- ports: ["5432:5432"]
-
- redis:
- image: redis:7-alpine
- ports: ["6379:6379"]
-
- questdb:
- image: questdb/questdb:latest
- ports: ["9000:9000", "8812:8812"]
-
- mongodb:
- image: mongo:6
- ports: ["27017:27017"]
-
- # Services
- market-data-gateway:
- build: ./apps/core-services/market-data-gateway
- ports: ["3001:3001"]
- depends_on: [redis, questdb]
-
- risk-guardian:
- build: ./apps/core-services/risk-guardian
- ports: ["3002:3002"]
- depends_on: [postgres, redis]
-
- strategy-orchestrator:
- build: ./apps/intelligence-services/strategy-orchestrator
- ports: ["4001:4001"]
- depends_on: [mongodb, postgres, redis]
-
- trading-dashboard:
- build: ./apps/interface-services/trading-dashboard
- ports: ["4200:4200"]
-```
-
-### **Production Environment**
-```yaml
-# kubernetes deployment example
-apiVersion: apps/v1
-kind: Deployment
-metadata:
- name: market-data-gateway
-spec:
- replicas: 3
- selector:
- matchLabels:
- app: market-data-gateway
- template:
- metadata:
- labels:
- app: market-data-gateway
- spec:
- containers:
- - name: market-data-gateway
- image: stockbot/market-data-gateway:latest
- ports:
- - containerPort: 3001
- resources:
- requests:
- memory: "256Mi"
- cpu: "250m"
- limits:
- memory: "512Mi"
- cpu: "500m"
-```
-
----
-
-## ๐ Security Architecture
-
-### **Authentication & Authorization**
-```typescript
-// JWT-based authentication
-interface AuthToken {
- userId: string;
- roles: string[];
- permissions: string[];
- expiresAt: Date;
-}
-
-// Role-based access control
-enum UserRole {
- TRADER = 'TRADER',
- ADMIN = 'ADMIN',
- VIEWER = 'VIEWER'
-}
-
-enum Permission {
- READ_MARKET_DATA = 'READ_MARKET_DATA',
- EXECUTE_TRADES = 'EXECUTE_TRADES',
- MANAGE_STRATEGIES = 'MANAGE_STRATEGIES',
- CONFIGURE_RISK = 'CONFIGURE_RISK'
-}
-```
-
-### **API Security**
-```typescript
-// Rate limiting configuration
-interface RateLimit {
- windowMs: number; // 15 minutes
- maxRequests: number; // 100 requests per window
- skipIf: (req: Request) => boolean;
-}
-
-// Input validation
-interface ApiValidation {
- schema: JSONSchema;
- sanitization: SanitizationRules;
- authentication: AuthenticationMiddleware;
-}
-```
-
-### **Data Security**
-- **Encryption at Rest**: AES-256 for sensitive data
-- **Encryption in Transit**: TLS 1.3 for all communications
-- **Secrets Management**: Kubernetes secrets or HashiCorp Vault
-- **Network Security**: VPC, security groups, firewalls
-
----
-
-## ๐ Monitoring & Observability
-
-### **Metrics Collection**
-```typescript
-interface ServiceMetrics {
- // Performance metrics
- requestLatency: Histogram;
- requestRate: Counter;
- errorRate: Counter;
-
- // Business metrics
- tradesExecuted: Counter;
- strategiesActive: Gauge;
- portfolioValue: Gauge;
-
- // System metrics
- memoryUsage: Gauge;
- cpuUsage: Gauge;
- dbConnections: Gauge;
-}
-```
-
-### **Health Checks**
-```typescript
-interface HealthCheck {
- service: string;
- status: 'healthy' | 'degraded' | 'unhealthy';
- checks: {
- database: boolean;
- redis: boolean;
- externalApis: boolean;
- webSocket: boolean;
- };
- uptime: number;
- version: string;
-}
-```
-
-### **Alerting Rules**
-```yaml
-# Prometheus alerting rules
-groups:
-- name: stockbot
- rules:
- - alert: HighErrorRate
- expr: rate(http_requests_total{status=~"5.."}[5m]) > 0.1
- for: 2m
-
- - alert: HighLatency
- expr: http_request_duration_seconds{quantile="0.95"} > 1
- for: 2m
-
- - alert: ServiceDown
- expr: up{job="stockbot"} == 0
- for: 30s
-```
-
----
-
-## ๐ Migration Plan
-
-### **Phase 1: Current โ Enhanced (1-2 months)**
-1. **Week 1-2**: Implement API Gateway and authentication
-2. **Week 3-4**: Add comprehensive logging and monitoring
-3. **Week 5-6**: Enhance error handling and resilience
-4. **Week 7-8**: Performance optimization and testing
-
-### **Phase 2: Enhanced โ ML-Ready (2-3 months)**
-1. **Month 1**: Implement ML pipeline infrastructure
-2. **Month 2**: Develop feature engineering services
-3. **Month 3**: Integrate ML predictions into strategies
-
-### **Phase 3: ML-Ready โ Multi-Asset (3-4 months)**
-1. **Month 1**: Abstract market data interfaces
-2. **Month 2**: Implement crypto trading support
-3. **Month 3**: Add forex and options trading
-4. **Month 4**: Performance optimization and testing
-
----
-
-## ๐ฏ Success Metrics
-
-### **Technical KPIs**
-- **Latency**: < 100ms for market data updates
-- **Throughput**: > 10,000 requests/second
-- **Availability**: 99.9% uptime
-- **Error Rate**: < 0.1% of requests
-
-### **Business KPIs**
-- **Strategy Performance**: Sharpe ratio > 1.5
-- **Risk Management**: Max drawdown < 5%
-- **Execution Quality**: Slippage < 0.01%
-- **System Adoption**: > 90% user satisfaction
-
----
-
-This architecture document serves as a living blueprint for the Stock Bot Trading System, providing clear guidance for current development and future scaling decisions.
\ No newline at end of file
diff --git a/CONTEXT.md b/CONTEXT.md
deleted file mode 100644
index 52b8c6b..0000000
--- a/CONTEXT.md
+++ /dev/null
@@ -1,229 +0,0 @@
-# ๐ Stock Bot Trading System - Architecture Context
-
-## ๐๏ธ System Overview
-
-A comprehensive, microservice-based trading bot system built with **Bun**, **TypeScript**, and **Turborepo**. The system features a service-oriented architecture designed for real-time market data processing, strategy execution, and risk management.
-
-## ๐ฏ Current System Status
-
-### โ
**Operational Services**
-- **Market Data Gateway** (`apps/core-services/market-data-gateway`) - Port 3004
- - **UNIFIED IMPLEMENTATION** - Merged from duplicate services
- - Real-time market data ingestion and processing
- - WebSocket server for live data streaming (Port 3005)
- - REST API for market data queries and configuration
- - Mock data implementation for testing
- - Full TypeScript implementation with resolved compilation errors
-
-- **Trading Dashboard** (`apps/interface-services/trading-dashboard`) - Port 5173
- - React + TypeScript frontend with Tremor UI
- - Real-time data visualization
- - WebSocket client for live updates
- - Professional financial dashboard components
-
-### ๐ง **Ready for Implementation**
-- **Strategy Orchestrator** (`apps/intelligence-services/strategy-orchestrator`) - Port 4001
- - Package structure created, implementation needed
- - Strategy execution and management
- - Signal generation coordination
-
-- **Risk Guardian** (`apps/core-services/risk-guardian`) - Port 3002
- - Package structure created, implementation needed
- - Real-time risk monitoring and alerts
- - Position and exposure limits
-
-## ๐๏ธ Service Architecture
-
-### **Service Categories**
-```
-apps/
-โโโ core-services/ # Essential trading infrastructure
-โ โโโ market-data-gateway/ โ
Operational (UNIFIED)
-โ โโโ risk-guardian/ ๐ Ready to implement
-โโโ intelligence-services/ # Strategy and signal generation
-โ โโโ strategy-orchestrator/ ๐ Ready to implement
-โโโ interface-services/ # User interfaces and APIs
-โ โโโ trading-dashboard/ โ
Operational
-โ โโโ trading-dashboard-react/ ๐ Alternative implementation
-โโโ data-services/ # Data processing and analytics
-โโโ execution-services/ # Order management and execution
-โโโ integration-services/ # External system integrations
-โโโ platform-services/ # Infrastructure and monitoring
-```
-
-### **Shared Packages**
-```
-packages/
-โโโ types/ # TypeScript type definitions
-โโโ config/ # Configuration management
-โโโ database/ # Database utilities (planned)
-โโโ trading-core/ # Core trading logic (planned)
-```
-
-## ๐ Communication Architecture
-
-### **Event-Driven Core (Dragonfly)**
-- **Primary Event Bus**: Dragonfly Redis-compatible streams
-- **Event Types**: Market data, trading signals, risk alerts, portfolio updates
-- **Communication Pattern**: Publish/Subscribe for loose coupling
-- **Real-time Processing**: Sub-millisecond event propagation
-
-### **Data Flow Patterns**
-1. **Market Data Flow**
- ```
- External APIs โ Market Data Gateway โ Dragonfly Events โ Dashboard/Services
- ```
-
-2. **Trading Signal Flow**
- ```
- Market Data โ Strategy Orchestrator โ Signals โ Risk Guardian โ Execution
- ```
-
-3. **Real-time Updates**
- ```
- Services โ WebSocket Server โ Dashboard (Live UI Updates)
- ```
-
-### **API Communication**
-- **REST APIs**: Service-to-service and client-to-service communication
-- **WebSocket**: Real-time bidirectional communication
-- **HTTP Health Checks**: Service monitoring and discovery
-
-## ๐๏ธ Data Infrastructure
-
-### **Storage Systems**
-- **๐ Dragonfly** (Port 6379): Redis-compatible event streaming and caching
-- **๐ PostgreSQL** (Port 5432): Operational data with trading schemas
-- **๐ QuestDB** (Ports 9000/8812): Time-series market data and analytics
-- **๐ MongoDB** (Port 27017): Document storage for sentiment analysis and raw documents
-
-### **Database Schemas**
-- `trading.*` - Orders, positions, executions, accounts
-- `strategy.*` - Strategies, signals, performance metrics
-- `risk.*` - Risk limits, events, monitoring
-- `audit.*` - System events, health checks, configuration
-
-### **MongoDB Collections**
-- `sentiment_analysis` - Market sentiment scores and analysis
-- `raw_documents` - News articles, social media posts, research reports
-- `market_events` - Significant market events and their impact
-
-### **Admin Interfaces**
-- **Redis Insight** (Port 8001): Dragonfly management
-- **PgAdmin** (Port 8080): PostgreSQL administration
-- **QuestDB Console** (Port 9000): Time-series data management
-- **Mongo Express** (Port 8081): MongoDB document browser and editor
-
-## ๐ก๏ธ Infrastructure & DevOps
-
-### **Container Management**
-- **Docker Compose**: Multi-service orchestration
-- **Development Environment**: `npm run dev:full`
-- **Infrastructure Management**: PowerShell scripts in `scripts/docker.ps1`
-
-### **Monitoring Stack**
-- **Prometheus** (Port 9090): Metrics collection
-- **Grafana** (Port 3000): Dashboards and alerting
-- **Health Checks**: Each service exposes `/health` endpoints
-
-### **Development Workflow**
-```bash
-# Start infrastructure
-npm run infra:up
-
-# Start admin tools
-npm run docker:admin
-
-# Start development services
-npm run dev:full
-```
-
-## ๐ Port Allocation
-
-| Service Category | Port Range | Current Services |
-|-----------------|------------|------------------|
-| **Core Services** | 3001-3099 | Market Data Gateway (3001), Risk Guardian (3002) |
-| **Intelligence** | 4001-4099 | Strategy Orchestrator (4001) |
-| **Data Services** | 5001-5099 | (Future expansion) |
-| **Interface** | 5173, 8001+ | Trading Dashboard (5173) |
-| **Infrastructure** | 6379, 5432, 9000+, 27017 | Dragonfly, PostgreSQL, QuestDB, MongoDB |
-
-## ๐ฏ Implementation Roadmap
-
-### **Phase 1 - Foundation** โ
Complete
-- [x] Monorepo setup with Turborepo
-- [x] Market Data Gateway with real-time streaming
-- [x] Professional React dashboard with Tremor UI
-- [x] Docker infrastructure with Dragonfly/PostgreSQL/QuestDB
-- [x] WebSocket real-time communication
-
-### **Phase 2 - Trading Logic** ๐ Next Priority
-- [ ] Strategy Orchestrator implementation
-- [ ] Risk Guardian implementation
-- [ ] Event-driven strategy execution
-- [ ] Portfolio position tracking
-
-### **Phase 3 - Advanced Features** โณ Future
-- [ ] Execution Engine with broker integration
-- [ ] Advanced analytics and backtesting
-- [ ] Machine learning signal generation
-- [ ] Multi-broker support
-
-## ๐ง Technology Stack
-
-### **Backend Services**
-- **Runtime**: Bun (fast JavaScript runtime)
-- **Framework**: Hono (lightweight web framework)
-- **Language**: TypeScript (type safety)
-- **Events**: Dragonfly Redis Streams
-- **WebSocket**: Native WebSocket implementation
-
-### **Frontend**
-- **Framework**: React with TypeScript
-- **Build Tool**: Vite (fast development)
-- **UI Library**: Tremor UI (financial components)
-- **Styling**: Modern CSS with responsive design
-
-### **Infrastructure**
-- **Monorepo**: Turborepo for build orchestration
-- **Containers**: Docker & Docker Compose
-- **Databases**: PostgreSQL, QuestDB, Dragonfly
-- **Monitoring**: Prometheus + Grafana
-
-## ๐ Security & Configuration
-
-### **Environment Management**
-- Environment-specific configurations
-- Secure credential management
-- Development vs production separation
-
-### **Service Security**
-- Inter-service authentication
-- API rate limiting
-- Database connection security
-- External API key management
-
-## ๐ Getting Started
-
-1. **Prerequisites**: Docker Desktop, Bun runtime
-2. **Infrastructure**: `npm run infra:up`
-3. **Admin Tools**: `npm run docker:admin`
-4. **Development**: `npm run dev:full`
-5. **Access**: Dashboard at http://localhost:5173
-
-## ๐ Key Configuration Files
-
-- `turbo.json` - Monorepo build configuration
-- `docker-compose.yml` - Infrastructure orchestration
-- `packages/config/` - Shared configuration management
-- `database/postgres/init/` - Database schema definitions
-
----
-
-**Architecture Design Principles:**
-- **Microservices**: Independent, scalable services
-- **Event-Driven**: Loose coupling via Dragonfly events
-- **Type Safety**: TypeScript across all services
-- **Real-time**: WebSocket and event streaming
-- **Observability**: Comprehensive monitoring and logging
-- **Developer Experience**: Fast development with hot reload
diff --git a/README.md b/README.md
deleted file mode 100644
index 6269de6..0000000
--- a/README.md
+++ /dev/null
@@ -1,180 +0,0 @@
-# ๐ค Stock Bot Trading System
-
-A comprehensive trading bot built with Bun and Turborepo, featuring a service-oriented architecture for real-time market data processing and strategy execution.
-
-## ๐ Quick Start
-
-### Prerequisites
-- [Bun](https://bun.sh/) runtime
-- Node.js 18+ (for compatibility)
-
-### Installation
-```bash
-# Clone and install dependencies
-git clone
-cd stock-bot
-bun install
-```
-
-### Running the System
-
-#### Option 1: VS Code Tasks (Recommended)
-1. Open the project in VS Code
-2. Press `Ctrl+Shift+P` (or `Cmd+Shift+P` on Mac)
-3. Type "Tasks: Run Task" and select it
-4. Choose "Start All Services"
-
-#### Option 2: Manual Startup
-```bash
-# Terminal 1: Start Market Data Gateway
-cd apps/core-services/market-data-gateway
-bun run dev
-
-# Terminal 2: Start Trading Dashboard
-cd apps/interface-services/trading-dashboard
-bun run dev
-```
-
-### Access Points
-- **Trading Dashboard**: http://localhost:5173
-- **Market Data API**: http://localhost:3001
-- **Health Check**: http://localhost:3001/health
-
-## ๐ Dashboard Features
-
-### Real-time Market Data
-- Live price feeds for AAPL, GOOGL, MSFT, TSLA, AMZN
-- WebSocket connections for real-time updates
-- Service health monitoring
-
-### Professional UI Components
-- Built with Tremor UI for financial visualizations
-- Interactive charts and metrics
-- Responsive design for all devices
-
-### Dashboard Tabs
-1. **Market Data**: Live prices, volume, bid/ask spreads
-2. **Portfolio**: Holdings allocation and performance
-3. **Charts**: Price and volume analysis
-4. **Performance**: Trading metrics and statistics
-
-## ๐๏ธ Architecture
-
-### Service-Oriented Design
-```
-apps/
-โโโ core-services/
-โ โโโ market-data-gateway/ # Market data ingestion
-โโโ interface-services/
-โ โโโ trading-dashboard/ # React dashboard
-โโโ data-services/ # (Future) Data processing
-โโโ execution-services/ # (Future) Order management
-โโโ intelligence-services/ # (Future) Strategy engine
-โโโ platform-services/ # (Future) Infrastructure
-โโโ integration-services/ # (Future) External APIs
-```
-
-### Shared Packages
-```
-packages/
-โโโ types/ # TypeScript definitions
-โโโ config/ # Configuration management
-โโโ database/ # (Future) Database utilities
-โโโ trading-core/ # (Future) Core trading logic
-```
-
-## ๐ง Development
-
-### Project Structure
-- **Turborepo**: Monorepo management
-- **Bun**: Package manager and runtime
-- **TypeScript**: Type safety across all services
-- **React + Vite**: Modern frontend development
-- **Tremor UI**: Financial dashboard components
-
-### Key Technologies
-- **Backend**: Hono framework, WebSockets, Redis
-- **Frontend**: React, TypeScript, Tremor UI
-- **Data**: QuestDB (planned), PostgreSQL (planned)
-- **Deployment**: Docker, Kubernetes (planned)
-
-## ๐ Current Status
-
-### โ
Completed
-- [x] Monorepo setup with Turborepo
-- [x] Market Data Gateway service
-- [x] Real-time WebSocket connections
-- [x] Professional React dashboard
-- [x] Tremor UI integration
-- [x] TypeScript type system
-- [x] Service health monitoring
-
-### ๐ง In Progress
-- [ ] Strategy execution engine
-- [ ] Risk management system
-- [ ] Portfolio tracking
-- [ ] Real broker integration
-
-### ๐ฎ Planned
-- [ ] Advanced charting
-- [ ] Backtesting framework
-- [ ] Machine learning signals
-- [ ] Multi-broker support
-- [ ] Mobile application
-
-## ๐ ๏ธ API Endpoints
-
-### Market Data Gateway (Port 3001)
-```
-GET /health # Service health check
-GET /api/market-data/:symbol # Current market data
-GET /api/ohlcv/:symbol # Historical OHLCV data
-WS ws://localhost:3001 # Real-time data stream
-```
-
-### Data Format
-```typescript
-interface MarketData {
- symbol: string;
- price: number;
- bid: number;
- ask: number;
- volume: number;
- timestamp: string;
-}
-```
-
-## ๐ง Configuration
-
-Environment variables are managed in `.env`:
-```bash
-# Database Configuration
-DATABASE_URL=postgresql://...
-QUESTDB_URL=http://localhost:9000
-
-# External APIs
-ALPHA_VANTAGE_API_KEY=your_key_here
-ALPACA_API_KEY=your_key_here
-
-# Service Configuration
-NODE_ENV=development
-LOG_LEVEL=info
-```
-
-## ๐ค Contributing
-
-1. Fork the repository
-2. Create a feature branch: `git checkout -b feature/amazing-feature`
-3. Commit changes: `git commit -m 'Add amazing feature'`
-4. Push to branch: `git push origin feature/amazing-feature`
-5. Open a Pull Request
-
-## ๐ License
-
-This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
-
-## ๐ Acknowledgments
-
-- [Tremor UI](https://tremor.so/) for beautiful financial components
-- [Bun](https://bun.sh/) for fast runtime and package management
-- [Turborepo](https://turbo.build/) for monorepo tooling
diff --git a/REFACTORING.md b/REFACTORING.md
deleted file mode 100644
index 0f679fc..0000000
--- a/REFACTORING.md
+++ /dev/null
@@ -1,62 +0,0 @@
-# Stock Bot Project Structure Refactoring
-
-This document outlines the changes made to improve separation of concerns in the stock-bot project architecture.
-
-## Directory Structure Changes
-
-1. Created a dedicated `libs/` directory (replacing the previous `packages/` approach)
-2. Split monolithic type definitions into domain-specific modules
-3. Created specialized libraries for cross-cutting concerns
-
-## New Libraries
-
-### 1. `@stock-bot/types`
-Domain-specific type definitions organized by functional area:
-- `market/` - Market data structures (OHLCV, OrderBook, etc.)
-- `trading/` - Trading types (Orders, Positions, etc.)
-- `strategy/` - Strategy and signal types
-- `events/` - Event definitions for the event bus
-- `api/` - Common API request/response types
-- `config/` - Configuration type definitions
-
-### 2. `@stock-bot/event-bus`
-A Redis-based event bus implementation for inter-service communication:
-- Publish/subscribe pattern for asynchronous messaging
-- Support for typed events
-- Reliable message delivery
-
-### 3. `@stock-bot/utils`
-Common utility functions shared across services:
-- `dateUtils` - Date/time helpers for market data
-- `financialUtils` - Financial calculations (Sharpe ratio, drawdown)
-- `logger` - Standardized logging service
-
-### 4. `@stock-bot/api-client`
-Type-safe API clients for inter-service communication:
-- `BaseApiClient` - Common HTTP client functionality
-- Service-specific clients (BacktestClient, StrategyClient)
-
-## Benefits of the New Architecture
-
-1. **Better Separation of Concerns**
- - Each library has a clear, focused purpose
- - Domain types are logically grouped
-
-2. **Improved Maintainability**
- - Smaller, focused modules
- - Clear dependencies between modules
-
-3. **Type Safety**
- - Consistent types across the entire system
- - Better IDE autocompletion and error checking
-
-4. **Reduced Duplication**
- - Shared utilities in a central location
- - Consistent implementation of common patterns
-
-## Next Steps
-
-1. Update service implementations to use the new libraries
-2. Migrate from the old packages directory to the new libs structure
-3. Implement domain-specific validations in each type module
-4. Add unit tests for each library
diff --git a/apps/core-services/market-data-gateway/README.md b/apps/core-services/market-data-gateway/README.md
deleted file mode 100644
index 8a351bf..0000000
--- a/apps/core-services/market-data-gateway/README.md
+++ /dev/null
@@ -1,196 +0,0 @@
-# Market Data Gateway - Unified Implementation
-
-## Overview
-
-The Market Data Gateway is a unified service that consolidates real-time market data ingestion, processing, and distribution capabilities. This service has been created by merging the previous core-services and data-services market-data-gateway implementations into a single, comprehensive solution.
-
-## Architecture
-
-### Unified Design
-- **Single Service**: Combines data ingestion, processing, and distribution in one service
-- **HTTP API**: RESTful endpoints for configuration and data retrieval
-- **WebSocket Server**: Real-time data streaming capabilities
-- **Type Safety**: Full TypeScript implementation with comprehensive type definitions
-
-### Key Components
-- **Data Source Management**: Configure and manage multiple market data sources
-- **Real-time Processing**: Stream processing pipelines for market data
-- **WebSocket Streaming**: Real-time data distribution to clients
-- **Health Monitoring**: Comprehensive health checks and metrics
-- **Cache Management**: Redis-based caching for performance optimization
-
-## Features
-
-### HTTP Endpoints
-
-#### Health & Status
-- `GET /health` - Basic health check
-- `GET /health/readiness` - Readiness probe
-- `GET /health/liveness` - Liveness probe
-- `GET /api/v1/gateway/status` - Gateway status and metrics
-- `GET /api/v1/gateway/config` - Current configuration
-
-#### Data Sources
-- `GET /api/v1/sources` - List configured data sources
-- `POST /api/v1/sources` - Add new data source
-- `PUT /api/v1/sources/:sourceId` - Update data source
-- `DELETE /api/v1/sources/:sourceId` - Remove data source
-
-#### Market Data
-- `GET /api/v1/data/tick/:symbol` - Latest tick data for symbol
-- `GET /api/v1/data/candles/:symbol` - Historical candle data
-- `GET /api/v1/subscriptions` - List active subscriptions
-- `POST /api/v1/subscriptions` - Create new subscription
-
-#### Metrics
-- `GET /api/v1/metrics` - System and gateway metrics
-
-### WebSocket Streaming
-
-Connect to `ws://localhost:3005/ws` for real-time data streaming.
-
-#### Message Types
-
-**Subscribe to symbols:**
-```json
-{
- "type": "subscribe",
- "symbols": ["AAPL", "GOOGL", "MSFT"]
-}
-```
-
-**Unsubscribe:**
-```json
-{
- "type": "unsubscribe",
- "subscriptionId": "sub_1234567890"
-}
-```
-
-**Receive tick data:**
-```json
-{
- "type": "tick",
- "data": {
- "symbol": "AAPL",
- "price": 150.25,
- "volume": 1000,
- "timestamp": "2025-06-03T13:01:49.638Z",
- "bid": 150.20,
- "ask": 150.30
- }
-}
-```
-
-## Configuration
-
-The service is configured through environment variables and the `GatewayConfig` interface:
-
-### Environment Variables
-- `PORT` - HTTP server port (default: 3004)
-- `HOST` - Server host (default: 0.0.0.0)
-- `REDIS_HOST` - Redis host (default: localhost)
-- `REDIS_PORT` - Redis port (default: 6379)
-- `REDIS_PASSWORD` - Redis password (optional)
-- `REDIS_DB` - Redis database number (default: 0)
-- `METRICS_PORT` - Metrics port (default: 9090)
-
-### Configuration Structure
-```typescript
-interface GatewayConfig {
- server: ServerConfig;
- dataSources: DataSourceConfig[];
- processing: ProcessingConfig;
- cache: CacheConfig;
- monitoring: MonitoringConfig;
-}
-```
-
-## Development
-
-### Prerequisites
-- Bun runtime
-- Redis server
-- TypeScript
-
-### Setup
-```bash
-cd apps/core-services/market-data-gateway
-bun install
-```
-
-### Development Mode
-```bash
-bun run dev
-```
-
-### Build
-```bash
-bun run build
-```
-
-### Testing
-The service includes mock data for testing purposes. When running, it will:
-- Respond to health checks
-- Provide mock tick and candle data
-- Accept WebSocket connections
-- Send simulated real-time data every 5 seconds
-
-## Deployment
-
-The service can be deployed using:
-- Docker containers
-- Kubernetes
-- Direct Node.js/Bun deployment
-
-### Docker
-```dockerfile
-FROM oven/bun:latest
-WORKDIR /app
-COPY package.json .
-COPY src/ ./src/
-RUN bun install
-RUN bun run build
-EXPOSE 3004 3005
-CMD ["bun", "run", "start"]
-```
-
-## Migration Notes
-
-This unified implementation replaces both:
-- `apps/core-services/market-data-gateway` (original)
-- `apps/data-services/market-data-gateway` (duplicate)
-
-### Changes Made
-1. **Consolidated Architecture**: Merged real-time and storage capabilities
-2. **Fixed Type Issues**: Resolved all TypeScript compilation errors
-3. **Simplified Configuration**: Aligned with `GatewayConfig` interface
-4. **Working WebSocket**: Functional real-time streaming
-5. **Comprehensive API**: Full REST API implementation
-6. **Mock Data**: Testing capabilities with simulated data
-
-### Removed Duplicates
-- Removed `apps/data-services/market-data-gateway` directory
-- Consolidated type definitions
-- Unified configuration structure
-
-## Future Enhancements
-
-1. **Real Data Sources**: Replace mock data with actual market data feeds
-2. **Advanced Processing**: Implement complex processing pipelines
-3. **Persistence Layer**: Add database storage for historical data
-4. **Authentication**: Add API authentication and authorization
-5. **Rate Limiting**: Implement request rate limiting
-6. **Monitoring**: Enhanced metrics and alerting
-7. **Load Balancing**: Support for horizontal scaling
-
-## Status
-
-โ
**COMPLETED**: TypeScript compilation errors resolved
-โ
**COMPLETED**: Unified service architecture
-โ
**COMPLETED**: Working HTTP and WebSocket servers
-โ
**COMPLETED**: Mock data implementation
-โ
**COMPLETED**: Health and metrics endpoints
-โ
**COMPLETED**: Duplicate service removal
-
-The market data gateway merge is now complete and the service is fully operational.
diff --git a/apps/core-services/market-data-gateway/package.json b/apps/core-services/market-data-gateway/package.json
deleted file mode 100644
index 413a8ab..0000000
--- a/apps/core-services/market-data-gateway/package.json
+++ /dev/null
@@ -1,58 +0,0 @@
-{
- "name": "@stock-bot/market-data-gateway",
- "version": "1.0.0",
- "description": "Unified market data gateway - real-time processing and historical storage",
- "main": "src/index.ts",
- "type": "module",
- "scripts": {
- "dev": "bun --watch src/index.ts",
- "build": "tsc",
- "start": "bun src/index.ts",
- "test": "bun test",
- "lint": "eslint src/**/*.ts",
- "type-check": "tsc --noEmit"
- },
- "dependencies": {
- "hono": "^4.6.3",
- "@hono/node-server": "^1.8.0",
- "ws": "^8.18.0",
- "bull": "^4.12.0",
- "ioredis": "^5.4.1",
- "zod": "^3.22.0",
- "uuid": "^9.0.0",
- "compression": "^1.7.4",
- "helmet": "^7.1.0",
- "rate-limiter-flexible": "^5.0.0",
- "node-cron": "^3.0.3",
- "eventemitter3": "^5.0.1",
- "fast-json-stringify": "^5.10.0",
- "pino": "^8.17.0",
- "dotenv": "^16.3.0",
- "@stock-bot/http-client": "*",
- "@stock-bot/config": "*",
- "@stock-bot/types": "*",
- "@stock-bot/event-bus": "*",
- "@stock-bot/utils": "*"
- },
- "devDependencies": {
- "@types/node": "^20.11.0",
- "@types/ws": "^8.5.12",
- "@types/uuid": "^9.0.0",
- "@types/compression": "^1.7.5",
- "@types/node-cron": "^3.0.11",
- "typescript": "^5.3.0",
- "eslint": "^8.56.0",
- "@typescript-eslint/eslint-plugin": "^6.19.0",
- "@typescript-eslint/parser": "^6.19.0",
- "bun-types": "^1.2.15"
- },
- "keywords": [
- "market-data",
- "gateway",
- "real-time",
- "websocket",
- "historical",
- "stock-bot",
- "core-services"
- ]
-}
diff --git a/apps/core-services/market-data-gateway/src/config/DataProviderConfig.ts b/apps/core-services/market-data-gateway/src/config/DataProviderConfig.ts
deleted file mode 100644
index 60d11cb..0000000
--- a/apps/core-services/market-data-gateway/src/config/DataProviderConfig.ts
+++ /dev/null
@@ -1,114 +0,0 @@
-// Data Provider Configuration
-export interface DataProviderConfig {
- name: string;
- type: 'rest' | 'websocket' | 'both';
- enabled: boolean;
- endpoints: {
- quotes?: string;
- candles?: string;
- trades?: string;
- websocket?: string;
- };
- authentication?: {
- type: 'api_key' | 'bearer' | 'basic';
- key?: string;
- secret?: string;
- token?: string;
- };
- rateLimits: {
- requestsPerSecond: number;
- requestsPerMinute: number;
- requestsPerHour: number;
- };
- retryPolicy: {
- maxRetries: number;
- backoffMultiplier: number;
- initialDelayMs: number;
- };
- timeout: number;
- priority: number; // 1-10, higher is better
-}
-
-export const dataProviderConfigs: Record = {
- 'alpha-vantage': {
- name: 'Alpha Vantage',
- type: 'rest',
- enabled: true,
- endpoints: {
- quotes: 'https://www.alphavantage.co/query',
- candles: 'https://www.alphavantage.co/query',
- },
- authentication: {
- type: 'api_key',
- key: process.env.ALPHA_VANTAGE_API_KEY,
- },
- rateLimits: {
- requestsPerSecond: 5,
- requestsPerMinute: 500,
- requestsPerHour: 25000,
- },
- retryPolicy: {
- maxRetries: 3,
- backoffMultiplier: 2,
- initialDelayMs: 1000,
- },
- timeout: 10000,
- priority: 7,
- },
- 'yahoo-finance': {
- name: 'Yahoo Finance',
- type: 'rest',
- enabled: true,
- endpoints: {
- quotes: 'https://query1.finance.yahoo.com/v8/finance/chart',
- candles: 'https://query1.finance.yahoo.com/v8/finance/chart',
- },
- rateLimits: {
- requestsPerSecond: 10,
- requestsPerMinute: 2000,
- requestsPerHour: 100000,
- },
- retryPolicy: {
- maxRetries: 3,
- backoffMultiplier: 1.5,
- initialDelayMs: 500,
- },
- timeout: 8000,
- priority: 8,
- },
- 'polygon': {
- name: 'Polygon.io',
- type: 'both',
- enabled: false,
- endpoints: {
- quotes: 'https://api.polygon.io/v2/last/nbbo',
- candles: 'https://api.polygon.io/v2/aggs/ticker',
- trades: 'https://api.polygon.io/v3/trades',
- websocket: 'wss://socket.polygon.io/stocks',
- },
- authentication: {
- type: 'api_key',
- key: process.env.POLYGON_API_KEY,
- },
- rateLimits: {
- requestsPerSecond: 100,
- requestsPerMinute: 5000,
- requestsPerHour: 100000,
- },
- retryPolicy: {
- maxRetries: 5,
- backoffMultiplier: 2,
- initialDelayMs: 200,
- },
- timeout: 5000,
- priority: 9,
- },
-};
-
-export function getEnabledProviders(): DataProviderConfig[] {
- return Object.values(dataProviderConfigs).filter(config => config.enabled);
-}
-
-export function getProviderByPriority(): DataProviderConfig[] {
- return getEnabledProviders().sort((a, b) => b.priority - a.priority);
-}
diff --git a/apps/core-services/market-data-gateway/src/controllers/GatewayController.ts b/apps/core-services/market-data-gateway/src/controllers/GatewayController.ts
deleted file mode 100644
index a6f6fa7..0000000
--- a/apps/core-services/market-data-gateway/src/controllers/GatewayController.ts
+++ /dev/null
@@ -1,449 +0,0 @@
-import { Context } from 'hono';
-import { MarketDataGatewayService } from '../services/MarketDataGatewayService';
-import {
- DataSourceConfig,
- SubscriptionRequest,
- ProcessingPipelineConfig,
- Logger
-} from '../types/MarketDataGateway';
-
-export class GatewayController {
- constructor(
- private gatewayService: MarketDataGatewayService,
- private logger: Logger
- ) {}
-
- // Gateway status and control
- async getStatus(c: Context) {
- try {
- const status = this.gatewayService.getStatus();
- return c.json(status);
- } catch (error) {
- this.logger.error('Failed to get gateway status:', error);
- return c.json({ error: 'Failed to get gateway status' }, 500);
- }
- }
-
- async getHealth(c: Context) {
- try {
- const health = this.gatewayService.getHealth();
- const statusCode = health.status === 'healthy' ? 200 :
- health.status === 'degraded' ? 200 : 503;
- return c.json(health, statusCode);
- } catch (error) {
- this.logger.error('Failed to get gateway health:', error);
- return c.json({
- status: 'unhealthy',
- message: 'Health check failed',
- timestamp: new Date().toISOString()
- }, 503);
- }
- }
-
- async getMetrics(c: Context) {
- try {
- const metrics = this.gatewayService.getMetrics();
- return c.json(metrics);
- } catch (error) {
- this.logger.error('Failed to get gateway metrics:', error);
- return c.json({ error: 'Failed to get gateway metrics' }, 500);
- }
- }
-
- async getConfiguration(c: Context) {
- try {
- const config = this.gatewayService.getConfiguration();
- return c.json(config);
- } catch (error) {
- this.logger.error('Failed to get gateway configuration:', error);
- return c.json({ error: 'Failed to get gateway configuration' }, 500);
- }
- }
-
- async updateConfiguration(c: Context) {
- try {
- const updates = await c.req.json();
- await this.gatewayService.updateConfiguration(updates);
-
- return c.json({
- message: 'Configuration updated successfully',
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to update gateway configuration:', error);
- return c.json({ error: 'Failed to update gateway configuration' }, 500);
- }
- }
-
- // Data source management
- async getDataSources(c: Context) {
- try {
- const sources = this.gatewayService.getDataSources();
- return c.json({ dataSources: Array.from(sources.values()) });
- } catch (error) {
- this.logger.error('Failed to get data sources:', error);
- return c.json({ error: 'Failed to get data sources' }, 500);
- }
- }
-
- async getDataSource(c: Context) {
- try {
- const sourceId = c.req.param('sourceId');
- const source = this.gatewayService.getDataSource(sourceId);
-
- if (!source) {
- return c.json({ error: 'Data source not found' }, 404);
- }
-
- return c.json(source);
- } catch (error) {
- this.logger.error('Failed to get data source:', error);
- return c.json({ error: 'Failed to get data source' }, 500);
- }
- }
-
- async addDataSource(c: Context) {
- try {
- const sourceConfig: DataSourceConfig = await c.req.json();
-
- // Validate required fields
- if (!sourceConfig.id || !sourceConfig.type || !sourceConfig.provider) {
- return c.json({
- error: 'Missing required fields: id, type, provider'
- }, 400);
- }
-
- await this.gatewayService.addDataSource(sourceConfig);
-
- return c.json({
- message: 'Data source added successfully',
- sourceId: sourceConfig.id,
- timestamp: new Date().toISOString()
- }, 201);
- } catch (error) {
- this.logger.error('Failed to add data source:', error);
- return c.json({ error: 'Failed to add data source' }, 500);
- }
- }
-
- async updateDataSource(c: Context) {
- try {
- const sourceId = c.req.param('sourceId');
- const updates = await c.req.json();
-
- await this.gatewayService.updateDataSource(sourceId, updates);
-
- return c.json({
- message: 'Data source updated successfully',
- sourceId,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to update data source:', error);
- return c.json({ error: 'Failed to update data source' }, 500);
- }
- }
-
- async removeDataSource(c: Context) {
- try {
- const sourceId = c.req.param('sourceId');
- await this.gatewayService.removeDataSource(sourceId);
-
- return c.json({
- message: 'Data source removed successfully',
- sourceId,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to remove data source:', error);
- return c.json({ error: 'Failed to remove data source' }, 500);
- }
- }
-
- async startDataSource(c: Context) {
- try {
- const sourceId = c.req.param('sourceId');
- await this.gatewayService.startDataSource(sourceId);
-
- return c.json({
- message: 'Data source started successfully',
- sourceId,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to start data source:', error);
- return c.json({ error: 'Failed to start data source' }, 500);
- }
- }
-
- async stopDataSource(c: Context) {
- try {
- const sourceId = c.req.param('sourceId');
- await this.gatewayService.stopDataSource(sourceId);
-
- return c.json({
- message: 'Data source stopped successfully',
- sourceId,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to stop data source:', error);
- return c.json({ error: 'Failed to stop data source' }, 500);
- }
- }
-
- // Subscription management
- async getSubscriptions(c: Context) {
- try {
- const subscriptions = this.gatewayService.getSubscriptions();
- return c.json({ subscriptions: Array.from(subscriptions.values()) });
- } catch (error) {
- this.logger.error('Failed to get subscriptions:', error);
- return c.json({ error: 'Failed to get subscriptions' }, 500);
- }
- }
-
- async getSubscription(c: Context) {
- try {
- const subscriptionId = c.req.param('subscriptionId');
- const subscription = this.gatewayService.getSubscription(subscriptionId);
-
- if (!subscription) {
- return c.json({ error: 'Subscription not found' }, 404);
- }
-
- return c.json(subscription);
- } catch (error) {
- this.logger.error('Failed to get subscription:', error);
- return c.json({ error: 'Failed to get subscription' }, 500);
- }
- }
-
- async createSubscription(c: Context) {
- try {
- const subscriptionRequest: SubscriptionRequest = await c.req.json();
-
- // Validate required fields
- if (!subscriptionRequest.clientId || !subscriptionRequest.symbols || subscriptionRequest.symbols.length === 0) {
- return c.json({
- error: 'Missing required fields: clientId, symbols'
- }, 400);
- }
-
- const subscriptionId = await this.gatewayService.subscribe(subscriptionRequest);
-
- return c.json({
- message: 'Subscription created successfully',
- subscriptionId,
- clientId: subscriptionRequest.clientId,
- symbols: subscriptionRequest.symbols,
- timestamp: new Date().toISOString()
- }, 201);
- } catch (error) {
- this.logger.error('Failed to create subscription:', error);
- return c.json({ error: 'Failed to create subscription' }, 500);
- }
- }
-
- async updateSubscription(c: Context) {
- try {
- const subscriptionId = c.req.param('subscriptionId');
- const updates = await c.req.json();
-
- await this.gatewayService.updateSubscription(subscriptionId, updates);
-
- return c.json({
- message: 'Subscription updated successfully',
- subscriptionId,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to update subscription:', error);
- return c.json({ error: 'Failed to update subscription' }, 500);
- }
- }
-
- async deleteSubscription(c: Context) {
- try {
- const subscriptionId = c.req.param('subscriptionId');
- await this.gatewayService.unsubscribe(subscriptionId);
-
- return c.json({
- message: 'Subscription deleted successfully',
- subscriptionId,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to delete subscription:', error);
- return c.json({ error: 'Failed to delete subscription' }, 500);
- }
- }
-
- // Processing pipeline management
- async getProcessingPipelines(c: Context) {
- try {
- const pipelines = this.gatewayService.getProcessingPipelines();
- return c.json({ pipelines: Array.from(pipelines.values()) });
- } catch (error) {
- this.logger.error('Failed to get processing pipelines:', error);
- return c.json({ error: 'Failed to get processing pipelines' }, 500);
- }
- }
-
- async getProcessingPipeline(c: Context) {
- try {
- const pipelineId = c.req.param('pipelineId');
- const pipeline = this.gatewayService.getProcessingPipeline(pipelineId);
-
- if (!pipeline) {
- return c.json({ error: 'Processing pipeline not found' }, 404);
- }
-
- return c.json(pipeline);
- } catch (error) {
- this.logger.error('Failed to get processing pipeline:', error);
- return c.json({ error: 'Failed to get processing pipeline' }, 500);
- }
- }
-
- async createProcessingPipeline(c: Context) {
- try {
- const pipelineConfig: ProcessingPipelineConfig = await c.req.json();
-
- // Validate required fields
- if (!pipelineConfig.id || !pipelineConfig.name || !pipelineConfig.processors) {
- return c.json({
- error: 'Missing required fields: id, name, processors'
- }, 400);
- }
-
- await this.gatewayService.addProcessingPipeline(pipelineConfig);
-
- return c.json({
- message: 'Processing pipeline created successfully',
- pipelineId: pipelineConfig.id,
- timestamp: new Date().toISOString()
- }, 201);
- } catch (error) {
- this.logger.error('Failed to create processing pipeline:', error);
- return c.json({ error: 'Failed to create processing pipeline' }, 500);
- }
- }
-
- async updateProcessingPipeline(c: Context) {
- try {
- const pipelineId = c.req.param('pipelineId');
- const updates = await c.req.json();
-
- await this.gatewayService.updateProcessingPipeline(pipelineId, updates);
-
- return c.json({
- message: 'Processing pipeline updated successfully',
- pipelineId,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to update processing pipeline:', error);
- return c.json({ error: 'Failed to update processing pipeline' }, 500);
- }
- }
-
- async deleteProcessingPipeline(c: Context) {
- try {
- const pipelineId = c.req.param('pipelineId');
- await this.gatewayService.removeProcessingPipeline(pipelineId);
-
- return c.json({
- message: 'Processing pipeline deleted successfully',
- pipelineId,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to delete processing pipeline:', error);
- return c.json({ error: 'Failed to delete processing pipeline' }, 500);
- }
- }
-
- // Market data queries
- async getLatestTick(c: Context) {
- try {
- const symbol = c.req.param('symbol');
- if (!symbol) {
- return c.json({ error: 'Symbol parameter is required' }, 400);
- }
-
- const tick = await this.gatewayService.getLatestTick(symbol);
-
- if (!tick) {
- return c.json({ error: 'No data available for symbol' }, 404);
- }
-
- return c.json(tick);
- } catch (error) {
- this.logger.error('Failed to get latest tick:', error);
- return c.json({ error: 'Failed to get latest tick' }, 500);
- }
- }
-
- async getCandles(c: Context) {
- try {
- const symbol = c.req.param('symbol');
- const timeframe = c.req.query('timeframe') || '1m';
- const startTime = c.req.query('startTime');
- const endTime = c.req.query('endTime');
- const limit = parseInt(c.req.query('limit') || '100');
-
- if (!symbol) {
- return c.json({ error: 'Symbol parameter is required' }, 400);
- }
-
- const candles = await this.gatewayService.getCandles(
- symbol,
- timeframe,
- startTime ? parseInt(startTime) : undefined,
- endTime ? parseInt(endTime) : undefined,
- limit
- );
-
- return c.json({
- symbol,
- timeframe,
- candles,
- count: candles.length,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to get candles:', error);
- return c.json({ error: 'Failed to get candles' }, 500);
- }
- }
-
- // System operations
- async flushCache(c: Context) {
- try {
- await this.gatewayService.flushCache();
-
- return c.json({
- message: 'Cache flushed successfully',
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to flush cache:', error);
- return c.json({ error: 'Failed to flush cache' }, 500);
- }
- }
-
- async restart(c: Context) {
- try {
- await this.gatewayService.restart();
-
- return c.json({
- message: 'Gateway restart initiated',
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to restart gateway:', error);
- return c.json({ error: 'Failed to restart gateway' }, 500);
- }
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/controllers/HealthController.ts b/apps/core-services/market-data-gateway/src/controllers/HealthController.ts
deleted file mode 100644
index 34e96ed..0000000
--- a/apps/core-services/market-data-gateway/src/controllers/HealthController.ts
+++ /dev/null
@@ -1,146 +0,0 @@
-import { Context } from 'hono';
-import { MarketDataGatewayService } from '../services/MarketDataGatewayService';
-import { Logger } from '../types/MarketDataGateway';
-
-export class HealthController {
- constructor(
- private gatewayService: MarketDataGatewayService,
- private logger: Logger
- ) {}
-
- async getHealth(c: Context) {
- try {
- const health = this.gatewayService.getHealth();
- const statusCode = health.status === 'healthy' ? 200 :
- health.status === 'degraded' ? 200 : 503;
-
- return c.json(health, statusCode);
- } catch (error) {
- this.logger.error('Health check failed:', error);
- return c.json({
- status: 'unhealthy',
- message: 'Health check failed',
- timestamp: new Date().toISOString(),
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 503);
- }
- }
-
- async getReadiness(c: Context) {
- try {
- const health = this.gatewayService.getHealth();
- const components = health.details?.components || {};
-
- // Check if all critical components are ready
- const criticalComponents = [
- 'dataSourceManager',
- 'processingEngine',
- 'subscriptionManager'
- ];
-
- const allReady = criticalComponents.every(component => {
- const componentHealth = components[component];
- return componentHealth && componentHealth.status === 'healthy';
- });
-
- const readinessStatus = {
- status: allReady ? 'ready' : 'not-ready',
- message: allReady ? 'All critical components are ready' : 'Some critical components are not ready',
- timestamp: new Date().toISOString(),
- components: Object.fromEntries(
- criticalComponents.map(component => [
- component,
- components[component]?.status || 'unknown'
- ])
- )
- };
-
- const statusCode = allReady ? 200 : 503;
- return c.json(readinessStatus, statusCode);
- } catch (error) {
- this.logger.error('Readiness check failed:', error);
- return c.json({
- status: 'not-ready',
- message: 'Readiness check failed',
- timestamp: new Date().toISOString(),
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 503);
- }
- }
-
- async getLiveness(c: Context) {
- try {
- // Basic liveness check - just verify the service is responding
- const uptime = process.uptime();
- const memoryUsage = process.memoryUsage();
-
- return c.json({
- status: 'alive',
- message: 'Service is alive and responding',
- timestamp: new Date().toISOString(),
- uptime: Math.floor(uptime),
- memory: {
- rss: Math.round(memoryUsage.rss / 1024 / 1024), // MB
- heapUsed: Math.round(memoryUsage.heapUsed / 1024 / 1024), // MB
- heapTotal: Math.round(memoryUsage.heapTotal / 1024 / 1024), // MB
- }
- });
- } catch (error) {
- this.logger.error('Liveness check failed:', error);
- return c.json({
- status: 'dead',
- message: 'Liveness check failed',
- timestamp: new Date().toISOString(),
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 503);
- }
- }
-
- async getComponentHealth(c: Context) {
- try {
- const component = c.req.param('component');
- const health = this.gatewayService.getHealth();
- const components = health.details?.components || {};
-
- if (!components[component]) {
- return c.json({
- error: 'Component not found',
- availableComponents: Object.keys(components)
- }, 404);
- }
-
- return c.json({
- component,
- ...components[component],
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Component health check failed:', error);
- return c.json({
- error: 'Component health check failed',
- message: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- }
-
- async getDetailedHealth(c: Context) {
- try {
- const health = this.gatewayService.getHealth();
- const metrics = this.gatewayService.getMetrics();
- const status = this.gatewayService.getStatus();
-
- return c.json({
- health,
- metrics,
- status,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Detailed health check failed:', error);
- return c.json({
- error: 'Detailed health check failed',
- message: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/controllers/MetricsController.ts b/apps/core-services/market-data-gateway/src/controllers/MetricsController.ts
deleted file mode 100644
index 9635eb5..0000000
--- a/apps/core-services/market-data-gateway/src/controllers/MetricsController.ts
+++ /dev/null
@@ -1,330 +0,0 @@
-import { Context } from 'hono';
-import { MetricsCollector } from '../services/MetricsCollector';
-import { Logger } from '../types/MarketDataGateway';
-
-export class MetricsController {
- constructor(
- private metricsCollector: MetricsCollector,
- private logger: Logger
- ) {}
-
- async getMetrics(c: Context) {
- try {
- const format = c.req.query('format') || 'json';
-
- if (format === 'prometheus') {
- const prometheusMetrics = this.metricsCollector.exportMetrics('prometheus');
- return c.text(prometheusMetrics, 200, {
- 'Content-Type': 'text/plain; charset=utf-8'
- });
- }
-
- const metrics = this.metricsCollector.getAggregatedMetrics();
- return c.json(metrics);
- } catch (error) {
- this.logger.error('Failed to get metrics:', error);
- return c.json({ error: 'Failed to get metrics' }, 500);
- }
- }
-
- async getMetric(c: Context) {
- try {
- const metricName = c.req.param('metricName');
- const duration = c.req.query('duration');
-
- if (!metricName) {
- return c.json({ error: 'Metric name is required' }, 400);
- }
-
- const durationMs = duration ? parseInt(duration) * 1000 : undefined;
- const metricData = this.metricsCollector.getMetric(metricName, durationMs);
-
- return c.json({
- metric: metricName,
- duration: duration || 'all',
- data: metricData,
- count: metricData.length
- });
- } catch (error) {
- this.logger.error('Failed to get metric:', error);
- return c.json({ error: 'Failed to get metric' }, 500);
- }
- }
-
- async getMetricAverage(c: Context) {
- try {
- const metricName = c.req.param('metricName');
- const duration = c.req.query('duration');
-
- if (!metricName) {
- return c.json({ error: 'Metric name is required' }, 400);
- }
-
- const durationMs = duration ? parseInt(duration) * 1000 : undefined;
- const average = this.metricsCollector.getAverageMetric(metricName, durationMs);
-
- return c.json({
- metric: metricName,
- duration: duration || 'all',
- average,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to get metric average:', error);
- return c.json({ error: 'Failed to get metric average' }, 500);
- }
- }
-
- async getMetricLatest(c: Context) {
- try {
- const metricName = c.req.param('metricName');
-
- if (!metricName) {
- return c.json({ error: 'Metric name is required' }, 400);
- }
-
- const latest = this.metricsCollector.getLatestMetric(metricName);
-
- if (latest === null) {
- return c.json({ error: 'Metric not found or no data available' }, 404);
- }
-
- return c.json({
- metric: metricName,
- value: latest,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to get latest metric:', error);
- return c.json({ error: 'Failed to get latest metric' }, 500);
- }
- }
-
- async getMetricRate(c: Context) {
- try {
- const metricName = c.req.param('metricName');
- const duration = c.req.query('duration') || '60'; // Default 60 seconds
-
- if (!metricName) {
- return c.json({ error: 'Metric name is required' }, 400);
- }
-
- const durationMs = parseInt(duration) * 1000;
- const rate = this.metricsCollector.getRate(metricName, durationMs);
-
- return c.json({
- metric: metricName,
- duration: duration,
- rate: rate,
- unit: 'per second',
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to get metric rate:', error);
- return c.json({ error: 'Failed to get metric rate' }, 500);
- }
- }
-
- async getMetricPercentile(c: Context) {
- try {
- const metricName = c.req.param('metricName');
- const percentile = c.req.query('percentile') || '95';
- const duration = c.req.query('duration');
-
- if (!metricName) {
- return c.json({ error: 'Metric name is required' }, 400);
- }
-
- const percentileValue = parseFloat(percentile);
- if (isNaN(percentileValue) || percentileValue < 0 || percentileValue > 100) {
- return c.json({ error: 'Percentile must be a number between 0 and 100' }, 400);
- }
-
- const durationMs = duration ? parseInt(duration) * 1000 : undefined;
- const value = this.metricsCollector.getPercentile(metricName, percentileValue, durationMs);
-
- return c.json({
- metric: metricName,
- percentile: percentileValue,
- value,
- duration: duration || 'all',
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to get metric percentile:', error);
- return c.json({ error: 'Failed to get metric percentile' }, 500);
- }
- }
-
- async getAlerts(c: Context) {
- try {
- const activeOnly = c.req.query('active') === 'true';
-
- if (activeOnly) {
- const alerts = this.metricsCollector.getActiveAlerts();
- return c.json({ alerts, count: alerts.length });
- }
-
- const rules = this.metricsCollector.getAlertRules();
- const alerts = this.metricsCollector.getActiveAlerts();
-
- return c.json({
- alertRules: rules,
- activeAlerts: alerts,
- rulesCount: rules.length,
- activeCount: alerts.length
- });
- } catch (error) {
- this.logger.error('Failed to get alerts:', error);
- return c.json({ error: 'Failed to get alerts' }, 500);
- }
- }
-
- async addAlertRule(c: Context) {
- try {
- const rule = await c.req.json();
-
- // Validate required fields
- if (!rule.id || !rule.metric || !rule.condition || rule.threshold === undefined) {
- return c.json({
- error: 'Missing required fields: id, metric, condition, threshold'
- }, 400);
- }
-
- // Validate condition
- const validConditions = ['gt', 'lt', 'eq', 'gte', 'lte'];
- if (!validConditions.includes(rule.condition)) {
- return c.json({
- error: `Invalid condition. Must be one of: ${validConditions.join(', ')}`
- }, 400);
- }
-
- this.metricsCollector.addAlertRule(rule);
-
- return c.json({
- message: 'Alert rule added successfully',
- ruleId: rule.id,
- timestamp: new Date().toISOString()
- }, 201);
- } catch (error) {
- this.logger.error('Failed to add alert rule:', error);
- return c.json({ error: 'Failed to add alert rule' }, 500);
- }
- }
-
- async removeAlertRule(c: Context) {
- try {
- const ruleId = c.req.param('ruleId');
-
- if (!ruleId) {
- return c.json({ error: 'Rule ID is required' }, 400);
- }
-
- this.metricsCollector.removeAlertRule(ruleId);
-
- return c.json({
- message: 'Alert rule removed successfully',
- ruleId,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to remove alert rule:', error);
- return c.json({ error: 'Failed to remove alert rule' }, 500);
- }
- }
-
- async recordCustomMetric(c: Context) {
- try {
- const { name, value, labels, type = 'gauge' } = await c.req.json();
-
- if (!name || value === undefined) {
- return c.json({
- error: 'Missing required fields: name, value'
- }, 400);
- }
-
- if (typeof value !== 'number') {
- return c.json({
- error: 'Value must be a number'
- }, 400);
- }
-
- switch (type) {
- case 'gauge':
- this.metricsCollector.setGauge(name, value, labels);
- break;
- case 'counter':
- this.metricsCollector.incrementCounter(name, value, labels);
- break;
- case 'histogram':
- this.metricsCollector.recordHistogram(name, value, labels);
- break;
- default:
- return c.json({
- error: 'Invalid metric type. Must be one of: gauge, counter, histogram'
- }, 400);
- }
-
- return c.json({
- message: 'Custom metric recorded successfully',
- name,
- value,
- type,
- timestamp: new Date().toISOString()
- });
- } catch (error) {
- this.logger.error('Failed to record custom metric:', error);
- return c.json({ error: 'Failed to record custom metric' }, 500);
- }
- }
-
- async getSystemMetrics(c: Context) {
- try {
- const process = require('process');
- const uptime = process.uptime();
- const memoryUsage = process.memoryUsage();
- const cpuUsage = process.cpuUsage();
-
- const systemMetrics = {
- uptime: Math.floor(uptime),
- memory: {
- rss: Math.round(memoryUsage.rss / 1024 / 1024), // MB
- heapUsed: Math.round(memoryUsage.heapUsed / 1024 / 1024), // MB
- heapTotal: Math.round(memoryUsage.heapTotal / 1024 / 1024), // MB
- external: Math.round(memoryUsage.external / 1024 / 1024), // MB
- },
- cpu: {
- user: cpuUsage.user,
- system: cpuUsage.system,
- },
- timestamp: new Date().toISOString()
- };
-
- return c.json(systemMetrics);
- } catch (error) {
- this.logger.error('Failed to get system metrics:', error);
- return c.json({ error: 'Failed to get system metrics' }, 500);
- }
- }
-
- async exportMetrics(c: Context) {
- try {
- const format = c.req.query('format') || 'json';
- const exported = this.metricsCollector.exportMetrics(format);
-
- if (format === 'prometheus') {
- return c.text(exported, 200, {
- 'Content-Type': 'text/plain; charset=utf-8'
- });
- }
-
- return c.text(exported, 200, {
- 'Content-Type': 'application/json',
- 'Content-Disposition': `attachment; filename="metrics-${new Date().toISOString()}.json"`
- });
- } catch (error) {
- this.logger.error('Failed to export metrics:', error);
- return c.json({ error: 'Failed to export metrics' }, 500);
- }
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/index.ts b/apps/core-services/market-data-gateway/src/index.ts
deleted file mode 100644
index 0ed49f4..0000000
--- a/apps/core-services/market-data-gateway/src/index.ts
+++ /dev/null
@@ -1,651 +0,0 @@
-// Market Data Gateway - Enhanced Implementation
-import { Hono } from 'hono';
-import { cors } from 'hono/cors';
-import { logger } from 'hono/logger';
-import { prettyJSON } from 'hono/pretty-json';
-import { WebSocketServer } from 'ws';
-
-// Types
-import { GatewayConfig } from './types/MarketDataGateway';
-
-// Services
-import { DataNormalizer, DataNormalizationResult } from './services/DataNormalizer';
-import { MarketDataCache } from './services/AdvancedCache';
-import { ConnectionPoolManager } from './services/ConnectionPoolManager';
-import { dataProviderConfigs, getEnabledProviders } from './config/DataProviderConfig';
-
-// Simple logger interface
-interface Logger {
- info: (message: string, ...args: any[]) => void;
- error: (message: string, ...args: any[]) => void;
- warn: (message: string, ...args: any[]) => void;
- debug: (message: string, ...args: any[]) => void;
-}
-
-// Create application logger
-const appLogger: Logger = {
- info: (message: string, ...args: any[]) => console.log(`[MDG-ENHANCED] [INFO] ${message}`, ...args),
- error: (message: string, ...args: any[]) => console.error(`[MDG-ENHANCED] [ERROR] ${message}`, ...args),
- warn: (message: string, ...args: any[]) => console.warn(`[MDG-ENHANCED] [WARN] ${message}`, ...args),
- debug: (message: string, ...args: any[]) => console.debug(`[MDG-ENHANCED] [DEBUG] ${message}`, ...args),
-};
-
-// Initialize services
-const dataNormalizer = new DataNormalizer();
-const marketDataCache = new MarketDataCache();
-const connectionPool = new ConnectionPoolManager({
- maxConnections: 100,
- maxConnectionsPerHost: 20,
- connectionTimeout: 10000,
- requestTimeout: 30000,
- retryAttempts: 3,
- retryDelay: 1000,
- keepAlive: true,
- maxIdleTime: 300000, // 5 minutes
-});
-
-// Configuration matching the GatewayConfig interface
-const config: GatewayConfig = {
- server: {
- port: parseInt(process.env.PORT || '3004'),
- host: process.env.HOST || '0.0.0.0',
- maxConnections: 1000,
- cors: {
- origins: ['http://localhost:3000', 'http://localhost:3001', 'http://localhost:8080'],
- methods: ['GET', 'POST', 'PUT', 'DELETE', 'OPTIONS'],
- headers: ['Content-Type', 'Authorization'],
- },
- }, dataSources: getEnabledProviders().map(provider => ({
- id: provider.name.toLowerCase().replace(/\s+/g, '-'),
- name: provider.name,
- type: provider.type === 'both' ? 'websocket' : provider.type as any,
- provider: provider.name,
- enabled: provider.enabled,
- priority: provider.priority,
- rateLimit: {
- requestsPerSecond: provider.rateLimits.requestsPerSecond,
- burstLimit: provider.rateLimits.requestsPerMinute,
- },
- connection: {
- url: provider.endpoints.quotes || provider.endpoints.websocket || '',
- authentication: provider.authentication ? {
- type: provider.authentication.type === 'api_key' ? 'apikey' as const : 'basic' as const,
- credentials: {
- apiKey: provider.authentication.key || '',
- secret: provider.authentication.secret || '',
- token: provider.authentication.token || '',
- },
- } : undefined,
- },
- subscriptions: {
- quotes: true,
- trades: true,
- orderbook: provider.endpoints.websocket ? true : false,
- candles: true,
- news: false,
- },
- symbols: ['AAPL', 'GOOGL', 'MSFT', 'AMZN', 'TSLA'], // Default symbols
- retryPolicy: {
- maxRetries: provider.retryPolicy.maxRetries,
- backoffMultiplier: provider.retryPolicy.backoffMultiplier,
- maxBackoffMs: provider.retryPolicy.initialDelayMs * 10,
- },
- healthCheck: {
- intervalMs: 30000,
- timeoutMs: provider.timeout,
- expectedLatencyMs: 1000,
- },
- })),
- processing: {
- pipelines: [],
- bufferSize: 10000,
- batchSize: 100,
- flushIntervalMs: 1000,
- }, cache: {
- redis: {
- host: process.env.REDIS_HOST || 'localhost',
- port: parseInt(process.env.REDIS_PORT || '6379'),
- password: process.env.REDIS_PASSWORD,
- db: parseInt(process.env.REDIS_DB || '0'),
- },
- ttl: {
- quotes: 60000, // 1 minute
- trades: 300000, // 5 minutes
- candles: 86400000, // 24 hours
- orderbook: 30000, // 30 seconds
- },
- }, monitoring: {
- metrics: {
- enabled: true,
- port: parseInt(process.env.METRICS_PORT || '9090'),
- intervalMs: 5000,
- retention: '24h',
- },
- alerts: {
- enabled: true,
- thresholds: {
- latency: 1000,
- latencyMs: 1000,
- errorRate: 0.05,
- connectionLoss: 3,
- },
- },
- },
-};
-
-// Global state variables
-let webSocketServer: WebSocketServer | null = null;
-let isShuttingDown = false;
-
-// Create Hono application
-const app = new Hono();
-
-// Middleware setup
-app.use('*', logger());
-app.use('*', prettyJSON());
-app.use('*', cors({
- origin: config.server.cors.origins,
- allowMethods: config.server.cors.methods,
- allowHeaders: config.server.cors.headers,
-}));
-
-// Mock data for testing
-const mockTickData = {
- symbol: 'AAPL',
- price: 150.25,
- volume: 1000,
- timestamp: new Date().toISOString(),
- bid: 150.20,
- ask: 150.30,
-};
-
-const mockCandleData = {
- symbol: 'AAPL',
- timeframe: '1m',
- timestamp: new Date().toISOString(),
- open: 150.00,
- high: 150.50,
- low: 149.75,
- close: 150.25,
- volume: 5000,
-};
-
-// Health endpoints
-app.get('/health', async (c) => {
- return c.json({
- status: 'healthy',
- timestamp: new Date().toISOString(),
- service: 'market-data-gateway',
- version: '1.0.0',
- });
-});
-
-app.get('/health/readiness', async (c) => {
- return c.json({
- status: 'ready',
- timestamp: new Date().toISOString(),
- checks: {
- webSocket: webSocketServer ? 'connected' : 'disconnected',
- cache: 'available',
- },
- });
-});
-
-app.get('/health/liveness', async (c) => {
- return c.json({
- status: 'alive',
- timestamp: new Date().toISOString(),
- uptime: process.uptime(),
- });
-});
-
-// Gateway status endpoints
-app.get('/api/v1/gateway/status', async (c) => {
- return c.json({
- status: 'running',
- dataSources: config.dataSources.length,
- activeConnections: webSocketServer ? webSocketServer.clients.size : 0,
- timestamp: new Date().toISOString(),
- });
-});
-
-app.get('/api/v1/gateway/config', async (c) => {
- return c.json({
- server: config.server,
- dataSourcesCount: config.dataSources.length,
- processingConfig: config.processing,
- monitoring: config.monitoring,
- });
-});
-
-// Data source management endpoints
-app.get('/api/v1/sources', async (c) => {
- return c.json({
- dataSources: config.dataSources,
- total: config.dataSources.length,
- });
-});
-
-app.post('/api/v1/sources', async (c) => {
- try {
- const newSource = await c.req.json();
- // In a real implementation, validate and add the data source
- return c.json({
- message: 'Data source configuration received',
- source: newSource,
- status: 'pending_validation',
- });
- } catch (error) {
- return c.json({ error: 'Invalid data source configuration' }, 400);
- }
-});
-
-// Subscription management endpoints
-app.get('/api/v1/subscriptions', async (c) => {
- return c.json({
- subscriptions: [],
- total: 0,
- message: 'Subscription management not yet implemented',
- });
-});
-
-app.post('/api/v1/subscriptions', async (c) => {
- try {
- const subscription = await c.req.json();
- return c.json({
- subscriptionId: `sub_${Date.now()}`,
- status: 'active',
- subscription,
- });
- } catch (error) {
- return c.json({ error: 'Invalid subscription request' }, 400);
- }
-});
-
-// Market data endpoints with enhanced functionality
-app.get('/api/v1/data/tick/:symbol', async (c) => {
- const symbol = c.req.param('symbol').toUpperCase();
- const source = c.req.query('source') || 'yahoo-finance';
-
- try {
- // Check cache first
- const cacheKey = marketDataCache.getQuoteKey(symbol);
- const cachedData = marketDataCache.get(cacheKey);
-
- if (cachedData) {
- appLogger.debug(`Cache hit for ${symbol}`);
- return c.json({
- ...cachedData,
- cached: true,
- timestamp: new Date().toISOString(),
- });
- }
-
- // Fetch from provider
- const provider = dataProviderConfigs[source];
- if (!provider || !provider.enabled) {
- return c.json({ error: 'Data source not available' }, 400);
- }
-
- // Mock data for now (replace with actual API calls)
- const mockData = {
- symbol,
- price: 150.25 + (Math.random() - 0.5) * 10,
- volume: Math.floor(Math.random() * 100000),
- timestamp: new Date().toISOString(),
- bid: 150.20,
- ask: 150.30,
- source,
- };
-
- // Normalize the data
- const normalizedResult = dataNormalizer.normalizeMarketData(mockData, source);
-
- if (!normalizedResult.success) {
- return c.json({ error: normalizedResult.error }, 500);
- }
-
- // Cache the result
- marketDataCache.setQuote(symbol, normalizedResult.data);
-
- return c.json({
- ...normalizedResult.data,
- cached: false,
- processingTimeMs: normalizedResult.processingTimeMs,
- });
-
- } catch (error) {
- appLogger.error(`Error fetching tick data for ${symbol}:`, error);
- return c.json({ error: 'Internal server error' }, 500);
- }
-});
-
-app.get('/api/v1/data/candles/:symbol', async (c) => {
- const symbol = c.req.param('symbol').toUpperCase();
- const timeframe = c.req.query('timeframe') || '1m';
- const limit = Math.min(parseInt(c.req.query('limit') || '100'), 1000); // Max 1000
- const source = c.req.query('source') || 'yahoo-finance';
-
- try {
- // Generate cache key
- const cacheKey = `candles:${symbol}:${timeframe}:${limit}`;
- const cachedData = marketDataCache.get(cacheKey);
-
- if (cachedData) {
- appLogger.debug(`Cache hit for candles ${symbol}:${timeframe}`);
- return c.json({
- candles: cachedData,
- cached: true,
- count: cachedData.length,
- });
- }
-
- // Mock candle data generation (replace with actual API calls)
- const candles = Array.from({ length: limit }, (_, i) => {
- const timestamp = new Date(Date.now() - i * 60000);
- const basePrice = 150 + (Math.random() - 0.5) * 20;
- const variation = (Math.random() - 0.5) * 2;
-
- return {
- symbol,
- timeframe,
- timestamp: timestamp.toISOString(),
- open: basePrice + variation,
- high: basePrice + variation + Math.random() * 2,
- low: basePrice + variation - Math.random() * 2,
- close: basePrice + variation + (Math.random() - 0.5),
- volume: Math.floor(Math.random() * 10000),
- source,
- };
- }).reverse(); // Oldest first
-
- // Normalize OHLCV data
- const normalizedResult = dataNormalizer.normalizeOHLCV(
- { candles: candles.map(c => ({ ...c, timestamp: new Date(c.timestamp) })) },
- source
- );
-
- if (!normalizedResult.success) {
- return c.json({ error: normalizedResult.error }, 500);
- }
-
- // Cache the result
- marketDataCache.set(cacheKey, normalizedResult.data, marketDataCache['getCandleTTL'](timeframe));
-
- return c.json({
- candles: normalizedResult.data,
- cached: false,
- count: normalizedResult.data?.length || 0,
- processingTimeMs: normalizedResult.processingTimeMs,
- });
-
- } catch (error) {
- appLogger.error(`Error fetching candles for ${symbol}:`, error);
- return c.json({ error: 'Internal server error' }, 500);
- }
-});
-
-// Enhanced metrics endpoints
-app.get('/api/v1/metrics', async (c) => {
- const cacheStats = marketDataCache.getStats();
- const connectionStats = connectionPool.getStats();
-
- return c.json({
- system: {
- uptime: process.uptime(),
- memoryUsage: process.memoryUsage(),
- cpuUsage: process.cpuUsage(),
- },
- gateway: {
- activeConnections: webSocketServer ? webSocketServer.clients.size : 0,
- dataSourcesCount: config.dataSources.filter(ds => ds.enabled).length,
- messagesProcessed: 0,
- },
- cache: cacheStats,
- connectionPool: connectionStats,
- timestamp: new Date().toISOString(),
- });
-});
-
-// Data quality assessment endpoint
-app.get('/api/v1/data/quality/:symbol', async (c) => {
- const symbol = c.req.param('symbol').toUpperCase();
- const source = c.req.query('source') || 'yahoo-finance';
-
- try {
- // Get recent data for quality assessment (mock for now)
- const recentData = Array.from({ length: 10 }, (_, i) => ({
- symbol,
- price: 150 + (Math.random() - 0.5) * 10,
- bid: 149.5,
- ask: 150.5,
- volume: Math.floor(Math.random() * 10000),
- timestamp: new Date(Date.now() - i * 60000),
- }));
-
- const qualityMetrics = dataNormalizer.assessDataQuality(recentData, source);
-
- return c.json({
- symbol,
- source,
- dataPoints: recentData.length,
- qualityMetrics,
- timestamp: new Date().toISOString(),
- });
-
- } catch (error) {
- appLogger.error(`Error assessing data quality for ${symbol}:`, error);
- return c.json({ error: 'Internal server error' }, 500);
- }
-});
-
-// Cache management endpoints
-app.get('/api/v1/cache/stats', async (c) => {
- return c.json({
- stats: marketDataCache.getStats(),
- keys: marketDataCache.keys().slice(0, 100), // Limit to first 100 keys
- timestamp: new Date().toISOString(),
- });
-});
-
-app.delete('/api/v1/cache/clear', async (c) => {
- marketDataCache.clear();
- return c.json({
- message: 'Cache cleared successfully',
- timestamp: new Date().toISOString(),
- });
-});
-
-app.delete('/api/v1/cache/key/:key', async (c) => {
- const key = c.req.param('key');
- const deleted = marketDataCache.delete(key);
-
- return c.json({
- message: deleted ? 'Key deleted successfully' : 'Key not found',
- key,
- deleted,
- timestamp: new Date().toISOString(),
- });
-});
-
-// Data providers status endpoint
-app.get('/api/v1/providers', async (c) => {
- const providers = Object.values(dataProviderConfigs).map(provider => ({
- name: provider.name,
- enabled: provider.enabled,
- type: provider.type,
- priority: provider.priority,
- rateLimits: provider.rateLimits,
- endpoints: Object.keys(provider.endpoints),
- }));
-
- return c.json({
- providers,
- enabled: providers.filter(p => p.enabled).length,
- total: providers.length,
- timestamp: new Date().toISOString(),
- });
-});
-
-// WebSocket server setup
-function setupWebSocketServer(): void {
- const wsPort = config.server.port + 1; // Use port + 1 for WebSocket
-
- webSocketServer = new WebSocketServer({
- port: wsPort,
- perMessageDeflate: false,
- });
-
- webSocketServer.on('connection', (ws, request) => {
- appLogger.info(`New WebSocket connection from ${request.socket.remoteAddress}`);
-
- ws.on('message', async (message) => {
- try {
- const data = JSON.parse(message.toString());
-
- switch (data.type) {
- case 'subscribe':
- if (data.symbols && Array.isArray(data.symbols)) {
- const subscriptionId = `sub_${Date.now()}`;
- ws.send(JSON.stringify({
- type: 'subscription_confirmed',
- subscriptionId,
- symbols: data.symbols,
- timestamp: new Date().toISOString(),
- }));
-
- // Send mock data every 5 seconds
- const interval = setInterval(() => {
- if (ws.readyState === ws.OPEN) {
- data.symbols.forEach((symbol: string) => {
- ws.send(JSON.stringify({
- type: 'tick',
- data: {
- ...mockTickData,
- symbol: symbol.toUpperCase(),
- price: mockTickData.price + (Math.random() - 0.5) * 2,
- timestamp: new Date().toISOString(),
- },
- }));
- });
- } else {
- clearInterval(interval);
- }
- }, 5000);
- }
- break;
-
- case 'unsubscribe':
- if (data.subscriptionId) {
- ws.send(JSON.stringify({
- type: 'unsubscription_confirmed',
- subscriptionId: data.subscriptionId,
- timestamp: new Date().toISOString(),
- }));
- }
- break;
-
- default:
- ws.send(JSON.stringify({
- type: 'error',
- message: 'Unknown message type',
- }));
- }
- } catch (error) {
- ws.send(JSON.stringify({
- type: 'error',
- message: 'Invalid message format',
- }));
- }
- });
-
- ws.on('close', () => {
- appLogger.info('WebSocket connection closed');
- });
-
- ws.on('error', (error) => {
- appLogger.error('WebSocket error:', error);
- });
- });
-
- appLogger.info(`WebSocket server listening on port ${wsPort}`);
-}
-
-// Enhanced graceful shutdown handler
-async function gracefulShutdown(): Promise {
- if (isShuttingDown) return;
- isShuttingDown = true;
-
- appLogger.info('Initiating graceful shutdown...');
-
- try {
- // Close WebSocket server
- if (webSocketServer) {
- webSocketServer.clients.forEach((client) => {
- client.terminate();
- });
- webSocketServer.close();
- appLogger.info('WebSocket server closed');
- }
-
- // Close connection pool
- await connectionPool.close();
- appLogger.info('Connection pool closed');
-
- // Clean up cache
- marketDataCache.destroy();
- appLogger.info('Cache destroyed');
-
- appLogger.info('Graceful shutdown completed');
- process.exit(0);
- } catch (error) {
- appLogger.error('Error during shutdown:', error);
- process.exit(1);
- }
-}
-
-// Enhanced start server function
-async function startServer(): Promise {
- try {
- appLogger.info('Starting Enhanced Market Data Gateway...');
-
- // Initialize cache event listeners
- marketDataCache.on('hit', (key) => appLogger.debug(`Cache hit: ${key}`));
- marketDataCache.on('miss', (key) => appLogger.debug(`Cache miss: ${key}`));
- marketDataCache.on('evict', (key) => appLogger.debug(`Cache evict: ${key}`));
-
- // Initialize connection pool event listeners
- connectionPool.on('connectionCreated', (host) => appLogger.debug(`Connection created for: ${host}`));
- connectionPool.on('error', ({ host, error }) => appLogger.warn(`Connection error for ${host}: ${error}`));
-
- // Setup WebSocket server
- setupWebSocketServer();
-
- // Setup graceful shutdown handlers
- process.on('SIGTERM', gracefulShutdown);
- process.on('SIGINT', gracefulShutdown);
-
- // Log service status
- appLogger.info(`HTTP server starting on ${config.server.host}:${config.server.port}`);
- appLogger.info(`WebSocket server running on port ${config.server.port + 1}`);
- appLogger.info(`Data sources configured: ${config.dataSources.length}`);
- appLogger.info(`Enabled providers: ${config.dataSources.filter(ds => ds.enabled).length}`);
- appLogger.info(`Cache max size: ${marketDataCache['config'].maxSize}`);
- appLogger.info(`Connection pool max connections: ${connectionPool['config'].maxConnections}`);
- appLogger.info('Enhanced Market Data Gateway started successfully');
-
- } catch (error) {
- appLogger.error('Failed to start server:', error);
- process.exit(1);
- }
-}
-
-// Start the application
-if (require.main === module) {
- startServer();
-}
-
-export default {
- port: config.server.port,
- fetch: app.fetch,
-};
diff --git a/apps/core-services/market-data-gateway/src/index_clean.ts b/apps/core-services/market-data-gateway/src/index_clean.ts
deleted file mode 100644
index 126d00e..0000000
--- a/apps/core-services/market-data-gateway/src/index_clean.ts
+++ /dev/null
@@ -1,386 +0,0 @@
-// Market Data Gateway - Unified Implementation
-import { Hono } from 'hono';
-import { cors } from 'hono/cors';
-import { logger } from 'hono/logger';
-import { prettyJSON } from 'hono/pretty-json';
-import { WebSocketServer } from 'ws';
-
-// Types
-import { GatewayConfig } from './types/MarketDataGateway';
-
-// Simple logger interface
-interface Logger {
- info: (message: string, ...args: any[]) => void;
- error: (message: string, ...args: any[]) => void;
- warn: (message: string, ...args: any[]) => void;
- debug: (message: string, ...args: any[]) => void;
-}
-
-// Create application logger
-const appLogger: Logger = {
- info: (message: string, ...args: any[]) => console.log(`[MDG-UNIFIED] [INFO] ${message}`, ...args),
- error: (message: string, ...args: any[]) => console.error(`[MDG-UNIFIED] [ERROR] ${message}`, ...args),
- warn: (message: string, ...args: any[]) => console.warn(`[MDG-UNIFIED] [WARN] ${message}`, ...args),
- debug: (message: string, ...args: any[]) => console.debug(`[MDG-UNIFIED] [DEBUG] ${message}`, ...args),
-};
-
-// Configuration matching the GatewayConfig interface
-const config: GatewayConfig = {
- server: {
- port: parseInt(process.env.PORT || '3004'),
- host: process.env.HOST || '0.0.0.0',
- maxConnections: 1000,
- cors: {
- origins: ['http://localhost:3000', 'http://localhost:3001', 'http://localhost:8080'],
- methods: ['GET', 'POST', 'PUT', 'DELETE', 'OPTIONS'],
- headers: ['Content-Type', 'Authorization'],
- },
- },
- dataSources: [], // Array of DataSourceConfig, initially empty
- processing: {
- pipelines: [],
- bufferSize: 10000,
- batchSize: 100,
- flushIntervalMs: 1000,
- }, cache: {
- redis: {
- host: process.env.REDIS_HOST || 'localhost',
- port: parseInt(process.env.REDIS_PORT || '6379'),
- password: process.env.REDIS_PASSWORD,
- db: parseInt(process.env.REDIS_DB || '0'),
- },
- ttl: {
- quotes: 60000, // 1 minute
- trades: 300000, // 5 minutes
- candles: 86400000, // 24 hours
- orderbook: 30000, // 30 seconds
- },
- }, monitoring: {
- metrics: {
- enabled: true,
- port: parseInt(process.env.METRICS_PORT || '9090'),
- intervalMs: 5000,
- retention: '24h',
- },
- alerts: {
- enabled: true,
- thresholds: {
- latency: 1000,
- latencyMs: 1000,
- errorRate: 0.05,
- connectionLoss: 3,
- },
- },
- },
-};
-
-// Global state variables
-let webSocketServer: WebSocketServer | null = null;
-let isShuttingDown = false;
-
-// Create Hono application
-const app = new Hono();
-
-// Middleware setup
-app.use('*', logger());
-app.use('*', prettyJSON());
-app.use('*', cors({
- origin: config.server.cors.origins,
- allowMethods: config.server.cors.methods,
- allowHeaders: config.server.cors.headers,
-}));
-
-// Mock data for testing
-const mockTickData = {
- symbol: 'AAPL',
- price: 150.25,
- volume: 1000,
- timestamp: new Date().toISOString(),
- bid: 150.20,
- ask: 150.30,
-};
-
-const mockCandleData = {
- symbol: 'AAPL',
- timeframe: '1m',
- timestamp: new Date().toISOString(),
- open: 150.00,
- high: 150.50,
- low: 149.75,
- close: 150.25,
- volume: 5000,
-};
-
-// Health endpoints
-app.get('/health', async (c) => {
- return c.json({
- status: 'healthy',
- timestamp: new Date().toISOString(),
- service: 'market-data-gateway',
- version: '1.0.0',
- });
-});
-
-app.get('/health/readiness', async (c) => {
- return c.json({
- status: 'ready',
- timestamp: new Date().toISOString(),
- checks: {
- webSocket: webSocketServer ? 'connected' : 'disconnected',
- cache: 'available',
- },
- });
-});
-
-app.get('/health/liveness', async (c) => {
- return c.json({
- status: 'alive',
- timestamp: new Date().toISOString(),
- uptime: process.uptime(),
- });
-});
-
-// Gateway status endpoints
-app.get('/api/v1/gateway/status', async (c) => {
- return c.json({
- status: 'running',
- dataSources: config.dataSources.length,
- activeConnections: webSocketServer ? webSocketServer.clients.size : 0,
- timestamp: new Date().toISOString(),
- });
-});
-
-app.get('/api/v1/gateway/config', async (c) => {
- return c.json({
- server: config.server,
- dataSourcesCount: config.dataSources.length,
- processingConfig: config.processing,
- monitoring: config.monitoring,
- });
-});
-
-// Data source management endpoints
-app.get('/api/v1/sources', async (c) => {
- return c.json({
- dataSources: config.dataSources,
- total: config.dataSources.length,
- });
-});
-
-app.post('/api/v1/sources', async (c) => {
- try {
- const newSource = await c.req.json();
- // In a real implementation, validate and add the data source
- return c.json({
- message: 'Data source configuration received',
- source: newSource,
- status: 'pending_validation',
- });
- } catch (error) {
- return c.json({ error: 'Invalid data source configuration' }, 400);
- }
-});
-
-// Subscription management endpoints
-app.get('/api/v1/subscriptions', async (c) => {
- return c.json({
- subscriptions: [],
- total: 0,
- message: 'Subscription management not yet implemented',
- });
-});
-
-app.post('/api/v1/subscriptions', async (c) => {
- try {
- const subscription = await c.req.json();
- return c.json({
- subscriptionId: `sub_${Date.now()}`,
- status: 'active',
- subscription,
- });
- } catch (error) {
- return c.json({ error: 'Invalid subscription request' }, 400);
- }
-});
-
-// Market data endpoints
-app.get('/api/v1/data/tick/:symbol', async (c) => {
- const symbol = c.req.param('symbol');
- return c.json({
- ...mockTickData,
- symbol: symbol.toUpperCase(),
- });
-});
-
-app.get('/api/v1/data/candles/:symbol', async (c) => {
- const symbol = c.req.param('symbol');
- const timeframe = c.req.query('timeframe') || '1m';
- const limit = parseInt(c.req.query('limit') || '100');
-
- const candles = Array.from({ length: limit }, (_, i) => ({
- ...mockCandleData,
- symbol: symbol.toUpperCase(),
- timeframe,
- timestamp: new Date(Date.now() - i * 60000).toISOString(),
- }));
-
- return c.json({ candles });
-});
-
-// Metrics endpoints
-app.get('/api/v1/metrics', async (c) => {
- return c.json({
- system: {
- uptime: process.uptime(),
- memoryUsage: process.memoryUsage(),
- cpuUsage: process.cpuUsage(),
- },
- gateway: {
- activeConnections: webSocketServer ? webSocketServer.clients.size : 0,
- dataSourcesCount: config.dataSources.length,
- messagesProcessed: 0,
- },
- timestamp: new Date().toISOString(),
- });
-});
-
-// WebSocket server setup
-function setupWebSocketServer(): void {
- const wsPort = config.server.port + 1; // Use port + 1 for WebSocket
-
- webSocketServer = new WebSocketServer({
- port: wsPort,
- perMessageDeflate: false,
- });
-
- webSocketServer.on('connection', (ws, request) => {
- appLogger.info(`New WebSocket connection from ${request.socket.remoteAddress}`);
-
- ws.on('message', async (message) => {
- try {
- const data = JSON.parse(message.toString());
-
- switch (data.type) {
- case 'subscribe':
- if (data.symbols && Array.isArray(data.symbols)) {
- const subscriptionId = `sub_${Date.now()}`;
- ws.send(JSON.stringify({
- type: 'subscription_confirmed',
- subscriptionId,
- symbols: data.symbols,
- timestamp: new Date().toISOString(),
- }));
-
- // Send mock data every 5 seconds
- const interval = setInterval(() => {
- if (ws.readyState === ws.OPEN) {
- data.symbols.forEach((symbol: string) => {
- ws.send(JSON.stringify({
- type: 'tick',
- data: {
- ...mockTickData,
- symbol: symbol.toUpperCase(),
- price: mockTickData.price + (Math.random() - 0.5) * 2,
- timestamp: new Date().toISOString(),
- },
- }));
- });
- } else {
- clearInterval(interval);
- }
- }, 5000);
- }
- break;
-
- case 'unsubscribe':
- if (data.subscriptionId) {
- ws.send(JSON.stringify({
- type: 'unsubscription_confirmed',
- subscriptionId: data.subscriptionId,
- timestamp: new Date().toISOString(),
- }));
- }
- break;
-
- default:
- ws.send(JSON.stringify({
- type: 'error',
- message: 'Unknown message type',
- }));
- }
- } catch (error) {
- ws.send(JSON.stringify({
- type: 'error',
- message: 'Invalid message format',
- }));
- }
- });
-
- ws.on('close', () => {
- appLogger.info('WebSocket connection closed');
- });
-
- ws.on('error', (error) => {
- appLogger.error('WebSocket error:', error);
- });
- });
-
- appLogger.info(`WebSocket server listening on port ${wsPort}`);
-}
-
-// Graceful shutdown handler
-async function gracefulShutdown(): Promise {
- if (isShuttingDown) return;
- isShuttingDown = true;
-
- appLogger.info('Initiating graceful shutdown...');
-
- try {
- // Close WebSocket server
- if (webSocketServer) {
- webSocketServer.clients.forEach((client) => {
- client.terminate();
- });
- webSocketServer.close();
- appLogger.info('WebSocket server closed');
- }
-
- appLogger.info('Graceful shutdown completed');
- process.exit(0);
- } catch (error) {
- appLogger.error('Error during shutdown:', error);
- process.exit(1);
- }
-}
-
-// Start server function
-async function startServer(): Promise {
- try {
- appLogger.info('Starting Market Data Gateway...');
-
- // Setup WebSocket server
- setupWebSocketServer();
-
- // Setup graceful shutdown handlers
- process.on('SIGTERM', gracefulShutdown);
- process.on('SIGINT', gracefulShutdown);
-
- appLogger.info(`HTTP server starting on ${config.server.host}:${config.server.port}`);
- appLogger.info(`WebSocket server running on port ${config.server.port + 1}`);
- appLogger.info('Market Data Gateway started successfully');
-
- } catch (error) {
- appLogger.error('Failed to start server:', error);
- process.exit(1);
- }
-}
-
-// Start the application
-if (require.main === module) {
- startServer();
-}
-
-export default {
- port: config.server.port,
- fetch: app.fetch,
-};
diff --git a/apps/core-services/market-data-gateway/src/realtime/index.ts b/apps/core-services/market-data-gateway/src/realtime/index.ts
deleted file mode 100644
index d21ca34..0000000
--- a/apps/core-services/market-data-gateway/src/realtime/index.ts
+++ /dev/null
@@ -1,7 +0,0 @@
-// Real-time market data processing components
-export { SubscriptionManager } from '../services/SubscriptionManager';
-export { DataSourceManager } from '../services/DataSourceManager';
-export { EventPublisher } from '../services/EventPublisher';
-export { ProcessingEngine } from '../services/ProcessingEngine';
-export { MarketDataService } from '../services/MarketDataService';
-export { MarketDataGatewayService } from '../services/MarketDataGatewayService';
diff --git a/apps/core-services/market-data-gateway/src/services/AdvancedCache.ts b/apps/core-services/market-data-gateway/src/services/AdvancedCache.ts
deleted file mode 100644
index e52bb34..0000000
--- a/apps/core-services/market-data-gateway/src/services/AdvancedCache.ts
+++ /dev/null
@@ -1,361 +0,0 @@
-import { EventEmitter } from 'events';
-
-export interface CacheEntry {
- data: T;
- timestamp: number;
- ttl: number;
- hits: number;
- lastAccessed: number;
-}
-
-export interface CacheStats {
- totalEntries: number;
- memoryUsage: number;
- hitRate: number;
- totalHits: number;
- totalMisses: number;
- averageAccessTime: number;
-}
-
-export interface CacheConfig {
- maxSize: number;
- defaultTtl: number;
- cleanupInterval: number;
- enableStats: boolean;
- compressionEnabled: boolean;
-}
-
-export class AdvancedCache extends EventEmitter {
- private cache = new Map>();
- private stats = {
- hits: 0,
- misses: 0,
- totalAccessTime: 0,
- accessCount: 0,
- };
- private cleanupTimer: NodeJS.Timeout | null = null;
-
- constructor(private config: CacheConfig) {
- super();
- this.startCleanupTimer();
- }
-
- /**
- * Get value from cache
- */
- get(key: string): T | null {
- const startTime = Date.now();
- const entry = this.cache.get(key);
-
- if (!entry) {
- this.stats.misses++;
- this.emit('miss', key);
- return null;
- }
-
- // Check if entry has expired
- if (Date.now() > entry.timestamp + entry.ttl) {
- this.cache.delete(key);
- this.stats.misses++;
- this.emit('expired', key, entry);
- return null;
- }
-
- // Update access statistics
- entry.hits++;
- entry.lastAccessed = Date.now();
- this.stats.hits++;
-
- if (this.config.enableStats) {
- this.stats.totalAccessTime += Date.now() - startTime;
- this.stats.accessCount++;
- }
-
- this.emit('hit', key, entry);
- return entry.data;
- }
-
- /**
- * Set value in cache
- */
- set(key: string, value: T, ttl?: number): void {
- const effectiveTtl = ttl || this.config.defaultTtl;
-
- // Check cache size limits
- if (this.cache.size >= this.config.maxSize && !this.cache.has(key)) {
- this.evictLeastUsed();
- }
-
- const entry: CacheEntry = {
- data: value,
- timestamp: Date.now(),
- ttl: effectiveTtl,
- hits: 0,
- lastAccessed: Date.now(),
- };
-
- this.cache.set(key, entry);
- this.emit('set', key, entry);
- }
-
- /**
- * Delete value from cache
- */
- delete(key: string): boolean {
- const deleted = this.cache.delete(key);
- if (deleted) {
- this.emit('delete', key);
- }
- return deleted;
- }
-
- /**
- * Check if key exists in cache
- */
- has(key: string): boolean {
- const entry = this.cache.get(key);
- if (!entry) return false;
-
- // Check if expired
- if (Date.now() > entry.timestamp + entry.ttl) {
- this.cache.delete(key);
- return false;
- }
-
- return true;
- }
-
- /**
- * Clear all cache entries
- */
- clear(): void {
- this.cache.clear();
- this.resetStats();
- this.emit('clear');
- }
-
- /**
- * Get cache statistics
- */
- getStats(): CacheStats {
- const memoryUsage = this.estimateMemoryUsage();
- const hitRate = this.stats.hits + this.stats.misses > 0
- ? this.stats.hits / (this.stats.hits + this.stats.misses)
- : 0;
- const averageAccessTime = this.stats.accessCount > 0
- ? this.stats.totalAccessTime / this.stats.accessCount
- : 0;
-
- return {
- totalEntries: this.cache.size,
- memoryUsage,
- hitRate,
- totalHits: this.stats.hits,
- totalMisses: this.stats.misses,
- averageAccessTime,
- };
- }
-
- /**
- * Get all cache keys
- */
- keys(): string[] {
- return Array.from(this.cache.keys());
- }
-
- /**
- * Get cache size
- */
- size(): number {
- return this.cache.size;
- }
-
- /**
- * Get or set with async loader function
- */
- async getOrSet(
- key: string,
- loader: () => Promise,
- ttl?: number
- ): Promise {
- const cached = this.get(key) as K;
- if (cached !== null) {
- return cached;
- }
-
- try {
- const value = await loader();
- this.set(key, value as any, ttl);
- return value;
- } catch (error) {
- this.emit('error', key, error);
- throw error;
- }
- }
-
- /**
- * Batch get multiple keys
- */
- mget(keys: string[]): Map {
- const result = new Map();
- for (const key of keys) {
- result.set(key, this.get(key));
- }
- return result;
- }
-
- /**
- * Batch set multiple key-value pairs
- */
- mset(entries: Map, ttl?: number): void {
- for (const [key, value] of entries) {
- this.set(key, value, ttl);
- }
- }
-
- /**
- * Clean up expired entries
- */
- cleanup(): number {
- const now = Date.now();
- let removedCount = 0;
-
- for (const [key, entry] of this.cache.entries()) {
- if (now > entry.timestamp + entry.ttl) {
- this.cache.delete(key);
- removedCount++;
- this.emit('expired', key, entry);
- }
- }
-
- return removedCount;
- }
-
- /**
- * Evict least recently used entries
- */
- private evictLeastUsed(): void {
- let oldestKey: string | null = null;
- let oldestTime = Date.now();
-
- for (const [key, entry] of this.cache.entries()) {
- if (entry.lastAccessed < oldestTime) {
- oldestTime = entry.lastAccessed;
- oldestKey = key;
- }
- }
-
- if (oldestKey) {
- this.cache.delete(oldestKey);
- this.emit('evict', oldestKey);
- }
- }
-
- /**
- * Estimate memory usage in bytes
- */
- private estimateMemoryUsage(): number {
- let totalSize = 0;
-
- for (const [key, entry] of this.cache.entries()) {
- // Rough estimation: key size + data size (as JSON string)
- totalSize += key.length * 2; // UTF-16 encoding
- totalSize += JSON.stringify(entry.data).length * 2;
- totalSize += 64; // Overhead for entry metadata
- }
-
- return totalSize;
- }
-
- /**
- * Reset statistics
- */
- private resetStats(): void {
- this.stats = {
- hits: 0,
- misses: 0,
- totalAccessTime: 0,
- accessCount: 0,
- };
- }
-
- /**
- * Start cleanup timer
- */
- private startCleanupTimer(): void {
- if (this.cleanupTimer) {
- clearInterval(this.cleanupTimer);
- }
-
- this.cleanupTimer = setInterval(() => {
- const removed = this.cleanup();
- if (removed > 0) {
- this.emit('cleanup', removed);
- }
- }, this.config.cleanupInterval);
- }
-
- /**
- * Stop cleanup timer and close cache
- */
- destroy(): void {
- if (this.cleanupTimer) {
- clearInterval(this.cleanupTimer);
- this.cleanupTimer = null;
- }
- this.clear();
- this.removeAllListeners();
- }
-}
-
-// Specialized cache for market data
-export class MarketDataCache extends AdvancedCache {
- constructor() {
- super({
- maxSize: 10000,
- defaultTtl: 60000, // 1 minute
- cleanupInterval: 30000, // 30 seconds
- enableStats: true,
- compressionEnabled: false,
- });
- }
-
- // Market data specific cache keys
- getQuoteKey(symbol: string): string {
- return `quote:${symbol}`;
- }
-
- getCandleKey(symbol: string, timeframe: string, timestamp: Date): string {
- return `candle:${symbol}:${timeframe}:${timestamp.getTime()}`;
- }
-
- getOrderBookKey(symbol: string): string {
- return `orderbook:${symbol}`;
- }
-
- // Market data specific TTLs
- setQuote(symbol: string, data: any): void {
- this.set(this.getQuoteKey(symbol), data, 60000); // 1 minute
- }
-
- setCandle(symbol: string, timeframe: string, timestamp: Date, data: any): void {
- const ttl = this.getCandleTTL(timeframe);
- this.set(this.getCandleKey(symbol, timeframe, timestamp), data, ttl);
- }
-
- setOrderBook(symbol: string, data: any): void {
- this.set(this.getOrderBookKey(symbol), data, 30000); // 30 seconds
- }
-
- private getCandleTTL(timeframe: string): number {
- const ttlMap: Record = {
- '1m': 60000, // 1 minute
- '5m': 300000, // 5 minutes
- '15m': 900000, // 15 minutes
- '1h': 3600000, // 1 hour
- '1d': 86400000, // 24 hours
- };
-
- return ttlMap[timeframe] || 300000; // Default 5 minutes
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/CacheManager.ts b/apps/core-services/market-data-gateway/src/services/CacheManager.ts
deleted file mode 100644
index 03fd8da..0000000
--- a/apps/core-services/market-data-gateway/src/services/CacheManager.ts
+++ /dev/null
@@ -1,372 +0,0 @@
-import Redis from 'ioredis';
-import { EventEmitter } from 'events';
-import {
- MarketDataTick,
- MarketDataCandle,
- CacheConfig,
- Logger,
- HealthStatus,
- GatewayMetrics
-} from '../types/MarketDataGateway';
-
-interface CacheMetrics {
- hits: number;
- misses: number;
- sets: number;
- deletes: number;
- errors: number;
- totalRequests: number;
- hitRate: number;
-}
-
-interface CacheEntry {
- data: T;
- timestamp: number;
- ttl: number;
-}
-
-export class CacheManager extends EventEmitter {
- private redis: Redis;
- private config: CacheConfig;
- private logger: Logger;
- private metrics: CacheMetrics;
- private isInitialized: boolean = false;
-
- constructor(config: CacheConfig, logger: Logger) {
- super();
- this.config = config;
- this.logger = logger;
- this.redis = new Redis({
- host: config.redis.host,
- port: config.redis.port,
- password: config.redis.password,
- db: config.redis.database || 0,
- retryDelayOnFailover: 100,
- maxRetriesPerRequest: 3,
- lazyConnect: true,
- });
-
- this.metrics = {
- hits: 0,
- misses: 0,
- sets: 0,
- deletes: 0,
- errors: 0,
- totalRequests: 0,
- hitRate: 0,
- };
-
- this.setupEventHandlers();
- }
-
- private setupEventHandlers(): void {
- this.redis.on('connect', () => {
- this.logger.info('Cache manager connected to Redis');
- this.isInitialized = true;
- this.emit('connected');
- });
-
- this.redis.on('error', (error) => {
- this.logger.error('Redis connection error:', error);
- this.metrics.errors++;
- this.emit('error', error);
- });
-
- this.redis.on('close', () => {
- this.logger.warn('Redis connection closed');
- this.isInitialized = false;
- this.emit('disconnected');
- });
- }
-
- async initialize(): Promise {
- try {
- await this.redis.connect();
- this.logger.info('Cache manager initialized successfully');
- } catch (error) {
- this.logger.error('Failed to initialize cache manager:', error);
- throw error;
- }
- }
-
- async shutdown(): Promise {
- try {
- await this.redis.quit();
- this.logger.info('Cache manager shut down successfully');
- } catch (error) {
- this.logger.error('Error shutting down cache manager:', error);
- }
- }
-
- // Market data caching methods
- async cacheTick(symbol: string, tick: MarketDataTick, ttl?: number): Promise {
- try {
- const key = this.getTickKey(symbol);
- const cacheEntry: CacheEntry = {
- data: tick,
- timestamp: Date.now(),
- ttl: ttl || this.config.tickTtl,
- };
-
- await this.redis.setex(key, cacheEntry.ttl, JSON.stringify(cacheEntry));
- this.metrics.sets++;
-
- // Also cache latest price for quick access
- await this.redis.setex(
- this.getLatestPriceKey(symbol),
- this.config.tickTtl,
- tick.price.toString()
- );
-
- this.emit('tick-cached', { symbol, tick });
- } catch (error) {
- this.logger.error(`Failed to cache tick for ${symbol}:`, error);
- this.metrics.errors++;
- throw error;
- }
- }
-
- async getLatestTick(symbol: string): Promise {
- try {
- this.metrics.totalRequests++;
- const key = this.getTickKey(symbol);
- const cached = await this.redis.get(key);
-
- if (cached) {
- this.metrics.hits++;
- const entry: CacheEntry = JSON.parse(cached);
- return entry.data;
- }
-
- this.metrics.misses++;
- return null;
- } catch (error) {
- this.logger.error(`Failed to get latest tick for ${symbol}:`, error);
- this.metrics.errors++;
- return null;
- } finally {
- this.updateHitRate();
- }
- }
-
- async cacheCandle(symbol: string, timeframe: string, candle: MarketDataCandle, ttl?: number): Promise {
- try {
- const key = this.getCandleKey(symbol, timeframe, candle.timestamp);
- const cacheEntry: CacheEntry = {
- data: candle,
- timestamp: Date.now(),
- ttl: ttl || this.config.candleTtl,
- };
-
- await this.redis.setex(key, cacheEntry.ttl, JSON.stringify(cacheEntry));
- this.metrics.sets++;
-
- // Also add to sorted set for range queries
- await this.redis.zadd(
- this.getCandleSetKey(symbol, timeframe),
- candle.timestamp,
- key
- );
-
- this.emit('candle-cached', { symbol, timeframe, candle });
- } catch (error) {
- this.logger.error(`Failed to cache candle for ${symbol}:`, error);
- this.metrics.errors++;
- throw error;
- }
- }
-
- async getCandleRange(
- symbol: string,
- timeframe: string,
- startTime: number,
- endTime: number
- ): Promise {
- try {
- this.metrics.totalRequests++;
- const setKey = this.getCandleSetKey(symbol, timeframe);
- const candleKeys = await this.redis.zrangebyscore(setKey, startTime, endTime);
-
- if (candleKeys.length === 0) {
- this.metrics.misses++;
- return [];
- }
-
- const pipeline = this.redis.pipeline();
- candleKeys.forEach(key => pipeline.get(key));
- const results = await pipeline.exec();
-
- const candles: MarketDataCandle[] = [];
- let hasData = false;
-
- results?.forEach((result) => {
- if (result && result[1]) {
- hasData = true;
- try {
- const entry: CacheEntry = JSON.parse(result[1] as string);
- candles.push(entry.data);
- } catch (parseError) {
- this.logger.error('Failed to parse cached candle:', parseError);
- }
- }
- });
-
- if (hasData) {
- this.metrics.hits++;
- } else {
- this.metrics.misses++;
- }
-
- return candles.sort((a, b) => a.timestamp - b.timestamp);
- } catch (error) {
- this.logger.error(`Failed to get candle range for ${symbol}:`, error);
- this.metrics.errors++;
- return [];
- } finally {
- this.updateHitRate();
- }
- }
-
- // Generic caching methods
- async set(key: string, value: T, ttl?: number): Promise {
- try {
- const cacheEntry: CacheEntry = {
- data: value,
- timestamp: Date.now(),
- ttl: ttl || this.config.defaultTtl,
- };
-
- await this.redis.setex(key, cacheEntry.ttl, JSON.stringify(cacheEntry));
- this.metrics.sets++;
- } catch (error) {
- this.logger.error(`Failed to set cache key ${key}:`, error);
- this.metrics.errors++;
- throw error;
- }
- }
-
- async get(key: string): Promise {
- try {
- this.metrics.totalRequests++;
- const cached = await this.redis.get(key);
-
- if (cached) {
- this.metrics.hits++;
- const entry: CacheEntry = JSON.parse(cached);
- return entry.data;
- }
-
- this.metrics.misses++;
- return null;
- } catch (error) {
- this.logger.error(`Failed to get cache key ${key}:`, error);
- this.metrics.errors++;
- return null;
- } finally {
- this.updateHitRate();
- }
- }
-
- async delete(key: string): Promise {
- try {
- await this.redis.del(key);
- this.metrics.deletes++;
- } catch (error) {
- this.logger.error(`Failed to delete cache key ${key}:`, error);
- this.metrics.errors++;
- throw error;
- }
- }
-
- async deletePattern(pattern: string): Promise {
- try {
- const keys = await this.redis.keys(pattern);
- if (keys.length > 0) {
- const deleted = await this.redis.del(...keys);
- this.metrics.deletes += deleted;
- return deleted;
- }
- return 0;
- } catch (error) {
- this.logger.error(`Failed to delete pattern ${pattern}:`, error);
- this.metrics.errors++;
- throw error;
- }
- }
-
- // Cache management
- async flush(): Promise {
- try {
- await this.redis.flushdb();
- this.logger.info('Cache flushed successfully');
- } catch (error) {
- this.logger.error('Failed to flush cache:', error);
- this.metrics.errors++;
- throw error;
- }
- }
-
- async getSize(): Promise {
- try {
- return await this.redis.dbsize();
- } catch (error) {
- this.logger.error('Failed to get cache size:', error);
- return 0;
- }
- }
-
- async getMemoryUsage(): Promise<{ used: number; peak: number }> {
- try {
- const info = await this.redis.memory('usage');
- const stats = await this.redis.memory('stats');
-
- return {
- used: parseInt(info as string) || 0,
- peak: parseInt(stats['peak.allocated'] as string) || 0,
- };
- } catch (error) {
- this.logger.error('Failed to get memory usage:', error);
- return { used: 0, peak: 0 };
- }
- }
-
- // Health and metrics
- getHealth(): HealthStatus {
- return {
- status: this.isInitialized ? 'healthy' : 'unhealthy',
- message: this.isInitialized ? 'Cache manager is operational' : 'Cache manager not connected',
- timestamp: new Date().toISOString(),
- details: {
- connected: this.isInitialized,
- metrics: this.metrics,
- },
- };
- }
-
- getMetrics(): CacheMetrics {
- return { ...this.metrics };
- }
-
- private updateHitRate(): void {
- this.metrics.hitRate = this.metrics.totalRequests > 0
- ? this.metrics.hits / this.metrics.totalRequests
- : 0;
- }
-
- // Key generation methods
- private getTickKey(symbol: string): string {
- return `tick:${symbol}:latest`;
- }
-
- private getLatestPriceKey(symbol: string): string {
- return `price:${symbol}:latest`;
- }
-
- private getCandleKey(symbol: string, timeframe: string, timestamp: number): string {
- return `candle:${symbol}:${timeframe}:${timestamp}`;
- }
-
- private getCandleSetKey(symbol: string, timeframe: string): string {
- return `candles:${symbol}:${timeframe}`;
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/ConnectionPoolManager.ts b/apps/core-services/market-data-gateway/src/services/ConnectionPoolManager.ts
deleted file mode 100644
index 97e1773..0000000
--- a/apps/core-services/market-data-gateway/src/services/ConnectionPoolManager.ts
+++ /dev/null
@@ -1,346 +0,0 @@
-import { EventEmitter } from 'eventemitter3';
-import {
- BunHttpClient,
- RequestConfig,
- HttpResponse,
- ConnectionStats,
- HttpClientConfig
-} from '@stock-bot/http-client';
-
-export interface ConnectionPoolConfig {
- maxConnections: number;
- maxConnectionsPerHost: number;
- connectionTimeout: number;
- requestTimeout: number;
- retryAttempts: number;
- retryDelay: number;
- keepAlive: boolean;
- maxIdleTime: number;
-}
-
-export interface QueuedRequest {
- id: string;
- config: RequestConfig;
- resolve: (value: any) => void;
- reject: (error: any) => void;
- timestamp: number;
- retryCount: number;
-}
-
-export class ConnectionPoolManager extends EventEmitter {
- private clients = new Map();
- private activeRequests = new Map(); // host -> count
- private requestQueue: QueuedRequest[] = [];
- private stats = {
- totalConnections: 0,
- successfulRequests: 0,
- failedRequests: 0,
- totalResponseTime: 0,
- requestCount: 0,
- };
- private isProcessingQueue = false;
- private queueProcessor?: NodeJS.Timeout;
-
- constructor(private config: ConnectionPoolConfig) {
- super();
- this.startQueueProcessor();
- }
-
- /**
- * Get or create a client for a host
- */
- private getClient(host: string): BunHttpClient {
- if (!this.clients.has(host)) {
- const client = new BunHttpClient({
- baseURL: `https://${host}`,
- timeout: this.config.requestTimeout,
- retries: this.config.retryAttempts,
- retryDelay: this.config.retryDelay,
- keepAlive: this.config.keepAlive,
- headers: {
- 'User-Agent': 'StockBot-MarketDataGateway/1.0',
- 'Accept': 'application/json',
- },
- validateStatus: (status: number) => status < 500,
- });
-
- // Listen for events from the client
- client.on('response', (data) => {
- const responseTime = data.response.timing.duration;
- this.updateStats(true, responseTime);
- this.emit('response', {
- host,
- responseTime,
- status: data.response.status
- });
- });
-
- client.on('error', (data) => {
- const responseTime = data.error?.config?.metadata?.startTime
- ? Date.now() - data.error.config.metadata.startTime
- : 0;
-
- this.updateStats(false, responseTime);
- this.emit('error', {
- host,
- error: data.error.message,
- responseTime
- });
- });
-
- this.clients.set(host, client);
- this.activeRequests.set(host, 0);
- this.stats.totalConnections++;
-
- this.emit('connectionCreated', host);
- }
-
- return this.clients.get(host)!;
- }
-
- /**
- * Make an HTTP request with connection pooling
- */
- async request(config: RequestConfig): Promise {
- return new Promise((resolve, reject) => {
- const requestId = this.generateRequestId();
- const queuedRequest: QueuedRequest = {
- id: requestId,
- config,
- resolve,
- reject,
- timestamp: Date.now(),
- retryCount: 0,
- };
-
- this.requestQueue.push(queuedRequest);
- this.processQueue();
- });
- }
-
- /**
- * Process the request queue
- */
- private async processQueue(): Promise {
- if (this.isProcessingQueue || this.requestQueue.length === 0) {
- return;
- }
-
- this.isProcessingQueue = true;
-
- while (this.requestQueue.length > 0) {
- const request = this.requestQueue.shift()!;
-
- try {
- const host = this.extractHost(request.config.url || '');
- const currentConnections = this.activeRequests.get(host) || 0;
-
- // Check connection limits
- if (currentConnections >= this.config.maxConnectionsPerHost) {
- // Put request back in queue
- this.requestQueue.unshift(request);
- break;
- }
-
- // Check global connection limit
- const totalActive = Array.from(this.activeRequests.values()).reduce((sum, count) => sum + count, 0);
- if (totalActive >= this.config.maxConnections) {
- this.requestQueue.unshift(request);
- break;
- }
-
- // Execute the request
- this.executeRequest(request, host);
-
- } catch (error) {
- request.reject(error);
- }
- }
-
- this.isProcessingQueue = false;
- }
-
- /**
- * Execute a single request
- */
- private async executeRequest(request: QueuedRequest, host: string): Promise {
- const client = this.getClient(host);
-
- // Increment active connections
- this.activeRequests.set(host, (this.activeRequests.get(host) || 0) + 1);
-
- try {
- // Add metadata to track timing
- if (!request.config.metadata) {
- request.config.metadata = {};
- }
- request.config.metadata.startTime = Date.now();
-
- // Execute request using our client
- const response = await client.request(request.config);
- request.resolve(response.data);
-
- } catch (error: any) {
- // No need to handle retries explicitly as the BunHttpClient handles them internally
- request.reject(error);
-
- // Emit retry event for monitoring
- if (error.retryCount) {
- this.emit('retry', {
- requestId: request.id,
- retryCount: error.retryCount,
- error
- });
- }
- } finally {
- // Decrement active connections
- this.activeRequests.set(host, Math.max(0, (this.activeRequests.get(host) || 0) - 1));
- }
- }
-
- /**
- * Extract host from URL
- */
- private extractHost(url: string): string {
- try {
- const urlObj = new URL(url);
- return urlObj.host;
- } catch {
- return 'default';
- }
- }
-
- /**
- * Generate unique request ID
- */
- private generateRequestId(): string {
- return `req_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
- }
-
- /**
- * Update statistics
- */
- private updateStats(success: boolean, responseTime: number): void {
- this.stats.requestCount++;
- this.stats.totalResponseTime += responseTime;
-
- if (success) {
- this.stats.successfulRequests++;
- } else {
- this.stats.failedRequests++;
- }
- }
-
- /**
- * Get connection pool statistics
- */
- getStats(): ConnectionStats {
- const totalActive = Array.from(this.activeRequests.values()).reduce((sum, count) => sum + count, 0);
- const averageResponseTime = this.stats.requestCount > 0
- ? this.stats.totalResponseTime / this.stats.requestCount
- : 0;
- const utilization = this.config.maxConnections > 0
- ? totalActive / this.config.maxConnections
- : 0;
-
- // Combine our stats with the stats from all clients
- const clientStats = Array.from(this.clients.values()).map(client => client.getStats());
-
- let successfulRequests = this.stats.successfulRequests;
- let failedRequests = this.stats.failedRequests;
-
- for (const stats of clientStats) {
- successfulRequests += stats.successfulRequests;
- failedRequests += stats.failedRequests;
- }
-
- return {
- activeConnections: totalActive,
- totalConnections: this.stats.totalConnections,
- successfulRequests,
- failedRequests,
- averageResponseTime,
- connectionPoolUtilization: utilization,
- requestsPerSecond: 0 // Will be calculated by the http-client
- };
- }
-
- /**
- * Start queue processor timer
- */
- private startQueueProcessor(): void {
- this.queueProcessor = setInterval(() => {
- this.processQueue();
- }, 100); // Process queue every 100ms
- }
-
- /**
- * Close all connections and clean up
- */
- async close(): Promise {
- // Stop the queue processor
- if (this.queueProcessor) {
- clearInterval(this.queueProcessor);
- }
-
- // Wait for pending requests to complete (with timeout)
- const timeout = 30000; // 30 seconds
- const startTime = Date.now();
-
- while (this.requestQueue.length > 0 && Date.now() - startTime < timeout) {
- await new Promise(resolve => setTimeout(resolve, 100));
- }
-
- // Clear remaining requests
- while (this.requestQueue.length > 0) {
- const request = this.requestQueue.shift()!;
- request.reject(new Error('Connection pool closing'));
- }
-
- // Close all clients
- const closePromises = Array.from(this.clients.values()).map(client => client.close());
- await Promise.all(closePromises);
-
- // Clear clients and requests
- this.clients.clear();
- this.activeRequests.clear();
-
- this.emit('closed');
- }
-
- /**
- * Health check for the connection pool
- */
- async healthCheck(): Promise<{ healthy: boolean; details: any }> {
- const stats = this.getStats();
- const queueSize = this.requestQueue.length;
-
- // Check health of all clients
- const clientHealthChecks = await Promise.all(
- Array.from(this.clients.entries()).map(async ([host, client]) => {
- const health = await client.healthCheck();
- return {
- host,
- healthy: health.healthy,
- details: health.details
- };
- })
- );
-
- const healthy =
- stats.connectionPoolUtilization < 0.9 && // Less than 90% utilization
- queueSize < 100 && // Queue not too large
- stats.averageResponseTime < 5000 && // Average response time under 5 seconds
- clientHealthChecks.every(check => check.healthy); // All clients healthy
-
- return {
- healthy,
- details: {
- stats,
- queueSize,
- clients: clientHealthChecks,
- connections: Array.from(this.clients.keys()),
- },
- };
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/DataNormalizer.ts b/apps/core-services/market-data-gateway/src/services/DataNormalizer.ts
deleted file mode 100644
index 2d7579a..0000000
--- a/apps/core-services/market-data-gateway/src/services/DataNormalizer.ts
+++ /dev/null
@@ -1,396 +0,0 @@
-import { dataProviderConfigs, DataProviderConfig } from '../config/DataProviderConfig';
-
-// Define local types for market data
-interface MarketDataType {
- symbol: string;
- price: number;
- bid: number;
- ask: number;
- volume: number;
- timestamp: Date;
-}
-
-interface OHLCVType {
- symbol: string;
- timestamp: Date;
- open: number;
- high: number;
- low: number;
- close: number;
- volume: number;
-}
-
-export interface DataNormalizationResult {
- success: boolean;
- data?: T;
- error?: string;
- source: string;
- timestamp: Date;
- processingTimeMs: number;
-}
-
-export interface DataQualityMetrics {
- completeness: number; // 0-1
- accuracy: number; // 0-1
- timeliness: number; // 0-1
- consistency: number; // 0-1
- overall: number; // 0-1
-}
-
-export class DataNormalizer {
- private readonly providerConfigs: Record;
-
- constructor() {
- this.providerConfigs = dataProviderConfigs;
- }
-
- /**
- * Normalize market data from different providers to our standard format
- */
- normalizeMarketData(rawData: any, source: string): DataNormalizationResult {
- const startTime = Date.now();
- try {
- let normalizedData: MarketDataType;
-
- switch (source.toLowerCase()) {
- case 'alpha-vantage':
- normalizedData = this.normalizeAlphaVantage(rawData);
- break;
- case 'yahoo-finance':
- normalizedData = this.normalizeYahooFinance(rawData);
- break;
- case 'polygon':
- normalizedData = this.normalizePolygon(rawData);
- break;
- default:
- return {
- success: false,
- error: `Unsupported data source: ${source}`,
- source,
- timestamp: new Date(),
- processingTimeMs: Date.now() - startTime,
- };
- }
-
- // Validate the normalized data
- if (!this.validateMarketData(normalizedData)) {
- return {
- success: false,
- error: 'Data validation failed',
- source,
- timestamp: new Date(),
- processingTimeMs: Date.now() - startTime,
- };
- }
-
- return {
- success: true,
- data: normalizedData,
- source,
- timestamp: new Date(),
- processingTimeMs: Date.now() - startTime,
- };
- } catch (error) {
- return {
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error',
- source,
- timestamp: new Date(),
- processingTimeMs: Date.now() - startTime,
- };
- }
- } /**
- * Normalize OHLCV data from different providers
- */
- normalizeOHLCV(rawData: any, source: string): DataNormalizationResult {
- const startTime = Date.now();
- try {
- let normalizedData: OHLCVType[];
-
- switch (source.toLowerCase()) {
- case 'alpha-vantage':
- normalizedData = this.normalizeAlphaVantageOHLCV(rawData);
- break;
- case 'yahoo-finance':
- normalizedData = this.normalizeYahooFinanceOHLCV(rawData);
- break;
- case 'polygon':
- normalizedData = this.normalizePolygonOHLCV(rawData);
- break;
- default:
- return {
- success: false,
- error: `Unsupported data source: ${source}`,
- source,
- timestamp: new Date(),
- processingTimeMs: Date.now() - startTime,
- };
- }
-
- // Validate each OHLCV entry
- const validData = normalizedData.filter(item => this.validateOHLCV(item));
-
- if (validData.length === 0) {
- return {
- success: false,
- error: 'No valid OHLCV data after normalization',
- source,
- timestamp: new Date(),
- processingTimeMs: Date.now() - startTime,
- };
- }
-
- return {
- success: true,
- data: validData,
- source,
- timestamp: new Date(),
- processingTimeMs: Date.now() - startTime,
- };
- } catch (error) {
- return {
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error',
- source,
- timestamp: new Date(),
- processingTimeMs: Date.now() - startTime,
- };
- }
- }
-
- private normalizeAlphaVantage(data: any): MarketDataType {
- const quote = data['Global Quote'];
- return {
- symbol: quote['01. symbol'],
- price: parseFloat(quote['05. price']),
- bid: parseFloat(quote['05. price']) - 0.01, // Approximate bid/ask
- ask: parseFloat(quote['05. price']) + 0.01,
- volume: parseInt(quote['06. volume']),
- timestamp: new Date(),
- };
- }
- private normalizeYahooFinance(data: any): MarketDataType {
- return {
- symbol: data.symbol,
- price: data.regularMarketPrice,
- bid: data.bid || data.regularMarketPrice - 0.01,
- ask: data.ask || data.regularMarketPrice + 0.01,
- volume: data.regularMarketVolume,
- timestamp: new Date(data.regularMarketTime * 1000),
- };
- }
-
- private normalizePolygon(data: any): MarketDataType {
- // Polygon.io format normalization
- return {
- symbol: data.T || data.symbol,
- price: data.c || data.price,
- bid: data.b || data.bid,
- ask: data.a || data.ask,
- volume: data.v || data.volume,
- timestamp: new Date(data.t || data.timestamp),
- };
- }
-
- private normalizeAlphaVantageOHLCV(data: any): OHLCVType[] {
- const timeSeries = data['Time Series (1min)'] || data['Time Series (5min)'] || data['Time Series (Daily)'];
- const symbol = data['Meta Data']['2. Symbol'];
-
- return Object.entries(timeSeries).map(([timestamp, values]: [string, any]) => ({
- symbol,
- timestamp: new Date(timestamp),
- open: parseFloat(values['1. open']),
- high: parseFloat(values['2. high']),
- low: parseFloat(values['3. low']),
- close: parseFloat(values['4. close']),
- volume: parseInt(values['5. volume']),
- })).sort((a, b) => b.timestamp.getTime() - a.timestamp.getTime());
- }
- private normalizeYahooFinanceOHLCV(data: any): OHLCVType[] {
- const result = data.chart.result[0];
- const timestamps = result.timestamp;
- const quotes = result.indicators.quote[0];
-
- return timestamps.map((timestamp: number, index: number) => ({
- symbol: result.meta.symbol,
- timestamp: new Date(timestamp * 1000),
- open: quotes.open[index],
- high: quotes.high[index],
- low: quotes.low[index],
- close: quotes.close[index],
- volume: quotes.volume[index],
- }));
- }
-
- private normalizePolygonOHLCV(data: any): OHLCVType[] {
- // Polygon.io aggregates format
- if (data.results && Array.isArray(data.results)) {
- return data.results.map((candle: any) => ({
- symbol: data.ticker || candle.T,
- timestamp: new Date(candle.t),
- open: candle.o,
- high: candle.h,
- low: candle.l,
- close: candle.c,
- volume: candle.v,
- }));
- }
-
- return [];
- } /**
- * Validate market data quality
- */
- validateMarketData(data: MarketDataType): boolean {
- return (
- data.symbol &&
- typeof data.symbol === 'string' &&
- data.symbol.length > 0 &&
- typeof data.price === 'number' &&
- data.price > 0 &&
- typeof data.volume === 'number' &&
- data.volume >= 0 &&
- data.timestamp instanceof Date &&
- !isNaN(data.timestamp.getTime()) &&
- typeof data.bid === 'number' &&
- typeof data.ask === 'number' &&
- data.ask >= data.bid
- ) as boolean;
- }
- /**
- * Validate OHLCV data quality
- */
- validateOHLCV(data: OHLCVType): boolean {
- return (
- data.symbol &&
- typeof data.symbol === 'string' &&
- data.symbol.length > 0 &&
- typeof data.open === 'number' && data.open > 0 &&
- typeof data.high === 'number' && data.high > 0 &&
- typeof data.low === 'number' && data.low > 0 &&
- typeof data.close === 'number' && data.close > 0 &&
- data.high >= Math.max(data.open, data.close) &&
- data.low <= Math.min(data.open, data.close) &&
- typeof data.volume === 'number' && data.volume >= 0 &&
- data.timestamp instanceof Date &&
- !isNaN(data.timestamp.getTime())
- ) as boolean;
- }
- /**
- * Assess data quality metrics for market data
- */
- assessDataQuality(data: MarketDataType[], source: string): DataQualityMetrics {
- if (data.length === 0) {
- return {
- completeness: 0,
- accuracy: 0,
- timeliness: 0,
- consistency: 0,
- overall: 0,
- };
- }
-
- // Completeness: percentage of valid data points
- const validCount = data.filter(item => this.validateMarketData(item)).length;
- const completeness = validCount / data.length;
-
- // Accuracy: based on price consistency and reasonable ranges
- const accuracyScore = this.assessAccuracy(data);
-
- // Timeliness: based on data freshness
- const timelinessScore = this.assessTimeliness(data);
-
- // Consistency: based on data patterns and outliers
- const consistencyScore = this.assessConsistency(data);
-
- const overall = (completeness + accuracyScore + timelinessScore + consistencyScore) / 4;
-
- return {
- completeness,
- accuracy: accuracyScore,
- timeliness: timelinessScore,
- consistency: consistencyScore,
- overall,
- };
- }
-
- private assessAccuracy(data: MarketDataType[]): number {
- let accuracySum = 0;
-
- for (const item of data) {
- let score = 1.0;
-
- // Check for reasonable price ranges
- if (item.price <= 0 || item.price > 100000) score -= 0.3;
-
- // Check bid/ask spread reasonableness
- const spread = item.ask - item.bid;
- const spreadPercentage = spread / item.price;
- if (spreadPercentage > 0.1) score -= 0.2; // More than 10% spread is suspicious
-
- // Check for negative volume
- if (item.volume < 0) score -= 0.5;
-
- accuracySum += Math.max(0, score);
- }
-
- return data.length > 0 ? accuracySum / data.length : 0;
- }
-
- private assessTimeliness(data: MarketDataType[]): number {
- const now = new Date();
- let timelinessSum = 0;
-
- for (const item of data) {
- const ageMs = now.getTime() - item.timestamp.getTime();
- const ageMinutes = ageMs / (1000 * 60);
-
- // Score based on data age (fresher is better)
- let score = 1.0;
- if (ageMinutes > 60) score = 0.1; // Very old data
- else if (ageMinutes > 15) score = 0.5; // Moderately old
- else if (ageMinutes > 5) score = 0.8; // Slightly old
-
- timelinessSum += score;
- }
-
- return data.length > 0 ? timelinessSum / data.length : 0;
- }
-
- private assessConsistency(data: MarketDataType[]): number {
- if (data.length < 2) return 1.0;
-
- // Sort by timestamp
- const sortedData = [...data].sort((a, b) => a.timestamp.getTime() - b.timestamp.getTime());
-
- let consistencySum = 0;
-
- for (let i = 1; i < sortedData.length; i++) {
- const current = sortedData[i];
- const previous = sortedData[i - 1];
-
- // Check for reasonable price movements
- const priceChange = Math.abs(current.price - previous.price) / previous.price;
-
- let score = 1.0;
- if (priceChange > 0.5) score -= 0.7; // More than 50% change is suspicious
- else if (priceChange > 0.1) score -= 0.3; // More than 10% change is notable
-
- consistencySum += Math.max(0, score);
- }
-
- return consistencySum / (sortedData.length - 1);
- }
- /**
- * Clean and sanitize market data
- */
- sanitizeMarketData(data: MarketDataType): MarketDataType {
- return {
- symbol: data.symbol.toUpperCase().trim(),
- price: Math.max(0, Number(data.price) || 0),
- bid: Math.max(0, Number(data.bid) || 0),
- ask: Math.max(0, Number(data.ask) || 0),
- volume: Math.max(0, Math.floor(Number(data.volume) || 0)),
- timestamp: new Date(data.timestamp),
- };
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/DataSourceManager.ts b/apps/core-services/market-data-gateway/src/services/DataSourceManager.ts
deleted file mode 100644
index 438b47c..0000000
--- a/apps/core-services/market-data-gateway/src/services/DataSourceManager.ts
+++ /dev/null
@@ -1,598 +0,0 @@
-import { EventEmitter } from 'eventemitter3';
-// Local logger interface to avoid pino dependency issues
-interface Logger {
- info(msg: string, ...args: any[]): void;
- error(msg: string, ...args: any[]): void;
- warn(msg: string, ...args: any[]): void;
- debug(msg: string, ...args: any[]): void;
- child(options: any): Logger;
-}
-
-// Simple logger implementation
-const createLogger = (name: string): Logger => ({
- info: (msg: string, ...args: any[]) => console.log(`[${name}] INFO:`, msg, ...args),
- error: (msg: string, ...args: any[]) => console.error(`[${name}] ERROR:`, msg, ...args),
- warn: (msg: string, ...args: any[]) => console.warn(`[${name}] WARN:`, msg, ...args),
- debug: (msg: string, ...args: any[]) => console.debug(`[${name}] DEBUG:`, msg, ...args),
- child: (options: any) => createLogger(`${name}.${options.component || 'child'}`)
-});
-
-import WebSocket from 'ws';
-// Simple HTTP client to replace axios
-interface HttpClient {
- get(url: string): Promise<{ data: any }>;
- post(url: string, data?: any): Promise<{ data: any }>;
-}
-
-const createHttpClient = (baseURL: string, headers?: Record): HttpClient => ({
- get: async (url: string) => {
- const response = await fetch(`${baseURL}${url}`, { headers });
- return { data: await response.json() };
- },
- post: async (url: string, data?: any) => {
- const response = await fetch(`${baseURL}${url}`, {
- method: 'POST',
- headers: { 'Content-Type': 'application/json', ...headers },
- body: data ? JSON.stringify(data) : undefined
- });
- return { data: await response.json() };
- }
-});
-import {
- DataSourceConfig,
- DataSourceMetrics,
- DataSourceError,
- MarketDataTick,
- MarketDataCandle,
- MarketDataTrade
-} from '../types/MarketDataGateway';
-
-interface DataSourceConnection {
- config: DataSourceConfig;
- connection?: WebSocket | HttpClient;
- status: 'disconnected' | 'connecting' | 'connected' | 'error';
- lastConnectedAt?: Date;
- lastErrorAt?: Date;
- retryCount: number;
- metrics: {
- messagesReceived: number;
- bytesReceived: number;
- errors: number;
- latencyMs: number[];
- };
-}
-
-export class DataSourceManager extends EventEmitter {
- private dataSources: Map = new Map();
- private logger: Logger;
- private healthCheckInterval?: NodeJS.Timeout;
- private reconnectTimeouts: Map = new Map();
-
- constructor(configs: DataSourceConfig[], logger: Logger) {
- super();
- this.logger = logger;
- this.initializeDataSources(configs);
- }
-
- private initializeDataSources(configs: DataSourceConfig[]) {
- for (const config of configs) {
- this.dataSources.set(config.id, {
- config,
- status: 'disconnected',
- retryCount: 0,
- metrics: {
- messagesReceived: 0,
- bytesReceived: 0,
- errors: 0,
- latencyMs: []
- }
- });
- }
- }
-
- public async start(): Promise {
- this.logger.info('Starting Data Source Manager');
-
- // Connect to all enabled data sources
- const connectionPromises = Array.from(this.dataSources.values())
- .filter(ds => ds.config.enabled)
- .map(ds => this.connectDataSource(ds.config.id));
-
- await Promise.allSettled(connectionPromises);
-
- // Start health check interval
- this.startHealthCheck();
-
- this.logger.info('Data Source Manager started');
- }
-
- public async stop(): Promise {
- this.logger.info('Stopping Data Source Manager');
-
- // Clear health check interval
- if (this.healthCheckInterval) {
- clearInterval(this.healthCheckInterval);
- }
-
- // Clear all reconnect timeouts
- for (const timeout of this.reconnectTimeouts.values()) {
- clearTimeout(timeout);
- }
- this.reconnectTimeouts.clear();
-
- // Disconnect all data sources
- const disconnectionPromises = Array.from(this.dataSources.keys())
- .map(sourceId => this.disconnectDataSource(sourceId));
-
- await Promise.allSettled(disconnectionPromises);
-
- this.logger.info('Data Source Manager stopped');
- }
-
- public async addDataSource(config: DataSourceConfig): Promise {
- this.logger.info({ sourceId: config.id }, 'Adding data source');
-
- this.dataSources.set(config.id, {
- config,
- status: 'disconnected',
- retryCount: 0,
- metrics: {
- messagesReceived: 0,
- bytesReceived: 0,
- errors: 0,
- latencyMs: []
- }
- });
-
- if (config.enabled) {
- await this.connectDataSource(config.id);
- }
- }
- public async removeDataSource(sourceId: string): Promise {
- this.logger.info(`Removing data source: ${sourceId}`);
-
- await this.disconnectDataSource(sourceId);
- this.dataSources.delete(sourceId);
-
- const timeout = this.reconnectTimeouts.get(sourceId);
- if (timeout) {
- clearTimeout(timeout);
- this.reconnectTimeouts.delete(sourceId);
- }
- }
-
- public async updateDataSource(sourceId: string, updates: Partial): Promise {
- const dataSource = this.dataSources.get(sourceId);
- if (!dataSource) {
- throw new Error(`Data source ${sourceId} not found`);
- }
-
- this.logger.info(`Updating data source: ${sourceId}`, updates);
-
- // Update configuration
- dataSource.config = { ...dataSource.config, ...updates };
-
- // Reconnect if needed
- if (dataSource.status === 'connected') {
- await this.disconnectDataSource(sourceId);
- if (dataSource.config.enabled) {
- await this.connectDataSource(sourceId);
- }
- }
- }
-
- public getDataSources(): DataSourceConfig[] {
- return Array.from(this.dataSources.values()).map(ds => ds.config);
- }
-
- public getDataSourceMetrics(sourceId?: string): DataSourceMetrics[] {
- const sources = sourceId
- ? [this.dataSources.get(sourceId)].filter(Boolean)
- : Array.from(this.dataSources.values());
-
- return sources.map(ds => ({
- sourceId: ds!.config.id,
- timestamp: new Date(),
- connections: {
- active: ds!.status === 'connected' ? 1 : 0,
- total: 1,
- failed: ds!.metrics.errors
- },
- messages: {
- received: ds!.metrics.messagesReceived,
- processed: ds!.metrics.messagesReceived, // Assuming all received are processed
- errors: ds!.metrics.errors,
- dropped: 0
- },
- latency: {
- avgMs: this.calculateAverageLatency(ds!.metrics.latencyMs),
- p50Ms: this.calculatePercentile(ds!.metrics.latencyMs, 0.5),
- p95Ms: this.calculatePercentile(ds!.metrics.latencyMs, 0.95),
- p99Ms: this.calculatePercentile(ds!.metrics.latencyMs, 0.99)
- },
- bandwidth: {
- inboundBytesPerSecond: ds!.metrics.bytesReceived / 60, // Rough estimate
- outboundBytesPerSecond: 0
- }
- }));
- }
-
- private async connectDataSource(sourceId: string): Promise {
- const dataSource = this.dataSources.get(sourceId);
- if (!dataSource) {
- throw new Error(`Data source ${sourceId} not found`);
- }
-
- if (dataSource.status === 'connected' || dataSource.status === 'connecting') {
- return;
- }
-
- this.logger.info({ sourceId }, 'Connecting to data source');
- dataSource.status = 'connecting';
-
- try {
- if (dataSource.config.type === 'websocket') {
- await this.connectWebSocket(dataSource);
- } else if (dataSource.config.type === 'rest') {
- await this.connectREST(dataSource);
- } else {
- throw new Error(`Unsupported data source type: ${dataSource.config.type}`);
- }
-
- dataSource.status = 'connected';
- dataSource.lastConnectedAt = new Date();
- dataSource.retryCount = 0;
-
- this.logger.info({ sourceId }, 'Data source connected');
- this.emit('connected', sourceId);
-
- } catch (error) {
- this.logger.error({ sourceId, error }, 'Failed to connect to data source');
- dataSource.status = 'error';
- dataSource.lastErrorAt = new Date();
- dataSource.metrics.errors++;
-
- this.emit('error', sourceId, error);
- this.scheduleReconnect(sourceId);
- }
- }
-
- private async connectWebSocket(dataSource: DataSourceConnection): Promise {
- const { config } = dataSource;
- const ws = new WebSocket(config.connection.url, {
- headers: config.connection.headers
- });
-
- return new Promise((resolve, reject) => {
- const connectTimeout = setTimeout(() => {
- ws.close();
- reject(new Error('WebSocket connection timeout'));
- }, 10000);
-
- ws.on('open', () => {
- clearTimeout(connectTimeout);
- this.logger.debug({ sourceId: config.id }, 'WebSocket connected');
-
- // Send subscription messages
- this.sendWebSocketSubscriptions(ws, config);
-
- dataSource.connection = ws;
- resolve();
- });
-
- ws.on('message', (data: Buffer) => {
- const receiveTime = Date.now();
- this.handleWebSocketMessage(config.id, data, receiveTime);
- });
-
- ws.on('error', (error) => {
- clearTimeout(connectTimeout);
- this.logger.error({ sourceId: config.id, error }, 'WebSocket error');
- reject(error);
- });
-
- ws.on('close', () => {
- this.logger.warn({ sourceId: config.id }, 'WebSocket disconnected');
- dataSource.status = 'disconnected';
- this.emit('disconnected', config.id);
- this.scheduleReconnect(config.id);
- });
- });
- }
-
- private async connectREST(dataSource: DataSourceConnection): Promise {
- const { config } = dataSource;
-
- const axiosInstance = axios.create({
- baseURL: config.connection.url,
- headers: config.connection.headers,
- timeout: 5000,
- params: config.connection.queryParams
- });
-
- // Add authentication if configured
- if (config.connection.authentication) {
- this.configureAuthentication(axiosInstance, config.connection.authentication);
- }
-
- // Test connection
- try {
- await axiosInstance.get('/health');
- dataSource.connection = axiosInstance;
-
- // Start polling for REST data sources
- this.startRESTPolling(config.id);
-
- } catch (error) {
- throw new Error(`REST API health check failed: ${error}`);
- }
- }
-
- private sendWebSocketSubscriptions(ws: WebSocket, config: DataSourceConfig): void {
- const subscriptions = [];
-
- if (config.subscriptions.quotes) {
- subscriptions.push({
- type: 'subscribe',
- channel: 'quotes',
- symbols: config.symbols
- });
- }
-
- if (config.subscriptions.trades) {
- subscriptions.push({
- type: 'subscribe',
- channel: 'trades',
- symbols: config.symbols
- });
- }
-
- if (config.subscriptions.orderbook) {
- subscriptions.push({
- type: 'subscribe',
- channel: 'orderbook',
- symbols: config.symbols
- });
- }
-
- if (config.subscriptions.candles) {
- subscriptions.push({
- type: 'subscribe',
- channel: 'candles',
- symbols: config.symbols
- });
- }
-
- for (const subscription of subscriptions) {
- ws.send(JSON.stringify(subscription));
- }
- }
-
- private handleWebSocketMessage(sourceId: string, data: Buffer, receiveTime: number): void {
- const dataSource = this.dataSources.get(sourceId);
- if (!dataSource) return;
-
- try {
- const message = JSON.parse(data.toString());
-
- // Update metrics
- dataSource.metrics.messagesReceived++;
- dataSource.metrics.bytesReceived += data.length;
-
- // Calculate latency if timestamp is available
- if (message.timestamp) {
- const latency = receiveTime - message.timestamp;
- dataSource.metrics.latencyMs.push(latency);
-
- // Keep only last 1000 latency measurements
- if (dataSource.metrics.latencyMs.length > 1000) {
- dataSource.metrics.latencyMs = dataSource.metrics.latencyMs.slice(-1000);
- }
- }
-
- // Emit normalized data
- const normalizedData = this.normalizeMessage(message, sourceId);
- if (normalizedData) {
- this.emit('data', sourceId, normalizedData);
- }
-
- } catch (error) {
- this.logger.error({ sourceId, error }, 'Error parsing WebSocket message');
- dataSource.metrics.errors++;
- }
- }
-
- private startRESTPolling(sourceId: string): void {
- const dataSource = this.dataSources.get(sourceId);
- if (!dataSource || !dataSource.connection) return;
-
- const pollInterval = 1000; // 1 second polling
-
- const poll = async () => {
- try {
- const axiosInstance = dataSource.connection as AxiosInstance;
- const response = await axiosInstance.get('/market-data');
-
- dataSource.metrics.messagesReceived++;
- dataSource.metrics.bytesReceived += JSON.stringify(response.data).length;
-
- const normalizedData = this.normalizeMessage(response.data, sourceId);
- if (normalizedData) {
- this.emit('data', sourceId, normalizedData);
- }
-
- } catch (error) {
- this.logger.error({ sourceId, error }, 'REST polling error');
- dataSource.metrics.errors++;
- }
-
- // Schedule next poll
- if (dataSource.status === 'connected') {
- setTimeout(poll, pollInterval);
- }
- };
-
- poll();
- }
-
- private normalizeMessage(message: any, sourceId: string): MarketDataTick | MarketDataCandle | MarketDataTrade | null {
- // This is a simplified normalization - in reality, you'd have specific
- // normalizers for each data source format
- try {
- if (message.type === 'quote' || message.price !== undefined) {
- return {
- symbol: message.symbol || message.s,
- timestamp: message.timestamp || message.t || Date.now(),
- price: message.price || message.p,
- volume: message.volume || message.v || 0,
- bid: message.bid || message.b,
- ask: message.ask || message.a,
- source: sourceId
- } as MarketDataTick;
- }
-
- if (message.type === 'trade') {
- return {
- id: message.id || `${sourceId}-${Date.now()}`,
- symbol: message.symbol || message.s,
- timestamp: message.timestamp || message.t || Date.now(),
- price: message.price || message.p,
- size: message.size || message.q,
- side: message.side || 'buy',
- source: sourceId
- } as MarketDataTrade;
- }
-
- if (message.type === 'candle' || message.ohlc) {
- return {
- symbol: message.symbol || message.s,
- timestamp: message.timestamp || message.t || Date.now(),
- open: message.open || message.o,
- high: message.high || message.h,
- low: message.low || message.l,
- close: message.close || message.c,
- volume: message.volume || message.v,
- timeframe: message.timeframe || '1m',
- source: sourceId
- } as MarketDataCandle;
- }
-
- return null;
- } catch (error) {
- this.logger.error({ error, message, sourceId }, 'Error normalizing message');
- return null;
- }
- }
-
- private async disconnectDataSource(sourceId: string): Promise {
- const dataSource = this.dataSources.get(sourceId);
- if (!dataSource || dataSource.status === 'disconnected') {
- return;
- }
-
- this.logger.info({ sourceId }, 'Disconnecting data source');
-
- if (dataSource.connection) {
- if (dataSource.connection instanceof WebSocket) {
- dataSource.connection.close();
- }
- // For REST connections, we just stop polling (handled in status check)
- }
-
- dataSource.status = 'disconnected';
- dataSource.connection = undefined;
- }
-
- private scheduleReconnect(sourceId: string): void {
- const dataSource = this.dataSources.get(sourceId);
- if (!dataSource || !dataSource.config.enabled) {
- return;
- }
-
- const { retryPolicy } = dataSource.config;
- const backoffMs = Math.min(
- retryPolicy.backoffMultiplier * Math.pow(2, dataSource.retryCount),
- retryPolicy.maxBackoffMs
- );
-
- if (dataSource.retryCount < retryPolicy.maxRetries) {
- this.logger.info({
- sourceId,
- retryCount: dataSource.retryCount,
- backoffMs
- }, 'Scheduling reconnect');
-
- const timeout = setTimeout(() => {
- dataSource.retryCount++;
- this.connectDataSource(sourceId);
- this.reconnectTimeouts.delete(sourceId);
- }, backoffMs);
-
- this.reconnectTimeouts.set(sourceId, timeout);
- } else {
- this.logger.error({ sourceId }, 'Max retries exceeded, giving up');
- dataSource.status = 'error';
- }
- }
-
- private startHealthCheck(): void {
- this.healthCheckInterval = setInterval(() => {
- for (const [sourceId, dataSource] of this.dataSources.entries()) {
- if (dataSource.config.enabled && dataSource.status === 'disconnected') {
- this.logger.debug({ sourceId }, 'Health check: attempting reconnect');
- this.connectDataSource(sourceId);
- }
- }
- }, 30000); // Check every 30 seconds
- }
-
- private configureAuthentication(axiosInstance: AxiosInstance, auth: any): void {
- switch (auth.type) {
- case 'apikey':
- axiosInstance.defaults.headers.common['X-API-Key'] = auth.credentials.apiKey;
- break;
- case 'basic':
- axiosInstance.defaults.auth = {
- username: auth.credentials.username,
- password: auth.credentials.password
- };
- break;
- case 'jwt':
- axiosInstance.defaults.headers.common['Authorization'] = `Bearer ${auth.credentials.token}`;
- break;
- }
- }
-
- private calculateAverageLatency(latencies: number[]): number {
- if (latencies.length === 0) return 0;
- return latencies.reduce((sum, lat) => sum + lat, 0) / latencies.length;
- }
-
- private calculatePercentile(values: number[], percentile: number): number {
- if (values.length === 0) return 0;
- const sorted = [...values].sort((a, b) => a - b);
- const index = Math.ceil(sorted.length * percentile) - 1;
- return sorted[Math.max(0, index)];
- }
-
- public async updateConfig(configs: DataSourceConfig[]): Promise {
- this.logger.info('Updating data source configurations');
-
- // Remove sources that are no longer in config
- const configIds = new Set(configs.map(c => c.id));
- for (const sourceId of this.dataSources.keys()) {
- if (!configIds.has(sourceId)) {
- await this.removeDataSource(sourceId);
- }
- }
-
- // Add or update sources
- for (const config of configs) {
- if (this.dataSources.has(config.id)) {
- await this.updateDataSource(config.id, config);
- } else {
- await this.addDataSource(config);
- }
- }
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/EventPublisher.ts b/apps/core-services/market-data-gateway/src/services/EventPublisher.ts
deleted file mode 100644
index 4c679a5..0000000
--- a/apps/core-services/market-data-gateway/src/services/EventPublisher.ts
+++ /dev/null
@@ -1,140 +0,0 @@
-import Redis from 'ioredis';
-import { databaseConfig } from '@stock-bot/config';
-import type { MarketDataEvent, SignalEvent, TradingEvent } from '@stock-bot/types';
-
-export class EventPublisher {
- private dragonfly: Redis;
- private readonly STREAM_NAME = 'trading-events';
-
- constructor() {
- this.dragonfly = new Redis({
- host: databaseConfig.dragonfly.host,
- port: databaseConfig.dragonfly.port,
- password: databaseConfig.dragonfly.password,
- maxRetriesPerRequest: databaseConfig.dragonfly.maxRetriesPerRequest,
- });
-
- this.dragonfly.on('connect', () => {
- console.log('๐ Connected to Dragonfly for event publishing');
- });
-
- this.dragonfly.on('error', (error) => {
- console.error('โ Dragonfly connection error:', error);
- });
- }
-
- /**
- * Publish a market data event to the event stream
- */ async publishMarketData(event: MarketDataEvent): Promise {
- try {
- await this.dragonfly.xadd(
- this.STREAM_NAME,
- '*',
- 'type', event.type,
- 'data', JSON.stringify(event.data),
- 'timestamp', event.timestamp.toISOString()
- );
- } catch (error) {
- console.error('Error publishing market data event:', error);
- throw error;
- }
- }
- /**
- * Publish a trading signal event
- */
- async publishSignal(event: SignalEvent): Promise {
- try {
- await this.dragonfly.xadd(
- this.STREAM_NAME,
- '*',
- 'type', event.type,
- 'signal', JSON.stringify(event.signal),
- 'timestamp', event.timestamp.toISOString()
- );
- } catch (error) {
- console.error('Error publishing signal event:', error);
- throw error;
- }
- }
-
- /**
- * Publish any trading event
- */
- async publishEvent(event: TradingEvent): Promise {
- try {
- const fields: string[] = ['type', event.type, 'timestamp', event.timestamp.toISOString()];
-
- if ('data' in event) {
- fields.push('data', JSON.stringify(event.data));
- }
- if ('order' in event) {
- fields.push('order', JSON.stringify(event.order));
- }
- if ('signal' in event) {
- fields.push('signal', JSON.stringify(event.signal));
- }
-
- await this.dragonfly.xadd(this.STREAM_NAME, '*', ...fields);
- } catch (error) {
- console.error('Error publishing event:', error);
- throw error;
- }
- }
- /**
- * Cache market data in Dragonfly for quick access
- */
- async cacheMarketData(symbol: string, data: any, ttl: number = 60): Promise {
- try {
- const key = `market-data:${symbol}`;
- await this.dragonfly.setex(key, ttl, JSON.stringify(data));
- } catch (error) {
- console.error('Error caching market data:', error);
- }
- }
- /**
- * Get cached market data from Dragonfly
- */
- async getCachedMarketData(symbol: string): Promise {
- try {
- const key = `market-data:${symbol}`;
- const cached = await this.dragonfly.get(key);
- return cached ? JSON.parse(cached) : null;
- } catch (error) {
- console.error('Error getting cached market data:', error);
- return null;
- }
- }
- /**
- * Publish to a specific channel for real-time subscriptions
- */
- async publishToChannel(channel: string, data: any): Promise {
- try {
- await this.dragonfly.publish(channel, JSON.stringify(data));
- } catch (error) {
- console.error(`Error publishing to channel ${channel}:`, error);
- throw error;
- }
- }
- /**
- * Set up health monitoring
- */
- async setServiceHealth(serviceName: string, status: 'healthy' | 'unhealthy'): Promise {
- try {
- const key = `health:${serviceName}`;
- const healthData = {
- status,
- timestamp: new Date().toISOString(),
- lastSeen: Date.now(),
- };
- await this.dragonfly.setex(key, 300, JSON.stringify(healthData)); // 5 minutes TTL
- } catch (error) {
- console.error('Error setting service health:', error);
- }
- }
- /**
- * Close Dragonfly connection
- */
- async disconnect(): Promise {
- await this.dragonfly.quit();
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/MarketDataGatewayService.ts b/apps/core-services/market-data-gateway/src/services/MarketDataGatewayService.ts
deleted file mode 100644
index e69de29..0000000
diff --git a/apps/core-services/market-data-gateway/src/services/MarketDataGatewayService.ts.backup b/apps/core-services/market-data-gateway/src/services/MarketDataGatewayService.ts.backup
deleted file mode 100644
index 44a548a..0000000
--- a/apps/core-services/market-data-gateway/src/services/MarketDataGatewayService.ts.backup
+++ /dev/null
@@ -1,404 +0,0 @@
-import { EventEmitter } from 'eventemitter3';
-// Local logger interface to avoid pino dependency issues
-interface Logger {
- info(msg: string | object, ...args: any[]): void;
- error(msg: string | object, ...args: any[]): void;
- warn(msg: string | object, ...args: any[]): void;
- debug(msg: string | object, ...args: any[]): void;
- child(options: any): Logger;
-}
-
-// Simple logger implementation
-const createLogger = (name: string): Logger => ({
- info: (msg: string | object, ...args: any[]) => {
- if (typeof msg === 'object') {
- console.log(`[${name}] INFO:`, JSON.stringify(msg), ...args);
- } else {
- console.log(`[${name}] INFO:`, msg, ...args);
- }
- },
- error: (msg: string | object, ...args: any[]) => {
- if (typeof msg === 'object') {
- console.error(`[${name}] ERROR:`, JSON.stringify(msg), ...args);
- } else {
- console.error(`[${name}] ERROR:`, msg, ...args);
- }
- },
- warn: (msg: string | object, ...args: any[]) => {
- if (typeof msg === 'object') {
- console.warn(`[${name}] WARN:`, JSON.stringify(msg), ...args);
- } else {
- console.warn(`[${name}] WARN:`, msg, ...args);
- }
- },
- debug: (msg: string | object, ...args: any[]) => {
- if (typeof msg === 'object') {
- console.debug(`[${name}] DEBUG:`, JSON.stringify(msg), ...args);
- } else {
- console.debug(`[${name}] DEBUG:`, msg, ...args);
- }
- },
- child: (options: any) => createLogger(`${name}.${options.component || 'child'}`)
-});
-import {
- GatewayConfig,
- DataSourceConfig,
- ProcessingPipeline,
- ClientSubscription,
- SubscriptionRequest,
- DataSourceMetrics,
- GatewayMetrics,
- MarketDataTick,
- MarketDataCandle,
- MarketDataTrade,
- MarketDataOrder,
- HealthStatus
-} from '../types/MarketDataGateway';
-import { DataSourceManager } from './DataSourceManager';
-import { ProcessingEngine } from './ProcessingEngine';
-import { SubscriptionManager } from './SubscriptionManager';
-import { CacheManager } from './CacheManager';
-import { MetricsCollector } from './MetricsCollector';
-import { ServiceIntegrationManager } from './ServiceIntegrationManager';
-
-export class MarketDataGatewayService extends EventEmitter {
- private config: GatewayConfig;
- private logger: Logger;
- private dataSourceManager!: DataSourceManager;
- private processingEngine!: ProcessingEngine;
- private subscriptionManager!: SubscriptionManager;
- private cacheManager!: CacheManager;
- private metricsCollector!: MetricsCollector;
- private serviceIntegration!: ServiceIntegrationManager;
- private _isRunning = false;
- private startTime: Date = new Date();
-
- constructor(config: GatewayConfig, logger: Logger) {
- super();
- this.config = config;
- this.logger = logger;
-
- this.initializeComponents();
- this.setupEventHandlers();
- }
-
- private initializeComponents() {
- this.logger.info('Initializing Market Data Gateway components');
-
- // Initialize core components
- this.dataSourceManager = new DataSourceManager(
- this.config.dataSources,
- this.logger.child({ component: 'DataSourceManager' })
- );
-
- this.processingEngine = new ProcessingEngine(
- this.config.processing,
- this.logger.child({ component: 'ProcessingEngine' })
- );
-
- this.subscriptionManager = new SubscriptionManager(
- this.logger.child({ component: 'SubscriptionManager' })
- );
-
- this.cacheManager = new CacheManager(
- this.config.cache,
- this.logger.child({ component: 'CacheManager' })
- );
-
- this.metricsCollector = new MetricsCollector(
- this.logger.child({ component: 'MetricsCollector' })
- );
-
- this.serviceIntegration = new ServiceIntegrationManager(
- this.logger.child({ component: 'ServiceIntegration' })
- );
- }
-
- private setupEventHandlers() {
- // Data source events
- this.dataSourceManager.on('data', this.handleIncomingData.bind(this));
- this.dataSourceManager.on('error', this.handleDataSourceError.bind(this));
- this.dataSourceManager.on('connected', this.handleDataSourceConnected.bind(this));
- this.dataSourceManager.on('disconnected', this.handleDataSourceDisconnected.bind(this));
-
- // Processing engine events
- this.processingEngine.on('processed', this.handleProcessedData.bind(this));
- this.processingEngine.on('error', this.handleProcessingError.bind(this));
-
- // Subscription events
- this.subscriptionManager.on('subscribed', this.handleClientSubscribed.bind(this));
- this.subscriptionManager.on('unsubscribed', this.handleClientUnsubscribed.bind(this));
- this.subscriptionManager.on('error', this.handleSubscriptionError.bind(this));
-
- // Cache events
- this.cacheManager.on('cached', this.handleDataCached.bind(this));
- this.cacheManager.on('error', this.handleCacheError.bind(this));
-
- // Service integration events
- this.serviceIntegration.on('data-forwarded', this.handleDataForwarded.bind(this));
- this.serviceIntegration.on('integration-error', this.handleIntegrationError.bind(this));
- }
-
- public async start(): Promise {
- if (this.isRunning) {
- this.logger.warn('Gateway is already running');
- return;
- }
-
- try {
- this.logger.info('Starting Market Data Gateway');
- this.startTime = new Date();
-
- // Start components in order
- await this.cacheManager.start();
- await this.metricsCollector.start();
- await this.serviceIntegration.start();
- await this.processingEngine.start();
- await this.subscriptionManager.start();
- await this.dataSourceManager.start();
-
- this.isRunning = true;
- this.logger.info('Market Data Gateway started successfully');
- this.emit('started');
-
- } catch (error) {
- this.logger.error({ error }, 'Failed to start Market Data Gateway');
- await this.stop();
- throw error;
- }
- }
-
- public async stop(): Promise {
- if (!this.isRunning) {
- return;
- }
-
- try {
- this.logger.info('Stopping Market Data Gateway');
-
- // Stop components in reverse order
- await this.dataSourceManager.stop();
- await this.subscriptionManager.stop();
- await this.processingEngine.stop();
- await this.serviceIntegration.stop();
- await this.metricsCollector.stop();
- await this.cacheManager.stop();
-
- this.isRunning = false;
- this.logger.info('Market Data Gateway stopped');
- this.emit('stopped');
-
- } catch (error) {
- this.logger.error({ error }, 'Error stopping Market Data Gateway');
- throw error;
- }
- }
-
- // Data handling methods
- private async handleIncomingData(sourceId: string, data: any): Promise {
- try {
- this.metricsCollector.recordMessage(sourceId, 'received');
-
- // Process data through pipeline
- const processedData = await this.processingEngine.process(data);
-
- // Cache processed data
- await this.cacheManager.cache(processedData);
-
- // Forward to subscribers
- await this.subscriptionManager.broadcast(processedData);
-
- // Forward to integrated services
- await this.serviceIntegration.forwardData(processedData);
-
- this.emit('data-processed', { sourceId, data: processedData });
-
- } catch (error) {
- this.logger.error({ error, sourceId, data }, 'Error handling incoming data');
- this.metricsCollector.recordError(sourceId);
- }
- }
-
- private async handleProcessedData(data: any): Promise {
- this.logger.debug({ data }, 'Data processed successfully');
- this.metricsCollector.recordMessage('processing', 'processed');
- }
-
- private handleDataSourceError(sourceId: string, error: Error): void {
- this.logger.error({ sourceId, error }, 'Data source error');
- this.metricsCollector.recordError(sourceId);
- this.emit('source-error', { sourceId, error });
- }
-
- private handleDataSourceConnected(sourceId: string): void {
- this.logger.info({ sourceId }, 'Data source connected');
- this.metricsCollector.recordConnection(sourceId, 'connected');
- }
-
- private handleDataSourceDisconnected(sourceId: string): void {
- this.logger.warn({ sourceId }, 'Data source disconnected');
- this.metricsCollector.recordConnection(sourceId, 'disconnected');
- }
-
- private handleProcessingError(error: Error, data: any): void {
- this.logger.error({ error, data }, 'Processing error');
- this.emit('processing-error', { error, data });
- }
-
- private handleClientSubscribed(subscription: ClientSubscription): void {
- this.logger.info({
- clientId: subscription.request.clientId,
- symbols: subscription.request.symbols
- }, 'Client subscribed');
- }
-
- private handleClientUnsubscribed(clientId: string): void {
- this.logger.info({ clientId }, 'Client unsubscribed');
- }
-
- private handleSubscriptionError(error: Error, clientId: string): void {
- this.logger.error({ error, clientId }, 'Subscription error');
- }
-
- private handleDataCached(key: string, data: any): void {
- this.logger.debug({ key }, 'Data cached');
- }
-
- private handleCacheError(error: Error, operation: string): void {
- this.logger.error({ error, operation }, 'Cache error');
- }
-
- private handleDataForwarded(service: string, data: any): void {
- this.logger.debug({ service }, 'Data forwarded to service');
- }
-
- private handleIntegrationError(service: string, error: Error): void {
- this.logger.error({ service, error }, 'Service integration error');
- }
-
- // Public API methods
- public async subscribe(request: SubscriptionRequest): Promise {
- return this.subscriptionManager.subscribe(request);
- }
-
- public async unsubscribe(subscriptionId: string): Promise {
- return this.subscriptionManager.unsubscribe(subscriptionId);
- }
-
- public async getSubscriptions(clientId?: string): Promise {
- return this.subscriptionManager.getSubscriptions(clientId);
- }
-
- public async addDataSource(config: DataSourceConfig): Promise {
- return this.dataSourceManager.addDataSource(config);
- }
-
- public async removeDataSource(sourceId: string): Promise {
- return this.dataSourceManager.removeDataSource(sourceId);
- }
-
- public async updateDataSource(sourceId: string, config: Partial): Promise {
- return this.dataSourceManager.updateDataSource(sourceId, config);
- }
-
- public async getDataSources(): Promise {
- return this.dataSourceManager.getDataSources();
- }
-
- public async addProcessingPipeline(pipeline: ProcessingPipeline): Promise {
- return this.processingEngine.addPipeline(pipeline);
- }
-
- public async removeProcessingPipeline(pipelineId: string): Promise {
- return this.processingEngine.removePipeline(pipelineId);
- }
-
- public async getProcessingPipelines(): Promise {
- return this.processingEngine.getPipelines();
- }
-
- public async getMetrics(): Promise {
- return this.metricsCollector.getMetrics();
- }
-
- public async getDataSourceMetrics(sourceId?: string): Promise {
- return this.metricsCollector.getDataSourceMetrics(sourceId);
- }
-
- public async getHealthStatus(): Promise {
- const metrics = await this.getMetrics();
- const dataSources = await this.getDataSources();
-
- // Check component health
- const dependencies = [
- {
- name: 'cache',
- status: await this.cacheManager.isHealthy() ? 'healthy' : 'unhealthy' as const
- },
- {
- name: 'processing-engine',
- status: this.processingEngine.isHealthy() ? 'healthy' : 'unhealthy' as const
- },
- {
- name: 'data-sources',
- status: dataSources.every(ds => ds.enabled) ? 'healthy' : 'unhealthy' as const
- }
- ];
-
- const hasUnhealthyDependencies = dependencies.some(dep => dep.status === 'unhealthy');
-
- return {
- service: 'market-data-gateway',
- status: hasUnhealthyDependencies ? 'degraded' : 'healthy',
- timestamp: new Date(),
- uptime: Date.now() - this.startTime.getTime(),
- version: process.env.SERVICE_VERSION || '1.0.0',
- dependencies,
- metrics: {
- connectionsActive: metrics.subscriptions.active,
- messagesPerSecond: metrics.processing.messagesPerSecond,
- errorRate: metrics.processing.errorRate,
- avgLatencyMs: metrics.dataSources.reduce((sum, ds) => sum + ds.latency.avgMs, 0) / metrics.dataSources.length || 0
- }
- };
- }
-
- // Cache operations
- public async getCachedData(key: string): Promise {
- return this.cacheManager.get(key);
- }
-
- public async setCachedData(key: string, data: any, ttl?: number): Promise {
- return this.cacheManager.set(key, data, ttl);
- }
-
- // Configuration management
- public getConfig(): GatewayConfig {
- return { ...this.config };
- }
-
- public async updateConfig(updates: Partial): Promise {
- this.config = { ...this.config, ...updates };
- this.logger.info('Gateway configuration updated');
-
- // Notify components of config changes
- if (updates.dataSources) {
- await this.dataSourceManager.updateConfig(updates.dataSources);
- }
-
- if (updates.processing) {
- await this.processingEngine.updateConfig(updates.processing);
- }
-
- this.emit('config-updated', this.config);
- }
-
- // Utility methods
- public isRunning(): boolean {
- return this.isRunning;
- }
-
- public getUptime(): number {
- return Date.now() - this.startTime.getTime();
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/MarketDataService.ts b/apps/core-services/market-data-gateway/src/services/MarketDataService.ts
deleted file mode 100644
index aacee20..0000000
--- a/apps/core-services/market-data-gateway/src/services/MarketDataService.ts
+++ /dev/null
@@ -1,278 +0,0 @@
-import type { MarketData, OHLCV, MarketDataEvent } from '@stock-bot/types';
-import { dataProviderConfigs } from '@stock-bot/config';
-import { EventPublisher } from './EventPublisher';
-import { DataNormalizer } from './DataNormalizer';
-
-export class MarketDataService {
- private wsClients: Set = new Set();
- private subscriptions: Map> = new Map();
- private dataUpdateInterval: Timer | null = null;
- private readonly UPDATE_INTERVAL = 5000; // 5 seconds
-
- constructor(
- private eventPublisher: EventPublisher,
- private dataNormalizer: DataNormalizer
- ) {}
-
- /**
- * Initialize the market data service
- */
- async initialize(): Promise {
- console.log('๐ Initializing Market Data Service...');
-
- // Set up periodic data updates for demo purposes
- this.startDataUpdates();
-
- // Set service health
- await this.eventPublisher.setServiceHealth('market-data-gateway', 'healthy');
-
- console.log('โ
Market Data Service initialized');
- }
-
- /**
- * Get latest market data for a symbol
- */
- async getLatestData(symbol: string): Promise {
- // First check cache
- const cached = await this.eventPublisher.getCachedMarketData(symbol);
- if (cached) {
- return cached;
- }
-
- // Fetch fresh data (using demo data for now)
- const marketData = this.generateDemoData(symbol);
-
- // Cache the data
- await this.eventPublisher.cacheMarketData(symbol, marketData, 60);
-
- // Publish market data event
- const event: MarketDataEvent = {
- type: 'MARKET_DATA',
- data: marketData,
- timestamp: new Date(),
- };
- await this.eventPublisher.publishMarketData(event);
-
- return marketData;
- }
-
- /**
- * Get OHLCV data for a symbol
- */
- async getOHLCV(symbol: string, interval: string, limit: number): Promise {
- // Generate demo OHLCV data
- const ohlcvData = this.generateDemoOHLCV(symbol, limit);
-
- // Cache the data
- await this.eventPublisher.cacheMarketData(`ohlcv:${symbol}:${interval}`, ohlcvData, 300);
-
- return ohlcvData;
- }
-
- /**
- * Add WebSocket client for real-time updates
- */
- addWebSocketClient(ws: any): void {
- this.wsClients.add(ws);
- }
-
- /**
- * Remove WebSocket client
- */
- removeWebSocketClient(ws: any): void {
- this.wsClients.delete(ws);
-
- // Remove from all subscriptions
- for (const [symbol, clients] of this.subscriptions) {
- clients.delete(ws);
- if (clients.size === 0) {
- this.subscriptions.delete(symbol);
- }
- }
- }
-
- /**
- * Handle WebSocket messages
- */
- handleWebSocketMessage(ws: any, data: any): void {
- try {
- const message = typeof data === 'string' ? JSON.parse(data) : data;
-
- switch (message.type) {
- case 'subscribe':
- this.subscribeToSymbol(ws, message.symbol);
- break;
- case 'unsubscribe':
- this.unsubscribeFromSymbol(ws, message.symbol);
- break;
- default:
- console.log('Unknown WebSocket message type:', message.type);
- }
- } catch (error) {
- console.error('Error handling WebSocket message:', error);
- }
- }
-
- /**
- * Subscribe WebSocket client to symbol updates
- */
- private subscribeToSymbol(ws: any, symbol: string): void {
- if (!this.subscriptions.has(symbol)) {
- this.subscriptions.set(symbol, new Set());
- }
- this.subscriptions.get(symbol)!.add(ws);
-
- ws.send(JSON.stringify({
- type: 'subscribed',
- symbol,
- timestamp: new Date().toISOString(),
- }));
- }
-
- /**
- * Unsubscribe WebSocket client from symbol updates
- */
- private unsubscribeFromSymbol(ws: any, symbol: string): void {
- const clients = this.subscriptions.get(symbol);
- if (clients) {
- clients.delete(ws);
- if (clients.size === 0) {
- this.subscriptions.delete(symbol);
- }
- }
-
- ws.send(JSON.stringify({
- type: 'unsubscribed',
- symbol,
- timestamp: new Date().toISOString(),
- }));
- }
-
- /**
- * Start periodic data updates for demo
- */
- private startDataUpdates(): void {
- this.dataUpdateInterval = setInterval(async () => {
- const symbols = ['AAPL', 'GOOGL', 'MSFT', 'TSLA', 'AMZN'];
-
- for (const symbol of symbols) {
- if (this.subscriptions.has(symbol)) {
- const marketData = this.generateDemoData(symbol);
-
- // Send to subscribed WebSocket clients
- const clients = this.subscriptions.get(symbol)!;
- const message = JSON.stringify({
- type: 'market_data',
- data: marketData,
- timestamp: new Date().toISOString(),
- });
-
- for (const client of clients) {
- try {
- client.send(message);
- } catch (error) {
- console.error('Error sending WebSocket message:', error);
- clients.delete(client);
- }
- }
-
- // Publish event
- const event: MarketDataEvent = {
- type: 'MARKET_DATA',
- data: marketData,
- timestamp: new Date(),
- };
- await this.eventPublisher.publishMarketData(event);
- }
- }
- }, this.UPDATE_INTERVAL);
- }
-
- /**
- * Generate demo market data
- */
- private generateDemoData(symbol: string): MarketData {
- const basePrice = this.getBasePrice(symbol);
- const variation = (Math.random() - 0.5) * 0.02; // ยฑ1% variation
- const price = basePrice * (1 + variation);
-
- return {
- symbol,
- price: Math.round(price * 100) / 100,
- bid: Math.round((price - 0.01) * 100) / 100,
- ask: Math.round((price + 0.01) * 100) / 100,
- volume: Math.floor(Math.random() * 1000000) + 100000,
- timestamp: new Date(),
- };
- }
-
- /**
- * Generate demo OHLCV data
- */
- private generateDemoOHLCV(symbol: string, limit: number): OHLCV[] {
- const basePrice = this.getBasePrice(symbol);
- const data: OHLCV[] = [];
- let currentPrice = basePrice;
-
- for (let i = limit - 1; i >= 0; i--) {
- const variation = (Math.random() - 0.5) * 0.05; // ยฑ2.5% variation
- const open = currentPrice;
- const close = open * (1 + variation);
- const high = Math.max(open, close) * (1 + Math.random() * 0.02);
- const low = Math.min(open, close) * (1 - Math.random() * 0.02);
-
- data.push({
- symbol,
- timestamp: new Date(Date.now() - i * 60000), // 1 minute intervals
- open: Math.round(open * 100) / 100,
- high: Math.round(high * 100) / 100,
- low: Math.round(low * 100) / 100,
- close: Math.round(close * 100) / 100,
- volume: Math.floor(Math.random() * 50000) + 10000,
- });
-
- currentPrice = close;
- }
-
- return data;
- }
-
- /**
- * Get base price for demo data
- */
- private getBasePrice(symbol: string): number {
- const prices: Record = {
- 'AAPL': 175.50,
- 'GOOGL': 142.30,
- 'MSFT': 378.85,
- 'TSLA': 208.75,
- 'AMZN': 151.20,
- 'NVDA': 465.80,
- 'META': 298.45,
- 'NFLX': 425.60,
- };
-
- return prices[symbol] || 100.00;
- }
-
- /**
- * Shutdown the service
- */
- async shutdown(): Promise {
- console.log('๐ Shutting down Market Data Service...');
-
- if (this.dataUpdateInterval) {
- clearInterval(this.dataUpdateInterval);
- }
-
- // Close all WebSocket connections
- for (const client of this.wsClients) {
- client.close();
- }
-
- await this.eventPublisher.setServiceHealth('market-data-gateway', 'unhealthy');
- await this.eventPublisher.disconnect();
-
- console.log('โ
Market Data Service shutdown complete');
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/MetricsCollector.ts b/apps/core-services/market-data-gateway/src/services/MetricsCollector.ts
deleted file mode 100644
index 70dbbd4..0000000
--- a/apps/core-services/market-data-gateway/src/services/MetricsCollector.ts
+++ /dev/null
@@ -1,511 +0,0 @@
-import { EventEmitter } from 'events';
-import {
- GatewayMetrics,
- Logger,
- HealthStatus,
- ProcessingMetrics,
- DataSourceMetrics,
- SubscriptionMetrics
-} from '../types/MarketDataGateway';
-
-interface MetricPoint {
- value: number;
- timestamp: number;
- labels?: Record;
-}
-
-interface TimeSeriesMetric {
- name: string;
- points: MetricPoint[];
- maxPoints: number;
-}
-
-interface AlertRule {
- id: string;
- metric: string;
- condition: 'gt' | 'lt' | 'eq' | 'gte' | 'lte';
- threshold: number;
- duration: number; // ms
- enabled: boolean;
- lastTriggered?: number;
-}
-
-interface Alert {
- id: string;
- rule: AlertRule;
- value: number;
- timestamp: number;
- message: string;
- severity: 'info' | 'warning' | 'error' | 'critical';
-}
-
-export class MetricsCollector extends EventEmitter {
- private logger: Logger;
- private metrics: Map;
- private aggregatedMetrics: GatewayMetrics;
- private alerts: Map;
- private alertRules: Map;
- private collectInterval: NodeJS.Timeout | null = null;
- private isRunning: boolean = false;
-
- constructor(logger: Logger) {
- super();
- this.logger = logger;
- this.metrics = new Map();
- this.alerts = new Map();
- this.alertRules = new Map();
-
- this.aggregatedMetrics = {
- totalMessages: 0,
- messagesPerSecond: 0,
- averageLatency: 0,
- errorRate: 0,
- activeConnections: 0,
- activeSubscriptions: 0,
- cacheHitRate: 0,
- uptime: 0,
- timestamp: new Date().toISOString(),
- dataSources: new Map(),
- processing: {
- totalProcessed: 0,
- processedPerSecond: 0,
- processingLatency: 0,
- errorCount: 0,
- queueDepth: 0,
- processorMetrics: new Map(),
- },
- subscriptions: {
- totalSubscriptions: 0,
- activeClients: 0,
- messagesSent: 0,
- sendRate: 0,
- subscriptionsBySymbol: new Map(),
- clientMetrics: new Map(),
- },
- };
-
- this.setupDefaultAlertRules();
- this.startCollection();
- }
-
- private setupDefaultAlertRules(): void {
- const defaultRules: AlertRule[] = [
- {
- id: 'high-error-rate',
- metric: 'errorRate',
- condition: 'gt',
- threshold: 0.05, // 5%
- duration: 60000, // 1 minute
- enabled: true,
- },
- {
- id: 'high-latency',
- metric: 'averageLatency',
- condition: 'gt',
- threshold: 1000, // 1 second
- duration: 30000, // 30 seconds
- enabled: true,
- },
- {
- id: 'low-cache-hit-rate',
- metric: 'cacheHitRate',
- condition: 'lt',
- threshold: 0.8, // 80%
- duration: 300000, // 5 minutes
- enabled: true,
- },
- {
- id: 'high-queue-depth',
- metric: 'processing.queueDepth',
- condition: 'gt',
- threshold: 1000,
- duration: 60000, // 1 minute
- enabled: true,
- },
- ];
-
- defaultRules.forEach(rule => {
- this.alertRules.set(rule.id, rule);
- });
- }
-
- startCollection(): void {
- if (this.isRunning) return;
-
- this.isRunning = true;
- this.collectInterval = setInterval(() => {
- this.collectMetrics();
- this.checkAlerts();
- this.cleanupOldMetrics();
- }, 1000); // Collect every second
-
- this.logger.info('Metrics collection started');
- }
-
- stopCollection(): void {
- if (!this.isRunning) return;
-
- this.isRunning = false;
- if (this.collectInterval) {
- clearInterval(this.collectInterval);
- this.collectInterval = null;
- }
-
- this.logger.info('Metrics collection stopped');
- }
-
- // Metric recording methods
- recordMessage(source: string, latency?: number, error?: boolean): void {
- this.recordMetric('totalMessages', 1);
- this.recordMetric('messagesPerSecond', 1);
-
- if (latency !== undefined) {
- this.recordMetric('latency', latency, { source });
- }
-
- if (error) {
- this.recordMetric('errors', 1, { source });
- }
- }
-
- recordProcessing(processed: number, latency: number, errors: number): void {
- this.recordMetric('processing.totalProcessed', processed);
- this.recordMetric('processing.processedPerSecond', processed);
- this.recordMetric('processing.processingLatency', latency);
- this.recordMetric('processing.errorCount', errors);
- }
-
- recordSubscription(action: 'subscribe' | 'unsubscribe', symbol: string, clientId: string): void {
- this.recordMetric('subscriptions.totalSubscriptions', action === 'subscribe' ? 1 : -1);
- this.recordMetric(`subscriptions.symbol.${symbol}`, action === 'subscribe' ? 1 : -1);
- this.recordMetric(`subscriptions.client.${clientId}`, action === 'subscribe' ? 1 : -1);
- }
-
- recordDataSource(sourceId: string, metrics: Partial): void {
- Object.entries(metrics).forEach(([key, value]) => {
- if (typeof value === 'number') {
- this.recordMetric(`dataSource.${sourceId}.${key}`, value);
- }
- });
- }
-
- recordCacheMetrics(hitRate: number, size: number, memoryUsage: number): void {
- this.recordMetric('cacheHitRate', hitRate);
- this.recordMetric('cacheSize', size);
- this.recordMetric('cacheMemoryUsage', memoryUsage);
- }
-
- setGauge(metric: string, value: number, labels?: Record): void {
- this.recordMetric(metric, value, labels, true);
- }
-
- incrementCounter(metric: string, value: number = 1, labels?: Record): void {
- this.recordMetric(metric, value, labels, false);
- }
-
- recordHistogram(metric: string, value: number, labels?: Record): void {
- this.recordMetric(`${metric}.value`, value, labels);
- this.recordMetric(`${metric}.count`, 1, labels);
- }
-
- private recordMetric(
- name: string,
- value: number,
- labels?: Record,
- isGauge: boolean = false
- ): void {
- const point: MetricPoint = {
- value,
- timestamp: Date.now(),
- labels,
- };
-
- if (!this.metrics.has(name)) {
- this.metrics.set(name, {
- name,
- points: [],
- maxPoints: 3600, // Keep 1 hour of data at 1-second intervals
- });
- }
-
- const metric = this.metrics.get(name)!;
-
- if (isGauge) {
- // For gauges, replace the last value
- metric.points = [point];
- } else {
- // For counters and histograms, append
- metric.points.push(point);
- }
-
- // Trim old points
- if (metric.points.length > metric.maxPoints) {
- metric.points = metric.points.slice(-metric.maxPoints);
- }
- }
-
- // Metric retrieval methods
- getMetric(name: string, duration?: number): MetricPoint[] {
- const metric = this.metrics.get(name);
- if (!metric) return [];
-
- if (!duration) return [...metric.points];
-
- const cutoff = Date.now() - duration;
- return metric.points.filter(point => point.timestamp >= cutoff);
- }
-
- getAverageMetric(name: string, duration?: number): number {
- const points = this.getMetric(name, duration);
- if (points.length === 0) return 0;
-
- const sum = points.reduce((acc, point) => acc + point.value, 0);
- return sum / points.length;
- }
-
- getLatestMetric(name: string): number | null {
- const metric = this.metrics.get(name);
- if (!metric || metric.points.length === 0) return null;
-
- return metric.points[metric.points.length - 1].value;
- }
-
- getRate(name: string, duration: number = 60000): number {
- const points = this.getMetric(name, duration);
- if (points.length < 2) return 0;
-
- const oldest = points[0];
- const newest = points[points.length - 1];
- const timeDiff = newest.timestamp - oldest.timestamp;
- const valueDiff = newest.value - oldest.value;
-
- return timeDiff > 0 ? (valueDiff / timeDiff) * 1000 : 0; // per second
- }
-
- getPercentile(name: string, percentile: number, duration?: number): number {
- const points = this.getMetric(name, duration);
- if (points.length === 0) return 0;
-
- const values = points.map(p => p.value).sort((a, b) => a - b);
- const index = Math.ceil((percentile / 100) * values.length) - 1;
- return values[Math.max(0, index)];
- }
-
- // Aggregated metrics
- getAggregatedMetrics(): GatewayMetrics {
- return { ...this.aggregatedMetrics };
- }
-
- private collectMetrics(): void {
- const now = new Date().toISOString();
-
- // Update basic metrics
- this.aggregatedMetrics.totalMessages = this.getLatestMetric('totalMessages') || 0;
- this.aggregatedMetrics.messagesPerSecond = this.getRate('messagesPerSecond');
- this.aggregatedMetrics.averageLatency = this.getAverageMetric('latency', 60000);
- this.aggregatedMetrics.cacheHitRate = this.getLatestMetric('cacheHitRate') || 0;
- this.aggregatedMetrics.timestamp = now;
-
- // Calculate error rate
- const totalMessages = this.aggregatedMetrics.totalMessages;
- const totalErrors = this.getLatestMetric('errors') || 0;
- this.aggregatedMetrics.errorRate = totalMessages > 0 ? totalErrors / totalMessages : 0;
-
- // Update processing metrics
- this.aggregatedMetrics.processing.totalProcessed = this.getLatestMetric('processing.totalProcessed') || 0;
- this.aggregatedMetrics.processing.processedPerSecond = this.getRate('processing.processedPerSecond');
- this.aggregatedMetrics.processing.processingLatency = this.getAverageMetric('processing.processingLatency', 60000);
- this.aggregatedMetrics.processing.errorCount = this.getLatestMetric('processing.errorCount') || 0;
- this.aggregatedMetrics.processing.queueDepth = this.getLatestMetric('processing.queueDepth') || 0;
-
- // Update subscription metrics
- this.aggregatedMetrics.subscriptions.totalSubscriptions = this.getLatestMetric('subscriptions.totalSubscriptions') || 0;
- this.aggregatedMetrics.subscriptions.messagesSent = this.getLatestMetric('subscriptions.messagesSent') || 0;
- this.aggregatedMetrics.subscriptions.sendRate = this.getRate('subscriptions.messagesSent');
-
- this.emit('metrics-updated', this.aggregatedMetrics);
- }
-
- // Alert management
- addAlertRule(rule: AlertRule): void {
- this.alertRules.set(rule.id, rule);
- this.logger.info(`Alert rule added: ${rule.id}`);
- }
-
- removeAlertRule(ruleId: string): void {
- this.alertRules.delete(ruleId);
- this.alerts.delete(ruleId);
- this.logger.info(`Alert rule removed: ${ruleId}`);
- }
-
- getAlertRules(): AlertRule[] {
- return Array.from(this.alertRules.values());
- }
-
- getActiveAlerts(): Alert[] {
- return Array.from(this.alerts.values());
- }
-
- private checkAlerts(): void {
- for (const rule of this.alertRules.values()) {
- if (!rule.enabled) continue;
-
- const value = this.getMetricValue(rule.metric);
- if (value === null) continue;
-
- const isTriggered = this.evaluateCondition(value, rule.condition, rule.threshold);
-
- if (isTriggered) {
- const now = Date.now();
- const existingAlert = this.alerts.get(rule.id);
-
- // Check if alert should be triggered based on duration
- if (!existingAlert || (now - existingAlert.timestamp) >= rule.duration) {
- const alert: Alert = {
- id: rule.id,
- rule,
- value,
- timestamp: now,
- message: `Alert: ${rule.metric} ${rule.condition} ${rule.threshold} (current: ${value})`,
- severity: this.getSeverity(rule.metric, value),
- };
-
- this.alerts.set(rule.id, alert);
- this.emit('alert-triggered', alert);
- this.logger.warn(`Alert triggered: ${alert.message}`);
- }
- } else {
- // Clear alert if condition is no longer met
- if (this.alerts.has(rule.id)) {
- this.alerts.delete(rule.id);
- this.emit('alert-cleared', rule.id);
- this.logger.info(`Alert cleared: ${rule.id}`);
- }
- }
- }
- }
-
- private getMetricValue(metricPath: string): number | null {
- if (metricPath.includes('.')) {
- // Handle nested metric paths
- const parts = metricPath.split('.');
- let value: any = this.aggregatedMetrics;
-
- for (const part of parts) {
- if (value && typeof value === 'object' && part in value) {
- value = value[part];
- } else {
- return null;
- }
- }
-
- return typeof value === 'number' ? value : null;
- }
-
- return this.getLatestMetric(metricPath);
- }
-
- private evaluateCondition(value: number, condition: string, threshold: number): boolean {
- switch (condition) {
- case 'gt': return value > threshold;
- case 'lt': return value < threshold;
- case 'eq': return value === threshold;
- case 'gte': return value >= threshold;
- case 'lte': return value <= threshold;
- default: return false;
- }
- }
-
- private getSeverity(metric: string, value: number): Alert['severity'] {
- // Define severity based on metric type and value
- if (metric.includes('error') || metric.includes('Error')) {
- if (value > 0.1) return 'critical'; // > 10% error rate
- if (value > 0.05) return 'error'; // > 5% error rate
- if (value > 0.01) return 'warning'; // > 1% error rate
- return 'info';
- }
-
- if (metric.includes('latency') || metric.includes('Latency')) {
- if (value > 5000) return 'critical'; // > 5 seconds
- if (value > 2000) return 'error'; // > 2 seconds
- if (value > 1000) return 'warning'; // > 1 second
- return 'info';
- }
-
- return 'warning'; // Default severity
- }
-
- private cleanupOldMetrics(): void {
- const cutoff = Date.now() - (24 * 60 * 60 * 1000); // 24 hours
-
- for (const metric of this.metrics.values()) {
- metric.points = metric.points.filter(point => point.timestamp > cutoff);
- }
- }
-
- // Health and status
- getHealth(): HealthStatus {
- const activeAlerts = this.getActiveAlerts();
- const criticalAlerts = activeAlerts.filter(a => a.severity === 'critical');
- const errorAlerts = activeAlerts.filter(a => a.severity === 'error');
-
- let status: 'healthy' | 'degraded' | 'unhealthy' = 'healthy';
- let message = 'Metrics collector is operational';
-
- if (criticalAlerts.length > 0) {
- status = 'unhealthy';
- message = `${criticalAlerts.length} critical alerts active`;
- } else if (errorAlerts.length > 0) {
- status = 'degraded';
- message = `${errorAlerts.length} error alerts active`;
- }
-
- return {
- status,
- message,
- timestamp: new Date().toISOString(),
- details: {
- isRunning: this.isRunning,
- totalMetrics: this.metrics.size,
- activeAlerts: activeAlerts.length,
- alertRules: this.alertRules.size,
- },
- };
- }
-
- // Export methods
- exportMetrics(format: 'json' | 'prometheus' = 'json'): string {
- if (format === 'prometheus') {
- return this.exportPrometheusFormat();
- }
-
- return JSON.stringify({
- aggregated: this.aggregatedMetrics,
- timeSeries: Object.fromEntries(this.metrics),
- alerts: Object.fromEntries(this.alerts),
- }, null, 2);
- }
-
- private exportPrometheusFormat(): string {
- const lines: string[] = [];
-
- // Export aggregated metrics
- lines.push(`# HELP gateway_total_messages Total messages processed`);
- lines.push(`# TYPE gateway_total_messages counter`);
- lines.push(`gateway_total_messages ${this.aggregatedMetrics.totalMessages}`);
-
- lines.push(`# HELP gateway_messages_per_second Messages processed per second`);
- lines.push(`# TYPE gateway_messages_per_second gauge`);
- lines.push(`gateway_messages_per_second ${this.aggregatedMetrics.messagesPerSecond}`);
-
- lines.push(`# HELP gateway_average_latency Average processing latency in milliseconds`);
- lines.push(`# TYPE gateway_average_latency gauge`);
- lines.push(`gateway_average_latency ${this.aggregatedMetrics.averageLatency}`);
-
- lines.push(`# HELP gateway_error_rate Error rate as percentage`);
- lines.push(`# TYPE gateway_error_rate gauge`);
- lines.push(`gateway_error_rate ${this.aggregatedMetrics.errorRate}`);
-
- return lines.join('\n');
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/ProcessingEngine.ts b/apps/core-services/market-data-gateway/src/services/ProcessingEngine.ts
deleted file mode 100644
index aea1fc6..0000000
--- a/apps/core-services/market-data-gateway/src/services/ProcessingEngine.ts
+++ /dev/null
@@ -1,539 +0,0 @@
-import { EventEmitter } from 'eventemitter3';
-import { Logger } from 'pino';
-import {
- ProcessingPipeline,
- DataProcessor,
- MarketDataTick,
- MarketDataCandle,
- MarketDataTrade,
- ProcessingError
-} from '../types/MarketDataGateway';
-
-interface ProcessingJob {
- id: string;
- data: any;
- pipeline: ProcessingPipeline;
- timestamp: Date;
- attempts: number;
-}
-
-export class ProcessingEngine extends EventEmitter {
- private config: any;
- private logger: Logger;
- private pipelines: Map = new Map();
- private processors: Map = new Map();
- private processingQueue: ProcessingJob[] = [];
- private isProcessing = false;
- private processingStats = {
- totalProcessed: 0,
- totalErrors: 0,
- avgProcessingTimeMs: 0,
- processingTimes: [] as number[]
- };
-
- constructor(config: any, logger: Logger) {
- super();
- this.config = config;
- this.logger = logger;
- this.initializeBuiltInProcessors();
- }
-
- private initializeBuiltInProcessors() {
- // Data validation processor
- this.processors.set('data-validator', {
- id: 'data-validator',
- name: 'Data Validator',
- type: 'validation',
- enabled: true,
- priority: 1,
- config: {},
- process: this.validateData.bind(this)
- });
-
- // Data enrichment processor
- this.processors.set('data-enricher', {
- id: 'data-enricher',
- name: 'Data Enricher',
- type: 'enrichment',
- enabled: true,
- priority: 2,
- config: {},
- process: this.enrichData.bind(this)
- });
-
- // Data normalization processor
- this.processors.set('data-normalizer', {
- id: 'data-normalizer',
- name: 'Data Normalizer',
- type: 'normalization',
- enabled: true,
- priority: 3,
- config: {},
- process: this.normalizeData.bind(this)
- });
-
- // Outlier detection processor
- this.processors.set('outlier-detector', {
- id: 'outlier-detector',
- name: 'Outlier Detector',
- type: 'filter',
- enabled: true,
- priority: 4,
- config: {
- priceDeviationThreshold: 0.1, // 10% price deviation
- volumeThreshold: 1000000 // Minimum volume threshold
- },
- process: this.detectOutliers.bind(this)
- });
-
- // Market hours filter
- this.processors.set('market-hours-filter', {
- id: 'market-hours-filter',
- name: 'Market Hours Filter',
- type: 'filter',
- enabled: true,
- priority: 5,
- config: {
- marketOpen: '09:30',
- marketClose: '16:00',
- timezone: 'America/New_York'
- },
- process: this.filterMarketHours.bind(this)
- });
-
- // OHLC aggregator
- this.processors.set('ohlc-aggregator', {
- id: 'ohlc-aggregator',
- name: 'OHLC Aggregator',
- type: 'aggregation',
- enabled: true,
- priority: 6,
- config: {
- timeframes: ['1m', '5m', '15m', '1h', '1d']
- },
- process: this.aggregateOHLC.bind(this)
- });
- }
-
- public async start(): Promise {
- this.logger.info('Starting Processing Engine');
-
- // Load configured pipelines
- if (this.config.pipelines) {
- for (const pipeline of this.config.pipelines) {
- this.addPipeline(pipeline);
- }
- }
-
- // Start processing loop
- this.startProcessing();
-
- this.logger.info('Processing Engine started');
- }
-
- public async stop(): Promise {
- this.logger.info('Stopping Processing Engine');
-
- this.isProcessing = false;
-
- // Wait for current processing to complete
- while (this.processingQueue.length > 0) {
- await new Promise(resolve => setTimeout(resolve, 100));
- }
-
- this.logger.info('Processing Engine stopped');
- }
-
- public async process(data: any): Promise {
- const startTime = Date.now();
-
- try {
- // Find applicable pipelines for this data
- const applicablePipelines = this.findApplicablePipelines(data);
-
- if (applicablePipelines.length === 0) {
- // No processing needed, return original data
- return data;
- }
-
- let processedData = data;
-
- // Process through each applicable pipeline
- for (const pipeline of applicablePipelines) {
- processedData = await this.processThroughPipeline(processedData, pipeline);
- }
-
- // Update processing stats
- const processingTime = Date.now() - startTime;
- this.updateProcessingStats(processingTime, false);
-
- this.emit('processed', processedData);
- return processedData;
-
- } catch (error) {
- this.logger.error({ error, data }, 'Processing error');
- this.updateProcessingStats(Date.now() - startTime, true);
- this.emit('error', error, data);
- throw error;
- }
- }
-
- public addPipeline(pipeline: ProcessingPipeline): void {
- this.logger.info({ pipelineId: pipeline.id }, 'Adding processing pipeline');
- this.pipelines.set(pipeline.id, pipeline);
- }
-
- public removePipeline(pipelineId: string): void {
- this.logger.info({ pipelineId }, 'Removing processing pipeline');
- this.pipelines.delete(pipelineId);
- }
-
- public getPipelines(): ProcessingPipeline[] {
- return Array.from(this.pipelines.values());
- }
-
- public addProcessor(processor: DataProcessor): void {
- this.logger.info({ processorId: processor.id }, 'Adding data processor');
- this.processors.set(processor.id, processor);
- }
-
- public removeProcessor(processorId: string): void {
- this.logger.info({ processorId }, 'Removing data processor');
- this.processors.delete(processorId);
- }
-
- public getProcessors(): DataProcessor[] {
- return Array.from(this.processors.values());
- }
-
- public getProcessingStats() {
- return {
- ...this.processingStats,
- queueDepth: this.processingQueue.length
- };
- }
-
- public isHealthy(): boolean {
- return this.isProcessing && this.processingStats.totalErrors / Math.max(this.processingStats.totalProcessed, 1) < 0.1;
- }
-
- private findApplicablePipelines(data: any): ProcessingPipeline[] {
- const applicable: ProcessingPipeline[] = [];
-
- for (const pipeline of this.pipelines.values()) {
- if (this.isPipelineApplicable(data, pipeline)) {
- applicable.push(pipeline);
- }
- }
-
- return applicable;
- }
-
- private isPipelineApplicable(data: any, pipeline: ProcessingPipeline): boolean {
- const { inputFilter } = pipeline;
-
- // Check symbol filter
- if (inputFilter.symbols && inputFilter.symbols.length > 0) {
- if (!data.symbol || !inputFilter.symbols.includes(data.symbol)) {
- return false;
- }
- }
-
- // Check source filter
- if (inputFilter.sources && inputFilter.sources.length > 0) {
- if (!data.source || !inputFilter.sources.includes(data.source)) {
- return false;
- }
- }
-
- // Check data type filter
- if (inputFilter.dataTypes && inputFilter.dataTypes.length > 0) {
- const dataType = this.getDataType(data);
- if (!inputFilter.dataTypes.includes(dataType)) {
- return false;
- }
- }
-
- return true;
- }
-
- private getDataType(data: any): string {
- if (data.id && data.side) return 'trade';
- if (data.open !== undefined && data.high !== undefined) return 'candle';
- if (data.price !== undefined) return 'quote';
- if (data.bids || data.asks) return 'orderbook';
- return 'unknown';
- }
-
- private async processThroughPipeline(data: any, pipeline: ProcessingPipeline): Promise {
- let processedData = data;
-
- // Sort processors by priority
- const sortedProcessors = pipeline.processors
- .filter(p => p.enabled)
- .sort((a, b) => a.priority - b.priority);
-
- for (const processorConfig of sortedProcessors) {
- const processor = this.processors.get(processorConfig.id);
- if (!processor) {
- this.logger.warn({
- processorId: processorConfig.id,
- pipelineId: pipeline.id
- }, 'Processor not found');
- continue;
- }
-
- try {
- processedData = await processor.process(processedData);
-
- // If processor returns null/undefined, filter out the data
- if (processedData === null || processedData === undefined) {
- this.logger.debug({
- processorId: processor.id,
- pipelineId: pipeline.id
- }, 'Data filtered out by processor');
- return null;
- }
-
- } catch (error) {
- this.logger.error({
- error,
- processorId: processor.id,
- pipelineId: pipeline.id,
- data: processedData
- }, 'Processor error');
-
- // Continue processing with original data on error
- // You might want to implement different error handling strategies
- }
- }
-
- return processedData;
- }
-
- private startProcessing(): void {
- this.isProcessing = true;
-
- const processLoop = async () => {
- while (this.isProcessing) {
- if (this.processingQueue.length > 0) {
- const job = this.processingQueue.shift()!;
-
- try {
- await this.processThroughPipeline(job.data, job.pipeline);
- } catch (error) {
- this.logger.error({
- jobId: job.id,
- error
- }, 'Job processing error');
- }
- } else {
- // Wait for new jobs
- await new Promise(resolve => setTimeout(resolve, 10));
- }
- }
- };
-
- processLoop();
- }
-
- private updateProcessingStats(processingTime: number, isError: boolean): void {
- this.processingStats.totalProcessed++;
-
- if (isError) {
- this.processingStats.totalErrors++;
- }
-
- this.processingStats.processingTimes.push(processingTime);
-
- // Keep only last 1000 processing times
- if (this.processingStats.processingTimes.length > 1000) {
- this.processingStats.processingTimes = this.processingStats.processingTimes.slice(-1000);
- }
-
- // Update average processing time
- this.processingStats.avgProcessingTimeMs =
- this.processingStats.processingTimes.reduce((sum, time) => sum + time, 0) /
- this.processingStats.processingTimes.length;
- }
-
- // Built-in processor implementations
- private async validateData(data: any): Promise {
- // Basic data validation
- if (!data) {
- throw new Error('Data is null or undefined');
- }
-
- if (!data.symbol) {
- throw new Error('Missing symbol');
- }
-
- if (!data.timestamp) {
- data.timestamp = Date.now();
- }
-
- // Validate price data
- if (data.price !== undefined) {
- if (typeof data.price !== 'number' || data.price <= 0) {
- throw new Error('Invalid price');
- }
- }
-
- // Validate volume data
- if (data.volume !== undefined) {
- if (typeof data.volume !== 'number' || data.volume < 0) {
- throw new Error('Invalid volume');
- }
- }
-
- return data;
- }
-
- private async enrichData(data: any): Promise {
- // Add computed fields
- const enriched = { ...data };
-
- // Add processing timestamp
- enriched.processedAt = Date.now();
-
- // Add data type
- enriched.dataType = this.getDataType(data);
-
- // Calculate derived metrics for quotes
- if (data.price && data.prevClose) {
- enriched.change = data.price - data.prevClose;
- enriched.changePercent = (enriched.change / data.prevClose) * 100;
- }
-
- // Add market session info
- const marketSession = this.getMarketSession(data.timestamp);
- enriched.marketSession = marketSession;
-
- return enriched;
- }
-
- private async normalizeData(data: any): Promise {
- const normalized = { ...data };
-
- // Normalize symbol format
- if (normalized.symbol) {
- normalized.symbol = normalized.symbol.toUpperCase().trim();
- }
-
- // Normalize timestamp to milliseconds
- if (normalized.timestamp) {
- if (typeof normalized.timestamp === 'string') {
- normalized.timestamp = new Date(normalized.timestamp).getTime();
- } else if (normalized.timestamp.toString().length === 10) {
- // Convert seconds to milliseconds
- normalized.timestamp *= 1000;
- }
- }
-
- // Round price values to appropriate precision
- if (normalized.price) {
- normalized.price = Math.round(normalized.price * 10000) / 10000;
- }
-
- return normalized;
- }
-
- private async detectOutliers(data: any): Promise {
- // Simple outlier detection - in practice, you'd use historical data
- const config = this.processors.get('outlier-detector')?.config;
-
- if (data.price && data.prevClose) {
- const priceDeviation = Math.abs(data.price - data.prevClose) / data.prevClose;
- if (priceDeviation > (config?.priceDeviationThreshold || 0.1)) {
- this.logger.warn({
- symbol: data.symbol,
- price: data.price,
- prevClose: data.prevClose,
- deviation: priceDeviation
- }, 'Price outlier detected');
-
- // You could either filter out or flag the data
- data.outlier = true;
- data.outlierReason = 'price_deviation';
- }
- }
-
- if (data.volume) {
- const volumeThreshold = config?.volumeThreshold || 1000000;
- if (data.volume > volumeThreshold) {
- data.highVolume = true;
- }
- }
-
- return data;
- }
-
- private async filterMarketHours(data: any): Promise {
- const config = this.processors.get('market-hours-filter')?.config;
-
- if (!config?.marketOpen || !config?.marketClose) {
- return data;
- }
-
- // Simple market hours check - in practice, you'd use proper timezone handling
- const timestamp = new Date(data.timestamp);
- const timeString = timestamp.toTimeString().substring(0, 5);
-
- if (timeString < config.marketOpen || timeString > config.marketClose) {
- // Mark as after hours
- data.afterHours = true;
- }
-
- return data;
- }
-
- private async aggregateOHLC(data: any): Promise {
- // This is a simplified version - in practice, you'd maintain
- // aggregation windows and emit candles when complete
- if (data.dataType === 'quote' && data.price) {
- const candle = {
- symbol: data.symbol,
- timestamp: data.timestamp,
- open: data.price,
- high: data.price,
- low: data.price,
- close: data.price,
- volume: data.volume || 0,
- timeframe: '1m',
- source: data.source,
- dataType: 'candle'
- };
-
- // In practice, you'd emit this separately or include it in results
- this.emit('candle-generated', candle);
- }
-
- return data;
- }
-
- private getMarketSession(timestamp: number): string {
- const date = new Date(timestamp);
- const timeString = date.toTimeString().substring(0, 5);
-
- if (timeString < '09:30') return 'pre-market';
- if (timeString <= '16:00') return 'regular';
- if (timeString <= '20:00') return 'after-hours';
- return 'closed';
- }
-
- public async updateConfig(config: any): Promise {
- this.config = config;
- this.logger.info('Processing engine configuration updated');
-
- // Update pipelines if provided
- if (config.pipelines) {
- // Clear existing pipelines
- this.pipelines.clear();
-
- // Add new pipelines
- for (const pipeline of config.pipelines) {
- this.addPipeline(pipeline);
- }
- }
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/ServiceIntegrationManager.ts b/apps/core-services/market-data-gateway/src/services/ServiceIntegrationManager.ts
deleted file mode 100644
index fbeca5c..0000000
--- a/apps/core-services/market-data-gateway/src/services/ServiceIntegrationManager.ts
+++ /dev/null
@@ -1,540 +0,0 @@
-import { EventEmitter } from 'events';
-import axios, { AxiosInstance } from 'axios';
-import {
- ServiceIntegration,
- Logger,
- HealthStatus,
- MarketDataTick,
- MarketDataCandle,
- ProcessedData,
- DataPipelineJob,
- FeatureComputationRequest,
- DataAsset
-} from '../types/MarketDataGateway';
-
-interface ServiceEndpoint {
- baseUrl: string;
- timeout: number;
- retries: number;
- healthPath: string;
-}
-
-interface ServiceHealth {
- serviceId: string;
- status: 'healthy' | 'degraded' | 'unhealthy' | 'unreachable';
- lastCheck: number;
- responseTime: number;
- errorCount: number;
-}
-
-interface IntegrationMetrics {
- totalRequests: number;
- successfulRequests: number;
- failedRequests: number;
- averageResponseTime: number;
- lastRequestTime: number;
-}
-
-export class ServiceIntegrationManager extends EventEmitter {
- private config: ServiceIntegration;
- private logger: Logger;
- private httpClients: Map;
- private serviceHealth: Map;
- private integrationMetrics: Map;
- private healthCheckInterval: NodeJS.Timeout | null = null;
- private isInitialized: boolean = false;
-
- constructor(config: ServiceIntegration, logger: Logger) {
- super();
- this.config = config;
- this.logger = logger;
- this.httpClients = new Map();
- this.serviceHealth = new Map();
- this.integrationMetrics = new Map();
- }
-
- async initialize(): Promise {
- try {
- // Initialize HTTP clients for each service
- const services = [
- { id: 'data-processor', config: this.config.dataProcessor },
- { id: 'feature-store', config: this.config.featureStore },
- { id: 'data-catalog', config: this.config.dataCatalog },
- ];
-
- for (const service of services) {
- if (service.config.enabled) {
- const client = axios.create({
- baseURL: service.config.baseUrl,
- timeout: service.config.timeout || 30000,
- headers: {
- 'Content-Type': 'application/json',
- 'User-Agent': 'market-data-gateway/1.0.0',
- },
- });
-
- // Add request interceptor for metrics
- client.interceptors.request.use((config) => {
- const startTime = Date.now();
- config.metadata = { startTime };
- return config;
- });
-
- // Add response interceptor for metrics and error handling
- client.interceptors.response.use(
- (response) => {
- const endTime = Date.now();
- const startTime = response.config.metadata?.startTime || endTime;
- this.updateMetrics(service.id, true, endTime - startTime);
- return response;
- },
- (error) => {
- const endTime = Date.now();
- const startTime = error.config?.metadata?.startTime || endTime;
- this.updateMetrics(service.id, false, endTime - startTime);
- return Promise.reject(error);
- }
- );
-
- this.httpClients.set(service.id, client);
-
- // Initialize health tracking
- this.serviceHealth.set(service.id, {
- serviceId: service.id,
- status: 'unreachable',
- lastCheck: 0,
- responseTime: 0,
- errorCount: 0,
- });
-
- // Initialize metrics
- this.integrationMetrics.set(service.id, {
- totalRequests: 0,
- successfulRequests: 0,
- failedRequests: 0,
- averageResponseTime: 0,
- lastRequestTime: 0,
- });
- }
- }
-
- // Start health monitoring
- this.startHealthMonitoring();
- this.isInitialized = true;
-
- this.logger.info('Service integration manager initialized successfully');
- this.emit('initialized');
- } catch (error) {
- this.logger.error('Failed to initialize service integration manager:', error);
- throw error;
- }
- }
-
- async shutdown(): Promise {
- try {
- if (this.healthCheckInterval) {
- clearInterval(this.healthCheckInterval);
- this.healthCheckInterval = null;
- }
-
- this.isInitialized = false;
- this.logger.info('Service integration manager shut down successfully');
- this.emit('shutdown');
- } catch (error) {
- this.logger.error('Error shutting down service integration manager:', error);
- }
- }
-
- // Data Processor Integration
- async sendToDataProcessor(data: ProcessedData[]): Promise {
- if (!this.config.dataProcessor.enabled) {
- this.logger.debug('Data processor integration disabled');
- return;
- }
-
- try {
- const client = this.httpClients.get('data-processor');
- if (!client) throw new Error('Data processor client not initialized');
-
- const payload = {
- source: 'market-data-gateway',
- timestamp: new Date().toISOString(),
- data: data,
- };
-
- const response = await client.post('/api/v1/data/ingest', payload);
-
- this.logger.debug(`Sent ${data.length} records to data processor`);
- this.emit('data-sent', { service: 'data-processor', count: data.length });
-
- return response.data;
- } catch (error) {
- this.logger.error('Failed to send data to data processor:', error);
- this.emit('integration-error', { service: 'data-processor', error });
- throw error;
- }
- }
-
- async createDataPipeline(pipelineConfig: any): Promise {
- if (!this.config.dataProcessor.enabled) {
- throw new Error('Data processor integration disabled');
- }
-
- try {
- const client = this.httpClients.get('data-processor');
- if (!client) throw new Error('Data processor client not initialized');
-
- const response = await client.post('/api/v1/pipelines', pipelineConfig);
-
- this.logger.info(`Created data pipeline: ${response.data.id}`);
- return response.data.id;
- } catch (error) {
- this.logger.error('Failed to create data pipeline:', error);
- throw error;
- }
- }
-
- async triggerPipelineJob(pipelineId: string, jobConfig: Partial): Promise {
- if (!this.config.dataProcessor.enabled) {
- throw new Error('Data processor integration disabled');
- }
-
- try {
- const client = this.httpClients.get('data-processor');
- if (!client) throw new Error('Data processor client not initialized');
-
- const response = await client.post(`/api/v1/pipelines/${pipelineId}/jobs`, jobConfig);
-
- this.logger.info(`Triggered pipeline job: ${response.data.jobId}`);
- return response.data.jobId;
- } catch (error) {
- this.logger.error('Failed to trigger pipeline job:', error);
- throw error;
- }
- }
-
- // Feature Store Integration
- async publishToFeatureStore(features: any[]): Promise {
- if (!this.config.featureStore.enabled) {
- this.logger.debug('Feature store integration disabled');
- return;
- }
-
- try {
- const client = this.httpClients.get('feature-store');
- if (!client) throw new Error('Feature store client not initialized');
-
- const payload = {
- source: 'market-data-gateway',
- timestamp: new Date().toISOString(),
- features: features,
- };
-
- const response = await client.post('/api/v1/features/ingest', payload);
-
- this.logger.debug(`Published ${features.length} features to feature store`);
- this.emit('features-published', { count: features.length });
-
- return response.data;
- } catch (error) {
- this.logger.error('Failed to publish features to feature store:', error);
- this.emit('integration-error', { service: 'feature-store', error });
- throw error;
- }
- }
-
- async requestFeatureComputation(request: FeatureComputationRequest): Promise {
- if (!this.config.featureStore.enabled) {
- throw new Error('Feature store integration disabled');
- }
-
- try {
- const client = this.httpClients.get('feature-store');
- if (!client) throw new Error('Feature store client not initialized');
-
- const response = await client.post('/api/v1/features/compute', request);
-
- this.logger.info(`Requested feature computation: ${request.featureGroupId}`);
- return response.data;
- } catch (error) {
- this.logger.error('Failed to request feature computation:', error);
- throw error;
- }
- }
-
- async getFeatureGroup(featureGroupId: string): Promise {
- if (!this.config.featureStore.enabled) {
- throw new Error('Feature store integration disabled');
- }
-
- try {
- const client = this.httpClients.get('feature-store');
- if (!client) throw new Error('Feature store client not initialized');
-
- const response = await client.get(`/api/v1/feature-groups/${featureGroupId}`);
- return response.data;
- } catch (error) {
- this.logger.error(`Failed to get feature group ${featureGroupId}:`, error);
- throw error;
- }
- }
-
- // Data Catalog Integration
- async registerDataAsset(asset: Omit): Promise {
- if (!this.config.dataCatalog.enabled) {
- this.logger.debug('Data catalog integration disabled');
- return '';
- }
-
- try {
- const client = this.httpClients.get('data-catalog');
- if (!client) throw new Error('Data catalog client not initialized');
-
- const response = await client.post('/api/v1/assets', asset);
-
- this.logger.info(`Registered data asset: ${asset.name}`);
- this.emit('asset-registered', { assetId: response.data.id, name: asset.name });
-
- return response.data.id;
- } catch (error) {
- this.logger.error('Failed to register data asset:', error);
- this.emit('integration-error', { service: 'data-catalog', error });
- throw error;
- }
- }
-
- async updateDataLineage(fromAssetId: string, toAssetId: string, transformationType: string): Promise {
- if (!this.config.dataCatalog.enabled) {
- this.logger.debug('Data catalog integration disabled');
- return;
- }
-
- try {
- const client = this.httpClients.get('data-catalog');
- if (!client) throw new Error('Data catalog client not initialized');
-
- const lineageData = {
- fromAssetId,
- toAssetId,
- transformationType,
- timestamp: new Date().toISOString(),
- source: 'market-data-gateway',
- };
-
- await client.post('/api/v1/lineage', lineageData);
-
- this.logger.debug(`Updated data lineage: ${fromAssetId} -> ${toAssetId}`);
- this.emit('lineage-updated', lineageData);
- } catch (error) {
- this.logger.error('Failed to update data lineage:', error);
- this.emit('integration-error', { service: 'data-catalog', error });
- throw error;
- }
- }
-
- async reportDataQuality(assetId: string, qualityMetrics: any): Promise {
- if (!this.config.dataCatalog.enabled) {
- this.logger.debug('Data catalog integration disabled');
- return;
- }
-
- try {
- const client = this.httpClients.get('data-catalog');
- if (!client) throw new Error('Data catalog client not initialized');
-
- const qualityReport = {
- assetId,
- metrics: qualityMetrics,
- timestamp: new Date().toISOString(),
- source: 'market-data-gateway',
- };
-
- await client.post('/api/v1/quality/reports', qualityReport);
-
- this.logger.debug(`Reported data quality for asset: ${assetId}`);
- this.emit('quality-reported', { assetId, metrics: qualityMetrics });
- } catch (error) {
- this.logger.error('Failed to report data quality:', error);
- this.emit('integration-error', { service: 'data-catalog', error });
- throw error;
- }
- }
-
- // Health monitoring
- private startHealthMonitoring(): void {
- this.healthCheckInterval = setInterval(() => {
- this.checkServiceHealth();
- }, 30000); // Check every 30 seconds
- }
-
- private async checkServiceHealth(): Promise {
- const healthPromises = Array.from(this.httpClients.entries()).map(
- async ([serviceId, client]) => {
- const startTime = Date.now();
- try {
- await client.get('/health');
- const responseTime = Date.now() - startTime;
-
- this.updateServiceHealth(serviceId, 'healthy', responseTime, false);
- } catch (error) {
- const responseTime = Date.now() - startTime;
- this.updateServiceHealth(serviceId, 'unhealthy', responseTime, true);
- }
- }
- );
-
- await Promise.allSettled(healthPromises);
- }
-
- private updateServiceHealth(
- serviceId: string,
- status: ServiceHealth['status'],
- responseTime: number,
- isError: boolean
- ): void {
- const health = this.serviceHealth.get(serviceId);
- if (!health) return;
-
- health.status = status;
- health.lastCheck = Date.now();
- health.responseTime = responseTime;
-
- if (isError) {
- health.errorCount++;
- } else {
- health.errorCount = Math.max(0, health.errorCount - 1); // Decay error count
- }
-
- this.serviceHealth.set(serviceId, health);
- this.emit('service-health-updated', { serviceId, health });
- }
-
- private updateMetrics(serviceId: string, success: boolean, responseTime: number): void {
- const metrics = this.integrationMetrics.get(serviceId);
- if (!metrics) return;
-
- metrics.totalRequests++;
- metrics.lastRequestTime = Date.now();
-
- if (success) {
- metrics.successfulRequests++;
- } else {
- metrics.failedRequests++;
- }
-
- // Update average response time
- const totalSuccessful = metrics.successfulRequests;
- if (totalSuccessful > 0) {
- metrics.averageResponseTime =
- (metrics.averageResponseTime * (totalSuccessful - 1) + responseTime) / totalSuccessful;
- }
-
- this.integrationMetrics.set(serviceId, metrics);
- }
-
- // Status and metrics
- getServiceHealth(serviceId?: string): ServiceHealth | ServiceHealth[] {
- if (serviceId) {
- return this.serviceHealth.get(serviceId) || {
- serviceId,
- status: 'unreachable',
- lastCheck: 0,
- responseTime: 0,
- errorCount: 0,
- };
- }
-
- return Array.from(this.serviceHealth.values());
- }
-
- getIntegrationMetrics(serviceId?: string): IntegrationMetrics | IntegrationMetrics[] {
- if (serviceId) {
- return this.integrationMetrics.get(serviceId) || {
- totalRequests: 0,
- successfulRequests: 0,
- failedRequests: 0,
- averageResponseTime: 0,
- lastRequestTime: 0,
- };
- }
-
- return Array.from(this.integrationMetrics.values());
- }
-
- getHealth(): HealthStatus {
- const allHealthy = Array.from(this.serviceHealth.values()).every(
- health => health.status === 'healthy'
- );
-
- const degradedServices = Array.from(this.serviceHealth.values()).filter(
- health => health.status === 'degraded'
- );
-
- const unhealthyServices = Array.from(this.serviceHealth.values()).filter(
- health => health.status === 'unhealthy' || health.status === 'unreachable'
- );
-
- let status: 'healthy' | 'degraded' | 'unhealthy' = 'healthy';
- let message = 'All service integrations are healthy';
-
- if (unhealthyServices.length > 0) {
- status = 'unhealthy';
- message = `${unhealthyServices.length} services are unhealthy`;
- } else if (degradedServices.length > 0) {
- status = 'degraded';
- message = `${degradedServices.length} services are degraded`;
- }
-
- return {
- status,
- message,
- timestamp: new Date().toISOString(),
- details: {
- isInitialized: this.isInitialized,
- totalServices: this.serviceHealth.size,
- healthyServices: Array.from(this.serviceHealth.values()).filter(h => h.status === 'healthy').length,
- degradedServices: degradedServices.length,
- unhealthyServices: unhealthyServices.length,
- serviceHealth: Object.fromEntries(this.serviceHealth),
- integrationMetrics: Object.fromEntries(this.integrationMetrics),
- },
- };
- }
-
- // Configuration management
- updateServiceConfig(serviceId: string, config: Partial): void {
- const currentConfig = this.getServiceConfig(serviceId);
- if (!currentConfig) {
- this.logger.error(`Service ${serviceId} not found for config update`);
- return;
- }
-
- // Update the configuration
- Object.assign(currentConfig, config);
-
- // Reinitialize the HTTP client if URL changed
- if (config.baseUrl) {
- const client = this.httpClients.get(serviceId);
- if (client) {
- client.defaults.baseURL = config.baseUrl;
- client.defaults.timeout = config.timeout || client.defaults.timeout;
- }
- }
-
- this.logger.info(`Updated configuration for service: ${serviceId}`);
- this.emit('service-config-updated', { serviceId, config });
- }
-
- private getServiceConfig(serviceId: string): any {
- switch (serviceId) {
- case 'data-processor':
- return this.config.dataProcessor;
- case 'feature-store':
- return this.config.featureStore;
- case 'data-catalog':
- return this.config.dataCatalog;
- default:
- return null;
- }
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/services/SubscriptionManager.ts b/apps/core-services/market-data-gateway/src/services/SubscriptionManager.ts
deleted file mode 100644
index 6b197b8..0000000
--- a/apps/core-services/market-data-gateway/src/services/SubscriptionManager.ts
+++ /dev/null
@@ -1,617 +0,0 @@
-import { EventEmitter } from 'eventemitter3';
-import { Logger } from 'pino';
-import WebSocket from 'ws';
-import {
- SubscriptionRequest,
- ClientSubscription,
- WebSocketMessage,
- WebSocketDataMessage,
- MarketDataTick,
- MarketDataCandle,
- MarketDataTrade
-} from '../types/MarketDataGateway';
-
-interface WebSocketClient {
- id: string;
- ws: WebSocket;
- subscriptions: Set;
- connectedAt: Date;
- lastPing: Date;
- metadata: {
- userAgent?: string;
- ip?: string;
- userId?: string;
- };
-}
-
-export class SubscriptionManager extends EventEmitter {
- private logger: Logger;
- private subscriptions: Map = new Map();
- private clients: Map = new Map();
- private symbolSubscriptions: Map> = new Map(); // symbol -> subscription IDs
- private heartbeatInterval?: NodeJS.Timeout;
- private cleanupInterval?: NodeJS.Timeout;
-
- constructor(logger: Logger) {
- super();
- this.logger = logger;
- }
-
- public async start(): Promise {
- this.logger.info('Starting Subscription Manager');
-
- // Start heartbeat for WebSocket clients
- this.startHeartbeat();
-
- // Start cleanup for stale subscriptions
- this.startCleanup();
-
- this.logger.info('Subscription Manager started');
- }
-
- public async stop(): Promise {
- this.logger.info('Stopping Subscription Manager');
-
- // Clear intervals
- if (this.heartbeatInterval) {
- clearInterval(this.heartbeatInterval);
- }
-
- if (this.cleanupInterval) {
- clearInterval(this.cleanupInterval);
- }
-
- // Close all WebSocket connections
- for (const client of this.clients.values()) {
- client.ws.close();
- }
-
- this.clients.clear();
- this.subscriptions.clear();
- this.symbolSubscriptions.clear();
-
- this.logger.info('Subscription Manager stopped');
- }
-
- public async subscribe(request: SubscriptionRequest): Promise {
- this.logger.info({
- clientId: request.clientId,
- symbols: request.symbols,
- dataTypes: request.dataTypes
- }, 'Creating subscription');
-
- // Validate subscription request
- this.validateSubscriptionRequest(request);
-
- // Create subscription
- const subscription: ClientSubscription = {
- request,
- status: 'active',
- connectedAt: new Date(),
- lastUpdate: new Date(),
- metrics: {
- messagesDelivered: 0,
- bytesTransferred: 0,
- errors: 0,
- avgLatencyMs: 0
- }
- };
-
- this.subscriptions.set(request.id, subscription);
-
- // Track symbol subscriptions for efficient lookup
- for (const symbol of request.symbols) {
- if (!this.symbolSubscriptions.has(symbol)) {
- this.symbolSubscriptions.set(symbol, new Set());
- }
- this.symbolSubscriptions.get(symbol)!.add(request.id);
- }
-
- this.emit('subscribed', subscription);
-
- this.logger.info({ subscriptionId: request.id }, 'Subscription created');
- return request.id;
- }
-
- public async unsubscribe(subscriptionId: string): Promise {
- const subscription = this.subscriptions.get(subscriptionId);
- if (!subscription) {
- throw new Error(`Subscription ${subscriptionId} not found`);
- }
-
- this.logger.info({ subscriptionId }, 'Removing subscription');
-
- // Remove from symbol tracking
- for (const symbol of subscription.request.symbols) {
- const symbolSubs = this.symbolSubscriptions.get(symbol);
- if (symbolSubs) {
- symbolSubs.delete(subscriptionId);
- if (symbolSubs.size === 0) {
- this.symbolSubscriptions.delete(symbol);
- }
- }
- }
-
- // Remove subscription
- this.subscriptions.delete(subscriptionId);
-
- this.emit('unsubscribed', subscription.request.clientId);
-
- this.logger.info({ subscriptionId }, 'Subscription removed');
- }
-
- public getSubscriptions(clientId?: string): ClientSubscription[] {
- const subscriptions = Array.from(this.subscriptions.values());
-
- if (clientId) {
- return subscriptions.filter(sub => sub.request.clientId === clientId);
- }
-
- return subscriptions;
- }
-
- public async broadcast(data: MarketDataTick | MarketDataCandle | MarketDataTrade): Promise {
- const symbol = data.symbol;
- const dataType = this.getDataType(data);
-
- // Get subscriptions for this symbol
- const subscriptionIds = this.symbolSubscriptions.get(symbol);
- if (!subscriptionIds || subscriptionIds.size === 0) {
- return;
- }
-
- const deliveryPromises: Promise[] = [];
-
- for (const subscriptionId of subscriptionIds) {
- const subscription = this.subscriptions.get(subscriptionId);
- if (!subscription || subscription.status !== 'active') {
- continue;
- }
-
- // Check if subscription wants this data type
- if (!subscription.request.dataTypes.includes(dataType as any)) {
- continue;
- }
-
- // Apply filters
- if (!this.passesFilters(data, subscription.request.filters)) {
- continue;
- }
-
- // Apply throttling if configured
- if (subscription.request.throttle && !this.passesThrottle(subscription)) {
- continue;
- }
-
- // Deliver data based on delivery method
- deliveryPromises.push(this.deliverData(subscription, data));
- }
-
- // Wait for all deliveries
- await Promise.allSettled(deliveryPromises);
- }
-
- public addWebSocketClient(ws: WebSocket, clientId: string, metadata: any = {}): void {
- this.logger.info({ clientId }, 'Adding WebSocket client');
-
- const client: WebSocketClient = {
- id: clientId,
- ws,
- subscriptions: new Set(),
- connectedAt: new Date(),
- lastPing: new Date(),
- metadata
- };
-
- this.clients.set(clientId, client);
-
- // Setup WebSocket event handlers
- ws.on('message', (message: Buffer) => {
- this.handleWebSocketMessage(clientId, message);
- });
-
- ws.on('close', () => {
- this.removeWebSocketClient(clientId);
- });
-
- ws.on('error', (error) => {
- this.logger.error({ clientId, error }, 'WebSocket client error');
- this.removeWebSocketClient(clientId);
- });
-
- ws.on('pong', () => {
- const client = this.clients.get(clientId);
- if (client) {
- client.lastPing = new Date();
- }
- });
-
- // Send welcome message
- this.sendWebSocketMessage(ws, {
- type: 'status',
- id: 'welcome',
- timestamp: Date.now(),
- payload: {
- status: 'connected',
- clientId,
- serverTime: new Date().toISOString()
- }
- });
- }
-
- public removeWebSocketClient(clientId: string): void {
- const client = this.clients.get(clientId);
- if (!client) {
- return;
- }
-
- this.logger.info({ clientId }, 'Removing WebSocket client');
-
- // Unsubscribe from all subscriptions
- for (const subscriptionId of client.subscriptions) {
- try {
- this.unsubscribe(subscriptionId);
- } catch (error) {
- this.logger.error({ subscriptionId, error }, 'Error unsubscribing client');
- }
- }
-
- // Close WebSocket if still open
- if (client.ws.readyState === WebSocket.OPEN) {
- client.ws.close();
- }
-
- this.clients.delete(clientId);
- }
-
- private validateSubscriptionRequest(request: SubscriptionRequest): void {
- if (!request.id) {
- throw new Error('Subscription ID is required');
- }
-
- if (!request.clientId) {
- throw new Error('Client ID is required');
- }
-
- if (!request.symbols || request.symbols.length === 0) {
- throw new Error('At least one symbol is required');
- }
-
- if (!request.dataTypes || request.dataTypes.length === 0) {
- throw new Error('At least one data type is required');
- }
-
- if (this.subscriptions.has(request.id)) {
- throw new Error(`Subscription ${request.id} already exists`);
- }
-
- // Validate symbols format
- for (const symbol of request.symbols) {
- if (typeof symbol !== 'string' || symbol.length === 0) {
- throw new Error(`Invalid symbol: ${symbol}`);
- }
- }
-
- // Validate data types
- const validDataTypes = ['quotes', 'trades', 'orderbook', 'candles', 'news'];
- for (const dataType of request.dataTypes) {
- if (!validDataTypes.includes(dataType)) {
- throw new Error(`Invalid data type: ${dataType}`);
- }
- }
- }
-
- private getDataType(data: any): string {
- if (data.id && data.side) return 'trades';
- if (data.open !== undefined && data.high !== undefined) return 'candles';
- if (data.price !== undefined) return 'quotes';
- if (data.bids || data.asks) return 'orderbook';
- return 'unknown';
- }
-
- private passesFilters(data: any, filters?: any): boolean {
- if (!filters) {
- return true;
- }
-
- // Price range filter
- if (filters.priceRange && data.price) {
- if (data.price < filters.priceRange.min || data.price > filters.priceRange.max) {
- return false;
- }
- }
-
- // Volume threshold filter
- if (filters.volumeThreshold && data.volume) {
- if (data.volume < filters.volumeThreshold) {
- return false;
- }
- }
-
- // Exchange filter
- if (filters.exchanges && data.exchange) {
- if (!filters.exchanges.includes(data.exchange)) {
- return false;
- }
- }
-
- return true;
- }
-
- private passesThrottle(subscription: ClientSubscription): boolean {
- const throttle = subscription.request.throttle;
- if (!throttle) {
- return true;
- }
-
- const now = Date.now();
- const timeSinceLastUpdate = now - subscription.lastUpdate.getTime();
- const minInterval = 1000 / throttle.maxUpdatesPerSecond;
-
- return timeSinceLastUpdate >= minInterval;
- }
-
- private async deliverData(subscription: ClientSubscription, data: any): Promise {
- const startTime = Date.now();
-
- try {
- const message: WebSocketDataMessage = {
- type: 'data',
- id: subscription.request.id,
- timestamp: Date.now(),
- payload: {
- dataType: this.getDataType(data),
- data
- }
- };
-
- switch (subscription.request.delivery.method) {
- case 'websocket':
- await this.deliverViaWebSocket(subscription, message);
- break;
- case 'webhook':
- await this.deliverViaWebhook(subscription, message);
- break;
- case 'eventbus':
- await this.deliverViaEventBus(subscription, message);
- break;
- default:
- throw new Error(`Unsupported delivery method: ${subscription.request.delivery.method}`);
- }
-
- // Update metrics
- const latency = Date.now() - startTime;
- subscription.metrics.messagesDelivered++;
- subscription.metrics.avgLatencyMs =
- (subscription.metrics.avgLatencyMs * (subscription.metrics.messagesDelivered - 1) + latency) /
- subscription.metrics.messagesDelivered;
- subscription.lastUpdate = new Date();
-
- } catch (error) {
- this.logger.error({
- subscriptionId: subscription.request.id,
- error
- }, 'Error delivering data');
-
- subscription.metrics.errors++;
-
- if (subscription.metrics.errors > 10) {
- subscription.status = 'error';
- this.emit('error', error, subscription.request.clientId);
- }
- }
- }
-
- private async deliverViaWebSocket(subscription: ClientSubscription, message: WebSocketDataMessage): Promise {
- const client = this.clients.get(subscription.request.clientId);
- if (!client || client.ws.readyState !== WebSocket.OPEN) {
- throw new Error('WebSocket client not available');
- }
-
- this.sendWebSocketMessage(client.ws, message);
-
- const messageSize = JSON.stringify(message).length;
- subscription.metrics.bytesTransferred += messageSize;
- }
-
- private async deliverViaWebhook(subscription: ClientSubscription, message: any): Promise {
- // Webhook delivery implementation would go here
- // This would use HTTP POST to deliver the data
- throw new Error('Webhook delivery not implemented');
- }
-
- private async deliverViaEventBus(subscription: ClientSubscription, message: any): Promise {
- // Event bus delivery implementation would go here
- // This would publish to the event bus
- this.emit('event-bus-delivery', subscription.request.clientId, message);
- }
-
- private sendWebSocketMessage(ws: WebSocket, message: WebSocketMessage): void {
- if (ws.readyState === WebSocket.OPEN) {
- ws.send(JSON.stringify(message));
- }
- }
-
- private handleWebSocketMessage(clientId: string, message: Buffer): void {
- try {
- const parsedMessage = JSON.parse(message.toString()) as WebSocketMessage;
-
- switch (parsedMessage.type) {
- case 'subscribe':
- this.handleWebSocketSubscribe(clientId, parsedMessage as any);
- break;
- case 'unsubscribe':
- this.handleWebSocketUnsubscribe(clientId, parsedMessage);
- break;
- case 'heartbeat':
- this.handleWebSocketHeartbeat(clientId);
- break;
- default:
- this.logger.warn({ clientId, messageType: parsedMessage.type }, 'Unknown WebSocket message type');
- }
- } catch (error) {
- this.logger.error({ clientId, error }, 'Error parsing WebSocket message');
- }
- }
-
- private async handleWebSocketSubscribe(clientId: string, message: any): Promise {
- try {
- const subscriptionRequest: SubscriptionRequest = {
- id: `${clientId}-${Date.now()}`,
- clientId,
- symbols: message.payload.symbols,
- dataTypes: message.payload.dataTypes,
- filters: message.payload.filters,
- throttle: message.payload.throttle,
- delivery: {
- method: 'websocket',
- format: 'json'
- }
- };
-
- const subscriptionId = await this.subscribe(subscriptionRequest);
-
- const client = this.clients.get(clientId);
- if (client) {
- client.subscriptions.add(subscriptionId);
- }
-
- // Send confirmation
- const confirmationMessage: WebSocketMessage = {
- type: 'status',
- id: message.id,
- timestamp: Date.now(),
- payload: {
- status: 'subscribed',
- subscriptionId,
- symbols: subscriptionRequest.symbols,
- dataTypes: subscriptionRequest.dataTypes
- }
- };
-
- const ws = this.clients.get(clientId)?.ws;
- if (ws) {
- this.sendWebSocketMessage(ws, confirmationMessage);
- }
-
- } catch (error) {
- this.logger.error({ clientId, error }, 'Error handling WebSocket subscribe');
-
- // Send error message
- const errorMessage: WebSocketMessage = {
- type: 'error',
- id: message.id,
- timestamp: Date.now(),
- payload: {
- error: error instanceof Error ? error.message : 'Unknown error'
- }
- };
-
- const ws = this.clients.get(clientId)?.ws;
- if (ws) {
- this.sendWebSocketMessage(ws, errorMessage);
- }
- }
- }
-
- private async handleWebSocketUnsubscribe(clientId: string, message: WebSocketMessage): Promise {
- try {
- const subscriptionId = message.payload?.subscriptionId;
- if (!subscriptionId) {
- throw new Error('Subscription ID is required');
- }
-
- await this.unsubscribe(subscriptionId);
-
- const client = this.clients.get(clientId);
- if (client) {
- client.subscriptions.delete(subscriptionId);
- }
-
- // Send confirmation
- const confirmationMessage: WebSocketMessage = {
- type: 'status',
- id: message.id,
- timestamp: Date.now(),
- payload: {
- status: 'unsubscribed',
- subscriptionId
- }
- };
-
- const ws = this.clients.get(clientId)?.ws;
- if (ws) {
- this.sendWebSocketMessage(ws, confirmationMessage);
- }
-
- } catch (error) {
- this.logger.error({ clientId, error }, 'Error handling WebSocket unsubscribe');
- }
- }
-
- private handleWebSocketHeartbeat(clientId: string): void {
- const client = this.clients.get(clientId);
- if (client) {
- client.lastPing = new Date();
-
- const heartbeatMessage: WebSocketMessage = {
- type: 'heartbeat',
- timestamp: Date.now(),
- payload: {
- serverTime: new Date().toISOString()
- }
- };
-
- this.sendWebSocketMessage(client.ws, heartbeatMessage);
- }
- }
-
- private startHeartbeat(): void {
- this.heartbeatInterval = setInterval(() => {
- const now = Date.now();
- const timeout = 60000; // 60 seconds
-
- for (const [clientId, client] of this.clients.entries()) {
- const timeSinceLastPing = now - client.lastPing.getTime();
-
- if (timeSinceLastPing > timeout) {
- this.logger.warn({ clientId }, 'Client heartbeat timeout');
- this.removeWebSocketClient(clientId);
- } else if (client.ws.readyState === WebSocket.OPEN) {
- // Send ping
- client.ws.ping();
- }
- }
- }, 30000); // Check every 30 seconds
- }
-
- private startCleanup(): void {
- this.cleanupInterval = setInterval(() => {
- const now = Date.now();
- const maxAge = 24 * 60 * 60 * 1000; // 24 hours
-
- for (const [subscriptionId, subscription] of this.subscriptions.entries()) {
- const age = now - subscription.connectedAt.getTime();
-
- if (subscription.status === 'error' || age > maxAge) {
- this.logger.info({ subscriptionId }, 'Cleaning up stale subscription');
- this.unsubscribe(subscriptionId);
- }
- }
- }, 60000); // Check every minute
- }
-
- public getMetrics() {
- return {
- totalSubscriptions: this.subscriptions.size,
- activeSubscriptions: Array.from(this.subscriptions.values())
- .filter(sub => sub.status === 'active').length,
- connectedClients: this.clients.size,
- symbolsTracked: this.symbolSubscriptions.size,
- totalMessagesDelivered: Array.from(this.subscriptions.values())
- .reduce((sum, sub) => sum + sub.metrics.messagesDelivered, 0),
- totalErrors: Array.from(this.subscriptions.values())
- .reduce((sum, sub) => sum + sub.metrics.errors, 0)
- };
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/shared/index.ts b/apps/core-services/market-data-gateway/src/shared/index.ts
deleted file mode 100644
index 71e0850..0000000
--- a/apps/core-services/market-data-gateway/src/shared/index.ts
+++ /dev/null
@@ -1,5 +0,0 @@
-// Shared components used by both realtime and storage modules
-export { CacheManager } from '../services/CacheManager';
-export { DataNormalizer } from '../services/DataNormalizer';
-export { MetricsCollector } from '../services/MetricsCollector';
-export { ServiceIntegrationManager } from '../services/ServiceIntegrationManager';
diff --git a/apps/core-services/market-data-gateway/src/storage/ArchivalService.ts b/apps/core-services/market-data-gateway/src/storage/ArchivalService.ts
deleted file mode 100644
index 37f87d5..0000000
--- a/apps/core-services/market-data-gateway/src/storage/ArchivalService.ts
+++ /dev/null
@@ -1,52 +0,0 @@
-/**
- * Archival service for managing data lifecycle and storage tiers
- * Handles cold storage, data compression, and retention policies
- */
-export class ArchivalService {
- private compressionLevel: number;
- private retentionPolicies: Map;
-
- constructor() {
- this.compressionLevel = 6; // Default compression level
- this.retentionPolicies = new Map();
- }
-
- /**
- * Archive old data to cold storage
- */
- async archiveData(symbol: string, cutoffDate: Date): Promise {
- try {
- console.log(`Archiving data for ${symbol} before ${cutoffDate}`);
- // Implementation for archiving
- } catch (error) {
- console.error('Error archiving data:', error);
- throw error;
- }
- }
-
- /**
- * Compress data for storage optimization
- */
- async compressData(data: any[]): Promise {
- try {
- // Implementation for data compression
- return Buffer.from(JSON.stringify(data));
- } catch (error) {
- console.error('Error compressing data:', error);
- throw error;
- }
- }
-
- /**
- * Apply retention policies
- */
- async applyRetentionPolicies(): Promise {
- try {
- console.log('Applying retention policies...');
- // Implementation for applying retention policies
- } catch (error) {
- console.error('Error applying retention policies:', error);
- throw error;
- }
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/storage/QueryEngine.ts b/apps/core-services/market-data-gateway/src/storage/QueryEngine.ts
deleted file mode 100644
index cc29f65..0000000
--- a/apps/core-services/market-data-gateway/src/storage/QueryEngine.ts
+++ /dev/null
@@ -1,46 +0,0 @@
-import { TimeSeriesStorage } from './TimeSeriesStorage';
-
-/**
- * Query engine for efficient historical data retrieval
- * Optimizes queries and provides various aggregation capabilities
- */
-export class QueryEngine {
- private storage: TimeSeriesStorage;
-
- constructor(storage: TimeSeriesStorage) {
- this.storage = storage;
- }
-
- /**
- * Execute optimized query with caching
- */
- async executeQuery(queryParams: any): Promise {
- try {
- // Implementation for optimized queries
- console.log('Executing optimized query:', queryParams);
- return [];
- } catch (error) {
- console.error('Error executing query:', error);
- throw error;
- }
- }
-
- /**
- * Aggregate data by time intervals
- */
- async aggregateByInterval(
- symbol: string,
- interval: string,
- startTime: Date,
- endTime: Date
- ): Promise {
- try {
- // Implementation for aggregation
- console.log(`Aggregating ${symbol} by ${interval}`);
- return [];
- } catch (error) {
- console.error('Error aggregating data:', error);
- throw error;
- }
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/storage/TimeSeriesStorage.ts b/apps/core-services/market-data-gateway/src/storage/TimeSeriesStorage.ts
deleted file mode 100644
index 37e6113..0000000
--- a/apps/core-services/market-data-gateway/src/storage/TimeSeriesStorage.ts
+++ /dev/null
@@ -1,78 +0,0 @@
-import { CacheManager } from '../services/CacheManager';
-import { DataNormalizer } from '../services/DataNormalizer';
-import { MetricsCollector } from '../services/MetricsCollector';
-
-/**
- * Historical data storage and retrieval service
- * Handles time-series storage, archival, and query capabilities
- */
-export class TimeSeriesStorage {
- private cache: CacheManager;
- private normalizer: DataNormalizer;
- private metrics: MetricsCollector;
-
- constructor(
- cache: CacheManager,
- normalizer: DataNormalizer,
- metrics: MetricsCollector
- ) {
- this.cache = cache;
- this.normalizer = normalizer;
- this.metrics = metrics;
- }
-
- /**
- * Store historical market data
- */
- async storeHistoricalData(symbol: string, data: any[]): Promise {
- try {
- // Implementation for storing historical data
- console.log(`Storing historical data for ${symbol}:`, data.length, 'records');
- await this.metrics.incrementCounter('historical_data_stored', { symbol });
- } catch (error) {
- console.error('Error storing historical data:', error);
- throw error;
- }
- }
-
- /**
- * Query historical data by time range
- */
- async queryTimeRange(
- symbol: string,
- startTime: Date,
- endTime: Date,
- interval?: string
- ): Promise {
- try {
- // Implementation for querying time range data
- console.log(`Querying ${symbol} from ${startTime} to ${endTime}`);
- await this.metrics.incrementCounter('historical_query', { symbol });
-
- // Return mock data for now
- return [];
- } catch (error) {
- console.error('Error querying historical data:', error);
- throw error;
- }
- }
-
- /**
- * Get data statistics and metadata
- */
- async getDataStats(symbol: string): Promise {
- try {
- // Implementation for getting data statistics
- return {
- symbol,
- recordCount: 0,
- firstRecord: null,
- lastRecord: null,
- intervals: []
- };
- } catch (error) {
- console.error('Error getting data stats:', error);
- throw error;
- }
- }
-}
diff --git a/apps/core-services/market-data-gateway/src/storage/index.ts b/apps/core-services/market-data-gateway/src/storage/index.ts
deleted file mode 100644
index ccdc38c..0000000
--- a/apps/core-services/market-data-gateway/src/storage/index.ts
+++ /dev/null
@@ -1,4 +0,0 @@
-// Storage and historical data components
-export { TimeSeriesStorage } from './TimeSeriesStorage';
-export { QueryEngine } from './QueryEngine';
-export { ArchivalService } from './ArchivalService';
diff --git a/apps/core-services/market-data-gateway/src/types/MarketDataGateway.ts b/apps/core-services/market-data-gateway/src/types/MarketDataGateway.ts
deleted file mode 100644
index fb00621..0000000
--- a/apps/core-services/market-data-gateway/src/types/MarketDataGateway.ts
+++ /dev/null
@@ -1,426 +0,0 @@
-// Market Data Gateway Types - Consolidated and organized
-
-// Market Data Types
-export interface MarketDataTick {
- symbol: string;
- timestamp: number;
- price: number;
- volume: number;
- bid?: number;
- ask?: number;
- bidSize?: number;
- askSize?: number;
- source: string;
- exchange?: string;
- lastTradeSize?: number;
- dayHigh?: number;
- dayLow?: number;
- dayOpen?: number;
- prevClose?: number;
- change?: number;
- changePercent?: number;
-}
-
-export interface MarketDataCandle {
- symbol: string;
- timestamp: number;
- open: number;
- high: number;
- low: number;
- close: number;
- volume: number;
- timeframe: string;
- source: string;
- exchange?: string;
- vwap?: number;
- trades?: number;
-}
-
-export interface MarketDataTrade {
- id: string;
- symbol: string;
- timestamp: number;
- price: number;
- size: number;
- side: 'buy' | 'sell';
- source: string;
- exchange?: string;
- conditions?: string[];
-}
-
-export interface MarketDataOrder {
- id: string;
- symbol: string;
- timestamp: number;
- side: 'buy' | 'sell';
- price: number;
- size: number;
- source: string;
- exchange?: string;
- orderType?: 'market' | 'limit' | 'stop';
- level?: number;
-}
-
-// Data Source Configuration
-export interface DataSourceConfig {
- id: string;
- name: string;
- type: 'websocket' | 'rest' | 'fix' | 'stream';
- provider: string;
- enabled: boolean;
- priority: number;
- rateLimit: {
- requestsPerSecond: number;
- burstLimit: number;
- };
- connection: {
- url: string;
- headers?: Record;
- queryParams?: Record;
- authentication?: {
- type: 'apikey' | 'oauth' | 'basic' | 'jwt';
- credentials: Record;
- };
- };
- subscriptions: {
- quotes: boolean;
- trades: boolean;
- orderbook: boolean;
- candles: boolean;
- news: boolean;
- };
- symbols: string[];
- retryPolicy: {
- maxRetries: number;
- backoffMultiplier: number;
- maxBackoffMs: number;
- };
- healthCheck: {
- intervalMs: number;
- timeoutMs: number;
- expectedLatencyMs: number;
- };
-}
-
-// Data Processing Pipeline
-export interface DataProcessor {
- id: string;
- name: string;
- type: 'enrichment' | 'validation' | 'normalization' | 'aggregation' | 'filter';
- enabled: boolean;
- priority: number;
- config: Record;
- process(data: MarketDataTick | MarketDataCandle | MarketDataTrade): Promise;
-}
-
-export interface ProcessingPipeline {
- id: string;
- name: string;
- processors: DataProcessor[];
- inputFilter: {
- symbols?: string[];
- sources?: string[];
- dataTypes?: string[];
- };
- outputTargets: {
- eventBus?: boolean;
- database?: boolean;
- cache?: boolean;
- websocket?: boolean;
- dataProcessor?: boolean;
- featureStore?: boolean;
- };
-}
-
-// ProcessingPipelineConfig is an alias for ProcessingPipeline
-export type ProcessingPipelineConfig = ProcessingPipeline;
-
-// Subscription Management
-export interface SubscriptionRequest {
- id: string;
- clientId: string;
- symbols: string[];
- dataTypes: ('quotes' | 'trades' | 'orderbook' | 'candles' | 'news')[];
- filters?: {
- priceRange?: { min: number; max: number };
- volumeThreshold?: number;
- exchanges?: string[];
- };
- throttle?: {
- maxUpdatesPerSecond: number;
- aggregationWindow?: number;
- };
- delivery: {
- method: 'websocket' | 'webhook' | 'eventbus';
- endpoint?: string;
- format: 'json' | 'protobuf' | 'avro';
- };
-}
-
-export interface ClientSubscription {
- request: SubscriptionRequest;
- status: 'active' | 'paused' | 'error' | 'stopped';
- connectedAt: Date;
- lastUpdate: Date;
- metrics: {
- messagesDelivered: number;
- bytesTransferred: number;
- errors: number;
- avgLatencyMs: number;
- };
-}
-
-// Gateway Configuration
-export interface GatewayConfig {
- server: {
- port: number;
- host: string;
- maxConnections: number;
- cors: {
- origins: string[];
- methods: string[];
- headers: string[];
- };
- };
- dataSources: DataSourceConfig[];
- processing: {
- pipelines: ProcessingPipeline[];
- bufferSize: number;
- batchSize: number;
- flushIntervalMs: number;
- };
- cache: {
- redis: {
- host: string;
- port: number;
- password?: string;
- db: number;
- };
- ttl: {
- quotes: number;
- trades: number;
- candles: number;
- orderbook: number;
- };
- }; monitoring: {
- metrics: {
- enabled: boolean;
- port: number;
- intervalMs: number;
- retention: string;
- };
- alerts: {
- enabled: boolean;
- thresholds: {
- errorRate: number;
- latency: number;
- latencyMs: number;
- connectionLoss: number;
- };
- };
- };
-}
-
-// Metrics and Monitoring
-export interface DataSourceMetrics {
- sourceId: string;
- status: 'connected' | 'disconnected' | 'error';
- messagesReceived: number;
- bytesReceived: number;
- latencyMs: number;
- errorCount: number;
- lastUpdate: Date;
-}
-
-export interface GatewayMetrics {
- timestamp: Date;
- uptime: number;
- system: {
- cpuUsage: number;
- memoryUsage: number;
- diskUsage: number;
- networkIO: {
- bytesIn: number;
- bytesOut: number;
- };
- };
- dataSources: DataSourceMetrics[];
- subscriptions: {
- total: number;
- active: number;
- byDataType: Record;
- };
- processing: {
- messagesPerSecond: number;
- avgProcessingTimeMs: number;
- queueDepth: number;
- errorRate: number;
- };
-}
-
-// Health Check
-export interface HealthStatus {
- service: string;
- status: 'healthy' | 'degraded' | 'unhealthy';
- timestamp: Date;
- uptime: number;
- version: string;
- dependencies: {
- name: string;
- status: 'healthy' | 'unhealthy';
- latencyMs?: number;
- error?: string;
- }[];
- metrics: {
- connectionsActive: number;
- messagesPerSecond: number;
- errorRate: number;
- avgLatencyMs: number;
- };
-}
-
-// WebSocket Types
-export interface WebSocketMessage {
- type: string;
- payload: any;
- timestamp: number;
- id?: string;
-}
-
-export interface WebSocketSubscribeMessage extends WebSocketMessage {
- type: 'subscribe';
- payload: SubscriptionRequest;
-}
-
-export interface WebSocketDataMessage extends WebSocketMessage {
- type: 'data';
- payload: MarketDataTick | MarketDataTrade | MarketDataCandle | MarketDataOrder;
- dataType?: string;
-}
-
-// Error Types
-export interface DataSourceError {
- sourceId: string;
- timestamp: Date;
- type: 'connection' | 'authentication' | 'ratelimit' | 'data' | 'timeout';
- message: string;
- details?: Record;
- severity: 'low' | 'medium' | 'high' | 'critical';
- recoverable: boolean;
-}
-
-// Event Types
-export interface MarketDataEvent {
- id: string;
- type: 'market.tick' | 'market.trade' | 'market.candle' | 'market.orderbook';
- source: 'market-data-gateway';
- timestamp: Date;
- data: MarketDataTick | MarketDataTrade | MarketDataCandle | MarketDataOrder;
- metadata?: Record;
-}
-
-// Processing and Integration Types
-export interface ProcessingError {
- code: string;
- message: string;
- timestamp: Date;
- data?: any;
- source?: string;
-}
-
-export interface ServiceIntegration {
- serviceName: string;
- endpoint: string;
- enabled: boolean;
- config: Record;
- dataProcessor: {
- enabled: boolean;
- endpoint: string;
- timeout: number;
- retries: number;
- };
- featureStore: {
- enabled: boolean;
- endpoint: string;
- timeout: number;
- retries: number;
- };
- dataCatalog: {
- enabled: boolean;
- endpoint: string;
- timeout: number;
- retries: number;
- };
-}
-
-export interface Logger {
- info(message: string, ...args: any[]): void;
- error(message: string, ...args: any[]): void;
- warn(message: string, ...args: any[]): void;
- debug(message: string, ...args: any[]): void;
-}
-
-export interface ProcessedData {
- source: string;
- timestamp: Date;
- data: any;
- processedAt: Date;
- metadata?: Record;
-}
-
-export interface DataPipelineJob {
- id: string;
- type: string;
- status: 'pending' | 'running' | 'completed' | 'failed';
- data: any;
- createdAt: Date;
- startedAt?: Date;
- completedAt?: Date;
-}
-
-export interface FeatureComputationRequest {
- featureGroupId: string;
- features: string[];
- data: any;
- timestamp: Date;
- metadata?: Record;
-}
-
-export interface DataAsset {
- id: string;
- name: string;
- type: string;
- source: string;
- metadata: Record;
- createdAt: Date;
- updatedAt: Date;
-}
-
-// Add missing types
-export interface CacheConfig {
- redis: {
- host: string;
- port: number;
- password?: string;
- db: number;
- };
- ttl: {
- quotes: number;
- trades: number;
- candles: number;
- orderbook: number;
- };
-}
-
-export interface ProcessingMetrics {
- totalProcessed: number;
- processedPerSecond: number;
- processingLatency: number;
- errorCount: number;
-}
-
-export interface SubscriptionMetrics {
- totalSubscriptions: number;
- messagesSent: number;
- sendRate: number;
-}
diff --git a/apps/core-services/market-data-gateway/tsconfig.json b/apps/core-services/market-data-gateway/tsconfig.json
deleted file mode 100644
index 175764e..0000000
--- a/apps/core-services/market-data-gateway/tsconfig.json
+++ /dev/null
@@ -1,32 +0,0 @@
-{
- "extends": "../../../tsconfig.json",
- "compilerOptions": {
- "outDir": "./dist",
- "module": "ESNext",
- "moduleResolution": "bundler",
- "types": ["bun-types"],
- "baseUrl": "../../../",
- "paths": {
- "@stock-bot/*": ["libs/*/src", "libs/*/dist"]
- },
- "rootDir": "../../../"
- },
- "include": [
- "src/**/*",
- "../../../libs/*/src/**/*"
- ],
- "exclude": [
- "node_modules",
- "dist",
- "../../../libs/*/examples/**/*",
- "../../../libs/**/*.test.ts",
- "../../../libs/**/*.spec.ts"
- ],
- "references": [
- { "path": "../../../libs/config" },
- { "path": "../../../libs/types" },
- { "path": "../../../libs/logger" },
- { "path": "../../../libs/http-client" },
- { "path": "../../../libs/event-bus" }
- ]
-}
diff --git a/apps/core-services/risk-guardian/package.json b/apps/core-services/risk-guardian/package.json
deleted file mode 100644
index 9760927..0000000
--- a/apps/core-services/risk-guardian/package.json
+++ /dev/null
@@ -1,22 +0,0 @@
-{
- "name": "risk-guardian",
- "version": "1.0.0",
- "description": "Real-time risk monitoring and controls service",
- "main": "src/index.ts",
- "scripts": {
- "dev": "bun run --watch src/index.ts",
- "start": "bun run src/index.ts",
- "test": "echo 'No tests yet'"
- },
- "dependencies": {
- "hono": "^4.6.3",
- "ioredis": "^5.4.1",
- "@stock-bot/config": "*",
- "@stock-bot/types": "*",
- "ws": "^8.18.0"
- },
- "devDependencies": {
- "bun-types": "^1.2.15",
- "@types/ws": "^8.5.12"
- }
-}
diff --git a/apps/core-services/risk-guardian/src/index.ts b/apps/core-services/risk-guardian/src/index.ts
deleted file mode 100644
index 7f64d1c..0000000
--- a/apps/core-services/risk-guardian/src/index.ts
+++ /dev/null
@@ -1,245 +0,0 @@
-import { Hono } from 'hono';
-import { WebSocketServer } from 'ws';
-import Redis from 'ioredis';
-
-const app = new Hono();
-const redis = new Redis({
- host: process.env.REDIS_HOST || 'localhost',
- port: parseInt(process.env.REDIS_PORT || '6379'),
- enableReadyCheck: false,
- maxRetriesPerRequest: null,
-});
-
-// WebSocket server for real-time risk alerts
-const wss = new WebSocketServer({ port: 8081 });
-
-// Risk thresholds configuration
-interface RiskThresholds {
- maxPositionSize: number;
- maxDailyLoss: number;
- maxPortfolioRisk: number;
- volatilityLimit: number;
-}
-
-const defaultThresholds: RiskThresholds = {
- maxPositionSize: 100000, // $100k max position
- maxDailyLoss: 10000, // $10k max daily loss
- maxPortfolioRisk: 0.02, // 2% portfolio risk
- volatilityLimit: 0.3 // 30% volatility limit
-};
-
-// Health check endpoint
-app.get('/health', (c) => {
- return c.json({
- service: 'risk-guardian',
- status: 'healthy',
- timestamp: new Date(),
- version: '1.0.0',
- connections: wss.clients.size
- });
-});
-
-// Get risk thresholds
-app.get('/api/risk/thresholds', async (c) => {
- try {
- const thresholds = await redis.hgetall('risk:thresholds');
- const parsedThresholds = Object.keys(thresholds).length > 0
- ? Object.fromEntries(
- Object.entries(thresholds).map(([k, v]) => [k, parseFloat(v as string)])
- )
- : defaultThresholds;
-
- return c.json({
- success: true,
- data: parsedThresholds
- });
- } catch (error) {
- console.error('Error fetching risk thresholds:', error);
- return c.json({ success: false, error: 'Failed to fetch thresholds' }, 500);
- }
-});
-
-// Update risk thresholds
-app.put('/api/risk/thresholds', async (c) => {
- try {
- const thresholds = await c.req.json();
- await redis.hmset('risk:thresholds', thresholds);
-
- // Broadcast threshold update to connected clients
- const message = JSON.stringify({
- type: 'THRESHOLD_UPDATE',
- data: thresholds,
- timestamp: new Date()
- });
-
- wss.clients.forEach(client => {
- if (client.readyState === 1) { // WebSocket.OPEN
- client.send(message);
- }
- });
-
- return c.json({ success: true, data: thresholds });
- } catch (error) {
- console.error('Error updating risk thresholds:', error);
- return c.json({ success: false, error: 'Failed to update thresholds' }, 500);
- }
-});
-
-// Real-time risk monitoring endpoint
-app.post('/api/risk/evaluate', async (c) => {
- try {
- const { symbol, quantity, price, portfolioValue } = await c.req.json();
-
- const thresholds = await redis.hgetall('risk:thresholds');
- const activeThresholds = Object.keys(thresholds).length > 0
- ? Object.fromEntries(
- Object.entries(thresholds).map(([k, v]) => [k, parseFloat(v as string)])
- )
- : defaultThresholds;
-
- const positionValue = quantity * price;
- const positionRisk = positionValue / portfolioValue;
-
- const riskEvaluation = {
- symbol,
- positionValue,
- positionRisk,
- violations: [] as string[],
- riskLevel: 'LOW' as 'LOW' | 'MEDIUM' | 'HIGH'
- };
-
- // Check risk violations
- if (positionValue > activeThresholds.maxPositionSize) {
- riskEvaluation.violations.push(`Position size exceeds limit: $${positionValue.toLocaleString()}`);
- }
-
- if (positionRisk > activeThresholds.maxPortfolioRisk) {
- riskEvaluation.violations.push(`Portfolio risk exceeds limit: ${(positionRisk * 100).toFixed(2)}%`);
- }
-
- // Determine risk level
- if (riskEvaluation.violations.length > 0) {
- riskEvaluation.riskLevel = 'HIGH';
- } else if (positionRisk > activeThresholds.maxPortfolioRisk * 0.7) {
- riskEvaluation.riskLevel = 'MEDIUM';
- }
-
- // Store risk evaluation
- await redis.setex(
- `risk:evaluation:${symbol}:${Date.now()}`,
- 3600, // 1 hour TTL
- JSON.stringify(riskEvaluation)
- );
-
- // Send real-time alert if high risk
- if (riskEvaluation.riskLevel === 'HIGH') {
- const alert = {
- type: 'RISK_ALERT',
- level: 'HIGH',
- data: riskEvaluation,
- timestamp: new Date()
- };
-
- wss.clients.forEach(client => {
- if (client.readyState === 1) {
- client.send(JSON.stringify(alert));
- }
- });
- }
-
- return c.json({ success: true, data: riskEvaluation });
- } catch (error) {
- console.error('Error evaluating risk:', error);
- return c.json({ success: false, error: 'Failed to evaluate risk' }, 500);
- }
-});
-
-// Get risk history
-app.get('/api/risk/history', async (c) => {
- try {
- const keys = await redis.keys('risk:evaluation:*');
- const evaluations: any[] = [];
-
- for (const key of keys.slice(0, 100)) { // Limit to 100 recent evaluations
- const data = await redis.get(key);
- if (data) {
- evaluations.push(JSON.parse(data));
- }
- }
-
- return c.json({
- success: true,
- data: evaluations.sort((a: any, b: any) => new Date(b.timestamp).getTime() - new Date(a.timestamp).getTime())
- });
- } catch (error) {
- console.error('Error fetching risk history:', error);
- return c.json({ success: false, error: 'Failed to fetch risk history' }, 500);
- }
-});
-
-// WebSocket connection handling
-wss.on('connection', (ws) => {
- console.log('New risk monitoring client connected');
-
- // Send welcome message
- ws.send(JSON.stringify({
- type: 'CONNECTED',
- message: 'Connected to Risk Guardian',
- timestamp: new Date()
- }));
-
- ws.on('close', () => {
- console.log('Risk monitoring client disconnected');
- });
-
- ws.on('error', (error) => {
- console.error('WebSocket error:', error);
- });
-});
-
-// Redis event subscriptions for cross-service communication
-redis.subscribe('trading:position:opened', 'trading:position:closed');
-
-redis.on('message', async (channel, message) => {
- try {
- const data = JSON.parse(message);
-
- if (channel === 'trading:position:opened') {
- // Auto-evaluate risk for new positions
- const evaluation = await evaluatePositionRisk(data);
-
- // Broadcast to connected clients
- wss.clients.forEach(client => {
- if (client.readyState === 1) {
- client.send(JSON.stringify({
- type: 'POSITION_RISK_UPDATE',
- data: evaluation,
- timestamp: new Date()
- }));
- }
- });
- }
- } catch (error) {
- console.error('Error processing Redis message:', error);
- }
-});
-
-async function evaluatePositionRisk(position: any) {
- // Implementation would evaluate position against current thresholds
- // This is a simplified version
- return {
- symbol: position.symbol,
- riskLevel: 'LOW',
- timestamp: new Date()
- };
-}
-
-const port = parseInt(process.env.PORT || '3002');
-
-console.log(`๐ก๏ธ Risk Guardian starting on port ${port}`);
-console.log(`๐ก WebSocket server running on port 8081`);
-
-export default {
- port,
- fetch: app.fetch,
-};
diff --git a/apps/core-services/risk-guardian/tsconfig.json b/apps/core-services/risk-guardian/tsconfig.json
deleted file mode 100644
index 168d88b..0000000
--- a/apps/core-services/risk-guardian/tsconfig.json
+++ /dev/null
@@ -1,12 +0,0 @@
-{
- "extends": "../../../tsconfig.json",
- "compilerOptions": {
- "outDir": "./dist",
- "rootDir": "./src",
- "module": "ESNext",
- "moduleResolution": "bundler",
- "types": ["bun-types"]
- },
- "include": ["src/**/*"],
- "exclude": ["node_modules", "dist"]
-}
diff --git a/apps/data-services/data-catalog/package.json b/apps/data-services/data-catalog/package.json
deleted file mode 100644
index f7fcb1a..0000000
--- a/apps/data-services/data-catalog/package.json
+++ /dev/null
@@ -1,40 +0,0 @@
-{
- "name": "@stock-bot/data-catalog",
- "version": "1.0.0",
- "private": true,
- "description": "Data catalog and discovery service for stock-bot",
- "type": "module",
- "main": "dist/index.js",
- "scripts": {
- "dev": "bun run --hot src/index.ts",
- "build": "tsc",
- "start": "node dist/index.js",
- "test": "bun test",
- "type-check": "tsc --noEmit"
- },
- "dependencies": {
- "@stock-bot/types": "workspace:*",
- "@stock-bot/utils": "workspace:*",
- "@stock-bot/event-bus": "workspace:*",
- "@stock-bot/api-client": "workspace:*",
- "hono": "^4.0.0",
- "zod": "^3.22.0",
- "elasticsearch": "^16.7.3",
- "neo4j-driver": "^5.15.0",
- "cron": "^3.1.6",
- "uuid": "^9.0.1"
- },
- "devDependencies": {
- "@types/uuid": "^9.0.8",
- "@types/cron": "^2.4.0",
- "@types/node": "^20.0.0",
- "typescript": "^5.3.0"
- },
- "keywords": [
- "data-catalog",
- "data-discovery",
- "data-lineage",
- "metadata",
- "stock-bot"
- ]
-}
diff --git a/apps/data-services/data-catalog/src/controllers/DataCatalogController.ts b/apps/data-services/data-catalog/src/controllers/DataCatalogController.ts
deleted file mode 100644
index 828917f..0000000
--- a/apps/data-services/data-catalog/src/controllers/DataCatalogController.ts
+++ /dev/null
@@ -1,360 +0,0 @@
-import { Context } from 'hono';
-import { Logger } from '@stock-bot/utils';
-import { DataCatalogService } from '../services/DataCatalogService';
-import {
- CreateDataAssetRequest,
- UpdateDataAssetRequest,
- DataAssetType,
- DataClassification
-} from '../types/DataCatalog';
-
-export class DataCatalogController {
- constructor(
- private dataCatalogService: DataCatalogService,
- private logger: Logger
- ) {}
-
- async createAsset(c: Context) {
- try {
- const request: CreateDataAssetRequest = await c.req.json();
-
- // Validate required fields
- if (!request.name || !request.type || !request.description || !request.owner) {
- return c.json({ error: 'Missing required fields: name, type, description, owner' }, 400);
- }
-
- const asset = await this.dataCatalogService.createAsset(request);
-
- this.logger.info('Asset created via API', {
- assetId: asset.id,
- name: asset.name,
- type: asset.type
- });
-
- return c.json(asset, 201);
- } catch (error) {
- this.logger.error('Failed to create asset', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getAsset(c: Context) {
- try {
- const assetId = c.req.param('id');
-
- if (!assetId) {
- return c.json({ error: 'Asset ID is required' }, 400);
- }
-
- const asset = await this.dataCatalogService.getAsset(assetId);
-
- if (!asset) {
- return c.json({ error: 'Asset not found' }, 404);
- }
-
- return c.json(asset);
- } catch (error) {
- this.logger.error('Failed to get asset', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async updateAsset(c: Context) {
- try {
- const assetId = c.req.param('id');
- const updates: UpdateDataAssetRequest = await c.req.json();
-
- if (!assetId) {
- return c.json({ error: 'Asset ID is required' }, 400);
- }
-
- const asset = await this.dataCatalogService.updateAsset(assetId, updates);
-
- if (!asset) {
- return c.json({ error: 'Asset not found' }, 404);
- }
-
- this.logger.info('Asset updated via API', {
- assetId,
- changes: Object.keys(updates)
- });
-
- return c.json(asset);
- } catch (error) {
- this.logger.error('Failed to update asset', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async deleteAsset(c: Context) {
- try {
- const assetId = c.req.param('id');
-
- if (!assetId) {
- return c.json({ error: 'Asset ID is required' }, 400);
- }
-
- await this.dataCatalogService.deleteAsset(assetId);
-
- this.logger.info('Asset deleted via API', { assetId });
-
- return c.json({ message: 'Asset deleted successfully' });
- } catch (error) {
- this.logger.error('Failed to delete asset', { error });
-
- if (error instanceof Error && error.message.includes('not found')) {
- return c.json({ error: 'Asset not found' }, 404);
- }
-
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async listAssets(c: Context) {
- try {
- const query = c.req.query();
- const filters: Record = {};
-
- // Parse query parameters
- if (query.type) filters.type = query.type;
- if (query.owner) filters.owner = query.owner;
- if (query.classification) filters.classification = query.classification;
- if (query.tags) {
- filters.tags = Array.isArray(query.tags) ? query.tags : [query.tags];
- }
-
- const assets = await this.dataCatalogService.listAssets(filters);
-
- return c.json({
- assets,
- total: assets.length,
- filters: filters
- });
- } catch (error) {
- this.logger.error('Failed to list assets', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async searchAssets(c: Context) {
- try {
- const query = c.req.query('q');
- const queryParams = c.req.query();
-
- if (!query) {
- return c.json({ error: 'Search query is required' }, 400);
- }
-
- const filters: Record = {};
- if (queryParams.type) filters.type = queryParams.type;
- if (queryParams.owner) filters.owner = queryParams.owner;
- if (queryParams.classification) filters.classification = queryParams.classification;
-
- const assets = await this.dataCatalogService.searchAssets(query, filters);
-
- this.logger.info('Asset search performed', {
- query,
- filters,
- resultCount: assets.length
- });
-
- return c.json({
- assets,
- total: assets.length,
- query,
- filters
- });
- } catch (error) {
- this.logger.error('Failed to search assets', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getAssetsByOwner(c: Context) {
- try {
- const owner = c.req.param('owner');
-
- if (!owner) {
- return c.json({ error: 'Owner is required' }, 400);
- }
-
- const assets = await this.dataCatalogService.getAssetsByOwner(owner);
-
- return c.json({
- assets,
- total: assets.length,
- owner
- });
- } catch (error) {
- this.logger.error('Failed to get assets by owner', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getAssetsByType(c: Context) {
- try {
- const type = c.req.param('type') as DataAssetType;
-
- if (!type) {
- return c.json({ error: 'Asset type is required' }, 400);
- }
-
- if (!Object.values(DataAssetType).includes(type)) {
- return c.json({ error: 'Invalid asset type' }, 400);
- }
-
- const assets = await this.dataCatalogService.getAssetsByType(type);
-
- return c.json({
- assets,
- total: assets.length,
- type
- });
- } catch (error) {
- this.logger.error('Failed to get assets by type', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getAssetsByClassification(c: Context) {
- try {
- const classification = c.req.param('classification') as DataClassification;
-
- if (!classification) {
- return c.json({ error: 'Classification is required' }, 400);
- }
-
- if (!Object.values(DataClassification).includes(classification)) {
- return c.json({ error: 'Invalid classification' }, 400);
- }
-
- const assets = await this.dataCatalogService.getAssetsByClassification(classification);
-
- return c.json({
- assets,
- total: assets.length,
- classification
- });
- } catch (error) {
- this.logger.error('Failed to get assets by classification', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getAssetsByTags(c: Context) {
- try {
- const tagsParam = c.req.query('tags');
-
- if (!tagsParam) {
- return c.json({ error: 'Tags parameter is required' }, 400);
- }
-
- const tags = Array.isArray(tagsParam) ? tagsParam : [tagsParam];
- const assets = await this.dataCatalogService.getAssetsByTags(tags);
-
- return c.json({
- assets,
- total: assets.length,
- tags
- });
- } catch (error) {
- this.logger.error('Failed to get assets by tags', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getAssetMetrics(c: Context) {
- try {
- const assetId = c.req.param('id');
-
- if (!assetId) {
- return c.json({ error: 'Asset ID is required' }, 400);
- }
-
- const asset = await this.dataCatalogService.getAsset(assetId);
-
- if (!asset) {
- return c.json({ error: 'Asset not found' }, 404);
- }
-
- const metrics = {
- id: asset.id,
- name: asset.name,
- type: asset.type,
- classification: asset.classification,
- usage: {
- accessCount: asset.usage.accessCount,
- uniqueUsers: asset.usage.uniqueUsers,
- lastAccessed: asset.usage.lastAccessed,
- usageTrend: asset.usage.usageTrend
- },
- quality: {
- overallScore: asset.quality.overallScore,
- lastAssessment: asset.quality.lastAssessment,
- issueCount: asset.quality.issues.filter(issue => !issue.resolved).length
- },
- governance: {
- policiesApplied: asset.governance.policies.length,
- complianceStatus: asset.governance.compliance.every(c => c.status === 'passed') ? 'compliant' : 'non-compliant',
- auditEntries: asset.governance.audit.length
- },
- lineage: {
- upstreamCount: asset.lineage.upstreamAssets.length,
- downstreamCount: asset.lineage.downstreamAssets.length
- }
- };
-
- return c.json(metrics);
- } catch (error) {
- this.logger.error('Failed to get asset metrics', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getCatalogStatistics(c: Context) {
- try {
- const allAssets = await this.dataCatalogService.listAssets();
-
- const statistics = {
- totalAssets: allAssets.length,
- assetsByType: this.groupByProperty(allAssets, 'type'),
- assetsByClassification: this.groupByProperty(allAssets, 'classification'),
- assetsByOwner: this.groupByProperty(allAssets, 'owner'),
- recentAssets: allAssets
- .sort((a, b) => b.createdAt.getTime() - a.createdAt.getTime())
- .slice(0, 10)
- .map(asset => ({
- id: asset.id,
- name: asset.name,
- type: asset.type,
- owner: asset.owner,
- createdAt: asset.createdAt
- })),
- mostAccessed: allAssets
- .sort((a, b) => b.usage.accessCount - a.usage.accessCount)
- .slice(0, 10)
- .map(asset => ({
- id: asset.id,
- name: asset.name,
- type: asset.type,
- accessCount: asset.usage.accessCount,
- lastAccessed: asset.usage.lastAccessed
- }))
- };
-
- return c.json(statistics);
- } catch (error) {
- this.logger.error('Failed to get catalog statistics', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- // Helper method to group assets by property
- private groupByProperty(assets: any[], property: string): Record {
- return assets.reduce((acc, asset) => {
- const value = asset[property];
- acc[value] = (acc[value] || 0) + 1;
- return acc;
- }, {});
- }
-}
diff --git a/apps/data-services/data-catalog/src/controllers/GovernanceController.ts b/apps/data-services/data-catalog/src/controllers/GovernanceController.ts
deleted file mode 100644
index de70890..0000000
--- a/apps/data-services/data-catalog/src/controllers/GovernanceController.ts
+++ /dev/null
@@ -1,414 +0,0 @@
-import { Hono } from 'hono';
-import { DataGovernanceService } from '../services/DataGovernanceService';
-import {
- GovernancePolicy,
- ComplianceCheck,
- AccessRequest,
- DataSubjectRequest,
- AuditLog
-} from '../types/DataCatalog';
-
-export class GovernanceController {
- private app: Hono;
- private governanceService: DataGovernanceService;
-
- constructor() {
- this.app = new Hono();
- this.governanceService = new DataGovernanceService();
- this.setupRoutes();
- }
-
- private setupRoutes() {
- // Create governance policy
- this.app.post('/policies', async (c) => {
- try {
- const policy: Omit = await c.req.json();
- const createdPolicy = await this.governanceService.createPolicy(policy);
-
- return c.json({
- success: true,
- data: createdPolicy
- });
- } catch (error) {
- console.error('Error creating governance policy:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get governance policies
- this.app.get('/policies', async (c) => {
- try {
- const type = c.req.query('type');
- const category = c.req.query('category');
- const active = c.req.query('active') === 'true';
-
- const filters: any = {};
- if (type) filters.type = type;
- if (category) filters.category = category;
- if (active !== undefined) filters.active = active;
-
- const policies = await this.governanceService.getPolicies(filters);
-
- return c.json({
- success: true,
- data: policies
- });
- } catch (error) {
- console.error('Error getting governance policies:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Update governance policy
- this.app.put('/policies/:policyId', async (c) => {
- try {
- const policyId = c.req.param('policyId');
- const updates: Partial = await c.req.json();
-
- const updatedPolicy = await this.governanceService.updatePolicy(policyId, updates);
-
- return c.json({
- success: true,
- data: updatedPolicy
- });
- } catch (error) {
- console.error('Error updating governance policy:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Delete governance policy
- this.app.delete('/policies/:policyId', async (c) => {
- try {
- const policyId = c.req.param('policyId');
- await this.governanceService.deletePolicy(policyId);
-
- return c.json({
- success: true,
- message: 'Governance policy deleted successfully'
- });
- } catch (error) {
- console.error('Error deleting governance policy:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Apply policy to asset
- this.app.post('/policies/:policyId/apply/:assetId', async (c) => {
- try {
- const policyId = c.req.param('policyId');
- const assetId = c.req.param('assetId');
-
- await this.governanceService.applyPolicy(policyId, assetId);
-
- return c.json({
- success: true,
- message: 'Policy applied successfully'
- });
- } catch (error) {
- console.error('Error applying policy:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Check compliance for asset
- this.app.post('/compliance/check', async (c) => {
- try {
- const request: { assetId: string; policyIds?: string[] } = await c.req.json();
- const complianceResult = await this.governanceService.checkCompliance(
- request.assetId,
- request.policyIds
- );
-
- return c.json({
- success: true,
- data: complianceResult
- });
- } catch (error) {
- console.error('Error checking compliance:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get compliance violations
- this.app.get('/compliance/violations', async (c) => {
- try {
- const assetId = c.req.query('assetId');
- const severity = c.req.query('severity');
- const status = c.req.query('status');
- const limit = c.req.query('limit') ? parseInt(c.req.query('limit')!) : 100;
- const offset = c.req.query('offset') ? parseInt(c.req.query('offset')!) : 0;
-
- const filters: any = {};
- if (assetId) filters.assetId = assetId;
- if (severity) filters.severity = severity;
- if (status) filters.status = status;
-
- const violations = await this.governanceService.getComplianceViolations(
- filters,
- { limit, offset }
- );
-
- return c.json({
- success: true,
- data: violations
- });
- } catch (error) {
- console.error('Error getting compliance violations:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Request access to asset
- this.app.post('/access/request', async (c) => {
- try {
- const request: Omit = await c.req.json();
- const accessRequest = await this.governanceService.requestAccess(request);
-
- return c.json({
- success: true,
- data: accessRequest
- });
- } catch (error) {
- console.error('Error requesting access:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Approve/deny access request
- this.app.patch('/access/:requestId', async (c) => {
- try {
- const requestId = c.req.param('requestId');
- const { action, reviewedBy, reviewComments } = await c.req.json();
-
- const updatedRequest = await this.governanceService.reviewAccessRequest(
- requestId,
- action,
- reviewedBy,
- reviewComments
- );
-
- return c.json({
- success: true,
- data: updatedRequest
- });
- } catch (error) {
- console.error('Error reviewing access request:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Check access authorization
- this.app.post('/access/check', async (c) => {
- try {
- const { userId, assetId, action } = await c.req.json();
- const authorized = await this.governanceService.checkAccess(userId, assetId, action);
-
- return c.json({
- success: true,
- data: {
- userId,
- assetId,
- action,
- authorized
- }
- });
- } catch (error) {
- console.error('Error checking access:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Handle data subject request (GDPR)
- this.app.post('/privacy/subject-request', async (c) => {
- try {
- const request: Omit = await c.req.json();
- const subjectRequest = await this.governanceService.handleDataSubjectRequest(request);
-
- return c.json({
- success: true,
- data: subjectRequest
- });
- } catch (error) {
- console.error('Error handling data subject request:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Anonymize asset data
- this.app.post('/privacy/anonymize/:assetId', async (c) => {
- try {
- const assetId = c.req.param('assetId');
- const { fields, method, requestedBy } = await c.req.json();
-
- const result = await this.governanceService.anonymizeData(
- assetId,
- fields,
- method,
- requestedBy
- );
-
- return c.json({
- success: true,
- data: result
- });
- } catch (error) {
- console.error('Error anonymizing data:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get audit logs
- this.app.get('/audit/logs', async (c) => {
- try {
- const assetId = c.req.query('assetId');
- const userId = c.req.query('userId');
- const action = c.req.query('action');
- const startDate = c.req.query('startDate');
- const endDate = c.req.query('endDate');
- const limit = c.req.query('limit') ? parseInt(c.req.query('limit')!) : 100;
- const offset = c.req.query('offset') ? parseInt(c.req.query('offset')!) : 0;
-
- const filters: any = {};
- if (assetId) filters.assetId = assetId;
- if (userId) filters.userId = userId;
- if (action) filters.action = action;
- if (startDate) filters.startDate = new Date(startDate);
- if (endDate) filters.endDate = new Date(endDate);
-
- const logs = await this.governanceService.getAuditLogs(filters, { limit, offset });
-
- return c.json({
- success: true,
- data: logs
- });
- } catch (error) {
- console.error('Error getting audit logs:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Log access event
- this.app.post('/audit/log', async (c) => {
- try {
- const logEntry: Omit = await c.req.json();
- const logged = await this.governanceService.logAccess(logEntry);
-
- return c.json({
- success: true,
- data: logged
- });
- } catch (error) {
- console.error('Error logging access event:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get retention policies
- this.app.get('/retention/policies', async (c) => {
- try {
- const assetType = c.req.query('assetType');
- const policies = await this.governanceService.getRetentionPolicies(assetType);
-
- return c.json({
- success: true,
- data: policies
- });
- } catch (error) {
- console.error('Error getting retention policies:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Apply retention policy
- this.app.post('/retention/apply', async (c) => {
- try {
- const { assetId, policyId, requestedBy } = await c.req.json();
- const result = await this.governanceService.applyRetentionPolicy(
- assetId,
- policyId,
- requestedBy
- );
-
- return c.json({
- success: true,
- data: result
- });
- } catch (error) {
- console.error('Error applying retention policy:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get governance metrics
- this.app.get('/metrics', async (c) => {
- try {
- const timeRange = c.req.query('timeRange') || '30d';
- const metrics = await this.governanceService.getGovernanceMetrics(timeRange);
-
- return c.json({
- success: true,
- data: metrics
- });
- } catch (error) {
- console.error('Error getting governance metrics:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
- }
-
- public getApp(): Hono {
- return this.app;
- }
-}
diff --git a/apps/data-services/data-catalog/src/controllers/HealthController.ts b/apps/data-services/data-catalog/src/controllers/HealthController.ts
deleted file mode 100644
index 8695106..0000000
--- a/apps/data-services/data-catalog/src/controllers/HealthController.ts
+++ /dev/null
@@ -1,172 +0,0 @@
-import { Hono } from 'hono';
-
-export class HealthController {
- private app: Hono;
-
- constructor() {
- this.app = new Hono();
- this.setupRoutes();
- }
-
- private setupRoutes() {
- // Basic health check
- this.app.get('/', async (c) => {
- return c.json({
- service: 'data-catalog',
- status: 'healthy',
- timestamp: new Date().toISOString(),
- version: process.env.SERVICE_VERSION || '1.0.0'
- });
- });
-
- // Detailed health check
- this.app.get('/detailed', async (c) => {
- try {
- const healthStatus = {
- service: 'data-catalog',
- status: 'healthy',
- timestamp: new Date().toISOString(),
- version: process.env.SERVICE_VERSION || '1.0.0',
- uptime: process.uptime(),
- memory: process.memoryUsage(),
- dependencies: {
- database: await this.checkDatabase(),
- search: await this.checkSearchService(),
- eventBus: await this.checkEventBus()
- }
- };
-
- // Determine overall status based on dependencies
- const hasUnhealthyDependencies = Object.values(healthStatus.dependencies)
- .some(dep => dep.status !== 'healthy');
-
- if (hasUnhealthyDependencies) {
- healthStatus.status = 'degraded';
- }
-
- const statusCode = healthStatus.status === 'healthy' ? 200 : 503;
- return c.json(healthStatus, statusCode);
- } catch (error) {
- console.error('Health check error:', error);
- return c.json({
- service: 'data-catalog',
- status: 'unhealthy',
- timestamp: new Date().toISOString(),
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 503);
- }
- });
-
- // Readiness check
- this.app.get('/ready', async (c) => {
- try {
- // Check if service is ready to accept requests
- const readyChecks = await Promise.all([
- this.checkDatabase(),
- this.checkSearchService()
- ]);
-
- const isReady = readyChecks.every(check => check.status === 'healthy');
-
- if (isReady) {
- return c.json({
- service: 'data-catalog',
- ready: true,
- timestamp: new Date().toISOString()
- });
- } else {
- return c.json({
- service: 'data-catalog',
- ready: false,
- timestamp: new Date().toISOString(),
- checks: readyChecks
- }, 503);
- }
- } catch (error) {
- console.error('Readiness check error:', error);
- return c.json({
- service: 'data-catalog',
- ready: false,
- timestamp: new Date().toISOString(),
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 503);
- }
- });
-
- // Liveness check
- this.app.get('/live', async (c) => {
- return c.json({
- service: 'data-catalog',
- alive: true,
- timestamp: new Date().toISOString()
- });
- });
- }
-
- private async checkDatabase(): Promise<{ name: string; status: string; responseTime?: number }> {
- const start = Date.now();
- try {
- // Simulate database check
- // In real implementation, this would ping the actual database
- await new Promise(resolve => setTimeout(resolve, 10));
-
- return {
- name: 'database',
- status: 'healthy',
- responseTime: Date.now() - start
- };
- } catch (error) {
- return {
- name: 'database',
- status: 'unhealthy',
- responseTime: Date.now() - start
- };
- }
- }
-
- private async checkSearchService(): Promise<{ name: string; status: string; responseTime?: number }> {
- const start = Date.now();
- try {
- // Simulate search service check
- // In real implementation, this would check search index health
- await new Promise(resolve => setTimeout(resolve, 5));
-
- return {
- name: 'search',
- status: 'healthy',
- responseTime: Date.now() - start
- };
- } catch (error) {
- return {
- name: 'search',
- status: 'unhealthy',
- responseTime: Date.now() - start
- };
- }
- }
-
- private async checkEventBus(): Promise<{ name: string; status: string; responseTime?: number }> {
- const start = Date.now();
- try {
- // Simulate event bus check
- // In real implementation, this would check message broker connectivity
- await new Promise(resolve => setTimeout(resolve, 3));
-
- return {
- name: 'eventBus',
- status: 'healthy',
- responseTime: Date.now() - start
- };
- } catch (error) {
- return {
- name: 'eventBus',
- status: 'unhealthy',
- responseTime: Date.now() - start
- };
- }
- }
-
- public getApp(): Hono {
- return this.app;
- }
-}
diff --git a/apps/data-services/data-catalog/src/controllers/LineageController.ts b/apps/data-services/data-catalog/src/controllers/LineageController.ts
deleted file mode 100644
index 2e4b819..0000000
--- a/apps/data-services/data-catalog/src/controllers/LineageController.ts
+++ /dev/null
@@ -1,211 +0,0 @@
-import { Hono } from 'hono';
-import { DataLineageService } from '../services/DataLineageService';
-import { CreateLineageRequest, LineageQuery, ImpactAnalysisQuery } from '../types/DataCatalog';
-
-export class LineageController {
- private app: Hono;
- private lineageService: DataLineageService;
-
- constructor() {
- this.app = new Hono();
- this.lineageService = new DataLineageService();
- this.setupRoutes();
- }
-
- private setupRoutes() {
- // Create lineage relationship
- this.app.post('/', async (c) => {
- try {
- const request: CreateLineageRequest = await c.req.json();
- const lineage = await this.lineageService.createLineage(request);
- return c.json({
- success: true,
- data: lineage
- });
- } catch (error) {
- console.error('Error creating lineage:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get lineage for asset
- this.app.get('/assets/:assetId', async (c) => {
- try {
- const assetId = c.req.param('assetId');
- const direction = c.req.query('direction') as 'upstream' | 'downstream' | 'both';
- const depth = c.req.query('depth') ? parseInt(c.req.query('depth')!) : undefined;
-
- const lineage = await this.lineageService.getAssetLineage(assetId, {
- direction: direction || 'both',
- depth: depth || 10
- });
-
- return c.json({
- success: true,
- data: lineage
- });
- } catch (error) {
- console.error('Error getting asset lineage:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get upstream dependencies
- this.app.get('/assets/:assetId/upstream', async (c) => {
- try {
- const assetId = c.req.param('assetId');
- const depth = c.req.query('depth') ? parseInt(c.req.query('depth')!) : 5;
-
- const upstream = await this.lineageService.getUpstreamDependencies(assetId, depth);
-
- return c.json({
- success: true,
- data: upstream
- });
- } catch (error) {
- console.error('Error getting upstream dependencies:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get downstream dependencies
- this.app.get('/assets/:assetId/downstream', async (c) => {
- try {
- const assetId = c.req.param('assetId');
- const depth = c.req.query('depth') ? parseInt(c.req.query('depth')!) : 5;
-
- const downstream = await this.lineageService.getDownstreamDependencies(assetId, depth);
-
- return c.json({
- success: true,
- data: downstream
- });
- } catch (error) {
- console.error('Error getting downstream dependencies:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Perform impact analysis
- this.app.post('/impact-analysis', async (c) => {
- try {
- const query: ImpactAnalysisQuery = await c.req.json();
- const analysis = await this.lineageService.performImpactAnalysis(query);
-
- return c.json({
- success: true,
- data: analysis
- });
- } catch (error) {
- console.error('Error performing impact analysis:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get lineage graph
- this.app.get('/graph', async (c) => {
- try {
- const assetIds = c.req.query('assetIds')?.split(',') || [];
- const depth = c.req.query('depth') ? parseInt(c.req.query('depth')!) : 3;
-
- if (assetIds.length === 0) {
- return c.json({
- success: false,
- error: 'Asset IDs are required'
- }, 400);
- }
-
- const graph = await this.lineageService.getLineageGraph(assetIds, depth);
-
- return c.json({
- success: true,
- data: graph
- });
- } catch (error) {
- console.error('Error getting lineage graph:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Check for circular dependencies
- this.app.get('/assets/:assetId/circular-check', async (c) => {
- try {
- const assetId = c.req.param('assetId');
- const hasCycles = await this.lineageService.hasCircularDependencies(assetId);
-
- return c.json({
- success: true,
- data: {
- assetId,
- hasCircularDependencies: hasCycles
- }
- });
- } catch (error) {
- console.error('Error checking circular dependencies:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Delete lineage relationship
- this.app.delete('/:lineageId', async (c) => {
- try {
- const lineageId = c.req.param('lineageId');
- await this.lineageService.deleteLineage(lineageId);
-
- return c.json({
- success: true,
- message: 'Lineage relationship deleted successfully'
- });
- } catch (error) {
- console.error('Error deleting lineage:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get lineage statistics
- this.app.get('/stats', async (c) => {
- try {
- const stats = await this.lineageService.getLineageStatistics();
-
- return c.json({
- success: true,
- data: stats
- });
- } catch (error) {
- console.error('Error getting lineage statistics:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
- }
-
- public getApp(): Hono {
- return this.app;
- }
-}
diff --git a/apps/data-services/data-catalog/src/controllers/QualityController.ts b/apps/data-services/data-catalog/src/controllers/QualityController.ts
deleted file mode 100644
index 2f3126f..0000000
--- a/apps/data-services/data-catalog/src/controllers/QualityController.ts
+++ /dev/null
@@ -1,321 +0,0 @@
-import { Hono } from 'hono';
-import { DataQualityService } from '../services/DataQualityService';
-import {
- QualityAssessmentRequest,
- QualityRule,
- QualityIssue,
- QualityReportRequest
-} from '../types/DataCatalog';
-
-export class QualityController {
- private app: Hono;
- private qualityService: DataQualityService;
-
- constructor() {
- this.app = new Hono();
- this.qualityService = new DataQualityService();
- this.setupRoutes();
- }
-
- private setupRoutes() {
- // Assess asset quality
- this.app.post('/assess', async (c) => {
- try {
- const request: QualityAssessmentRequest = await c.req.json();
- const assessment = await this.qualityService.assessQuality(request);
-
- return c.json({
- success: true,
- data: assessment
- });
- } catch (error) {
- console.error('Error assessing quality:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get quality assessment for asset
- this.app.get('/assets/:assetId', async (c) => {
- try {
- const assetId = c.req.param('assetId');
- const assessment = await this.qualityService.getQualityAssessment(assetId);
-
- if (!assessment) {
- return c.json({
- success: false,
- error: 'Quality assessment not found'
- }, 404);
- }
-
- return c.json({
- success: true,
- data: assessment
- });
- } catch (error) {
- console.error('Error getting quality assessment:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Create quality rule
- this.app.post('/rules', async (c) => {
- try {
- const rule: Omit = await c.req.json();
- const createdRule = await this.qualityService.createQualityRule(rule);
-
- return c.json({
- success: true,
- data: createdRule
- });
- } catch (error) {
- console.error('Error creating quality rule:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get quality rules
- this.app.get('/rules', async (c) => {
- try {
- const assetType = c.req.query('assetType');
- const dimension = c.req.query('dimension');
- const active = c.req.query('active') === 'true';
-
- const filters: any = {};
- if (assetType) filters.assetType = assetType;
- if (dimension) filters.dimension = dimension;
- if (active !== undefined) filters.active = active;
-
- const rules = await this.qualityService.getQualityRules(filters);
-
- return c.json({
- success: true,
- data: rules
- });
- } catch (error) {
- console.error('Error getting quality rules:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Update quality rule
- this.app.put('/rules/:ruleId', async (c) => {
- try {
- const ruleId = c.req.param('ruleId');
- const updates: Partial = await c.req.json();
-
- const updatedRule = await this.qualityService.updateQualityRule(ruleId, updates);
-
- return c.json({
- success: true,
- data: updatedRule
- });
- } catch (error) {
- console.error('Error updating quality rule:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Delete quality rule
- this.app.delete('/rules/:ruleId', async (c) => {
- try {
- const ruleId = c.req.param('ruleId');
- await this.qualityService.deleteQualityRule(ruleId);
-
- return c.json({
- success: true,
- message: 'Quality rule deleted successfully'
- });
- } catch (error) {
- console.error('Error deleting quality rule:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Validate quality rules for asset
- this.app.post('/validate/:assetId', async (c) => {
- try {
- const assetId = c.req.param('assetId');
- const data = await c.req.json();
-
- const validationResults = await this.qualityService.validateQualityRules(assetId, data);
-
- return c.json({
- success: true,
- data: validationResults
- });
- } catch (error) {
- console.error('Error validating quality rules:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Report quality issue
- this.app.post('/issues', async (c) => {
- try {
- const issue: Omit = await c.req.json();
- const reportedIssue = await this.qualityService.reportQualityIssue(issue);
-
- return c.json({
- success: true,
- data: reportedIssue
- });
- } catch (error) {
- console.error('Error reporting quality issue:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get quality issues
- this.app.get('/issues', async (c) => {
- try {
- const assetId = c.req.query('assetId');
- const severity = c.req.query('severity');
- const status = c.req.query('status');
- const dimension = c.req.query('dimension');
- const limit = c.req.query('limit') ? parseInt(c.req.query('limit')!) : 100;
- const offset = c.req.query('offset') ? parseInt(c.req.query('offset')!) : 0;
-
- const filters: any = {};
- if (assetId) filters.assetId = assetId;
- if (severity) filters.severity = severity;
- if (status) filters.status = status;
- if (dimension) filters.dimension = dimension;
-
- const issues = await this.qualityService.getQualityIssues(filters, { limit, offset });
-
- return c.json({
- success: true,
- data: issues
- });
- } catch (error) {
- console.error('Error getting quality issues:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Resolve quality issue
- this.app.patch('/issues/:issueId/resolve', async (c) => {
- try {
- const issueId = c.req.param('issueId');
- const { resolution, resolvedBy } = await c.req.json();
-
- const resolvedIssue = await this.qualityService.resolveQualityIssue(
- issueId,
- resolution,
- resolvedBy
- );
-
- return c.json({
- success: true,
- data: resolvedIssue
- });
- } catch (error) {
- console.error('Error resolving quality issue:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get quality trends
- this.app.get('/trends', async (c) => {
- try {
- const assetId = c.req.query('assetId');
- const dimension = c.req.query('dimension');
- const timeRange = c.req.query('timeRange') || '30d';
-
- const trends = await this.qualityService.getQualityTrends(
- assetId,
- dimension,
- timeRange
- );
-
- return c.json({
- success: true,
- data: trends
- });
- } catch (error) {
- console.error('Error getting quality trends:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Generate quality report
- this.app.post('/reports', async (c) => {
- try {
- const request: QualityReportRequest = await c.req.json();
- const report = await this.qualityService.generateQualityReport(request);
-
- return c.json({
- success: true,
- data: report
- });
- } catch (error) {
- console.error('Error generating quality report:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
-
- // Get quality metrics summary
- this.app.get('/metrics/summary', async (c) => {
- try {
- const assetIds = c.req.query('assetIds')?.split(',');
- const timeRange = c.req.query('timeRange') || '7d';
-
- const summary = await this.qualityService.getQualityMetricsSummary(
- assetIds,
- timeRange
- );
-
- return c.json({
- success: true,
- data: summary
- });
- } catch (error) {
- console.error('Error getting quality metrics summary:', error);
- return c.json({
- success: false,
- error: error instanceof Error ? error.message : 'Unknown error'
- }, 500);
- }
- });
- }
-
- public getApp(): Hono {
- return this.app;
- }
-}
diff --git a/apps/data-services/data-catalog/src/controllers/SearchController.ts b/apps/data-services/data-catalog/src/controllers/SearchController.ts
deleted file mode 100644
index 14e534b..0000000
--- a/apps/data-services/data-catalog/src/controllers/SearchController.ts
+++ /dev/null
@@ -1,334 +0,0 @@
-import { Context } from 'hono';
-import { Logger } from '@stock-bot/utils';
-import { SearchService } from '../services/SearchService';
-import { SearchQuery, SearchFilters } from '../types/DataCatalog';
-
-export class SearchController {
- constructor(
- private searchService: SearchService,
- private logger: Logger
- ) {}
-
- async search(c: Context) {
- try {
- const queryParams = c.req.query();
-
- const searchQuery: SearchQuery = {
- text: queryParams.q || '',
- offset: parseInt(queryParams.offset || '0'),
- limit: parseInt(queryParams.limit || '20'),
- sortBy: queryParams.sortBy,
- sortOrder: queryParams.sortOrder as 'asc' | 'desc',
- userId: queryParams.userId
- };
-
- // Parse filters
- const filters: SearchFilters = {};
- if (queryParams.types) {
- filters.types = Array.isArray(queryParams.types) ? queryParams.types : [queryParams.types];
- }
- if (queryParams.classifications) {
- filters.classifications = Array.isArray(queryParams.classifications) ? queryParams.classifications : [queryParams.classifications];
- }
- if (queryParams.owners) {
- filters.owners = Array.isArray(queryParams.owners) ? queryParams.owners : [queryParams.owners];
- }
- if (queryParams.tags) {
- filters.tags = Array.isArray(queryParams.tags) ? queryParams.tags : [queryParams.tags];
- }
- if (queryParams.createdAfter) {
- filters.createdAfter = new Date(queryParams.createdAfter);
- }
- if (queryParams.createdBefore) {
- filters.createdBefore = new Date(queryParams.createdBefore);
- }
-
- if (Object.keys(filters).length > 0) {
- searchQuery.filters = filters;
- }
-
- const result = await this.searchService.search(searchQuery);
-
- this.logger.info('Search API call completed', {
- query: searchQuery.text,
- resultCount: result.total,
- searchTime: result.searchTime
- });
-
- return c.json(result);
- } catch (error) {
- this.logger.error('Search API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async suggest(c: Context) {
- try {
- const partial = c.req.query('q');
-
- if (!partial || partial.length < 2) {
- return c.json({ suggestions: [] });
- }
-
- const suggestions = await this.searchService.suggest(partial);
-
- return c.json({ suggestions });
- } catch (error) {
- this.logger.error('Suggestion API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async searchByFacets(c: Context) {
- try {
- const facets = await c.req.json();
-
- if (!facets || typeof facets !== 'object') {
- return c.json({ error: 'Facets object is required' }, 400);
- }
-
- const assets = await this.searchService.searchByFacets(facets);
-
- return c.json({
- assets,
- total: assets.length,
- facets
- });
- } catch (error) {
- this.logger.error('Facet search API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async searchSimilar(c: Context) {
- try {
- const assetId = c.req.param('id');
- const limit = parseInt(c.req.query('limit') || '10');
-
- if (!assetId) {
- return c.json({ error: 'Asset ID is required' }, 400);
- }
-
- const similarAssets = await this.searchService.searchSimilar(assetId, limit);
-
- return c.json({
- assetId,
- similarAssets,
- total: similarAssets.length
- });
- } catch (error) {
- this.logger.error('Similar search API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getPopularSearches(c: Context) {
- try {
- const limit = parseInt(c.req.query('limit') || '10');
-
- const popularSearches = await this.searchService.getPopularSearches(limit);
-
- return c.json({
- searches: popularSearches,
- total: popularSearches.length
- });
- } catch (error) {
- this.logger.error('Popular searches API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getRecentSearches(c: Context) {
- try {
- const userId = c.req.param('userId');
- const limit = parseInt(c.req.query('limit') || '10');
-
- if (!userId) {
- return c.json({ error: 'User ID is required' }, 400);
- }
-
- const recentSearches = await this.searchService.getRecentSearches(userId, limit);
-
- return c.json({
- userId,
- searches: recentSearches,
- total: recentSearches.length
- });
- } catch (error) {
- this.logger.error('Recent searches API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async reindexAssets(c: Context) {
- try {
- await this.searchService.reindexAll();
-
- this.logger.info('Search index rebuilt via API');
-
- return c.json({ message: 'Search index rebuilt successfully' });
- } catch (error) {
- this.logger.error('Reindex API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getSearchAnalytics(c: Context) {
- try {
- const timeframe = c.req.query('timeframe') || 'week';
-
- const analytics = await this.searchService.getSearchAnalytics(timeframe);
-
- return c.json({
- timeframe,
- analytics
- });
- } catch (error) {
- this.logger.error('Search analytics API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async advancedSearch(c: Context) {
- try {
- const searchRequest = await c.req.json();
-
- if (!searchRequest) {
- return c.json({ error: 'Search request is required' }, 400);
- }
-
- // Build advanced search query
- const searchQuery: SearchQuery = {
- text: searchRequest.query || '',
- offset: searchRequest.offset || 0,
- limit: searchRequest.limit || 20,
- sortBy: searchRequest.sortBy,
- sortOrder: searchRequest.sortOrder,
- userId: searchRequest.userId,
- filters: searchRequest.filters
- };
-
- const result = await this.searchService.search(searchQuery);
-
- // If no results and query is complex, try to suggest simpler alternatives
- if (result.total === 0 && searchQuery.text && searchQuery.text.split(' ').length > 2) {
- const simpleQuery = searchQuery.text.split(' ')[0];
- const simpleResult = await this.searchService.search({
- ...searchQuery,
- text: simpleQuery
- });
-
- if (simpleResult.total > 0) {
- result.suggestions = [`Try searching for "${simpleQuery}"`];
- }
- }
-
- this.logger.info('Advanced search API call completed', {
- query: searchQuery.text,
- resultCount: result.total,
- searchTime: result.searchTime
- });
-
- return c.json(result);
- } catch (error) {
- this.logger.error('Advanced search API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async exportSearchResults(c: Context) {
- try {
- const queryParams = c.req.query();
- const format = queryParams.format || 'json';
-
- if (format !== 'json' && format !== 'csv') {
- return c.json({ error: 'Unsupported export format. Use json or csv' }, 400);
- }
-
- // Perform search with maximum results
- const searchQuery: SearchQuery = {
- text: queryParams.q || '',
- offset: 0,
- limit: 10000, // Large limit for export
- sortBy: queryParams.sortBy,
- sortOrder: queryParams.sortOrder as 'asc' | 'desc'
- };
-
- const result = await this.searchService.search(searchQuery);
-
- if (format === 'csv') {
- const csv = this.convertToCSV(result.assets);
- c.header('Content-Type', 'text/csv');
- c.header('Content-Disposition', 'attachment; filename="search-results.csv"');
- return c.text(csv);
- } else {
- c.header('Content-Type', 'application/json');
- c.header('Content-Disposition', 'attachment; filename="search-results.json"');
- return c.json(result);
- }
- } catch (error) {
- this.logger.error('Export search results API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- async getSearchStatistics(c: Context) {
- try {
- const timeframe = c.req.query('timeframe') || 'week';
-
- const analytics = await this.searchService.getSearchAnalytics(timeframe);
-
- const statistics = {
- searchVolume: analytics.totalSearches,
- uniqueQueries: analytics.uniqueQueries,
- averageResultsPerSearch: Math.round(analytics.averageResults),
- noResultQueriesPercent: analytics.totalSearches > 0
- ? Math.round((analytics.noResultQueries / analytics.totalSearches) * 100)
- : 0,
- topSearchTerms: analytics.topQueries.slice(0, 5),
- searchTrend: analytics.searchTrend.trend,
- facetUsage: analytics.facetUsage
- };
-
- return c.json({
- timeframe,
- statistics
- });
- } catch (error) {
- this.logger.error('Search statistics API call failed', { error });
- return c.json({ error: 'Internal server error' }, 500);
- }
- }
-
- // Helper method to convert assets to CSV format
- private convertToCSV(assets: any[]): string {
- if (assets.length === 0) {
- return 'No results found';
- }
-
- const headers = [
- 'ID', 'Name', 'Type', 'Description', 'Owner', 'Classification',
- 'Tags', 'Created At', 'Updated At', 'Last Accessed'
- ];
-
- const csvRows = [headers.join(',')];
-
- for (const asset of assets) {
- const row = [
- asset.id,
- `"${asset.name.replace(/"/g, '""')}"`,
- asset.type,
- `"${asset.description.replace(/"/g, '""')}"`,
- asset.owner,
- asset.classification,
- `"${asset.tags.join('; ')}"`,
- asset.createdAt.toISOString(),
- asset.updatedAt.toISOString(),
- asset.lastAccessed ? asset.lastAccessed.toISOString() : ''
- ];
- csvRows.push(row.join(','));
- }
-
- return csvRows.join('\n');
- }
-}
diff --git a/apps/data-services/data-catalog/src/index.ts b/apps/data-services/data-catalog/src/index.ts
deleted file mode 100644
index 90cdcee..0000000
--- a/apps/data-services/data-catalog/src/index.ts
+++ /dev/null
@@ -1,201 +0,0 @@
-import { Hono } from 'hono';
-import { cors } from 'hono/cors';
-import { logger } from 'hono/logger';
-import { prettyJSON } from 'hono/pretty-json';
-import { serve } from '@hono/node-server';
-
-// Import controllers
-import { DataCatalogController } from './controllers/DataCatalogController';
-import { SearchController } from './controllers/SearchController';
-import { LineageController } from './controllers/LineageController';
-import { QualityController } from './controllers/QualityController';
-import { GovernanceController } from './controllers/GovernanceController';
-import { HealthController } from './controllers/HealthController';
-
-// Create main application
-const app = new Hono();
-
-// Add middleware
-app.use('*', cors({
- origin: ['http://localhost:3000', 'http://localhost:4000', 'http://localhost:5173'],
- allowMethods: ['GET', 'POST', 'PUT', 'DELETE', 'PATCH', 'OPTIONS'],
- allowHeaders: ['Content-Type', 'Authorization', 'X-Requested-With'],
- credentials: true
-}));
-
-app.use('*', logger());
-app.use('*', prettyJSON());
-
-// Initialize controllers
-const dataCatalogController = new DataCatalogController();
-const searchController = new SearchController();
-const lineageController = new LineageController();
-const qualityController = new QualityController();
-const governanceController = new GovernanceController();
-const healthController = new HealthController();
-
-// Setup routes
-app.route('/api/v1/assets', dataCatalogController.getApp());
-app.route('/api/v1/search', searchController.getApp());
-app.route('/api/v1/lineage', lineageController.getApp());
-app.route('/api/v1/quality', qualityController.getApp());
-app.route('/api/v1/governance', governanceController.getApp());
-app.route('/health', healthController.getApp());
-
-// Root endpoint
-app.get('/', (c) => {
- return c.json({
- service: 'Data Catalog Service',
- version: '1.0.0',
- description: 'Comprehensive data catalog and governance service for stock-bot data platform',
- endpoints: {
- assets: '/api/v1/assets',
- search: '/api/v1/search',
- lineage: '/api/v1/lineage',
- quality: '/api/v1/quality',
- governance: '/api/v1/governance',
- health: '/health'
- },
- documentation: '/api/v1/docs'
- });
-});
-
-// API documentation endpoint
-app.get('/api/v1/docs', (c) => {
- return c.json({
- title: 'Data Catalog Service API',
- version: '1.0.0',
- description: 'RESTful API for data catalog, lineage, quality, and governance operations',
- endpoints: {
- assets: {
- description: 'Data asset management',
- methods: {
- 'GET /api/v1/assets': 'List assets with filtering and pagination',
- 'POST /api/v1/assets': 'Create new data asset',
- 'GET /api/v1/assets/:id': 'Get asset by ID',
- 'PUT /api/v1/assets/:id': 'Update asset',
- 'DELETE /api/v1/assets/:id': 'Delete asset',
- 'GET /api/v1/assets/:id/schema': 'Get asset schema',
- 'PUT /api/v1/assets/:id/schema': 'Update asset schema',
- 'GET /api/v1/assets/:id/usage': 'Get asset usage analytics',
- 'POST /api/v1/assets/:id/usage': 'Record asset usage'
- }
- },
- search: {
- description: 'Data discovery and search',
- methods: {
- 'GET /api/v1/search': 'Search assets with full-text and faceted search',
- 'GET /api/v1/search/suggest': 'Get search suggestions',
- 'GET /api/v1/search/facets': 'Get available search facets',
- 'GET /api/v1/search/similar/:id': 'Find similar assets',
- 'GET /api/v1/search/trending': 'Get trending searches',
- 'POST /api/v1/search/export': 'Export search results'
- }
- },
- lineage: {
- description: 'Data lineage and impact analysis',
- methods: {
- 'POST /api/v1/lineage': 'Create lineage relationship',
- 'GET /api/v1/lineage/assets/:assetId': 'Get asset lineage',
- 'GET /api/v1/lineage/assets/:assetId/upstream': 'Get upstream dependencies',
- 'GET /api/v1/lineage/assets/:assetId/downstream': 'Get downstream dependencies',
- 'POST /api/v1/lineage/impact-analysis': 'Perform impact analysis',
- 'GET /api/v1/lineage/graph': 'Get lineage graph visualization',
- 'GET /api/v1/lineage/assets/:assetId/circular-check': 'Check for circular dependencies',
- 'DELETE /api/v1/lineage/:lineageId': 'Delete lineage relationship',
- 'GET /api/v1/lineage/stats': 'Get lineage statistics'
- }
- },
- quality: {
- description: 'Data quality assessment and monitoring',
- methods: {
- 'POST /api/v1/quality/assess': 'Assess data quality',
- 'GET /api/v1/quality/assets/:assetId': 'Get quality assessment',
- 'POST /api/v1/quality/rules': 'Create quality rule',
- 'GET /api/v1/quality/rules': 'Get quality rules',
- 'PUT /api/v1/quality/rules/:ruleId': 'Update quality rule',
- 'DELETE /api/v1/quality/rules/:ruleId': 'Delete quality rule',
- 'POST /api/v1/quality/validate/:assetId': 'Validate quality rules',
- 'POST /api/v1/quality/issues': 'Report quality issue',
- 'GET /api/v1/quality/issues': 'Get quality issues',
- 'PATCH /api/v1/quality/issues/:issueId/resolve': 'Resolve quality issue',
- 'GET /api/v1/quality/trends': 'Get quality trends',
- 'POST /api/v1/quality/reports': 'Generate quality report',
- 'GET /api/v1/quality/metrics/summary': 'Get quality metrics summary'
- }
- },
- governance: {
- description: 'Data governance and compliance',
- methods: {
- 'POST /api/v1/governance/policies': 'Create governance policy',
- 'GET /api/v1/governance/policies': 'Get governance policies',
- 'PUT /api/v1/governance/policies/:policyId': 'Update governance policy',
- 'DELETE /api/v1/governance/policies/:policyId': 'Delete governance policy',
- 'POST /api/v1/governance/policies/:policyId/apply/:assetId': 'Apply policy to asset',
- 'POST /api/v1/governance/compliance/check': 'Check compliance',
- 'GET /api/v1/governance/compliance/violations': 'Get compliance violations',
- 'POST /api/v1/governance/access/request': 'Request data access',
- 'PATCH /api/v1/governance/access/:requestId': 'Review access request',
- 'POST /api/v1/governance/access/check': 'Check access authorization',
- 'POST /api/v1/governance/privacy/subject-request': 'Handle data subject request',
- 'POST /api/v1/governance/privacy/anonymize/:assetId': 'Anonymize asset data',
- 'GET /api/v1/governance/audit/logs': 'Get audit logs',
- 'POST /api/v1/governance/audit/log': 'Log access event',
- 'GET /api/v1/governance/retention/policies': 'Get retention policies',
- 'POST /api/v1/governance/retention/apply': 'Apply retention policy',
- 'GET /api/v1/governance/metrics': 'Get governance metrics'
- }
- },
- health: {
- description: 'Service health monitoring',
- methods: {
- 'GET /health': 'Basic health check',
- 'GET /health/detailed': 'Detailed health check with dependencies',
- 'GET /health/ready': 'Readiness check',
- 'GET /health/live': 'Liveness check'
- }
- }
- }
- });
-});
-
-// 404 handler
-app.notFound((c) => {
- return c.json({
- success: false,
- error: 'Endpoint not found',
- availableEndpoints: [
- '/api/v1/assets',
- '/api/v1/search',
- '/api/v1/lineage',
- '/api/v1/quality',
- '/api/v1/governance',
- '/health'
- ]
- }, 404);
-});
-
-// Error handler
-app.onError((err, c) => {
- console.error('Application error:', err);
-
- return c.json({
- success: false,
- error: 'Internal server error',
- message: process.env.NODE_ENV === 'development' ? err.message : 'Something went wrong'
- }, 500);
-});
-
-// Start server
-const port = parseInt(process.env.PORT || '3003');
-
-console.log(`๐ Data Catalog Service starting on port ${port}`);
-console.log(`๐ API Documentation available at http://localhost:${port}/api/v1/docs`);
-console.log(`โค๏ธ Health endpoint available at http://localhost:${port}/health`);
-
-serve({
- fetch: app.fetch,
- port: port
-});
-
-export default app;
diff --git a/apps/data-services/data-catalog/src/services/DataCatalogService.ts b/apps/data-services/data-catalog/src/services/DataCatalogService.ts
deleted file mode 100644
index f85c0e0..0000000
--- a/apps/data-services/data-catalog/src/services/DataCatalogService.ts
+++ /dev/null
@@ -1,312 +0,0 @@
-import { EventBus } from '@stock-bot/event-bus';
-import { Logger } from '@stock-bot/utils';
-import {
- DataAsset,
- CreateDataAssetRequest,
- UpdateDataAssetRequest,
- DataAssetType,
- DataClassification
-} from '../types/DataCatalog';
-
-export interface DataCatalogService {
- createAsset(request: CreateDataAssetRequest): Promise;
- getAsset(id: string): Promise;
- updateAsset(id: string, request: UpdateDataAssetRequest): Promise;
- deleteAsset(id: string): Promise;
- listAssets(filters?: Record): Promise;
- searchAssets(query: string, filters?: Record): Promise;
- getAssetsByOwner(owner: string): Promise;
- getAssetsByType(type: DataAssetType): Promise;
- getAssetsByClassification(classification: DataClassification): Promise;
- getAssetsByTags(tags: string[]): Promise;
-}
-
-export class DataCatalogServiceImpl implements DataCatalogService {
- private assets: Map = new Map();
-
- constructor(
- private eventBus: EventBus,
- private logger: Logger
- ) {}
-
- async createAsset(request: CreateDataAssetRequest): Promise {
- try {
- const asset: DataAsset = {
- id: this.generateId(),
- name: request.name,
- type: request.type,
- description: request.description,
- owner: request.owner,
- steward: request.steward,
- tags: request.tags || [],
- classification: request.classification,
- schema: request.schema,
- location: request.location,
- metadata: {
- customProperties: {},
- ...request.metadata
- },
- lineage: {
- id: this.generateId(),
- assetId: '',
- upstreamAssets: [],
- downstreamAssets: [],
- transformations: [],
- impact: {
- downstreamAssets: [],
- affectedUsers: [],
- estimatedImpact: 'low',
- impactDescription: '',
- recommendations: []
- },
- createdAt: new Date(),
- updatedAt: new Date()
- },
- quality: {
- id: this.generateId(),
- assetId: '',
- overallScore: 100,
- dimensions: [],
- rules: [],
- issues: [],
- trend: {
- timeframe: 'week',
- dataPoints: [],
- trend: 'stable',
- changeRate: 0
- },
- lastAssessment: new Date()
- },
- usage: {
- id: this.generateId(),
- assetId: '',
- accessCount: 0,
- uniqueUsers: 0,
- lastAccessed: new Date(),
- topUsers: [],
- accessPatterns: [],
- popularQueries: [],
- usageTrend: {
- timeframe: 'week',
- dataPoints: [],
- trend: 'stable',
- changeRate: 0
- }
- },
- governance: request.governance || {
- id: this.generateId(),
- assetId: '',
- policies: [],
- compliance: [],
- retention: {
- retentionPeriod: 365,
- retentionReason: 'Business requirement',
- legalHold: false
- },
- access: {
- defaultAccess: 'none',
- roles: [],
- users: []
- },
- privacy: {
- containsPII: false,
- sensitiveFields: [],
- anonymizationRules: [],
- consentRequired: false,
- dataSubjectRights: []
- },
- audit: []
- },
- createdAt: new Date(),
- updatedAt: new Date()
- };
-
- // Set correct asset IDs in nested objects
- asset.lineage.assetId = asset.id;
- asset.quality.assetId = asset.id;
- asset.usage.assetId = asset.id;
- asset.governance.assetId = asset.id;
-
- this.assets.set(asset.id, asset);
-
- this.logger.info('Data asset created', { assetId: asset.id, name: asset.name });
-
- await this.eventBus.emit('data.asset.created', {
- assetId: asset.id,
- asset,
- timestamp: new Date()
- });
-
- return asset;
- } catch (error) {
- this.logger.error('Failed to create data asset', { request, error });
- throw error;
- }
- }
-
- async getAsset(id: string): Promise {
- try {
- const asset = this.assets.get(id);
-
- if (asset) {
- // Update last accessed time
- asset.lastAccessed = new Date();
- asset.usage.lastAccessed = new Date();
- asset.usage.accessCount++;
-
- await this.eventBus.emit('data.asset.accessed', {
- assetId: id,
- timestamp: new Date()
- });
- }
-
- return asset || null;
- } catch (error) {
- this.logger.error('Failed to get data asset', { assetId: id, error });
- throw error;
- }
- }
-
- async updateAsset(id: string, request: UpdateDataAssetRequest): Promise {
- try {
- const asset = this.assets.get(id);
- if (!asset) {
- return null;
- }
-
- // Update only provided fields
- if (request.name !== undefined) asset.name = request.name;
- if (request.description !== undefined) asset.description = request.description;
- if (request.owner !== undefined) asset.owner = request.owner;
- if (request.steward !== undefined) asset.steward = request.steward;
- if (request.tags !== undefined) asset.tags = request.tags;
- if (request.classification !== undefined) asset.classification = request.classification;
- if (request.schema !== undefined) asset.schema = request.schema;
- if (request.metadata !== undefined) {
- asset.metadata = { ...asset.metadata, ...request.metadata };
- }
-
- asset.updatedAt = new Date();
-
- this.assets.set(id, asset);
-
- this.logger.info('Data asset updated', { assetId: id, changes: request });
-
- await this.eventBus.emit('data.asset.updated', {
- assetId: id,
- asset,
- changes: request,
- timestamp: new Date()
- });
-
- return asset;
- } catch (error) {
- this.logger.error('Failed to update data asset', { assetId: id, request, error });
- throw error;
- }
- }
-
- async deleteAsset(id: string): Promise {
- try {
- const asset = this.assets.get(id);
- if (!asset) {
- throw new Error(`Asset with id ${id} not found`);
- }
-
- this.assets.delete(id);
-
- this.logger.info('Data asset deleted', { assetId: id });
-
- await this.eventBus.emit('data.asset.deleted', {
- assetId: id,
- asset,
- timestamp: new Date()
- });
- } catch (error) {
- this.logger.error('Failed to delete data asset', { assetId: id, error });
- throw error;
- }
- }
-
- async listAssets(filters?: Record): Promise {
- try {
- let assets = Array.from(this.assets.values());
-
- if (filters) {
- assets = assets.filter(asset => {
- return Object.entries(filters).every(([key, value]) => {
- if (key === 'type') return asset.type === value;
- if (key === 'owner') return asset.owner === value;
- if (key === 'classification') return asset.classification === value;
- if (key === 'tags') return Array.isArray(value) ?
- value.some(tag => asset.tags.includes(tag)) :
- asset.tags.includes(value);
- return true;
- });
- });
- }
-
- return assets;
- } catch (error) {
- this.logger.error('Failed to list data assets', { filters, error });
- throw error;
- }
- }
-
- async searchAssets(query: string, filters?: Record): Promise {
- try {
- let assets = Array.from(this.assets.values());
-
- // Simple text search in name, description, and tags
- const searchTerm = query.toLowerCase();
- assets = assets.filter(asset =>
- asset.name.toLowerCase().includes(searchTerm) ||
- asset.description.toLowerCase().includes(searchTerm) ||
- asset.tags.some(tag => tag.toLowerCase().includes(searchTerm))
- );
-
- // Apply additional filters
- if (filters) {
- assets = assets.filter(asset => {
- return Object.entries(filters).every(([key, value]) => {
- if (key === 'type') return asset.type === value;
- if (key === 'owner') return asset.owner === value;
- if (key === 'classification') return asset.classification === value;
- return true;
- });
- });
- }
-
- this.logger.info('Asset search completed', {
- query,
- filters,
- resultCount: assets.length
- });
-
- return assets;
- } catch (error) {
- this.logger.error('Failed to search data assets', { query, filters, error });
- throw error;
- }
- }
-
- async getAssetsByOwner(owner: string): Promise {
- return this.listAssets({ owner });
- }
-
- async getAssetsByType(type: DataAssetType): Promise {
- return this.listAssets({ type });
- }
-
- async getAssetsByClassification(classification: DataClassification): Promise {
- return this.listAssets({ classification });
- }
-
- async getAssetsByTags(tags: string[]): Promise {
- return this.listAssets({ tags });
- }
-
- private generateId(): string {
- return `asset_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
- }
-}
diff --git a/apps/data-services/data-catalog/src/services/DataGovernanceService.ts b/apps/data-services/data-catalog/src/services/DataGovernanceService.ts
deleted file mode 100644
index 5c1a526..0000000
--- a/apps/data-services/data-catalog/src/services/DataGovernanceService.ts
+++ /dev/null
@@ -1,764 +0,0 @@
-import { EventBus } from '@stock-bot/event-bus';
-import { Logger } from '@stock-bot/utils';
-import {
- DataGovernance,
- GovernancePolicy,
- ComplianceCheck,
- RetentionPolicy,
- AccessControl,
- PrivacySettings,
- AuditEntry,
- DataAsset,
- GovernancePolicyType,
- ComplianceStatus,
- DataClassification
-} from '../types/DataCatalog';
-
-export interface DataGovernanceService {
- createPolicy(policy: Omit): Promise;
- updatePolicy(policyId: string, updates: Partial): Promise;
- deletePolicy(policyId: string): Promise;
- getPolicy(policyId: string): Promise;
- listPolicies(filters?: Record): Promise;
- applyPolicy(assetId: string, policyId: string): Promise;
- removePolicy(assetId: string, policyId: string): Promise;
- checkCompliance(assetId: string): Promise;
- updateRetentionPolicy(assetId: string, retention: RetentionPolicy): Promise;
- updateAccessControl(assetId: string, access: AccessControl): Promise;
- updatePrivacySettings(assetId: string, privacy: PrivacySettings): Promise;
- auditAccess(assetId: string, userId: string, action: string, details?: any): Promise;
- getAuditTrail(assetId: string, filters?: Record): Promise;
- generateComplianceReport(assetIds: string[]): Promise;
- validateDataAccess(assetId: string, userId: string, action: string): Promise;
- anonymizeData(assetId: string, options?: any): Promise;
- handleDataSubjectRequest(assetId: string, request: any): Promise;
-}
-
-export class DataGovernanceServiceImpl implements DataGovernanceService {
- private policies: Map = new Map();
- private governance: Map = new Map();
- private assets: Map = new Map();
-
- constructor(
- private eventBus: EventBus,
- private logger: Logger
- ) {
- this.initializeDefaultPolicies();
- }
-
- async createPolicy(policy: Omit): Promise {
- try {
- const fullPolicy: GovernancePolicy = {
- ...policy,
- id: this.generateId(),
- createdAt: new Date(),
- updatedAt: new Date()
- };
-
- this.policies.set(fullPolicy.id, fullPolicy);
-
- this.logger.info('Governance policy created', {
- policyId: fullPolicy.id,
- name: fullPolicy.name,
- type: fullPolicy.type
- });
-
- await this.eventBus.emit('data.governance.policy.created', {
- policy: fullPolicy,
- timestamp: new Date()
- });
-
- return fullPolicy;
- } catch (error) {
- this.logger.error('Failed to create governance policy', { policy, error });
- throw error;
- }
- }
-
- async updatePolicy(policyId: string, updates: Partial): Promise {
- try {
- const policy = this.policies.get(policyId);
- if (!policy) {
- return null;
- }
-
- const updatedPolicy: GovernancePolicy = {
- ...policy,
- ...updates,
- updatedAt: new Date()
- };
-
- this.policies.set(policyId, updatedPolicy);
-
- this.logger.info('Governance policy updated', { policyId, changes: updates });
-
- await this.eventBus.emit('data.governance.policy.updated', {
- policy: updatedPolicy,
- changes: updates,
- timestamp: new Date()
- });
-
- return updatedPolicy;
- } catch (error) {
- this.logger.error('Failed to update governance policy', { policyId, updates, error });
- throw error;
- }
- }
-
- async deletePolicy(policyId: string): Promise {
- try {
- const policy = this.policies.get(policyId);
- if (!policy) {
- throw new Error(`Policy with id ${policyId} not found`);
- }
-
- this.policies.delete(policyId);
-
- // Remove policy from all assets
- for (const [assetId, governance] of this.governance) {
- governance.policies = governance.policies.filter(p => p.id !== policyId);
- this.governance.set(assetId, governance);
- }
-
- this.logger.info('Governance policy deleted', { policyId });
-
- await this.eventBus.emit('data.governance.policy.deleted', {
- policyId,
- policy,
- timestamp: new Date()
- });
- } catch (error) {
- this.logger.error('Failed to delete governance policy', { policyId, error });
- throw error;
- }
- }
-
- async getPolicy(policyId: string): Promise {
- try {
- return this.policies.get(policyId) || null;
- } catch (error) {
- this.logger.error('Failed to get governance policy', { policyId, error });
- throw error;
- }
- }
-
- async listPolicies(filters?: Record): Promise {
- try {
- let policies = Array.from(this.policies.values());
-
- if (filters) {
- policies = policies.filter(policy => {
- return Object.entries(filters).every(([key, value]) => {
- if (key === 'type') return policy.type === value;
- if (key === 'active') return policy.active === value;
- if (key === 'classification') return policy.applicableClassifications?.includes(value);
- return true;
- });
- });
- }
-
- return policies;
- } catch (error) {
- this.logger.error('Failed to list governance policies', { filters, error });
- throw error;
- }
- }
-
- async applyPolicy(assetId: string, policyId: string): Promise {
- try {
- const policy = this.policies.get(policyId);
- if (!policy) {
- throw new Error(`Policy with id ${policyId} not found`);
- }
-
- let governance = this.governance.get(assetId);
- if (!governance) {
- governance = this.createEmptyGovernance(assetId);
- }
-
- // Check if policy is already applied
- if (!governance.policies.find(p => p.id === policyId)) {
- governance.policies.push(policy);
- this.governance.set(assetId, governance);
-
- // Perform compliance check after applying policy
- await this.checkCompliance(assetId);
-
- this.logger.info('Policy applied to asset', { assetId, policyId });
-
- await this.eventBus.emit('data.governance.policy.applied', {
- assetId,
- policyId,
- timestamp: new Date()
- });
- }
- } catch (error) {
- this.logger.error('Failed to apply policy to asset', { assetId, policyId, error });
- throw error;
- }
- }
-
- async removePolicy(assetId: string, policyId: string): Promise {
- try {
- const governance = this.governance.get(assetId);
- if (!governance) {
- throw new Error(`Governance not found for asset ${assetId}`);
- }
-
- governance.policies = governance.policies.filter(p => p.id !== policyId);
- this.governance.set(assetId, governance);
-
- this.logger.info('Policy removed from asset', { assetId, policyId });
-
- await this.eventBus.emit('data.governance.policy.removed', {
- assetId,
- policyId,
- timestamp: new Date()
- });
- } catch (error) {
- this.logger.error('Failed to remove policy from asset', { assetId, policyId, error });
- throw error;
- }
- }
-
- async checkCompliance(assetId: string): Promise {
- try {
- const governance = this.governance.get(assetId);
- const asset = this.assets.get(assetId);
-
- if (!governance || !asset) {
- return [];
- }
-
- const complianceChecks: ComplianceCheck[] = [];
-
- for (const policy of governance.policies) {
- if (!policy.active) continue;
-
- const check = await this.performComplianceCheck(asset, policy);
- complianceChecks.push(check);
- }
-
- // Update governance with compliance results
- governance.compliance = complianceChecks;
- this.governance.set(assetId, governance);
-
- // Log compliance issues
- const failedChecks = complianceChecks.filter(check => check.status === 'failed');
- if (failedChecks.length > 0) {
- this.logger.warn('Compliance violations detected', {
- assetId,
- violationCount: failedChecks.length
- });
-
- await this.eventBus.emit('data.governance.compliance.violation', {
- assetId,
- violations: failedChecks,
- timestamp: new Date()
- });
- }
-
- return complianceChecks;
- } catch (error) {
- this.logger.error('Failed to check compliance', { assetId, error });
- throw error;
- }
- }
-
- async updateRetentionPolicy(assetId: string, retention: RetentionPolicy): Promise {
- try {
- let governance = this.governance.get(assetId);
- if (!governance) {
- governance = this.createEmptyGovernance(assetId);
- }
-
- governance.retention = retention;
- this.governance.set(assetId, governance);
-
- this.logger.info('Retention policy updated', { assetId, retentionPeriod: retention.retentionPeriod });
-
- await this.eventBus.emit('data.governance.retention.updated', {
- assetId,
- retention,
- timestamp: new Date()
- });
- } catch (error) {
- this.logger.error('Failed to update retention policy', { assetId, retention, error });
- throw error;
- }
- }
-
- async updateAccessControl(assetId: string, access: AccessControl): Promise {
- try {
- let governance = this.governance.get(assetId);
- if (!governance) {
- governance = this.createEmptyGovernance(assetId);
- }
-
- governance.access = access;
- this.governance.set(assetId, governance);
-
- this.logger.info('Access control updated', { assetId, defaultAccess: access.defaultAccess });
-
- await this.eventBus.emit('data.governance.access.updated', {
- assetId,
- access,
- timestamp: new Date()
- });
- } catch (error) {
- this.logger.error('Failed to update access control', { assetId, access, error });
- throw error;
- }
- }
-
- async updatePrivacySettings(assetId: string, privacy: PrivacySettings): Promise {
- try {
- let governance = this.governance.get(assetId);
- if (!governance) {
- governance = this.createEmptyGovernance(assetId);
- }
-
- governance.privacy = privacy;
- this.governance.set(assetId, governance);
-
- this.logger.info('Privacy settings updated', {
- assetId,
- containsPII: privacy.containsPII,
- consentRequired: privacy.consentRequired
- });
-
- await this.eventBus.emit('data.governance.privacy.updated', {
- assetId,
- privacy,
- timestamp: new Date()
- });
- } catch (error) {
- this.logger.error('Failed to update privacy settings', { assetId, privacy, error });
- throw error;
- }
- }
-
- async auditAccess(assetId: string, userId: string, action: string, details?: any): Promise