adding data-services
This commit is contained in:
parent
e3bfd05b90
commit
405b818c86
139 changed files with 55943 additions and 416 deletions
34104
.chat/Claude4-June2.json
Normal file
34104
.chat/Claude4-June2.json
Normal file
File diff suppressed because one or more lines are too long
703
ARCHITECTURE.md
Normal file
703
ARCHITECTURE.md
Normal file
|
|
@ -0,0 +1,703 @@
|
|||
# 🏗️ Stock Bot Trading System - Architecture Documentation
|
||||
|
||||
## 📋 Table of Contents
|
||||
- [System Overview](#system-overview)
|
||||
- [Current Architecture](#current-architecture)
|
||||
- [Service Breakdown](#service-breakdown)
|
||||
- [Data Flow](#data-flow)
|
||||
- [Technology Stack](#technology-stack)
|
||||
- [Future Architecture](#future-architecture)
|
||||
- [Improvement Recommendations](#improvement-recommendations)
|
||||
- [Deployment Architecture](#deployment-architecture)
|
||||
- [Security Architecture](#security-architecture)
|
||||
- [Monitoring & Observability](#monitoring--observability)
|
||||
|
||||
---
|
||||
|
||||
## 🎯 System Overview
|
||||
|
||||
The Stock Bot Trading System is a **microservice-based**, **event-driven** trading platform built for **real-time market analysis**, **strategy execution**, and **risk management**. The system follows a **service-oriented architecture (SOA)** with **clear separation of concerns** and **horizontal scalability**.
|
||||
|
||||
### Core Principles
|
||||
- **Microservices Architecture**: Independent, deployable services
|
||||
- **Event-Driven Communication**: WebSocket and Redis pub/sub
|
||||
- **Real-Time Processing**: Sub-second latency requirements
|
||||
- **Scalable Design**: Horizontal scaling capabilities
|
||||
- **Fault Tolerance**: Circuit breakers and graceful degradation
|
||||
- **Type Safety**: Full TypeScript implementation
|
||||
|
||||
---
|
||||
|
||||
## 🏗️ Current Architecture
|
||||
|
||||
```mermaid
|
||||
graph TB
|
||||
subgraph "Frontend Layer"
|
||||
UI[Angular Trading Dashboard]
|
||||
end
|
||||
|
||||
subgraph "API Gateway Layer"
|
||||
GW[API Gateway - Future]
|
||||
end
|
||||
|
||||
subgraph "Core Services"
|
||||
MDG[Market Data Gateway<br/>Port 3001]
|
||||
RG[Risk Guardian<br/>Port 3002]
|
||||
SO[Strategy Orchestrator<br/>Port 4001]
|
||||
end
|
||||
|
||||
subgraph "Data Layer"
|
||||
Redis[(Redis Cache)]
|
||||
PG[(PostgreSQL)]
|
||||
QDB[(QuestDB)]
|
||||
Mongo[(MongoDB)]
|
||||
end
|
||||
|
||||
subgraph "External APIs"
|
||||
Alpha[Alpha Vantage]
|
||||
IEX[IEX Cloud]
|
||||
Yahoo[Yahoo Finance]
|
||||
end
|
||||
|
||||
UI -->|WebSocket/HTTP| MDG
|
||||
UI -->|WebSocket/HTTP| RG
|
||||
UI -->|WebSocket/HTTP| SO
|
||||
|
||||
MDG --> Redis
|
||||
MDG --> QDB
|
||||
MDG -->|Fetch Data| Alpha
|
||||
MDG -->|Fetch Data| IEX
|
||||
MDG -->|Fetch Data| Yahoo
|
||||
|
||||
RG --> Redis
|
||||
RG --> PG
|
||||
|
||||
SO --> Redis
|
||||
SO --> PG
|
||||
SO --> Mongo
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Service Breakdown
|
||||
|
||||
### **1. Interface Services**
|
||||
|
||||
#### **Trading Dashboard** (`apps/interface-services/trading-dashboard`)
|
||||
- **Framework**: Angular 20 + Angular Material + Tailwind CSS
|
||||
- **Port**: 4200 (development)
|
||||
- **Purpose**: Real-time trading interface and strategy management
|
||||
- **Key Features**:
|
||||
- Real-time market data visualization
|
||||
- Strategy creation and backtesting UI
|
||||
- Risk management dashboard
|
||||
- Portfolio monitoring
|
||||
- WebSocket integration for live updates
|
||||
|
||||
**Current Structure:**
|
||||
```
|
||||
trading-dashboard/
|
||||
├── src/
|
||||
│ ├── app/
|
||||
│ │ ├── components/ # Reusable UI components
|
||||
│ │ │ ├── sidebar/ # Navigation sidebar
|
||||
│ │ │ └── notifications/ # Alert system
|
||||
│ │ ├── pages/ # Route-based pages
|
||||
│ │ │ ├── dashboard/ # Main trading dashboard
|
||||
│ │ │ ├── market-data/ # Market data visualization
|
||||
│ │ │ ├── portfolio/ # Portfolio management
|
||||
│ │ │ ├── strategies/ # Strategy management
|
||||
│ │ │ └── risk-management/ # Risk controls
|
||||
│ │ ├── services/ # Angular services
|
||||
│ │ │ ├── api.service.ts # HTTP API communication
|
||||
│ │ │ ├── websocket.service.ts # WebSocket management
|
||||
│ │ │ └── strategy.service.ts # Strategy operations
|
||||
│ │ └── shared/ # Shared utilities
|
||||
│ └── styles.css # Global styles
|
||||
```
|
||||
|
||||
### **2. Core Services**
|
||||
|
||||
#### **Market Data Gateway** (`apps/core-services/market-data-gateway`)
|
||||
- **Framework**: Hono + Bun
|
||||
- **Port**: 3001
|
||||
- **Purpose**: Market data aggregation and real-time distribution
|
||||
- **Database**: QuestDB (time-series), Redis (caching)
|
||||
|
||||
**Responsibilities:**
|
||||
- Aggregate data from multiple market data providers
|
||||
- Real-time WebSocket streaming to clients
|
||||
- Historical data storage and retrieval
|
||||
- Rate limiting and API management
|
||||
- Data normalization and validation
|
||||
|
||||
**API Endpoints:**
|
||||
```
|
||||
GET /health # Health check
|
||||
GET /api/market-data/:symbol # Get latest market data
|
||||
GET /api/historical/:symbol # Get historical data
|
||||
WS /ws # WebSocket for real-time data
|
||||
```
|
||||
|
||||
#### **Risk Guardian** (`apps/core-services/risk-guardian`)
|
||||
- **Framework**: Hono + Bun
|
||||
- **Port**: 3002
|
||||
- **Purpose**: Real-time risk monitoring and controls
|
||||
- **Database**: PostgreSQL (persistent), Redis (real-time)
|
||||
|
||||
**Responsibilities:**
|
||||
- Position size monitoring
|
||||
- Daily loss tracking
|
||||
- Portfolio risk assessment
|
||||
- Volatility monitoring
|
||||
- Real-time risk alerts
|
||||
- Risk threshold management
|
||||
|
||||
**API Endpoints:**
|
||||
```
|
||||
GET /health # Health check
|
||||
GET /api/risk/thresholds # Get risk thresholds
|
||||
PUT /api/risk/thresholds # Update risk thresholds
|
||||
POST /api/risk/evaluate # Evaluate position risk
|
||||
GET /api/risk/history # Risk evaluation history
|
||||
WS /ws # WebSocket for risk alerts
|
||||
```
|
||||
|
||||
#### **Strategy Orchestrator** (`apps/intelligence-services/strategy-orchestrator`)
|
||||
- **Framework**: Hono + Bun
|
||||
- **Port**: 4001
|
||||
- **Purpose**: Strategy lifecycle management and execution
|
||||
- **Database**: MongoDB (strategies), PostgreSQL (trades), Redis (signals)
|
||||
|
||||
**Responsibilities:**
|
||||
- Strategy creation and management
|
||||
- Backtesting engine (vectorized & event-based)
|
||||
- Real-time strategy execution
|
||||
- Signal generation and broadcasting
|
||||
- Performance analytics
|
||||
- Strategy optimization
|
||||
|
||||
**Current Structure:**
|
||||
```
|
||||
strategy-orchestrator/
|
||||
├── src/
|
||||
│ ├── core/
|
||||
│ │ ├── backtesting/
|
||||
│ │ │ ├── BacktestEngine.ts # Main backtesting engine
|
||||
│ │ │ ├── BacktestService.ts # Backtesting service layer
|
||||
│ │ │ ├── MarketDataFeed.ts # Historical data provider
|
||||
│ │ │ └── PerformanceAnalytics.ts # Performance metrics
|
||||
│ │ ├── execution/
|
||||
│ │ │ └── StrategyExecutionService.ts # Real-time execution
|
||||
│ │ ├── strategies/
|
||||
│ │ │ ├── Strategy.ts # Base strategy interface
|
||||
│ │ │ ├── StrategyRegistry.ts # Strategy management
|
||||
│ │ │ ├── BaseStrategy.ts # Abstract base class
|
||||
│ │ │ ├── VectorizedStrategy.ts # Vectorized base class
|
||||
│ │ │ ├── MovingAverageCrossover.ts # MA strategy
|
||||
│ │ │ └── MeanReversionStrategy.ts # Mean reversion
|
||||
│ │ └── indicators/
|
||||
│ │ └── TechnicalIndicators.ts # Technical analysis
|
||||
│ ├── controllers/
|
||||
│ │ └── StrategyController.ts # API endpoints
|
||||
│ └── index.ts # Main entry point
|
||||
```
|
||||
|
||||
**API Endpoints:**
|
||||
```
|
||||
GET /health # Health check
|
||||
GET /api/strategies # List strategies
|
||||
POST /api/strategies # Create strategy
|
||||
PUT /api/strategies/:id # Update strategy
|
||||
POST /api/strategies/:id/:action # Start/stop/pause strategy
|
||||
GET /api/strategies/:id/signals # Get strategy signals
|
||||
POST /api/strategies/:id/backtest # Run backtest
|
||||
GET /api/strategies/:id/performance # Get performance metrics
|
||||
WS /ws # WebSocket for strategy updates
|
||||
```
|
||||
|
||||
### **3. Shared Packages**
|
||||
|
||||
#### **Shared Types** (`packages/shared-types`)
|
||||
```typescript
|
||||
export interface MarketData {
|
||||
symbol: string;
|
||||
price: number;
|
||||
volume: number;
|
||||
timestamp: Date;
|
||||
bid: number;
|
||||
ask: number;
|
||||
}
|
||||
|
||||
export interface Strategy {
|
||||
id: string;
|
||||
name: string;
|
||||
symbols: string[];
|
||||
parameters: Record<string, any>;
|
||||
status: 'ACTIVE' | 'PAUSED' | 'STOPPED';
|
||||
}
|
||||
|
||||
export interface BacktestResult {
|
||||
totalReturn: number;
|
||||
sharpeRatio: number;
|
||||
maxDrawdown: number;
|
||||
winRate: number;
|
||||
totalTrades: number;
|
||||
}
|
||||
```
|
||||
|
||||
#### **Configuration** (`packages/config`)
|
||||
```typescript
|
||||
export const config = {
|
||||
redis: {
|
||||
host: process.env.REDIS_HOST || 'localhost',
|
||||
port: parseInt(process.env.REDIS_PORT || '6379')
|
||||
},
|
||||
database: {
|
||||
postgres: process.env.POSTGRES_URL,
|
||||
questdb: process.env.QUESTDB_URL,
|
||||
mongodb: process.env.MONGODB_URL
|
||||
},
|
||||
marketData: {
|
||||
alphaVantageKey: process.env.ALPHA_VANTAGE_KEY,
|
||||
iexKey: process.env.IEX_KEY
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔄 Data Flow
|
||||
|
||||
### **Real-Time Market Data Flow**
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant EXT as External APIs
|
||||
participant MDG as Market Data Gateway
|
||||
participant Redis as Redis Cache
|
||||
participant QDB as QuestDB
|
||||
participant UI as Trading Dashboard
|
||||
|
||||
EXT->>MDG: Market data feed
|
||||
MDG->>Redis: Cache latest prices
|
||||
MDG->>QDB: Store historical data
|
||||
MDG->>UI: WebSocket broadcast
|
||||
UI->>UI: Update charts/tables
|
||||
```
|
||||
|
||||
### **Strategy Execution Flow**
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant UI as Trading Dashboard
|
||||
participant SO as Strategy Orchestrator
|
||||
participant MDG as Market Data Gateway
|
||||
participant RG as Risk Guardian
|
||||
participant Redis as Redis
|
||||
|
||||
UI->>SO: Start strategy
|
||||
SO->>MDG: Subscribe to market data
|
||||
MDG->>SO: Real-time price updates
|
||||
SO->>SO: Generate trading signals
|
||||
SO->>RG: Risk evaluation
|
||||
RG->>SO: Risk approval/rejection
|
||||
SO->>Redis: Store signals
|
||||
SO->>UI: WebSocket signal broadcast
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 💻 Technology Stack
|
||||
|
||||
### **Backend**
|
||||
- **Runtime**: Bun (ultra-fast JavaScript runtime)
|
||||
- **Web Framework**: Hono (lightweight, fast web framework)
|
||||
- **Language**: TypeScript (type safety)
|
||||
- **Build Tool**: Turbo (monorepo management)
|
||||
|
||||
### **Frontend**
|
||||
- **Framework**: Angular 20 (latest stable)
|
||||
- **UI Library**: Angular Material + Tailwind CSS
|
||||
- **State Management**: Angular Signals (reactive programming)
|
||||
- **WebSocket**: Native WebSocket API
|
||||
|
||||
### **Databases**
|
||||
- **Time-Series**: QuestDB (market data storage)
|
||||
- **Relational**: PostgreSQL (structured data)
|
||||
- **Document**: MongoDB (strategy configurations)
|
||||
- **Cache/Pub-Sub**: Redis (real-time data)
|
||||
|
||||
### **Infrastructure**
|
||||
- **Containerization**: Docker + Docker Compose
|
||||
- **Process Management**: PM2 (production)
|
||||
- **Monitoring**: Built-in health checks
|
||||
- **Development**: Hot reload, TypeScript compilation
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Future Architecture
|
||||
|
||||
### **Phase 1: Enhanced Microservices (Q2 2025)**
|
||||
```mermaid
|
||||
graph TB
|
||||
subgraph "API Gateway Layer"
|
||||
GW[Kong/Envoy API Gateway]
|
||||
LB[Load Balancer]
|
||||
end
|
||||
|
||||
subgraph "Authentication"
|
||||
AUTH[Auth Service<br/>JWT + OAuth]
|
||||
end
|
||||
|
||||
subgraph "Core Services"
|
||||
MDG[Market Data Gateway]
|
||||
RG[Risk Guardian]
|
||||
SO[Strategy Orchestrator]
|
||||
OE[Order Execution Engine]
|
||||
NS[Notification Service]
|
||||
end
|
||||
|
||||
subgraph "Analytics Services"
|
||||
BA[Backtest Analytics]
|
||||
PA[Performance Analytics]
|
||||
ML[ML Prediction Service]
|
||||
end
|
||||
|
||||
subgraph "Message Queue"
|
||||
NATS[NATS/Apache Kafka]
|
||||
end
|
||||
```
|
||||
|
||||
### **Phase 2: Machine Learning Integration (Q3 2025)**
|
||||
- **ML Pipeline**: Python-based ML services
|
||||
- **Feature Engineering**: Real-time feature computation
|
||||
- **Model Training**: Automated model retraining
|
||||
- **Prediction API**: Real-time predictions
|
||||
|
||||
### **Phase 3: Multi-Asset Support (Q4 2025)**
|
||||
- **Crypto Trading**: Binance, Coinbase integration
|
||||
- **Forex Trading**: OANDA, FXCM integration
|
||||
- **Options Trading**: Interactive Brokers integration
|
||||
- **Futures Trading**: CME, ICE integration
|
||||
|
||||
---
|
||||
|
||||
## 📈 Improvement Recommendations
|
||||
|
||||
### **1. High Priority Improvements**
|
||||
|
||||
#### **API Gateway Implementation**
|
||||
```typescript
|
||||
// Implement Kong or Envoy for:
|
||||
- Rate limiting per service
|
||||
- Authentication/authorization
|
||||
- Request/response transformation
|
||||
- Circuit breaker patterns
|
||||
- Load balancing
|
||||
```
|
||||
|
||||
#### **Enhanced Error Handling**
|
||||
```typescript
|
||||
// Implement structured error handling:
|
||||
interface ServiceError {
|
||||
code: string;
|
||||
message: string;
|
||||
service: string;
|
||||
timestamp: Date;
|
||||
correlationId: string;
|
||||
}
|
||||
```
|
||||
|
||||
#### **Comprehensive Logging**
|
||||
```typescript
|
||||
// Implement structured logging:
|
||||
interface LogEntry {
|
||||
level: 'debug' | 'info' | 'warn' | 'error';
|
||||
service: string;
|
||||
message: string;
|
||||
metadata: Record<string, any>;
|
||||
timestamp: Date;
|
||||
correlationId: string;
|
||||
}
|
||||
```
|
||||
|
||||
### **2. Medium Priority Improvements**
|
||||
|
||||
#### **Database Optimization**
|
||||
```sql
|
||||
-- QuestDB optimizations for market data:
|
||||
CREATE TABLE market_data (
|
||||
symbol SYMBOL,
|
||||
timestamp TIMESTAMP,
|
||||
price DOUBLE,
|
||||
volume DOUBLE,
|
||||
bid DOUBLE,
|
||||
ask DOUBLE
|
||||
) timestamp(timestamp) PARTITION BY DAY;
|
||||
|
||||
-- Add indexes for fast queries:
|
||||
CREATE INDEX idx_symbol_timestamp ON market_data (symbol, timestamp);
|
||||
```
|
||||
|
||||
#### **Caching Strategy**
|
||||
```typescript
|
||||
// Implement multi-layer caching:
|
||||
interface CacheStrategy {
|
||||
L1: Map<string, any>; // In-memory cache
|
||||
L2: Redis; // Distributed cache
|
||||
L3: Database; // Persistent storage
|
||||
}
|
||||
```
|
||||
|
||||
#### **WebSocket Optimization**
|
||||
```typescript
|
||||
// Implement WebSocket connection pooling:
|
||||
interface WSConnectionPool {
|
||||
connections: Map<string, WebSocket[]>;
|
||||
balancer: RoundRobinBalancer;
|
||||
heartbeat: HeartbeatManager;
|
||||
}
|
||||
```
|
||||
|
||||
### **3. Low Priority Improvements**
|
||||
|
||||
#### **Code Quality**
|
||||
- Implement comprehensive unit tests (>90% coverage)
|
||||
- Add integration tests for all services
|
||||
- Implement E2E tests for critical user flows
|
||||
- Add performance benchmarks
|
||||
|
||||
#### **Documentation**
|
||||
- API documentation with OpenAPI/Swagger
|
||||
- Developer onboarding guide
|
||||
- Deployment runbooks
|
||||
- Architecture decision records (ADRs)
|
||||
|
||||
---
|
||||
|
||||
## 🏗️ Deployment Architecture
|
||||
|
||||
### **Development Environment**
|
||||
```yaml
|
||||
# docker-compose.dev.yml
|
||||
version: '3.8'
|
||||
services:
|
||||
# Databases
|
||||
postgres:
|
||||
image: postgres:15
|
||||
ports: ["5432:5432"]
|
||||
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
ports: ["6379:6379"]
|
||||
|
||||
questdb:
|
||||
image: questdb/questdb:latest
|
||||
ports: ["9000:9000", "8812:8812"]
|
||||
|
||||
mongodb:
|
||||
image: mongo:6
|
||||
ports: ["27017:27017"]
|
||||
|
||||
# Services
|
||||
market-data-gateway:
|
||||
build: ./apps/core-services/market-data-gateway
|
||||
ports: ["3001:3001"]
|
||||
depends_on: [redis, questdb]
|
||||
|
||||
risk-guardian:
|
||||
build: ./apps/core-services/risk-guardian
|
||||
ports: ["3002:3002"]
|
||||
depends_on: [postgres, redis]
|
||||
|
||||
strategy-orchestrator:
|
||||
build: ./apps/intelligence-services/strategy-orchestrator
|
||||
ports: ["4001:4001"]
|
||||
depends_on: [mongodb, postgres, redis]
|
||||
|
||||
trading-dashboard:
|
||||
build: ./apps/interface-services/trading-dashboard
|
||||
ports: ["4200:4200"]
|
||||
```
|
||||
|
||||
### **Production Environment**
|
||||
```yaml
|
||||
# kubernetes deployment example
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: market-data-gateway
|
||||
spec:
|
||||
replicas: 3
|
||||
selector:
|
||||
matchLabels:
|
||||
app: market-data-gateway
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: market-data-gateway
|
||||
spec:
|
||||
containers:
|
||||
- name: market-data-gateway
|
||||
image: stockbot/market-data-gateway:latest
|
||||
ports:
|
||||
- containerPort: 3001
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "250m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔒 Security Architecture
|
||||
|
||||
### **Authentication & Authorization**
|
||||
```typescript
|
||||
// JWT-based authentication
|
||||
interface AuthToken {
|
||||
userId: string;
|
||||
roles: string[];
|
||||
permissions: string[];
|
||||
expiresAt: Date;
|
||||
}
|
||||
|
||||
// Role-based access control
|
||||
enum UserRole {
|
||||
TRADER = 'TRADER',
|
||||
ADMIN = 'ADMIN',
|
||||
VIEWER = 'VIEWER'
|
||||
}
|
||||
|
||||
enum Permission {
|
||||
READ_MARKET_DATA = 'READ_MARKET_DATA',
|
||||
EXECUTE_TRADES = 'EXECUTE_TRADES',
|
||||
MANAGE_STRATEGIES = 'MANAGE_STRATEGIES',
|
||||
CONFIGURE_RISK = 'CONFIGURE_RISK'
|
||||
}
|
||||
```
|
||||
|
||||
### **API Security**
|
||||
```typescript
|
||||
// Rate limiting configuration
|
||||
interface RateLimit {
|
||||
windowMs: number; // 15 minutes
|
||||
maxRequests: number; // 100 requests per window
|
||||
skipIf: (req: Request) => boolean;
|
||||
}
|
||||
|
||||
// Input validation
|
||||
interface ApiValidation {
|
||||
schema: JSONSchema;
|
||||
sanitization: SanitizationRules;
|
||||
authentication: AuthenticationMiddleware;
|
||||
}
|
||||
```
|
||||
|
||||
### **Data Security**
|
||||
- **Encryption at Rest**: AES-256 for sensitive data
|
||||
- **Encryption in Transit**: TLS 1.3 for all communications
|
||||
- **Secrets Management**: Kubernetes secrets or HashiCorp Vault
|
||||
- **Network Security**: VPC, security groups, firewalls
|
||||
|
||||
---
|
||||
|
||||
## 📊 Monitoring & Observability
|
||||
|
||||
### **Metrics Collection**
|
||||
```typescript
|
||||
interface ServiceMetrics {
|
||||
// Performance metrics
|
||||
requestLatency: Histogram;
|
||||
requestRate: Counter;
|
||||
errorRate: Counter;
|
||||
|
||||
// Business metrics
|
||||
tradesExecuted: Counter;
|
||||
strategiesActive: Gauge;
|
||||
portfolioValue: Gauge;
|
||||
|
||||
// System metrics
|
||||
memoryUsage: Gauge;
|
||||
cpuUsage: Gauge;
|
||||
dbConnections: Gauge;
|
||||
}
|
||||
```
|
||||
|
||||
### **Health Checks**
|
||||
```typescript
|
||||
interface HealthCheck {
|
||||
service: string;
|
||||
status: 'healthy' | 'degraded' | 'unhealthy';
|
||||
checks: {
|
||||
database: boolean;
|
||||
redis: boolean;
|
||||
externalApis: boolean;
|
||||
webSocket: boolean;
|
||||
};
|
||||
uptime: number;
|
||||
version: string;
|
||||
}
|
||||
```
|
||||
|
||||
### **Alerting Rules**
|
||||
```yaml
|
||||
# Prometheus alerting rules
|
||||
groups:
|
||||
- name: stockbot
|
||||
rules:
|
||||
- alert: HighErrorRate
|
||||
expr: rate(http_requests_total{status=~"5.."}[5m]) > 0.1
|
||||
for: 2m
|
||||
|
||||
- alert: HighLatency
|
||||
expr: http_request_duration_seconds{quantile="0.95"} > 1
|
||||
for: 2m
|
||||
|
||||
- alert: ServiceDown
|
||||
expr: up{job="stockbot"} == 0
|
||||
for: 30s
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📋 Migration Plan
|
||||
|
||||
### **Phase 1: Current → Enhanced (1-2 months)**
|
||||
1. **Week 1-2**: Implement API Gateway and authentication
|
||||
2. **Week 3-4**: Add comprehensive logging and monitoring
|
||||
3. **Week 5-6**: Enhance error handling and resilience
|
||||
4. **Week 7-8**: Performance optimization and testing
|
||||
|
||||
### **Phase 2: Enhanced → ML-Ready (2-3 months)**
|
||||
1. **Month 1**: Implement ML pipeline infrastructure
|
||||
2. **Month 2**: Develop feature engineering services
|
||||
3. **Month 3**: Integrate ML predictions into strategies
|
||||
|
||||
### **Phase 3: ML-Ready → Multi-Asset (3-4 months)**
|
||||
1. **Month 1**: Abstract market data interfaces
|
||||
2. **Month 2**: Implement crypto trading support
|
||||
3. **Month 3**: Add forex and options trading
|
||||
4. **Month 4**: Performance optimization and testing
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Success Metrics
|
||||
|
||||
### **Technical KPIs**
|
||||
- **Latency**: < 100ms for market data updates
|
||||
- **Throughput**: > 10,000 requests/second
|
||||
- **Availability**: 99.9% uptime
|
||||
- **Error Rate**: < 0.1% of requests
|
||||
|
||||
### **Business KPIs**
|
||||
- **Strategy Performance**: Sharpe ratio > 1.5
|
||||
- **Risk Management**: Max drawdown < 5%
|
||||
- **Execution Quality**: Slippage < 0.01%
|
||||
- **System Adoption**: > 90% user satisfaction
|
||||
|
||||
---
|
||||
|
||||
This architecture document serves as a living blueprint for the Stock Bot Trading System, providing clear guidance for current development and future scaling decisions.
|
||||
62
REFACTORING.md
Normal file
62
REFACTORING.md
Normal file
|
|
@ -0,0 +1,62 @@
|
|||
# Stock Bot Project Structure Refactoring
|
||||
|
||||
This document outlines the changes made to improve separation of concerns in the stock-bot project architecture.
|
||||
|
||||
## Directory Structure Changes
|
||||
|
||||
1. Created a dedicated `libs/` directory (replacing the previous `packages/` approach)
|
||||
2. Split monolithic type definitions into domain-specific modules
|
||||
3. Created specialized libraries for cross-cutting concerns
|
||||
|
||||
## New Libraries
|
||||
|
||||
### 1. `@stock-bot/shared-types`
|
||||
Domain-specific type definitions organized by functional area:
|
||||
- `market/` - Market data structures (OHLCV, OrderBook, etc.)
|
||||
- `trading/` - Trading types (Orders, Positions, etc.)
|
||||
- `strategy/` - Strategy and signal types
|
||||
- `events/` - Event definitions for the event bus
|
||||
- `api/` - Common API request/response types
|
||||
- `config/` - Configuration type definitions
|
||||
|
||||
### 2. `@stock-bot/event-bus`
|
||||
A Redis-based event bus implementation for inter-service communication:
|
||||
- Publish/subscribe pattern for asynchronous messaging
|
||||
- Support for typed events
|
||||
- Reliable message delivery
|
||||
|
||||
### 3. `@stock-bot/utils`
|
||||
Common utility functions shared across services:
|
||||
- `dateUtils` - Date/time helpers for market data
|
||||
- `financialUtils` - Financial calculations (Sharpe ratio, drawdown)
|
||||
- `logger` - Standardized logging service
|
||||
|
||||
### 4. `@stock-bot/api-client`
|
||||
Type-safe API clients for inter-service communication:
|
||||
- `BaseApiClient` - Common HTTP client functionality
|
||||
- Service-specific clients (BacktestClient, StrategyClient)
|
||||
|
||||
## Benefits of the New Architecture
|
||||
|
||||
1. **Better Separation of Concerns**
|
||||
- Each library has a clear, focused purpose
|
||||
- Domain types are logically grouped
|
||||
|
||||
2. **Improved Maintainability**
|
||||
- Smaller, focused modules
|
||||
- Clear dependencies between modules
|
||||
|
||||
3. **Type Safety**
|
||||
- Consistent types across the entire system
|
||||
- Better IDE autocompletion and error checking
|
||||
|
||||
4. **Reduced Duplication**
|
||||
- Shared utilities in a central location
|
||||
- Consistent implementation of common patterns
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. Update service implementations to use the new libraries
|
||||
2. Migrate from the old packages directory to the new libs structure
|
||||
3. Implement domain-specific validations in each type module
|
||||
4. Add unit tests for each library
|
||||
40
apps/data-services/data-catalog/package.json
Normal file
40
apps/data-services/data-catalog/package.json
Normal file
|
|
@ -0,0 +1,40 @@
|
|||
{
|
||||
"name": "@stock-bot/data-catalog",
|
||||
"version": "1.0.0",
|
||||
"private": true,
|
||||
"description": "Data catalog and discovery service for stock-bot",
|
||||
"type": "module",
|
||||
"main": "dist/index.js",
|
||||
"scripts": {
|
||||
"dev": "bun run --hot src/index.ts",
|
||||
"build": "tsc",
|
||||
"start": "node dist/index.js",
|
||||
"test": "bun test",
|
||||
"type-check": "tsc --noEmit"
|
||||
},
|
||||
"dependencies": {
|
||||
"@stock-bot/shared-types": "workspace:*",
|
||||
"@stock-bot/utils": "workspace:*",
|
||||
"@stock-bot/event-bus": "workspace:*",
|
||||
"@stock-bot/api-client": "workspace:*",
|
||||
"hono": "^4.0.0",
|
||||
"zod": "^3.22.0",
|
||||
"elasticsearch": "^16.7.3",
|
||||
"neo4j-driver": "^5.15.0",
|
||||
"cron": "^3.1.6",
|
||||
"uuid": "^9.0.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/uuid": "^9.0.8",
|
||||
"@types/cron": "^2.4.0",
|
||||
"@types/node": "^20.0.0",
|
||||
"typescript": "^5.3.0"
|
||||
},
|
||||
"keywords": [
|
||||
"data-catalog",
|
||||
"data-discovery",
|
||||
"data-lineage",
|
||||
"metadata",
|
||||
"stock-bot"
|
||||
]
|
||||
}
|
||||
|
|
@ -0,0 +1,360 @@
|
|||
import { Context } from 'hono';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import { DataCatalogService } from '../services/DataCatalogService';
|
||||
import {
|
||||
CreateDataAssetRequest,
|
||||
UpdateDataAssetRequest,
|
||||
DataAssetType,
|
||||
DataClassification
|
||||
} from '../types/DataCatalog';
|
||||
|
||||
export class DataCatalogController {
|
||||
constructor(
|
||||
private dataCatalogService: DataCatalogService,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async createAsset(c: Context) {
|
||||
try {
|
||||
const request: CreateDataAssetRequest = await c.req.json();
|
||||
|
||||
// Validate required fields
|
||||
if (!request.name || !request.type || !request.description || !request.owner) {
|
||||
return c.json({ error: 'Missing required fields: name, type, description, owner' }, 400);
|
||||
}
|
||||
|
||||
const asset = await this.dataCatalogService.createAsset(request);
|
||||
|
||||
this.logger.info('Asset created via API', {
|
||||
assetId: asset.id,
|
||||
name: asset.name,
|
||||
type: asset.type
|
||||
});
|
||||
|
||||
return c.json(asset, 201);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to create asset', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getAsset(c: Context) {
|
||||
try {
|
||||
const assetId = c.req.param('id');
|
||||
|
||||
if (!assetId) {
|
||||
return c.json({ error: 'Asset ID is required' }, 400);
|
||||
}
|
||||
|
||||
const asset = await this.dataCatalogService.getAsset(assetId);
|
||||
|
||||
if (!asset) {
|
||||
return c.json({ error: 'Asset not found' }, 404);
|
||||
}
|
||||
|
||||
return c.json(asset);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get asset', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async updateAsset(c: Context) {
|
||||
try {
|
||||
const assetId = c.req.param('id');
|
||||
const updates: UpdateDataAssetRequest = await c.req.json();
|
||||
|
||||
if (!assetId) {
|
||||
return c.json({ error: 'Asset ID is required' }, 400);
|
||||
}
|
||||
|
||||
const asset = await this.dataCatalogService.updateAsset(assetId, updates);
|
||||
|
||||
if (!asset) {
|
||||
return c.json({ error: 'Asset not found' }, 404);
|
||||
}
|
||||
|
||||
this.logger.info('Asset updated via API', {
|
||||
assetId,
|
||||
changes: Object.keys(updates)
|
||||
});
|
||||
|
||||
return c.json(asset);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update asset', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async deleteAsset(c: Context) {
|
||||
try {
|
||||
const assetId = c.req.param('id');
|
||||
|
||||
if (!assetId) {
|
||||
return c.json({ error: 'Asset ID is required' }, 400);
|
||||
}
|
||||
|
||||
await this.dataCatalogService.deleteAsset(assetId);
|
||||
|
||||
this.logger.info('Asset deleted via API', { assetId });
|
||||
|
||||
return c.json({ message: 'Asset deleted successfully' });
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to delete asset', { error });
|
||||
|
||||
if (error instanceof Error && error.message.includes('not found')) {
|
||||
return c.json({ error: 'Asset not found' }, 404);
|
||||
}
|
||||
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async listAssets(c: Context) {
|
||||
try {
|
||||
const query = c.req.query();
|
||||
const filters: Record<string, any> = {};
|
||||
|
||||
// Parse query parameters
|
||||
if (query.type) filters.type = query.type;
|
||||
if (query.owner) filters.owner = query.owner;
|
||||
if (query.classification) filters.classification = query.classification;
|
||||
if (query.tags) {
|
||||
filters.tags = Array.isArray(query.tags) ? query.tags : [query.tags];
|
||||
}
|
||||
|
||||
const assets = await this.dataCatalogService.listAssets(filters);
|
||||
|
||||
return c.json({
|
||||
assets,
|
||||
total: assets.length,
|
||||
filters: filters
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to list assets', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async searchAssets(c: Context) {
|
||||
try {
|
||||
const query = c.req.query('q');
|
||||
const queryParams = c.req.query();
|
||||
|
||||
if (!query) {
|
||||
return c.json({ error: 'Search query is required' }, 400);
|
||||
}
|
||||
|
||||
const filters: Record<string, any> = {};
|
||||
if (queryParams.type) filters.type = queryParams.type;
|
||||
if (queryParams.owner) filters.owner = queryParams.owner;
|
||||
if (queryParams.classification) filters.classification = queryParams.classification;
|
||||
|
||||
const assets = await this.dataCatalogService.searchAssets(query, filters);
|
||||
|
||||
this.logger.info('Asset search performed', {
|
||||
query,
|
||||
filters,
|
||||
resultCount: assets.length
|
||||
});
|
||||
|
||||
return c.json({
|
||||
assets,
|
||||
total: assets.length,
|
||||
query,
|
||||
filters
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to search assets', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getAssetsByOwner(c: Context) {
|
||||
try {
|
||||
const owner = c.req.param('owner');
|
||||
|
||||
if (!owner) {
|
||||
return c.json({ error: 'Owner is required' }, 400);
|
||||
}
|
||||
|
||||
const assets = await this.dataCatalogService.getAssetsByOwner(owner);
|
||||
|
||||
return c.json({
|
||||
assets,
|
||||
total: assets.length,
|
||||
owner
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get assets by owner', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getAssetsByType(c: Context) {
|
||||
try {
|
||||
const type = c.req.param('type') as DataAssetType;
|
||||
|
||||
if (!type) {
|
||||
return c.json({ error: 'Asset type is required' }, 400);
|
||||
}
|
||||
|
||||
if (!Object.values(DataAssetType).includes(type)) {
|
||||
return c.json({ error: 'Invalid asset type' }, 400);
|
||||
}
|
||||
|
||||
const assets = await this.dataCatalogService.getAssetsByType(type);
|
||||
|
||||
return c.json({
|
||||
assets,
|
||||
total: assets.length,
|
||||
type
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get assets by type', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getAssetsByClassification(c: Context) {
|
||||
try {
|
||||
const classification = c.req.param('classification') as DataClassification;
|
||||
|
||||
if (!classification) {
|
||||
return c.json({ error: 'Classification is required' }, 400);
|
||||
}
|
||||
|
||||
if (!Object.values(DataClassification).includes(classification)) {
|
||||
return c.json({ error: 'Invalid classification' }, 400);
|
||||
}
|
||||
|
||||
const assets = await this.dataCatalogService.getAssetsByClassification(classification);
|
||||
|
||||
return c.json({
|
||||
assets,
|
||||
total: assets.length,
|
||||
classification
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get assets by classification', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getAssetsByTags(c: Context) {
|
||||
try {
|
||||
const tagsParam = c.req.query('tags');
|
||||
|
||||
if (!tagsParam) {
|
||||
return c.json({ error: 'Tags parameter is required' }, 400);
|
||||
}
|
||||
|
||||
const tags = Array.isArray(tagsParam) ? tagsParam : [tagsParam];
|
||||
const assets = await this.dataCatalogService.getAssetsByTags(tags);
|
||||
|
||||
return c.json({
|
||||
assets,
|
||||
total: assets.length,
|
||||
tags
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get assets by tags', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getAssetMetrics(c: Context) {
|
||||
try {
|
||||
const assetId = c.req.param('id');
|
||||
|
||||
if (!assetId) {
|
||||
return c.json({ error: 'Asset ID is required' }, 400);
|
||||
}
|
||||
|
||||
const asset = await this.dataCatalogService.getAsset(assetId);
|
||||
|
||||
if (!asset) {
|
||||
return c.json({ error: 'Asset not found' }, 404);
|
||||
}
|
||||
|
||||
const metrics = {
|
||||
id: asset.id,
|
||||
name: asset.name,
|
||||
type: asset.type,
|
||||
classification: asset.classification,
|
||||
usage: {
|
||||
accessCount: asset.usage.accessCount,
|
||||
uniqueUsers: asset.usage.uniqueUsers,
|
||||
lastAccessed: asset.usage.lastAccessed,
|
||||
usageTrend: asset.usage.usageTrend
|
||||
},
|
||||
quality: {
|
||||
overallScore: asset.quality.overallScore,
|
||||
lastAssessment: asset.quality.lastAssessment,
|
||||
issueCount: asset.quality.issues.filter(issue => !issue.resolved).length
|
||||
},
|
||||
governance: {
|
||||
policiesApplied: asset.governance.policies.length,
|
||||
complianceStatus: asset.governance.compliance.every(c => c.status === 'passed') ? 'compliant' : 'non-compliant',
|
||||
auditEntries: asset.governance.audit.length
|
||||
},
|
||||
lineage: {
|
||||
upstreamCount: asset.lineage.upstreamAssets.length,
|
||||
downstreamCount: asset.lineage.downstreamAssets.length
|
||||
}
|
||||
};
|
||||
|
||||
return c.json(metrics);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get asset metrics', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getCatalogStatistics(c: Context) {
|
||||
try {
|
||||
const allAssets = await this.dataCatalogService.listAssets();
|
||||
|
||||
const statistics = {
|
||||
totalAssets: allAssets.length,
|
||||
assetsByType: this.groupByProperty(allAssets, 'type'),
|
||||
assetsByClassification: this.groupByProperty(allAssets, 'classification'),
|
||||
assetsByOwner: this.groupByProperty(allAssets, 'owner'),
|
||||
recentAssets: allAssets
|
||||
.sort((a, b) => b.createdAt.getTime() - a.createdAt.getTime())
|
||||
.slice(0, 10)
|
||||
.map(asset => ({
|
||||
id: asset.id,
|
||||
name: asset.name,
|
||||
type: asset.type,
|
||||
owner: asset.owner,
|
||||
createdAt: asset.createdAt
|
||||
})),
|
||||
mostAccessed: allAssets
|
||||
.sort((a, b) => b.usage.accessCount - a.usage.accessCount)
|
||||
.slice(0, 10)
|
||||
.map(asset => ({
|
||||
id: asset.id,
|
||||
name: asset.name,
|
||||
type: asset.type,
|
||||
accessCount: asset.usage.accessCount,
|
||||
lastAccessed: asset.usage.lastAccessed
|
||||
}))
|
||||
};
|
||||
|
||||
return c.json(statistics);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get catalog statistics', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
// Helper method to group assets by property
|
||||
private groupByProperty(assets: any[], property: string): Record<string, number> {
|
||||
return assets.reduce((acc, asset) => {
|
||||
const value = asset[property];
|
||||
acc[value] = (acc[value] || 0) + 1;
|
||||
return acc;
|
||||
}, {});
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,414 @@
|
|||
import { Hono } from 'hono';
|
||||
import { DataGovernanceService } from '../services/DataGovernanceService';
|
||||
import {
|
||||
GovernancePolicy,
|
||||
ComplianceCheck,
|
||||
AccessRequest,
|
||||
DataSubjectRequest,
|
||||
AuditLog
|
||||
} from '../types/DataCatalog';
|
||||
|
||||
export class GovernanceController {
|
||||
private app: Hono;
|
||||
private governanceService: DataGovernanceService;
|
||||
|
||||
constructor() {
|
||||
this.app = new Hono();
|
||||
this.governanceService = new DataGovernanceService();
|
||||
this.setupRoutes();
|
||||
}
|
||||
|
||||
private setupRoutes() {
|
||||
// Create governance policy
|
||||
this.app.post('/policies', async (c) => {
|
||||
try {
|
||||
const policy: Omit<GovernancePolicy, 'id' | 'createdAt' | 'updatedAt'> = await c.req.json();
|
||||
const createdPolicy = await this.governanceService.createPolicy(policy);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: createdPolicy
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error creating governance policy:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get governance policies
|
||||
this.app.get('/policies', async (c) => {
|
||||
try {
|
||||
const type = c.req.query('type');
|
||||
const category = c.req.query('category');
|
||||
const active = c.req.query('active') === 'true';
|
||||
|
||||
const filters: any = {};
|
||||
if (type) filters.type = type;
|
||||
if (category) filters.category = category;
|
||||
if (active !== undefined) filters.active = active;
|
||||
|
||||
const policies = await this.governanceService.getPolicies(filters);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: policies
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting governance policies:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Update governance policy
|
||||
this.app.put('/policies/:policyId', async (c) => {
|
||||
try {
|
||||
const policyId = c.req.param('policyId');
|
||||
const updates: Partial<GovernancePolicy> = await c.req.json();
|
||||
|
||||
const updatedPolicy = await this.governanceService.updatePolicy(policyId, updates);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: updatedPolicy
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error updating governance policy:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Delete governance policy
|
||||
this.app.delete('/policies/:policyId', async (c) => {
|
||||
try {
|
||||
const policyId = c.req.param('policyId');
|
||||
await this.governanceService.deletePolicy(policyId);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Governance policy deleted successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error deleting governance policy:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Apply policy to asset
|
||||
this.app.post('/policies/:policyId/apply/:assetId', async (c) => {
|
||||
try {
|
||||
const policyId = c.req.param('policyId');
|
||||
const assetId = c.req.param('assetId');
|
||||
|
||||
await this.governanceService.applyPolicy(policyId, assetId);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Policy applied successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error applying policy:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Check compliance for asset
|
||||
this.app.post('/compliance/check', async (c) => {
|
||||
try {
|
||||
const request: { assetId: string; policyIds?: string[] } = await c.req.json();
|
||||
const complianceResult = await this.governanceService.checkCompliance(
|
||||
request.assetId,
|
||||
request.policyIds
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: complianceResult
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error checking compliance:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get compliance violations
|
||||
this.app.get('/compliance/violations', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.query('assetId');
|
||||
const severity = c.req.query('severity');
|
||||
const status = c.req.query('status');
|
||||
const limit = c.req.query('limit') ? parseInt(c.req.query('limit')!) : 100;
|
||||
const offset = c.req.query('offset') ? parseInt(c.req.query('offset')!) : 0;
|
||||
|
||||
const filters: any = {};
|
||||
if (assetId) filters.assetId = assetId;
|
||||
if (severity) filters.severity = severity;
|
||||
if (status) filters.status = status;
|
||||
|
||||
const violations = await this.governanceService.getComplianceViolations(
|
||||
filters,
|
||||
{ limit, offset }
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: violations
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting compliance violations:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Request access to asset
|
||||
this.app.post('/access/request', async (c) => {
|
||||
try {
|
||||
const request: Omit<AccessRequest, 'id' | 'requestedAt' | 'status'> = await c.req.json();
|
||||
const accessRequest = await this.governanceService.requestAccess(request);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: accessRequest
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error requesting access:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Approve/deny access request
|
||||
this.app.patch('/access/:requestId', async (c) => {
|
||||
try {
|
||||
const requestId = c.req.param('requestId');
|
||||
const { action, reviewedBy, reviewComments } = await c.req.json();
|
||||
|
||||
const updatedRequest = await this.governanceService.reviewAccessRequest(
|
||||
requestId,
|
||||
action,
|
||||
reviewedBy,
|
||||
reviewComments
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: updatedRequest
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error reviewing access request:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Check access authorization
|
||||
this.app.post('/access/check', async (c) => {
|
||||
try {
|
||||
const { userId, assetId, action } = await c.req.json();
|
||||
const authorized = await this.governanceService.checkAccess(userId, assetId, action);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: {
|
||||
userId,
|
||||
assetId,
|
||||
action,
|
||||
authorized
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error checking access:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Handle data subject request (GDPR)
|
||||
this.app.post('/privacy/subject-request', async (c) => {
|
||||
try {
|
||||
const request: Omit<DataSubjectRequest, 'id' | 'submittedAt' | 'status'> = await c.req.json();
|
||||
const subjectRequest = await this.governanceService.handleDataSubjectRequest(request);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: subjectRequest
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error handling data subject request:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Anonymize asset data
|
||||
this.app.post('/privacy/anonymize/:assetId', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.param('assetId');
|
||||
const { fields, method, requestedBy } = await c.req.json();
|
||||
|
||||
const result = await this.governanceService.anonymizeData(
|
||||
assetId,
|
||||
fields,
|
||||
method,
|
||||
requestedBy
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: result
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error anonymizing data:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get audit logs
|
||||
this.app.get('/audit/logs', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.query('assetId');
|
||||
const userId = c.req.query('userId');
|
||||
const action = c.req.query('action');
|
||||
const startDate = c.req.query('startDate');
|
||||
const endDate = c.req.query('endDate');
|
||||
const limit = c.req.query('limit') ? parseInt(c.req.query('limit')!) : 100;
|
||||
const offset = c.req.query('offset') ? parseInt(c.req.query('offset')!) : 0;
|
||||
|
||||
const filters: any = {};
|
||||
if (assetId) filters.assetId = assetId;
|
||||
if (userId) filters.userId = userId;
|
||||
if (action) filters.action = action;
|
||||
if (startDate) filters.startDate = new Date(startDate);
|
||||
if (endDate) filters.endDate = new Date(endDate);
|
||||
|
||||
const logs = await this.governanceService.getAuditLogs(filters, { limit, offset });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: logs
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting audit logs:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Log access event
|
||||
this.app.post('/audit/log', async (c) => {
|
||||
try {
|
||||
const logEntry: Omit<AuditLog, 'id' | 'timestamp'> = await c.req.json();
|
||||
const logged = await this.governanceService.logAccess(logEntry);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: logged
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error logging access event:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get retention policies
|
||||
this.app.get('/retention/policies', async (c) => {
|
||||
try {
|
||||
const assetType = c.req.query('assetType');
|
||||
const policies = await this.governanceService.getRetentionPolicies(assetType);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: policies
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting retention policies:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Apply retention policy
|
||||
this.app.post('/retention/apply', async (c) => {
|
||||
try {
|
||||
const { assetId, policyId, requestedBy } = await c.req.json();
|
||||
const result = await this.governanceService.applyRetentionPolicy(
|
||||
assetId,
|
||||
policyId,
|
||||
requestedBy
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: result
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error applying retention policy:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get governance metrics
|
||||
this.app.get('/metrics', async (c) => {
|
||||
try {
|
||||
const timeRange = c.req.query('timeRange') || '30d';
|
||||
const metrics = await this.governanceService.getGovernanceMetrics(timeRange);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: metrics
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting governance metrics:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
public getApp(): Hono {
|
||||
return this.app;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,172 @@
|
|||
import { Hono } from 'hono';
|
||||
|
||||
export class HealthController {
|
||||
private app: Hono;
|
||||
|
||||
constructor() {
|
||||
this.app = new Hono();
|
||||
this.setupRoutes();
|
||||
}
|
||||
|
||||
private setupRoutes() {
|
||||
// Basic health check
|
||||
this.app.get('/', async (c) => {
|
||||
return c.json({
|
||||
service: 'data-catalog',
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
version: process.env.SERVICE_VERSION || '1.0.0'
|
||||
});
|
||||
});
|
||||
|
||||
// Detailed health check
|
||||
this.app.get('/detailed', async (c) => {
|
||||
try {
|
||||
const healthStatus = {
|
||||
service: 'data-catalog',
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
version: process.env.SERVICE_VERSION || '1.0.0',
|
||||
uptime: process.uptime(),
|
||||
memory: process.memoryUsage(),
|
||||
dependencies: {
|
||||
database: await this.checkDatabase(),
|
||||
search: await this.checkSearchService(),
|
||||
eventBus: await this.checkEventBus()
|
||||
}
|
||||
};
|
||||
|
||||
// Determine overall status based on dependencies
|
||||
const hasUnhealthyDependencies = Object.values(healthStatus.dependencies)
|
||||
.some(dep => dep.status !== 'healthy');
|
||||
|
||||
if (hasUnhealthyDependencies) {
|
||||
healthStatus.status = 'degraded';
|
||||
}
|
||||
|
||||
const statusCode = healthStatus.status === 'healthy' ? 200 : 503;
|
||||
return c.json(healthStatus, statusCode);
|
||||
} catch (error) {
|
||||
console.error('Health check error:', error);
|
||||
return c.json({
|
||||
service: 'data-catalog',
|
||||
status: 'unhealthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 503);
|
||||
}
|
||||
});
|
||||
|
||||
// Readiness check
|
||||
this.app.get('/ready', async (c) => {
|
||||
try {
|
||||
// Check if service is ready to accept requests
|
||||
const readyChecks = await Promise.all([
|
||||
this.checkDatabase(),
|
||||
this.checkSearchService()
|
||||
]);
|
||||
|
||||
const isReady = readyChecks.every(check => check.status === 'healthy');
|
||||
|
||||
if (isReady) {
|
||||
return c.json({
|
||||
service: 'data-catalog',
|
||||
ready: true,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
} else {
|
||||
return c.json({
|
||||
service: 'data-catalog',
|
||||
ready: false,
|
||||
timestamp: new Date().toISOString(),
|
||||
checks: readyChecks
|
||||
}, 503);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Readiness check error:', error);
|
||||
return c.json({
|
||||
service: 'data-catalog',
|
||||
ready: false,
|
||||
timestamp: new Date().toISOString(),
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 503);
|
||||
}
|
||||
});
|
||||
|
||||
// Liveness check
|
||||
this.app.get('/live', async (c) => {
|
||||
return c.json({
|
||||
service: 'data-catalog',
|
||||
alive: true,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
private async checkDatabase(): Promise<{ name: string; status: string; responseTime?: number }> {
|
||||
const start = Date.now();
|
||||
try {
|
||||
// Simulate database check
|
||||
// In real implementation, this would ping the actual database
|
||||
await new Promise(resolve => setTimeout(resolve, 10));
|
||||
|
||||
return {
|
||||
name: 'database',
|
||||
status: 'healthy',
|
||||
responseTime: Date.now() - start
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
name: 'database',
|
||||
status: 'unhealthy',
|
||||
responseTime: Date.now() - start
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async checkSearchService(): Promise<{ name: string; status: string; responseTime?: number }> {
|
||||
const start = Date.now();
|
||||
try {
|
||||
// Simulate search service check
|
||||
// In real implementation, this would check search index health
|
||||
await new Promise(resolve => setTimeout(resolve, 5));
|
||||
|
||||
return {
|
||||
name: 'search',
|
||||
status: 'healthy',
|
||||
responseTime: Date.now() - start
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
name: 'search',
|
||||
status: 'unhealthy',
|
||||
responseTime: Date.now() - start
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async checkEventBus(): Promise<{ name: string; status: string; responseTime?: number }> {
|
||||
const start = Date.now();
|
||||
try {
|
||||
// Simulate event bus check
|
||||
// In real implementation, this would check message broker connectivity
|
||||
await new Promise(resolve => setTimeout(resolve, 3));
|
||||
|
||||
return {
|
||||
name: 'eventBus',
|
||||
status: 'healthy',
|
||||
responseTime: Date.now() - start
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
name: 'eventBus',
|
||||
status: 'unhealthy',
|
||||
responseTime: Date.now() - start
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
public getApp(): Hono {
|
||||
return this.app;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,211 @@
|
|||
import { Hono } from 'hono';
|
||||
import { DataLineageService } from '../services/DataLineageService';
|
||||
import { CreateLineageRequest, LineageQuery, ImpactAnalysisQuery } from '../types/DataCatalog';
|
||||
|
||||
export class LineageController {
|
||||
private app: Hono;
|
||||
private lineageService: DataLineageService;
|
||||
|
||||
constructor() {
|
||||
this.app = new Hono();
|
||||
this.lineageService = new DataLineageService();
|
||||
this.setupRoutes();
|
||||
}
|
||||
|
||||
private setupRoutes() {
|
||||
// Create lineage relationship
|
||||
this.app.post('/', async (c) => {
|
||||
try {
|
||||
const request: CreateLineageRequest = await c.req.json();
|
||||
const lineage = await this.lineageService.createLineage(request);
|
||||
return c.json({
|
||||
success: true,
|
||||
data: lineage
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error creating lineage:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get lineage for asset
|
||||
this.app.get('/assets/:assetId', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.param('assetId');
|
||||
const direction = c.req.query('direction') as 'upstream' | 'downstream' | 'both';
|
||||
const depth = c.req.query('depth') ? parseInt(c.req.query('depth')!) : undefined;
|
||||
|
||||
const lineage = await this.lineageService.getAssetLineage(assetId, {
|
||||
direction: direction || 'both',
|
||||
depth: depth || 10
|
||||
});
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: lineage
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting asset lineage:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get upstream dependencies
|
||||
this.app.get('/assets/:assetId/upstream', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.param('assetId');
|
||||
const depth = c.req.query('depth') ? parseInt(c.req.query('depth')!) : 5;
|
||||
|
||||
const upstream = await this.lineageService.getUpstreamDependencies(assetId, depth);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: upstream
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting upstream dependencies:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get downstream dependencies
|
||||
this.app.get('/assets/:assetId/downstream', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.param('assetId');
|
||||
const depth = c.req.query('depth') ? parseInt(c.req.query('depth')!) : 5;
|
||||
|
||||
const downstream = await this.lineageService.getDownstreamDependencies(assetId, depth);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: downstream
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting downstream dependencies:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Perform impact analysis
|
||||
this.app.post('/impact-analysis', async (c) => {
|
||||
try {
|
||||
const query: ImpactAnalysisQuery = await c.req.json();
|
||||
const analysis = await this.lineageService.performImpactAnalysis(query);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: analysis
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error performing impact analysis:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get lineage graph
|
||||
this.app.get('/graph', async (c) => {
|
||||
try {
|
||||
const assetIds = c.req.query('assetIds')?.split(',') || [];
|
||||
const depth = c.req.query('depth') ? parseInt(c.req.query('depth')!) : 3;
|
||||
|
||||
if (assetIds.length === 0) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Asset IDs are required'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
const graph = await this.lineageService.getLineageGraph(assetIds, depth);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: graph
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting lineage graph:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Check for circular dependencies
|
||||
this.app.get('/assets/:assetId/circular-check', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.param('assetId');
|
||||
const hasCycles = await this.lineageService.hasCircularDependencies(assetId);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: {
|
||||
assetId,
|
||||
hasCircularDependencies: hasCycles
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error checking circular dependencies:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Delete lineage relationship
|
||||
this.app.delete('/:lineageId', async (c) => {
|
||||
try {
|
||||
const lineageId = c.req.param('lineageId');
|
||||
await this.lineageService.deleteLineage(lineageId);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Lineage relationship deleted successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error deleting lineage:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get lineage statistics
|
||||
this.app.get('/stats', async (c) => {
|
||||
try {
|
||||
const stats = await this.lineageService.getLineageStatistics();
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: stats
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting lineage statistics:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
public getApp(): Hono {
|
||||
return this.app;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,321 @@
|
|||
import { Hono } from 'hono';
|
||||
import { DataQualityService } from '../services/DataQualityService';
|
||||
import {
|
||||
QualityAssessmentRequest,
|
||||
QualityRule,
|
||||
QualityIssue,
|
||||
QualityReportRequest
|
||||
} from '../types/DataCatalog';
|
||||
|
||||
export class QualityController {
|
||||
private app: Hono;
|
||||
private qualityService: DataQualityService;
|
||||
|
||||
constructor() {
|
||||
this.app = new Hono();
|
||||
this.qualityService = new DataQualityService();
|
||||
this.setupRoutes();
|
||||
}
|
||||
|
||||
private setupRoutes() {
|
||||
// Assess asset quality
|
||||
this.app.post('/assess', async (c) => {
|
||||
try {
|
||||
const request: QualityAssessmentRequest = await c.req.json();
|
||||
const assessment = await this.qualityService.assessQuality(request);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: assessment
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error assessing quality:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get quality assessment for asset
|
||||
this.app.get('/assets/:assetId', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.param('assetId');
|
||||
const assessment = await this.qualityService.getQualityAssessment(assetId);
|
||||
|
||||
if (!assessment) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Quality assessment not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: assessment
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting quality assessment:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Create quality rule
|
||||
this.app.post('/rules', async (c) => {
|
||||
try {
|
||||
const rule: Omit<QualityRule, 'id' | 'createdAt' | 'updatedAt'> = await c.req.json();
|
||||
const createdRule = await this.qualityService.createQualityRule(rule);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: createdRule
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error creating quality rule:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get quality rules
|
||||
this.app.get('/rules', async (c) => {
|
||||
try {
|
||||
const assetType = c.req.query('assetType');
|
||||
const dimension = c.req.query('dimension');
|
||||
const active = c.req.query('active') === 'true';
|
||||
|
||||
const filters: any = {};
|
||||
if (assetType) filters.assetType = assetType;
|
||||
if (dimension) filters.dimension = dimension;
|
||||
if (active !== undefined) filters.active = active;
|
||||
|
||||
const rules = await this.qualityService.getQualityRules(filters);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: rules
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting quality rules:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Update quality rule
|
||||
this.app.put('/rules/:ruleId', async (c) => {
|
||||
try {
|
||||
const ruleId = c.req.param('ruleId');
|
||||
const updates: Partial<QualityRule> = await c.req.json();
|
||||
|
||||
const updatedRule = await this.qualityService.updateQualityRule(ruleId, updates);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: updatedRule
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error updating quality rule:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Delete quality rule
|
||||
this.app.delete('/rules/:ruleId', async (c) => {
|
||||
try {
|
||||
const ruleId = c.req.param('ruleId');
|
||||
await this.qualityService.deleteQualityRule(ruleId);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Quality rule deleted successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error deleting quality rule:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Validate quality rules for asset
|
||||
this.app.post('/validate/:assetId', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.param('assetId');
|
||||
const data = await c.req.json();
|
||||
|
||||
const validationResults = await this.qualityService.validateQualityRules(assetId, data);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: validationResults
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error validating quality rules:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Report quality issue
|
||||
this.app.post('/issues', async (c) => {
|
||||
try {
|
||||
const issue: Omit<QualityIssue, 'id' | 'reportedAt' | 'updatedAt'> = await c.req.json();
|
||||
const reportedIssue = await this.qualityService.reportQualityIssue(issue);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: reportedIssue
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error reporting quality issue:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get quality issues
|
||||
this.app.get('/issues', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.query('assetId');
|
||||
const severity = c.req.query('severity');
|
||||
const status = c.req.query('status');
|
||||
const dimension = c.req.query('dimension');
|
||||
const limit = c.req.query('limit') ? parseInt(c.req.query('limit')!) : 100;
|
||||
const offset = c.req.query('offset') ? parseInt(c.req.query('offset')!) : 0;
|
||||
|
||||
const filters: any = {};
|
||||
if (assetId) filters.assetId = assetId;
|
||||
if (severity) filters.severity = severity;
|
||||
if (status) filters.status = status;
|
||||
if (dimension) filters.dimension = dimension;
|
||||
|
||||
const issues = await this.qualityService.getQualityIssues(filters, { limit, offset });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: issues
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting quality issues:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Resolve quality issue
|
||||
this.app.patch('/issues/:issueId/resolve', async (c) => {
|
||||
try {
|
||||
const issueId = c.req.param('issueId');
|
||||
const { resolution, resolvedBy } = await c.req.json();
|
||||
|
||||
const resolvedIssue = await this.qualityService.resolveQualityIssue(
|
||||
issueId,
|
||||
resolution,
|
||||
resolvedBy
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: resolvedIssue
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error resolving quality issue:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get quality trends
|
||||
this.app.get('/trends', async (c) => {
|
||||
try {
|
||||
const assetId = c.req.query('assetId');
|
||||
const dimension = c.req.query('dimension');
|
||||
const timeRange = c.req.query('timeRange') || '30d';
|
||||
|
||||
const trends = await this.qualityService.getQualityTrends(
|
||||
assetId,
|
||||
dimension,
|
||||
timeRange
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: trends
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting quality trends:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Generate quality report
|
||||
this.app.post('/reports', async (c) => {
|
||||
try {
|
||||
const request: QualityReportRequest = await c.req.json();
|
||||
const report = await this.qualityService.generateQualityReport(request);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: report
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error generating quality report:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get quality metrics summary
|
||||
this.app.get('/metrics/summary', async (c) => {
|
||||
try {
|
||||
const assetIds = c.req.query('assetIds')?.split(',');
|
||||
const timeRange = c.req.query('timeRange') || '7d';
|
||||
|
||||
const summary = await this.qualityService.getQualityMetricsSummary(
|
||||
assetIds,
|
||||
timeRange
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: summary
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting quality metrics summary:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
public getApp(): Hono {
|
||||
return this.app;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,334 @@
|
|||
import { Context } from 'hono';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import { SearchService } from '../services/SearchService';
|
||||
import { SearchQuery, SearchFilters } from '../types/DataCatalog';
|
||||
|
||||
export class SearchController {
|
||||
constructor(
|
||||
private searchService: SearchService,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async search(c: Context) {
|
||||
try {
|
||||
const queryParams = c.req.query();
|
||||
|
||||
const searchQuery: SearchQuery = {
|
||||
text: queryParams.q || '',
|
||||
offset: parseInt(queryParams.offset || '0'),
|
||||
limit: parseInt(queryParams.limit || '20'),
|
||||
sortBy: queryParams.sortBy,
|
||||
sortOrder: queryParams.sortOrder as 'asc' | 'desc',
|
||||
userId: queryParams.userId
|
||||
};
|
||||
|
||||
// Parse filters
|
||||
const filters: SearchFilters = {};
|
||||
if (queryParams.types) {
|
||||
filters.types = Array.isArray(queryParams.types) ? queryParams.types : [queryParams.types];
|
||||
}
|
||||
if (queryParams.classifications) {
|
||||
filters.classifications = Array.isArray(queryParams.classifications) ? queryParams.classifications : [queryParams.classifications];
|
||||
}
|
||||
if (queryParams.owners) {
|
||||
filters.owners = Array.isArray(queryParams.owners) ? queryParams.owners : [queryParams.owners];
|
||||
}
|
||||
if (queryParams.tags) {
|
||||
filters.tags = Array.isArray(queryParams.tags) ? queryParams.tags : [queryParams.tags];
|
||||
}
|
||||
if (queryParams.createdAfter) {
|
||||
filters.createdAfter = new Date(queryParams.createdAfter);
|
||||
}
|
||||
if (queryParams.createdBefore) {
|
||||
filters.createdBefore = new Date(queryParams.createdBefore);
|
||||
}
|
||||
|
||||
if (Object.keys(filters).length > 0) {
|
||||
searchQuery.filters = filters;
|
||||
}
|
||||
|
||||
const result = await this.searchService.search(searchQuery);
|
||||
|
||||
this.logger.info('Search API call completed', {
|
||||
query: searchQuery.text,
|
||||
resultCount: result.total,
|
||||
searchTime: result.searchTime
|
||||
});
|
||||
|
||||
return c.json(result);
|
||||
} catch (error) {
|
||||
this.logger.error('Search API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async suggest(c: Context) {
|
||||
try {
|
||||
const partial = c.req.query('q');
|
||||
|
||||
if (!partial || partial.length < 2) {
|
||||
return c.json({ suggestions: [] });
|
||||
}
|
||||
|
||||
const suggestions = await this.searchService.suggest(partial);
|
||||
|
||||
return c.json({ suggestions });
|
||||
} catch (error) {
|
||||
this.logger.error('Suggestion API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async searchByFacets(c: Context) {
|
||||
try {
|
||||
const facets = await c.req.json();
|
||||
|
||||
if (!facets || typeof facets !== 'object') {
|
||||
return c.json({ error: 'Facets object is required' }, 400);
|
||||
}
|
||||
|
||||
const assets = await this.searchService.searchByFacets(facets);
|
||||
|
||||
return c.json({
|
||||
assets,
|
||||
total: assets.length,
|
||||
facets
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Facet search API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async searchSimilar(c: Context) {
|
||||
try {
|
||||
const assetId = c.req.param('id');
|
||||
const limit = parseInt(c.req.query('limit') || '10');
|
||||
|
||||
if (!assetId) {
|
||||
return c.json({ error: 'Asset ID is required' }, 400);
|
||||
}
|
||||
|
||||
const similarAssets = await this.searchService.searchSimilar(assetId, limit);
|
||||
|
||||
return c.json({
|
||||
assetId,
|
||||
similarAssets,
|
||||
total: similarAssets.length
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Similar search API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getPopularSearches(c: Context) {
|
||||
try {
|
||||
const limit = parseInt(c.req.query('limit') || '10');
|
||||
|
||||
const popularSearches = await this.searchService.getPopularSearches(limit);
|
||||
|
||||
return c.json({
|
||||
searches: popularSearches,
|
||||
total: popularSearches.length
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Popular searches API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getRecentSearches(c: Context) {
|
||||
try {
|
||||
const userId = c.req.param('userId');
|
||||
const limit = parseInt(c.req.query('limit') || '10');
|
||||
|
||||
if (!userId) {
|
||||
return c.json({ error: 'User ID is required' }, 400);
|
||||
}
|
||||
|
||||
const recentSearches = await this.searchService.getRecentSearches(userId, limit);
|
||||
|
||||
return c.json({
|
||||
userId,
|
||||
searches: recentSearches,
|
||||
total: recentSearches.length
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Recent searches API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async reindexAssets(c: Context) {
|
||||
try {
|
||||
await this.searchService.reindexAll();
|
||||
|
||||
this.logger.info('Search index rebuilt via API');
|
||||
|
||||
return c.json({ message: 'Search index rebuilt successfully' });
|
||||
} catch (error) {
|
||||
this.logger.error('Reindex API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getSearchAnalytics(c: Context) {
|
||||
try {
|
||||
const timeframe = c.req.query('timeframe') || 'week';
|
||||
|
||||
const analytics = await this.searchService.getSearchAnalytics(timeframe);
|
||||
|
||||
return c.json({
|
||||
timeframe,
|
||||
analytics
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Search analytics API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async advancedSearch(c: Context) {
|
||||
try {
|
||||
const searchRequest = await c.req.json();
|
||||
|
||||
if (!searchRequest) {
|
||||
return c.json({ error: 'Search request is required' }, 400);
|
||||
}
|
||||
|
||||
// Build advanced search query
|
||||
const searchQuery: SearchQuery = {
|
||||
text: searchRequest.query || '',
|
||||
offset: searchRequest.offset || 0,
|
||||
limit: searchRequest.limit || 20,
|
||||
sortBy: searchRequest.sortBy,
|
||||
sortOrder: searchRequest.sortOrder,
|
||||
userId: searchRequest.userId,
|
||||
filters: searchRequest.filters
|
||||
};
|
||||
|
||||
const result = await this.searchService.search(searchQuery);
|
||||
|
||||
// If no results and query is complex, try to suggest simpler alternatives
|
||||
if (result.total === 0 && searchQuery.text && searchQuery.text.split(' ').length > 2) {
|
||||
const simpleQuery = searchQuery.text.split(' ')[0];
|
||||
const simpleResult = await this.searchService.search({
|
||||
...searchQuery,
|
||||
text: simpleQuery
|
||||
});
|
||||
|
||||
if (simpleResult.total > 0) {
|
||||
result.suggestions = [`Try searching for "${simpleQuery}"`];
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.info('Advanced search API call completed', {
|
||||
query: searchQuery.text,
|
||||
resultCount: result.total,
|
||||
searchTime: result.searchTime
|
||||
});
|
||||
|
||||
return c.json(result);
|
||||
} catch (error) {
|
||||
this.logger.error('Advanced search API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async exportSearchResults(c: Context) {
|
||||
try {
|
||||
const queryParams = c.req.query();
|
||||
const format = queryParams.format || 'json';
|
||||
|
||||
if (format !== 'json' && format !== 'csv') {
|
||||
return c.json({ error: 'Unsupported export format. Use json or csv' }, 400);
|
||||
}
|
||||
|
||||
// Perform search with maximum results
|
||||
const searchQuery: SearchQuery = {
|
||||
text: queryParams.q || '',
|
||||
offset: 0,
|
||||
limit: 10000, // Large limit for export
|
||||
sortBy: queryParams.sortBy,
|
||||
sortOrder: queryParams.sortOrder as 'asc' | 'desc'
|
||||
};
|
||||
|
||||
const result = await this.searchService.search(searchQuery);
|
||||
|
||||
if (format === 'csv') {
|
||||
const csv = this.convertToCSV(result.assets);
|
||||
c.header('Content-Type', 'text/csv');
|
||||
c.header('Content-Disposition', 'attachment; filename="search-results.csv"');
|
||||
return c.text(csv);
|
||||
} else {
|
||||
c.header('Content-Type', 'application/json');
|
||||
c.header('Content-Disposition', 'attachment; filename="search-results.json"');
|
||||
return c.json(result);
|
||||
}
|
||||
} catch (error) {
|
||||
this.logger.error('Export search results API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getSearchStatistics(c: Context) {
|
||||
try {
|
||||
const timeframe = c.req.query('timeframe') || 'week';
|
||||
|
||||
const analytics = await this.searchService.getSearchAnalytics(timeframe);
|
||||
|
||||
const statistics = {
|
||||
searchVolume: analytics.totalSearches,
|
||||
uniqueQueries: analytics.uniqueQueries,
|
||||
averageResultsPerSearch: Math.round(analytics.averageResults),
|
||||
noResultQueriesPercent: analytics.totalSearches > 0
|
||||
? Math.round((analytics.noResultQueries / analytics.totalSearches) * 100)
|
||||
: 0,
|
||||
topSearchTerms: analytics.topQueries.slice(0, 5),
|
||||
searchTrend: analytics.searchTrend.trend,
|
||||
facetUsage: analytics.facetUsage
|
||||
};
|
||||
|
||||
return c.json({
|
||||
timeframe,
|
||||
statistics
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Search statistics API call failed', { error });
|
||||
return c.json({ error: 'Internal server error' }, 500);
|
||||
}
|
||||
}
|
||||
|
||||
// Helper method to convert assets to CSV format
|
||||
private convertToCSV(assets: any[]): string {
|
||||
if (assets.length === 0) {
|
||||
return 'No results found';
|
||||
}
|
||||
|
||||
const headers = [
|
||||
'ID', 'Name', 'Type', 'Description', 'Owner', 'Classification',
|
||||
'Tags', 'Created At', 'Updated At', 'Last Accessed'
|
||||
];
|
||||
|
||||
const csvRows = [headers.join(',')];
|
||||
|
||||
for (const asset of assets) {
|
||||
const row = [
|
||||
asset.id,
|
||||
`"${asset.name.replace(/"/g, '""')}"`,
|
||||
asset.type,
|
||||
`"${asset.description.replace(/"/g, '""')}"`,
|
||||
asset.owner,
|
||||
asset.classification,
|
||||
`"${asset.tags.join('; ')}"`,
|
||||
asset.createdAt.toISOString(),
|
||||
asset.updatedAt.toISOString(),
|
||||
asset.lastAccessed ? asset.lastAccessed.toISOString() : ''
|
||||
];
|
||||
csvRows.push(row.join(','));
|
||||
}
|
||||
|
||||
return csvRows.join('\n');
|
||||
}
|
||||
}
|
||||
201
apps/data-services/data-catalog/src/index.ts
Normal file
201
apps/data-services/data-catalog/src/index.ts
Normal file
|
|
@ -0,0 +1,201 @@
|
|||
import { Hono } from 'hono';
|
||||
import { cors } from 'hono/cors';
|
||||
import { logger } from 'hono/logger';
|
||||
import { prettyJSON } from 'hono/pretty-json';
|
||||
import { serve } from '@hono/node-server';
|
||||
|
||||
// Import controllers
|
||||
import { DataCatalogController } from './controllers/DataCatalogController';
|
||||
import { SearchController } from './controllers/SearchController';
|
||||
import { LineageController } from './controllers/LineageController';
|
||||
import { QualityController } from './controllers/QualityController';
|
||||
import { GovernanceController } from './controllers/GovernanceController';
|
||||
import { HealthController } from './controllers/HealthController';
|
||||
|
||||
// Create main application
|
||||
const app = new Hono();
|
||||
|
||||
// Add middleware
|
||||
app.use('*', cors({
|
||||
origin: ['http://localhost:3000', 'http://localhost:4000', 'http://localhost:5173'],
|
||||
allowMethods: ['GET', 'POST', 'PUT', 'DELETE', 'PATCH', 'OPTIONS'],
|
||||
allowHeaders: ['Content-Type', 'Authorization', 'X-Requested-With'],
|
||||
credentials: true
|
||||
}));
|
||||
|
||||
app.use('*', logger());
|
||||
app.use('*', prettyJSON());
|
||||
|
||||
// Initialize controllers
|
||||
const dataCatalogController = new DataCatalogController();
|
||||
const searchController = new SearchController();
|
||||
const lineageController = new LineageController();
|
||||
const qualityController = new QualityController();
|
||||
const governanceController = new GovernanceController();
|
||||
const healthController = new HealthController();
|
||||
|
||||
// Setup routes
|
||||
app.route('/api/v1/assets', dataCatalogController.getApp());
|
||||
app.route('/api/v1/search', searchController.getApp());
|
||||
app.route('/api/v1/lineage', lineageController.getApp());
|
||||
app.route('/api/v1/quality', qualityController.getApp());
|
||||
app.route('/api/v1/governance', governanceController.getApp());
|
||||
app.route('/health', healthController.getApp());
|
||||
|
||||
// Root endpoint
|
||||
app.get('/', (c) => {
|
||||
return c.json({
|
||||
service: 'Data Catalog Service',
|
||||
version: '1.0.0',
|
||||
description: 'Comprehensive data catalog and governance service for stock-bot data platform',
|
||||
endpoints: {
|
||||
assets: '/api/v1/assets',
|
||||
search: '/api/v1/search',
|
||||
lineage: '/api/v1/lineage',
|
||||
quality: '/api/v1/quality',
|
||||
governance: '/api/v1/governance',
|
||||
health: '/health'
|
||||
},
|
||||
documentation: '/api/v1/docs'
|
||||
});
|
||||
});
|
||||
|
||||
// API documentation endpoint
|
||||
app.get('/api/v1/docs', (c) => {
|
||||
return c.json({
|
||||
title: 'Data Catalog Service API',
|
||||
version: '1.0.0',
|
||||
description: 'RESTful API for data catalog, lineage, quality, and governance operations',
|
||||
endpoints: {
|
||||
assets: {
|
||||
description: 'Data asset management',
|
||||
methods: {
|
||||
'GET /api/v1/assets': 'List assets with filtering and pagination',
|
||||
'POST /api/v1/assets': 'Create new data asset',
|
||||
'GET /api/v1/assets/:id': 'Get asset by ID',
|
||||
'PUT /api/v1/assets/:id': 'Update asset',
|
||||
'DELETE /api/v1/assets/:id': 'Delete asset',
|
||||
'GET /api/v1/assets/:id/schema': 'Get asset schema',
|
||||
'PUT /api/v1/assets/:id/schema': 'Update asset schema',
|
||||
'GET /api/v1/assets/:id/usage': 'Get asset usage analytics',
|
||||
'POST /api/v1/assets/:id/usage': 'Record asset usage'
|
||||
}
|
||||
},
|
||||
search: {
|
||||
description: 'Data discovery and search',
|
||||
methods: {
|
||||
'GET /api/v1/search': 'Search assets with full-text and faceted search',
|
||||
'GET /api/v1/search/suggest': 'Get search suggestions',
|
||||
'GET /api/v1/search/facets': 'Get available search facets',
|
||||
'GET /api/v1/search/similar/:id': 'Find similar assets',
|
||||
'GET /api/v1/search/trending': 'Get trending searches',
|
||||
'POST /api/v1/search/export': 'Export search results'
|
||||
}
|
||||
},
|
||||
lineage: {
|
||||
description: 'Data lineage and impact analysis',
|
||||
methods: {
|
||||
'POST /api/v1/lineage': 'Create lineage relationship',
|
||||
'GET /api/v1/lineage/assets/:assetId': 'Get asset lineage',
|
||||
'GET /api/v1/lineage/assets/:assetId/upstream': 'Get upstream dependencies',
|
||||
'GET /api/v1/lineage/assets/:assetId/downstream': 'Get downstream dependencies',
|
||||
'POST /api/v1/lineage/impact-analysis': 'Perform impact analysis',
|
||||
'GET /api/v1/lineage/graph': 'Get lineage graph visualization',
|
||||
'GET /api/v1/lineage/assets/:assetId/circular-check': 'Check for circular dependencies',
|
||||
'DELETE /api/v1/lineage/:lineageId': 'Delete lineage relationship',
|
||||
'GET /api/v1/lineage/stats': 'Get lineage statistics'
|
||||
}
|
||||
},
|
||||
quality: {
|
||||
description: 'Data quality assessment and monitoring',
|
||||
methods: {
|
||||
'POST /api/v1/quality/assess': 'Assess data quality',
|
||||
'GET /api/v1/quality/assets/:assetId': 'Get quality assessment',
|
||||
'POST /api/v1/quality/rules': 'Create quality rule',
|
||||
'GET /api/v1/quality/rules': 'Get quality rules',
|
||||
'PUT /api/v1/quality/rules/:ruleId': 'Update quality rule',
|
||||
'DELETE /api/v1/quality/rules/:ruleId': 'Delete quality rule',
|
||||
'POST /api/v1/quality/validate/:assetId': 'Validate quality rules',
|
||||
'POST /api/v1/quality/issues': 'Report quality issue',
|
||||
'GET /api/v1/quality/issues': 'Get quality issues',
|
||||
'PATCH /api/v1/quality/issues/:issueId/resolve': 'Resolve quality issue',
|
||||
'GET /api/v1/quality/trends': 'Get quality trends',
|
||||
'POST /api/v1/quality/reports': 'Generate quality report',
|
||||
'GET /api/v1/quality/metrics/summary': 'Get quality metrics summary'
|
||||
}
|
||||
},
|
||||
governance: {
|
||||
description: 'Data governance and compliance',
|
||||
methods: {
|
||||
'POST /api/v1/governance/policies': 'Create governance policy',
|
||||
'GET /api/v1/governance/policies': 'Get governance policies',
|
||||
'PUT /api/v1/governance/policies/:policyId': 'Update governance policy',
|
||||
'DELETE /api/v1/governance/policies/:policyId': 'Delete governance policy',
|
||||
'POST /api/v1/governance/policies/:policyId/apply/:assetId': 'Apply policy to asset',
|
||||
'POST /api/v1/governance/compliance/check': 'Check compliance',
|
||||
'GET /api/v1/governance/compliance/violations': 'Get compliance violations',
|
||||
'POST /api/v1/governance/access/request': 'Request data access',
|
||||
'PATCH /api/v1/governance/access/:requestId': 'Review access request',
|
||||
'POST /api/v1/governance/access/check': 'Check access authorization',
|
||||
'POST /api/v1/governance/privacy/subject-request': 'Handle data subject request',
|
||||
'POST /api/v1/governance/privacy/anonymize/:assetId': 'Anonymize asset data',
|
||||
'GET /api/v1/governance/audit/logs': 'Get audit logs',
|
||||
'POST /api/v1/governance/audit/log': 'Log access event',
|
||||
'GET /api/v1/governance/retention/policies': 'Get retention policies',
|
||||
'POST /api/v1/governance/retention/apply': 'Apply retention policy',
|
||||
'GET /api/v1/governance/metrics': 'Get governance metrics'
|
||||
}
|
||||
},
|
||||
health: {
|
||||
description: 'Service health monitoring',
|
||||
methods: {
|
||||
'GET /health': 'Basic health check',
|
||||
'GET /health/detailed': 'Detailed health check with dependencies',
|
||||
'GET /health/ready': 'Readiness check',
|
||||
'GET /health/live': 'Liveness check'
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// 404 handler
|
||||
app.notFound((c) => {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Endpoint not found',
|
||||
availableEndpoints: [
|
||||
'/api/v1/assets',
|
||||
'/api/v1/search',
|
||||
'/api/v1/lineage',
|
||||
'/api/v1/quality',
|
||||
'/api/v1/governance',
|
||||
'/health'
|
||||
]
|
||||
}, 404);
|
||||
});
|
||||
|
||||
// Error handler
|
||||
app.onError((err, c) => {
|
||||
console.error('Application error:', err);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Internal server error',
|
||||
message: process.env.NODE_ENV === 'development' ? err.message : 'Something went wrong'
|
||||
}, 500);
|
||||
});
|
||||
|
||||
// Start server
|
||||
const port = parseInt(process.env.PORT || '3003');
|
||||
|
||||
console.log(`🚀 Data Catalog Service starting on port ${port}`);
|
||||
console.log(`📚 API Documentation available at http://localhost:${port}/api/v1/docs`);
|
||||
console.log(`❤️ Health endpoint available at http://localhost:${port}/health`);
|
||||
|
||||
serve({
|
||||
fetch: app.fetch,
|
||||
port: port
|
||||
});
|
||||
|
||||
export default app;
|
||||
|
|
@ -0,0 +1,312 @@
|
|||
import { EventBus } from '@stock-bot/event-bus';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import {
|
||||
DataAsset,
|
||||
CreateDataAssetRequest,
|
||||
UpdateDataAssetRequest,
|
||||
DataAssetType,
|
||||
DataClassification
|
||||
} from '../types/DataCatalog';
|
||||
|
||||
export interface DataCatalogService {
|
||||
createAsset(request: CreateDataAssetRequest): Promise<DataAsset>;
|
||||
getAsset(id: string): Promise<DataAsset | null>;
|
||||
updateAsset(id: string, request: UpdateDataAssetRequest): Promise<DataAsset | null>;
|
||||
deleteAsset(id: string): Promise<void>;
|
||||
listAssets(filters?: Record<string, any>): Promise<DataAsset[]>;
|
||||
searchAssets(query: string, filters?: Record<string, any>): Promise<DataAsset[]>;
|
||||
getAssetsByOwner(owner: string): Promise<DataAsset[]>;
|
||||
getAssetsByType(type: DataAssetType): Promise<DataAsset[]>;
|
||||
getAssetsByClassification(classification: DataClassification): Promise<DataAsset[]>;
|
||||
getAssetsByTags(tags: string[]): Promise<DataAsset[]>;
|
||||
}
|
||||
|
||||
export class DataCatalogServiceImpl implements DataCatalogService {
|
||||
private assets: Map<string, DataAsset> = new Map();
|
||||
|
||||
constructor(
|
||||
private eventBus: EventBus,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async createAsset(request: CreateDataAssetRequest): Promise<DataAsset> {
|
||||
try {
|
||||
const asset: DataAsset = {
|
||||
id: this.generateId(),
|
||||
name: request.name,
|
||||
type: request.type,
|
||||
description: request.description,
|
||||
owner: request.owner,
|
||||
steward: request.steward,
|
||||
tags: request.tags || [],
|
||||
classification: request.classification,
|
||||
schema: request.schema,
|
||||
location: request.location,
|
||||
metadata: {
|
||||
customProperties: {},
|
||||
...request.metadata
|
||||
},
|
||||
lineage: {
|
||||
id: this.generateId(),
|
||||
assetId: '',
|
||||
upstreamAssets: [],
|
||||
downstreamAssets: [],
|
||||
transformations: [],
|
||||
impact: {
|
||||
downstreamAssets: [],
|
||||
affectedUsers: [],
|
||||
estimatedImpact: 'low',
|
||||
impactDescription: '',
|
||||
recommendations: []
|
||||
},
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date()
|
||||
},
|
||||
quality: {
|
||||
id: this.generateId(),
|
||||
assetId: '',
|
||||
overallScore: 100,
|
||||
dimensions: [],
|
||||
rules: [],
|
||||
issues: [],
|
||||
trend: {
|
||||
timeframe: 'week',
|
||||
dataPoints: [],
|
||||
trend: 'stable',
|
||||
changeRate: 0
|
||||
},
|
||||
lastAssessment: new Date()
|
||||
},
|
||||
usage: {
|
||||
id: this.generateId(),
|
||||
assetId: '',
|
||||
accessCount: 0,
|
||||
uniqueUsers: 0,
|
||||
lastAccessed: new Date(),
|
||||
topUsers: [],
|
||||
accessPatterns: [],
|
||||
popularQueries: [],
|
||||
usageTrend: {
|
||||
timeframe: 'week',
|
||||
dataPoints: [],
|
||||
trend: 'stable',
|
||||
changeRate: 0
|
||||
}
|
||||
},
|
||||
governance: request.governance || {
|
||||
id: this.generateId(),
|
||||
assetId: '',
|
||||
policies: [],
|
||||
compliance: [],
|
||||
retention: {
|
||||
retentionPeriod: 365,
|
||||
retentionReason: 'Business requirement',
|
||||
legalHold: false
|
||||
},
|
||||
access: {
|
||||
defaultAccess: 'none',
|
||||
roles: [],
|
||||
users: []
|
||||
},
|
||||
privacy: {
|
||||
containsPII: false,
|
||||
sensitiveFields: [],
|
||||
anonymizationRules: [],
|
||||
consentRequired: false,
|
||||
dataSubjectRights: []
|
||||
},
|
||||
audit: []
|
||||
},
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date()
|
||||
};
|
||||
|
||||
// Set correct asset IDs in nested objects
|
||||
asset.lineage.assetId = asset.id;
|
||||
asset.quality.assetId = asset.id;
|
||||
asset.usage.assetId = asset.id;
|
||||
asset.governance.assetId = asset.id;
|
||||
|
||||
this.assets.set(asset.id, asset);
|
||||
|
||||
this.logger.info('Data asset created', { assetId: asset.id, name: asset.name });
|
||||
|
||||
await this.eventBus.emit('data.asset.created', {
|
||||
assetId: asset.id,
|
||||
asset,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
return asset;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to create data asset', { request, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getAsset(id: string): Promise<DataAsset | null> {
|
||||
try {
|
||||
const asset = this.assets.get(id);
|
||||
|
||||
if (asset) {
|
||||
// Update last accessed time
|
||||
asset.lastAccessed = new Date();
|
||||
asset.usage.lastAccessed = new Date();
|
||||
asset.usage.accessCount++;
|
||||
|
||||
await this.eventBus.emit('data.asset.accessed', {
|
||||
assetId: id,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
|
||||
return asset || null;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get data asset', { assetId: id, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updateAsset(id: string, request: UpdateDataAssetRequest): Promise<DataAsset | null> {
|
||||
try {
|
||||
const asset = this.assets.get(id);
|
||||
if (!asset) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Update only provided fields
|
||||
if (request.name !== undefined) asset.name = request.name;
|
||||
if (request.description !== undefined) asset.description = request.description;
|
||||
if (request.owner !== undefined) asset.owner = request.owner;
|
||||
if (request.steward !== undefined) asset.steward = request.steward;
|
||||
if (request.tags !== undefined) asset.tags = request.tags;
|
||||
if (request.classification !== undefined) asset.classification = request.classification;
|
||||
if (request.schema !== undefined) asset.schema = request.schema;
|
||||
if (request.metadata !== undefined) {
|
||||
asset.metadata = { ...asset.metadata, ...request.metadata };
|
||||
}
|
||||
|
||||
asset.updatedAt = new Date();
|
||||
|
||||
this.assets.set(id, asset);
|
||||
|
||||
this.logger.info('Data asset updated', { assetId: id, changes: request });
|
||||
|
||||
await this.eventBus.emit('data.asset.updated', {
|
||||
assetId: id,
|
||||
asset,
|
||||
changes: request,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
return asset;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update data asset', { assetId: id, request, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async deleteAsset(id: string): Promise<void> {
|
||||
try {
|
||||
const asset = this.assets.get(id);
|
||||
if (!asset) {
|
||||
throw new Error(`Asset with id ${id} not found`);
|
||||
}
|
||||
|
||||
this.assets.delete(id);
|
||||
|
||||
this.logger.info('Data asset deleted', { assetId: id });
|
||||
|
||||
await this.eventBus.emit('data.asset.deleted', {
|
||||
assetId: id,
|
||||
asset,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to delete data asset', { assetId: id, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async listAssets(filters?: Record<string, any>): Promise<DataAsset[]> {
|
||||
try {
|
||||
let assets = Array.from(this.assets.values());
|
||||
|
||||
if (filters) {
|
||||
assets = assets.filter(asset => {
|
||||
return Object.entries(filters).every(([key, value]) => {
|
||||
if (key === 'type') return asset.type === value;
|
||||
if (key === 'owner') return asset.owner === value;
|
||||
if (key === 'classification') return asset.classification === value;
|
||||
if (key === 'tags') return Array.isArray(value) ?
|
||||
value.some(tag => asset.tags.includes(tag)) :
|
||||
asset.tags.includes(value);
|
||||
return true;
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
return assets;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to list data assets', { filters, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async searchAssets(query: string, filters?: Record<string, any>): Promise<DataAsset[]> {
|
||||
try {
|
||||
let assets = Array.from(this.assets.values());
|
||||
|
||||
// Simple text search in name, description, and tags
|
||||
const searchTerm = query.toLowerCase();
|
||||
assets = assets.filter(asset =>
|
||||
asset.name.toLowerCase().includes(searchTerm) ||
|
||||
asset.description.toLowerCase().includes(searchTerm) ||
|
||||
asset.tags.some(tag => tag.toLowerCase().includes(searchTerm))
|
||||
);
|
||||
|
||||
// Apply additional filters
|
||||
if (filters) {
|
||||
assets = assets.filter(asset => {
|
||||
return Object.entries(filters).every(([key, value]) => {
|
||||
if (key === 'type') return asset.type === value;
|
||||
if (key === 'owner') return asset.owner === value;
|
||||
if (key === 'classification') return asset.classification === value;
|
||||
return true;
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
this.logger.info('Asset search completed', {
|
||||
query,
|
||||
filters,
|
||||
resultCount: assets.length
|
||||
});
|
||||
|
||||
return assets;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to search data assets', { query, filters, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getAssetsByOwner(owner: string): Promise<DataAsset[]> {
|
||||
return this.listAssets({ owner });
|
||||
}
|
||||
|
||||
async getAssetsByType(type: DataAssetType): Promise<DataAsset[]> {
|
||||
return this.listAssets({ type });
|
||||
}
|
||||
|
||||
async getAssetsByClassification(classification: DataClassification): Promise<DataAsset[]> {
|
||||
return this.listAssets({ classification });
|
||||
}
|
||||
|
||||
async getAssetsByTags(tags: string[]): Promise<DataAsset[]> {
|
||||
return this.listAssets({ tags });
|
||||
}
|
||||
|
||||
private generateId(): string {
|
||||
return `asset_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,764 @@
|
|||
import { EventBus } from '@stock-bot/event-bus';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import {
|
||||
DataGovernance,
|
||||
GovernancePolicy,
|
||||
ComplianceCheck,
|
||||
RetentionPolicy,
|
||||
AccessControl,
|
||||
PrivacySettings,
|
||||
AuditEntry,
|
||||
DataAsset,
|
||||
GovernancePolicyType,
|
||||
ComplianceStatus,
|
||||
DataClassification
|
||||
} from '../types/DataCatalog';
|
||||
|
||||
export interface DataGovernanceService {
|
||||
createPolicy(policy: Omit<GovernancePolicy, 'id' | 'createdAt' | 'updatedAt'>): Promise<GovernancePolicy>;
|
||||
updatePolicy(policyId: string, updates: Partial<GovernancePolicy>): Promise<GovernancePolicy | null>;
|
||||
deletePolicy(policyId: string): Promise<void>;
|
||||
getPolicy(policyId: string): Promise<GovernancePolicy | null>;
|
||||
listPolicies(filters?: Record<string, any>): Promise<GovernancePolicy[]>;
|
||||
applyPolicy(assetId: string, policyId: string): Promise<void>;
|
||||
removePolicy(assetId: string, policyId: string): Promise<void>;
|
||||
checkCompliance(assetId: string): Promise<ComplianceCheck[]>;
|
||||
updateRetentionPolicy(assetId: string, retention: RetentionPolicy): Promise<void>;
|
||||
updateAccessControl(assetId: string, access: AccessControl): Promise<void>;
|
||||
updatePrivacySettings(assetId: string, privacy: PrivacySettings): Promise<void>;
|
||||
auditAccess(assetId: string, userId: string, action: string, details?: any): Promise<void>;
|
||||
getAuditTrail(assetId: string, filters?: Record<string, any>): Promise<AuditEntry[]>;
|
||||
generateComplianceReport(assetIds: string[]): Promise<any>;
|
||||
validateDataAccess(assetId: string, userId: string, action: string): Promise<boolean>;
|
||||
anonymizeData(assetId: string, options?: any): Promise<void>;
|
||||
handleDataSubjectRequest(assetId: string, request: any): Promise<any>;
|
||||
}
|
||||
|
||||
export class DataGovernanceServiceImpl implements DataGovernanceService {
|
||||
private policies: Map<string, GovernancePolicy> = new Map();
|
||||
private governance: Map<string, DataGovernance> = new Map();
|
||||
private assets: Map<string, DataAsset> = new Map();
|
||||
|
||||
constructor(
|
||||
private eventBus: EventBus,
|
||||
private logger: Logger
|
||||
) {
|
||||
this.initializeDefaultPolicies();
|
||||
}
|
||||
|
||||
async createPolicy(policy: Omit<GovernancePolicy, 'id' | 'createdAt' | 'updatedAt'>): Promise<GovernancePolicy> {
|
||||
try {
|
||||
const fullPolicy: GovernancePolicy = {
|
||||
...policy,
|
||||
id: this.generateId(),
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date()
|
||||
};
|
||||
|
||||
this.policies.set(fullPolicy.id, fullPolicy);
|
||||
|
||||
this.logger.info('Governance policy created', {
|
||||
policyId: fullPolicy.id,
|
||||
name: fullPolicy.name,
|
||||
type: fullPolicy.type
|
||||
});
|
||||
|
||||
await this.eventBus.emit('data.governance.policy.created', {
|
||||
policy: fullPolicy,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
return fullPolicy;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to create governance policy', { policy, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updatePolicy(policyId: string, updates: Partial<GovernancePolicy>): Promise<GovernancePolicy | null> {
|
||||
try {
|
||||
const policy = this.policies.get(policyId);
|
||||
if (!policy) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const updatedPolicy: GovernancePolicy = {
|
||||
...policy,
|
||||
...updates,
|
||||
updatedAt: new Date()
|
||||
};
|
||||
|
||||
this.policies.set(policyId, updatedPolicy);
|
||||
|
||||
this.logger.info('Governance policy updated', { policyId, changes: updates });
|
||||
|
||||
await this.eventBus.emit('data.governance.policy.updated', {
|
||||
policy: updatedPolicy,
|
||||
changes: updates,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
return updatedPolicy;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update governance policy', { policyId, updates, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async deletePolicy(policyId: string): Promise<void> {
|
||||
try {
|
||||
const policy = this.policies.get(policyId);
|
||||
if (!policy) {
|
||||
throw new Error(`Policy with id ${policyId} not found`);
|
||||
}
|
||||
|
||||
this.policies.delete(policyId);
|
||||
|
||||
// Remove policy from all assets
|
||||
for (const [assetId, governance] of this.governance) {
|
||||
governance.policies = governance.policies.filter(p => p.id !== policyId);
|
||||
this.governance.set(assetId, governance);
|
||||
}
|
||||
|
||||
this.logger.info('Governance policy deleted', { policyId });
|
||||
|
||||
await this.eventBus.emit('data.governance.policy.deleted', {
|
||||
policyId,
|
||||
policy,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to delete governance policy', { policyId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getPolicy(policyId: string): Promise<GovernancePolicy | null> {
|
||||
try {
|
||||
return this.policies.get(policyId) || null;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get governance policy', { policyId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async listPolicies(filters?: Record<string, any>): Promise<GovernancePolicy[]> {
|
||||
try {
|
||||
let policies = Array.from(this.policies.values());
|
||||
|
||||
if (filters) {
|
||||
policies = policies.filter(policy => {
|
||||
return Object.entries(filters).every(([key, value]) => {
|
||||
if (key === 'type') return policy.type === value;
|
||||
if (key === 'active') return policy.active === value;
|
||||
if (key === 'classification') return policy.applicableClassifications?.includes(value);
|
||||
return true;
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
return policies;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to list governance policies', { filters, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async applyPolicy(assetId: string, policyId: string): Promise<void> {
|
||||
try {
|
||||
const policy = this.policies.get(policyId);
|
||||
if (!policy) {
|
||||
throw new Error(`Policy with id ${policyId} not found`);
|
||||
}
|
||||
|
||||
let governance = this.governance.get(assetId);
|
||||
if (!governance) {
|
||||
governance = this.createEmptyGovernance(assetId);
|
||||
}
|
||||
|
||||
// Check if policy is already applied
|
||||
if (!governance.policies.find(p => p.id === policyId)) {
|
||||
governance.policies.push(policy);
|
||||
this.governance.set(assetId, governance);
|
||||
|
||||
// Perform compliance check after applying policy
|
||||
await this.checkCompliance(assetId);
|
||||
|
||||
this.logger.info('Policy applied to asset', { assetId, policyId });
|
||||
|
||||
await this.eventBus.emit('data.governance.policy.applied', {
|
||||
assetId,
|
||||
policyId,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to apply policy to asset', { assetId, policyId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async removePolicy(assetId: string, policyId: string): Promise<void> {
|
||||
try {
|
||||
const governance = this.governance.get(assetId);
|
||||
if (!governance) {
|
||||
throw new Error(`Governance not found for asset ${assetId}`);
|
||||
}
|
||||
|
||||
governance.policies = governance.policies.filter(p => p.id !== policyId);
|
||||
this.governance.set(assetId, governance);
|
||||
|
||||
this.logger.info('Policy removed from asset', { assetId, policyId });
|
||||
|
||||
await this.eventBus.emit('data.governance.policy.removed', {
|
||||
assetId,
|
||||
policyId,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to remove policy from asset', { assetId, policyId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async checkCompliance(assetId: string): Promise<ComplianceCheck[]> {
|
||||
try {
|
||||
const governance = this.governance.get(assetId);
|
||||
const asset = this.assets.get(assetId);
|
||||
|
||||
if (!governance || !asset) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const complianceChecks: ComplianceCheck[] = [];
|
||||
|
||||
for (const policy of governance.policies) {
|
||||
if (!policy.active) continue;
|
||||
|
||||
const check = await this.performComplianceCheck(asset, policy);
|
||||
complianceChecks.push(check);
|
||||
}
|
||||
|
||||
// Update governance with compliance results
|
||||
governance.compliance = complianceChecks;
|
||||
this.governance.set(assetId, governance);
|
||||
|
||||
// Log compliance issues
|
||||
const failedChecks = complianceChecks.filter(check => check.status === 'failed');
|
||||
if (failedChecks.length > 0) {
|
||||
this.logger.warn('Compliance violations detected', {
|
||||
assetId,
|
||||
violationCount: failedChecks.length
|
||||
});
|
||||
|
||||
await this.eventBus.emit('data.governance.compliance.violation', {
|
||||
assetId,
|
||||
violations: failedChecks,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
|
||||
return complianceChecks;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to check compliance', { assetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updateRetentionPolicy(assetId: string, retention: RetentionPolicy): Promise<void> {
|
||||
try {
|
||||
let governance = this.governance.get(assetId);
|
||||
if (!governance) {
|
||||
governance = this.createEmptyGovernance(assetId);
|
||||
}
|
||||
|
||||
governance.retention = retention;
|
||||
this.governance.set(assetId, governance);
|
||||
|
||||
this.logger.info('Retention policy updated', { assetId, retentionPeriod: retention.retentionPeriod });
|
||||
|
||||
await this.eventBus.emit('data.governance.retention.updated', {
|
||||
assetId,
|
||||
retention,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update retention policy', { assetId, retention, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updateAccessControl(assetId: string, access: AccessControl): Promise<void> {
|
||||
try {
|
||||
let governance = this.governance.get(assetId);
|
||||
if (!governance) {
|
||||
governance = this.createEmptyGovernance(assetId);
|
||||
}
|
||||
|
||||
governance.access = access;
|
||||
this.governance.set(assetId, governance);
|
||||
|
||||
this.logger.info('Access control updated', { assetId, defaultAccess: access.defaultAccess });
|
||||
|
||||
await this.eventBus.emit('data.governance.access.updated', {
|
||||
assetId,
|
||||
access,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update access control', { assetId, access, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updatePrivacySettings(assetId: string, privacy: PrivacySettings): Promise<void> {
|
||||
try {
|
||||
let governance = this.governance.get(assetId);
|
||||
if (!governance) {
|
||||
governance = this.createEmptyGovernance(assetId);
|
||||
}
|
||||
|
||||
governance.privacy = privacy;
|
||||
this.governance.set(assetId, governance);
|
||||
|
||||
this.logger.info('Privacy settings updated', {
|
||||
assetId,
|
||||
containsPII: privacy.containsPII,
|
||||
consentRequired: privacy.consentRequired
|
||||
});
|
||||
|
||||
await this.eventBus.emit('data.governance.privacy.updated', {
|
||||
assetId,
|
||||
privacy,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update privacy settings', { assetId, privacy, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async auditAccess(assetId: string, userId: string, action: string, details?: any): Promise<void> {
|
||||
try {
|
||||
let governance = this.governance.get(assetId);
|
||||
if (!governance) {
|
||||
governance = this.createEmptyGovernance(assetId);
|
||||
}
|
||||
|
||||
const auditEntry: AuditEntry = {
|
||||
id: this.generateId(),
|
||||
userId,
|
||||
action,
|
||||
timestamp: new Date(),
|
||||
ipAddress: details?.ipAddress,
|
||||
userAgent: details?.userAgent,
|
||||
details
|
||||
};
|
||||
|
||||
governance.audit.push(auditEntry);
|
||||
this.governance.set(assetId, governance);
|
||||
|
||||
this.logger.info('Access audited', { assetId, userId, action });
|
||||
|
||||
await this.eventBus.emit('data.governance.access.audited', {
|
||||
assetId,
|
||||
auditEntry,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to audit access', { assetId, userId, action, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getAuditTrail(assetId: string, filters?: Record<string, any>): Promise<AuditEntry[]> {
|
||||
try {
|
||||
const governance = this.governance.get(assetId);
|
||||
if (!governance) {
|
||||
return [];
|
||||
}
|
||||
|
||||
let auditEntries = governance.audit;
|
||||
|
||||
if (filters) {
|
||||
auditEntries = auditEntries.filter(entry => {
|
||||
return Object.entries(filters).every(([key, value]) => {
|
||||
if (key === 'userId') return entry.userId === value;
|
||||
if (key === 'action') return entry.action === value;
|
||||
if (key === 'fromDate') return entry.timestamp >= new Date(value);
|
||||
if (key === 'toDate') return entry.timestamp <= new Date(value);
|
||||
return true;
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
return auditEntries.sort((a, b) => b.timestamp.getTime() - a.timestamp.getTime());
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get audit trail', { assetId, filters, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async generateComplianceReport(assetIds: string[]): Promise<any> {
|
||||
try {
|
||||
const reportData = {
|
||||
summary: {
|
||||
totalAssets: assetIds.length,
|
||||
compliantAssets: 0,
|
||||
nonCompliantAssets: 0,
|
||||
violationCount: 0,
|
||||
reportDate: new Date()
|
||||
},
|
||||
assetCompliance: [] as any[],
|
||||
policyViolations: [] as any[],
|
||||
recommendations: [] as string[]
|
||||
};
|
||||
|
||||
let totalViolations = 0;
|
||||
|
||||
for (const assetId of assetIds) {
|
||||
const governance = this.governance.get(assetId);
|
||||
const asset = this.assets.get(assetId);
|
||||
|
||||
if (governance && asset) {
|
||||
const complianceChecks = await this.checkCompliance(assetId);
|
||||
const violations = complianceChecks.filter(check => check.status === 'failed');
|
||||
const isCompliant = violations.length === 0;
|
||||
|
||||
if (isCompliant) {
|
||||
reportData.summary.compliantAssets++;
|
||||
} else {
|
||||
reportData.summary.nonCompliantAssets++;
|
||||
}
|
||||
|
||||
totalViolations += violations.length;
|
||||
|
||||
reportData.assetCompliance.push({
|
||||
assetId,
|
||||
assetName: asset.name,
|
||||
classification: asset.classification,
|
||||
compliant: isCompliant,
|
||||
violationCount: violations.length,
|
||||
policiesApplied: governance.policies.length,
|
||||
lastChecked: new Date()
|
||||
});
|
||||
|
||||
// Add violations to report
|
||||
violations.forEach(violation => {
|
||||
reportData.policyViolations.push({
|
||||
assetId,
|
||||
assetName: asset.name,
|
||||
policyName: violation.policyName,
|
||||
violation: violation.details,
|
||||
severity: violation.severity || 'medium',
|
||||
checkedAt: violation.checkedAt
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
reportData.summary.violationCount = totalViolations;
|
||||
|
||||
// Generate recommendations
|
||||
reportData.recommendations = this.generateComplianceRecommendations(reportData);
|
||||
|
||||
this.logger.info('Compliance report generated', {
|
||||
totalAssets: assetIds.length,
|
||||
compliantAssets: reportData.summary.compliantAssets,
|
||||
violationCount: totalViolations
|
||||
});
|
||||
|
||||
return reportData;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to generate compliance report', { assetIds, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async validateDataAccess(assetId: string, userId: string, action: string): Promise<boolean> {
|
||||
try {
|
||||
const governance = this.governance.get(assetId);
|
||||
const asset = this.assets.get(assetId);
|
||||
|
||||
if (!governance || !asset) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check default access
|
||||
if (governance.access.defaultAccess === 'none') {
|
||||
// Must have explicit permission
|
||||
const hasUserAccess = governance.access.users.some(user =>
|
||||
user.userId === userId && user.permissions.includes(action)
|
||||
);
|
||||
|
||||
const hasRoleAccess = governance.access.roles.some(role =>
|
||||
role.permissions.includes(action) // Simplified - would check user roles
|
||||
);
|
||||
|
||||
return hasUserAccess || hasRoleAccess;
|
||||
}
|
||||
|
||||
// Check if explicitly denied
|
||||
const isDenied = governance.access.users.some(user =>
|
||||
user.userId === userId && user.permissions.includes(`deny:${action}`)
|
||||
);
|
||||
|
||||
if (isDenied) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check classification-based access
|
||||
if (asset.classification === 'restricted' || asset.classification === 'confidential') {
|
||||
// Require explicit permission for sensitive data
|
||||
const hasPermission = governance.access.users.some(user =>
|
||||
user.userId === userId && user.permissions.includes(action)
|
||||
);
|
||||
return hasPermission;
|
||||
}
|
||||
|
||||
return true; // Default allow for non-sensitive data
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to validate data access', { assetId, userId, action, error });
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async anonymizeData(assetId: string, options?: any): Promise<void> {
|
||||
try {
|
||||
const governance = this.governance.get(assetId);
|
||||
if (!governance || !governance.privacy.containsPII) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Apply anonymization rules
|
||||
for (const rule of governance.privacy.anonymizationRules) {
|
||||
await this.applyAnonymizationRule(assetId, rule, options);
|
||||
}
|
||||
|
||||
this.logger.info('Data anonymization completed', { assetId });
|
||||
|
||||
await this.eventBus.emit('data.governance.anonymization.completed', {
|
||||
assetId,
|
||||
options,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to anonymize data', { assetId, options, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async handleDataSubjectRequest(assetId: string, request: any): Promise<any> {
|
||||
try {
|
||||
const governance = this.governance.get(assetId);
|
||||
const asset = this.assets.get(assetId);
|
||||
|
||||
if (!governance || !asset) {
|
||||
throw new Error(`Asset or governance not found for ${assetId}`);
|
||||
}
|
||||
|
||||
let response: any = {};
|
||||
|
||||
switch (request.type) {
|
||||
case 'access':
|
||||
response = await this.handleAccessRequest(assetId, request);
|
||||
break;
|
||||
case 'rectification':
|
||||
response = await this.handleRectificationRequest(assetId, request);
|
||||
break;
|
||||
case 'erasure':
|
||||
response = await this.handleErasureRequest(assetId, request);
|
||||
break;
|
||||
case 'portability':
|
||||
response = await this.handlePortabilityRequest(assetId, request);
|
||||
break;
|
||||
default:
|
||||
throw new Error(`Unsupported request type: ${request.type}`);
|
||||
}
|
||||
|
||||
this.logger.info('Data subject request handled', { assetId, requestType: request.type });
|
||||
|
||||
await this.eventBus.emit('data.governance.subject.request.handled', {
|
||||
assetId,
|
||||
request,
|
||||
response,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
return response;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to handle data subject request', { assetId, request, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Private helper methods
|
||||
private initializeDefaultPolicies(): void {
|
||||
const defaultPolicies: GovernancePolicy[] = [
|
||||
{
|
||||
id: 'policy_pii_protection',
|
||||
name: 'PII Protection Policy',
|
||||
description: 'Ensures proper handling of personally identifiable information',
|
||||
type: 'privacy',
|
||||
rules: [
|
||||
'PII data must be encrypted at rest',
|
||||
'PII access must be logged',
|
||||
'PII retention must not exceed 7 years'
|
||||
],
|
||||
applicableClassifications: ['pii'],
|
||||
active: true,
|
||||
severity: 'high',
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date()
|
||||
},
|
||||
{
|
||||
id: 'policy_financial_compliance',
|
||||
name: 'Financial Data Compliance',
|
||||
description: 'Compliance with financial regulations',
|
||||
type: 'compliance',
|
||||
rules: [
|
||||
'Financial data must be retained for 7 years',
|
||||
'Access to financial data must be role-based',
|
||||
'All financial data access must be audited'
|
||||
],
|
||||
applicableClassifications: ['financial'],
|
||||
active: true,
|
||||
severity: 'critical',
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date()
|
||||
}
|
||||
];
|
||||
|
||||
defaultPolicies.forEach(policy => {
|
||||
this.policies.set(policy.id, policy);
|
||||
});
|
||||
}
|
||||
|
||||
private createEmptyGovernance(assetId: string): DataGovernance {
|
||||
return {
|
||||
id: this.generateId(),
|
||||
assetId,
|
||||
policies: [],
|
||||
compliance: [],
|
||||
retention: {
|
||||
retentionPeriod: 365,
|
||||
retentionReason: 'Business requirement',
|
||||
legalHold: false
|
||||
},
|
||||
access: {
|
||||
defaultAccess: 'none',
|
||||
roles: [],
|
||||
users: []
|
||||
},
|
||||
privacy: {
|
||||
containsPII: false,
|
||||
sensitiveFields: [],
|
||||
anonymizationRules: [],
|
||||
consentRequired: false,
|
||||
dataSubjectRights: []
|
||||
},
|
||||
audit: []
|
||||
};
|
||||
}
|
||||
|
||||
private async performComplianceCheck(asset: DataAsset, policy: GovernancePolicy): Promise<ComplianceCheck> {
|
||||
// Mock compliance check implementation
|
||||
// In real scenario, this would validate actual compliance
|
||||
|
||||
const isCompliant = Math.random() > 0.1; // 90% compliance rate for demo
|
||||
|
||||
const check: ComplianceCheck = {
|
||||
id: this.generateId(),
|
||||
policyId: policy.id,
|
||||
policyName: policy.name,
|
||||
status: isCompliant ? 'passed' : 'failed',
|
||||
checkedAt: new Date(),
|
||||
details: isCompliant ? 'All policy requirements met' : 'Policy violation detected',
|
||||
severity: policy.severity
|
||||
};
|
||||
|
||||
if (!isCompliant) {
|
||||
check.recommendations = [
|
||||
'Review data handling procedures',
|
||||
'Update access controls',
|
||||
'Implement additional monitoring'
|
||||
];
|
||||
}
|
||||
|
||||
return check;
|
||||
}
|
||||
|
||||
private async applyAnonymizationRule(assetId: string, rule: any, options?: any): Promise<void> {
|
||||
// Mock anonymization implementation
|
||||
this.logger.info('Applying anonymization rule', { assetId, rule: rule.type });
|
||||
}
|
||||
|
||||
private async handleAccessRequest(assetId: string, request: any): Promise<any> {
|
||||
return {
|
||||
status: 'completed',
|
||||
data: 'Data access provided according to privacy policy',
|
||||
timestamp: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async handleRectificationRequest(assetId: string, request: any): Promise<any> {
|
||||
return {
|
||||
status: 'completed',
|
||||
changes: 'Data rectification completed',
|
||||
timestamp: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async handleErasureRequest(assetId: string, request: any): Promise<any> {
|
||||
return {
|
||||
status: 'completed',
|
||||
erasure: 'Data erasure completed',
|
||||
timestamp: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async handlePortabilityRequest(assetId: string, request: any): Promise<any> {
|
||||
return {
|
||||
status: 'completed',
|
||||
export: 'Data export provided',
|
||||
timestamp: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private generateComplianceRecommendations(reportData: any): string[] {
|
||||
const recommendations: string[] = [];
|
||||
|
||||
if (reportData.summary.nonCompliantAssets > 0) {
|
||||
recommendations.push(`${reportData.summary.nonCompliantAssets} assets require compliance remediation.`);
|
||||
}
|
||||
|
||||
if (reportData.summary.violationCount > 10) {
|
||||
recommendations.push('High number of policy violations detected. Review governance policies and implementation.');
|
||||
}
|
||||
|
||||
const criticalViolations = reportData.policyViolations.filter((v: any) => v.severity === 'critical');
|
||||
if (criticalViolations.length > 0) {
|
||||
recommendations.push(`${criticalViolations.length} critical violations require immediate attention.`);
|
||||
}
|
||||
|
||||
if (recommendations.length === 0) {
|
||||
recommendations.push('All assets are compliant with governance policies. Continue monitoring.');
|
||||
}
|
||||
|
||||
return recommendations;
|
||||
}
|
||||
|
||||
private generateId(): string {
|
||||
return `governance_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
|
||||
// Method to inject assets (typically from DataCatalogService)
|
||||
setAssets(assets: Map<string, DataAsset>): void {
|
||||
this.assets = assets;
|
||||
}
|
||||
|
||||
// Method to inject governance (typically from DataCatalogService)
|
||||
setGovernance(governance: Map<string, DataGovernance>): void {
|
||||
this.governance = governance;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,607 @@
|
|||
import { EventBus } from '@stock-bot/event-bus';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import {
|
||||
DataLineage,
|
||||
DataAsset,
|
||||
LineageTransformation,
|
||||
ImpactAnalysis,
|
||||
LineageQuery,
|
||||
LineageDirection
|
||||
} from '../types/DataCatalog';
|
||||
|
||||
export interface DataLineageService {
|
||||
addLineage(lineage: DataLineage): Promise<void>;
|
||||
getLineage(assetId: string): Promise<DataLineage | null>;
|
||||
updateLineage(assetId: string, lineage: Partial<DataLineage>): Promise<DataLineage | null>;
|
||||
addUpstreamDependency(assetId: string, upstreamAssetId: string, transformation?: LineageTransformation): Promise<void>;
|
||||
addDownstreamDependency(assetId: string, downstreamAssetId: string, transformation?: LineageTransformation): Promise<void>;
|
||||
removeUpstreamDependency(assetId: string, upstreamAssetId: string): Promise<void>;
|
||||
removeDownstreamDependency(assetId: string, downstreamAssetId: string): Promise<void>;
|
||||
getUpstreamAssets(assetId: string, depth?: number): Promise<DataAsset[]>;
|
||||
getDownstreamAssets(assetId: string, depth?: number): Promise<DataAsset[]>;
|
||||
analyzeImpact(assetId: string): Promise<ImpactAnalysis>;
|
||||
queryLineage(query: LineageQuery): Promise<DataAsset[]>;
|
||||
getLineageGraph(assetId: string, direction: LineageDirection, depth?: number): Promise<any>;
|
||||
detectCircularDependencies(): Promise<string[][]>;
|
||||
}
|
||||
|
||||
export class DataLineageServiceImpl implements DataLineageService {
|
||||
private lineages: Map<string, DataLineage> = new Map();
|
||||
private assets: Map<string, DataAsset> = new Map();
|
||||
|
||||
constructor(
|
||||
private eventBus: EventBus,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async addLineage(lineage: DataLineage): Promise<void> {
|
||||
try {
|
||||
this.lineages.set(lineage.assetId, lineage);
|
||||
|
||||
this.logger.info('Data lineage added', {
|
||||
assetId: lineage.assetId,
|
||||
upstreamCount: lineage.upstreamAssets.length,
|
||||
downstreamCount: lineage.downstreamAssets.length
|
||||
});
|
||||
|
||||
await this.eventBus.emit('data.lineage.added', {
|
||||
assetId: lineage.assetId,
|
||||
lineage,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to add data lineage', { lineage, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getLineage(assetId: string): Promise<DataLineage | null> {
|
||||
try {
|
||||
return this.lineages.get(assetId) || null;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get data lineage', { assetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updateLineage(assetId: string, lineage: Partial<DataLineage>): Promise<DataLineage | null> {
|
||||
try {
|
||||
const existingLineage = this.lineages.get(assetId);
|
||||
if (!existingLineage) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const updatedLineage: DataLineage = {
|
||||
...existingLineage,
|
||||
...lineage,
|
||||
updatedAt: new Date()
|
||||
};
|
||||
|
||||
this.lineages.set(assetId, updatedLineage);
|
||||
|
||||
this.logger.info('Data lineage updated', { assetId, changes: lineage });
|
||||
|
||||
await this.eventBus.emit('data.lineage.updated', {
|
||||
assetId,
|
||||
lineage: updatedLineage,
|
||||
changes: lineage,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
return updatedLineage;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update data lineage', { assetId, lineage, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async addUpstreamDependency(
|
||||
assetId: string,
|
||||
upstreamAssetId: string,
|
||||
transformation?: LineageTransformation
|
||||
): Promise<void> {
|
||||
try {
|
||||
let lineage = this.lineages.get(assetId);
|
||||
if (!lineage) {
|
||||
lineage = this.createEmptyLineage(assetId);
|
||||
}
|
||||
|
||||
// Check if dependency already exists
|
||||
if (!lineage.upstreamAssets.includes(upstreamAssetId)) {
|
||||
lineage.upstreamAssets.push(upstreamAssetId);
|
||||
|
||||
if (transformation) {
|
||||
lineage.transformations.push(transformation);
|
||||
}
|
||||
|
||||
lineage.updatedAt = new Date();
|
||||
this.lineages.set(assetId, lineage);
|
||||
|
||||
// Update downstream lineage of the upstream asset
|
||||
await this.addDownstreamToUpstream(upstreamAssetId, assetId);
|
||||
|
||||
this.logger.info('Upstream dependency added', { assetId, upstreamAssetId });
|
||||
|
||||
await this.eventBus.emit('data.lineage.dependency.added', {
|
||||
assetId,
|
||||
upstreamAssetId,
|
||||
transformation,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to add upstream dependency', { assetId, upstreamAssetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async addDownstreamDependency(
|
||||
assetId: string,
|
||||
downstreamAssetId: string,
|
||||
transformation?: LineageTransformation
|
||||
): Promise<void> {
|
||||
try {
|
||||
let lineage = this.lineages.get(assetId);
|
||||
if (!lineage) {
|
||||
lineage = this.createEmptyLineage(assetId);
|
||||
}
|
||||
|
||||
// Check if dependency already exists
|
||||
if (!lineage.downstreamAssets.includes(downstreamAssetId)) {
|
||||
lineage.downstreamAssets.push(downstreamAssetId);
|
||||
lineage.updatedAt = new Date();
|
||||
this.lineages.set(assetId, lineage);
|
||||
|
||||
// Update upstream lineage of the downstream asset
|
||||
await this.addUpstreamToDownstream(downstreamAssetId, assetId, transformation);
|
||||
|
||||
this.logger.info('Downstream dependency added', { assetId, downstreamAssetId });
|
||||
|
||||
await this.eventBus.emit('data.lineage.dependency.added', {
|
||||
assetId,
|
||||
downstreamAssetId,
|
||||
transformation,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to add downstream dependency', { assetId, downstreamAssetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async removeUpstreamDependency(assetId: string, upstreamAssetId: string): Promise<void> {
|
||||
try {
|
||||
const lineage = this.lineages.get(assetId);
|
||||
if (lineage) {
|
||||
lineage.upstreamAssets = lineage.upstreamAssets.filter(id => id !== upstreamAssetId);
|
||||
lineage.updatedAt = new Date();
|
||||
this.lineages.set(assetId, lineage);
|
||||
|
||||
// Remove from downstream lineage of upstream asset
|
||||
await this.removeDownstreamFromUpstream(upstreamAssetId, assetId);
|
||||
|
||||
this.logger.info('Upstream dependency removed', { assetId, upstreamAssetId });
|
||||
|
||||
await this.eventBus.emit('data.lineage.dependency.removed', {
|
||||
assetId,
|
||||
upstreamAssetId,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to remove upstream dependency', { assetId, upstreamAssetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async removeDownstreamDependency(assetId: string, downstreamAssetId: string): Promise<void> {
|
||||
try {
|
||||
const lineage = this.lineages.get(assetId);
|
||||
if (lineage) {
|
||||
lineage.downstreamAssets = lineage.downstreamAssets.filter(id => id !== downstreamAssetId);
|
||||
lineage.updatedAt = new Date();
|
||||
this.lineages.set(assetId, lineage);
|
||||
|
||||
// Remove from upstream lineage of downstream asset
|
||||
await this.removeUpstreamFromDownstream(downstreamAssetId, assetId);
|
||||
|
||||
this.logger.info('Downstream dependency removed', { assetId, downstreamAssetId });
|
||||
|
||||
await this.eventBus.emit('data.lineage.dependency.removed', {
|
||||
assetId,
|
||||
downstreamAssetId,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to remove downstream dependency', { assetId, downstreamAssetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getUpstreamAssets(assetId: string, depth: number = 1): Promise<DataAsset[]> {
|
||||
try {
|
||||
const visited = new Set<string>();
|
||||
const result: DataAsset[] = [];
|
||||
|
||||
await this.traverseUpstream(assetId, depth, visited, result);
|
||||
|
||||
return result;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get upstream assets', { assetId, depth, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getDownstreamAssets(assetId: string, depth: number = 1): Promise<DataAsset[]> {
|
||||
try {
|
||||
const visited = new Set<string>();
|
||||
const result: DataAsset[] = [];
|
||||
|
||||
await this.traverseDownstream(assetId, depth, visited, result);
|
||||
|
||||
return result;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get downstream assets', { assetId, depth, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async analyzeImpact(assetId: string): Promise<ImpactAnalysis> {
|
||||
try {
|
||||
const downstreamAssets = await this.getDownstreamAssets(assetId, 5); // Go deep for impact analysis
|
||||
const affectedUsers = new Set<string>();
|
||||
|
||||
// Collect all users who might be affected
|
||||
for (const asset of downstreamAssets) {
|
||||
affectedUsers.add(asset.owner);
|
||||
if (asset.steward) {
|
||||
affectedUsers.add(asset.steward);
|
||||
}
|
||||
// Add users from usage analytics
|
||||
asset.usage.topUsers.forEach(user => affectedUsers.add(user.userId));
|
||||
}
|
||||
|
||||
// Calculate impact level
|
||||
let estimatedImpact: 'low' | 'medium' | 'high' | 'critical' = 'low';
|
||||
if (downstreamAssets.length > 20) {
|
||||
estimatedImpact = 'critical';
|
||||
} else if (downstreamAssets.length > 10) {
|
||||
estimatedImpact = 'high';
|
||||
} else if (downstreamAssets.length > 5) {
|
||||
estimatedImpact = 'medium';
|
||||
}
|
||||
|
||||
const impact: ImpactAnalysis = {
|
||||
downstreamAssets: downstreamAssets.map(asset => asset.id),
|
||||
affectedUsers: Array.from(affectedUsers),
|
||||
estimatedImpact,
|
||||
impactDescription: this.generateImpactDescription(downstreamAssets.length, Array.from(affectedUsers).length),
|
||||
recommendations: this.generateRecommendations(estimatedImpact, downstreamAssets.length)
|
||||
};
|
||||
|
||||
this.logger.info('Impact analysis completed', {
|
||||
assetId,
|
||||
impactLevel: estimatedImpact,
|
||||
affectedAssets: downstreamAssets.length,
|
||||
affectedUsers: affectedUsers.size
|
||||
});
|
||||
|
||||
return impact;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to analyze impact', { assetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async queryLineage(query: LineageQuery): Promise<DataAsset[]> {
|
||||
try {
|
||||
let results: DataAsset[] = [];
|
||||
|
||||
if (query.assetIds) {
|
||||
for (const assetId of query.assetIds) {
|
||||
if (query.direction === 'upstream' || query.direction === 'both') {
|
||||
const upstream = await this.getUpstreamAssets(assetId, query.depth);
|
||||
results.push(...upstream);
|
||||
}
|
||||
if (query.direction === 'downstream' || query.direction === 'both') {
|
||||
const downstream = await this.getDownstreamAssets(assetId, query.depth);
|
||||
results.push(...downstream);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Remove duplicates
|
||||
const uniqueResults = results.filter((asset, index, arr) =>
|
||||
arr.findIndex(a => a.id === asset.id) === index
|
||||
);
|
||||
|
||||
return uniqueResults;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to query lineage', { query, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getLineageGraph(assetId: string, direction: LineageDirection, depth: number = 3): Promise<any> {
|
||||
try {
|
||||
const graph = {
|
||||
nodes: new Map(),
|
||||
edges: []
|
||||
};
|
||||
|
||||
const visited = new Set<string>();
|
||||
await this.buildLineageGraph(assetId, direction, depth, visited, graph);
|
||||
|
||||
return {
|
||||
nodes: Array.from(graph.nodes.values()),
|
||||
edges: graph.edges
|
||||
};
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get lineage graph', { assetId, direction, depth, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async detectCircularDependencies(): Promise<string[][]> {
|
||||
try {
|
||||
const cycles: string[][] = [];
|
||||
const visited = new Set<string>();
|
||||
const recursionStack = new Set<string>();
|
||||
|
||||
for (const assetId of this.lineages.keys()) {
|
||||
if (!visited.has(assetId)) {
|
||||
const path: string[] = [];
|
||||
await this.detectCycleDFS(assetId, visited, recursionStack, path, cycles);
|
||||
}
|
||||
}
|
||||
|
||||
if (cycles.length > 0) {
|
||||
this.logger.warn('Circular dependencies detected', { cycleCount: cycles.length });
|
||||
}
|
||||
|
||||
return cycles;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to detect circular dependencies', { error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Private helper methods
|
||||
private createEmptyLineage(assetId: string): DataLineage {
|
||||
return {
|
||||
id: this.generateId(),
|
||||
assetId,
|
||||
upstreamAssets: [],
|
||||
downstreamAssets: [],
|
||||
transformations: [],
|
||||
impact: {
|
||||
downstreamAssets: [],
|
||||
affectedUsers: [],
|
||||
estimatedImpact: 'low',
|
||||
impactDescription: '',
|
||||
recommendations: []
|
||||
},
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async addDownstreamToUpstream(upstreamAssetId: string, downstreamAssetId: string): Promise<void> {
|
||||
let upstreamLineage = this.lineages.get(upstreamAssetId);
|
||||
if (!upstreamLineage) {
|
||||
upstreamLineage = this.createEmptyLineage(upstreamAssetId);
|
||||
}
|
||||
|
||||
if (!upstreamLineage.downstreamAssets.includes(downstreamAssetId)) {
|
||||
upstreamLineage.downstreamAssets.push(downstreamAssetId);
|
||||
upstreamLineage.updatedAt = new Date();
|
||||
this.lineages.set(upstreamAssetId, upstreamLineage);
|
||||
}
|
||||
}
|
||||
|
||||
private async addUpstreamToDownstream(
|
||||
downstreamAssetId: string,
|
||||
upstreamAssetId: string,
|
||||
transformation?: LineageTransformation
|
||||
): Promise<void> {
|
||||
let downstreamLineage = this.lineages.get(downstreamAssetId);
|
||||
if (!downstreamLineage) {
|
||||
downstreamLineage = this.createEmptyLineage(downstreamAssetId);
|
||||
}
|
||||
|
||||
if (!downstreamLineage.upstreamAssets.includes(upstreamAssetId)) {
|
||||
downstreamLineage.upstreamAssets.push(upstreamAssetId);
|
||||
|
||||
if (transformation) {
|
||||
downstreamLineage.transformations.push(transformation);
|
||||
}
|
||||
|
||||
downstreamLineage.updatedAt = new Date();
|
||||
this.lineages.set(downstreamAssetId, downstreamLineage);
|
||||
}
|
||||
}
|
||||
|
||||
private async removeDownstreamFromUpstream(upstreamAssetId: string, downstreamAssetId: string): Promise<void> {
|
||||
const upstreamLineage = this.lineages.get(upstreamAssetId);
|
||||
if (upstreamLineage) {
|
||||
upstreamLineage.downstreamAssets = upstreamLineage.downstreamAssets.filter(id => id !== downstreamAssetId);
|
||||
upstreamLineage.updatedAt = new Date();
|
||||
this.lineages.set(upstreamAssetId, upstreamLineage);
|
||||
}
|
||||
}
|
||||
|
||||
private async removeUpstreamFromDownstream(downstreamAssetId: string, upstreamAssetId: string): Promise<void> {
|
||||
const downstreamLineage = this.lineages.get(downstreamAssetId);
|
||||
if (downstreamLineage) {
|
||||
downstreamLineage.upstreamAssets = downstreamLineage.upstreamAssets.filter(id => id !== upstreamAssetId);
|
||||
downstreamLineage.updatedAt = new Date();
|
||||
this.lineages.set(downstreamAssetId, downstreamLineage);
|
||||
}
|
||||
}
|
||||
|
||||
private async traverseUpstream(
|
||||
assetId: string,
|
||||
remainingDepth: number,
|
||||
visited: Set<string>,
|
||||
result: DataAsset[]
|
||||
): Promise<void> {
|
||||
if (remainingDepth === 0 || visited.has(assetId)) {
|
||||
return;
|
||||
}
|
||||
|
||||
visited.add(assetId);
|
||||
const lineage = this.lineages.get(assetId);
|
||||
|
||||
if (lineage) {
|
||||
for (const upstreamId of lineage.upstreamAssets) {
|
||||
const asset = this.assets.get(upstreamId);
|
||||
if (asset && !result.find(a => a.id === asset.id)) {
|
||||
result.push(asset);
|
||||
}
|
||||
await this.traverseUpstream(upstreamId, remainingDepth - 1, visited, result);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async traverseDownstream(
|
||||
assetId: string,
|
||||
remainingDepth: number,
|
||||
visited: Set<string>,
|
||||
result: DataAsset[]
|
||||
): Promise<void> {
|
||||
if (remainingDepth === 0 || visited.has(assetId)) {
|
||||
return;
|
||||
}
|
||||
|
||||
visited.add(assetId);
|
||||
const lineage = this.lineages.get(assetId);
|
||||
|
||||
if (lineage) {
|
||||
for (const downstreamId of lineage.downstreamAssets) {
|
||||
const asset = this.assets.get(downstreamId);
|
||||
if (asset && !result.find(a => a.id === asset.id)) {
|
||||
result.push(asset);
|
||||
}
|
||||
await this.traverseDownstream(downstreamId, remainingDepth - 1, visited, result);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async buildLineageGraph(
|
||||
assetId: string,
|
||||
direction: LineageDirection,
|
||||
remainingDepth: number,
|
||||
visited: Set<string>,
|
||||
graph: any
|
||||
): Promise<void> {
|
||||
if (remainingDepth === 0 || visited.has(assetId)) {
|
||||
return;
|
||||
}
|
||||
|
||||
visited.add(assetId);
|
||||
const asset = this.assets.get(assetId);
|
||||
const lineage = this.lineages.get(assetId);
|
||||
|
||||
if (asset) {
|
||||
graph.nodes.set(assetId, {
|
||||
id: assetId,
|
||||
name: asset.name,
|
||||
type: asset.type,
|
||||
classification: asset.classification
|
||||
});
|
||||
}
|
||||
|
||||
if (lineage) {
|
||||
if (direction === 'upstream' || direction === 'both') {
|
||||
for (const upstreamId of lineage.upstreamAssets) {
|
||||
graph.edges.push({
|
||||
source: upstreamId,
|
||||
target: assetId,
|
||||
type: 'upstream'
|
||||
});
|
||||
await this.buildLineageGraph(upstreamId, direction, remainingDepth - 1, visited, graph);
|
||||
}
|
||||
}
|
||||
|
||||
if (direction === 'downstream' || direction === 'both') {
|
||||
for (const downstreamId of lineage.downstreamAssets) {
|
||||
graph.edges.push({
|
||||
source: assetId,
|
||||
target: downstreamId,
|
||||
type: 'downstream'
|
||||
});
|
||||
await this.buildLineageGraph(downstreamId, direction, remainingDepth - 1, visited, graph);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async detectCycleDFS(
|
||||
assetId: string,
|
||||
visited: Set<string>,
|
||||
recursionStack: Set<string>,
|
||||
path: string[],
|
||||
cycles: string[][]
|
||||
): Promise<void> {
|
||||
visited.add(assetId);
|
||||
recursionStack.add(assetId);
|
||||
path.push(assetId);
|
||||
|
||||
const lineage = this.lineages.get(assetId);
|
||||
if (lineage) {
|
||||
for (const downstreamId of lineage.downstreamAssets) {
|
||||
if (!visited.has(downstreamId)) {
|
||||
await this.detectCycleDFS(downstreamId, visited, recursionStack, path, cycles);
|
||||
} else if (recursionStack.has(downstreamId)) {
|
||||
// Found a cycle
|
||||
const cycleStart = path.indexOf(downstreamId);
|
||||
cycles.push(path.slice(cycleStart));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
path.pop();
|
||||
recursionStack.delete(assetId);
|
||||
}
|
||||
|
||||
private generateImpactDescription(assetCount: number, userCount: number): string {
|
||||
if (assetCount === 0) {
|
||||
return 'No downstream dependencies identified.';
|
||||
}
|
||||
|
||||
return `Changes to this asset may affect ${assetCount} downstream asset(s) and ${userCount} user(s).`;
|
||||
}
|
||||
|
||||
private generateRecommendations(impact: string, assetCount: number): string[] {
|
||||
const recommendations: string[] = [];
|
||||
|
||||
if (impact === 'critical') {
|
||||
recommendations.push('Schedule maintenance window');
|
||||
recommendations.push('Notify all stakeholders in advance');
|
||||
recommendations.push('Prepare rollback plan');
|
||||
recommendations.push('Consider phased rollout');
|
||||
} else if (impact === 'high') {
|
||||
recommendations.push('Notify affected users');
|
||||
recommendations.push('Test changes thoroughly');
|
||||
recommendations.push('Monitor downstream systems');
|
||||
} else if (impact === 'medium') {
|
||||
recommendations.push('Test with subset of data');
|
||||
recommendations.push('Monitor for issues');
|
||||
} else {
|
||||
recommendations.push('Standard testing procedures apply');
|
||||
}
|
||||
|
||||
return recommendations;
|
||||
}
|
||||
|
||||
private generateId(): string {
|
||||
return `lineage_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
|
||||
// Method to inject assets (typically from DataCatalogService)
|
||||
setAssets(assets: Map<string, DataAsset>): void {
|
||||
this.assets = assets;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,734 @@
|
|||
import { EventBus } from '@stock-bot/event-bus';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import {
|
||||
DataQuality,
|
||||
QualityDimension,
|
||||
QualityRule,
|
||||
QualityIssue,
|
||||
QualityTrend,
|
||||
DataAsset,
|
||||
QualityAssessmentRequest,
|
||||
QualityRuleType,
|
||||
QualitySeverity
|
||||
} from '../types/DataCatalog';
|
||||
|
||||
export interface DataQualityService {
|
||||
assessQuality(assetId: string, request: QualityAssessmentRequest): Promise<DataQuality>;
|
||||
getQuality(assetId: string): Promise<DataQuality | null>;
|
||||
updateQuality(assetId: string, quality: Partial<DataQuality>): Promise<DataQuality | null>;
|
||||
addQualityRule(assetId: string, rule: QualityRule): Promise<void>;
|
||||
removeQualityRule(assetId: string, ruleId: string): Promise<void>;
|
||||
validateRule(assetId: string, ruleId: string): Promise<boolean>;
|
||||
reportIssue(assetId: string, issue: Omit<QualityIssue, 'id' | 'detectedAt'>): Promise<void>;
|
||||
resolveIssue(assetId: string, issueId: string): Promise<void>;
|
||||
getTrendAnalysis(assetId: string, timeframe: string): Promise<QualityTrend>;
|
||||
getQualityMetrics(filters?: Record<string, any>): Promise<any>;
|
||||
generateQualityReport(assetIds: string[]): Promise<any>;
|
||||
}
|
||||
|
||||
export class DataQualityServiceImpl implements DataQualityService {
|
||||
private qualities: Map<string, DataQuality> = new Map();
|
||||
private assets: Map<string, DataAsset> = new Map();
|
||||
|
||||
constructor(
|
||||
private eventBus: EventBus,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async assessQuality(assetId: string, request: QualityAssessmentRequest): Promise<DataQuality> {
|
||||
try {
|
||||
const asset = this.assets.get(assetId);
|
||||
if (!asset) {
|
||||
throw new Error(`Asset with id ${assetId} not found`);
|
||||
}
|
||||
|
||||
let quality = this.qualities.get(assetId);
|
||||
if (!quality) {
|
||||
quality = this.createEmptyQuality(assetId);
|
||||
}
|
||||
|
||||
// Perform quality assessment based on request
|
||||
const assessmentResults = await this.performQualityAssessment(asset, request);
|
||||
|
||||
// Update quality metrics
|
||||
quality.dimensions = assessmentResults.dimensions;
|
||||
quality.overallScore = this.calculateOverallScore(assessmentResults.dimensions);
|
||||
quality.lastAssessment = new Date();
|
||||
|
||||
// Update trend data
|
||||
this.updateQualityTrend(quality, quality.overallScore);
|
||||
|
||||
this.qualities.set(assetId, quality);
|
||||
|
||||
this.logger.info('Quality assessment completed', {
|
||||
assetId,
|
||||
overallScore: quality.overallScore,
|
||||
dimensionCount: quality.dimensions.length
|
||||
});
|
||||
|
||||
await this.eventBus.emit('data.quality.assessed', {
|
||||
assetId,
|
||||
quality,
|
||||
request,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
return quality;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to assess quality', { assetId, request, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getQuality(assetId: string): Promise<DataQuality | null> {
|
||||
try {
|
||||
return this.qualities.get(assetId) || null;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get quality', { assetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updateQuality(assetId: string, quality: Partial<DataQuality>): Promise<DataQuality | null> {
|
||||
try {
|
||||
const existingQuality = this.qualities.get(assetId);
|
||||
if (!existingQuality) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const updatedQuality: DataQuality = {
|
||||
...existingQuality,
|
||||
...quality,
|
||||
lastAssessment: new Date()
|
||||
};
|
||||
|
||||
this.qualities.set(assetId, updatedQuality);
|
||||
|
||||
this.logger.info('Quality updated', { assetId, changes: quality });
|
||||
|
||||
await this.eventBus.emit('data.quality.updated', {
|
||||
assetId,
|
||||
quality: updatedQuality,
|
||||
changes: quality,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
return updatedQuality;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update quality', { assetId, quality, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async addQualityRule(assetId: string, rule: QualityRule): Promise<void> {
|
||||
try {
|
||||
let quality = this.qualities.get(assetId);
|
||||
if (!quality) {
|
||||
quality = this.createEmptyQuality(assetId);
|
||||
}
|
||||
|
||||
// Ensure rule has an ID
|
||||
if (!rule.id) {
|
||||
rule.id = this.generateId();
|
||||
}
|
||||
|
||||
quality.rules.push(rule);
|
||||
this.qualities.set(assetId, quality);
|
||||
|
||||
this.logger.info('Quality rule added', { assetId, ruleId: rule.id, ruleType: rule.type });
|
||||
|
||||
await this.eventBus.emit('data.quality.rule.added', {
|
||||
assetId,
|
||||
rule,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to add quality rule', { assetId, rule, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async removeQualityRule(assetId: string, ruleId: string): Promise<void> {
|
||||
try {
|
||||
const quality = this.qualities.get(assetId);
|
||||
if (!quality) {
|
||||
throw new Error(`Quality not found for asset ${assetId}`);
|
||||
}
|
||||
|
||||
quality.rules = quality.rules.filter(rule => rule.id !== ruleId);
|
||||
this.qualities.set(assetId, quality);
|
||||
|
||||
this.logger.info('Quality rule removed', { assetId, ruleId });
|
||||
|
||||
await this.eventBus.emit('data.quality.rule.removed', {
|
||||
assetId,
|
||||
ruleId,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to remove quality rule', { assetId, ruleId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async validateRule(assetId: string, ruleId: string): Promise<boolean> {
|
||||
try {
|
||||
const quality = this.qualities.get(assetId);
|
||||
const asset = this.assets.get(assetId);
|
||||
|
||||
if (!quality || !asset) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const rule = quality.rules.find(r => r.id === ruleId);
|
||||
if (!rule) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const isValid = await this.executeQualityRule(asset, rule);
|
||||
|
||||
if (!isValid) {
|
||||
// Create quality issue
|
||||
const issue: QualityIssue = {
|
||||
id: this.generateId(),
|
||||
ruleId: rule.id,
|
||||
type: rule.type,
|
||||
severity: rule.severity,
|
||||
message: `Quality rule validation failed: ${rule.description}`,
|
||||
detectedAt: new Date(),
|
||||
resolved: false
|
||||
};
|
||||
|
||||
quality.issues.push(issue);
|
||||
this.qualities.set(assetId, quality);
|
||||
|
||||
await this.eventBus.emit('data.quality.issue.detected', {
|
||||
assetId,
|
||||
issue,
|
||||
rule,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
|
||||
return isValid;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to validate quality rule', { assetId, ruleId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async reportIssue(assetId: string, issue: Omit<QualityIssue, 'id' | 'detectedAt'>): Promise<void> {
|
||||
try {
|
||||
let quality = this.qualities.get(assetId);
|
||||
if (!quality) {
|
||||
quality = this.createEmptyQuality(assetId);
|
||||
}
|
||||
|
||||
const fullIssue: QualityIssue = {
|
||||
...issue,
|
||||
id: this.generateId(),
|
||||
detectedAt: new Date()
|
||||
};
|
||||
|
||||
quality.issues.push(fullIssue);
|
||||
this.qualities.set(assetId, quality);
|
||||
|
||||
this.logger.info('Quality issue reported', {
|
||||
assetId,
|
||||
issueId: fullIssue.id,
|
||||
severity: fullIssue.severity
|
||||
});
|
||||
|
||||
await this.eventBus.emit('data.quality.issue.reported', {
|
||||
assetId,
|
||||
issue: fullIssue,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to report quality issue', { assetId, issue, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async resolveIssue(assetId: string, issueId: string): Promise<void> {
|
||||
try {
|
||||
const quality = this.qualities.get(assetId);
|
||||
if (!quality) {
|
||||
throw new Error(`Quality not found for asset ${assetId}`);
|
||||
}
|
||||
|
||||
const issue = quality.issues.find(i => i.id === issueId);
|
||||
if (!issue) {
|
||||
throw new Error(`Issue ${issueId} not found for asset ${assetId}`);
|
||||
}
|
||||
|
||||
issue.resolved = true;
|
||||
issue.resolvedAt = new Date();
|
||||
this.qualities.set(assetId, quality);
|
||||
|
||||
this.logger.info('Quality issue resolved', { assetId, issueId });
|
||||
|
||||
await this.eventBus.emit('data.quality.issue.resolved', {
|
||||
assetId,
|
||||
issue,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to resolve quality issue', { assetId, issueId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getTrendAnalysis(assetId: string, timeframe: string): Promise<QualityTrend> {
|
||||
try {
|
||||
const quality = this.qualities.get(assetId);
|
||||
if (!quality) {
|
||||
throw new Error(`Quality not found for asset ${assetId}`);
|
||||
}
|
||||
|
||||
// Filter trend data by timeframe
|
||||
const filteredTrend = this.filterTrendByTimeframe(quality.trend, timeframe);
|
||||
|
||||
// Calculate trend direction and change rate
|
||||
const trendAnalysis = this.analyzeTrend(filteredTrend.dataPoints);
|
||||
|
||||
return {
|
||||
...filteredTrend,
|
||||
trend: trendAnalysis.direction,
|
||||
changeRate: trendAnalysis.changeRate
|
||||
};
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get trend analysis', { assetId, timeframe, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getQualityMetrics(filters?: Record<string, any>): Promise<any> {
|
||||
try {
|
||||
let qualities = Array.from(this.qualities.values());
|
||||
|
||||
// Apply filters if provided
|
||||
if (filters) {
|
||||
const assets = Array.from(this.assets.values());
|
||||
const filteredAssets = assets.filter(asset => {
|
||||
return Object.entries(filters).every(([key, value]) => {
|
||||
if (key === 'type') return asset.type === value;
|
||||
if (key === 'owner') return asset.owner === value;
|
||||
if (key === 'classification') return asset.classification === value;
|
||||
return true;
|
||||
});
|
||||
});
|
||||
|
||||
qualities = qualities.filter(quality =>
|
||||
filteredAssets.some(asset => asset.id === quality.assetId)
|
||||
);
|
||||
}
|
||||
|
||||
// Calculate aggregate metrics
|
||||
const metrics = {
|
||||
totalAssets: qualities.length,
|
||||
averageQualityScore: this.calculateAverageScore(qualities),
|
||||
qualityDistribution: this.calculateQualityDistribution(qualities),
|
||||
topIssues: this.getTopQualityIssues(qualities),
|
||||
trendSummary: this.getTrendSummary(qualities),
|
||||
ruleCompliance: this.calculateRuleCompliance(qualities)
|
||||
};
|
||||
|
||||
this.logger.info('Quality metrics calculated', {
|
||||
totalAssets: metrics.totalAssets,
|
||||
averageScore: metrics.averageQualityScore
|
||||
});
|
||||
|
||||
return metrics;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get quality metrics', { filters, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async generateQualityReport(assetIds: string[]): Promise<any> {
|
||||
try {
|
||||
const reportData = {
|
||||
summary: {
|
||||
totalAssets: assetIds.length,
|
||||
assessmentDate: new Date(),
|
||||
averageScore: 0,
|
||||
criticalIssues: 0,
|
||||
highIssues: 0
|
||||
},
|
||||
assetDetails: [] as any[],
|
||||
recommendations: [] as string[]
|
||||
};
|
||||
|
||||
let totalScore = 0;
|
||||
let criticalCount = 0;
|
||||
let highCount = 0;
|
||||
|
||||
for (const assetId of assetIds) {
|
||||
const quality = this.qualities.get(assetId);
|
||||
const asset = this.assets.get(assetId);
|
||||
|
||||
if (quality && asset) {
|
||||
totalScore += quality.overallScore;
|
||||
|
||||
const criticalIssuesCount = quality.issues.filter(i =>
|
||||
i.severity === 'critical' && !i.resolved
|
||||
).length;
|
||||
const highIssuesCount = quality.issues.filter(i =>
|
||||
i.severity === 'high' && !i.resolved
|
||||
).length;
|
||||
|
||||
criticalCount += criticalIssuesCount;
|
||||
highCount += highIssuesCount;
|
||||
|
||||
reportData.assetDetails.push({
|
||||
assetId,
|
||||
assetName: asset.name,
|
||||
qualityScore: quality.overallScore,
|
||||
dimensions: quality.dimensions,
|
||||
openIssues: quality.issues.filter(i => !i.resolved).length,
|
||||
criticalIssues: criticalIssuesCount,
|
||||
highIssues: highIssuesCount,
|
||||
lastAssessment: quality.lastAssessment
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
reportData.summary.averageScore = Math.round(totalScore / assetIds.length);
|
||||
reportData.summary.criticalIssues = criticalCount;
|
||||
reportData.summary.highIssues = highCount;
|
||||
|
||||
// Generate recommendations
|
||||
reportData.recommendations = this.generateQualityRecommendations(reportData);
|
||||
|
||||
this.logger.info('Quality report generated', {
|
||||
assetCount: assetIds.length,
|
||||
averageScore: reportData.summary.averageScore,
|
||||
criticalIssues: criticalCount
|
||||
});
|
||||
|
||||
return reportData;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to generate quality report', { assetIds, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Private helper methods
|
||||
private createEmptyQuality(assetId: string): DataQuality {
|
||||
return {
|
||||
id: this.generateId(),
|
||||
assetId,
|
||||
overallScore: 100,
|
||||
dimensions: [],
|
||||
rules: [],
|
||||
issues: [],
|
||||
trend: {
|
||||
timeframe: 'week',
|
||||
dataPoints: [],
|
||||
trend: 'stable',
|
||||
changeRate: 0
|
||||
},
|
||||
lastAssessment: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async performQualityAssessment(
|
||||
asset: DataAsset,
|
||||
request: QualityAssessmentRequest
|
||||
): Promise<{ dimensions: QualityDimension[] }> {
|
||||
const dimensions: QualityDimension[] = [];
|
||||
|
||||
// Completeness assessment
|
||||
if (request.checkCompleteness) {
|
||||
const completeness = await this.assessCompleteness(asset);
|
||||
dimensions.push(completeness);
|
||||
}
|
||||
|
||||
// Accuracy assessment
|
||||
if (request.checkAccuracy) {
|
||||
const accuracy = await this.assessAccuracy(asset);
|
||||
dimensions.push(accuracy);
|
||||
}
|
||||
|
||||
// Consistency assessment
|
||||
if (request.checkConsistency) {
|
||||
const consistency = await this.assessConsistency(asset);
|
||||
dimensions.push(consistency);
|
||||
}
|
||||
|
||||
// Validity assessment
|
||||
if (request.checkValidity) {
|
||||
const validity = await this.assessValidity(asset);
|
||||
dimensions.push(validity);
|
||||
}
|
||||
|
||||
// Timeliness assessment
|
||||
if (request.checkTimeliness) {
|
||||
const timeliness = await this.assessTimeliness(asset);
|
||||
dimensions.push(timeliness);
|
||||
}
|
||||
|
||||
// Uniqueness assessment
|
||||
if (request.checkUniqueness) {
|
||||
const uniqueness = await this.assessUniqueness(asset);
|
||||
dimensions.push(uniqueness);
|
||||
}
|
||||
|
||||
return { dimensions };
|
||||
}
|
||||
|
||||
private async assessCompleteness(asset: DataAsset): Promise<QualityDimension> {
|
||||
// Mock implementation - in real scenario, this would analyze actual data
|
||||
const score = Math.floor(Math.random() * 20) + 80; // 80-100
|
||||
|
||||
return {
|
||||
name: 'completeness',
|
||||
score,
|
||||
description: 'Measures the degree to which data is complete',
|
||||
rules: [`No null values in required fields`],
|
||||
threshold: 95,
|
||||
lastChecked: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async assessAccuracy(asset: DataAsset): Promise<QualityDimension> {
|
||||
const score = Math.floor(Math.random() * 15) + 85; // 85-100
|
||||
|
||||
return {
|
||||
name: 'accuracy',
|
||||
score,
|
||||
description: 'Measures how well data represents real-world values',
|
||||
rules: [`Values within expected ranges`, `Format validation`],
|
||||
threshold: 90,
|
||||
lastChecked: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async assessConsistency(asset: DataAsset): Promise<QualityDimension> {
|
||||
const score = Math.floor(Math.random() * 25) + 75; // 75-100
|
||||
|
||||
return {
|
||||
name: 'consistency',
|
||||
score,
|
||||
description: 'Measures uniformity of data across datasets',
|
||||
rules: [`Consistent data types`, `Standardized formats`],
|
||||
threshold: 85,
|
||||
lastChecked: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async assessValidity(asset: DataAsset): Promise<QualityDimension> {
|
||||
const score = Math.floor(Math.random() * 20) + 80; // 80-100
|
||||
|
||||
return {
|
||||
name: 'validity',
|
||||
score,
|
||||
description: 'Measures conformity to defined business rules',
|
||||
rules: [`Business rule compliance`, `Schema validation`],
|
||||
threshold: 90,
|
||||
lastChecked: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async assessTimeliness(asset: DataAsset): Promise<QualityDimension> {
|
||||
const score = Math.floor(Math.random() * 30) + 70; // 70-100
|
||||
|
||||
return {
|
||||
name: 'timeliness',
|
||||
score,
|
||||
description: 'Measures how up-to-date the data is',
|
||||
rules: [`Data refreshed within SLA`, `Timestamp validation`],
|
||||
threshold: 85,
|
||||
lastChecked: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async assessUniqueness(asset: DataAsset): Promise<QualityDimension> {
|
||||
const score = Math.floor(Math.random() * 25) + 75; // 75-100
|
||||
|
||||
return {
|
||||
name: 'uniqueness',
|
||||
score,
|
||||
description: 'Measures absence of duplicate records',
|
||||
rules: [`No duplicate primary keys`, `Unique constraints enforced`],
|
||||
threshold: 95,
|
||||
lastChecked: new Date()
|
||||
};
|
||||
}
|
||||
|
||||
private async executeQualityRule(asset: DataAsset, rule: QualityRule): Promise<boolean> {
|
||||
// Mock implementation - in real scenario, this would execute the actual rule
|
||||
// For demo purposes, randomly pass/fail rules
|
||||
const passRate = rule.severity === 'critical' ? 0.9 : 0.95;
|
||||
return Math.random() < passRate;
|
||||
}
|
||||
|
||||
private calculateOverallScore(dimensions: QualityDimension[]): number {
|
||||
if (dimensions.length === 0) return 100;
|
||||
|
||||
const totalScore = dimensions.reduce((sum, dim) => sum + dim.score, 0);
|
||||
return Math.round(totalScore / dimensions.length);
|
||||
}
|
||||
|
||||
private updateQualityTrend(quality: DataQuality, newScore: number): void {
|
||||
quality.trend.dataPoints.push({
|
||||
timestamp: new Date(),
|
||||
value: newScore
|
||||
});
|
||||
|
||||
// Keep only last 30 data points
|
||||
if (quality.trend.dataPoints.length > 30) {
|
||||
quality.trend.dataPoints = quality.trend.dataPoints.slice(-30);
|
||||
}
|
||||
|
||||
// Update trend analysis
|
||||
const trendAnalysis = this.analyzeTrend(quality.trend.dataPoints);
|
||||
quality.trend.trend = trendAnalysis.direction;
|
||||
quality.trend.changeRate = trendAnalysis.changeRate;
|
||||
}
|
||||
|
||||
private filterTrendByTimeframe(trend: QualityTrend, timeframe: string): QualityTrend {
|
||||
const now = new Date();
|
||||
let cutoffDate: Date;
|
||||
|
||||
switch (timeframe) {
|
||||
case 'day':
|
||||
cutoffDate = new Date(now.getTime() - 24 * 60 * 60 * 1000);
|
||||
break;
|
||||
case 'week':
|
||||
cutoffDate = new Date(now.getTime() - 7 * 24 * 60 * 60 * 1000);
|
||||
break;
|
||||
case 'month':
|
||||
cutoffDate = new Date(now.getTime() - 30 * 24 * 60 * 60 * 1000);
|
||||
break;
|
||||
default:
|
||||
cutoffDate = new Date(0); // All time
|
||||
}
|
||||
|
||||
const filteredDataPoints = trend.dataPoints.filter(dp => dp.timestamp >= cutoffDate);
|
||||
|
||||
return {
|
||||
...trend,
|
||||
timeframe,
|
||||
dataPoints: filteredDataPoints
|
||||
};
|
||||
}
|
||||
|
||||
private analyzeTrend(dataPoints: { timestamp: Date; value: number }[]): { direction: 'improving' | 'declining' | 'stable'; changeRate: number } {
|
||||
if (dataPoints.length < 2) {
|
||||
return { direction: 'stable', changeRate: 0 };
|
||||
}
|
||||
|
||||
const values = dataPoints.map(dp => dp.value);
|
||||
const firstValue = values[0];
|
||||
const lastValue = values[values.length - 1];
|
||||
const changeRate = ((lastValue - firstValue) / firstValue) * 100;
|
||||
|
||||
let direction: 'improving' | 'declining' | 'stable';
|
||||
if (Math.abs(changeRate) < 2) {
|
||||
direction = 'stable';
|
||||
} else if (changeRate > 0) {
|
||||
direction = 'improving';
|
||||
} else {
|
||||
direction = 'declining';
|
||||
}
|
||||
|
||||
return { direction, changeRate: Math.round(changeRate * 100) / 100 };
|
||||
}
|
||||
|
||||
private calculateAverageScore(qualities: DataQuality[]): number {
|
||||
if (qualities.length === 0) return 0;
|
||||
|
||||
const totalScore = qualities.reduce((sum, quality) => sum + quality.overallScore, 0);
|
||||
return Math.round(totalScore / qualities.length);
|
||||
}
|
||||
|
||||
private calculateQualityDistribution(qualities: DataQuality[]): Record<string, number> {
|
||||
const distribution = { excellent: 0, good: 0, fair: 0, poor: 0 };
|
||||
|
||||
qualities.forEach(quality => {
|
||||
if (quality.overallScore >= 90) distribution.excellent++;
|
||||
else if (quality.overallScore >= 80) distribution.good++;
|
||||
else if (quality.overallScore >= 70) distribution.fair++;
|
||||
else distribution.poor++;
|
||||
});
|
||||
|
||||
return distribution;
|
||||
}
|
||||
|
||||
private getTopQualityIssues(qualities: DataQuality[]): Array<{ type: string; count: number }> {
|
||||
const issueTypes = new Map<string, number>();
|
||||
|
||||
qualities.forEach(quality => {
|
||||
quality.issues.filter(issue => !issue.resolved).forEach(issue => {
|
||||
issueTypes.set(issue.type, (issueTypes.get(issue.type) || 0) + 1);
|
||||
});
|
||||
});
|
||||
|
||||
return Array.from(issueTypes.entries())
|
||||
.map(([type, count]) => ({ type, count }))
|
||||
.sort((a, b) => b.count - a.count)
|
||||
.slice(0, 5);
|
||||
}
|
||||
|
||||
private getTrendSummary(qualities: DataQuality[]): Record<string, number> {
|
||||
const trends = { improving: 0, declining: 0, stable: 0 };
|
||||
|
||||
qualities.forEach(quality => {
|
||||
trends[quality.trend.trend]++;
|
||||
});
|
||||
|
||||
return trends;
|
||||
}
|
||||
|
||||
private calculateRuleCompliance(qualities: DataQuality[]): number {
|
||||
let totalRules = 0;
|
||||
let passedRules = 0;
|
||||
|
||||
qualities.forEach(quality => {
|
||||
totalRules += quality.rules.length;
|
||||
// Mock compliance calculation
|
||||
passedRules += Math.floor(quality.rules.length * (quality.overallScore / 100));
|
||||
});
|
||||
|
||||
return totalRules > 0 ? Math.round((passedRules / totalRules) * 100) : 100;
|
||||
}
|
||||
|
||||
private generateQualityRecommendations(reportData: any): string[] {
|
||||
const recommendations: string[] = [];
|
||||
|
||||
if (reportData.summary.averageScore < 80) {
|
||||
recommendations.push('Overall data quality is below acceptable threshold. Consider implementing comprehensive data quality monitoring.');
|
||||
}
|
||||
|
||||
if (reportData.summary.criticalIssues > 0) {
|
||||
recommendations.push(`${reportData.summary.criticalIssues} critical quality issues require immediate attention.`);
|
||||
}
|
||||
|
||||
if (reportData.summary.highIssues > 5) {
|
||||
recommendations.push('High number of quality issues detected. Review data validation processes.');
|
||||
}
|
||||
|
||||
// Asset-specific recommendations
|
||||
const lowScoreAssets = reportData.assetDetails.filter((asset: any) => asset.qualityScore < 70);
|
||||
if (lowScoreAssets.length > 0) {
|
||||
recommendations.push(`${lowScoreAssets.length} assets have quality scores below 70% and need immediate remediation.`);
|
||||
}
|
||||
|
||||
if (recommendations.length === 0) {
|
||||
recommendations.push('Data quality is within acceptable ranges. Continue monitoring and maintain current practices.');
|
||||
}
|
||||
|
||||
return recommendations;
|
||||
}
|
||||
|
||||
private generateId(): string {
|
||||
return `quality_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
|
||||
// Method to inject assets (typically from DataCatalogService)
|
||||
setAssets(assets: Map<string, DataAsset>): void {
|
||||
this.assets = assets;
|
||||
}
|
||||
}
|
||||
801
apps/data-services/data-catalog/src/services/SearchService.ts
Normal file
801
apps/data-services/data-catalog/src/services/SearchService.ts
Normal file
|
|
@ -0,0 +1,801 @@
|
|||
import { EventBus } from '@stock-bot/event-bus';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import {
|
||||
DataAsset,
|
||||
SearchQuery,
|
||||
SearchResult,
|
||||
SearchFilters,
|
||||
SearchSuggestion,
|
||||
DataAssetType,
|
||||
DataClassification
|
||||
} from '../types/DataCatalog';
|
||||
|
||||
export interface SearchService {
|
||||
search(query: SearchQuery): Promise<SearchResult>;
|
||||
suggest(partial: string): Promise<SearchSuggestion[]>;
|
||||
searchByFacets(facets: Record<string, string[]>): Promise<DataAsset[]>;
|
||||
searchSimilar(assetId: string, limit?: number): Promise<DataAsset[]>;
|
||||
getPopularSearches(limit?: number): Promise<string[]>;
|
||||
getRecentSearches(userId: string, limit?: number): Promise<string[]>;
|
||||
indexAsset(asset: DataAsset): Promise<void>;
|
||||
removeFromIndex(assetId: string): Promise<void>;
|
||||
reindexAll(): Promise<void>;
|
||||
getSearchAnalytics(timeframe?: string): Promise<any>;
|
||||
}
|
||||
|
||||
export class SearchServiceImpl implements SearchService {
|
||||
private searchIndex: Map<string, DataAsset> = new Map();
|
||||
private searchHistory: Array<{ query: string; userId?: string; timestamp: Date; resultCount: number }> = [];
|
||||
private assets: Map<string, DataAsset> = new Map();
|
||||
|
||||
// In-memory inverted index for search
|
||||
private wordToAssets: Map<string, Set<string>> = new Map();
|
||||
private tagToAssets: Map<string, Set<string>> = new Map();
|
||||
private typeToAssets: Map<string, Set<string>> = new Map();
|
||||
private classificationToAssets: Map<string, Set<string>> = new Map();
|
||||
private ownerToAssets: Map<string, Set<string>> = new Map();
|
||||
|
||||
constructor(
|
||||
private eventBus: EventBus,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async search(query: SearchQuery): Promise<SearchResult> {
|
||||
try {
|
||||
const startTime = Date.now();
|
||||
let results: DataAsset[] = [];
|
||||
|
||||
if (query.text) {
|
||||
results = await this.performTextSearch(query.text);
|
||||
} else {
|
||||
results = Array.from(this.assets.values());
|
||||
}
|
||||
|
||||
// Apply filters
|
||||
if (query.filters) {
|
||||
results = this.applyFilters(results, query.filters);
|
||||
}
|
||||
|
||||
// Sort results
|
||||
results = this.sortResults(results, query.sortBy, query.sortOrder);
|
||||
|
||||
// Apply pagination
|
||||
const total = results.length;
|
||||
const offset = query.offset || 0;
|
||||
const limit = query.limit || 20;
|
||||
const paginatedResults = results.slice(offset, offset + limit);
|
||||
|
||||
// Calculate facets
|
||||
const facets = this.calculateFacets(results);
|
||||
|
||||
const searchTime = Date.now() - startTime;
|
||||
|
||||
const searchResult: SearchResult = {
|
||||
assets: paginatedResults,
|
||||
total,
|
||||
offset,
|
||||
limit,
|
||||
searchTime,
|
||||
facets,
|
||||
suggestions: await this.generateSearchSuggestions(query.text || '', results)
|
||||
};
|
||||
|
||||
// Record search in history
|
||||
this.recordSearch(query.text || '', query.userId, total);
|
||||
|
||||
this.logger.info('Search completed', {
|
||||
query: query.text,
|
||||
resultCount: total,
|
||||
searchTime
|
||||
});
|
||||
|
||||
await this.eventBus.emit('data.catalog.search.performed', {
|
||||
query,
|
||||
resultCount: total,
|
||||
searchTime,
|
||||
timestamp: new Date()
|
||||
});
|
||||
|
||||
return searchResult;
|
||||
} catch (error) {
|
||||
this.logger.error('Search failed', { query, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async suggest(partial: string): Promise<SearchSuggestion[]> {
|
||||
try {
|
||||
const suggestions: SearchSuggestion[] = [];
|
||||
const normalizedPartial = partial.toLowerCase().trim();
|
||||
|
||||
if (normalizedPartial.length < 2) {
|
||||
return suggestions;
|
||||
}
|
||||
|
||||
// Asset name suggestions
|
||||
for (const asset of this.assets.values()) {
|
||||
if (asset.name.toLowerCase().includes(normalizedPartial)) {
|
||||
suggestions.push({
|
||||
text: asset.name,
|
||||
type: 'asset_name',
|
||||
count: 1,
|
||||
highlight: this.highlightMatch(asset.name, partial)
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Tag suggestions
|
||||
const tagCounts = new Map<string, number>();
|
||||
for (const asset of this.assets.values()) {
|
||||
for (const tag of asset.tags) {
|
||||
if (tag.toLowerCase().includes(normalizedPartial)) {
|
||||
tagCounts.set(tag, (tagCounts.get(tag) || 0) + 1);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (const [tag, count] of tagCounts) {
|
||||
suggestions.push({
|
||||
text: tag,
|
||||
type: 'tag',
|
||||
count,
|
||||
highlight: this.highlightMatch(tag, partial)
|
||||
});
|
||||
}
|
||||
|
||||
// Owner suggestions
|
||||
const ownerCounts = new Map<string, number>();
|
||||
for (const asset of this.assets.values()) {
|
||||
if (asset.owner.toLowerCase().includes(normalizedPartial)) {
|
||||
ownerCounts.set(asset.owner, (ownerCounts.get(asset.owner) || 0) + 1);
|
||||
}
|
||||
}
|
||||
|
||||
for (const [owner, count] of ownerCounts) {
|
||||
suggestions.push({
|
||||
text: owner,
|
||||
type: 'owner',
|
||||
count,
|
||||
highlight: this.highlightMatch(owner, partial)
|
||||
});
|
||||
}
|
||||
|
||||
// Popular search suggestions
|
||||
const popularSearches = this.getPopularSearchTerms().filter(term =>
|
||||
term.toLowerCase().includes(normalizedPartial)
|
||||
);
|
||||
|
||||
for (const search of popularSearches.slice(0, 5)) {
|
||||
suggestions.push({
|
||||
text: search,
|
||||
type: 'popular_search',
|
||||
count: this.getSearchCount(search),
|
||||
highlight: this.highlightMatch(search, partial)
|
||||
});
|
||||
}
|
||||
|
||||
// Sort by relevance and count
|
||||
return suggestions
|
||||
.sort((a, b) => {
|
||||
// Prefer exact matches
|
||||
const aExact = a.text.toLowerCase().startsWith(normalizedPartial) ? 1 : 0;
|
||||
const bExact = b.text.toLowerCase().startsWith(normalizedPartial) ? 1 : 0;
|
||||
|
||||
if (aExact !== bExact) return bExact - aExact;
|
||||
|
||||
// Then by count
|
||||
return b.count - a.count;
|
||||
})
|
||||
.slice(0, 10);
|
||||
} catch (error) {
|
||||
this.logger.error('Suggestion generation failed', { partial, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async searchByFacets(facets: Record<string, string[]>): Promise<DataAsset[]> {
|
||||
try {
|
||||
let results: Set<string> = new Set();
|
||||
let isFirstFacet = true;
|
||||
|
||||
for (const [facetType, values] of Object.entries(facets)) {
|
||||
const facetResults = new Set<string>();
|
||||
|
||||
for (const value of values) {
|
||||
let assetIds: Set<string> | undefined;
|
||||
|
||||
switch (facetType) {
|
||||
case 'type':
|
||||
assetIds = this.typeToAssets.get(value);
|
||||
break;
|
||||
case 'classification':
|
||||
assetIds = this.classificationToAssets.get(value);
|
||||
break;
|
||||
case 'owner':
|
||||
assetIds = this.ownerToAssets.get(value);
|
||||
break;
|
||||
case 'tags':
|
||||
assetIds = this.tagToAssets.get(value);
|
||||
break;
|
||||
}
|
||||
|
||||
if (assetIds) {
|
||||
for (const assetId of assetIds) {
|
||||
facetResults.add(assetId);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (isFirstFacet) {
|
||||
results = facetResults;
|
||||
isFirstFacet = false;
|
||||
} else {
|
||||
// Intersection of results
|
||||
results = new Set([...results].filter(id => facetResults.has(id)));
|
||||
}
|
||||
}
|
||||
|
||||
const assets = Array.from(results)
|
||||
.map(id => this.assets.get(id))
|
||||
.filter((asset): asset is DataAsset => asset !== undefined);
|
||||
|
||||
this.logger.info('Facet search completed', {
|
||||
facets,
|
||||
resultCount: assets.length
|
||||
});
|
||||
|
||||
return assets;
|
||||
} catch (error) {
|
||||
this.logger.error('Facet search failed', { facets, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async searchSimilar(assetId: string, limit: number = 10): Promise<DataAsset[]> {
|
||||
try {
|
||||
const targetAsset = this.assets.get(assetId);
|
||||
if (!targetAsset) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const similarities: Array<{ asset: DataAsset; score: number }> = [];
|
||||
|
||||
for (const asset of this.assets.values()) {
|
||||
if (asset.id === assetId) continue;
|
||||
|
||||
const score = this.calculateSimilarity(targetAsset, asset);
|
||||
if (score > 0.1) { // Minimum similarity threshold
|
||||
similarities.push({ asset, score });
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by similarity score and return top results
|
||||
const results = similarities
|
||||
.sort((a, b) => b.score - a.score)
|
||||
.slice(0, limit)
|
||||
.map(item => item.asset);
|
||||
|
||||
this.logger.info('Similar assets found', {
|
||||
assetId,
|
||||
similarCount: results.length
|
||||
});
|
||||
|
||||
return results;
|
||||
} catch (error) {
|
||||
this.logger.error('Similar asset search failed', { assetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getPopularSearches(limit: number = 10): Promise<string[]> {
|
||||
try {
|
||||
const searchCounts = new Map<string, number>();
|
||||
|
||||
// Count search frequency
|
||||
for (const search of this.searchHistory) {
|
||||
if (search.query) {
|
||||
searchCounts.set(search.query, (searchCounts.get(search.query) || 0) + 1);
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by frequency and return top searches
|
||||
return Array.from(searchCounts.entries())
|
||||
.sort((a, b) => b[1] - a[1])
|
||||
.slice(0, limit)
|
||||
.map(([query]) => query);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get popular searches', { error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getRecentSearches(userId: string, limit: number = 10): Promise<string[]> {
|
||||
try {
|
||||
return this.searchHistory
|
||||
.filter(search => search.userId === userId && search.query)
|
||||
.sort((a, b) => b.timestamp.getTime() - a.timestamp.getTime())
|
||||
.slice(0, limit)
|
||||
.map(search => search.query);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get recent searches', { userId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async indexAsset(asset: DataAsset): Promise<void> {
|
||||
try {
|
||||
// Add to main index
|
||||
this.searchIndex.set(asset.id, asset);
|
||||
this.assets.set(asset.id, asset);
|
||||
|
||||
// Update inverted indices
|
||||
this.updateInvertedIndices(asset);
|
||||
|
||||
this.logger.debug('Asset indexed', { assetId: asset.id, name: asset.name });
|
||||
|
||||
await this.eventBus.emit('data.catalog.asset.indexed', {
|
||||
assetId: asset.id,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to index asset', { asset, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async removeFromIndex(assetId: string): Promise<void> {
|
||||
try {
|
||||
const asset = this.searchIndex.get(assetId);
|
||||
if (!asset) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Remove from main index
|
||||
this.searchIndex.delete(assetId);
|
||||
this.assets.delete(assetId);
|
||||
|
||||
// Remove from inverted indices
|
||||
this.removeFromInvertedIndices(asset);
|
||||
|
||||
this.logger.debug('Asset removed from index', { assetId });
|
||||
|
||||
await this.eventBus.emit('data.catalog.asset.unindexed', {
|
||||
assetId,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to remove asset from index', { assetId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async reindexAll(): Promise<void> {
|
||||
try {
|
||||
// Clear all indices
|
||||
this.searchIndex.clear();
|
||||
this.wordToAssets.clear();
|
||||
this.tagToAssets.clear();
|
||||
this.typeToAssets.clear();
|
||||
this.classificationToAssets.clear();
|
||||
this.ownerToAssets.clear();
|
||||
|
||||
// Reindex all assets
|
||||
for (const asset of this.assets.values()) {
|
||||
await this.indexAsset(asset);
|
||||
}
|
||||
|
||||
this.logger.info('Search index rebuilt', { assetCount: this.assets.size });
|
||||
|
||||
await this.eventBus.emit('data.catalog.index.rebuilt', {
|
||||
assetCount: this.assets.size,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to rebuild search index', { error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getSearchAnalytics(timeframe: string = 'week'): Promise<any> {
|
||||
try {
|
||||
const now = new Date();
|
||||
let cutoffDate: Date;
|
||||
|
||||
switch (timeframe) {
|
||||
case 'day':
|
||||
cutoffDate = new Date(now.getTime() - 24 * 60 * 60 * 1000);
|
||||
break;
|
||||
case 'week':
|
||||
cutoffDate = new Date(now.getTime() - 7 * 24 * 60 * 60 * 1000);
|
||||
break;
|
||||
case 'month':
|
||||
cutoffDate = new Date(now.getTime() - 30 * 24 * 60 * 60 * 1000);
|
||||
break;
|
||||
default:
|
||||
cutoffDate = new Date(0);
|
||||
}
|
||||
|
||||
const recentSearches = this.searchHistory.filter(search => search.timestamp >= cutoffDate);
|
||||
|
||||
const analytics = {
|
||||
totalSearches: recentSearches.length,
|
||||
uniqueQueries: new Set(recentSearches.map(s => s.query)).size,
|
||||
averageResults: recentSearches.length > 0 ?
|
||||
recentSearches.reduce((sum, s) => sum + s.resultCount, 0) / recentSearches.length : 0,
|
||||
noResultQueries: recentSearches.filter(s => s.resultCount === 0).length,
|
||||
topQueries: this.getTopQueries(recentSearches, 10),
|
||||
searchTrend: this.calculateSearchTrend(recentSearches, timeframe),
|
||||
facetUsage: this.getFacetUsage(recentSearches)
|
||||
};
|
||||
|
||||
return analytics;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get search analytics', { timeframe, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Private helper methods
|
||||
private async performTextSearch(text: string): Promise<DataAsset[]> {
|
||||
const words = this.tokenize(text);
|
||||
const assetScores = new Map<string, number>();
|
||||
|
||||
for (const word of words) {
|
||||
const assetIds = this.wordToAssets.get(word) || new Set();
|
||||
|
||||
for (const assetId of assetIds) {
|
||||
assetScores.set(assetId, (assetScores.get(assetId) || 0) + 1);
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by relevance score
|
||||
const sortedAssetIds = Array.from(assetScores.entries())
|
||||
.sort((a, b) => b[1] - a[1])
|
||||
.map(([assetId]) => assetId);
|
||||
|
||||
return sortedAssetIds
|
||||
.map(id => this.assets.get(id))
|
||||
.filter((asset): asset is DataAsset => asset !== undefined);
|
||||
}
|
||||
|
||||
private applyFilters(assets: DataAsset[], filters: SearchFilters): DataAsset[] {
|
||||
return assets.filter(asset => {
|
||||
if (filters.types && filters.types.length > 0) {
|
||||
if (!filters.types.includes(asset.type)) return false;
|
||||
}
|
||||
|
||||
if (filters.classifications && filters.classifications.length > 0) {
|
||||
if (!filters.classifications.includes(asset.classification)) return false;
|
||||
}
|
||||
|
||||
if (filters.owners && filters.owners.length > 0) {
|
||||
if (!filters.owners.includes(asset.owner)) return false;
|
||||
}
|
||||
|
||||
if (filters.tags && filters.tags.length > 0) {
|
||||
if (!filters.tags.some(tag => asset.tags.includes(tag))) return false;
|
||||
}
|
||||
|
||||
if (filters.createdAfter) {
|
||||
if (asset.createdAt < filters.createdAfter) return false;
|
||||
}
|
||||
|
||||
if (filters.createdBefore) {
|
||||
if (asset.createdAt > filters.createdBefore) return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
});
|
||||
}
|
||||
|
||||
private sortResults(assets: DataAsset[], sortBy?: string, sortOrder?: 'asc' | 'desc'): DataAsset[] {
|
||||
if (!sortBy) {
|
||||
return assets; // Return as-is (relevance order)
|
||||
}
|
||||
|
||||
const order = sortOrder === 'desc' ? -1 : 1;
|
||||
|
||||
return assets.sort((a, b) => {
|
||||
let comparison = 0;
|
||||
|
||||
switch (sortBy) {
|
||||
case 'name':
|
||||
comparison = a.name.localeCompare(b.name);
|
||||
break;
|
||||
case 'createdAt':
|
||||
comparison = a.createdAt.getTime() - b.createdAt.getTime();
|
||||
break;
|
||||
case 'updatedAt':
|
||||
comparison = a.updatedAt.getTime() - b.updatedAt.getTime();
|
||||
break;
|
||||
case 'lastAccessed':
|
||||
const aAccessed = a.lastAccessed?.getTime() || 0;
|
||||
const bAccessed = b.lastAccessed?.getTime() || 0;
|
||||
comparison = aAccessed - bAccessed;
|
||||
break;
|
||||
case 'usage':
|
||||
comparison = a.usage.accessCount - b.usage.accessCount;
|
||||
break;
|
||||
default:
|
||||
comparison = 0;
|
||||
}
|
||||
|
||||
return comparison * order;
|
||||
});
|
||||
}
|
||||
|
||||
private calculateFacets(assets: DataAsset[]): Record<string, Array<{ value: string; count: number }>> {
|
||||
const facets: Record<string, Map<string, number>> = {
|
||||
types: new Map(),
|
||||
classifications: new Map(),
|
||||
owners: new Map(),
|
||||
tags: new Map()
|
||||
};
|
||||
|
||||
for (const asset of assets) {
|
||||
// Type facet
|
||||
facets.types.set(asset.type, (facets.types.get(asset.type) || 0) + 1);
|
||||
|
||||
// Classification facet
|
||||
facets.classifications.set(asset.classification, (facets.classifications.get(asset.classification) || 0) + 1);
|
||||
|
||||
// Owner facet
|
||||
facets.owners.set(asset.owner, (facets.owners.get(asset.owner) || 0) + 1);
|
||||
|
||||
// Tags facet
|
||||
for (const tag of asset.tags) {
|
||||
facets.tags.set(tag, (facets.tags.get(tag) || 0) + 1);
|
||||
}
|
||||
}
|
||||
|
||||
// Convert to required format
|
||||
const result: Record<string, Array<{ value: string; count: number }>> = {};
|
||||
|
||||
for (const [facetName, facetMap] of Object.entries(facets)) {
|
||||
result[facetName] = Array.from(facetMap.entries())
|
||||
.map(([value, count]) => ({ value, count }))
|
||||
.sort((a, b) => b.count - a.count);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private async generateSearchSuggestions(query: string, results: DataAsset[]): Promise<string[]> {
|
||||
if (!query || results.length === 0) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const suggestions: string[] = [];
|
||||
|
||||
// Extract common tags from results
|
||||
const tagCounts = new Map<string, number>();
|
||||
for (const asset of results.slice(0, 10)) { // Top 10 results
|
||||
for (const tag of asset.tags) {
|
||||
tagCounts.set(tag, (tagCounts.get(tag) || 0) + 1);
|
||||
}
|
||||
}
|
||||
|
||||
// Add top tags as suggestions
|
||||
const topTags = Array.from(tagCounts.entries())
|
||||
.sort((a, b) => b[1] - a[1])
|
||||
.slice(0, 3)
|
||||
.map(([tag]) => `${query} ${tag}`);
|
||||
|
||||
suggestions.push(...topTags);
|
||||
|
||||
return suggestions;
|
||||
}
|
||||
|
||||
private updateInvertedIndices(asset: DataAsset): void {
|
||||
// Index words from name and description
|
||||
const words = [
|
||||
...this.tokenize(asset.name),
|
||||
...this.tokenize(asset.description)
|
||||
];
|
||||
|
||||
for (const word of words) {
|
||||
if (!this.wordToAssets.has(word)) {
|
||||
this.wordToAssets.set(word, new Set());
|
||||
}
|
||||
this.wordToAssets.get(word)!.add(asset.id);
|
||||
}
|
||||
|
||||
// Index tags
|
||||
for (const tag of asset.tags) {
|
||||
if (!this.tagToAssets.has(tag)) {
|
||||
this.tagToAssets.set(tag, new Set());
|
||||
}
|
||||
this.tagToAssets.get(tag)!.add(asset.id);
|
||||
}
|
||||
|
||||
// Index type
|
||||
if (!this.typeToAssets.has(asset.type)) {
|
||||
this.typeToAssets.set(asset.type, new Set());
|
||||
}
|
||||
this.typeToAssets.get(asset.type)!.add(asset.id);
|
||||
|
||||
// Index classification
|
||||
if (!this.classificationToAssets.has(asset.classification)) {
|
||||
this.classificationToAssets.set(asset.classification, new Set());
|
||||
}
|
||||
this.classificationToAssets.get(asset.classification)!.add(asset.id);
|
||||
|
||||
// Index owner
|
||||
if (!this.ownerToAssets.has(asset.owner)) {
|
||||
this.ownerToAssets.set(asset.owner, new Set());
|
||||
}
|
||||
this.ownerToAssets.get(asset.owner)!.add(asset.id);
|
||||
}
|
||||
|
||||
private removeFromInvertedIndices(asset: DataAsset): void {
|
||||
// Remove from word index
|
||||
const words = [
|
||||
...this.tokenize(asset.name),
|
||||
...this.tokenize(asset.description)
|
||||
];
|
||||
|
||||
for (const word of words) {
|
||||
const assetSet = this.wordToAssets.get(word);
|
||||
if (assetSet) {
|
||||
assetSet.delete(asset.id);
|
||||
if (assetSet.size === 0) {
|
||||
this.wordToAssets.delete(word);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Remove from other indices
|
||||
this.removeFromIndex(this.tagToAssets, asset.tags, asset.id);
|
||||
this.removeFromIndex(this.typeToAssets, [asset.type], asset.id);
|
||||
this.removeFromIndex(this.classificationToAssets, [asset.classification], asset.id);
|
||||
this.removeFromIndex(this.ownerToAssets, [asset.owner], asset.id);
|
||||
}
|
||||
|
||||
private removeFromIndex(index: Map<string, Set<string>>, values: string[], assetId: string): void {
|
||||
for (const value of values) {
|
||||
const assetSet = index.get(value);
|
||||
if (assetSet) {
|
||||
assetSet.delete(assetId);
|
||||
if (assetSet.size === 0) {
|
||||
index.delete(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private tokenize(text: string): string[] {
|
||||
return text
|
||||
.toLowerCase()
|
||||
.replace(/[^\w\s]/g, ' ')
|
||||
.split(/\s+/)
|
||||
.filter(word => word.length > 2);
|
||||
}
|
||||
|
||||
private calculateSimilarity(asset1: DataAsset, asset2: DataAsset): number {
|
||||
let score = 0;
|
||||
|
||||
// Type similarity
|
||||
if (asset1.type === asset2.type) score += 0.3;
|
||||
|
||||
// Classification similarity
|
||||
if (asset1.classification === asset2.classification) score += 0.2;
|
||||
|
||||
// Owner similarity
|
||||
if (asset1.owner === asset2.owner) score += 0.1;
|
||||
|
||||
// Tag similarity (Jaccard similarity)
|
||||
const tags1 = new Set(asset1.tags);
|
||||
const tags2 = new Set(asset2.tags);
|
||||
const intersection = new Set([...tags1].filter(tag => tags2.has(tag)));
|
||||
const union = new Set([...tags1, ...tags2]);
|
||||
|
||||
if (union.size > 0) {
|
||||
score += (intersection.size / union.size) * 0.4;
|
||||
}
|
||||
|
||||
return score;
|
||||
}
|
||||
|
||||
private highlightMatch(text: string, query: string): string {
|
||||
const regex = new RegExp(`(${query})`, 'gi');
|
||||
return text.replace(regex, '<mark>$1</mark>');
|
||||
}
|
||||
|
||||
private recordSearch(query: string, userId?: string, resultCount: number = 0): void {
|
||||
this.searchHistory.push({
|
||||
query,
|
||||
userId,
|
||||
timestamp: new Date(),
|
||||
resultCount
|
||||
});
|
||||
|
||||
// Keep only last 1000 searches
|
||||
if (this.searchHistory.length > 1000) {
|
||||
this.searchHistory = this.searchHistory.slice(-1000);
|
||||
}
|
||||
}
|
||||
|
||||
private getPopularSearchTerms(): string[] {
|
||||
const searchCounts = new Map<string, number>();
|
||||
|
||||
for (const search of this.searchHistory) {
|
||||
if (search.query) {
|
||||
searchCounts.set(search.query, (searchCounts.get(search.query) || 0) + 1);
|
||||
}
|
||||
}
|
||||
|
||||
return Array.from(searchCounts.entries())
|
||||
.sort((a, b) => b[1] - a[1])
|
||||
.map(([query]) => query);
|
||||
}
|
||||
|
||||
private getSearchCount(query: string): number {
|
||||
return this.searchHistory.filter(search => search.query === query).length;
|
||||
}
|
||||
|
||||
private getTopQueries(searches: any[], limit: number): Array<{ query: string; count: number }> {
|
||||
const queryCounts = new Map<string, number>();
|
||||
|
||||
for (const search of searches) {
|
||||
if (search.query) {
|
||||
queryCounts.set(search.query, (queryCounts.get(search.query) || 0) + 1);
|
||||
}
|
||||
}
|
||||
|
||||
return Array.from(queryCounts.entries())
|
||||
.map(([query, count]) => ({ query, count }))
|
||||
.sort((a, b) => b.count - a.count)
|
||||
.slice(0, limit);
|
||||
}
|
||||
|
||||
private calculateSearchTrend(searches: any[], timeframe: string): any {
|
||||
// Group searches by day
|
||||
const dailyCounts = new Map<string, number>();
|
||||
|
||||
for (const search of searches) {
|
||||
const day = search.timestamp.toISOString().split('T')[0];
|
||||
dailyCounts.set(day, (dailyCounts.get(day) || 0) + 1);
|
||||
}
|
||||
|
||||
const dataPoints = Array.from(dailyCounts.entries())
|
||||
.map(([date, count]) => ({ date, count }))
|
||||
.sort((a, b) => a.date.localeCompare(b.date));
|
||||
|
||||
return {
|
||||
dataPoints,
|
||||
trend: this.analyzeTrend(dataPoints.map(p => p.count))
|
||||
};
|
||||
}
|
||||
|
||||
private analyzeTrend(values: number[]): string {
|
||||
if (values.length < 2) return 'stable';
|
||||
|
||||
const firstHalf = values.slice(0, Math.floor(values.length / 2));
|
||||
const secondHalf = values.slice(Math.floor(values.length / 2));
|
||||
|
||||
const firstAvg = firstHalf.reduce((sum, val) => sum + val, 0) / firstHalf.length;
|
||||
const secondAvg = secondHalf.reduce((sum, val) => sum + val, 0) / secondHalf.length;
|
||||
|
||||
const changePercent = ((secondAvg - firstAvg) / firstAvg) * 100;
|
||||
|
||||
if (Math.abs(changePercent) < 10) return 'stable';
|
||||
return changePercent > 0 ? 'increasing' : 'decreasing';
|
||||
}
|
||||
|
||||
private getFacetUsage(searches: any[]): Record<string, number> {
|
||||
// Mock facet usage tracking
|
||||
return {
|
||||
types: Math.floor(searches.length * 0.3),
|
||||
classifications: Math.floor(searches.length * 0.2),
|
||||
owners: Math.floor(searches.length * 0.1),
|
||||
tags: Math.floor(searches.length * 0.4)
|
||||
};
|
||||
}
|
||||
|
||||
// Method to inject assets (typically from DataCatalogService)
|
||||
setAssets(assets: Map<string, DataAsset>): void {
|
||||
this.assets = assets;
|
||||
// Reindex all assets when assets are updated
|
||||
this.reindexAll();
|
||||
}
|
||||
}
|
||||
524
apps/data-services/data-catalog/src/types/DataCatalog.ts
Normal file
524
apps/data-services/data-catalog/src/types/DataCatalog.ts
Normal file
|
|
@ -0,0 +1,524 @@
|
|||
// Data Asset Types
|
||||
export interface DataAsset {
|
||||
id: string;
|
||||
name: string;
|
||||
type: DataAssetType;
|
||||
description: string;
|
||||
owner: string;
|
||||
steward?: string;
|
||||
tags: string[];
|
||||
classification: DataClassification;
|
||||
schema?: DataSchema;
|
||||
location: DataLocation;
|
||||
metadata: DataAssetMetadata;
|
||||
lineage: DataLineage;
|
||||
quality: DataQuality;
|
||||
usage: DataUsage;
|
||||
governance: DataGovernance;
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
lastAccessed?: Date;
|
||||
}
|
||||
|
||||
export enum DataAssetType {
|
||||
TABLE = 'table',
|
||||
VIEW = 'view',
|
||||
DATASET = 'dataset',
|
||||
API = 'api',
|
||||
FILE = 'file',
|
||||
STREAM = 'stream',
|
||||
MODEL = 'model',
|
||||
FEATURE_GROUP = 'feature_group',
|
||||
PIPELINE = 'pipeline',
|
||||
REPORT = 'report'
|
||||
}
|
||||
|
||||
export enum DataClassification {
|
||||
PUBLIC = 'public',
|
||||
INTERNAL = 'internal',
|
||||
CONFIDENTIAL = 'confidential',
|
||||
RESTRICTED = 'restricted',
|
||||
PII = 'pii',
|
||||
FINANCIAL = 'financial'
|
||||
}
|
||||
|
||||
export interface DataSchema {
|
||||
version: string;
|
||||
fields: DataField[];
|
||||
primaryKeys?: string[];
|
||||
foreignKeys?: ForeignKey[];
|
||||
indexes?: Index[];
|
||||
}
|
||||
|
||||
export interface DataField {
|
||||
name: string;
|
||||
type: string;
|
||||
nullable: boolean;
|
||||
description?: string;
|
||||
constraints?: FieldConstraint[];
|
||||
tags?: string[];
|
||||
classification?: DataClassification;
|
||||
}
|
||||
|
||||
export interface ForeignKey {
|
||||
fields: string[];
|
||||
referencedAsset: string;
|
||||
referencedFields: string[];
|
||||
}
|
||||
|
||||
export interface Index {
|
||||
name: string;
|
||||
fields: string[];
|
||||
unique: boolean;
|
||||
type: 'btree' | 'hash' | 'gin' | 'gist';
|
||||
}
|
||||
|
||||
export interface FieldConstraint {
|
||||
type: 'not_null' | 'unique' | 'check' | 'range' | 'pattern';
|
||||
value?: any;
|
||||
description?: string;
|
||||
}
|
||||
|
||||
export interface DataLocation {
|
||||
type: 'database' | 'file_system' | 'cloud_storage' | 'api' | 'stream';
|
||||
connection: string;
|
||||
path: string;
|
||||
format?: string;
|
||||
compression?: string;
|
||||
partitioning?: PartitionInfo;
|
||||
}
|
||||
|
||||
export interface PartitionInfo {
|
||||
fields: string[];
|
||||
strategy: 'range' | 'hash' | 'list';
|
||||
count?: number;
|
||||
}
|
||||
|
||||
export interface DataAssetMetadata {
|
||||
size?: number;
|
||||
rowCount?: number;
|
||||
columnCount?: number;
|
||||
fileFormat?: string;
|
||||
encoding?: string;
|
||||
delimiter?: string;
|
||||
compression?: string;
|
||||
checksums?: Record<string, string>;
|
||||
customProperties?: Record<string, any>;
|
||||
}
|
||||
|
||||
// Data Lineage Types
|
||||
export interface DataLineage {
|
||||
id: string;
|
||||
assetId: string;
|
||||
upstreamAssets: LineageEdge[];
|
||||
downstreamAssets: LineageEdge[];
|
||||
transformations: DataTransformation[];
|
||||
impact: ImpactAnalysis;
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
}
|
||||
|
||||
export interface LineageEdge {
|
||||
sourceAssetId: string;
|
||||
targetAssetId: string;
|
||||
relationship: LineageRelationship;
|
||||
transformations: string[];
|
||||
confidence: number;
|
||||
metadata?: Record<string, any>;
|
||||
}
|
||||
|
||||
export enum LineageRelationship {
|
||||
DERIVED_FROM = 'derived_from',
|
||||
AGGREGATED_FROM = 'aggregated_from',
|
||||
JOINED_WITH = 'joined_with',
|
||||
FILTERED_FROM = 'filtered_from',
|
||||
TRANSFORMED_FROM = 'transformed_from',
|
||||
COPIED_FROM = 'copied_from',
|
||||
ENRICHED_WITH = 'enriched_with'
|
||||
}
|
||||
|
||||
export interface DataTransformation {
|
||||
id: string;
|
||||
name: string;
|
||||
type: TransformationType;
|
||||
description?: string;
|
||||
code?: string;
|
||||
inputFields: string[];
|
||||
outputFields: string[];
|
||||
logic: string;
|
||||
parameters?: Record<string, any>;
|
||||
}
|
||||
|
||||
export enum TransformationType {
|
||||
FILTER = 'filter',
|
||||
AGGREGATE = 'aggregate',
|
||||
JOIN = 'join',
|
||||
UNION = 'union',
|
||||
PIVOT = 'pivot',
|
||||
UNPIVOT = 'unpivot',
|
||||
SORT = 'sort',
|
||||
DEDUPLICATE = 'deduplicate',
|
||||
CALCULATE = 'calculate',
|
||||
CAST = 'cast',
|
||||
RENAME = 'rename'
|
||||
}
|
||||
|
||||
export interface ImpactAnalysis {
|
||||
downstreamAssets: string[];
|
||||
affectedUsers: string[];
|
||||
estimatedImpact: 'low' | 'medium' | 'high' | 'critical';
|
||||
impactDescription: string;
|
||||
recommendations: string[];
|
||||
}
|
||||
|
||||
// Data Quality Types
|
||||
export interface DataQuality {
|
||||
id: string;
|
||||
assetId: string;
|
||||
overallScore: number;
|
||||
dimensions: QualityDimension[];
|
||||
rules: QualityRule[];
|
||||
issues: QualityIssue[];
|
||||
trend: QualityTrend;
|
||||
lastAssessment: Date;
|
||||
nextAssessment?: Date;
|
||||
}
|
||||
|
||||
export interface QualityDimension {
|
||||
name: QualityDimensionType;
|
||||
score: number;
|
||||
weight: number;
|
||||
description: string;
|
||||
metrics: QualityMetric[];
|
||||
}
|
||||
|
||||
export enum QualityDimensionType {
|
||||
COMPLETENESS = 'completeness',
|
||||
ACCURACY = 'accuracy',
|
||||
CONSISTENCY = 'consistency',
|
||||
VALIDITY = 'validity',
|
||||
UNIQUENESS = 'uniqueness',
|
||||
TIMELINESS = 'timeliness',
|
||||
INTEGRITY = 'integrity'
|
||||
}
|
||||
|
||||
export interface QualityRule {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
dimension: QualityDimensionType;
|
||||
type: QualityRuleType;
|
||||
field?: string;
|
||||
condition: string;
|
||||
threshold: number;
|
||||
severity: 'low' | 'medium' | 'high' | 'critical';
|
||||
enabled: boolean;
|
||||
}
|
||||
|
||||
export enum QualityRuleType {
|
||||
NULL_CHECK = 'null_check',
|
||||
RANGE_CHECK = 'range_check',
|
||||
PATTERN_CHECK = 'pattern_check',
|
||||
REFERENCE_CHECK = 'reference_check',
|
||||
DUPLICATE_CHECK = 'duplicate_check',
|
||||
FRESHNESS_CHECK = 'freshness_check',
|
||||
CUSTOM = 'custom'
|
||||
}
|
||||
|
||||
export interface QualityMetric {
|
||||
name: string;
|
||||
value: number;
|
||||
unit?: string;
|
||||
threshold?: number;
|
||||
status: 'pass' | 'warn' | 'fail';
|
||||
}
|
||||
|
||||
export interface QualityIssue {
|
||||
id: string;
|
||||
ruleId: string;
|
||||
severity: 'low' | 'medium' | 'high' | 'critical';
|
||||
description: string;
|
||||
field?: string;
|
||||
affectedRows?: number;
|
||||
detectedAt: Date;
|
||||
status: 'open' | 'acknowledged' | 'resolved' | 'false_positive';
|
||||
assignee?: string;
|
||||
resolution?: string;
|
||||
resolvedAt?: Date;
|
||||
}
|
||||
|
||||
export interface QualityTrend {
|
||||
timeframe: 'day' | 'week' | 'month';
|
||||
dataPoints: QualityDataPoint[];
|
||||
trend: 'improving' | 'stable' | 'degrading';
|
||||
changeRate: number;
|
||||
}
|
||||
|
||||
export interface QualityDataPoint {
|
||||
timestamp: Date;
|
||||
score: number;
|
||||
dimensionScores: Record<QualityDimensionType, number>;
|
||||
}
|
||||
|
||||
// Data Usage Types
|
||||
export interface DataUsage {
|
||||
id: string;
|
||||
assetId: string;
|
||||
accessCount: number;
|
||||
uniqueUsers: number;
|
||||
lastAccessed: Date;
|
||||
topUsers: UserUsage[];
|
||||
accessPatterns: AccessPattern[];
|
||||
popularQueries: PopularQuery[];
|
||||
usageTrend: UsageTrend;
|
||||
}
|
||||
|
||||
export interface UserUsage {
|
||||
userId: string;
|
||||
userName: string;
|
||||
accessCount: number;
|
||||
lastAccessed: Date;
|
||||
accessType: 'read' | 'write' | 'query' | 'download';
|
||||
}
|
||||
|
||||
export interface AccessPattern {
|
||||
timeOfDay: number; // Hour 0-23
|
||||
dayOfWeek: number; // 0-6
|
||||
frequency: number;
|
||||
accessType: 'read' | 'write' | 'query' | 'download';
|
||||
}
|
||||
|
||||
export interface PopularQuery {
|
||||
query: string;
|
||||
count: number;
|
||||
avgExecutionTime: number;
|
||||
lastExecuted: Date;
|
||||
users: string[];
|
||||
}
|
||||
|
||||
export interface UsageTrend {
|
||||
timeframe: 'day' | 'week' | 'month';
|
||||
dataPoints: UsageDataPoint[];
|
||||
trend: 'increasing' | 'stable' | 'decreasing';
|
||||
changeRate: number;
|
||||
}
|
||||
|
||||
export interface UsageDataPoint {
|
||||
timestamp: Date;
|
||||
accessCount: number;
|
||||
uniqueUsers: number;
|
||||
avgResponseTime?: number;
|
||||
}
|
||||
|
||||
// Data Governance Types
|
||||
export interface DataGovernance {
|
||||
id: string;
|
||||
assetId: string;
|
||||
policies: GovernancePolicy[];
|
||||
compliance: ComplianceStatus[];
|
||||
retention: RetentionPolicy;
|
||||
access: AccessPolicy;
|
||||
privacy: PrivacySettings;
|
||||
audit: AuditTrail[];
|
||||
}
|
||||
|
||||
export interface GovernancePolicy {
|
||||
id: string;
|
||||
name: string;
|
||||
type: PolicyType;
|
||||
description: string;
|
||||
rules: PolicyRule[];
|
||||
enforcement: 'advisory' | 'preventive' | 'detective';
|
||||
status: 'active' | 'inactive' | 'draft';
|
||||
}
|
||||
|
||||
export enum PolicyType {
|
||||
ACCESS_CONTROL = 'access_control',
|
||||
DATA_RETENTION = 'data_retention',
|
||||
DATA_PRIVACY = 'data_privacy',
|
||||
DATA_QUALITY = 'data_quality',
|
||||
USAGE_MONITORING = 'usage_monitoring',
|
||||
COMPLIANCE = 'compliance'
|
||||
}
|
||||
|
||||
export interface PolicyRule {
|
||||
id: string;
|
||||
condition: string;
|
||||
action: string;
|
||||
parameters?: Record<string, any>;
|
||||
}
|
||||
|
||||
export interface ComplianceStatus {
|
||||
regulation: 'GDPR' | 'CCPA' | 'SOX' | 'HIPAA' | 'PCI_DSS' | 'CUSTOM';
|
||||
status: 'compliant' | 'non_compliant' | 'unknown';
|
||||
lastAssessment: Date;
|
||||
issues: ComplianceIssue[];
|
||||
}
|
||||
|
||||
export interface ComplianceIssue {
|
||||
id: string;
|
||||
description: string;
|
||||
severity: 'low' | 'medium' | 'high' | 'critical';
|
||||
requirement: string;
|
||||
remediation: string;
|
||||
dueDate?: Date;
|
||||
}
|
||||
|
||||
export interface RetentionPolicy {
|
||||
retentionPeriod: number; // in days
|
||||
archiveAfter?: number; // in days
|
||||
deleteAfter?: number; // in days
|
||||
retentionReason: string;
|
||||
legalHold: boolean;
|
||||
}
|
||||
|
||||
export interface AccessPolicy {
|
||||
defaultAccess: 'none' | 'read' | 'write' | 'admin';
|
||||
roles: RolePermission[];
|
||||
users: UserPermission[];
|
||||
conditions?: AccessCondition[];
|
||||
}
|
||||
|
||||
export interface RolePermission {
|
||||
role: string;
|
||||
permissions: Permission[];
|
||||
conditions?: AccessCondition[];
|
||||
}
|
||||
|
||||
export interface UserPermission {
|
||||
userId: string;
|
||||
permissions: Permission[];
|
||||
conditions?: AccessCondition[];
|
||||
expiresAt?: Date;
|
||||
}
|
||||
|
||||
export enum Permission {
|
||||
READ = 'read',
|
||||
WRITE = 'write',
|
||||
DELETE = 'delete',
|
||||
ADMIN = 'admin',
|
||||
QUERY = 'query',
|
||||
EXPORT = 'export'
|
||||
}
|
||||
|
||||
export interface AccessCondition {
|
||||
type: 'time_based' | 'location_based' | 'purpose_based' | 'data_sensitivity';
|
||||
condition: string;
|
||||
value: any;
|
||||
}
|
||||
|
||||
export interface PrivacySettings {
|
||||
containsPII: boolean;
|
||||
sensitiveFields: string[];
|
||||
anonymizationRules: AnonymizationRule[];
|
||||
consentRequired: boolean;
|
||||
dataSubjectRights: DataSubjectRight[];
|
||||
}
|
||||
|
||||
export interface AnonymizationRule {
|
||||
field: string;
|
||||
method: 'mask' | 'hash' | 'encrypt' | 'tokenize' | 'generalize' | 'suppress';
|
||||
parameters?: Record<string, any>;
|
||||
}
|
||||
|
||||
export interface DataSubjectRight {
|
||||
type: 'access' | 'rectification' | 'erasure' | 'portability' | 'restriction';
|
||||
enabled: boolean;
|
||||
automatedResponse: boolean;
|
||||
}
|
||||
|
||||
export interface AuditTrail {
|
||||
id: string;
|
||||
timestamp: Date;
|
||||
userId: string;
|
||||
action: string;
|
||||
resource: string;
|
||||
details: Record<string, any>;
|
||||
outcome: 'success' | 'failure';
|
||||
ipAddress?: string;
|
||||
userAgent?: string;
|
||||
}
|
||||
|
||||
// Search and Discovery Types
|
||||
export interface SearchRequest {
|
||||
query: string;
|
||||
filters?: SearchFilter[];
|
||||
facets?: string[];
|
||||
sortBy?: string;
|
||||
sortOrder?: 'asc' | 'desc';
|
||||
limit?: number;
|
||||
offset?: number;
|
||||
}
|
||||
|
||||
export interface SearchFilter {
|
||||
field: string;
|
||||
operator: 'eq' | 'ne' | 'gt' | 'gte' | 'lt' | 'lte' | 'in' | 'contains' | 'startswith' | 'endswith';
|
||||
value: any;
|
||||
}
|
||||
|
||||
export interface SearchResponse {
|
||||
total: number;
|
||||
assets: DataAsset[];
|
||||
facets: SearchFacet[];
|
||||
suggestions: string[];
|
||||
}
|
||||
|
||||
export interface SearchFacet {
|
||||
field: string;
|
||||
values: FacetValue[];
|
||||
}
|
||||
|
||||
export interface FacetValue {
|
||||
value: string;
|
||||
count: number;
|
||||
}
|
||||
|
||||
// API Request/Response Types
|
||||
export interface CreateDataAssetRequest {
|
||||
name: string;
|
||||
type: DataAssetType;
|
||||
description: string;
|
||||
owner: string;
|
||||
steward?: string;
|
||||
tags?: string[];
|
||||
classification: DataClassification;
|
||||
schema?: DataSchema;
|
||||
location: DataLocation;
|
||||
metadata?: Partial<DataAssetMetadata>;
|
||||
governance?: Partial<DataGovernance>;
|
||||
}
|
||||
|
||||
export interface UpdateDataAssetRequest {
|
||||
name?: string;
|
||||
description?: string;
|
||||
owner?: string;
|
||||
steward?: string;
|
||||
tags?: string[];
|
||||
classification?: DataClassification;
|
||||
schema?: DataSchema;
|
||||
metadata?: Partial<DataAssetMetadata>;
|
||||
}
|
||||
|
||||
export interface LineageRequest {
|
||||
assetId: string;
|
||||
direction: 'upstream' | 'downstream' | 'both';
|
||||
depth?: number;
|
||||
includeTransformations?: boolean;
|
||||
}
|
||||
|
||||
export interface QualityAssessmentRequest {
|
||||
assetId: string;
|
||||
rules?: string[];
|
||||
immediate?: boolean;
|
||||
}
|
||||
|
||||
export interface CreateQualityRuleRequest {
|
||||
name: string;
|
||||
description: string;
|
||||
dimension: QualityDimensionType;
|
||||
type: QualityRuleType;
|
||||
field?: string;
|
||||
condition: string;
|
||||
threshold: number;
|
||||
severity: 'low' | 'medium' | 'high' | 'critical';
|
||||
}
|
||||
20
apps/data-services/data-catalog/tsconfig.json
Normal file
20
apps/data-services/data-catalog/tsconfig.json
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
{
|
||||
"extends": "../../../tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"outDir": "./dist",
|
||||
"rootDir": "./src",
|
||||
"baseUrl": "./src",
|
||||
"paths": {
|
||||
"@/*": ["*"]
|
||||
}
|
||||
},
|
||||
"include": [
|
||||
"src/**/*"
|
||||
],
|
||||
"exclude": [
|
||||
"node_modules",
|
||||
"dist",
|
||||
"**/*.test.ts",
|
||||
"**/*.spec.ts"
|
||||
]
|
||||
}
|
||||
35
apps/data-services/data-processor/package.json
Normal file
35
apps/data-services/data-processor/package.json
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
{
|
||||
"name": "data-processor",
|
||||
"version": "1.0.0",
|
||||
"description": "Data processing and pipeline orchestration service",
|
||||
"main": "src/index.ts",
|
||||
"scripts": {
|
||||
"dev": "bun run --watch src/index.ts",
|
||||
"start": "bun run src/index.ts",
|
||||
"build": "bun build src/index.ts --outdir=dist",
|
||||
"test": "bun test",
|
||||
"lint": "eslint src/**/*.ts",
|
||||
"type-check": "tsc --noEmit"
|
||||
},
|
||||
"dependencies": {
|
||||
"@stock-bot/shared-types": "*",
|
||||
"@stock-bot/event-bus": "*",
|
||||
"@stock-bot/utils": "*",
|
||||
"@stock-bot/api-client": "*",
|
||||
"hono": "^4.6.3",
|
||||
"ioredis": "^5.4.1",
|
||||
"cron": "^3.1.6",
|
||||
"bull": "^4.12.2",
|
||||
"axios": "^1.6.2",
|
||||
"node-fetch": "^3.3.2",
|
||||
"csv-parser": "^3.0.0",
|
||||
"joi": "^17.11.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"bun-types": "^1.2.15",
|
||||
"@types/node": "^20.10.5",
|
||||
"@types/bull": "^4.10.0",
|
||||
"typescript": "^5.3.3",
|
||||
"eslint": "^8.56.0"
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,104 @@
|
|||
import { Context } from 'hono';
|
||||
import { logger } from '@stock-bot/utils';
|
||||
|
||||
export class HealthController {
|
||||
async getHealth(c: Context): Promise<Response> {
|
||||
try {
|
||||
const health = {
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'data-processor',
|
||||
version: process.env.npm_package_version || '1.0.0',
|
||||
uptime: process.uptime(),
|
||||
environment: process.env.NODE_ENV || 'development',
|
||||
dependencies: {
|
||||
redis: await this.checkRedisHealth(),
|
||||
eventBus: await this.checkEventBusHealth(),
|
||||
}
|
||||
};
|
||||
|
||||
return c.json(health);
|
||||
} catch (error) {
|
||||
logger.error('Health check failed:', error);
|
||||
|
||||
return c.json({
|
||||
status: 'unhealthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'data-processor',
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 503);
|
||||
}
|
||||
}
|
||||
|
||||
async getDetailedHealth(c: Context): Promise<Response> {
|
||||
try {
|
||||
const health = {
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'data-processor',
|
||||
version: process.env.npm_package_version || '1.0.0',
|
||||
uptime: process.uptime(),
|
||||
environment: process.env.NODE_ENV || 'development',
|
||||
system: {
|
||||
platform: process.platform,
|
||||
architecture: process.arch,
|
||||
nodeVersion: process.version,
|
||||
memory: process.memoryUsage(),
|
||||
pid: process.pid
|
||||
},
|
||||
dependencies: {
|
||||
redis: await this.checkRedisHealth(),
|
||||
eventBus: await this.checkEventBusHealth(),
|
||||
},
|
||||
metrics: {
|
||||
activePipelines: 0, // Will be populated by orchestrator
|
||||
runningJobs: 0, // Will be populated by orchestrator
|
||||
totalProcessedRecords: 0 // Will be populated by orchestrator
|
||||
}
|
||||
};
|
||||
|
||||
return c.json(health);
|
||||
} catch (error) {
|
||||
logger.error('Detailed health check failed:', error);
|
||||
|
||||
return c.json({
|
||||
status: 'unhealthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'data-processor',
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 503);
|
||||
}
|
||||
}
|
||||
|
||||
private async checkRedisHealth(): Promise<{ status: string; latency?: number; error?: string }> {
|
||||
try {
|
||||
const startTime = Date.now();
|
||||
// In a real implementation, ping Redis here
|
||||
const latency = Date.now() - startTime;
|
||||
|
||||
return {
|
||||
status: 'healthy',
|
||||
latency
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'unhealthy',
|
||||
error: error instanceof Error ? error.message : 'Redis connection failed'
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async checkEventBusHealth(): Promise<{ status: string; error?: string }> {
|
||||
try {
|
||||
// In a real implementation, check event bus connection here
|
||||
return {
|
||||
status: 'healthy'
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'unhealthy',
|
||||
error: error instanceof Error ? error.message : 'Event bus connection failed'
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,297 @@
|
|||
import { Context } from 'hono';
|
||||
import { logger } from '@stock-bot/utils';
|
||||
import { DataPipelineOrchestrator } from '../core/DataPipelineOrchestrator';
|
||||
import { JobStatus } from '../types/DataPipeline';
|
||||
|
||||
export class JobController {
|
||||
constructor(private orchestrator: DataPipelineOrchestrator) {}
|
||||
|
||||
async listJobs(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineId = c.req.query('pipelineId');
|
||||
const status = c.req.query('status') as JobStatus;
|
||||
const limit = parseInt(c.req.query('limit') || '50');
|
||||
const offset = parseInt(c.req.query('offset') || '0');
|
||||
|
||||
let jobs = this.orchestrator.listJobs(pipelineId);
|
||||
|
||||
// Filter by status if provided
|
||||
if (status) {
|
||||
jobs = jobs.filter(job => job.status === status);
|
||||
}
|
||||
|
||||
// Sort by creation time (newest first)
|
||||
jobs.sort((a, b) => b.createdAt.getTime() - a.createdAt.getTime());
|
||||
|
||||
// Apply pagination
|
||||
const totalJobs = jobs.length;
|
||||
const paginatedJobs = jobs.slice(offset, offset + limit);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: paginatedJobs,
|
||||
pagination: {
|
||||
total: totalJobs,
|
||||
limit,
|
||||
offset,
|
||||
hasMore: offset + limit < totalJobs
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to list jobs:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to list jobs'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getJob(c: Context): Promise<Response> {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
const job = this.orchestrator.getJob(jobId);
|
||||
|
||||
if (!job) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Job not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: job
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to get job:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to get job'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async cancelJob(c: Context): Promise<Response> {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
const job = this.orchestrator.getJob(jobId);
|
||||
|
||||
if (!job) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Job not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
if (job.status !== JobStatus.RUNNING && job.status !== JobStatus.PENDING) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Job cannot be cancelled in current status'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
// Update job status to cancelled
|
||||
job.status = JobStatus.CANCELLED;
|
||||
job.completedAt = new Date();
|
||||
job.error = 'Job cancelled by user';
|
||||
|
||||
logger.info(`Cancelled job: ${jobId}`);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Job cancelled successfully',
|
||||
data: job
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to cancel job:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to cancel job'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async retryJob(c: Context): Promise<Response> {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
const job = this.orchestrator.getJob(jobId);
|
||||
|
||||
if (!job) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Job not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
if (job.status !== JobStatus.FAILED) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Only failed jobs can be retried'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
// Create a new job with the same parameters
|
||||
const newJob = await this.orchestrator.runPipeline(job.pipelineId, job.parameters);
|
||||
|
||||
logger.info(`Retried job: ${jobId} as new job: ${newJob.id}`);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Job retried successfully',
|
||||
data: {
|
||||
originalJob: job,
|
||||
newJob: newJob
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to retry job:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to retry job'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getJobLogs(c: Context): Promise<Response> {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
const job = this.orchestrator.getJob(jobId);
|
||||
|
||||
if (!job) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Job not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
// In a real implementation, fetch logs from a log store
|
||||
const logs = [
|
||||
{
|
||||
timestamp: job.createdAt,
|
||||
level: 'info',
|
||||
message: `Job ${jobId} created`
|
||||
},
|
||||
...(job.startedAt ? [{
|
||||
timestamp: job.startedAt,
|
||||
level: 'info',
|
||||
message: `Job ${jobId} started`
|
||||
}] : []),
|
||||
...(job.completedAt ? [{
|
||||
timestamp: job.completedAt,
|
||||
level: job.status === JobStatus.COMPLETED ? 'info' : 'error',
|
||||
message: job.status === JobStatus.COMPLETED ?
|
||||
`Job ${jobId} completed successfully` :
|
||||
`Job ${jobId} failed: ${job.error}`
|
||||
}] : [])
|
||||
];
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: {
|
||||
jobId,
|
||||
logs,
|
||||
totalLogs: logs.length
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to get job logs:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to get job logs'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getJobMetrics(c: Context): Promise<Response> {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
const job = this.orchestrator.getJob(jobId);
|
||||
|
||||
if (!job) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Job not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
const metrics = {
|
||||
...job.metrics,
|
||||
duration: job.completedAt && job.startedAt ?
|
||||
job.completedAt.getTime() - job.startedAt.getTime() : null,
|
||||
successRate: job.metrics.recordsProcessed > 0 ?
|
||||
(job.metrics.recordsSuccessful / job.metrics.recordsProcessed) * 100 : 0,
|
||||
errorRate: job.metrics.recordsProcessed > 0 ?
|
||||
(job.metrics.recordsFailed / job.metrics.recordsProcessed) * 100 : 0,
|
||||
status: job.status,
|
||||
startedAt: job.startedAt,
|
||||
completedAt: job.completedAt
|
||||
};
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: metrics
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to get job metrics:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to get job metrics'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getJobStats(c: Context): Promise<Response> {
|
||||
try {
|
||||
const jobs = this.orchestrator.listJobs();
|
||||
|
||||
const stats = {
|
||||
total: jobs.length,
|
||||
byStatus: {
|
||||
pending: jobs.filter(j => j.status === JobStatus.PENDING).length,
|
||||
running: jobs.filter(j => j.status === JobStatus.RUNNING).length,
|
||||
completed: jobs.filter(j => j.status === JobStatus.COMPLETED).length,
|
||||
failed: jobs.filter(j => j.status === JobStatus.FAILED).length,
|
||||
cancelled: jobs.filter(j => j.status === JobStatus.CANCELLED).length,
|
||||
},
|
||||
metrics: {
|
||||
totalRecordsProcessed: jobs.reduce((sum, j) => sum + j.metrics.recordsProcessed, 0),
|
||||
totalRecordsSuccessful: jobs.reduce((sum, j) => sum + j.metrics.recordsSuccessful, 0),
|
||||
totalRecordsFailed: jobs.reduce((sum, j) => sum + j.metrics.recordsFailed, 0),
|
||||
averageProcessingTime: jobs.length > 0 ?
|
||||
jobs.reduce((sum, j) => sum + j.metrics.processingTimeMs, 0) / jobs.length : 0,
|
||||
successRate: jobs.length > 0 ?
|
||||
(jobs.filter(j => j.status === JobStatus.COMPLETED).length / jobs.length) * 100 : 0
|
||||
},
|
||||
recentJobs: jobs
|
||||
.sort((a, b) => b.createdAt.getTime() - a.createdAt.getTime())
|
||||
.slice(0, 10)
|
||||
.map(job => ({
|
||||
id: job.id,
|
||||
pipelineId: job.pipelineId,
|
||||
status: job.status,
|
||||
createdAt: job.createdAt,
|
||||
processingTime: job.metrics.processingTimeMs,
|
||||
recordsProcessed: job.metrics.recordsProcessed
|
||||
}))
|
||||
};
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: stats
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to get job stats:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to get job stats'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,346 @@
|
|||
import { Context } from 'hono';
|
||||
import { logger } from '@stock-bot/utils';
|
||||
import { DataPipelineOrchestrator } from '../core/DataPipelineOrchestrator';
|
||||
import { DataPipeline, PipelineStatus } from '../types/DataPipeline';
|
||||
|
||||
export class PipelineController {
|
||||
constructor(private orchestrator: DataPipelineOrchestrator) {}
|
||||
|
||||
async listPipelines(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelines = this.orchestrator.listPipelines();
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: pipelines,
|
||||
total: pipelines.length
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to list pipelines:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to list pipelines'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async createPipeline(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineData = await c.req.json();
|
||||
|
||||
// Validate required fields
|
||||
if (!pipelineData.name) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline name is required'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
const pipeline = await this.orchestrator.createPipeline(pipelineData);
|
||||
|
||||
logger.info(`Created pipeline: ${pipeline.name} (${pipeline.id})`);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: pipeline
|
||||
}, 201);
|
||||
} catch (error) {
|
||||
logger.error('Failed to create pipeline:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to create pipeline'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getPipeline(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineId = c.req.param('id');
|
||||
const pipeline = this.orchestrator.getPipeline(pipelineId);
|
||||
|
||||
if (!pipeline) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: pipeline
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to get pipeline:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to get pipeline'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async updatePipeline(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineId = c.req.param('id');
|
||||
const updateData = await c.req.json();
|
||||
|
||||
const existingPipeline = this.orchestrator.getPipeline(pipelineId);
|
||||
if (!existingPipeline) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
// Update pipeline (in a real implementation, this would use a proper update method)
|
||||
const updatedPipeline: DataPipeline = {
|
||||
...existingPipeline,
|
||||
...updateData,
|
||||
id: pipelineId, // Ensure ID doesn't change
|
||||
updatedAt: new Date()
|
||||
};
|
||||
|
||||
// In a real implementation, save to persistent storage
|
||||
logger.info(`Updated pipeline: ${updatedPipeline.name} (${pipelineId})`);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: updatedPipeline
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to update pipeline:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to update pipeline'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async deletePipeline(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineId = c.req.param('id');
|
||||
|
||||
const pipeline = this.orchestrator.getPipeline(pipelineId);
|
||||
if (!pipeline) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
// Check if pipeline is running
|
||||
const runningJobs = this.orchestrator.listJobs(pipelineId);
|
||||
if (runningJobs.length > 0) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Cannot delete pipeline with running jobs'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
// In a real implementation, delete from persistent storage
|
||||
logger.info(`Deleted pipeline: ${pipeline.name} (${pipelineId})`);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Pipeline deleted successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to delete pipeline:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to delete pipeline'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async runPipeline(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineId = c.req.param('id');
|
||||
const parameters = await c.req.json().catch(() => ({}));
|
||||
|
||||
const pipeline = this.orchestrator.getPipeline(pipelineId);
|
||||
if (!pipeline) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
if (pipeline.status !== PipelineStatus.ACTIVE) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline is not active'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
const job = await this.orchestrator.runPipeline(pipelineId, parameters);
|
||||
|
||||
logger.info(`Started pipeline job: ${job.id} for pipeline: ${pipelineId}`);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: job
|
||||
}, 202);
|
||||
} catch (error) {
|
||||
logger.error('Failed to run pipeline:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to run pipeline'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async schedulePipeline(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineId = c.req.param('id');
|
||||
const { cronExpression } = await c.req.json();
|
||||
|
||||
if (!cronExpression) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Cron expression is required'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
const pipeline = this.orchestrator.getPipeline(pipelineId);
|
||||
if (!pipeline) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
await this.orchestrator.schedulePipeline(pipelineId, cronExpression);
|
||||
|
||||
logger.info(`Scheduled pipeline: ${pipelineId} with cron: ${cronExpression}`);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Pipeline scheduled successfully',
|
||||
data: {
|
||||
pipelineId,
|
||||
cronExpression
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to schedule pipeline:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to schedule pipeline'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async pausePipeline(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineId = c.req.param('id');
|
||||
|
||||
const pipeline = this.orchestrator.getPipeline(pipelineId);
|
||||
if (!pipeline) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
// Update pipeline status to paused
|
||||
pipeline.status = PipelineStatus.PAUSED;
|
||||
pipeline.updatedAt = new Date();
|
||||
|
||||
logger.info(`Paused pipeline: ${pipelineId}`);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Pipeline paused successfully',
|
||||
data: pipeline
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to pause pipeline:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to pause pipeline'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async resumePipeline(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineId = c.req.param('id');
|
||||
|
||||
const pipeline = this.orchestrator.getPipeline(pipelineId);
|
||||
if (!pipeline) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
// Update pipeline status to active
|
||||
pipeline.status = PipelineStatus.ACTIVE;
|
||||
pipeline.updatedAt = new Date();
|
||||
|
||||
logger.info(`Resumed pipeline: ${pipelineId}`);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Pipeline resumed successfully',
|
||||
data: pipeline
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to resume pipeline:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to resume pipeline'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getPipelineMetrics(c: Context): Promise<Response> {
|
||||
try {
|
||||
const pipelineId = c.req.param('id');
|
||||
|
||||
const pipeline = this.orchestrator.getPipeline(pipelineId);
|
||||
if (!pipeline) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Pipeline not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
const jobs = this.orchestrator.listJobs(pipelineId);
|
||||
|
||||
const metrics = {
|
||||
totalJobs: jobs.length,
|
||||
completedJobs: jobs.filter(j => j.status === 'completed').length,
|
||||
failedJobs: jobs.filter(j => j.status === 'failed').length,
|
||||
runningJobs: jobs.filter(j => j.status === 'running').length,
|
||||
totalRecordsProcessed: jobs.reduce((sum, j) => sum + j.metrics.recordsProcessed, 0),
|
||||
totalProcessingTime: jobs.reduce((sum, j) => sum + j.metrics.processingTimeMs, 0),
|
||||
averageProcessingTime: jobs.length > 0 ?
|
||||
jobs.reduce((sum, j) => sum + j.metrics.processingTimeMs, 0) / jobs.length : 0,
|
||||
successRate: jobs.length > 0 ?
|
||||
(jobs.filter(j => j.status === 'completed').length / jobs.length) * 100 : 0
|
||||
};
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: metrics
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to get pipeline metrics:', error);
|
||||
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to get pipeline metrics'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,293 @@
|
|||
import { EventBus } from '@stock-bot/event-bus';
|
||||
import { logger } from '@stock-bot/utils';
|
||||
import { DataPipeline, PipelineStatus, PipelineJob, JobStatus } from '../types/DataPipeline';
|
||||
import { DataIngestionService } from '../services/DataIngestionService';
|
||||
import { DataTransformationService } from '../services/DataTransformationService';
|
||||
import { DataValidationService } from '../services/DataValidationService';
|
||||
import { DataQualityService } from '../services/DataQualityService';
|
||||
import { PipelineScheduler } from './PipelineScheduler';
|
||||
import { JobQueue } from './JobQueue';
|
||||
|
||||
export class DataPipelineOrchestrator {
|
||||
private eventBus: EventBus;
|
||||
private scheduler: PipelineScheduler;
|
||||
private jobQueue: JobQueue;
|
||||
private pipelines: Map<string, DataPipeline> = new Map();
|
||||
private runningJobs: Map<string, PipelineJob> = new Map();
|
||||
|
||||
constructor(
|
||||
private ingestionService: DataIngestionService,
|
||||
private transformationService: DataTransformationService,
|
||||
private validationService: DataValidationService,
|
||||
private qualityService: DataQualityService
|
||||
) {
|
||||
this.eventBus = new EventBus();
|
||||
this.scheduler = new PipelineScheduler(this);
|
||||
this.jobQueue = new JobQueue(this);
|
||||
}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Data Pipeline Orchestrator...');
|
||||
|
||||
await this.eventBus.initialize();
|
||||
await this.scheduler.initialize();
|
||||
await this.jobQueue.initialize();
|
||||
|
||||
// Subscribe to pipeline events
|
||||
await this.eventBus.subscribe('data.pipeline.*', this.handlePipelineEvent.bind(this));
|
||||
await this.eventBus.subscribe('data.job.*', this.handleJobEvent.bind(this));
|
||||
|
||||
// Load existing pipelines
|
||||
await this.loadPipelines();
|
||||
|
||||
logger.info('✅ Data Pipeline Orchestrator initialized');
|
||||
}
|
||||
|
||||
async createPipeline(pipeline: Omit<DataPipeline, 'id' | 'createdAt' | 'updatedAt'>): Promise<DataPipeline> {
|
||||
const pipelineWithId: DataPipeline = {
|
||||
...pipeline,
|
||||
id: this.generatePipelineId(),
|
||||
status: PipelineStatus.DRAFT,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
};
|
||||
|
||||
this.pipelines.set(pipelineWithId.id, pipelineWithId);
|
||||
|
||||
await this.eventBus.publish('data.pipeline.created', {
|
||||
pipelineId: pipelineWithId.id,
|
||||
pipeline: pipelineWithId,
|
||||
});
|
||||
|
||||
logger.info(`📋 Created pipeline: ${pipelineWithId.name} (${pipelineWithId.id})`);
|
||||
return pipelineWithId;
|
||||
}
|
||||
|
||||
async runPipeline(pipelineId: string, parameters?: Record<string, any>): Promise<PipelineJob> {
|
||||
const pipeline = this.pipelines.get(pipelineId);
|
||||
if (!pipeline) {
|
||||
throw new Error(`Pipeline not found: ${pipelineId}`);
|
||||
}
|
||||
|
||||
if (pipeline.status !== PipelineStatus.ACTIVE) {
|
||||
throw new Error(`Pipeline is not active: ${pipeline.status}`);
|
||||
}
|
||||
|
||||
const job: PipelineJob = {
|
||||
id: this.generateJobId(),
|
||||
pipelineId,
|
||||
status: JobStatus.PENDING,
|
||||
parameters: parameters || {},
|
||||
createdAt: new Date(),
|
||||
startedAt: null,
|
||||
completedAt: null,
|
||||
error: null,
|
||||
metrics: {
|
||||
recordsProcessed: 0,
|
||||
recordsSuccessful: 0,
|
||||
recordsFailed: 0,
|
||||
processingTimeMs: 0,
|
||||
},
|
||||
};
|
||||
|
||||
this.runningJobs.set(job.id, job);
|
||||
|
||||
// Queue the job for execution
|
||||
await this.jobQueue.enqueueJob(job);
|
||||
|
||||
await this.eventBus.publish('data.job.queued', {
|
||||
jobId: job.id,
|
||||
pipelineId,
|
||||
job,
|
||||
});
|
||||
|
||||
logger.info(`🚀 Queued pipeline job: ${job.id} for pipeline: ${pipeline.name}`);
|
||||
return job;
|
||||
}
|
||||
|
||||
async executePipelineJob(job: PipelineJob): Promise<void> {
|
||||
const pipeline = this.pipelines.get(job.pipelineId);
|
||||
if (!pipeline) {
|
||||
throw new Error(`Pipeline not found: ${job.pipelineId}`);
|
||||
}
|
||||
|
||||
const startTime = Date.now();
|
||||
job.status = JobStatus.RUNNING;
|
||||
job.startedAt = new Date();
|
||||
|
||||
await this.eventBus.publish('data.job.started', {
|
||||
jobId: job.id,
|
||||
pipelineId: job.pipelineId,
|
||||
job,
|
||||
});
|
||||
|
||||
try {
|
||||
logger.info(`⚙️ Executing pipeline job: ${job.id}`);
|
||||
|
||||
// Execute pipeline steps
|
||||
await this.executeIngestionStep(pipeline, job);
|
||||
await this.executeTransformationStep(pipeline, job);
|
||||
await this.executeValidationStep(pipeline, job);
|
||||
await this.executeQualityChecks(pipeline, job);
|
||||
|
||||
// Complete the job
|
||||
job.status = JobStatus.COMPLETED;
|
||||
job.completedAt = new Date();
|
||||
job.metrics.processingTimeMs = Date.now() - startTime;
|
||||
|
||||
await this.eventBus.publish('data.job.completed', {
|
||||
jobId: job.id,
|
||||
pipelineId: job.pipelineId,
|
||||
job,
|
||||
});
|
||||
|
||||
logger.info(`✅ Pipeline job completed: ${job.id} in ${job.metrics.processingTimeMs}ms`);
|
||||
|
||||
} catch (error) {
|
||||
job.status = JobStatus.FAILED;
|
||||
job.completedAt = new Date();
|
||||
job.error = error instanceof Error ? error.message : 'Unknown error';
|
||||
job.metrics.processingTimeMs = Date.now() - startTime;
|
||||
|
||||
await this.eventBus.publish('data.job.failed', {
|
||||
jobId: job.id,
|
||||
pipelineId: job.pipelineId,
|
||||
job,
|
||||
error: job.error,
|
||||
});
|
||||
|
||||
logger.error(`❌ Pipeline job failed: ${job.id}`, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
private async executeIngestionStep(pipeline: DataPipeline, job: PipelineJob): Promise<void> {
|
||||
if (!pipeline.steps.ingestion) return;
|
||||
|
||||
logger.info(`📥 Executing ingestion step for job: ${job.id}`);
|
||||
|
||||
const result = await this.ingestionService.ingestData(
|
||||
pipeline.steps.ingestion,
|
||||
job.parameters
|
||||
);
|
||||
|
||||
job.metrics.recordsProcessed += result.recordsProcessed;
|
||||
job.metrics.recordsSuccessful += result.recordsSuccessful;
|
||||
job.metrics.recordsFailed += result.recordsFailed;
|
||||
}
|
||||
|
||||
private async executeTransformationStep(pipeline: DataPipeline, job: PipelineJob): Promise<void> {
|
||||
if (!pipeline.steps.transformation) return;
|
||||
|
||||
logger.info(`🔄 Executing transformation step for job: ${job.id}`);
|
||||
|
||||
const result = await this.transformationService.transformData(
|
||||
pipeline.steps.transformation,
|
||||
job.parameters
|
||||
);
|
||||
|
||||
job.metrics.recordsProcessed += result.recordsProcessed;
|
||||
job.metrics.recordsSuccessful += result.recordsSuccessful;
|
||||
job.metrics.recordsFailed += result.recordsFailed;
|
||||
}
|
||||
|
||||
private async executeValidationStep(pipeline: DataPipeline, job: PipelineJob): Promise<void> {
|
||||
if (!pipeline.steps.validation) return;
|
||||
|
||||
logger.info(`✅ Executing validation step for job: ${job.id}`);
|
||||
|
||||
const result = await this.validationService.validateData(
|
||||
pipeline.steps.validation,
|
||||
job.parameters
|
||||
);
|
||||
|
||||
job.metrics.recordsProcessed += result.recordsProcessed;
|
||||
job.metrics.recordsSuccessful += result.recordsSuccessful;
|
||||
job.metrics.recordsFailed += result.recordsFailed;
|
||||
}
|
||||
|
||||
private async executeQualityChecks(pipeline: DataPipeline, job: PipelineJob): Promise<void> {
|
||||
if (!pipeline.steps.qualityChecks) return;
|
||||
|
||||
logger.info(`🔍 Executing quality checks for job: ${job.id}`);
|
||||
|
||||
await this.qualityService.runQualityChecks(
|
||||
pipeline.steps.qualityChecks,
|
||||
job.parameters
|
||||
);
|
||||
}
|
||||
|
||||
async schedulePipeline(pipelineId: string, cronExpression: string): Promise<void> {
|
||||
const pipeline = this.pipelines.get(pipelineId);
|
||||
if (!pipeline) {
|
||||
throw new Error(`Pipeline not found: ${pipelineId}`);
|
||||
}
|
||||
|
||||
await this.scheduler.schedulePipeline(pipelineId, cronExpression);
|
||||
|
||||
pipeline.schedule = {
|
||||
cronExpression,
|
||||
enabled: true,
|
||||
lastRun: null,
|
||||
nextRun: this.scheduler.getNextRunTime(cronExpression),
|
||||
};
|
||||
|
||||
await this.eventBus.publish('data.pipeline.scheduled', {
|
||||
pipelineId,
|
||||
cronExpression,
|
||||
});
|
||||
|
||||
logger.info(`📅 Scheduled pipeline: ${pipeline.name} with cron: ${cronExpression}`);
|
||||
}
|
||||
|
||||
// Pipeline CRUD operations
|
||||
getPipeline(pipelineId: string): DataPipeline | undefined {
|
||||
return this.pipelines.get(pipelineId);
|
||||
}
|
||||
|
||||
listPipelines(): DataPipeline[] {
|
||||
return Array.from(this.pipelines.values());
|
||||
}
|
||||
|
||||
getJob(jobId: string): PipelineJob | undefined {
|
||||
return this.runningJobs.get(jobId);
|
||||
}
|
||||
|
||||
listJobs(pipelineId?: string): PipelineJob[] {
|
||||
const jobs = Array.from(this.runningJobs.values());
|
||||
return pipelineId ? jobs.filter(job => job.pipelineId === pipelineId) : jobs;
|
||||
}
|
||||
|
||||
private async handlePipelineEvent(event: any): Promise<void> {
|
||||
logger.debug('📨 Received pipeline event:', event);
|
||||
// Handle pipeline-level events
|
||||
}
|
||||
|
||||
private async handleJobEvent(event: any): Promise<void> {
|
||||
logger.debug('📨 Received job event:', event);
|
||||
// Handle job-level events
|
||||
}
|
||||
|
||||
private async loadPipelines(): Promise<void> {
|
||||
// In a real implementation, load pipelines from persistent storage
|
||||
logger.info('📂 Loading existing pipelines...');
|
||||
}
|
||||
|
||||
private generatePipelineId(): string {
|
||||
return `pipeline_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
|
||||
private generateJobId(): string {
|
||||
return `job_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
|
||||
async shutdown(): Promise<void> {
|
||||
logger.info('🔄 Shutting down Data Pipeline Orchestrator...');
|
||||
|
||||
await this.scheduler.shutdown();
|
||||
await this.jobQueue.shutdown();
|
||||
await this.eventBus.disconnect();
|
||||
|
||||
logger.info('✅ Data Pipeline Orchestrator shutdown complete');
|
||||
}
|
||||
}
|
||||
77
apps/data-services/data-processor/src/core/JobQueue.ts
Normal file
77
apps/data-services/data-processor/src/core/JobQueue.ts
Normal file
|
|
@ -0,0 +1,77 @@
|
|||
import Queue from 'bull';
|
||||
import { logger } from '@stock-bot/utils';
|
||||
import { PipelineJob } from '../types/DataPipeline';
|
||||
import { DataPipelineOrchestrator } from './DataPipelineOrchestrator';
|
||||
|
||||
export class JobQueue {
|
||||
private queue: Queue.Queue;
|
||||
|
||||
constructor(private orchestrator: DataPipelineOrchestrator) {
|
||||
this.queue = new Queue('data-pipeline-jobs', {
|
||||
redis: {
|
||||
host: process.env.REDIS_HOST || 'localhost',
|
||||
port: parseInt(process.env.REDIS_PORT || '6379'),
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Job Queue...');
|
||||
|
||||
// Process jobs with a maximum of 5 concurrent jobs
|
||||
this.queue.process('pipeline-job', 5, async (job) => {
|
||||
const pipelineJob: PipelineJob = job.data;
|
||||
await this.orchestrator.executePipelineJob(pipelineJob);
|
||||
});
|
||||
|
||||
// Handle job events
|
||||
this.queue.on('completed', (job) => {
|
||||
logger.info(`✅ Job completed: ${job.id}`);
|
||||
});
|
||||
|
||||
this.queue.on('failed', (job, error) => {
|
||||
logger.error(`❌ Job failed: ${job.id}`, error);
|
||||
});
|
||||
|
||||
this.queue.on('stalled', (job) => {
|
||||
logger.warn(`⚠️ Job stalled: ${job.id}`);
|
||||
});
|
||||
|
||||
logger.info('✅ Job Queue initialized');
|
||||
}
|
||||
|
||||
async enqueueJob(job: PipelineJob): Promise<void> {
|
||||
await this.queue.add('pipeline-job', job, {
|
||||
jobId: job.id,
|
||||
removeOnComplete: 100, // Keep last 100 completed jobs
|
||||
removeOnFail: 50, // Keep last 50 failed jobs
|
||||
attempts: 3, // Retry failed jobs up to 3 times
|
||||
backoff: {
|
||||
type: 'exponential',
|
||||
delay: 2000,
|
||||
},
|
||||
});
|
||||
|
||||
logger.info(`📤 Enqueued job: ${job.id}`);
|
||||
}
|
||||
|
||||
async getJobStats(): Promise<any> {
|
||||
const waiting = await this.queue.getWaiting();
|
||||
const active = await this.queue.getActive();
|
||||
const completed = await this.queue.getCompleted();
|
||||
const failed = await this.queue.getFailed();
|
||||
|
||||
return {
|
||||
waiting: waiting.length,
|
||||
active: active.length,
|
||||
completed: completed.length,
|
||||
failed: failed.length,
|
||||
};
|
||||
}
|
||||
|
||||
async shutdown(): Promise<void> {
|
||||
logger.info('🔄 Shutting down Job Queue...');
|
||||
await this.queue.close();
|
||||
logger.info('✅ Job Queue shutdown complete');
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,69 @@
|
|||
import { CronJob } from 'cron';
|
||||
import { logger } from '@stock-bot/utils';
|
||||
import { DataPipelineOrchestrator } from './DataPipelineOrchestrator';
|
||||
|
||||
export class PipelineScheduler {
|
||||
private scheduledJobs: Map<string, CronJob> = new Map();
|
||||
|
||||
constructor(private orchestrator: DataPipelineOrchestrator) {}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Pipeline Scheduler...');
|
||||
logger.info('✅ Pipeline Scheduler initialized');
|
||||
}
|
||||
|
||||
async schedulePipeline(pipelineId: string, cronExpression: string): Promise<void> {
|
||||
// Cancel existing schedule if it exists
|
||||
if (this.scheduledJobs.has(pipelineId)) {
|
||||
this.cancelSchedule(pipelineId);
|
||||
}
|
||||
|
||||
const cronJob = new CronJob(
|
||||
cronExpression,
|
||||
async () => {
|
||||
try {
|
||||
logger.info(`⏰ Scheduled execution triggered for pipeline: ${pipelineId}`);
|
||||
await this.orchestrator.runPipeline(pipelineId);
|
||||
} catch (error) {
|
||||
logger.error(`❌ Scheduled pipeline execution failed: ${pipelineId}`, error);
|
||||
}
|
||||
},
|
||||
null,
|
||||
true, // Start immediately
|
||||
'UTC'
|
||||
);
|
||||
|
||||
this.scheduledJobs.set(pipelineId, cronJob);
|
||||
logger.info(`📅 Scheduled pipeline ${pipelineId} with cron: ${cronExpression}`);
|
||||
}
|
||||
|
||||
cancelSchedule(pipelineId: string): void {
|
||||
const job = this.scheduledJobs.get(pipelineId);
|
||||
if (job) {
|
||||
job.stop();
|
||||
this.scheduledJobs.delete(pipelineId);
|
||||
logger.info(`🚫 Cancelled schedule for pipeline: ${pipelineId}`);
|
||||
}
|
||||
}
|
||||
|
||||
getNextRunTime(cronExpression: string): Date {
|
||||
const job = new CronJob(cronExpression);
|
||||
return job.nextDate().toDate();
|
||||
}
|
||||
|
||||
getScheduledPipelines(): string[] {
|
||||
return Array.from(this.scheduledJobs.keys());
|
||||
}
|
||||
|
||||
async shutdown(): Promise<void> {
|
||||
logger.info('🔄 Shutting down Pipeline Scheduler...');
|
||||
|
||||
for (const [pipelineId, job] of this.scheduledJobs) {
|
||||
job.stop();
|
||||
logger.info(`🚫 Stopped scheduled job for pipeline: ${pipelineId}`);
|
||||
}
|
||||
|
||||
this.scheduledJobs.clear();
|
||||
logger.info('✅ Pipeline Scheduler shutdown complete');
|
||||
}
|
||||
}
|
||||
107
apps/data-services/data-processor/src/index.ts
Normal file
107
apps/data-services/data-processor/src/index.ts
Normal file
|
|
@ -0,0 +1,107 @@
|
|||
import { Hono } from 'hono';
|
||||
import { serve } from 'bun';
|
||||
import { logger } from '@stock-bot/utils';
|
||||
import { DataPipelineOrchestrator } from './core/DataPipelineOrchestrator';
|
||||
import { DataQualityService } from './services/DataQualityService';
|
||||
import { DataIngestionService } from './services/DataIngestionService';
|
||||
import { DataTransformationService } from './services/DataTransformationService';
|
||||
import { DataValidationService } from './services/DataValidationService';
|
||||
import { HealthController } from './controllers/HealthController';
|
||||
import { PipelineController } from './controllers/PipelineController';
|
||||
import { JobController } from './controllers/JobController';
|
||||
|
||||
const app = new Hono();
|
||||
|
||||
// Services
|
||||
const dataQualityService = new DataQualityService();
|
||||
const dataIngestionService = new DataIngestionService();
|
||||
const dataTransformationService = new DataTransformationService();
|
||||
const dataValidationService = new DataValidationService();
|
||||
|
||||
// Core orchestrator
|
||||
const pipelineOrchestrator = new DataPipelineOrchestrator(
|
||||
dataIngestionService,
|
||||
dataTransformationService,
|
||||
dataValidationService,
|
||||
dataQualityService
|
||||
);
|
||||
|
||||
// Controllers
|
||||
const healthController = new HealthController();
|
||||
const pipelineController = new PipelineController(pipelineOrchestrator);
|
||||
const jobController = new JobController(pipelineOrchestrator);
|
||||
|
||||
// Health endpoints
|
||||
app.get('/health', healthController.getHealth.bind(healthController));
|
||||
app.get('/health/detailed', healthController.getDetailedHealth.bind(healthController));
|
||||
|
||||
// Pipeline management
|
||||
app.get('/api/pipelines', pipelineController.listPipelines.bind(pipelineController));
|
||||
app.post('/api/pipelines', pipelineController.createPipeline.bind(pipelineController));
|
||||
app.get('/api/pipelines/:id', pipelineController.getPipeline.bind(pipelineController));
|
||||
app.put('/api/pipelines/:id', pipelineController.updatePipeline.bind(pipelineController));
|
||||
app.delete('/api/pipelines/:id', pipelineController.deletePipeline.bind(pipelineController));
|
||||
app.post('/api/pipelines/:id/run', pipelineController.runPipeline.bind(pipelineController));
|
||||
app.post('/api/pipelines/:id/schedule', pipelineController.schedulePipeline.bind(pipelineController));
|
||||
app.post('/api/pipelines/:id/pause', pipelineController.pausePipeline.bind(pipelineController));
|
||||
app.post('/api/pipelines/:id/resume', pipelineController.resumePipeline.bind(pipelineController));
|
||||
app.get('/api/pipelines/:id/metrics', pipelineController.getPipelineMetrics.bind(pipelineController));
|
||||
|
||||
// Job management
|
||||
app.get('/api/jobs', jobController.listJobs.bind(jobController));
|
||||
app.get('/api/jobs/stats', jobController.getJobStats.bind(jobController));
|
||||
app.get('/api/jobs/:id', jobController.getJob.bind(jobController));
|
||||
app.get('/api/jobs/:id/logs', jobController.getJobLogs.bind(jobController));
|
||||
app.get('/api/jobs/:id/metrics', jobController.getJobMetrics.bind(jobController));
|
||||
app.post('/api/jobs/:id/cancel', jobController.cancelJob.bind(jobController));
|
||||
app.post('/api/jobs/:id/retry', jobController.retryJob.bind(jobController));
|
||||
|
||||
// Data quality endpoints
|
||||
app.get('/api/data-quality/metrics', async (c) => {
|
||||
const metrics = await dataQualityService.getQualityMetrics();
|
||||
return c.json({ success: true, data: metrics });
|
||||
});
|
||||
|
||||
app.get('/api/data-quality/report/:dataset', async (c) => {
|
||||
const dataset = c.req.param('dataset');
|
||||
const report = await dataQualityService.generateReport(dataset);
|
||||
return c.json({ success: true, data: report });
|
||||
});
|
||||
|
||||
const PORT = parseInt(process.env.DATA_PROCESSOR_PORT || '5001');
|
||||
|
||||
// Initialize services
|
||||
async function initializeServices() {
|
||||
try {
|
||||
logger.info('🔄 Initializing Data Processor services...');
|
||||
|
||||
await dataQualityService.initialize();
|
||||
await dataIngestionService.initialize();
|
||||
await dataTransformationService.initialize();
|
||||
await dataValidationService.initialize();
|
||||
await pipelineOrchestrator.initialize();
|
||||
|
||||
logger.info('✅ Data Processor services initialized successfully');
|
||||
} catch (error) {
|
||||
logger.error('❌ Failed to initialize Data Processor services:', error);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Graceful shutdown
|
||||
process.on('SIGINT', async () => {
|
||||
logger.info('🔄 Gracefully shutting down Data Processor...');
|
||||
await pipelineOrchestrator.shutdown();
|
||||
process.exit(0);
|
||||
});
|
||||
|
||||
initializeServices().then(() => {
|
||||
serve({
|
||||
port: PORT,
|
||||
fetch: app.fetch,
|
||||
});
|
||||
|
||||
logger.info(`🚀 Data Processor running on port ${PORT}`);
|
||||
logger.info(`🔍 Health check: http://localhost:${PORT}/health`);
|
||||
logger.info(`📊 API documentation: http://localhost:${PORT}/api`);
|
||||
});
|
||||
|
|
@ -0,0 +1,200 @@
|
|||
import { logger } from '@stock-bot/utils';
|
||||
import { IngestionStep, ProcessingResult, DataSource } from '../types/DataPipeline';
|
||||
import axios from 'axios';
|
||||
import * as csv from 'csv-parser';
|
||||
import * as fs from 'fs';
|
||||
|
||||
export class DataIngestionService {
|
||||
private activeConnections: Map<string, any> = new Map();
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Data Ingestion Service...');
|
||||
logger.info('✅ Data Ingestion Service initialized');
|
||||
}
|
||||
|
||||
async ingestData(step: IngestionStep, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
const startTime = Date.now();
|
||||
logger.info(`📥 Starting data ingestion from ${step.source.type}: ${step.source.connection.url || step.source.connection.host}`);
|
||||
|
||||
try {
|
||||
switch (step.source.type) {
|
||||
case 'api':
|
||||
return await this.ingestFromApi(step.source, parameters);
|
||||
case 'file':
|
||||
return await this.ingestFromFile(step.source, parameters);
|
||||
case 'database':
|
||||
return await this.ingestFromDatabase(step.source, parameters);
|
||||
case 'stream':
|
||||
return await this.ingestFromStream(step.source, parameters);
|
||||
default:
|
||||
throw new Error(`Unsupported ingestion type: ${step.source.type}`);
|
||||
}
|
||||
} catch (error) {
|
||||
const processingTime = Date.now() - startTime;
|
||||
logger.error(`❌ Data ingestion failed after ${processingTime}ms:`, error);
|
||||
|
||||
return {
|
||||
recordsProcessed: 0,
|
||||
recordsSuccessful: 0,
|
||||
recordsFailed: 0,
|
||||
errors: [{
|
||||
record: 0,
|
||||
message: error instanceof Error ? error.message : 'Unknown error',
|
||||
code: 'INGESTION_ERROR'
|
||||
}],
|
||||
metadata: { processingTimeMs: processingTime }
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async ingestFromApi(source: DataSource, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
const config = {
|
||||
method: 'GET',
|
||||
url: source.connection.url,
|
||||
headers: source.connection.headers || {},
|
||||
params: { ...source.connection.params, ...parameters },
|
||||
};
|
||||
|
||||
if (source.connection.apiKey) {
|
||||
config.headers['Authorization'] = `Bearer ${source.connection.apiKey}`;
|
||||
}
|
||||
|
||||
const response = await axios(config);
|
||||
const data = response.data;
|
||||
|
||||
// Process the data based on format
|
||||
let records: any[] = [];
|
||||
|
||||
if (Array.isArray(data)) {
|
||||
records = data;
|
||||
} else if (data.data && Array.isArray(data.data)) {
|
||||
records = data.data;
|
||||
} else if (data.results && Array.isArray(data.results)) {
|
||||
records = data.results;
|
||||
} else {
|
||||
records = [data];
|
||||
}
|
||||
|
||||
logger.info(`📊 Ingested ${records.length} records from API: ${source.connection.url}`);
|
||||
|
||||
return {
|
||||
recordsProcessed: records.length,
|
||||
recordsSuccessful: records.length,
|
||||
recordsFailed: 0,
|
||||
errors: [],
|
||||
metadata: {
|
||||
source: 'api',
|
||||
url: source.connection.url,
|
||||
statusCode: response.status,
|
||||
responseSize: JSON.stringify(data).length
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private async ingestFromFile(source: DataSource, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
const filePath = source.connection.url || parameters.filePath;
|
||||
|
||||
if (!filePath) {
|
||||
throw new Error('File path is required for file ingestion');
|
||||
}
|
||||
|
||||
switch (source.format) {
|
||||
case 'csv':
|
||||
return await this.ingestCsvFile(filePath);
|
||||
case 'json':
|
||||
return await this.ingestJsonFile(filePath);
|
||||
default:
|
||||
throw new Error(`Unsupported file format: ${source.format}`);
|
||||
}
|
||||
}
|
||||
|
||||
private async ingestCsvFile(filePath: string): Promise<ProcessingResult> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const records: any[] = [];
|
||||
const errors: any[] = [];
|
||||
let recordCount = 0;
|
||||
|
||||
fs.createReadStream(filePath)
|
||||
.pipe(csv())
|
||||
.on('data', (data) => {
|
||||
recordCount++;
|
||||
try {
|
||||
records.push(data);
|
||||
} catch (error) {
|
||||
errors.push({
|
||||
record: recordCount,
|
||||
message: error instanceof Error ? error.message : 'Parse error',
|
||||
code: 'CSV_PARSE_ERROR'
|
||||
});
|
||||
}
|
||||
})
|
||||
.on('end', () => {
|
||||
logger.info(`📊 Ingested ${records.length} records from CSV: ${filePath}`);
|
||||
resolve({
|
||||
recordsProcessed: recordCount,
|
||||
recordsSuccessful: records.length,
|
||||
recordsFailed: errors.length,
|
||||
errors,
|
||||
metadata: {
|
||||
source: 'file',
|
||||
format: 'csv',
|
||||
filePath
|
||||
}
|
||||
});
|
||||
})
|
||||
.on('error', reject);
|
||||
});
|
||||
}
|
||||
|
||||
private async ingestJsonFile(filePath: string): Promise<ProcessingResult> {
|
||||
const fileContent = await fs.promises.readFile(filePath, 'utf8');
|
||||
const data = JSON.parse(fileContent);
|
||||
|
||||
let records: any[] = [];
|
||||
|
||||
if (Array.isArray(data)) {
|
||||
records = data;
|
||||
} else {
|
||||
records = [data];
|
||||
}
|
||||
|
||||
logger.info(`📊 Ingested ${records.length} records from JSON: ${filePath}`);
|
||||
|
||||
return {
|
||||
recordsProcessed: records.length,
|
||||
recordsSuccessful: records.length,
|
||||
recordsFailed: 0,
|
||||
errors: [],
|
||||
metadata: {
|
||||
source: 'file',
|
||||
format: 'json',
|
||||
filePath,
|
||||
fileSize: fileContent.length
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private async ingestFromDatabase(source: DataSource, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
// Placeholder for database ingestion
|
||||
// In a real implementation, this would connect to various databases
|
||||
// (PostgreSQL, MySQL, MongoDB, etc.) and execute queries
|
||||
|
||||
throw new Error('Database ingestion not yet implemented');
|
||||
}
|
||||
|
||||
private async ingestFromStream(source: DataSource, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
// Placeholder for stream ingestion
|
||||
// In a real implementation, this would connect to streaming sources
|
||||
// (Kafka, Kinesis, WebSocket, etc.)
|
||||
|
||||
throw new Error('Stream ingestion not yet implemented');
|
||||
}
|
||||
|
||||
async getIngestionMetrics(): Promise<any> {
|
||||
return {
|
||||
activeConnections: this.activeConnections.size,
|
||||
supportedSources: ['api', 'file', 'database', 'stream'],
|
||||
supportedFormats: ['json', 'csv', 'xml', 'parquet', 'avro']
|
||||
};
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,373 @@
|
|||
import { logger } from '@stock-bot/utils';
|
||||
import { QualityCheckStep, ProcessingResult, QualityCheck, QualityThresholds } from '../types/DataPipeline';
|
||||
|
||||
export class DataQualityService {
|
||||
private qualityMetrics: Map<string, any> = new Map();
|
||||
private qualityReports: Map<string, any> = new Map();
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Data Quality Service...');
|
||||
|
||||
// Initialize quality metrics storage
|
||||
this.qualityMetrics.clear();
|
||||
this.qualityReports.clear();
|
||||
|
||||
logger.info('✅ Data Quality Service initialized');
|
||||
}
|
||||
|
||||
async runQualityChecks(step: QualityCheckStep, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
const startTime = Date.now();
|
||||
logger.info(`🔍 Running ${step.checks.length} quality checks`);
|
||||
|
||||
const inputData = parameters.inputData || [];
|
||||
const results: any[] = [];
|
||||
const errors: any[] = [];
|
||||
let totalScore = 0;
|
||||
|
||||
try {
|
||||
for (const check of step.checks) {
|
||||
const checkResult = await this.executeQualityCheck(check, inputData);
|
||||
results.push(checkResult);
|
||||
totalScore += checkResult.score;
|
||||
|
||||
// Check if the quality score meets thresholds
|
||||
if (checkResult.score < step.thresholds.error) {
|
||||
errors.push({
|
||||
record: 0,
|
||||
field: check.field,
|
||||
message: `Quality check failed: ${check.name} scored ${checkResult.score}%, below error threshold ${step.thresholds.error}%`,
|
||||
code: 'QUALITY_CHECK_ERROR'
|
||||
});
|
||||
} else if (checkResult.score < step.thresholds.warning) {
|
||||
logger.warn(`⚠️ Quality warning: ${check.name} scored ${checkResult.score}%, below warning threshold ${step.thresholds.warning}%`);
|
||||
}
|
||||
}
|
||||
|
||||
const averageScore = totalScore / step.checks.length;
|
||||
const processingTime = Date.now() - startTime;
|
||||
|
||||
// Store quality metrics
|
||||
this.storeQualityMetrics({
|
||||
timestamp: new Date(),
|
||||
averageScore,
|
||||
checksRun: step.checks.length,
|
||||
results,
|
||||
processingTimeMs: processingTime
|
||||
});
|
||||
|
||||
logger.info(`🔍 Quality checks completed: ${averageScore.toFixed(2)}% average score in ${processingTime}ms`);
|
||||
|
||||
return {
|
||||
recordsProcessed: inputData.length,
|
||||
recordsSuccessful: errors.length === 0 ? inputData.length : 0,
|
||||
recordsFailed: errors.length > 0 ? inputData.length : 0,
|
||||
errors,
|
||||
metadata: {
|
||||
qualityScore: averageScore,
|
||||
checksRun: step.checks.length,
|
||||
results,
|
||||
processingTimeMs: processingTime
|
||||
}
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
const processingTime = Date.now() - startTime;
|
||||
logger.error(`❌ Quality checks failed after ${processingTime}ms:`, error);
|
||||
|
||||
return {
|
||||
recordsProcessed: inputData.length,
|
||||
recordsSuccessful: 0,
|
||||
recordsFailed: inputData.length,
|
||||
errors: [{
|
||||
record: 0,
|
||||
message: error instanceof Error ? error.message : 'Unknown quality check error',
|
||||
code: 'QUALITY_SERVICE_ERROR'
|
||||
}],
|
||||
metadata: { processingTimeMs: processingTime }
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async executeQualityCheck(check: QualityCheck, data: any[]): Promise<any> {
|
||||
switch (check.type) {
|
||||
case 'completeness':
|
||||
return this.checkCompleteness(check, data);
|
||||
case 'accuracy':
|
||||
return this.checkAccuracy(check, data);
|
||||
case 'consistency':
|
||||
return this.checkConsistency(check, data);
|
||||
case 'validity':
|
||||
return this.checkValidity(check, data);
|
||||
case 'uniqueness':
|
||||
return this.checkUniqueness(check, data);
|
||||
default:
|
||||
throw new Error(`Unsupported quality check type: ${check.type}`);
|
||||
}
|
||||
}
|
||||
|
||||
private checkCompleteness(check: QualityCheck, data: any[]): any {
|
||||
if (!check.field) {
|
||||
throw new Error('Completeness check requires a field');
|
||||
}
|
||||
|
||||
const totalRecords = data.length;
|
||||
const completeRecords = data.filter(record => {
|
||||
const value = this.getFieldValue(record, check.field!);
|
||||
return value !== null && value !== undefined && value !== '';
|
||||
}).length;
|
||||
|
||||
const score = totalRecords > 0 ? (completeRecords / totalRecords) * 100 : 100;
|
||||
|
||||
return {
|
||||
checkName: check.name,
|
||||
type: 'completeness',
|
||||
field: check.field,
|
||||
score,
|
||||
passed: score >= check.threshold,
|
||||
details: {
|
||||
totalRecords,
|
||||
completeRecords,
|
||||
missingRecords: totalRecords - completeRecords
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private checkAccuracy(check: QualityCheck, data: any[]): any {
|
||||
// Placeholder for accuracy checks
|
||||
// In a real implementation, this would validate data against known references
|
||||
// or business rules specific to stock market data
|
||||
|
||||
const score = 95; // Mock score
|
||||
|
||||
return {
|
||||
checkName: check.name,
|
||||
type: 'accuracy',
|
||||
field: check.field,
|
||||
score,
|
||||
passed: score >= check.threshold,
|
||||
details: {
|
||||
validatedRecords: data.length,
|
||||
accurateRecords: Math.floor(data.length * 0.95)
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private checkConsistency(check: QualityCheck, data: any[]): any {
|
||||
if (!check.field) {
|
||||
throw new Error('Consistency check requires a field');
|
||||
}
|
||||
|
||||
// Check for consistent data types and formats
|
||||
const fieldValues = data.map(record => this.getFieldValue(record, check.field!));
|
||||
const types = [...new Set(fieldValues.map(val => typeof val))];
|
||||
|
||||
// For stock symbols, check consistent format
|
||||
if (check.field === 'symbol') {
|
||||
const validSymbols = fieldValues.filter(symbol =>
|
||||
typeof symbol === 'string' && /^[A-Z]{1,5}$/.test(symbol)
|
||||
).length;
|
||||
|
||||
const score = fieldValues.length > 0 ? (validSymbols / fieldValues.length) * 100 : 100;
|
||||
|
||||
return {
|
||||
checkName: check.name,
|
||||
type: 'consistency',
|
||||
field: check.field,
|
||||
score,
|
||||
passed: score >= check.threshold,
|
||||
details: {
|
||||
totalValues: fieldValues.length,
|
||||
consistentValues: validSymbols,
|
||||
inconsistentValues: fieldValues.length - validSymbols
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
// Generic consistency check
|
||||
const score = types.length === 1 ? 100 : 0;
|
||||
|
||||
return {
|
||||
checkName: check.name,
|
||||
type: 'consistency',
|
||||
field: check.field,
|
||||
score,
|
||||
passed: score >= check.threshold,
|
||||
details: {
|
||||
dataTypes: types,
|
||||
isConsistent: types.length === 1
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private checkValidity(check: QualityCheck, data: any[]): any {
|
||||
if (!check.field) {
|
||||
throw new Error('Validity check requires a field');
|
||||
}
|
||||
|
||||
let validRecords = 0;
|
||||
const totalRecords = data.length;
|
||||
|
||||
for (const record of data) {
|
||||
const value = this.getFieldValue(record, check.field);
|
||||
|
||||
if (this.isValidValue(check.field, value)) {
|
||||
validRecords++;
|
||||
}
|
||||
}
|
||||
|
||||
const score = totalRecords > 0 ? (validRecords / totalRecords) * 100 : 100;
|
||||
|
||||
return {
|
||||
checkName: check.name,
|
||||
type: 'validity',
|
||||
field: check.field,
|
||||
score,
|
||||
passed: score >= check.threshold,
|
||||
details: {
|
||||
totalRecords,
|
||||
validRecords,
|
||||
invalidRecords: totalRecords - validRecords
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private checkUniqueness(check: QualityCheck, data: any[]): any {
|
||||
if (!check.field) {
|
||||
throw new Error('Uniqueness check requires a field');
|
||||
}
|
||||
|
||||
const fieldValues = data.map(record => this.getFieldValue(record, check.field!));
|
||||
const uniqueValues = new Set(fieldValues);
|
||||
|
||||
const score = fieldValues.length > 0 ? (uniqueValues.size / fieldValues.length) * 100 : 100;
|
||||
|
||||
return {
|
||||
checkName: check.name,
|
||||
type: 'uniqueness',
|
||||
field: check.field,
|
||||
score,
|
||||
passed: score >= check.threshold,
|
||||
details: {
|
||||
totalValues: fieldValues.length,
|
||||
uniqueValues: uniqueValues.size,
|
||||
duplicateValues: fieldValues.length - uniqueValues.size
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private getFieldValue(record: any, fieldPath: string): any {
|
||||
return fieldPath.split('.').reduce((obj, field) => obj?.[field], record);
|
||||
}
|
||||
|
||||
private isValidValue(field: string, value: any): boolean {
|
||||
switch (field) {
|
||||
case 'symbol':
|
||||
return typeof value === 'string' && /^[A-Z]{1,5}$/.test(value);
|
||||
case 'price':
|
||||
return typeof value === 'number' && value > 0 && value < 1000000;
|
||||
case 'volume':
|
||||
return typeof value === 'number' && value >= 0 && Number.isInteger(value);
|
||||
case 'timestamp':
|
||||
return value instanceof Date || !isNaN(new Date(value).getTime());
|
||||
default:
|
||||
return value !== null && value !== undefined;
|
||||
}
|
||||
}
|
||||
|
||||
private storeQualityMetrics(metrics: any): void {
|
||||
const key = `metrics_${Date.now()}`;
|
||||
this.qualityMetrics.set(key, metrics);
|
||||
|
||||
// Keep only last 100 metrics
|
||||
if (this.qualityMetrics.size > 100) {
|
||||
const oldestKey = this.qualityMetrics.keys().next().value;
|
||||
this.qualityMetrics.delete(oldestKey);
|
||||
}
|
||||
}
|
||||
|
||||
async getQualityMetrics(dataset?: string): Promise<any> {
|
||||
const allMetrics = Array.from(this.qualityMetrics.values());
|
||||
|
||||
if (allMetrics.length === 0) {
|
||||
return {
|
||||
totalChecks: 0,
|
||||
averageScore: 0,
|
||||
recentResults: []
|
||||
};
|
||||
}
|
||||
|
||||
const totalChecks = allMetrics.reduce((sum, m) => sum + m.checksRun, 0);
|
||||
const averageScore = allMetrics.reduce((sum, m) => sum + m.averageScore, 0) / allMetrics.length;
|
||||
const recentResults = allMetrics.slice(-10);
|
||||
|
||||
return {
|
||||
totalChecks,
|
||||
averageScore: Math.round(averageScore * 100) / 100,
|
||||
recentResults,
|
||||
summary: {
|
||||
totalRuns: allMetrics.length,
|
||||
averageProcessingTime: allMetrics.reduce((sum, m) => sum + m.processingTimeMs, 0) / allMetrics.length
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
async generateReport(dataset: string): Promise<any> {
|
||||
const metrics = await this.getQualityMetrics(dataset);
|
||||
|
||||
const report = {
|
||||
dataset,
|
||||
generatedAt: new Date(),
|
||||
summary: metrics,
|
||||
recommendations: this.generateRecommendations(metrics),
|
||||
trends: this.analyzeTrends(metrics.recentResults)
|
||||
};
|
||||
|
||||
this.qualityReports.set(dataset, report);
|
||||
|
||||
return report;
|
||||
}
|
||||
|
||||
private generateRecommendations(metrics: any): string[] {
|
||||
const recommendations: string[] = [];
|
||||
|
||||
if (metrics.averageScore < 80) {
|
||||
recommendations.push('Overall data quality is below acceptable threshold. Review data ingestion processes.');
|
||||
}
|
||||
|
||||
if (metrics.averageScore < 95 && metrics.averageScore >= 80) {
|
||||
recommendations.push('Data quality is acceptable but could be improved. Consider implementing additional validation rules.');
|
||||
}
|
||||
|
||||
if (metrics.totalChecks === 0) {
|
||||
recommendations.push('No quality checks have been run. Implement quality monitoring for your data pipelines.');
|
||||
}
|
||||
|
||||
return recommendations;
|
||||
}
|
||||
|
||||
private analyzeTrends(recentResults: any[]): any {
|
||||
if (recentResults.length < 2) {
|
||||
return { trend: 'insufficient_data', message: 'Not enough data to analyze trends' };
|
||||
}
|
||||
|
||||
const scores = recentResults.map(r => r.averageScore);
|
||||
const latestScore = scores[scores.length - 1];
|
||||
const previousScore = scores[scores.length - 2];
|
||||
|
||||
if (latestScore > previousScore) {
|
||||
return { trend: 'improving', message: 'Data quality is improving' };
|
||||
} else if (latestScore < previousScore) {
|
||||
return { trend: 'declining', message: 'Data quality is declining' };
|
||||
} else {
|
||||
return { trend: 'stable', message: 'Data quality is stable' };
|
||||
}
|
||||
}
|
||||
|
||||
async getAvailableReports(): Promise<string[]> {
|
||||
return Array.from(this.qualityReports.keys());
|
||||
}
|
||||
|
||||
async getReport(dataset: string): Promise<any | null> {
|
||||
return this.qualityReports.get(dataset) || null;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,290 @@
|
|||
import { logger } from '@stock-bot/utils';
|
||||
import { TransformationStep, ProcessingResult } from '../types/DataPipeline';
|
||||
|
||||
export class DataTransformationService {
|
||||
private transformationFunctions: Map<string, Function> = new Map();
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Data Transformation Service...');
|
||||
|
||||
// Register built-in transformation functions
|
||||
this.registerBuiltInTransformations();
|
||||
|
||||
logger.info('✅ Data Transformation Service initialized');
|
||||
}
|
||||
|
||||
async transformData(step: TransformationStep, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
const startTime = Date.now();
|
||||
logger.info(`🔄 Starting data transformation: ${step.type}`);
|
||||
|
||||
try {
|
||||
switch (step.type) {
|
||||
case 'javascript':
|
||||
return await this.executeJavaScriptTransformation(step, parameters);
|
||||
case 'sql':
|
||||
return await this.executeSqlTransformation(step, parameters);
|
||||
case 'custom':
|
||||
return await this.executeCustomTransformation(step, parameters);
|
||||
default:
|
||||
throw new Error(`Unsupported transformation type: ${step.type}`);
|
||||
}
|
||||
} catch (error) {
|
||||
const processingTime = Date.now() - startTime;
|
||||
logger.error(`❌ Data transformation failed after ${processingTime}ms:`, error);
|
||||
|
||||
return {
|
||||
recordsProcessed: 0,
|
||||
recordsSuccessful: 0,
|
||||
recordsFailed: 0,
|
||||
errors: [{
|
||||
record: 0,
|
||||
message: error instanceof Error ? error.message : 'Unknown error',
|
||||
code: 'TRANSFORMATION_ERROR'
|
||||
}],
|
||||
metadata: { processingTimeMs: processingTime }
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async executeJavaScriptTransformation(step: TransformationStep, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
const { code, inputData } = step.configuration;
|
||||
|
||||
if (!code || !inputData) {
|
||||
throw new Error('JavaScript transformation requires code and inputData configuration');
|
||||
}
|
||||
|
||||
const transformedRecords: any[] = [];
|
||||
const errors: any[] = [];
|
||||
let recordCount = 0;
|
||||
|
||||
// Execute transformation for each record
|
||||
for (const record of inputData) {
|
||||
recordCount++;
|
||||
|
||||
try {
|
||||
// Create a safe execution context
|
||||
const context = {
|
||||
record,
|
||||
parameters,
|
||||
utils: this.getTransformationUtils(),
|
||||
};
|
||||
|
||||
// Execute the transformation code
|
||||
const transformFunction = new Function('context', `
|
||||
const { record, parameters, utils } = context;
|
||||
${code}
|
||||
`);
|
||||
|
||||
const result = transformFunction(context);
|
||||
|
||||
if (result !== undefined) {
|
||||
transformedRecords.push(result);
|
||||
} else {
|
||||
transformedRecords.push(record); // Keep original if no transformation result
|
||||
}
|
||||
} catch (error) {
|
||||
errors.push({
|
||||
record: recordCount,
|
||||
message: error instanceof Error ? error.message : 'Transformation error',
|
||||
code: 'JS_TRANSFORM_ERROR'
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
logger.info(`🔄 Transformed ${transformedRecords.length} records using JavaScript`);
|
||||
|
||||
return {
|
||||
recordsProcessed: recordCount,
|
||||
recordsSuccessful: transformedRecords.length,
|
||||
recordsFailed: errors.length,
|
||||
errors,
|
||||
metadata: {
|
||||
transformationType: 'javascript',
|
||||
outputData: transformedRecords
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private async executeSqlTransformation(step: TransformationStep, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
// Placeholder for SQL transformation
|
||||
// In a real implementation, this would execute SQL queries against a data warehouse
|
||||
// or in-memory SQL engine like DuckDB
|
||||
|
||||
throw new Error('SQL transformation not yet implemented');
|
||||
}
|
||||
|
||||
private async executeCustomTransformation(step: TransformationStep, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
const { functionName, inputData } = step.configuration;
|
||||
|
||||
if (!functionName) {
|
||||
throw new Error('Custom transformation requires functionName configuration');
|
||||
}
|
||||
|
||||
const transformFunction = this.transformationFunctions.get(functionName);
|
||||
if (!transformFunction) {
|
||||
throw new Error(`Custom transformation function not found: ${functionName}`);
|
||||
}
|
||||
|
||||
const result = await transformFunction(inputData, parameters);
|
||||
|
||||
logger.info(`🔄 Executed custom transformation: ${functionName}`);
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private registerBuiltInTransformations(): void {
|
||||
// Market data normalization
|
||||
this.transformationFunctions.set('normalizeMarketData', (data: any[], parameters: any) => {
|
||||
const normalized = data.map(record => ({
|
||||
symbol: record.symbol?.toUpperCase(),
|
||||
price: parseFloat(record.price) || 0,
|
||||
volume: parseInt(record.volume) || 0,
|
||||
timestamp: new Date(record.timestamp || Date.now()),
|
||||
source: parameters.source || 'unknown'
|
||||
}));
|
||||
|
||||
return {
|
||||
recordsProcessed: data.length,
|
||||
recordsSuccessful: normalized.length,
|
||||
recordsFailed: 0,
|
||||
errors: [],
|
||||
metadata: { outputData: normalized }
|
||||
};
|
||||
});
|
||||
|
||||
// Financial data aggregation
|
||||
this.transformationFunctions.set('aggregateFinancialData', (data: any[], parameters: any) => {
|
||||
const { groupBy = 'symbol', aggregations = ['avg', 'sum'] } = parameters;
|
||||
|
||||
const grouped = data.reduce((acc, record) => {
|
||||
const key = record[groupBy];
|
||||
if (!acc[key]) {
|
||||
acc[key] = [];
|
||||
}
|
||||
acc[key].push(record);
|
||||
return acc;
|
||||
}, {});
|
||||
|
||||
const aggregated = Object.entries(grouped).map(([key, records]: [string, any[]]) => {
|
||||
const result: any = { [groupBy]: key };
|
||||
|
||||
if (aggregations.includes('avg')) {
|
||||
result.avgPrice = records.reduce((sum, r) => sum + (r.price || 0), 0) / records.length;
|
||||
}
|
||||
|
||||
if (aggregations.includes('sum')) {
|
||||
result.totalVolume = records.reduce((sum, r) => sum + (r.volume || 0), 0);
|
||||
}
|
||||
|
||||
if (aggregations.includes('count')) {
|
||||
result.count = records.length;
|
||||
}
|
||||
|
||||
return result;
|
||||
});
|
||||
|
||||
return {
|
||||
recordsProcessed: data.length,
|
||||
recordsSuccessful: aggregated.length,
|
||||
recordsFailed: 0,
|
||||
errors: [],
|
||||
metadata: { outputData: aggregated }
|
||||
};
|
||||
});
|
||||
|
||||
// Data cleaning
|
||||
this.transformationFunctions.set('cleanData', (data: any[], parameters: any) => {
|
||||
const { removeNulls = true, trimStrings = true, validateNumbers = true } = parameters;
|
||||
const cleaned: any[] = [];
|
||||
const errors: any[] = [];
|
||||
|
||||
data.forEach((record, index) => {
|
||||
try {
|
||||
let cleanRecord = { ...record };
|
||||
|
||||
if (removeNulls) {
|
||||
Object.keys(cleanRecord).forEach(key => {
|
||||
if (cleanRecord[key] === null || cleanRecord[key] === undefined) {
|
||||
delete cleanRecord[key];
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
if (trimStrings) {
|
||||
Object.keys(cleanRecord).forEach(key => {
|
||||
if (typeof cleanRecord[key] === 'string') {
|
||||
cleanRecord[key] = cleanRecord[key].trim();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
if (validateNumbers) {
|
||||
Object.keys(cleanRecord).forEach(key => {
|
||||
if (typeof cleanRecord[key] === 'string' && !isNaN(Number(cleanRecord[key]))) {
|
||||
cleanRecord[key] = Number(cleanRecord[key]);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
cleaned.push(cleanRecord);
|
||||
} catch (error) {
|
||||
errors.push({
|
||||
record: index + 1,
|
||||
message: error instanceof Error ? error.message : 'Cleaning error',
|
||||
code: 'DATA_CLEANING_ERROR'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
recordsProcessed: data.length,
|
||||
recordsSuccessful: cleaned.length,
|
||||
recordsFailed: errors.length,
|
||||
errors,
|
||||
metadata: { outputData: cleaned }
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
private getTransformationUtils() {
|
||||
return {
|
||||
// Date utilities
|
||||
formatDate: (date: Date | string, format: string = 'ISO') => {
|
||||
const d = new Date(date);
|
||||
switch (format) {
|
||||
case 'ISO':
|
||||
return d.toISOString();
|
||||
case 'YYYY-MM-DD':
|
||||
return d.toISOString().split('T')[0];
|
||||
default:
|
||||
return d.toString();
|
||||
}
|
||||
},
|
||||
|
||||
// Number utilities
|
||||
round: (num: number, decimals: number = 2) => {
|
||||
return Math.round(num * Math.pow(10, decimals)) / Math.pow(10, decimals);
|
||||
},
|
||||
|
||||
// String utilities
|
||||
slugify: (str: string) => {
|
||||
return str.toLowerCase().replace(/[^a-z0-9]/g, '-').replace(/-+/g, '-');
|
||||
},
|
||||
|
||||
// Market data utilities
|
||||
calculatePercentageChange: (current: number, previous: number) => {
|
||||
if (previous === 0) return 0;
|
||||
return ((current - previous) / previous) * 100;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
registerCustomTransformation(name: string, func: Function): void {
|
||||
this.transformationFunctions.set(name, func);
|
||||
logger.info(`✅ Registered custom transformation: ${name}`);
|
||||
}
|
||||
|
||||
getAvailableTransformations(): string[] {
|
||||
return Array.from(this.transformationFunctions.keys());
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,303 @@
|
|||
import { logger } from '@stock-bot/utils';
|
||||
import { ValidationStep, ProcessingResult, ValidationRule } from '../types/DataPipeline';
|
||||
import Joi from 'joi';
|
||||
|
||||
export class DataValidationService {
|
||||
private validators: Map<string, Function> = new Map();
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Data Validation Service...');
|
||||
|
||||
// Register built-in validators
|
||||
this.registerBuiltInValidators();
|
||||
|
||||
logger.info('✅ Data Validation Service initialized');
|
||||
}
|
||||
|
||||
async validateData(step: ValidationStep, parameters: Record<string, any>): Promise<ProcessingResult> {
|
||||
const startTime = Date.now();
|
||||
logger.info(`✅ Starting data validation with ${step.rules.length} rules`);
|
||||
|
||||
const inputData = parameters.inputData || [];
|
||||
const validRecords: any[] = [];
|
||||
const errors: any[] = [];
|
||||
let recordCount = 0;
|
||||
|
||||
try {
|
||||
for (const record of inputData) {
|
||||
recordCount++;
|
||||
const recordErrors: any[] = [];
|
||||
|
||||
// Apply all validation rules to this record
|
||||
for (const rule of step.rules) {
|
||||
try {
|
||||
const isValid = await this.applyValidationRule(record, rule);
|
||||
if (!isValid) {
|
||||
recordErrors.push({
|
||||
record: recordCount,
|
||||
field: rule.field,
|
||||
message: rule.message || `Validation failed for rule: ${rule.type}`,
|
||||
code: `VALIDATION_${rule.type.toUpperCase()}_FAILED`
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
recordErrors.push({
|
||||
record: recordCount,
|
||||
field: rule.field,
|
||||
message: error instanceof Error ? error.message : 'Validation error',
|
||||
code: 'VALIDATION_ERROR'
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (recordErrors.length === 0) {
|
||||
validRecords.push(record);
|
||||
} else {
|
||||
errors.push(...recordErrors);
|
||||
|
||||
// Handle validation failure based on strategy
|
||||
if (step.onFailure === 'stop') {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const processingTime = Date.now() - startTime;
|
||||
logger.info(`✅ Validation completed: ${validRecords.length}/${recordCount} records valid in ${processingTime}ms`);
|
||||
|
||||
return {
|
||||
recordsProcessed: recordCount,
|
||||
recordsSuccessful: validRecords.length,
|
||||
recordsFailed: recordCount - validRecords.length,
|
||||
errors,
|
||||
metadata: {
|
||||
validationRules: step.rules.length,
|
||||
onFailure: step.onFailure,
|
||||
processingTimeMs: processingTime,
|
||||
outputData: validRecords
|
||||
}
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
const processingTime = Date.now() - startTime;
|
||||
logger.error(`❌ Data validation failed after ${processingTime}ms:`, error);
|
||||
|
||||
return {
|
||||
recordsProcessed: recordCount,
|
||||
recordsSuccessful: 0,
|
||||
recordsFailed: recordCount,
|
||||
errors: [{
|
||||
record: 0,
|
||||
message: error instanceof Error ? error.message : 'Unknown validation error',
|
||||
code: 'VALIDATION_SERVICE_ERROR'
|
||||
}],
|
||||
metadata: { processingTimeMs: processingTime }
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async applyValidationRule(record: any, rule: ValidationRule): Promise<boolean> {
|
||||
const fieldValue = this.getFieldValue(record, rule.field);
|
||||
|
||||
switch (rule.type) {
|
||||
case 'required':
|
||||
return this.validateRequired(fieldValue);
|
||||
|
||||
case 'type':
|
||||
return this.validateType(fieldValue, rule.value);
|
||||
|
||||
case 'range':
|
||||
return this.validateRange(fieldValue, rule.value);
|
||||
|
||||
case 'pattern':
|
||||
return this.validatePattern(fieldValue, rule.value);
|
||||
|
||||
case 'custom':
|
||||
return await this.validateCustom(record, rule);
|
||||
|
||||
default:
|
||||
throw new Error(`Unknown validation rule type: ${rule.type}`);
|
||||
}
|
||||
}
|
||||
|
||||
private getFieldValue(record: any, fieldPath: string): any {
|
||||
return fieldPath.split('.').reduce((obj, key) => obj?.[key], record);
|
||||
}
|
||||
|
||||
private validateRequired(value: any): boolean {
|
||||
return value !== null && value !== undefined && value !== '';
|
||||
}
|
||||
|
||||
private validateType(value: any, expectedType: string): boolean {
|
||||
if (value === null || value === undefined) {
|
||||
return false;
|
||||
}
|
||||
|
||||
switch (expectedType) {
|
||||
case 'string':
|
||||
return typeof value === 'string';
|
||||
case 'number':
|
||||
return typeof value === 'number' && !isNaN(value);
|
||||
case 'boolean':
|
||||
return typeof value === 'boolean';
|
||||
case 'date':
|
||||
return value instanceof Date || !isNaN(Date.parse(value));
|
||||
case 'array':
|
||||
return Array.isArray(value);
|
||||
case 'object':
|
||||
return typeof value === 'object' && !Array.isArray(value);
|
||||
default:
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
private validateRange(value: any, range: { min?: number; max?: number }): boolean {
|
||||
if (typeof value !== 'number') {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (range.min !== undefined && value < range.min) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (range.max !== undefined && value > range.max) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
private validatePattern(value: any, pattern: string): boolean {
|
||||
if (typeof value !== 'string') {
|
||||
return false;
|
||||
}
|
||||
|
||||
const regex = new RegExp(pattern);
|
||||
return regex.test(value);
|
||||
}
|
||||
|
||||
private async validateCustom(record: any, rule: ValidationRule): Promise<boolean> {
|
||||
const validatorName = rule.value as string;
|
||||
const validator = this.validators.get(validatorName);
|
||||
|
||||
if (!validator) {
|
||||
throw new Error(`Custom validator not found: ${validatorName}`);
|
||||
}
|
||||
|
||||
return await validator(record, rule.field);
|
||||
}
|
||||
|
||||
private registerBuiltInValidators(): void {
|
||||
// Stock symbol validator
|
||||
this.validators.set('stockSymbol', (record: any, field: string) => {
|
||||
const symbol = this.getFieldValue(record, field);
|
||||
if (typeof symbol !== 'string') return false;
|
||||
|
||||
// Basic stock symbol validation: 1-5 uppercase letters
|
||||
return /^[A-Z]{1,5}$/.test(symbol);
|
||||
});
|
||||
|
||||
// Price validator
|
||||
this.validators.set('stockPrice', (record: any, field: string) => {
|
||||
const price = this.getFieldValue(record, field);
|
||||
|
||||
// Must be a positive number
|
||||
return typeof price === 'number' && price > 0 && price < 1000000;
|
||||
});
|
||||
|
||||
// Volume validator
|
||||
this.validators.set('stockVolume', (record: any, field: string) => {
|
||||
const volume = this.getFieldValue(record, field);
|
||||
|
||||
// Must be a non-negative integer
|
||||
return Number.isInteger(volume) && volume >= 0;
|
||||
});
|
||||
|
||||
// Market data timestamp validator
|
||||
this.validators.set('marketTimestamp', (record: any, field: string) => {
|
||||
const timestamp = this.getFieldValue(record, field);
|
||||
|
||||
if (!timestamp) return false;
|
||||
|
||||
const date = new Date(timestamp);
|
||||
if (isNaN(date.getTime())) return false;
|
||||
|
||||
// Check if timestamp is within reasonable bounds (not too old or in future)
|
||||
const now = new Date();
|
||||
const oneYearAgo = new Date(now.getTime() - 365 * 24 * 60 * 60 * 1000);
|
||||
const oneHourInFuture = new Date(now.getTime() + 60 * 60 * 1000);
|
||||
|
||||
return date >= oneYearAgo && date <= oneHourInFuture;
|
||||
});
|
||||
|
||||
// Email validator
|
||||
this.validators.set('email', (record: any, field: string) => {
|
||||
const email = this.getFieldValue(record, field);
|
||||
if (typeof email !== 'string') return false;
|
||||
|
||||
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
|
||||
return emailRegex.test(email);
|
||||
});
|
||||
|
||||
// JSON schema validator
|
||||
this.validators.set('jsonSchema', (record: any, field: string, schema?: any) => {
|
||||
if (!schema) return false;
|
||||
|
||||
try {
|
||||
const joiSchema = Joi.object(schema);
|
||||
const { error } = joiSchema.validate(record);
|
||||
return !error;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
});
|
||||
|
||||
// Data completeness validator
|
||||
this.validators.set('completeness', (record: any, field: string) => {
|
||||
const requiredFields = ['symbol', 'price', 'timestamp'];
|
||||
return requiredFields.every(f => {
|
||||
const value = this.getFieldValue(record, f);
|
||||
return value !== null && value !== undefined && value !== '';
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
registerCustomValidator(name: string, validator: Function): void {
|
||||
this.validators.set(name, validator);
|
||||
logger.info(`✅ Registered custom validator: ${name}`);
|
||||
}
|
||||
|
||||
getAvailableValidators(): string[] {
|
||||
return Array.from(this.validators.keys());
|
||||
}
|
||||
|
||||
async validateSchema(data: any[], schema: any): Promise<ProcessingResult> {
|
||||
const joiSchema = Joi.array().items(Joi.object(schema));
|
||||
const { error, value } = joiSchema.validate(data);
|
||||
|
||||
if (error) {
|
||||
return {
|
||||
recordsProcessed: data.length,
|
||||
recordsSuccessful: 0,
|
||||
recordsFailed: data.length,
|
||||
errors: [{
|
||||
record: 0,
|
||||
message: error.message,
|
||||
code: 'SCHEMA_VALIDATION_FAILED'
|
||||
}],
|
||||
metadata: { schemaValidation: true }
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
recordsProcessed: data.length,
|
||||
recordsSuccessful: data.length,
|
||||
recordsFailed: 0,
|
||||
errors: [],
|
||||
metadata: {
|
||||
schemaValidation: true,
|
||||
outputData: value
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
178
apps/data-services/data-processor/src/types/DataPipeline.ts
Normal file
178
apps/data-services/data-processor/src/types/DataPipeline.ts
Normal file
|
|
@ -0,0 +1,178 @@
|
|||
// Data Pipeline Types
|
||||
|
||||
export interface DataPipeline {
|
||||
id: string;
|
||||
name: string;
|
||||
description?: string;
|
||||
status: PipelineStatus;
|
||||
steps: PipelineSteps;
|
||||
schedule?: PipelineSchedule;
|
||||
metadata: Record<string, any>;
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
}
|
||||
|
||||
export enum PipelineStatus {
|
||||
DRAFT = 'draft',
|
||||
ACTIVE = 'active',
|
||||
PAUSED = 'paused',
|
||||
DISABLED = 'disabled',
|
||||
}
|
||||
|
||||
export interface PipelineSteps {
|
||||
ingestion?: IngestionStep;
|
||||
transformation?: TransformationStep;
|
||||
validation?: ValidationStep;
|
||||
qualityChecks?: QualityCheckStep;
|
||||
}
|
||||
|
||||
export interface IngestionStep {
|
||||
type: 'api' | 'file' | 'database' | 'stream';
|
||||
source: DataSource;
|
||||
configuration: Record<string, any>;
|
||||
retryPolicy?: RetryPolicy;
|
||||
}
|
||||
|
||||
export interface TransformationStep {
|
||||
type: 'sql' | 'javascript' | 'python' | 'custom';
|
||||
configuration: Record<string, any>;
|
||||
schema?: DataSchema;
|
||||
}
|
||||
|
||||
export interface ValidationStep {
|
||||
rules: ValidationRule[];
|
||||
onFailure: 'stop' | 'continue' | 'alert';
|
||||
}
|
||||
|
||||
export interface QualityCheckStep {
|
||||
checks: QualityCheck[];
|
||||
thresholds: QualityThresholds;
|
||||
}
|
||||
|
||||
export interface PipelineSchedule {
|
||||
cronExpression: string;
|
||||
enabled: boolean;
|
||||
lastRun: Date | null;
|
||||
nextRun: Date | null;
|
||||
}
|
||||
|
||||
// Job Types
|
||||
|
||||
export interface PipelineJob {
|
||||
id: string;
|
||||
pipelineId: string;
|
||||
status: JobStatus;
|
||||
parameters: Record<string, any>;
|
||||
createdAt: Date;
|
||||
startedAt: Date | null;
|
||||
completedAt: Date | null;
|
||||
error: string | null;
|
||||
metrics: JobMetrics;
|
||||
}
|
||||
|
||||
export enum JobStatus {
|
||||
PENDING = 'pending',
|
||||
RUNNING = 'running',
|
||||
COMPLETED = 'completed',
|
||||
FAILED = 'failed',
|
||||
CANCELLED = 'cancelled',
|
||||
}
|
||||
|
||||
export interface JobMetrics {
|
||||
recordsProcessed: number;
|
||||
recordsSuccessful: number;
|
||||
recordsFailed: number;
|
||||
processingTimeMs: number;
|
||||
}
|
||||
|
||||
// Data Source Types
|
||||
|
||||
export interface DataSource {
|
||||
type: 'api' | 'file' | 'database' | 'stream';
|
||||
connection: ConnectionConfig;
|
||||
format?: 'json' | 'csv' | 'xml' | 'parquet' | 'avro';
|
||||
}
|
||||
|
||||
export interface ConnectionConfig {
|
||||
url?: string;
|
||||
host?: string;
|
||||
port?: number;
|
||||
database?: string;
|
||||
username?: string;
|
||||
password?: string;
|
||||
apiKey?: string;
|
||||
headers?: Record<string, string>;
|
||||
params?: Record<string, any>;
|
||||
}
|
||||
|
||||
// Schema Types
|
||||
|
||||
export interface DataSchema {
|
||||
fields: SchemaField[];
|
||||
constraints?: SchemaConstraint[];
|
||||
}
|
||||
|
||||
export interface SchemaField {
|
||||
name: string;
|
||||
type: 'string' | 'number' | 'boolean' | 'date' | 'object' | 'array';
|
||||
required: boolean;
|
||||
nullable: boolean;
|
||||
format?: string;
|
||||
description?: string;
|
||||
}
|
||||
|
||||
export interface SchemaConstraint {
|
||||
type: 'unique' | 'reference' | 'range' | 'pattern';
|
||||
field: string;
|
||||
value: any;
|
||||
}
|
||||
|
||||
// Validation Types
|
||||
|
||||
export interface ValidationRule {
|
||||
field: string;
|
||||
type: 'required' | 'type' | 'range' | 'pattern' | 'custom';
|
||||
value: any;
|
||||
message?: string;
|
||||
}
|
||||
|
||||
// Quality Check Types
|
||||
|
||||
export interface QualityCheck {
|
||||
name: string;
|
||||
type: 'completeness' | 'accuracy' | 'consistency' | 'validity' | 'uniqueness';
|
||||
field?: string;
|
||||
condition: string;
|
||||
threshold: number;
|
||||
}
|
||||
|
||||
export interface QualityThresholds {
|
||||
error: number; // 0-100 percentage
|
||||
warning: number; // 0-100 percentage
|
||||
}
|
||||
|
||||
// Processing Result Types
|
||||
|
||||
export interface ProcessingResult {
|
||||
recordsProcessed: number;
|
||||
recordsSuccessful: number;
|
||||
recordsFailed: number;
|
||||
errors: ProcessingError[];
|
||||
metadata: Record<string, any>;
|
||||
}
|
||||
|
||||
export interface ProcessingError {
|
||||
record: number;
|
||||
field?: string;
|
||||
message: string;
|
||||
code?: string;
|
||||
}
|
||||
|
||||
// Retry Policy Types
|
||||
|
||||
export interface RetryPolicy {
|
||||
maxAttempts: number;
|
||||
backoffStrategy: 'fixed' | 'exponential' | 'linear';
|
||||
initialDelay: number;
|
||||
maxDelay: number;
|
||||
}
|
||||
|
|
@ -3,15 +3,20 @@
|
|||
"target": "ES2022",
|
||||
"module": "ESNext",
|
||||
"moduleResolution": "bundler",
|
||||
"declaration": true,
|
||||
"allowImportingTsExtensions": true,
|
||||
"allowSyntheticDefaultImports": true,
|
||||
"esModuleInterop": true,
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"strict": true,
|
||||
"noImplicitAny": true,
|
||||
"skipLibCheck": true,
|
||||
"outDir": "./dist",
|
||||
"rootDir": "./src",
|
||||
"strict": true,
|
||||
"esModuleInterop": true,
|
||||
"skipLibCheck": true,
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"allowSyntheticDefaultImports": true,
|
||||
"resolveJsonModule": true
|
||||
"resolveJsonModule": true,
|
||||
"declaration": true,
|
||||
"declarationMap": true,
|
||||
"sourceMap": true,
|
||||
"types": ["bun-types"]
|
||||
},
|
||||
"include": ["src/**/*"],
|
||||
"exclude": ["node_modules", "dist"]
|
||||
41
apps/data-services/feature-store/package.json
Normal file
41
apps/data-services/feature-store/package.json
Normal file
|
|
@ -0,0 +1,41 @@
|
|||
{
|
||||
"name": "feature-store",
|
||||
"version": "1.0.0",
|
||||
"description": "ML feature management and serving service",
|
||||
"main": "src/index.ts",
|
||||
"scripts": {
|
||||
"dev": "bun run --watch src/index.ts",
|
||||
"start": "bun run src/index.ts",
|
||||
"build": "bun build src/index.ts --outdir=dist",
|
||||
"test": "bun test",
|
||||
"lint": "eslint src/**/*.ts",
|
||||
"type-check": "tsc --noEmit"
|
||||
},
|
||||
"dependencies": {
|
||||
"@stock-bot/shared-types": "*",
|
||||
"@stock-bot/event-bus": "*",
|
||||
"@stock-bot/utils": "*",
|
||||
"@stock-bot/api-client": "*",
|
||||
"hono": "^4.6.3",
|
||||
"ioredis": "^5.4.1",
|
||||
"node-fetch": "^3.3.2",
|
||||
"date-fns": "^2.30.0",
|
||||
"lodash": "^4.17.21",
|
||||
"compression": "^1.7.4",
|
||||
"cors": "^2.8.5",
|
||||
"helmet": "^7.1.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/bun": "latest",
|
||||
"@types/lodash": "^4.14.200",
|
||||
"@types/compression": "^1.7.5",
|
||||
"@types/cors": "^2.8.17",
|
||||
"typescript": "^5.3.0",
|
||||
"eslint": "^8.55.0",
|
||||
"@typescript-eslint/eslint-plugin": "^6.13.1",
|
||||
"@typescript-eslint/parser": "^6.13.1"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"typescript": "^5.0.0"
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,220 @@
|
|||
import { Context } from 'hono';
|
||||
import { FeatureComputationService } from '../services/FeatureComputationService';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import {
|
||||
ComputationJob,
|
||||
CreateComputationJobRequest,
|
||||
UpdateComputationJobRequest
|
||||
} from '../types/FeatureStore';
|
||||
|
||||
export class ComputationController {
|
||||
constructor(
|
||||
private computationService: FeatureComputationService,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async createComputationJob(c: Context) {
|
||||
try {
|
||||
const request: CreateComputationJobRequest = await c.req.json();
|
||||
|
||||
const job = await this.computationService.createComputationJob(request);
|
||||
|
||||
this.logger.info('Computation job created', { jobId: job.id });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: job
|
||||
}, 201);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to create computation job', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getComputationJob(c: Context) {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
|
||||
const job = await this.computationService.getComputationJob(jobId);
|
||||
|
||||
if (!job) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Computation job not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: job
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get computation job', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async updateComputationJob(c: Context) {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
const request: UpdateComputationJobRequest = await c.req.json();
|
||||
|
||||
const job = await this.computationService.updateComputationJob(jobId, request);
|
||||
|
||||
if (!job) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Computation job not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
this.logger.info('Computation job updated', { jobId });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: job
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update computation job', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async deleteComputationJob(c: Context) {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
|
||||
await this.computationService.deleteComputationJob(jobId);
|
||||
|
||||
this.logger.info('Computation job deleted', { jobId });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Computation job deleted successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to delete computation job', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async listComputationJobs(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.query('featureGroupId');
|
||||
const status = c.req.query('status');
|
||||
|
||||
const jobs = await this.computationService.listComputationJobs({
|
||||
featureGroupId,
|
||||
status: status as any
|
||||
});
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: jobs
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to list computation jobs', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async executeComputationJob(c: Context) {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
|
||||
const result = await this.computationService.executeComputationJob(jobId);
|
||||
|
||||
this.logger.info('Computation job executed', { jobId, result });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: result
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to execute computation job', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async scheduleComputationJob(c: Context) {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
const { schedule } = await c.req.json();
|
||||
|
||||
await this.computationService.scheduleComputationJob(jobId, schedule);
|
||||
|
||||
this.logger.info('Computation job scheduled', { jobId, schedule });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Computation job scheduled successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to schedule computation job', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async unscheduleComputationJob(c: Context) {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
|
||||
await this.computationService.unscheduleComputationJob(jobId);
|
||||
|
||||
this.logger.info('Computation job unscheduled', { jobId });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Computation job unscheduled successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to unschedule computation job', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getComputationJobHistory(c: Context) {
|
||||
try {
|
||||
const jobId = c.req.param('id');
|
||||
const limit = parseInt(c.req.query('limit') || '10');
|
||||
const offset = parseInt(c.req.query('offset') || '0');
|
||||
|
||||
const history = await this.computationService.getComputationJobHistory(jobId, limit, offset);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: history
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get computation job history', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,226 @@
|
|||
import { Context } from 'hono';
|
||||
import { FeatureStoreService } from '../services/FeatureStoreService';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import {
|
||||
FeatureGroup,
|
||||
CreateFeatureGroupRequest,
|
||||
UpdateFeatureGroupRequest,
|
||||
FeatureValue,
|
||||
GetFeaturesRequest
|
||||
} from '../types/FeatureStore';
|
||||
|
||||
export class FeatureController {
|
||||
constructor(
|
||||
private featureStoreService: FeatureStoreService,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async createFeatureGroup(c: Context) {
|
||||
try {
|
||||
const request: CreateFeatureGroupRequest = await c.req.json();
|
||||
|
||||
const featureGroup = await this.featureStoreService.createFeatureGroup(request);
|
||||
|
||||
this.logger.info('Feature group created', { featureGroupId: featureGroup.id });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: featureGroup
|
||||
}, 201);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to create feature group', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getFeatureGroup(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
|
||||
const featureGroup = await this.featureStoreService.getFeatureGroup(featureGroupId);
|
||||
|
||||
if (!featureGroup) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Feature group not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: featureGroup
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get feature group', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async updateFeatureGroup(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
const request: UpdateFeatureGroupRequest = await c.req.json();
|
||||
|
||||
const featureGroup = await this.featureStoreService.updateFeatureGroup(featureGroupId, request);
|
||||
|
||||
if (!featureGroup) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Feature group not found'
|
||||
}, 404);
|
||||
}
|
||||
|
||||
this.logger.info('Feature group updated', { featureGroupId });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: featureGroup
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update feature group', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async deleteFeatureGroup(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
|
||||
await this.featureStoreService.deleteFeatureGroup(featureGroupId);
|
||||
|
||||
this.logger.info('Feature group deleted', { featureGroupId });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Feature group deleted successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to delete feature group', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async listFeatureGroups(c: Context) {
|
||||
try {
|
||||
const featureGroups = await this.featureStoreService.listFeatureGroups();
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: featureGroups
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to list feature groups', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getFeatures(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
const entityId = c.req.query('entityId');
|
||||
const timestamp = c.req.query('timestamp');
|
||||
|
||||
if (!entityId) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'entityId query parameter is required'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
const request: GetFeaturesRequest = {
|
||||
featureGroupId,
|
||||
entityId,
|
||||
timestamp: timestamp ? new Date(timestamp) : undefined
|
||||
};
|
||||
|
||||
const features = await this.featureStoreService.getFeatures(request);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: features
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get features', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async storeFeatures(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
const features: FeatureValue[] = await c.req.json();
|
||||
|
||||
await this.featureStoreService.storeFeatures(featureGroupId, features);
|
||||
|
||||
this.logger.info('Features stored', {
|
||||
featureGroupId,
|
||||
featureCount: features.length
|
||||
});
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Features stored successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to store features', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getFeatureHistory(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
const featureName = c.req.param('featureName');
|
||||
const entityId = c.req.query('entityId');
|
||||
const startTime = c.req.query('startTime');
|
||||
const endTime = c.req.query('endTime');
|
||||
|
||||
if (!entityId) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'entityId query parameter is required'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
const history = await this.featureStoreService.getFeatureHistory(
|
||||
featureGroupId,
|
||||
featureName,
|
||||
entityId,
|
||||
startTime ? new Date(startTime) : undefined,
|
||||
endTime ? new Date(endTime) : undefined
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: history
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get feature history', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,166 @@
|
|||
import { Context } from 'hono';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
|
||||
export class HealthController {
|
||||
constructor(private logger: Logger) {}
|
||||
|
||||
async getHealth(c: Context) {
|
||||
try {
|
||||
const health = {
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'feature-store',
|
||||
version: '1.0.0',
|
||||
uptime: process.uptime(),
|
||||
memory: {
|
||||
used: Math.round((process.memoryUsage().heapUsed / 1024 / 1024) * 100) / 100,
|
||||
total: Math.round((process.memoryUsage().heapTotal / 1024 / 1024) * 100) / 100
|
||||
},
|
||||
dependencies: {
|
||||
redis: await this.checkRedisHealth(),
|
||||
database: await this.checkDatabaseHealth(),
|
||||
eventBus: await this.checkEventBusHealth()
|
||||
}
|
||||
};
|
||||
|
||||
return c.json(health);
|
||||
} catch (error) {
|
||||
this.logger.error('Health check failed', { error });
|
||||
return c.json({
|
||||
status: 'unhealthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'feature-store',
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getReadiness(c: Context) {
|
||||
try {
|
||||
const readiness = {
|
||||
status: 'ready',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'feature-store',
|
||||
checks: {
|
||||
onlineStore: await this.checkOnlineStoreReadiness(),
|
||||
offlineStore: await this.checkOfflineStoreReadiness(),
|
||||
metadataStore: await this.checkMetadataStoreReadiness(),
|
||||
computationEngine: await this.checkComputationEngineReadiness()
|
||||
}
|
||||
};
|
||||
|
||||
const isReady = Object.values(readiness.checks).every(check => check.status === 'ready');
|
||||
|
||||
return c.json(readiness, isReady ? 200 : 503);
|
||||
} catch (error) {
|
||||
this.logger.error('Readiness check failed', { error });
|
||||
return c.json({
|
||||
status: 'not_ready',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'feature-store',
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 503);
|
||||
}
|
||||
}
|
||||
|
||||
async getLiveness(c: Context) {
|
||||
try {
|
||||
const liveness = {
|
||||
status: 'alive',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'feature-store',
|
||||
pid: process.pid,
|
||||
uptime: process.uptime()
|
||||
};
|
||||
|
||||
return c.json(liveness);
|
||||
} catch (error) {
|
||||
this.logger.error('Liveness check failed', { error });
|
||||
return c.json({
|
||||
status: 'dead',
|
||||
timestamp: new Date().toISOString(),
|
||||
service: 'feature-store',
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
private async checkRedisHealth(): Promise<{ status: string; latency?: number }> {
|
||||
try {
|
||||
const start = Date.now();
|
||||
// TODO: Implement actual Redis health check
|
||||
const latency = Date.now() - start;
|
||||
return { status: 'healthy', latency };
|
||||
} catch (error) {
|
||||
return { status: 'unhealthy' };
|
||||
}
|
||||
}
|
||||
|
||||
private async checkDatabaseHealth(): Promise<{ status: string; latency?: number }> {
|
||||
try {
|
||||
const start = Date.now();
|
||||
// TODO: Implement actual database health check
|
||||
const latency = Date.now() - start;
|
||||
return { status: 'healthy', latency };
|
||||
} catch (error) {
|
||||
return { status: 'unhealthy' };
|
||||
}
|
||||
}
|
||||
|
||||
private async checkEventBusHealth(): Promise<{ status: string }> {
|
||||
try {
|
||||
// TODO: Implement actual event bus health check
|
||||
return { status: 'healthy' };
|
||||
} catch (error) {
|
||||
return { status: 'unhealthy' };
|
||||
}
|
||||
}
|
||||
|
||||
private async checkOnlineStoreReadiness(): Promise<{ status: string; message?: string }> {
|
||||
try {
|
||||
// TODO: Implement actual online store readiness check
|
||||
return { status: 'ready' };
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'not_ready',
|
||||
message: error instanceof Error ? error.message : 'Unknown error'
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async checkOfflineStoreReadiness(): Promise<{ status: string; message?: string }> {
|
||||
try {
|
||||
// TODO: Implement actual offline store readiness check
|
||||
return { status: 'ready' };
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'not_ready',
|
||||
message: error instanceof Error ? error.message : 'Unknown error'
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async checkMetadataStoreReadiness(): Promise<{ status: string; message?: string }> {
|
||||
try {
|
||||
// TODO: Implement actual metadata store readiness check
|
||||
return { status: 'ready' };
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'not_ready',
|
||||
message: error instanceof Error ? error.message : 'Unknown error'
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
private async checkComputationEngineReadiness(): Promise<{ status: string; message?: string }> {
|
||||
try {
|
||||
// TODO: Implement actual computation engine readiness check
|
||||
return { status: 'ready' };
|
||||
} catch (error) {
|
||||
return {
|
||||
status: 'not_ready',
|
||||
message: error instanceof Error ? error.message : 'Unknown error'
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,123 @@
|
|||
import { Context } from 'hono';
|
||||
import { FeatureMonitoringService } from '../services/FeatureMonitoringService';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import {
|
||||
FeatureMonitoringConfig,
|
||||
FeatureValue
|
||||
} from '../types/FeatureStore';
|
||||
|
||||
export class MonitoringController {
|
||||
constructor(
|
||||
private monitoringService: FeatureMonitoringService,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async startMonitoring(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
const config: FeatureMonitoringConfig = await c.req.json();
|
||||
|
||||
await this.monitoringService.startMonitoring(featureGroupId, config);
|
||||
|
||||
this.logger.info('Monitoring started', { featureGroupId });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Monitoring started successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to start monitoring', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async stopMonitoring(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
|
||||
await this.monitoringService.stopMonitoring(featureGroupId);
|
||||
|
||||
this.logger.info('Monitoring stopped', { featureGroupId });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Monitoring stopped successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to stop monitoring', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async detectDrift(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
const recentData: FeatureValue[] = await c.req.json();
|
||||
|
||||
const alerts = await this.monitoringService.detectDrift(featureGroupId, recentData);
|
||||
|
||||
this.logger.info('Drift detection completed', {
|
||||
featureGroupId,
|
||||
alertsCount: alerts.length
|
||||
});
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: alerts
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to detect drift', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async getMonitoringMetrics(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
|
||||
const metrics = await this.monitoringService.getMonitoringMetrics(featureGroupId);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: metrics
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get monitoring metrics', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
|
||||
async updateMonitoringConfig(c: Context) {
|
||||
try {
|
||||
const featureGroupId = c.req.param('id');
|
||||
const config: FeatureMonitoringConfig = await c.req.json();
|
||||
|
||||
await this.monitoringService.updateMonitoringConfig(featureGroupId, config);
|
||||
|
||||
this.logger.info('Monitoring config updated', { featureGroupId });
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
message: 'Monitoring configuration updated successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update monitoring config', { error });
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
}
|
||||
}
|
||||
41
apps/data-services/feature-store/src/index.ts
Normal file
41
apps/data-services/feature-store/src/index.ts
Normal file
|
|
@ -0,0 +1,41 @@
|
|||
import { Hono } from 'hono';
|
||||
import { cors } from 'hono/cors';
|
||||
import { logger } from 'hono/logger';
|
||||
|
||||
// Controllers
|
||||
import { HealthController } from './controllers/HealthController';
|
||||
|
||||
const app = new Hono();
|
||||
|
||||
// Middleware
|
||||
app.use('*', cors());
|
||||
app.use('*', logger());
|
||||
|
||||
// Initialize logger for services
|
||||
const appLogger = { info: console.log, error: console.error, warn: console.warn, debug: console.log };
|
||||
|
||||
// Controllers
|
||||
const healthController = new HealthController(appLogger);
|
||||
|
||||
// Health endpoints
|
||||
app.get('/health', healthController.getHealth.bind(healthController));
|
||||
app.get('/health/readiness', healthController.getReadiness.bind(healthController));
|
||||
app.get('/health/liveness', healthController.getLiveness.bind(healthController));
|
||||
|
||||
// API endpoints will be implemented as services are completed
|
||||
app.get('/api/v1/feature-groups', async (c) => {
|
||||
return c.json({ message: 'Feature groups endpoint - not implemented yet' });
|
||||
});
|
||||
|
||||
app.post('/api/v1/feature-groups', async (c) => {
|
||||
return c.json({ message: 'Create feature group endpoint - not implemented yet' });
|
||||
});
|
||||
|
||||
const port = process.env.PORT || 3003;
|
||||
|
||||
console.log(`Feature Store service running on port ${port}`);
|
||||
|
||||
export default {
|
||||
port,
|
||||
fetch: app.fetch,
|
||||
};
|
||||
|
|
@ -0,0 +1,167 @@
|
|||
import { logger } from '@stock-bot/utils';
|
||||
import {
|
||||
FeatureComputation,
|
||||
ComputationStatus,
|
||||
ComputationError
|
||||
} from '../types/FeatureStore';
|
||||
|
||||
export class FeatureComputationService {
|
||||
private computations: Map<string, FeatureComputation> = new Map();
|
||||
private runningComputations: Set<string> = new Set();
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Feature Computation Service...');
|
||||
|
||||
this.computations.clear();
|
||||
this.runningComputations.clear();
|
||||
|
||||
logger.info('✅ Feature Computation Service initialized');
|
||||
}
|
||||
|
||||
async startComputation(
|
||||
featureGroupId: string,
|
||||
parameters: Record<string, any>
|
||||
): Promise<FeatureComputation> {
|
||||
const computation: FeatureComputation = {
|
||||
id: this.generateComputationId(),
|
||||
featureGroupId,
|
||||
status: ComputationStatus.PENDING,
|
||||
startTime: new Date(),
|
||||
recordsProcessed: 0,
|
||||
recordsGenerated: 0,
|
||||
errors: [],
|
||||
metadata: parameters,
|
||||
};
|
||||
|
||||
this.computations.set(computation.id, computation);
|
||||
|
||||
// Start computation asynchronously
|
||||
this.executeComputation(computation);
|
||||
|
||||
logger.info(`⚙️ Started feature computation: ${computation.id} for group: ${featureGroupId}`);
|
||||
return computation;
|
||||
}
|
||||
|
||||
async getComputation(id: string): Promise<FeatureComputation | null> {
|
||||
return this.computations.get(id) || null;
|
||||
}
|
||||
|
||||
async listComputations(featureGroupId?: string): Promise<FeatureComputation[]> {
|
||||
const computations = Array.from(this.computations.values());
|
||||
return featureGroupId ?
|
||||
computations.filter(c => c.featureGroupId === featureGroupId) :
|
||||
computations;
|
||||
}
|
||||
|
||||
async cancelComputation(id: string): Promise<boolean> {
|
||||
const computation = this.computations.get(id);
|
||||
if (!computation) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (computation.status === ComputationStatus.RUNNING) {
|
||||
computation.status = ComputationStatus.CANCELLED;
|
||||
computation.endTime = new Date();
|
||||
this.runningComputations.delete(id);
|
||||
|
||||
logger.info(`❌ Cancelled computation: ${id}`);
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
private async executeComputation(computation: FeatureComputation): Promise<void> {
|
||||
try {
|
||||
computation.status = ComputationStatus.RUNNING;
|
||||
this.runningComputations.add(computation.id);
|
||||
|
||||
logger.info(`⚙️ Executing computation: ${computation.id}`);
|
||||
|
||||
// Simulate computation work
|
||||
const totalRecords = 1000; // Mock value
|
||||
const batchSize = 100;
|
||||
|
||||
for (let processed = 0; processed < totalRecords; processed += batchSize) {
|
||||
// Check if computation was cancelled
|
||||
if (computation.status === ComputationStatus.CANCELLED) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Simulate processing time
|
||||
await new Promise(resolve => setTimeout(resolve, 100));
|
||||
|
||||
const currentBatch = Math.min(batchSize, totalRecords - processed);
|
||||
computation.recordsProcessed += currentBatch;
|
||||
computation.recordsGenerated += currentBatch; // Assume 1:1 for simplicity
|
||||
|
||||
// Simulate some errors
|
||||
if (Math.random() < 0.05) { // 5% error rate
|
||||
const error: ComputationError = {
|
||||
entityId: `entity_${processed}`,
|
||||
error: 'Simulated processing error',
|
||||
timestamp: new Date(),
|
||||
};
|
||||
computation.errors.push(error);
|
||||
}
|
||||
}
|
||||
|
||||
computation.status = ComputationStatus.COMPLETED;
|
||||
computation.endTime = new Date();
|
||||
this.runningComputations.delete(computation.id);
|
||||
|
||||
logger.info(`✅ Completed computation: ${computation.id}`);
|
||||
|
||||
} catch (error) {
|
||||
computation.status = ComputationStatus.FAILED;
|
||||
computation.endTime = new Date();
|
||||
this.runningComputations.delete(computation.id);
|
||||
|
||||
const computationError: ComputationError = {
|
||||
entityId: 'unknown',
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
timestamp: new Date(),
|
||||
};
|
||||
computation.errors.push(computationError);
|
||||
|
||||
logger.error(`❌ Computation failed: ${computation.id}`, error);
|
||||
}
|
||||
}
|
||||
|
||||
private generateComputationId(): string {
|
||||
return `comp_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
|
||||
async getComputationStats(): Promise<any> {
|
||||
const computations = Array.from(this.computations.values());
|
||||
|
||||
return {
|
||||
total: computations.length,
|
||||
running: this.runningComputations.size,
|
||||
byStatus: {
|
||||
pending: computations.filter(c => c.status === ComputationStatus.PENDING).length,
|
||||
running: computations.filter(c => c.status === ComputationStatus.RUNNING).length,
|
||||
completed: computations.filter(c => c.status === ComputationStatus.COMPLETED).length,
|
||||
failed: computations.filter(c => c.status === ComputationStatus.FAILED).length,
|
||||
cancelled: computations.filter(c => c.status === ComputationStatus.CANCELLED).length,
|
||||
},
|
||||
totalRecordsProcessed: computations.reduce((sum, c) => sum + c.recordsProcessed, 0),
|
||||
totalRecordsGenerated: computations.reduce((sum, c) => sum + c.recordsGenerated, 0),
|
||||
totalErrors: computations.reduce((sum, c) => sum + c.errors.length, 0),
|
||||
};
|
||||
}
|
||||
|
||||
async shutdown(): Promise<void> {
|
||||
logger.info('🔄 Shutting down Feature Computation Service...');
|
||||
|
||||
// Cancel all running computations
|
||||
for (const computationId of this.runningComputations) {
|
||||
await this.cancelComputation(computationId);
|
||||
}
|
||||
|
||||
this.computations.clear();
|
||||
this.runningComputations.clear();
|
||||
|
||||
logger.info('✅ Feature Computation Service shutdown complete');
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,246 @@
|
|||
import { EventBus } from '@stock-bot/event-bus';
|
||||
import { Logger } from '@stock-bot/utils';
|
||||
import {
|
||||
FeatureGroup,
|
||||
FeatureDriftAlert,
|
||||
FeatureMonitoringConfig,
|
||||
FeatureMonitoringMetrics,
|
||||
FeatureValue,
|
||||
DriftDetectionMethod
|
||||
} from '../types/FeatureStore';
|
||||
|
||||
export interface FeatureMonitoringService {
|
||||
startMonitoring(featureGroupId: string, config: FeatureMonitoringConfig): Promise<void>;
|
||||
stopMonitoring(featureGroupId: string): Promise<void>;
|
||||
detectDrift(featureGroupId: string, recentData: FeatureValue[]): Promise<FeatureDriftAlert[]>;
|
||||
getMonitoringMetrics(featureGroupId: string): Promise<FeatureMonitoringMetrics>;
|
||||
updateMonitoringConfig(featureGroupId: string, config: FeatureMonitoringConfig): Promise<void>;
|
||||
}
|
||||
|
||||
export class FeatureMonitoringServiceImpl implements FeatureMonitoringService {
|
||||
private monitoringJobs: Map<string, NodeJS.Timeout> = new Map();
|
||||
private baselineStats: Map<string, any> = new Map();
|
||||
|
||||
constructor(
|
||||
private eventBus: EventBus,
|
||||
private logger: Logger
|
||||
) {}
|
||||
|
||||
async startMonitoring(featureGroupId: string, config: FeatureMonitoringConfig): Promise<void> {
|
||||
try {
|
||||
// Stop existing monitoring if running
|
||||
await this.stopMonitoring(featureGroupId);
|
||||
|
||||
// Start new monitoring job
|
||||
const interval = setInterval(async () => {
|
||||
await this.runMonitoringCheck(featureGroupId, config);
|
||||
}, config.checkInterval * 1000);
|
||||
|
||||
this.monitoringJobs.set(featureGroupId, interval);
|
||||
|
||||
this.logger.info(`Started monitoring for feature group: ${featureGroupId}`);
|
||||
|
||||
await this.eventBus.emit('feature.monitoring.started', {
|
||||
featureGroupId,
|
||||
config,
|
||||
timestamp: new Date()
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to start feature monitoring', { featureGroupId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async stopMonitoring(featureGroupId: string): Promise<void> {
|
||||
try {
|
||||
const job = this.monitoringJobs.get(featureGroupId);
|
||||
if (job) {
|
||||
clearInterval(job);
|
||||
this.monitoringJobs.delete(featureGroupId);
|
||||
|
||||
this.logger.info(`Stopped monitoring for feature group: ${featureGroupId}`);
|
||||
|
||||
await this.eventBus.emit('feature.monitoring.stopped', {
|
||||
featureGroupId,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to stop feature monitoring', { featureGroupId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async detectDrift(featureGroupId: string, recentData: FeatureValue[]): Promise<FeatureDriftAlert[]> {
|
||||
try {
|
||||
const alerts: FeatureDriftAlert[] = [];
|
||||
const baseline = this.baselineStats.get(featureGroupId);
|
||||
|
||||
if (!baseline) {
|
||||
// No baseline available, collect current data as baseline
|
||||
await this.updateBaseline(featureGroupId, recentData);
|
||||
return alerts;
|
||||
}
|
||||
|
||||
// Group data by feature name
|
||||
const featureData = this.groupByFeature(recentData);
|
||||
|
||||
for (const [featureName, values] of featureData) {
|
||||
const currentStats = this.calculateStatistics(values);
|
||||
const baselineFeatureStats = baseline[featureName];
|
||||
|
||||
if (!baselineFeatureStats) continue;
|
||||
|
||||
// Detect drift using various methods
|
||||
const driftScore = await this.calculateDriftScore(
|
||||
baselineFeatureStats,
|
||||
currentStats,
|
||||
DriftDetectionMethod.KOLMOGOROV_SMIRNOV
|
||||
);
|
||||
|
||||
if (driftScore > 0.1) { // Threshold for drift detection
|
||||
alerts.push({
|
||||
id: `drift_${featureGroupId}_${featureName}_${Date.now()}`,
|
||||
featureGroupId,
|
||||
featureName,
|
||||
driftScore,
|
||||
severity: driftScore > 0.3 ? 'high' : driftScore > 0.2 ? 'medium' : 'low',
|
||||
detectionMethod: DriftDetectionMethod.KOLMOGOROV_SMIRNOV,
|
||||
baselineStats: baselineFeatureStats,
|
||||
currentStats,
|
||||
detectedAt: new Date(),
|
||||
message: `Feature drift detected for ${featureName} with score ${driftScore.toFixed(3)}`
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (alerts.length > 0) {
|
||||
await this.eventBus.emit('feature.drift.detected', {
|
||||
featureGroupId,
|
||||
alerts,
|
||||
timestamp: new Date()
|
||||
});
|
||||
}
|
||||
|
||||
return alerts;
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to detect drift', { featureGroupId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getMonitoringMetrics(featureGroupId: string): Promise<FeatureMonitoringMetrics> {
|
||||
try {
|
||||
const isActive = this.monitoringJobs.has(featureGroupId);
|
||||
const baseline = this.baselineStats.get(featureGroupId);
|
||||
|
||||
return {
|
||||
featureGroupId,
|
||||
isActive,
|
||||
lastCheckTime: new Date(),
|
||||
totalChecks: 0, // Would be stored in persistent storage
|
||||
driftAlertsCount: 0, // Would be queried from alert storage
|
||||
averageDriftScore: 0,
|
||||
featuresMonitored: baseline ? Object.keys(baseline).length : 0,
|
||||
uptime: isActive ? Date.now() : 0 // Would calculate actual uptime
|
||||
};
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to get monitoring metrics', { featureGroupId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async updateMonitoringConfig(featureGroupId: string, config: FeatureMonitoringConfig): Promise<void> {
|
||||
try {
|
||||
// Restart monitoring with new config
|
||||
if (this.monitoringJobs.has(featureGroupId)) {
|
||||
await this.stopMonitoring(featureGroupId);
|
||||
await this.startMonitoring(featureGroupId, config);
|
||||
}
|
||||
|
||||
this.logger.info(`Updated monitoring config for feature group: ${featureGroupId}`);
|
||||
} catch (error) {
|
||||
this.logger.error('Failed to update monitoring config', { featureGroupId, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
private async runMonitoringCheck(featureGroupId: string, config: FeatureMonitoringConfig): Promise<void> {
|
||||
try {
|
||||
// In a real implementation, this would fetch recent data from the feature store
|
||||
const recentData: FeatureValue[] = []; // Placeholder
|
||||
|
||||
await this.detectDrift(featureGroupId, recentData);
|
||||
} catch (error) {
|
||||
this.logger.error('Monitoring check failed', { featureGroupId, error });
|
||||
}
|
||||
}
|
||||
|
||||
private async updateBaseline(featureGroupId: string, data: FeatureValue[]): Promise<void> {
|
||||
const featureData = this.groupByFeature(data);
|
||||
const baseline: Record<string, any> = {};
|
||||
|
||||
for (const [featureName, values] of featureData) {
|
||||
baseline[featureName] = this.calculateStatistics(values);
|
||||
}
|
||||
|
||||
this.baselineStats.set(featureGroupId, baseline);
|
||||
}
|
||||
|
||||
private groupByFeature(data: FeatureValue[]): Map<string, number[]> {
|
||||
const grouped = new Map<string, number[]>();
|
||||
|
||||
for (const item of data) {
|
||||
if (!grouped.has(item.featureName)) {
|
||||
grouped.set(item.featureName, []);
|
||||
}
|
||||
grouped.get(item.featureName)!.push(item.value as number);
|
||||
}
|
||||
|
||||
return grouped;
|
||||
}
|
||||
|
||||
private calculateStatistics(values: number[]): any {
|
||||
const sorted = values.sort((a, b) => a - b);
|
||||
const n = values.length;
|
||||
const mean = values.reduce((sum, val) => sum + val, 0) / n;
|
||||
const variance = values.reduce((sum, val) => sum + Math.pow(val - mean, 2), 0) / n;
|
||||
const stdDev = Math.sqrt(variance);
|
||||
|
||||
return {
|
||||
count: n,
|
||||
mean,
|
||||
stdDev,
|
||||
min: sorted[0],
|
||||
max: sorted[n - 1],
|
||||
median: n % 2 === 0 ? (sorted[n/2 - 1] + sorted[n/2]) / 2 : sorted[Math.floor(n/2)],
|
||||
q25: sorted[Math.floor(n * 0.25)],
|
||||
q75: sorted[Math.floor(n * 0.75)]
|
||||
};
|
||||
}
|
||||
|
||||
private async calculateDriftScore(
|
||||
baseline: any,
|
||||
current: any,
|
||||
method: DriftDetectionMethod
|
||||
): Promise<number> {
|
||||
switch (method) {
|
||||
case DriftDetectionMethod.KOLMOGOROV_SMIRNOV:
|
||||
// Simplified KS test approximation
|
||||
return Math.abs(baseline.mean - current.mean) / (baseline.stdDev + current.stdDev + 1e-8);
|
||||
|
||||
case DriftDetectionMethod.POPULATION_STABILITY_INDEX:
|
||||
// Simplified PSI calculation
|
||||
const expectedRatio = baseline.mean / (baseline.mean + current.mean + 1e-8);
|
||||
const actualRatio = current.mean / (baseline.mean + current.mean + 1e-8);
|
||||
return Math.abs(expectedRatio - actualRatio);
|
||||
|
||||
case DriftDetectionMethod.JENSEN_SHANNON_DIVERGENCE:
|
||||
// Simplified JS divergence approximation
|
||||
return Math.min(1.0, Math.abs(baseline.mean - current.mean) / Math.max(baseline.stdDev, current.stdDev, 1e-8));
|
||||
|
||||
default:
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,195 @@
|
|||
import { logger } from '@stock-bot/utils';
|
||||
import { FeatureStatistics, HistogramBucket, ValueCount } from '../types/FeatureStore';
|
||||
|
||||
export class FeatureStatisticsService {
|
||||
private statistics: Map<string, FeatureStatistics> = new Map();
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Feature Statistics Service...');
|
||||
|
||||
this.statistics.clear();
|
||||
|
||||
logger.info('✅ Feature Statistics Service initialized');
|
||||
}
|
||||
|
||||
async computeStatistics(
|
||||
featureGroupId: string,
|
||||
featureName: string,
|
||||
data: any[]
|
||||
): Promise<FeatureStatistics> {
|
||||
const values = data.map(item => item[featureName]).filter(v => v !== null && v !== undefined);
|
||||
|
||||
const statistics: FeatureStatistics = {
|
||||
featureGroupId,
|
||||
featureName,
|
||||
statistics: {
|
||||
count: data.length,
|
||||
nullCount: data.length - values.length,
|
||||
distinctCount: new Set(values).size,
|
||||
},
|
||||
computedAt: new Date(),
|
||||
};
|
||||
|
||||
// Compute numerical statistics if applicable
|
||||
const numericalValues = values.filter(v => typeof v === 'number');
|
||||
if (numericalValues.length > 0) {
|
||||
const sorted = numericalValues.sort((a, b) => a - b);
|
||||
const sum = numericalValues.reduce((acc, val) => acc + val, 0);
|
||||
const mean = sum / numericalValues.length;
|
||||
|
||||
statistics.statistics.min = sorted[0];
|
||||
statistics.statistics.max = sorted[sorted.length - 1];
|
||||
statistics.statistics.mean = mean;
|
||||
statistics.statistics.median = this.calculateMedian(sorted);
|
||||
statistics.statistics.stdDev = this.calculateStandardDeviation(numericalValues, mean);
|
||||
statistics.statistics.percentiles = this.calculatePercentiles(sorted);
|
||||
statistics.statistics.histogram = this.calculateHistogram(numericalValues);
|
||||
}
|
||||
|
||||
// Compute top values for categorical data
|
||||
statistics.statistics.topValues = this.calculateTopValues(values);
|
||||
|
||||
const key = `${featureGroupId}.${featureName}`;
|
||||
this.statistics.set(key, statistics);
|
||||
|
||||
logger.info(`📊 Computed statistics for feature: ${featureGroupId}.${featureName}`);
|
||||
return statistics;
|
||||
}
|
||||
|
||||
async getStatistics(featureGroupId: string, featureName: string): Promise<FeatureStatistics | null> {
|
||||
const key = `${featureGroupId}.${featureName}`;
|
||||
return this.statistics.get(key) || null;
|
||||
}
|
||||
|
||||
async getFeatureGroupStatistics(featureGroupId: string): Promise<FeatureStatistics[]> {
|
||||
const groupStats: FeatureStatistics[] = [];
|
||||
|
||||
for (const [key, stats] of this.statistics.entries()) {
|
||||
if (stats.featureGroupId === featureGroupId) {
|
||||
groupStats.push(stats);
|
||||
}
|
||||
}
|
||||
|
||||
return groupStats;
|
||||
}
|
||||
|
||||
async getAllStatistics(): Promise<FeatureStatistics[]> {
|
||||
return Array.from(this.statistics.values());
|
||||
}
|
||||
|
||||
async deleteStatistics(featureGroupId: string, featureName?: string): Promise<void> {
|
||||
if (featureName) {
|
||||
const key = `${featureGroupId}.${featureName}`;
|
||||
this.statistics.delete(key);
|
||||
} else {
|
||||
// Delete all statistics for the feature group
|
||||
const keysToDelete: string[] = [];
|
||||
for (const [key, stats] of this.statistics.entries()) {
|
||||
if (stats.featureGroupId === featureGroupId) {
|
||||
keysToDelete.push(key);
|
||||
}
|
||||
}
|
||||
|
||||
for (const key of keysToDelete) {
|
||||
this.statistics.delete(key);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private calculateMedian(sortedValues: number[]): number {
|
||||
const length = sortedValues.length;
|
||||
if (length % 2 === 0) {
|
||||
return (sortedValues[length / 2 - 1] + sortedValues[length / 2]) / 2;
|
||||
} else {
|
||||
return sortedValues[Math.floor(length / 2)];
|
||||
}
|
||||
}
|
||||
|
||||
private calculateStandardDeviation(values: number[], mean: number): number {
|
||||
const squaredDifferences = values.map(value => Math.pow(value - mean, 2));
|
||||
const avgSquaredDiff = squaredDifferences.reduce((acc, val) => acc + val, 0) / values.length;
|
||||
return Math.sqrt(avgSquaredDiff);
|
||||
}
|
||||
|
||||
private calculatePercentiles(sortedValues: number[]): Record<string, number> {
|
||||
const percentiles = [5, 10, 25, 50, 75, 90, 95];
|
||||
const result: Record<string, number> = {};
|
||||
|
||||
for (const p of percentiles) {
|
||||
const index = (p / 100) * (sortedValues.length - 1);
|
||||
if (Number.isInteger(index)) {
|
||||
result[`p${p}`] = sortedValues[index];
|
||||
} else {
|
||||
const lower = Math.floor(index);
|
||||
const upper = Math.ceil(index);
|
||||
const weight = index - lower;
|
||||
result[`p${p}`] = sortedValues[lower] * (1 - weight) + sortedValues[upper] * weight;
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private calculateHistogram(values: number[], buckets: number = 10): HistogramBucket[] {
|
||||
const min = Math.min(...values);
|
||||
const max = Math.max(...values);
|
||||
const bucketSize = (max - min) / buckets;
|
||||
|
||||
const histogram: HistogramBucket[] = [];
|
||||
|
||||
for (let i = 0; i < buckets; i++) {
|
||||
const bucketMin = min + i * bucketSize;
|
||||
const bucketMax = i === buckets - 1 ? max : min + (i + 1) * bucketSize;
|
||||
|
||||
const count = values.filter(v => v >= bucketMin && v < bucketMax).length;
|
||||
|
||||
histogram.push({
|
||||
min: bucketMin,
|
||||
max: bucketMax,
|
||||
count,
|
||||
});
|
||||
}
|
||||
|
||||
return histogram;
|
||||
}
|
||||
|
||||
private calculateTopValues(values: any[], limit: number = 10): ValueCount[] {
|
||||
const valueCounts = new Map<any, number>();
|
||||
|
||||
for (const value of values) {
|
||||
valueCounts.set(value, (valueCounts.get(value) || 0) + 1);
|
||||
}
|
||||
|
||||
const sortedCounts = Array.from(valueCounts.entries())
|
||||
.map(([value, count]) => ({
|
||||
value,
|
||||
count,
|
||||
percentage: (count / values.length) * 100,
|
||||
}))
|
||||
.sort((a, b) => b.count - a.count)
|
||||
.slice(0, limit);
|
||||
|
||||
return sortedCounts;
|
||||
}
|
||||
|
||||
async getStatisticsSummary(): Promise<any> {
|
||||
const allStats = Array.from(this.statistics.values());
|
||||
|
||||
return {
|
||||
totalFeatures: allStats.length,
|
||||
totalRecords: allStats.reduce((sum, s) => sum + s.statistics.count, 0),
|
||||
totalNullValues: allStats.reduce((sum, s) => sum + s.statistics.nullCount, 0),
|
||||
featureGroups: new Set(allStats.map(s => s.featureGroupId)).size,
|
||||
lastComputed: allStats.length > 0 ?
|
||||
Math.max(...allStats.map(s => s.computedAt.getTime())) : null,
|
||||
};
|
||||
}
|
||||
|
||||
async shutdown(): Promise<void> {
|
||||
logger.info('🔄 Shutting down Feature Statistics Service...');
|
||||
|
||||
this.statistics.clear();
|
||||
|
||||
logger.info('✅ Feature Statistics Service shutdown complete');
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,313 @@
|
|||
import { EventBus } from '@stock-bot/event-bus';
|
||||
import { logger } from '@stock-bot/utils';
|
||||
import {
|
||||
FeatureGroup,
|
||||
FeatureGroupStatus,
|
||||
FeatureVector,
|
||||
FeatureRequest,
|
||||
FeatureResponse,
|
||||
FeatureStorageConfig,
|
||||
FeatureRegistry
|
||||
} from '../types/FeatureStore';
|
||||
import { OnlineStore } from './storage/OnlineStore';
|
||||
import { OfflineStore } from './storage/OfflineStore';
|
||||
import { MetadataStore } from './storage/MetadataStore';
|
||||
|
||||
export class FeatureStoreService {
|
||||
private eventBus: EventBus;
|
||||
private onlineStore: OnlineStore;
|
||||
private offlineStore: OfflineStore;
|
||||
private metadataStore: MetadataStore;
|
||||
private registry: FeatureRegistry;
|
||||
|
||||
constructor() {
|
||||
this.eventBus = new EventBus();
|
||||
this.onlineStore = new OnlineStore();
|
||||
this.offlineStore = new OfflineStore();
|
||||
this.metadataStore = new MetadataStore();
|
||||
this.registry = {
|
||||
featureGroups: new Map(),
|
||||
features: new Map(),
|
||||
dependencies: new Map(),
|
||||
lineage: new Map()
|
||||
};
|
||||
}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Feature Store Service...');
|
||||
|
||||
await this.eventBus.initialize();
|
||||
await this.onlineStore.initialize();
|
||||
await this.offlineStore.initialize();
|
||||
await this.metadataStore.initialize();
|
||||
|
||||
// Load existing feature groups from metadata store
|
||||
await this.loadFeatureGroups();
|
||||
|
||||
// Subscribe to feature events
|
||||
await this.eventBus.subscribe('feature.*', this.handleFeatureEvent.bind(this));
|
||||
|
||||
logger.info('✅ Feature Store Service initialized');
|
||||
}
|
||||
|
||||
async createFeatureGroup(featureGroup: Omit<FeatureGroup, 'id' | 'createdAt' | 'updatedAt'>): Promise<FeatureGroup> {
|
||||
const featureGroupWithId: FeatureGroup = {
|
||||
...featureGroup,
|
||||
id: this.generateFeatureGroupId(),
|
||||
status: FeatureGroupStatus.DRAFT,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
};
|
||||
|
||||
// Store in metadata store
|
||||
await this.metadataStore.saveFeatureGroup(featureGroupWithId);
|
||||
|
||||
// Update registry
|
||||
this.registry.featureGroups.set(featureGroupWithId.id, featureGroupWithId);
|
||||
|
||||
// Register individual features
|
||||
for (const feature of featureGroupWithId.features) {
|
||||
const featureKey = `${featureGroupWithId.id}.${feature.name}`;
|
||||
this.registry.features.set(featureKey, feature);
|
||||
}
|
||||
|
||||
await this.eventBus.publish('feature.group.created', {
|
||||
featureGroupId: featureGroupWithId.id,
|
||||
featureGroup: featureGroupWithId,
|
||||
});
|
||||
|
||||
logger.info(`📋 Created feature group: ${featureGroupWithId.name} (${featureGroupWithId.id})`);
|
||||
return featureGroupWithId;
|
||||
}
|
||||
|
||||
async updateFeatureGroup(id: string, updates: Partial<FeatureGroup>): Promise<FeatureGroup> {
|
||||
const existingGroup = this.registry.featureGroups.get(id);
|
||||
if (!existingGroup) {
|
||||
throw new Error(`Feature group not found: ${id}`);
|
||||
}
|
||||
|
||||
const updatedGroup: FeatureGroup = {
|
||||
...existingGroup,
|
||||
...updates,
|
||||
id, // Ensure ID doesn't change
|
||||
updatedAt: new Date(),
|
||||
};
|
||||
|
||||
// Store in metadata store
|
||||
await this.metadataStore.saveFeatureGroup(updatedGroup);
|
||||
|
||||
// Update registry
|
||||
this.registry.featureGroups.set(id, updatedGroup);
|
||||
|
||||
await this.eventBus.publish('feature.group.updated', {
|
||||
featureGroupId: id,
|
||||
featureGroup: updatedGroup,
|
||||
});
|
||||
|
||||
logger.info(`📝 Updated feature group: ${updatedGroup.name} (${id})`);
|
||||
return updatedGroup;
|
||||
}
|
||||
|
||||
async deleteFeatureGroup(id: string): Promise<void> {
|
||||
const featureGroup = this.registry.featureGroups.get(id);
|
||||
if (!featureGroup) {
|
||||
throw new Error(`Feature group not found: ${id}`);
|
||||
}
|
||||
|
||||
// Remove from stores
|
||||
await this.metadataStore.deleteFeatureGroup(id);
|
||||
await this.onlineStore.deleteFeatureGroup(id);
|
||||
await this.offlineStore.deleteFeatureGroup(id);
|
||||
|
||||
// Update registry
|
||||
this.registry.featureGroups.delete(id);
|
||||
|
||||
// Remove features from registry
|
||||
for (const feature of featureGroup.features) {
|
||||
const featureKey = `${id}.${feature.name}`;
|
||||
this.registry.features.delete(featureKey);
|
||||
}
|
||||
|
||||
await this.eventBus.publish('feature.group.deleted', {
|
||||
featureGroupId: id,
|
||||
featureGroup,
|
||||
});
|
||||
|
||||
logger.info(`🗑️ Deleted feature group: ${featureGroup.name} (${id})`);
|
||||
}
|
||||
|
||||
async getFeatureGroup(id: string): Promise<FeatureGroup | null> {
|
||||
return this.registry.featureGroups.get(id) || null;
|
||||
}
|
||||
|
||||
async listFeatureGroups(status?: FeatureGroupStatus): Promise<FeatureGroup[]> {
|
||||
const groups = Array.from(this.registry.featureGroups.values());
|
||||
return status ? groups.filter(group => group.status === status) : groups;
|
||||
}
|
||||
|
||||
async getOnlineFeatures(request: FeatureRequest): Promise<FeatureResponse[]> {
|
||||
logger.info(`🔍 Getting online features for ${request.entityIds.length} entities`);
|
||||
|
||||
const responses: FeatureResponse[] = [];
|
||||
|
||||
for (const entityId of request.entityIds) {
|
||||
const features: Record<string, any> = {};
|
||||
const metadata: Record<string, any> = {};
|
||||
|
||||
for (const featureGroupId of request.featureGroups) {
|
||||
const featureGroup = this.registry.featureGroups.get(featureGroupId);
|
||||
if (!featureGroup) {
|
||||
logger.warn(`Feature group not found: ${featureGroupId}`);
|
||||
continue;
|
||||
}
|
||||
|
||||
const featureVector = await this.onlineStore.getFeatures(
|
||||
entityId,
|
||||
request.entityType,
|
||||
featureGroupId,
|
||||
request.asOfTime
|
||||
);
|
||||
|
||||
if (featureVector) {
|
||||
Object.assign(features, featureVector.values);
|
||||
if (request.includeMetadata) {
|
||||
metadata[featureGroupId] = featureVector.metadata;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
responses.push({
|
||||
entityId,
|
||||
entityType: request.entityType,
|
||||
features,
|
||||
metadata,
|
||||
timestamp: request.asOfTime || new Date(),
|
||||
});
|
||||
}
|
||||
|
||||
return responses;
|
||||
}
|
||||
|
||||
async getHistoricalFeatures(request: FeatureRequest): Promise<FeatureResponse[]> {
|
||||
logger.info(`📊 Getting historical features for ${request.entityIds.length} entities`);
|
||||
|
||||
return await this.offlineStore.getHistoricalFeatures(request);
|
||||
}
|
||||
|
||||
async getBatchFeatures(request: FeatureRequest): Promise<FeatureResponse[]> {
|
||||
logger.info(`📦 Getting batch features for ${request.entityIds.length} entities`);
|
||||
|
||||
// For batch requests, use offline store for efficiency
|
||||
return await this.offlineStore.getBatchFeatures(request);
|
||||
}
|
||||
|
||||
async ingestFeatures(featureVectors: FeatureVector[]): Promise<void> {
|
||||
logger.info(`📥 Ingesting ${featureVectors.length} feature vectors`);
|
||||
|
||||
// Store in both online and offline stores
|
||||
await Promise.all([
|
||||
this.onlineStore.writeFeatures(featureVectors),
|
||||
this.offlineStore.writeFeatures(featureVectors)
|
||||
]);
|
||||
|
||||
await this.eventBus.publish('feature.ingested', {
|
||||
vectorCount: featureVectors.length,
|
||||
timestamp: new Date(),
|
||||
});
|
||||
}
|
||||
|
||||
async searchFeatures(query: string, filters?: Record<string, any>): Promise<any[]> {
|
||||
const results: any[] = [];
|
||||
|
||||
for (const [groupId, group] of this.registry.featureGroups) {
|
||||
for (const feature of group.features) {
|
||||
const featureInfo = {
|
||||
featureGroupId: groupId,
|
||||
featureGroupName: group.name,
|
||||
featureName: feature.name,
|
||||
description: feature.description,
|
||||
type: feature.type,
|
||||
valueType: feature.valueType,
|
||||
tags: feature.tags,
|
||||
};
|
||||
|
||||
// Simple text search
|
||||
const searchText = `${group.name} ${feature.name} ${feature.description || ''} ${feature.tags.join(' ')}`.toLowerCase();
|
||||
if (searchText.includes(query.toLowerCase())) {
|
||||
// Apply filters if provided
|
||||
if (filters) {
|
||||
let matches = true;
|
||||
for (const [key, value] of Object.entries(filters)) {
|
||||
if (featureInfo[key as keyof typeof featureInfo] !== value) {
|
||||
matches = false;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (matches) {
|
||||
results.push(featureInfo);
|
||||
}
|
||||
} else {
|
||||
results.push(featureInfo);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
async getFeatureLineage(featureGroupId: string, featureName: string): Promise<any> {
|
||||
const lineageKey = `${featureGroupId}.${featureName}`;
|
||||
return this.registry.lineage.get(lineageKey) || null;
|
||||
}
|
||||
|
||||
async getFeatureUsage(featureGroupId: string, featureName: string): Promise<any> {
|
||||
// In a real implementation, this would track feature usage across models and applications
|
||||
return {
|
||||
featureGroupId,
|
||||
featureName,
|
||||
usageCount: 0,
|
||||
lastUsed: null,
|
||||
consumers: [],
|
||||
models: []
|
||||
};
|
||||
}
|
||||
|
||||
private async loadFeatureGroups(): Promise<void> {
|
||||
logger.info('📂 Loading existing feature groups...');
|
||||
|
||||
const featureGroups = await this.metadataStore.getAllFeatureGroups();
|
||||
|
||||
for (const group of featureGroups) {
|
||||
this.registry.featureGroups.set(group.id, group);
|
||||
|
||||
// Register individual features
|
||||
for (const feature of group.features) {
|
||||
const featureKey = `${group.id}.${feature.name}`;
|
||||
this.registry.features.set(featureKey, feature);
|
||||
}
|
||||
}
|
||||
|
||||
logger.info(`📂 Loaded ${featureGroups.length} feature groups`);
|
||||
}
|
||||
|
||||
private async handleFeatureEvent(event: any): Promise<void> {
|
||||
logger.debug('📨 Received feature event:', event);
|
||||
// Handle feature-level events
|
||||
}
|
||||
|
||||
private generateFeatureGroupId(): string {
|
||||
return `fg_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
|
||||
async shutdown(): Promise<void> {
|
||||
logger.info('🔄 Shutting down Feature Store Service...');
|
||||
|
||||
await this.onlineStore.shutdown();
|
||||
await this.offlineStore.shutdown();
|
||||
await this.metadataStore.shutdown();
|
||||
await this.eventBus.disconnect();
|
||||
|
||||
logger.info('✅ Feature Store Service shutdown complete');
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,52 @@
|
|||
import { logger } from '@stock-bot/utils';
|
||||
import { FeatureGroup } from '../../types/FeatureStore';
|
||||
|
||||
export class MetadataStore {
|
||||
private featureGroups: Map<string, FeatureGroup> = new Map();
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Metadata Store...');
|
||||
|
||||
// In a real implementation, connect to PostgreSQL or other metadata store
|
||||
this.featureGroups.clear();
|
||||
|
||||
logger.info('✅ Metadata Store initialized');
|
||||
}
|
||||
|
||||
async saveFeatureGroup(featureGroup: FeatureGroup): Promise<void> {
|
||||
this.featureGroups.set(featureGroup.id, { ...featureGroup });
|
||||
logger.debug(`💾 Saved feature group metadata: ${featureGroup.id}`);
|
||||
}
|
||||
|
||||
async getFeatureGroup(id: string): Promise<FeatureGroup | null> {
|
||||
return this.featureGroups.get(id) || null;
|
||||
}
|
||||
|
||||
async getAllFeatureGroups(): Promise<FeatureGroup[]> {
|
||||
return Array.from(this.featureGroups.values());
|
||||
}
|
||||
|
||||
async deleteFeatureGroup(id: string): Promise<void> {
|
||||
this.featureGroups.delete(id);
|
||||
logger.debug(`🗑️ Deleted feature group metadata: ${id}`);
|
||||
}
|
||||
|
||||
async findFeatureGroups(criteria: Partial<FeatureGroup>): Promise<FeatureGroup[]> {
|
||||
const groups = Array.from(this.featureGroups.values());
|
||||
|
||||
return groups.filter(group => {
|
||||
for (const [key, value] of Object.entries(criteria)) {
|
||||
if (group[key as keyof FeatureGroup] !== value) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
return true;
|
||||
});
|
||||
}
|
||||
|
||||
async shutdown(): Promise<void> {
|
||||
logger.info('🔄 Shutting down Metadata Store...');
|
||||
this.featureGroups.clear();
|
||||
logger.info('✅ Metadata Store shutdown complete');
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,121 @@
|
|||
import { logger } from '@stock-bot/utils';
|
||||
import { FeatureVector, FeatureRequest, FeatureResponse } from '../../types/FeatureStore';
|
||||
|
||||
export class OfflineStore {
|
||||
private store: Map<string, FeatureVector[]> = new Map();
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Offline Store...');
|
||||
|
||||
// In a real implementation, connect to data warehouse, S3, etc.
|
||||
this.store.clear();
|
||||
|
||||
logger.info('✅ Offline Store initialized');
|
||||
}
|
||||
|
||||
async writeFeatures(featureVectors: FeatureVector[]): Promise<void> {
|
||||
for (const vector of featureVectors) {
|
||||
const partitionKey = this.buildPartitionKey(vector.entityType, vector.featureGroupId);
|
||||
|
||||
if (!this.store.has(partitionKey)) {
|
||||
this.store.set(partitionKey, []);
|
||||
}
|
||||
|
||||
this.store.get(partitionKey)!.push(vector);
|
||||
}
|
||||
|
||||
logger.debug(`💾 Stored ${featureVectors.length} feature vectors in offline store`);
|
||||
}
|
||||
|
||||
async getHistoricalFeatures(request: FeatureRequest): Promise<FeatureResponse[]> {
|
||||
const responses: FeatureResponse[] = [];
|
||||
|
||||
for (const entityId of request.entityIds) {
|
||||
const features: Record<string, any> = {};
|
||||
const metadata: Record<string, any> = {};
|
||||
|
||||
for (const featureGroupId of request.featureGroups) {
|
||||
const partitionKey = this.buildPartitionKey(request.entityType, featureGroupId);
|
||||
const vectors = this.store.get(partitionKey) || [];
|
||||
|
||||
// Find the most recent vector for this entity before asOfTime
|
||||
const relevantVectors = vectors
|
||||
.filter(v => v.entityId === entityId)
|
||||
.filter(v => !request.asOfTime || v.timestamp <= request.asOfTime)
|
||||
.sort((a, b) => b.timestamp.getTime() - a.timestamp.getTime());
|
||||
|
||||
if (relevantVectors.length > 0) {
|
||||
const latestVector = relevantVectors[0];
|
||||
Object.assign(features, latestVector.values);
|
||||
|
||||
if (request.includeMetadata) {
|
||||
metadata[featureGroupId] = latestVector.metadata;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
responses.push({
|
||||
entityId,
|
||||
entityType: request.entityType,
|
||||
features,
|
||||
metadata,
|
||||
timestamp: request.asOfTime || new Date(),
|
||||
});
|
||||
}
|
||||
|
||||
return responses;
|
||||
}
|
||||
|
||||
async getBatchFeatures(request: FeatureRequest): Promise<FeatureResponse[]> {
|
||||
// For simplicity, use the same logic as historical features
|
||||
// In a real implementation, this would use optimized batch processing
|
||||
return await this.getHistoricalFeatures(request);
|
||||
}
|
||||
|
||||
async getFeatureData(
|
||||
featureGroupId: string,
|
||||
entityType: string,
|
||||
startTime?: Date,
|
||||
endTime?: Date
|
||||
): Promise<FeatureVector[]> {
|
||||
const partitionKey = this.buildPartitionKey(entityType, featureGroupId);
|
||||
let vectors = this.store.get(partitionKey) || [];
|
||||
|
||||
// Apply time filters
|
||||
if (startTime) {
|
||||
vectors = vectors.filter(v => v.timestamp >= startTime);
|
||||
}
|
||||
|
||||
if (endTime) {
|
||||
vectors = vectors.filter(v => v.timestamp <= endTime);
|
||||
}
|
||||
|
||||
return vectors;
|
||||
}
|
||||
|
||||
async deleteFeatureGroup(featureGroupId: string): Promise<void> {
|
||||
const keysToDelete: string[] = [];
|
||||
|
||||
for (const key of this.store.keys()) {
|
||||
if (key.includes(`:${featureGroupId}`)) {
|
||||
keysToDelete.push(key);
|
||||
}
|
||||
}
|
||||
|
||||
for (const key of keysToDelete) {
|
||||
this.store.delete(key);
|
||||
}
|
||||
|
||||
logger.debug(`🗑️ Deleted ${keysToDelete.length} partitions for feature group: ${featureGroupId}`);
|
||||
}
|
||||
|
||||
private buildPartitionKey(entityType: string, featureGroupId: string): string {
|
||||
return `${entityType}:${featureGroupId}`;
|
||||
}
|
||||
|
||||
async shutdown(): Promise<void> {
|
||||
logger.info('🔄 Shutting down Offline Store...');
|
||||
this.store.clear();
|
||||
logger.info('✅ Offline Store shutdown complete');
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,75 @@
|
|||
import { logger } from '@stock-bot/utils';
|
||||
import { FeatureVector, FeatureRequest, FeatureResponse } from '../../types/FeatureStore';
|
||||
|
||||
export class OnlineStore {
|
||||
private store: Map<string, any> = new Map();
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
logger.info('🔄 Initializing Online Store...');
|
||||
|
||||
// In a real implementation, connect to Redis or other online store
|
||||
this.store.clear();
|
||||
|
||||
logger.info('✅ Online Store initialized');
|
||||
}
|
||||
|
||||
async writeFeatures(featureVectors: FeatureVector[]): Promise<void> {
|
||||
for (const vector of featureVectors) {
|
||||
const key = this.buildKey(vector.entityId, vector.entityType, vector.featureGroupId);
|
||||
|
||||
this.store.set(key, {
|
||||
...vector,
|
||||
timestamp: vector.timestamp,
|
||||
});
|
||||
}
|
||||
|
||||
logger.debug(`💾 Stored ${featureVectors.length} feature vectors in online store`);
|
||||
}
|
||||
|
||||
async getFeatures(
|
||||
entityId: string,
|
||||
entityType: string,
|
||||
featureGroupId: string,
|
||||
asOfTime?: Date
|
||||
): Promise<FeatureVector | null> {
|
||||
const key = this.buildKey(entityId, entityType, featureGroupId);
|
||||
const storedVector = this.store.get(key);
|
||||
|
||||
if (!storedVector) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// If asOfTime is specified, check if the stored vector is valid at that time
|
||||
if (asOfTime && storedVector.timestamp > asOfTime) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return storedVector;
|
||||
}
|
||||
|
||||
async deleteFeatureGroup(featureGroupId: string): Promise<void> {
|
||||
const keysToDelete: string[] = [];
|
||||
|
||||
for (const key of this.store.keys()) {
|
||||
if (key.includes(`:${featureGroupId}`)) {
|
||||
keysToDelete.push(key);
|
||||
}
|
||||
}
|
||||
|
||||
for (const key of keysToDelete) {
|
||||
this.store.delete(key);
|
||||
}
|
||||
|
||||
logger.debug(`🗑️ Deleted ${keysToDelete.length} records for feature group: ${featureGroupId}`);
|
||||
}
|
||||
|
||||
private buildKey(entityId: string, entityType: string, featureGroupId: string): string {
|
||||
return `${entityType}:${entityId}:${featureGroupId}`;
|
||||
}
|
||||
|
||||
async shutdown(): Promise<void> {
|
||||
logger.info('🔄 Shutting down Online Store...');
|
||||
this.store.clear();
|
||||
logger.info('✅ Online Store shutdown complete');
|
||||
}
|
||||
}
|
||||
243
apps/data-services/feature-store/src/types/FeatureStore.ts
Normal file
243
apps/data-services/feature-store/src/types/FeatureStore.ts
Normal file
|
|
@ -0,0 +1,243 @@
|
|||
// Feature Store Types
|
||||
|
||||
export interface FeatureGroup {
|
||||
id: string;
|
||||
name: string;
|
||||
description?: string;
|
||||
version: string;
|
||||
features: Feature[];
|
||||
source: FeatureSource;
|
||||
schedule?: FeatureSchedule;
|
||||
metadata: Record<string, any>;
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
status: FeatureGroupStatus;
|
||||
}
|
||||
|
||||
export enum FeatureGroupStatus {
|
||||
DRAFT = 'draft',
|
||||
ACTIVE = 'active',
|
||||
DEPRECATED = 'deprecated',
|
||||
ARCHIVED = 'archived',
|
||||
}
|
||||
|
||||
export interface Feature {
|
||||
name: string;
|
||||
type: FeatureType;
|
||||
description?: string;
|
||||
valueType: 'number' | 'string' | 'boolean' | 'array' | 'object';
|
||||
nullable: boolean;
|
||||
defaultValue?: any;
|
||||
validation?: FeatureValidation;
|
||||
transformation?: FeatureTransformation;
|
||||
tags: string[];
|
||||
}
|
||||
|
||||
export enum FeatureType {
|
||||
NUMERICAL = 'numerical',
|
||||
CATEGORICAL = 'categorical',
|
||||
BOOLEAN = 'boolean',
|
||||
TEXT = 'text',
|
||||
TIMESTAMP = 'timestamp',
|
||||
DERIVED = 'derived',
|
||||
}
|
||||
|
||||
export interface FeatureSource {
|
||||
type: 'batch' | 'stream' | 'sql' | 'api' | 'file';
|
||||
connection: Record<string, any>;
|
||||
query?: string;
|
||||
transformation?: string;
|
||||
refreshInterval?: number;
|
||||
}
|
||||
|
||||
export interface FeatureSchedule {
|
||||
cronExpression: string;
|
||||
enabled: boolean;
|
||||
lastRun: Date | null;
|
||||
nextRun: Date | null;
|
||||
}
|
||||
|
||||
export interface FeatureValidation {
|
||||
required: boolean;
|
||||
minValue?: number;
|
||||
maxValue?: number;
|
||||
allowedValues?: any[];
|
||||
pattern?: string;
|
||||
customValidator?: string;
|
||||
}
|
||||
|
||||
export interface FeatureTransformation {
|
||||
type: 'normalize' | 'standardize' | 'encode' | 'custom';
|
||||
parameters: Record<string, any>;
|
||||
}
|
||||
|
||||
// Feature Value Types
|
||||
|
||||
export interface FeatureVector {
|
||||
entityId: string;
|
||||
entityType: string;
|
||||
featureGroupId: string;
|
||||
timestamp: Date;
|
||||
values: Record<string, any>;
|
||||
metadata?: Record<string, any>;
|
||||
}
|
||||
|
||||
export interface FeatureRequest {
|
||||
entityIds: string[];
|
||||
entityType: string;
|
||||
featureGroups: string[];
|
||||
asOfTime?: Date;
|
||||
pointInTime?: boolean;
|
||||
includeMetadata?: boolean;
|
||||
}
|
||||
|
||||
export interface FeatureResponse {
|
||||
entityId: string;
|
||||
entityType: string;
|
||||
features: Record<string, any>;
|
||||
metadata: Record<string, any>;
|
||||
timestamp: Date;
|
||||
}
|
||||
|
||||
// Feature Store Operations
|
||||
|
||||
export interface FeatureComputation {
|
||||
id: string;
|
||||
featureGroupId: string;
|
||||
status: ComputationStatus;
|
||||
startTime: Date;
|
||||
endTime?: Date;
|
||||
recordsProcessed: number;
|
||||
recordsGenerated: number;
|
||||
errors: ComputationError[];
|
||||
metadata: Record<string, any>;
|
||||
}
|
||||
|
||||
export enum ComputationStatus {
|
||||
PENDING = 'pending',
|
||||
RUNNING = 'running',
|
||||
COMPLETED = 'completed',
|
||||
FAILED = 'failed',
|
||||
CANCELLED = 'cancelled',
|
||||
}
|
||||
|
||||
export interface ComputationError {
|
||||
entityId: string;
|
||||
error: string;
|
||||
timestamp: Date;
|
||||
}
|
||||
|
||||
// Feature Statistics
|
||||
|
||||
export interface FeatureStatistics {
|
||||
featureGroupId: string;
|
||||
featureName: string;
|
||||
statistics: {
|
||||
count: number;
|
||||
nullCount: number;
|
||||
distinctCount: number;
|
||||
min?: number;
|
||||
max?: number;
|
||||
mean?: number;
|
||||
median?: number;
|
||||
stdDev?: number;
|
||||
percentiles?: Record<string, number>;
|
||||
histogram?: HistogramBucket[];
|
||||
topValues?: ValueCount[];
|
||||
};
|
||||
computedAt: Date;
|
||||
}
|
||||
|
||||
export interface HistogramBucket {
|
||||
min: number;
|
||||
max: number;
|
||||
count: number;
|
||||
}
|
||||
|
||||
export interface ValueCount {
|
||||
value: any;
|
||||
count: number;
|
||||
percentage: number;
|
||||
}
|
||||
|
||||
// Feature Registry
|
||||
|
||||
export interface FeatureRegistry {
|
||||
featureGroups: Map<string, FeatureGroup>;
|
||||
features: Map<string, Feature>;
|
||||
dependencies: Map<string, string[]>;
|
||||
lineage: Map<string, FeatureLineage>;
|
||||
}
|
||||
|
||||
export interface FeatureLineage {
|
||||
featureGroupId: string;
|
||||
featureName: string;
|
||||
upstream: FeatureDependency[];
|
||||
downstream: FeatureDependency[];
|
||||
transformations: string[];
|
||||
}
|
||||
|
||||
export interface FeatureDependency {
|
||||
featureGroupId: string;
|
||||
featureName: string;
|
||||
dependencyType: 'direct' | 'derived' | 'aggregated';
|
||||
}
|
||||
|
||||
// Storage Types
|
||||
|
||||
export interface FeatureStorageConfig {
|
||||
online: OnlineStoreConfig;
|
||||
offline: OfflineStoreConfig;
|
||||
metadata: MetadataStoreConfig;
|
||||
}
|
||||
|
||||
export interface OnlineStoreConfig {
|
||||
type: 'redis' | 'dynamodb' | 'cassandra';
|
||||
connection: Record<string, any>;
|
||||
ttl?: number;
|
||||
keyFormat?: string;
|
||||
}
|
||||
|
||||
export interface OfflineStoreConfig {
|
||||
type: 'parquet' | 'delta' | 'postgresql' | 's3';
|
||||
connection: Record<string, any>;
|
||||
partitioning?: PartitioningConfig;
|
||||
}
|
||||
|
||||
export interface MetadataStoreConfig {
|
||||
type: 'postgresql' | 'mysql' | 'sqlite';
|
||||
connection: Record<string, any>;
|
||||
}
|
||||
|
||||
export interface PartitioningConfig {
|
||||
columns: string[];
|
||||
strategy: 'time' | 'hash' | 'range';
|
||||
granularity?: 'hour' | 'day' | 'month';
|
||||
}
|
||||
|
||||
// Monitoring and Alerting
|
||||
|
||||
export interface FeatureMonitoring {
|
||||
featureGroupId: string;
|
||||
featureName: string;
|
||||
monitors: FeatureMonitor[];
|
||||
alerts: FeatureAlert[];
|
||||
}
|
||||
|
||||
export interface FeatureMonitor {
|
||||
name: string;
|
||||
type: 'drift' | 'freshness' | 'availability' | 'quality';
|
||||
threshold: number;
|
||||
enabled: boolean;
|
||||
configuration: Record<string, any>;
|
||||
}
|
||||
|
||||
export interface FeatureAlert {
|
||||
id: string;
|
||||
monitorName: string;
|
||||
level: 'warning' | 'error' | 'critical';
|
||||
message: string;
|
||||
timestamp: Date;
|
||||
resolved: boolean;
|
||||
resolvedAt?: Date;
|
||||
}
|
||||
30
apps/data-services/feature-store/tsconfig.json
Normal file
30
apps/data-services/feature-store/tsconfig.json
Normal file
|
|
@ -0,0 +1,30 @@
|
|||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2022",
|
||||
"lib": ["ES2022"],
|
||||
"module": "ESNext",
|
||||
"moduleResolution": "bundler",
|
||||
"declaration": true,
|
||||
"declarationMap": true,
|
||||
"sourceMap": true,
|
||||
"outDir": "./dist",
|
||||
"rootDir": "./src",
|
||||
"strict": true,
|
||||
"esModuleInterop": true,
|
||||
"skipLibCheck": true,
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"resolveJsonModule": true,
|
||||
"allowImportingTsExtensions": true,
|
||||
"noEmit": true,
|
||||
"paths": {
|
||||
"@/*": ["./src/*"]
|
||||
}
|
||||
},
|
||||
"include": [
|
||||
"src/**/*"
|
||||
],
|
||||
"exclude": [
|
||||
"node_modules",
|
||||
"dist"
|
||||
]
|
||||
}
|
||||
0
apps/intelligence-services/backtest-engine/README.md
Normal file
0
apps/intelligence-services/backtest-engine/README.md
Normal file
24
apps/intelligence-services/backtest-engine/package.json
Normal file
24
apps/intelligence-services/backtest-engine/package.json
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"name": "backtest-engine",
|
||||
"version": "1.0.0",
|
||||
"description": "Dedicated backtesting engine for trading strategies",
|
||||
"main": "src/index.ts",
|
||||
"scripts": {
|
||||
"dev": "bun run --watch src/index.ts",
|
||||
"start": "bun run src/index.ts",
|
||||
"test": "bun test --timeout 10000 src/tests/**/*.test.ts",
|
||||
"test:watch": "bun test --watch src/tests/**/*.test.ts"
|
||||
}, "dependencies": {
|
||||
"hono": "^4.6.3",
|
||||
"@stock-bot/shared-types": "workspace:*",
|
||||
"@stock-bot/utils": "workspace:*",
|
||||
"@stock-bot/event-bus": "workspace:*",
|
||||
"@stock-bot/api-client": "workspace:*",
|
||||
"@stock-bot/config": "*",
|
||||
"ws": "^8.18.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"bun-types": "^1.2.15",
|
||||
"@types/ws": "^8.5.12"
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,650 @@
|
|||
import { EventEmitter } from 'events';
|
||||
import { OHLCV } from '@stock-bot/shared-types';
|
||||
import { Order, Position } from '@stock-bot/shared-types';
|
||||
import { createLogger } from '@stock-bot/utils';
|
||||
import { financialUtils } from '@stock-bot/utils';
|
||||
|
||||
const logger = createLogger('backtest-engine');
|
||||
|
||||
// Use OHLCV from shared-types as BarData equivalent
|
||||
export type BarData = OHLCV;
|
||||
|
||||
// Strategy interface to match existing pattern
|
||||
export interface StrategyInterface {
|
||||
id: string;
|
||||
onBar(bar: BarData): Promise<Order[]> | Order[];
|
||||
stop(): Promise<void>;
|
||||
}
|
||||
|
||||
export interface BacktestConfig {
|
||||
initialCapital: number;
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
commission: number;
|
||||
slippage: number;
|
||||
}
|
||||
|
||||
export interface BacktestResult {
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
initialCapital: number;
|
||||
finalCapital: number;
|
||||
totalReturn: number;
|
||||
totalTrades: number;
|
||||
winningTrades: number;
|
||||
losingTrades: number;
|
||||
winRate: number;
|
||||
avgWin: number;
|
||||
avgLoss: number;
|
||||
profitFactor: number;
|
||||
sharpeRatio: number;
|
||||
maxDrawdown: number;
|
||||
trades: Array<{
|
||||
id: string;
|
||||
symbol: string;
|
||||
side: 'BUY' | 'SELL';
|
||||
quantity: number;
|
||||
entryTime: Date;
|
||||
exitTime: Date;
|
||||
entryPrice: number;
|
||||
exitPrice: number;
|
||||
pnl: number;
|
||||
pnlPercent: number;
|
||||
}>;
|
||||
dailyReturns: Array<{
|
||||
date: Date;
|
||||
portfolioValue: number;
|
||||
dayReturn: number;
|
||||
}>;
|
||||
}
|
||||
|
||||
export interface BacktestProgress {
|
||||
currentDate: Date;
|
||||
progress: number; // 0-100
|
||||
portfolioValue: number;
|
||||
totalTrades: number;
|
||||
}
|
||||
|
||||
export interface DataFeed {
|
||||
getHistoricalData(symbol: string, startDate: Date, endDate: Date): Promise<BarData[]>;
|
||||
}
|
||||
|
||||
// Extended Position interface that includes additional fields needed for backtesting
|
||||
export interface BacktestPosition {
|
||||
symbol: string;
|
||||
quantity: number;
|
||||
averagePrice: number;
|
||||
marketValue: number;
|
||||
unrealizedPnL: number;
|
||||
timestamp: Date;
|
||||
// Additional fields for backtesting
|
||||
avgPrice: number; // Alias for averagePrice
|
||||
entryTime: Date;
|
||||
}
|
||||
|
||||
// Extended Order interface that includes additional fields needed for backtesting
|
||||
export interface BacktestOrder extends Order {
|
||||
fillPrice?: number;
|
||||
fillTime?: Date;
|
||||
}
|
||||
trades: Array<{
|
||||
symbol: string;
|
||||
entryTime: Date;
|
||||
entryPrice: number;
|
||||
exitTime: Date;
|
||||
exitPrice: number;
|
||||
quantity: number;
|
||||
pnl: number;
|
||||
pnlPercent: number;
|
||||
}>;
|
||||
}
|
||||
|
||||
export interface BacktestProgress {
|
||||
progress: number; // 0-100
|
||||
currentDate: Date;
|
||||
processingSpeed: number; // Bars per second
|
||||
estimatedTimeRemaining: number; // milliseconds
|
||||
currentCapital: number;
|
||||
currentReturn: number;
|
||||
currentDrawdown: number;
|
||||
}
|
||||
|
||||
export interface DataFeed {
|
||||
getHistoricalData(symbol: string, resolution: string, start: Date, end: Date): Promise<BarData[]>;
|
||||
hasDataFor(symbol: string, resolution: string, start: Date, end: Date): Promise<boolean>;
|
||||
}
|
||||
|
||||
export class BacktestEngine extends EventEmitter {
|
||||
private config: BacktestConfig;
|
||||
private strategy: StrategyInterface;
|
||||
private dataFeed: DataFeed;
|
||||
private isRunning: boolean = false;
|
||||
private barBuffer: Map<string, BarData[]> = new Map();
|
||||
private pendingOrders: BacktestOrder[] = [];
|
||||
private filledOrders: BacktestOrder[] = [];
|
||||
private currentTime: Date;
|
||||
private startTime: number = 0; // For performance tracking
|
||||
private processedBars: number = 0;
|
||||
private marketData: Map<string, BarData[]> = new Map();
|
||||
|
||||
// Results tracking
|
||||
private initialCapital: number;
|
||||
private currentCapital: number;
|
||||
private positions = new Map<string, BacktestPosition>();
|
||||
private trades: BacktestResult['trades'] = [];
|
||||
private dailyReturns: BacktestResult['dailyReturns'] = [];
|
||||
private previousPortfolioValue: number;
|
||||
private highWaterMark: number;
|
||||
private maxDrawdown: number = 0;
|
||||
private drawdownStartTime: Date | null = null;
|
||||
private maxDrawdownDuration: number = 0;
|
||||
private winningTrades: number = 0;
|
||||
private losingTrades: number = 0;
|
||||
private breakEvenTrades: number = 0;
|
||||
private totalProfits: number = 0;
|
||||
private totalLosses: number = 0;
|
||||
constructor(strategy: StrategyInterface, config: BacktestConfig, dataFeed: DataFeed) {
|
||||
super();
|
||||
this.strategy = strategy;
|
||||
this.config = config;
|
||||
this.dataFeed = dataFeed;
|
||||
this.currentTime = new Date(config.startDate);
|
||||
this.initialCapital = config.initialCapital;
|
||||
this.currentCapital = config.initialCapital;
|
||||
this.previousPortfolioValue = config.initialCapital;
|
||||
this.highWaterMark = config.initialCapital;
|
||||
}
|
||||
async run(): Promise<BacktestResult> {
|
||||
if (this.isRunning) {
|
||||
throw new Error('Backtest is already running');
|
||||
}
|
||||
|
||||
this.isRunning = true;
|
||||
this.startTime = Date.now();
|
||||
this.emit('started', { strategyId: this.strategy.id, config: this.config });
|
||||
|
||||
try {
|
||||
await this.runEventBased();
|
||||
const result = this.generateResults();
|
||||
this.emit('completed', { strategyId: this.strategy.id, result });
|
||||
this.isRunning = false;
|
||||
return result;
|
||||
} catch (error) {
|
||||
this.isRunning = false;
|
||||
this.emit('error', { strategyId: this.strategy.id, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
private async runEventBased(): Promise<void> {
|
||||
// Load market data for all symbols
|
||||
await this.loadMarketData();
|
||||
|
||||
// Initialize the strategy
|
||||
await this.strategy.start();
|
||||
|
||||
// Create a merged timeline of all bars across all symbols, sorted by timestamp
|
||||
const timeline = this.createMergedTimeline();
|
||||
|
||||
// Process each event in chronological order
|
||||
let lastProgressUpdate = Date.now();
|
||||
let prevDate = new Date(0);
|
||||
|
||||
for (let i = 0; i < timeline.length; i++) {
|
||||
const bar = timeline[i];
|
||||
this.currentTime = bar.timestamp;
|
||||
|
||||
// Process any pending orders
|
||||
await this.processOrders(bar);
|
||||
|
||||
// Update positions with current prices
|
||||
this.updatePositions(bar);
|
||||
|
||||
// If we've crossed to a new day, calculate daily return
|
||||
if (this.currentTime.toDateString() !== prevDate.toDateString()) {
|
||||
this.calculateDailyReturn();
|
||||
prevDate = this.currentTime;
|
||||
}
|
||||
|
||||
// Send the new bar to the strategy
|
||||
const orders = await this.strategy.onBar(bar);
|
||||
|
||||
// Add any new orders to the pending orders queue
|
||||
if (orders && orders.length > 0) {
|
||||
this.pendingOrders.push(...orders);
|
||||
}
|
||||
|
||||
// Update progress periodically
|
||||
if (Date.now() - lastProgressUpdate > 1000) { // Update every second
|
||||
this.updateProgress(i / timeline.length);
|
||||
lastProgressUpdate = Date.now();
|
||||
}
|
||||
}
|
||||
|
||||
// Process any remaining orders
|
||||
for (const order of this.pendingOrders) {
|
||||
await this.processOrder(order);
|
||||
}
|
||||
|
||||
// Close any remaining positions at the last known price
|
||||
await this.closeAllPositions();
|
||||
|
||||
// Clean up strategy
|
||||
await this.strategy.stop();
|
||||
}
|
||||
|
||||
private async runVectorized(): Promise<void> {
|
||||
// Load market data for all symbols
|
||||
await this.loadMarketData();
|
||||
|
||||
// To implement a vectorized approach, we need to:
|
||||
// 1. Pre-compute technical indicators
|
||||
// 2. Generate buy/sell signals for the entire dataset
|
||||
// 3. Calculate portfolio values based on signals
|
||||
|
||||
// This is a simplified implementation since specific vectorized strategies
|
||||
// will need to be implemented separately based on the strategy type
|
||||
|
||||
const timeline = this.createMergedTimeline();
|
||||
const startTime = Date.now();
|
||||
|
||||
// Initialize variables for tracking performance
|
||||
let currentPositions = new Map<string, number>();
|
||||
let currentCash = this.initialCapital;
|
||||
let prevPortfolioValue = this.initialCapital;
|
||||
let highWaterMark = this.initialCapital;
|
||||
let maxDrawdown = 0;
|
||||
let maxDrawdownStartDate = new Date();
|
||||
let maxDrawdownEndDate = new Date();
|
||||
let currentDrawdownStart = new Date();
|
||||
|
||||
// Pre-process data (this would be implemented by the specific strategy)
|
||||
const allBars = new Map<string, BarData[]>();
|
||||
for (const symbol of this.config.symbols) {
|
||||
allBars.set(symbol, this.marketData.get(symbol) || []);
|
||||
}
|
||||
|
||||
// Apply strategy logic (vectorized implementation would be here)
|
||||
// For now, we'll just simulate the processing
|
||||
|
||||
this.emit('completed', { message: 'Vectorized backtest completed in fast mode' });
|
||||
}
|
||||
|
||||
private async loadMarketData(): Promise<void> {
|
||||
for (const symbol of this.config.symbols) {
|
||||
this.emit('loading', { symbol, resolution: this.config.dataResolution });
|
||||
|
||||
// Check if data is available
|
||||
const hasData = await this.dataFeed.hasDataFor(
|
||||
symbol,
|
||||
this.config.dataResolution,
|
||||
this.config.startDate,
|
||||
this.config.endDate
|
||||
);
|
||||
|
||||
if (!hasData) {
|
||||
throw new Error(`No data available for ${symbol} at resolution ${this.config.dataResolution}`);
|
||||
}
|
||||
|
||||
// Load data
|
||||
const data = await this.dataFeed.getHistoricalData(
|
||||
symbol,
|
||||
this.config.dataResolution,
|
||||
this.config.startDate,
|
||||
this.config.endDate
|
||||
);
|
||||
|
||||
this.marketData.set(symbol, data);
|
||||
this.emit('loaded', { symbol, count: data.length });
|
||||
}
|
||||
}
|
||||
|
||||
private createMergedTimeline(): BarData[] {
|
||||
const allBars: BarData[] = [];
|
||||
|
||||
for (const [symbol, bars] of this.marketData.entries()) {
|
||||
allBars.push(...bars);
|
||||
}
|
||||
|
||||
// Sort by timestamp
|
||||
return allBars.sort((a, b) => a.timestamp.getTime() - b.timestamp.getTime());
|
||||
}
|
||||
|
||||
private async processOrders(currentBar: BarData): Promise<void> {
|
||||
// Find orders for the current symbol
|
||||
const ordersToProcess = this.pendingOrders.filter(order => order.symbol === currentBar.symbol);
|
||||
|
||||
if (ordersToProcess.length === 0) return;
|
||||
|
||||
// Remove these orders from pendingOrders
|
||||
this.pendingOrders = this.pendingOrders.filter(order => order.symbol !== currentBar.symbol);
|
||||
|
||||
// Process each order
|
||||
for (const order of ordersToProcess) {
|
||||
await this.processOrder(order);
|
||||
}
|
||||
}
|
||||
|
||||
private async processOrder(order: Order): Promise<void> {
|
||||
// Get the latest price for the symbol
|
||||
const latestBars = this.marketData.get(order.symbol);
|
||||
if (!latestBars || latestBars.length === 0) {
|
||||
order.status = 'REJECTED';
|
||||
this.emit('orderRejected', { order, reason: 'No market data available' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Find the bar closest to the order time
|
||||
const bar = latestBars.find(b =>
|
||||
b.timestamp.getTime() >= order.timestamp.getTime()
|
||||
) || latestBars[latestBars.length - 1];
|
||||
|
||||
// Calculate fill price with slippage
|
||||
let fillPrice: number;
|
||||
if (order.type === 'MARKET') {
|
||||
// Apply slippage model
|
||||
const slippageFactor = 1 + (order.side === 'BUY' ? this.config.slippage : -this.config.slippage);
|
||||
fillPrice = bar.close * slippageFactor;
|
||||
} else if (order.type === 'LIMIT' && order.price !== undefined) {
|
||||
// For limit orders, check if the price was reached
|
||||
if ((order.side === 'BUY' && bar.low <= order.price) ||
|
||||
(order.side === 'SELL' && bar.high >= order.price)) {
|
||||
fillPrice = order.price;
|
||||
} else {
|
||||
// Limit price not reached
|
||||
return;
|
||||
}
|
||||
} else {
|
||||
// Other order types not implemented
|
||||
order.status = 'REJECTED';
|
||||
this.emit('orderRejected', { order, reason: 'Order type not supported' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Calculate commission
|
||||
const orderValue = order.quantity * fillPrice;
|
||||
const commission = orderValue * this.config.commission;
|
||||
|
||||
// Check if we have enough cash for BUY orders
|
||||
if (order.side === 'BUY') {
|
||||
const totalCost = orderValue + commission;
|
||||
if (totalCost > this.currentCapital) {
|
||||
// Not enough cash
|
||||
order.status = 'REJECTED';
|
||||
this.emit('orderRejected', { order, reason: 'Insufficient funds' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Update cash
|
||||
this.currentCapital -= totalCost;
|
||||
|
||||
// Update or create position
|
||||
const existingPosition = this.positions.get(order.symbol);
|
||||
if (existingPosition) {
|
||||
// Update existing position (average down)
|
||||
const totalShares = existingPosition.quantity + order.quantity;
|
||||
const totalCost = (existingPosition.quantity * existingPosition.avgPrice) + (order.quantity * fillPrice);
|
||||
existingPosition.avgPrice = totalCost / totalShares;
|
||||
existingPosition.quantity = totalShares;
|
||||
} else {
|
||||
// Create new position
|
||||
this.positions.set(order.symbol, {
|
||||
symbol: order.symbol,
|
||||
quantity: order.quantity,
|
||||
avgPrice: fillPrice,
|
||||
side: 'LONG',
|
||||
entryTime: this.currentTime
|
||||
});
|
||||
}
|
||||
} else if (order.side === 'SELL') {
|
||||
const position = this.positions.get(order.symbol);
|
||||
|
||||
if (!position || position.quantity < order.quantity) {
|
||||
// Not enough shares to sell
|
||||
order.status = 'REJECTED';
|
||||
this.emit('orderRejected', { order, reason: 'Insufficient position' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Calculate P&L
|
||||
const pnl = (fillPrice - position.avgPrice) * order.quantity;
|
||||
|
||||
// Update cash
|
||||
this.currentCapital += orderValue - commission;
|
||||
|
||||
// Update position
|
||||
position.quantity -= order.quantity;
|
||||
|
||||
if (position.quantity === 0) {
|
||||
// Position closed, record the trade
|
||||
this.positions.delete(order.symbol);
|
||||
|
||||
this.trades.push({
|
||||
symbol: order.symbol,
|
||||
entryTime: position.entryTime,
|
||||
entryPrice: position.avgPrice,
|
||||
exitTime: this.currentTime,
|
||||
exitPrice: fillPrice,
|
||||
quantity: order.quantity,
|
||||
pnl: pnl,
|
||||
pnlPercent: (pnl / (position.avgPrice * order.quantity)) * 100
|
||||
});
|
||||
|
||||
// Update statistics
|
||||
if (pnl > 0) {
|
||||
this.winningTrades++;
|
||||
this.totalProfits += pnl;
|
||||
} else if (pnl < 0) {
|
||||
this.losingTrades++;
|
||||
this.totalLosses -= pnl; // Make positive for easier calculations
|
||||
} else {
|
||||
this.breakEvenTrades++;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Mark order as filled
|
||||
order.status = 'FILLED';
|
||||
order.fillPrice = fillPrice;
|
||||
order.fillTime = this.currentTime;
|
||||
this.filledOrders.push(order);
|
||||
|
||||
// Notify strategy
|
||||
await this.strategy.onOrderFilled(order);
|
||||
|
||||
this.emit('orderFilled', { order });
|
||||
}
|
||||
|
||||
private updatePositions(currentBar: BarData): void {
|
||||
// Update the unrealized P&L for positions in this symbol
|
||||
const position = this.positions.get(currentBar.symbol);
|
||||
if (position) {
|
||||
const currentPrice = currentBar.close;
|
||||
const unrealizedPnL = (currentPrice - position.avgPrice) * position.quantity;
|
||||
position.unrealizedPnL = unrealizedPnL;
|
||||
}
|
||||
|
||||
// Calculate total portfolio value
|
||||
const portfolioValue = this.calculatePortfolioValue();
|
||||
|
||||
// Check for new high water mark
|
||||
if (portfolioValue > this.highWaterMark) {
|
||||
this.highWaterMark = portfolioValue;
|
||||
this.drawdownStartTime = null;
|
||||
}
|
||||
|
||||
// Check for drawdown
|
||||
if (this.drawdownStartTime === null && portfolioValue < this.highWaterMark) {
|
||||
this.drawdownStartTime = this.currentTime;
|
||||
}
|
||||
|
||||
// Update max drawdown
|
||||
if (this.highWaterMark > 0) {
|
||||
const currentDrawdown = (this.highWaterMark - portfolioValue) / this.highWaterMark;
|
||||
if (currentDrawdown > this.maxDrawdown) {
|
||||
this.maxDrawdown = currentDrawdown;
|
||||
|
||||
// Calculate drawdown duration
|
||||
if (this.drawdownStartTime !== null) {
|
||||
const drawdownDuration = (this.currentTime.getTime() - this.drawdownStartTime.getTime()) / (1000 * 60 * 60 * 24); // In days
|
||||
if (drawdownDuration > this.maxDrawdownDuration) {
|
||||
this.maxDrawdownDuration = drawdownDuration;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
this.previousPortfolioValue = portfolioValue;
|
||||
}
|
||||
|
||||
private calculatePortfolioValue(): number {
|
||||
let totalValue = this.currentCapital;
|
||||
|
||||
// Add the current value of all positions
|
||||
for (const [symbol, position] of this.positions.entries()) {
|
||||
// Find the latest price for this symbol
|
||||
const bars = this.marketData.get(symbol);
|
||||
if (bars && bars.length > 0) {
|
||||
const latestBar = bars[bars.length - 1];
|
||||
totalValue += position.quantity * latestBar.close;
|
||||
} else {
|
||||
// If no price data, use the average price (not ideal but better than nothing)
|
||||
totalValue += position.quantity * position.avgPrice;
|
||||
}
|
||||
}
|
||||
|
||||
return totalValue;
|
||||
}
|
||||
|
||||
private calculateDailyReturn(): void {
|
||||
const portfolioValue = this.calculatePortfolioValue();
|
||||
const dailyReturn = (portfolioValue - this.previousPortfolioValue) / this.previousPortfolioValue;
|
||||
|
||||
this.dailyReturns.push({
|
||||
date: new Date(this.currentTime),
|
||||
return: dailyReturn
|
||||
});
|
||||
|
||||
this.previousPortfolioValue = portfolioValue;
|
||||
}
|
||||
|
||||
private async closeAllPositions(): Promise<void> {
|
||||
for (const [symbol, position] of this.positions.entries()) {
|
||||
// Find the latest price
|
||||
const bars = this.marketData.get(symbol);
|
||||
if (!bars || bars.length === 0) continue;
|
||||
|
||||
const lastBar = bars[bars.length - 1];
|
||||
const closePrice = lastBar.close;
|
||||
|
||||
// Calculate P&L
|
||||
const pnl = (closePrice - position.avgPrice) * position.quantity;
|
||||
|
||||
// Update cash
|
||||
this.currentCapital += position.quantity * closePrice;
|
||||
|
||||
// Record the trade
|
||||
this.trades.push({
|
||||
symbol,
|
||||
entryTime: position.entryTime,
|
||||
entryPrice: position.avgPrice,
|
||||
exitTime: this.currentTime,
|
||||
exitPrice: closePrice,
|
||||
quantity: position.quantity,
|
||||
pnl,
|
||||
pnlPercent: (pnl / (position.avgPrice * position.quantity)) * 100
|
||||
});
|
||||
|
||||
// Update statistics
|
||||
if (pnl > 0) {
|
||||
this.winningTrades++;
|
||||
this.totalProfits += pnl;
|
||||
} else if (pnl < 0) {
|
||||
this.losingTrades++;
|
||||
this.totalLosses -= pnl; // Make positive for easier calculations
|
||||
} else {
|
||||
this.breakEvenTrades++;
|
||||
}
|
||||
}
|
||||
|
||||
// Clear positions
|
||||
this.positions.clear();
|
||||
}
|
||||
|
||||
private updateProgress(progress: number): void {
|
||||
const currentPortfolioValue = this.calculatePortfolioValue();
|
||||
const currentDrawdown = this.highWaterMark > 0
|
||||
? (this.highWaterMark - currentPortfolioValue) / this.highWaterMark
|
||||
: 0;
|
||||
|
||||
const elapsedMs = Date.now() - this.startTime;
|
||||
const totalEstimatedMs = elapsedMs / progress;
|
||||
const remainingMs = totalEstimatedMs - elapsedMs;
|
||||
|
||||
this.emit('progress', {
|
||||
progress: progress * 100,
|
||||
currentDate: this.currentTime,
|
||||
processingSpeed: this.processedBars / (elapsedMs / 1000),
|
||||
estimatedTimeRemaining: remainingMs,
|
||||
currentCapital: this.currentCapital,
|
||||
currentReturn: (currentPortfolioValue - this.initialCapital) / this.initialCapital,
|
||||
currentDrawdown
|
||||
} as BacktestProgress);
|
||||
}
|
||||
|
||||
private generateResults(): BacktestResult {
|
||||
const currentPortfolioValue = this.calculatePortfolioValue();
|
||||
const totalReturn = (currentPortfolioValue - this.initialCapital) / this.initialCapital;
|
||||
|
||||
// Calculate annualized return
|
||||
const days = (this.config.endDate.getTime() - this.config.startDate.getTime()) / (1000 * 60 * 60 * 24);
|
||||
const annualizedReturn = Math.pow(1 + totalReturn, 365 / days) - 1;
|
||||
|
||||
// Calculate Sharpe Ratio
|
||||
let sharpeRatio = 0;
|
||||
if (this.dailyReturns.length > 1) {
|
||||
const dailyReturnValues = this.dailyReturns.map(dr => dr.return);
|
||||
const avgDailyReturn = dailyReturnValues.reduce((sum, ret) => sum + ret, 0) / dailyReturnValues.length;
|
||||
const stdDev = Math.sqrt(
|
||||
dailyReturnValues.reduce((sum, ret) => sum + Math.pow(ret - avgDailyReturn, 2), 0) / dailyReturnValues.length
|
||||
);
|
||||
|
||||
// Annualize
|
||||
sharpeRatio = stdDev > 0
|
||||
? (avgDailyReturn * 252) / (stdDev * Math.sqrt(252))
|
||||
: 0;
|
||||
}
|
||||
|
||||
// Calculate win rate and profit factor
|
||||
const totalTrades = this.winningTrades + this.losingTrades + this.breakEvenTrades;
|
||||
const winRate = totalTrades > 0 ? this.winningTrades / totalTrades : 0;
|
||||
const profitFactor = this.totalLosses > 0 ? this.totalProfits / this.totalLosses : (this.totalProfits > 0 ? Infinity : 0);
|
||||
|
||||
// Calculate average winning and losing trade
|
||||
const avgWinningTrade = this.winningTrades > 0 ? this.totalProfits / this.winningTrades : 0;
|
||||
const avgLosingTrade = this.losingTrades > 0 ? this.totalLosses / this.losingTrades : 0;
|
||||
|
||||
return {
|
||||
strategyId: this.strategy.id,
|
||||
startDate: this.config.startDate,
|
||||
endDate: this.config.endDate,
|
||||
duration: Date.now() - this.startTime,
|
||||
initialCapital: this.initialCapital,
|
||||
finalCapital: currentPortfolioValue,
|
||||
totalReturn,
|
||||
annualizedReturn,
|
||||
sharpeRatio,
|
||||
maxDrawdown: this.maxDrawdown,
|
||||
maxDrawdownDuration: this.maxDrawdownDuration,
|
||||
winRate,
|
||||
totalTrades,
|
||||
winningTrades: this.winningTrades,
|
||||
losingTrades: this.losingTrades,
|
||||
averageWinningTrade: avgWinningTrade,
|
||||
averageLosingTrade: avgLosingTrade,
|
||||
profitFactor,
|
||||
dailyReturns: this.dailyReturns,
|
||||
trades: this.trades
|
||||
};
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,186 @@
|
|||
import { BaseStrategy } from '../Strategy';
|
||||
import { BacktestConfig, BacktestEngine, BacktestResult } from './BacktestEngine';
|
||||
import { MarketDataFeed } from './MarketDataFeed';
|
||||
import { StrategyRegistry, StrategyType } from '../strategies/StrategyRegistry';
|
||||
|
||||
export interface BacktestRequest {
|
||||
strategyType: StrategyType;
|
||||
strategyParams: Record<string, any>;
|
||||
symbols: string[];
|
||||
startDate: Date | string;
|
||||
endDate: Date | string;
|
||||
initialCapital: number;
|
||||
dataResolution: '1m' | '5m' | '15m' | '30m' | '1h' | '4h' | '1d';
|
||||
commission: number;
|
||||
slippage: number;
|
||||
mode: 'event' | 'vector';
|
||||
}
|
||||
|
||||
/**
|
||||
* Backtesting Service
|
||||
*
|
||||
* A service that handles backtesting requests and manages backtesting sessions.
|
||||
*/
|
||||
export class BacktestService {
|
||||
private readonly strategyRegistry: StrategyRegistry;
|
||||
private readonly dataFeed: MarketDataFeed;
|
||||
private readonly activeBacktests: Map<string, BacktestEngine> = new Map();
|
||||
|
||||
constructor(apiBaseUrl: string = 'http://localhost:3001/api') {
|
||||
this.strategyRegistry = StrategyRegistry.getInstance();
|
||||
this.dataFeed = new MarketDataFeed(apiBaseUrl);
|
||||
}
|
||||
|
||||
/**
|
||||
* Run a backtest based on a request
|
||||
*/
|
||||
async runBacktest(request: BacktestRequest): Promise<BacktestResult> {
|
||||
// Create a strategy instance
|
||||
const strategyId = `backtest_${Date.now()}`;
|
||||
const strategy = this.strategyRegistry.createStrategy(
|
||||
request.strategyType,
|
||||
strategyId,
|
||||
`Backtest ${request.strategyType}`,
|
||||
`Generated backtest for ${request.symbols.join(', ')}`,
|
||||
request.symbols,
|
||||
request.strategyParams
|
||||
);
|
||||
|
||||
// Parse dates if they are strings
|
||||
const startDate = typeof request.startDate === 'string'
|
||||
? new Date(request.startDate)
|
||||
: request.startDate;
|
||||
|
||||
const endDate = typeof request.endDate === 'string'
|
||||
? new Date(request.endDate)
|
||||
: request.endDate;
|
||||
|
||||
// Create backtest configuration
|
||||
const config: BacktestConfig = {
|
||||
startDate,
|
||||
endDate,
|
||||
symbols: request.symbols,
|
||||
initialCapital: request.initialCapital,
|
||||
commission: request.commission,
|
||||
slippage: request.slippage,
|
||||
dataResolution: request.dataResolution,
|
||||
mode: request.mode
|
||||
};
|
||||
|
||||
// Create and run the backtest engine
|
||||
const engine = new BacktestEngine(strategy, config, this.dataFeed);
|
||||
this.activeBacktests.set(strategyId, engine);
|
||||
|
||||
try {
|
||||
// Set up event forwarding
|
||||
const forwardEvents = (eventName: string) => {
|
||||
engine.on(eventName, (data) => {
|
||||
console.log(`[Backtest ${strategyId}] ${eventName}:`, data);
|
||||
});
|
||||
};
|
||||
|
||||
forwardEvents('started');
|
||||
forwardEvents('loading');
|
||||
forwardEvents('loaded');
|
||||
forwardEvents('progress');
|
||||
forwardEvents('orderFilled');
|
||||
forwardEvents('orderRejected');
|
||||
forwardEvents('completed');
|
||||
forwardEvents('error');
|
||||
|
||||
// Run the backtest
|
||||
const result = await engine.run();
|
||||
|
||||
// Clean up
|
||||
this.activeBacktests.delete(strategyId);
|
||||
|
||||
return result;
|
||||
} catch (error) {
|
||||
this.activeBacktests.delete(strategyId);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Optimize a strategy by running multiple backtests with different parameters
|
||||
*/
|
||||
async optimizeStrategy(
|
||||
baseRequest: BacktestRequest,
|
||||
parameterGrid: Record<string, any[]>
|
||||
): Promise<Array<BacktestResult & { parameters: Record<string, any> }>> {
|
||||
const results: Array<BacktestResult & { parameters: Record<string, any> }> = [];
|
||||
|
||||
// Generate parameter combinations
|
||||
const paramKeys = Object.keys(parameterGrid);
|
||||
const combinations = this.generateParameterCombinations(parameterGrid, paramKeys);
|
||||
|
||||
// Run backtest for each combination
|
||||
for (const paramSet of combinations) {
|
||||
const request = {
|
||||
...baseRequest,
|
||||
strategyParams: {
|
||||
...baseRequest.strategyParams,
|
||||
...paramSet
|
||||
}
|
||||
};
|
||||
|
||||
try {
|
||||
const result = await this.runBacktest(request);
|
||||
results.push({
|
||||
...result,
|
||||
parameters: paramSet
|
||||
});
|
||||
} catch (error) {
|
||||
console.error(`Optimization failed for parameters:`, paramSet, error);
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by performance metric (e.g., Sharpe ratio)
|
||||
return results.sort((a, b) => b.sharpeRatio - a.sharpeRatio);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate all combinations of parameters for grid search
|
||||
*/
|
||||
private generateParameterCombinations(
|
||||
grid: Record<string, any[]>,
|
||||
keys: string[],
|
||||
current: Record<string, any> = {},
|
||||
index: number = 0,
|
||||
result: Record<string, any>[] = []
|
||||
): Record<string, any>[] {
|
||||
if (index === keys.length) {
|
||||
result.push({ ...current });
|
||||
return result;
|
||||
}
|
||||
|
||||
const key = keys[index];
|
||||
const values = grid[key];
|
||||
|
||||
for (const value of values) {
|
||||
current[key] = value;
|
||||
this.generateParameterCombinations(grid, keys, current, index + 1, result);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get an active backtest engine by ID
|
||||
*/
|
||||
getBacktestEngine(id: string): BacktestEngine | undefined {
|
||||
return this.activeBacktests.get(id);
|
||||
}
|
||||
|
||||
/**
|
||||
* Cancel a running backtest
|
||||
*/
|
||||
cancelBacktest(id: string): boolean {
|
||||
const engine = this.activeBacktests.get(id);
|
||||
if (!engine) return false;
|
||||
|
||||
// No explicit cancel method on engine, but we can clean up
|
||||
this.activeBacktests.delete(id);
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,166 @@
|
|||
import { BarData } from '../Strategy';
|
||||
import { DataFeed } from './BacktestEngine';
|
||||
import axios from 'axios';
|
||||
|
||||
export class MarketDataFeed implements DataFeed {
|
||||
private readonly apiBaseUrl: string;
|
||||
private cache: Map<string, BarData[]> = new Map();
|
||||
|
||||
constructor(apiBaseUrl: string = 'http://localhost:3001/api') {
|
||||
this.apiBaseUrl = apiBaseUrl;
|
||||
}
|
||||
|
||||
async getHistoricalData(symbol: string, resolution: string, start: Date, end: Date): Promise<BarData[]> {
|
||||
const cacheKey = this.getCacheKey(symbol, resolution, start, end);
|
||||
|
||||
// Check cache first
|
||||
if (this.cache.has(cacheKey)) {
|
||||
return this.cache.get(cacheKey)!;
|
||||
}
|
||||
|
||||
try {
|
||||
// Format dates for API request
|
||||
const startStr = start.toISOString();
|
||||
const endStr = end.toISOString();
|
||||
|
||||
const response = await axios.get(`${this.apiBaseUrl}/market-data/history`, {
|
||||
params: {
|
||||
symbol,
|
||||
resolution,
|
||||
start: startStr,
|
||||
end: endStr
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.data.success || !response.data.data) {
|
||||
throw new Error(`Failed to fetch historical data for ${symbol}`);
|
||||
}
|
||||
|
||||
// Transform API response to BarData objects
|
||||
const bars: BarData[] = response.data.data.map((bar: any) => ({
|
||||
symbol,
|
||||
timestamp: new Date(bar.timestamp),
|
||||
open: bar.open,
|
||||
high: bar.high,
|
||||
low: bar.low,
|
||||
close: bar.close,
|
||||
volume: bar.volume
|
||||
}));
|
||||
|
||||
// Cache the result
|
||||
this.cache.set(cacheKey, bars);
|
||||
|
||||
return bars;
|
||||
} catch (error) {
|
||||
console.error(`Error fetching historical data for ${symbol}:`, error);
|
||||
// Return fallback test data if API call fails
|
||||
return this.generateFallbackTestData(symbol, resolution, start, end);
|
||||
}
|
||||
}
|
||||
|
||||
async hasDataFor(symbol: string, resolution: string, start: Date, end: Date): Promise<boolean> {
|
||||
try {
|
||||
const startStr = start.toISOString();
|
||||
const endStr = end.toISOString();
|
||||
|
||||
const response = await axios.get(`${this.apiBaseUrl}/market-data/available`, {
|
||||
params: {
|
||||
symbol,
|
||||
resolution,
|
||||
start: startStr,
|
||||
end: endStr
|
||||
}
|
||||
});
|
||||
|
||||
return response.data.success && response.data.data.available;
|
||||
} catch (error) {
|
||||
console.error(`Error checking data availability for ${symbol}:`, error);
|
||||
// Assume data is available for test purposes
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
clearCache(): void {
|
||||
this.cache.clear();
|
||||
}
|
||||
|
||||
private getCacheKey(symbol: string, resolution: string, start: Date, end: Date): string {
|
||||
return `${symbol}_${resolution}_${start.getTime()}_${end.getTime()}`;
|
||||
}
|
||||
|
||||
private generateFallbackTestData(symbol: string, resolution: string, start: Date, end: Date): BarData[] {
|
||||
console.warn(`Generating fallback test data for ${symbol} from ${start} to ${end}`);
|
||||
|
||||
const bars: BarData[] = [];
|
||||
let current = new Date(start);
|
||||
let basePrice = this.getBasePrice(symbol);
|
||||
|
||||
// Generate daily bars by default
|
||||
const interval = this.getIntervalFromResolution(resolution);
|
||||
|
||||
while (current.getTime() <= end.getTime()) {
|
||||
// Only generate bars for trading days (skip weekends)
|
||||
if (current.getDay() !== 0 && current.getDay() !== 6) {
|
||||
// Generate a random daily price movement (-1% to +1%)
|
||||
const dailyChange = (Math.random() * 2 - 1) / 100;
|
||||
|
||||
// Add some randomness to the volatility
|
||||
const volatility = 0.005 + Math.random() * 0.01; // 0.5% to 1.5%
|
||||
|
||||
const open = basePrice * (1 + (Math.random() * 0.002 - 0.001));
|
||||
const close = open * (1 + dailyChange);
|
||||
const high = Math.max(open, close) * (1 + Math.random() * volatility);
|
||||
const low = Math.min(open, close) * (1 - Math.random() * volatility);
|
||||
const volume = Math.floor(100000 + Math.random() * 900000);
|
||||
|
||||
bars.push({
|
||||
symbol,
|
||||
timestamp: new Date(current),
|
||||
open,
|
||||
high,
|
||||
low,
|
||||
close,
|
||||
volume
|
||||
});
|
||||
|
||||
// Update base price for next bar
|
||||
basePrice = close;
|
||||
}
|
||||
|
||||
// Move to next interval
|
||||
current = new Date(current.getTime() + interval);
|
||||
}
|
||||
|
||||
return bars;
|
||||
}
|
||||
|
||||
private getBasePrice(symbol: string): number {
|
||||
// Return a realistic base price for common symbols
|
||||
switch (symbol.toUpperCase()) {
|
||||
case 'AAPL': return 170 + Math.random() * 30;
|
||||
case 'MSFT': return 370 + Math.random() * 50;
|
||||
case 'AMZN': return 140 + Math.random() * 20;
|
||||
case 'GOOGL': return 130 + Math.random() * 20;
|
||||
case 'META': return 300 + Math.random() * 50;
|
||||
case 'TSLA': return 180 + Math.random() * 70;
|
||||
case 'NVDA': return 700 + Math.random() * 200;
|
||||
case 'SPY': return 450 + Math.random() * 30;
|
||||
case 'QQQ': return 370 + Math.random() * 40;
|
||||
default: return 100 + Math.random() * 50;
|
||||
}
|
||||
}
|
||||
|
||||
private getIntervalFromResolution(resolution: string): number {
|
||||
// Return milliseconds for each resolution
|
||||
switch (resolution) {
|
||||
case '1m': return 60 * 1000;
|
||||
case '5m': return 5 * 60 * 1000;
|
||||
case '15m': return 15 * 60 * 1000;
|
||||
case '30m': return 30 * 60 * 1000;
|
||||
case '1h': return 60 * 60 * 1000;
|
||||
case '4h': return 4 * 60 * 60 * 1000;
|
||||
case '1d': return 24 * 60 * 60 * 1000;
|
||||
default: return 24 * 60 * 60 * 1000; // Default to daily
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,325 @@
|
|||
import { BacktestResult } from './BacktestEngine';
|
||||
|
||||
/**
|
||||
* Performance Analysis Utilities
|
||||
*
|
||||
* Provides additional metrics and analysis tools for backtesting results.
|
||||
*/
|
||||
export class PerformanceAnalytics {
|
||||
/**
|
||||
* Calculate additional metrics from backtest results
|
||||
*/
|
||||
static enhanceResults(result: BacktestResult): BacktestResult {
|
||||
// Calculate additional metrics
|
||||
const enhancedResult = {
|
||||
...result,
|
||||
...this.calculateAdvancedMetrics(result)
|
||||
};
|
||||
|
||||
return enhancedResult;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate advanced performance metrics
|
||||
*/
|
||||
private static calculateAdvancedMetrics(result: BacktestResult): Partial<BacktestResult> {
|
||||
// Extract daily returns
|
||||
const dailyReturns = result.dailyReturns.map(dr => dr.return);
|
||||
|
||||
// Calculate Sortino ratio
|
||||
const sortinoRatio = this.calculateSortinoRatio(dailyReturns);
|
||||
|
||||
// Calculate Calmar ratio
|
||||
const calmarRatio = result.maxDrawdown > 0
|
||||
? result.annualizedReturn / result.maxDrawdown
|
||||
: Infinity;
|
||||
|
||||
// Calculate Omega ratio
|
||||
const omegaRatio = this.calculateOmegaRatio(dailyReturns);
|
||||
|
||||
// Calculate CAGR
|
||||
const startTimestamp = result.startDate.getTime();
|
||||
const endTimestamp = result.endDate.getTime();
|
||||
const yearsElapsed = (endTimestamp - startTimestamp) / (365 * 24 * 60 * 60 * 1000);
|
||||
const cagr = Math.pow(result.finalCapital / result.initialCapital, 1 / yearsElapsed) - 1;
|
||||
|
||||
// Calculate additional volatility and return metrics
|
||||
const volatility = this.calculateVolatility(dailyReturns);
|
||||
const ulcerIndex = this.calculateUlcerIndex(result.dailyReturns);
|
||||
|
||||
return {
|
||||
sortinoRatio,
|
||||
calmarRatio,
|
||||
omegaRatio,
|
||||
cagr,
|
||||
volatility,
|
||||
ulcerIndex
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Sortino ratio (downside risk-adjusted return)
|
||||
*/
|
||||
private static calculateSortinoRatio(dailyReturns: number[]): number {
|
||||
if (dailyReturns.length === 0) return 0;
|
||||
|
||||
const avgReturn = dailyReturns.reduce((sum, ret) => sum + ret, 0) / dailyReturns.length;
|
||||
|
||||
// Filter only negative returns (downside)
|
||||
const negativeReturns = dailyReturns.filter(ret => ret < 0);
|
||||
|
||||
if (negativeReturns.length === 0) return Infinity;
|
||||
|
||||
// Calculate downside deviation
|
||||
const downsideDeviation = Math.sqrt(
|
||||
negativeReturns.reduce((sum, ret) => sum + Math.pow(ret, 2), 0) / negativeReturns.length
|
||||
);
|
||||
|
||||
// Annualize
|
||||
const annualizedReturn = avgReturn * 252;
|
||||
const annualizedDownsideDeviation = downsideDeviation * Math.sqrt(252);
|
||||
|
||||
return annualizedDownsideDeviation > 0
|
||||
? annualizedReturn / annualizedDownsideDeviation
|
||||
: 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Omega ratio (probability-weighted ratio of gains versus losses)
|
||||
*/
|
||||
private static calculateOmegaRatio(dailyReturns: number[], threshold = 0): number {
|
||||
if (dailyReturns.length === 0) return 0;
|
||||
|
||||
let sumGains = 0;
|
||||
let sumLosses = 0;
|
||||
|
||||
for (const ret of dailyReturns) {
|
||||
if (ret > threshold) {
|
||||
sumGains += (ret - threshold);
|
||||
} else {
|
||||
sumLosses += (threshold - ret);
|
||||
}
|
||||
}
|
||||
|
||||
return sumLosses > 0 ? sumGains / sumLosses : Infinity;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate annualized volatility
|
||||
*/
|
||||
private static calculateVolatility(returns: number[]): number {
|
||||
if (returns.length < 2) return 0;
|
||||
|
||||
const mean = returns.reduce((sum, ret) => sum + ret, 0) / returns.length;
|
||||
const variance = returns.reduce((sum, ret) => sum + Math.pow(ret - mean, 2), 0) / returns.length;
|
||||
|
||||
// Annualize
|
||||
return Math.sqrt(variance * 252);
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Ulcer Index (measure of downside risk)
|
||||
*/
|
||||
private static calculateUlcerIndex(dailyReturns: Array<{ date: Date; return: number }>): number {
|
||||
if (dailyReturns.length === 0) return 0;
|
||||
|
||||
// Calculate running equity curve
|
||||
let equity = 1;
|
||||
const equityCurve = dailyReturns.map(dr => {
|
||||
equity *= (1 + dr.return);
|
||||
return equity;
|
||||
});
|
||||
|
||||
// Find running maximum
|
||||
const runningMax: number[] = [];
|
||||
let currentMax = equityCurve[0];
|
||||
|
||||
for (const value of equityCurve) {
|
||||
currentMax = Math.max(currentMax, value);
|
||||
runningMax.push(currentMax);
|
||||
}
|
||||
|
||||
// Calculate percentage drawdowns
|
||||
const percentDrawdowns = equityCurve.map((value, i) =>
|
||||
(runningMax[i] - value) / runningMax[i]
|
||||
);
|
||||
|
||||
// Calculate Ulcer Index
|
||||
const sumSquaredDrawdowns = percentDrawdowns.reduce(
|
||||
(sum, dd) => sum + dd * dd, 0
|
||||
);
|
||||
|
||||
return Math.sqrt(sumSquaredDrawdowns / percentDrawdowns.length);
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract monthly returns from daily returns
|
||||
*/
|
||||
static calculateMonthlyReturns(dailyReturns: Array<{ date: Date; return: number }>): Array<{
|
||||
year: number;
|
||||
month: number;
|
||||
return: number;
|
||||
}> {
|
||||
const monthlyReturns: Array<{ year: number; month: number; return: number }> = [];
|
||||
|
||||
if (dailyReturns.length === 0) return monthlyReturns;
|
||||
|
||||
// Group returns by year and month
|
||||
const groupedReturns: Record<string, number[]> = {};
|
||||
|
||||
for (const dr of dailyReturns) {
|
||||
const year = dr.date.getFullYear();
|
||||
const month = dr.date.getMonth();
|
||||
const key = `${year}-${month}`;
|
||||
|
||||
if (!groupedReturns[key]) {
|
||||
groupedReturns[key] = [];
|
||||
}
|
||||
|
||||
groupedReturns[key].push(dr.return);
|
||||
}
|
||||
|
||||
// Calculate compound return for each month
|
||||
for (const key in groupedReturns) {
|
||||
const [yearStr, monthStr] = key.split('-');
|
||||
const year = parseInt(yearStr);
|
||||
const month = parseInt(monthStr);
|
||||
|
||||
// Compound the daily returns for the month
|
||||
const monthReturn = groupedReturns[key].reduce(
|
||||
(product, ret) => product * (1 + ret), 1
|
||||
) - 1;
|
||||
|
||||
monthlyReturns.push({ year, month, return: monthReturn });
|
||||
}
|
||||
|
||||
// Sort by date
|
||||
return monthlyReturns.sort((a, b) => {
|
||||
if (a.year !== b.year) return a.year - b.year;
|
||||
return a.month - b.month;
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Create drawdown analysis from equity curve
|
||||
*/
|
||||
static analyzeDrawdowns(dailyReturns: Array<{ date: Date; return: number }>): Array<{
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
recoveryDate: Date | null;
|
||||
drawdown: number;
|
||||
durationDays: number;
|
||||
recoveryDays: number | null;
|
||||
}> {
|
||||
if (dailyReturns.length === 0) return [];
|
||||
|
||||
// Calculate equity curve
|
||||
let equity = 1;
|
||||
const equityCurve = dailyReturns.map(dr => {
|
||||
equity *= (1 + dr.return);
|
||||
return { date: dr.date, equity };
|
||||
});
|
||||
|
||||
// Analyze drawdowns
|
||||
const drawdowns: Array<{
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
recoveryDate: Date | null;
|
||||
drawdown: number;
|
||||
durationDays: number;
|
||||
recoveryDays: number | null;
|
||||
}> = [];
|
||||
|
||||
let peakEquity = equityCurve[0].equity;
|
||||
let peakDate = equityCurve[0].date;
|
||||
let inDrawdown = false;
|
||||
let currentDrawdown: {
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
lowEquity: number;
|
||||
peakEquity: number;
|
||||
} | null = null;
|
||||
|
||||
// Find drawdown periods
|
||||
for (let i = 1; i < equityCurve.length; i++) {
|
||||
const { date, equity } = equityCurve[i];
|
||||
|
||||
// New peak
|
||||
if (equity > peakEquity) {
|
||||
peakEquity = equity;
|
||||
peakDate = date;
|
||||
|
||||
// If recovering from drawdown, record recovery
|
||||
if (inDrawdown && currentDrawdown) {
|
||||
const recoveryDate = date;
|
||||
const drawdownPct = (currentDrawdown.peakEquity - currentDrawdown.lowEquity) /
|
||||
currentDrawdown.peakEquity;
|
||||
|
||||
const durationDays = Math.floor(
|
||||
(currentDrawdown.endDate.getTime() - currentDrawdown.startDate.getTime()) /
|
||||
(1000 * 60 * 60 * 24)
|
||||
);
|
||||
|
||||
const recoveryDays = Math.floor(
|
||||
(recoveryDate.getTime() - currentDrawdown.endDate.getTime()) /
|
||||
(1000 * 60 * 60 * 24)
|
||||
);
|
||||
|
||||
drawdowns.push({
|
||||
startDate: currentDrawdown.startDate,
|
||||
endDate: currentDrawdown.endDate,
|
||||
recoveryDate,
|
||||
drawdown: drawdownPct,
|
||||
durationDays,
|
||||
recoveryDays
|
||||
});
|
||||
|
||||
inDrawdown = false;
|
||||
currentDrawdown = null;
|
||||
}
|
||||
}
|
||||
// In drawdown
|
||||
else {
|
||||
const drawdownPct = (peakEquity - equity) / peakEquity;
|
||||
|
||||
if (!inDrawdown) {
|
||||
// Start of new drawdown
|
||||
inDrawdown = true;
|
||||
currentDrawdown = {
|
||||
startDate: peakDate,
|
||||
endDate: date,
|
||||
lowEquity: equity,
|
||||
peakEquity
|
||||
};
|
||||
} else if (currentDrawdown && equity < currentDrawdown.lowEquity) {
|
||||
// New low in current drawdown
|
||||
currentDrawdown.lowEquity = equity;
|
||||
currentDrawdown.endDate = date;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Handle any ongoing drawdown at the end
|
||||
if (inDrawdown && currentDrawdown) {
|
||||
const drawdownPct = (currentDrawdown.peakEquity - currentDrawdown.lowEquity) /
|
||||
currentDrawdown.peakEquity;
|
||||
|
||||
const durationDays = Math.floor(
|
||||
(currentDrawdown.endDate.getTime() - currentDrawdown.startDate.getTime()) /
|
||||
(1000 * 60 * 60 * 24)
|
||||
);
|
||||
|
||||
drawdowns.push({
|
||||
startDate: currentDrawdown.startDate,
|
||||
endDate: currentDrawdown.endDate,
|
||||
recoveryDate: null,
|
||||
drawdown: drawdownPct,
|
||||
durationDays,
|
||||
recoveryDays: null
|
||||
});
|
||||
}
|
||||
|
||||
// Sort by drawdown magnitude
|
||||
return drawdowns.sort((a, b) => b.drawdown - a.drawdown);
|
||||
}
|
||||
}
|
||||
0
apps/intelligence-services/backtest-engine/src/index.ts
Normal file
0
apps/intelligence-services/backtest-engine/src/index.ts
Normal file
0
apps/intelligence-services/backtest-engine/tsconfig.json
Normal file
0
apps/intelligence-services/backtest-engine/tsconfig.json
Normal file
0
apps/intelligence-services/signal-engine/README.md
Normal file
0
apps/intelligence-services/signal-engine/README.md
Normal file
0
apps/intelligence-services/signal-engine/package.json
Normal file
0
apps/intelligence-services/signal-engine/package.json
Normal file
0
apps/intelligence-services/signal-engine/src/index.ts
Normal file
0
apps/intelligence-services/signal-engine/src/index.ts
Normal file
0
apps/intelligence-services/signal-engine/tsconfig.json
Normal file
0
apps/intelligence-services/signal-engine/tsconfig.json
Normal file
|
|
@ -2,11 +2,11 @@
|
|||
"name": "strategy-orchestrator",
|
||||
"version": "1.0.0",
|
||||
"description": "Trading strategy lifecycle management service",
|
||||
"main": "src/index.ts",
|
||||
"scripts": {
|
||||
"main": "src/index.ts", "scripts": {
|
||||
"dev": "bun run --watch src/index.ts",
|
||||
"start": "bun run src/index.ts",
|
||||
"test": "echo 'No tests yet'"
|
||||
"test": "bun test --timeout 10000 src/tests/**/*.test.ts",
|
||||
"test:watch": "bun test --watch src/tests/**/*.test.ts"
|
||||
},
|
||||
"dependencies": {
|
||||
"hono": "^4.6.3",
|
||||
|
|
|
|||
|
|
@ -0,0 +1,267 @@
|
|||
import { Request, Response } from 'express';
|
||||
import { StrategyRegistry, StrategyType } from '../core/strategies/StrategyRegistry';
|
||||
import { BacktestRequest, BacktestService } from '../core/backtesting/BacktestService';
|
||||
import { BaseStrategy } from '../core/Strategy';
|
||||
import { PerformanceAnalytics } from '../core/backtesting/PerformanceAnalytics';
|
||||
|
||||
/**
|
||||
* Strategy Controller
|
||||
*
|
||||
* Handles HTTP requests related to strategy management, backtesting, and execution.
|
||||
*/
|
||||
export class StrategyController {
|
||||
private readonly strategyRegistry: StrategyRegistry;
|
||||
private readonly backtestService: BacktestService;
|
||||
|
||||
constructor(apiBaseUrl: string = 'http://localhost:3001/api') {
|
||||
this.strategyRegistry = StrategyRegistry.getInstance();
|
||||
this.backtestService = new BacktestService(apiBaseUrl);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all available strategy types
|
||||
*/
|
||||
public getStrategyTypes(req: Request, res: Response): void {
|
||||
const types = Object.values(StrategyType);
|
||||
res.json({
|
||||
success: true,
|
||||
data: types
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all strategies
|
||||
*/
|
||||
public getStrategies(req: Request, res: Response): void {
|
||||
const strategies = this.strategyRegistry.getAllStrategies();
|
||||
|
||||
// Convert to array of plain objects for serialization
|
||||
const serializedStrategies = strategies.map(strategy => ({
|
||||
id: strategy.id,
|
||||
name: strategy.name,
|
||||
description: strategy.description,
|
||||
symbols: strategy.symbols,
|
||||
parameters: strategy.parameters,
|
||||
type: this.strategyRegistry.getStrategyType(strategy)
|
||||
}));
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: serializedStrategies
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a specific strategy by ID
|
||||
*/
|
||||
public getStrategy(req: Request, res: Response): void {
|
||||
const { id } = req.params;
|
||||
const strategy = this.strategyRegistry.getStrategyById(id);
|
||||
|
||||
if (!strategy) {
|
||||
res.status(404).json({
|
||||
success: false,
|
||||
error: `Strategy with ID ${id} not found`
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
const type = this.strategyRegistry.getStrategyType(strategy);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: {
|
||||
id: strategy.id,
|
||||
name: strategy.name,
|
||||
description: strategy.description,
|
||||
symbols: strategy.symbols,
|
||||
parameters: strategy.parameters,
|
||||
type
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new strategy
|
||||
*/
|
||||
public createStrategy(req: Request, res: Response): void {
|
||||
try {
|
||||
const { name, description, symbols, parameters, type } = req.body;
|
||||
|
||||
if (!type || !Object.values(StrategyType).includes(type)) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: 'Invalid strategy type'
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
const strategy = this.strategyRegistry.createStrategy(
|
||||
type as StrategyType,
|
||||
`strategy_${Date.now()}`, // Generate an ID
|
||||
name || `New ${type} Strategy`,
|
||||
description || `Generated ${type} strategy`,
|
||||
symbols || [],
|
||||
parameters || {}
|
||||
);
|
||||
|
||||
res.status(201).json({
|
||||
success: true,
|
||||
data: {
|
||||
id: strategy.id,
|
||||
name: strategy.name,
|
||||
description: strategy.description,
|
||||
symbols: strategy.symbols,
|
||||
parameters: strategy.parameters,
|
||||
type
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
error: (error as Error).message
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update an existing strategy
|
||||
*/
|
||||
public updateStrategy(req: Request, res: Response): void {
|
||||
const { id } = req.params;
|
||||
const { name, description, symbols, parameters } = req.body;
|
||||
|
||||
const strategy = this.strategyRegistry.getStrategyById(id);
|
||||
|
||||
if (!strategy) {
|
||||
res.status(404).json({
|
||||
success: false,
|
||||
error: `Strategy with ID ${id} not found`
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Update properties
|
||||
if (name !== undefined) strategy.name = name;
|
||||
if (description !== undefined) strategy.description = description;
|
||||
if (symbols !== undefined) (strategy as any).symbols = symbols; // Hack since symbols is readonly
|
||||
if (parameters !== undefined) strategy.parameters = parameters;
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: {
|
||||
id: strategy.id,
|
||||
name: strategy.name,
|
||||
description: strategy.description,
|
||||
symbols: strategy.symbols,
|
||||
parameters: strategy.parameters,
|
||||
type: this.strategyRegistry.getStrategyType(strategy)
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a strategy
|
||||
*/
|
||||
public deleteStrategy(req: Request, res: Response): void {
|
||||
const { id } = req.params;
|
||||
const success = this.strategyRegistry.deleteStrategy(id);
|
||||
|
||||
if (!success) {
|
||||
res.status(404).json({
|
||||
success: false,
|
||||
error: `Strategy with ID ${id} not found`
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: { id }
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Run a backtest
|
||||
*/
|
||||
public async runBacktest(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const backtestRequest: BacktestRequest = req.body;
|
||||
|
||||
// Validate request
|
||||
if (!backtestRequest.strategyType) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: 'Strategy type is required'
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
if (!backtestRequest.symbols || backtestRequest.symbols.length === 0) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: 'At least one symbol is required'
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Run the backtest
|
||||
const result = await this.backtestService.runBacktest(backtestRequest);
|
||||
|
||||
// Enhance results with additional metrics
|
||||
const enhancedResult = PerformanceAnalytics.enhanceResults(result);
|
||||
|
||||
// Calculate additional analytics
|
||||
const monthlyReturns = PerformanceAnalytics.calculateMonthlyReturns(result.dailyReturns);
|
||||
const drawdowns = PerformanceAnalytics.analyzeDrawdowns(result.dailyReturns);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: {
|
||||
...enhancedResult,
|
||||
monthlyReturns,
|
||||
drawdowns
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Backtest error:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
error: (error as Error).message
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Optimize a strategy with grid search
|
||||
*/
|
||||
public async optimizeStrategy(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const { baseRequest, parameterGrid } = req.body;
|
||||
|
||||
// Validate request
|
||||
if (!baseRequest || !parameterGrid) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: 'Base request and parameter grid are required'
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Run optimization
|
||||
const results = await this.backtestService.optimizeStrategy(baseRequest, parameterGrid);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: results
|
||||
});
|
||||
} catch (error) {
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
error: (error as Error).message
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export default StrategyController;
|
||||
|
|
@ -0,0 +1,287 @@
|
|||
import { EventEmitter } from 'events';
|
||||
|
||||
export interface BarData {
|
||||
symbol: string;
|
||||
timestamp: Date;
|
||||
open: number;
|
||||
high: number;
|
||||
low: number;
|
||||
close: number;
|
||||
volume: number;
|
||||
}
|
||||
|
||||
export interface Position {
|
||||
symbol: string;
|
||||
quantity: number;
|
||||
avgPrice: number;
|
||||
side: 'LONG' | 'SHORT';
|
||||
entryTime: Date;
|
||||
unrealizedPnL?: number;
|
||||
realizedPnL?: number;
|
||||
}
|
||||
|
||||
export interface Order {
|
||||
id: string;
|
||||
symbol: string;
|
||||
side: 'BUY' | 'SELL';
|
||||
quantity: number;
|
||||
price?: number; // Market order if undefined
|
||||
type: 'MARKET' | 'LIMIT' | 'STOP' | 'STOP_LIMIT';
|
||||
status: 'PENDING' | 'FILLED' | 'CANCELLED' | 'REJECTED';
|
||||
timestamp: Date;
|
||||
fillPrice?: number;
|
||||
fillTime?: Date;
|
||||
}
|
||||
|
||||
export interface StrategyContext {
|
||||
currentTime: Date;
|
||||
portfolio: {
|
||||
cash: number;
|
||||
positions: Map<string, Position>;
|
||||
totalValue: number;
|
||||
};
|
||||
marketData: Map<string, BarData[]>; // Historical data for each symbol
|
||||
indicators: Map<string, any>; // Cached indicator values
|
||||
}
|
||||
|
||||
export interface StrategyParameters {
|
||||
[key: string]: number | string | boolean | any[];
|
||||
}
|
||||
|
||||
export interface StrategyMetrics {
|
||||
totalReturn: number;
|
||||
totalTrades: number;
|
||||
winningTrades: number;
|
||||
losingTrades: number;
|
||||
winRate: number;
|
||||
avgWin: number;
|
||||
avgLoss: number;
|
||||
profitFactor: number;
|
||||
sharpeRatio: number;
|
||||
maxDrawdown: number;
|
||||
maxDrawdownDuration: number;
|
||||
calmarRatio: number;
|
||||
sortinoRatio: number;
|
||||
beta: number;
|
||||
alpha: number;
|
||||
volatility: number;
|
||||
}
|
||||
|
||||
export abstract class BaseStrategy extends EventEmitter {
|
||||
public readonly id: string;
|
||||
public readonly name: string;
|
||||
public readonly description: string;
|
||||
public readonly symbols: string[];
|
||||
public parameters: StrategyParameters;
|
||||
|
||||
protected context: StrategyContext;
|
||||
protected isInitialized: boolean = false;
|
||||
|
||||
constructor(
|
||||
id: string,
|
||||
name: string,
|
||||
description: string,
|
||||
symbols: string[],
|
||||
parameters: StrategyParameters = {}
|
||||
) {
|
||||
super();
|
||||
this.id = id;
|
||||
this.name = name;
|
||||
this.description = description;
|
||||
this.symbols = symbols;
|
||||
this.parameters = parameters;
|
||||
|
||||
this.context = {
|
||||
currentTime: new Date(),
|
||||
portfolio: {
|
||||
cash: 100000, // Default starting capital
|
||||
positions: new Map(),
|
||||
totalValue: 100000
|
||||
},
|
||||
marketData: new Map(),
|
||||
indicators: new Map()
|
||||
};
|
||||
}
|
||||
|
||||
// Abstract methods that must be implemented by strategy subclasses
|
||||
abstract initialize(): Promise<void>;
|
||||
abstract onBar(bar: BarData): Promise<Order[]>;
|
||||
abstract onOrderFilled(order: Order): Promise<void>;
|
||||
abstract cleanup(): Promise<void>;
|
||||
|
||||
// Lifecycle methods
|
||||
async start(): Promise<void> {
|
||||
if (!this.isInitialized) {
|
||||
await this.initialize();
|
||||
this.isInitialized = true;
|
||||
}
|
||||
this.emit('started', { strategyId: this.id });
|
||||
}
|
||||
|
||||
async stop(): Promise<void> {
|
||||
await this.cleanup();
|
||||
this.emit('stopped', { strategyId: this.id });
|
||||
}
|
||||
|
||||
// Market data management
|
||||
addBar(bar: BarData): void {
|
||||
this.context.currentTime = bar.timestamp;
|
||||
|
||||
if (!this.context.marketData.has(bar.symbol)) {
|
||||
this.context.marketData.set(bar.symbol, []);
|
||||
}
|
||||
|
||||
const bars = this.context.marketData.get(bar.symbol)!;
|
||||
bars.push(bar);
|
||||
|
||||
// Keep only last 1000 bars to manage memory
|
||||
if (bars.length > 1000) {
|
||||
bars.shift();
|
||||
}
|
||||
}
|
||||
|
||||
// Portfolio management helpers
|
||||
protected getCurrentPrice(symbol: string): number | null {
|
||||
const bars = this.context.marketData.get(symbol);
|
||||
return bars && bars.length > 0 ? bars[bars.length - 1].close : null;
|
||||
}
|
||||
|
||||
protected getPosition(symbol: string): Position | null {
|
||||
return this.context.portfolio.positions.get(symbol) || null;
|
||||
}
|
||||
|
||||
protected hasPosition(symbol: string): boolean {
|
||||
return this.context.portfolio.positions.has(symbol);
|
||||
}
|
||||
|
||||
protected getAvailableCash(): number {
|
||||
return this.context.portfolio.cash;
|
||||
}
|
||||
|
||||
protected calculatePositionValue(symbol: string): number {
|
||||
const position = this.getPosition(symbol);
|
||||
const currentPrice = this.getCurrentPrice(symbol);
|
||||
|
||||
if (!position || !currentPrice) return 0;
|
||||
|
||||
return position.quantity * currentPrice;
|
||||
}
|
||||
|
||||
protected updatePortfolioValue(): void {
|
||||
let totalValue = this.context.portfolio.cash;
|
||||
|
||||
for (const [symbol, position] of this.context.portfolio.positions) {
|
||||
const currentPrice = this.getCurrentPrice(symbol);
|
||||
if (currentPrice) {
|
||||
totalValue += position.quantity * currentPrice;
|
||||
}
|
||||
}
|
||||
|
||||
this.context.portfolio.totalValue = totalValue;
|
||||
}
|
||||
|
||||
// Order creation helpers
|
||||
protected createMarketOrder(symbol: string, side: 'BUY' | 'SELL', quantity: number): Order {
|
||||
return {
|
||||
id: this.generateOrderId(),
|
||||
symbol,
|
||||
side,
|
||||
quantity: Math.abs(quantity),
|
||||
type: 'MARKET',
|
||||
status: 'PENDING',
|
||||
timestamp: this.context.currentTime
|
||||
};
|
||||
}
|
||||
|
||||
protected createLimitOrder(
|
||||
symbol: string,
|
||||
side: 'BUY' | 'SELL',
|
||||
quantity: number,
|
||||
price: number
|
||||
): Order {
|
||||
return {
|
||||
id: this.generateOrderId(),
|
||||
symbol,
|
||||
side,
|
||||
quantity: Math.abs(quantity),
|
||||
price,
|
||||
type: 'LIMIT',
|
||||
status: 'PENDING',
|
||||
timestamp: this.context.currentTime
|
||||
};
|
||||
}
|
||||
|
||||
protected createStopOrder(
|
||||
symbol: string,
|
||||
side: 'BUY' | 'SELL',
|
||||
quantity: number,
|
||||
stopPrice: number
|
||||
): Order {
|
||||
return {
|
||||
id: this.generateOrderId(),
|
||||
symbol,
|
||||
side,
|
||||
quantity: Math.abs(quantity),
|
||||
price: stopPrice,
|
||||
type: 'STOP',
|
||||
status: 'PENDING',
|
||||
timestamp: this.context.currentTime
|
||||
};
|
||||
}
|
||||
|
||||
private generateOrderId(): string {
|
||||
return `${this.id}_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
|
||||
// Utility methods for common strategy patterns
|
||||
protected getBarsSince(symbol: string, periods: number): BarData[] {
|
||||
const bars = this.context.marketData.get(symbol) || [];
|
||||
return bars.slice(-periods);
|
||||
}
|
||||
|
||||
protected getReturns(symbol: string, periods: number): number[] {
|
||||
const bars = this.getBarsSince(symbol, periods + 1);
|
||||
const returns: number[] = [];
|
||||
|
||||
for (let i = 1; i < bars.length; i++) {
|
||||
const returnPct = (bars[i].close - bars[i - 1].close) / bars[i - 1].close;
|
||||
returns.push(returnPct);
|
||||
}
|
||||
|
||||
return returns;
|
||||
}
|
||||
|
||||
protected getVolatility(symbol: string, periods: number): number {
|
||||
const returns = this.getReturns(symbol, periods);
|
||||
if (returns.length === 0) return 0;
|
||||
|
||||
const mean = returns.reduce((sum, ret) => sum + ret, 0) / returns.length;
|
||||
const variance = returns.reduce((sum, ret) => sum + Math.pow(ret - mean, 2), 0) / returns.length;
|
||||
|
||||
return Math.sqrt(variance * 252); // Annualized volatility
|
||||
}
|
||||
|
||||
// Parameter validation
|
||||
protected validateParameters(): boolean {
|
||||
// Override in subclasses for parameter validation
|
||||
return true;
|
||||
}
|
||||
|
||||
// Get strategy state for serialization
|
||||
getState() {
|
||||
return {
|
||||
id: this.id,
|
||||
name: this.name,
|
||||
description: this.description,
|
||||
symbols: this.symbols,
|
||||
parameters: this.parameters,
|
||||
isInitialized: this.isInitialized,
|
||||
currentTime: this.context.currentTime,
|
||||
portfolio: {
|
||||
cash: this.context.portfolio.cash,
|
||||
totalValue: this.context.portfolio.totalValue,
|
||||
positions: Array.from(this.context.portfolio.positions.entries())
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,362 @@
|
|||
import { BarData } from '../Strategy';
|
||||
|
||||
export class TechnicalIndicators {
|
||||
/**
|
||||
* Calculate Simple Moving Average (SMA)
|
||||
* @param prices Array of price values
|
||||
* @param period Number of periods for calculation
|
||||
* @returns Array of SMA values
|
||||
*/
|
||||
static sma(prices: number[], period: number): number[] {
|
||||
if (period <= 0 || prices.length === 0) return [];
|
||||
|
||||
const result: number[] = [];
|
||||
|
||||
// Not enough data for calculation
|
||||
if (prices.length < period) {
|
||||
return Array(prices.length).fill(NaN);
|
||||
}
|
||||
|
||||
// Calculate first SMA
|
||||
let sum = 0;
|
||||
for (let i = 0; i < period; i++) {
|
||||
sum += prices[i];
|
||||
}
|
||||
|
||||
result.push(sum / period);
|
||||
|
||||
// Calculate subsequent SMAs using previous sum
|
||||
for (let i = period; i < prices.length; i++) {
|
||||
sum = sum - prices[i - period] + prices[i];
|
||||
result.push(sum / period);
|
||||
}
|
||||
|
||||
// Fill beginning with NaN
|
||||
const nanValues = Array(period - 1).fill(NaN);
|
||||
|
||||
return [...nanValues, ...result];
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Exponential Moving Average (EMA)
|
||||
* @param prices Array of price values
|
||||
* @param period Number of periods for calculation
|
||||
* @returns Array of EMA values
|
||||
*/
|
||||
static ema(prices: number[], period: number): number[] {
|
||||
if (period <= 0 || prices.length === 0) return [];
|
||||
|
||||
const result: number[] = [];
|
||||
const multiplier = 2 / (period + 1);
|
||||
|
||||
// Not enough data for calculation
|
||||
if (prices.length < period) {
|
||||
return Array(prices.length).fill(NaN);
|
||||
}
|
||||
|
||||
// Calculate SMA for first EMA value
|
||||
let sum = 0;
|
||||
for (let i = 0; i < period; i++) {
|
||||
sum += prices[i];
|
||||
}
|
||||
|
||||
// First EMA is SMA
|
||||
let ema = sum / period;
|
||||
result.push(ema);
|
||||
|
||||
// Calculate subsequent EMAs
|
||||
for (let i = period; i < prices.length; i++) {
|
||||
ema = (prices[i] - ema) * multiplier + ema;
|
||||
result.push(ema);
|
||||
}
|
||||
|
||||
// Fill beginning with NaN
|
||||
const nanValues = Array(period - 1).fill(NaN);
|
||||
|
||||
return [...nanValues, ...result];
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Relative Strength Index (RSI)
|
||||
* @param prices Array of price values
|
||||
* @param period Number of periods for calculation
|
||||
* @returns Array of RSI values
|
||||
*/
|
||||
static rsi(prices: number[], period: number): number[] {
|
||||
if (period <= 0 || prices.length < period + 1) {
|
||||
return Array(prices.length).fill(NaN);
|
||||
}
|
||||
|
||||
const result: number[] = [];
|
||||
const gains: number[] = [];
|
||||
const losses: number[] = [];
|
||||
|
||||
// Calculate price changes
|
||||
for (let i = 1; i < prices.length; i++) {
|
||||
const change = prices[i] - prices[i - 1];
|
||||
gains.push(change > 0 ? change : 0);
|
||||
losses.push(change < 0 ? Math.abs(change) : 0);
|
||||
}
|
||||
|
||||
// Not enough data
|
||||
if (gains.length < period) {
|
||||
return Array(prices.length).fill(NaN);
|
||||
}
|
||||
|
||||
// Calculate first average gain and loss
|
||||
let avgGain = 0;
|
||||
let avgLoss = 0;
|
||||
|
||||
for (let i = 0; i < period; i++) {
|
||||
avgGain += gains[i];
|
||||
avgLoss += losses[i];
|
||||
}
|
||||
|
||||
avgGain /= period;
|
||||
avgLoss /= period;
|
||||
|
||||
// Calculate first RSI
|
||||
let rs = avgGain / (avgLoss === 0 ? 0.001 : avgLoss); // Avoid division by zero
|
||||
let rsi = 100 - (100 / (1 + rs));
|
||||
result.push(rsi);
|
||||
|
||||
// Calculate subsequent RSIs
|
||||
for (let i = period; i < gains.length; i++) {
|
||||
// Smooth averages
|
||||
avgGain = ((avgGain * (period - 1)) + gains[i]) / period;
|
||||
avgLoss = ((avgLoss * (period - 1)) + losses[i]) / period;
|
||||
|
||||
// Calculate RS and RSI
|
||||
rs = avgGain / (avgLoss === 0 ? 0.001 : avgLoss);
|
||||
rsi = 100 - (100 / (1 + rs));
|
||||
result.push(rsi);
|
||||
}
|
||||
|
||||
// Fill beginning with NaN
|
||||
const nanValues = Array(period).fill(NaN);
|
||||
|
||||
return [...nanValues, ...result];
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Moving Average Convergence Divergence (MACD)
|
||||
* @param prices Array of price values
|
||||
* @param fastPeriod Fast EMA period (default: 12)
|
||||
* @param slowPeriod Slow EMA period (default: 26)
|
||||
* @param signalPeriod Signal line period (default: 9)
|
||||
* @returns Object containing MACD line, signal line, and histogram
|
||||
*/
|
||||
static macd(
|
||||
prices: number[],
|
||||
fastPeriod: number = 12,
|
||||
slowPeriod: number = 26,
|
||||
signalPeriod: number = 9
|
||||
): { macdLine: number[], signalLine: number[], histogram: number[] } {
|
||||
// Calculate EMAs
|
||||
const fastEMA = this.ema(prices, fastPeriod);
|
||||
const slowEMA = this.ema(prices, slowPeriod);
|
||||
|
||||
// Calculate MACD line (fast EMA - slow EMA)
|
||||
const macdLine: number[] = [];
|
||||
for (let i = 0; i < prices.length; i++) {
|
||||
macdLine.push(isNaN(fastEMA[i]) || isNaN(slowEMA[i])
|
||||
? NaN
|
||||
: fastEMA[i] - slowEMA[i]);
|
||||
}
|
||||
|
||||
// Calculate signal line (EMA of MACD line)
|
||||
const signalLine = this.ema(macdLine.filter(val => !isNaN(val)), signalPeriod);
|
||||
|
||||
// Pad signal line with NaNs to match original length
|
||||
const paddedSignalLine = Array(prices.length - signalLine.length).fill(NaN).concat(signalLine);
|
||||
|
||||
// Calculate histogram (MACD line - signal line)
|
||||
const histogram: number[] = [];
|
||||
for (let i = 0; i < prices.length; i++) {
|
||||
histogram.push(isNaN(macdLine[i]) || isNaN(paddedSignalLine[i])
|
||||
? NaN
|
||||
: macdLine[i] - paddedSignalLine[i]);
|
||||
}
|
||||
|
||||
return {
|
||||
macdLine,
|
||||
signalLine: paddedSignalLine,
|
||||
histogram
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Bollinger Bands
|
||||
* @param prices Array of price values
|
||||
* @param period SMA period (default: 20)
|
||||
* @param stdDevMultiplier Standard deviation multiplier (default: 2)
|
||||
* @returns Object containing upper band, middle band, and lower band
|
||||
*/
|
||||
static bollingerBands(
|
||||
prices: number[],
|
||||
period: number = 20,
|
||||
stdDevMultiplier: number = 2
|
||||
): { upper: number[], middle: number[], lower: number[] } {
|
||||
// Calculate middle band (SMA)
|
||||
const middle = this.sma(prices, period);
|
||||
|
||||
// Calculate standard deviation for each point
|
||||
const upper: number[] = [];
|
||||
const lower: number[] = [];
|
||||
|
||||
for (let i = 0; i < prices.length; i++) {
|
||||
if (isNaN(middle[i])) {
|
||||
upper.push(NaN);
|
||||
lower.push(NaN);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Calculate standard deviation using values in the period window
|
||||
let stdDev = 0;
|
||||
let count = 0;
|
||||
|
||||
// Start index for the window
|
||||
const startIdx = Math.max(0, i - period + 1);
|
||||
|
||||
for (let j = startIdx; j <= i; j++) {
|
||||
stdDev += Math.pow(prices[j] - middle[i], 2);
|
||||
count++;
|
||||
}
|
||||
|
||||
stdDev = Math.sqrt(stdDev / count);
|
||||
|
||||
// Calculate bands
|
||||
upper.push(middle[i] + (stdDevMultiplier * stdDev));
|
||||
lower.push(middle[i] - (stdDevMultiplier * stdDev));
|
||||
}
|
||||
|
||||
return { upper, middle, lower };
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Average True Range (ATR)
|
||||
* @param bars Array of BarData objects
|
||||
* @param period Number of periods for calculation
|
||||
* @returns Array of ATR values
|
||||
*/
|
||||
static atr(bars: BarData[], period: number): number[] {
|
||||
if (period <= 0 || bars.length < 2) {
|
||||
return Array(bars.length).fill(NaN);
|
||||
}
|
||||
|
||||
// Calculate True Range for each bar
|
||||
const trueRanges: number[] = [];
|
||||
|
||||
// First TR is high - low
|
||||
trueRanges.push(bars[0].high - bars[0].low);
|
||||
|
||||
// Calculate remaining TRs
|
||||
for (let i = 1; i < bars.length; i++) {
|
||||
const currentHigh = bars[i].high;
|
||||
const currentLow = bars[i].low;
|
||||
const previousClose = bars[i - 1].close;
|
||||
|
||||
const tr1 = currentHigh - currentLow;
|
||||
const tr2 = Math.abs(currentHigh - previousClose);
|
||||
const tr3 = Math.abs(currentLow - previousClose);
|
||||
|
||||
const tr = Math.max(tr1, tr2, tr3);
|
||||
trueRanges.push(tr);
|
||||
}
|
||||
|
||||
// Calculate ATR (first value is simple average)
|
||||
const result: number[] = [];
|
||||
|
||||
// Not enough data
|
||||
if (trueRanges.length < period) {
|
||||
return Array(bars.length).fill(NaN);
|
||||
}
|
||||
|
||||
// First ATR is simple average of true ranges
|
||||
let atr = 0;
|
||||
for (let i = 0; i < period; i++) {
|
||||
atr += trueRanges[i];
|
||||
}
|
||||
atr /= period;
|
||||
result.push(atr);
|
||||
|
||||
// Calculate subsequent ATRs using smoothing
|
||||
for (let i = period; i < trueRanges.length; i++) {
|
||||
atr = ((atr * (period - 1)) + trueRanges[i]) / period;
|
||||
result.push(atr);
|
||||
}
|
||||
|
||||
// Fill beginning with NaN
|
||||
const nanValues = Array(period).fill(NaN);
|
||||
|
||||
return [...nanValues, ...result];
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Stochastic Oscillator
|
||||
* @param bars Array of BarData objects
|
||||
* @param period %K period (default: 14)
|
||||
* @param smoothK %K smoothing (default: 3)
|
||||
* @param smoothD %D period (default: 3)
|
||||
* @returns Object containing %K and %D values
|
||||
*/
|
||||
static stochastic(
|
||||
bars: BarData[],
|
||||
period: number = 14,
|
||||
smoothK: number = 3,
|
||||
smoothD: number = 3
|
||||
): { k: number[], d: number[] } {
|
||||
if (period <= 0 || bars.length < period) {
|
||||
return { k: Array(bars.length).fill(NaN), d: Array(bars.length).fill(NaN) };
|
||||
}
|
||||
|
||||
const rawK: number[] = [];
|
||||
|
||||
// Calculate raw %K values
|
||||
for (let i = period - 1; i < bars.length; i++) {
|
||||
let highest = -Infinity;
|
||||
let lowest = Infinity;
|
||||
|
||||
// Find highest high and lowest low in the period
|
||||
for (let j = i - (period - 1); j <= i; j++) {
|
||||
highest = Math.max(highest, bars[j].high);
|
||||
lowest = Math.min(lowest, bars[j].low);
|
||||
}
|
||||
|
||||
// Calculate raw %K
|
||||
const currentClose = bars[i].close;
|
||||
const rawKValue = 100 * ((currentClose - lowest) / (highest - lowest));
|
||||
rawK.push(rawKValue);
|
||||
}
|
||||
|
||||
// Fill beginning with NaN
|
||||
const nanValues = Array(period - 1).fill(NaN);
|
||||
const fullRawK = [...nanValues, ...rawK];
|
||||
|
||||
// Apply smoothing to %K (SMA of raw %K)
|
||||
const filteredK = fullRawK.filter(val => !isNaN(val));
|
||||
let k = this.sma(filteredK, smoothK);
|
||||
|
||||
// Pad with NaNs
|
||||
k = [...Array(fullRawK.length - k.length).fill(NaN), ...k];
|
||||
|
||||
// Calculate %D (SMA of %K)
|
||||
const filteredSmoothedK = k.filter(val => !isNaN(val));
|
||||
let d = this.sma(filteredSmoothedK, smoothD);
|
||||
|
||||
// Pad with NaNs
|
||||
d = [...Array(k.length - d.length).fill(NaN), ...d];
|
||||
|
||||
return { k, d };
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract specific price from bars (e.g., close, open, high, low)
|
||||
* @param bars Array of BarData objects
|
||||
* @param field Price field to extract
|
||||
* @returns Array of extracted price values
|
||||
*/
|
||||
static extractPrice(bars: BarData[], field: 'open' | 'high' | 'low' | 'close' = 'close'): number[] {
|
||||
return bars.map(bar => bar[field]);
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,604 @@
|
|||
import { EventEmitter } from 'events';
|
||||
import { BaseStrategy } from '../Strategy';
|
||||
import { BarData, Order, Position } from '../Strategy';
|
||||
|
||||
export interface BacktestConfig {
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
symbols: string[];
|
||||
initialCapital: number;
|
||||
commission: number; // Per trade commission (percentage)
|
||||
slippage: number; // Slippage model (percentage)
|
||||
dataResolution: '1m' | '5m' | '15m' | '30m' | '1h' | '4h' | '1d';
|
||||
mode: 'event' | 'vector';
|
||||
}
|
||||
|
||||
export interface BacktestResult {
|
||||
strategyId: string;
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
duration: number; // In milliseconds
|
||||
initialCapital: number;
|
||||
finalCapital: number;
|
||||
totalReturn: number;
|
||||
annualizedReturn: number;
|
||||
sharpeRatio: number;
|
||||
maxDrawdown: number;
|
||||
maxDrawdownDuration: number; // In days
|
||||
winRate: number;
|
||||
totalTrades: number;
|
||||
winningTrades: number;
|
||||
losingTrades: number;
|
||||
averageWinningTrade: number;
|
||||
averageLosingTrade: number;
|
||||
profitFactor: number;
|
||||
dailyReturns: Array<{ date: Date; return: number }>;
|
||||
trades: Array<{
|
||||
symbol: string;
|
||||
entryTime: Date;
|
||||
entryPrice: number;
|
||||
exitTime: Date;
|
||||
exitPrice: number;
|
||||
quantity: number;
|
||||
pnl: number;
|
||||
pnlPercent: number;
|
||||
}>;
|
||||
}
|
||||
|
||||
export interface BacktestProgress {
|
||||
progress: number; // 0-100
|
||||
currentDate: Date;
|
||||
processingSpeed: number; // Bars per second
|
||||
estimatedTimeRemaining: number; // milliseconds
|
||||
currentCapital: number;
|
||||
currentReturn: number;
|
||||
currentDrawdown: number;
|
||||
}
|
||||
|
||||
export interface DataFeed {
|
||||
getHistoricalData(symbol: string, resolution: string, start: Date, end: Date): Promise<BarData[]>;
|
||||
hasDataFor(symbol: string, resolution: string, start: Date, end: Date): Promise<boolean>;
|
||||
}
|
||||
|
||||
export class BacktestEngine extends EventEmitter {
|
||||
private config: BacktestConfig;
|
||||
private strategy: BaseStrategy;
|
||||
private dataFeed: DataFeed;
|
||||
private isRunning: boolean = false;
|
||||
private barBuffer: Map<string, BarData[]> = new Map();
|
||||
private pendingOrders: Order[] = [];
|
||||
private filledOrders: Order[] = [];
|
||||
private currentTime: Date;
|
||||
private startTime: number = 0; // For performance tracking
|
||||
private processedBars: number = 0;
|
||||
private marketData: Map<string, BarData[]> = new Map();
|
||||
|
||||
// Results tracking
|
||||
private initialCapital: number;
|
||||
private currentCapital: number;
|
||||
private positions = new Map<string, Position>();
|
||||
private trades: BacktestResult['trades'] = [];
|
||||
private dailyReturns: BacktestResult['dailyReturns'] = [];
|
||||
private previousPortfolioValue: number;
|
||||
private highWaterMark: number;
|
||||
private maxDrawdown: number = 0;
|
||||
private drawdownStartTime: Date | null = null;
|
||||
private maxDrawdownDuration: number = 0;
|
||||
private winningTrades: number = 0;
|
||||
private losingTrades: number = 0;
|
||||
private breakEvenTrades: number = 0;
|
||||
private totalProfits: number = 0;
|
||||
private totalLosses: number = 0;
|
||||
|
||||
constructor(strategy: BaseStrategy, config: BacktestConfig, dataFeed: DataFeed) {
|
||||
super();
|
||||
this.strategy = strategy;
|
||||
this.config = config;
|
||||
this.dataFeed = dataFeed;
|
||||
this.currentTime = new Date(config.startDate);
|
||||
this.initialCapital = config.initialCapital;
|
||||
this.currentCapital = config.initialCapital;
|
||||
this.previousPortfolioValue = config.initialCapital;
|
||||
this.highWaterMark = config.initialCapital;
|
||||
}
|
||||
|
||||
async run(): Promise<BacktestResult> {
|
||||
if (this.isRunning) {
|
||||
throw new Error('Backtest is already running');
|
||||
}
|
||||
|
||||
this.isRunning = true;
|
||||
this.startTime = Date.now();
|
||||
this.emit('started', { strategyId: this.strategy.id, config: this.config });
|
||||
|
||||
try {
|
||||
// Load data based on configured mode
|
||||
if (this.config.mode === 'event') {
|
||||
await this.runEventBased();
|
||||
} else {
|
||||
await this.runVectorized();
|
||||
}
|
||||
|
||||
const result = this.generateResults();
|
||||
this.emit('completed', { strategyId: this.strategy.id, result });
|
||||
this.isRunning = false;
|
||||
return result;
|
||||
} catch (error) {
|
||||
this.isRunning = false;
|
||||
this.emit('error', { strategyId: this.strategy.id, error });
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
private async runEventBased(): Promise<void> {
|
||||
// Load market data for all symbols
|
||||
await this.loadMarketData();
|
||||
|
||||
// Initialize the strategy
|
||||
await this.strategy.start();
|
||||
|
||||
// Create a merged timeline of all bars across all symbols, sorted by timestamp
|
||||
const timeline = this.createMergedTimeline();
|
||||
|
||||
// Process each event in chronological order
|
||||
let lastProgressUpdate = Date.now();
|
||||
let prevDate = new Date(0);
|
||||
|
||||
for (let i = 0; i < timeline.length; i++) {
|
||||
const bar = timeline[i];
|
||||
this.currentTime = bar.timestamp;
|
||||
|
||||
// Process any pending orders
|
||||
await this.processOrders(bar);
|
||||
|
||||
// Update positions with current prices
|
||||
this.updatePositions(bar);
|
||||
|
||||
// If we've crossed to a new day, calculate daily return
|
||||
if (this.currentTime.toDateString() !== prevDate.toDateString()) {
|
||||
this.calculateDailyReturn();
|
||||
prevDate = this.currentTime;
|
||||
}
|
||||
|
||||
// Send the new bar to the strategy
|
||||
const orders = await this.strategy.onBar(bar);
|
||||
|
||||
// Add any new orders to the pending orders queue
|
||||
if (orders && orders.length > 0) {
|
||||
this.pendingOrders.push(...orders);
|
||||
}
|
||||
|
||||
// Update progress periodically
|
||||
if (Date.now() - lastProgressUpdate > 1000) { // Update every second
|
||||
this.updateProgress(i / timeline.length);
|
||||
lastProgressUpdate = Date.now();
|
||||
}
|
||||
}
|
||||
|
||||
// Process any remaining orders
|
||||
for (const order of this.pendingOrders) {
|
||||
await this.processOrder(order);
|
||||
}
|
||||
|
||||
// Close any remaining positions at the last known price
|
||||
await this.closeAllPositions();
|
||||
|
||||
// Clean up strategy
|
||||
await this.strategy.stop();
|
||||
}
|
||||
|
||||
private async runVectorized(): Promise<void> {
|
||||
// Load market data for all symbols
|
||||
await this.loadMarketData();
|
||||
|
||||
// To implement a vectorized approach, we need to:
|
||||
// 1. Pre-compute technical indicators
|
||||
// 2. Generate buy/sell signals for the entire dataset
|
||||
// 3. Calculate portfolio values based on signals
|
||||
|
||||
// This is a simplified implementation since specific vectorized strategies
|
||||
// will need to be implemented separately based on the strategy type
|
||||
|
||||
const timeline = this.createMergedTimeline();
|
||||
const startTime = Date.now();
|
||||
|
||||
// Initialize variables for tracking performance
|
||||
let currentPositions = new Map<string, number>();
|
||||
let currentCash = this.initialCapital;
|
||||
let prevPortfolioValue = this.initialCapital;
|
||||
let highWaterMark = this.initialCapital;
|
||||
let maxDrawdown = 0;
|
||||
let maxDrawdownStartDate = new Date();
|
||||
let maxDrawdownEndDate = new Date();
|
||||
let currentDrawdownStart = new Date();
|
||||
|
||||
// Pre-process data (this would be implemented by the specific strategy)
|
||||
const allBars = new Map<string, BarData[]>();
|
||||
for (const symbol of this.config.symbols) {
|
||||
allBars.set(symbol, this.marketData.get(symbol) || []);
|
||||
}
|
||||
|
||||
// Apply strategy logic (vectorized implementation would be here)
|
||||
// For now, we'll just simulate the processing
|
||||
|
||||
this.emit('completed', { message: 'Vectorized backtest completed in fast mode' });
|
||||
}
|
||||
|
||||
private async loadMarketData(): Promise<void> {
|
||||
for (const symbol of this.config.symbols) {
|
||||
this.emit('loading', { symbol, resolution: this.config.dataResolution });
|
||||
|
||||
// Check if data is available
|
||||
const hasData = await this.dataFeed.hasDataFor(
|
||||
symbol,
|
||||
this.config.dataResolution,
|
||||
this.config.startDate,
|
||||
this.config.endDate
|
||||
);
|
||||
|
||||
if (!hasData) {
|
||||
throw new Error(`No data available for ${symbol} at resolution ${this.config.dataResolution}`);
|
||||
}
|
||||
|
||||
// Load data
|
||||
const data = await this.dataFeed.getHistoricalData(
|
||||
symbol,
|
||||
this.config.dataResolution,
|
||||
this.config.startDate,
|
||||
this.config.endDate
|
||||
);
|
||||
|
||||
this.marketData.set(symbol, data);
|
||||
this.emit('loaded', { symbol, count: data.length });
|
||||
}
|
||||
}
|
||||
|
||||
private createMergedTimeline(): BarData[] {
|
||||
const allBars: BarData[] = [];
|
||||
|
||||
for (const [symbol, bars] of this.marketData.entries()) {
|
||||
allBars.push(...bars);
|
||||
}
|
||||
|
||||
// Sort by timestamp
|
||||
return allBars.sort((a, b) => a.timestamp.getTime() - b.timestamp.getTime());
|
||||
}
|
||||
|
||||
private async processOrders(currentBar: BarData): Promise<void> {
|
||||
// Find orders for the current symbol
|
||||
const ordersToProcess = this.pendingOrders.filter(order => order.symbol === currentBar.symbol);
|
||||
|
||||
if (ordersToProcess.length === 0) return;
|
||||
|
||||
// Remove these orders from pendingOrders
|
||||
this.pendingOrders = this.pendingOrders.filter(order => order.symbol !== currentBar.symbol);
|
||||
|
||||
// Process each order
|
||||
for (const order of ordersToProcess) {
|
||||
await this.processOrder(order);
|
||||
}
|
||||
}
|
||||
|
||||
private async processOrder(order: Order): Promise<void> {
|
||||
// Get the latest price for the symbol
|
||||
const latestBars = this.marketData.get(order.symbol);
|
||||
if (!latestBars || latestBars.length === 0) {
|
||||
order.status = 'REJECTED';
|
||||
this.emit('orderRejected', { order, reason: 'No market data available' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Find the bar closest to the order time
|
||||
const bar = latestBars.find(b =>
|
||||
b.timestamp.getTime() >= order.timestamp.getTime()
|
||||
) || latestBars[latestBars.length - 1];
|
||||
|
||||
// Calculate fill price with slippage
|
||||
let fillPrice: number;
|
||||
if (order.type === 'MARKET') {
|
||||
// Apply slippage model
|
||||
const slippageFactor = 1 + (order.side === 'BUY' ? this.config.slippage : -this.config.slippage);
|
||||
fillPrice = bar.close * slippageFactor;
|
||||
} else if (order.type === 'LIMIT' && order.price !== undefined) {
|
||||
// For limit orders, check if the price was reached
|
||||
if ((order.side === 'BUY' && bar.low <= order.price) ||
|
||||
(order.side === 'SELL' && bar.high >= order.price)) {
|
||||
fillPrice = order.price;
|
||||
} else {
|
||||
// Limit price not reached
|
||||
return;
|
||||
}
|
||||
} else {
|
||||
// Other order types not implemented
|
||||
order.status = 'REJECTED';
|
||||
this.emit('orderRejected', { order, reason: 'Order type not supported' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Calculate commission
|
||||
const orderValue = order.quantity * fillPrice;
|
||||
const commission = orderValue * this.config.commission;
|
||||
|
||||
// Check if we have enough cash for BUY orders
|
||||
if (order.side === 'BUY') {
|
||||
const totalCost = orderValue + commission;
|
||||
if (totalCost > this.currentCapital) {
|
||||
// Not enough cash
|
||||
order.status = 'REJECTED';
|
||||
this.emit('orderRejected', { order, reason: 'Insufficient funds' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Update cash
|
||||
this.currentCapital -= totalCost;
|
||||
|
||||
// Update or create position
|
||||
const existingPosition = this.positions.get(order.symbol);
|
||||
if (existingPosition) {
|
||||
// Update existing position (average down)
|
||||
const totalShares = existingPosition.quantity + order.quantity;
|
||||
const totalCost = (existingPosition.quantity * existingPosition.avgPrice) + (order.quantity * fillPrice);
|
||||
existingPosition.avgPrice = totalCost / totalShares;
|
||||
existingPosition.quantity = totalShares;
|
||||
} else {
|
||||
// Create new position
|
||||
this.positions.set(order.symbol, {
|
||||
symbol: order.symbol,
|
||||
quantity: order.quantity,
|
||||
avgPrice: fillPrice,
|
||||
side: 'LONG',
|
||||
entryTime: this.currentTime
|
||||
});
|
||||
}
|
||||
} else if (order.side === 'SELL') {
|
||||
const position = this.positions.get(order.symbol);
|
||||
|
||||
if (!position || position.quantity < order.quantity) {
|
||||
// Not enough shares to sell
|
||||
order.status = 'REJECTED';
|
||||
this.emit('orderRejected', { order, reason: 'Insufficient position' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Calculate P&L
|
||||
const pnl = (fillPrice - position.avgPrice) * order.quantity;
|
||||
|
||||
// Update cash
|
||||
this.currentCapital += orderValue - commission;
|
||||
|
||||
// Update position
|
||||
position.quantity -= order.quantity;
|
||||
|
||||
if (position.quantity === 0) {
|
||||
// Position closed, record the trade
|
||||
this.positions.delete(order.symbol);
|
||||
|
||||
this.trades.push({
|
||||
symbol: order.symbol,
|
||||
entryTime: position.entryTime,
|
||||
entryPrice: position.avgPrice,
|
||||
exitTime: this.currentTime,
|
||||
exitPrice: fillPrice,
|
||||
quantity: order.quantity,
|
||||
pnl: pnl,
|
||||
pnlPercent: (pnl / (position.avgPrice * order.quantity)) * 100
|
||||
});
|
||||
|
||||
// Update statistics
|
||||
if (pnl > 0) {
|
||||
this.winningTrades++;
|
||||
this.totalProfits += pnl;
|
||||
} else if (pnl < 0) {
|
||||
this.losingTrades++;
|
||||
this.totalLosses -= pnl; // Make positive for easier calculations
|
||||
} else {
|
||||
this.breakEvenTrades++;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Mark order as filled
|
||||
order.status = 'FILLED';
|
||||
order.fillPrice = fillPrice;
|
||||
order.fillTime = this.currentTime;
|
||||
this.filledOrders.push(order);
|
||||
|
||||
// Notify strategy
|
||||
await this.strategy.onOrderFilled(order);
|
||||
|
||||
this.emit('orderFilled', { order });
|
||||
}
|
||||
|
||||
private updatePositions(currentBar: BarData): void {
|
||||
// Update the unrealized P&L for positions in this symbol
|
||||
const position = this.positions.get(currentBar.symbol);
|
||||
if (position) {
|
||||
const currentPrice = currentBar.close;
|
||||
const unrealizedPnL = (currentPrice - position.avgPrice) * position.quantity;
|
||||
position.unrealizedPnL = unrealizedPnL;
|
||||
}
|
||||
|
||||
// Calculate total portfolio value
|
||||
const portfolioValue = this.calculatePortfolioValue();
|
||||
|
||||
// Check for new high water mark
|
||||
if (portfolioValue > this.highWaterMark) {
|
||||
this.highWaterMark = portfolioValue;
|
||||
this.drawdownStartTime = null;
|
||||
}
|
||||
|
||||
// Check for drawdown
|
||||
if (this.drawdownStartTime === null && portfolioValue < this.highWaterMark) {
|
||||
this.drawdownStartTime = this.currentTime;
|
||||
}
|
||||
|
||||
// Update max drawdown
|
||||
if (this.highWaterMark > 0) {
|
||||
const currentDrawdown = (this.highWaterMark - portfolioValue) / this.highWaterMark;
|
||||
if (currentDrawdown > this.maxDrawdown) {
|
||||
this.maxDrawdown = currentDrawdown;
|
||||
|
||||
// Calculate drawdown duration
|
||||
if (this.drawdownStartTime !== null) {
|
||||
const drawdownDuration = (this.currentTime.getTime() - this.drawdownStartTime.getTime()) / (1000 * 60 * 60 * 24); // In days
|
||||
if (drawdownDuration > this.maxDrawdownDuration) {
|
||||
this.maxDrawdownDuration = drawdownDuration;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
this.previousPortfolioValue = portfolioValue;
|
||||
}
|
||||
|
||||
private calculatePortfolioValue(): number {
|
||||
let totalValue = this.currentCapital;
|
||||
|
||||
// Add the current value of all positions
|
||||
for (const [symbol, position] of this.positions.entries()) {
|
||||
// Find the latest price for this symbol
|
||||
const bars = this.marketData.get(symbol);
|
||||
if (bars && bars.length > 0) {
|
||||
const latestBar = bars[bars.length - 1];
|
||||
totalValue += position.quantity * latestBar.close;
|
||||
} else {
|
||||
// If no price data, use the average price (not ideal but better than nothing)
|
||||
totalValue += position.quantity * position.avgPrice;
|
||||
}
|
||||
}
|
||||
|
||||
return totalValue;
|
||||
}
|
||||
|
||||
private calculateDailyReturn(): void {
|
||||
const portfolioValue = this.calculatePortfolioValue();
|
||||
const dailyReturn = (portfolioValue - this.previousPortfolioValue) / this.previousPortfolioValue;
|
||||
|
||||
this.dailyReturns.push({
|
||||
date: new Date(this.currentTime),
|
||||
return: dailyReturn
|
||||
});
|
||||
|
||||
this.previousPortfolioValue = portfolioValue;
|
||||
}
|
||||
|
||||
private async closeAllPositions(): Promise<void> {
|
||||
for (const [symbol, position] of this.positions.entries()) {
|
||||
// Find the latest price
|
||||
const bars = this.marketData.get(symbol);
|
||||
if (!bars || bars.length === 0) continue;
|
||||
|
||||
const lastBar = bars[bars.length - 1];
|
||||
const closePrice = lastBar.close;
|
||||
|
||||
// Calculate P&L
|
||||
const pnl = (closePrice - position.avgPrice) * position.quantity;
|
||||
|
||||
// Update cash
|
||||
this.currentCapital += position.quantity * closePrice;
|
||||
|
||||
// Record the trade
|
||||
this.trades.push({
|
||||
symbol,
|
||||
entryTime: position.entryTime,
|
||||
entryPrice: position.avgPrice,
|
||||
exitTime: this.currentTime,
|
||||
exitPrice: closePrice,
|
||||
quantity: position.quantity,
|
||||
pnl,
|
||||
pnlPercent: (pnl / (position.avgPrice * position.quantity)) * 100
|
||||
});
|
||||
|
||||
// Update statistics
|
||||
if (pnl > 0) {
|
||||
this.winningTrades++;
|
||||
this.totalProfits += pnl;
|
||||
} else if (pnl < 0) {
|
||||
this.losingTrades++;
|
||||
this.totalLosses -= pnl; // Make positive for easier calculations
|
||||
} else {
|
||||
this.breakEvenTrades++;
|
||||
}
|
||||
}
|
||||
|
||||
// Clear positions
|
||||
this.positions.clear();
|
||||
}
|
||||
|
||||
private updateProgress(progress: number): void {
|
||||
const currentPortfolioValue = this.calculatePortfolioValue();
|
||||
const currentDrawdown = this.highWaterMark > 0
|
||||
? (this.highWaterMark - currentPortfolioValue) / this.highWaterMark
|
||||
: 0;
|
||||
|
||||
const elapsedMs = Date.now() - this.startTime;
|
||||
const totalEstimatedMs = elapsedMs / progress;
|
||||
const remainingMs = totalEstimatedMs - elapsedMs;
|
||||
|
||||
this.emit('progress', {
|
||||
progress: progress * 100,
|
||||
currentDate: this.currentTime,
|
||||
processingSpeed: this.processedBars / (elapsedMs / 1000),
|
||||
estimatedTimeRemaining: remainingMs,
|
||||
currentCapital: this.currentCapital,
|
||||
currentReturn: (currentPortfolioValue - this.initialCapital) / this.initialCapital,
|
||||
currentDrawdown
|
||||
} as BacktestProgress);
|
||||
}
|
||||
|
||||
private generateResults(): BacktestResult {
|
||||
const currentPortfolioValue = this.calculatePortfolioValue();
|
||||
const totalReturn = (currentPortfolioValue - this.initialCapital) / this.initialCapital;
|
||||
|
||||
// Calculate annualized return
|
||||
const days = (this.config.endDate.getTime() - this.config.startDate.getTime()) / (1000 * 60 * 60 * 24);
|
||||
const annualizedReturn = Math.pow(1 + totalReturn, 365 / days) - 1;
|
||||
|
||||
// Calculate Sharpe Ratio
|
||||
let sharpeRatio = 0;
|
||||
if (this.dailyReturns.length > 1) {
|
||||
const dailyReturnValues = this.dailyReturns.map(dr => dr.return);
|
||||
const avgDailyReturn = dailyReturnValues.reduce((sum, ret) => sum + ret, 0) / dailyReturnValues.length;
|
||||
const stdDev = Math.sqrt(
|
||||
dailyReturnValues.reduce((sum, ret) => sum + Math.pow(ret - avgDailyReturn, 2), 0) / dailyReturnValues.length
|
||||
);
|
||||
|
||||
// Annualize
|
||||
sharpeRatio = stdDev > 0
|
||||
? (avgDailyReturn * 252) / (stdDev * Math.sqrt(252))
|
||||
: 0;
|
||||
}
|
||||
|
||||
// Calculate win rate and profit factor
|
||||
const totalTrades = this.winningTrades + this.losingTrades + this.breakEvenTrades;
|
||||
const winRate = totalTrades > 0 ? this.winningTrades / totalTrades : 0;
|
||||
const profitFactor = this.totalLosses > 0 ? this.totalProfits / this.totalLosses : (this.totalProfits > 0 ? Infinity : 0);
|
||||
|
||||
// Calculate average winning and losing trade
|
||||
const avgWinningTrade = this.winningTrades > 0 ? this.totalProfits / this.winningTrades : 0;
|
||||
const avgLosingTrade = this.losingTrades > 0 ? this.totalLosses / this.losingTrades : 0;
|
||||
|
||||
return {
|
||||
strategyId: this.strategy.id,
|
||||
startDate: this.config.startDate,
|
||||
endDate: this.config.endDate,
|
||||
duration: Date.now() - this.startTime,
|
||||
initialCapital: this.initialCapital,
|
||||
finalCapital: currentPortfolioValue,
|
||||
totalReturn,
|
||||
annualizedReturn,
|
||||
sharpeRatio,
|
||||
maxDrawdown: this.maxDrawdown,
|
||||
maxDrawdownDuration: this.maxDrawdownDuration,
|
||||
winRate,
|
||||
totalTrades,
|
||||
winningTrades: this.winningTrades,
|
||||
losingTrades: this.losingTrades,
|
||||
averageWinningTrade: avgWinningTrade,
|
||||
averageLosingTrade: avgLosingTrade,
|
||||
profitFactor,
|
||||
dailyReturns: this.dailyReturns,
|
||||
trades: this.trades
|
||||
};
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,186 @@
|
|||
import { BaseStrategy } from '../Strategy';
|
||||
import { BacktestConfig, BacktestEngine, BacktestResult } from './BacktestEngine';
|
||||
import { MarketDataFeed } from './MarketDataFeed';
|
||||
import { StrategyRegistry, StrategyType } from '../strategies/StrategyRegistry';
|
||||
|
||||
export interface BacktestRequest {
|
||||
strategyType: StrategyType;
|
||||
strategyParams: Record<string, any>;
|
||||
symbols: string[];
|
||||
startDate: Date | string;
|
||||
endDate: Date | string;
|
||||
initialCapital: number;
|
||||
dataResolution: '1m' | '5m' | '15m' | '30m' | '1h' | '4h' | '1d';
|
||||
commission: number;
|
||||
slippage: number;
|
||||
mode: 'event' | 'vector';
|
||||
}
|
||||
|
||||
/**
|
||||
* Backtesting Service
|
||||
*
|
||||
* A service that handles backtesting requests and manages backtesting sessions.
|
||||
*/
|
||||
export class BacktestService {
|
||||
private readonly strategyRegistry: StrategyRegistry;
|
||||
private readonly dataFeed: MarketDataFeed;
|
||||
private readonly activeBacktests: Map<string, BacktestEngine> = new Map();
|
||||
|
||||
constructor(apiBaseUrl: string = 'http://localhost:3001/api') {
|
||||
this.strategyRegistry = StrategyRegistry.getInstance();
|
||||
this.dataFeed = new MarketDataFeed(apiBaseUrl);
|
||||
}
|
||||
|
||||
/**
|
||||
* Run a backtest based on a request
|
||||
*/
|
||||
async runBacktest(request: BacktestRequest): Promise<BacktestResult> {
|
||||
// Create a strategy instance
|
||||
const strategyId = `backtest_${Date.now()}`;
|
||||
const strategy = this.strategyRegistry.createStrategy(
|
||||
request.strategyType,
|
||||
strategyId,
|
||||
`Backtest ${request.strategyType}`,
|
||||
`Generated backtest for ${request.symbols.join(', ')}`,
|
||||
request.symbols,
|
||||
request.strategyParams
|
||||
);
|
||||
|
||||
// Parse dates if they are strings
|
||||
const startDate = typeof request.startDate === 'string'
|
||||
? new Date(request.startDate)
|
||||
: request.startDate;
|
||||
|
||||
const endDate = typeof request.endDate === 'string'
|
||||
? new Date(request.endDate)
|
||||
: request.endDate;
|
||||
|
||||
// Create backtest configuration
|
||||
const config: BacktestConfig = {
|
||||
startDate,
|
||||
endDate,
|
||||
symbols: request.symbols,
|
||||
initialCapital: request.initialCapital,
|
||||
commission: request.commission,
|
||||
slippage: request.slippage,
|
||||
dataResolution: request.dataResolution,
|
||||
mode: request.mode
|
||||
};
|
||||
|
||||
// Create and run the backtest engine
|
||||
const engine = new BacktestEngine(strategy, config, this.dataFeed);
|
||||
this.activeBacktests.set(strategyId, engine);
|
||||
|
||||
try {
|
||||
// Set up event forwarding
|
||||
const forwardEvents = (eventName: string) => {
|
||||
engine.on(eventName, (data) => {
|
||||
console.log(`[Backtest ${strategyId}] ${eventName}:`, data);
|
||||
});
|
||||
};
|
||||
|
||||
forwardEvents('started');
|
||||
forwardEvents('loading');
|
||||
forwardEvents('loaded');
|
||||
forwardEvents('progress');
|
||||
forwardEvents('orderFilled');
|
||||
forwardEvents('orderRejected');
|
||||
forwardEvents('completed');
|
||||
forwardEvents('error');
|
||||
|
||||
// Run the backtest
|
||||
const result = await engine.run();
|
||||
|
||||
// Clean up
|
||||
this.activeBacktests.delete(strategyId);
|
||||
|
||||
return result;
|
||||
} catch (error) {
|
||||
this.activeBacktests.delete(strategyId);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Optimize a strategy by running multiple backtests with different parameters
|
||||
*/
|
||||
async optimizeStrategy(
|
||||
baseRequest: BacktestRequest,
|
||||
parameterGrid: Record<string, any[]>
|
||||
): Promise<Array<BacktestResult & { parameters: Record<string, any> }>> {
|
||||
const results: Array<BacktestResult & { parameters: Record<string, any> }> = [];
|
||||
|
||||
// Generate parameter combinations
|
||||
const paramKeys = Object.keys(parameterGrid);
|
||||
const combinations = this.generateParameterCombinations(parameterGrid, paramKeys);
|
||||
|
||||
// Run backtest for each combination
|
||||
for (const paramSet of combinations) {
|
||||
const request = {
|
||||
...baseRequest,
|
||||
strategyParams: {
|
||||
...baseRequest.strategyParams,
|
||||
...paramSet
|
||||
}
|
||||
};
|
||||
|
||||
try {
|
||||
const result = await this.runBacktest(request);
|
||||
results.push({
|
||||
...result,
|
||||
parameters: paramSet
|
||||
});
|
||||
} catch (error) {
|
||||
console.error(`Optimization failed for parameters:`, paramSet, error);
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by performance metric (e.g., Sharpe ratio)
|
||||
return results.sort((a, b) => b.sharpeRatio - a.sharpeRatio);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate all combinations of parameters for grid search
|
||||
*/
|
||||
private generateParameterCombinations(
|
||||
grid: Record<string, any[]>,
|
||||
keys: string[],
|
||||
current: Record<string, any> = {},
|
||||
index: number = 0,
|
||||
result: Record<string, any>[] = []
|
||||
): Record<string, any>[] {
|
||||
if (index === keys.length) {
|
||||
result.push({ ...current });
|
||||
return result;
|
||||
}
|
||||
|
||||
const key = keys[index];
|
||||
const values = grid[key];
|
||||
|
||||
for (const value of values) {
|
||||
current[key] = value;
|
||||
this.generateParameterCombinations(grid, keys, current, index + 1, result);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get an active backtest engine by ID
|
||||
*/
|
||||
getBacktestEngine(id: string): BacktestEngine | undefined {
|
||||
return this.activeBacktests.get(id);
|
||||
}
|
||||
|
||||
/**
|
||||
* Cancel a running backtest
|
||||
*/
|
||||
cancelBacktest(id: string): boolean {
|
||||
const engine = this.activeBacktests.get(id);
|
||||
if (!engine) return false;
|
||||
|
||||
// No explicit cancel method on engine, but we can clean up
|
||||
this.activeBacktests.delete(id);
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,166 @@
|
|||
import { BarData } from '../Strategy';
|
||||
import { DataFeed } from './BacktestEngine';
|
||||
import axios from 'axios';
|
||||
|
||||
export class MarketDataFeed implements DataFeed {
|
||||
private readonly apiBaseUrl: string;
|
||||
private cache: Map<string, BarData[]> = new Map();
|
||||
|
||||
constructor(apiBaseUrl: string = 'http://localhost:3001/api') {
|
||||
this.apiBaseUrl = apiBaseUrl;
|
||||
}
|
||||
|
||||
async getHistoricalData(symbol: string, resolution: string, start: Date, end: Date): Promise<BarData[]> {
|
||||
const cacheKey = this.getCacheKey(symbol, resolution, start, end);
|
||||
|
||||
// Check cache first
|
||||
if (this.cache.has(cacheKey)) {
|
||||
return this.cache.get(cacheKey)!;
|
||||
}
|
||||
|
||||
try {
|
||||
// Format dates for API request
|
||||
const startStr = start.toISOString();
|
||||
const endStr = end.toISOString();
|
||||
|
||||
const response = await axios.get(`${this.apiBaseUrl}/market-data/history`, {
|
||||
params: {
|
||||
symbol,
|
||||
resolution,
|
||||
start: startStr,
|
||||
end: endStr
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.data.success || !response.data.data) {
|
||||
throw new Error(`Failed to fetch historical data for ${symbol}`);
|
||||
}
|
||||
|
||||
// Transform API response to BarData objects
|
||||
const bars: BarData[] = response.data.data.map((bar: any) => ({
|
||||
symbol,
|
||||
timestamp: new Date(bar.timestamp),
|
||||
open: bar.open,
|
||||
high: bar.high,
|
||||
low: bar.low,
|
||||
close: bar.close,
|
||||
volume: bar.volume
|
||||
}));
|
||||
|
||||
// Cache the result
|
||||
this.cache.set(cacheKey, bars);
|
||||
|
||||
return bars;
|
||||
} catch (error) {
|
||||
console.error(`Error fetching historical data for ${symbol}:`, error);
|
||||
// Return fallback test data if API call fails
|
||||
return this.generateFallbackTestData(symbol, resolution, start, end);
|
||||
}
|
||||
}
|
||||
|
||||
async hasDataFor(symbol: string, resolution: string, start: Date, end: Date): Promise<boolean> {
|
||||
try {
|
||||
const startStr = start.toISOString();
|
||||
const endStr = end.toISOString();
|
||||
|
||||
const response = await axios.get(`${this.apiBaseUrl}/market-data/available`, {
|
||||
params: {
|
||||
symbol,
|
||||
resolution,
|
||||
start: startStr,
|
||||
end: endStr
|
||||
}
|
||||
});
|
||||
|
||||
return response.data.success && response.data.data.available;
|
||||
} catch (error) {
|
||||
console.error(`Error checking data availability for ${symbol}:`, error);
|
||||
// Assume data is available for test purposes
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
clearCache(): void {
|
||||
this.cache.clear();
|
||||
}
|
||||
|
||||
private getCacheKey(symbol: string, resolution: string, start: Date, end: Date): string {
|
||||
return `${symbol}_${resolution}_${start.getTime()}_${end.getTime()}`;
|
||||
}
|
||||
|
||||
private generateFallbackTestData(symbol: string, resolution: string, start: Date, end: Date): BarData[] {
|
||||
console.warn(`Generating fallback test data for ${symbol} from ${start} to ${end}`);
|
||||
|
||||
const bars: BarData[] = [];
|
||||
let current = new Date(start);
|
||||
let basePrice = this.getBasePrice(symbol);
|
||||
|
||||
// Generate daily bars by default
|
||||
const interval = this.getIntervalFromResolution(resolution);
|
||||
|
||||
while (current.getTime() <= end.getTime()) {
|
||||
// Only generate bars for trading days (skip weekends)
|
||||
if (current.getDay() !== 0 && current.getDay() !== 6) {
|
||||
// Generate a random daily price movement (-1% to +1%)
|
||||
const dailyChange = (Math.random() * 2 - 1) / 100;
|
||||
|
||||
// Add some randomness to the volatility
|
||||
const volatility = 0.005 + Math.random() * 0.01; // 0.5% to 1.5%
|
||||
|
||||
const open = basePrice * (1 + (Math.random() * 0.002 - 0.001));
|
||||
const close = open * (1 + dailyChange);
|
||||
const high = Math.max(open, close) * (1 + Math.random() * volatility);
|
||||
const low = Math.min(open, close) * (1 - Math.random() * volatility);
|
||||
const volume = Math.floor(100000 + Math.random() * 900000);
|
||||
|
||||
bars.push({
|
||||
symbol,
|
||||
timestamp: new Date(current),
|
||||
open,
|
||||
high,
|
||||
low,
|
||||
close,
|
||||
volume
|
||||
});
|
||||
|
||||
// Update base price for next bar
|
||||
basePrice = close;
|
||||
}
|
||||
|
||||
// Move to next interval
|
||||
current = new Date(current.getTime() + interval);
|
||||
}
|
||||
|
||||
return bars;
|
||||
}
|
||||
|
||||
private getBasePrice(symbol: string): number {
|
||||
// Return a realistic base price for common symbols
|
||||
switch (symbol.toUpperCase()) {
|
||||
case 'AAPL': return 170 + Math.random() * 30;
|
||||
case 'MSFT': return 370 + Math.random() * 50;
|
||||
case 'AMZN': return 140 + Math.random() * 20;
|
||||
case 'GOOGL': return 130 + Math.random() * 20;
|
||||
case 'META': return 300 + Math.random() * 50;
|
||||
case 'TSLA': return 180 + Math.random() * 70;
|
||||
case 'NVDA': return 700 + Math.random() * 200;
|
||||
case 'SPY': return 450 + Math.random() * 30;
|
||||
case 'QQQ': return 370 + Math.random() * 40;
|
||||
default: return 100 + Math.random() * 50;
|
||||
}
|
||||
}
|
||||
|
||||
private getIntervalFromResolution(resolution: string): number {
|
||||
// Return milliseconds for each resolution
|
||||
switch (resolution) {
|
||||
case '1m': return 60 * 1000;
|
||||
case '5m': return 5 * 60 * 1000;
|
||||
case '15m': return 15 * 60 * 1000;
|
||||
case '30m': return 30 * 60 * 1000;
|
||||
case '1h': return 60 * 60 * 1000;
|
||||
case '4h': return 4 * 60 * 60 * 1000;
|
||||
case '1d': return 24 * 60 * 60 * 1000;
|
||||
default: return 24 * 60 * 60 * 1000; // Default to daily
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,325 @@
|
|||
import { BacktestResult } from './BacktestEngine';
|
||||
|
||||
/**
|
||||
* Performance Analysis Utilities
|
||||
*
|
||||
* Provides additional metrics and analysis tools for backtesting results.
|
||||
*/
|
||||
export class PerformanceAnalytics {
|
||||
/**
|
||||
* Calculate additional metrics from backtest results
|
||||
*/
|
||||
static enhanceResults(result: BacktestResult): BacktestResult {
|
||||
// Calculate additional metrics
|
||||
const enhancedResult = {
|
||||
...result,
|
||||
...this.calculateAdvancedMetrics(result)
|
||||
};
|
||||
|
||||
return enhancedResult;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate advanced performance metrics
|
||||
*/
|
||||
private static calculateAdvancedMetrics(result: BacktestResult): Partial<BacktestResult> {
|
||||
// Extract daily returns
|
||||
const dailyReturns = result.dailyReturns.map(dr => dr.return);
|
||||
|
||||
// Calculate Sortino ratio
|
||||
const sortinoRatio = this.calculateSortinoRatio(dailyReturns);
|
||||
|
||||
// Calculate Calmar ratio
|
||||
const calmarRatio = result.maxDrawdown > 0
|
||||
? result.annualizedReturn / result.maxDrawdown
|
||||
: Infinity;
|
||||
|
||||
// Calculate Omega ratio
|
||||
const omegaRatio = this.calculateOmegaRatio(dailyReturns);
|
||||
|
||||
// Calculate CAGR
|
||||
const startTimestamp = result.startDate.getTime();
|
||||
const endTimestamp = result.endDate.getTime();
|
||||
const yearsElapsed = (endTimestamp - startTimestamp) / (365 * 24 * 60 * 60 * 1000);
|
||||
const cagr = Math.pow(result.finalCapital / result.initialCapital, 1 / yearsElapsed) - 1;
|
||||
|
||||
// Calculate additional volatility and return metrics
|
||||
const volatility = this.calculateVolatility(dailyReturns);
|
||||
const ulcerIndex = this.calculateUlcerIndex(result.dailyReturns);
|
||||
|
||||
return {
|
||||
sortinoRatio,
|
||||
calmarRatio,
|
||||
omegaRatio,
|
||||
cagr,
|
||||
volatility,
|
||||
ulcerIndex
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Sortino ratio (downside risk-adjusted return)
|
||||
*/
|
||||
private static calculateSortinoRatio(dailyReturns: number[]): number {
|
||||
if (dailyReturns.length === 0) return 0;
|
||||
|
||||
const avgReturn = dailyReturns.reduce((sum, ret) => sum + ret, 0) / dailyReturns.length;
|
||||
|
||||
// Filter only negative returns (downside)
|
||||
const negativeReturns = dailyReturns.filter(ret => ret < 0);
|
||||
|
||||
if (negativeReturns.length === 0) return Infinity;
|
||||
|
||||
// Calculate downside deviation
|
||||
const downsideDeviation = Math.sqrt(
|
||||
negativeReturns.reduce((sum, ret) => sum + Math.pow(ret, 2), 0) / negativeReturns.length
|
||||
);
|
||||
|
||||
// Annualize
|
||||
const annualizedReturn = avgReturn * 252;
|
||||
const annualizedDownsideDeviation = downsideDeviation * Math.sqrt(252);
|
||||
|
||||
return annualizedDownsideDeviation > 0
|
||||
? annualizedReturn / annualizedDownsideDeviation
|
||||
: 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Omega ratio (probability-weighted ratio of gains versus losses)
|
||||
*/
|
||||
private static calculateOmegaRatio(dailyReturns: number[], threshold = 0): number {
|
||||
if (dailyReturns.length === 0) return 0;
|
||||
|
||||
let sumGains = 0;
|
||||
let sumLosses = 0;
|
||||
|
||||
for (const ret of dailyReturns) {
|
||||
if (ret > threshold) {
|
||||
sumGains += (ret - threshold);
|
||||
} else {
|
||||
sumLosses += (threshold - ret);
|
||||
}
|
||||
}
|
||||
|
||||
return sumLosses > 0 ? sumGains / sumLosses : Infinity;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate annualized volatility
|
||||
*/
|
||||
private static calculateVolatility(returns: number[]): number {
|
||||
if (returns.length < 2) return 0;
|
||||
|
||||
const mean = returns.reduce((sum, ret) => sum + ret, 0) / returns.length;
|
||||
const variance = returns.reduce((sum, ret) => sum + Math.pow(ret - mean, 2), 0) / returns.length;
|
||||
|
||||
// Annualize
|
||||
return Math.sqrt(variance * 252);
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate Ulcer Index (measure of downside risk)
|
||||
*/
|
||||
private static calculateUlcerIndex(dailyReturns: Array<{ date: Date; return: number }>): number {
|
||||
if (dailyReturns.length === 0) return 0;
|
||||
|
||||
// Calculate running equity curve
|
||||
let equity = 1;
|
||||
const equityCurve = dailyReturns.map(dr => {
|
||||
equity *= (1 + dr.return);
|
||||
return equity;
|
||||
});
|
||||
|
||||
// Find running maximum
|
||||
const runningMax: number[] = [];
|
||||
let currentMax = equityCurve[0];
|
||||
|
||||
for (const value of equityCurve) {
|
||||
currentMax = Math.max(currentMax, value);
|
||||
runningMax.push(currentMax);
|
||||
}
|
||||
|
||||
// Calculate percentage drawdowns
|
||||
const percentDrawdowns = equityCurve.map((value, i) =>
|
||||
(runningMax[i] - value) / runningMax[i]
|
||||
);
|
||||
|
||||
// Calculate Ulcer Index
|
||||
const sumSquaredDrawdowns = percentDrawdowns.reduce(
|
||||
(sum, dd) => sum + dd * dd, 0
|
||||
);
|
||||
|
||||
return Math.sqrt(sumSquaredDrawdowns / percentDrawdowns.length);
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract monthly returns from daily returns
|
||||
*/
|
||||
static calculateMonthlyReturns(dailyReturns: Array<{ date: Date; return: number }>): Array<{
|
||||
year: number;
|
||||
month: number;
|
||||
return: number;
|
||||
}> {
|
||||
const monthlyReturns: Array<{ year: number; month: number; return: number }> = [];
|
||||
|
||||
if (dailyReturns.length === 0) return monthlyReturns;
|
||||
|
||||
// Group returns by year and month
|
||||
const groupedReturns: Record<string, number[]> = {};
|
||||
|
||||
for (const dr of dailyReturns) {
|
||||
const year = dr.date.getFullYear();
|
||||
const month = dr.date.getMonth();
|
||||
const key = `${year}-${month}`;
|
||||
|
||||
if (!groupedReturns[key]) {
|
||||
groupedReturns[key] = [];
|
||||
}
|
||||
|
||||
groupedReturns[key].push(dr.return);
|
||||
}
|
||||
|
||||
// Calculate compound return for each month
|
||||
for (const key in groupedReturns) {
|
||||
const [yearStr, monthStr] = key.split('-');
|
||||
const year = parseInt(yearStr);
|
||||
const month = parseInt(monthStr);
|
||||
|
||||
// Compound the daily returns for the month
|
||||
const monthReturn = groupedReturns[key].reduce(
|
||||
(product, ret) => product * (1 + ret), 1
|
||||
) - 1;
|
||||
|
||||
monthlyReturns.push({ year, month, return: monthReturn });
|
||||
}
|
||||
|
||||
// Sort by date
|
||||
return monthlyReturns.sort((a, b) => {
|
||||
if (a.year !== b.year) return a.year - b.year;
|
||||
return a.month - b.month;
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Create drawdown analysis from equity curve
|
||||
*/
|
||||
static analyzeDrawdowns(dailyReturns: Array<{ date: Date; return: number }>): Array<{
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
recoveryDate: Date | null;
|
||||
drawdown: number;
|
||||
durationDays: number;
|
||||
recoveryDays: number | null;
|
||||
}> {
|
||||
if (dailyReturns.length === 0) return [];
|
||||
|
||||
// Calculate equity curve
|
||||
let equity = 1;
|
||||
const equityCurve = dailyReturns.map(dr => {
|
||||
equity *= (1 + dr.return);
|
||||
return { date: dr.date, equity };
|
||||
});
|
||||
|
||||
// Analyze drawdowns
|
||||
const drawdowns: Array<{
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
recoveryDate: Date | null;
|
||||
drawdown: number;
|
||||
durationDays: number;
|
||||
recoveryDays: number | null;
|
||||
}> = [];
|
||||
|
||||
let peakEquity = equityCurve[0].equity;
|
||||
let peakDate = equityCurve[0].date;
|
||||
let inDrawdown = false;
|
||||
let currentDrawdown: {
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
lowEquity: number;
|
||||
peakEquity: number;
|
||||
} | null = null;
|
||||
|
||||
// Find drawdown periods
|
||||
for (let i = 1; i < equityCurve.length; i++) {
|
||||
const { date, equity } = equityCurve[i];
|
||||
|
||||
// New peak
|
||||
if (equity > peakEquity) {
|
||||
peakEquity = equity;
|
||||
peakDate = date;
|
||||
|
||||
// If recovering from drawdown, record recovery
|
||||
if (inDrawdown && currentDrawdown) {
|
||||
const recoveryDate = date;
|
||||
const drawdownPct = (currentDrawdown.peakEquity - currentDrawdown.lowEquity) /
|
||||
currentDrawdown.peakEquity;
|
||||
|
||||
const durationDays = Math.floor(
|
||||
(currentDrawdown.endDate.getTime() - currentDrawdown.startDate.getTime()) /
|
||||
(1000 * 60 * 60 * 24)
|
||||
);
|
||||
|
||||
const recoveryDays = Math.floor(
|
||||
(recoveryDate.getTime() - currentDrawdown.endDate.getTime()) /
|
||||
(1000 * 60 * 60 * 24)
|
||||
);
|
||||
|
||||
drawdowns.push({
|
||||
startDate: currentDrawdown.startDate,
|
||||
endDate: currentDrawdown.endDate,
|
||||
recoveryDate,
|
||||
drawdown: drawdownPct,
|
||||
durationDays,
|
||||
recoveryDays
|
||||
});
|
||||
|
||||
inDrawdown = false;
|
||||
currentDrawdown = null;
|
||||
}
|
||||
}
|
||||
// In drawdown
|
||||
else {
|
||||
const drawdownPct = (peakEquity - equity) / peakEquity;
|
||||
|
||||
if (!inDrawdown) {
|
||||
// Start of new drawdown
|
||||
inDrawdown = true;
|
||||
currentDrawdown = {
|
||||
startDate: peakDate,
|
||||
endDate: date,
|
||||
lowEquity: equity,
|
||||
peakEquity
|
||||
};
|
||||
} else if (currentDrawdown && equity < currentDrawdown.lowEquity) {
|
||||
// New low in current drawdown
|
||||
currentDrawdown.lowEquity = equity;
|
||||
currentDrawdown.endDate = date;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Handle any ongoing drawdown at the end
|
||||
if (inDrawdown && currentDrawdown) {
|
||||
const drawdownPct = (currentDrawdown.peakEquity - currentDrawdown.lowEquity) /
|
||||
currentDrawdown.peakEquity;
|
||||
|
||||
const durationDays = Math.floor(
|
||||
(currentDrawdown.endDate.getTime() - currentDrawdown.startDate.getTime()) /
|
||||
(1000 * 60 * 60 * 24)
|
||||
);
|
||||
|
||||
drawdowns.push({
|
||||
startDate: currentDrawdown.startDate,
|
||||
endDate: currentDrawdown.endDate,
|
||||
recoveryDate: null,
|
||||
drawdown: drawdownPct,
|
||||
durationDays,
|
||||
recoveryDays: null
|
||||
});
|
||||
}
|
||||
|
||||
// Sort by drawdown magnitude
|
||||
return drawdowns.sort((a, b) => b.drawdown - a.drawdown);
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,512 @@
|
|||
import { EventEmitter } from 'events';
|
||||
import { BaseStrategy } from '../Strategy';
|
||||
import { BarData, Order, Position } from '../Strategy';
|
||||
import { MarketDataFeed } from '../backtesting/MarketDataFeed';
|
||||
import { StrategyRegistry } from '../strategies/StrategyRegistry';
|
||||
import { WebSocket } from 'ws';
|
||||
|
||||
export interface ExecutionConfig {
|
||||
symbols: string[];
|
||||
dataResolution: '1m' | '5m' | '15m' | '30m' | '1h' | '4h' | '1d';
|
||||
realTrading: boolean;
|
||||
maxPositionValue: number;
|
||||
maxOrdersPerMinute: number;
|
||||
stopLossPercentage: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Strategy Execution Service
|
||||
*
|
||||
* Handles the real-time execution of trading strategies.
|
||||
* Manages live data feeds, order execution, and position tracking.
|
||||
*/
|
||||
export class StrategyExecutionService extends EventEmitter {
|
||||
private strategyRegistry: StrategyRegistry;
|
||||
private marketDataFeed: MarketDataFeed;
|
||||
private activeStrategies: Map<string, {
|
||||
strategy: BaseStrategy;
|
||||
config: ExecutionConfig;
|
||||
positions: Map<string, Position>;
|
||||
lastBar: Map<string, BarData>;
|
||||
}> = new Map();
|
||||
|
||||
private isRunning: boolean = false;
|
||||
private dataPollingIntervals: Map<string, NodeJS.Timeout> = new Map();
|
||||
private webSocketServer: WebSocket.Server | null = null;
|
||||
private wsClients: Set<WebSocket> = new Set();
|
||||
private marketDataCache: Map<string, BarData[]> = new Map();
|
||||
|
||||
constructor(apiBaseUrl: string = 'http://localhost:3001/api', wsPort: number = 8082) {
|
||||
super();
|
||||
this.strategyRegistry = StrategyRegistry.getInstance();
|
||||
this.marketDataFeed = new MarketDataFeed(apiBaseUrl);
|
||||
this.initializeWebSocketServer(wsPort);
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize WebSocket server for real-time updates
|
||||
*/
|
||||
private initializeWebSocketServer(port: number): void {
|
||||
try {
|
||||
this.webSocketServer = new WebSocket.Server({ port });
|
||||
|
||||
this.webSocketServer.on('connection', (ws) => {
|
||||
console.log('New client connected to strategy execution service');
|
||||
this.wsClients.add(ws);
|
||||
|
||||
ws.on('message', (message) => {
|
||||
try {
|
||||
const data = JSON.parse(message.toString());
|
||||
this.handleWebSocketMessage(ws, data);
|
||||
} catch (error) {
|
||||
console.error('Error handling WebSocket message:', error);
|
||||
}
|
||||
});
|
||||
|
||||
ws.on('close', () => {
|
||||
console.log('Client disconnected from strategy execution service');
|
||||
this.wsClients.delete(ws);
|
||||
});
|
||||
|
||||
// Send initial state
|
||||
this.sendAllStrategyStatus(ws);
|
||||
});
|
||||
|
||||
console.log(`WebSocket server started on port ${port}`);
|
||||
} catch (error) {
|
||||
console.error('Failed to initialize WebSocket server:', error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle incoming WebSocket messages
|
||||
*/
|
||||
private handleWebSocketMessage(ws: WebSocket, message: any): void {
|
||||
switch (message.type) {
|
||||
case 'get_active_strategies':
|
||||
this.sendAllStrategyStatus(ws);
|
||||
break;
|
||||
case 'start_strategy':
|
||||
this.startStrategy(message.id, message.config);
|
||||
break;
|
||||
case 'stop_strategy':
|
||||
this.stopStrategy(message.id);
|
||||
break;
|
||||
case 'pause_strategy':
|
||||
this.pauseStrategy(message.id);
|
||||
break;
|
||||
default:
|
||||
console.warn(`Unknown WebSocket message type: ${message.type}`);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Send a message to all connected WebSocket clients
|
||||
*/
|
||||
private broadcastMessage(message: any): void {
|
||||
const messageStr = JSON.stringify(message);
|
||||
for (const client of this.wsClients) {
|
||||
if (client.readyState === WebSocket.OPEN) {
|
||||
client.send(messageStr);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Send current status of all active strategies to a specific client
|
||||
*/
|
||||
private sendAllStrategyStatus(ws: WebSocket): void {
|
||||
const statusList = Array.from(this.activeStrategies.entries()).map(([id, data]) => ({
|
||||
id,
|
||||
name: data.strategy.name,
|
||||
status: this.isRunning ? 'ACTIVE' : 'PAUSED',
|
||||
symbols: data.config.symbols,
|
||||
positions: Array.from(data.positions.entries()).map(([symbol, pos]) => ({
|
||||
symbol,
|
||||
quantity: pos.quantity,
|
||||
entryPrice: pos.entryPrice,
|
||||
currentValue: pos.currentValue
|
||||
}))
|
||||
}));
|
||||
|
||||
ws.send(JSON.stringify({
|
||||
type: 'strategy_status_list',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: statusList
|
||||
}));
|
||||
}
|
||||
|
||||
/**
|
||||
* Start the execution service (global)
|
||||
*/
|
||||
start(): void {
|
||||
if (this.isRunning) return;
|
||||
|
||||
this.isRunning = true;
|
||||
console.log('Strategy execution service started');
|
||||
|
||||
// Start data polling for all active strategies
|
||||
for (const [strategyId, data] of this.activeStrategies.entries()) {
|
||||
this.startDataPollingForStrategy(strategyId, data);
|
||||
}
|
||||
|
||||
this.broadcastMessage({
|
||||
type: 'execution_service_status',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: { status: 'RUNNING' }
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop the execution service (global)
|
||||
*/
|
||||
stop(): void {
|
||||
if (!this.isRunning) return;
|
||||
|
||||
this.isRunning = false;
|
||||
console.log('Strategy execution service stopped');
|
||||
|
||||
// Clear all data polling intervals
|
||||
for (const interval of this.dataPollingIntervals.values()) {
|
||||
clearInterval(interval);
|
||||
}
|
||||
this.dataPollingIntervals.clear();
|
||||
|
||||
this.broadcastMessage({
|
||||
type: 'execution_service_status',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: { status: 'STOPPED' }
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Start a specific strategy
|
||||
*/
|
||||
startStrategy(strategyId: string, config?: ExecutionConfig): boolean {
|
||||
const strategy = this.strategyRegistry.getStrategyById(strategyId);
|
||||
|
||||
if (!strategy) {
|
||||
console.error(`Strategy with ID ${strategyId} not found`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// If strategy is already active, return false
|
||||
if (this.activeStrategies.has(strategyId)) {
|
||||
console.warn(`Strategy ${strategyId} is already active`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Use provided config or create default
|
||||
const executionConfig: ExecutionConfig = config || {
|
||||
symbols: strategy.symbols,
|
||||
dataResolution: '1m',
|
||||
realTrading: false,
|
||||
maxPositionValue: 10000,
|
||||
maxOrdersPerMinute: 5,
|
||||
stopLossPercentage: 0.02
|
||||
};
|
||||
|
||||
// Initialize strategy data
|
||||
const strategyData = {
|
||||
strategy,
|
||||
config: executionConfig,
|
||||
positions: new Map<string, Position>(),
|
||||
lastBar: new Map<string, BarData>()
|
||||
};
|
||||
|
||||
this.activeStrategies.set(strategyId, strategyData);
|
||||
|
||||
// If execution service is running, start data polling for this strategy
|
||||
if (this.isRunning) {
|
||||
this.startDataPollingForStrategy(strategyId, strategyData);
|
||||
}
|
||||
|
||||
console.log(`Strategy ${strategyId} started with ${executionConfig.symbols.length} symbols`);
|
||||
|
||||
this.broadcastMessage({
|
||||
type: 'strategy_started',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: {
|
||||
strategyId,
|
||||
name: strategy.name,
|
||||
symbols: executionConfig.symbols
|
||||
}
|
||||
});
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop a specific strategy
|
||||
*/
|
||||
stopStrategy(strategyId: string): boolean {
|
||||
if (!this.activeStrategies.has(strategyId)) {
|
||||
console.warn(`Strategy ${strategyId} is not active`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Clear data polling interval for this strategy
|
||||
const intervalId = this.dataPollingIntervals.get(strategyId);
|
||||
if (intervalId) {
|
||||
clearInterval(intervalId);
|
||||
this.dataPollingIntervals.delete(strategyId);
|
||||
}
|
||||
|
||||
// Get strategy data before removing
|
||||
const strategyData = this.activeStrategies.get(strategyId)!;
|
||||
const { strategy } = strategyData;
|
||||
|
||||
// Close any open positions (in real implementation)
|
||||
// ...
|
||||
|
||||
this.activeStrategies.delete(strategyId);
|
||||
|
||||
console.log(`Strategy ${strategyId} stopped`);
|
||||
|
||||
this.broadcastMessage({
|
||||
type: 'strategy_stopped',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: {
|
||||
strategyId,
|
||||
name: strategy.name
|
||||
}
|
||||
});
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Pause a specific strategy
|
||||
*/
|
||||
pauseStrategy(strategyId: string): boolean {
|
||||
if (!this.activeStrategies.has(strategyId)) {
|
||||
console.warn(`Strategy ${strategyId} is not active`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Clear data polling interval for this strategy
|
||||
const intervalId = this.dataPollingIntervals.get(strategyId);
|
||||
if (intervalId) {
|
||||
clearInterval(intervalId);
|
||||
this.dataPollingIntervals.delete(strategyId);
|
||||
}
|
||||
|
||||
const { strategy } = this.activeStrategies.get(strategyId)!;
|
||||
|
||||
console.log(`Strategy ${strategyId} paused`);
|
||||
|
||||
this.broadcastMessage({
|
||||
type: 'strategy_paused',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: {
|
||||
strategyId,
|
||||
name: strategy.name
|
||||
}
|
||||
});
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Start data polling for a specific strategy
|
||||
*/
|
||||
private startDataPollingForStrategy(
|
||||
strategyId: string,
|
||||
data: {
|
||||
strategy: BaseStrategy;
|
||||
config: ExecutionConfig;
|
||||
positions: Map<string, Position>;
|
||||
lastBar: Map<string, BarData>;
|
||||
}
|
||||
): void {
|
||||
const { strategy, config } = data;
|
||||
|
||||
// Determine polling interval based on data resolution
|
||||
const pollingIntervalMs = this.getPollingIntervalFromResolution(config.dataResolution);
|
||||
|
||||
// Set up interval for data polling
|
||||
const intervalId = setInterval(async () => {
|
||||
await this.pollMarketData(strategyId);
|
||||
}, pollingIntervalMs);
|
||||
|
||||
this.dataPollingIntervals.set(strategyId, intervalId);
|
||||
|
||||
console.log(`Started data polling for strategy ${strategyId} with interval ${pollingIntervalMs}ms`);
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert data resolution to polling interval
|
||||
*/
|
||||
private getPollingIntervalFromResolution(resolution: string): number {
|
||||
switch (resolution) {
|
||||
case '1m': return 60 * 1000; // 60 seconds
|
||||
case '5m': return 5 * 60 * 1000; // 5 minutes
|
||||
case '15m': return 15 * 60 * 1000; // 15 minutes
|
||||
case '30m': return 30 * 60 * 1000; // 30 minutes
|
||||
case '1h': return 60 * 60 * 1000; // 1 hour
|
||||
case '4h': return 4 * 60 * 60 * 1000; // 4 hours
|
||||
case '1d': return 24 * 60 * 60 * 1000; // 1 day
|
||||
default: return 60 * 1000; // Default to 1 minute
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Poll market data for a specific strategy
|
||||
*/
|
||||
private async pollMarketData(strategyId: string): Promise<void> {
|
||||
const strategyData = this.activeStrategies.get(strategyId);
|
||||
|
||||
if (!strategyData) {
|
||||
console.warn(`Strategy ${strategyId} not found in active strategies`);
|
||||
return;
|
||||
}
|
||||
|
||||
const { strategy, config, lastBar } = strategyData;
|
||||
|
||||
try {
|
||||
// Get latest market data for all symbols
|
||||
for (const symbol of config.symbols) {
|
||||
// Get latest bar
|
||||
const now = new Date();
|
||||
const startTime = new Date(now.getTime() - 24 * 60 * 60 * 1000); // 24 hours ago
|
||||
|
||||
// Fetch latest bar
|
||||
const bars = await this.marketDataFeed.getHistoricalData(
|
||||
symbol,
|
||||
config.dataResolution,
|
||||
startTime,
|
||||
now
|
||||
);
|
||||
|
||||
if (bars.length > 0) {
|
||||
const latestBar = bars[bars.length - 1];
|
||||
|
||||
// Only process if this is a new bar
|
||||
const currentLastBar = lastBar.get(symbol);
|
||||
if (!currentLastBar || latestBar.timestamp.getTime() > currentLastBar.timestamp.getTime()) {
|
||||
// Update last bar
|
||||
lastBar.set(symbol, latestBar);
|
||||
|
||||
// Process the bar with the strategy
|
||||
this.processBar(strategyId, symbol, latestBar);
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error polling market data for strategy ${strategyId}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Process a new bar with the strategy
|
||||
*/ private processBar(strategyId: string, symbol: string, bar: BarData): void {
|
||||
const strategyData = this.activeStrategies.get(strategyId);
|
||||
|
||||
if (!strategyData) {
|
||||
console.warn(`Strategy ${strategyId} not found in active strategies`);
|
||||
return;
|
||||
}
|
||||
|
||||
const { strategy } = strategyData;
|
||||
|
||||
// Call strategy's onBar method to get trading signals
|
||||
const signal = strategy.onBar(bar);
|
||||
|
||||
if (signal) {
|
||||
// Create a signal object with timestamp and ID
|
||||
const signalObject = {
|
||||
id: `sig_${Date.now()}_${Math.floor(Math.random() * 10000)}`,
|
||||
strategyId,
|
||||
name: strategy.name,
|
||||
symbol: bar.symbol,
|
||||
price: bar.close,
|
||||
action: signal.action,
|
||||
quantity: signal.quantity,
|
||||
metadata: signal.metadata,
|
||||
timestamp: new Date().toISOString(),
|
||||
confidence: signal.confidence || 0.8 // Default confidence if not provided
|
||||
};
|
||||
|
||||
// Store the signal in Redis (this would be in a production environment)
|
||||
try {
|
||||
// This would normally be injected as a dependency, but for simplicity:
|
||||
const redis = require('ioredis') ?
|
||||
new (require('ioredis'))({
|
||||
host: process.env.REDIS_HOST || 'localhost',
|
||||
port: parseInt(process.env.REDIS_PORT || '6379')
|
||||
}) : null;
|
||||
|
||||
if (redis) {
|
||||
// Store signal with TTL of 7 days
|
||||
redis.setex(
|
||||
`signal:${strategyId}:${signalObject.id}`,
|
||||
60 * 60 * 24 * 7, // 7 days TTL
|
||||
JSON.stringify(signalObject)
|
||||
);
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Error storing signal in Redis:', err);
|
||||
}
|
||||
|
||||
// Broadcast the signal
|
||||
this.broadcastMessage({
|
||||
type: 'strategy_signal',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: signalObject
|
||||
});
|
||||
|
||||
// Execute the signal if real trading is enabled
|
||||
if (strategyData.config.realTrading) {
|
||||
this.executeSignal(strategyId, signal);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute a trading signal
|
||||
*/
|
||||
private executeSignal(strategyId: string, signal: any): void {
|
||||
// In a real implementation, this would connect to the order execution service
|
||||
// For now, just log the signal and broadcast a mock trade
|
||||
console.log(`Executing signal for strategy ${strategyId}:`, signal);
|
||||
|
||||
const strategyData = this.activeStrategies.get(strategyId);
|
||||
if (!strategyData) return;
|
||||
|
||||
const { strategy } = strategyData;
|
||||
|
||||
// Broadcast mock trade execution
|
||||
this.broadcastMessage({
|
||||
type: 'strategy_trade',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: {
|
||||
strategyId,
|
||||
name: strategy.name,
|
||||
symbol: signal.symbol,
|
||||
price: signal.price,
|
||||
action: signal.action,
|
||||
quantity: signal.quantity,
|
||||
executionTime: new Date().toISOString(),
|
||||
status: 'FILLED'
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Close all connections when service is shut down
|
||||
*/
|
||||
shutdown(): void {
|
||||
this.stop();
|
||||
|
||||
if (this.webSocketServer) {
|
||||
for (const client of this.wsClients) {
|
||||
client.close();
|
||||
}
|
||||
this.wsClients.clear();
|
||||
|
||||
this.webSocketServer.close(() => {
|
||||
console.log('WebSocket server closed');
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,138 @@
|
|||
import { BarData, Order } from '../Strategy';
|
||||
import { TechnicalIndicators } from '../analysis/TechnicalIndicators';
|
||||
import { VectorizedStrategy, VectorizedStrategyParams } from './VectorizedStrategy';
|
||||
|
||||
export interface MeanReversionParams extends VectorizedStrategyParams {
|
||||
lookback: number; // Period for mean calculation
|
||||
entryDeviation: number; // Standard deviations for entry
|
||||
exitDeviation: number; // Standard deviations for exit
|
||||
rsiPeriod: number; // RSI period
|
||||
rsiOverbought: number; // RSI overbought level
|
||||
rsiOversold: number; // RSI oversold level
|
||||
useRsi: boolean; // Whether to use RSI filter
|
||||
}
|
||||
|
||||
/**
|
||||
* Mean Reversion Strategy
|
||||
*
|
||||
* A vectorized strategy that trades mean reversion by detecting
|
||||
* overbought/oversold conditions and entering when price deviates
|
||||
* significantly from its moving average.
|
||||
*/
|
||||
export class MeanReversionStrategy extends VectorizedStrategy {
|
||||
// Default parameters
|
||||
protected static readonly DEFAULT_PARAMS: MeanReversionParams = {
|
||||
...VectorizedStrategy.DEFAULT_PARAMS,
|
||||
lookback: 20,
|
||||
entryDeviation: 1.5,
|
||||
exitDeviation: 0.5,
|
||||
lookbackPeriod: 100,
|
||||
rsiPeriod: 14,
|
||||
rsiOverbought: 70,
|
||||
rsiOversold: 30,
|
||||
useRsi: true
|
||||
};
|
||||
|
||||
constructor(
|
||||
id: string,
|
||||
name: string,
|
||||
description: string,
|
||||
symbols: string[],
|
||||
params: Partial<MeanReversionParams> = {}
|
||||
) {
|
||||
super(id, name, description, symbols, {
|
||||
...MeanReversionStrategy.DEFAULT_PARAMS,
|
||||
...params
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Computes buy/sell signals based on mean reversion logic
|
||||
*/
|
||||
protected computeSignals(
|
||||
symbol: string,
|
||||
prices: number[],
|
||||
volumes: number[]
|
||||
): Array<1 | 0 | -1> {
|
||||
const params = this.parameters as MeanReversionParams;
|
||||
const result: Array<1 | 0 | -1> = Array(prices.length).fill(0);
|
||||
|
||||
// Not enough data
|
||||
if (prices.length < params.lookback) {
|
||||
return result;
|
||||
}
|
||||
|
||||
// Calculate moving average
|
||||
const sma = TechnicalIndicators.sma(prices, params.lookback);
|
||||
|
||||
// Calculate standard deviation
|
||||
const stdDevs: number[] = [];
|
||||
for (let i = params.lookback - 1; i < prices.length; i++) {
|
||||
let sum = 0;
|
||||
for (let j = i - params.lookback + 1; j <= i; j++) {
|
||||
sum += Math.pow(prices[j] - sma[i], 2);
|
||||
}
|
||||
stdDevs.push(Math.sqrt(sum / params.lookback));
|
||||
}
|
||||
|
||||
// Pad standard deviations with NaN
|
||||
const paddedStdDevs = [...Array(params.lookback - 1).fill(NaN), ...stdDevs];
|
||||
|
||||
// Calculate upper and lower bands
|
||||
const upperBand: number[] = [];
|
||||
const lowerBand: number[] = [];
|
||||
|
||||
for (let i = 0; i < prices.length; i++) {
|
||||
if (isNaN(sma[i]) || isNaN(paddedStdDevs[i])) {
|
||||
upperBand.push(NaN);
|
||||
lowerBand.push(NaN);
|
||||
} else {
|
||||
upperBand.push(sma[i] + paddedStdDevs[i] * params.entryDeviation);
|
||||
lowerBand.push(sma[i] - paddedStdDevs[i] * params.entryDeviation);
|
||||
}
|
||||
}
|
||||
|
||||
// Calculate RSI if used
|
||||
let rsi: number[] = [];
|
||||
if (params.useRsi) {
|
||||
rsi = TechnicalIndicators.rsi(prices, params.rsiPeriod);
|
||||
}
|
||||
|
||||
// Generate signals
|
||||
for (let i = params.lookback; i < prices.length; i++) {
|
||||
// Check if price is outside bands
|
||||
const outsideUpperBand = prices[i] > upperBand[i];
|
||||
const outsideLowerBand = prices[i] < lowerBand[i];
|
||||
|
||||
// Check RSI conditions
|
||||
const rsiOverbought = params.useRsi && rsi[i] > params.rsiOverbought;
|
||||
const rsiOversold = params.useRsi && rsi[i] < params.rsiOversold;
|
||||
|
||||
// Check if price is returning to mean
|
||||
const returningFromUpper =
|
||||
outsideUpperBand && prices[i] < prices[i-1] && prices[i-1] > prices[i-2];
|
||||
|
||||
const returningFromLower =
|
||||
outsideLowerBand && prices[i] > prices[i-1] && prices[i-1] < prices[i-2];
|
||||
|
||||
// Generate signals
|
||||
if ((returningFromUpper && (!params.useRsi || rsiOverbought))) {
|
||||
result[i] = -1; // SELL signal
|
||||
} else if ((returningFromLower && (!params.useRsi || rsiOversold))) {
|
||||
result[i] = 1; // BUY signal
|
||||
} else if (Math.abs(prices[i] - sma[i]) < paddedStdDevs[i] * params.exitDeviation) {
|
||||
// Price returned to mean - exit position
|
||||
if (i > 0 && result[i-1] !== 0) {
|
||||
result[i] = result[i-1] * -1; // Opposite of previous signal
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
async onBar(bar: BarData): Promise<Order[]> {
|
||||
// Use the vectorized implementation from the base class
|
||||
return super.onBar(bar);
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,182 @@
|
|||
import { BaseStrategy, BarData, Order, StrategyParameters } from '../Strategy';
|
||||
import { TechnicalIndicators } from '../analysis/TechnicalIndicators';
|
||||
|
||||
export interface MovingAverageCrossoverParams extends StrategyParameters {
|
||||
fastPeriod: number; // Fast moving average period
|
||||
slowPeriod: number; // Slow moving average period
|
||||
positionSize: number; // Position size as percentage (0-1)
|
||||
stopLoss: number; // Stop loss percentage (0-1)
|
||||
takeProfit: number; // Take profit percentage (0-1)
|
||||
}
|
||||
|
||||
/**
|
||||
* Moving Average Crossover Strategy
|
||||
*
|
||||
* A simple strategy that buys when fast MA crosses above slow MA,
|
||||
* and sells when fast MA crosses below slow MA.
|
||||
*/
|
||||
export class MovingAverageCrossover extends BaseStrategy {
|
||||
// Default parameters
|
||||
protected static readonly DEFAULT_PARAMS: MovingAverageCrossoverParams = {
|
||||
fastPeriod: 10,
|
||||
slowPeriod: 30,
|
||||
positionSize: 0.2, // 20% of portfolio
|
||||
stopLoss: 0.02, // 2% stop loss
|
||||
takeProfit: 0.05 // 5% take profit
|
||||
};
|
||||
|
||||
private fastMA: Map<string, number[]> = new Map();
|
||||
private slowMA: Map<string, number[]> = new Map();
|
||||
private lastSignal: Map<string, 'BUY' | 'SELL' | null> = new Map();
|
||||
private stopLossLevels: Map<string, number> = new Map();
|
||||
private takeProfitLevels: Map<string, number> = new Map();
|
||||
|
||||
constructor(
|
||||
id: string,
|
||||
name: string,
|
||||
description: string,
|
||||
symbols: string[],
|
||||
params: Partial<MovingAverageCrossoverParams> = {}
|
||||
) {
|
||||
super(id, name, description, symbols, {
|
||||
...MovingAverageCrossover.DEFAULT_PARAMS,
|
||||
...params
|
||||
});
|
||||
}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
// Initialize state for each symbol
|
||||
for (const symbol of this.symbols) {
|
||||
this.fastMA.set(symbol, []);
|
||||
this.slowMA.set(symbol, []);
|
||||
this.lastSignal.set(symbol, null);
|
||||
}
|
||||
|
||||
// Validate parameters
|
||||
const params = this.parameters as MovingAverageCrossoverParams;
|
||||
|
||||
if (params.fastPeriod >= params.slowPeriod) {
|
||||
console.warn('Fast period should be smaller than slow period');
|
||||
}
|
||||
|
||||
if (params.positionSize <= 0 || params.positionSize > 1) {
|
||||
console.warn('Position size should be between 0 and 1');
|
||||
params.positionSize = 0.2; // Reset to default
|
||||
}
|
||||
}
|
||||
|
||||
async onBar(bar: BarData): Promise<Order[]> {
|
||||
const symbol = bar.symbol;
|
||||
const params = this.parameters as MovingAverageCrossoverParams;
|
||||
|
||||
// Update market data
|
||||
this.addBar(bar);
|
||||
|
||||
// Get historical data
|
||||
const bars = this.context.marketData.get(symbol) || [];
|
||||
|
||||
// Wait until we have enough data
|
||||
if (bars.length < params.slowPeriod + 1) {
|
||||
return [];
|
||||
}
|
||||
|
||||
// Extract close prices
|
||||
const prices = TechnicalIndicators.extractPrice(bars, 'close');
|
||||
|
||||
// Calculate moving averages
|
||||
const fastMA = TechnicalIndicators.sma(prices, params.fastPeriod);
|
||||
const slowMA = TechnicalIndicators.sma(prices, params.slowPeriod);
|
||||
|
||||
// Update MA values
|
||||
this.fastMA.set(symbol, fastMA);
|
||||
this.slowMA.set(symbol, slowMA);
|
||||
|
||||
// Get current and previous values
|
||||
const currentFast = fastMA[fastMA.length - 1];
|
||||
const currentSlow = slowMA[slowMA.length - 1];
|
||||
const previousFast = fastMA[fastMA.length - 2];
|
||||
const previousSlow = slowMA[slowMA.length - 2];
|
||||
|
||||
// Check for crossovers
|
||||
const orders: Order[] = [];
|
||||
|
||||
// Check for stop loss and take profit first
|
||||
if (this.hasPosition(symbol)) {
|
||||
const position = this.getPosition(symbol)!;
|
||||
const currentPrice = bar.close;
|
||||
|
||||
const stopLossLevel = this.stopLossLevels.get(symbol);
|
||||
const takeProfitLevel = this.takeProfitLevels.get(symbol);
|
||||
|
||||
// Check stop loss
|
||||
if (stopLossLevel && currentPrice <= stopLossLevel) {
|
||||
orders.push(this.createMarketOrder(symbol, 'SELL', position.quantity));
|
||||
this.stopLossLevels.delete(symbol);
|
||||
this.takeProfitLevels.delete(symbol);
|
||||
console.log(`${symbol} Stop loss triggered at ${currentPrice}`);
|
||||
return orders;
|
||||
}
|
||||
|
||||
// Check take profit
|
||||
if (takeProfitLevel && currentPrice >= takeProfitLevel) {
|
||||
orders.push(this.createMarketOrder(symbol, 'SELL', position.quantity));
|
||||
this.stopLossLevels.delete(symbol);
|
||||
this.takeProfitLevels.delete(symbol);
|
||||
console.log(`${symbol} Take profit triggered at ${currentPrice}`);
|
||||
return orders;
|
||||
}
|
||||
}
|
||||
|
||||
// Check for moving average crossover signals
|
||||
const crossedAbove = previousFast <= previousSlow && currentFast > currentSlow;
|
||||
const crossedBelow = previousFast >= previousSlow && currentFast < currentSlow;
|
||||
|
||||
if (crossedAbove && !this.hasPosition(symbol)) {
|
||||
// Buy signal - calculate position size
|
||||
const cash = this.getAvailableCash();
|
||||
const positionValue = cash * params.positionSize;
|
||||
const quantity = Math.floor(positionValue / bar.close);
|
||||
|
||||
if (quantity > 0) {
|
||||
orders.push(this.createMarketOrder(symbol, 'BUY', quantity));
|
||||
this.lastSignal.set(symbol, 'BUY');
|
||||
|
||||
// Set stop loss and take profit levels
|
||||
this.stopLossLevels.set(symbol, bar.close * (1 - params.stopLoss));
|
||||
this.takeProfitLevels.set(symbol, bar.close * (1 + params.takeProfit));
|
||||
console.log(`${symbol} Buy signal at ${bar.close}, quantity: ${quantity}`);
|
||||
}
|
||||
} else if (crossedBelow && this.hasPosition(symbol)) {
|
||||
// Sell signal
|
||||
const position = this.getPosition(symbol)!;
|
||||
orders.push(this.createMarketOrder(symbol, 'SELL', position.quantity));
|
||||
this.lastSignal.set(symbol, 'SELL');
|
||||
this.stopLossLevels.delete(symbol);
|
||||
this.takeProfitLevels.delete(symbol);
|
||||
console.log(`${symbol} Sell signal at ${bar.close}`);
|
||||
}
|
||||
|
||||
return orders;
|
||||
}
|
||||
|
||||
async onOrderFilled(order: Order): Promise<void> {
|
||||
console.log(`Order filled: ${order.symbol} ${order.side} ${order.quantity} @ ${order.fillPrice}`);
|
||||
|
||||
// Additional post-order logic can be added here
|
||||
if (order.side === 'BUY') {
|
||||
// Update stop loss and take profit levels based on fill price
|
||||
const params = this.parameters as MovingAverageCrossoverParams;
|
||||
this.stopLossLevels.set(order.symbol, order.fillPrice! * (1 - params.stopLoss));
|
||||
this.takeProfitLevels.set(order.symbol, order.fillPrice! * (1 + params.takeProfit));
|
||||
}
|
||||
}
|
||||
|
||||
async cleanup(): Promise<void> {
|
||||
// Clean up any resources or state
|
||||
this.fastMA.clear();
|
||||
this.slowMA.clear();
|
||||
this.lastSignal.clear();
|
||||
this.stopLossLevels.clear();
|
||||
this.takeProfitLevels.clear();
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,197 @@
|
|||
import { BaseStrategy, StrategyParameters } from '../Strategy';
|
||||
import { MovingAverageCrossover, MovingAverageCrossoverParams } from './MovingAverageCrossover';
|
||||
import { MeanReversionStrategy, MeanReversionParams } from './MeanReversionStrategy';
|
||||
|
||||
// Define the strategy types
|
||||
export type StrategyType = 'MOVING_AVERAGE_CROSSOVER' | 'MEAN_REVERSION' | 'CUSTOM';
|
||||
|
||||
// Strategy factory function type
|
||||
export type StrategyFactory = (
|
||||
id: string,
|
||||
name: string,
|
||||
description: string,
|
||||
symbols: string[],
|
||||
parameters: StrategyParameters
|
||||
) => BaseStrategy;
|
||||
|
||||
/**
|
||||
* Strategy Registry
|
||||
*
|
||||
* Manages and creates strategy instances based on strategy type.
|
||||
* Allows dynamic registration of new strategy types.
|
||||
*/
|
||||
export class StrategyRegistry {
|
||||
private static instance: StrategyRegistry;
|
||||
private factories: Map<StrategyType, StrategyFactory> = new Map();
|
||||
|
||||
/**
|
||||
* Get the singleton instance of StrategyRegistry
|
||||
*/
|
||||
public static getInstance(): StrategyRegistry {
|
||||
if (!StrategyRegistry.instance) {
|
||||
StrategyRegistry.instance = new StrategyRegistry();
|
||||
}
|
||||
return StrategyRegistry.instance;
|
||||
}
|
||||
|
||||
/**
|
||||
* Private constructor to enforce singleton pattern
|
||||
*/
|
||||
private constructor() {
|
||||
this.registerBuiltInStrategies();
|
||||
}
|
||||
|
||||
/**
|
||||
* Register built-in strategies
|
||||
*/
|
||||
private registerBuiltInStrategies(): void {
|
||||
// Register Moving Average Crossover
|
||||
this.registerStrategy('MOVING_AVERAGE_CROSSOVER', (id, name, description, symbols, parameters) => {
|
||||
return new MovingAverageCrossover(
|
||||
id,
|
||||
name,
|
||||
description,
|
||||
symbols,
|
||||
parameters as MovingAverageCrossoverParams
|
||||
);
|
||||
});
|
||||
|
||||
// Register Mean Reversion
|
||||
this.registerStrategy('MEAN_REVERSION', (id, name, description, symbols, parameters) => {
|
||||
return new MeanReversionStrategy(
|
||||
id,
|
||||
name,
|
||||
description,
|
||||
symbols,
|
||||
parameters as MeanReversionParams
|
||||
);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Register a new strategy type with factory function
|
||||
*/
|
||||
public registerStrategy(type: StrategyType, factory: StrategyFactory): void {
|
||||
this.factories.set(type, factory);
|
||||
}
|
||||
/**
|
||||
* Create a strategy instance based on type and parameters
|
||||
*/
|
||||
public createStrategy(
|
||||
type: StrategyType,
|
||||
id: string,
|
||||
name: string,
|
||||
description: string,
|
||||
symbols: string[],
|
||||
parameters: StrategyParameters = {}
|
||||
): BaseStrategy {
|
||||
const factory = this.factories.get(type);
|
||||
|
||||
if (!factory) {
|
||||
throw new Error(`Strategy type '${type}' is not registered`);
|
||||
}
|
||||
|
||||
const strategy = factory(id, name, description, symbols, parameters);
|
||||
|
||||
// Store the strategy for management
|
||||
this.storeStrategy(strategy);
|
||||
|
||||
return strategy;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get default parameters for a strategy type
|
||||
*/
|
||||
public getDefaultParameters(type: StrategyType): StrategyParameters {
|
||||
switch(type) {
|
||||
case 'MOVING_AVERAGE_CROSSOVER':
|
||||
return {
|
||||
fastPeriod: 10,
|
||||
slowPeriod: 30,
|
||||
positionSize: 0.2,
|
||||
stopLoss: 0.02,
|
||||
takeProfit: 0.05
|
||||
};
|
||||
case 'MEAN_REVERSION':
|
||||
return {
|
||||
lookback: 20,
|
||||
entryDeviation: 1.5,
|
||||
exitDeviation: 0.5,
|
||||
lookbackPeriod: 100,
|
||||
positionSize: 0.2,
|
||||
stopLoss: 0.02,
|
||||
takeProfit: 0.05,
|
||||
useBollingerBands: true,
|
||||
bollingerPeriod: 20,
|
||||
bollingerDeviation: 2,
|
||||
rsiPeriod: 14,
|
||||
rsiOverbought: 70,
|
||||
rsiOversold: 30,
|
||||
useRsi: true
|
||||
};
|
||||
default:
|
||||
return {};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all registered strategy types
|
||||
*/
|
||||
public getStrategyTypes(): StrategyType[] {
|
||||
return Array.from(this.factories.keys());
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a strategy type is registered
|
||||
*/
|
||||
public hasStrategyType(type: StrategyType): boolean {
|
||||
return this.factories.has(type);
|
||||
}
|
||||
|
||||
// Store created strategies for management
|
||||
private strategies: Map<string, BaseStrategy> = new Map();
|
||||
|
||||
/**
|
||||
* Store strategy instance for later retrieval
|
||||
* Used when creating strategies through registry
|
||||
*/
|
||||
private storeStrategy(strategy: BaseStrategy): void {
|
||||
this.strategies.set(strategy.id, strategy);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all registered strategies
|
||||
*/
|
||||
public getAllStrategies(): BaseStrategy[] {
|
||||
return Array.from(this.strategies.values());
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a strategy by ID
|
||||
*/
|
||||
public getStrategyById(id: string): BaseStrategy | undefined {
|
||||
return this.strategies.get(id);
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a strategy by ID
|
||||
*/
|
||||
public deleteStrategy(id: string): boolean {
|
||||
return this.strategies.delete(id);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the type of a strategy instance
|
||||
*/
|
||||
public getStrategyType(strategy: BaseStrategy): StrategyType {
|
||||
// Determine the type based on constructor
|
||||
if (strategy instanceof MovingAverageCrossover) {
|
||||
return 'MOVING_AVERAGE_CROSSOVER';
|
||||
} else if (strategy instanceof MeanReversionStrategy) {
|
||||
return 'MEAN_REVERSION';
|
||||
}
|
||||
|
||||
// Default to CUSTOM if we can't determine
|
||||
return 'CUSTOM';
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,198 @@
|
|||
import { BaseStrategy, BarData, Order, StrategyParameters } from '../Strategy';
|
||||
import { TechnicalIndicators } from '../analysis/TechnicalIndicators';
|
||||
|
||||
export interface VectorizedStrategyParams extends StrategyParameters {
|
||||
lookbackPeriod: number; // Data history to use for calculations
|
||||
positionSize: number; // Position size as percentage (0-1)
|
||||
stopLoss: number; // Stop loss percentage (0-1)
|
||||
takeProfit: number; // Take profit percentage (0-1)
|
||||
useBollingerBands: boolean; // Use Bollinger Bands for volatility
|
||||
bollingerPeriod: number; // Bollinger Band period
|
||||
bollingerDeviation: number; // Number of standard deviations
|
||||
}
|
||||
|
||||
/**
|
||||
* Vectorized Strategy Base Class
|
||||
*
|
||||
* A performant strategy implementation designed for fast backtesting.
|
||||
* Instead of processing bars one at a time, it pre-calculates signals
|
||||
* for large chunks of data and executes them in batches.
|
||||
*/
|
||||
export abstract class VectorizedStrategy extends BaseStrategy {
|
||||
// Default parameters
|
||||
protected static readonly DEFAULT_PARAMS: VectorizedStrategyParams = {
|
||||
lookbackPeriod: 100,
|
||||
positionSize: 0.2, // 20% of portfolio
|
||||
stopLoss: 0.02, // 2% stop loss
|
||||
takeProfit: 0.05, // 5% take profit
|
||||
useBollingerBands: true,
|
||||
bollingerPeriod: 20,
|
||||
bollingerDeviation: 2
|
||||
};
|
||||
|
||||
protected indicators: Map<string, {
|
||||
prices: number[];
|
||||
signals: Array<1 | 0 | -1>; // 1 = BUY, 0 = NEUTRAL, -1 = SELL
|
||||
stopLevels: number[];
|
||||
profitLevels: number[];
|
||||
bollingerUpper?: number[];
|
||||
bollingerLower?: number[];
|
||||
}> = new Map();
|
||||
|
||||
protected stopLossLevels: Map<string, number> = new Map();
|
||||
protected takeProfitLevels: Map<string, number> = new Map();
|
||||
|
||||
constructor(
|
||||
id: string,
|
||||
name: string,
|
||||
description: string,
|
||||
symbols: string[],
|
||||
params: Partial<VectorizedStrategyParams> = {}
|
||||
) {
|
||||
super(id, name, description, symbols, {
|
||||
...VectorizedStrategy.DEFAULT_PARAMS,
|
||||
...params
|
||||
});
|
||||
}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
// Initialize state for each symbol
|
||||
for (const symbol of this.symbols) {
|
||||
this.indicators.set(symbol, {
|
||||
prices: [],
|
||||
signals: [],
|
||||
stopLevels: [],
|
||||
profitLevels: []
|
||||
});
|
||||
}
|
||||
|
||||
// Validate parameters
|
||||
const params = this.parameters as VectorizedStrategyParams;
|
||||
|
||||
if (params.positionSize <= 0 || params.positionSize > 1) {
|
||||
console.warn('Position size should be between 0 and 1');
|
||||
params.positionSize = 0.2; // Reset to default
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* The main method that must be implemented by vectorized strategy subclasses.
|
||||
* It should compute signals based on price data in a vectorized manner.
|
||||
*/
|
||||
protected abstract computeSignals(
|
||||
symbol: string,
|
||||
prices: number[],
|
||||
volumes: number[]
|
||||
): Array<1 | 0 | -1>;
|
||||
|
||||
async onBar(bar: BarData): Promise<Order[]> {
|
||||
const symbol = bar.symbol;
|
||||
const params = this.parameters as VectorizedStrategyParams;
|
||||
|
||||
// Update market data
|
||||
this.addBar(bar);
|
||||
|
||||
// Get historical data
|
||||
const bars = this.context.marketData.get(symbol) || [];
|
||||
|
||||
// If we don't have enough bars yet, wait
|
||||
if (bars.length < params.lookbackPeriod) {
|
||||
return [];
|
||||
}
|
||||
|
||||
// Extract price and volume data
|
||||
const prices = TechnicalIndicators.extractPrice(bars, 'close');
|
||||
const volumes = bars.map(b => b.volume);
|
||||
|
||||
// Update indicators cache
|
||||
const indicators = this.indicators.get(symbol)!;
|
||||
indicators.prices = prices;
|
||||
|
||||
// Check for stop loss and take profit first
|
||||
const orders: Order[] = [];
|
||||
|
||||
if (this.hasPosition(symbol)) {
|
||||
const position = this.getPosition(symbol)!;
|
||||
const currentPrice = bar.close;
|
||||
|
||||
const stopLossLevel = this.stopLossLevels.get(symbol);
|
||||
const takeProfitLevel = this.takeProfitLevels.get(symbol);
|
||||
|
||||
// Check stop loss
|
||||
if (stopLossLevel && currentPrice <= stopLossLevel) {
|
||||
orders.push(this.createMarketOrder(symbol, 'SELL', position.quantity));
|
||||
this.stopLossLevels.delete(symbol);
|
||||
this.takeProfitLevels.delete(symbol);
|
||||
return orders;
|
||||
}
|
||||
|
||||
// Check take profit
|
||||
if (takeProfitLevel && currentPrice >= takeProfitLevel) {
|
||||
orders.push(this.createMarketOrder(symbol, 'SELL', position.quantity));
|
||||
this.stopLossLevels.delete(symbol);
|
||||
this.takeProfitLevels.delete(symbol);
|
||||
return orders;
|
||||
}
|
||||
}
|
||||
|
||||
// Compute vectorized signals only when we have enough new data
|
||||
// (optimization to avoid recomputing on every bar)
|
||||
if (bars.length % 10 === 0 || !indicators.signals.length) {
|
||||
indicators.signals = this.computeSignals(symbol, prices, volumes);
|
||||
|
||||
// Update Bollinger Bands if configured
|
||||
if (params.useBollingerBands) {
|
||||
const bands = TechnicalIndicators.bollingerBands(
|
||||
prices,
|
||||
params.bollingerPeriod,
|
||||
params.bollingerDeviation
|
||||
);
|
||||
indicators.bollingerUpper = bands.upper;
|
||||
indicators.bollingerLower = bands.lower;
|
||||
}
|
||||
}
|
||||
|
||||
// Get the latest signal
|
||||
const latestSignal = indicators.signals[indicators.signals.length - 1];
|
||||
|
||||
// Generate orders based on signals
|
||||
if (latestSignal === 1 && !this.hasPosition(symbol)) {
|
||||
// Buy signal - calculate position size
|
||||
const cash = this.getAvailableCash();
|
||||
const positionValue = cash * params.positionSize;
|
||||
const quantity = Math.floor(positionValue / bar.close);
|
||||
|
||||
if (quantity > 0) {
|
||||
orders.push(this.createMarketOrder(symbol, 'BUY', quantity));
|
||||
|
||||
// Set stop loss and take profit levels
|
||||
this.stopLossLevels.set(symbol, bar.close * (1 - params.stopLoss));
|
||||
this.takeProfitLevels.set(symbol, bar.close * (1 + params.takeProfit));
|
||||
}
|
||||
} else if (latestSignal === -1 && this.hasPosition(symbol)) {
|
||||
// Sell signal
|
||||
const position = this.getPosition(symbol)!;
|
||||
orders.push(this.createMarketOrder(symbol, 'SELL', position.quantity));
|
||||
this.stopLossLevels.delete(symbol);
|
||||
this.takeProfitLevels.delete(symbol);
|
||||
}
|
||||
|
||||
return orders;
|
||||
}
|
||||
|
||||
async onOrderFilled(order: Order): Promise<void> {
|
||||
// Update stop loss and take profit levels based on fill price
|
||||
if (order.side === 'BUY') {
|
||||
const params = this.parameters as VectorizedStrategyParams;
|
||||
this.stopLossLevels.set(order.symbol, order.fillPrice! * (1 - params.stopLoss));
|
||||
this.takeProfitLevels.set(order.symbol, order.fillPrice! * (1 + params.takeProfit));
|
||||
}
|
||||
}
|
||||
|
||||
async cleanup(): Promise<void> {
|
||||
// Clean up any resources or state
|
||||
this.indicators.clear();
|
||||
this.stopLossLevels.clear();
|
||||
this.takeProfitLevels.clear();
|
||||
}
|
||||
}
|
||||
|
|
@ -2,6 +2,12 @@ import { Hono } from 'hono';
|
|||
import { WebSocketServer } from 'ws';
|
||||
import Redis from 'ioredis';
|
||||
import * as cron from 'node-cron';
|
||||
import { BaseStrategy } from './core/Strategy';
|
||||
import { StrategyRegistry } from './core/strategies/StrategyRegistry';
|
||||
import { BacktestService, BacktestRequest } from './core/backtesting/BacktestService';
|
||||
import { BacktestResult } from './core/backtesting/BacktestEngine';
|
||||
import { PerformanceAnalytics } from './core/backtesting/PerformanceAnalytics';
|
||||
import { StrategyController } from './controllers/StrategyController';
|
||||
|
||||
const app = new Hono();
|
||||
const redis = new Redis({
|
||||
|
|
@ -14,6 +20,10 @@ const redis = new Redis({
|
|||
// WebSocket server for real-time strategy updates
|
||||
const wss = new WebSocketServer({ port: 8082 });
|
||||
|
||||
// Initialize strategy registry and backtest service
|
||||
const strategyRegistry = StrategyRegistry.getInstance();
|
||||
const backtestService = new BacktestService();
|
||||
|
||||
// Strategy interfaces
|
||||
interface TradingStrategy {
|
||||
id: string;
|
||||
|
|
@ -48,6 +58,9 @@ interface StrategySignal {
|
|||
// In-memory strategy registry (in production, this would be persisted)
|
||||
const strategies = new Map<string, TradingStrategy>();
|
||||
|
||||
// Initialize strategy controller
|
||||
const strategyController = new StrategyController();
|
||||
|
||||
// Health check endpoint
|
||||
app.get('/health', (c) => {
|
||||
return c.json({
|
||||
|
|
@ -56,41 +69,214 @@ app.get('/health', (c) => {
|
|||
timestamp: new Date(),
|
||||
version: '1.0.0',
|
||||
activeStrategies: Array.from(strategies.values()).filter(s => s.status === 'ACTIVE').length,
|
||||
registeredStrategies: strategyRegistry.getAllStrategies().length,
|
||||
connections: wss.clients.size
|
||||
});
|
||||
});
|
||||
|
||||
// Get all strategies
|
||||
// API Routes
|
||||
// Strategy management endpoints
|
||||
app.get('/api/strategy-types', async (c) => {
|
||||
try {
|
||||
const types = Object.values(strategyRegistry.getStrategyTypes());
|
||||
return c.json({ success: true, data: types });
|
||||
} catch (error) {
|
||||
console.error('Error getting strategy types:', error);
|
||||
return c.json({ success: false, error: 'Failed to get strategy types' }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/api/strategies', async (c) => {
|
||||
try {
|
||||
const strategiesList = Array.from(strategies.values());
|
||||
return c.json({
|
||||
success: true,
|
||||
data: strategiesList
|
||||
});
|
||||
const strategies = strategyRegistry.getAllStrategies();
|
||||
const serializedStrategies = strategies.map(strategy => ({
|
||||
id: strategy.id,
|
||||
name: strategy.name,
|
||||
description: strategy.description,
|
||||
symbols: strategy.symbols,
|
||||
parameters: strategy.parameters,
|
||||
type: strategyRegistry.getStrategyType(strategy)
|
||||
}));
|
||||
|
||||
return c.json({ success: true, data: serializedStrategies });
|
||||
} catch (error) {
|
||||
console.error('Error fetching strategies:', error);
|
||||
return c.json({ success: false, error: 'Failed to fetch strategies' }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Get specific strategy
|
||||
app.get('/api/strategies/:id', async (c) => {
|
||||
try {
|
||||
const id = c.req.param('id');
|
||||
const strategy = strategies.get(id);
|
||||
const strategy = strategyRegistry.getStrategyById(id);
|
||||
|
||||
if (!strategy) {
|
||||
return c.json({ success: false, error: 'Strategy not found' }, 404);
|
||||
return c.json({ success: false, error: `Strategy with ID ${id} not found` }, 404);
|
||||
}
|
||||
|
||||
return c.json({ success: true, data: strategy });
|
||||
const type = strategyRegistry.getStrategyType(strategy);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: {
|
||||
id: strategy.id,
|
||||
name: strategy.name,
|
||||
description: strategy.description,
|
||||
symbols: strategy.symbols,
|
||||
parameters: strategy.parameters,
|
||||
type
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching strategy:', error);
|
||||
return c.json({ success: false, error: 'Failed to fetch strategy' }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.post('/api/strategies', async (c) => {
|
||||
try {
|
||||
const { name, description, symbols, parameters, type } = await c.req.json();
|
||||
|
||||
if (!type) {
|
||||
return c.json({
|
||||
success: false,
|
||||
error: 'Invalid strategy type'
|
||||
}, 400);
|
||||
}
|
||||
|
||||
const strategy = strategyRegistry.createStrategy(
|
||||
type,
|
||||
`strategy_${Date.now()}`, // Generate an ID
|
||||
name || `New ${type} Strategy`,
|
||||
description || `Generated ${type} strategy`,
|
||||
symbols || [],
|
||||
parameters || {}
|
||||
);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: {
|
||||
id: strategy.id,
|
||||
name: strategy.name,
|
||||
description: strategy.description,
|
||||
symbols: strategy.symbols,
|
||||
parameters: strategy.parameters,
|
||||
type
|
||||
}
|
||||
}, 201);
|
||||
} catch (error) {
|
||||
console.error('Error creating strategy:', error);
|
||||
return c.json({ success: false, error: (error as Error).message }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.put('/api/strategies/:id', async (c) => {
|
||||
try {
|
||||
const id = c.req.param('id');
|
||||
const { name, description, symbols, parameters } = await c.req.json();
|
||||
|
||||
const strategy = strategyRegistry.getStrategyById(id);
|
||||
|
||||
if (!strategy) {
|
||||
return c.json({ success: false, error: `Strategy with ID ${id} not found` }, 404);
|
||||
}
|
||||
|
||||
// Update properties
|
||||
if (name !== undefined) strategy.name = name;
|
||||
if (description !== undefined) strategy.description = description;
|
||||
if (symbols !== undefined) (strategy as any).symbols = symbols; // Hack since symbols is readonly
|
||||
if (parameters !== undefined) strategy.parameters = parameters;
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: {
|
||||
id: strategy.id,
|
||||
name: strategy.name,
|
||||
description: strategy.description,
|
||||
symbols: strategy.symbols,
|
||||
parameters: strategy.parameters,
|
||||
type: strategyRegistry.getStrategyType(strategy)
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error updating strategy:', error);
|
||||
return c.json({ success: false, error: (error as Error).message }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.delete('/api/strategies/:id', async (c) => {
|
||||
try {
|
||||
const id = c.req.param('id');
|
||||
const success = strategyRegistry.deleteStrategy(id);
|
||||
|
||||
if (!success) {
|
||||
return c.json({ success: false, error: `Strategy with ID ${id} not found` }, 404);
|
||||
}
|
||||
|
||||
return c.json({ success: true, data: { id } });
|
||||
} catch (error) {
|
||||
console.error('Error deleting strategy:', error);
|
||||
return c.json({ success: false, error: (error as Error).message }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Backtesting endpoints
|
||||
app.post('/api/backtest', async (c) => {
|
||||
try {
|
||||
const backtestRequest = await c.req.json() as BacktestRequest;
|
||||
|
||||
// Validate request
|
||||
if (!backtestRequest.strategyType) {
|
||||
return c.json({ success: false, error: 'Strategy type is required' }, 400);
|
||||
}
|
||||
|
||||
if (!backtestRequest.symbols || backtestRequest.symbols.length === 0) {
|
||||
return c.json({ success: false, error: 'At least one symbol is required' }, 400);
|
||||
}
|
||||
|
||||
// Run the backtest
|
||||
const result = await backtestService.runBacktest(backtestRequest);
|
||||
|
||||
// Enhance results with additional metrics
|
||||
const enhancedResult = PerformanceAnalytics.enhanceResults(result);
|
||||
|
||||
// Calculate additional analytics
|
||||
const monthlyReturns = PerformanceAnalytics.calculateMonthlyReturns(result.dailyReturns);
|
||||
const drawdowns = PerformanceAnalytics.analyzeDrawdowns(result.dailyReturns);
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: {
|
||||
...enhancedResult,
|
||||
monthlyReturns,
|
||||
drawdowns
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Backtest error:', error);
|
||||
return c.json({ success: false, error: (error as Error).message }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.post('/api/optimize', async (c) => {
|
||||
try {
|
||||
const { baseRequest, parameterGrid } = await c.req.json();
|
||||
|
||||
// Validate request
|
||||
if (!baseRequest || !parameterGrid) {
|
||||
return c.json({ success: false, error: 'Base request and parameter grid are required' }, 400);
|
||||
}
|
||||
|
||||
// Run optimization
|
||||
const results = await backtestService.optimizeStrategy(baseRequest, parameterGrid);
|
||||
|
||||
return c.json({ success: true, data: results });
|
||||
} catch (error) {
|
||||
console.error('Strategy optimization error:', error);
|
||||
return c.json({ success: false, error: (error as Error).message }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Create new strategy
|
||||
app.post('/api/strategies', async (c) => {
|
||||
try {
|
||||
|
|
@ -252,6 +438,32 @@ app.get('/api/strategies/:id/signals', async (c) => {
|
|||
}
|
||||
});
|
||||
|
||||
// Get strategy trades
|
||||
app.get('/api/strategies/:id/trades', async (c) => {
|
||||
try {
|
||||
const id = c.req.param('id');
|
||||
const limit = parseInt(c.req.query('limit') || '50');
|
||||
|
||||
const tradeKeys = await redis.keys(`trade:${id}:*`);
|
||||
const trades: any[] = [];
|
||||
|
||||
for (const key of tradeKeys.slice(0, limit)) {
|
||||
const data = await redis.get(key);
|
||||
if (data) {
|
||||
trades.push(JSON.parse(data));
|
||||
}
|
||||
}
|
||||
|
||||
return c.json({
|
||||
success: true,
|
||||
data: trades.sort((a: any, b: any) => new Date(b.exitTime || b.timestamp).getTime() - new Date(a.exitTime || a.timestamp).getTime())
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching strategy trades:', error);
|
||||
return c.json({ success: false, error: 'Failed to fetch trades' }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Generate demo signal (for testing)
|
||||
app.post('/api/strategies/:id/generate-signal', async (c) => {
|
||||
try {
|
||||
|
|
@ -378,6 +590,91 @@ cron.schedule('*/5 * * * *', async () => {
|
|||
}
|
||||
});
|
||||
|
||||
// Backtesting API endpoints
|
||||
app.post('/api/backtest', async (c) => {
|
||||
try {
|
||||
const request = await c.req.json() as BacktestRequest;
|
||||
console.log('Received backtest request:', request);
|
||||
|
||||
const result = await backtestService.runBacktest(request);
|
||||
const enhancedResult = PerformanceAnalytics.enhanceResults(result);
|
||||
|
||||
// Store backtest result in Redis for persistence
|
||||
await redis.setex(
|
||||
`backtest:${result.strategyId}`,
|
||||
86400 * 7, // 7 days TTL
|
||||
JSON.stringify(enhancedResult)
|
||||
);
|
||||
|
||||
return c.json({ success: true, data: enhancedResult });
|
||||
} catch (error) {
|
||||
console.error('Backtest error:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.post('/api/backtest/optimize', async (c) => {
|
||||
try {
|
||||
const { baseRequest, parameterGrid } = await c.req.json() as {
|
||||
baseRequest: BacktestRequest,
|
||||
parameterGrid: Record<string, any[]>
|
||||
};
|
||||
|
||||
console.log('Received optimization request:', baseRequest, parameterGrid);
|
||||
|
||||
const results = await backtestService.optimizeStrategy(baseRequest, parameterGrid);
|
||||
|
||||
return c.json({ success: true, data: results });
|
||||
} catch (error) {
|
||||
console.error('Optimization error:', error);
|
||||
return c.json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
}, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/api/backtest/:id', async (c) => {
|
||||
try {
|
||||
const id = c.req.param('id');
|
||||
const data = await redis.get(`backtest:${id}`);
|
||||
|
||||
if (!data) {
|
||||
return c.json({ success: false, error: 'Backtest not found' }, 404);
|
||||
}
|
||||
|
||||
const result = JSON.parse(data) as BacktestResult;
|
||||
return c.json({ success: true, data: result });
|
||||
} catch (error) {
|
||||
console.error('Error fetching backtest:', error);
|
||||
return c.json({ success: false, error: 'Failed to fetch backtest' }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/api/strategy-types', (c) => {
|
||||
const types = strategyRegistry.getStrategyTypes();
|
||||
return c.json({ success: true, data: types });
|
||||
});
|
||||
|
||||
app.get('/api/strategy-parameters/:type', (c) => {
|
||||
try {
|
||||
const type = c.req.param('type') as any;
|
||||
|
||||
if (!strategyRegistry.hasStrategyType(type)) {
|
||||
return c.json({ success: false, error: 'Strategy type not found' }, 404);
|
||||
}
|
||||
|
||||
const params = strategyRegistry.getDefaultParameters(type);
|
||||
return c.json({ success: true, data: params });
|
||||
} catch (error) {
|
||||
console.error('Error fetching strategy parameters:', error);
|
||||
return c.json({ success: false, error: 'Failed to fetch parameters' }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Load existing strategies from Redis on startup
|
||||
async function loadStrategiesFromRedis() {
|
||||
try {
|
||||
|
|
@ -395,7 +692,7 @@ async function loadStrategiesFromRedis() {
|
|||
}
|
||||
}
|
||||
|
||||
const port = parseInt(process.env.PORT || '3003');
|
||||
const port = parseInt(process.env.PORT || '4001');
|
||||
|
||||
console.log(`🎯 Strategy Orchestrator starting on port ${port}`);
|
||||
console.log(`📡 WebSocket server running on port 8082`);
|
||||
|
|
|
|||
|
|
@ -0,0 +1,139 @@
|
|||
import { describe, test, expect, beforeEach } from 'bun:test';
|
||||
import { BacktestService, BacktestRequest } from '../../core/backtesting/BacktestService';
|
||||
import { StrategyRegistry, StrategyType } from '../../core/strategies/StrategyRegistry';
|
||||
import { MarketDataFeed } from '../../core/backtesting/MarketDataFeed';
|
||||
|
||||
// Mock dependencies
|
||||
jest.mock('../../core/backtesting/MarketDataFeed');
|
||||
|
||||
describe('BacktestService', () => {
|
||||
let backtestService: BacktestService;
|
||||
let mockRequest: BacktestRequest;
|
||||
|
||||
beforeEach(() => {
|
||||
// Reset mocks and create fresh service instance
|
||||
jest.clearAllMocks();
|
||||
backtestService = new BacktestService('http://test.api');
|
||||
|
||||
// Create a standard backtest request for tests
|
||||
mockRequest = {
|
||||
strategyType: 'MEAN_REVERSION' as StrategyType,
|
||||
strategyParams: {
|
||||
lookback: 20,
|
||||
entryDeviation: 1.5,
|
||||
exitDeviation: 0.5,
|
||||
lookbackPeriod: 100
|
||||
},
|
||||
symbols: ['AAPL'],
|
||||
startDate: new Date('2023-01-01'),
|
||||
endDate: new Date('2023-02-01'),
|
||||
initialCapital: 100000,
|
||||
dataResolution: '1d',
|
||||
commission: 0.001,
|
||||
slippage: 0.001,
|
||||
mode: 'vector'
|
||||
};
|
||||
|
||||
// Mock the MarketDataFeed implementation
|
||||
(MarketDataFeed.prototype.getHistoricalData as jest.Mock).mockResolvedValue([
|
||||
// Generate some sample data
|
||||
...Array(30).fill(0).map((_, i) => ({
|
||||
symbol: 'AAPL',
|
||||
timestamp: new Date(`2023-01-${(i + 1).toString().padStart(2, '0')}`),
|
||||
open: 150 + Math.random() * 10,
|
||||
high: 155 + Math.random() * 10,
|
||||
low: 145 + Math.random() * 10,
|
||||
close: 150 + Math.random() * 10,
|
||||
volume: 1000000 + Math.random() * 500000
|
||||
}))
|
||||
]);
|
||||
});
|
||||
|
||||
test('should run a backtest successfully', async () => {
|
||||
// Act
|
||||
const result = await backtestService.runBacktest(mockRequest);
|
||||
|
||||
// Assert
|
||||
expect(result).toBeDefined();
|
||||
expect(result.strategyId).toBeDefined();
|
||||
expect(result.initialCapital).toBe(100000);
|
||||
expect(result.trades).toBeDefined();
|
||||
expect(result.dailyReturns).toBeDefined();
|
||||
|
||||
// Verify market data was requested
|
||||
expect(MarketDataFeed.prototype.getHistoricalData).toHaveBeenCalledTimes(mockRequest.symbols.length);
|
||||
});
|
||||
|
||||
test('should optimize strategy parameters', async () => {
|
||||
// Arrange
|
||||
const parameterGrid = {
|
||||
lookback: [10, 20],
|
||||
entryDeviation: [1.0, 1.5, 2.0]
|
||||
};
|
||||
|
||||
// We should get 2×3 = 6 combinations
|
||||
|
||||
// Act
|
||||
const results = await backtestService.optimizeStrategy(mockRequest, parameterGrid);
|
||||
|
||||
// Assert
|
||||
expect(results).toHaveLength(6);
|
||||
expect(results[0].parameters).toBeDefined();
|
||||
|
||||
// Check that results are sorted by performance (sharpe ratio)
|
||||
for (let i = 0; i < results.length - 1; i++) {
|
||||
expect(results[i].sharpeRatio).toBeGreaterThanOrEqual(results[i + 1].sharpeRatio);
|
||||
}
|
||||
});
|
||||
|
||||
test('should handle errors during backtest', async () => {
|
||||
// Arrange
|
||||
(MarketDataFeed.prototype.getHistoricalData as jest.Mock).mockRejectedValue(
|
||||
new Error('Data source error')
|
||||
);
|
||||
|
||||
// Act & Assert
|
||||
await expect(backtestService.runBacktest(mockRequest))
|
||||
.rejects
|
||||
.toThrow();
|
||||
});
|
||||
|
||||
test('should generate correct parameter combinations', () => {
|
||||
// Arrange
|
||||
const grid = {
|
||||
param1: [1, 2],
|
||||
param2: ['a', 'b'],
|
||||
param3: [true, false]
|
||||
};
|
||||
|
||||
// Act
|
||||
const combinations = (backtestService as any).generateParameterCombinations(grid, Object.keys(grid));
|
||||
|
||||
// Assert - should get 2×2×2 = 8 combinations
|
||||
expect(combinations).toHaveLength(8);
|
||||
|
||||
// Check that all combinations are generated
|
||||
expect(combinations).toContainEqual({ param1: 1, param2: 'a', param3: true });
|
||||
expect(combinations).toContainEqual({ param1: 1, param2: 'a', param3: false });
|
||||
expect(combinations).toContainEqual({ param1: 1, param2: 'b', param3: true });
|
||||
expect(combinations).toContainEqual({ param1: 1, param2: 'b', param3: false });
|
||||
expect(combinations).toContainEqual({ param1: 2, param2: 'a', param3: true });
|
||||
expect(combinations).toContainEqual({ param1: 2, param2: 'a', param3: false });
|
||||
expect(combinations).toContainEqual({ param1: 2, param2: 'b', param3: true });
|
||||
expect(combinations).toContainEqual({ param1: 2, param2: 'b', param3: false });
|
||||
});
|
||||
|
||||
test('should track active backtests', () => {
|
||||
// Arrange
|
||||
const activeBacktests = (backtestService as any).activeBacktests;
|
||||
|
||||
// Act
|
||||
let promise = backtestService.runBacktest(mockRequest);
|
||||
|
||||
// Assert
|
||||
expect(activeBacktests.size).toBe(1);
|
||||
|
||||
// Clean up
|
||||
return promise;
|
||||
});
|
||||
});
|
||||
|
|
@ -0,0 +1,169 @@
|
|||
import { describe, test, expect } from 'bun:test';
|
||||
import { PerformanceAnalytics } from '../../core/backtesting/PerformanceAnalytics';
|
||||
import { BacktestResult } from '../../core/backtesting/BacktestEngine';
|
||||
|
||||
describe('PerformanceAnalytics', () => {
|
||||
// Sample backtest result for testing
|
||||
const sampleResult: BacktestResult = {
|
||||
strategyId: 'test-strategy',
|
||||
startDate: new Date('2023-01-01'),
|
||||
endDate: new Date('2023-12-31'),
|
||||
duration: 31536000000, // 1 year in ms
|
||||
initialCapital: 100000,
|
||||
finalCapital: 125000,
|
||||
totalReturn: 0.25, // 25% return
|
||||
annualizedReturn: 0.25,
|
||||
sharpeRatio: 1.5,
|
||||
maxDrawdown: 0.10, // 10% drawdown
|
||||
maxDrawdownDuration: 30, // 30 days
|
||||
winRate: 0.6, // 60% win rate
|
||||
totalTrades: 50,
|
||||
winningTrades: 30,
|
||||
losingTrades: 20,
|
||||
averageWinningTrade: 0.05, // 5% average win
|
||||
averageLosingTrade: -0.03, // 3% average loss
|
||||
profitFactor: 2.5,
|
||||
dailyReturns: [
|
||||
// Generate 365 days of sample daily returns with some randomness
|
||||
...Array(365).fill(0).map((_, i) => ({
|
||||
date: new Date(new Date('2023-01-01').getTime() + i * 24 * 3600 * 1000),
|
||||
return: 0.001 + (Math.random() - 0.45) * 0.01 // Mean positive return with noise
|
||||
}))
|
||||
],
|
||||
trades: [
|
||||
// Generate sample trades
|
||||
...Array(50).fill(0).map((_, i) => ({
|
||||
symbol: 'AAPL',
|
||||
entryTime: new Date(`2023-${Math.floor(i / 4) + 1}-${(i % 28) + 1}`),
|
||||
entryPrice: 150 + Math.random() * 10,
|
||||
exitTime: new Date(`2023-${Math.floor(i / 4) + 1}-${(i % 28) + 5}`),
|
||||
exitPrice: 155 + Math.random() * 10,
|
||||
quantity: 10,
|
||||
pnl: 500 * (Math.random() - 0.3), // Some wins, some losses
|
||||
pnlPercent: 0.05 * (Math.random() - 0.3)
|
||||
}))
|
||||
]
|
||||
};
|
||||
|
||||
test('should calculate advanced metrics', () => {
|
||||
// Act
|
||||
const enhancedResult = PerformanceAnalytics.enhanceResults(sampleResult);
|
||||
|
||||
// Assert
|
||||
expect(enhancedResult.sortinoRatio).toBeDefined();
|
||||
expect(enhancedResult.calmarRatio).toBeDefined();
|
||||
expect(enhancedResult.omegaRatio).toBeDefined();
|
||||
expect(enhancedResult.cagr).toBeDefined();
|
||||
expect(enhancedResult.volatility).toBeDefined();
|
||||
expect(enhancedResult.ulcerIndex).toBeDefined();
|
||||
|
||||
// Check that the original result properties are preserved
|
||||
expect(enhancedResult.strategyId).toBe(sampleResult.strategyId);
|
||||
expect(enhancedResult.totalReturn).toBe(sampleResult.totalReturn);
|
||||
|
||||
// Validate some calculations
|
||||
expect(enhancedResult.calmarRatio).toBeCloseTo(sampleResult.annualizedReturn / sampleResult.maxDrawdown);
|
||||
expect(typeof enhancedResult.sortinoRatio).toBe('number');
|
||||
});
|
||||
|
||||
test('should calculate monthly returns', () => {
|
||||
// Act
|
||||
const monthlyReturns = PerformanceAnalytics.calculateMonthlyReturns(sampleResult.dailyReturns);
|
||||
|
||||
// Assert
|
||||
expect(monthlyReturns).toBeDefined();
|
||||
expect(monthlyReturns.length).toBe(12); // 12 months in a year
|
||||
expect(monthlyReturns[0].year).toBe(2023);
|
||||
expect(monthlyReturns[0].month).toBe(0); // January is 0
|
||||
|
||||
// Verify sorting
|
||||
let lastDate = { year: 0, month: 0 };
|
||||
for (const mr of monthlyReturns) {
|
||||
expect(mr.year >= lastDate.year).toBeTruthy();
|
||||
if (mr.year === lastDate.year) {
|
||||
expect(mr.month >= lastDate.month).toBeTruthy();
|
||||
}
|
||||
lastDate = { year: mr.year, month: mr.month };
|
||||
}
|
||||
});
|
||||
|
||||
test('should analyze drawdowns', () => {
|
||||
// Act
|
||||
const drawdowns = PerformanceAnalytics.analyzeDrawdowns(sampleResult.dailyReturns);
|
||||
|
||||
// Assert
|
||||
expect(drawdowns).toBeDefined();
|
||||
expect(drawdowns.length).toBeGreaterThan(0);
|
||||
|
||||
// Check drawdown properties
|
||||
for (const dd of drawdowns) {
|
||||
expect(dd.startDate).toBeInstanceOf(Date);
|
||||
expect(dd.endDate).toBeInstanceOf(Date);
|
||||
expect(dd.drawdown).toBeGreaterThan(0);
|
||||
expect(dd.durationDays).toBeGreaterThanOrEqual(0);
|
||||
|
||||
// Recovery date and days might be null for ongoing drawdowns
|
||||
if (dd.recoveryDate) {
|
||||
expect(dd.recoveryDate).toBeInstanceOf(Date);
|
||||
expect(dd.recoveryDays).toBeGreaterThanOrEqual(0);
|
||||
}
|
||||
}
|
||||
|
||||
// Check sorting by drawdown magnitude
|
||||
for (let i = 0; i < drawdowns.length - 1; i++) {
|
||||
expect(drawdowns[i].drawdown).toBeGreaterThanOrEqual(drawdowns[i + 1].drawdown);
|
||||
}
|
||||
});
|
||||
|
||||
test('should handle empty inputs', () => {
|
||||
// Act & Assert
|
||||
expect(() => PerformanceAnalytics.calculateMonthlyReturns([])).not.toThrow();
|
||||
expect(() => PerformanceAnalytics.analyzeDrawdowns([])).not.toThrow();
|
||||
|
||||
const emptyMonthlyReturns = PerformanceAnalytics.calculateMonthlyReturns([]);
|
||||
const emptyDrawdowns = PerformanceAnalytics.analyzeDrawdowns([]);
|
||||
|
||||
expect(emptyMonthlyReturns).toEqual([]);
|
||||
expect(emptyDrawdowns).toEqual([]);
|
||||
});
|
||||
|
||||
test('should calculate special cases correctly', () => {
|
||||
// Case with no negative returns
|
||||
const allPositiveReturns = {
|
||||
dailyReturns: Array(30).fill(0).map((_, i) => ({
|
||||
date: new Date(`2023-01-${i + 1}`),
|
||||
return: 0.01 // Always positive
|
||||
}))
|
||||
};
|
||||
|
||||
// Case with no recovery from drawdown
|
||||
const noRecoveryReturns = {
|
||||
dailyReturns: [
|
||||
...Array(30).fill(0).map((_, i) => ({
|
||||
date: new Date(`2023-01-${i + 1}`),
|
||||
return: 0.01 // Positive returns
|
||||
})),
|
||||
...Array(30).fill(0).map((_, i) => ({
|
||||
date: new Date(`2023-02-${i + 1}`),
|
||||
return: -0.005 // Negative returns with no recovery
|
||||
}))
|
||||
]
|
||||
};
|
||||
|
||||
// Act
|
||||
const positiveMetrics = PerformanceAnalytics.enhanceResults({
|
||||
...sampleResult,
|
||||
dailyReturns: allPositiveReturns.dailyReturns
|
||||
});
|
||||
|
||||
const noRecoveryDrawdowns = PerformanceAnalytics.analyzeDrawdowns(noRecoveryReturns.dailyReturns);
|
||||
|
||||
// Assert
|
||||
expect(positiveMetrics.sortinoRatio).toBe(Infinity); // No downside risk
|
||||
|
||||
// Last drawdown should have no recovery
|
||||
const lastDrawdown = noRecoveryDrawdowns[noRecoveryDrawdowns.length - 1];
|
||||
expect(lastDrawdown.recoveryDate).toBeNull();
|
||||
expect(lastDrawdown.recoveryDays).toBeNull();
|
||||
});
|
||||
});
|
||||
|
|
@ -0,0 +1,237 @@
|
|||
import { describe, it, expect, beforeEach, afterEach, jest } from 'bun:test';
|
||||
import { EventEmitter } from 'events';
|
||||
import { WebSocket } from 'ws';
|
||||
import { StrategyExecutionService } from '../../core/execution/StrategyExecutionService';
|
||||
import { StrategyRegistry } from '../../core/strategies/StrategyRegistry';
|
||||
import { MarketDataFeed } from '../../core/backtesting/MarketDataFeed';
|
||||
import { BaseStrategy, BarData, Order } from '../../core/Strategy';
|
||||
|
||||
// Mock WebSocket to avoid actual network connections during tests
|
||||
jest.mock('ws', () => {
|
||||
const EventEmitter = require('events');
|
||||
|
||||
class MockWebSocket extends EventEmitter {
|
||||
static OPEN = 1;
|
||||
readyState = 1;
|
||||
close = jest.fn();
|
||||
send = jest.fn();
|
||||
}
|
||||
|
||||
class MockServer extends EventEmitter {
|
||||
clients = new Set();
|
||||
|
||||
constructor() {
|
||||
super();
|
||||
// Add a mock client to the set
|
||||
const mockClient = new MockWebSocket();
|
||||
this.clients.add(mockClient);
|
||||
}
|
||||
|
||||
close(callback) {
|
||||
callback();
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
WebSocket: MockWebSocket,
|
||||
Server: MockServer
|
||||
};
|
||||
});
|
||||
|
||||
// Mock MarketDataFeed to avoid actual API calls
|
||||
jest.mock('../../core/backtesting/MarketDataFeed', () => {
|
||||
return {
|
||||
MarketDataFeed: class {
|
||||
async getHistoricalData(symbol, resolution, startDate, endDate) {
|
||||
// Return mock data
|
||||
return [
|
||||
{
|
||||
symbol,
|
||||
timestamp: new Date(),
|
||||
open: 100,
|
||||
high: 105,
|
||||
low: 95,
|
||||
close: 102,
|
||||
volume: 1000
|
||||
}
|
||||
];
|
||||
}
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
// Mock strategy for testing
|
||||
class MockStrategy extends BaseStrategy {
|
||||
name = 'MockStrategy';
|
||||
description = 'A mock strategy for testing';
|
||||
symbols = ['AAPL', 'MSFT'];
|
||||
parameters = { param1: 1, param2: 2 };
|
||||
|
||||
constructor(id: string) {
|
||||
super(id);
|
||||
}
|
||||
|
||||
async start(): Promise<void> {}
|
||||
|
||||
async stop(): Promise<void> {}
|
||||
|
||||
onBar(bar: BarData) {
|
||||
// Return a mock signal
|
||||
return {
|
||||
action: 'BUY',
|
||||
symbol: bar.symbol,
|
||||
price: bar.close,
|
||||
quantity: 10,
|
||||
metadata: { reason: 'Test signal' }
|
||||
};
|
||||
}
|
||||
|
||||
async onOrderFilled(order: Order): Promise<void> {}
|
||||
}
|
||||
|
||||
// Mock StrategyRegistry
|
||||
jest.mock('../../core/strategies/StrategyRegistry', () => {
|
||||
const mockInstance = {
|
||||
getStrategyById: jest.fn(),
|
||||
getStrategyTypes: () => [{ id: 'mock-strategy', name: 'Mock Strategy' }],
|
||||
getAllStrategies: () => [new MockStrategy('mock-1')]
|
||||
};
|
||||
|
||||
return {
|
||||
StrategyRegistry: {
|
||||
getInstance: () => mockInstance
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
describe('StrategyExecutionService', () => {
|
||||
let executionService: StrategyExecutionService;
|
||||
let strategyRegistry: typeof StrategyRegistry.getInstance;
|
||||
|
||||
beforeEach(() => {
|
||||
// Reset mocks
|
||||
jest.clearAllMocks();
|
||||
|
||||
// Create a new execution service for each test
|
||||
executionService = new StrategyExecutionService('http://localhost:3001/api', 8082);
|
||||
strategyRegistry = StrategyRegistry.getInstance();
|
||||
|
||||
// Setup mock strategy
|
||||
const mockStrategy = new MockStrategy('test-strategy');
|
||||
(strategyRegistry.getStrategyById as jest.Mock).mockReturnValue(mockStrategy);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
executionService.shutdown();
|
||||
});
|
||||
|
||||
it('should initialize correctly', () => {
|
||||
expect(executionService).toBeDefined();
|
||||
});
|
||||
|
||||
it('should start a strategy correctly', () => {
|
||||
// Arrange & Act
|
||||
const result = executionService.startStrategy('test-strategy');
|
||||
|
||||
// Assert
|
||||
expect(result).toBe(true);
|
||||
expect(strategyRegistry.getStrategyById).toHaveBeenCalledWith('test-strategy');
|
||||
|
||||
// Check if WebSocket broadcast happened
|
||||
const ws = executionService['webSocketServer'].clients.values().next().value;
|
||||
expect(ws.send).toHaveBeenCalled();
|
||||
|
||||
// Check the broadcast message contains the correct type
|
||||
const lastCall = ws.send.mock.calls[0][0];
|
||||
const message = JSON.parse(lastCall);
|
||||
expect(message.type).toBe('strategy_started');
|
||||
expect(message.data.strategyId).toBe('test-strategy');
|
||||
});
|
||||
|
||||
it('should stop a strategy correctly', () => {
|
||||
// Arrange
|
||||
executionService.startStrategy('test-strategy');
|
||||
|
||||
// Act
|
||||
const result = executionService.stopStrategy('test-strategy');
|
||||
|
||||
// Assert
|
||||
expect(result).toBe(true);
|
||||
|
||||
// Check if WebSocket broadcast happened
|
||||
const ws = executionService['webSocketServer'].clients.values().next().value;
|
||||
const lastCallIndex = ws.send.mock.calls.length - 1;
|
||||
const lastCall = ws.send.mock.calls[lastCallIndex][0];
|
||||
const message = JSON.parse(lastCall);
|
||||
|
||||
expect(message.type).toBe('strategy_stopped');
|
||||
expect(message.data.strategyId).toBe('test-strategy');
|
||||
});
|
||||
|
||||
it('should pause a strategy correctly', () => {
|
||||
// Arrange
|
||||
executionService.startStrategy('test-strategy');
|
||||
|
||||
// Act
|
||||
const result = executionService.pauseStrategy('test-strategy');
|
||||
|
||||
// Assert
|
||||
expect(result).toBe(true);
|
||||
|
||||
// Check if WebSocket broadcast happened
|
||||
const ws = executionService['webSocketServer'].clients.values().next().value;
|
||||
const lastCallIndex = ws.send.mock.calls.length - 1;
|
||||
const lastCall = ws.send.mock.calls[lastCallIndex][0];
|
||||
const message = JSON.parse(lastCall);
|
||||
|
||||
expect(message.type).toBe('strategy_paused');
|
||||
expect(message.data.strategyId).toBe('test-strategy');
|
||||
});
|
||||
|
||||
it('should process market data and generate signals', async () => {
|
||||
// Arrange
|
||||
executionService.startStrategy('test-strategy');
|
||||
|
||||
// Act - Trigger market data polling manually
|
||||
await executionService['pollMarketData']('test-strategy');
|
||||
|
||||
// Assert - Check if signal was generated and broadcast
|
||||
const ws = executionService['webSocketServer'].clients.values().next().value;
|
||||
|
||||
// Find the strategy_signal message
|
||||
const signalMessages = ws.send.mock.calls
|
||||
.map(call => JSON.parse(call[0]))
|
||||
.filter(msg => msg.type === 'strategy_signal');
|
||||
|
||||
expect(signalMessages.length).toBeGreaterThan(0);
|
||||
expect(signalMessages[0].data.action).toBe('BUY');
|
||||
expect(signalMessages[0].data.strategyId).toBe('test-strategy');
|
||||
});
|
||||
|
||||
it('should handle WebSocket client connections', () => {
|
||||
// Arrange
|
||||
const mockWs = new WebSocket();
|
||||
const mockMessage = JSON.stringify({ type: 'get_active_strategies' });
|
||||
|
||||
// Act - Simulate connection and message
|
||||
executionService['webSocketServer'].emit('connection', mockWs);
|
||||
mockWs.emit('message', mockMessage);
|
||||
|
||||
// Assert
|
||||
expect(mockWs.send).toHaveBeenCalled();
|
||||
|
||||
// Check that the response is a strategy_status_list message
|
||||
const lastCall = mockWs.send.mock.calls[0][0];
|
||||
const message = JSON.parse(lastCall);
|
||||
expect(message.type).toBe('strategy_status_list');
|
||||
});
|
||||
|
||||
it('should shut down correctly', () => {
|
||||
// Act
|
||||
executionService.shutdown();
|
||||
|
||||
// Assert - WebSocket server should be closed
|
||||
const ws = executionService['webSocketServer'].clients.values().next().value;
|
||||
expect(ws.close).toHaveBeenCalled();
|
||||
});
|
||||
});
|
||||
|
|
@ -0,0 +1,130 @@
|
|||
import { describe, test, expect, beforeEach, mock } from 'bun:test';
|
||||
import { MeanReversionStrategy } from '../../core/strategies/MeanReversionStrategy';
|
||||
import { BarData } from '../../core/Strategy';
|
||||
|
||||
describe('MeanReversionStrategy', () => {
|
||||
let strategy: MeanReversionStrategy;
|
||||
let mockData: BarData[];
|
||||
|
||||
beforeEach(() => {
|
||||
// Create a strategy instance with test parameters
|
||||
strategy = new MeanReversionStrategy(
|
||||
'test_id',
|
||||
'Test Mean Reversion',
|
||||
'A test strategy',
|
||||
['AAPL'],
|
||||
{
|
||||
lookback: 20,
|
||||
entryDeviation: 1.5,
|
||||
exitDeviation: 0.5,
|
||||
lookbackPeriod: 100,
|
||||
positionSize: 0.2,
|
||||
stopLoss: 0.02,
|
||||
takeProfit: 0.05,
|
||||
useBollingerBands: true,
|
||||
bollingerPeriod: 20,
|
||||
bollingerDeviation: 2,
|
||||
rsiPeriod: 14,
|
||||
rsiOverbought: 70,
|
||||
rsiOversold: 30,
|
||||
useRsi: true
|
||||
}
|
||||
);
|
||||
|
||||
// Create mock price data
|
||||
const now = new Date();
|
||||
mockData = [];
|
||||
|
||||
// Create 100 bars of data with a mean-reverting pattern
|
||||
let price = 100;
|
||||
for (let i = 0; i < 100; i++) {
|
||||
// Add some mean reversion pattern (oscillating around 100)
|
||||
price = price + Math.sin(i / 10) * 5 + (Math.random() - 0.5) * 2;
|
||||
|
||||
mockData.push({
|
||||
symbol: 'AAPL',
|
||||
timestamp: new Date(now.getTime() - (100 - i) * 60000), // 1-minute bars
|
||||
open: price - 0.5,
|
||||
high: price + 1,
|
||||
low: price - 1,
|
||||
close: price,
|
||||
volume: 1000 + Math.random() * 1000
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
test('should initialize with correct parameters', () => {
|
||||
expect(strategy.id).toBe('test_id');
|
||||
expect(strategy.name).toBe('Test Mean Reversion');
|
||||
expect(strategy.description).toBe('A test strategy');
|
||||
expect(strategy.symbols).toEqual(['AAPL']);
|
||||
expect(strategy.parameters.lookback).toBe(20);
|
||||
expect(strategy.parameters.entryDeviation).toBe(1.5);
|
||||
});
|
||||
|
||||
test('should generate signals with vectorized calculation', async () => {
|
||||
// Arrange a price series with fake mean reversion
|
||||
const results = await strategy.runVectorized({
|
||||
symbols: ['AAPL'],
|
||||
data: { 'AAPL': mockData },
|
||||
initialCapital: 10000,
|
||||
startIndex: 20, // Skip the first 20 bars for indicator warmup
|
||||
endIndex: mockData.length - 1
|
||||
});
|
||||
|
||||
// Assert
|
||||
expect(results).toBeDefined();
|
||||
expect(results.positions).toBeDefined();
|
||||
// Should generate at least one trade in this artificial data
|
||||
expect(results.trades.length).toBeGreaterThan(0);
|
||||
expect(results.equityCurve.length).toBeGreaterThan(0);
|
||||
});
|
||||
|
||||
test('should calculate correct entry and exit signals', () => {
|
||||
// Mock the indicator calculations to test logic directly
|
||||
// We'll create a simple scenario where price is 2 standard deviations away
|
||||
const mockBar: BarData = {
|
||||
symbol: 'AAPL',
|
||||
timestamp: new Date(),
|
||||
open: 100,
|
||||
high: 102,
|
||||
low: 98,
|
||||
close: 100,
|
||||
volume: 1000
|
||||
};
|
||||
|
||||
// Mock the calculation context
|
||||
const context = {
|
||||
mean: 100,
|
||||
stdDev: 5,
|
||||
upperBand: 110,
|
||||
lowerBand: 90,
|
||||
rsi: 25, // Oversold
|
||||
shouldEnterLong: true,
|
||||
shouldExitLong: false,
|
||||
shouldEnterShort: false,
|
||||
shouldExitShort: false
|
||||
};
|
||||
|
||||
// Call the internal signal generation logic via a protected method
|
||||
// (For testing purposes, we're accessing a protected method)
|
||||
const result = (strategy as any).calculateSignals('AAPL', mockBar, context);
|
||||
|
||||
// Assert the signals based on our scenario
|
||||
expect(result).toBeDefined();
|
||||
expect(result.action).toBe('BUY'); // Should buy in oversold condition
|
||||
});
|
||||
|
||||
test('should handle empty data correctly', async () => {
|
||||
// Act & Assert
|
||||
await expect(async () => {
|
||||
await strategy.runVectorized({
|
||||
symbols: ['AAPL'],
|
||||
data: { 'AAPL': [] },
|
||||
initialCapital: 10000,
|
||||
startIndex: 0,
|
||||
endIndex: 0
|
||||
});
|
||||
}).not.toThrow();
|
||||
});
|
||||
});
|
||||
|
|
@ -0,0 +1,256 @@
|
|||
import { describe, test, expect, beforeEach } from 'bun:test';
|
||||
import { BaseStrategy } from '../../core/Strategy';
|
||||
import { StrategyRegistry, StrategyType } from '../../core/strategies/StrategyRegistry';
|
||||
import { MovingAverageCrossover } from '../../core/strategies/MovingAverageCrossover';
|
||||
import { MeanReversionStrategy } from '../../core/strategies/MeanReversionStrategy';
|
||||
import { VectorizedStrategy } from '../../core/strategies/VectorizedStrategy';
|
||||
|
||||
describe('Strategy Registry', () => {
|
||||
let registry: StrategyRegistry;
|
||||
|
||||
beforeEach(() => {
|
||||
// Reset the singleton for testing
|
||||
(StrategyRegistry as any).instance = null;
|
||||
registry = StrategyRegistry.getInstance();
|
||||
});
|
||||
|
||||
test('should create a MovingAverageCrossover strategy', () => {
|
||||
// Arrange
|
||||
const id = 'test_id';
|
||||
const name = 'Test Strategy';
|
||||
const description = 'A test strategy';
|
||||
const symbols = ['AAPL', 'MSFT'];
|
||||
const parameters = { fastPeriod: 10, slowPeriod: 30 };
|
||||
|
||||
// Act
|
||||
const strategy = registry.createStrategy(
|
||||
'MOVING_AVERAGE_CROSSOVER',
|
||||
id,
|
||||
name,
|
||||
description,
|
||||
symbols,
|
||||
parameters
|
||||
);
|
||||
|
||||
// Assert
|
||||
expect(strategy).toBeInstanceOf(MovingAverageCrossover);
|
||||
expect(strategy.id).toEqual(id);
|
||||
expect(strategy.name).toEqual(name);
|
||||
expect(strategy.description).toEqual(description);
|
||||
expect(strategy.symbols).toEqual(symbols);
|
||||
expect(strategy.parameters).toMatchObject(parameters);
|
||||
});
|
||||
|
||||
test('should create a MeanReversion strategy', () => {
|
||||
// Arrange
|
||||
const id = 'test_id';
|
||||
const name = 'Test Strategy';
|
||||
const description = 'A test strategy';
|
||||
const symbols = ['AAPL', 'MSFT'];
|
||||
const parameters = { lookback: 20, entryDeviation: 1.5 };
|
||||
|
||||
// Act
|
||||
const strategy = registry.createStrategy(
|
||||
'MEAN_REVERSION',
|
||||
id,
|
||||
name,
|
||||
description,
|
||||
symbols,
|
||||
parameters
|
||||
);
|
||||
|
||||
// Assert
|
||||
expect(strategy).toBeInstanceOf(MeanReversionStrategy);
|
||||
expect(strategy.id).toEqual(id);
|
||||
expect(strategy.name).toEqual(name);
|
||||
expect(strategy.description).toEqual(description);
|
||||
expect(strategy.symbols).toEqual(symbols);
|
||||
expect(strategy.parameters).toMatchObject(parameters);
|
||||
});
|
||||
|
||||
test('should throw error for invalid strategy type', () => {
|
||||
// Arrange
|
||||
const id = 'test_id';
|
||||
const name = 'Test Strategy';
|
||||
const description = 'A test strategy';
|
||||
const symbols = ['AAPL', 'MSFT'];
|
||||
const parameters = {};
|
||||
|
||||
// Act & Assert
|
||||
expect(() => {
|
||||
registry.createStrategy(
|
||||
'INVALID_TYPE' as StrategyType,
|
||||
id,
|
||||
name,
|
||||
description,
|
||||
symbols,
|
||||
parameters
|
||||
);
|
||||
}).toThrow("Strategy type 'INVALID_TYPE' is not registered");
|
||||
});
|
||||
|
||||
test('should register a custom strategy', () => {
|
||||
// Arrange
|
||||
const mockStrategyFactory = (
|
||||
id: string,
|
||||
name: string,
|
||||
description: string,
|
||||
symbols: string[],
|
||||
parameters: any
|
||||
) => {
|
||||
return new MovingAverageCrossover(id, name, description, symbols, parameters);
|
||||
};
|
||||
|
||||
// Act
|
||||
registry.registerStrategy('CUSTOM' as StrategyType, mockStrategyFactory);
|
||||
|
||||
// Assert
|
||||
expect(registry.hasStrategyType('CUSTOM')).toBe(true);
|
||||
|
||||
const strategy = registry.createStrategy(
|
||||
'CUSTOM',
|
||||
'custom_id',
|
||||
'Custom Strategy',
|
||||
'A custom strategy',
|
||||
['BTC/USD'],
|
||||
{}
|
||||
);
|
||||
|
||||
expect(strategy).toBeInstanceOf(MovingAverageCrossover);
|
||||
});
|
||||
|
||||
test('should get default parameters for a strategy type', () => {
|
||||
// Act
|
||||
const macParams = registry.getDefaultParameters('MOVING_AVERAGE_CROSSOVER');
|
||||
const mrParams = registry.getDefaultParameters('MEAN_REVERSION');
|
||||
|
||||
// Assert
|
||||
expect(macParams).toHaveProperty('fastPeriod');
|
||||
expect(macParams).toHaveProperty('slowPeriod');
|
||||
expect(mrParams).toHaveProperty('lookback');
|
||||
expect(mrParams).toHaveProperty('entryDeviation');
|
||||
});
|
||||
|
||||
test('should return empty object for unknown strategy default parameters', () => {
|
||||
// Act
|
||||
const params = registry.getDefaultParameters('CUSTOM' as StrategyType);
|
||||
|
||||
// Assert
|
||||
expect(params).toEqual({});
|
||||
});
|
||||
|
||||
test('should get all registered strategy types', () => {
|
||||
// Act
|
||||
const types = registry.getStrategyTypes();
|
||||
|
||||
// Assert
|
||||
expect(types).toContain('MOVING_AVERAGE_CROSSOVER');
|
||||
expect(types).toContain('MEAN_REVERSION');
|
||||
});
|
||||
|
||||
test('should check if strategy type is registered', () => {
|
||||
// Act & Assert
|
||||
expect(registry.hasStrategyType('MOVING_AVERAGE_CROSSOVER')).toBe(true);
|
||||
expect(registry.hasStrategyType('INVALID_TYPE' as StrategyType)).toBe(false);
|
||||
});
|
||||
|
||||
test('should get all registered strategies', () => {
|
||||
// Arrange
|
||||
registry.createStrategy(
|
||||
'MOVING_AVERAGE_CROSSOVER',
|
||||
'mac_id',
|
||||
'MAC Strategy',
|
||||
'MAC strategy',
|
||||
['AAPL'],
|
||||
{}
|
||||
);
|
||||
|
||||
registry.createStrategy(
|
||||
'MEAN_REVERSION',
|
||||
'mr_id',
|
||||
'MR Strategy',
|
||||
'MR strategy',
|
||||
['MSFT'],
|
||||
{}
|
||||
);
|
||||
|
||||
// Act
|
||||
const strategies = registry.getAllStrategies();
|
||||
|
||||
// Assert
|
||||
expect(strategies).toHaveLength(2);
|
||||
expect(strategies[0].id).toEqual('mac_id');
|
||||
expect(strategies[1].id).toEqual('mr_id');
|
||||
});
|
||||
|
||||
test('should get strategy by ID', () => {
|
||||
// Arrange
|
||||
registry.createStrategy(
|
||||
'MOVING_AVERAGE_CROSSOVER',
|
||||
'mac_id',
|
||||
'MAC Strategy',
|
||||
'MAC strategy',
|
||||
['AAPL'],
|
||||
{}
|
||||
);
|
||||
|
||||
// Act
|
||||
const strategy = registry.getStrategyById('mac_id');
|
||||
const nonExistent = registry.getStrategyById('non_existent');
|
||||
|
||||
// Assert
|
||||
expect(strategy).not.toBeNull();
|
||||
expect(strategy?.id).toEqual('mac_id');
|
||||
expect(nonExistent).toBeUndefined();
|
||||
});
|
||||
|
||||
test('should delete strategy by ID', () => {
|
||||
// Arrange
|
||||
registry.createStrategy(
|
||||
'MOVING_AVERAGE_CROSSOVER',
|
||||
'mac_id',
|
||||
'MAC Strategy',
|
||||
'MAC strategy',
|
||||
['AAPL'],
|
||||
{}
|
||||
);
|
||||
|
||||
// Act
|
||||
const result1 = registry.deleteStrategy('mac_id');
|
||||
const result2 = registry.deleteStrategy('non_existent');
|
||||
|
||||
// Assert
|
||||
expect(result1).toBe(true);
|
||||
expect(result2).toBe(false);
|
||||
expect(registry.getStrategyById('mac_id')).toBeUndefined();
|
||||
});
|
||||
|
||||
test('should identify strategy type from instance', () => {
|
||||
// Arrange
|
||||
const macStrategy = registry.createStrategy(
|
||||
'MOVING_AVERAGE_CROSSOVER',
|
||||
'mac_id',
|
||||
'MAC Strategy',
|
||||
'MAC strategy',
|
||||
['AAPL'],
|
||||
{}
|
||||
);
|
||||
|
||||
const mrStrategy = registry.createStrategy(
|
||||
'MEAN_REVERSION',
|
||||
'mr_id',
|
||||
'MR Strategy',
|
||||
'MR strategy',
|
||||
['MSFT'],
|
||||
{}
|
||||
);
|
||||
|
||||
// Act
|
||||
const macType = registry.getStrategyType(macStrategy);
|
||||
const mrType = registry.getStrategyType(mrStrategy);
|
||||
|
||||
// Assert
|
||||
expect(macType).toEqual('MOVING_AVERAGE_CROSSOVER');
|
||||
expect(mrType).toEqual('MEAN_REVERSION');
|
||||
});
|
||||
});
|
||||
0
apps/interface-services/trading-dashboard/src/App.tsx
Normal file
0
apps/interface-services/trading-dashboard/src/App.tsx
Normal file
|
|
@ -0,0 +1,165 @@
|
|||
import { Component, Input, OnChanges, SimpleChanges } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { BacktestResult } from '../../../services/strategy.service';
|
||||
import { Chart, ChartOptions } from 'chart.js/auto';
|
||||
|
||||
@Component({
|
||||
selector: 'app-drawdown-chart',
|
||||
standalone: true,
|
||||
imports: [CommonModule],
|
||||
template: `
|
||||
<div class="drawdown-chart-container">
|
||||
<canvas #drawdownChart></canvas>
|
||||
</div>
|
||||
`,
|
||||
styles: `
|
||||
.drawdown-chart-container {
|
||||
width: 100%;
|
||||
height: 300px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
`
|
||||
})
|
||||
export class DrawdownChartComponent implements OnChanges {
|
||||
@Input() backtestResult?: BacktestResult;
|
||||
|
||||
private chart?: Chart;
|
||||
private chartElement?: HTMLCanvasElement;
|
||||
|
||||
ngOnChanges(changes: SimpleChanges): void {
|
||||
if (changes['backtestResult'] && this.backtestResult) {
|
||||
this.renderChart();
|
||||
}
|
||||
}
|
||||
|
||||
ngAfterViewInit(): void {
|
||||
this.chartElement = document.querySelector('canvas') as HTMLCanvasElement;
|
||||
if (this.backtestResult) {
|
||||
this.renderChart();
|
||||
}
|
||||
}
|
||||
|
||||
private renderChart(): void {
|
||||
if (!this.chartElement || !this.backtestResult) return;
|
||||
|
||||
// Clean up previous chart if it exists
|
||||
if (this.chart) {
|
||||
this.chart.destroy();
|
||||
}
|
||||
|
||||
// Calculate drawdown series from daily returns
|
||||
const drawdownData = this.calculateDrawdownSeries(this.backtestResult);
|
||||
|
||||
// Create chart
|
||||
this.chart = new Chart(this.chartElement, {
|
||||
type: 'line',
|
||||
data: {
|
||||
labels: drawdownData.dates.map(date => this.formatDate(date)),
|
||||
datasets: [
|
||||
{
|
||||
label: 'Drawdown',
|
||||
data: drawdownData.drawdowns,
|
||||
borderColor: 'rgba(255, 99, 132, 1)',
|
||||
backgroundColor: 'rgba(255, 99, 132, 0.2)',
|
||||
fill: true,
|
||||
tension: 0.3,
|
||||
borderWidth: 2
|
||||
}
|
||||
]
|
||||
},
|
||||
options: {
|
||||
responsive: true,
|
||||
maintainAspectRatio: false,
|
||||
scales: {
|
||||
x: {
|
||||
ticks: {
|
||||
maxTicksLimit: 12,
|
||||
maxRotation: 0,
|
||||
minRotation: 0
|
||||
},
|
||||
grid: {
|
||||
display: false
|
||||
}
|
||||
},
|
||||
y: {
|
||||
ticks: {
|
||||
callback: function(value) {
|
||||
return (value * 100).toFixed(1) + '%';
|
||||
}
|
||||
},
|
||||
grid: {
|
||||
color: 'rgba(200, 200, 200, 0.2)'
|
||||
},
|
||||
min: -0.05, // Show at least 5% drawdown for context
|
||||
suggestedMax: 0.01
|
||||
}
|
||||
},
|
||||
plugins: {
|
||||
tooltip: {
|
||||
mode: 'index',
|
||||
intersect: false,
|
||||
callbacks: {
|
||||
label: function(context) {
|
||||
let label = context.dataset.label || '';
|
||||
if (label) {
|
||||
label += ': ';
|
||||
}
|
||||
if (context.parsed.y !== null) {
|
||||
label += (context.parsed.y * 100).toFixed(2) + '%';
|
||||
}
|
||||
return label;
|
||||
}
|
||||
}
|
||||
},
|
||||
legend: {
|
||||
position: 'top',
|
||||
}
|
||||
}
|
||||
} as ChartOptions
|
||||
});
|
||||
}
|
||||
|
||||
private calculateDrawdownSeries(result: BacktestResult): {
|
||||
dates: Date[];
|
||||
drawdowns: number[];
|
||||
} {
|
||||
const dates: Date[] = [];
|
||||
const drawdowns: number[] = [];
|
||||
|
||||
// Sort daily returns by date
|
||||
const sortedReturns = [...result.dailyReturns].sort(
|
||||
(a, b) => new Date(a.date).getTime() - new Date(b.date).getTime()
|
||||
);
|
||||
|
||||
// Calculate equity curve
|
||||
let equity = 1;
|
||||
const equityCurve: number[] = [];
|
||||
|
||||
for (const daily of sortedReturns) {
|
||||
equity *= (1 + daily.return);
|
||||
equityCurve.push(equity);
|
||||
dates.push(new Date(daily.date));
|
||||
}
|
||||
|
||||
// Calculate running maximum (high water mark)
|
||||
let hwm = equityCurve[0];
|
||||
|
||||
for (let i = 0; i < equityCurve.length; i++) {
|
||||
// Update high water mark
|
||||
hwm = Math.max(hwm, equityCurve[i]);
|
||||
// Calculate drawdown as percentage from high water mark
|
||||
const drawdown = (equityCurve[i] / hwm) - 1;
|
||||
drawdowns.push(drawdown);
|
||||
}
|
||||
|
||||
return { dates, drawdowns };
|
||||
}
|
||||
|
||||
private formatDate(date: Date): string {
|
||||
return new Date(date).toLocaleDateString('en-US', {
|
||||
month: 'short',
|
||||
day: 'numeric',
|
||||
year: 'numeric'
|
||||
});
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,171 @@
|
|||
import { Component, Input, OnChanges, SimpleChanges } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { BacktestResult } from '../../../services/strategy.service';
|
||||
import { Chart, ChartOptions } from 'chart.js/auto';
|
||||
|
||||
@Component({
|
||||
selector: 'app-equity-chart',
|
||||
standalone: true,
|
||||
imports: [CommonModule],
|
||||
template: `
|
||||
<div class="equity-chart-container">
|
||||
<canvas #equityChart></canvas>
|
||||
</div>
|
||||
`,
|
||||
styles: `
|
||||
.equity-chart-container {
|
||||
width: 100%;
|
||||
height: 400px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
`
|
||||
})
|
||||
export class EquityChartComponent implements OnChanges {
|
||||
@Input() backtestResult?: BacktestResult;
|
||||
|
||||
private chart?: Chart;
|
||||
private chartElement?: HTMLCanvasElement;
|
||||
|
||||
ngOnChanges(changes: SimpleChanges): void {
|
||||
if (changes['backtestResult'] && this.backtestResult) {
|
||||
this.renderChart();
|
||||
}
|
||||
}
|
||||
|
||||
ngAfterViewInit(): void {
|
||||
this.chartElement = document.querySelector('canvas') as HTMLCanvasElement;
|
||||
if (this.backtestResult) {
|
||||
this.renderChart();
|
||||
}
|
||||
}
|
||||
|
||||
private renderChart(): void {
|
||||
if (!this.chartElement || !this.backtestResult) return;
|
||||
|
||||
// Clean up previous chart if it exists
|
||||
if (this.chart) {
|
||||
this.chart.destroy();
|
||||
}
|
||||
|
||||
// Prepare data
|
||||
const equityCurve = this.calculateEquityCurve(this.backtestResult);
|
||||
|
||||
// Create chart
|
||||
this.chart = new Chart(this.chartElement, {
|
||||
type: 'line',
|
||||
data: {
|
||||
labels: equityCurve.dates.map(date => this.formatDate(date)),
|
||||
datasets: [
|
||||
{
|
||||
label: 'Portfolio Value',
|
||||
data: equityCurve.values,
|
||||
borderColor: 'rgba(75, 192, 192, 1)',
|
||||
backgroundColor: 'rgba(75, 192, 192, 0.2)',
|
||||
tension: 0.3,
|
||||
borderWidth: 2,
|
||||
fill: true
|
||||
},
|
||||
{
|
||||
label: 'Benchmark',
|
||||
data: equityCurve.benchmark,
|
||||
borderColor: 'rgba(153, 102, 255, 0.5)',
|
||||
backgroundColor: 'rgba(153, 102, 255, 0.1)',
|
||||
borderDash: [5, 5],
|
||||
tension: 0.3,
|
||||
borderWidth: 1,
|
||||
fill: false
|
||||
}
|
||||
]
|
||||
},
|
||||
options: {
|
||||
responsive: true,
|
||||
maintainAspectRatio: false,
|
||||
scales: {
|
||||
x: {
|
||||
ticks: {
|
||||
maxTicksLimit: 12,
|
||||
maxRotation: 0,
|
||||
minRotation: 0
|
||||
},
|
||||
grid: {
|
||||
display: false
|
||||
}
|
||||
},
|
||||
y: {
|
||||
ticks: {
|
||||
callback: function(value) {
|
||||
return '$' + value.toLocaleString();
|
||||
}
|
||||
},
|
||||
grid: {
|
||||
color: 'rgba(200, 200, 200, 0.2)'
|
||||
}
|
||||
}
|
||||
},
|
||||
plugins: {
|
||||
tooltip: {
|
||||
mode: 'index',
|
||||
intersect: false,
|
||||
callbacks: {
|
||||
label: function(context) {
|
||||
let label = context.dataset.label || '';
|
||||
if (label) {
|
||||
label += ': ';
|
||||
}
|
||||
if (context.parsed.y !== null) {
|
||||
label += new Intl.NumberFormat('en-US', { style: 'currency', currency: 'USD' })
|
||||
.format(context.parsed.y);
|
||||
}
|
||||
return label;
|
||||
}
|
||||
}
|
||||
},
|
||||
legend: {
|
||||
position: 'top',
|
||||
}
|
||||
}
|
||||
} as ChartOptions
|
||||
});
|
||||
}
|
||||
|
||||
private calculateEquityCurve(result: BacktestResult): {
|
||||
dates: Date[];
|
||||
values: number[];
|
||||
benchmark: number[];
|
||||
} {
|
||||
const initialValue = result.initialCapital;
|
||||
const dates: Date[] = [];
|
||||
const values: number[] = [];
|
||||
const benchmark: number[] = [];
|
||||
|
||||
// Sort daily returns by date
|
||||
const sortedReturns = [...result.dailyReturns].sort(
|
||||
(a, b) => new Date(a.date).getTime() - new Date(b.date).getTime()
|
||||
);
|
||||
|
||||
// Calculate cumulative portfolio values
|
||||
let portfolioValue = initialValue;
|
||||
let benchmarkValue = initialValue;
|
||||
|
||||
for (const daily of sortedReturns) {
|
||||
const date = new Date(daily.date);
|
||||
portfolioValue = portfolioValue * (1 + daily.return);
|
||||
// Simple benchmark (e.g., assuming 8% annualized return for a market index)
|
||||
benchmarkValue = benchmarkValue * (1 + 0.08 / 365);
|
||||
|
||||
dates.push(date);
|
||||
values.push(portfolioValue);
|
||||
benchmark.push(benchmarkValue);
|
||||
}
|
||||
|
||||
return { dates, values, benchmark };
|
||||
}
|
||||
|
||||
private formatDate(date: Date): string {
|
||||
return new Date(date).toLocaleDateString('en-US', {
|
||||
month: 'short',
|
||||
day: 'numeric',
|
||||
year: 'numeric'
|
||||
});
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,258 @@
|
|||
import { Component, Input } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { MatCardModule } from '@angular/material/card';
|
||||
import { MatGridListModule } from '@angular/material/grid-list';
|
||||
import { MatDividerModule } from '@angular/material/divider';
|
||||
import { MatTooltipModule } from '@angular/material/tooltip';
|
||||
import { BacktestResult } from '../../../services/strategy.service';
|
||||
|
||||
@Component({
|
||||
selector: 'app-performance-metrics',
|
||||
standalone: true,
|
||||
imports: [
|
||||
CommonModule,
|
||||
MatCardModule,
|
||||
MatGridListModule,
|
||||
MatDividerModule,
|
||||
MatTooltipModule
|
||||
],
|
||||
template: `
|
||||
<mat-card class="metrics-card">
|
||||
<mat-card-header>
|
||||
<mat-card-title>Performance Metrics</mat-card-title>
|
||||
</mat-card-header>
|
||||
<mat-card-content>
|
||||
<div class="metrics-grid">
|
||||
<div class="metric-group">
|
||||
<h3>Returns</h3>
|
||||
<div class="metrics-row">
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Total return over the backtest period">Total Return</div>
|
||||
<div class="metric-value" [ngClass]="getReturnClass(backtestResult?.totalReturn || 0)">
|
||||
{{formatPercent(backtestResult?.totalReturn || 0)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Annualized return (adjusted for the backtest duration)">Annualized Return</div>
|
||||
<div class="metric-value" [ngClass]="getReturnClass(backtestResult?.annualizedReturn || 0)">
|
||||
{{formatPercent(backtestResult?.annualizedReturn || 0)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Compound Annual Growth Rate">CAGR</div>
|
||||
<div class="metric-value" [ngClass]="getReturnClass(backtestResult?.cagr || 0)">
|
||||
{{formatPercent(backtestResult?.cagr || 0)}}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<mat-divider></mat-divider>
|
||||
|
||||
<div class="metric-group">
|
||||
<h3>Risk Metrics</h3>
|
||||
<div class="metrics-row">
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Maximum peak-to-valley drawdown">Max Drawdown</div>
|
||||
<div class="metric-value negative">
|
||||
{{formatPercent(backtestResult?.maxDrawdown || 0)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Number of days in the worst drawdown">Max DD Duration</div>
|
||||
<div class="metric-value">
|
||||
{{formatDays(backtestResult?.maxDrawdownDuration || 0)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Annualized standard deviation of returns">Volatility</div>
|
||||
<div class="metric-value">
|
||||
{{formatPercent(backtestResult?.volatility || 0)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Square root of the sum of the squares of drawdowns">Ulcer Index</div>
|
||||
<div class="metric-value">
|
||||
{{(backtestResult?.ulcerIndex || 0).toFixed(4)}}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<mat-divider></mat-divider>
|
||||
|
||||
<div class="metric-group">
|
||||
<h3>Risk-Adjusted Returns</h3>
|
||||
<div class="metrics-row">
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Excess return per unit of risk">Sharpe Ratio</div>
|
||||
<div class="metric-value" [ngClass]="getRatioClass(backtestResult?.sharpeRatio || 0)">
|
||||
{{(backtestResult?.sharpeRatio || 0).toFixed(2)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Return per unit of downside risk">Sortino Ratio</div>
|
||||
<div class="metric-value" [ngClass]="getRatioClass(backtestResult?.sortinoRatio || 0)">
|
||||
{{(backtestResult?.sortinoRatio || 0).toFixed(2)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Return per unit of max drawdown">Calmar Ratio</div>
|
||||
<div class="metric-value" [ngClass]="getRatioClass(backtestResult?.calmarRatio || 0)">
|
||||
{{(backtestResult?.calmarRatio || 0).toFixed(2)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Probability-weighted ratio of gains vs. losses">Omega Ratio</div>
|
||||
<div class="metric-value" [ngClass]="getRatioClass(backtestResult?.omegaRatio || 0)">
|
||||
{{(backtestResult?.omegaRatio || 0).toFixed(2)}}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<mat-divider></mat-divider>
|
||||
|
||||
<div class="metric-group">
|
||||
<h3>Trade Statistics</h3>
|
||||
<div class="metrics-row">
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Total number of trades">Total Trades</div>
|
||||
<div class="metric-value">
|
||||
{{backtestResult?.totalTrades || 0}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Percentage of winning trades">Win Rate</div>
|
||||
<div class="metric-value" [ngClass]="getWinRateClass(backtestResult?.winRate || 0)">
|
||||
{{formatPercent(backtestResult?.winRate || 0)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Average profit of winning trades">Avg Win</div>
|
||||
<div class="metric-value positive">
|
||||
{{formatPercent(backtestResult?.averageWinningTrade || 0)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Average loss of losing trades">Avg Loss</div>
|
||||
<div class="metric-value negative">
|
||||
{{formatPercent(backtestResult?.averageLosingTrade || 0)}}
|
||||
</div>
|
||||
</div>
|
||||
<div class="metric">
|
||||
<div class="metric-name" matTooltip="Ratio of total gains to total losses">Profit Factor</div>
|
||||
<div class="metric-value" [ngClass]="getProfitFactorClass(backtestResult?.profitFactor || 0)">
|
||||
{{(backtestResult?.profitFactor || 0).toFixed(2)}}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</mat-card-content>
|
||||
</mat-card>
|
||||
`,
|
||||
styles: `
|
||||
.metrics-card {
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.metrics-grid {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 16px;
|
||||
}
|
||||
|
||||
.metric-group {
|
||||
padding: 10px 0;
|
||||
}
|
||||
|
||||
.metric-group h3 {
|
||||
margin-top: 0;
|
||||
margin-bottom: 16px;
|
||||
font-size: 16px;
|
||||
font-weight: 500;
|
||||
color: #555;
|
||||
}
|
||||
|
||||
.metrics-row {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 24px;
|
||||
}
|
||||
|
||||
.metric {
|
||||
min-width: 120px;
|
||||
margin-bottom: 16px;
|
||||
}
|
||||
|
||||
.metric-name {
|
||||
font-size: 12px;
|
||||
color: #666;
|
||||
margin-bottom: 4px;
|
||||
}
|
||||
|
||||
.metric-value {
|
||||
font-size: 16px;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.positive {
|
||||
color: #4CAF50;
|
||||
}
|
||||
|
||||
.negative {
|
||||
color: #F44336;
|
||||
}
|
||||
|
||||
.neutral {
|
||||
color: #FFA000;
|
||||
}
|
||||
|
||||
mat-divider {
|
||||
margin: 8px 0;
|
||||
}
|
||||
`
|
||||
})
|
||||
export class PerformanceMetricsComponent {
|
||||
@Input() backtestResult?: BacktestResult;
|
||||
|
||||
// Formatting helpers
|
||||
formatPercent(value: number): string {
|
||||
return new Intl.NumberFormat('en-US', {
|
||||
style: 'percent',
|
||||
minimumFractionDigits: 2,
|
||||
maximumFractionDigits: 2
|
||||
}).format(value);
|
||||
}
|
||||
|
||||
formatDays(days: number): string {
|
||||
return `${days} days`;
|
||||
}
|
||||
|
||||
// Conditional classes
|
||||
getReturnClass(value: number): string {
|
||||
if (value > 0) return 'positive';
|
||||
if (value < 0) return 'negative';
|
||||
return '';
|
||||
}
|
||||
|
||||
getRatioClass(value: number): string {
|
||||
if (value >= 1.5) return 'positive';
|
||||
if (value >= 1) return 'neutral';
|
||||
if (value < 0) return 'negative';
|
||||
return '';
|
||||
}
|
||||
|
||||
getWinRateClass(value: number): string {
|
||||
if (value >= 0.55) return 'positive';
|
||||
if (value >= 0.45) return 'neutral';
|
||||
return 'negative';
|
||||
}
|
||||
|
||||
getProfitFactorClass(value: number): string {
|
||||
if (value >= 1.5) return 'positive';
|
||||
if (value >= 1) return 'neutral';
|
||||
return 'negative';
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,221 @@
|
|||
import { Component, Input } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { MatTableModule } from '@angular/material/table';
|
||||
import { MatSortModule, Sort } from '@angular/material/sort';
|
||||
import { MatPaginatorModule, PageEvent } from '@angular/material/paginator';
|
||||
import { MatCardModule } from '@angular/material/card';
|
||||
import { MatIconModule } from '@angular/material/icon';
|
||||
import { BacktestResult } from '../../../services/strategy.service';
|
||||
|
||||
@Component({
|
||||
selector: 'app-trades-table',
|
||||
standalone: true,
|
||||
imports: [
|
||||
CommonModule,
|
||||
MatTableModule,
|
||||
MatSortModule,
|
||||
MatPaginatorModule,
|
||||
MatCardModule,
|
||||
MatIconModule
|
||||
],
|
||||
template: `
|
||||
<mat-card class="trades-card">
|
||||
<mat-card-header>
|
||||
<mat-card-title>Trades</mat-card-title>
|
||||
</mat-card-header>
|
||||
<mat-card-content>
|
||||
<table mat-table [dataSource]="displayedTrades" matSort (matSortChange)="sortData($event)" class="trades-table">
|
||||
|
||||
<!-- Symbol Column -->
|
||||
<ng-container matColumnDef="symbol">
|
||||
<th mat-header-cell *matHeaderCellDef mat-sort-header> Symbol </th>
|
||||
<td mat-cell *matCellDef="let trade"> {{trade.symbol}} </td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Entry Date Column -->
|
||||
<ng-container matColumnDef="entryTime">
|
||||
<th mat-header-cell *matHeaderCellDef mat-sort-header> Entry Time </th>
|
||||
<td mat-cell *matCellDef="let trade"> {{formatDate(trade.entryTime)}} </td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Entry Price Column -->
|
||||
<ng-container matColumnDef="entryPrice">
|
||||
<th mat-header-cell *matHeaderCellDef mat-sort-header> Entry Price </th>
|
||||
<td mat-cell *matCellDef="let trade"> {{formatCurrency(trade.entryPrice)}} </td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Exit Date Column -->
|
||||
<ng-container matColumnDef="exitTime">
|
||||
<th mat-header-cell *matHeaderCellDef mat-sort-header> Exit Time </th>
|
||||
<td mat-cell *matCellDef="let trade"> {{formatDate(trade.exitTime)}} </td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Exit Price Column -->
|
||||
<ng-container matColumnDef="exitPrice">
|
||||
<th mat-header-cell *matHeaderCellDef mat-sort-header> Exit Price </th>
|
||||
<td mat-cell *matCellDef="let trade"> {{formatCurrency(trade.exitPrice)}} </td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Quantity Column -->
|
||||
<ng-container matColumnDef="quantity">
|
||||
<th mat-header-cell *matHeaderCellDef mat-sort-header> Quantity </th>
|
||||
<td mat-cell *matCellDef="let trade"> {{trade.quantity}} </td>
|
||||
</ng-container>
|
||||
|
||||
<!-- P&L Column -->
|
||||
<ng-container matColumnDef="pnl">
|
||||
<th mat-header-cell *matHeaderCellDef mat-sort-header> P&L </th>
|
||||
<td mat-cell *matCellDef="let trade"
|
||||
[ngClass]="{'positive': trade.pnl > 0, 'negative': trade.pnl < 0}">
|
||||
{{formatCurrency(trade.pnl)}}
|
||||
</td>
|
||||
</ng-container>
|
||||
|
||||
<!-- P&L Percent Column -->
|
||||
<ng-container matColumnDef="pnlPercent">
|
||||
<th mat-header-cell *matHeaderCellDef mat-sort-header> P&L % </th>
|
||||
<td mat-cell *matCellDef="let trade"
|
||||
[ngClass]="{'positive': trade.pnlPercent > 0, 'negative': trade.pnlPercent < 0}">
|
||||
{{formatPercent(trade.pnlPercent)}}
|
||||
</td>
|
||||
</ng-container>
|
||||
|
||||
<tr mat-header-row *matHeaderRowDef="displayedColumns"></tr>
|
||||
<tr mat-row *matRowDef="let row; columns: displayedColumns;"></tr>
|
||||
</table>
|
||||
|
||||
<mat-paginator
|
||||
[length]="totalTrades"
|
||||
[pageSize]="pageSize"
|
||||
[pageSizeOptions]="[5, 10, 25, 50]"
|
||||
(page)="pageChange($event)"
|
||||
aria-label="Select page">
|
||||
</mat-paginator>
|
||||
</mat-card-content>
|
||||
</mat-card>
|
||||
`,
|
||||
styles: `
|
||||
.trades-card {
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.trades-table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
}
|
||||
|
||||
.mat-column-pnl, .mat-column-pnlPercent {
|
||||
text-align: right;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.positive {
|
||||
color: #4CAF50;
|
||||
}
|
||||
|
||||
.negative {
|
||||
color: #F44336;
|
||||
}
|
||||
|
||||
.mat-mdc-row:hover {
|
||||
background-color: rgba(0, 0, 0, 0.04);
|
||||
}
|
||||
`
|
||||
})
|
||||
export class TradesTableComponent {
|
||||
@Input() set backtestResult(value: BacktestResult | undefined) {
|
||||
if (value) {
|
||||
this._backtestResult = value;
|
||||
this.updateDisplayedTrades();
|
||||
}
|
||||
}
|
||||
|
||||
get backtestResult(): BacktestResult | undefined {
|
||||
return this._backtestResult;
|
||||
}
|
||||
|
||||
private _backtestResult?: BacktestResult;
|
||||
|
||||
// Table configuration
|
||||
displayedColumns: string[] = [
|
||||
'symbol', 'entryTime', 'entryPrice', 'exitTime',
|
||||
'exitPrice', 'quantity', 'pnl', 'pnlPercent'
|
||||
];
|
||||
|
||||
// Pagination
|
||||
pageSize = 10;
|
||||
currentPage = 0;
|
||||
displayedTrades: any[] = [];
|
||||
|
||||
get totalTrades(): number {
|
||||
return this._backtestResult?.trades.length || 0;
|
||||
}
|
||||
|
||||
// Sort the trades
|
||||
sortData(sort: Sort): void {
|
||||
if (!sort.active || sort.direction === '') {
|
||||
this.updateDisplayedTrades();
|
||||
return;
|
||||
}
|
||||
|
||||
const data = this._backtestResult?.trades.slice() || [];
|
||||
|
||||
this.displayedTrades = data.sort((a, b) => {
|
||||
const isAsc = sort.direction === 'asc';
|
||||
switch (sort.active) {
|
||||
case 'symbol': return this.compare(a.symbol, b.symbol, isAsc);
|
||||
case 'entryTime': return this.compare(new Date(a.entryTime).getTime(), new Date(b.entryTime).getTime(), isAsc);
|
||||
case 'entryPrice': return this.compare(a.entryPrice, b.entryPrice, isAsc);
|
||||
case 'exitTime': return this.compare(new Date(a.exitTime).getTime(), new Date(b.exitTime).getTime(), isAsc);
|
||||
case 'exitPrice': return this.compare(a.exitPrice, b.exitPrice, isAsc);
|
||||
case 'quantity': return this.compare(a.quantity, b.quantity, isAsc);
|
||||
case 'pnl': return this.compare(a.pnl, b.pnl, isAsc);
|
||||
case 'pnlPercent': return this.compare(a.pnlPercent, b.pnlPercent, isAsc);
|
||||
default: return 0;
|
||||
}
|
||||
}).slice(this.currentPage * this.pageSize, (this.currentPage + 1) * this.pageSize);
|
||||
}
|
||||
|
||||
// Handle page changes
|
||||
pageChange(event: PageEvent): void {
|
||||
this.pageSize = event.pageSize;
|
||||
this.currentPage = event.pageIndex;
|
||||
this.updateDisplayedTrades();
|
||||
}
|
||||
|
||||
// Update displayed trades based on current page and page size
|
||||
updateDisplayedTrades(): void {
|
||||
if (this._backtestResult) {
|
||||
this.displayedTrades = this._backtestResult.trades.slice(
|
||||
this.currentPage * this.pageSize,
|
||||
(this.currentPage + 1) * this.pageSize
|
||||
);
|
||||
} else {
|
||||
this.displayedTrades = [];
|
||||
}
|
||||
}
|
||||
|
||||
// Helper methods for formatting
|
||||
formatDate(date: Date | string): string {
|
||||
return new Date(date).toLocaleString();
|
||||
}
|
||||
|
||||
formatCurrency(value: number): string {
|
||||
return new Intl.NumberFormat('en-US', {
|
||||
style: 'currency',
|
||||
currency: 'USD',
|
||||
}).format(value);
|
||||
}
|
||||
|
||||
formatPercent(value: number): string {
|
||||
return new Intl.NumberFormat('en-US', {
|
||||
style: 'percent',
|
||||
minimumFractionDigits: 2,
|
||||
maximumFractionDigits: 2
|
||||
}).format(value);
|
||||
}
|
||||
|
||||
private compare(a: number | string, b: number | string, isAsc: boolean): number {
|
||||
return (a < b ? -1 : 1) * (isAsc ? 1 : -1);
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,185 @@
|
|||
import { Component, Inject, OnInit } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import {
|
||||
FormBuilder,
|
||||
FormGroup,
|
||||
ReactiveFormsModule,
|
||||
Validators
|
||||
} from '@angular/forms';
|
||||
import { MatButtonModule } from '@angular/material/button';
|
||||
import { MatDialogModule, MAT_DIALOG_DATA, MatDialogRef } from '@angular/material/dialog';
|
||||
import { MatFormFieldModule } from '@angular/material/form-field';
|
||||
import { MatInputModule } from '@angular/material/input';
|
||||
import { MatSelectModule } from '@angular/material/select';
|
||||
import { MatDatepickerModule } from '@angular/material/datepicker';
|
||||
import { MatNativeDateModule } from '@angular/material/core';
|
||||
import { MatProgressBarModule } from '@angular/material/progress-bar';
|
||||
import { MatTabsModule } from '@angular/material/tabs';
|
||||
import { MatChipsModule } from '@angular/material/chips';
|
||||
import { MatIconModule } from '@angular/material/icon';
|
||||
import { MatSlideToggleModule } from '@angular/material/slide-toggle';
|
||||
import {
|
||||
BacktestRequest,
|
||||
BacktestResult,
|
||||
StrategyService,
|
||||
TradingStrategy
|
||||
} from '../../../services/strategy.service';
|
||||
|
||||
@Component({
|
||||
selector: 'app-backtest-dialog',
|
||||
standalone: true,
|
||||
imports: [
|
||||
CommonModule,
|
||||
ReactiveFormsModule,
|
||||
MatButtonModule,
|
||||
MatDialogModule,
|
||||
MatFormFieldModule,
|
||||
MatInputModule,
|
||||
MatSelectModule,
|
||||
MatDatepickerModule,
|
||||
MatNativeDateModule,
|
||||
MatProgressBarModule,
|
||||
MatTabsModule,
|
||||
MatChipsModule,
|
||||
MatIconModule,
|
||||
MatSlideToggleModule
|
||||
],
|
||||
templateUrl: './backtest-dialog.component.html',
|
||||
styleUrl: './backtest-dialog.component.css'
|
||||
})
|
||||
export class BacktestDialogComponent implements OnInit {
|
||||
backtestForm: FormGroup;
|
||||
strategyTypes: string[] = [];
|
||||
availableSymbols: string[] = ['AAPL', 'MSFT', 'GOOGL', 'AMZN', 'TSLA', 'META', 'NVDA', 'SPY', 'QQQ'];
|
||||
selectedSymbols: string[] = [];
|
||||
parameters: Record<string, any> = {};
|
||||
isRunning: boolean = false;
|
||||
backtestResult: BacktestResult | null = null;
|
||||
|
||||
constructor(
|
||||
private fb: FormBuilder,
|
||||
private strategyService: StrategyService,
|
||||
@Inject(MAT_DIALOG_DATA) public data: TradingStrategy | null,
|
||||
private dialogRef: MatDialogRef<BacktestDialogComponent>
|
||||
) {
|
||||
// Initialize form with defaults
|
||||
this.backtestForm = this.fb.group({
|
||||
strategyType: ['', [Validators.required]],
|
||||
startDate: [new Date(new Date().setFullYear(new Date().getFullYear() - 1)), [Validators.required]],
|
||||
endDate: [new Date(), [Validators.required]],
|
||||
initialCapital: [100000, [Validators.required, Validators.min(1000)]],
|
||||
dataResolution: ['1d', [Validators.required]],
|
||||
commission: [0.001, [Validators.required, Validators.min(0), Validators.max(0.1)]],
|
||||
slippage: [0.0005, [Validators.required, Validators.min(0), Validators.max(0.1)]],
|
||||
mode: ['event', [Validators.required]]
|
||||
});
|
||||
|
||||
// If strategy is provided, pre-populate the form
|
||||
if (data) {
|
||||
this.selectedSymbols = [...data.symbols];
|
||||
this.backtestForm.patchValue({
|
||||
strategyType: data.type
|
||||
});
|
||||
this.parameters = {...data.parameters};
|
||||
}
|
||||
}
|
||||
|
||||
ngOnInit(): void {
|
||||
this.loadStrategyTypes();
|
||||
}
|
||||
|
||||
loadStrategyTypes(): void {
|
||||
this.strategyService.getStrategyTypes().subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
this.strategyTypes = response.data;
|
||||
|
||||
// If strategy is provided, load its parameters
|
||||
if (this.data) {
|
||||
this.onStrategyTypeChange(this.data.type);
|
||||
}
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error loading strategy types:', error);
|
||||
this.strategyTypes = ['MOVING_AVERAGE_CROSSOVER', 'MEAN_REVERSION', 'CUSTOM'];
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
onStrategyTypeChange(type: string): void {
|
||||
// Get default parameters for this strategy type
|
||||
this.strategyService.getStrategyParameters(type).subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
// If strategy is provided, merge default with existing
|
||||
if (this.data) {
|
||||
this.parameters = {
|
||||
...response.data,
|
||||
...this.data.parameters
|
||||
};
|
||||
} else {
|
||||
this.parameters = response.data;
|
||||
}
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error loading parameters:', error);
|
||||
this.parameters = {};
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
addSymbol(symbol: string): void {
|
||||
if (!symbol || this.selectedSymbols.includes(symbol)) return;
|
||||
this.selectedSymbols.push(symbol);
|
||||
}
|
||||
|
||||
removeSymbol(symbol: string): void {
|
||||
this.selectedSymbols = this.selectedSymbols.filter(s => s !== symbol);
|
||||
}
|
||||
|
||||
updateParameter(key: string, value: any): void {
|
||||
this.parameters[key] = value;
|
||||
}
|
||||
|
||||
onSubmit(): void {
|
||||
if (this.backtestForm.invalid || this.selectedSymbols.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const formValue = this.backtestForm.value;
|
||||
|
||||
const backtestRequest: BacktestRequest = {
|
||||
strategyType: formValue.strategyType,
|
||||
strategyParams: this.parameters,
|
||||
symbols: this.selectedSymbols,
|
||||
startDate: formValue.startDate,
|
||||
endDate: formValue.endDate,
|
||||
initialCapital: formValue.initialCapital,
|
||||
dataResolution: formValue.dataResolution,
|
||||
commission: formValue.commission,
|
||||
slippage: formValue.slippage,
|
||||
mode: formValue.mode
|
||||
};
|
||||
|
||||
this.isRunning = true;
|
||||
|
||||
this.strategyService.runBacktest(backtestRequest).subscribe({
|
||||
next: (response) => {
|
||||
this.isRunning = false;
|
||||
if (response.success) {
|
||||
this.backtestResult = response.data;
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
this.isRunning = false;
|
||||
console.error('Backtest error:', error);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
close(): void {
|
||||
this.dialogRef.close(this.backtestResult);
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,84 @@
|
|||
<h2 mat-dialog-title>{{isEditMode ? 'Edit Strategy' : 'Create Strategy'}}</h2>
|
||||
|
||||
<form [formGroup]="strategyForm" (ngSubmit)="onSubmit()">
|
||||
<mat-dialog-content class="mat-typography">
|
||||
<div class="grid grid-cols-1 gap-4">
|
||||
<!-- Basic Strategy Information -->
|
||||
<mat-form-field appearance="outline" class="w-full">
|
||||
<mat-label>Strategy Name</mat-label>
|
||||
<input matInput formControlName="name" placeholder="e.g., My Moving Average Crossover">
|
||||
<mat-error *ngIf="strategyForm.get('name')?.invalid">Name is required</mat-error>
|
||||
</mat-form-field>
|
||||
|
||||
<mat-form-field appearance="outline" class="w-full">
|
||||
<mat-label>Description</mat-label>
|
||||
<textarea matInput formControlName="description" rows="3"
|
||||
placeholder="Describe what this strategy does..."></textarea>
|
||||
</mat-form-field>
|
||||
|
||||
<mat-form-field appearance="outline" class="w-full">
|
||||
<mat-label>Strategy Type</mat-label>
|
||||
<mat-select formControlName="type" (selectionChange)="onStrategyTypeChange($event.value)">
|
||||
<mat-option *ngFor="let type of strategyTypes" [value]="type">
|
||||
{{type}}
|
||||
</mat-option>
|
||||
</mat-select>
|
||||
<mat-error *ngIf="strategyForm.get('type')?.invalid">Strategy type is required</mat-error>
|
||||
</mat-form-field>
|
||||
|
||||
<!-- Symbol Selection -->
|
||||
<div class="w-full">
|
||||
<label class="text-sm">Trading Symbols</label>
|
||||
<div class="flex flex-wrap gap-2 mb-2">
|
||||
<mat-chip *ngFor="let symbol of selectedSymbols" [removable]="true"
|
||||
(removed)="removeSymbol(symbol)">
|
||||
{{symbol}}
|
||||
<mat-icon matChipRemove>cancel</mat-icon>
|
||||
</mat-chip>
|
||||
</div>
|
||||
|
||||
<div class="flex gap-2">
|
||||
<mat-form-field appearance="outline" class="flex-1">
|
||||
<mat-label>Add Symbol</mat-label>
|
||||
<input matInput #symbolInput placeholder="e.g., AAPL">
|
||||
</mat-form-field>
|
||||
<button type="button" mat-raised-button color="primary"
|
||||
(click)="addSymbol(symbolInput.value); symbolInput.value = ''">
|
||||
Add
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="mt-2">
|
||||
<p class="text-sm text-gray-500 mb-1">Suggested symbols:</p>
|
||||
<div class="flex flex-wrap gap-2">
|
||||
<button type="button" *ngFor="let symbol of availableSymbols"
|
||||
mat-stroked-button (click)="addSymbol(symbol)"
|
||||
[disabled]="selectedSymbols.includes(symbol)">
|
||||
{{symbol}}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Dynamic Strategy Parameters -->
|
||||
<div *ngIf="strategyForm.get('type')?.value && Object.keys(parameters).length > 0">
|
||||
<h3 class="text-lg font-semibold mb-2">Strategy Parameters</h3>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<mat-form-field *ngFor="let param of parameters | keyvalue" appearance="outline">
|
||||
<mat-label>{{param.key}}</mat-label>
|
||||
<input matInput [value]="param.value"
|
||||
(input)="updateParameter(param.key, $any($event.target).value)">
|
||||
</mat-form-field>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</mat-dialog-content>
|
||||
|
||||
<mat-dialog-actions align="end">
|
||||
<button mat-button mat-dialog-close>Cancel</button>
|
||||
<button mat-raised-button color="primary" type="submit"
|
||||
[disabled]="strategyForm.invalid || selectedSymbols.length === 0">
|
||||
{{isEditMode ? 'Update' : 'Create'}}
|
||||
</button>
|
||||
</mat-dialog-actions>
|
||||
</form>
|
||||
|
|
@ -0,0 +1,178 @@
|
|||
import { Component, Inject, OnInit } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import {
|
||||
FormBuilder,
|
||||
FormGroup,
|
||||
ReactiveFormsModule,
|
||||
Validators
|
||||
} from '@angular/forms';
|
||||
import { MatButtonModule } from '@angular/material/button';
|
||||
import { MatDialogModule, MAT_DIALOG_DATA, MatDialogRef } from '@angular/material/dialog';
|
||||
import { MatFormFieldModule } from '@angular/material/form-field';
|
||||
import { MatInputModule } from '@angular/material/input';
|
||||
import { MatSelectModule } from '@angular/material/select';
|
||||
import { MatChipsModule } from '@angular/material/chips';
|
||||
import { MatIconModule } from '@angular/material/icon';
|
||||
import { COMMA, ENTER } from '@angular/cdk/keycodes';
|
||||
import { MatAutocompleteModule } from '@angular/material/autocomplete';
|
||||
import {
|
||||
StrategyService,
|
||||
TradingStrategy
|
||||
} from '../../../services/strategy.service';
|
||||
|
||||
@Component({
|
||||
selector: 'app-strategy-dialog',
|
||||
standalone: true,
|
||||
imports: [
|
||||
CommonModule,
|
||||
ReactiveFormsModule,
|
||||
MatButtonModule,
|
||||
MatDialogModule,
|
||||
MatFormFieldModule,
|
||||
MatInputModule,
|
||||
MatSelectModule,
|
||||
MatChipsModule,
|
||||
MatIconModule,
|
||||
MatAutocompleteModule
|
||||
],
|
||||
templateUrl: './strategy-dialog.component.html',
|
||||
styleUrl: './strategy-dialog.component.css'
|
||||
})
|
||||
export class StrategyDialogComponent implements OnInit {
|
||||
strategyForm: FormGroup;
|
||||
isEditMode: boolean = false;
|
||||
strategyTypes: string[] = [];
|
||||
availableSymbols: string[] = ['AAPL', 'MSFT', 'GOOGL', 'AMZN', 'TSLA', 'META', 'NVDA', 'SPY', 'QQQ'];
|
||||
selectedSymbols: string[] = [];
|
||||
separatorKeysCodes: number[] = [ENTER, COMMA];
|
||||
parameters: Record<string, any> = {};
|
||||
|
||||
constructor(
|
||||
private fb: FormBuilder,
|
||||
private strategyService: StrategyService,
|
||||
@Inject(MAT_DIALOG_DATA) public data: TradingStrategy | null,
|
||||
private dialogRef: MatDialogRef<StrategyDialogComponent>
|
||||
) {
|
||||
this.isEditMode = !!data;
|
||||
|
||||
this.strategyForm = this.fb.group({
|
||||
name: ['', [Validators.required]],
|
||||
description: [''],
|
||||
type: ['', [Validators.required]],
|
||||
// Dynamic parameters will be added based on strategy type
|
||||
});
|
||||
|
||||
if (this.isEditMode && data) {
|
||||
this.selectedSymbols = [...data.symbols];
|
||||
this.strategyForm.patchValue({
|
||||
name: data.name,
|
||||
description: data.description,
|
||||
type: data.type
|
||||
});
|
||||
this.parameters = {...data.parameters};
|
||||
}
|
||||
}
|
||||
|
||||
ngOnInit(): void {
|
||||
// In a real implementation, fetch available strategy types from the API
|
||||
this.loadStrategyTypes();
|
||||
}
|
||||
|
||||
loadStrategyTypes(): void {
|
||||
// In a real implementation, this would call the API
|
||||
this.strategyService.getStrategyTypes().subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
this.strategyTypes = response.data;
|
||||
|
||||
// If editing, load parameters
|
||||
if (this.isEditMode && this.data) {
|
||||
this.onStrategyTypeChange(this.data.type);
|
||||
}
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error loading strategy types:', error);
|
||||
// Fallback to hardcoded types
|
||||
this.strategyTypes = ['MOVING_AVERAGE_CROSSOVER', 'MEAN_REVERSION', 'CUSTOM'];
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
onStrategyTypeChange(type: string): void {
|
||||
// Get default parameters for this strategy type
|
||||
this.strategyService.getStrategyParameters(type).subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
// If editing, merge default with existing
|
||||
if (this.isEditMode && this.data) {
|
||||
this.parameters = {
|
||||
...response.data,
|
||||
...this.data.parameters
|
||||
};
|
||||
} else {
|
||||
this.parameters = response.data;
|
||||
}
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error loading parameters:', error);
|
||||
// Fallback to empty parameters
|
||||
this.parameters = {};
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
addSymbol(symbol: string): void {
|
||||
if (!symbol || this.selectedSymbols.includes(symbol)) return;
|
||||
this.selectedSymbols.push(symbol);
|
||||
}
|
||||
|
||||
removeSymbol(symbol: string): void {
|
||||
this.selectedSymbols = this.selectedSymbols.filter(s => s !== symbol);
|
||||
}
|
||||
|
||||
onSubmit(): void {
|
||||
if (this.strategyForm.invalid || this.selectedSymbols.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const formValue = this.strategyForm.value;
|
||||
|
||||
const strategy: Partial<TradingStrategy> = {
|
||||
name: formValue.name,
|
||||
description: formValue.description,
|
||||
type: formValue.type,
|
||||
symbols: this.selectedSymbols,
|
||||
parameters: this.parameters,
|
||||
};
|
||||
|
||||
if (this.isEditMode && this.data) {
|
||||
this.strategyService.updateStrategy(this.data.id, strategy).subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
this.dialogRef.close(true);
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error updating strategy:', error);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
this.strategyService.createStrategy(strategy).subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
this.dialogRef.close(true);
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error creating strategy:', error);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
updateParameter(key: string, value: any): void {
|
||||
this.parameters[key] = value;
|
||||
}
|
||||
}
|
||||
|
|
@ -4,12 +4,139 @@
|
|||
<h1 class="text-2xl font-bold text-gray-900">Trading Strategies</h1>
|
||||
<p class="text-gray-600 mt-1">Configure and monitor your automated trading strategies</p>
|
||||
</div>
|
||||
<div class="flex gap-2">
|
||||
<button mat-raised-button color="primary" (click)="openStrategyDialog()">
|
||||
<mat-icon>add</mat-icon> New Strategy
|
||||
</button>
|
||||
<button mat-raised-button color="accent" (click)="openBacktestDialog()">
|
||||
<mat-icon>science</mat-icon> New Backtest
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<mat-card class="p-6 h-96 flex items-center">
|
||||
<div class="text-center text-gray-500 w-full">
|
||||
<mat-icon style="font-size: 4rem; width: 4rem; height: 4rem;">psychology</mat-icon>
|
||||
<p class="mb-4">Strategy management and configuration will be implemented here</p>
|
||||
</div>
|
||||
<mat-card *ngIf="isLoading" class="p-4">
|
||||
<mat-progress-bar mode="indeterminate"></mat-progress-bar>
|
||||
</mat-card>
|
||||
|
||||
<div *ngIf="!selectedStrategy; else strategyDetails">
|
||||
<mat-card *ngIf="strategies.length > 0; else noStrategies" class="p-4">
|
||||
<table mat-table [dataSource]="strategies" class="w-full">
|
||||
<!-- Name Column -->
|
||||
<ng-container matColumnDef="name">
|
||||
<th mat-header-cell *matHeaderCellDef>Strategy</th>
|
||||
<td mat-cell *matCellDef="let strategy">
|
||||
<div class="font-semibold">{{strategy.name}}</div>
|
||||
<div class="text-xs text-gray-500">{{strategy.description}}</div>
|
||||
</td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Type Column -->
|
||||
<ng-container matColumnDef="type">
|
||||
<th mat-header-cell *matHeaderCellDef>Type</th>
|
||||
<td mat-cell *matCellDef="let strategy">{{strategy.type}}</td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Symbols Column -->
|
||||
<ng-container matColumnDef="symbols">
|
||||
<th mat-header-cell *matHeaderCellDef>Symbols</th>
|
||||
<td mat-cell *matCellDef="let strategy">
|
||||
<div class="flex flex-wrap gap-1 max-w-xs">
|
||||
<mat-chip *ngFor="let symbol of strategy.symbols.slice(0, 3)">
|
||||
{{symbol}}
|
||||
</mat-chip>
|
||||
<span *ngIf="strategy.symbols.length > 3" class="text-gray-500">
|
||||
+{{strategy.symbols.length - 3}} more
|
||||
</span>
|
||||
</div>
|
||||
</td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Status Column -->
|
||||
<ng-container matColumnDef="status">
|
||||
<th mat-header-cell *matHeaderCellDef>Status</th>
|
||||
<td mat-cell *matCellDef="let strategy">
|
||||
<div class="flex items-center gap-1">
|
||||
<span class="w-2 h-2 rounded-full"
|
||||
[style.background-color]="getStatusColor(strategy.status)"></span>
|
||||
{{strategy.status}}
|
||||
</div>
|
||||
</td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Performance Column -->
|
||||
<ng-container matColumnDef="performance">
|
||||
<th mat-header-cell *matHeaderCellDef>Performance</th>
|
||||
<td mat-cell *matCellDef="let strategy">
|
||||
<div class="flex flex-col">
|
||||
<div class="flex justify-between">
|
||||
<span class="text-xs text-gray-500">Return:</span>
|
||||
<span [ngClass]="{'text-green-600': strategy.performance.totalReturn > 0,
|
||||
'text-red-600': strategy.performance.totalReturn < 0}">
|
||||
{{strategy.performance.totalReturn | percent:'1.2-2'}}
|
||||
</span>
|
||||
</div>
|
||||
<div class="flex justify-between">
|
||||
<span class="text-xs text-gray-500">Win Rate:</span>
|
||||
<span>{{strategy.performance.winRate | percent:'1.0-0'}}</span>
|
||||
</div>
|
||||
</div>
|
||||
</td>
|
||||
</ng-container>
|
||||
|
||||
<!-- Actions Column -->
|
||||
<ng-container matColumnDef="actions">
|
||||
<th mat-header-cell *matHeaderCellDef>Actions</th>
|
||||
<td mat-cell *matCellDef="let strategy">
|
||||
<div class="flex gap-2">
|
||||
<button mat-icon-button color="primary" (click)="viewStrategyDetails(strategy)">
|
||||
<mat-icon>visibility</mat-icon>
|
||||
</button>
|
||||
<button mat-icon-button [color]="strategy.status === 'ACTIVE' ? 'warn' : 'primary'"
|
||||
(click)="toggleStrategyStatus(strategy)">
|
||||
<mat-icon>{{strategy.status === 'ACTIVE' ? 'pause' : 'play_arrow'}}</mat-icon>
|
||||
</button>
|
||||
<button mat-icon-button [matMenuTriggerFor]="menu">
|
||||
<mat-icon>more_vert</mat-icon>
|
||||
</button>
|
||||
<mat-menu #menu="matMenu">
|
||||
<button mat-menu-item (click)="openStrategyDialog(strategy)">
|
||||
<mat-icon>edit</mat-icon>
|
||||
<span>Edit</span>
|
||||
</button>
|
||||
<button mat-menu-item (click)="openBacktestDialog(strategy)">
|
||||
<mat-icon>science</mat-icon>
|
||||
<span>Backtest</span>
|
||||
</button>
|
||||
</mat-menu>
|
||||
</div>
|
||||
</td>
|
||||
</ng-container>
|
||||
|
||||
<tr mat-header-row *matHeaderRowDef="displayedColumns"></tr>
|
||||
<tr mat-row *matRowDef="let row; columns: displayedColumns;"></tr>
|
||||
</table>
|
||||
</mat-card>
|
||||
|
||||
<ng-template #noStrategies>
|
||||
<mat-card class="p-6 flex flex-col items-center justify-center">
|
||||
<div class="text-center text-gray-500 w-full">
|
||||
<mat-icon style="font-size: 4rem; width: 4rem; height: 4rem; margin: 0 auto;">psychology</mat-icon>
|
||||
<h3 class="text-xl font-semibold mt-4">No Strategies Yet</h3>
|
||||
<p class="mb-4">Create your first trading strategy to get started</p>
|
||||
<button mat-raised-button color="primary" (click)="openStrategyDialog()">
|
||||
<mat-icon>add</mat-icon> Create Strategy
|
||||
</button>
|
||||
</div>
|
||||
</mat-card>
|
||||
</ng-template>
|
||||
</div>
|
||||
|
||||
<ng-template #strategyDetails>
|
||||
<div class="flex justify-between items-center mb-4">
|
||||
<button mat-button (click)="selectedStrategy = null">
|
||||
<mat-icon>arrow_back</mat-icon> Back to Strategies
|
||||
</button>
|
||||
</div>
|
||||
<app-strategy-details [strategy]="selectedStrategy"></app-strategy-details>
|
||||
</ng-template>
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -1,13 +1,148 @@
|
|||
import { Component } from '@angular/core';
|
||||
import { Component, OnInit } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { MatCardModule } from '@angular/material/card';
|
||||
import { MatIconModule } from '@angular/material/icon';
|
||||
import { MatButtonModule } from '@angular/material/button';
|
||||
import { MatTabsModule } from '@angular/material/tabs';
|
||||
import { MatTableModule } from '@angular/material/table';
|
||||
import { MatSortModule } from '@angular/material/sort';
|
||||
import { MatPaginatorModule } from '@angular/material/paginator';
|
||||
import { MatDialogModule, MatDialog } from '@angular/material/dialog';
|
||||
import { MatMenuModule } from '@angular/material/menu';
|
||||
import { MatChipsModule } from '@angular/material/chips';
|
||||
import { MatProgressBarModule } from '@angular/material/progress-bar';
|
||||
import { FormsModule, ReactiveFormsModule } from '@angular/forms';
|
||||
import { StrategyService, TradingStrategy } from '../../services/strategy.service';
|
||||
import { WebSocketService } from '../../services/websocket.service';
|
||||
import { StrategyDialogComponent } from './dialogs/strategy-dialog.component';
|
||||
import { BacktestDialogComponent } from './dialogs/backtest-dialog.component';
|
||||
import { StrategyDetailsComponent } from './strategy-details/strategy-details.component';
|
||||
|
||||
@Component({
|
||||
selector: 'app-strategies',
|
||||
standalone: true,
|
||||
imports: [CommonModule, MatCardModule, MatIconModule],
|
||||
imports: [
|
||||
CommonModule,
|
||||
MatCardModule,
|
||||
MatIconModule,
|
||||
MatButtonModule,
|
||||
MatTabsModule,
|
||||
MatTableModule,
|
||||
MatSortModule,
|
||||
MatPaginatorModule,
|
||||
MatDialogModule,
|
||||
MatMenuModule,
|
||||
MatChipsModule,
|
||||
MatProgressBarModule,
|
||||
FormsModule,
|
||||
ReactiveFormsModule,
|
||||
StrategyDetailsComponent
|
||||
],
|
||||
templateUrl: './strategies.component.html',
|
||||
styleUrl: './strategies.component.css'
|
||||
})
|
||||
export class StrategiesComponent {}
|
||||
export class StrategiesComponent implements OnInit {
|
||||
strategies: TradingStrategy[] = [];
|
||||
displayedColumns: string[] = ['name', 'type', 'symbols', 'status', 'performance', 'actions'];
|
||||
selectedStrategy: TradingStrategy | null = null;
|
||||
isLoading = false;
|
||||
|
||||
constructor(
|
||||
private strategyService: StrategyService,
|
||||
private webSocketService: WebSocketService,
|
||||
private dialog: MatDialog
|
||||
) {}
|
||||
|
||||
ngOnInit(): void {
|
||||
this.loadStrategies();
|
||||
this.listenForStrategyUpdates();
|
||||
}
|
||||
|
||||
loadStrategies(): void {
|
||||
this.isLoading = true;
|
||||
this.strategyService.getStrategies().subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
this.strategies = response.data;
|
||||
}
|
||||
this.isLoading = false;
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error loading strategies:', error);
|
||||
this.isLoading = false;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
listenForStrategyUpdates(): void {
|
||||
this.webSocketService.messages.subscribe(message => {
|
||||
if (message.type === 'STRATEGY_CREATED' ||
|
||||
message.type === 'STRATEGY_UPDATED' ||
|
||||
message.type === 'STRATEGY_STATUS_CHANGED') {
|
||||
// Refresh the strategy list when changes occur
|
||||
this.loadStrategies();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
getStatusColor(status: string): string {
|
||||
switch (status) {
|
||||
case 'ACTIVE': return 'green';
|
||||
case 'PAUSED': return 'orange';
|
||||
case 'ERROR': return 'red';
|
||||
default: return 'gray';
|
||||
}
|
||||
}
|
||||
|
||||
openStrategyDialog(strategy?: TradingStrategy): void {
|
||||
const dialogRef = this.dialog.open(StrategyDialogComponent, {
|
||||
width: '600px',
|
||||
data: strategy || null
|
||||
});
|
||||
|
||||
dialogRef.afterClosed().subscribe(result => {
|
||||
if (result) {
|
||||
this.loadStrategies();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
openBacktestDialog(strategy?: TradingStrategy): void {
|
||||
const dialogRef = this.dialog.open(BacktestDialogComponent, {
|
||||
width: '800px',
|
||||
data: strategy || null
|
||||
});
|
||||
|
||||
dialogRef.afterClosed().subscribe(result => {
|
||||
if (result) {
|
||||
// Handle backtest result if needed
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
toggleStrategyStatus(strategy: TradingStrategy): void {
|
||||
this.isLoading = true;
|
||||
|
||||
if (strategy.status === 'ACTIVE') {
|
||||
this.strategyService.pauseStrategy(strategy.id).subscribe({
|
||||
next: () => this.loadStrategies(),
|
||||
error: (error) => {
|
||||
console.error('Error pausing strategy:', error);
|
||||
this.isLoading = false;
|
||||
}
|
||||
});
|
||||
} else {
|
||||
this.strategyService.startStrategy(strategy.id).subscribe({
|
||||
next: () => this.loadStrategies(),
|
||||
error: (error) => {
|
||||
console.error('Error starting strategy:', error);
|
||||
this.isLoading = false;
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
viewStrategyDetails(strategy: TradingStrategy): void {
|
||||
this.selectedStrategy = strategy;
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,16 @@
|
|||
/* Strategy details specific styles */
|
||||
table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
}
|
||||
|
||||
th {
|
||||
font-weight: 600;
|
||||
color: #4b5563;
|
||||
font-size: 0.875rem;
|
||||
border-bottom: 1px solid #e5e7eb;
|
||||
}
|
||||
|
||||
td {
|
||||
border-bottom: 1px solid #e5e7eb;
|
||||
}
|
||||
|
|
@ -0,0 +1,214 @@
|
|||
<div class="space-y-6" *ngIf="strategy">
|
||||
<div class="flex flex-col md:flex-row md:justify-between md:items-start gap-4">
|
||||
<!-- Strategy Overview Card -->
|
||||
<mat-card class="flex-1 p-4">
|
||||
<div class="flex items-start justify-between">
|
||||
<div>
|
||||
<h2 class="text-xl font-bold">{{strategy.name}}</h2>
|
||||
<p class="text-gray-600 text-sm">{{strategy.description}}</p>
|
||||
</div>
|
||||
<div class="flex items-center gap-2">
|
||||
<button mat-raised-button color="primary" class="mr-2" (click)="openBacktestDialog()">
|
||||
Run Backtest
|
||||
</button>
|
||||
<span class="px-3 py-1 rounded-full text-xs font-semibold"
|
||||
[style.background-color]="getStatusColor(strategy.status)"
|
||||
style="color: white;">
|
||||
{{strategy.status}}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mt-4 grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<h3 class="font-semibold text-sm text-gray-600">Type</h3>
|
||||
<p>{{strategy.type}}</p>
|
||||
</div>
|
||||
<div>
|
||||
<h3 class="font-semibold text-sm text-gray-600">Created</h3>
|
||||
<p>{{strategy.createdAt | date:'medium'}}</p>
|
||||
</div>
|
||||
<div>
|
||||
<h3 class="font-semibold text-sm text-gray-600">Last Updated</h3>
|
||||
<p>{{strategy.updatedAt | date:'medium'}}</p>
|
||||
</div>
|
||||
<div>
|
||||
<h3 class="font-semibold text-sm text-gray-600">Symbols</h3>
|
||||
<div class="flex flex-wrap gap-1 mt-1">
|
||||
<mat-chip *ngFor="let symbol of strategy.symbols">{{symbol}}</mat-chip>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</mat-card>
|
||||
|
||||
<!-- Performance Summary Card -->
|
||||
<mat-card class="md:w-1/3 p-4">
|
||||
<h3 class="text-lg font-bold mb-3">Performance</h3>
|
||||
<div class="grid grid-cols-2 gap-3">
|
||||
<div>
|
||||
<p class="text-sm text-gray-600">Return</p>
|
||||
<p class="text-xl font-semibold"
|
||||
[ngClass]="{'text-green-600': performance.totalReturn >= 0, 'text-red-600': performance.totalReturn < 0}">
|
||||
{{performance.totalReturn | percent:'1.2-2'}}
|
||||
</p>
|
||||
</div>
|
||||
<div>
|
||||
<p class="text-sm text-gray-600">Win Rate</p>
|
||||
<p class="text-xl font-semibold">{{performance.winRate | percent:'1.0-0'}}</p>
|
||||
</div>
|
||||
<div>
|
||||
<p class="text-sm text-gray-600">Sharpe Ratio</p>
|
||||
<p class="text-xl font-semibold">{{performance.sharpeRatio | number:'1.2-2'}}</p>
|
||||
</div>
|
||||
<div>
|
||||
<p class="text-sm text-gray-600">Max Drawdown</p>
|
||||
<p class="text-xl font-semibold text-red-600">{{performance.maxDrawdown | percent:'1.2-2'}}</p>
|
||||
</div>
|
||||
<div>
|
||||
<p class="text-sm text-gray-600">Total Trades</p>
|
||||
<p class="text-xl font-semibold">{{performance.totalTrades}}</p>
|
||||
</div>
|
||||
<div>
|
||||
<p class="text-sm text-gray-600">Sortino Ratio</p>
|
||||
<p class="text-xl font-semibold">{{performance.sortinoRatio | number:'1.2-2'}}</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<mat-divider class="my-4"></mat-divider>
|
||||
|
||||
<div class="flex justify-between mt-2">
|
||||
<button mat-button color="primary" *ngIf="strategy.status !== 'ACTIVE'" (click)="activateStrategy()">
|
||||
<mat-icon>play_arrow</mat-icon> Start
|
||||
</button>
|
||||
<button mat-button color="accent" *ngIf="strategy.status === 'ACTIVE'" (click)="pauseStrategy()">
|
||||
<mat-icon>pause</mat-icon> Pause
|
||||
</button>
|
||||
<button mat-button color="warn" *ngIf="strategy.status === 'ACTIVE'" (click)="stopStrategy()">
|
||||
<mat-icon>stop</mat-icon> Stop
|
||||
</button>
|
||||
<button mat-button (click)="openEditDialog()">
|
||||
<mat-icon>edit</mat-icon> Edit
|
||||
</button>
|
||||
</div>
|
||||
</mat-card>
|
||||
</div>
|
||||
|
||||
<!-- Parameters Card -->
|
||||
<mat-card class="p-4">
|
||||
<h3 class="text-lg font-bold mb-3">Strategy Parameters</h3>
|
||||
<div class="grid grid-cols-2 md:grid-cols-4 gap-4">
|
||||
<div *ngFor="let param of strategy.parameters | keyvalue">
|
||||
<p class="text-sm text-gray-600">{{param.key}}</p>
|
||||
<p class="font-semibold">{{param.value}}</p>
|
||||
</div>
|
||||
</div>
|
||||
</mat-card>
|
||||
<!-- Backtest Results Section (only shown when a backtest has been run) -->
|
||||
<div *ngIf="backtestResult" class="backtest-results space-y-6">
|
||||
<h2 class="text-xl font-bold">Backtest Results</h2>
|
||||
|
||||
<!-- Performance Metrics Component -->
|
||||
<app-performance-metrics [backtestResult]="backtestResult"></app-performance-metrics>
|
||||
|
||||
<!-- Equity Chart Component -->
|
||||
<app-equity-chart [backtestResult]="backtestResult"></app-equity-chart>
|
||||
|
||||
<!-- Drawdown Chart Component -->
|
||||
<app-drawdown-chart [backtestResult]="backtestResult"></app-drawdown-chart>
|
||||
|
||||
<!-- Trades Table Component -->
|
||||
<app-trades-table [backtestResult]="backtestResult"></app-trades-table>
|
||||
</div>
|
||||
|
||||
<!-- Tabs for Signals/Trades -->
|
||||
<mat-card class="p-0">
|
||||
<mat-tab-group>
|
||||
<!-- Signals Tab -->
|
||||
<mat-tab label="Recent Signals">
|
||||
<div class="p-4">
|
||||
<ng-container *ngIf="!isLoadingSignals; else loadingSignals">
|
||||
<table class="min-w-full">
|
||||
<thead>
|
||||
<tr>
|
||||
<th class="py-2 text-left">Time</th>
|
||||
<th class="py-2 text-left">Symbol</th>
|
||||
<th class="py-2 text-left">Action</th>
|
||||
<th class="py-2 text-left">Price</th>
|
||||
<th class="py-2 text-left">Quantity</th>
|
||||
<th class="py-2 text-left">Confidence</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr *ngFor="let signal of signals">
|
||||
<td class="py-2">{{signal.timestamp | date:'short'}}</td>
|
||||
<td class="py-2">{{signal.symbol}}</td>
|
||||
<td class="py-2">
|
||||
<span class="px-2 py-1 rounded text-xs font-semibold"
|
||||
[style.background-color]="getSignalColor(signal.action)"
|
||||
style="color: white;">
|
||||
{{signal.action}}
|
||||
</span>
|
||||
</td>
|
||||
<td class="py-2">${{signal.price | number:'1.2-2'}}</td>
|
||||
<td class="py-2">{{signal.quantity}}</td>
|
||||
<td class="py-2">{{signal.confidence | percent:'1.0-0'}}</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</ng-container>
|
||||
<ng-template #loadingSignals>
|
||||
<mat-progress-bar mode="indeterminate"></mat-progress-bar>
|
||||
</ng-template>
|
||||
</div>
|
||||
</mat-tab>
|
||||
|
||||
<!-- Trades Tab -->
|
||||
<mat-tab label="Recent Trades">
|
||||
<div class="p-4">
|
||||
<ng-container *ngIf="!isLoadingTrades; else loadingTrades">
|
||||
<table class="min-w-full">
|
||||
<thead>
|
||||
<tr>
|
||||
<th class="py-2 text-left">Symbol</th>
|
||||
<th class="py-2 text-left">Entry</th>
|
||||
<th class="py-2 text-left">Exit</th>
|
||||
<th class="py-2 text-left">Quantity</th>
|
||||
<th class="py-2 text-left">P&L</th>
|
||||
<th class="py-2 text-left">P&L %</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr *ngFor="let trade of trades">
|
||||
<td class="py-2">{{trade.symbol}}</td>
|
||||
<td class="py-2">
|
||||
${{trade.entryPrice | number:'1.2-2'}} @ {{trade.entryTime | date:'short'}}
|
||||
</td>
|
||||
<td class="py-2">
|
||||
${{trade.exitPrice | number:'1.2-2'}} @ {{trade.exitTime | date:'short'}}
|
||||
</td>
|
||||
<td class="py-2">{{trade.quantity}}</td>
|
||||
<td class="py-2" [ngClass]="{'text-green-600': trade.pnl >= 0, 'text-red-600': trade.pnl < 0}">
|
||||
${{trade.pnl | number:'1.2-2'}}
|
||||
</td>
|
||||
<td class="py-2" [ngClass]="{'text-green-600': trade.pnlPercent >= 0, 'text-red-600': trade.pnlPercent < 0}">
|
||||
{{trade.pnlPercent | number:'1.2-2'}}%
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</ng-container>
|
||||
<ng-template #loadingTrades>
|
||||
<mat-progress-bar mode="indeterminate"></mat-progress-bar>
|
||||
</ng-template>
|
||||
</div>
|
||||
</mat-tab>
|
||||
</mat-tab-group>
|
||||
</mat-card>
|
||||
</div>
|
||||
|
||||
<mat-card class="p-6 flex items-center" *ngIf="!strategy">
|
||||
<div class="text-center text-gray-500 w-full">
|
||||
<mat-icon style="font-size: 4rem; width: 4rem; height: 4rem;">psychology</mat-icon>
|
||||
<p class="mb-4">No strategy selected</p>
|
||||
</div>
|
||||
</mat-card>
|
||||
|
|
@ -0,0 +1,381 @@
|
|||
import { Component, Input, OnChanges, SimpleChanges } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { MatCardModule } from '@angular/material/card';
|
||||
import { MatTabsModule } from '@angular/material/tabs';
|
||||
import { MatIconModule } from '@angular/material/icon';
|
||||
import { MatButtonModule } from '@angular/material/button';
|
||||
import { MatTableModule } from '@angular/material/table';
|
||||
import { MatChipsModule } from '@angular/material/chips';
|
||||
import { MatProgressBarModule } from '@angular/material/progress-bar';
|
||||
import { MatDividerModule } from '@angular/material/divider';
|
||||
import { MatDialog } from '@angular/material/dialog';
|
||||
import { BacktestResult, TradingStrategy, StrategyService } from '../../../services/strategy.service';
|
||||
import { WebSocketService } from '../../../services/websocket.service';
|
||||
import { EquityChartComponent } from '../components/equity-chart.component';
|
||||
import { DrawdownChartComponent } from '../components/drawdown-chart.component';
|
||||
import { TradesTableComponent } from '../components/trades-table.component';
|
||||
import { PerformanceMetricsComponent } from '../components/performance-metrics.component';
|
||||
import { StrategyDialogComponent } from '../dialogs/strategy-dialog.component';
|
||||
import { BacktestDialogComponent } from '../dialogs/backtest-dialog.component';
|
||||
|
||||
@Component({
|
||||
selector: 'app-strategy-details',
|
||||
standalone: true,
|
||||
imports: [
|
||||
CommonModule,
|
||||
MatCardModule,
|
||||
MatTabsModule,
|
||||
MatIconModule,
|
||||
MatButtonModule,
|
||||
MatTableModule,
|
||||
MatChipsModule,
|
||||
MatProgressBarModule,
|
||||
MatDividerModule,
|
||||
EquityChartComponent,
|
||||
DrawdownChartComponent,
|
||||
TradesTableComponent,
|
||||
PerformanceMetricsComponent
|
||||
],
|
||||
templateUrl: './strategy-details.component.html',
|
||||
styleUrl: './strategy-details.component.css'
|
||||
})
|
||||
export class StrategyDetailsComponent implements OnChanges {
|
||||
@Input() strategy: TradingStrategy | null = null;
|
||||
|
||||
signals: any[] = [];
|
||||
trades: any[] = [];
|
||||
performance: any = {};
|
||||
isLoadingSignals = false;
|
||||
isLoadingTrades = false;
|
||||
backtestResult: BacktestResult | undefined;
|
||||
|
||||
constructor(
|
||||
private strategyService: StrategyService,
|
||||
private webSocketService: WebSocketService,
|
||||
private dialog: MatDialog
|
||||
) {}
|
||||
|
||||
ngOnChanges(changes: SimpleChanges): void {
|
||||
if (changes['strategy'] && this.strategy) {
|
||||
this.loadStrategyData();
|
||||
this.listenForUpdates();
|
||||
}
|
||||
}
|
||||
|
||||
loadStrategyData(): void {
|
||||
if (!this.strategy) return;
|
||||
|
||||
// In a real implementation, these would call API methods to fetch the data
|
||||
this.loadSignals();
|
||||
this.loadTrades();
|
||||
this.loadPerformance();
|
||||
}
|
||||
loadSignals(): void {
|
||||
if (!this.strategy) return;
|
||||
|
||||
this.isLoadingSignals = true;
|
||||
|
||||
// First check if we can get real signals from the API
|
||||
this.strategyService.getStrategySignals(this.strategy.id)
|
||||
.subscribe({
|
||||
next: (response) => {
|
||||
if (response.success && response.data && response.data.length > 0) {
|
||||
this.signals = response.data;
|
||||
} else {
|
||||
// Fallback to mock data if no real signals available
|
||||
this.signals = this.generateMockSignals();
|
||||
}
|
||||
this.isLoadingSignals = false;
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error loading signals', error);
|
||||
// Fallback to mock data on error
|
||||
this.signals = this.generateMockSignals();
|
||||
this.isLoadingSignals = false;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
loadTrades(): void {
|
||||
if (!this.strategy) return;
|
||||
|
||||
this.isLoadingTrades = true;
|
||||
|
||||
// First check if we can get real trades from the API
|
||||
this.strategyService.getStrategyTrades(this.strategy.id)
|
||||
.subscribe({
|
||||
next: (response) => {
|
||||
if (response.success && response.data && response.data.length > 0) {
|
||||
this.trades = response.data;
|
||||
} else {
|
||||
// Fallback to mock data if no real trades available
|
||||
this.trades = this.generateMockTrades();
|
||||
}
|
||||
this.isLoadingTrades = false;
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error loading trades', error);
|
||||
// Fallback to mock data on error
|
||||
this.trades = this.generateMockTrades();
|
||||
this.isLoadingTrades = false;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
loadPerformance(): void {
|
||||
// This would be an API call in a real implementation
|
||||
this.performance = {
|
||||
totalReturn: this.strategy?.performance.totalReturn || 0,
|
||||
winRate: this.strategy?.performance.winRate || 0,
|
||||
sharpeRatio: this.strategy?.performance.sharpeRatio || 0,
|
||||
maxDrawdown: this.strategy?.performance.maxDrawdown || 0,
|
||||
totalTrades: this.strategy?.performance.totalTrades || 0,
|
||||
// Additional metrics that would come from the API
|
||||
dailyReturn: 0.0012,
|
||||
volatility: 0.008,
|
||||
sortinoRatio: 1.2,
|
||||
calmarRatio: 0.7
|
||||
};
|
||||
}
|
||||
listenForUpdates(): void {
|
||||
if (!this.strategy) return;
|
||||
|
||||
// Subscribe to strategy signals
|
||||
this.webSocketService.getStrategySignals(this.strategy.id)
|
||||
.subscribe((signal: any) => {
|
||||
// Add the new signal to the top of the list
|
||||
this.signals = [signal, ...this.signals.slice(0, 9)]; // Keep only the latest 10 signals
|
||||
});
|
||||
|
||||
// Subscribe to strategy trades
|
||||
this.webSocketService.getStrategyTrades(this.strategy.id)
|
||||
.subscribe((trade: any) => {
|
||||
// Add the new trade to the top of the list
|
||||
this.trades = [trade, ...this.trades.slice(0, 9)]; // Keep only the latest 10 trades
|
||||
|
||||
// Update performance metrics
|
||||
this.updatePerformanceMetrics();
|
||||
});
|
||||
|
||||
// Subscribe to strategy status updates
|
||||
this.webSocketService.getStrategyUpdates()
|
||||
.subscribe((update: any) => {
|
||||
if (update.strategyId === this.strategy?.id) {
|
||||
// Update strategy status if changed
|
||||
if (update.status && this.strategy.status !== update.status) {
|
||||
this.strategy.status = update.status;
|
||||
}
|
||||
|
||||
// Update other fields if present
|
||||
if (update.performance && this.strategy) {
|
||||
this.strategy.performance = {
|
||||
...this.strategy.performance,
|
||||
...update.performance
|
||||
};
|
||||
this.performance = {
|
||||
...this.performance,
|
||||
...update.performance
|
||||
};
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
console.log('WebSocket listeners for strategy updates initialized');
|
||||
}
|
||||
|
||||
/**
|
||||
* Update performance metrics when new trades come in
|
||||
*/
|
||||
private updatePerformanceMetrics(): void {
|
||||
if (!this.strategy || this.trades.length === 0) return;
|
||||
|
||||
// Calculate basic metrics
|
||||
const winningTrades = this.trades.filter(t => t.pnl > 0);
|
||||
const losingTrades = this.trades.filter(t => t.pnl < 0);
|
||||
|
||||
const totalPnl = this.trades.reduce((sum, trade) => sum + trade.pnl, 0);
|
||||
const winRate = winningTrades.length / this.trades.length;
|
||||
|
||||
// Update performance data
|
||||
const currentPerformance = this.performance || {};
|
||||
this.performance = {
|
||||
...currentPerformance,
|
||||
totalTrades: this.trades.length,
|
||||
winRate: winRate,
|
||||
totalReturn: (currentPerformance.totalReturn || 0) + (totalPnl / 10000) // Approximate
|
||||
};
|
||||
|
||||
// Update strategy performance as well
|
||||
if (this.strategy && this.strategy.performance) {
|
||||
this.strategy.performance = {
|
||||
...this.strategy.performance,
|
||||
totalTrades: this.trades.length,
|
||||
winRate: winRate
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
getStatusColor(status: string): string {
|
||||
switch (status) {
|
||||
case 'ACTIVE': return 'green';
|
||||
case 'PAUSED': return 'orange';
|
||||
case 'ERROR': return 'red';
|
||||
default: return 'gray';
|
||||
}
|
||||
}
|
||||
|
||||
getSignalColor(action: string): string {
|
||||
switch (action) {
|
||||
case 'BUY': return 'green';
|
||||
case 'SELL': return 'red';
|
||||
default: return 'gray';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Open the backtest dialog to run a backtest for this strategy
|
||||
*/
|
||||
openBacktestDialog(): void {
|
||||
if (!this.strategy) return;
|
||||
|
||||
const dialogRef = this.dialog.open(BacktestDialogComponent, {
|
||||
width: '800px',
|
||||
data: this.strategy
|
||||
});
|
||||
|
||||
dialogRef.afterClosed().subscribe(result => {
|
||||
if (result) {
|
||||
// Store the backtest result for visualization
|
||||
this.backtestResult = result;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Open the strategy edit dialog
|
||||
*/
|
||||
openEditDialog(): void {
|
||||
if (!this.strategy) return;
|
||||
|
||||
const dialogRef = this.dialog.open(StrategyDialogComponent, {
|
||||
width: '600px',
|
||||
data: this.strategy
|
||||
});
|
||||
|
||||
dialogRef.afterClosed().subscribe(result => {
|
||||
if (result) {
|
||||
// Refresh strategy data after edit
|
||||
this.loadStrategyData();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Start the strategy
|
||||
*/
|
||||
activateStrategy(): void {
|
||||
if (!this.strategy) return;
|
||||
|
||||
this.strategyService.startStrategy(this.strategy.id).subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
this.strategy!.status = 'ACTIVE';
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error starting strategy:', error);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Pause the strategy
|
||||
*/
|
||||
pauseStrategy(): void {
|
||||
if (!this.strategy) return;
|
||||
|
||||
this.strategyService.pauseStrategy(this.strategy.id).subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
this.strategy!.status = 'PAUSED';
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error pausing strategy:', error);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop the strategy
|
||||
*/
|
||||
stopStrategy(): void {
|
||||
if (!this.strategy) return;
|
||||
|
||||
this.strategyService.stopStrategy(this.strategy.id).subscribe({
|
||||
next: (response) => {
|
||||
if (response.success) {
|
||||
this.strategy!.status = 'INACTIVE';
|
||||
}
|
||||
},
|
||||
error: (error) => {
|
||||
console.error('Error stopping strategy:', error);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Methods to generate mock data
|
||||
private generateMockSignals(): any[] {
|
||||
if (!this.strategy) return [];
|
||||
|
||||
const signals = [];
|
||||
const actions = ['BUY', 'SELL', 'HOLD'];
|
||||
const now = new Date();
|
||||
|
||||
for (let i = 0; i < 10; i++) {
|
||||
const symbol = this.strategy.symbols[Math.floor(Math.random() * this.strategy.symbols.length)];
|
||||
const action = actions[Math.floor(Math.random() * actions.length)];
|
||||
|
||||
signals.push({
|
||||
id: `sig_${i}`,
|
||||
symbol,
|
||||
action,
|
||||
confidence: 0.7 + Math.random() * 0.3,
|
||||
price: 100 + Math.random() * 50,
|
||||
timestamp: new Date(now.getTime() - i * 1000 * 60 * 30), // 30 min intervals
|
||||
quantity: Math.floor(10 + Math.random() * 90)
|
||||
});
|
||||
}
|
||||
|
||||
return signals;
|
||||
}
|
||||
|
||||
private generateMockTrades(): any[] {
|
||||
if (!this.strategy) return [];
|
||||
|
||||
const trades = [];
|
||||
const now = new Date();
|
||||
|
||||
for (let i = 0; i < 10; i++) {
|
||||
const symbol = this.strategy.symbols[Math.floor(Math.random() * this.strategy.symbols.length)];
|
||||
const entryPrice = 100 + Math.random() * 50;
|
||||
const exitPrice = entryPrice * (1 + (Math.random() * 0.1 - 0.05)); // -5% to +5%
|
||||
const quantity = Math.floor(10 + Math.random() * 90);
|
||||
const pnl = (exitPrice - entryPrice) * quantity;
|
||||
|
||||
trades.push({
|
||||
id: `trade_${i}`,
|
||||
symbol,
|
||||
entryPrice,
|
||||
entryTime: new Date(now.getTime() - (i + 5) * 1000 * 60 * 60), // Hourly intervals
|
||||
exitPrice,
|
||||
exitTime: new Date(now.getTime() - i * 1000 * 60 * 60),
|
||||
quantity,
|
||||
pnl,
|
||||
pnlPercent: ((exitPrice - entryPrice) / entryPrice) * 100
|
||||
});
|
||||
}
|
||||
|
||||
return trades;
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,209 @@
|
|||
import { Injectable } from '@angular/core';
|
||||
import { HttpClient } from '@angular/common/http';
|
||||
import { Observable } from 'rxjs';
|
||||
|
||||
export interface TradingStrategy {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
status: 'ACTIVE' | 'INACTIVE' | 'PAUSED' | 'ERROR';
|
||||
type: string;
|
||||
symbols: string[];
|
||||
parameters: Record<string, any>;
|
||||
performance: {
|
||||
totalTrades: number;
|
||||
winRate: number;
|
||||
totalReturn: number;
|
||||
sharpeRatio: number;
|
||||
maxDrawdown: number;
|
||||
};
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
}
|
||||
|
||||
export interface BacktestRequest {
|
||||
strategyType: string;
|
||||
strategyParams: Record<string, any>;
|
||||
symbols: string[];
|
||||
startDate: Date | string;
|
||||
endDate: Date | string;
|
||||
initialCapital: number;
|
||||
dataResolution: '1m' | '5m' | '15m' | '30m' | '1h' | '4h' | '1d';
|
||||
commission: number;
|
||||
slippage: number;
|
||||
mode: 'event' | 'vector';
|
||||
}
|
||||
|
||||
export interface BacktestResult {
|
||||
strategyId: string;
|
||||
startDate: Date;
|
||||
endDate: Date;
|
||||
duration: number;
|
||||
initialCapital: number;
|
||||
finalCapital: number;
|
||||
totalReturn: number;
|
||||
annualizedReturn: number;
|
||||
sharpeRatio: number;
|
||||
maxDrawdown: number;
|
||||
maxDrawdownDuration: number;
|
||||
winRate: number;
|
||||
totalTrades: number;
|
||||
winningTrades: number;
|
||||
losingTrades: number;
|
||||
averageWinningTrade: number;
|
||||
averageLosingTrade: number;
|
||||
profitFactor: number;
|
||||
dailyReturns: Array<{ date: Date; return: number }>;
|
||||
trades: Array<{
|
||||
symbol: string;
|
||||
entryTime: Date;
|
||||
entryPrice: number;
|
||||
exitTime: Date;
|
||||
exitPrice: number;
|
||||
quantity: number;
|
||||
pnl: number;
|
||||
pnlPercent: number;
|
||||
}>;
|
||||
// Advanced metrics
|
||||
sortinoRatio?: number;
|
||||
calmarRatio?: number;
|
||||
omegaRatio?: number;
|
||||
cagr?: number;
|
||||
volatility?: number;
|
||||
ulcerIndex?: number;
|
||||
}
|
||||
|
||||
interface ApiResponse<T> {
|
||||
success: boolean;
|
||||
data: T;
|
||||
error?: string;
|
||||
}
|
||||
|
||||
@Injectable({
|
||||
providedIn: 'root'
|
||||
})
|
||||
export class StrategyService {
|
||||
private apiBaseUrl = '/api'; // Will be proxied to the correct backend endpoint
|
||||
|
||||
constructor(private http: HttpClient) { }
|
||||
|
||||
// Strategy Management
|
||||
getStrategies(): Observable<ApiResponse<TradingStrategy[]>> {
|
||||
return this.http.get<ApiResponse<TradingStrategy[]>>(`${this.apiBaseUrl}/strategies`);
|
||||
}
|
||||
|
||||
getStrategy(id: string): Observable<ApiResponse<TradingStrategy>> {
|
||||
return this.http.get<ApiResponse<TradingStrategy>>(`${this.apiBaseUrl}/strategies/${id}`);
|
||||
}
|
||||
|
||||
createStrategy(strategy: Partial<TradingStrategy>): Observable<ApiResponse<TradingStrategy>> {
|
||||
return this.http.post<ApiResponse<TradingStrategy>>(`${this.apiBaseUrl}/strategies`, strategy);
|
||||
}
|
||||
|
||||
updateStrategy(id: string, updates: Partial<TradingStrategy>): Observable<ApiResponse<TradingStrategy>> {
|
||||
return this.http.put<ApiResponse<TradingStrategy>>(`${this.apiBaseUrl}/strategies/${id}`, updates);
|
||||
}
|
||||
|
||||
startStrategy(id: string): Observable<ApiResponse<TradingStrategy>> {
|
||||
return this.http.post<ApiResponse<TradingStrategy>>(`${this.apiBaseUrl}/strategies/${id}/start`, {});
|
||||
}
|
||||
|
||||
stopStrategy(id: string): Observable<ApiResponse<TradingStrategy>> {
|
||||
return this.http.post<ApiResponse<TradingStrategy>>(`${this.apiBaseUrl}/strategies/${id}/stop`, {});
|
||||
}
|
||||
|
||||
pauseStrategy(id: string): Observable<ApiResponse<TradingStrategy>> {
|
||||
return this.http.post<ApiResponse<TradingStrategy>>(`${this.apiBaseUrl}/strategies/${id}/pause`, {});
|
||||
}
|
||||
|
||||
// Backtest Management
|
||||
getStrategyTypes(): Observable<ApiResponse<string[]>> {
|
||||
return this.http.get<ApiResponse<string[]>>(`${this.apiBaseUrl}/strategy-types`);
|
||||
}
|
||||
|
||||
getStrategyParameters(type: string): Observable<ApiResponse<Record<string, any>>> {
|
||||
return this.http.get<ApiResponse<Record<string, any>>>(`${this.apiBaseUrl}/strategy-parameters/${type}`);
|
||||
}
|
||||
|
||||
runBacktest(request: BacktestRequest): Observable<ApiResponse<BacktestResult>> {
|
||||
return this.http.post<ApiResponse<BacktestResult>>(`${this.apiBaseUrl}/backtest`, request);
|
||||
}
|
||||
getBacktestResult(id: string): Observable<ApiResponse<BacktestResult>> {
|
||||
return this.http.get<ApiResponse<BacktestResult>>(`${this.apiBaseUrl}/backtest/${id}`);
|
||||
}
|
||||
|
||||
optimizeStrategy(
|
||||
baseRequest: BacktestRequest,
|
||||
parameterGrid: Record<string, any[]>
|
||||
): Observable<ApiResponse<Array<BacktestResult & { parameters: Record<string, any> }>>> {
|
||||
return this.http.post<ApiResponse<Array<BacktestResult & { parameters: Record<string, any> }>>>(
|
||||
`${this.apiBaseUrl}/backtest/optimize`,
|
||||
{ baseRequest, parameterGrid }
|
||||
);
|
||||
}
|
||||
|
||||
// Strategy Signals and Trades
|
||||
getStrategySignals(strategyId: string): Observable<ApiResponse<Array<{
|
||||
id: string;
|
||||
strategyId: string;
|
||||
symbol: string;
|
||||
action: string;
|
||||
price: number;
|
||||
quantity: number;
|
||||
timestamp: Date;
|
||||
confidence: number;
|
||||
metadata?: any;
|
||||
}>>> {
|
||||
return this.http.get<ApiResponse<any[]>>(`${this.apiBaseUrl}/strategies/${strategyId}/signals`);
|
||||
}
|
||||
|
||||
getStrategyTrades(strategyId: string): Observable<ApiResponse<Array<{
|
||||
id: string;
|
||||
strategyId: string;
|
||||
symbol: string;
|
||||
entryPrice: number;
|
||||
entryTime: Date;
|
||||
exitPrice: number;
|
||||
exitTime: Date;
|
||||
quantity: number;
|
||||
pnl: number;
|
||||
pnlPercent: number;
|
||||
}>>> {
|
||||
return this.http.get<ApiResponse<any[]>>(`${this.apiBaseUrl}/strategies/${strategyId}/trades`);
|
||||
}
|
||||
|
||||
// Helper methods for common transformations
|
||||
formatBacktestRequest(formData: any): BacktestRequest {
|
||||
// Handle date formatting and parameter conversion
|
||||
return {
|
||||
...formData,
|
||||
startDate: formData.startDate instanceof Date ? formData.startDate.toISOString() : formData.startDate,
|
||||
endDate: formData.endDate instanceof Date ? formData.endDate.toISOString() : formData.endDate,
|
||||
strategyParams: this.convertParameterTypes(formData.strategyType, formData.strategyParams)
|
||||
};
|
||||
}
|
||||
|
||||
private convertParameterTypes(strategyType: string, params: Record<string, any>): Record<string, any> {
|
||||
// Convert string parameters to correct types based on strategy requirements
|
||||
const result: Record<string, any> = {};
|
||||
|
||||
for (const [key, value] of Object.entries(params)) {
|
||||
if (typeof value === 'string') {
|
||||
// Try to convert to number if it looks like a number
|
||||
if (!isNaN(Number(value))) {
|
||||
result[key] = Number(value);
|
||||
} else if (value.toLowerCase() === 'true') {
|
||||
result[key] = true;
|
||||
} else if (value.toLowerCase() === 'false') {
|
||||
result[key] = false;
|
||||
} else {
|
||||
result[key] = value;
|
||||
}
|
||||
} else {
|
||||
result[key] = value;
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
|
@ -136,7 +136,6 @@ export class WebSocketService {
|
|||
map(message => message.data as RiskAlert)
|
||||
);
|
||||
}
|
||||
|
||||
// Strategy Updates
|
||||
getStrategyUpdates(): Observable<any> {
|
||||
const subject = this.messageSubjects.get('strategyOrchestrator');
|
||||
|
|
@ -150,6 +149,52 @@ export class WebSocketService {
|
|||
);
|
||||
}
|
||||
|
||||
// Strategy Signals
|
||||
getStrategySignals(strategyId?: string): Observable<any> {
|
||||
const subject = this.messageSubjects.get('strategyOrchestrator');
|
||||
if (!subject) {
|
||||
throw new Error('Strategy Orchestrator WebSocket not initialized');
|
||||
}
|
||||
|
||||
return subject.asObservable().pipe(
|
||||
filter(message =>
|
||||
message.type === 'strategy_signal' &&
|
||||
(!strategyId || message.data.strategyId === strategyId)
|
||||
),
|
||||
map(message => message.data)
|
||||
);
|
||||
}
|
||||
|
||||
// Strategy Trades
|
||||
getStrategyTrades(strategyId?: string): Observable<any> {
|
||||
const subject = this.messageSubjects.get('strategyOrchestrator');
|
||||
if (!subject) {
|
||||
throw new Error('Strategy Orchestrator WebSocket not initialized');
|
||||
}
|
||||
|
||||
return subject.asObservable().pipe(
|
||||
filter(message =>
|
||||
message.type === 'strategy_trade' &&
|
||||
(!strategyId || message.data.strategyId === strategyId)
|
||||
),
|
||||
map(message => message.data)
|
||||
);
|
||||
}
|
||||
|
||||
// All strategy-related messages, useful for components that need all types
|
||||
getAllStrategyMessages(): Observable<WebSocketMessage> {
|
||||
const subject = this.messageSubjects.get('strategyOrchestrator');
|
||||
if (!subject) {
|
||||
throw new Error('Strategy Orchestrator WebSocket not initialized');
|
||||
}
|
||||
|
||||
return subject.asObservable().pipe(
|
||||
filter(message =>
|
||||
message.type.startsWith('strategy_')
|
||||
)
|
||||
);
|
||||
}
|
||||
|
||||
// Send messages
|
||||
sendMessage(serviceName: string, message: any) {
|
||||
const ws = this.connections.get(serviceName);
|
||||
|
|
|
|||
0
apps/interface-services/trading-dashboard/src/index.css
Normal file
0
apps/interface-services/trading-dashboard/src/index.css
Normal file
|
|
@ -0,0 +1,244 @@
|
|||
import { TestBed, ComponentFixture, fakeAsync, tick } from '@angular/core/testing';
|
||||
import { WebSocketService } from '../../services/websocket.service';
|
||||
import { StrategyService, TradingStrategy } from '../../services/strategy.service';
|
||||
import { StrategyDetailsComponent } from '../../pages/strategies/strategy-details/strategy-details.component';
|
||||
import { HttpClientTestingModule } from '@angular/common/http/testing';
|
||||
import { MatDialogModule } from '@angular/material/dialog';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { BehaviorSubject, Subject } from 'rxjs';
|
||||
import { WebSocketMessage } from '../../services/websocket.service';
|
||||
|
||||
describe('StrategyDetails WebSocket Integration', () => {
|
||||
let component: StrategyDetailsComponent;
|
||||
let fixture: ComponentFixture<StrategyDetailsComponent>;
|
||||
let webSocketServiceSpy: jasmine.SpyObj<WebSocketService>;
|
||||
let strategyServiceSpy: jasmine.SpyObj<StrategyService>;
|
||||
|
||||
// Mock data
|
||||
const mockStrategy: TradingStrategy = {
|
||||
id: 'test-strategy',
|
||||
name: 'Test Strategy',
|
||||
description: 'A test strategy',
|
||||
status: 'INACTIVE',
|
||||
type: 'MovingAverageCrossover',
|
||||
symbols: ['AAPL', 'MSFT', 'GOOGL'],
|
||||
parameters: {
|
||||
shortPeriod: 10,
|
||||
longPeriod: 30
|
||||
},
|
||||
performance: {
|
||||
totalTrades: 100,
|
||||
winRate: 0.6,
|
||||
totalReturn: 0.15,
|
||||
sharpeRatio: 1.2,
|
||||
maxDrawdown: 0.05
|
||||
},
|
||||
createdAt: new Date('2023-01-01'),
|
||||
updatedAt: new Date('2023-01-10')
|
||||
};
|
||||
|
||||
// Create mock subjects for WebSocket messages
|
||||
const mockStrategySubject = new Subject<WebSocketMessage>();
|
||||
|
||||
beforeEach(async () => {
|
||||
// Create spies for services
|
||||
webSocketServiceSpy = jasmine.createSpyObj('WebSocketService', [
|
||||
'getStrategyUpdates',
|
||||
'getStrategySignals',
|
||||
'getStrategyTrades',
|
||||
'getAllStrategyMessages',
|
||||
'sendMessage'
|
||||
]);
|
||||
|
||||
strategyServiceSpy = jasmine.createSpyObj('StrategyService', [
|
||||
'startStrategy',
|
||||
'stopStrategy',
|
||||
'pauseStrategy'
|
||||
]);
|
||||
|
||||
// Setup spy return values
|
||||
webSocketServiceSpy.getStrategyUpdates.and.returnValue(mockStrategySubject.asObservable());
|
||||
webSocketServiceSpy.getStrategySignals.and.returnValue(mockStrategySubject.asObservable());
|
||||
webSocketServiceSpy.getStrategyTrades.and.returnValue(mockStrategySubject.asObservable());
|
||||
webSocketServiceSpy.getAllStrategyMessages.and.returnValue(mockStrategySubject.asObservable());
|
||||
|
||||
strategyServiceSpy.startStrategy.and.returnValue(
|
||||
new BehaviorSubject({ success: true, data: { ...mockStrategy, status: 'ACTIVE' } })
|
||||
);
|
||||
strategyServiceSpy.pauseStrategy.and.returnValue(
|
||||
new BehaviorSubject({ success: true, data: { ...mockStrategy, status: 'PAUSED' } })
|
||||
);
|
||||
strategyServiceSpy.stopStrategy.and.returnValue(
|
||||
new BehaviorSubject({ success: true, data: { ...mockStrategy, status: 'INACTIVE' } })
|
||||
);
|
||||
|
||||
await TestBed.configureTestingModule({
|
||||
imports: [
|
||||
CommonModule,
|
||||
HttpClientTestingModule,
|
||||
MatDialogModule
|
||||
],
|
||||
declarations: [
|
||||
StrategyDetailsComponent
|
||||
],
|
||||
providers: [
|
||||
{ provide: WebSocketService, useValue: webSocketServiceSpy },
|
||||
{ provide: StrategyService, useValue: strategyServiceSpy }
|
||||
]
|
||||
}).compileComponents();
|
||||
|
||||
fixture = TestBed.createComponent(StrategyDetailsComponent);
|
||||
component = fixture.componentInstance;
|
||||
component.strategy = { ...mockStrategy };
|
||||
fixture.detectChanges();
|
||||
});
|
||||
|
||||
it('should create', () => {
|
||||
expect(component).toBeTruthy();
|
||||
});
|
||||
|
||||
it('should subscribe to WebSocket updates when strategy changes', () => {
|
||||
// Arrange & Act
|
||||
component.ngOnChanges({
|
||||
strategy: {
|
||||
currentValue: mockStrategy,
|
||||
previousValue: null,
|
||||
firstChange: true,
|
||||
isFirstChange: () => true
|
||||
}
|
||||
});
|
||||
|
||||
// Assert
|
||||
expect(webSocketServiceSpy.getStrategySignals).toHaveBeenCalledWith(mockStrategy.id);
|
||||
expect(webSocketServiceSpy.getStrategyTrades).toHaveBeenCalledWith(mockStrategy.id);
|
||||
});
|
||||
|
||||
it('should update signals when receiving new signal WebSocket message', fakeAsync(() => {
|
||||
// Arrange
|
||||
component.signals = [];
|
||||
component.ngOnChanges({
|
||||
strategy: {
|
||||
currentValue: mockStrategy,
|
||||
previousValue: null,
|
||||
firstChange: true,
|
||||
isFirstChange: () => true
|
||||
}
|
||||
});
|
||||
|
||||
// Act: Simulate receiving a WebSocket signal message
|
||||
const mockSignal = {
|
||||
type: 'strategy_signal',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: {
|
||||
strategyId: mockStrategy.id,
|
||||
symbol: 'AAPL',
|
||||
action: 'BUY',
|
||||
price: 150.5,
|
||||
quantity: 10,
|
||||
confidence: 0.85
|
||||
}
|
||||
};
|
||||
|
||||
mockStrategySubject.next(mockSignal);
|
||||
tick();
|
||||
|
||||
// Assert
|
||||
expect(component.signals.length).toBeGreaterThan(0);
|
||||
expect(component.signals[0].symbol).toBe('AAPL');
|
||||
expect(component.signals[0].action).toBe('BUY');
|
||||
}));
|
||||
|
||||
it('should update trades when receiving new trade WebSocket message', fakeAsync(() => {
|
||||
// Arrange
|
||||
component.trades = [];
|
||||
component.ngOnChanges({
|
||||
strategy: {
|
||||
currentValue: mockStrategy,
|
||||
previousValue: null,
|
||||
firstChange: true,
|
||||
isFirstChange: () => true
|
||||
}
|
||||
});
|
||||
|
||||
// Act: Simulate receiving a WebSocket trade message
|
||||
const mockTrade = {
|
||||
type: 'strategy_trade',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: {
|
||||
strategyId: mockStrategy.id,
|
||||
symbol: 'MSFT',
|
||||
entryPrice: 290.50,
|
||||
entryTime: new Date().toISOString(),
|
||||
exitPrice: 295.25,
|
||||
exitTime: new Date().toISOString(),
|
||||
quantity: 5,
|
||||
pnl: 23.75,
|
||||
pnlPercent: 1.63
|
||||
}
|
||||
};
|
||||
|
||||
mockStrategySubject.next(mockTrade);
|
||||
tick();
|
||||
|
||||
// Assert
|
||||
expect(component.trades.length).toBeGreaterThan(0);
|
||||
expect(component.trades[0].symbol).toBe('MSFT');
|
||||
expect(component.trades[0].pnl).toBeCloseTo(23.75);
|
||||
}));
|
||||
|
||||
it('should update strategy status when receiving status update message', fakeAsync(() => {
|
||||
// Arrange
|
||||
component.strategy = { ...mockStrategy, status: 'INACTIVE' };
|
||||
component.ngOnChanges({
|
||||
strategy: {
|
||||
currentValue: component.strategy,
|
||||
previousValue: null,
|
||||
firstChange: true,
|
||||
isFirstChange: () => true
|
||||
}
|
||||
});
|
||||
|
||||
// Act: Simulate receiving a WebSocket status update message
|
||||
const mockStatusUpdate = {
|
||||
type: 'strategy_update',
|
||||
timestamp: new Date().toISOString(),
|
||||
data: {
|
||||
strategyId: mockStrategy.id,
|
||||
status: 'ACTIVE'
|
||||
}
|
||||
};
|
||||
|
||||
mockStrategySubject.next(mockStatusUpdate);
|
||||
tick();
|
||||
|
||||
// Assert
|
||||
expect(component.strategy.status).toBe('ACTIVE');
|
||||
}));
|
||||
|
||||
it('should call startStrategy service method when activateStrategy is called', () => {
|
||||
// Arrange & Act
|
||||
component.activateStrategy();
|
||||
|
||||
// Assert
|
||||
expect(strategyServiceSpy.startStrategy).toHaveBeenCalledWith(mockStrategy.id);
|
||||
expect(component.strategy.status).toBe('ACTIVE');
|
||||
});
|
||||
|
||||
it('should call pauseStrategy service method when pauseStrategy is called', () => {
|
||||
// Arrange & Act
|
||||
component.pauseStrategy();
|
||||
|
||||
// Assert
|
||||
expect(strategyServiceSpy.pauseStrategy).toHaveBeenCalledWith(mockStrategy.id);
|
||||
expect(component.strategy.status).toBe('PAUSED');
|
||||
});
|
||||
|
||||
it('should call stopStrategy service method when stopStrategy is called', () => {
|
||||
// Arrange & Act
|
||||
component.stopStrategy();
|
||||
|
||||
// Assert
|
||||
expect(strategyServiceSpy.stopStrategy).toHaveBeenCalledWith(mockStrategy.id);
|
||||
expect(component.strategy.status).toBe('INACTIVE');
|
||||
});
|
||||
});
|
||||
87
docs/enhanced-architecture.md
Normal file
87
docs/enhanced-architecture.md
Normal file
|
|
@ -0,0 +1,87 @@
|
|||
# Enhanced Separation of Concerns
|
||||
|
||||
This guide describes how the project architecture has been improved to better separate concerns through a modular libs structure.
|
||||
|
||||
## New Library Structure
|
||||
|
||||
We've reorganized the project's shared libraries for improved maintainability:
|
||||
|
||||
### 1. Shared Types (`@stock-bot/shared-types`)
|
||||
|
||||
Types are now organized by domain:
|
||||
|
||||
```
|
||||
libs/shared-types/
|
||||
├── src/
|
||||
│ ├── market/ # Market data types (OHLCV, OrderBook)
|
||||
│ ├── trading/ # Trading types (Orders, Positions)
|
||||
│ ├── strategy/ # Strategy and signal types
|
||||
│ ├── events/ # Event definitions
|
||||
│ ├── api/ # API response/request types
|
||||
│ └── config/ # Configuration types
|
||||
```
|
||||
|
||||
### 2. Event Bus (`@stock-bot/event-bus`)
|
||||
|
||||
A consistent event publishing system:
|
||||
|
||||
```
|
||||
libs/event-bus/
|
||||
├── src/
|
||||
│ ├── EventBus.ts # Core event bus implementation
|
||||
│ └── index.ts # Public API
|
||||
```
|
||||
|
||||
### 3. Utils (`@stock-bot/utils`)
|
||||
|
||||
Shared utility functions:
|
||||
|
||||
```
|
||||
libs/utils/
|
||||
├── src/
|
||||
│ ├── dateUtils.ts # Date manipulation helpers
|
||||
│ ├── financialUtils.ts # Financial calculations
|
||||
│ ├── logger.ts # Standardized logging
|
||||
│ └── index.ts # Public API
|
||||
```
|
||||
|
||||
### 4. API Client (`@stock-bot/api-client`)
|
||||
|
||||
Type-safe service clients:
|
||||
|
||||
```
|
||||
libs/api-client/
|
||||
├── src/
|
||||
│ ├── BaseApiClient.ts # Common HTTP client logic
|
||||
│ ├── BacktestClient.ts # Backtest service client
|
||||
│ ├── StrategyClient.ts # Strategy service client
|
||||
│ └── index.ts # Public API
|
||||
```
|
||||
|
||||
## Service Architecture Improvements
|
||||
|
||||
The intelligence services have been split into focused services:
|
||||
|
||||
1. `strategy-orchestrator`: Core strategy management
|
||||
2. `backtest-engine`: Dedicated historical testing
|
||||
3. `signal-engine`: Signal generation
|
||||
|
||||
This provides:
|
||||
- Better scaling for resource-intensive operations
|
||||
- Focused codebases for each concern
|
||||
- Independent deployment cycles
|
||||
- Clear service boundaries
|
||||
|
||||
## Usage Guidelines
|
||||
|
||||
- Use the shared types library for all data models
|
||||
- Use the event bus for inter-service communication
|
||||
- Use the API clients for direct service calls
|
||||
- Use utility functions instead of duplicating common code
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. Continue migrating services to use the new libraries
|
||||
2. Add comprehensive tests for each library
|
||||
3. Create a complete API gateway for external access
|
||||
4. Document service boundaries with OpenAPI schemas
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue