work on market-data-gateway

This commit is contained in:
Bojan Kucera 2025-06-03 09:57:11 -04:00
parent 405b818c86
commit b957fb99aa
87 changed files with 7979 additions and 99 deletions

View file

View file

@ -0,0 +1,27 @@
# Core Services
Core services provide fundamental infrastructure and foundational capabilities for the stock trading platform.
## Services
### Market Data Gateway
- **Purpose**: Real-time market data processing and distribution
- **Key Functions**:
- WebSocket streaming for live market data
- Multi-source data aggregation (Alpaca, Yahoo Finance, etc.)
- Data caching and normalization
- Rate limiting and connection management
- Error handling and reconnection logic
### Risk Guardian
- **Purpose**: Real-time risk monitoring and controls
- **Key Functions**:
- Position monitoring and risk threshold enforcement
- Portfolio risk assessment and alerting
- Real-time risk metric calculations
- Automated risk controls and circuit breakers
- Risk reporting and compliance monitoring
## Architecture
Core services form the backbone of the trading platform, providing essential data flow and risk management capabilities that all other services depend upon. They handle the most critical and time-sensitive operations requiring high reliability and performance.

View file

@ -0,0 +1,82 @@
# Market Data Gateway
## Overview
The Market Data Gateway (MDG) service serves as the central hub for real-time market data processing and distribution within the stock-bot platform. It acts as the intermediary between external market data providers and internal platform services, ensuring consistent, normalized, and reliable market data delivery.
## Key Features
### Real-time Data Processing
- **WebSocket Streaming**: Provides low-latency data streams for market updates
- **Multi-source Aggregation**: Integrates data from multiple providers (Alpaca, Yahoo Finance, etc.)
- **Normalized Data Model**: Transforms varied provider formats into a unified platform data model
- **Subscription Management**: Allows services to subscribe to specific data streams
### Data Quality Management
- **Validation & Sanitization**: Ensures data integrity through validation rules
- **Anomaly Detection**: Identifies unusual price movements or data issues
- **Gap Filling**: Interpolation strategies for missing data points
- **Data Reconciliation**: Cross-validates data from multiple sources
### Performance Optimization
- **Caching Layer**: In-memory cache for frequently accessed data
- **Rate Limiting**: Protects against API quota exhaustion
- **Connection Pooling**: Efficiently manages provider connections
- **Compression**: Minimizes data transfer size for bandwidth efficiency
### Operational Resilience
- **Automatic Reconnection**: Handles provider disconnections gracefully
- **Circuit Breaking**: Prevents cascade failures during outages
- **Failover Mechanisms**: Switches to alternative data sources when primary sources fail
- **Health Monitoring**: Self-reports service health metrics
## Integration Points
### Upstream Connections
- Alpaca Markets API (primary data source)
- Yahoo Finance API (secondary data source)
- Potential future integrations with IEX, Polygon, etc.
### Downstream Consumers
- Strategy Orchestrator
- Risk Guardian
- Trading Dashboard
- Data Persistence Layer
## Technical Implementation
### Technology Stack
- **Runtime**: Node.js with TypeScript
- **Messaging**: WebSockets for real-time streaming
- **Caching**: Redis for shared cache
- **Metrics**: Prometheus metrics for monitoring
- **Configuration**: Environment-based with runtime updates
### Architecture Pattern
- Event-driven microservice with publisher-subscriber model
- Horizontally scalable to handle increased data volumes
- Stateless design with external state management
## Development Guidelines
### Error Handling
- Detailed error classification and handling strategy
- Graceful degradation during partial outages
- Comprehensive error logging with context
### Testing Strategy
- Unit tests for data transformation logic
- Integration tests with mock data providers
- Performance tests for throughput capacity
- Chaos testing for resilience verification
### Observability
- Detailed logs for troubleshooting
- Performance metrics for optimization
- Health checks for system monitoring
- Tracing for request flow analysis
## Future Enhancements
- Support for options and derivatives data
- Real-time news and sentiment integration
- Machine learning-based data quality improvements
- Enhanced historical data query capabilities

View file

@ -0,0 +1,84 @@
# Risk Guardian
## Overview
The Risk Guardian service provides real-time risk monitoring and control mechanisms for the stock-bot platform. It serves as the protective layer that ensures trading activities remain within defined risk parameters, safeguarding the platform and its users from excessive market exposure and potential losses.
## Key Features
### Real-time Risk Monitoring
- **Position Tracking**: Continuously monitors all open positions
- **Risk Metric Calculation**: Calculates key risk metrics (VaR, volatility, exposure)
- **Threshold Management**: Configurable risk thresholds with multiple severity levels
- **Aggregated Risk Views**: Risk assessment at portfolio, strategy, and position levels
### Automated Risk Controls
- **Pre-trade Validation**: Validates orders against risk limits before execution
- **Circuit Breakers**: Automatically halts trading when thresholds are breached
- **Position Liquidation**: Controlled unwinding of positions when necessary
- **Trading Restrictions**: Enforces instrument, size, and frequency restrictions
### Risk Alerting
- **Real-time Notifications**: Immediate alerts for threshold breaches
- **Escalation Paths**: Multi-level alerting based on severity
- **Alert History**: Maintains historical record of all risk events
- **Custom Alert Rules**: Configurable alerting conditions and criteria
### Compliance Management
- **Regulatory Reporting**: Assists with required regulatory reporting
- **Audit Trails**: Comprehensive logging of risk-related decisions
- **Rule-based Controls**: Implements compliance-driven trading restrictions
- **Documentation**: Maintains evidence of risk control effectiveness
## Integration Points
### Upstream Connections
- Market Data Gateway (for price data)
- Strategy Orchestrator (for active strategies)
- Order Management System (for position tracking)
### Downstream Consumers
- Trading Dashboard (for risk visualization)
- Strategy Orchestrator (for trading restrictions)
- Notification Service (for alerting)
## Technical Implementation
### Technology Stack
- **Runtime**: Node.js with TypeScript
- **Database**: Time-series database for risk metrics
- **Messaging**: Event-driven architecture with message bus
- **Math Libraries**: Specialized libraries for risk calculations
- **Caching**: In-memory risk state management
### Architecture Pattern
- Reactive microservice with event sourcing
- Command Query Responsibility Segregation (CQRS)
- Rule engine for risk evaluation
- Stateful service with persistence
## Development Guidelines
### Risk Calculation Approach
- Clear documentation of all risk formulas
- Validation against industry standard calculations
- Performance optimization for real-time processing
- Regular backtesting of risk models
### Testing Strategy
- Unit tests for risk calculation logic
- Scenario-based testing for specific market conditions
- Stress testing with extreme market movements
- Performance testing for high-frequency updates
### Calibration Process
- Documented process for risk model calibration
- Historical data validation
- Parameter sensitivity analysis
- Regular recalibration schedule
## Future Enhancements
- Machine learning for anomaly detection
- Scenario analysis and stress testing
- Custom risk models per strategy type
- Enhanced visualization of risk exposures
- Factor-based risk decomposition