removed some old stuff

This commit is contained in:
Boki 2025-06-20 13:40:44 -04:00
parent 05974cd602
commit c2420a34f1
9 changed files with 0 additions and 2197 deletions

View file

@ -1,377 +0,0 @@
# 📋 Stock Bot Development Roadmap
*Last Updated: June 2025*
## 🎯 Overview
This document outlines the development plan for the Stock Bot platform, focusing on building a robust data pipeline from market data providers through processing layers to trading execution. The plan emphasizes establishing solid foundational layers before adding advanced features.
## 🏗️ Architecture Philosophy
```
Raw Data → Clean Data → Insights → Strategies → Execution → Monitoring
```
Our approach prioritizes:
- **Data Quality First**: Clean, validated data is the foundation
- **Incremental Complexity**: Start simple, add sophistication gradually
- **Monitoring Everything**: Observability at each layer
- **Fault Tolerance**: Graceful handling of failures and data gaps
---
## 📊 Phase 1: Data Foundation Layer (Current Focus)
### 1.1 Data Service & Providers ✅ **In Progress**
**Current Status**: Basic structure in place, needs enhancement
**Core Components**:
- `apps/data-service` - Central data orchestration service
- Provider implementations:
- `providers/yahoo.provider.ts` ✅ Basic implementation
- `providers/quotemedia.provider.ts` ✅ Basic implementation
- `providers/proxy.provider.ts` ✅ Proxy/fallback logic
**Immediate Tasks**:
1. **Enhance Provider Reliability**
```typescript
// libs/data-providers (NEW LIBRARY NEEDED)
interface DataProvider {
getName(): string;
getQuote(symbol: string): Promise<Quote>;
getHistorical(symbol: string, period: TimePeriod): Promise<OHLCV[]>;
isHealthy(): Promise<boolean>;
getRateLimit(): RateLimitInfo;
}
```
2. **Add Rate Limiting & Circuit Breakers**
- Implement in `libs/http` client
- Add provider-specific rate limits
- Circuit breaker pattern for failed providers
3. **Data Validation Layer**
```typescript
// libs/data-validation (NEW LIBRARY NEEDED)
- Price reasonableness checks
- Volume validation
- Timestamp validation
- Missing data detection
```
4. **Provider Registry Enhancement**
- Dynamic provider switching
- Health-based routing
- Cost optimization (free → paid fallback)
### 1.2 Raw Data Storage
**Storage Strategy**:
- **QuestDB**: Real-time market data (OHLCV, quotes)
- **MongoDB**: Provider responses, metadata, configurations
- **PostgreSQL**: Processed/clean data, trading records
**Schema Design**:
```sql
-- QuestDB Time-Series Tables
raw_quotes (timestamp, symbol, provider, bid, ask, last, volume)
raw_ohlcv (timestamp, symbol, provider, open, high, low, close, volume)
provider_health (timestamp, provider, latency, success_rate, error_rate)
-- MongoDB Collections
provider_responses: { provider, symbol, timestamp, raw_response, status }
data_quality_metrics: { symbol, date, completeness, accuracy, issues[] }
```
**Immediate Implementation**:
1. Enhance `libs/questdb-client` with streaming inserts
2. Add data retention policies
3. Implement data compression strategies
---
## 🧹 Phase 2: Data Processing & Quality Layer
### 2.1 Data Cleaning Service ⚡ **Next Priority**
**New Service**: `apps/processing-service`
**Core Responsibilities**:
1. **Data Normalization**
- Standardize timestamps (UTC)
- Normalize price formats
- Handle split/dividend adjustments
2. **Quality Checks**
- Outlier detection (price spikes, volume anomalies)
- Gap filling strategies
- Cross-provider validation
3. **Data Enrichment**
- Calculate derived metrics (returns, volatility)
- Add technical indicators
- Market session classification
**Library Enhancements Needed**:
```typescript
// libs/data-frame (ENHANCE EXISTING)
class MarketDataFrame {
// Add time-series specific operations
fillGaps(strategy: GapFillStrategy): MarketDataFrame;
detectOutliers(method: OutlierMethod): OutlierReport;
normalize(): MarketDataFrame;
calculateReturns(period: number): MarketDataFrame;
}
// libs/data-quality (NEW LIBRARY)
interface QualityMetrics {
completeness: number;
accuracy: number;
timeliness: number;
consistency: number;
issues: QualityIssue[];
}
```
### 2.2 Technical Indicators Library
**Enhance**: `libs/strategy-engine` or create `libs/technical-indicators`
**Initial Indicators**:
- Moving averages (SMA, EMA, VWAP)
- Momentum (RSI, MACD, Stochastic)
- Volatility (Bollinger Bands, ATR)
- Volume (OBV, Volume Profile)
```typescript
// Implementation approach
interface TechnicalIndicator<T = number> {
name: string;
calculate(data: OHLCV[]): T[];
getSignal(current: T, previous: T[]): Signal;
}
```
---
## 🧠 Phase 3: Analytics & Strategy Layer
### 3.1 Strategy Engine Enhancement
**Current**: Basic structure exists in `libs/strategy-engine`
**Enhancements Needed**:
1. **Strategy Framework**
```typescript
abstract class TradingStrategy {
abstract analyze(data: MarketData): StrategySignal[];
abstract getRiskParams(): RiskParameters;
backtest(historicalData: MarketData[]): BacktestResults;
}
```
2. **Signal Generation**
- Entry/exit signals
- Position sizing recommendations
- Risk-adjusted scores
3. **Strategy Types to Implement**:
- Mean reversion
- Momentum/trend following
- Statistical arbitrage
- Volume-based strategies
### 3.2 Backtesting Engine
**New Service**: Enhanced `apps/strategy-service`
**Features**:
- Historical simulation
- Performance metrics calculation
- Risk analysis
- Strategy comparison
---
## ⚡ Phase 4: Execution Layer
### 4.1 Portfolio Management
**Enhance**: `apps/portfolio-service`
**Core Features**:
- Position tracking
- Risk monitoring
- P&L calculation
- Margin management
### 4.2 Order Management
**New Service**: `apps/order-service`
**Responsibilities**:
- Order validation
- Execution routing
- Fill reporting
- Trade reconciliation
### 4.3 Risk Management
**New Library**: `libs/risk-engine`
**Risk Controls**:
- Position limits
- Drawdown limits
- Correlation limits
- Volatility scaling
---
## 📚 Library Improvements Roadmap
### Immediate (Phase 1-2)
1. **`libs/http`** ✅ **Current Priority**
- [ ] Rate limiting middleware
- [ ] Circuit breaker pattern
- [ ] Request/response caching
- [ ] Retry strategies with exponential backoff
2. **`libs/questdb-client`**
- [ ] Streaming insert optimization
- [ ] Batch insert operations
- [ ] Connection pooling
- [ ] Query result caching
3. **`libs/logger`** ✅ **Recently Updated**
- [x] Migrated to `getLogger()` pattern
- [ ] Performance metrics logging
- [ ] Structured trading event logging
4. **`libs/data-frame`**
- [ ] Time-series operations
- [ ] Financial calculations
- [ ] Memory optimization for large datasets
### Medium Term (Phase 3)
5. **`libs/cache`**
- [ ] Market data caching strategies
- [ ] Cache warming for frequently accessed symbols
- [ ] Distributed caching support
6. **`libs/config`**
- [ ] Strategy-specific configurations
- [ ] Dynamic configuration updates
- [ ] Environment-specific overrides
### Long Term (Phase 4+)
7. **`libs/vector-engine`**
- [ ] Market similarity analysis
- [ ] Pattern recognition
- [ ] Correlation analysis
---
## 🎯 Immediate Next Steps (Next 2 Weeks)
### Week 1: Data Provider Hardening
1. **Enhance HTTP Client** (`libs/http`)
- Implement rate limiting
- Add circuit breaker pattern
- Add comprehensive error handling
2. **Provider Reliability** (`apps/data-service`)
- Add health checks for all providers
- Implement fallback logic
- Add provider performance monitoring
3. **Data Validation**
- Create `libs/data-validation`
- Implement basic price/volume validation
- Add data quality metrics
### Week 2: Processing Foundation
1. **Start Processing Service** (`apps/processing-service`)
- Basic data cleaning pipeline
- Outlier detection
- Gap filling strategies
2. **QuestDB Optimization** (`libs/questdb-client`)
- Implement streaming inserts
- Add batch operations
- Optimize for time-series data
3. **Technical Indicators**
- Start `libs/technical-indicators`
- Implement basic indicators (SMA, EMA, RSI)
---
## 📊 Success Metrics
### Phase 1 Completion Criteria
- [ ] 99.9% data provider uptime
- [ ] <500ms average data latency
- [ ] Zero data quality issues for major symbols
- [ ] All providers monitored and health-checked
### Phase 2 Completion Criteria
- [ ] Automated data quality scoring
- [ ] Gap-free historical data for 100+ symbols
- [ ] Real-time technical indicator calculation
- [ ] Processing latency <100ms
### Phase 3 Completion Criteria
- [ ] 5+ implemented trading strategies
- [ ] Comprehensive backtesting framework
- [ ] Performance analytics dashboard
---
## 🚨 Risk Mitigation
### Data Risks
- **Provider Failures**: Multi-provider fallback strategy
- **Data Quality**: Automated validation and alerting
- **Rate Limits**: Smart request distribution
### Technical Risks
- **Scalability**: Horizontal scaling design
- **Latency**: Optimize critical paths early
- **Data Loss**: Comprehensive backup strategies
### Operational Risks
- **Monitoring**: Full observability stack (Grafana, Loki, Prometheus)
- **Alerting**: Critical issue notifications
- **Documentation**: Keep architecture docs current
---
## 💡 Innovation Opportunities
### Machine Learning Integration
- Predictive models for data quality
- Anomaly detection in market data
- Strategy parameter optimization
### Real-time Processing
- Stream processing with Kafka/Pulsar
- Event-driven architecture
- WebSocket data feeds
### Advanced Analytics
- Market microstructure analysis
- Alternative data integration
- Cross-asset correlation analysis
---
*This roadmap is a living document that will evolve as we learn and adapt. Focus remains on building solid foundations before adding complexity.*
**Next Review**: End of June 2025

View file

@ -1,161 +0,0 @@
# 🚀 Trading Bot Docker Infrastructure Setup Complete!
Your Docker infrastructure has been successfully configured. Here's what you have:
## 📦 What's Included
### Core Services
- **🐉 Dragonfly**: Redis-compatible cache and event streaming (Port 6379)
- **🐘 PostgreSQL**: Operational database with complete trading schema (Port 5432)
- **📊 QuestDB**: Time-series database for market data (Ports 9000, 8812, 9009)
- **🍃 MongoDB**: Document storage for sentiment analysis and raw documents (Port 27017)
### Admin Tools
- **🔧 Redis Insight**: Dragonfly management GUI (Port 8001)
- **🛠️ PgAdmin**: PostgreSQL administration (Port 8080)
- **🍃 Mongo Express**: MongoDB document browser (Port 8081)
### Monitoring (Optional)
- **📈 Prometheus**: Metrics collection (Port 9090)
- **📊 Grafana**: Dashboards and alerting (Port 3000)
## 🏁 Getting Started
### Step 1: Start Docker Desktop
Make sure Docker Desktop is running on your Windows machine.
### Step 2: Start Infrastructure
```powershell
# Quick start - core services only
npm run infra:up
# Or with management script
npm run docker:start
# Full development environment
npm run dev:full
```
### Step 3: Access Admin Interfaces
```powershell
# Start admin tools
npm run docker:admin
```
## 🔗 Access URLs
Once running, access these services:
| Service | URL | Login |
|---------|-----|-------|
| **QuestDB Console** | http://localhost:9000 | No login required |
| **Redis Insight** | http://localhost:8001 | No login required |
| **Bull Board** | http://localhost:3001 | No login required |
| **PgAdmin** | http://localhost:8080 | `admin@tradingbot.local` / `admin123` |
| **Mongo Express** | http://localhost:8081 | `admin` / `admin123` |
| **Prometheus** | http://localhost:9090 | No login required |
| **Grafana** | http://localhost:3000 | `admin` / `admin123` |
| **Bull Board** | http://localhost:3001 | No login required |
## 📊 Database Connections
### From Your Trading Services
Update your `.env` file:
```env
# Dragonfly (Redis replacement)
DRAGONFLY_HOST=localhost
DRAGONFLY_PORT=6379
# PostgreSQL
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=trading_bot
POSTGRES_USER=trading_user
POSTGRES_PASSWORD=trading_pass_dev
# QuestDB
QUESTDB_HOST=localhost
QUESTDB_PORT=8812
QUESTDB_DB=qdb
# MongoDB
MONGODB_HOST=localhost
MONGODB_PORT=27017
MONGODB_DB=trading_documents
MONGODB_USER=trading_admin
MONGODB_PASSWORD=trading_mongo_dev
```
### Database Schema
PostgreSQL includes these pre-configured schemas:
- `trading.*` - Orders, positions, executions, accounts
- `strategy.*` - Strategies, signals, performance metrics
- `risk.*` - Risk limits, events, monitoring
- `audit.*` - System events, health checks, configuration
## 🛠️ Management Commands
```powershell
# Basic operations
npm run docker:start # Start core services
npm run docker:stop # Stop all services
npm run docker:status # Check service status
npm run docker:logs # View all logs
npm run docker:reset # Reset all data (destructive!)
# Additional services
npm run docker:admin # Start admin interfaces
npm run docker:monitoring # Start Prometheus & Grafana
# Development workflows
npm run dev:full # Infrastructure + admin + your services
npm run dev:clean # Reset + restart everything
# Direct PowerShell script access
./scripts/docker.ps1 start
./scripts/docker.ps1 logs -Service dragonfly
./scripts/docker.ps1 help
```
## ✅ Next Steps
1. **Start Docker Desktop** if not already running
2. **Run**: `npm run docker:start` to start core infrastructure
3. **Run**: `npm run docker:admin` to start admin tools
4. **Update** your environment variables to use the Docker services
5. **Test** Dragonfly connection in your EventPublisher service
6. **Verify** database schema in PgAdmin
7. **Start** your trading services with the new infrastructure
## 🎯 Ready for Integration
Your EventPublisher service is already configured to use Dragonfly. The infrastructure supports:
- ✅ **Event Streaming**: Dragonfly handles Redis Streams for real-time events
- ✅ **Caching**: High-performance caching with better memory efficiency
- ✅ **Operational Data**: PostgreSQL with complete trading schemas
- ✅ **Time-Series Data**: QuestDB for market data and analytics
- ✅ **Monitoring**: Full observability stack ready
- ✅ **Admin Tools**: Web-based management interfaces
The system is designed to scale from development to production with the same Docker configuration.
## 🔧 Troubleshooting
If you encounter issues:
```powershell
# Check Docker status
docker --version
docker-compose --version
# Verify services
npm run docker:status
# View specific service logs
./scripts/docker.ps1 logs -Service dragonfly
# Reset if needed
npm run docker:reset
```
**Happy Trading! 🚀📈**

View file

@ -1,825 +0,0 @@
# Stock Bot - System Architecture
> **Updated**: June 2025
## Overview
TypeScript microservices architecture for automated stock trading with real-time data processing and multi-database storage.
## Core Services
```
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Data Service │ │Processing Service│ │Strategy Service │
│ • Market Data │────▶│ • Indicators │────▶│ • Strategies │
│ • Providers │ │ • Analytics │ │ • Backtesting │
│ • QuestDB │ │ • Validation │ │ • Signal Gen │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│ │ │
│ ┌─────────────────┐ │
└──────────────▶│ Event Bus │◀─────────────┘
│ (Dragonfly) │
└─────────────────┘
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│Execution Service│ │Portfolio Service│ │ Dashboard │
│ • Order Mgmt │ │ • Positions │ │ • Angular UI │
│ • Risk Control │ │ • Risk Mgmt │ │ • Real-time │
│ • Execution │ │ • Performance │ │ • Analytics │
└─────────────────┘ └─────────────────┘ └─────────────────┘
```
## Services Structure
```
stock-bot/
├── apps/
│ ├── data-service/ # Market data ingestion & storage
│ ├── execution-service/ # Order execution & broker integration
│ ├── portfolio-service/ # Position & risk management
│ ├── processing-service/ # Data processing & indicators
│ ├── strategy-service/ # Trading strategies & backtesting
│ └── dashboard/ # Angular UI (port 4200)
├── libs/ # Shared libraries
│ ├── logger/ # Centralized logging w/ Loki
│ ├── config/ # Configuration management
│ ├── event-bus/ # Event system
│ ├── mongodb-client/ # MongoDB operations
│ ├── postgres-client/ # PostgreSQL operations
│ ├── questdb-client/ # Time-series data
│ ├── http/ # HTTP client w/ proxy support
│ ├── cache/ # Caching layer
│ └── utils/ # Common utilities
└── database/ # Database configurations
├── mongodb/init/
└── postgres/init/
```
## Technology Stack
| Component | Technology | Purpose |
|-----------|------------|---------|
| **Runtime** | Bun | Fast JavaScript runtime |
| **Language** | TypeScript | Type-safe development |
| **Databases** | PostgreSQL, MongoDB, QuestDB | Multi-database architecture |
| **Caching** | Dragonfly (Redis) | Event bus & caching |
| **Frontend** | Angular 18 | Modern reactive UI |
| **Monitoring** | Prometheus, Grafana, Loki | Observability stack |
## Quick Start
```bash
# Install dependencies
bun install
# Start infrastructure
bun run infra:up
# Start services
bun run dev
# Access dashboard
# http://localhost:4200
```
## Key Features
- **Real-time Trading**: Live market data & order execution
- **Multi-Database**: PostgreSQL, MongoDB, QuestDB for different data types
- **Event-Driven**: Asynchronous communication via Dragonfly
- **Monitoring**: Full observability with metrics, logs, and tracing
- **Modular**: Shared libraries for common functionality
- **Type-Safe**: Full TypeScript coverage
│ ├── processing-service/ # Combined processing & indicators
│ │ ├── src/
│ │ │ ├── indicators/ # Technical indicators (uses @stock-bot/utils)
│ │ │ ├── processors/ # Data processing pipeline
│ │ │ ├── vectorized/ # Vectorized calculations
│ │ │ ├── services/
│ │ │ └── index.ts
│ │ └── package.json
│ │
│ ├── strategy-service/ # Combined strategy & backtesting
│ │ ├── src/
│ │ │ ├── strategies/ # Strategy implementations
│ │ │ ├── backtesting/ # Multi-mode backtesting engine
│ │ │ │ ├── modes/ # Backtesting modes
│ │ │ │ │ ├── live-mode.ts # Live trading mode
│ │ │ │ │ ├── event-mode.ts # Event-driven backtest
│ │ │ │ │ └── vector-mode.ts # Vectorized backtest
│ │ │ │ ├── engines/ # Execution engines
│ │ │ │ │ ├── event-engine.ts # Event-based simulation
│ │ │ │ │ ├── vector-engine.ts # Vectorized calculations
│ │ │ │ │ └── hybrid-engine.ts # Combined validation
│ │ │ │ ├── simulator.ts # Market simulator
│ │ │ │ ├── runner.ts # Backtest orchestrator
│ │ │ │ └── metrics.ts # Performance analysis
│ │ │ ├── live/ # Live strategy execution
│ │ │ ├── framework/ # Strategy framework
│ │ │ │ ├── base-strategy.ts
│ │ │ │ ├── execution-mode.ts
│ │ │ │ └── mode-factory.ts
│ │ │ └── index.ts
│ │ └── package.json
│ │
│ ├── execution-service/ # Combined order execution & simulation
│ │ ├── src/
│ │ │ ├── brokers/ # Live broker adapters
│ │ │ ├── simulation/ # Simulated execution
│ │ │ ├── unified/ # Unified execution interface
│ │ │ │ ├── executor.ts # Abstract executor
│ │ │ │ ├── live-executor.ts
│ │ │ │ ├── sim-executor.ts
│ │ │ │ └── vector-executor.ts
│ │ │ └── index.ts
│ │ └── package.json
│ │
│ ├── portfolio-service/ # Combined portfolio & risk management
│ │ ├── src/
│ │ │ ├── portfolio/ # Portfolio tracking
│ │ │ ├── risk/ # Risk management (uses @stock-bot/utils)
│ │ │ ├── positions/ # Position management
│ │ │ ├── performance/ # Performance tracking
│ │ │ └── index.ts
│ │ └── package.json
│ │
│ └── dashboard/ # Combined API & reporting
│ ├── src/
│ │ ├── api/ # REST API
│ │ ├── web/ # Web interface (Angular)
│ │ ├── reports/ # Report generation
│ │ ├── websockets/ # Real-time updates
│ │ └── index.ts
│ └── package.json
├── libs/ # ✅ Your existing shared libraries
│ ├── config/ # ✅ Environment configuration
│ ├── http/ # ✅ HTTP utilities
│ ├── logger/ # ✅ Loki-integrated logging
│ ├── mongodb-client/ # ✅ MongoDB operations
│ ├── postgres-client/ # ✅ PostgreSQL operations
│ ├── questdb-client/ # ✅ Time-series data
│ ├── types/ # ✅ Shared TypeScript types
│ ├── utils/ # ✅ Calculations & utilities
│ ├── event-bus/ # 🆕 Dragonfly event system
│ ├── strategy-engine/ # 🆕 Strategy framework
│ ├── vector-engine/ # 🆕 Vectorized calculations
│ └── data-frame/ # 🆕 DataFrame operations
```
## Multi-Mode Backtesting Architecture
### 1. Execution Mode Framework
```typescript
export abstract class ExecutionMode {
protected logger = createLogger(this.constructor.name);
protected config = new ServiceConfig();
abstract name: string;
abstract executeOrder(order: Order): Promise<OrderResult>;
abstract getCurrentTime(): Date;
abstract getMarketData(symbol: string): Promise<MarketData>;
abstract publishEvent(event: string, data: any): Promise<void>;
}
export enum BacktestMode {
LIVE = 'live',
EVENT_DRIVEN = 'event-driven',
VECTORIZED = 'vectorized',
HYBRID = 'hybrid'
}
```
### 2. Live Trading Mode
```typescript
export class LiveMode extends ExecutionMode {
name = 'live';
private broker = new BrokerClient(this.config.getBrokerConfig());
private eventBus = new EventBus();
async executeOrder(order: Order): Promise<OrderResult> {
this.logger.info('Executing live order', { orderId: order.id });
// Execute via real broker
const result = await this.broker.placeOrder(order);
// Publish to event bus
await this.eventBus.publish('order.executed', result);
return result;
}
getCurrentTime(): Date {
return new Date(); // Real time
}
async getMarketData(symbol: string): Promise<MarketData> {
// Get live market data
return await this.marketDataService.getLiveData(symbol);
}
async publishEvent(event: string, data: any): Promise<void> {
await this.eventBus.publish(event, data);
}
}
```
### 3. Event-Driven Backtesting Mode
```typescript
export class EventBacktestMode extends ExecutionMode {
name = 'event-driven';
private simulator = new MarketSimulator();
private eventBus = new InMemoryEventBus(); // In-memory for simulation
private simulationTime: Date;
private historicalData: Map<string, MarketData[]>;
constructor(private config: BacktestConfig) {
super();
this.simulationTime = config.startDate;
}
async executeOrder(order: Order): Promise<OrderResult> {
this.logger.debug('Simulating order execution', {
orderId: order.id,
simulationTime: this.simulationTime
});
// Realistic order simulation with slippage, fees
const result = await this.simulator.executeOrder(order, {
currentTime: this.simulationTime,
marketData: await this.getMarketData(order.symbol),
slippageModel: this.config.slippageModel,
commissionModel: this.config.commissionModel
});
// Publish to simulation event bus
await this.eventBus.publish('order.executed', result);
return result;
}
getCurrentTime(): Date {
return this.simulationTime;
}
async getMarketData(symbol: string): Promise<MarketData> {
const data = this.historicalData.get(symbol) || [];
return data.find(d => d.timestamp <= this.simulationTime) || null;
}
async publishEvent(event: string, data: any): Promise<void> {
await this.eventBus.publish(event, data);
}
// Progress simulation time
advanceTime(newTime: Date): void {
this.simulationTime = newTime;
}
}
```
### 4. Vectorized Backtesting Mode
```typescript
export class VectorBacktestMode extends ExecutionMode {
name = 'vectorized';
private dataFrame: DataFrame;
private currentIndex: number = 0;
constructor(private config: VectorBacktestConfig) {
super();
this.dataFrame = new DataFrame(config.historicalData);
}
// Vectorized execution - processes entire dataset at once
async executeVectorizedBacktest(strategy: VectorizedStrategy): Promise<BacktestResult> {
const startTime = Date.now();
this.logger.info('Starting vectorized backtest', {
strategy: strategy.name,
dataPoints: this.dataFrame.length
});
// Generate all signals at once using your utils library
const signals = this.generateVectorizedSignals(strategy);
// Calculate performance metrics vectorized
const performance = this.calculateVectorizedPerformance(signals);
// Apply trading costs if specified
if (this.config.tradingCosts) {
this.applyTradingCosts(performance, signals);
}
const executionTime = Date.now() - startTime;
this.logger.info('Vectorized backtest completed', {
executionTime,
totalReturn: performance.totalReturn,
sharpeRatio: performance.sharpeRatio
});
return {
mode: 'vectorized',
strategy: strategy.name,
performance,
executionTime,
signals
};
}
private generateVectorizedSignals(strategy: VectorizedStrategy): DataFrame {
const prices = this.dataFrame.get('close');
// Use your existing technical indicators from @stock-bot/utils
const indicators = {
sma20: sma(prices, 20),
sma50: sma(prices, 50),
rsi: rsi(prices, 14),
macd: macd(prices)
};
// Generate position signals vectorized
const positions = strategy.generatePositions(this.dataFrame, indicators);
return new DataFrame({
...this.dataFrame.toObject(),
...indicators,
positions
});
}
private calculateVectorizedPerformance(signals: DataFrame): PerformanceMetrics {
const prices = signals.get('close');
const positions = signals.get('positions');
// Calculate returns vectorized
const returns = prices.slice(1).map((price, i) =>
(price - prices[i]) / prices[i]
);
// Strategy returns = position[t-1] * market_return[t]
const strategyReturns = returns.map((ret, i) =>
(positions[i] || 0) * ret
);
// Use your existing performance calculation utilities
return {
totalReturn: calculateTotalReturn(strategyReturns),
sharpeRatio: calculateSharpeRatio(strategyReturns),
maxDrawdown: calculateMaxDrawdown(strategyReturns),
volatility: calculateVolatility(strategyReturns),
winRate: calculateWinRate(strategyReturns)
};
}
// Standard interface methods (not used in vectorized mode)
async executeOrder(order: Order): Promise<OrderResult> {
throw new Error('Use executeVectorizedBacktest for vectorized mode');
}
getCurrentTime(): Date {
return this.dataFrame.getTimestamp(this.currentIndex);
}
async getMarketData(symbol: string): Promise<MarketData> {
return this.dataFrame.getRow(this.currentIndex);
}
async publishEvent(event: string, data: any): Promise<void> {
// No-op for vectorized mode
}
}
```
### 5. Hybrid Validation Mode
```typescript
export class HybridBacktestMode extends ExecutionMode {
name = 'hybrid';
private eventMode: EventBacktestMode;
private vectorMode: VectorBacktestMode;
constructor(config: BacktestConfig) {
super();
this.eventMode = new EventBacktestMode(config);
this.vectorMode = new VectorBacktestMode(config);
}
async validateStrategy(
strategy: BaseStrategy,
tolerance: number = 0.001
): Promise<ValidationResult> {
this.logger.info('Starting hybrid validation', {
strategy: strategy.name,
tolerance
});
// Run vectorized backtest (fast)
const vectorResult = await this.vectorMode.executeVectorizedBacktest(
strategy as VectorizedStrategy
);
// Run event-driven backtest (realistic)
const eventResult = await this.runEventBacktest(strategy);
// Compare results
const performanceDiff = Math.abs(
vectorResult.performance.totalReturn -
eventResult.performance.totalReturn
);
const isValid = performanceDiff < tolerance;
this.logger.info('Hybrid validation completed', {
isValid,
performanceDifference: performanceDiff,
recommendation: isValid ? 'vectorized' : 'event-driven'
});
return {
isValid,
performanceDifference: performanceDiff,
vectorizedResult: vectorResult,
eventResult,
recommendation: isValid ?
'Vectorized results are reliable for this strategy' :
'Use event-driven backtesting for accurate results'
};
}
// Standard interface methods delegate to event mode
async executeOrder(order: Order): Promise<OrderResult> {
return await this.eventMode.executeOrder(order);
}
getCurrentTime(): Date {
return this.eventMode.getCurrentTime();
}
async getMarketData(symbol: string): Promise<MarketData> {
return await this.eventMode.getMarketData(symbol);
}
async publishEvent(event: string, data: any): Promise<void> {
await this.eventMode.publishEvent(event, data);
}
}
```
## Unified Strategy Implementation
### Base Strategy Framework
```typescript
export abstract class BaseStrategy {
protected mode: ExecutionMode;
protected logger = createLogger(this.constructor.name);
abstract name: string;
abstract parameters: Record<string, any>;
constructor(mode: ExecutionMode) {
this.mode = mode;
}
// Works identically across all modes
abstract onPriceUpdate(data: PriceData): Promise<void>;
abstract onIndicatorUpdate(data: IndicatorData): Promise<void>;
protected async emitSignal(signal: TradeSignal): Promise<void> {
this.logger.debug('Emitting trade signal', { signal });
// Mode handles whether this is live, simulated, or vectorized
const order = this.createOrder(signal);
const result = await this.mode.executeOrder(order);
await this.mode.publishEvent('trade.executed', {
signal,
order,
result,
timestamp: this.mode.getCurrentTime()
});
}
private createOrder(signal: TradeSignal): Order {
return {
id: generateId(),
symbol: signal.symbol,
side: signal.action,
quantity: signal.quantity,
type: 'market',
timestamp: this.mode.getCurrentTime()
};
}
}
// Vectorized strategy interface
export interface VectorizedStrategy {
name: string;
parameters: Record<string, any>;
generatePositions(data: DataFrame, indicators: any): number[];
}
```
### Example Strategy Implementation
```typescript
export class SMAStrategy extends BaseStrategy implements VectorizedStrategy {
name = 'SMA-Crossover';
parameters = { fastPeriod: 10, slowPeriod: 20 };
private fastSMA: number[] = [];
private slowSMA: number[] = [];
async onPriceUpdate(data: PriceData): Promise<void> {
// Same logic for live, event-driven, and hybrid modes
this.fastSMA.push(data.close);
this.slowSMA.push(data.close);
if (this.fastSMA.length > this.parameters.fastPeriod) {
this.fastSMA.shift();
}
if (this.slowSMA.length > this.parameters.slowPeriod) {
this.slowSMA.shift();
}
if (this.fastSMA.length === this.parameters.fastPeriod &&
this.slowSMA.length === this.parameters.slowPeriod) {
const fastAvg = sma(this.fastSMA, this.parameters.fastPeriod)[0];
const slowAvg = sma(this.slowSMA, this.parameters.slowPeriod)[0];
if (fastAvg > slowAvg) {
await this.emitSignal({
symbol: data.symbol,
action: 'BUY',
quantity: 100,
confidence: 0.8
});
} else if (fastAvg < slowAvg) {
await this.emitSignal({
symbol: data.symbol,
action: 'SELL',
quantity: 100,
confidence: 0.8
});
}
}
}
async onIndicatorUpdate(data: IndicatorData): Promise<void> {
// Handle pre-calculated indicators
}
// Vectorized implementation for fast backtesting
generatePositions(data: DataFrame, indicators: any): number[] {
const { sma20: fastSMA, sma50: slowSMA } = indicators;
return fastSMA.map((fast, i) => {
const slow = slowSMA[i];
if (isNaN(fast) || isNaN(slow)) return 0;
// Long when fast > slow, short when fast < slow
return fast > slow ? 1 : (fast < slow ? -1 : 0);
});
}
}
```
## Mode Factory and Service Integration
### Mode Factory
```typescript
export class ModeFactory {
static create(mode: BacktestMode, config: any): ExecutionMode {
switch (mode) {
case BacktestMode.LIVE:
return new LiveMode();
case BacktestMode.EVENT_DRIVEN:
return new EventBacktestMode(config);
case BacktestMode.VECTORIZED:
return new VectorBacktestMode(config);
case BacktestMode.HYBRID:
return new HybridBacktestMode(config);
default:
throw new Error(`Unknown mode: ${mode}`);
}
}
}
```
### Strategy Service Integration
```typescript
export class StrategyService {
private logger = createLogger('strategy-service');
async runStrategy(
strategyName: string,
mode: BacktestMode,
config: any
): Promise<any> {
const executionMode = ModeFactory.create(mode, config);
const strategy = await this.loadStrategy(strategyName, executionMode);
this.logger.info('Starting strategy execution', {
strategy: strategyName,
mode,
config
});
switch (mode) {
case BacktestMode.LIVE:
return await this.runLiveStrategy(strategy);
case BacktestMode.EVENT_DRIVEN:
return await this.runEventBacktest(strategy, config);
case BacktestMode.VECTORIZED:
return await (executionMode as VectorBacktestMode)
.executeVectorizedBacktest(strategy as VectorizedStrategy);
case BacktestMode.HYBRID:
return await (executionMode as HybridBacktestMode)
.validateStrategy(strategy, config.tolerance);
default:
throw new Error(`Unsupported mode: ${mode}`);
}
}
async optimizeStrategy(
strategyName: string,
parameterGrid: Record<string, any[]>,
config: BacktestConfig
): Promise<OptimizationResult[]> {
const results: OptimizationResult[] = [];
const combinations = this.generateParameterCombinations(parameterGrid);
this.logger.info('Starting parameter optimization', {
strategy: strategyName,
combinations: combinations.length
});
// Use vectorized mode for fast parameter optimization
const vectorMode = new VectorBacktestMode(config);
// Can be parallelized
await Promise.all(
combinations.map(async (params) => {
const strategy = await this.loadStrategy(strategyName, vectorMode, params);
const result = await vectorMode.executeVectorizedBacktest(
strategy as VectorizedStrategy
);
results.push({
parameters: params,
performance: result.performance,
executionTime: result.executionTime
});
})
);
// Sort by Sharpe ratio
return results.sort((a, b) =>
b.performance.sharpeRatio - a.performance.sharpeRatio
);
}
}
```
## Service Configuration
### Environment-Based Mode Selection
```typescript
export class ServiceConfig {
getTradingConfig(): TradingConfig {
return {
mode: (process.env.TRADING_MODE as BacktestMode) || BacktestMode.LIVE,
brokerConfig: {
apiKey: process.env.BROKER_API_KEY,
sandbox: process.env.BROKER_SANDBOX === 'true'
},
backtestConfig: {
startDate: new Date(process.env.BACKTEST_START_DATE || '2023-01-01'),
endDate: new Date(process.env.BACKTEST_END_DATE || '2024-01-01'),
initialCapital: parseFloat(process.env.INITIAL_CAPITAL || '100000'),
slippageModel: process.env.SLIPPAGE_MODEL || 'linear',
commissionModel: process.env.COMMISSION_MODEL || 'fixed'
}
};
}
}
```
### CLI Interface
```typescript
// CLI for running different modes
import { Command } from 'commander';
const program = new Command();
program
.name('stock-bot')
.description('Stock Trading Bot with Multi-Mode Backtesting');
program
.command('live')
.description('Run live trading')
.option('-s, --strategy <strategy>', 'Strategy to run')
.action(async (options) => {
const strategyService = new StrategyService();
await strategyService.runStrategy(
options.strategy,
BacktestMode.LIVE,
{}
);
});
program
.command('backtest')
.description('Run backtesting')
.option('-s, --strategy <strategy>', 'Strategy to test')
.option('-m, --mode <mode>', 'Backtest mode (event|vector|hybrid)', 'event')
.option('-f, --from <date>', 'Start date')
.option('-t, --to <date>', 'End date')
.action(async (options) => {
const strategyService = new StrategyService();
await strategyService.runStrategy(
options.strategy,
options.mode as BacktestMode,
{
startDate: new Date(options.from),
endDate: new Date(options.to)
}
);
});
program
.command('optimize')
.description('Optimize strategy parameters')
.option('-s, --strategy <strategy>', 'Strategy to optimize')
.option('-p, --params <params>', 'Parameter grid JSON')
.action(async (options) => {
const strategyService = new StrategyService();
const paramGrid = JSON.parse(options.params);
await strategyService.optimizeStrategy(
options.strategy,
paramGrid,
{}
);
});
program.parse();
```
## Performance Comparison
### Execution Speed by Mode
| Mode | Data Points/Second | Memory Usage | Use Case |
|------|-------------------|--------------|----------|
| **Live** | Real-time | Low | Production trading |
| **Event-Driven** | ~1,000 | Medium | Realistic validation |
| **Vectorized** | ~100,000+ | High | Parameter optimization |
| **Hybrid** | Combined | Medium | Strategy validation |
### When to Use Each Mode
- **Live Mode**: Production trading with real money
- **Event-Driven**: Final strategy validation, complex order logic
- **Vectorized**: Initial development, parameter optimization, quick testing
- **Hybrid**: Validating vectorized results against realistic simulation
## Integration with Your Existing Libraries
This architecture leverages all your existing infrastructure:
- **@stock-bot/config**: Environment management
- **@stock-bot/logger**: Comprehensive logging with Loki
- **@stock-bot/utils**: All technical indicators and calculations
- **@stock-bot/questdb-client**: Time-series data storage
- **@stock-bot/postgres-client**: Transactional data
- **@stock-bot/mongodb-client**: Configuration storage
## Key Benefits
1. **Unified Codebase**: Same strategy logic across all modes
2. **Performance Flexibility**: Choose speed vs accuracy based on needs
3. **Validation Pipeline**: Hybrid mode ensures vectorized results are accurate
4. **Production Ready**: Live mode for actual trading
5. **Development Friendly**: Fast iteration with vectorized backtesting
This simplified architecture reduces complexity while providing comprehensive backtesting capabilities that scale from rapid prototyping to production trading.

View file

@ -1,43 +0,0 @@
# Cache Library Usage Guide
> **⚠️ DEPRECATED**: This documentation is outdated. The cache library has been simplified to only use Redis/Dragonfly.
>
> **Please see [simplified-cache-usage.md](./simplified-cache-usage.md) for current usage instructions.**
The `@stock-bot/cache` library now provides a simplified Redis-only caching solution designed specifically for trading bot applications.
## Migration from Old API
If you're migrating from the old cache API, here are the key changes:
### Old API (DEPRECATED)
```typescript
// These are no longer supported
const cache = createCache('auto');
const cache = createCache('memory');
const cache = createCache('hybrid');
const cache = createCache('redis');
// Direct imports no longer available
import { MemoryCache, HybridCache } from '@stock-bot/cache';
```
### New Simplified API
```typescript
// Use factory functions with options only
const cache = createCache({ keyPrefix: 'app:', ttl: 3600 });
const tradingCache = createTradingCache();
const marketCache = createMarketDataCache();
// Only Redis cache is available
import { RedisCache, RedisConnectionManager } from '@stock-bot/cache';
```
## Quick Migration Steps
1. **Remove cache type parameters**: Change `createCache('hybrid')` to `createCache()`
2. **Remove name parameter**: The `name` option is no longer needed
3. **Update imports**: Use named imports instead of default import
4. **Use specialized factories**: Consider using `createTradingCache()`, `createMarketDataCache()`, etc.
For complete usage examples and best practices, see [simplified-cache-usage.md](./simplified-cache-usage.md).

View file

@ -1,176 +0,0 @@
# Simplified Cache Library Usage
The cache library has been simplified to only use Redis/Dragonfly with a connection manager for better performance and easier management.
## Quick Start
```typescript
import { createCache, createTradingCache, createMarketDataCache, RedisConnectionManager } from '@stock-bot/cache';
// Create different cache instances
const generalCache = createCache({ keyPrefix: 'app:' });
const tradingCache = createTradingCache(); // Uses 'trading:' prefix
const marketCache = createMarketDataCache(); // Uses 'market:' prefix with 5min TTL
// All cache instances share connections by default for better performance
```
## Connection Management
The library now uses a connection manager that allows you to control whether services share connections or get unique ones:
```typescript
import { RedisConnectionManager } from '@stock-bot/cache';
const connectionManager = RedisConnectionManager.getInstance();
// For shared connections (recommended for most cases)
const sharedRedis = connectionManager.getConnection({
name: 'BATCH-PROCESSOR',
singleton: true // All instances share this connection
});
// For unique connections (when you need isolation)
const uniqueRedis = connectionManager.getConnection({
name: 'DATA-FETCHER',
singleton: false // Each instance gets its own connection
});
```
## Cache Usage Examples
### Basic Operations
```typescript
import { createCache } from '@stock-bot/cache';
const cache = createCache({ keyPrefix: 'myapp:' });
// Set data with default TTL (1 hour)
await cache.set('user:123', { name: 'John', email: 'john@example.com' });
// Set data with custom TTL (5 minutes)
await cache.set('session:abc', sessionData, 300);
// Get data
const user = await cache.get('user:123');
// Check if key exists
const exists = await cache.exists('user:123');
// Delete data
await cache.del('user:123');
// Clear all data with this prefix
await cache.clear();
```
### Trading-Specific Caches
```typescript
import { createTradingCache, createMarketDataCache, createIndicatorCache } from '@stock-bot/cache';
// Trading cache (1 hour TTL)
const tradingCache = createTradingCache();
await tradingCache.set('position:AAPL', { shares: 100, price: 150.00 });
// Market data cache (5 minute TTL)
const marketCache = createMarketDataCache();
await marketCache.set('quote:AAPL', { price: 151.25, volume: 1000000 });
// Indicator cache (30 minute TTL)
const indicatorCache = createIndicatorCache();
await indicatorCache.set('sma:AAPL:20', [150.1, 150.3, 150.8, 151.2]);
```
## Connection Names in Redis
When you create cache instances, they will appear in Redis with clean, identifiable names:
- `TRADING-SERVICE` - For trading cache
- `MARKET-SERVICE` - For market data cache
- `INDICATORS-SERVICE` - For indicator cache
- `CACHE-SERVICE` - For general cache
You can monitor all connections using:
```bash
# TypeScript version (more detailed)
bun run scripts/get-redis-connections.ts
# Bash version (quick check)
./scripts/get-redis-connections.sh
```
## Health Monitoring
```typescript
// Check if cache is ready
if (cache.isReady()) {
console.log('Cache is ready for operations');
}
// Wait for cache to be ready
await cache.waitForReady(5000); // Wait up to 5 seconds
// Health check
const isHealthy = await cache.health();
// Get performance statistics
const stats = cache.getStats();
console.log(`Hit rate: ${stats.hitRate}%, Total operations: ${stats.total}`);
```
## Batch Processor Example
Here's how to set up a batch processor with a shared connection:
```typescript
import { RedisConnectionManager } from '@stock-bot/cache';
import Redis from 'ioredis';
export class BatchProcessor {
private redis: Redis;
constructor() {
const connectionManager = RedisConnectionManager.getInstance();
// All batch processor instances share this connection
this.redis = connectionManager.getConnection({
name: 'BATCH-PROCESSOR',
singleton: true
});
}
async processItems(items: any[]): Promise<void> {
await this.redis.set('batch:status', 'processing');
for (const item of items) {
await this.redis.lpush('batch:queue', JSON.stringify(item));
}
await this.redis.set('batch:status', 'completed');
}
async getBatchStatus(): Promise<string> {
return await this.redis.get('batch:status') || 'idle';
}
}
```
## Key Benefits
**Simplified**: Only Redis-based caching, no complex hybrid logic
**Connection Management**: Shared connections for better performance
**Clean Monitoring**: Easy to identify connections in Redis
**Trading Optimized**: Pre-configured caches for different data types
**Type Safe**: Full TypeScript support
**Error Handling**: Graceful fallbacks and comprehensive logging
## Removed Features
**Memory Cache**: Removed to avoid consistency issues
**Hybrid Cache**: Removed unnecessary complexity
**Auto-detection**: Always uses Redis/Dragonfly now
This simplified approach provides better performance, easier debugging, and more predictable behavior for your trading application.

View file

@ -1,65 +0,0 @@
# Testing with Bun in Stock Bot Platform
The Stock Bot platform uses [Bun Test](https://bun.sh/docs/cli/test) as the primary testing framework (Updated June 2025). Bun Test provides fast, modern testing with Jest-like API compatibility.
## Getting Started
Run tests using these commands:
```cmd
# Run all tests (using Turbo)
bun test
# Run tests in watch mode
bun test:watch
# Run tests with coverage
bun test:coverage
# Run specific test types
bun test:unit
bun test:integration
bun test:e2e
```
## Library-specific Testing
Each library has its own testing configuration in a `bunfig.toml` file. This allows for library-specific test settings while maintaining consistent patterns across the codebase.
### Example bunfig.toml:
```toml
[test]
preload = ["./test/setup.ts"]
timeout = 5000
[test.env]
NODE_ENV = "test"
[bun]
paths = {
"@/*" = ["./src/*"]
}
```
## Migration from Jest
This project has been fully migrated from Jest to Bun Test. Some key differences:
1. **Import statements**: Use `import { describe, it, expect } from 'bun:test'` instead of Jest imports
2. **Mocking**: Use Bun's built-in mocking utilities (see global `spyOn` helper)
3. **Configuration**: Use `bunfig.toml` instead of Jest config files
4. **Test helpers**: Test helpers are available globally via `global.testHelpers`
## Best Practices
- Use `describe` and `it` for test organization
- Use relative imports (`../src/`) in test files
- Keep test setup clean with proper `beforeEach` and `afterEach` handlers
- For complex test scenarios, create dedicated setup files
## Test Environment
- All tests run with `NODE_ENV=test`
- Console output is silenced by default (restore with `testHelpers.restoreConsole()`)
- Default timeout is 30 seconds for integration tests, 5 seconds for unit tests

View file

@ -1,104 +0,0 @@
# TypeScript Configuration Structure
This document explains the TypeScript configuration structure used in the Stock Bot trading platform.
## Root Configuration
The root `tsconfig.json` at the project root establishes common settings for all projects in the monorepo:
```json
{
"$schema": "https://json.schemastore.org/tsconfig",
"compilerOptions": {
"target": "ES2022",
"module": "NodeNext",
"moduleResolution": "NodeNext",
"strict": true,
"noImplicitAny": true,
"strictNullChecks": true,
"noImplicitThis": true,
"alwaysStrict": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"sourceMap": false,
"declaration": true,
"baseUrl": ".",
"paths": {
"@stock-bot/*": ["libs/*/src"]
}
},
"exclude": [
"node_modules",
"dist"
]
}
```
## Template Configurations
We provide two template configurations:
1. `tsconfig.app.json` - For application projects:
```json
{
"extends": "../../../tsconfig.json",
"compilerOptions": {
"outDir": "./dist",
"rootDir": "./src",
"module": "ESNext",
"moduleResolution": "bundler",
"types": ["bun-types"]
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}
```
## Project-Specific Configurations
Each project in the monorepo extends from the root configuration and adds its own specific settings:
### Library Projects
Library projects extend the root configuration with a relative path:
```json
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"outDir": "./dist",
"rootDir": "./src",
"declaration": true
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist", "**/*.test.ts", "**/*.spec.ts"]
}
```
### Application Projects
Application projects also extend the root configuration with a relative path:
```json
{
"extends": "../../../tsconfig.json",
"compilerOptions": {
"outDir": "./dist",
"rootDir": "./src",
"module": "ESNext",
"moduleResolution": "bundler",
"types": ["bun-types"]
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}
```
## Special Configurations
Some projects have special needs:
1. **Trading Dashboard (Angular)**: Uses an extended configuration structure with separate files for app and testing.
2. **Projects with TypeScript imports from extensions**: These projects set `"allowImportingTsExtensions": true` and `"noEmit": true`.

View file

@ -1,216 +0,0 @@
/**
* Example: Enhanced Data Service with Multi-Database Support
*
* This shows how to update your existing data service to leverage
* the new multi-database MongoDB client functionality.
*/
import { getLogger } from '@stock-bot/logger';
import { MongoDBClient, setDefaultDatabase } from '@stock-bot/mongodb-client';
const logger = getLogger('enhanced-data-service');
export class EnhancedDataService {
private mongoClient = MongoDBClient.getInstance();
async initialize() {
// Connect to MongoDB
await this.mongoClient.connect();
// Set stock as default database for market data operations
setDefaultDatabase('stock');
logger.info('Enhanced data service initialized with multi-database support');
}
/**
* Save Interactive Brokers data to stock database
*/
async saveIBMarketData(exchanges: any[], symbols: any[]) {
logger.info('Saving IB market data to stock database');
// These use the default 'stock' database
const exchangeResult = await this.mongoClient.batchUpsert(
'exchanges',
exchanges,
'exchange_id'
);
const symbolResult = await this.mongoClient.batchUpsert('symbols', symbols, 'symbol');
// Or use convenience method for cleaner code
// const exchangeResult = await this.mongoClient.batchUpsertStock('exchanges', exchanges, 'exchange_id');
// const symbolResult = await this.mongoClient.batchUpsertStock('symbols', symbols, 'symbol');
logger.info('IB market data saved', {
exchanges: exchangeResult,
symbols: symbolResult,
});
return { exchanges: exchangeResult, symbols: symbolResult };
}
/**
* Save real-time prices to stock database
*/
async saveRealTimePrices(priceData: any[]) {
logger.info(`Saving ${priceData.length} real-time prices to stock database`);
return await this.mongoClient.batchUpsertStock('real_time_prices', priceData, [
'symbol',
'timestamp',
]);
}
/**
* Save performance analytics to analytics database
*/
async savePerformanceAnalytics(performanceData: any[]) {
logger.info(`Saving ${performanceData.length} performance records to analytics database`);
return await this.mongoClient.batchUpsertAnalytics('performance_metrics', performanceData, [
'portfolio_id',
'date',
]);
}
/**
* Save trading logs to trading_documents database
*/
async saveTradingLogs(logs: any[]) {
logger.info(`Saving ${logs.length} trading logs to trading_documents database`);
return await this.mongoClient.batchUpsertTrading('trading_logs', logs, 'log_id');
}
/**
* Example: Cross-database analytics
*/
async generateCrossDatabaseReport() {
logger.info('Generating cross-database analytics report');
// Get data from multiple databases
const stockDb = this.mongoClient.getDatabase('stock');
const analyticsDb = this.mongoClient.getDatabase('analytics');
const tradingDb = this.mongoClient.getDatabase('trading_documents');
// Get counts from different databases
const symbolCount = await stockDb.collection('symbols').countDocuments();
const exchangeCount = await stockDb.collection('exchanges').countDocuments();
const performanceCount = await analyticsDb.collection('performance_metrics').countDocuments();
const logCount = await tradingDb.collection('trading_logs').countDocuments();
const report = {
stock_database: {
symbols: symbolCount,
exchanges: exchangeCount,
},
analytics_database: {
performance_metrics: performanceCount,
},
trading_database: {
logs: logCount,
},
generated_at: new Date(),
};
logger.info('Cross-database report generated', report);
return report;
}
/**
* Example: Database-specific operations
*/
async performDatabaseSpecificOperations() {
// Switch default database for a series of analytics operations
const originalDefault = this.mongoClient.getDefaultDatabase();
this.mongoClient.setDefaultDatabase('analytics');
try {
// Now all operations without explicit database go to 'analytics'
await this.mongoClient.batchUpsert('daily_reports', [], 'date');
await this.mongoClient.batchUpsert('portfolio_summaries', [], 'portfolio_id');
logger.info('Analytics operations completed on analytics database');
} finally {
// Restore original default database
this.mongoClient.setDefaultDatabase(originalDefault);
}
}
/**
* Example: Configuration-driven database routing
*/
async saveDataByType(dataType: string, collection: string, data: any[], uniqueKeys: string[]) {
const databaseMapping: Record<string, string> = {
market_data: 'stock',
analytics: 'analytics',
trading: 'trading_documents',
logs: 'trading_documents',
};
const targetDatabase = databaseMapping[dataType] || 'stock';
logger.info(`Saving ${data.length} records`, {
dataType,
targetDatabase,
collection,
});
return await this.mongoClient.batchUpsert(collection, data, uniqueKeys, {
database: targetDatabase,
});
}
async shutdown() {
logger.info('Shutting down enhanced data service');
await this.mongoClient.disconnect();
}
}
// Usage example for your existing data service
export async function integrateWithExistingDataService() {
const enhancedService = new EnhancedDataService();
try {
await enhancedService.initialize();
// Example market data
const exchanges = [
{ exchange_id: 'NYSE', name: 'New York Stock Exchange', mic: 'XNYS' },
{ exchange_id: 'NASDAQ', name: 'NASDAQ', mic: 'XNAS' },
];
const symbols = [
{ symbol: 'AAPL', exchange: 'NASDAQ', company_name: 'Apple Inc.' },
{ symbol: 'MSFT', exchange: 'NASDAQ', company_name: 'Microsoft Corporation' },
];
// Save to appropriate databases
await enhancedService.saveIBMarketData(exchanges, symbols);
// Save real-time prices
await enhancedService.saveRealTimePrices([
{ symbol: 'AAPL', price: 150.25, volume: 1000000, timestamp: new Date() },
]);
// Save analytics
await enhancedService.savePerformanceAnalytics([
{ portfolio_id: 'portfolio_1', date: new Date(), return: 0.15 },
]);
// Generate cross-database report
const report = await enhancedService.generateCrossDatabaseReport();
console.log('Cross-database report:', report);
// Example of configuration-driven routing
await enhancedService.saveDataByType('market_data', 'prices', [], 'symbol');
await enhancedService.saveDataByType('analytics', 'metrics', [], 'metric_id');
} catch (error) {
logger.error('Enhanced data service error:', error);
} finally {
await enhancedService.shutdown();
}
}
// Export for integration
export default EnhancedDataService;

View file

@ -1,230 +0,0 @@
/**
* Practical Usage Examples for Multi-Database MongoDB Client
*
* This file demonstrates real-world usage patterns for the enhanced MongoDB client
* with multiple database support.
*/
import { getCurrentDatabase, MongoDBClient, setDefaultDatabase } from '@stock-bot/mongodb-client';
// Example 1: Using different databases for different data types
export class DataServiceExample {
private mongoClient = MongoDBClient.getInstance();
async initialize() {
await this.mongoClient.connect();
// Set stock as default database for most operations
setDefaultDatabase('stock');
console.log(`Default database: ${getCurrentDatabase()}`);
}
// Stock market data goes to 'stock' database (default)
async saveStockData(symbols: any[], exchanges: any[]) {
// These use the default 'stock' database
await this.mongoClient.batchUpsert('symbols', symbols, 'symbol');
await this.mongoClient.batchUpsert('exchanges', exchanges, 'exchange_id');
// Or use convenience method (explicitly targets 'stock' database)
await this.mongoClient.batchUpsertStock('prices', symbols, 'symbol');
}
// Analytics and metrics go to 'analytics' database
async saveAnalyticsData(performanceData: any[], metrics: any[]) {
// Override database for specific operations
await this.mongoClient.batchUpsert('performance', performanceData, 'date', {
database: 'analytics',
});
// Or use convenience method
await this.mongoClient.batchUpsertAnalytics('metrics', metrics, 'metric_name');
}
// Trading documents and logs go to 'trading_documents' database
async saveTradingData(orders: any[], transactions: any[]) {
// Use convenience method for trading data
await this.mongoClient.batchUpsertTrading('orders', orders, 'order_id');
await this.mongoClient.batchUpsertTrading('transactions', transactions, 'transaction_id');
}
// Example of switching default database dynamically
async switchToAnalyticsMode() {
console.log(`Current default: ${getCurrentDatabase()}`);
// Switch to analytics database for a series of operations
setDefaultDatabase('analytics');
console.log(`New default: ${getCurrentDatabase()}`);
// Now all operations without explicit database parameter go to 'analytics'
await this.mongoClient.batchUpsert('daily_reports', [], 'date');
await this.mongoClient.batchUpsert('portfolio_performance', [], 'portfolio_id');
// Switch back to stock database
setDefaultDatabase('stock');
}
// Example of working with multiple databases simultaneously
async crossDatabaseAnalysis() {
// Get direct access to different databases
const stockDb = this.mongoClient.getDatabase('stock');
const analyticsDb = this.mongoClient.getDatabase('analytics');
const tradingDb = this.mongoClient.getDatabase('trading_documents');
// Perform operations on multiple databases
const stockSymbols = await stockDb.collection('symbols').find({}).toArray();
const performance = await analyticsDb.collection('performance').find({}).toArray();
const orders = await tradingDb.collection('orders').find({}).toArray();
console.log('Cross-database analysis:', {
symbolsCount: stockSymbols.length,
performanceRecords: performance.length,
ordersCount: orders.length,
});
}
}
// Example 2: Data Migration Between Databases
export class DataMigrationExample {
private mongoClient = MongoDBClient.getInstance();
async migrateHistoricalData() {
await this.mongoClient.connect();
// Get collections from different databases
const stockCollection = this.mongoClient.getCollection('historical_prices', 'stock');
const analyticsCollection = this.mongoClient.getCollection('price_analysis', 'analytics');
// Read from stock database
const historicalPrices = await stockCollection
.find({
date: { $gte: new Date('2024-01-01') },
})
.toArray();
console.log(`Found ${historicalPrices.length} historical price records`);
// Transform and save to analytics database
const analysisData = historicalPrices.map(price => ({
symbol: price.symbol,
date: price.date,
price_change: price.close - price.open,
volume_normalized: price.volume / 1000000,
created_at: new Date(),
updated_at: new Date(),
}));
// Save to analytics database
await this.mongoClient.batchUpsert('price_analysis', analysisData, ['symbol', 'date'], {
database: 'analytics',
});
console.log(`Migrated ${analysisData.length} records to analytics database`);
}
}
// Example 3: Service-Specific Database Usage
export class TradingServiceExample {
private mongoClient = MongoDBClient.getInstance();
async initialize() {
await this.mongoClient.connect();
// Trading service primarily works with trading_documents database
setDefaultDatabase('trading_documents');
}
async processTradeOrders(orders: any[]) {
// Default database is 'trading_documents', so no need to specify
await this.mongoClient.batchUpsert('orders', orders, 'order_id');
// Log to analytics database for monitoring
const orderMetrics = orders.map(order => ({
metric_name: `order_${order.type}`,
value: order.quantity,
timestamp: new Date(),
}));
await this.mongoClient.batchUpsert(
'trading_metrics',
orderMetrics,
['metric_name', 'timestamp'],
{ database: 'analytics' }
);
}
async getOrderHistory(symbolFilter?: string) {
// Get collection from default database (trading_documents)
const ordersCollection = this.mongoClient.getCollection('orders');
const filter = symbolFilter ? { symbol: symbolFilter } : {};
return await ordersCollection.find(filter).sort({ created_at: -1 }).limit(100).toArray();
}
}
// Example 4: Configuration-Based Database Routing
export class ConfigurableDatabaseRouter {
private mongoClient = MongoDBClient.getInstance();
// Configuration mapping data types to databases
private databaseConfig = {
market_data: 'stock',
user_data: 'trading_documents',
analytics: 'analytics',
logs: 'trading_documents',
cache: 'analytics',
};
async saveData(
dataType: keyof typeof this.databaseConfig,
collection: string,
data: any[],
uniqueKeys: string[]
) {
const targetDatabase = this.databaseConfig[dataType];
console.log(`Saving ${data.length} records to ${targetDatabase}.${collection}`);
return await this.mongoClient.batchUpsert(collection, data, uniqueKeys, {
database: targetDatabase,
});
}
async saveMarketData(data: any[]) {
return this.saveData('market_data', 'realtime_prices', data, 'symbol');
}
async saveUserActivity(data: any[]) {
return this.saveData('user_data', 'user_actions', data, 'user_id');
}
async saveAnalytics(data: any[]) {
return this.saveData('analytics', 'performance_metrics', data, 'metric_id');
}
}
// Usage example for your data service
export async function exampleUsage() {
const dataService = new DataServiceExample();
await dataService.initialize();
// Save different types of data to appropriate databases
await dataService.saveStockData(
[{ symbol: 'AAPL', price: 150.25, volume: 1000000 }],
[{ exchange_id: 'NYSE', name: 'New York Stock Exchange' }]
);
await dataService.saveAnalyticsData(
[{ date: new Date(), portfolio_return: 0.15 }],
[{ metric_name: 'sharpe_ratio', value: 1.25 }]
);
await dataService.saveTradingData(
[{ order_id: 'ORD001', symbol: 'AAPL', quantity: 100 }],
[{ transaction_id: 'TXN001', amount: 15025 }]
);
// Perform cross-database analysis
await dataService.crossDatabaseAnalysis();
console.log('Multi-database operations completed successfully!');
}