This commit is contained in:
Boki 2025-06-22 17:55:51 -04:00
parent d858222af7
commit 7d9044ab29
202 changed files with 10755 additions and 10972 deletions

Binary file not shown.

View file

@ -0,0 +1,58 @@
# Code Style and Conventions
## TypeScript Configuration
- **Strict mode enabled**: All strict checks are on
- **Target**: ES2022
- **Module**: ESNext with bundler resolution
- **Path aliases**: `@stock-bot/*` maps to `libs/*/src`
- **Decorators**: Enabled for dependency injection
## Code Style Rules (ESLint)
- **No unused variables**: Error (except prefixed with `_`)
- **No explicit any**: Warning
- **No non-null assertion**: Warning
- **No console**: Warning (except in tests)
- **Prefer const**: Enforced
- **Strict equality**: Always use `===`
- **Curly braces**: Required for all blocks
## Formatting (Prettier)
- **Semicolons**: Always
- **Single quotes**: Yes
- **Trailing comma**: ES5
- **Print width**: 100 characters
- **Tab width**: 2 spaces
- **Arrow parens**: Avoid when possible
- **End of line**: LF
## Import Order
1. Node built-ins
2. Third-party modules
3. `@stock-bot/*` imports
4. Relative imports (parent directories first)
5. Current directory imports
## Naming Conventions
- **Files**: kebab-case (e.g., `database-setup.ts`)
- **Classes**: PascalCase
- **Functions/Variables**: camelCase
- **Constants**: UPPER_SNAKE_CASE
- **Interfaces/Types**: PascalCase with 'I' or 'T' prefix optional
## Library Standards
- **Named exports only**: No default exports
- **Factory patterns**: For complex initialization
- **Singleton pattern**: For global services (config, logger)
- **Direct class exports**: For DI-managed services
## Testing
- **File naming**: `*.test.ts` or `*.spec.ts`
- **Test structure**: Bun's built-in test runner
- **Integration tests**: Use TestContainers for databases
- **Mocking**: Mock external dependencies
## Documentation
- **JSDoc**: For all public APIs
- **README.md**: Required for each library
- **Usage examples**: Include in documentation
- **Error messages**: Descriptive with context

View file

@ -0,0 +1,41 @@
# Current Refactoring Context
## Data Ingestion Service Refactor
The project is currently undergoing a major refactoring to move away from singleton patterns to a dependency injection approach using service containers.
### What's Been Done
- Created connection pool pattern with `ServiceContainer`
- Refactored data-ingestion service to use DI container
- Updated handlers to accept container parameter
- Added proper resource disposal with `ctx.dispose()`
### Migration Status
- QM handler: ✅ Fully migrated to container pattern
- IB handler: ⚠️ Partially migrated (using migration helper)
- Proxy handler: ✅ Updated to accept container
- WebShare handler: ✅ Updated to accept container
### Key Patterns
1. **Service Container**: Central DI container managing all connections
2. **Operation Context**: Provides scoped database access within operations
3. **Factory Pattern**: Connection factories for different databases
4. **Resource Disposal**: Always call `ctx.dispose()` after operations
### Example Pattern
```typescript
const ctx = OperationContext.create('handler', 'operation', { container });
try {
// Use databases through context
await ctx.mongodb.insertOne(data);
await ctx.postgres.query('...');
return { success: true };
} finally {
await ctx.dispose(); // Always cleanup
}
```
### Next Steps
- Complete migration of remaining IB operations
- Remove migration helper once complete
- Apply same pattern to other services
- Add monitoring for connection pools

View file

@ -0,0 +1,55 @@
# Stock Bot Trading Platform
## Project Purpose
This is an advanced trading bot platform with a microservice architecture designed for automated stock trading. The system includes:
- Market data ingestion from multiple providers (Yahoo Finance, QuoteMedia, Interactive Brokers, WebShare)
- Data processing and technical indicator calculation
- Trading strategy development and backtesting
- Order execution and risk management
- Portfolio tracking and performance analytics
- Web dashboard for monitoring
## Architecture Overview
The project follows a **microservices architecture** with shared libraries:
### Core Services (apps/)
- **data-ingestion**: Ingests market data from multiple providers
- **data-pipeline**: Processes and transforms data
- **web-api**: REST API service
- **web-app**: React-based dashboard
### Shared Libraries (libs/)
**Core Libraries:**
- config: Environment configuration with Zod validation
- logger: Structured logging with Loki integration
- di: Dependency injection container
- types: Shared TypeScript types
- handlers: Common handler patterns
**Data Libraries:**
- postgres: PostgreSQL client for transactional data
- questdb: Time-series database for market data
- mongodb: Document storage for configurations
**Service Libraries:**
- queue: BullMQ-based job processing
- event-bus: Dragonfly/Redis event bus
- shutdown: Graceful shutdown management
**Utils:**
- Financial calculations and technical indicators
- Date utilities
- Position sizing calculations
## Database Strategy
- **PostgreSQL**: Transactional data (orders, positions, strategies)
- **QuestDB**: Time-series data (OHLCV, indicators, performance metrics)
- **MongoDB**: Document storage (configurations, raw API responses)
- **Dragonfly/Redis**: Event bus and caching layer
## Current Development Phase
Phase 1: Data Foundation Layer (In Progress)
- Enhancing data provider reliability
- Implementing data validation
- Optimizing time-series storage
- Building robust HTTP client with circuit breakers

View file

@ -0,0 +1,62 @@
# Project Structure
## Root Directory
```
stock-bot/
├── apps/ # Microservice applications
│ ├── data-ingestion/ # Market data ingestion service
│ ├── data-pipeline/ # Data processing pipeline
│ ├── web-api/ # REST API service
│ └── web-app/ # React dashboard
├── libs/ # Shared libraries
│ ├── core/ # Core functionality
│ │ ├── config/ # Configuration management
│ │ ├── logger/ # Logging infrastructure
│ │ ├── di/ # Dependency injection
│ │ ├── types/ # Shared TypeScript types
│ │ └── handlers/ # Common handler patterns
│ ├── data/ # Database clients
│ │ ├── postgres/ # PostgreSQL client
│ │ ├── questdb/ # QuestDB time-series client
│ │ └── mongodb/ # MongoDB document storage
│ ├── services/ # Service utilities
│ │ ├── queue/ # BullMQ job processing
│ │ ├── event-bus/ # Dragonfly event bus
│ │ └── shutdown/ # Graceful shutdown
│ └── utils/ # Utility functions
├── database/ # Database schemas and migrations
├── scripts/ # Build and utility scripts
├── config/ # Configuration files
├── monitoring/ # Monitoring configurations
├── docs/ # Documentation
└── test/ # Global test utilities
## Key Files
- `package.json` - Root package configuration
- `turbo.json` - Turbo monorepo configuration
- `tsconfig.json` - TypeScript configuration
- `eslint.config.js` - ESLint rules
- `.prettierrc` - Prettier formatting rules
- `docker-compose.yml` - Infrastructure setup
- `.env` - Environment variables
## Monorepo Structure
- Uses Bun workspaces with Turbo for orchestration
- Each app and library has its own package.json
- Shared dependencies at root level
- Libraries published as `@stock-bot/*` packages
## Service Architecture Pattern
Each service typically follows:
```
service/
├── src/
│ ├── index.ts # Entry point
│ ├── routes/ # API routes (Hono)
│ ├── handlers/ # Business logic
│ ├── services/ # Service layer
│ └── types/ # Service-specific types
├── test/ # Tests
├── package.json
└── tsconfig.json
```

View file

@ -0,0 +1,73 @@
# Suggested Commands for Development
## Package Management (Bun)
- `bun install` - Install all dependencies
- `bun add <package>` - Add a new dependency
- `bun add -D <package>` - Add a dev dependency
- `bun update` - Update dependencies
## Development
- `bun run dev` - Start all services in development mode (uses Turbo)
- `bun run dev:full` - Start infrastructure + admin tools + dev mode
- `bun run dev:clean` - Reset infrastructure and start fresh
## Building
- `bun run build` - Build all services and libraries
- `bun run build:libs` - Build only shared libraries
- `bun run build:all:clean` - Clean build with cache removal
- `./scripts/build-all.sh` - Custom build script with options
## Testing
- `bun test` - Run all tests
- `bun test --watch` - Run tests in watch mode
- `bun run test:coverage` - Run tests with coverage report
- `bun run test:libs` - Test only shared libraries
- `bun run test:apps` - Test only applications
- `bun test <file>` - Run specific test file
## Code Quality (IMPORTANT - Run before committing!)
- `bun run lint` - Check for linting errors
- `bun run lint:fix` - Auto-fix linting issues
- `bun run format` - Format code with Prettier
- `./scripts/format.sh` - Alternative format script
## Infrastructure Management
- `bun run infra:up` - Start databases (PostgreSQL, QuestDB, MongoDB, Dragonfly)
- `bun run infra:down` - Stop infrastructure
- `bun run infra:reset` - Reset with clean volumes
- `bun run docker:admin` - Start admin GUIs (pgAdmin, Mongo Express, Redis Insight)
- `bun run docker:monitoring` - Start monitoring stack
## Database Operations
- `bun run db:setup-ib` - Setup Interactive Brokers database schema
- `bun run db:init` - Initialize all database schemas
## Utility Commands
- `bun run clean` - Clean build artifacts
- `bun run clean:all` - Deep clean including node_modules
- `turbo run <task>` - Run task across monorepo
## Git Commands (Linux)
- `git status` - Check current status
- `git add .` - Stage all changes
- `git commit -m "message"` - Commit changes
- `git push` - Push to remote
- `git pull` - Pull from remote
- `git checkout -b <branch>` - Create new branch
## System Commands (Linux)
- `ls -la` - List files with details
- `cd <directory>` - Change directory
- `grep -r "pattern" .` - Search for pattern
- `find . -name "*.ts"` - Find files by pattern
- `which <command>` - Find command location
## MCP Setup (for database access in IDE)
- `./scripts/setup-mcp.sh` - Setup Model Context Protocol servers
- Requires infrastructure to be running first
## Service URLs
- Dashboard: http://localhost:4200
- QuestDB Console: http://localhost:9000
- Grafana: http://localhost:3000
- pgAdmin: http://localhost:8080

View file

@ -0,0 +1,55 @@
# Task Completion Checklist
When you complete any coding task, ALWAYS run these commands in order:
## 1. Code Quality Checks (MANDATORY)
```bash
# Run linting to catch code issues
bun run lint
# If there are errors, fix them automatically
bun run lint:fix
# Format the code
bun run format
```
## 2. Testing (if applicable)
```bash
# Run tests if you modified existing functionality
bun test
# Run specific test file if you added/modified tests
bun test <path-to-test-file>
```
## 3. Build Verification (for significant changes)
```bash
# Build the affected libraries/apps
bun run build:libs # if you changed libraries
bun run build # for full build
```
## 4. Final Verification Steps
- Ensure no TypeScript errors in the IDE
- Check that imports are properly ordered (Prettier should handle this)
- Verify no console.log statements in production code
- Confirm all new code follows the established patterns
## 5. Git Commit Guidelines
- Stage changes: `git add .`
- Write descriptive commit messages
- Reference issue numbers if applicable
- Use conventional commit format when possible:
- `feat:` for new features
- `fix:` for bug fixes
- `refactor:` for code refactoring
- `docs:` for documentation
- `test:` for tests
- `chore:` for maintenance
## Important Notes
- NEVER skip the linting and formatting steps
- The project uses ESLint and Prettier - let them do their job
- If lint errors persist after auto-fix, they need manual attention
- Always test your changes, even if just running the service locally

View file

@ -0,0 +1,49 @@
# Technology Stack
## Runtime & Package Manager
- **Bun**: v1.1.0+ (primary runtime and package manager)
- **Node.js**: v18.0.0+ (compatibility)
- **TypeScript**: v5.8.3
## Core Technologies
- **Turbo**: Monorepo build system
- **ESBuild**: Fast bundling (integrated with Bun)
- **Hono**: Lightweight web framework for services
## Databases
- **PostgreSQL**: Primary transactional database
- **QuestDB**: Time-series database for market data
- **MongoDB**: Document storage
- **Dragonfly**: Redis-compatible cache and event bus
## Queue & Messaging
- **BullMQ**: Job queue processing
- **IORedis**: Redis client for Dragonfly
## Web Technologies
- **React**: Frontend framework (web-app)
- **Angular**: (based on polyfills.ts reference)
- **PrimeNG**: UI component library
- **TailwindCSS**: CSS framework
## Testing
- **Bun Test**: Built-in test runner
- **TestContainers**: Database integration testing
- **Supertest**: API testing
## Monitoring & Observability
- **Loki**: Log aggregation
- **Prometheus**: Metrics collection
- **Grafana**: Visualization dashboards
## Development Tools
- **ESLint**: Code linting
- **Prettier**: Code formatting
- **Docker Compose**: Local infrastructure
- **Model Context Protocol (MCP)**: Database access in IDE
## Key Dependencies
- **Awilix**: Dependency injection container
- **Zod**: Schema validation
- **pg**: PostgreSQL client
- **Playwright**: Browser automation for proxy testing

66
.serena/project.yml Normal file
View file

@ -0,0 +1,66 @@
# language of the project (csharp, python, rust, java, typescript, javascript, go, cpp, or ruby)
# Special requirements:
# * csharp: Requires the presence of a .sln file in the project folder.
language: typescript
# whether to use the project's gitignore file to ignore files
# Added on 2025-04-07
ignore_all_files_in_gitignore: true
# list of additional paths to ignore
# same syntax as gitignore, so you can use * and **
# Was previously called `ignored_dirs`, please update your config if you are using that.
# Added (renamed)on 2025-04-07
ignored_paths: []
# whether the project is in read-only mode
# If set to true, all editing tools will be disabled and attempts to use them will result in an error
# Added on 2025-04-18
read_only: false
# list of tool names to exclude. We recommend not excluding any tools, see the readme for more details.
# Below is the complete list of tools for convenience.
# To make sure you have the latest list of tools, and to view their descriptions,
# execute `uv run scripts/print_tool_overview.py`.
#
# * `activate_project`: Activates a project by name.
# * `check_onboarding_performed`: Checks whether project onboarding was already performed.
# * `create_text_file`: Creates/overwrites a file in the project directory.
# * `delete_lines`: Deletes a range of lines within a file.
# * `delete_memory`: Deletes a memory from Serena's project-specific memory store.
# * `execute_shell_command`: Executes a shell command.
# * `find_referencing_code_snippets`: Finds code snippets in which the symbol at the given location is referenced.
# * `find_referencing_symbols`: Finds symbols that reference the symbol at the given location (optionally filtered by type).
# * `find_symbol`: Performs a global (or local) search for symbols with/containing a given name/substring (optionally filtered by type).
# * `get_current_config`: Prints the current configuration of the agent, including the active and available projects, tools, contexts, and modes.
# * `get_symbols_overview`: Gets an overview of the top-level symbols defined in a given file or directory.
# * `initial_instructions`: Gets the initial instructions for the current project.
# Should only be used in settings where the system prompt cannot be set,
# e.g. in clients you have no control over, like Claude Desktop.
# * `insert_after_symbol`: Inserts content after the end of the definition of a given symbol.
# * `insert_at_line`: Inserts content at a given line in a file.
# * `insert_before_symbol`: Inserts content before the beginning of the definition of a given symbol.
# * `list_dir`: Lists files and directories in the given directory (optionally with recursion).
# * `list_memories`: Lists memories in Serena's project-specific memory store.
# * `onboarding`: Performs onboarding (identifying the project structure and essential tasks, e.g. for testing or building).
# * `prepare_for_new_conversation`: Provides instructions for preparing for a new conversation (in order to continue with the necessary context).
# * `read_file`: Reads a file within the project directory.
# * `read_memory`: Reads the memory with the given name from Serena's project-specific memory store.
# * `remove_project`: Removes a project from the Serena configuration.
# * `replace_lines`: Replaces a range of lines within a file with new content.
# * `replace_symbol_body`: Replaces the full definition of a symbol.
# * `restart_language_server`: Restarts the language server, may be necessary when edits not through Serena happen.
# * `search_for_pattern`: Performs a search for a pattern in the project.
# * `summarize_changes`: Provides instructions for summarizing the changes made to the codebase.
# * `switch_modes`: Activates modes by providing a list of their names
# * `think_about_collected_information`: Thinking tool for pondering the completeness of collected information.
# * `think_about_task_adherence`: Thinking tool for determining whether the agent is still on track with the current task.
# * `think_about_whether_you_are_done`: Thinking tool for determining whether the task is truly completed.
# * `write_memory`: Writes a named memory (for future reference) to Serena's project-specific memory store.
excluded_tools: []
# initial prompt for the project. It will always be given to the LLM upon activating the project
# (contrary to the memories, which are loaded on demand).
initial_prompt: ""
project_name: "stock-bot"

20
.vscode/mcp.json vendored
View file

@ -1,21 +1,3 @@
{ {
"mcpServers": {
"postgres": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-postgres",
"postgresql://trading_user:trading_pass_dev@localhost:5432/trading_bot"
]
},
"mongodb": {
"command": "npx",
"args": [
"-y",
"mongodb-mcp-server",
"--connectionString",
"mongodb://trading_admin:trading_mongo_dev@localhost:27017/stock?authSource=admin"
]
}
}
} }

171
CLAUDE.md
View file

@ -1,171 +0,0 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Development Commands
**Package Manager**: Bun (v1.1.0+)
**Build & Development**:
- `bun install` - Install dependencies
- `bun run dev` - Start all services in development mode (uses Turbo)
- `bun run build` - Build all services and libraries
- `bun run build:libs` - Build only shared libraries
- `./scripts/build-all.sh` - Custom build script with options
**Testing**:
- `bun test` - Run all tests
- `bun run test:libs` - Test only shared libraries
- `bun run test:apps` - Test only applications
- `bun run test:coverage` - Run tests with coverage
**Code Quality**:
- `bun run lint` - Lint TypeScript files
- `bun run lint:fix` - Auto-fix linting issues
- `bun run format` - Format code using Prettier
- `./scripts/format.sh` - Format script
**Infrastructure**:
- `bun run infra:up` - Start database infrastructure (PostgreSQL, QuestDB, MongoDB, Dragonfly)
- `bun run infra:down` - Stop infrastructure
- `bun run infra:reset` - Reset infrastructure with clean volumes
- `bun run docker:admin` - Start admin GUIs (pgAdmin, Mongo Express, Redis Insight)
**Database Setup**:
- `bun run db:setup-ib` - Setup Interactive Brokers database schema
- `bun run db:init` - Initialize database schemas
## Architecture Overview
**Microservices Architecture** with shared libraries and multi-database storage:
### Core Services (`apps/`)
- **data-ingestion** - Market data ingestion from multiple providers (Yahoo, QuoteMedia, IB)
- **processing-service** - Data cleaning, validation, and technical indicators
- **strategy-service** - Trading strategies and backtesting (multi-mode: live, event-driven, vectorized, hybrid)
- **execution-service** - Order management and risk controls
- **portfolio-service** - Position tracking and performance analytics
- **web-app** - React dashboard with real-time updates
### Shared Libraries (`libs/`)
- **config** - Environment configuration with Zod validation
- **logger** - Loki-integrated structured logging (use `getLogger()` pattern)
- **http** - HTTP client with proxy support and rate limiting
- **cache** - Redis/Dragonfly caching layer
- **queue** - BullMQ-based job processing with batch support
- **postgres-client** - PostgreSQL operations with transactions
- **questdb-client** - Time-series data storage
- **mongodb-client** - Document storage operations
- **utils** - Financial calculations and technical indicators
### Database Strategy
- **PostgreSQL** - Transactional data (orders, positions, strategies)
- **QuestDB** - Time-series data (OHLCV, indicators, performance metrics)
- **MongoDB** - Document storage (configurations, raw responses)
- **Dragonfly** - Event bus and caching (Redis-compatible)
## Key Patterns & Conventions
**Library Usage**:
- Import from shared libraries: `import { getLogger } from '@stock-bot/logger'`
- Use configuration: `import { databaseConfig } from '@stock-bot/config'`
- Logger pattern: `const logger = getLogger('service-name')`
**Service Structure**:
- Each service has `src/index.ts` as entry point
- Routes in `src/routes/` using Hono framework
- Handlers/services in `src/handlers/` or `src/services/`
- Use dependency injection pattern
**Data Processing**:
- Raw data → QuestDB via handlers
- Processed data → PostgreSQL via processing service
- Event-driven communication via Dragonfly
- Queue-based batch processing for large datasets
**Multi-Mode Backtesting**:
- **Live Mode** - Real-time trading with brokers
- **Event-Driven** - Realistic simulation with market conditions
- **Vectorized** - Fast mathematical backtesting for optimization
- **Hybrid** - Validation by comparing vectorized vs event-driven results
## Development Workflow
1. **Start Infrastructure**: `bun run infra:up`
2. **Build Libraries**: `bun run build:libs`
3. **Start Development**: `bun run dev`
4. **Access UIs**:
- Dashboard: http://localhost:4200
- QuestDB Console: http://localhost:9000
- Grafana: http://localhost:3000
- pgAdmin: http://localhost:8080
## Important Files & Locations
**Configuration**:
- Environment variables in `.env` files
- Service configs in `libs/config/src/`
- Database init scripts in `database/postgres/init/`
**Key Scripts**:
- `scripts/build-all.sh` - Production build with cleanup
- `scripts/docker.sh` - Docker management
- `scripts/format.sh` - Code formatting
- `scripts/setup-mcp.sh` - Setup Model Context Protocol servers for database access
**Documentation**:
- `SIMPLIFIED-ARCHITECTURE.md` - Detailed architecture overview
- `DEVELOPMENT-ROADMAP.md` - Development phases and priorities
- Individual library READMEs in `libs/*/README.md`
## Current Development Phase
**Phase 1: Data Foundation Layer** (In Progress)
- Enhancing data provider reliability and rate limiting
- Implementing data validation and quality metrics
- Optimizing QuestDB storage for time-series data
- Building robust HTTP client with circuit breakers
Focus on data quality and provider fault tolerance before advancing to strategy implementation.
## Testing & Quality
- Use Bun's built-in test runner
- Integration tests with TestContainers for databases
- ESLint for code quality with TypeScript rules
- Prettier for code formatting
- All services should have health check endpoints
## Model Context Protocol (MCP) Setup
**MCP Database Servers** are configured in `.vscode/mcp.json` for direct database access:
- **PostgreSQL MCP Server**: Provides read-only access to PostgreSQL database
- Connection: `postgresql://trading_user:trading_pass_dev@localhost:5432/trading_bot`
- Package: `@modelcontextprotocol/server-postgres`
- **MongoDB MCP Server**: Official MongoDB team server for database and Atlas interaction
- Connection: `mongodb://trading_admin:trading_mongo_dev@localhost:27017/stock?authSource=admin`
- Package: `mongodb-mcp-server` (official MongoDB JavaScript team package)
**Setup Commands**:
- `./scripts/setup-mcp.sh` - Setup and test MCP servers
- `bun run infra:up` - Start database infrastructure (required for MCP)
**Usage**: Once configured, Claude Code can directly query and inspect database schemas and data through natural language commands.
## Environment Variables
Key environment variables (see `.env` example):
- `NODE_ENV` - Environment (development/production)
- `DATA_SERVICE_PORT` - Port for data service
- `DRAGONFLY_HOST/PORT` - Cache/event bus connection
- Database connection strings for PostgreSQL, QuestDB, MongoDB
## Monitoring & Observability
- **Logging**: Structured JSON logs to Loki
- **Metrics**: Prometheus metrics collection
- **Visualization**: Grafana dashboards
- **Queue Monitoring**: Bull Board for job queues
- **Health Checks**: All services expose `/health` endpoints

View file

@ -1,183 +0,0 @@
# Migration Guide: From Singleton to Connection Pool Pattern
## Overview
This guide explains how to migrate from the singleton anti-pattern to a proper connection pool pattern using the new `@stock-bot/connection-factory` library.
## Current State (Singleton Anti-Pattern)
```typescript
// ❌ Old pattern - global singleton
import { connectMongoDB, getMongoDBClient } from '@stock-bot/mongodb-client';
import { connectPostgreSQL, getPostgreSQLClient } from '@stock-bot/postgres-client';
// Initialize once at startup
await connectMongoDB(config);
await connectPostgreSQL(config);
// Use everywhere
const mongo = getMongoDBClient();
const postgres = getPostgreSQLClient();
```
### Problems with this approach:
- Global state makes testing difficult
- All operations share the same connection pool
- Can't optimize pool sizes for different use cases
- Memory leaks from persistent connections
- Hard to implement graceful shutdown
## New Pattern (Connection Factory + Service Container)
### Step 1: Set up Connection Factory
```typescript
// ✅ New pattern - connection factory
import { setupServiceContainer } from './setup/database-setup';
// Initialize service container at startup
const container = await setupServiceContainer();
// Register cleanup
shutdown.register(async () => {
await container.dispose();
});
```
### Step 2: Update Handlers to Use Container
```typescript
// ✅ Use OperationContext with container
export class MyHandler {
constructor(private readonly container: ServiceContainer) {}
async handleOperation(data: any) {
const context = OperationContext.create('my-handler', 'operation', {
container: this.container
});
try {
// Connections are managed by the container
await context.mongodb.insertOne(data);
await context.postgres.query('...');
await context.cache.set('key', 'value');
} finally {
// Clean up resources
await context.dispose();
}
}
}
```
### Step 3: Update Route Handlers
```typescript
// Pass container to route handlers
export function createRoutes(container: ServiceContainer) {
const router = new Hono();
const handler = new MyHandler(container);
router.get('/data', async (c) => {
const result = await handler.handleOperation(c.req.query());
return c.json(result);
});
return router;
}
```
## Migration Checklist
### For Each Service:
1. **Create database setup module**
```typescript
// apps/[service-name]/src/setup/database-setup.ts
export async function setupServiceContainer(): Promise<ServiceContainer> {
// Configure connection pools based on service needs
}
```
2. **Update main index.ts**
- Remove direct `connectMongoDB()` and `connectPostgreSQL()` calls
- Replace with `setupServiceContainer()`
- Pass container to route handlers and job processors
3. **Update handlers**
- Accept `ServiceContainer` in constructor
- Create `OperationContext` with container
- Remove direct database client imports
- Add `context.dispose()` in finally blocks
4. **Update job handlers**
```typescript
// Before
export async function myJobHandler(job: Job) {
const mongo = getMongoDBClient();
// ...
}
// After
export function createMyJobHandler(container: ServiceContainer) {
return async (job: Job) => {
const context = OperationContext.create('job', job.name, {
container
});
try {
// Use context.mongodb, context.postgres, etc.
} finally {
await context.dispose();
}
};
}
```
## Pool Size Recommendations
The `PoolSizeCalculator` provides optimal pool sizes based on service type:
| Service | Min | Max | Use Case |
|---------|-----|-----|----------|
| data-ingestion | 5 | 50 | High-volume batch imports |
| data-pipeline | 3 | 30 | Data processing pipelines |
| web-api | 2 | 10 | Low-latency API requests |
| processing-service | 2 | 20 | CPU-intensive operations |
| portfolio-service | 2 | 15 | Portfolio calculations |
| strategy-service | 3 | 25 | Strategy backtesting |
## Benefits After Migration
1. **Better Resource Management**
- Each service gets appropriately sized connection pools
- Automatic cleanup with dispose pattern
- No more connection leaks
2. **Improved Testing**
- Easy to mock containers for tests
- No global state to reset between tests
- Can test with different configurations
3. **Enhanced Performance**
- Optimized pool sizes per service
- Isolated pools for heavy operations
- Better connection reuse
4. **Operational Benefits**
- Connection pool metrics per service
- Graceful shutdown handling
- Better error isolation
## Backward Compatibility
The `OperationContext` maintains backward compatibility:
- If no container is provided, it falls back to singleton pattern
- This allows gradual migration service by service
- Warning logs indicate when fallback is used
## Example: Complete Service Migration
See `/apps/data-ingestion/src/handlers/example-handler.ts` for a complete example of:
- Using the service container
- Creating operation contexts
- Handling batch operations with scoped containers
- Proper resource cleanup

View file

@ -94,4 +94,4 @@
"burstSize": 20 "burstSize": 20
} }
} }
} }

View file

@ -1,3 +1,3 @@
export { updateCeoChannels } from './update-ceo-channels.action'; export { updateCeoChannels } from './update-ceo-channels.action';
export { updateUniqueSymbols } from './update-unique-symbols.action'; export { updateUniqueSymbols } from './update-unique-symbols.action';
export { processIndividualSymbol } from './process-individual-symbol.action'; export { processIndividualSymbol } from './process-individual-symbol.action';

View file

@ -1,111 +1,117 @@
import { getRandomUserAgent } from '@stock-bot/utils'; import { getRandomUserAgent } from '@stock-bot/utils';
import type { CeoHandler } from '../ceo.handler'; import type { CeoHandler } from '../ceo.handler';
export async function processIndividualSymbol(this: CeoHandler, payload: any, _context: any): Promise<unknown> { export async function processIndividualSymbol(
const { ceoId, symbol, timestamp } = payload; this: CeoHandler,
const proxy = this.proxy?.getProxy(); payload: any,
if(!proxy) { _context: any
this.logger.warn('No proxy available for processing individual CEO symbol'); ): Promise<unknown> {
return; const { ceoId, symbol, timestamp } = payload;
} const proxy = this.proxy?.getProxy();
if (!proxy) {
this.logger.debug('Processing individual CEO symbol', { this.logger.warn('No proxy available for processing individual CEO symbol');
ceoId, return;
timestamp, }
});
try { this.logger.debug('Processing individual CEO symbol', {
// Fetch detailed information for the individual symbol ceoId,
const response = await this.http.get(`https://api.ceo.ca/api/get_spiels?channel=${ceoId}&load_more=top` timestamp,
+ (timestamp ? `&until=${timestamp}` : ''), });
{ try {
proxy: proxy, // Fetch detailed information for the individual symbol
headers: { const response = await this.http.get(
`https://api.ceo.ca/api/get_spiels?channel=${ceoId}&load_more=top` +
'User-Agent': getRandomUserAgent() (timestamp ? `&until=${timestamp}` : ''),
} {
}); proxy: proxy,
headers: {
if (!response.ok) { 'User-Agent': getRandomUserAgent(),
throw new Error(`Failed to fetch details for ceoId ${ceoId}: ${response.statusText}`); },
} }
);
const data = await response.json();
if (!response.ok) {
const spielCount = data.spiels.length; throw new Error(`Failed to fetch details for ceoId ${ceoId}: ${response.statusText}`);
if(spielCount === 0) { }
this.logger.warn(`No spiels found for ceoId ${ceoId}`);
return null; // No data to process const data = await response.json();
}
const latestSpielTime = data.spiels[0]?.timestamp; const spielCount = data.spiels.length;
const posts = data.spiels.map((spiel: any) => ({ if (spielCount === 0) {
ceoId, this.logger.warn(`No spiels found for ceoId ${ceoId}`);
spiel: spiel.spiel, return null; // No data to process
spielReplyToId: spiel.spiel_reply_to_id, }
spielReplyTo: spiel.spiel_reply_to, const latestSpielTime = data.spiels[0]?.timestamp;
spielReplyToName: spiel.spiel_reply_to_name, const posts = data.spiels.map((spiel: any) => ({
spielReplyToEdited: spiel.spiel_reply_to_edited, ceoId,
userId: spiel.user_id, spiel: spiel.spiel,
name: spiel.name, spielReplyToId: spiel.spiel_reply_to_id,
timestamp: spiel.timestamp, spielReplyTo: spiel.spiel_reply_to,
spielId: spiel.spiel_id, spielReplyToName: spiel.spiel_reply_to_name,
color: spiel.color, spielReplyToEdited: spiel.spiel_reply_to_edited,
parentId: spiel.parent_id, userId: spiel.user_id,
publicId: spiel.public_id, name: spiel.name,
parentChannel: spiel.parent_channel, timestamp: spiel.timestamp,
parentTimestamp: spiel.parent_timestamp, spielId: spiel.spiel_id,
votes: spiel.votes, color: spiel.color,
editable: spiel.editable, parentId: spiel.parent_id,
edited: spiel.edited, publicId: spiel.public_id,
featured: spiel.featured, parentChannel: spiel.parent_channel,
verified: spiel.verified, parentTimestamp: spiel.parent_timestamp,
fake: spiel.fake, votes: spiel.votes,
bot: spiel.bot, editable: spiel.editable,
voted: spiel.voted, edited: spiel.edited,
flagged: spiel.flagged, featured: spiel.featured,
ownSpiel: spiel.own_spiel, verified: spiel.verified,
score: spiel.score, fake: spiel.fake,
savedId: spiel.saved_id, bot: spiel.bot,
savedTimestamp: spiel.saved_timestamp, voted: spiel.voted,
poll: spiel.poll, flagged: spiel.flagged,
votedInPoll: spiel.voted_in_poll ownSpiel: spiel.own_spiel,
})); score: spiel.score,
savedId: spiel.saved_id,
await this.mongodb.batchUpsert('ceoPosts', posts, ['spielId']); savedTimestamp: spiel.saved_timestamp,
this.logger.info(`Fetched ${spielCount} spiels for ceoId ${ceoId}`); poll: spiel.poll,
votedInPoll: spiel.voted_in_poll,
// Update Shorts }));
const shortRes = await this.http.get(`https://api.ceo.ca/api/short_positions/one?symbol=${symbol}`,
{ await this.mongodb.batchUpsert('ceoPosts', posts, ['spielId']);
proxy: proxy, this.logger.info(`Fetched ${spielCount} spiels for ceoId ${ceoId}`);
headers: {
'User-Agent': getRandomUserAgent() // Update Shorts
} const shortRes = await this.http.get(
}); `https://api.ceo.ca/api/short_positions/one?symbol=${symbol}`,
{
if (shortRes.ok) { proxy: proxy,
const shortData = await shortRes.json(); headers: {
if(shortData && shortData.positions) { 'User-Agent': getRandomUserAgent(),
await this.mongodb.batchUpsert('ceoShorts', shortData.positions, ['id']); },
} }
);
await this.scheduleOperation('process-individual-symbol', {
ceoId: ceoId, if (shortRes.ok) {
timestamp: latestSpielTime const shortData = await shortRes.json();
}); if (shortData && shortData.positions) {
} await this.mongodb.batchUpsert('ceoShorts', shortData.positions, ['id']);
}
await this.scheduleOperation('process-individual-symbol', {
this.logger.info(`Successfully processed channel ${ceoId} and added channel ${ceoId} at timestamp ${latestSpielTime}`); ceoId: ceoId,
timestamp: latestSpielTime,
return { ceoId, spielCount, timestamp }; });
}
} catch (error) {
this.logger.error('Failed to process individual symbol', { this.logger.info(
error, `Successfully processed channel ${ceoId} and added channel ${ceoId} at timestamp ${latestSpielTime}`
ceoId, );
timestamp
}); return { ceoId, spielCount, timestamp };
throw error; } catch (error) {
} this.logger.error('Failed to process individual symbol', {
} error,
ceoId,
timestamp,
});
throw error;
}
}

View file

@ -1,67 +1,72 @@
import { getRandomUserAgent } from '@stock-bot/utils'; import { getRandomUserAgent } from '@stock-bot/utils';
import type { CeoHandler } from '../ceo.handler'; import type { CeoHandler } from '../ceo.handler';
export async function updateCeoChannels(this: CeoHandler, payload: number | undefined): Promise<unknown> { export async function updateCeoChannels(
const proxy = this.proxy?.getProxy(); this: CeoHandler,
if(!proxy) { payload: number | undefined
this.logger.warn('No proxy available for CEO channels update'); ): Promise<unknown> {
return; const proxy = this.proxy?.getProxy();
} if (!proxy) {
let page; this.logger.warn('No proxy available for CEO channels update');
if(payload === undefined) { return;
page = 1 }
}else{ let page;
page = payload; if (payload === undefined) {
} page = 1;
} else {
page = payload;
this.logger.info(`Fetching CEO channels for page ${page} with proxy ${proxy}`); }
const response = await this.http.get('https://api.ceo.ca/api/home?exchange=all&sort_by=symbol&sector=All&tab=companies&page='+page, {
proxy: proxy, this.logger.info(`Fetching CEO channels for page ${page} with proxy ${proxy}`);
headers: { const response = await this.http.get(
'User-Agent': getRandomUserAgent() 'https://api.ceo.ca/api/home?exchange=all&sort_by=symbol&sector=All&tab=companies&page=' + page,
} {
}) proxy: proxy,
const results = await response.json(); headers: {
const channels = results.channel_categories[0].channels; 'User-Agent': getRandomUserAgent(),
const totalChannels = results.channel_categories[0].total_channels; },
const totalPages = Math.ceil(totalChannels / channels.length); }
const exchanges: {exchange: string, countryCode: string}[] = [] );
const symbols = channels.map((channel: any) =>{ const results = await response.json();
// check if exchange is in the exchanges array object const channels = results.channel_categories[0].channels;
if(!exchanges.find((e: any) => e.exchange === channel.exchange)) { const totalChannels = results.channel_categories[0].total_channels;
exchanges.push({ const totalPages = Math.ceil(totalChannels / channels.length);
exchange: channel.exchange, const exchanges: { exchange: string; countryCode: string }[] = [];
countryCode: 'CA' const symbols = channels.map((channel: any) => {
}); // check if exchange is in the exchanges array object
} if (!exchanges.find((e: any) => e.exchange === channel.exchange)) {
const details = channel.company_details || {}; exchanges.push({
return { exchange: channel.exchange,
symbol: channel.symbol, countryCode: 'CA',
exchange: channel.exchange, });
name: channel.title, }
type: channel.type, const details = channel.company_details || {};
ceoId: channel.channel, return {
marketCap: details.market_cap, symbol: channel.symbol,
volumeRatio: details.volume_ratio, exchange: channel.exchange,
avgVolume: details.avg_volume, name: channel.title,
stockType: details.stock_type, type: channel.type,
issueType: details.issue_type, ceoId: channel.channel,
sharesOutstanding: details.shares_outstanding, marketCap: details.market_cap,
float: details.float, volumeRatio: details.volume_ratio,
} avgVolume: details.avg_volume,
}) stockType: details.stock_type,
issueType: details.issue_type,
await this.mongodb.batchUpsert('ceoSymbols', symbols, ['symbol', 'exchange']); sharesOutstanding: details.shares_outstanding,
await this.mongodb.batchUpsert('ceoExchanges', exchanges, ['exchange']); float: details.float,
};
if(page === 1) { });
for( let i = 2; i <= totalPages; i++) {
this.logger.info(`Scheduling page ${i} of ${totalPages} for CEO channels`); await this.mongodb.batchUpsert('ceoSymbols', symbols, ['symbol', 'exchange']);
await this.scheduleOperation('update-ceo-channels', i) await this.mongodb.batchUpsert('ceoExchanges', exchanges, ['exchange']);
}
} if (page === 1) {
for (let i = 2; i <= totalPages; i++) {
this.logger.info(`Fetched CEO channels for page ${page}/${totalPages}`); this.logger.info(`Scheduling page ${i} of ${totalPages} for CEO channels`);
return { page, totalPages }; await this.scheduleOperation('update-ceo-channels', i);
} }
}
this.logger.info(`Fetched CEO channels for page ${page}/${totalPages}`);
return { page, totalPages };
}

View file

@ -1,63 +1,71 @@
import type { CeoHandler } from '../ceo.handler'; import type { CeoHandler } from '../ceo.handler';
export async function updateUniqueSymbols(this: CeoHandler, _payload: unknown, _context: any): Promise<unknown> { export async function updateUniqueSymbols(
this.logger.info('Starting update to get unique CEO symbols by ceoId'); this: CeoHandler,
_payload: unknown,
try { _context: any
// Get unique ceoId values from the ceoSymbols collection ): Promise<unknown> {
const uniqueCeoIds = await this.mongodb.collection('ceoSymbols').distinct('ceoId'); this.logger.info('Starting update to get unique CEO symbols by ceoId');
this.logger.info(`Found ${uniqueCeoIds.length} unique CEO IDs`); try {
// Get unique ceoId values from the ceoSymbols collection
// Get detailed records for each unique ceoId (latest/first record) const uniqueCeoIds = await this.mongodb.collection('ceoSymbols').distinct('ceoId');
const uniqueSymbols = [];
for (const ceoId of uniqueCeoIds) { this.logger.info(`Found ${uniqueCeoIds.length} unique CEO IDs`);
const symbol = await this.mongodb.collection('ceoSymbols')
.findOne({ ceoId }, { sort: { _id: -1 } }); // Get latest record // Get detailed records for each unique ceoId (latest/first record)
const uniqueSymbols = [];
if (symbol) { for (const ceoId of uniqueCeoIds) {
uniqueSymbols.push(symbol); const symbol = await this.mongodb
} .collection('ceoSymbols')
} .findOne({ ceoId }, { sort: { _id: -1 } }); // Get latest record
this.logger.info(`Retrieved ${uniqueSymbols.length} unique symbol records`); if (symbol) {
uniqueSymbols.push(symbol);
// Schedule individual jobs for each unique symbol }
let scheduledJobs = 0; }
for (const symbol of uniqueSymbols) {
// Schedule a job to process this individual symbol this.logger.info(`Retrieved ${uniqueSymbols.length} unique symbol records`);
await this.scheduleOperation('process-individual-symbol', {
ceoId: symbol.ceoId, // Schedule individual jobs for each unique symbol
symbol: symbol.symbol, let scheduledJobs = 0;
}); for (const symbol of uniqueSymbols) {
scheduledJobs++; // Schedule a job to process this individual symbol
await this.scheduleOperation('process-individual-symbol', {
// Add small delay to avoid overwhelming the queue ceoId: symbol.ceoId,
if (scheduledJobs % 10 === 0) { symbol: symbol.symbol,
this.logger.debug(`Scheduled ${scheduledJobs} jobs so far`); });
} scheduledJobs++;
}
// Add small delay to avoid overwhelming the queue
this.logger.info(`Successfully scheduled ${scheduledJobs} individual symbol update jobs`); if (scheduledJobs % 10 === 0) {
this.logger.debug(`Scheduled ${scheduledJobs} jobs so far`);
// Cache the results for monitoring }
await this.cacheSet('unique-symbols-last-run', { }
timestamp: new Date().toISOString(),
totalUniqueIds: uniqueCeoIds.length, this.logger.info(`Successfully scheduled ${scheduledJobs} individual symbol update jobs`);
totalRecords: uniqueSymbols.length,
scheduledJobs // Cache the results for monitoring
}, 1800); // Cache for 30 minutes await this.cacheSet(
'unique-symbols-last-run',
return { {
success: true, timestamp: new Date().toISOString(),
uniqueCeoIds: uniqueCeoIds.length, totalUniqueIds: uniqueCeoIds.length,
uniqueRecords: uniqueSymbols.length, totalRecords: uniqueSymbols.length,
scheduledJobs, scheduledJobs,
timestamp: new Date().toISOString() },
}; 1800
); // Cache for 30 minutes
} catch (error) {
this.logger.error('Failed to update unique CEO symbols', { error }); return {
throw error; success: true,
} uniqueCeoIds: uniqueCeoIds.length,
} uniqueRecords: uniqueSymbols.length,
scheduledJobs,
timestamp: new Date().toISOString(),
};
} catch (error) {
this.logger.error('Failed to update unique CEO symbols', { error });
throw error;
}
}

View file

@ -3,13 +3,9 @@ import {
Handler, Handler,
Operation, Operation,
ScheduledOperation, ScheduledOperation,
type IServiceContainer type IServiceContainer,
} from '@stock-bot/handlers'; } from '@stock-bot/handlers';
import { import { processIndividualSymbol, updateCeoChannels, updateUniqueSymbols } from './actions';
processIndividualSymbol,
updateCeoChannels,
updateUniqueSymbols
} from './actions';
@Handler('ceo') @Handler('ceo')
// @Disabled() // @Disabled()
@ -18,21 +14,21 @@ export class CeoHandler extends BaseHandler {
super(services); // Handler name read from @Handler decorator super(services); // Handler name read from @Handler decorator
} }
@ScheduledOperation('update-ceo-channels', '0 */15 * * *', { @ScheduledOperation('update-ceo-channels', '0 */15 * * *', {
priority: 7, priority: 7,
immediately: false, immediately: false,
description: 'Get all CEO symbols and exchanges' description: 'Get all CEO symbols and exchanges',
}) })
updateCeoChannels = updateCeoChannels; updateCeoChannels = updateCeoChannels;
@Operation('update-unique-symbols') @Operation('update-unique-symbols')
@ScheduledOperation('process-unique-symbols', '0 0 1 * *', { @ScheduledOperation('process-unique-symbols', '0 0 1 * *', {
priority: 5, priority: 5,
immediately: false, immediately: false,
description: 'Process unique CEO symbols and schedule individual jobs' description: 'Process unique CEO symbols and schedule individual jobs',
}) })
updateUniqueSymbols = updateUniqueSymbols; updateUniqueSymbols = updateUniqueSymbols;
@Operation('process-individual-symbol') @Operation('process-individual-symbol')
processIndividualSymbol = processIndividualSymbol; processIndividualSymbol = processIndividualSymbol;
} }

View file

@ -1,114 +1,107 @@
import { OperationContext } from '@stock-bot/di'; import { OperationContext } from '@stock-bot/di';
import type { ServiceContainer } from '@stock-bot/di'; import type { ServiceContainer } from '@stock-bot/di';
/** /**
* Example handler showing how to use the new connection pooling pattern * Example handler showing how to use the new connection pooling pattern
*/ */
export class ExampleHandler { export class ExampleHandler {
constructor(private readonly container: ServiceContainer) {} constructor(private readonly container: ServiceContainer) {}
/** /**
* Example operation using the enhanced OperationContext * Example operation using the enhanced OperationContext
*/ */
async performOperation(data: any): Promise<void> { async performOperation(data: any): Promise<void> {
// Create operation context with container // Create operation context with container
const context = new OperationContext( const context = new OperationContext('example-handler', 'perform-operation', this.container, {
'example-handler', data,
'perform-operation', });
this.container,
{ data } try {
); // Log operation start
context.logger.info('Starting operation', { data });
try {
// Log operation start // Use MongoDB through service resolution
context.logger.info('Starting operation', { data }); const mongodb = context.resolve<any>('mongodb');
const result = await mongodb.collection('test').insertOne(data);
// Use MongoDB through service resolution context.logger.debug('MongoDB insert complete', { insertedId: result.insertedId });
const mongodb = context.resolve<any>('mongodb');
const result = await mongodb.collection('test').insertOne(data); // Use PostgreSQL through service resolution
context.logger.debug('MongoDB insert complete', { insertedId: result.insertedId }); const postgres = context.resolve<any>('postgres');
await postgres.query('INSERT INTO operations (id, status) VALUES ($1, $2)', [
// Use PostgreSQL through service resolution result.insertedId,
const postgres = context.resolve<any>('postgres'); 'completed',
await postgres.query( ]);
'INSERT INTO operations (id, status) VALUES ($1, $2)',
[result.insertedId, 'completed'] // Use cache through service resolution
); const cache = context.resolve<any>('cache');
await cache.set(`operation:${result.insertedId}`, {
// Use cache through service resolution status: 'completed',
const cache = context.resolve<any>('cache'); timestamp: new Date(),
await cache.set(`operation:${result.insertedId}`, { });
status: 'completed',
timestamp: new Date() context.logger.info('Operation completed successfully');
}); } catch (error) {
context.logger.error('Operation failed', { error });
context.logger.info('Operation completed successfully'); throw error;
} catch (error) { }
context.logger.error('Operation failed', { error }); }
throw error;
} /**
} * Example of batch operation with isolated connection pool
*/
/** async performBatchOperation(items: any[]): Promise<void> {
* Example of batch operation with isolated connection pool // Create a scoped container for this batch operation
*/ const scopedContainer = this.container.createScope();
async performBatchOperation(items: any[]): Promise<void> {
// Create a scoped container for this batch operation const context = new OperationContext('example-handler', 'batch-operation', scopedContainer, {
const scopedContainer = this.container.createScope(); itemCount: items.length,
});
const context = new OperationContext(
'example-handler', try {
'batch-operation', context.logger.info('Starting batch operation', { itemCount: items.length });
scopedContainer,
{ itemCount: items.length } // Get services once for the batch
); const mongodb = context.resolve<any>('mongodb');
const cache = context.resolve<any>('cache');
try {
context.logger.info('Starting batch operation', { itemCount: items.length }); // Process items in parallel
const promises = items.map(async (item, index) => {
// Get services once for the batch const itemContext = new OperationContext(
const mongodb = context.resolve<any>('mongodb'); 'example-handler',
const cache = context.resolve<any>('cache'); `batch-item-${index}`,
scopedContainer,
// Process items in parallel { item }
const promises = items.map(async (item, index) => { );
const itemContext = new OperationContext(
'example-handler', try {
`batch-item-${index}`, await mongodb.collection('batch').insertOne(item);
scopedContainer, await cache.set(`batch:${item.id}`, item);
{ item } } catch (error) {
); itemContext.logger.error('Batch item failed', { error, itemIndex: index });
throw error;
try { }
await mongodb.collection('batch').insertOne(item); });
await cache.set(`batch:${item.id}`, item);
} catch (error) { await Promise.all(promises);
itemContext.logger.error('Batch item failed', { error, itemIndex: index }); context.logger.info('Batch operation completed');
throw error; } finally {
} // Clean up scoped resources
}); await scopedContainer.dispose();
}
await Promise.all(promises); }
context.logger.info('Batch operation completed'); }
} finally { /**
// Clean up scoped resources * Example of how to use in a job handler
await scopedContainer.dispose(); */
} export async function createExampleJobHandler(container: ServiceContainer) {
} return async (job: any) => {
} const handler = new ExampleHandler(container);
/** if (job.data.type === 'batch') {
* Example of how to use in a job handler await handler.performBatchOperation(job.data.items);
*/ } else {
export async function createExampleJobHandler(container: ServiceContainer) { await handler.performOperation(job.data);
return async (job: any) => { }
const handler = new ExampleHandler(container); };
}
if (job.data.type === 'batch') {
await handler.performBatchOperation(job.data.items);
} else {
await handler.performOperation(job.data);
}
};
}

View file

@ -1,94 +1,94 @@
/** /**
* Example Handler - Demonstrates ergonomic handler patterns * Example Handler - Demonstrates ergonomic handler patterns
* Shows inline operations, service helpers, and scheduled operations * Shows inline operations, service helpers, and scheduled operations
*/ */
import { import {
BaseHandler, BaseHandler,
Handler, Handler,
Operation, Operation,
ScheduledOperation, ScheduledOperation,
type ExecutionContext, type ExecutionContext,
type IServiceContainer type IServiceContainer,
} from '@stock-bot/handlers'; } from '@stock-bot/handlers';
@Handler('example') @Handler('example')
export class ExampleHandler extends BaseHandler { export class ExampleHandler extends BaseHandler {
constructor(services: IServiceContainer) { constructor(services: IServiceContainer) {
super(services); super(services);
} }
/** /**
* Simple inline operation - no separate action file needed * Simple inline operation - no separate action file needed
*/ */
@Operation('get-stats') @Operation('get-stats')
async getStats(): Promise<{ total: number; active: number; cached: boolean }> { async getStats(): Promise<{ total: number; active: number; cached: boolean }> {
// Use collection helper for cleaner MongoDB access // Use collection helper for cleaner MongoDB access
const total = await this.collection('items').countDocuments(); const total = await this.collection('items').countDocuments();
const active = await this.collection('items').countDocuments({ status: 'active' }); const active = await this.collection('items').countDocuments({ status: 'active' });
// Use cache helpers with automatic prefixing // Use cache helpers with automatic prefixing
const cached = await this.cacheGet<number>('last-total'); const cached = await this.cacheGet<number>('last-total');
await this.cacheSet('last-total', total, 300); // 5 minutes await this.cacheSet('last-total', total, 300); // 5 minutes
// Use log helper with automatic handler context // Use log helper with automatic handler context
this.log('info', 'Stats retrieved', { total, active }); this.log('info', 'Stats retrieved', { total, active });
return { total, active, cached: cached !== null }; return { total, active, cached: cached !== null };
} }
/** /**
* Scheduled operation using combined decorator * Scheduled operation using combined decorator
*/ */
@ScheduledOperation('cleanup-old-items', '0 2 * * *', { @ScheduledOperation('cleanup-old-items', '0 2 * * *', {
priority: 5, priority: 5,
description: 'Clean up items older than 30 days' description: 'Clean up items older than 30 days',
}) })
async cleanupOldItems(): Promise<{ deleted: number }> { async cleanupOldItems(): Promise<{ deleted: number }> {
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000); const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);
const result = await this.collection('items').deleteMany({ const result = await this.collection('items').deleteMany({
createdAt: { $lt: thirtyDaysAgo } createdAt: { $lt: thirtyDaysAgo },
}); });
this.log('info', 'Cleanup completed', { deleted: result.deletedCount }); this.log('info', 'Cleanup completed', { deleted: result.deletedCount });
// Schedule a follow-up task // Schedule a follow-up task
await this.scheduleIn('generate-report', { type: 'cleanup' }, 60); // 1 minute await this.scheduleIn('generate-report', { type: 'cleanup' }, 60); // 1 minute
return { deleted: result.deletedCount }; return { deleted: result.deletedCount };
} }
/** /**
* Operation that uses proxy service * Operation that uses proxy service
*/ */
@Operation('fetch-external-data') @Operation('fetch-external-data')
async fetchExternalData(input: { url: string }): Promise<{ data: any }> { async fetchExternalData(input: { url: string }): Promise<{ data: any }> {
const proxyUrl = this.proxy.getProxy(); const proxyUrl = this.proxy.getProxy();
if (!proxyUrl) { if (!proxyUrl) {
throw new Error('No proxy available'); throw new Error('No proxy available');
} }
// Use HTTP client with proxy // Use HTTP client with proxy
const response = await this.http.get(input.url, { const response = await this.http.get(input.url, {
proxy: proxyUrl, proxy: proxyUrl,
timeout: 10000 timeout: 10000,
}); });
// Cache the result // Cache the result
await this.cacheSet(`external:${input.url}`, response.data, 3600); await this.cacheSet(`external:${input.url}`, response.data, 3600);
return { data: response.data }; return { data: response.data };
} }
/** /**
* Complex operation that still uses action file * Complex operation that still uses action file
*/ */
@Operation('process-batch') @Operation('process-batch')
async processBatch(input: any, context: ExecutionContext): Promise<unknown> { async processBatch(input: any, context: ExecutionContext): Promise<unknown> {
// For complex operations, still use action files // For complex operations, still use action files
const { processBatch } = await import('./actions/batch.action'); const { processBatch } = await import('./actions/batch.action');
return processBatch(this, input); return processBatch(this, input);
} }
} }

View file

@ -0,0 +1,38 @@
import type { IbHandler } from '../ib.handler';
export async function fetchExchangesAndSymbols(this: IbHandler): Promise<unknown> {
this.logger.info('Starting IB exchanges and symbols fetch job');
try {
// Fetch session headers first
const sessionHeaders = await this.fetchSession();
if (!sessionHeaders) {
this.logger.error('Failed to get session headers for IB job');
return { success: false, error: 'No session headers' };
}
this.logger.info('Session headers obtained, fetching exchanges...');
// Fetch exchanges
const exchanges = await this.fetchExchanges();
this.logger.info('Fetched exchanges from IB', { count: exchanges?.length || 0 });
// Fetch symbols
this.logger.info('Fetching symbols...');
const symbols = await this.fetchSymbols();
this.logger.info('Fetched symbols from IB', { count: symbols?.length || 0 });
return {
success: true,
exchangesCount: exchanges?.length || 0,
symbolsCount: symbols?.length || 0,
};
} catch (error) {
this.logger.error('Failed to fetch IB exchanges and symbols', { error });
return {
success: false,
error: error instanceof Error ? error.message : 'Unknown error',
};
}
}

View file

@ -1,16 +1,15 @@
/** import type { IbHandler } from '../ib.handler';
* IB Exchanges Operations - Fetching exchange data from IB API
*/
import { OperationContext } from '@stock-bot/di';
import type { ServiceContainer } from '@stock-bot/di';
import { IB_CONFIG } from '../shared/config'; import { IB_CONFIG } from '../shared/config';
export async function fetchExchanges(sessionHeaders: Record<string, string>, container: ServiceContainer): Promise<unknown[] | null> { export async function fetchExchanges(this: IbHandler): Promise<unknown[] | null> {
const ctx = OperationContext.create('ib', 'exchanges', { container });
try { try {
ctx.logger.info('🔍 Fetching exchanges with session headers...'); // First get session headers
const sessionHeaders = await this.fetchSession();
if (!sessionHeaders) {
throw new Error('Failed to get session headers');
}
this.logger.info('🔍 Fetching exchanges with session headers...');
// The URL for the exchange data API // The URL for the exchange data API
const exchangeUrl = IB_CONFIG.BASE_URL + IB_CONFIG.EXCHANGE_API; const exchangeUrl = IB_CONFIG.BASE_URL + IB_CONFIG.EXCHANGE_API;
@ -28,7 +27,7 @@ export async function fetchExchanges(sessionHeaders: Record<string, string>, con
'X-Requested-With': 'XMLHttpRequest', 'X-Requested-With': 'XMLHttpRequest',
}; };
ctx.logger.info('📤 Making request to exchange API...', { this.logger.info('📤 Making request to exchange API...', {
url: exchangeUrl, url: exchangeUrl,
headerCount: Object.keys(requestHeaders).length, headerCount: Object.keys(requestHeaders).length,
}); });
@ -41,7 +40,7 @@ export async function fetchExchanges(sessionHeaders: Record<string, string>, con
}); });
if (!response.ok) { if (!response.ok) {
ctx.logger.error('❌ Exchange API request failed', { this.logger.error('❌ Exchange API request failed', {
status: response.status, status: response.status,
statusText: response.statusText, statusText: response.statusText,
}); });
@ -50,19 +49,18 @@ export async function fetchExchanges(sessionHeaders: Record<string, string>, con
const data = await response.json(); const data = await response.json();
const exchanges = data?.exchanges || []; const exchanges = data?.exchanges || [];
ctx.logger.info('✅ Exchange data fetched successfully'); this.logger.info('✅ Exchange data fetched successfully');
ctx.logger.info('Saving IB exchanges to MongoDB...'); this.logger.info('Saving IB exchanges to MongoDB...');
await ctx.mongodb.batchUpsert('ibExchanges', exchanges, ['id', 'country_code']); await this.mongodb.batchUpsert('ibExchanges', exchanges, ['id', 'country_code']);
ctx.logger.info('✅ Exchange IB data saved to MongoDB:', { this.logger.info('✅ Exchange IB data saved to MongoDB:', {
count: exchanges.length, count: exchanges.length,
}); });
return exchanges; return exchanges;
} catch (error) { } catch (error) {
ctx.logger.error('❌ Failed to fetch exchanges', { error }); this.logger.error('❌ Failed to fetch exchanges', { error });
return null; return null;
} finally {
await ctx.dispose();
} }
} }

View file

@ -0,0 +1,83 @@
import { Browser } from '@stock-bot/browser';
import type { IbHandler } from '../ib.handler';
import { IB_CONFIG } from '../shared/config';
export async function fetchSession(this: IbHandler): Promise<Record<string, string> | undefined> {
try {
await Browser.initialize({
headless: true,
timeout: IB_CONFIG.BROWSER_TIMEOUT,
blockResources: false,
});
this.logger.info('✅ Browser initialized');
const { page } = await Browser.createPageWithProxy(
IB_CONFIG.BASE_URL + IB_CONFIG.PRODUCTS_PAGE,
IB_CONFIG.DEFAULT_PROXY
);
this.logger.info('✅ Page created with proxy');
const headersPromise = new Promise<Record<string, string> | undefined>(resolve => {
let resolved = false;
page.onNetworkEvent(event => {
if (event.url.includes('/webrest/search/product-types/summary')) {
if (event.type === 'request') {
try {
resolve(event.headers);
} catch (e) {
resolve(undefined);
this.logger.debug('Raw Summary Response error', { error: (e as Error).message });
}
}
}
});
// Timeout fallback
setTimeout(() => {
if (!resolved) {
resolved = true;
this.logger.warn('Timeout waiting for headers');
resolve(undefined);
}
}, IB_CONFIG.HEADERS_TIMEOUT);
});
this.logger.info('⏳ Waiting for page load...');
await page.waitForLoadState('domcontentloaded', { timeout: IB_CONFIG.PAGE_LOAD_TIMEOUT });
this.logger.info('✅ Page loaded');
//Products tabs
this.logger.info('🔍 Looking for Products tab...');
const productsTab = page.locator('#productSearchTab[role="tab"][href="#products"]');
await productsTab.waitFor({ timeout: IB_CONFIG.ELEMENT_TIMEOUT });
this.logger.info('✅ Found Products tab');
this.logger.info('🖱️ Clicking Products tab...');
await productsTab.click();
this.logger.info('✅ Products tab clicked');
// New Products Checkbox
this.logger.info('🔍 Looking for "New Products Only" radio button...');
const radioButton = page.locator('span.checkbox-text:has-text("New Products Only")');
await radioButton.waitFor({ timeout: IB_CONFIG.ELEMENT_TIMEOUT });
this.logger.info(`🎯 Found "New Products Only" radio button`);
await radioButton.first().click();
this.logger.info('✅ "New Products Only" radio button clicked');
// Wait for and return headers immediately when captured
this.logger.info('⏳ Waiting for headers to be captured...');
const headers = await headersPromise;
page.close();
if (headers) {
this.logger.info('✅ Headers captured successfully');
} else {
this.logger.warn('⚠️ No headers were captured');
}
return headers;
} catch (error) {
this.logger.error('Failed to fetch IB symbol summary', { error });
return;
}
}

View file

@ -1,18 +1,16 @@
/** import type { IbHandler } from '../ib.handler';
* IB Symbols Operations - Fetching symbol data from IB API
*/
import { OperationContext } from '@stock-bot/di';
import type { ServiceContainer } from '@stock-bot/di';
import { IB_CONFIG } from '../shared/config'; import { IB_CONFIG } from '../shared/config';
// Fetch symbols from IB using the session headers export async function fetchSymbols(this: IbHandler): Promise<unknown[] | null> {
export async function fetchSymbols(sessionHeaders: Record<string, string>, container: ServiceContainer): Promise<unknown[] | null> {
const ctx = OperationContext.create('ib', 'symbols', { container });
try { try {
ctx.logger.info('🔍 Fetching symbols with session headers...'); // First get session headers
const sessionHeaders = await this.fetchSession();
if (!sessionHeaders) {
throw new Error('Failed to get session headers');
}
this.logger.info('🔍 Fetching symbols with session headers...');
// Prepare headers - include all session headers plus any additional ones // Prepare headers - include all session headers plus any additional ones
const requestHeaders = { const requestHeaders = {
...sessionHeaders, ...sessionHeaders,
@ -39,18 +37,15 @@ export async function fetchSymbols(sessionHeaders: Record<string, string>, conta
}; };
// Get Summary // Get Summary
const summaryResponse = await fetch( const summaryResponse = await fetch(IB_CONFIG.BASE_URL + IB_CONFIG.SUMMARY_API, {
IB_CONFIG.BASE_URL + IB_CONFIG.SUMMARY_API, method: 'POST',
{ headers: requestHeaders,
method: 'POST', proxy: IB_CONFIG.DEFAULT_PROXY,
headers: requestHeaders, body: JSON.stringify(requestBody),
proxy: IB_CONFIG.DEFAULT_PROXY, });
body: JSON.stringify(requestBody),
}
);
if (!summaryResponse.ok) { if (!summaryResponse.ok) {
ctx.logger.error('❌ Summary API request failed', { this.logger.error('❌ Summary API request failed', {
status: summaryResponse.status, status: summaryResponse.status,
statusText: summaryResponse.statusText, statusText: summaryResponse.statusText,
}); });
@ -58,36 +53,33 @@ export async function fetchSymbols(sessionHeaders: Record<string, string>, conta
} }
const summaryData = await summaryResponse.json(); const summaryData = await summaryResponse.json();
ctx.logger.info('✅ IB Summary data fetched successfully', { this.logger.info('✅ IB Summary data fetched successfully', {
totalCount: summaryData[0].totalCount, totalCount: summaryData[0].totalCount,
}); });
const symbols = []; const symbols = [];
requestBody.pageSize = IB_CONFIG.PAGE_SIZE; requestBody.pageSize = IB_CONFIG.PAGE_SIZE;
const pageCount = Math.ceil(summaryData[0].totalCount / IB_CONFIG.PAGE_SIZE) || 0; const pageCount = Math.ceil(summaryData[0].totalCount / IB_CONFIG.PAGE_SIZE) || 0;
ctx.logger.info('Fetching Symbols for IB', { pageCount }); this.logger.info('Fetching Symbols for IB', { pageCount });
const symbolPromises = []; const symbolPromises = [];
for (let page = 1; page <= pageCount; page++) { for (let page = 1; page <= pageCount; page++) {
requestBody.pageNumber = page; requestBody.pageNumber = page;
// Fetch symbols for the current page // Fetch symbols for the current page
const symbolsResponse = fetch( const symbolsResponse = fetch(IB_CONFIG.BASE_URL + IB_CONFIG.PRODUCTS_API, {
IB_CONFIG.BASE_URL + IB_CONFIG.PRODUCTS_API, method: 'POST',
{ headers: requestHeaders,
method: 'POST', proxy: IB_CONFIG.DEFAULT_PROXY,
headers: requestHeaders, body: JSON.stringify(requestBody),
proxy: IB_CONFIG.DEFAULT_PROXY, });
body: JSON.stringify(requestBody),
}
);
symbolPromises.push(symbolsResponse); symbolPromises.push(symbolsResponse);
} }
const responses = await Promise.all(symbolPromises); const responses = await Promise.all(symbolPromises);
for (const response of responses) { for (const response of responses) {
if (!response.ok) { if (!response.ok) {
ctx.logger.error('❌ Symbols API request failed', { this.logger.error('❌ Symbols API request failed', {
status: response.status, status: response.status,
statusText: response.statusText, statusText: response.statusText,
}); });
@ -98,29 +90,28 @@ export async function fetchSymbols(sessionHeaders: Record<string, string>, conta
if (symJson && symJson.length > 0) { if (symJson && symJson.length > 0) {
symbols.push(...symJson); symbols.push(...symJson);
} else { } else {
ctx.logger.warn('⚠️ No symbols found in response'); this.logger.warn('⚠️ No symbols found in response');
continue; continue;
} }
} }
if (symbols.length === 0) { if (symbols.length === 0) {
ctx.logger.warn('⚠️ No symbols fetched from IB'); this.logger.warn('⚠️ No symbols fetched from IB');
return null; return null;
} }
ctx.logger.info('✅ IB symbols fetched successfully, saving to DB...', { this.logger.info('✅ IB symbols fetched successfully, saving to DB...', {
totalSymbols: symbols.length, totalSymbols: symbols.length,
}); });
await ctx.mongodb.batchUpsert('ib_symbols', symbols, ['symbol', 'exchangeId']); await this.mongodb.batchUpsert('ib_symbols', symbols, ['symbol', 'exchangeId']);
ctx.logger.info('Saved IB symbols to DB', { this.logger.info('Saved IB symbols to DB', {
totalSymbols: symbols.length, totalSymbols: symbols.length,
}); });
return symbols; return symbols;
} catch (error) { } catch (error) {
ctx.logger.error('❌ Failed to fetch symbols', { error }); this.logger.error('❌ Failed to fetch symbols', { error });
return null; return null;
} finally {
await ctx.dispose();
} }
} }

View file

@ -0,0 +1,5 @@
export { fetchSession } from './fetch-session.action';
export { fetchExchanges } from './fetch-exchanges.action';
export { fetchSymbols } from './fetch-symbols.action';
export { fetchExchangesAndSymbols } from './fetch-exchanges-and-symbols.action';

View file

@ -1,90 +1,33 @@
/**
* Interactive Brokers Provider for new queue system
*/
import { getLogger } from '@stock-bot/logger';
import { import {
createJobHandler, BaseHandler,
handlerRegistry, Handler,
type HandlerConfigWithSchedule, Operation,
} from '@stock-bot/queue'; ScheduledOperation,
import type { ServiceContainer } from '@stock-bot/di'; type IServiceContainer,
} from '@stock-bot/handlers';
import { fetchExchanges, fetchExchangesAndSymbols, fetchSession, fetchSymbols } from './actions';
const logger = getLogger('ib-provider'); @Handler('ib')
export class IbHandler extends BaseHandler {
constructor(services: IServiceContainer) {
super(services);
}
// Initialize and register the IB provider @Operation('fetch-session')
export function initializeIBProvider(container: ServiceContainer) { fetchSession = fetchSession;
logger.debug('Registering IB provider with scheduled jobs...');
const ibProviderConfig: HandlerConfigWithSchedule = { @Operation('fetch-exchanges')
name: 'ib', fetchExchanges = fetchExchanges;
operations: {
'fetch-session': createJobHandler(async () => {
// payload contains session configuration (not used in current implementation)
logger.debug('Processing session fetch request');
const { fetchSession } = await import('./operations/session.operations');
return fetchSession(container);
}),
'fetch-exchanges': createJobHandler(async () => { @Operation('fetch-symbols')
// payload should contain session headers fetchSymbols = fetchSymbols;
logger.debug('Processing exchanges fetch request');
const { fetchSession } = await import('./operations/session.operations');
const { fetchExchanges } = await import('./operations/exchanges.operations');
const sessionHeaders = await fetchSession(container);
if (sessionHeaders) {
return fetchExchanges(sessionHeaders, container);
}
throw new Error('Failed to get session headers');
}),
'fetch-symbols': createJobHandler(async () => { @Operation('ib-exchanges-and-symbols')
// payload should contain session headers @ScheduledOperation('ib-exchanges-and-symbols', '0 0 * * 0', {
logger.debug('Processing symbols fetch request'); priority: 5,
const { fetchSession } = await import('./operations/session.operations'); description: 'Fetch and update IB exchanges and symbols data',
const { fetchSymbols } = await import('./operations/symbols.operations'); immediately: false,
const sessionHeaders = await fetchSession(container); })
if (sessionHeaders) { fetchExchangesAndSymbols = fetchExchangesAndSymbols;
return fetchSymbols(sessionHeaders, container);
}
throw new Error('Failed to get session headers');
}),
'ib-exchanges-and-symbols': createJobHandler(async () => {
// Legacy operation for scheduled jobs
logger.info('Fetching symbol summary from IB');
const { fetchSession } = await import('./operations/session.operations');
const { fetchExchanges } = await import('./operations/exchanges.operations');
const { fetchSymbols } = await import('./operations/symbols.operations');
const sessionHeaders = await fetchSession(container);
logger.info('Fetched symbol summary from IB');
if (sessionHeaders) {
logger.debug('Fetching exchanges from IB');
const exchanges = await fetchExchanges(sessionHeaders, container);
logger.info('Fetched exchanges from IB', { count: exchanges?.length });
logger.debug('Fetching symbols from IB');
const symbols = await fetchSymbols(sessionHeaders, container);
logger.info('Fetched symbols from IB', { symbols });
return { exchangesCount: exchanges?.length, symbolsCount: symbols?.length };
}
return null;
}),
},
scheduledJobs: [
{
type: 'ib-exchanges-and-symbols',
operation: 'ib-exchanges-and-symbols',
cronPattern: '0 0 * * 0', // Every Sunday at midnight
priority: 5,
description: 'Fetch and update IB exchanges and symbols data',
// immediately: true, // Don't run immediately during startup to avoid conflicts
},
],
};
handlerRegistry.registerWithSchedule(ibProviderConfig);
logger.debug('IB provider registered successfully with scheduled jobs');
} }

View file

@ -1,91 +0,0 @@
/**
* IB Session Operations - Browser automation for session headers
*/
import { Browser } from '@stock-bot/browser';
import { OperationContext } from '@stock-bot/di';
import type { ServiceContainer } from '@stock-bot/di';
import { IB_CONFIG } from '../shared/config';
export async function fetchSession(container: ServiceContainer): Promise<Record<string, string> | undefined> {
const ctx = OperationContext.create('ib', 'session', { container });
try {
await Browser.initialize({
headless: true,
timeout: IB_CONFIG.BROWSER_TIMEOUT,
blockResources: false
});
ctx.logger.info('✅ Browser initialized');
const { page } = await Browser.createPageWithProxy(
IB_CONFIG.BASE_URL + IB_CONFIG.PRODUCTS_PAGE,
IB_CONFIG.DEFAULT_PROXY
);
ctx.logger.info('✅ Page created with proxy');
const headersPromise = new Promise<Record<string, string> | undefined>(resolve => {
let resolved = false;
page.onNetworkEvent(event => {
if (event.url.includes('/webrest/search/product-types/summary')) {
if (event.type === 'request') {
try {
resolve(event.headers);
} catch (e) {
resolve(undefined);
ctx.logger.debug('Raw Summary Response error', { error: (e as Error).message });
}
}
}
});
// Timeout fallback
setTimeout(() => {
if (!resolved) {
resolved = true;
ctx.logger.warn('Timeout waiting for headers');
resolve(undefined);
}
}, IB_CONFIG.HEADERS_TIMEOUT);
});
ctx.logger.info('⏳ Waiting for page load...');
await page.waitForLoadState('domcontentloaded', { timeout: IB_CONFIG.PAGE_LOAD_TIMEOUT });
ctx.logger.info('✅ Page loaded');
//Products tabs
ctx.logger.info('🔍 Looking for Products tab...');
const productsTab = page.locator('#productSearchTab[role=\"tab\"][href=\"#products\"]');
await productsTab.waitFor({ timeout: IB_CONFIG.ELEMENT_TIMEOUT });
ctx.logger.info('✅ Found Products tab');
ctx.logger.info('🖱️ Clicking Products tab...');
await productsTab.click();
ctx.logger.info('✅ Products tab clicked');
// New Products Checkbox
ctx.logger.info('🔍 Looking for \"New Products Only\" radio button...');
const radioButton = page.locator('span.checkbox-text:has-text(\"New Products Only\")');
await radioButton.waitFor({ timeout: IB_CONFIG.ELEMENT_TIMEOUT });
ctx.logger.info(`🎯 Found \"New Products Only\" radio button`);
await radioButton.first().click();
ctx.logger.info('✅ \"New Products Only\" radio button clicked');
// Wait for and return headers immediately when captured
ctx.logger.info('⏳ Waiting for headers to be captured...');
const headers = await headersPromise;
page.close();
if (headers) {
ctx.logger.info('✅ Headers captured successfully');
} else {
ctx.logger.warn('⚠️ No headers were captured');
}
return headers;
} catch (error) {
ctx.logger.error('Failed to fetch IB symbol summary', { error });
return;
} finally {
await ctx.dispose();
}
}

View file

@ -8,16 +8,17 @@ export const IB_CONFIG = {
EXCHANGE_API: '/webrest/exchanges', EXCHANGE_API: '/webrest/exchanges',
SUMMARY_API: '/webrest/search/product-types/summary', SUMMARY_API: '/webrest/search/product-types/summary',
PRODUCTS_API: '/webrest/search/products-by-filters', PRODUCTS_API: '/webrest/search/products-by-filters',
// Browser configuration // Browser configuration
BROWSER_TIMEOUT: 10000, BROWSER_TIMEOUT: 10000,
PAGE_LOAD_TIMEOUT: 20000, PAGE_LOAD_TIMEOUT: 20000,
ELEMENT_TIMEOUT: 5000, ELEMENT_TIMEOUT: 5000,
HEADERS_TIMEOUT: 30000, HEADERS_TIMEOUT: 30000,
// API configuration // API configuration
DEFAULT_PROXY: 'http://doimvbnb-US-rotate:w5fpiwrb9895@p.webshare.io:80', DEFAULT_PROXY: 'http://doimvbnb-US-rotate:w5fpiwrb9895@p.webshare.io:80',
PAGE_SIZE: 500, PAGE_SIZE: 500,
PRODUCT_COUNTRIES: ['CA', 'US'], PRODUCT_COUNTRIES: ['CA', 'US'],
PRODUCT_TYPES: ['STK'], PRODUCT_TYPES: ['STK'],
}; };

View file

@ -6,11 +6,12 @@
import type { IServiceContainer } from '@stock-bot/handlers'; import type { IServiceContainer } from '@stock-bot/handlers';
import { autoRegisterHandlers } from '@stock-bot/handlers'; import { autoRegisterHandlers } from '@stock-bot/handlers';
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
// Import handlers for bundling (ensures they're included in the build) // Import handlers for bundling (ensures they're included in the build)
import './qm/qm.handler'; import './qm/qm.handler';
import './webshare/webshare.handler'; import './webshare/webshare.handler';
import './ceo/ceo.handler'; import './ceo/ceo.handler';
import './ib/ib.handler';
// Add more handler imports as needed // Add more handler imports as needed
const logger = getLogger('handler-init'); const logger = getLogger('handler-init');
@ -21,21 +22,17 @@ const logger = getLogger('handler-init');
export async function initializeAllHandlers(serviceContainer: IServiceContainer): Promise<void> { export async function initializeAllHandlers(serviceContainer: IServiceContainer): Promise<void> {
try { try {
// Auto-register all handlers in this directory // Auto-register all handlers in this directory
const result = await autoRegisterHandlers( const result = await autoRegisterHandlers(__dirname, serviceContainer, {
__dirname, pattern: '.handler.',
serviceContainer, exclude: ['test', 'spec'],
{ dryRun: false,
pattern: '.handler.', });
exclude: ['test', 'spec'],
dryRun: false
}
);
logger.info('Handler auto-registration complete', { logger.info('Handler auto-registration complete', {
registered: result.registered, registered: result.registered,
failed: result.failed failed: result.failed,
}); });
if (result.failed.length > 0) { if (result.failed.length > 0) {
logger.error('Some handlers failed to register', { failed: result.failed }); logger.error('Some handlers failed to register', { failed: result.failed });
} }
@ -51,21 +48,20 @@ export async function initializeAllHandlers(serviceContainer: IServiceContainer)
*/ */
async function manualHandlerRegistration(serviceContainer: any): Promise<void> { async function manualHandlerRegistration(serviceContainer: any): Promise<void> {
logger.warn('Falling back to manual handler registration'); logger.warn('Falling back to manual handler registration');
try { try {
// // Import and register handlers manually // // Import and register handlers manually
// const { QMHandler } = await import('./qm/qm.handler'); // const { QMHandler } = await import('./qm/qm.handler');
// const qmHandler = new QMHandler(serviceContainer); // const qmHandler = new QMHandler(serviceContainer);
// qmHandler.register(); // qmHandler.register();
// const { WebShareHandler } = await import('./webshare/webshare.handler'); // const { WebShareHandler } = await import('./webshare/webshare.handler');
// const webShareHandler = new WebShareHandler(serviceContainer); // const webShareHandler = new WebShareHandler(serviceContainer);
// webShareHandler.register(); // webShareHandler.register();
logger.info('Manual handler registration complete'); logger.info('Manual handler registration complete');
} catch (error) { } catch (error) {
logger.error('Manual handler registration failed', { error }); logger.error('Manual handler registration failed', { error });
throw error; throw error;
} }
} }

View file

@ -1,24 +1,23 @@
/** /**
* Proxy Check Operations - Checking proxy functionality * Proxy Check Operations - Checking proxy functionality
*/ */
import type { ProxyInfo } from '@stock-bot/proxy';
import { OperationContext } from '@stock-bot/di'; import { OperationContext } from '@stock-bot/di';
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import type { ProxyInfo } from '@stock-bot/proxy';
import { fetch } from '@stock-bot/utils'; import { fetch } from '@stock-bot/utils';
import { PROXY_CONFIG } from '../shared/config'; import { PROXY_CONFIG } from '../shared/config';
/** /**
* Check if a proxy is working * Check if a proxy is working
*/ */
export async function checkProxy(proxy: ProxyInfo): Promise<ProxyInfo> { export async function checkProxy(proxy: ProxyInfo): Promise<ProxyInfo> {
const ctx = { const ctx = {
logger: getLogger('proxy-check'), logger: getLogger('proxy-check'),
resolve: <T>(_name: string) => { resolve: <T>(_name: string) => {
throw new Error(`Service container not available for proxy operations`); throw new Error(`Service container not available for proxy operations`);
} },
} as any; } as any;
let success = false; let success = false;
ctx.logger.debug(`Checking Proxy:`, { ctx.logger.debug(`Checking Proxy:`, {
protocol: proxy.protocol, protocol: proxy.protocol,
@ -28,16 +27,17 @@ export async function checkProxy(proxy: ProxyInfo): Promise<ProxyInfo> {
try { try {
// Test the proxy using fetch with proxy support // Test the proxy using fetch with proxy support
const proxyUrl = proxy.username && proxy.password const proxyUrl =
? `${proxy.protocol}://${encodeURIComponent(proxy.username)}:${encodeURIComponent(proxy.password)}@${proxy.host}:${proxy.port}` proxy.username && proxy.password
: `${proxy.protocol}://${proxy.host}:${proxy.port}`; ? `${proxy.protocol}://${encodeURIComponent(proxy.username)}:${encodeURIComponent(proxy.password)}@${proxy.host}:${proxy.port}`
: `${proxy.protocol}://${proxy.host}:${proxy.port}`;
const response = await fetch(PROXY_CONFIG.CHECK_URL, { const response = await fetch(PROXY_CONFIG.CHECK_URL, {
proxy: proxyUrl, proxy: proxyUrl,
signal: AbortSignal.timeout(PROXY_CONFIG.CHECK_TIMEOUT), signal: AbortSignal.timeout(PROXY_CONFIG.CHECK_TIMEOUT),
logger: ctx.logger logger: ctx.logger,
} as any); } as any);
const data = await response.text(); const data = await response.text();
const isWorking = response.ok; const isWorking = response.ok;
@ -94,7 +94,11 @@ export async function checkProxy(proxy: ProxyInfo): Promise<ProxyInfo> {
/** /**
* Update proxy data in cache with working/total stats and average response time * Update proxy data in cache with working/total stats and average response time
*/ */
async function updateProxyInCache(proxy: ProxyInfo, isWorking: boolean, ctx: OperationContext): Promise<void> { async function updateProxyInCache(
proxy: ProxyInfo,
isWorking: boolean,
ctx: OperationContext
): Promise<void> {
const _cacheKey = `${PROXY_CONFIG.CACHE_KEY}:${proxy.protocol}://${proxy.host}:${proxy.port}`; const _cacheKey = `${PROXY_CONFIG.CACHE_KEY}:${proxy.protocol}://${proxy.host}:${proxy.port}`;
try { try {
@ -167,6 +171,6 @@ async function updateProxyInCache(proxy: ProxyInfo, isWorking: boolean, ctx: Ope
function updateProxyStats(sourceId: string, success: boolean, ctx: OperationContext) { function updateProxyStats(sourceId: string, success: boolean, ctx: OperationContext) {
// Stats are now handled by the global ProxyManager // Stats are now handled by the global ProxyManager
ctx.logger.debug('Proxy check result', { sourceId, success }); ctx.logger.debug('Proxy check result', { sourceId, success });
// TODO: Integrate with global ProxyManager stats if needed // TODO: Integrate with global ProxyManager stats if needed
} }

View file

@ -1,9 +1,8 @@
/** /**
* Proxy Query Operations - Getting active proxies from cache * Proxy Query Operations - Getting active proxies from cache
*/ */
import type { ProxyInfo } from '@stock-bot/proxy';
import { OperationContext } from '@stock-bot/di'; import { OperationContext } from '@stock-bot/di';
import type { ProxyInfo } from '@stock-bot/proxy';
import { PROXY_CONFIG } from '../shared/config'; import { PROXY_CONFIG } from '../shared/config';
/** /**
@ -17,7 +16,7 @@ export async function getRandomActiveProxy(
minSuccessRate: number = 50 minSuccessRate: number = 50
): Promise<ProxyInfo | null> { ): Promise<ProxyInfo | null> {
const ctx = OperationContext.create('proxy', 'get-random'); const ctx = OperationContext.create('proxy', 'get-random');
try { try {
// Get all active proxy keys from cache // Get all active proxy keys from cache
const pattern = protocol const pattern = protocol
@ -56,7 +55,10 @@ export async function getRandomActiveProxy(
return proxyData; return proxyData;
} }
} catch (error) { } catch (error) {
ctx.logger.debug('Error reading proxy from cache', { key, error: (error as Error).message }); ctx.logger.debug('Error reading proxy from cache', {
key,
error: (error as Error).message,
});
continue; continue;
} }
} }
@ -76,4 +78,4 @@ export async function getRandomActiveProxy(
}); });
return null; return null;
} }
} }

View file

@ -1,13 +1,13 @@
/** /**
* Proxy Queue Operations - Queueing proxy operations * Proxy Queue Operations - Queueing proxy operations
*/ */
import { OperationContext } from '@stock-bot/di';
import type { ProxyInfo } from '@stock-bot/proxy'; import type { ProxyInfo } from '@stock-bot/proxy';
import { QueueManager } from '@stock-bot/queue'; import { QueueManager } from '@stock-bot/queue';
import { OperationContext } from '@stock-bot/di';
export async function queueProxyFetch(): Promise<string> { export async function queueProxyFetch(): Promise<string> {
const ctx = OperationContext.create('proxy', 'queue-fetch'); const ctx = OperationContext.create('proxy', 'queue-fetch');
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const queue = queueManager.getQueue('proxy'); const queue = queueManager.getQueue('proxy');
const job = await queue.add('proxy-fetch', { const job = await queue.add('proxy-fetch', {
@ -24,7 +24,7 @@ export async function queueProxyFetch(): Promise<string> {
export async function queueProxyCheck(proxies: ProxyInfo[]): Promise<string> { export async function queueProxyCheck(proxies: ProxyInfo[]): Promise<string> {
const ctx = OperationContext.create('proxy', 'queue-check'); const ctx = OperationContext.create('proxy', 'queue-check');
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const queue = queueManager.getQueue('proxy'); const queue = queueManager.getQueue('proxy');
const job = await queue.add('proxy-check', { const job = await queue.add('proxy-check', {
@ -37,4 +37,4 @@ export async function queueProxyCheck(proxies: ProxyInfo[]): Promise<string> {
const jobId = job.id || 'unknown'; const jobId = job.id || 'unknown';
ctx.logger.info('Proxy check job queued', { jobId, count: proxies.length }); ctx.logger.info('Proxy check job queued', { jobId, count: proxies.length });
return jobId; return jobId;
} }

View file

@ -1,10 +1,14 @@
/** /**
* Proxy Provider for new queue system * Proxy Provider for new queue system
*/ */
import type { ProxyInfo } from '@stock-bot/proxy';
import { getLogger } from '@stock-bot/logger';
import { handlerRegistry, createJobHandler, type HandlerConfigWithSchedule } from '@stock-bot/queue';
import type { ServiceContainer } from '@stock-bot/di'; import type { ServiceContainer } from '@stock-bot/di';
import { getLogger } from '@stock-bot/logger';
import type { ProxyInfo } from '@stock-bot/proxy';
import {
createJobHandler,
handlerRegistry,
type HandlerConfigWithSchedule,
} from '@stock-bot/queue';
const handlerLogger = getLogger('proxy-handler'); const handlerLogger = getLogger('proxy-handler');

View file

@ -137,4 +137,4 @@ export const PROXY_CONFIG = {
protocol: 'https', protocol: 'https',
}, },
], ],
}; };

View file

@ -10,4 +10,4 @@ export interface ProxySource {
total?: number; // Optional, used for stats total?: number; // Optional, used for stats
percentWorking?: number; // Optional, used for stats percentWorking?: number; // Optional, used for stats
lastChecked?: Date; // Optional, used for stats lastChecked?: Date; // Optional, used for stats
} }

View file

@ -6,16 +6,14 @@ import type { IServiceContainer } from '@stock-bot/handlers';
export async function fetchExchanges(services: IServiceContainer): Promise<any[]> { export async function fetchExchanges(services: IServiceContainer): Promise<any[]> {
// Get exchanges from MongoDB // Get exchanges from MongoDB
const exchanges = await services.mongodb.collection('qm_exchanges') const exchanges = await services.mongodb.collection('qm_exchanges').find({}).toArray();
.find({}).toArray();
return exchanges; return exchanges;
} }
export async function getExchangeByCode(services: IServiceContainer, code: string): Promise<any> { export async function getExchangeByCode(services: IServiceContainer, code: string): Promise<any> {
// Get specific exchange by code // Get specific exchange by code
const exchange = await services.mongodb.collection('qm_exchanges') const exchange = await services.mongodb.collection('qm_exchanges').findOne({ code });
.findOne({ code });
return exchange; return exchange;
} }

View file

@ -9,10 +9,10 @@ import { QMSessionManager } from '../shared/session-manager';
/** /**
* Check existing sessions and queue creation jobs for needed sessions * Check existing sessions and queue creation jobs for needed sessions
*/ */
export async function checkSessions(handler: BaseHandler): Promise<{ export async function checkSessions(handler: BaseHandler): Promise<{
cleaned: number; cleaned: number;
queued: number; queued: number;
message: string; message: string;
}> { }> {
const sessionManager = QMSessionManager.getInstance(); const sessionManager = QMSessionManager.getInstance();
const cleanedCount = sessionManager.cleanupFailedSessions(); const cleanedCount = sessionManager.cleanupFailedSessions();
@ -24,17 +24,17 @@ export async function checkSessions(handler: BaseHandler): Promise<{
const currentCount = sessionManager.getSessions(sessionId).length; const currentCount = sessionManager.getSessions(sessionId).length;
const neededSessions = SESSION_CONFIG.MAX_SESSIONS - currentCount; const neededSessions = SESSION_CONFIG.MAX_SESSIONS - currentCount;
for (let i = 0; i < neededSessions; i++) { for (let i = 0; i < neededSessions; i++) {
await handler.scheduleOperation('create-session', { sessionId , sessionType }); await handler.scheduleOperation('create-session', { sessionId, sessionType });
handler.logger.info(`Queued job to create session for ${sessionType}`); handler.logger.info(`Queued job to create session for ${sessionType}`);
queuedCount++; queuedCount++;
} }
} }
} }
return { return {
cleaned: cleanedCount, cleaned: cleanedCount,
queued: queuedCount, queued: queuedCount,
message: `Session check completed: cleaned ${cleanedCount}, queued ${queuedCount}` message: `Session check completed: cleaned ${cleanedCount}, queued ${queuedCount}`,
}; };
} }
@ -42,16 +42,15 @@ export async function checkSessions(handler: BaseHandler): Promise<{
* Create a single session for a specific session ID * Create a single session for a specific session ID
*/ */
export async function createSingleSession( export async function createSingleSession(
handler: BaseHandler, handler: BaseHandler,
input: any input: any
): Promise<{ sessionId: string; status: string; sessionType: string }> { ): Promise<{ sessionId: string; status: string; sessionType: string }> {
const { sessionId, sessionType } = input || {}; const { sessionId, sessionType } = input || {};
const sessionManager = QMSessionManager.getInstance(); const sessionManager = QMSessionManager.getInstance();
// Get proxy from proxy service // Get proxy from proxy service
const proxyString = handler.proxy.getProxy(); const proxyString = handler.proxy.getProxy();
// const session = { // const session = {
// proxy: proxyString || 'http://proxy:8080', // proxy: proxyString || 'http://proxy:8080',
// headers: sessionManager.getQmHeaders(), // headers: sessionManager.getQmHeaders(),
@ -60,15 +59,14 @@ export async function createSingleSession(
// lastUsed: new Date() // lastUsed: new Date()
// }; // };
handler.logger.info(`Creating session for ${sessionType}`) handler.logger.info(`Creating session for ${sessionType}`);
// Add session to manager // Add session to manager
// sessionManager.addSession(sessionType, session); // sessionManager.addSession(sessionType, session);
return { return {
sessionId: sessionType, sessionId: sessionType,
status: 'created', status: 'created',
sessionType sessionType,
}; };
} }

View file

@ -9,16 +9,15 @@ export async function spiderSymbolSearch(
services: IServiceContainer, services: IServiceContainer,
config: SymbolSpiderJob config: SymbolSpiderJob
): Promise<{ foundSymbols: number; depth: number }> { ): Promise<{ foundSymbols: number; depth: number }> {
// Simple spider implementation // Simple spider implementation
// TODO: Implement actual API calls to discover symbols // TODO: Implement actual API calls to discover symbols
// For now, just return mock results // For now, just return mock results
const foundSymbols = Math.floor(Math.random() * 10) + 1; const foundSymbols = Math.floor(Math.random() * 10) + 1;
return { return {
foundSymbols, foundSymbols,
depth: config.depth depth: config.depth,
}; };
} }
@ -31,4 +30,4 @@ export async function queueSymbolDiscovery(
// TODO: Queue actual discovery jobs // TODO: Queue actual discovery jobs
await services.cache.set(`discovery:${term}`, { queued: true }, 3600); await services.cache.set(`discovery:${term}`, { queued: true }, 3600);
} }
} }

View file

@ -6,16 +6,14 @@ import type { IServiceContainer } from '@stock-bot/handlers';
export async function searchSymbols(services: IServiceContainer): Promise<any[]> { export async function searchSymbols(services: IServiceContainer): Promise<any[]> {
// Get symbols from MongoDB // Get symbols from MongoDB
const symbols = await services.mongodb.collection('qm_symbols') const symbols = await services.mongodb.collection('qm_symbols').find({}).limit(50).toArray();
.find({}).limit(50).toArray();
return symbols; return symbols;
} }
export async function fetchSymbolData(services: IServiceContainer, symbol: string): Promise<any> { export async function fetchSymbolData(services: IServiceContainer, symbol: string): Promise<any> {
// Fetch data for a specific symbol // Fetch data for a specific symbol
const symbolData = await services.mongodb.collection('qm_symbols') const symbolData = await services.mongodb.collection('qm_symbols').findOne({ symbol });
.findOne({ symbol });
return symbolData; return symbolData;
} }

View file

@ -1,8 +1,4 @@
import { import { BaseHandler, Handler, type IServiceContainer } from '@stock-bot/handlers';
BaseHandler,
Handler,
type IServiceContainer
} from '@stock-bot/handlers';
@Handler('qm') @Handler('qm')
export class QMHandler extends BaseHandler { export class QMHandler extends BaseHandler {
@ -11,10 +7,10 @@ export class QMHandler extends BaseHandler {
} }
// @Operation('check-sessions') // @Operation('check-sessions')
// @QueueSchedule('0 */15 * * *', { // @QueueSchedule('0 */15 * * *', {
// priority: 7, // priority: 7,
// immediately: true, // immediately: true,
// description: 'Check and maintain QM sessions' // description: 'Check and maintain QM sessions'
// }) // })
// async checkSessions(input: unknown, context: ExecutionContext): Promise<unknown> { // async checkSessions(input: unknown, context: ExecutionContext): Promise<unknown> {
// // Call the session maintenance action // // Call the session maintenance action
@ -36,13 +32,13 @@ export class QMHandler extends BaseHandler {
// // Check existing symbols in MongoDB // // Check existing symbols in MongoDB
// const symbolsCollection = this.mongodb.collection('qm_symbols'); // const symbolsCollection = this.mongodb.collection('qm_symbols');
// const symbols = await symbolsCollection.find({}).limit(100).toArray(); // const symbols = await symbolsCollection.find({}).limit(100).toArray();
// this.logger.info('QM symbol search completed', { count: symbols.length }); // this.logger.info('QM symbol search completed', { count: symbols.length });
// if (symbols && symbols.length > 0) { // if (symbols && symbols.length > 0) {
// // Cache result for performance // // Cache result for performance
// await this.cache.set('qm-symbols-sample', symbols.slice(0, 10), 1800); // await this.cache.set('qm-symbols-sample', symbols.slice(0, 10), 1800);
// return { // return {
// success: true, // success: true,
// message: 'QM symbol search completed successfully', // message: 'QM symbol search completed successfully',
@ -58,7 +54,7 @@ export class QMHandler extends BaseHandler {
// count: 0, // count: 0,
// }; // };
// } // }
// } catch (error) { // } catch (error) {
// this.logger.error('Failed to search QM symbols', { error }); // this.logger.error('Failed to search QM symbols', { error });
// throw error; // throw error;
@ -66,10 +62,10 @@ export class QMHandler extends BaseHandler {
// } // }
// @Operation('spider-symbol-search') // @Operation('spider-symbol-search')
// @QueueSchedule('0 0 * * 0', { // @QueueSchedule('0 0 * * 0', {
// priority: 10, // priority: 10,
// immediately: false, // immediately: false,
// description: 'Comprehensive symbol search using QM API' // description: 'Comprehensive symbol search using QM API'
// }) // })
// async spiderSymbolSearch(payload: SymbolSpiderJob | undefined, context: ExecutionContext): Promise<unknown> { // async spiderSymbolSearch(payload: SymbolSpiderJob | undefined, context: ExecutionContext): Promise<unknown> {
// // Set default payload for scheduled runs // // Set default payload for scheduled runs
@ -79,9 +75,9 @@ export class QMHandler extends BaseHandler {
// source: 'qm', // source: 'qm',
// maxDepth: 4 // maxDepth: 4
// }; // };
// this.logger.info('Starting QM spider symbol search', { payload: jobPayload }); // this.logger.info('Starting QM spider symbol search', { payload: jobPayload });
// // Store spider job info in cache (temporary data) // // Store spider job info in cache (temporary data)
// const spiderJobId = `spider:qm:${Date.now()}:${Math.random().toString(36).substr(2, 9)}`; // const spiderJobId = `spider:qm:${Date.now()}:${Math.random().toString(36).substr(2, 9)}`;
// const spiderResult = { // const spiderResult = {
@ -90,19 +86,18 @@ export class QMHandler extends BaseHandler {
// status: 'started', // status: 'started',
// jobId: spiderJobId // jobId: spiderJobId
// }; // };
// // Store in cache with 1 hour TTL (temporary data) // // Store in cache with 1 hour TTL (temporary data)
// await this.cache.set(spiderJobId, spiderResult, 3600); // await this.cache.set(spiderJobId, spiderResult, 3600);
// this.logger.debug('Spider job stored in cache', { spiderJobId, ttl: 3600 }); // this.logger.debug('Spider job stored in cache', { spiderJobId, ttl: 3600 });
// // Schedule follow-up processing if needed // // Schedule follow-up processing if needed
// await this.scheduleOperation('search-symbols', { source: 'spider', spiderJobId }, 5000); // await this.scheduleOperation('search-symbols', { source: 'spider', spiderJobId }, 5000);
// return { // return {
// success: true, // success: true,
// message: 'QM spider search initiated', // message: 'QM spider search initiated',
// spiderJobId // spiderJobId
// }; // };
// } // }
} }

View file

@ -2,11 +2,10 @@
* Shared configuration for QM operations * Shared configuration for QM operations
*/ */
// QM Session IDs for different endpoints // QM Session IDs for different endpoints
export const QM_SESSION_IDS = { export const QM_SESSION_IDS = {
LOOKUP: 'dc8c9930437f65d30f6597768800957017bac203a0a50342932757c8dfa158d6', // lookup endpoint LOOKUP: 'dc8c9930437f65d30f6597768800957017bac203a0a50342932757c8dfa158d6', // lookup endpoint
// '5ad521e05faf5778d567f6d0012ec34d6cdbaeb2462f41568f66558bc7b4ced9': [], //4488d072b // '5ad521e05faf5778d567f6d0012ec34d6cdbaeb2462f41568f66558bc7b4ced9': [], //4488d072b
// cc1cbdaf040f76db8f4c94f7d156b9b9b716e1a7509ec9c74a48a47f6b6b9f87: [], //97ff00cf3 // getQuotes // cc1cbdaf040f76db8f4c94f7d156b9b9b716e1a7509ec9c74a48a47f6b6b9f87: [], //97ff00cf3 // getQuotes
// '74963ff42f1db2320d051762b5d3950ff9eab23f9d5c5b592551b4ca0441d086': [], //32ca24e394b // getSplitsBySymbol getBrokerRatingsBySymbol getDividendsBySymbol getEarningsSurprisesBySymbol getEarningsEventsBySymbol // '74963ff42f1db2320d051762b5d3950ff9eab23f9d5c5b592551b4ca0441d086': [], //32ca24e394b // getSplitsBySymbol getBrokerRatingsBySymbol getDividendsBySymbol getEarningsSurprisesBySymbol getEarningsEventsBySymbol
// '1e1d7cb1de1fd2fe52684abdea41a446919a5fe12776dfab88615ac1ce1ec2f6': [], //fb5721812d2c // getEnhancedQuotes getProfiles // '1e1d7cb1de1fd2fe52684abdea41a446919a5fe12776dfab88615ac1ce1ec2f6': [], //fb5721812d2c // getEnhancedQuotes getProfiles
@ -36,4 +35,4 @@ export const SESSION_CONFIG = {
MAX_FAILED_CALLS: 10, MAX_FAILED_CALLS: 10,
SESSION_TIMEOUT: 10000, // 10 seconds SESSION_TIMEOUT: 10000, // 10 seconds
API_TIMEOUT: 15000, // 15 seconds API_TIMEOUT: 15000, // 15 seconds
} as const; } as const;

View file

@ -33,13 +33,15 @@ export class QMSessionManager {
if (!sessions || sessions.length === 0) { if (!sessions || sessions.length === 0) {
return null; return null;
} }
// Filter out sessions with excessive failures // Filter out sessions with excessive failures
const validSessions = sessions.filter(session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS); const validSessions = sessions.filter(
session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS
);
if (validSessions.length === 0) { if (validSessions.length === 0) {
return null; return null;
} }
return validSessions[Math.floor(Math.random() * validSessions.length)]; return validSessions[Math.floor(Math.random() * validSessions.length)];
} }
@ -72,7 +74,7 @@ export class QMSessionManager {
*/ */
cleanupFailedSessions(): number { cleanupFailedSessions(): number {
let removedCount = 0; let removedCount = 0;
Object.keys(this.sessionCache).forEach(sessionId => { Object.keys(this.sessionCache).forEach(sessionId => {
const initialCount = this.sessionCache[sessionId].length; const initialCount = this.sessionCache[sessionId].length;
this.sessionCache[sessionId] = this.sessionCache[sessionId].filter( this.sessionCache[sessionId] = this.sessionCache[sessionId].filter(
@ -80,7 +82,7 @@ export class QMSessionManager {
); );
removedCount += initialCount - this.sessionCache[sessionId].length; removedCount += initialCount - this.sessionCache[sessionId].length;
}); });
return removedCount; return removedCount;
} }
@ -94,13 +96,15 @@ export class QMSessionManager {
Referer: 'https://www.quotemedia.com/', Referer: 'https://www.quotemedia.com/',
}; };
} }
/** /**
* Check if more sessions are needed for a session ID * Check if more sessions are needed for a session ID
*/ */
needsMoreSessions(sessionId: string): boolean { needsMoreSessions(sessionId: string): boolean {
const sessions = this.sessionCache[sessionId] || []; const sessions = this.sessionCache[sessionId] || [];
const validSessions = sessions.filter(session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS); const validSessions = sessions.filter(
session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS
);
return validSessions.length < SESSION_CONFIG.MIN_SESSIONS; return validSessions.length < SESSION_CONFIG.MIN_SESSIONS;
} }
@ -117,18 +121,22 @@ export class QMSessionManager {
*/ */
getStats() { getStats() {
const stats: Record<string, { total: number; valid: number; failed: number }> = {}; const stats: Record<string, { total: number; valid: number; failed: number }> = {};
Object.entries(this.sessionCache).forEach(([sessionId, sessions]) => { Object.entries(this.sessionCache).forEach(([sessionId, sessions]) => {
const validSessions = sessions.filter(session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS); const validSessions = sessions.filter(
const failedSessions = sessions.filter(session => session.failedCalls > SESSION_CONFIG.MAX_FAILED_CALLS); session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS
);
const failedSessions = sessions.filter(
session => session.failedCalls > SESSION_CONFIG.MAX_FAILED_CALLS
);
stats[sessionId] = { stats[sessionId] = {
total: sessions.length, total: sessions.length,
valid: validSessions.length, valid: validSessions.length,
failed: failedSessions.length failed: failedSessions.length,
}; };
}); });
return stats; return stats;
} }
@ -145,4 +153,4 @@ export class QMSessionManager {
getInitialized(): boolean { getInitialized(): boolean {
return this.isInitialized; return this.isInitialized;
} }
} }

View file

@ -29,4 +29,4 @@ export interface SpiderResult {
success: boolean; success: boolean;
symbolsFound: number; symbolsFound: number;
jobsCreated: number; jobsCreated: number;
} }

View file

@ -1,9 +1,8 @@
/** /**
* WebShare Fetch Operations - API integration * WebShare Fetch Operations - API integration
*/ */
import type { ProxyInfo } from '@stock-bot/proxy';
import { OperationContext } from '@stock-bot/di'; import { OperationContext } from '@stock-bot/di';
import type { ProxyInfo } from '@stock-bot/proxy';
import { WEBSHARE_CONFIG } from '../shared/config'; import { WEBSHARE_CONFIG } from '../shared/config';
/** /**
@ -11,7 +10,7 @@ import { WEBSHARE_CONFIG } from '../shared/config';
*/ */
export async function fetchWebShareProxies(): Promise<ProxyInfo[]> { export async function fetchWebShareProxies(): Promise<ProxyInfo[]> {
const ctx = OperationContext.create('webshare', 'fetch-proxies'); const ctx = OperationContext.create('webshare', 'fetch-proxies');
try { try {
// Get configuration from config system // Get configuration from config system
const { getConfig } = await import('@stock-bot/config'); const { getConfig } = await import('@stock-bot/config');
@ -30,14 +29,17 @@ export async function fetchWebShareProxies(): Promise<ProxyInfo[]> {
ctx.logger.info('Fetching proxies from WebShare API', { apiUrl }); ctx.logger.info('Fetching proxies from WebShare API', { apiUrl });
const response = await fetch(`${apiUrl}proxy/list/?mode=${WEBSHARE_CONFIG.DEFAULT_MODE}&page=${WEBSHARE_CONFIG.DEFAULT_PAGE}&page_size=${WEBSHARE_CONFIG.DEFAULT_PAGE_SIZE}`, { const response = await fetch(
method: 'GET', `${apiUrl}proxy/list/?mode=${WEBSHARE_CONFIG.DEFAULT_MODE}&page=${WEBSHARE_CONFIG.DEFAULT_PAGE}&page_size=${WEBSHARE_CONFIG.DEFAULT_PAGE_SIZE}`,
headers: { {
Authorization: `Token ${apiKey}`, method: 'GET',
'Content-Type': 'application/json', headers: {
}, Authorization: `Token ${apiKey}`,
signal: AbortSignal.timeout(WEBSHARE_CONFIG.TIMEOUT), 'Content-Type': 'application/json',
}); },
signal: AbortSignal.timeout(WEBSHARE_CONFIG.TIMEOUT),
}
);
if (!response.ok) { if (!response.ok) {
ctx.logger.error('WebShare API request failed', { ctx.logger.error('WebShare API request failed', {
@ -55,22 +57,19 @@ export async function fetchWebShareProxies(): Promise<ProxyInfo[]> {
} }
// Transform proxy data to ProxyInfo format // Transform proxy data to ProxyInfo format
const proxies: ProxyInfo[] = data.results.map((proxy: { const proxies: ProxyInfo[] = data.results.map(
username: string; (proxy: { username: string; password: string; proxy_address: string; port: number }) => ({
password: string; source: 'webshare',
proxy_address: string; protocol: 'http' as const,
port: number; host: proxy.proxy_address,
}) => ({ port: proxy.port,
source: 'webshare', username: proxy.username,
protocol: 'http' as const, password: proxy.password,
host: proxy.proxy_address, isWorking: true, // WebShare provides working proxies
port: proxy.port, firstSeen: new Date(),
username: proxy.username, lastChecked: new Date(),
password: proxy.password, })
isWorking: true, // WebShare provides working proxies );
firstSeen: new Date(),
lastChecked: new Date(),
}));
ctx.logger.info('Successfully fetched proxies from WebShare', { ctx.logger.info('Successfully fetched proxies from WebShare', {
count: proxies.length, count: proxies.length,
@ -82,4 +81,4 @@ export async function fetchWebShareProxies(): Promise<ProxyInfo[]> {
ctx.logger.error('Failed to fetch proxies from WebShare', { error }); ctx.logger.error('Failed to fetch proxies from WebShare', { error });
return []; return [];
} }
} }

View file

@ -7,4 +7,4 @@ export const WEBSHARE_CONFIG = {
DEFAULT_MODE: 'direct', DEFAULT_MODE: 'direct',
DEFAULT_PAGE: 1, DEFAULT_PAGE: 1,
TIMEOUT: 10000, TIMEOUT: 10000,
}; };

View file

@ -4,7 +4,7 @@ import {
Operation, Operation,
QueueSchedule, QueueSchedule,
type ExecutionContext, type ExecutionContext,
type IServiceContainer type IServiceContainer,
} from '@stock-bot/handlers'; } from '@stock-bot/handlers';
@Handler('webshare') @Handler('webshare')
@ -14,33 +14,45 @@ export class WebShareHandler extends BaseHandler {
} }
@Operation('fetch-proxies') @Operation('fetch-proxies')
@QueueSchedule('0 */6 * * *', { @QueueSchedule('0 */6 * * *', {
priority: 3, priority: 3,
immediately: true, immediately: true,
description: 'Fetch fresh proxies from WebShare API' description: 'Fetch fresh proxies from WebShare API',
}) })
async fetchProxies(_input: unknown, _context: ExecutionContext): Promise<unknown> { async fetchProxies(_input: unknown, _context: ExecutionContext): Promise<unknown> {
this.logger.info('Fetching proxies from WebShare API'); this.logger.info('Fetching proxies from WebShare API');
try { try {
const { fetchWebShareProxies } = await import('./operations/fetch.operations'); const { fetchWebShareProxies } = await import('./operations/fetch.operations');
const proxies = await fetchWebShareProxies(); const proxies = await fetchWebShareProxies();
if (proxies.length > 0) { if (proxies.length > 0) {
// Update the centralized proxy manager using the injected service // Update the centralized proxy manager using the injected service
if (!this.proxy) {
this.logger.warn('Proxy manager is not initialized, cannot update proxies');
return {
success: false,
proxiesUpdated: 0,
error: 'Proxy manager not initialized',
};
}
await this.proxy.updateProxies(proxies); await this.proxy.updateProxies(proxies);
this.logger.info('Updated proxy manager with WebShare proxies', { this.logger.info('Updated proxy manager with WebShare proxies', {
count: proxies.length, count: proxies.length,
workingCount: proxies.filter(p => p.isWorking !== false).length, workingCount: proxies.filter(p => p.isWorking !== false).length,
}); });
// Cache proxy stats for monitoring // Cache proxy stats for monitoring
await this.cache.set('webshare-proxy-count', proxies.length, 3600); await this.cache.set('webshare-proxy-count', proxies.length, 3600);
await this.cache.set('webshare-working-count', proxies.filter(p => p.isWorking !== false).length, 3600); await this.cache.set(
'webshare-working-count',
proxies.filter(p => p.isWorking !== false).length,
3600
);
await this.cache.set('last-webshare-fetch', new Date().toISOString(), 1800); await this.cache.set('last-webshare-fetch', new Date().toISOString(), 1800);
return { return {
success: true, success: true,
proxiesUpdated: proxies.length, proxiesUpdated: proxies.length,
workingProxies: proxies.filter(p => p.isWorking !== false).length, workingProxies: proxies.filter(p => p.isWorking !== false).length,
@ -59,4 +71,3 @@ export class WebShareHandler extends BaseHandler {
} }
} }
} }

View file

@ -4,20 +4,18 @@
*/ */
// Framework imports // Framework imports
import { initializeServiceConfig } from '@stock-bot/config';
import { Hono } from 'hono'; import { Hono } from 'hono';
import { cors } from 'hono/cors'; import { cors } from 'hono/cors';
import { initializeServiceConfig } from '@stock-bot/config';
// Library imports // Library imports
import { import {
createServiceContainer, createServiceContainer,
initializeServices as initializeAwilixServices, initializeServices as initializeAwilixServices,
type ServiceContainer type ServiceContainer,
} from '@stock-bot/di'; } from '@stock-bot/di';
import { getLogger, setLoggerConfig, shutdownLoggers } from '@stock-bot/logger'; import { getLogger, setLoggerConfig, shutdownLoggers } from '@stock-bot/logger';
import { Shutdown } from '@stock-bot/shutdown'; import { Shutdown } from '@stock-bot/shutdown';
import { handlerRegistry } from '@stock-bot/types'; import { handlerRegistry } from '@stock-bot/types';
// Local imports // Local imports
import { createRoutes } from './routes/create-routes'; import { createRoutes } from './routes/create-routes';
import { initializeAllHandlers } from './handlers'; import { initializeAllHandlers } from './handlers';
@ -84,17 +82,17 @@ async function initializeServices() {
ttl: 3600, ttl: 3600,
}, },
}; };
container = createServiceContainer(awilixConfig); container = createServiceContainer(awilixConfig);
await initializeAwilixServices(container); await initializeAwilixServices(container);
logger.info('Awilix container created and initialized'); logger.info('Awilix container created and initialized');
// Get the service container for handlers // Get the service container for handlers
const serviceContainer = container.resolve('serviceContainer'); const serviceContainer = container.resolve('serviceContainer');
// Create app with routes // Create app with routes
app = new Hono(); app = new Hono();
// Add CORS middleware // Add CORS middleware
app.use( app.use(
'*', '*',
@ -105,17 +103,17 @@ async function initializeServices() {
credentials: false, credentials: false,
}) })
); );
// Create and mount routes using the service container // Create and mount routes using the service container
const routes = createRoutes(serviceContainer); const routes = createRoutes(serviceContainer);
app.route('/', routes); app.route('/', routes);
// Initialize handlers with service container from Awilix // Initialize handlers with service container from Awilix
logger.debug('Initializing data handlers with Awilix DI pattern...'); logger.debug('Initializing data handlers with Awilix DI pattern...');
// Auto-register all handlers with the service container from Awilix // Auto-register all handlers with the service container from Awilix
await initializeAllHandlers(serviceContainer); await initializeAllHandlers(serviceContainer);
logger.info('Data handlers initialized with new DI pattern'); logger.info('Data handlers initialized with new DI pattern');
// Create scheduled jobs from registered handlers // Create scheduled jobs from registered handlers
@ -175,10 +173,10 @@ async function initializeServices() {
logger.info('All services initialized successfully'); logger.info('All services initialized successfully');
} catch (error) { } catch (error) {
console.error('DETAILED ERROR:', error); console.error('DETAILED ERROR:', error);
logger.error('Failed to initialize services', { logger.error('Failed to initialize services', {
error: error instanceof Error ? error.message : String(error), error: error instanceof Error ? error.message : String(error),
stack: error instanceof Error ? error.stack : undefined, stack: error instanceof Error ? error.stack : undefined,
details: JSON.stringify(error, null, 2) details: JSON.stringify(error, null, 2),
}); });
throw error; throw error;
} }
@ -236,14 +234,20 @@ shutdown.onShutdownMedium(async () => {
if (container) { if (container) {
// Disconnect database clients // Disconnect database clients
const mongoClient = container.resolve('mongoClient'); const mongoClient = container.resolve('mongoClient');
if (mongoClient?.disconnect) await mongoClient.disconnect(); if (mongoClient?.disconnect) {
await mongoClient.disconnect();
}
const postgresClient = container.resolve('postgresClient'); const postgresClient = container.resolve('postgresClient');
if (postgresClient?.disconnect) await postgresClient.disconnect(); if (postgresClient?.disconnect) {
await postgresClient.disconnect();
}
const questdbClient = container.resolve('questdbClient'); const questdbClient = container.resolve('questdbClient');
if (questdbClient?.disconnect) await questdbClient.disconnect(); if (questdbClient?.disconnect) {
await questdbClient.disconnect();
}
logger.info('All services disposed successfully'); logger.info('All services disposed successfully');
} }
} catch (error) { } catch (error) {
@ -268,4 +272,4 @@ startServer().catch(error => {
process.exit(1); process.exit(1);
}); });
logger.info('Data service startup initiated with improved DI pattern'); logger.info('Data service startup initiated with improved DI pattern');

View file

@ -1,69 +1,74 @@
/** /**
* Routes creation with improved DI pattern * Routes creation with improved DI pattern
*/ */
import { Hono } from 'hono'; import { Hono } from 'hono';
import type { IServiceContainer } from '@stock-bot/handlers'; import type { IServiceContainer } from '@stock-bot/handlers';
import { exchangeRoutes } from './exchange.routes'; import { exchangeRoutes } from './exchange.routes';
import { healthRoutes } from './health.routes'; import { healthRoutes } from './health.routes';
import { queueRoutes } from './queue.routes'; import { queueRoutes } from './queue.routes';
/** /**
* Creates all routes with access to type-safe services * Creates all routes with access to type-safe services
*/ */
export function createRoutes(services: IServiceContainer): Hono { export function createRoutes(services: IServiceContainer): Hono {
const app = new Hono(); const app = new Hono();
// Mount routes that don't need services // Mount routes that don't need services
app.route('/health', healthRoutes); app.route('/health', healthRoutes);
// Mount routes that need services (will be updated to use services) // Mount routes that need services (will be updated to use services)
app.route('/api/exchanges', exchangeRoutes); app.route('/api/exchanges', exchangeRoutes);
app.route('/api/queue', queueRoutes); app.route('/api/queue', queueRoutes);
// Store services in app context for handlers that need it // Store services in app context for handlers that need it
app.use('*', async (c, next) => { app.use('*', async (c, next) => {
c.set('services', services); c.set('services', services);
await next(); await next();
}); });
// Add a new endpoint to test the improved DI // Add a new endpoint to test the improved DI
app.get('/api/di-test', async (c) => { app.get('/api/di-test', async c => {
try { try {
const services = c.get('services') as IServiceContainer; const services = c.get('services') as IServiceContainer;
// Test MongoDB connection // Test MongoDB connection
const mongoStats = services.mongodb?.getPoolMetrics?.() || { status: services.mongodb ? 'connected' : 'disabled' }; const mongoStats = services.mongodb?.getPoolMetrics?.() || {
status: services.mongodb ? 'connected' : 'disabled',
// Test PostgreSQL connection };
const pgConnected = services.postgres?.connected || false;
// Test PostgreSQL connection
// Test cache const pgConnected = services.postgres?.connected || false;
const cacheReady = services.cache?.isReady() || false;
// Test cache
// Test queue const cacheReady = services.cache?.isReady() || false;
const queueStats = services.queue?.getGlobalStats() || { status: 'disabled' };
// Test queue
return c.json({ const queueStats = services.queue?.getGlobalStats() || { status: 'disabled' };
success: true,
message: 'Improved DI pattern is working!', return c.json({
services: { success: true,
mongodb: mongoStats, message: 'Improved DI pattern is working!',
postgres: { connected: pgConnected }, services: {
cache: { ready: cacheReady }, mongodb: mongoStats,
queue: queueStats postgres: { connected: pgConnected },
}, cache: { ready: cacheReady },
timestamp: new Date().toISOString() queue: queueStats,
}); },
} catch (error) { timestamp: new Date().toISOString(),
const services = c.get('services') as IServiceContainer; });
services.logger.error('DI test endpoint failed', { error }); } catch (error) {
return c.json({ const services = c.get('services') as IServiceContainer;
success: false, services.logger.error('DI test endpoint failed', { error });
error: error instanceof Error ? error.message : String(error) return c.json(
}, 500); {
} success: false,
}); error: error instanceof Error ? error.message : String(error),
},
return app; 500
} );
}
});
return app;
}

View file

@ -11,7 +11,7 @@ exchange.get('/', async c => {
return c.json({ return c.json({
status: 'success', status: 'success',
data: [], data: [],
message: 'Exchange endpoints will be implemented with database integration' message: 'Exchange endpoints will be implemented with database integration',
}); });
} catch (error) { } catch (error) {
logger.error('Failed to get exchanges', { error }); logger.error('Failed to get exchanges', { error });
@ -19,4 +19,4 @@ exchange.get('/', async c => {
} }
}); });
export { exchange as exchangeRoutes }; export { exchange as exchangeRoutes };

View file

@ -10,11 +10,11 @@ queue.get('/status', async c => {
try { try {
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const globalStats = await queueManager.getGlobalStats(); const globalStats = await queueManager.getGlobalStats();
return c.json({ return c.json({
status: 'success', status: 'success',
data: globalStats, data: globalStats,
message: 'Queue status retrieved successfully' message: 'Queue status retrieved successfully',
}); });
} catch (error) { } catch (error) {
logger.error('Failed to get queue status', { error }); logger.error('Failed to get queue status', { error });
@ -22,4 +22,4 @@ queue.get('/status', async c => {
} }
}); });
export { queue as queueRoutes }; export { queue as queueRoutes };

View file

@ -37,4 +37,4 @@ export interface IBSymbol {
name?: string; name?: string;
currency?: string; currency?: string;
// Add other properties as needed // Add other properties as needed
} }

View file

@ -90,4 +90,4 @@ export interface FetchWebShareProxiesResult extends CountableJobResult {
// No payload job types (for operations that don't need input) // No payload job types (for operations that don't need input)
export interface NoPayload { export interface NoPayload {
// Empty interface for operations that don't need payload // Empty interface for operations that don't need payload
} }

View file

@ -1,5 +1,5 @@
import { getLogger } from '@stock-bot/logger';
import { sleep } from '@stock-bot/di'; import { sleep } from '@stock-bot/di';
import { getLogger } from '@stock-bot/logger';
const logger = getLogger('symbol-search-util'); const logger = getLogger('symbol-search-util');

View file

@ -1,101 +1,103 @@
#!/usr/bin/env bun #!/usr/bin/env bun
/** /**
* Test script for CEO handler operations * Test script for CEO handler operations
*/ */
import { initializeServiceConfig } from '@stock-bot/config';
import { initializeServiceConfig } from '@stock-bot/config'; import { createServiceContainer, initializeServices } from '@stock-bot/di';
import { createServiceContainer, initializeServices } from '@stock-bot/di'; import { getLogger } from '@stock-bot/logger';
import { getLogger } from '@stock-bot/logger';
const logger = getLogger('test-ceo-operations');
const logger = getLogger('test-ceo-operations');
async function testCeoOperations() {
async function testCeoOperations() { logger.info('Testing CEO handler operations...');
logger.info('Testing CEO handler operations...');
try {
try { // Initialize config
// Initialize config const config = initializeServiceConfig();
const config = initializeServiceConfig();
// Create Awilix container
// Create Awilix container const awilixConfig = {
const awilixConfig = { redis: {
redis: { host: config.database.dragonfly.host,
host: config.database.dragonfly.host, port: config.database.dragonfly.port,
port: config.database.dragonfly.port, db: config.database.dragonfly.db,
db: config.database.dragonfly.db, },
}, mongodb: {
mongodb: { uri: config.database.mongodb.uri,
uri: config.database.mongodb.uri, database: config.database.mongodb.database,
database: config.database.mongodb.database, },
}, postgres: {
postgres: { host: config.database.postgres.host,
host: config.database.postgres.host, port: config.database.postgres.port,
port: config.database.postgres.port, database: config.database.postgres.database,
database: config.database.postgres.database, user: config.database.postgres.user,
user: config.database.postgres.user, password: config.database.postgres.password,
password: config.database.postgres.password, },
}, questdb: {
questdb: { enabled: false,
enabled: false, host: config.database.questdb.host,
host: config.database.questdb.host, httpPort: config.database.questdb.httpPort,
httpPort: config.database.questdb.httpPort, pgPort: config.database.questdb.pgPort,
pgPort: config.database.questdb.pgPort, influxPort: config.database.questdb.ilpPort,
influxPort: config.database.questdb.ilpPort, database: config.database.questdb.database,
database: config.database.questdb.database, },
}, };
};
const container = createServiceContainer(awilixConfig);
const container = createServiceContainer(awilixConfig); await initializeServices(container);
await initializeServices(container);
const serviceContainer = container.resolve('serviceContainer');
const serviceContainer = container.resolve('serviceContainer');
// Import and create CEO handler
// Import and create CEO handler const { CeoHandler } = await import('./src/handlers/ceo/ceo.handler');
const { CeoHandler } = await import('./src/handlers/ceo/ceo.handler'); const ceoHandler = new CeoHandler(serviceContainer);
const ceoHandler = new CeoHandler(serviceContainer);
// Test 1: Check if there are any CEO symbols in the database
// Test 1: Check if there are any CEO symbols in the database logger.info('Checking for existing CEO symbols...');
logger.info('Checking for existing CEO symbols...'); const collection = serviceContainer.mongodb.collection('ceoSymbols');
const collection = serviceContainer.mongodb.collection('ceoSymbols'); const count = await collection.countDocuments();
const count = await collection.countDocuments(); logger.info(`Found ${count} CEO symbols in database`);
logger.info(`Found ${count} CEO symbols in database`);
if (count > 0) {
if (count > 0) { // Test 2: Run process-unique-symbols operation
// Test 2: Run process-unique-symbols operation logger.info('Testing process-unique-symbols operation...');
logger.info('Testing process-unique-symbols operation...'); const result = await ceoHandler.updateUniqueSymbols(undefined, {});
const result = await ceoHandler.updateUniqueSymbols(undefined, {}); logger.info('Process unique symbols result:', result);
logger.info('Process unique symbols result:', result);
// Test 3: Test individual symbol processing
// Test 3: Test individual symbol processing logger.info('Testing process-individual-symbol operation...');
logger.info('Testing process-individual-symbol operation...'); const sampleSymbol = await collection.findOne({});
const sampleSymbol = await collection.findOne({}); if (sampleSymbol) {
if (sampleSymbol) { const individualResult = await ceoHandler.processIndividualSymbol(
const individualResult = await ceoHandler.processIndividualSymbol({ {
ceoId: sampleSymbol.ceoId, ceoId: sampleSymbol.ceoId,
symbol: sampleSymbol.symbol, symbol: sampleSymbol.symbol,
exchange: sampleSymbol.exchange, exchange: sampleSymbol.exchange,
name: sampleSymbol.name, name: sampleSymbol.name,
}, {}); },
logger.info('Process individual symbol result:', individualResult); {}
} );
} else { logger.info('Process individual symbol result:', individualResult);
logger.warn('No CEO symbols found. Run the service to populate data first.'); }
} } else {
logger.warn('No CEO symbols found. Run the service to populate data first.');
// Clean up }
await serviceContainer.mongodb.disconnect();
await serviceContainer.postgres.disconnect(); // Clean up
if (serviceContainer.cache) { await serviceContainer.mongodb.disconnect();
await serviceContainer.cache.disconnect(); await serviceContainer.postgres.disconnect();
} if (serviceContainer.cache) {
await serviceContainer.cache.disconnect();
logger.info('Test completed successfully!'); }
process.exit(0);
} catch (error) { logger.info('Test completed successfully!');
logger.error('Test failed:', error); process.exit(0);
process.exit(1); } catch (error) {
} logger.error('Test failed:', error);
} process.exit(1);
}
// Run the test }
testCeoOperations();
// Run the test
testCeoOperations();

View file

@ -1,15 +1,15 @@
{ {
"service": { "service": {
"name": "data-pipeline", "name": "data-pipeline",
"port": 3005, "port": 3005,
"host": "0.0.0.0", "host": "0.0.0.0",
"healthCheckPath": "/health", "healthCheckPath": "/health",
"metricsPath": "/metrics", "metricsPath": "/metrics",
"shutdownTimeout": 30000, "shutdownTimeout": 30000,
"cors": { "cors": {
"enabled": true, "enabled": true,
"origin": "*", "origin": "*",
"credentials": false "credentials": false
} }
} }
} }

View file

@ -1,27 +1,27 @@
import { PostgreSQLClient } from '@stock-bot/postgres'; import { MongoDBClient } from '@stock-bot/mongodb';
import { MongoDBClient } from '@stock-bot/mongodb'; import { PostgreSQLClient } from '@stock-bot/postgres';
let postgresClient: PostgreSQLClient | null = null; let postgresClient: PostgreSQLClient | null = null;
let mongodbClient: MongoDBClient | null = null; let mongodbClient: MongoDBClient | null = null;
export function setPostgreSQLClient(client: PostgreSQLClient): void { export function setPostgreSQLClient(client: PostgreSQLClient): void {
postgresClient = client; postgresClient = client;
} }
export function getPostgreSQLClient(): PostgreSQLClient { export function getPostgreSQLClient(): PostgreSQLClient {
if (!postgresClient) { if (!postgresClient) {
throw new Error('PostgreSQL client not initialized. Call setPostgreSQLClient first.'); throw new Error('PostgreSQL client not initialized. Call setPostgreSQLClient first.');
} }
return postgresClient; return postgresClient;
} }
export function setMongoDBClient(client: MongoDBClient): void { export function setMongoDBClient(client: MongoDBClient): void {
mongodbClient = client; mongodbClient = client;
} }
export function getMongoDBClient(): MongoDBClient { export function getMongoDBClient(): MongoDBClient {
if (!mongodbClient) { if (!mongodbClient) {
throw new Error('MongoDB client not initialized. Call setMongoDBClient first.'); throw new Error('MongoDB client not initialized. Call setMongoDBClient first.');
} }
return mongodbClient; return mongodbClient;
} }

View file

@ -1,58 +1,58 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { handlerRegistry, type HandlerConfig, type ScheduledJobConfig } from '@stock-bot/queue'; import { handlerRegistry, type HandlerConfig, type ScheduledJobConfig } from '@stock-bot/queue';
import { exchangeOperations } from './operations'; import { exchangeOperations } from './operations';
const logger = getLogger('exchanges-handler'); const logger = getLogger('exchanges-handler');
const HANDLER_NAME = 'exchanges'; const HANDLER_NAME = 'exchanges';
const exchangesHandlerConfig: HandlerConfig = { const exchangesHandlerConfig: HandlerConfig = {
concurrency: 1, concurrency: 1,
maxAttempts: 3, maxAttempts: 3,
scheduledJobs: [ scheduledJobs: [
{ {
operation: 'sync-all-exchanges', operation: 'sync-all-exchanges',
cronPattern: '0 0 * * 0', // Weekly on Sunday at midnight cronPattern: '0 0 * * 0', // Weekly on Sunday at midnight
payload: { clearFirst: true }, payload: { clearFirst: true },
priority: 10, priority: 10,
immediately: false, immediately: false,
} as ScheduledJobConfig, } as ScheduledJobConfig,
{ {
operation: 'sync-qm-exchanges', operation: 'sync-qm-exchanges',
cronPattern: '0 1 * * *', // Daily at 1 AM cronPattern: '0 1 * * *', // Daily at 1 AM
payload: {}, payload: {},
priority: 5, priority: 5,
immediately: false, immediately: false,
} as ScheduledJobConfig, } as ScheduledJobConfig,
{ {
operation: 'sync-ib-exchanges', operation: 'sync-ib-exchanges',
cronPattern: '0 3 * * *', // Daily at 3 AM cronPattern: '0 3 * * *', // Daily at 3 AM
payload: {}, payload: {},
priority: 3, priority: 3,
immediately: false, immediately: false,
} as ScheduledJobConfig, } as ScheduledJobConfig,
{ {
operation: 'sync-qm-provider-mappings', operation: 'sync-qm-provider-mappings',
cronPattern: '0 3 * * *', // Daily at 3 AM cronPattern: '0 3 * * *', // Daily at 3 AM
payload: {}, payload: {},
priority: 7, priority: 7,
immediately: false, immediately: false,
} as ScheduledJobConfig, } as ScheduledJobConfig,
], ],
operations: { operations: {
'sync-all-exchanges': exchangeOperations.syncAllExchanges, 'sync-all-exchanges': exchangeOperations.syncAllExchanges,
'sync-qm-exchanges': exchangeOperations.syncQMExchanges, 'sync-qm-exchanges': exchangeOperations.syncQMExchanges,
'sync-ib-exchanges': exchangeOperations.syncIBExchanges, 'sync-ib-exchanges': exchangeOperations.syncIBExchanges,
'sync-qm-provider-mappings': exchangeOperations.syncQMProviderMappings, 'sync-qm-provider-mappings': exchangeOperations.syncQMProviderMappings,
'clear-postgresql-data': exchangeOperations.clearPostgreSQLData, 'clear-postgresql-data': exchangeOperations.clearPostgreSQLData,
'get-exchange-stats': exchangeOperations.getExchangeStats, 'get-exchange-stats': exchangeOperations.getExchangeStats,
'get-provider-mapping-stats': exchangeOperations.getProviderMappingStats, 'get-provider-mapping-stats': exchangeOperations.getProviderMappingStats,
'enhanced-sync-status': exchangeOperations['enhanced-sync-status'], 'enhanced-sync-status': exchangeOperations['enhanced-sync-status'],
}, },
}; };
export function initializeExchangesHandler(): void { export function initializeExchangesHandler(): void {
logger.info('Registering exchanges handler...'); logger.info('Registering exchanges handler...');
handlerRegistry.registerHandler(HANDLER_NAME, exchangesHandlerConfig); handlerRegistry.registerHandler(HANDLER_NAME, exchangesHandlerConfig);
logger.info('Exchanges handler registered successfully'); logger.info('Exchanges handler registered successfully');
} }

View file

@ -13,7 +13,7 @@ export async function clearPostgreSQLData(payload: JobPayload): Promise<{
try { try {
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
// Start transaction for atomic operations // Start transaction for atomic operations
await postgresClient.query('BEGIN'); await postgresClient.query('BEGIN');
@ -21,9 +21,7 @@ export async function clearPostgreSQLData(payload: JobPayload): Promise<{
const exchangeCountResult = await postgresClient.query( const exchangeCountResult = await postgresClient.query(
'SELECT COUNT(*) as count FROM exchanges' 'SELECT COUNT(*) as count FROM exchanges'
); );
const symbolCountResult = await postgresClient.query( const symbolCountResult = await postgresClient.query('SELECT COUNT(*) as count FROM symbols');
'SELECT COUNT(*) as count FROM symbols'
);
const mappingCountResult = await postgresClient.query( const mappingCountResult = await postgresClient.query(
'SELECT COUNT(*) as count FROM provider_mappings' 'SELECT COUNT(*) as count FROM provider_mappings'
); );
@ -57,4 +55,4 @@ export async function clearPostgreSQLData(payload: JobPayload): Promise<{
logger.error('Failed to clear PostgreSQL data', { error }); logger.error('Failed to clear PostgreSQL data', { error });
throw error; throw error;
} }
} }

View file

@ -16,11 +16,11 @@ export async function getSyncStatus(payload: JobPayload): Promise<SyncStatus[]>
ORDER BY provider, data_type ORDER BY provider, data_type
`; `;
const result = await postgresClient.query(query); const result = await postgresClient.query(query);
logger.info(`Retrieved sync status for ${result.rows.length} entries`); logger.info(`Retrieved sync status for ${result.rows.length} entries`);
return result.rows; return result.rows;
} catch (error) { } catch (error) {
logger.error('Failed to get sync status', { error }); logger.error('Failed to get sync status', { error });
throw error; throw error;
} }
} }

View file

@ -18,11 +18,11 @@ export async function getExchangeStats(payload: JobPayload): Promise<any> {
FROM exchanges FROM exchanges
`; `;
const result = await postgresClient.query(query); const result = await postgresClient.query(query);
logger.info('Retrieved exchange statistics'); logger.info('Retrieved exchange statistics');
return result.rows[0]; return result.rows[0];
} catch (error) { } catch (error) {
logger.error('Failed to get exchange statistics', { error }); logger.error('Failed to get exchange statistics', { error });
throw error; throw error;
} }
} }

View file

@ -1,19 +1,19 @@
import { syncAllExchanges } from './sync-all-exchanges.operations'; import { clearPostgreSQLData } from './clear-postgresql-data.operations';
import { syncQMExchanges } from './qm-exchanges.operations'; import { getSyncStatus } from './enhanced-sync-status.operations';
import { syncIBExchanges } from './sync-ib-exchanges.operations'; import { getExchangeStats } from './exchange-stats.operations';
import { syncQMProviderMappings } from './sync-qm-provider-mappings.operations'; import { getProviderMappingStats } from './provider-mapping-stats.operations';
import { clearPostgreSQLData } from './clear-postgresql-data.operations'; import { syncQMExchanges } from './qm-exchanges.operations';
import { getExchangeStats } from './exchange-stats.operations'; import { syncAllExchanges } from './sync-all-exchanges.operations';
import { getProviderMappingStats } from './provider-mapping-stats.operations'; import { syncIBExchanges } from './sync-ib-exchanges.operations';
import { getSyncStatus } from './enhanced-sync-status.operations'; import { syncQMProviderMappings } from './sync-qm-provider-mappings.operations';
export const exchangeOperations = { export const exchangeOperations = {
syncAllExchanges, syncAllExchanges,
syncQMExchanges, syncQMExchanges,
syncIBExchanges, syncIBExchanges,
syncQMProviderMappings, syncQMProviderMappings,
clearPostgreSQLData, clearPostgreSQLData,
getExchangeStats, getExchangeStats,
getProviderMappingStats, getProviderMappingStats,
'enhanced-sync-status': getSyncStatus, 'enhanced-sync-status': getSyncStatus,
}; };

View file

@ -22,11 +22,11 @@ export async function getProviderMappingStats(payload: JobPayload): Promise<any>
ORDER BY provider ORDER BY provider
`; `;
const result = await postgresClient.query(query); const result = await postgresClient.query(query);
logger.info('Retrieved provider mapping statistics'); logger.info('Retrieved provider mapping statistics');
return result.rows; return result.rows;
} catch (error) { } catch (error) {
logger.error('Failed to get provider mapping statistics', { error }); logger.error('Failed to get provider mapping statistics', { error });
throw error; throw error;
} }
} }

View file

@ -1,102 +1,113 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients'; import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads'; import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-qm-exchanges'); const logger = getLogger('sync-qm-exchanges');
export async function syncQMExchanges(payload: JobPayload): Promise<{ processed: number; created: number; updated: number }> { export async function syncQMExchanges(
logger.info('Starting QM exchanges sync...'); payload: JobPayload
): Promise<{ processed: number; created: number; updated: number }> {
try { logger.info('Starting QM exchanges sync...');
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient(); try {
const mongoClient = getMongoDBClient();
// 1. Get all QM exchanges from MongoDB const postgresClient = getPostgreSQLClient();
const qmExchanges = await mongoClient.find('qmExchanges', {});
logger.info(`Found ${qmExchanges.length} QM exchanges to process`); // 1. Get all QM exchanges from MongoDB
const qmExchanges = await mongoClient.find('qmExchanges', {});
let created = 0; logger.info(`Found ${qmExchanges.length} QM exchanges to process`);
let updated = 0;
let created = 0;
for (const exchange of qmExchanges) { let updated = 0;
try {
// 2. Check if exchange exists for (const exchange of qmExchanges) {
const existingExchange = await findExchange(exchange.exchangeCode, postgresClient); try {
// 2. Check if exchange exists
if (existingExchange) { const existingExchange = await findExchange(exchange.exchangeCode, postgresClient);
// Update existing
await updateExchange(existingExchange.id, exchange, postgresClient); if (existingExchange) {
updated++; // Update existing
} else { await updateExchange(existingExchange.id, exchange, postgresClient);
// Create new updated++;
await createExchange(exchange, postgresClient); } else {
created++; // Create new
} await createExchange(exchange, postgresClient);
} catch (error) { created++;
logger.error('Failed to process exchange', { error, exchange: exchange.exchangeCode }); }
} } catch (error) {
} logger.error('Failed to process exchange', { error, exchange: exchange.exchangeCode });
}
// 3. Update sync status }
await updateSyncStatus('qm', 'exchanges', qmExchanges.length, postgresClient);
// 3. Update sync status
const result = { processed: qmExchanges.length, created, updated }; await updateSyncStatus('qm', 'exchanges', qmExchanges.length, postgresClient);
logger.info('QM exchanges sync completed', result);
return result; const result = { processed: qmExchanges.length, created, updated };
} catch (error) { logger.info('QM exchanges sync completed', result);
logger.error('QM exchanges sync failed', { error }); return result;
throw error; } catch (error) {
} logger.error('QM exchanges sync failed', { error });
} throw error;
}
// Helper functions }
async function findExchange(exchangeCode: string, postgresClient: any): Promise<any> {
const query = 'SELECT * FROM exchanges WHERE code = $1'; // Helper functions
const result = await postgresClient.query(query, [exchangeCode]); async function findExchange(exchangeCode: string, postgresClient: any): Promise<any> {
return result.rows[0] || null; const query = 'SELECT * FROM exchanges WHERE code = $1';
} const result = await postgresClient.query(query, [exchangeCode]);
return result.rows[0] || null;
async function createExchange(qmExchange: any, postgresClient: any): Promise<void> { }
const query = `
INSERT INTO exchanges (code, name, country, currency, visible) async function createExchange(qmExchange: any, postgresClient: any): Promise<void> {
VALUES ($1, $2, $3, $4, $5) const query = `
ON CONFLICT (code) DO NOTHING INSERT INTO exchanges (code, name, country, currency, visible)
`; VALUES ($1, $2, $3, $4, $5)
ON CONFLICT (code) DO NOTHING
await postgresClient.query(query, [ `;
qmExchange.exchangeCode || qmExchange.exchange,
qmExchange.exchangeShortName || qmExchange.name, await postgresClient.query(query, [
qmExchange.countryCode || 'US', qmExchange.exchangeCode || qmExchange.exchange,
'USD', // Default currency, can be improved qmExchange.exchangeShortName || qmExchange.name,
true, // New exchanges are visible by default qmExchange.countryCode || 'US',
]); 'USD', // Default currency, can be improved
} true, // New exchanges are visible by default
]);
async function updateExchange(exchangeId: string, qmExchange: any, postgresClient: any): Promise<void> { }
const query = `
UPDATE exchanges async function updateExchange(
SET name = COALESCE($2, name), exchangeId: string,
country = COALESCE($3, country), qmExchange: any,
updated_at = NOW() postgresClient: any
WHERE id = $1 ): Promise<void> {
`; const query = `
UPDATE exchanges
await postgresClient.query(query, [ SET name = COALESCE($2, name),
exchangeId, country = COALESCE($3, country),
qmExchange.exchangeShortName || qmExchange.name, updated_at = NOW()
qmExchange.countryCode, WHERE id = $1
]); `;
}
await postgresClient.query(query, [
async function updateSyncStatus(provider: string, dataType: string, count: number, postgresClient: any): Promise<void> { exchangeId,
const query = ` qmExchange.exchangeShortName || qmExchange.name,
UPDATE sync_status qmExchange.countryCode,
SET last_sync_at = NOW(), ]);
last_sync_count = $3, }
sync_errors = NULL,
updated_at = NOW() async function updateSyncStatus(
WHERE provider = $1 AND data_type = $2 provider: string,
`; dataType: string,
count: number,
await postgresClient.query(query, [provider, dataType, count]); postgresClient: any
} ): Promise<void> {
const query = `
UPDATE sync_status
SET last_sync_at = NOW(),
last_sync_count = $3,
sync_errors = NULL,
updated_at = NOW()
WHERE provider = $1 AND data_type = $2
`;
await postgresClient.query(query, [provider, dataType, count]);
}

View file

@ -1,266 +1,275 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients'; import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload, SyncResult } from '../../../types/job-payloads'; import type { JobPayload, SyncResult } from '../../../types/job-payloads';
const logger = getLogger('enhanced-sync-all-exchanges'); const logger = getLogger('enhanced-sync-all-exchanges');
export async function syncAllExchanges(payload: JobPayload): Promise<SyncResult> { export async function syncAllExchanges(payload: JobPayload): Promise<SyncResult> {
const clearFirst = payload.clearFirst || true; const clearFirst = payload.clearFirst || true;
logger.info('Starting comprehensive exchange sync...', { clearFirst }); logger.info('Starting comprehensive exchange sync...', { clearFirst });
const result: SyncResult = { const result: SyncResult = {
processed: 0, processed: 0,
created: 0, created: 0,
updated: 0, updated: 0,
skipped: 0, skipped: 0,
errors: 0, errors: 0,
}; };
try { try {
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
// Clear existing data if requested // Clear existing data if requested
if (clearFirst) { if (clearFirst) {
await clearPostgreSQLData(postgresClient); await clearPostgreSQLData(postgresClient);
} }
// Start transaction for atomic operations // Start transaction for atomic operations
await postgresClient.query('BEGIN'); await postgresClient.query('BEGIN');
// 1. Sync from EOD exchanges (comprehensive global data) // 1. Sync from EOD exchanges (comprehensive global data)
const eodResult = await syncEODExchanges(); const eodResult = await syncEODExchanges();
mergeResults(result, eodResult); mergeResults(result, eodResult);
// 2. Sync from IB exchanges (detailed asset information) // 2. Sync from IB exchanges (detailed asset information)
const ibResult = await syncIBExchanges(); const ibResult = await syncIBExchanges();
mergeResults(result, ibResult); mergeResults(result, ibResult);
// 3. Update sync status // 3. Update sync status
await updateSyncStatus('all', 'exchanges', result.processed, postgresClient); await updateSyncStatus('all', 'exchanges', result.processed, postgresClient);
await postgresClient.query('COMMIT'); await postgresClient.query('COMMIT');
logger.info('Comprehensive exchange sync completed', result); logger.info('Comprehensive exchange sync completed', result);
return result; return result;
} catch (error) { } catch (error) {
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
await postgresClient.query('ROLLBACK'); await postgresClient.query('ROLLBACK');
logger.error('Comprehensive exchange sync failed', { error }); logger.error('Comprehensive exchange sync failed', { error });
throw error; throw error;
} }
} }
async function clearPostgreSQLData(postgresClient: any): Promise<void> { async function clearPostgreSQLData(postgresClient: any): Promise<void> {
logger.info('Clearing existing PostgreSQL data...'); logger.info('Clearing existing PostgreSQL data...');
// Clear data in correct order (respect foreign keys) // Clear data in correct order (respect foreign keys)
await postgresClient.query('DELETE FROM provider_mappings'); await postgresClient.query('DELETE FROM provider_mappings');
await postgresClient.query('DELETE FROM symbols'); await postgresClient.query('DELETE FROM symbols');
await postgresClient.query('DELETE FROM exchanges'); await postgresClient.query('DELETE FROM exchanges');
// Reset sync status // Reset sync status
await postgresClient.query( await postgresClient.query(
'UPDATE sync_status SET last_sync_at = NULL, last_sync_count = 0, sync_errors = NULL' 'UPDATE sync_status SET last_sync_at = NULL, last_sync_count = 0, sync_errors = NULL'
); );
logger.info('PostgreSQL data cleared successfully'); logger.info('PostgreSQL data cleared successfully');
} }
async function syncEODExchanges(): Promise<SyncResult> { async function syncEODExchanges(): Promise<SyncResult> {
const mongoClient = getMongoDBClient(); const mongoClient = getMongoDBClient();
const exchanges = await mongoClient.find('eodExchanges', { active: true }); const exchanges = await mongoClient.find('eodExchanges', { active: true });
const result: SyncResult = { processed: 0, created: 0, updated: 0, skipped: 0, errors: 0 }; const result: SyncResult = { processed: 0, created: 0, updated: 0, skipped: 0, errors: 0 };
for (const exchange of exchanges) { for (const exchange of exchanges) {
try { try {
// Create provider exchange mapping for EOD // Create provider exchange mapping for EOD
await createProviderExchangeMapping( await createProviderExchangeMapping(
'eod', // provider 'eod', // provider
exchange.Code, exchange.Code,
exchange.Name, exchange.Name,
exchange.CountryISO2, exchange.CountryISO2,
exchange.Currency, exchange.Currency,
0.95 // very high confidence for EOD data 0.95 // very high confidence for EOD data
); );
result.processed++; result.processed++;
result.created++; // Count as created mapping result.created++; // Count as created mapping
} catch (error) { } catch (error) {
logger.error('Failed to process EOD exchange', { error, exchange }); logger.error('Failed to process EOD exchange', { error, exchange });
result.errors++; result.errors++;
} }
} }
return result; return result;
} }
async function syncIBExchanges(): Promise<SyncResult> { async function syncIBExchanges(): Promise<SyncResult> {
const mongoClient = getMongoDBClient(); const mongoClient = getMongoDBClient();
const exchanges = await mongoClient.find('ibExchanges', {}); const exchanges = await mongoClient.find('ibExchanges', {});
const result: SyncResult = { processed: 0, created: 0, updated: 0, skipped: 0, errors: 0 }; const result: SyncResult = { processed: 0, created: 0, updated: 0, skipped: 0, errors: 0 };
for (const exchange of exchanges) { for (const exchange of exchanges) {
try { try {
// Create provider exchange mapping for IB // Create provider exchange mapping for IB
await createProviderExchangeMapping( await createProviderExchangeMapping(
'ib', // provider 'ib', // provider
exchange.exchange_id, exchange.exchange_id,
exchange.name, exchange.name,
exchange.country_code, exchange.country_code,
'USD', // IB doesn't specify currency, default to USD 'USD', // IB doesn't specify currency, default to USD
0.85 // good confidence for IB data 0.85 // good confidence for IB data
); );
result.processed++; result.processed++;
result.created++; // Count as created mapping result.created++; // Count as created mapping
} catch (error) { } catch (error) {
logger.error('Failed to process IB exchange', { error, exchange }); logger.error('Failed to process IB exchange', { error, exchange });
result.errors++; result.errors++;
} }
} }
return result; return result;
} }
async function createProviderExchangeMapping( async function createProviderExchangeMapping(
provider: string, provider: string,
providerExchangeCode: string, providerExchangeCode: string,
providerExchangeName: string, providerExchangeName: string,
countryCode: string | null, countryCode: string | null,
currency: string | null, currency: string | null,
confidence: number confidence: number
): Promise<void> { ): Promise<void> {
if (!providerExchangeCode) { if (!providerExchangeCode) {
return; return;
} }
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
// Check if mapping already exists // Check if mapping already exists
const existingMapping = await findProviderExchangeMapping(provider, providerExchangeCode); const existingMapping = await findProviderExchangeMapping(provider, providerExchangeCode);
if (existingMapping) { if (existingMapping) {
// Don't override existing mappings to preserve manual work // Don't override existing mappings to preserve manual work
return; return;
} }
// Find or create master exchange // Find or create master exchange
const masterExchange = await findOrCreateMasterExchange( const masterExchange = await findOrCreateMasterExchange(
providerExchangeCode, providerExchangeCode,
providerExchangeName, providerExchangeName,
countryCode, countryCode,
currency currency
); );
// Create the provider exchange mapping // Create the provider exchange mapping
const query = ` const query = `
INSERT INTO provider_exchange_mappings INSERT INTO provider_exchange_mappings
(provider, provider_exchange_code, provider_exchange_name, master_exchange_id, (provider, provider_exchange_code, provider_exchange_name, master_exchange_id,
country_code, currency, confidence, active, auto_mapped) country_code, currency, confidence, active, auto_mapped)
VALUES ($1, $2, $3, $4, $5, $6, $7, false, true) VALUES ($1, $2, $3, $4, $5, $6, $7, false, true)
ON CONFLICT (provider, provider_exchange_code) DO NOTHING ON CONFLICT (provider, provider_exchange_code) DO NOTHING
`; `;
await postgresClient.query(query, [ await postgresClient.query(query, [
provider, provider,
providerExchangeCode, providerExchangeCode,
providerExchangeName, providerExchangeName,
masterExchange.id, masterExchange.id,
countryCode, countryCode,
currency, currency,
confidence, confidence,
]); ]);
} }
async function findOrCreateMasterExchange( async function findOrCreateMasterExchange(
providerCode: string, providerCode: string,
providerName: string, providerName: string,
countryCode: string | null, countryCode: string | null,
currency: string | null currency: string | null
): Promise<any> { ): Promise<any> {
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
// First, try to find exact match // First, try to find exact match
let masterExchange = await findExchangeByCode(providerCode); let masterExchange = await findExchangeByCode(providerCode);
if (masterExchange) { if (masterExchange) {
return masterExchange; return masterExchange;
} }
// Try to find by similar codes (basic mapping) // Try to find by similar codes (basic mapping)
const basicMapping = getBasicExchangeMapping(providerCode); const basicMapping = getBasicExchangeMapping(providerCode);
if (basicMapping) { if (basicMapping) {
masterExchange = await findExchangeByCode(basicMapping); masterExchange = await findExchangeByCode(basicMapping);
if (masterExchange) { if (masterExchange) {
return masterExchange; return masterExchange;
} }
} }
// Create new master exchange (inactive by default) // Create new master exchange (inactive by default)
const query = ` const query = `
INSERT INTO exchanges (code, name, country, currency, active) INSERT INTO exchanges (code, name, country, currency, active)
VALUES ($1, $2, $3, $4, false) VALUES ($1, $2, $3, $4, false)
ON CONFLICT (code) DO UPDATE SET ON CONFLICT (code) DO UPDATE SET
name = COALESCE(EXCLUDED.name, exchanges.name), name = COALESCE(EXCLUDED.name, exchanges.name),
country = COALESCE(EXCLUDED.country, exchanges.country), country = COALESCE(EXCLUDED.country, exchanges.country),
currency = COALESCE(EXCLUDED.currency, exchanges.currency) currency = COALESCE(EXCLUDED.currency, exchanges.currency)
RETURNING id, code, name, country, currency RETURNING id, code, name, country, currency
`; `;
const result = await postgresClient.query(query, [ const result = await postgresClient.query(query, [
providerCode, providerCode,
providerName || providerCode, providerName || providerCode,
countryCode || 'US', countryCode || 'US',
currency || 'USD', currency || 'USD',
]); ]);
return result.rows[0]; return result.rows[0];
} }
function getBasicExchangeMapping(providerCode: string): string | null { function getBasicExchangeMapping(providerCode: string): string | null {
const mappings: Record<string, string> = { const mappings: Record<string, string> = {
NYE: 'NYSE', NYE: 'NYSE',
NAS: 'NASDAQ', NAS: 'NASDAQ',
TO: 'TSX', TO: 'TSX',
LN: 'LSE', LN: 'LSE',
LON: 'LSE', LON: 'LSE',
}; };
return mappings[providerCode.toUpperCase()] || null; return mappings[providerCode.toUpperCase()] || null;
} }
async function findProviderExchangeMapping(provider: string, providerExchangeCode: string): Promise<any> { async function findProviderExchangeMapping(
const postgresClient = getPostgreSQLClient(); provider: string,
const query = 'SELECT * FROM provider_exchange_mappings WHERE provider = $1 AND provider_exchange_code = $2'; providerExchangeCode: string
const result = await postgresClient.query(query, [provider, providerExchangeCode]); ): Promise<any> {
return result.rows[0] || null; const postgresClient = getPostgreSQLClient();
} const query =
'SELECT * FROM provider_exchange_mappings WHERE provider = $1 AND provider_exchange_code = $2';
async function findExchangeByCode(code: string): Promise<any> { const result = await postgresClient.query(query, [provider, providerExchangeCode]);
const postgresClient = getPostgreSQLClient(); return result.rows[0] || null;
const query = 'SELECT * FROM exchanges WHERE code = $1'; }
const result = await postgresClient.query(query, [code]);
return result.rows[0] || null; async function findExchangeByCode(code: string): Promise<any> {
} const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM exchanges WHERE code = $1';
async function updateSyncStatus(provider: string, dataType: string, count: number, postgresClient: any): Promise<void> { const result = await postgresClient.query(query, [code]);
const query = ` return result.rows[0] || null;
INSERT INTO sync_status (provider, data_type, last_sync_at, last_sync_count, sync_errors) }
VALUES ($1, $2, NOW(), $3, NULL)
ON CONFLICT (provider, data_type) async function updateSyncStatus(
DO UPDATE SET provider: string,
last_sync_at = NOW(), dataType: string,
last_sync_count = EXCLUDED.last_sync_count, count: number,
sync_errors = NULL, postgresClient: any
updated_at = NOW() ): Promise<void> {
`; const query = `
INSERT INTO sync_status (provider, data_type, last_sync_at, last_sync_count, sync_errors)
await postgresClient.query(query, [provider, dataType, count]); VALUES ($1, $2, NOW(), $3, NULL)
} ON CONFLICT (provider, data_type)
DO UPDATE SET
function mergeResults(target: SyncResult, source: SyncResult): void { last_sync_at = NOW(),
target.processed += source.processed; last_sync_count = EXCLUDED.last_sync_count,
target.created += source.created; sync_errors = NULL,
target.updated += source.updated; updated_at = NOW()
target.skipped += source.skipped; `;
target.errors += source.errors;
} await postgresClient.query(query, [provider, dataType, count]);
}
function mergeResults(target: SyncResult, source: SyncResult): void {
target.processed += source.processed;
target.created += source.created;
target.updated += source.updated;
target.skipped += source.skipped;
target.errors += source.errors;
}

View file

@ -1,206 +1,208 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient } from '../../../clients'; import type { MasterExchange } from '@stock-bot/mongodb';
import type { JobPayload } from '../../../types/job-payloads'; import { getMongoDBClient } from '../../../clients';
import type { MasterExchange } from '@stock-bot/mongodb'; import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-ib-exchanges'); const logger = getLogger('sync-ib-exchanges');
interface IBExchange { interface IBExchange {
id?: string; id?: string;
_id?: any; _id?: any;
name?: string; name?: string;
code?: string; code?: string;
country_code?: string; country_code?: string;
currency?: string; currency?: string;
} }
export async function syncIBExchanges(payload: JobPayload): Promise<{ syncedCount: number; totalExchanges: number }> { export async function syncIBExchanges(
logger.info('Syncing IB exchanges from database...'); payload: JobPayload
): Promise<{ syncedCount: number; totalExchanges: number }> {
try { logger.info('Syncing IB exchanges from database...');
const mongoClient = getMongoDBClient();
const db = mongoClient.getDatabase(); try {
const mongoClient = getMongoDBClient();
// Filter by country code US and CA const db = mongoClient.getDatabase();
const ibExchanges = await db
.collection<IBExchange>('ibExchanges') // Filter by country code US and CA
.find({ const ibExchanges = await db
country_code: { $in: ['US', 'CA'] }, .collection<IBExchange>('ibExchanges')
}) .find({
.toArray(); country_code: { $in: ['US', 'CA'] },
})
logger.info('Found IB exchanges in database', { count: ibExchanges.length }); .toArray();
let syncedCount = 0; logger.info('Found IB exchanges in database', { count: ibExchanges.length });
for (const exchange of ibExchanges) { let syncedCount = 0;
try {
await createOrUpdateMasterExchange(exchange); for (const exchange of ibExchanges) {
syncedCount++; try {
await createOrUpdateMasterExchange(exchange);
logger.debug('Synced IB exchange', { syncedCount++;
ibId: exchange.id,
country: exchange.country_code, logger.debug('Synced IB exchange', {
}); ibId: exchange.id,
} catch (error) { country: exchange.country_code,
logger.error('Failed to sync IB exchange', { exchange: exchange.id, error }); });
} } catch (error) {
} logger.error('Failed to sync IB exchange', { exchange: exchange.id, error });
}
logger.info('IB exchange sync completed', { }
syncedCount,
totalExchanges: ibExchanges.length, logger.info('IB exchange sync completed', {
}); syncedCount,
totalExchanges: ibExchanges.length,
return { syncedCount, totalExchanges: ibExchanges.length }; });
} catch (error) {
logger.error('Failed to fetch IB exchanges from database', { error }); return { syncedCount, totalExchanges: ibExchanges.length };
return { syncedCount: 0, totalExchanges: 0 }; } catch (error) {
} logger.error('Failed to fetch IB exchanges from database', { error });
} return { syncedCount: 0, totalExchanges: 0 };
}
/** }
* Create or update master exchange record 1:1 from IB exchange
*/ /**
async function createOrUpdateMasterExchange(ibExchange: IBExchange): Promise<void> { * Create or update master exchange record 1:1 from IB exchange
const mongoClient = getMongoDBClient(); */
const db = mongoClient.getDatabase(); async function createOrUpdateMasterExchange(ibExchange: IBExchange): Promise<void> {
const collection = db.collection<MasterExchange>('masterExchanges'); const mongoClient = getMongoDBClient();
const db = mongoClient.getDatabase();
const masterExchangeId = generateMasterExchangeId(ibExchange); const collection = db.collection<MasterExchange>('masterExchanges');
const now = new Date();
const masterExchangeId = generateMasterExchangeId(ibExchange);
// Check if master exchange already exists const now = new Date();
const existing = await collection.findOne({ masterExchangeId });
// Check if master exchange already exists
if (existing) { const existing = await collection.findOne({ masterExchangeId });
// Update existing record
await collection.updateOne( if (existing) {
{ masterExchangeId }, // Update existing record
{ await collection.updateOne(
$set: { { masterExchangeId },
officialName: ibExchange.name || `Exchange ${ibExchange.id}`, {
country: ibExchange.country_code || 'UNKNOWN', $set: {
currency: ibExchange.currency || 'USD', officialName: ibExchange.name || `Exchange ${ibExchange.id}`,
timezone: inferTimezone(ibExchange), country: ibExchange.country_code || 'UNKNOWN',
updated_at: now, currency: ibExchange.currency || 'USD',
}, timezone: inferTimezone(ibExchange),
} updated_at: now,
); },
}
logger.debug('Updated existing master exchange', { masterExchangeId }); );
} else {
// Create new master exchange logger.debug('Updated existing master exchange', { masterExchangeId });
const masterExchange: MasterExchange = { } else {
masterExchangeId, // Create new master exchange
shortName: masterExchangeId, // Set shortName to masterExchangeId on creation const masterExchange: MasterExchange = {
officialName: ibExchange.name || `Exchange ${ibExchange.id}`, masterExchangeId,
country: ibExchange.country_code || 'UNKNOWN', shortName: masterExchangeId, // Set shortName to masterExchangeId on creation
currency: ibExchange.currency || 'USD', officialName: ibExchange.name || `Exchange ${ibExchange.id}`,
timezone: inferTimezone(ibExchange), country: ibExchange.country_code || 'UNKNOWN',
active: false, // Set active to false only on creation currency: ibExchange.currency || 'USD',
timezone: inferTimezone(ibExchange),
sourceMappings: { active: false, // Set active to false only on creation
ib: {
id: ibExchange.id || ibExchange._id?.toString() || 'unknown', sourceMappings: {
name: ibExchange.name || `Exchange ${ibExchange.id}`, ib: {
code: ibExchange.code || ibExchange.id || '', id: ibExchange.id || ibExchange._id?.toString() || 'unknown',
aliases: generateAliases(ibExchange), name: ibExchange.name || `Exchange ${ibExchange.id}`,
lastUpdated: now, code: ibExchange.code || ibExchange.id || '',
}, aliases: generateAliases(ibExchange),
}, lastUpdated: now,
},
confidence: 1.0, // High confidence for direct IB mapping },
verified: true, // Mark as verified since it's direct from IB
confidence: 1.0, // High confidence for direct IB mapping
// DocumentBase fields verified: true, // Mark as verified since it's direct from IB
source: 'ib-exchange-sync',
created_at: now, // DocumentBase fields
updated_at: now, source: 'ib-exchange-sync',
}; created_at: now,
updated_at: now,
await collection.insertOne(masterExchange); };
logger.debug('Created new master exchange', { masterExchangeId });
} await collection.insertOne(masterExchange);
} logger.debug('Created new master exchange', { masterExchangeId });
}
/** }
* Generate master exchange ID from IB exchange
*/ /**
function generateMasterExchangeId(ibExchange: IBExchange): string { * Generate master exchange ID from IB exchange
// Use code if available, otherwise use ID, otherwise generate from name */
if (ibExchange.code) { function generateMasterExchangeId(ibExchange: IBExchange): string {
return ibExchange.code.toUpperCase().replace(/[^A-Z0-9]/g, ''); // Use code if available, otherwise use ID, otherwise generate from name
} if (ibExchange.code) {
return ibExchange.code.toUpperCase().replace(/[^A-Z0-9]/g, '');
if (ibExchange.id) { }
return ibExchange.id.toUpperCase().replace(/[^A-Z0-9]/g, '');
} if (ibExchange.id) {
return ibExchange.id.toUpperCase().replace(/[^A-Z0-9]/g, '');
if (ibExchange.name) { }
return ibExchange.name
.toUpperCase() if (ibExchange.name) {
.split(' ') return ibExchange.name
.slice(0, 2) .toUpperCase()
.join('_') .split(' ')
.replace(/[^A-Z0-9_]/g, ''); .slice(0, 2)
} .join('_')
.replace(/[^A-Z0-9_]/g, '');
return 'UNKNOWN_EXCHANGE'; }
}
return 'UNKNOWN_EXCHANGE';
/** }
* Generate aliases for the exchange
*/ /**
function generateAliases(ibExchange: IBExchange): string[] { * Generate aliases for the exchange
const aliases: string[] = []; */
function generateAliases(ibExchange: IBExchange): string[] {
if (ibExchange.name && ibExchange.name.includes(' ')) { const aliases: string[] = [];
// Add abbreviated version
aliases.push( if (ibExchange.name && ibExchange.name.includes(' ')) {
ibExchange.name // Add abbreviated version
.split(' ') aliases.push(
.map(w => w[0]) ibExchange.name
.join('') .split(' ')
.toUpperCase() .map(w => w[0])
); .join('')
} .toUpperCase()
);
if (ibExchange.code) { }
aliases.push(ibExchange.code.toUpperCase());
} if (ibExchange.code) {
aliases.push(ibExchange.code.toUpperCase());
return aliases; }
}
return aliases;
/** }
* Infer timezone from exchange name/location
*/ /**
function inferTimezone(ibExchange: IBExchange): string { * Infer timezone from exchange name/location
if (!ibExchange.name) { */
return 'UTC'; function inferTimezone(ibExchange: IBExchange): string {
} if (!ibExchange.name) {
return 'UTC';
const name = ibExchange.name.toUpperCase(); }
if (name.includes('NEW YORK') || name.includes('NYSE') || name.includes('NASDAQ')) { const name = ibExchange.name.toUpperCase();
return 'America/New_York';
} if (name.includes('NEW YORK') || name.includes('NYSE') || name.includes('NASDAQ')) {
if (name.includes('LONDON')) { return 'America/New_York';
return 'Europe/London'; }
} if (name.includes('LONDON')) {
if (name.includes('TOKYO')) { return 'Europe/London';
return 'Asia/Tokyo'; }
} if (name.includes('TOKYO')) {
if (name.includes('SHANGHAI')) { return 'Asia/Tokyo';
return 'Asia/Shanghai'; }
} if (name.includes('SHANGHAI')) {
if (name.includes('TORONTO')) { return 'Asia/Shanghai';
return 'America/Toronto'; }
} if (name.includes('TORONTO')) {
if (name.includes('FRANKFURT')) { return 'America/Toronto';
return 'Europe/Berlin'; }
} if (name.includes('FRANKFURT')) {
return 'Europe/Berlin';
return 'UTC'; // Default }
}
return 'UTC'; // Default
}

View file

@ -1,203 +1,207 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients'; import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload, SyncResult } from '../../../types/job-payloads'; import type { JobPayload, SyncResult } from '../../../types/job-payloads';
const logger = getLogger('enhanced-sync-qm-provider-mappings'); const logger = getLogger('enhanced-sync-qm-provider-mappings');
export async function syncQMProviderMappings(payload: JobPayload): Promise<SyncResult> { export async function syncQMProviderMappings(payload: JobPayload): Promise<SyncResult> {
logger.info('Starting QM provider exchange mappings sync...'); logger.info('Starting QM provider exchange mappings sync...');
const result: SyncResult = { const result: SyncResult = {
processed: 0, processed: 0,
created: 0, created: 0,
updated: 0, updated: 0,
skipped: 0, skipped: 0,
errors: 0, errors: 0,
}; };
try { try {
const mongoClient = getMongoDBClient(); const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
// Start transaction // Start transaction
await postgresClient.query('BEGIN'); await postgresClient.query('BEGIN');
// Get unique exchange combinations from QM symbols // Get unique exchange combinations from QM symbols
const db = mongoClient.getDatabase(); const db = mongoClient.getDatabase();
const pipeline = [ const pipeline = [
{ {
$group: { $group: {
_id: { _id: {
exchangeCode: '$exchangeCode', exchangeCode: '$exchangeCode',
exchange: '$exchange', exchange: '$exchange',
countryCode: '$countryCode', countryCode: '$countryCode',
}, },
count: { $sum: 1 }, count: { $sum: 1 },
sampleExchange: { $first: '$exchange' }, sampleExchange: { $first: '$exchange' },
}, },
}, },
{ {
$project: { $project: {
exchangeCode: '$_id.exchangeCode', exchangeCode: '$_id.exchangeCode',
exchange: '$_id.exchange', exchange: '$_id.exchange',
countryCode: '$_id.countryCode', countryCode: '$_id.countryCode',
count: 1, count: 1,
sampleExchange: 1, sampleExchange: 1,
}, },
}, },
]; ];
const qmExchanges = await db.collection('qmSymbols').aggregate(pipeline).toArray(); const qmExchanges = await db.collection('qmSymbols').aggregate(pipeline).toArray();
logger.info(`Found ${qmExchanges.length} unique QM exchange combinations`); logger.info(`Found ${qmExchanges.length} unique QM exchange combinations`);
for (const exchange of qmExchanges) { for (const exchange of qmExchanges) {
try { try {
// Create provider exchange mapping for QM // Create provider exchange mapping for QM
await createProviderExchangeMapping( await createProviderExchangeMapping(
'qm', // provider 'qm', // provider
exchange.exchangeCode, exchange.exchangeCode,
exchange.sampleExchange || exchange.exchangeCode, exchange.sampleExchange || exchange.exchangeCode,
exchange.countryCode, exchange.countryCode,
exchange.countryCode === 'CA' ? 'CAD' : 'USD', // Simple currency mapping exchange.countryCode === 'CA' ? 'CAD' : 'USD', // Simple currency mapping
0.8 // good confidence for QM data 0.8 // good confidence for QM data
); );
result.processed++; result.processed++;
result.created++; result.created++;
} catch (error) { } catch (error) {
logger.error('Failed to process QM exchange mapping', { error, exchange }); logger.error('Failed to process QM exchange mapping', { error, exchange });
result.errors++; result.errors++;
} }
} }
await postgresClient.query('COMMIT'); await postgresClient.query('COMMIT');
logger.info('QM provider exchange mappings sync completed', result); logger.info('QM provider exchange mappings sync completed', result);
return result; return result;
} catch (error) { } catch (error) {
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
await postgresClient.query('ROLLBACK'); await postgresClient.query('ROLLBACK');
logger.error('QM provider exchange mappings sync failed', { error }); logger.error('QM provider exchange mappings sync failed', { error });
throw error; throw error;
} }
} }
async function createProviderExchangeMapping( async function createProviderExchangeMapping(
provider: string, provider: string,
providerExchangeCode: string, providerExchangeCode: string,
providerExchangeName: string, providerExchangeName: string,
countryCode: string | null, countryCode: string | null,
currency: string | null, currency: string | null,
confidence: number confidence: number
): Promise<void> { ): Promise<void> {
if (!providerExchangeCode) { if (!providerExchangeCode) {
return; return;
} }
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
// Check if mapping already exists // Check if mapping already exists
const existingMapping = await findProviderExchangeMapping(provider, providerExchangeCode); const existingMapping = await findProviderExchangeMapping(provider, providerExchangeCode);
if (existingMapping) { if (existingMapping) {
// Don't override existing mappings to preserve manual work // Don't override existing mappings to preserve manual work
return; return;
} }
// Find or create master exchange // Find or create master exchange
const masterExchange = await findOrCreateMasterExchange( const masterExchange = await findOrCreateMasterExchange(
providerExchangeCode, providerExchangeCode,
providerExchangeName, providerExchangeName,
countryCode, countryCode,
currency currency
); );
// Create the provider exchange mapping // Create the provider exchange mapping
const query = ` const query = `
INSERT INTO provider_exchange_mappings INSERT INTO provider_exchange_mappings
(provider, provider_exchange_code, provider_exchange_name, master_exchange_id, (provider, provider_exchange_code, provider_exchange_name, master_exchange_id,
country_code, currency, confidence, active, auto_mapped) country_code, currency, confidence, active, auto_mapped)
VALUES ($1, $2, $3, $4, $5, $6, $7, false, true) VALUES ($1, $2, $3, $4, $5, $6, $7, false, true)
ON CONFLICT (provider, provider_exchange_code) DO NOTHING ON CONFLICT (provider, provider_exchange_code) DO NOTHING
`; `;
await postgresClient.query(query, [ await postgresClient.query(query, [
provider, provider,
providerExchangeCode, providerExchangeCode,
providerExchangeName, providerExchangeName,
masterExchange.id, masterExchange.id,
countryCode, countryCode,
currency, currency,
confidence, confidence,
]); ]);
} }
async function findProviderExchangeMapping(provider: string, providerExchangeCode: string): Promise<any> { async function findProviderExchangeMapping(
const postgresClient = getPostgreSQLClient(); provider: string,
const query = 'SELECT * FROM provider_exchange_mappings WHERE provider = $1 AND provider_exchange_code = $2'; providerExchangeCode: string
const result = await postgresClient.query(query, [provider, providerExchangeCode]); ): Promise<any> {
return result.rows[0] || null; const postgresClient = getPostgreSQLClient();
} const query =
'SELECT * FROM provider_exchange_mappings WHERE provider = $1 AND provider_exchange_code = $2';
async function findOrCreateMasterExchange( const result = await postgresClient.query(query, [provider, providerExchangeCode]);
providerCode: string, return result.rows[0] || null;
providerName: string, }
countryCode: string | null,
currency: string | null async function findOrCreateMasterExchange(
): Promise<any> { providerCode: string,
const postgresClient = getPostgreSQLClient(); providerName: string,
countryCode: string | null,
// First, try to find exact match currency: string | null
let masterExchange = await findExchangeByCode(providerCode); ): Promise<any> {
const postgresClient = getPostgreSQLClient();
if (masterExchange) {
return masterExchange; // First, try to find exact match
} let masterExchange = await findExchangeByCode(providerCode);
// Try to find by similar codes (basic mapping) if (masterExchange) {
const basicMapping = getBasicExchangeMapping(providerCode); return masterExchange;
if (basicMapping) { }
masterExchange = await findExchangeByCode(basicMapping);
if (masterExchange) { // Try to find by similar codes (basic mapping)
return masterExchange; const basicMapping = getBasicExchangeMapping(providerCode);
} if (basicMapping) {
} masterExchange = await findExchangeByCode(basicMapping);
if (masterExchange) {
// Create new master exchange (inactive by default) return masterExchange;
const query = ` }
INSERT INTO exchanges (code, name, country, currency, active) }
VALUES ($1, $2, $3, $4, false)
ON CONFLICT (code) DO UPDATE SET // Create new master exchange (inactive by default)
name = COALESCE(EXCLUDED.name, exchanges.name), const query = `
country = COALESCE(EXCLUDED.country, exchanges.country), INSERT INTO exchanges (code, name, country, currency, active)
currency = COALESCE(EXCLUDED.currency, exchanges.currency) VALUES ($1, $2, $3, $4, false)
RETURNING id, code, name, country, currency ON CONFLICT (code) DO UPDATE SET
`; name = COALESCE(EXCLUDED.name, exchanges.name),
country = COALESCE(EXCLUDED.country, exchanges.country),
const result = await postgresClient.query(query, [ currency = COALESCE(EXCLUDED.currency, exchanges.currency)
providerCode, RETURNING id, code, name, country, currency
providerName || providerCode, `;
countryCode || 'US',
currency || 'USD', const result = await postgresClient.query(query, [
]); providerCode,
providerName || providerCode,
return result.rows[0]; countryCode || 'US',
} currency || 'USD',
]);
function getBasicExchangeMapping(providerCode: string): string | null {
const mappings: Record<string, string> = { return result.rows[0];
NYE: 'NYSE', }
NAS: 'NASDAQ',
TO: 'TSX', function getBasicExchangeMapping(providerCode: string): string | null {
LN: 'LSE', const mappings: Record<string, string> = {
LON: 'LSE', NYE: 'NYSE',
}; NAS: 'NASDAQ',
TO: 'TSX',
return mappings[providerCode.toUpperCase()] || null; LN: 'LSE',
} LON: 'LSE',
};
async function findExchangeByCode(code: string): Promise<any> {
const postgresClient = getPostgreSQLClient(); return mappings[providerCode.toUpperCase()] || null;
const query = 'SELECT * FROM exchanges WHERE code = $1'; }
const result = await postgresClient.query(query, [code]);
return result.rows[0] || null; async function findExchangeByCode(code: string): Promise<any> {
} const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM exchanges WHERE code = $1';
const result = await postgresClient.query(query, [code]);
return result.rows[0] || null;
}

View file

@ -1,9 +1,9 @@
import { syncQMSymbols } from './qm-symbols.operations'; import { syncQMSymbols } from './qm-symbols.operations';
import { syncSymbolsFromProvider } from './sync-symbols-from-provider.operations'; import { getSyncStatus } from './sync-status.operations';
import { getSyncStatus } from './sync-status.operations'; import { syncSymbolsFromProvider } from './sync-symbols-from-provider.operations';
export const symbolOperations = { export const symbolOperations = {
syncQMSymbols, syncQMSymbols,
syncSymbolsFromProvider, syncSymbolsFromProvider,
getSyncStatus, getSyncStatus,
}; };

View file

@ -1,167 +1,183 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients'; import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads'; import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-qm-symbols'); const logger = getLogger('sync-qm-symbols');
export async function syncQMSymbols(payload: JobPayload): Promise<{ processed: number; created: number; updated: number }> { export async function syncQMSymbols(
logger.info('Starting QM symbols sync...'); payload: JobPayload
): Promise<{ processed: number; created: number; updated: number }> {
try { logger.info('Starting QM symbols sync...');
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient(); try {
const mongoClient = getMongoDBClient();
// 1. Get all QM symbols from MongoDB const postgresClient = getPostgreSQLClient();
const qmSymbols = await mongoClient.find('qmSymbols', {});
logger.info(`Found ${qmSymbols.length} QM symbols to process`); // 1. Get all QM symbols from MongoDB
const qmSymbols = await mongoClient.find('qmSymbols', {});
let created = 0; logger.info(`Found ${qmSymbols.length} QM symbols to process`);
let updated = 0;
let created = 0;
for (const symbol of qmSymbols) { let updated = 0;
try {
// 2. Resolve exchange for (const symbol of qmSymbols) {
const exchangeId = await resolveExchange(symbol.exchangeCode || symbol.exchange, postgresClient); try {
// 2. Resolve exchange
if (!exchangeId) { const exchangeId = await resolveExchange(
logger.warn('Unknown exchange, skipping symbol', { symbol.exchangeCode || symbol.exchange,
symbol: symbol.symbol, postgresClient
exchange: symbol.exchangeCode || symbol.exchange, );
});
continue; if (!exchangeId) {
} logger.warn('Unknown exchange, skipping symbol', {
symbol: symbol.symbol,
// 3. Check if symbol exists exchange: symbol.exchangeCode || symbol.exchange,
const existingSymbol = await findSymbol(symbol.symbol, exchangeId, postgresClient); });
continue;
if (existingSymbol) { }
// Update existing
await updateSymbol(existingSymbol.id, symbol, postgresClient); // 3. Check if symbol exists
await upsertProviderMapping(existingSymbol.id, 'qm', symbol, postgresClient); const existingSymbol = await findSymbol(symbol.symbol, exchangeId, postgresClient);
updated++;
} else { if (existingSymbol) {
// Create new // Update existing
const newSymbolId = await createSymbol(symbol, exchangeId, postgresClient); await updateSymbol(existingSymbol.id, symbol, postgresClient);
await upsertProviderMapping(newSymbolId, 'qm', symbol, postgresClient); await upsertProviderMapping(existingSymbol.id, 'qm', symbol, postgresClient);
created++; updated++;
} } else {
} catch (error) { // Create new
logger.error('Failed to process symbol', { error, symbol: symbol.symbol }); const newSymbolId = await createSymbol(symbol, exchangeId, postgresClient);
} await upsertProviderMapping(newSymbolId, 'qm', symbol, postgresClient);
} created++;
}
// 4. Update sync status } catch (error) {
await updateSyncStatus('qm', 'symbols', qmSymbols.length, postgresClient); logger.error('Failed to process symbol', { error, symbol: symbol.symbol });
}
const result = { processed: qmSymbols.length, created, updated }; }
logger.info('QM symbols sync completed', result);
return result; // 4. Update sync status
} catch (error) { await updateSyncStatus('qm', 'symbols', qmSymbols.length, postgresClient);
logger.error('QM symbols sync failed', { error });
throw error; const result = { processed: qmSymbols.length, created, updated };
} logger.info('QM symbols sync completed', result);
} return result;
} catch (error) {
// Helper functions logger.error('QM symbols sync failed', { error });
async function resolveExchange(exchangeCode: string, postgresClient: any): Promise<string | null> { throw error;
if (!exchangeCode) return null; }
}
// Simple mapping - expand this as needed
const exchangeMap: Record<string, string> = { // Helper functions
NASDAQ: 'NASDAQ', async function resolveExchange(exchangeCode: string, postgresClient: any): Promise<string | null> {
NYSE: 'NYSE', if (!exchangeCode) {
TSX: 'TSX', return null;
TSE: 'TSX', // TSE maps to TSX }
LSE: 'LSE',
CME: 'CME', // Simple mapping - expand this as needed
}; const exchangeMap: Record<string, string> = {
NASDAQ: 'NASDAQ',
const normalizedCode = exchangeMap[exchangeCode.toUpperCase()]; NYSE: 'NYSE',
if (!normalizedCode) { TSX: 'TSX',
return null; TSE: 'TSX', // TSE maps to TSX
} LSE: 'LSE',
CME: 'CME',
const query = 'SELECT id FROM exchanges WHERE code = $1'; };
const result = await postgresClient.query(query, [normalizedCode]);
return result.rows[0]?.id || null; const normalizedCode = exchangeMap[exchangeCode.toUpperCase()];
} if (!normalizedCode) {
return null;
async function findSymbol(symbol: string, exchangeId: string, postgresClient: any): Promise<any> { }
const query = 'SELECT * FROM symbols WHERE symbol = $1 AND exchange_id = $2';
const result = await postgresClient.query(query, [symbol, exchangeId]); const query = 'SELECT id FROM exchanges WHERE code = $1';
return result.rows[0] || null; const result = await postgresClient.query(query, [normalizedCode]);
} return result.rows[0]?.id || null;
}
async function createSymbol(qmSymbol: any, exchangeId: string, postgresClient: any): Promise<string> {
const query = ` async function findSymbol(symbol: string, exchangeId: string, postgresClient: any): Promise<any> {
INSERT INTO symbols (symbol, exchange_id, company_name, country, currency) const query = 'SELECT * FROM symbols WHERE symbol = $1 AND exchange_id = $2';
VALUES ($1, $2, $3, $4, $5) const result = await postgresClient.query(query, [symbol, exchangeId]);
RETURNING id return result.rows[0] || null;
`; }
const result = await postgresClient.query(query, [ async function createSymbol(
qmSymbol.symbol, qmSymbol: any,
exchangeId, exchangeId: string,
qmSymbol.companyName || qmSymbol.name, postgresClient: any
qmSymbol.countryCode || 'US', ): Promise<string> {
qmSymbol.currency || 'USD', const query = `
]); INSERT INTO symbols (symbol, exchange_id, company_name, country, currency)
VALUES ($1, $2, $3, $4, $5)
return result.rows[0].id; RETURNING id
} `;
async function updateSymbol(symbolId: string, qmSymbol: any, postgresClient: any): Promise<void> { const result = await postgresClient.query(query, [
const query = ` qmSymbol.symbol,
UPDATE symbols exchangeId,
SET company_name = COALESCE($2, company_name), qmSymbol.companyName || qmSymbol.name,
country = COALESCE($3, country), qmSymbol.countryCode || 'US',
currency = COALESCE($4, currency), qmSymbol.currency || 'USD',
updated_at = NOW() ]);
WHERE id = $1
`; return result.rows[0].id;
}
await postgresClient.query(query, [
symbolId, async function updateSymbol(symbolId: string, qmSymbol: any, postgresClient: any): Promise<void> {
qmSymbol.companyName || qmSymbol.name, const query = `
qmSymbol.countryCode, UPDATE symbols
qmSymbol.currency, SET company_name = COALESCE($2, company_name),
]); country = COALESCE($3, country),
} currency = COALESCE($4, currency),
updated_at = NOW()
async function upsertProviderMapping( WHERE id = $1
symbolId: string, `;
provider: string,
qmSymbol: any, await postgresClient.query(query, [
postgresClient: any symbolId,
): Promise<void> { qmSymbol.companyName || qmSymbol.name,
const query = ` qmSymbol.countryCode,
INSERT INTO provider_mappings qmSymbol.currency,
(symbol_id, provider, provider_symbol, provider_exchange, last_seen) ]);
VALUES ($1, $2, $3, $4, NOW()) }
ON CONFLICT (provider, provider_symbol)
DO UPDATE SET async function upsertProviderMapping(
symbol_id = EXCLUDED.symbol_id, symbolId: string,
provider_exchange = EXCLUDED.provider_exchange, provider: string,
last_seen = NOW() qmSymbol: any,
`; postgresClient: any
): Promise<void> {
await postgresClient.query(query, [ const query = `
symbolId, INSERT INTO provider_mappings
provider, (symbol_id, provider, provider_symbol, provider_exchange, last_seen)
qmSymbol.qmSearchCode || qmSymbol.symbol, VALUES ($1, $2, $3, $4, NOW())
qmSymbol.exchangeCode || qmSymbol.exchange, ON CONFLICT (provider, provider_symbol)
]); DO UPDATE SET
} symbol_id = EXCLUDED.symbol_id,
provider_exchange = EXCLUDED.provider_exchange,
async function updateSyncStatus(provider: string, dataType: string, count: number, postgresClient: any): Promise<void> { last_seen = NOW()
const query = ` `;
UPDATE sync_status
SET last_sync_at = NOW(), await postgresClient.query(query, [
last_sync_count = $3, symbolId,
sync_errors = NULL, provider,
updated_at = NOW() qmSymbol.qmSearchCode || qmSymbol.symbol,
WHERE provider = $1 AND data_type = $2 qmSymbol.exchangeCode || qmSymbol.exchange,
`; ]);
}
await postgresClient.query(query, [provider, dataType, count]);
} async function updateSyncStatus(
provider: string,
dataType: string,
count: number,
postgresClient: any
): Promise<void> {
const query = `
UPDATE sync_status
SET last_sync_at = NOW(),
last_sync_count = $3,
sync_errors = NULL,
updated_at = NOW()
WHERE provider = $1 AND data_type = $2
`;
await postgresClient.query(query, [provider, dataType, count]);
}

View file

@ -1,21 +1,21 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { getPostgreSQLClient } from '../../../clients'; import { getPostgreSQLClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads'; import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-status'); const logger = getLogger('sync-status');
export async function getSyncStatus(payload: JobPayload): Promise<Record<string, unknown>[]> { export async function getSyncStatus(payload: JobPayload): Promise<Record<string, unknown>[]> {
logger.info('Getting sync status...'); logger.info('Getting sync status...');
try { try {
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM sync_status ORDER BY provider, data_type'; const query = 'SELECT * FROM sync_status ORDER BY provider, data_type';
const result = await postgresClient.query(query); const result = await postgresClient.query(query);
logger.info(`Retrieved sync status for ${result.rows.length} entries`); logger.info(`Retrieved sync status for ${result.rows.length} entries`);
return result.rows; return result.rows;
} catch (error) { } catch (error) {
logger.error('Failed to get sync status', { error }); logger.error('Failed to get sync status', { error });
throw error; throw error;
} }
} }

View file

@ -1,215 +1,231 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients'; import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload, SyncResult } from '../../../types/job-payloads'; import type { JobPayload, SyncResult } from '../../../types/job-payloads';
const logger = getLogger('enhanced-sync-symbols-from-provider'); const logger = getLogger('enhanced-sync-symbols-from-provider');
export async function syncSymbolsFromProvider(payload: JobPayload): Promise<SyncResult> { export async function syncSymbolsFromProvider(payload: JobPayload): Promise<SyncResult> {
const provider = payload.provider; const provider = payload.provider;
const clearFirst = payload.clearFirst || false; const clearFirst = payload.clearFirst || false;
if (!provider) { if (!provider) {
throw new Error('Provider is required in payload'); throw new Error('Provider is required in payload');
} }
logger.info(`Starting ${provider} symbols sync...`, { clearFirst }); logger.info(`Starting ${provider} symbols sync...`, { clearFirst });
const result: SyncResult = { const result: SyncResult = {
processed: 0, processed: 0,
created: 0, created: 0,
updated: 0, updated: 0,
skipped: 0, skipped: 0,
errors: 0, errors: 0,
}; };
try { try {
const mongoClient = getMongoDBClient(); const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
// Clear existing data if requested (only symbols and mappings, keep exchanges) // Clear existing data if requested (only symbols and mappings, keep exchanges)
if (clearFirst) { if (clearFirst) {
await postgresClient.query('BEGIN'); await postgresClient.query('BEGIN');
await postgresClient.query('DELETE FROM provider_mappings'); await postgresClient.query('DELETE FROM provider_mappings');
await postgresClient.query('DELETE FROM symbols'); await postgresClient.query('DELETE FROM symbols');
await postgresClient.query('COMMIT'); await postgresClient.query('COMMIT');
logger.info('Cleared existing symbols and mappings before sync'); logger.info('Cleared existing symbols and mappings before sync');
} }
// Start transaction // Start transaction
await postgresClient.query('BEGIN'); await postgresClient.query('BEGIN');
let symbols: Record<string, unknown>[] = []; let symbols: Record<string, unknown>[] = [];
// Get symbols based on provider // Get symbols based on provider
const db = mongoClient.getDatabase(); const db = mongoClient.getDatabase();
switch (provider.toLowerCase()) { switch (provider.toLowerCase()) {
case 'qm': case 'qm':
symbols = await db.collection('qmSymbols').find({}).toArray(); symbols = await db.collection('qmSymbols').find({}).toArray();
break; break;
case 'eod': case 'eod':
symbols = await db.collection('eodSymbols').find({}).toArray(); symbols = await db.collection('eodSymbols').find({}).toArray();
break; break;
case 'ib': case 'ib':
symbols = await db.collection('ibSymbols').find({}).toArray(); symbols = await db.collection('ibSymbols').find({}).toArray();
break; break;
default: default:
throw new Error(`Unsupported provider: ${provider}`); throw new Error(`Unsupported provider: ${provider}`);
} }
logger.info(`Found ${symbols.length} ${provider} symbols to process`); logger.info(`Found ${symbols.length} ${provider} symbols to process`);
result.processed = symbols.length; result.processed = symbols.length;
for (const symbol of symbols) { for (const symbol of symbols) {
try { try {
await processSingleSymbol(symbol, provider, result); await processSingleSymbol(symbol, provider, result);
} catch (error) { } catch (error) {
logger.error('Failed to process symbol', { logger.error('Failed to process symbol', {
error, error,
symbol: symbol.symbol || symbol.code, symbol: symbol.symbol || symbol.code,
provider, provider,
}); });
result.errors++; result.errors++;
} }
} }
// Update sync status // Update sync status
await updateSyncStatus(provider, 'symbols', result.processed, postgresClient); await updateSyncStatus(provider, 'symbols', result.processed, postgresClient);
await postgresClient.query('COMMIT'); await postgresClient.query('COMMIT');
logger.info(`${provider} symbols sync completed`, result); logger.info(`${provider} symbols sync completed`, result);
return result; return result;
} catch (error) { } catch (error) {
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
await postgresClient.query('ROLLBACK'); await postgresClient.query('ROLLBACK');
logger.error(`${provider} symbols sync failed`, { error }); logger.error(`${provider} symbols sync failed`, { error });
throw error; throw error;
} }
} }
async function processSingleSymbol(symbol: any, provider: string, result: SyncResult): Promise<void> { async function processSingleSymbol(
const symbolCode = symbol.symbol || symbol.code; symbol: any,
const exchangeCode = symbol.exchangeCode || symbol.exchange || symbol.exchange_id; provider: string,
result: SyncResult
if (!symbolCode || !exchangeCode) { ): Promise<void> {
result.skipped++; const symbolCode = symbol.symbol || symbol.code;
return; const exchangeCode = symbol.exchangeCode || symbol.exchange || symbol.exchange_id;
}
if (!symbolCode || !exchangeCode) {
// Find active provider exchange mapping result.skipped++;
const providerMapping = await findActiveProviderExchangeMapping(provider, exchangeCode); return;
}
if (!providerMapping) {
result.skipped++; // Find active provider exchange mapping
return; const providerMapping = await findActiveProviderExchangeMapping(provider, exchangeCode);
}
if (!providerMapping) {
// Check if symbol exists result.skipped++;
const existingSymbol = await findSymbolByCodeAndExchange( return;
symbolCode, }
providerMapping.master_exchange_id
); // Check if symbol exists
const existingSymbol = await findSymbolByCodeAndExchange(
if (existingSymbol) { symbolCode,
await updateSymbol(existingSymbol.id, symbol); providerMapping.master_exchange_id
await upsertProviderMapping(existingSymbol.id, provider, symbol); );
result.updated++;
} else { if (existingSymbol) {
const newSymbolId = await createSymbol(symbol, providerMapping.master_exchange_id); await updateSymbol(existingSymbol.id, symbol);
await upsertProviderMapping(newSymbolId, provider, symbol); await upsertProviderMapping(existingSymbol.id, provider, symbol);
result.created++; result.updated++;
} } else {
} const newSymbolId = await createSymbol(symbol, providerMapping.master_exchange_id);
await upsertProviderMapping(newSymbolId, provider, symbol);
async function findActiveProviderExchangeMapping(provider: string, providerExchangeCode: string): Promise<any> { result.created++;
const postgresClient = getPostgreSQLClient(); }
const query = ` }
SELECT pem.*, e.code as master_exchange_code
FROM provider_exchange_mappings pem async function findActiveProviderExchangeMapping(
JOIN exchanges e ON pem.master_exchange_id = e.id provider: string,
WHERE pem.provider = $1 AND pem.provider_exchange_code = $2 AND pem.active = true providerExchangeCode: string
`; ): Promise<any> {
const result = await postgresClient.query(query, [provider, providerExchangeCode]); const postgresClient = getPostgreSQLClient();
return result.rows[0] || null; const query = `
} SELECT pem.*, e.code as master_exchange_code
FROM provider_exchange_mappings pem
async function findSymbolByCodeAndExchange(symbol: string, exchangeId: string): Promise<any> { JOIN exchanges e ON pem.master_exchange_id = e.id
const postgresClient = getPostgreSQLClient(); WHERE pem.provider = $1 AND pem.provider_exchange_code = $2 AND pem.active = true
const query = 'SELECT * FROM symbols WHERE symbol = $1 AND exchange_id = $2'; `;
const result = await postgresClient.query(query, [symbol, exchangeId]); const result = await postgresClient.query(query, [provider, providerExchangeCode]);
return result.rows[0] || null; return result.rows[0] || null;
} }
async function createSymbol(symbol: any, exchangeId: string): Promise<string> { async function findSymbolByCodeAndExchange(symbol: string, exchangeId: string): Promise<any> {
const postgresClient = getPostgreSQLClient(); const postgresClient = getPostgreSQLClient();
const query = ` const query = 'SELECT * FROM symbols WHERE symbol = $1 AND exchange_id = $2';
INSERT INTO symbols (symbol, exchange_id, company_name, country, currency) const result = await postgresClient.query(query, [symbol, exchangeId]);
VALUES ($1, $2, $3, $4, $5) return result.rows[0] || null;
RETURNING id }
`;
async function createSymbol(symbol: any, exchangeId: string): Promise<string> {
const result = await postgresClient.query(query, [ const postgresClient = getPostgreSQLClient();
symbol.symbol || symbol.code, const query = `
exchangeId, INSERT INTO symbols (symbol, exchange_id, company_name, country, currency)
symbol.companyName || symbol.name || symbol.company_name, VALUES ($1, $2, $3, $4, $5)
symbol.countryCode || symbol.country_code || 'US', RETURNING id
symbol.currency || 'USD', `;
]);
const result = await postgresClient.query(query, [
return result.rows[0].id; symbol.symbol || symbol.code,
} exchangeId,
symbol.companyName || symbol.name || symbol.company_name,
async function updateSymbol(symbolId: string, symbol: any): Promise<void> { symbol.countryCode || symbol.country_code || 'US',
const postgresClient = getPostgreSQLClient(); symbol.currency || 'USD',
const query = ` ]);
UPDATE symbols
SET company_name = COALESCE($2, company_name), return result.rows[0].id;
country = COALESCE($3, country), }
currency = COALESCE($4, currency),
updated_at = NOW() async function updateSymbol(symbolId: string, symbol: any): Promise<void> {
WHERE id = $1 const postgresClient = getPostgreSQLClient();
`; const query = `
UPDATE symbols
await postgresClient.query(query, [ SET company_name = COALESCE($2, company_name),
symbolId, country = COALESCE($3, country),
symbol.companyName || symbol.name || symbol.company_name, currency = COALESCE($4, currency),
symbol.countryCode || symbol.country_code, updated_at = NOW()
symbol.currency, WHERE id = $1
]); `;
}
await postgresClient.query(query, [
async function upsertProviderMapping(symbolId: string, provider: string, symbol: any): Promise<void> { symbolId,
const postgresClient = getPostgreSQLClient(); symbol.companyName || symbol.name || symbol.company_name,
const query = ` symbol.countryCode || symbol.country_code,
INSERT INTO provider_mappings symbol.currency,
(symbol_id, provider, provider_symbol, provider_exchange, last_seen) ]);
VALUES ($1, $2, $3, $4, NOW()) }
ON CONFLICT (provider, provider_symbol)
DO UPDATE SET async function upsertProviderMapping(
symbol_id = EXCLUDED.symbol_id, symbolId: string,
provider_exchange = EXCLUDED.provider_exchange, provider: string,
last_seen = NOW() symbol: any
`; ): Promise<void> {
const postgresClient = getPostgreSQLClient();
await postgresClient.query(query, [ const query = `
symbolId, INSERT INTO provider_mappings
provider, (symbol_id, provider, provider_symbol, provider_exchange, last_seen)
symbol.qmSearchCode || symbol.symbol || symbol.code, VALUES ($1, $2, $3, $4, NOW())
symbol.exchangeCode || symbol.exchange || symbol.exchange_id, ON CONFLICT (provider, provider_symbol)
]); DO UPDATE SET
} symbol_id = EXCLUDED.symbol_id,
provider_exchange = EXCLUDED.provider_exchange,
async function updateSyncStatus(provider: string, dataType: string, count: number, postgresClient: any): Promise<void> { last_seen = NOW()
const query = ` `;
INSERT INTO sync_status (provider, data_type, last_sync_at, last_sync_count, sync_errors)
VALUES ($1, $2, NOW(), $3, NULL) await postgresClient.query(query, [
ON CONFLICT (provider, data_type) symbolId,
DO UPDATE SET provider,
last_sync_at = NOW(), symbol.qmSearchCode || symbol.symbol || symbol.code,
last_sync_count = EXCLUDED.last_sync_count, symbol.exchangeCode || symbol.exchange || symbol.exchange_id,
sync_errors = NULL, ]);
updated_at = NOW() }
`;
async function updateSyncStatus(
await postgresClient.query(query, [provider, dataType, count]); provider: string,
} dataType: string,
count: number,
postgresClient: any
): Promise<void> {
const query = `
INSERT INTO sync_status (provider, data_type, last_sync_at, last_sync_count, sync_errors)
VALUES ($1, $2, NOW(), $3, NULL)
ON CONFLICT (provider, data_type)
DO UPDATE SET
last_sync_at = NOW(),
last_sync_count = EXCLUDED.last_sync_count,
sync_errors = NULL,
updated_at = NOW()
`;
await postgresClient.query(query, [provider, dataType, count]);
}

View file

@ -1,41 +1,41 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { handlerRegistry, type HandlerConfig, type ScheduledJobConfig } from '@stock-bot/queue'; import { handlerRegistry, type HandlerConfig, type ScheduledJobConfig } from '@stock-bot/queue';
import { symbolOperations } from './operations'; import { symbolOperations } from './operations';
const logger = getLogger('symbols-handler'); const logger = getLogger('symbols-handler');
const HANDLER_NAME = 'symbols'; const HANDLER_NAME = 'symbols';
const symbolsHandlerConfig: HandlerConfig = { const symbolsHandlerConfig: HandlerConfig = {
concurrency: 1, concurrency: 1,
maxAttempts: 3, maxAttempts: 3,
scheduledJobs: [ scheduledJobs: [
{ {
operation: 'sync-qm-symbols', operation: 'sync-qm-symbols',
cronPattern: '0 2 * * *', // Daily at 2 AM cronPattern: '0 2 * * *', // Daily at 2 AM
payload: {}, payload: {},
priority: 5, priority: 5,
immediately: false, immediately: false,
} as ScheduledJobConfig, } as ScheduledJobConfig,
{ {
operation: 'sync-symbols-qm', operation: 'sync-symbols-qm',
cronPattern: '0 4 * * *', // Daily at 4 AM cronPattern: '0 4 * * *', // Daily at 4 AM
payload: { provider: 'qm', clearFirst: false }, payload: { provider: 'qm', clearFirst: false },
priority: 5, priority: 5,
immediately: false, immediately: false,
} as ScheduledJobConfig, } as ScheduledJobConfig,
], ],
operations: { operations: {
'sync-qm-symbols': symbolOperations.syncQMSymbols, 'sync-qm-symbols': symbolOperations.syncQMSymbols,
'sync-symbols-qm': symbolOperations.syncSymbolsFromProvider, 'sync-symbols-qm': symbolOperations.syncSymbolsFromProvider,
'sync-symbols-eod': symbolOperations.syncSymbolsFromProvider, 'sync-symbols-eod': symbolOperations.syncSymbolsFromProvider,
'sync-symbols-ib': symbolOperations.syncSymbolsFromProvider, 'sync-symbols-ib': symbolOperations.syncSymbolsFromProvider,
'sync-status': symbolOperations.getSyncStatus, 'sync-status': symbolOperations.getSyncStatus,
}, },
}; };
export function initializeSymbolsHandler(): void { export function initializeSymbolsHandler(): void {
logger.info('Registering symbols handler...'); logger.info('Registering symbols handler...');
handlerRegistry.registerHandler(HANDLER_NAME, symbolsHandlerConfig); handlerRegistry.registerHandler(HANDLER_NAME, symbolsHandlerConfig);
logger.info('Symbols handler registered successfully'); logger.info('Symbols handler registered successfully');
} }

View file

@ -1,16 +1,16 @@
// Framework imports // Framework imports
import { initializeServiceConfig } from '@stock-bot/config';
import { Hono } from 'hono'; import { Hono } from 'hono';
import { cors } from 'hono/cors'; import { cors } from 'hono/cors';
import { initializeServiceConfig } from '@stock-bot/config';
// Library imports // Library imports
import { getLogger, setLoggerConfig, shutdownLoggers } from '@stock-bot/logger'; import { getLogger, setLoggerConfig, shutdownLoggers } from '@stock-bot/logger';
import { MongoDBClient } from '@stock-bot/mongodb'; import { MongoDBClient } from '@stock-bot/mongodb';
import { PostgreSQLClient } from '@stock-bot/postgres'; import { PostgreSQLClient } from '@stock-bot/postgres';
import { QueueManager, type QueueManagerConfig } from '@stock-bot/queue'; import { QueueManager, type QueueManagerConfig } from '@stock-bot/queue';
import { Shutdown } from '@stock-bot/shutdown'; import { Shutdown } from '@stock-bot/shutdown';
import { setMongoDBClient, setPostgreSQLClient } from './clients';
// Local imports // Local imports
import { enhancedSyncRoutes, healthRoutes, statsRoutes, syncRoutes } from './routes'; import { enhancedSyncRoutes, healthRoutes, statsRoutes, syncRoutes } from './routes';
import { setMongoDBClient, setPostgreSQLClient } from './clients';
const config = initializeServiceConfig(); const config = initializeServiceConfig();
console.log('Data Sync Service Configuration:', JSON.stringify(config, null, 2)); console.log('Data Sync Service Configuration:', JSON.stringify(config, null, 2));
@ -66,17 +66,20 @@ async function initializeServices() {
// Initialize MongoDB client // Initialize MongoDB client
logger.debug('Connecting to MongoDB...'); logger.debug('Connecting to MongoDB...');
const mongoConfig = databaseConfig.mongodb; const mongoConfig = databaseConfig.mongodb;
mongoClient = new MongoDBClient({ mongoClient = new MongoDBClient(
uri: mongoConfig.uri, {
database: mongoConfig.database, uri: mongoConfig.uri,
host: mongoConfig.host || 'localhost', database: mongoConfig.database,
port: mongoConfig.port || 27017, host: mongoConfig.host || 'localhost',
timeouts: { port: mongoConfig.port || 27017,
connectTimeout: 30000, timeouts: {
socketTimeout: 30000, connectTimeout: 30000,
serverSelectionTimeout: 5000, socketTimeout: 30000,
serverSelectionTimeout: 5000,
},
}, },
}, logger); logger
);
await mongoClient.connect(); await mongoClient.connect();
setMongoDBClient(mongoClient); setMongoDBClient(mongoClient);
logger.info('MongoDB connected'); logger.info('MongoDB connected');
@ -84,18 +87,21 @@ async function initializeServices() {
// Initialize PostgreSQL client // Initialize PostgreSQL client
logger.debug('Connecting to PostgreSQL...'); logger.debug('Connecting to PostgreSQL...');
const pgConfig = databaseConfig.postgres; const pgConfig = databaseConfig.postgres;
postgresClient = new PostgreSQLClient({ postgresClient = new PostgreSQLClient(
host: pgConfig.host, {
port: pgConfig.port, host: pgConfig.host,
database: pgConfig.database, port: pgConfig.port,
username: pgConfig.user, database: pgConfig.database,
password: pgConfig.password, username: pgConfig.user,
poolSettings: { password: pgConfig.password,
min: 2, poolSettings: {
max: pgConfig.poolSize || 10, min: 2,
idleTimeoutMillis: pgConfig.idleTimeout || 30000, max: pgConfig.poolSize || 10,
idleTimeoutMillis: pgConfig.idleTimeout || 30000,
},
}, },
}, logger); logger
);
await postgresClient.connect(); await postgresClient.connect();
setPostgreSQLClient(postgresClient); setPostgreSQLClient(postgresClient);
logger.info('PostgreSQL connected'); logger.info('PostgreSQL connected');
@ -124,7 +130,7 @@ async function initializeServices() {
enableDLQ: true, enableDLQ: true,
}, },
enableScheduledJobs: true, enableScheduledJobs: true,
delayWorkerStart: true, // Prevent workers from starting until all singletons are ready delayWorkerStart: true, // Prevent workers from starting until all singletons are ready
}; };
queueManager = QueueManager.getOrInitialize(queueManagerConfig); queueManager = QueueManager.getOrInitialize(queueManagerConfig);
@ -134,10 +140,10 @@ async function initializeServices() {
logger.debug('Initializing sync handlers...'); logger.debug('Initializing sync handlers...');
const { initializeExchangesHandler } = await import('./handlers/exchanges/exchanges.handler'); const { initializeExchangesHandler } = await import('./handlers/exchanges/exchanges.handler');
const { initializeSymbolsHandler } = await import('./handlers/symbols/symbols.handler'); const { initializeSymbolsHandler } = await import('./handlers/symbols/symbols.handler');
initializeExchangesHandler(); initializeExchangesHandler();
initializeSymbolsHandler(); initializeSymbolsHandler();
logger.info('Sync handlers initialized'); logger.info('Sync handlers initialized');
// Create scheduled jobs from registered handlers // Create scheduled jobs from registered handlers
@ -271,4 +277,4 @@ startServer().catch(error => {
process.exit(1); process.exit(1);
}); });
logger.info('Data sync service startup initiated'); logger.info('Data sync service startup initiated');

View file

@ -11,13 +11,13 @@ enhancedSync.post('/exchanges/all', async c => {
const clearFirst = c.req.query('clear') === 'true'; const clearFirst = c.req.query('clear') === 'true';
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges'); const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('sync-all-exchanges', { const job = await exchangesQueue.addJob('sync-all-exchanges', {
handler: 'exchanges', handler: 'exchanges',
operation: 'sync-all-exchanges', operation: 'sync-all-exchanges',
payload: { clearFirst }, payload: { clearFirst },
}); });
return c.json({ success: true, jobId: job.id, message: 'Enhanced exchange sync job queued' }); return c.json({ success: true, jobId: job.id, message: 'Enhanced exchange sync job queued' });
} catch (error) { } catch (error) {
logger.error('Failed to queue enhanced exchange sync job', { error }); logger.error('Failed to queue enhanced exchange sync job', { error });
@ -32,14 +32,18 @@ enhancedSync.post('/provider-mappings/qm', async c => {
try { try {
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges'); const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('sync-qm-provider-mappings', { const job = await exchangesQueue.addJob('sync-qm-provider-mappings', {
handler: 'exchanges', handler: 'exchanges',
operation: 'sync-qm-provider-mappings', operation: 'sync-qm-provider-mappings',
payload: {}, payload: {},
}); });
return c.json({ success: true, jobId: job.id, message: 'QM provider mappings sync job queued' }); return c.json({
success: true,
jobId: job.id,
message: 'QM provider mappings sync job queued',
});
} catch (error) { } catch (error) {
logger.error('Failed to queue QM provider mappings sync job', { error }); logger.error('Failed to queue QM provider mappings sync job', { error });
return c.json( return c.json(
@ -55,13 +59,13 @@ enhancedSync.post('/symbols/:provider', async c => {
const clearFirst = c.req.query('clear') === 'true'; const clearFirst = c.req.query('clear') === 'true';
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const symbolsQueue = queueManager.getQueue('symbols'); const symbolsQueue = queueManager.getQueue('symbols');
const job = await symbolsQueue.addJob(`sync-symbols-${provider}`, { const job = await symbolsQueue.addJob(`sync-symbols-${provider}`, {
handler: 'symbols', handler: 'symbols',
operation: `sync-symbols-${provider}`, operation: `sync-symbols-${provider}`,
payload: { provider, clearFirst }, payload: { provider, clearFirst },
}); });
return c.json({ success: true, jobId: job.id, message: `${provider} symbols sync job queued` }); return c.json({ success: true, jobId: job.id, message: `${provider} symbols sync job queued` });
} catch (error) { } catch (error) {
logger.error('Failed to queue enhanced symbol sync job', { error }); logger.error('Failed to queue enhanced symbol sync job', { error });
@ -77,13 +81,13 @@ enhancedSync.get('/status/enhanced', async c => {
try { try {
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges'); const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('enhanced-sync-status', { const job = await exchangesQueue.addJob('enhanced-sync-status', {
handler: 'exchanges', handler: 'exchanges',
operation: 'enhanced-sync-status', operation: 'enhanced-sync-status',
payload: {}, payload: {},
}); });
// Wait for job to complete and return result // Wait for job to complete and return result
const result = await job.waitUntilFinished(); const result = await job.waitUntilFinished();
return c.json(result); return c.json(result);
@ -93,4 +97,4 @@ enhancedSync.get('/status/enhanced', async c => {
} }
}); });
export { enhancedSync as enhancedSyncRoutes }; export { enhancedSync as enhancedSyncRoutes };

View file

@ -2,4 +2,4 @@
export { healthRoutes } from './health.routes'; export { healthRoutes } from './health.routes';
export { syncRoutes } from './sync.routes'; export { syncRoutes } from './sync.routes';
export { enhancedSyncRoutes } from './enhanced-sync.routes'; export { enhancedSyncRoutes } from './enhanced-sync.routes';
export { statsRoutes } from './stats.routes'; export { statsRoutes } from './stats.routes';

View file

@ -10,13 +10,13 @@ stats.get('/exchanges', async c => {
try { try {
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges'); const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('get-exchange-stats', { const job = await exchangesQueue.addJob('get-exchange-stats', {
handler: 'exchanges', handler: 'exchanges',
operation: 'get-exchange-stats', operation: 'get-exchange-stats',
payload: {}, payload: {},
}); });
// Wait for job to complete and return result // Wait for job to complete and return result
const result = await job.waitUntilFinished(); const result = await job.waitUntilFinished();
return c.json(result); return c.json(result);
@ -30,13 +30,13 @@ stats.get('/provider-mappings', async c => {
try { try {
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges'); const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('get-provider-mapping-stats', { const job = await exchangesQueue.addJob('get-provider-mapping-stats', {
handler: 'exchanges', handler: 'exchanges',
operation: 'get-provider-mapping-stats', operation: 'get-provider-mapping-stats',
payload: {}, payload: {},
}); });
// Wait for job to complete and return result // Wait for job to complete and return result
const result = await job.waitUntilFinished(); const result = await job.waitUntilFinished();
return c.json(result); return c.json(result);
@ -46,4 +46,4 @@ stats.get('/provider-mappings', async c => {
} }
}); });
export { stats as statsRoutes }; export { stats as statsRoutes };

View file

@ -10,13 +10,13 @@ sync.post('/symbols', async c => {
try { try {
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const symbolsQueue = queueManager.getQueue('symbols'); const symbolsQueue = queueManager.getQueue('symbols');
const job = await symbolsQueue.addJob('sync-qm-symbols', { const job = await symbolsQueue.addJob('sync-qm-symbols', {
handler: 'symbols', handler: 'symbols',
operation: 'sync-qm-symbols', operation: 'sync-qm-symbols',
payload: {}, payload: {},
}); });
return c.json({ success: true, jobId: job.id, message: 'QM symbols sync job queued' }); return c.json({ success: true, jobId: job.id, message: 'QM symbols sync job queued' });
} catch (error) { } catch (error) {
logger.error('Failed to queue symbol sync job', { error }); logger.error('Failed to queue symbol sync job', { error });
@ -31,13 +31,13 @@ sync.post('/exchanges', async c => {
try { try {
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges'); const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('sync-qm-exchanges', { const job = await exchangesQueue.addJob('sync-qm-exchanges', {
handler: 'exchanges', handler: 'exchanges',
operation: 'sync-qm-exchanges', operation: 'sync-qm-exchanges',
payload: {}, payload: {},
}); });
return c.json({ success: true, jobId: job.id, message: 'QM exchanges sync job queued' }); return c.json({ success: true, jobId: job.id, message: 'QM exchanges sync job queued' });
} catch (error) { } catch (error) {
logger.error('Failed to queue exchange sync job', { error }); logger.error('Failed to queue exchange sync job', { error });
@ -53,13 +53,13 @@ sync.get('/status', async c => {
try { try {
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const symbolsQueue = queueManager.getQueue('symbols'); const symbolsQueue = queueManager.getQueue('symbols');
const job = await symbolsQueue.addJob('sync-status', { const job = await symbolsQueue.addJob('sync-status', {
handler: 'symbols', handler: 'symbols',
operation: 'sync-status', operation: 'sync-status',
payload: {}, payload: {},
}); });
// Wait for job to complete and return result // Wait for job to complete and return result
const result = await job.waitUntilFinished(); const result = await job.waitUntilFinished();
return c.json(result); return c.json(result);
@ -74,13 +74,13 @@ sync.post('/clear', async c => {
try { try {
const queueManager = QueueManager.getInstance(); const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges'); const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('clear-postgresql-data', { const job = await exchangesQueue.addJob('clear-postgresql-data', {
handler: 'exchanges', handler: 'exchanges',
operation: 'clear-postgresql-data', operation: 'clear-postgresql-data',
payload: {}, payload: {},
}); });
// Wait for job to complete and return result // Wait for job to complete and return result
const result = await job.waitUntilFinished(); const result = await job.waitUntilFinished();
return c.json({ success: true, result }); return c.json({ success: true, result });
@ -93,4 +93,4 @@ sync.post('/clear', async c => {
} }
}); });
export { sync as syncRoutes }; export { sync as syncRoutes };

View file

@ -1,27 +1,27 @@
export interface JobPayload { export interface JobPayload {
[key: string]: any; [key: string]: any;
} }
export interface SyncResult { export interface SyncResult {
processed: number; processed: number;
created: number; created: number;
updated: number; updated: number;
skipped: number; skipped: number;
errors: number; errors: number;
} }
export interface SyncStatus { export interface SyncStatus {
provider: string; provider: string;
dataType: string; dataType: string;
lastSyncAt?: Date; lastSyncAt?: Date;
lastSyncCount: number; lastSyncCount: number;
syncErrors?: string; syncErrors?: string;
} }
export interface ExchangeMapping { export interface ExchangeMapping {
id: string; id: string;
code: string; code: string;
name: string; name: string;
country: string; country: string;
currency: string; currency: string;
} }

View file

@ -1,15 +1,15 @@
{ {
"service": { "service": {
"name": "web-api", "name": "web-api",
"port": 4000, "port": 4000,
"host": "0.0.0.0", "host": "0.0.0.0",
"healthCheckPath": "/health", "healthCheckPath": "/health",
"metricsPath": "/metrics", "metricsPath": "/metrics",
"shutdownTimeout": 30000, "shutdownTimeout": 30000,
"cors": { "cors": {
"enabled": true, "enabled": true,
"origin": ["http://localhost:4200", "http://localhost:3000", "http://localhost:3002"], "origin": ["http://localhost:4200", "http://localhost:3000", "http://localhost:3002"],
"credentials": true "credentials": true
} }
} }
} }

View file

@ -1,27 +1,27 @@
import { PostgreSQLClient } from '@stock-bot/postgres'; import { MongoDBClient } from '@stock-bot/mongodb';
import { MongoDBClient } from '@stock-bot/mongodb'; import { PostgreSQLClient } from '@stock-bot/postgres';
let postgresClient: PostgreSQLClient | null = null; let postgresClient: PostgreSQLClient | null = null;
let mongodbClient: MongoDBClient | null = null; let mongodbClient: MongoDBClient | null = null;
export function setPostgreSQLClient(client: PostgreSQLClient): void { export function setPostgreSQLClient(client: PostgreSQLClient): void {
postgresClient = client; postgresClient = client;
} }
export function getPostgreSQLClient(): PostgreSQLClient { export function getPostgreSQLClient(): PostgreSQLClient {
if (!postgresClient) { if (!postgresClient) {
throw new Error('PostgreSQL client not initialized. Call setPostgreSQLClient first.'); throw new Error('PostgreSQL client not initialized. Call setPostgreSQLClient first.');
} }
return postgresClient; return postgresClient;
} }
export function setMongoDBClient(client: MongoDBClient): void { export function setMongoDBClient(client: MongoDBClient): void {
mongodbClient = client; mongodbClient = client;
} }
export function getMongoDBClient(): MongoDBClient { export function getMongoDBClient(): MongoDBClient {
if (!mongodbClient) { if (!mongodbClient) {
throw new Error('MongoDB client not initialized. Call setMongoDBClient first.'); throw new Error('MongoDB client not initialized. Call setMongoDBClient first.');
} }
return mongodbClient; return mongodbClient;
} }

View file

@ -77,17 +77,20 @@ async function initializeServices() {
// Initialize MongoDB client // Initialize MongoDB client
logger.debug('Connecting to MongoDB...'); logger.debug('Connecting to MongoDB...');
const mongoConfig = databaseConfig.mongodb; const mongoConfig = databaseConfig.mongodb;
mongoClient = new MongoDBClient({ mongoClient = new MongoDBClient(
uri: mongoConfig.uri, {
database: mongoConfig.database, uri: mongoConfig.uri,
host: mongoConfig.host, database: mongoConfig.database,
port: mongoConfig.port, host: mongoConfig.host,
timeouts: { port: mongoConfig.port,
connectTimeout: 30000, timeouts: {
socketTimeout: 30000, connectTimeout: 30000,
serverSelectionTimeout: 5000, socketTimeout: 30000,
serverSelectionTimeout: 5000,
},
}, },
}, logger); logger
);
await mongoClient.connect(); await mongoClient.connect();
setMongoDBClient(mongoClient); setMongoDBClient(mongoClient);
logger.info('MongoDB connected'); logger.info('MongoDB connected');
@ -95,18 +98,21 @@ async function initializeServices() {
// Initialize PostgreSQL client // Initialize PostgreSQL client
logger.debug('Connecting to PostgreSQL...'); logger.debug('Connecting to PostgreSQL...');
const pgConfig = databaseConfig.postgres; const pgConfig = databaseConfig.postgres;
postgresClient = new PostgreSQLClient({ postgresClient = new PostgreSQLClient(
host: pgConfig.host, {
port: pgConfig.port, host: pgConfig.host,
database: pgConfig.database, port: pgConfig.port,
username: pgConfig.user, database: pgConfig.database,
password: pgConfig.password, username: pgConfig.user,
poolSettings: { password: pgConfig.password,
min: 2, poolSettings: {
max: pgConfig.poolSize || 10, min: 2,
idleTimeoutMillis: pgConfig.idleTimeout || 30000, max: pgConfig.poolSize || 10,
idleTimeoutMillis: pgConfig.idleTimeout || 30000,
},
}, },
}, logger); logger
);
await postgresClient.connect(); await postgresClient.connect();
setPostgreSQLClient(postgresClient); setPostgreSQLClient(postgresClient);
logger.info('PostgreSQL connected'); logger.info('PostgreSQL connected');

View file

@ -4,13 +4,13 @@
import { Hono } from 'hono'; import { Hono } from 'hono';
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { exchangeService } from '../services/exchange.service'; import { exchangeService } from '../services/exchange.service';
import { createSuccessResponse, handleError } from '../utils/error-handler';
import { import {
validateCreateExchange, validateCreateExchange,
validateUpdateExchange,
validateCreateProviderMapping, validateCreateProviderMapping,
validateUpdateExchange,
validateUpdateProviderMapping, validateUpdateProviderMapping,
} from '../utils/validation'; } from '../utils/validation';
import { handleError, createSuccessResponse } from '../utils/error-handler';
const logger = getLogger('exchange-routes'); const logger = getLogger('exchange-routes');
export const exchangeRoutes = new Hono(); export const exchangeRoutes = new Hono();
@ -32,19 +32,19 @@ exchangeRoutes.get('/', async c => {
exchangeRoutes.get('/:id', async c => { exchangeRoutes.get('/:id', async c => {
const exchangeId = c.req.param('id'); const exchangeId = c.req.param('id');
logger.debug('Getting exchange by ID', { exchangeId }); logger.debug('Getting exchange by ID', { exchangeId });
try { try {
const result = await exchangeService.getExchangeById(exchangeId); const result = await exchangeService.getExchangeById(exchangeId);
if (!result) { if (!result) {
logger.warn('Exchange not found', { exchangeId }); logger.warn('Exchange not found', { exchangeId });
return c.json(createSuccessResponse(null, 'Exchange not found'), 404); return c.json(createSuccessResponse(null, 'Exchange not found'), 404);
} }
logger.info('Successfully retrieved exchange details', { logger.info('Successfully retrieved exchange details', {
exchangeId, exchangeId,
exchangeCode: result.exchange.code, exchangeCode: result.exchange.code,
mappingCount: result.provider_mappings.length mappingCount: result.provider_mappings.length,
}); });
return c.json(createSuccessResponse(result)); return c.json(createSuccessResponse(result));
} catch (error) { } catch (error) {
@ -56,25 +56,22 @@ exchangeRoutes.get('/:id', async c => {
// Create new exchange // Create new exchange
exchangeRoutes.post('/', async c => { exchangeRoutes.post('/', async c => {
logger.debug('Creating new exchange'); logger.debug('Creating new exchange');
try { try {
const body = await c.req.json(); const body = await c.req.json();
logger.debug('Received exchange creation request', { requestBody: body }); logger.debug('Received exchange creation request', { requestBody: body });
const validatedData = validateCreateExchange(body); const validatedData = validateCreateExchange(body);
logger.debug('Exchange data validated successfully', { validatedData }); logger.debug('Exchange data validated successfully', { validatedData });
const exchange = await exchangeService.createExchange(validatedData); const exchange = await exchangeService.createExchange(validatedData);
logger.info('Exchange created successfully', { logger.info('Exchange created successfully', {
exchangeId: exchange.id, exchangeId: exchange.id,
code: exchange.code, code: exchange.code,
name: exchange.name name: exchange.name,
}); });
return c.json( return c.json(createSuccessResponse(exchange, 'Exchange created successfully'), 201);
createSuccessResponse(exchange, 'Exchange created successfully'),
201
);
} catch (error) { } catch (error) {
logger.error('Failed to create exchange', { error }); logger.error('Failed to create exchange', { error });
return handleError(c, error, 'to create exchange'); return handleError(c, error, 'to create exchange');
@ -85,32 +82,32 @@ exchangeRoutes.post('/', async c => {
exchangeRoutes.patch('/:id', async c => { exchangeRoutes.patch('/:id', async c => {
const exchangeId = c.req.param('id'); const exchangeId = c.req.param('id');
logger.debug('Updating exchange', { exchangeId }); logger.debug('Updating exchange', { exchangeId });
try { try {
const body = await c.req.json(); const body = await c.req.json();
logger.debug('Received exchange update request', { exchangeId, updates: body }); logger.debug('Received exchange update request', { exchangeId, updates: body });
const validatedUpdates = validateUpdateExchange(body); const validatedUpdates = validateUpdateExchange(body);
logger.debug('Exchange update data validated', { exchangeId, validatedUpdates }); logger.debug('Exchange update data validated', { exchangeId, validatedUpdates });
const exchange = await exchangeService.updateExchange(exchangeId, validatedUpdates); const exchange = await exchangeService.updateExchange(exchangeId, validatedUpdates);
if (!exchange) { if (!exchange) {
logger.warn('Exchange not found for update', { exchangeId }); logger.warn('Exchange not found for update', { exchangeId });
return c.json(createSuccessResponse(null, 'Exchange not found'), 404); return c.json(createSuccessResponse(null, 'Exchange not found'), 404);
} }
logger.info('Exchange updated successfully', { logger.info('Exchange updated successfully', {
exchangeId, exchangeId,
code: exchange.code, code: exchange.code,
updates: validatedUpdates updates: validatedUpdates,
}); });
// Log special actions // Log special actions
if (validatedUpdates.visible === false) { if (validatedUpdates.visible === false) {
logger.warn('Exchange marked as hidden - provider mappings will be deleted', { logger.warn('Exchange marked as hidden - provider mappings will be deleted', {
exchangeId, exchangeId,
code: exchange.code code: exchange.code,
}); });
} }
@ -124,7 +121,7 @@ exchangeRoutes.patch('/:id', async c => {
// Get all provider mappings // Get all provider mappings
exchangeRoutes.get('/provider-mappings/all', async c => { exchangeRoutes.get('/provider-mappings/all', async c => {
logger.debug('Getting all provider mappings'); logger.debug('Getting all provider mappings');
try { try {
const mappings = await exchangeService.getAllProviderMappings(); const mappings = await exchangeService.getAllProviderMappings();
logger.info('Successfully retrieved all provider mappings', { count: mappings.length }); logger.info('Successfully retrieved all provider mappings', { count: mappings.length });
@ -139,18 +136,12 @@ exchangeRoutes.get('/provider-mappings/all', async c => {
exchangeRoutes.get('/provider-mappings/:provider', async c => { exchangeRoutes.get('/provider-mappings/:provider', async c => {
const provider = c.req.param('provider'); const provider = c.req.param('provider');
logger.debug('Getting provider mappings by provider', { provider }); logger.debug('Getting provider mappings by provider', { provider });
try { try {
const mappings = await exchangeService.getProviderMappingsByProvider(provider); const mappings = await exchangeService.getProviderMappingsByProvider(provider);
logger.info('Successfully retrieved provider mappings', { provider, count: mappings.length }); logger.info('Successfully retrieved provider mappings', { provider, count: mappings.length });
return c.json( return c.json(createSuccessResponse(mappings, undefined, mappings.length));
createSuccessResponse(
mappings,
undefined,
mappings.length
)
);
} catch (error) { } catch (error) {
logger.error('Failed to get provider mappings', { error, provider }); logger.error('Failed to get provider mappings', { error, provider });
return handleError(c, error, 'to get provider mappings'); return handleError(c, error, 'to get provider mappings');
@ -161,26 +152,26 @@ exchangeRoutes.get('/provider-mappings/:provider', async c => {
exchangeRoutes.patch('/provider-mappings/:id', async c => { exchangeRoutes.patch('/provider-mappings/:id', async c => {
const mappingId = c.req.param('id'); const mappingId = c.req.param('id');
logger.debug('Updating provider mapping', { mappingId }); logger.debug('Updating provider mapping', { mappingId });
try { try {
const body = await c.req.json(); const body = await c.req.json();
logger.debug('Received provider mapping update request', { mappingId, updates: body }); logger.debug('Received provider mapping update request', { mappingId, updates: body });
const validatedUpdates = validateUpdateProviderMapping(body); const validatedUpdates = validateUpdateProviderMapping(body);
logger.debug('Provider mapping update data validated', { mappingId, validatedUpdates }); logger.debug('Provider mapping update data validated', { mappingId, validatedUpdates });
const mapping = await exchangeService.updateProviderMapping(mappingId, validatedUpdates); const mapping = await exchangeService.updateProviderMapping(mappingId, validatedUpdates);
if (!mapping) { if (!mapping) {
logger.warn('Provider mapping not found for update', { mappingId }); logger.warn('Provider mapping not found for update', { mappingId });
return c.json(createSuccessResponse(null, 'Provider mapping not found'), 404); return c.json(createSuccessResponse(null, 'Provider mapping not found'), 404);
} }
logger.info('Provider mapping updated successfully', { logger.info('Provider mapping updated successfully', {
mappingId, mappingId,
provider: mapping.provider, provider: mapping.provider,
providerExchangeCode: mapping.provider_exchange_code, providerExchangeCode: mapping.provider_exchange_code,
updates: validatedUpdates updates: validatedUpdates,
}); });
return c.json(createSuccessResponse(mapping, 'Provider mapping updated successfully')); return c.json(createSuccessResponse(mapping, 'Provider mapping updated successfully'));
@ -193,26 +184,23 @@ exchangeRoutes.patch('/provider-mappings/:id', async c => {
// Create new provider mapping // Create new provider mapping
exchangeRoutes.post('/provider-mappings', async c => { exchangeRoutes.post('/provider-mappings', async c => {
logger.debug('Creating new provider mapping'); logger.debug('Creating new provider mapping');
try { try {
const body = await c.req.json(); const body = await c.req.json();
logger.debug('Received provider mapping creation request', { requestBody: body }); logger.debug('Received provider mapping creation request', { requestBody: body });
const validatedData = validateCreateProviderMapping(body); const validatedData = validateCreateProviderMapping(body);
logger.debug('Provider mapping data validated successfully', { validatedData }); logger.debug('Provider mapping data validated successfully', { validatedData });
const mapping = await exchangeService.createProviderMapping(validatedData); const mapping = await exchangeService.createProviderMapping(validatedData);
logger.info('Provider mapping created successfully', { logger.info('Provider mapping created successfully', {
mappingId: mapping.id, mappingId: mapping.id,
provider: mapping.provider, provider: mapping.provider,
providerExchangeCode: mapping.provider_exchange_code, providerExchangeCode: mapping.provider_exchange_code,
masterExchangeId: mapping.master_exchange_id masterExchangeId: mapping.master_exchange_id,
}); });
return c.json( return c.json(createSuccessResponse(mapping, 'Provider mapping created successfully'), 201);
createSuccessResponse(mapping, 'Provider mapping created successfully'),
201
);
} catch (error) { } catch (error) {
logger.error('Failed to create provider mapping', { error }); logger.error('Failed to create provider mapping', { error });
return handleError(c, error, 'to create provider mapping'); return handleError(c, error, 'to create provider mapping');
@ -222,7 +210,7 @@ exchangeRoutes.post('/provider-mappings', async c => {
// Get all available providers // Get all available providers
exchangeRoutes.get('/providers/list', async c => { exchangeRoutes.get('/providers/list', async c => {
logger.debug('Getting providers list'); logger.debug('Getting providers list');
try { try {
const providers = await exchangeService.getProviders(); const providers = await exchangeService.getProviders();
logger.info('Successfully retrieved providers list', { count: providers.length, providers }); logger.info('Successfully retrieved providers list', { count: providers.length, providers });
@ -237,21 +225,15 @@ exchangeRoutes.get('/providers/list', async c => {
exchangeRoutes.get('/provider-exchanges/unmapped/:provider', async c => { exchangeRoutes.get('/provider-exchanges/unmapped/:provider', async c => {
const provider = c.req.param('provider'); const provider = c.req.param('provider');
logger.debug('Getting unmapped provider exchanges', { provider }); logger.debug('Getting unmapped provider exchanges', { provider });
try { try {
const exchanges = await exchangeService.getUnmappedProviderExchanges(provider); const exchanges = await exchangeService.getUnmappedProviderExchanges(provider);
logger.info('Successfully retrieved unmapped provider exchanges', { logger.info('Successfully retrieved unmapped provider exchanges', {
provider, provider,
count: exchanges.length count: exchanges.length,
}); });
return c.json( return c.json(createSuccessResponse(exchanges, undefined, exchanges.length));
createSuccessResponse(
exchanges,
undefined,
exchanges.length
)
);
} catch (error) { } catch (error) {
logger.error('Failed to get unmapped provider exchanges', { error, provider }); logger.error('Failed to get unmapped provider exchanges', { error, provider });
return handleError(c, error, 'to get unmapped provider exchanges'); return handleError(c, error, 'to get unmapped provider exchanges');
@ -261,7 +243,7 @@ exchangeRoutes.get('/provider-exchanges/unmapped/:provider', async c => {
// Get exchange statistics // Get exchange statistics
exchangeRoutes.get('/stats/summary', async c => { exchangeRoutes.get('/stats/summary', async c => {
logger.debug('Getting exchange statistics'); logger.debug('Getting exchange statistics');
try { try {
const stats = await exchangeService.getExchangeStats(); const stats = await exchangeService.getExchangeStats();
logger.info('Successfully retrieved exchange statistics', { stats }); logger.info('Successfully retrieved exchange statistics', { stats });
@ -270,4 +252,4 @@ exchangeRoutes.get('/stats/summary', async c => {
logger.error('Failed to get exchange statistics', { error }); logger.error('Failed to get exchange statistics', { error });
return handleError(c, error, 'to get exchange statistics'); return handleError(c, error, 'to get exchange statistics');
} }
}); });

View file

@ -3,7 +3,7 @@
*/ */
import { Hono } from 'hono'; import { Hono } from 'hono';
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { getPostgreSQLClient, getMongoDBClient } from '../clients'; import { getMongoDBClient, getPostgreSQLClient } from '../clients';
const logger = getLogger('health-routes'); const logger = getLogger('health-routes');
export const healthRoutes = new Hono(); export const healthRoutes = new Hono();
@ -11,13 +11,13 @@ export const healthRoutes = new Hono();
// Basic health check // Basic health check
healthRoutes.get('/', c => { healthRoutes.get('/', c => {
logger.debug('Basic health check requested'); logger.debug('Basic health check requested');
const response = { const response = {
status: 'healthy', status: 'healthy',
service: 'web-api', service: 'web-api',
timestamp: new Date().toISOString(), timestamp: new Date().toISOString(),
}; };
logger.info('Basic health check successful', { status: response.status }); logger.info('Basic health check successful', { status: response.status });
return c.json(response); return c.json(response);
}); });
@ -25,7 +25,7 @@ healthRoutes.get('/', c => {
// Detailed health check with database connectivity // Detailed health check with database connectivity
healthRoutes.get('/detailed', async c => { healthRoutes.get('/detailed', async c => {
logger.debug('Detailed health check requested'); logger.debug('Detailed health check requested');
const health = { const health = {
status: 'healthy', status: 'healthy',
service: 'web-api', service: 'web-api',
@ -80,19 +80,19 @@ healthRoutes.get('/detailed', async c => {
health.status = allHealthy ? 'healthy' : 'unhealthy'; health.status = allHealthy ? 'healthy' : 'unhealthy';
const statusCode = allHealthy ? 200 : 503; const statusCode = allHealthy ? 200 : 503;
if (allHealthy) { if (allHealthy) {
logger.info('Detailed health check successful - all systems healthy', { logger.info('Detailed health check successful - all systems healthy', {
mongodb: health.checks.mongodb.status, mongodb: health.checks.mongodb.status,
postgresql: health.checks.postgresql.status postgresql: health.checks.postgresql.status,
}); });
} else { } else {
logger.warn('Detailed health check failed - some systems unhealthy', { logger.warn('Detailed health check failed - some systems unhealthy', {
mongodb: health.checks.mongodb.status, mongodb: health.checks.mongodb.status,
postgresql: health.checks.postgresql.status, postgresql: health.checks.postgresql.status,
overallStatus: health.status overallStatus: health.status,
}); });
} }
return c.json(health, statusCode); return c.json(health, statusCode);
}); });

View file

@ -1,15 +1,15 @@
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { getPostgreSQLClient, getMongoDBClient } from '../clients'; import { getMongoDBClient, getPostgreSQLClient } from '../clients';
import { import {
Exchange,
ExchangeWithMappings,
ProviderMapping,
CreateExchangeRequest, CreateExchangeRequest,
UpdateExchangeRequest,
CreateProviderMappingRequest, CreateProviderMappingRequest,
UpdateProviderMappingRequest, Exchange,
ProviderExchange,
ExchangeStats, ExchangeStats,
ExchangeWithMappings,
ProviderExchange,
ProviderMapping,
UpdateExchangeRequest,
UpdateProviderMappingRequest,
} from '../types/exchange.types'; } from '../types/exchange.types';
const logger = getLogger('exchange-service'); const logger = getLogger('exchange-service');
@ -18,7 +18,7 @@ export class ExchangeService {
private get postgresClient() { private get postgresClient() {
return getPostgreSQLClient(); return getPostgreSQLClient();
} }
private get mongoClient() { private get mongoClient() {
return getMongoDBClient(); return getMongoDBClient();
} }
@ -63,14 +63,17 @@ export class ExchangeService {
const mappingsResult = await this.postgresClient.query(mappingsQuery); const mappingsResult = await this.postgresClient.query(mappingsQuery);
// Group mappings by exchange ID // Group mappings by exchange ID
const mappingsByExchange = mappingsResult.rows.reduce((acc, mapping) => { const mappingsByExchange = mappingsResult.rows.reduce(
const exchangeId = mapping.master_exchange_id; (acc, mapping) => {
if (!acc[exchangeId]) { const exchangeId = mapping.master_exchange_id;
acc[exchangeId] = []; if (!acc[exchangeId]) {
} acc[exchangeId] = [];
acc[exchangeId].push(mapping); }
return acc; acc[exchangeId].push(mapping);
}, {} as Record<string, ProviderMapping[]>); return acc;
},
{} as Record<string, ProviderMapping[]>
);
// Attach mappings to exchanges // Attach mappings to exchanges
return exchangesResult.rows.map(exchange => ({ return exchangesResult.rows.map(exchange => ({
@ -79,7 +82,9 @@ export class ExchangeService {
})); }));
} }
async getExchangeById(id: string): Promise<{ exchange: Exchange; provider_mappings: ProviderMapping[] } | null> { async getExchangeById(
id: string
): Promise<{ exchange: Exchange; provider_mappings: ProviderMapping[] } | null> {
const exchangeQuery = 'SELECT * FROM exchanges WHERE id = $1 AND visible = true'; const exchangeQuery = 'SELECT * FROM exchanges WHERE id = $1 AND visible = true';
const exchangeResult = await this.postgresClient.query(exchangeQuery, [id]); const exchangeResult = await this.postgresClient.query(exchangeQuery, [id]);
@ -230,7 +235,10 @@ export class ExchangeService {
return result.rows[0]; return result.rows[0];
} }
async updateProviderMapping(id: string, updates: UpdateProviderMappingRequest): Promise<ProviderMapping | null> { async updateProviderMapping(
id: string,
updates: UpdateProviderMappingRequest
): Promise<ProviderMapping | null> {
const updateFields = []; const updateFields = [];
const values = []; const values = [];
let paramIndex = 1; let paramIndex = 1;
@ -359,7 +367,6 @@ export class ExchangeService {
break; break;
} }
default: default:
throw new Error(`Unknown provider: ${provider}`); throw new Error(`Unknown provider: ${provider}`);
} }
@ -369,4 +376,4 @@ export class ExchangeService {
} }
// Export singleton instance // Export singleton instance
export const exchangeService = new ExchangeService(); export const exchangeService = new ExchangeService();

View file

@ -100,4 +100,4 @@ export interface ApiResponse<T = unknown> {
error?: string; error?: string;
message?: string; message?: string;
total?: number; total?: number;
} }

View file

@ -1,7 +1,7 @@
import { Context } from 'hono'; import { Context } from 'hono';
import { getLogger } from '@stock-bot/logger'; import { getLogger } from '@stock-bot/logger';
import { ValidationError } from './validation';
import { ApiResponse } from '../types/exchange.types'; import { ApiResponse } from '../types/exchange.types';
import { ValidationError } from './validation';
const logger = getLogger('error-handler'); const logger = getLogger('error-handler');
@ -61,4 +61,4 @@ export function createSuccessResponse<T>(
} }
return response; return response;
} }

View file

@ -1,7 +1,10 @@
import { CreateExchangeRequest, CreateProviderMappingRequest } from '../types/exchange.types'; import { CreateExchangeRequest, CreateProviderMappingRequest } from '../types/exchange.types';
export class ValidationError extends Error { export class ValidationError extends Error {
constructor(message: string, public field?: string) { constructor(
message: string,
public field?: string
) {
super(message); super(message);
this.name = 'ValidationError'; this.name = 'ValidationError';
} }
@ -38,7 +41,10 @@ export function validateCreateExchange(data: unknown): CreateExchangeRequest {
} }
if (currency.length !== 3) { if (currency.length !== 3) {
throw new ValidationError('Currency must be exactly 3 characters (e.g., USD, EUR, CAD)', 'currency'); throw new ValidationError(
'Currency must be exactly 3 characters (e.g., USD, EUR, CAD)',
'currency'
);
} }
return { return {
@ -172,4 +178,4 @@ export function validateUpdateProviderMapping(data: unknown): Record<string, unk
} }
return updates; return updates;
} }

View file

@ -1,16 +1,16 @@
import { useCallback, useEffect, useState } from 'react'; import { useCallback, useEffect, useState } from 'react';
import { exchangeApi } from '../services/exchangeApi';
import { import {
CreateExchangeRequest,
CreateProviderMappingRequest,
Exchange, Exchange,
ExchangeDetails, ExchangeDetails,
ExchangeStats, ExchangeStats,
ProviderMapping,
ProviderExchange, ProviderExchange,
CreateExchangeRequest, ProviderMapping,
UpdateExchangeRequest, UpdateExchangeRequest,
CreateProviderMappingRequest,
UpdateProviderMappingRequest, UpdateProviderMappingRequest,
} from '../types'; } from '../types';
import { exchangeApi } from '../services/exchangeApi';
export function useExchanges() { export function useExchanges() {
const [exchanges, setExchanges] = useState<Exchange[]>([]); const [exchanges, setExchanges] = useState<Exchange[]>([]);
@ -62,18 +62,15 @@ export function useExchanges() {
[fetchExchanges] [fetchExchanges]
); );
const fetchExchangeDetails = useCallback( const fetchExchangeDetails = useCallback(async (id: string): Promise<ExchangeDetails | null> => {
async (id: string): Promise<ExchangeDetails | null> => { try {
try { return await exchangeApi.getExchangeById(id);
return await exchangeApi.getExchangeById(id); } catch (err) {
} catch (err) { // Error fetching exchange details - error state will show in UI
// Error fetching exchange details - error state will show in UI setError(err instanceof Error ? err.message : 'Failed to fetch exchange details');
setError(err instanceof Error ? err.message : 'Failed to fetch exchange details'); return null;
return null; }
} }, []);
},
[]
);
const fetchStats = useCallback(async (): Promise<ExchangeStats | null> => { const fetchStats = useCallback(async (): Promise<ExchangeStats | null> => {
try { try {

View file

@ -1,22 +1,22 @@
import { useState, useCallback } from 'react'; import { useCallback, useState } from 'react';
import { FormErrors } from '../types'; import { FormErrors } from '../types';
export function useFormValidation<T>( export function useFormValidation<T>(initialData: T, validateFn: (data: T) => FormErrors) {
initialData: T,
validateFn: (data: T) => FormErrors
) {
const [formData, setFormData] = useState<T>(initialData); const [formData, setFormData] = useState<T>(initialData);
const [errors, setErrors] = useState<FormErrors>({}); const [errors, setErrors] = useState<FormErrors>({});
const [isSubmitting, setIsSubmitting] = useState(false); const [isSubmitting, setIsSubmitting] = useState(false);
const updateField = useCallback((field: keyof T, value: T[keyof T]) => { const updateField = useCallback(
setFormData(prev => ({ ...prev, [field]: value })); (field: keyof T, value: T[keyof T]) => {
setFormData(prev => ({ ...prev, [field]: value }));
// Clear error when user starts typing
if (errors[field as string]) { // Clear error when user starts typing
setErrors(prev => ({ ...prev, [field as string]: '' })); if (errors[field as string]) {
} setErrors(prev => ({ ...prev, [field as string]: '' }));
}, [errors]); }
},
[errors]
);
const validate = useCallback((): boolean => { const validate = useCallback((): boolean => {
const newErrors = validateFn(formData); const newErrors = validateFn(formData);
@ -30,24 +30,29 @@ export function useFormValidation<T>(
setIsSubmitting(false); setIsSubmitting(false);
}, [initialData]); }, [initialData]);
const handleSubmit = useCallback(async ( const handleSubmit = useCallback(
onSubmit: (data: T) => Promise<void>, async (
onSuccess?: () => void, onSubmit: (data: T) => Promise<void>,
onError?: (error: unknown) => void onSuccess?: () => void,
) => { onError?: (error: unknown) => void
if (!validate()) {return;} ) => {
if (!validate()) {
return;
}
setIsSubmitting(true); setIsSubmitting(true);
try { try {
await onSubmit(formData); await onSubmit(formData);
reset(); reset();
onSuccess?.(); onSuccess?.();
} catch (error) { } catch (error) {
onError?.(error); onError?.(error);
} finally { } finally {
setIsSubmitting(false); setIsSubmitting(false);
} }
}, [formData, validate, reset]); },
[formData, validate, reset]
);
return { return {
formData, formData,
@ -59,4 +64,4 @@ export function useFormValidation<T>(
handleSubmit, handleSubmit,
setIsSubmitting, setIsSubmitting,
}; };
} }

View file

@ -1,25 +1,22 @@
import { import {
ApiResponse, ApiResponse,
CreateExchangeRequest,
CreateProviderMappingRequest,
Exchange, Exchange,
ExchangeDetails, ExchangeDetails,
ExchangeStats, ExchangeStats,
ProviderMapping,
ProviderExchange, ProviderExchange,
CreateExchangeRequest, ProviderMapping,
UpdateExchangeRequest, UpdateExchangeRequest,
CreateProviderMappingRequest,
UpdateProviderMappingRequest, UpdateProviderMappingRequest,
} from '../types'; } from '../types';
const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || 'http://localhost:4000/api'; const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || 'http://localhost:4000/api';
class ExchangeApiService { class ExchangeApiService {
private async request<T>( private async request<T>(endpoint: string, options?: RequestInit): Promise<ApiResponse<T>> {
endpoint: string,
options?: RequestInit
): Promise<ApiResponse<T>> {
const url = `${API_BASE_URL}${endpoint}`; const url = `${API_BASE_URL}${endpoint}`;
const response = await fetch(url, { const response = await fetch(url, {
headers: { headers: {
'Content-Type': 'application/json', 'Content-Type': 'application/json',
@ -33,7 +30,7 @@ class ExchangeApiService {
} }
const data = await response.json(); const data = await response.json();
if (!data.success) { if (!data.success) {
throw new Error(data.error || 'API request failed'); throw new Error(data.error || 'API request failed');
} }
@ -76,10 +73,10 @@ class ExchangeApiService {
// Provider Mappings // Provider Mappings
async getProviderMappings(provider?: string): Promise<ProviderMapping[]> { async getProviderMappings(provider?: string): Promise<ProviderMapping[]> {
const endpoint = provider const endpoint = provider
? `/exchanges/provider-mappings/${provider}` ? `/exchanges/provider-mappings/${provider}`
: '/exchanges/provider-mappings/all'; : '/exchanges/provider-mappings/all';
const response = await this.request<ProviderMapping[]>(endpoint); const response = await this.request<ProviderMapping[]>(endpoint);
return response.data || []; return response.data || [];
} }
@ -96,7 +93,7 @@ class ExchangeApiService {
} }
async updateProviderMapping( async updateProviderMapping(
id: string, id: string,
data: UpdateProviderMappingRequest data: UpdateProviderMappingRequest
): Promise<ProviderMapping> { ): Promise<ProviderMapping> {
const response = await this.request<ProviderMapping>(`/exchanges/provider-mappings/${id}`, { const response = await this.request<ProviderMapping>(`/exchanges/provider-mappings/${id}`, {
@ -132,4 +129,4 @@ class ExchangeApiService {
} }
// Export singleton instance // Export singleton instance
export const exchangeApi = new ExchangeApiService(); export const exchangeApi = new ExchangeApiService();

View file

@ -66,4 +66,4 @@ export interface ExchangeStats {
active_provider_mappings: string; active_provider_mappings: string;
verified_provider_mappings: string; verified_provider_mappings: string;
providers: string; providers: string;
} }

View file

@ -32,7 +32,9 @@ export interface AddExchangeDialogProps extends BaseDialogProps {
export interface AddProviderMappingDialogProps extends BaseDialogProps { export interface AddProviderMappingDialogProps extends BaseDialogProps {
exchangeId: string; exchangeId: string;
exchangeName: string; exchangeName: string;
onCreateMapping: (request: import('./request.types').CreateProviderMappingRequest) => Promise<unknown>; onCreateMapping: (
request: import('./request.types').CreateProviderMappingRequest
) => Promise<unknown>;
} }
export interface DeleteExchangeDialogProps extends BaseDialogProps { export interface DeleteExchangeDialogProps extends BaseDialogProps {
@ -40,4 +42,4 @@ export interface DeleteExchangeDialogProps extends BaseDialogProps {
exchangeName: string; exchangeName: string;
providerMappingCount: number; providerMappingCount: number;
onConfirmDelete: (exchangeId: string) => Promise<boolean>; onConfirmDelete: (exchangeId: string) => Promise<boolean>;
} }

View file

@ -32,4 +32,4 @@ export interface UpdateProviderMappingRequest {
verified?: boolean; verified?: boolean;
confidence?: number; confidence?: number;
master_exchange_id?: string; master_exchange_id?: string;
} }

View file

@ -21,7 +21,7 @@ export function sortProviderMappings(mappings: ProviderMapping[]): ProviderMappi
if (!a.active && b.active) { if (!a.active && b.active) {
return 1; return 1;
} }
// Then by provider name // Then by provider name
return a.provider.localeCompare(b.provider); return a.provider.localeCompare(b.provider);
}); });
@ -32,4 +32,4 @@ export function truncateText(text: string, maxLength: number): string {
return text; return text;
} }
return text.substring(0, maxLength) + '...'; return text.substring(0, maxLength) + '...';
} }

View file

@ -35,4 +35,4 @@ export function validateExchangeForm(data: {
export function hasValidationErrors(errors: FormErrors): boolean { export function hasValidationErrors(errors: FormErrors): boolean {
return Object.keys(errors).length > 0; return Object.keys(errors).length > 0;
} }

View file

@ -19,7 +19,11 @@ export function formatPercentage(value: number): string {
} }
export function getValueColor(value: number): string { export function getValueColor(value: number): string {
if (value > 0) {return 'text-success';} if (value > 0) {
if (value < 0) {return 'text-danger';} return 'text-success';
}
if (value < 0) {
return 'text-danger';
}
return 'text-text-secondary'; return 'text-text-secondary';
} }

View file

@ -23,9 +23,15 @@ export function formatPercentage(value: number, decimals = 2): string {
* Format large numbers with K, M, B suffixes * Format large numbers with K, M, B suffixes
*/ */
export function formatNumber(num: number): string { export function formatNumber(num: number): string {
if (num >= 1e9) {return (num / 1e9).toFixed(1) + 'B';} if (num >= 1e9) {
if (num >= 1e6) {return (num / 1e6).toFixed(1) + 'M';} return (num / 1e9).toFixed(1) + 'B';
if (num >= 1e3) {return (num / 1e3).toFixed(1) + 'K';} }
if (num >= 1e6) {
return (num / 1e6).toFixed(1) + 'M';
}
if (num >= 1e3) {
return (num / 1e3).toFixed(1) + 'K';
}
return num.toString(); return num.toString();
} }
@ -33,8 +39,12 @@ export function formatNumber(num: number): string {
* Get color class based on numeric value (profit/loss) * Get color class based on numeric value (profit/loss)
*/ */
export function getValueColor(value: number): string { export function getValueColor(value: number): string {
if (value > 0) {return 'text-success';} if (value > 0) {
if (value < 0) {return 'text-danger';} return 'text-success';
}
if (value < 0) {
return 'text-danger';
}
return 'text-text-secondary'; return 'text-text-secondary';
} }
@ -42,6 +52,8 @@ export function getValueColor(value: number): string {
* Truncate text to specified length * Truncate text to specified length
*/ */
export function truncateText(text: string, length: number): string { export function truncateText(text: string, length: number): string {
if (text.length <= length) {return text;} if (text.length <= length) {
return text;
}
return text.slice(0, length) + '...'; return text.slice(0, length) + '...';
} }

View file

@ -1,148 +0,0 @@
# Enhanced Cache Provider Usage
The Redis cache provider now supports advanced TTL handling and conditional operations.
## Basic Usage (Backward Compatible)
```typescript
import { RedisCache } from '@stock-bot/cache';
const cache = new RedisCache({
keyPrefix: 'trading:',
defaultTTL: 3600 // 1 hour
});
// Simple set with TTL (old way - still works)
await cache.set('user:123', userData, 1800); // 30 minutes
// Simple get
const user = await cache.get<UserData>('user:123');
```
## Enhanced Set Options
```typescript
// Preserve existing TTL when updating
await cache.set('user:123', updatedUserData, { preserveTTL: true });
// Only set if key exists (update operation)
const oldValue = await cache.set('user:123', newData, {
onlyIfExists: true,
getOldValue: true
});
// Only set if key doesn't exist (create operation)
await cache.set('user:456', newUser, {
onlyIfNotExists: true,
ttl: 7200 // 2 hours
});
// Get old value when setting new one
const previousData = await cache.set('session:abc', sessionData, {
getOldValue: true,
ttl: 1800
});
```
## Convenience Methods
```typescript
// Update value preserving TTL
await cache.update('user:123', updatedUserData);
// Set only if exists
const updated = await cache.setIfExists('user:123', newData, 3600);
// Set only if not exists (returns true if created)
const created = await cache.setIfNotExists('user:456', userData);
// Replace existing key with new TTL
const oldData = await cache.replace('user:123', newData, 7200);
// Atomic field updates
await cache.updateField('counter:views', (current) => (current || 0) + 1);
await cache.updateField('user:123', (user) => ({
...user,
lastSeen: new Date().toISOString(),
loginCount: (user?.loginCount || 0) + 1
}));
```
## Stock Bot Use Cases
### 1. Rate Limiting
```typescript
// Only create rate limit if not exists
const rateLimited = await cache.setIfNotExists(
`ratelimit:${userId}:${endpoint}`,
{ count: 1, resetTime: Date.now() + 60000 },
60 // 1 minute
);
if (!rateLimited) {
// Increment existing counter
await cache.updateField(`ratelimit:${userId}:${endpoint}`, (data) => ({
...data,
count: data.count + 1
}));
}
```
### 2. Session Management
```typescript
// Update session data without changing expiration
await cache.update(`session:${sessionId}`, {
...sessionData,
lastActivity: Date.now()
});
```
### 3. Cache Warming
```typescript
// Only update existing cached data, don't create new entries
const warmed = await cache.setIfExists(`stock:${symbol}:price`, latestPrice);
if (warmed) {
console.log(`Warmed cache for ${symbol}`);
}
```
### 4. Atomic Counters
```typescript
// Thread-safe counter increments
await cache.updateField('metrics:api:calls', (count) => (count || 0) + 1);
await cache.updateField('metrics:errors:500', (count) => (count || 0) + 1);
```
### 5. TTL Preservation for Frequently Updated Data
```typescript
// Keep original expiration when updating frequently changing data
await cache.set(`portfolio:${userId}:positions`, positions, { preserveTTL: true });
```
## Error Handling
The cache provider includes robust error handling:
```typescript
try {
await cache.set('key', value);
} catch (error) {
// Errors are logged and fallback values returned
// The cache operations are non-blocking
}
// Check cache health
const isHealthy = await cache.health();
// Wait for cache to be ready
await cache.waitForReady(10000); // 10 second timeout
```
## Performance Benefits
1. **Atomic Operations**: `updateField` uses Lua scripts to prevent race conditions
2. **TTL Preservation**: Avoids unnecessary TTL resets on updates
3. **Conditional Operations**: Reduces network round trips
4. **Shared Connections**: Efficient connection pooling
5. **Error Recovery**: Graceful degradation when Redis is unavailable

View file

@ -1,169 +0,0 @@
# Loki Logging for Stock Bot
This document outlines how to use the Loki logging system integrated with the Stock Bot platform (Updated June 2025).
## Overview
Loki provides centralized logging for all Stock Bot services with:
1. **Centralized logging** for all microservices
2. **Log aggregation** and filtering by service, level, and custom labels
3. **Grafana integration** for visualization and dashboards
4. **Query capabilities** using LogQL for log analysis
5. **Alert capabilities** for critical issues
## Getting Started
### Starting the Logging Stack
```cmd
# Start the monitoring stack (includes Loki and Grafana)
scripts\docker.ps1 monitoring
```
Or start services individually:
```cmd
# Start Loki service only
docker-compose up -d loki
# Start Loki and Grafana
docker-compose up -d loki grafana
```
### Viewing Logs
Once started:
1. Access Grafana at http://localhost:3000 (login with admin/admin)
2. Navigate to the "Stock Bot Logs" dashboard
3. View and query your logs
## Using the Logger in Your Services
The Stock Bot logger automatically sends logs to Loki using the updated pattern:
```typescript
import { getLogger } from '@stock-bot/logger';
// Create a logger for your service
const logger = getLogger('your-service-name');
// Log at different levels
logger.debug('Detailed information for debugging');
logger.info('General information about operations');
logger.warn('Potential issues that don\'t affect operation');
logger.error('Critical errors that require attention');
// Log with structured data (searchable in Loki)
logger.info('Processing trade', {
symbol: 'MSFT',
price: 410.75,
quantity: 50
});
```
## Configuration Options
Logger configuration is managed through the `@stock-bot/config` package and can be set in your `.env` file:
```bash
# Logging configuration
LOG_LEVEL=debug # debug, info, warn, error
LOG_CONSOLE=true # Log to console in addition to Loki
LOKI_HOST=localhost # Loki server hostname
LOKI_PORT=3100 # Loki server port
LOKI_RETENTION_DAYS=30 # Days to retain logs
LOKI_LABELS=environment=development,service=stock-bot # Default labels
LOKI_BATCH_SIZE=100 # Number of logs to batch before sending
LOKI_BATCH_WAIT=5 # Max time to wait before sending logs
```
## Useful Loki Queries
Inside Grafana, you can use these LogQL queries to analyze your logs:
1. **All logs from a specific service**:
```
{service="market-data-gateway"}
```
2. **All error logs across all services**:
```
{level="error"}
```
3. **Logs containing specific text**:
```
{service="market-data-gateway"} |= "trade"
```
4. **Count of error logs by service over time**:
```
sum by(service) (count_over_time({level="error"}[5m]))
```
## Testing the Logging Integration
Test the logging integration using Bun:
```cmd
# Run from project root using Bun (current runtime)
bun run tools/test-loki-logging.ts
```
## Architecture
Our logging implementation follows this architecture:
```
┌─────────────────┐ ┌─────────────────┐
│ Trading Services│────►│ @stock-bot/logger│
└─────────────────┘ │ getLogger() │
└────────┬────────┘
┌────────────────────────────────────────┐
│ Loki │
└────────────────┬───────────────────────┘
┌────────────────────────────────────────┐
│ Grafana │
└────────────────────────────────────────┘
```
## Adding New Dashboards
To create new Grafana dashboards for log visualization:
1. Build your dashboard in the Grafana UI
2. Export it to JSON
3. Add it to `monitoring/grafana/provisioning/dashboards/json/`
4. Restart the monitoring stack
## Troubleshooting
If logs aren't appearing in Grafana:
1. Run the status check script to verify Loki and Grafana are working:
```cmd
tools\check-loki-status.bat
```
2. Check that Loki and Grafana containers are running:
```cmd
docker ps | findstr "loki grafana"
```
3. Verify .env configuration for Loki host and port:
```cmd
type .env | findstr "LOKI_"
```
4. Ensure your service has the latest @stock-bot/logger package
5. Check for errors in the Loki container logs:
```cmd
docker logs stock-bot-loki
```

View file

@ -1,212 +0,0 @@
# MongoDB Client Multi-Database Migration Guide
## Overview
Your MongoDB client has been enhanced to support multiple databases dynamically while maintaining full backward compatibility.
## Key Features Added
### 1. **Dynamic Database Switching**
```typescript
// Set default database (all operations will use this unless overridden)
client.setDefaultDatabase('analytics');
// Get current default database
const currentDb = client.getDefaultDatabase(); // Returns: 'analytics'
```
### 2. **Database Parameter in Methods**
All methods now accept an optional `database` parameter:
```typescript
// Old way (still works - uses default database)
await client.batchUpsert('symbols', data, 'symbol');
// New way (specify database explicitly)
await client.batchUpsert('symbols', data, 'symbol', { database: 'stock' });
```
### 3. **Convenience Methods**
Pre-configured methods for common databases:
```typescript
// Stock database operations
await client.batchUpsertStock('symbols', data, 'symbol');
// Analytics database operations
await client.batchUpsertAnalytics('metrics', data, 'metric_name');
// Trading documents database operations
await client.batchUpsertTrading('orders', data, 'order_id');
```
### 4. **Direct Database Access**
```typescript
// Get specific database instances
const stockDb = client.getDatabase('stock');
const analyticsDb = client.getDatabase('analytics');
// Get collections with database override
const collection = client.getCollection('symbols', 'stock');
```
## Migration Steps
### Step 1: No Changes Required (Backward Compatible)
Your existing code continues to work unchanged:
```typescript
// This still works exactly as before
const client = MongoDBClient.getInstance();
await client.connect();
await client.batchUpsert('exchanges', exchangeData, 'exchange_id');
```
### Step 2: Organize Data by Database (Recommended)
Update your data service to use appropriate databases:
```typescript
// In your data service initialization
export class DataService {
private mongoClient = MongoDBClient.getInstance();
async initialize() {
await this.mongoClient.connect();
// Set stock as default for most operations
this.mongoClient.setDefaultDatabase('stock');
}
async saveInteractiveBrokersData(exchanges: any[], symbols: any[]) {
// Stock market data goes to 'stock' database (default)
await this.mongoClient.batchUpsert('exchanges', exchanges, 'exchange_id');
await this.mongoClient.batchUpsert('symbols', symbols, 'symbol');
}
async saveAnalyticsData(performance: any[]) {
// Analytics data goes to 'analytics' database
await this.mongoClient.batchUpsert(
'performance',
performance,
'date',
{ database: 'analytics' }
);
}
}
```
### Step 3: Use Convenience Methods (Optional)
Replace explicit database parameters with convenience methods:
```typescript
// Instead of:
await client.batchUpsert('symbols', data, 'symbol', { database: 'stock' });
// Use:
await client.batchUpsertStock('symbols', data, 'symbol');
```
## Factory Functions
New factory functions are available for easier database management:
```typescript
import {
connectMongoDB,
setDefaultDatabase,
getCurrentDatabase,
getDatabase
} from '@stock-bot/mongodb-client';
// Set default database globally
setDefaultDatabase('analytics');
// Get current default
const current = getCurrentDatabase();
// Get specific database
const stockDb = getDatabase('stock');
```
## Database Recommendations
### Stock Database (`stock`)
- Market data (symbols, exchanges, prices)
- Financial instruments
- Market events
- Real-time data
### Analytics Database (`analytics`)
- Performance metrics
- Calculated indicators
- Reports and dashboards
- Aggregated data
### Trading Documents Database (`trading_documents`)
- Trade orders and executions
- User portfolios
- Transaction logs
- Audit trails
## Example: Updating Your Data Service
```typescript
// Before (still works)
export class DataService {
async saveExchanges(exchanges: any[]) {
const client = MongoDBClient.getInstance();
await client.batchUpsert('exchanges', exchanges, 'exchange_id');
}
}
// After (recommended)
export class DataService {
private mongoClient = MongoDBClient.getInstance();
async initialize() {
await this.mongoClient.connect();
this.mongoClient.setDefaultDatabase('stock'); // Set appropriate default
}
async saveExchanges(exchanges: any[]) {
// Uses default 'stock' database
await this.mongoClient.batchUpsert('exchanges', exchanges, 'exchange_id');
// Or use convenience method
await this.mongoClient.batchUpsertStock('exchanges', exchanges, 'exchange_id');
}
async savePerformanceMetrics(metrics: any[]) {
// Save to analytics database
await this.mongoClient.batchUpsertAnalytics('metrics', metrics, 'metric_name');
}
}
```
## Testing
Your existing tests continue to work. For new multi-database features:
```typescript
import { MongoDBClient } from '@stock-bot/mongodb-client';
const client = MongoDBClient.getInstance();
await client.connect();
// Test database switching
client.setDefaultDatabase('test_db');
expect(client.getDefaultDatabase()).toBe('test_db');
// Test explicit database parameter
await client.batchUpsert('test_collection', data, 'id', { database: 'other_db' });
```
## Benefits
1. **Organized Data**: Separate databases for different data types
2. **Better Performance**: Smaller, focused databases
3. **Easier Maintenance**: Clear data boundaries
4. **Scalability**: Can scale databases independently
5. **Backward Compatibility**: No breaking changes
## Next Steps
1. Update your data service to use appropriate default database
2. Gradually migrate to using specific databases for different data types
3. Consider using convenience methods for cleaner code
4. Update tests to cover multi-database scenarios

View file

@ -91,4 +91,4 @@
"apiKey": "", "apiKey": "",
"apiUrl": "https://proxy.webshare.io/api/v2/" "apiUrl": "https://proxy.webshare.io/api/v2/"
} }
} }

Some files were not shown because too many files have changed in this diff Show more