This commit is contained in:
Boki 2025-06-22 17:55:51 -04:00
parent d858222af7
commit 7d9044ab29
202 changed files with 10755 additions and 10972 deletions

Binary file not shown.

View file

@ -0,0 +1,58 @@
# Code Style and Conventions
## TypeScript Configuration
- **Strict mode enabled**: All strict checks are on
- **Target**: ES2022
- **Module**: ESNext with bundler resolution
- **Path aliases**: `@stock-bot/*` maps to `libs/*/src`
- **Decorators**: Enabled for dependency injection
## Code Style Rules (ESLint)
- **No unused variables**: Error (except prefixed with `_`)
- **No explicit any**: Warning
- **No non-null assertion**: Warning
- **No console**: Warning (except in tests)
- **Prefer const**: Enforced
- **Strict equality**: Always use `===`
- **Curly braces**: Required for all blocks
## Formatting (Prettier)
- **Semicolons**: Always
- **Single quotes**: Yes
- **Trailing comma**: ES5
- **Print width**: 100 characters
- **Tab width**: 2 spaces
- **Arrow parens**: Avoid when possible
- **End of line**: LF
## Import Order
1. Node built-ins
2. Third-party modules
3. `@stock-bot/*` imports
4. Relative imports (parent directories first)
5. Current directory imports
## Naming Conventions
- **Files**: kebab-case (e.g., `database-setup.ts`)
- **Classes**: PascalCase
- **Functions/Variables**: camelCase
- **Constants**: UPPER_SNAKE_CASE
- **Interfaces/Types**: PascalCase with 'I' or 'T' prefix optional
## Library Standards
- **Named exports only**: No default exports
- **Factory patterns**: For complex initialization
- **Singleton pattern**: For global services (config, logger)
- **Direct class exports**: For DI-managed services
## Testing
- **File naming**: `*.test.ts` or `*.spec.ts`
- **Test structure**: Bun's built-in test runner
- **Integration tests**: Use TestContainers for databases
- **Mocking**: Mock external dependencies
## Documentation
- **JSDoc**: For all public APIs
- **README.md**: Required for each library
- **Usage examples**: Include in documentation
- **Error messages**: Descriptive with context

View file

@ -0,0 +1,41 @@
# Current Refactoring Context
## Data Ingestion Service Refactor
The project is currently undergoing a major refactoring to move away from singleton patterns to a dependency injection approach using service containers.
### What's Been Done
- Created connection pool pattern with `ServiceContainer`
- Refactored data-ingestion service to use DI container
- Updated handlers to accept container parameter
- Added proper resource disposal with `ctx.dispose()`
### Migration Status
- QM handler: ✅ Fully migrated to container pattern
- IB handler: ⚠️ Partially migrated (using migration helper)
- Proxy handler: ✅ Updated to accept container
- WebShare handler: ✅ Updated to accept container
### Key Patterns
1. **Service Container**: Central DI container managing all connections
2. **Operation Context**: Provides scoped database access within operations
3. **Factory Pattern**: Connection factories for different databases
4. **Resource Disposal**: Always call `ctx.dispose()` after operations
### Example Pattern
```typescript
const ctx = OperationContext.create('handler', 'operation', { container });
try {
// Use databases through context
await ctx.mongodb.insertOne(data);
await ctx.postgres.query('...');
return { success: true };
} finally {
await ctx.dispose(); // Always cleanup
}
```
### Next Steps
- Complete migration of remaining IB operations
- Remove migration helper once complete
- Apply same pattern to other services
- Add monitoring for connection pools

View file

@ -0,0 +1,55 @@
# Stock Bot Trading Platform
## Project Purpose
This is an advanced trading bot platform with a microservice architecture designed for automated stock trading. The system includes:
- Market data ingestion from multiple providers (Yahoo Finance, QuoteMedia, Interactive Brokers, WebShare)
- Data processing and technical indicator calculation
- Trading strategy development and backtesting
- Order execution and risk management
- Portfolio tracking and performance analytics
- Web dashboard for monitoring
## Architecture Overview
The project follows a **microservices architecture** with shared libraries:
### Core Services (apps/)
- **data-ingestion**: Ingests market data from multiple providers
- **data-pipeline**: Processes and transforms data
- **web-api**: REST API service
- **web-app**: React-based dashboard
### Shared Libraries (libs/)
**Core Libraries:**
- config: Environment configuration with Zod validation
- logger: Structured logging with Loki integration
- di: Dependency injection container
- types: Shared TypeScript types
- handlers: Common handler patterns
**Data Libraries:**
- postgres: PostgreSQL client for transactional data
- questdb: Time-series database for market data
- mongodb: Document storage for configurations
**Service Libraries:**
- queue: BullMQ-based job processing
- event-bus: Dragonfly/Redis event bus
- shutdown: Graceful shutdown management
**Utils:**
- Financial calculations and technical indicators
- Date utilities
- Position sizing calculations
## Database Strategy
- **PostgreSQL**: Transactional data (orders, positions, strategies)
- **QuestDB**: Time-series data (OHLCV, indicators, performance metrics)
- **MongoDB**: Document storage (configurations, raw API responses)
- **Dragonfly/Redis**: Event bus and caching layer
## Current Development Phase
Phase 1: Data Foundation Layer (In Progress)
- Enhancing data provider reliability
- Implementing data validation
- Optimizing time-series storage
- Building robust HTTP client with circuit breakers

View file

@ -0,0 +1,62 @@
# Project Structure
## Root Directory
```
stock-bot/
├── apps/ # Microservice applications
│ ├── data-ingestion/ # Market data ingestion service
│ ├── data-pipeline/ # Data processing pipeline
│ ├── web-api/ # REST API service
│ └── web-app/ # React dashboard
├── libs/ # Shared libraries
│ ├── core/ # Core functionality
│ │ ├── config/ # Configuration management
│ │ ├── logger/ # Logging infrastructure
│ │ ├── di/ # Dependency injection
│ │ ├── types/ # Shared TypeScript types
│ │ └── handlers/ # Common handler patterns
│ ├── data/ # Database clients
│ │ ├── postgres/ # PostgreSQL client
│ │ ├── questdb/ # QuestDB time-series client
│ │ └── mongodb/ # MongoDB document storage
│ ├── services/ # Service utilities
│ │ ├── queue/ # BullMQ job processing
│ │ ├── event-bus/ # Dragonfly event bus
│ │ └── shutdown/ # Graceful shutdown
│ └── utils/ # Utility functions
├── database/ # Database schemas and migrations
├── scripts/ # Build and utility scripts
├── config/ # Configuration files
├── monitoring/ # Monitoring configurations
├── docs/ # Documentation
└── test/ # Global test utilities
## Key Files
- `package.json` - Root package configuration
- `turbo.json` - Turbo monorepo configuration
- `tsconfig.json` - TypeScript configuration
- `eslint.config.js` - ESLint rules
- `.prettierrc` - Prettier formatting rules
- `docker-compose.yml` - Infrastructure setup
- `.env` - Environment variables
## Monorepo Structure
- Uses Bun workspaces with Turbo for orchestration
- Each app and library has its own package.json
- Shared dependencies at root level
- Libraries published as `@stock-bot/*` packages
## Service Architecture Pattern
Each service typically follows:
```
service/
├── src/
│ ├── index.ts # Entry point
│ ├── routes/ # API routes (Hono)
│ ├── handlers/ # Business logic
│ ├── services/ # Service layer
│ └── types/ # Service-specific types
├── test/ # Tests
├── package.json
└── tsconfig.json
```

View file

@ -0,0 +1,73 @@
# Suggested Commands for Development
## Package Management (Bun)
- `bun install` - Install all dependencies
- `bun add <package>` - Add a new dependency
- `bun add -D <package>` - Add a dev dependency
- `bun update` - Update dependencies
## Development
- `bun run dev` - Start all services in development mode (uses Turbo)
- `bun run dev:full` - Start infrastructure + admin tools + dev mode
- `bun run dev:clean` - Reset infrastructure and start fresh
## Building
- `bun run build` - Build all services and libraries
- `bun run build:libs` - Build only shared libraries
- `bun run build:all:clean` - Clean build with cache removal
- `./scripts/build-all.sh` - Custom build script with options
## Testing
- `bun test` - Run all tests
- `bun test --watch` - Run tests in watch mode
- `bun run test:coverage` - Run tests with coverage report
- `bun run test:libs` - Test only shared libraries
- `bun run test:apps` - Test only applications
- `bun test <file>` - Run specific test file
## Code Quality (IMPORTANT - Run before committing!)
- `bun run lint` - Check for linting errors
- `bun run lint:fix` - Auto-fix linting issues
- `bun run format` - Format code with Prettier
- `./scripts/format.sh` - Alternative format script
## Infrastructure Management
- `bun run infra:up` - Start databases (PostgreSQL, QuestDB, MongoDB, Dragonfly)
- `bun run infra:down` - Stop infrastructure
- `bun run infra:reset` - Reset with clean volumes
- `bun run docker:admin` - Start admin GUIs (pgAdmin, Mongo Express, Redis Insight)
- `bun run docker:monitoring` - Start monitoring stack
## Database Operations
- `bun run db:setup-ib` - Setup Interactive Brokers database schema
- `bun run db:init` - Initialize all database schemas
## Utility Commands
- `bun run clean` - Clean build artifacts
- `bun run clean:all` - Deep clean including node_modules
- `turbo run <task>` - Run task across monorepo
## Git Commands (Linux)
- `git status` - Check current status
- `git add .` - Stage all changes
- `git commit -m "message"` - Commit changes
- `git push` - Push to remote
- `git pull` - Pull from remote
- `git checkout -b <branch>` - Create new branch
## System Commands (Linux)
- `ls -la` - List files with details
- `cd <directory>` - Change directory
- `grep -r "pattern" .` - Search for pattern
- `find . -name "*.ts"` - Find files by pattern
- `which <command>` - Find command location
## MCP Setup (for database access in IDE)
- `./scripts/setup-mcp.sh` - Setup Model Context Protocol servers
- Requires infrastructure to be running first
## Service URLs
- Dashboard: http://localhost:4200
- QuestDB Console: http://localhost:9000
- Grafana: http://localhost:3000
- pgAdmin: http://localhost:8080

View file

@ -0,0 +1,55 @@
# Task Completion Checklist
When you complete any coding task, ALWAYS run these commands in order:
## 1. Code Quality Checks (MANDATORY)
```bash
# Run linting to catch code issues
bun run lint
# If there are errors, fix them automatically
bun run lint:fix
# Format the code
bun run format
```
## 2. Testing (if applicable)
```bash
# Run tests if you modified existing functionality
bun test
# Run specific test file if you added/modified tests
bun test <path-to-test-file>
```
## 3. Build Verification (for significant changes)
```bash
# Build the affected libraries/apps
bun run build:libs # if you changed libraries
bun run build # for full build
```
## 4. Final Verification Steps
- Ensure no TypeScript errors in the IDE
- Check that imports are properly ordered (Prettier should handle this)
- Verify no console.log statements in production code
- Confirm all new code follows the established patterns
## 5. Git Commit Guidelines
- Stage changes: `git add .`
- Write descriptive commit messages
- Reference issue numbers if applicable
- Use conventional commit format when possible:
- `feat:` for new features
- `fix:` for bug fixes
- `refactor:` for code refactoring
- `docs:` for documentation
- `test:` for tests
- `chore:` for maintenance
## Important Notes
- NEVER skip the linting and formatting steps
- The project uses ESLint and Prettier - let them do their job
- If lint errors persist after auto-fix, they need manual attention
- Always test your changes, even if just running the service locally

View file

@ -0,0 +1,49 @@
# Technology Stack
## Runtime & Package Manager
- **Bun**: v1.1.0+ (primary runtime and package manager)
- **Node.js**: v18.0.0+ (compatibility)
- **TypeScript**: v5.8.3
## Core Technologies
- **Turbo**: Monorepo build system
- **ESBuild**: Fast bundling (integrated with Bun)
- **Hono**: Lightweight web framework for services
## Databases
- **PostgreSQL**: Primary transactional database
- **QuestDB**: Time-series database for market data
- **MongoDB**: Document storage
- **Dragonfly**: Redis-compatible cache and event bus
## Queue & Messaging
- **BullMQ**: Job queue processing
- **IORedis**: Redis client for Dragonfly
## Web Technologies
- **React**: Frontend framework (web-app)
- **Angular**: (based on polyfills.ts reference)
- **PrimeNG**: UI component library
- **TailwindCSS**: CSS framework
## Testing
- **Bun Test**: Built-in test runner
- **TestContainers**: Database integration testing
- **Supertest**: API testing
## Monitoring & Observability
- **Loki**: Log aggregation
- **Prometheus**: Metrics collection
- **Grafana**: Visualization dashboards
## Development Tools
- **ESLint**: Code linting
- **Prettier**: Code formatting
- **Docker Compose**: Local infrastructure
- **Model Context Protocol (MCP)**: Database access in IDE
## Key Dependencies
- **Awilix**: Dependency injection container
- **Zod**: Schema validation
- **pg**: PostgreSQL client
- **Playwright**: Browser automation for proxy testing

66
.serena/project.yml Normal file
View file

@ -0,0 +1,66 @@
# language of the project (csharp, python, rust, java, typescript, javascript, go, cpp, or ruby)
# Special requirements:
# * csharp: Requires the presence of a .sln file in the project folder.
language: typescript
# whether to use the project's gitignore file to ignore files
# Added on 2025-04-07
ignore_all_files_in_gitignore: true
# list of additional paths to ignore
# same syntax as gitignore, so you can use * and **
# Was previously called `ignored_dirs`, please update your config if you are using that.
# Added (renamed)on 2025-04-07
ignored_paths: []
# whether the project is in read-only mode
# If set to true, all editing tools will be disabled and attempts to use them will result in an error
# Added on 2025-04-18
read_only: false
# list of tool names to exclude. We recommend not excluding any tools, see the readme for more details.
# Below is the complete list of tools for convenience.
# To make sure you have the latest list of tools, and to view their descriptions,
# execute `uv run scripts/print_tool_overview.py`.
#
# * `activate_project`: Activates a project by name.
# * `check_onboarding_performed`: Checks whether project onboarding was already performed.
# * `create_text_file`: Creates/overwrites a file in the project directory.
# * `delete_lines`: Deletes a range of lines within a file.
# * `delete_memory`: Deletes a memory from Serena's project-specific memory store.
# * `execute_shell_command`: Executes a shell command.
# * `find_referencing_code_snippets`: Finds code snippets in which the symbol at the given location is referenced.
# * `find_referencing_symbols`: Finds symbols that reference the symbol at the given location (optionally filtered by type).
# * `find_symbol`: Performs a global (or local) search for symbols with/containing a given name/substring (optionally filtered by type).
# * `get_current_config`: Prints the current configuration of the agent, including the active and available projects, tools, contexts, and modes.
# * `get_symbols_overview`: Gets an overview of the top-level symbols defined in a given file or directory.
# * `initial_instructions`: Gets the initial instructions for the current project.
# Should only be used in settings where the system prompt cannot be set,
# e.g. in clients you have no control over, like Claude Desktop.
# * `insert_after_symbol`: Inserts content after the end of the definition of a given symbol.
# * `insert_at_line`: Inserts content at a given line in a file.
# * `insert_before_symbol`: Inserts content before the beginning of the definition of a given symbol.
# * `list_dir`: Lists files and directories in the given directory (optionally with recursion).
# * `list_memories`: Lists memories in Serena's project-specific memory store.
# * `onboarding`: Performs onboarding (identifying the project structure and essential tasks, e.g. for testing or building).
# * `prepare_for_new_conversation`: Provides instructions for preparing for a new conversation (in order to continue with the necessary context).
# * `read_file`: Reads a file within the project directory.
# * `read_memory`: Reads the memory with the given name from Serena's project-specific memory store.
# * `remove_project`: Removes a project from the Serena configuration.
# * `replace_lines`: Replaces a range of lines within a file with new content.
# * `replace_symbol_body`: Replaces the full definition of a symbol.
# * `restart_language_server`: Restarts the language server, may be necessary when edits not through Serena happen.
# * `search_for_pattern`: Performs a search for a pattern in the project.
# * `summarize_changes`: Provides instructions for summarizing the changes made to the codebase.
# * `switch_modes`: Activates modes by providing a list of their names
# * `think_about_collected_information`: Thinking tool for pondering the completeness of collected information.
# * `think_about_task_adherence`: Thinking tool for determining whether the agent is still on track with the current task.
# * `think_about_whether_you_are_done`: Thinking tool for determining whether the task is truly completed.
# * `write_memory`: Writes a named memory (for future reference) to Serena's project-specific memory store.
excluded_tools: []
# initial prompt for the project. It will always be given to the LLM upon activating the project
# (contrary to the memories, which are loaded on demand).
initial_prompt: ""
project_name: "stock-bot"

20
.vscode/mcp.json vendored
View file

@ -1,21 +1,3 @@
{
"mcpServers": {
"postgres": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-postgres",
"postgresql://trading_user:trading_pass_dev@localhost:5432/trading_bot"
]
},
"mongodb": {
"command": "npx",
"args": [
"-y",
"mongodb-mcp-server",
"--connectionString",
"mongodb://trading_admin:trading_mongo_dev@localhost:27017/stock?authSource=admin"
]
}
}
}

171
CLAUDE.md
View file

@ -1,171 +0,0 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Development Commands
**Package Manager**: Bun (v1.1.0+)
**Build & Development**:
- `bun install` - Install dependencies
- `bun run dev` - Start all services in development mode (uses Turbo)
- `bun run build` - Build all services and libraries
- `bun run build:libs` - Build only shared libraries
- `./scripts/build-all.sh` - Custom build script with options
**Testing**:
- `bun test` - Run all tests
- `bun run test:libs` - Test only shared libraries
- `bun run test:apps` - Test only applications
- `bun run test:coverage` - Run tests with coverage
**Code Quality**:
- `bun run lint` - Lint TypeScript files
- `bun run lint:fix` - Auto-fix linting issues
- `bun run format` - Format code using Prettier
- `./scripts/format.sh` - Format script
**Infrastructure**:
- `bun run infra:up` - Start database infrastructure (PostgreSQL, QuestDB, MongoDB, Dragonfly)
- `bun run infra:down` - Stop infrastructure
- `bun run infra:reset` - Reset infrastructure with clean volumes
- `bun run docker:admin` - Start admin GUIs (pgAdmin, Mongo Express, Redis Insight)
**Database Setup**:
- `bun run db:setup-ib` - Setup Interactive Brokers database schema
- `bun run db:init` - Initialize database schemas
## Architecture Overview
**Microservices Architecture** with shared libraries and multi-database storage:
### Core Services (`apps/`)
- **data-ingestion** - Market data ingestion from multiple providers (Yahoo, QuoteMedia, IB)
- **processing-service** - Data cleaning, validation, and technical indicators
- **strategy-service** - Trading strategies and backtesting (multi-mode: live, event-driven, vectorized, hybrid)
- **execution-service** - Order management and risk controls
- **portfolio-service** - Position tracking and performance analytics
- **web-app** - React dashboard with real-time updates
### Shared Libraries (`libs/`)
- **config** - Environment configuration with Zod validation
- **logger** - Loki-integrated structured logging (use `getLogger()` pattern)
- **http** - HTTP client with proxy support and rate limiting
- **cache** - Redis/Dragonfly caching layer
- **queue** - BullMQ-based job processing with batch support
- **postgres-client** - PostgreSQL operations with transactions
- **questdb-client** - Time-series data storage
- **mongodb-client** - Document storage operations
- **utils** - Financial calculations and technical indicators
### Database Strategy
- **PostgreSQL** - Transactional data (orders, positions, strategies)
- **QuestDB** - Time-series data (OHLCV, indicators, performance metrics)
- **MongoDB** - Document storage (configurations, raw responses)
- **Dragonfly** - Event bus and caching (Redis-compatible)
## Key Patterns & Conventions
**Library Usage**:
- Import from shared libraries: `import { getLogger } from '@stock-bot/logger'`
- Use configuration: `import { databaseConfig } from '@stock-bot/config'`
- Logger pattern: `const logger = getLogger('service-name')`
**Service Structure**:
- Each service has `src/index.ts` as entry point
- Routes in `src/routes/` using Hono framework
- Handlers/services in `src/handlers/` or `src/services/`
- Use dependency injection pattern
**Data Processing**:
- Raw data → QuestDB via handlers
- Processed data → PostgreSQL via processing service
- Event-driven communication via Dragonfly
- Queue-based batch processing for large datasets
**Multi-Mode Backtesting**:
- **Live Mode** - Real-time trading with brokers
- **Event-Driven** - Realistic simulation with market conditions
- **Vectorized** - Fast mathematical backtesting for optimization
- **Hybrid** - Validation by comparing vectorized vs event-driven results
## Development Workflow
1. **Start Infrastructure**: `bun run infra:up`
2. **Build Libraries**: `bun run build:libs`
3. **Start Development**: `bun run dev`
4. **Access UIs**:
- Dashboard: http://localhost:4200
- QuestDB Console: http://localhost:9000
- Grafana: http://localhost:3000
- pgAdmin: http://localhost:8080
## Important Files & Locations
**Configuration**:
- Environment variables in `.env` files
- Service configs in `libs/config/src/`
- Database init scripts in `database/postgres/init/`
**Key Scripts**:
- `scripts/build-all.sh` - Production build with cleanup
- `scripts/docker.sh` - Docker management
- `scripts/format.sh` - Code formatting
- `scripts/setup-mcp.sh` - Setup Model Context Protocol servers for database access
**Documentation**:
- `SIMPLIFIED-ARCHITECTURE.md` - Detailed architecture overview
- `DEVELOPMENT-ROADMAP.md` - Development phases and priorities
- Individual library READMEs in `libs/*/README.md`
## Current Development Phase
**Phase 1: Data Foundation Layer** (In Progress)
- Enhancing data provider reliability and rate limiting
- Implementing data validation and quality metrics
- Optimizing QuestDB storage for time-series data
- Building robust HTTP client with circuit breakers
Focus on data quality and provider fault tolerance before advancing to strategy implementation.
## Testing & Quality
- Use Bun's built-in test runner
- Integration tests with TestContainers for databases
- ESLint for code quality with TypeScript rules
- Prettier for code formatting
- All services should have health check endpoints
## Model Context Protocol (MCP) Setup
**MCP Database Servers** are configured in `.vscode/mcp.json` for direct database access:
- **PostgreSQL MCP Server**: Provides read-only access to PostgreSQL database
- Connection: `postgresql://trading_user:trading_pass_dev@localhost:5432/trading_bot`
- Package: `@modelcontextprotocol/server-postgres`
- **MongoDB MCP Server**: Official MongoDB team server for database and Atlas interaction
- Connection: `mongodb://trading_admin:trading_mongo_dev@localhost:27017/stock?authSource=admin`
- Package: `mongodb-mcp-server` (official MongoDB JavaScript team package)
**Setup Commands**:
- `./scripts/setup-mcp.sh` - Setup and test MCP servers
- `bun run infra:up` - Start database infrastructure (required for MCP)
**Usage**: Once configured, Claude Code can directly query and inspect database schemas and data through natural language commands.
## Environment Variables
Key environment variables (see `.env` example):
- `NODE_ENV` - Environment (development/production)
- `DATA_SERVICE_PORT` - Port for data service
- `DRAGONFLY_HOST/PORT` - Cache/event bus connection
- Database connection strings for PostgreSQL, QuestDB, MongoDB
## Monitoring & Observability
- **Logging**: Structured JSON logs to Loki
- **Metrics**: Prometheus metrics collection
- **Visualization**: Grafana dashboards
- **Queue Monitoring**: Bull Board for job queues
- **Health Checks**: All services expose `/health` endpoints

View file

@ -1,183 +0,0 @@
# Migration Guide: From Singleton to Connection Pool Pattern
## Overview
This guide explains how to migrate from the singleton anti-pattern to a proper connection pool pattern using the new `@stock-bot/connection-factory` library.
## Current State (Singleton Anti-Pattern)
```typescript
// ❌ Old pattern - global singleton
import { connectMongoDB, getMongoDBClient } from '@stock-bot/mongodb-client';
import { connectPostgreSQL, getPostgreSQLClient } from '@stock-bot/postgres-client';
// Initialize once at startup
await connectMongoDB(config);
await connectPostgreSQL(config);
// Use everywhere
const mongo = getMongoDBClient();
const postgres = getPostgreSQLClient();
```
### Problems with this approach:
- Global state makes testing difficult
- All operations share the same connection pool
- Can't optimize pool sizes for different use cases
- Memory leaks from persistent connections
- Hard to implement graceful shutdown
## New Pattern (Connection Factory + Service Container)
### Step 1: Set up Connection Factory
```typescript
// ✅ New pattern - connection factory
import { setupServiceContainer } from './setup/database-setup';
// Initialize service container at startup
const container = await setupServiceContainer();
// Register cleanup
shutdown.register(async () => {
await container.dispose();
});
```
### Step 2: Update Handlers to Use Container
```typescript
// ✅ Use OperationContext with container
export class MyHandler {
constructor(private readonly container: ServiceContainer) {}
async handleOperation(data: any) {
const context = OperationContext.create('my-handler', 'operation', {
container: this.container
});
try {
// Connections are managed by the container
await context.mongodb.insertOne(data);
await context.postgres.query('...');
await context.cache.set('key', 'value');
} finally {
// Clean up resources
await context.dispose();
}
}
}
```
### Step 3: Update Route Handlers
```typescript
// Pass container to route handlers
export function createRoutes(container: ServiceContainer) {
const router = new Hono();
const handler = new MyHandler(container);
router.get('/data', async (c) => {
const result = await handler.handleOperation(c.req.query());
return c.json(result);
});
return router;
}
```
## Migration Checklist
### For Each Service:
1. **Create database setup module**
```typescript
// apps/[service-name]/src/setup/database-setup.ts
export async function setupServiceContainer(): Promise<ServiceContainer> {
// Configure connection pools based on service needs
}
```
2. **Update main index.ts**
- Remove direct `connectMongoDB()` and `connectPostgreSQL()` calls
- Replace with `setupServiceContainer()`
- Pass container to route handlers and job processors
3. **Update handlers**
- Accept `ServiceContainer` in constructor
- Create `OperationContext` with container
- Remove direct database client imports
- Add `context.dispose()` in finally blocks
4. **Update job handlers**
```typescript
// Before
export async function myJobHandler(job: Job) {
const mongo = getMongoDBClient();
// ...
}
// After
export function createMyJobHandler(container: ServiceContainer) {
return async (job: Job) => {
const context = OperationContext.create('job', job.name, {
container
});
try {
// Use context.mongodb, context.postgres, etc.
} finally {
await context.dispose();
}
};
}
```
## Pool Size Recommendations
The `PoolSizeCalculator` provides optimal pool sizes based on service type:
| Service | Min | Max | Use Case |
|---------|-----|-----|----------|
| data-ingestion | 5 | 50 | High-volume batch imports |
| data-pipeline | 3 | 30 | Data processing pipelines |
| web-api | 2 | 10 | Low-latency API requests |
| processing-service | 2 | 20 | CPU-intensive operations |
| portfolio-service | 2 | 15 | Portfolio calculations |
| strategy-service | 3 | 25 | Strategy backtesting |
## Benefits After Migration
1. **Better Resource Management**
- Each service gets appropriately sized connection pools
- Automatic cleanup with dispose pattern
- No more connection leaks
2. **Improved Testing**
- Easy to mock containers for tests
- No global state to reset between tests
- Can test with different configurations
3. **Enhanced Performance**
- Optimized pool sizes per service
- Isolated pools for heavy operations
- Better connection reuse
4. **Operational Benefits**
- Connection pool metrics per service
- Graceful shutdown handling
- Better error isolation
## Backward Compatibility
The `OperationContext` maintains backward compatibility:
- If no container is provided, it falls back to singleton pattern
- This allows gradual migration service by service
- Warning logs indicate when fallback is used
## Example: Complete Service Migration
See `/apps/data-ingestion/src/handlers/example-handler.ts` for a complete example of:
- Using the service container
- Creating operation contexts
- Handling batch operations with scoped containers
- Proper resource cleanup

View file

@ -94,4 +94,4 @@
"burstSize": 20
}
}
}
}

View file

@ -1,3 +1,3 @@
export { updateCeoChannels } from './update-ceo-channels.action';
export { updateUniqueSymbols } from './update-unique-symbols.action';
export { processIndividualSymbol } from './process-individual-symbol.action';
export { updateCeoChannels } from './update-ceo-channels.action';
export { updateUniqueSymbols } from './update-unique-symbols.action';
export { processIndividualSymbol } from './process-individual-symbol.action';

View file

@ -1,111 +1,117 @@
import { getRandomUserAgent } from '@stock-bot/utils';
import type { CeoHandler } from '../ceo.handler';
export async function processIndividualSymbol(this: CeoHandler, payload: any, _context: any): Promise<unknown> {
const { ceoId, symbol, timestamp } = payload;
const proxy = this.proxy?.getProxy();
if(!proxy) {
this.logger.warn('No proxy available for processing individual CEO symbol');
return;
}
this.logger.debug('Processing individual CEO symbol', {
ceoId,
timestamp,
});
try {
// Fetch detailed information for the individual symbol
const response = await this.http.get(`https://api.ceo.ca/api/get_spiels?channel=${ceoId}&load_more=top`
+ (timestamp ? `&until=${timestamp}` : ''),
{
proxy: proxy,
headers: {
'User-Agent': getRandomUserAgent()
}
});
if (!response.ok) {
throw new Error(`Failed to fetch details for ceoId ${ceoId}: ${response.statusText}`);
}
const data = await response.json();
const spielCount = data.spiels.length;
if(spielCount === 0) {
this.logger.warn(`No spiels found for ceoId ${ceoId}`);
return null; // No data to process
}
const latestSpielTime = data.spiels[0]?.timestamp;
const posts = data.spiels.map((spiel: any) => ({
ceoId,
spiel: spiel.spiel,
spielReplyToId: spiel.spiel_reply_to_id,
spielReplyTo: spiel.spiel_reply_to,
spielReplyToName: spiel.spiel_reply_to_name,
spielReplyToEdited: spiel.spiel_reply_to_edited,
userId: spiel.user_id,
name: spiel.name,
timestamp: spiel.timestamp,
spielId: spiel.spiel_id,
color: spiel.color,
parentId: spiel.parent_id,
publicId: spiel.public_id,
parentChannel: spiel.parent_channel,
parentTimestamp: spiel.parent_timestamp,
votes: spiel.votes,
editable: spiel.editable,
edited: spiel.edited,
featured: spiel.featured,
verified: spiel.verified,
fake: spiel.fake,
bot: spiel.bot,
voted: spiel.voted,
flagged: spiel.flagged,
ownSpiel: spiel.own_spiel,
score: spiel.score,
savedId: spiel.saved_id,
savedTimestamp: spiel.saved_timestamp,
poll: spiel.poll,
votedInPoll: spiel.voted_in_poll
}));
await this.mongodb.batchUpsert('ceoPosts', posts, ['spielId']);
this.logger.info(`Fetched ${spielCount} spiels for ceoId ${ceoId}`);
// Update Shorts
const shortRes = await this.http.get(`https://api.ceo.ca/api/short_positions/one?symbol=${symbol}`,
{
proxy: proxy,
headers: {
'User-Agent': getRandomUserAgent()
}
});
if (shortRes.ok) {
const shortData = await shortRes.json();
if(shortData && shortData.positions) {
await this.mongodb.batchUpsert('ceoShorts', shortData.positions, ['id']);
}
await this.scheduleOperation('process-individual-symbol', {
ceoId: ceoId,
timestamp: latestSpielTime
});
}
this.logger.info(`Successfully processed channel ${ceoId} and added channel ${ceoId} at timestamp ${latestSpielTime}`);
return { ceoId, spielCount, timestamp };
} catch (error) {
this.logger.error('Failed to process individual symbol', {
error,
ceoId,
timestamp
});
throw error;
}
}
import { getRandomUserAgent } from '@stock-bot/utils';
import type { CeoHandler } from '../ceo.handler';
export async function processIndividualSymbol(
this: CeoHandler,
payload: any,
_context: any
): Promise<unknown> {
const { ceoId, symbol, timestamp } = payload;
const proxy = this.proxy?.getProxy();
if (!proxy) {
this.logger.warn('No proxy available for processing individual CEO symbol');
return;
}
this.logger.debug('Processing individual CEO symbol', {
ceoId,
timestamp,
});
try {
// Fetch detailed information for the individual symbol
const response = await this.http.get(
`https://api.ceo.ca/api/get_spiels?channel=${ceoId}&load_more=top` +
(timestamp ? `&until=${timestamp}` : ''),
{
proxy: proxy,
headers: {
'User-Agent': getRandomUserAgent(),
},
}
);
if (!response.ok) {
throw new Error(`Failed to fetch details for ceoId ${ceoId}: ${response.statusText}`);
}
const data = await response.json();
const spielCount = data.spiels.length;
if (spielCount === 0) {
this.logger.warn(`No spiels found for ceoId ${ceoId}`);
return null; // No data to process
}
const latestSpielTime = data.spiels[0]?.timestamp;
const posts = data.spiels.map((spiel: any) => ({
ceoId,
spiel: spiel.spiel,
spielReplyToId: spiel.spiel_reply_to_id,
spielReplyTo: spiel.spiel_reply_to,
spielReplyToName: spiel.spiel_reply_to_name,
spielReplyToEdited: spiel.spiel_reply_to_edited,
userId: spiel.user_id,
name: spiel.name,
timestamp: spiel.timestamp,
spielId: spiel.spiel_id,
color: spiel.color,
parentId: spiel.parent_id,
publicId: spiel.public_id,
parentChannel: spiel.parent_channel,
parentTimestamp: spiel.parent_timestamp,
votes: spiel.votes,
editable: spiel.editable,
edited: spiel.edited,
featured: spiel.featured,
verified: spiel.verified,
fake: spiel.fake,
bot: spiel.bot,
voted: spiel.voted,
flagged: spiel.flagged,
ownSpiel: spiel.own_spiel,
score: spiel.score,
savedId: spiel.saved_id,
savedTimestamp: spiel.saved_timestamp,
poll: spiel.poll,
votedInPoll: spiel.voted_in_poll,
}));
await this.mongodb.batchUpsert('ceoPosts', posts, ['spielId']);
this.logger.info(`Fetched ${spielCount} spiels for ceoId ${ceoId}`);
// Update Shorts
const shortRes = await this.http.get(
`https://api.ceo.ca/api/short_positions/one?symbol=${symbol}`,
{
proxy: proxy,
headers: {
'User-Agent': getRandomUserAgent(),
},
}
);
if (shortRes.ok) {
const shortData = await shortRes.json();
if (shortData && shortData.positions) {
await this.mongodb.batchUpsert('ceoShorts', shortData.positions, ['id']);
}
await this.scheduleOperation('process-individual-symbol', {
ceoId: ceoId,
timestamp: latestSpielTime,
});
}
this.logger.info(
`Successfully processed channel ${ceoId} and added channel ${ceoId} at timestamp ${latestSpielTime}`
);
return { ceoId, spielCount, timestamp };
} catch (error) {
this.logger.error('Failed to process individual symbol', {
error,
ceoId,
timestamp,
});
throw error;
}
}

View file

@ -1,67 +1,72 @@
import { getRandomUserAgent } from '@stock-bot/utils';
import type { CeoHandler } from '../ceo.handler';
export async function updateCeoChannels(this: CeoHandler, payload: number | undefined): Promise<unknown> {
const proxy = this.proxy?.getProxy();
if(!proxy) {
this.logger.warn('No proxy available for CEO channels update');
return;
}
let page;
if(payload === undefined) {
page = 1
}else{
page = payload;
}
this.logger.info(`Fetching CEO channels for page ${page} with proxy ${proxy}`);
const response = await this.http.get('https://api.ceo.ca/api/home?exchange=all&sort_by=symbol&sector=All&tab=companies&page='+page, {
proxy: proxy,
headers: {
'User-Agent': getRandomUserAgent()
}
})
const results = await response.json();
const channels = results.channel_categories[0].channels;
const totalChannels = results.channel_categories[0].total_channels;
const totalPages = Math.ceil(totalChannels / channels.length);
const exchanges: {exchange: string, countryCode: string}[] = []
const symbols = channels.map((channel: any) =>{
// check if exchange is in the exchanges array object
if(!exchanges.find((e: any) => e.exchange === channel.exchange)) {
exchanges.push({
exchange: channel.exchange,
countryCode: 'CA'
});
}
const details = channel.company_details || {};
return {
symbol: channel.symbol,
exchange: channel.exchange,
name: channel.title,
type: channel.type,
ceoId: channel.channel,
marketCap: details.market_cap,
volumeRatio: details.volume_ratio,
avgVolume: details.avg_volume,
stockType: details.stock_type,
issueType: details.issue_type,
sharesOutstanding: details.shares_outstanding,
float: details.float,
}
})
await this.mongodb.batchUpsert('ceoSymbols', symbols, ['symbol', 'exchange']);
await this.mongodb.batchUpsert('ceoExchanges', exchanges, ['exchange']);
if(page === 1) {
for( let i = 2; i <= totalPages; i++) {
this.logger.info(`Scheduling page ${i} of ${totalPages} for CEO channels`);
await this.scheduleOperation('update-ceo-channels', i)
}
}
this.logger.info(`Fetched CEO channels for page ${page}/${totalPages}`);
return { page, totalPages };
}
import { getRandomUserAgent } from '@stock-bot/utils';
import type { CeoHandler } from '../ceo.handler';
export async function updateCeoChannels(
this: CeoHandler,
payload: number | undefined
): Promise<unknown> {
const proxy = this.proxy?.getProxy();
if (!proxy) {
this.logger.warn('No proxy available for CEO channels update');
return;
}
let page;
if (payload === undefined) {
page = 1;
} else {
page = payload;
}
this.logger.info(`Fetching CEO channels for page ${page} with proxy ${proxy}`);
const response = await this.http.get(
'https://api.ceo.ca/api/home?exchange=all&sort_by=symbol&sector=All&tab=companies&page=' + page,
{
proxy: proxy,
headers: {
'User-Agent': getRandomUserAgent(),
},
}
);
const results = await response.json();
const channels = results.channel_categories[0].channels;
const totalChannels = results.channel_categories[0].total_channels;
const totalPages = Math.ceil(totalChannels / channels.length);
const exchanges: { exchange: string; countryCode: string }[] = [];
const symbols = channels.map((channel: any) => {
// check if exchange is in the exchanges array object
if (!exchanges.find((e: any) => e.exchange === channel.exchange)) {
exchanges.push({
exchange: channel.exchange,
countryCode: 'CA',
});
}
const details = channel.company_details || {};
return {
symbol: channel.symbol,
exchange: channel.exchange,
name: channel.title,
type: channel.type,
ceoId: channel.channel,
marketCap: details.market_cap,
volumeRatio: details.volume_ratio,
avgVolume: details.avg_volume,
stockType: details.stock_type,
issueType: details.issue_type,
sharesOutstanding: details.shares_outstanding,
float: details.float,
};
});
await this.mongodb.batchUpsert('ceoSymbols', symbols, ['symbol', 'exchange']);
await this.mongodb.batchUpsert('ceoExchanges', exchanges, ['exchange']);
if (page === 1) {
for (let i = 2; i <= totalPages; i++) {
this.logger.info(`Scheduling page ${i} of ${totalPages} for CEO channels`);
await this.scheduleOperation('update-ceo-channels', i);
}
}
this.logger.info(`Fetched CEO channels for page ${page}/${totalPages}`);
return { page, totalPages };
}

View file

@ -1,63 +1,71 @@
import type { CeoHandler } from '../ceo.handler';
export async function updateUniqueSymbols(this: CeoHandler, _payload: unknown, _context: any): Promise<unknown> {
this.logger.info('Starting update to get unique CEO symbols by ceoId');
try {
// Get unique ceoId values from the ceoSymbols collection
const uniqueCeoIds = await this.mongodb.collection('ceoSymbols').distinct('ceoId');
this.logger.info(`Found ${uniqueCeoIds.length} unique CEO IDs`);
// Get detailed records for each unique ceoId (latest/first record)
const uniqueSymbols = [];
for (const ceoId of uniqueCeoIds) {
const symbol = await this.mongodb.collection('ceoSymbols')
.findOne({ ceoId }, { sort: { _id: -1 } }); // Get latest record
if (symbol) {
uniqueSymbols.push(symbol);
}
}
this.logger.info(`Retrieved ${uniqueSymbols.length} unique symbol records`);
// Schedule individual jobs for each unique symbol
let scheduledJobs = 0;
for (const symbol of uniqueSymbols) {
// Schedule a job to process this individual symbol
await this.scheduleOperation('process-individual-symbol', {
ceoId: symbol.ceoId,
symbol: symbol.symbol,
});
scheduledJobs++;
// Add small delay to avoid overwhelming the queue
if (scheduledJobs % 10 === 0) {
this.logger.debug(`Scheduled ${scheduledJobs} jobs so far`);
}
}
this.logger.info(`Successfully scheduled ${scheduledJobs} individual symbol update jobs`);
// Cache the results for monitoring
await this.cacheSet('unique-symbols-last-run', {
timestamp: new Date().toISOString(),
totalUniqueIds: uniqueCeoIds.length,
totalRecords: uniqueSymbols.length,
scheduledJobs
}, 1800); // Cache for 30 minutes
return {
success: true,
uniqueCeoIds: uniqueCeoIds.length,
uniqueRecords: uniqueSymbols.length,
scheduledJobs,
timestamp: new Date().toISOString()
};
} catch (error) {
this.logger.error('Failed to update unique CEO symbols', { error });
throw error;
}
}
import type { CeoHandler } from '../ceo.handler';
export async function updateUniqueSymbols(
this: CeoHandler,
_payload: unknown,
_context: any
): Promise<unknown> {
this.logger.info('Starting update to get unique CEO symbols by ceoId');
try {
// Get unique ceoId values from the ceoSymbols collection
const uniqueCeoIds = await this.mongodb.collection('ceoSymbols').distinct('ceoId');
this.logger.info(`Found ${uniqueCeoIds.length} unique CEO IDs`);
// Get detailed records for each unique ceoId (latest/first record)
const uniqueSymbols = [];
for (const ceoId of uniqueCeoIds) {
const symbol = await this.mongodb
.collection('ceoSymbols')
.findOne({ ceoId }, { sort: { _id: -1 } }); // Get latest record
if (symbol) {
uniqueSymbols.push(symbol);
}
}
this.logger.info(`Retrieved ${uniqueSymbols.length} unique symbol records`);
// Schedule individual jobs for each unique symbol
let scheduledJobs = 0;
for (const symbol of uniqueSymbols) {
// Schedule a job to process this individual symbol
await this.scheduleOperation('process-individual-symbol', {
ceoId: symbol.ceoId,
symbol: symbol.symbol,
});
scheduledJobs++;
// Add small delay to avoid overwhelming the queue
if (scheduledJobs % 10 === 0) {
this.logger.debug(`Scheduled ${scheduledJobs} jobs so far`);
}
}
this.logger.info(`Successfully scheduled ${scheduledJobs} individual symbol update jobs`);
// Cache the results for monitoring
await this.cacheSet(
'unique-symbols-last-run',
{
timestamp: new Date().toISOString(),
totalUniqueIds: uniqueCeoIds.length,
totalRecords: uniqueSymbols.length,
scheduledJobs,
},
1800
); // Cache for 30 minutes
return {
success: true,
uniqueCeoIds: uniqueCeoIds.length,
uniqueRecords: uniqueSymbols.length,
scheduledJobs,
timestamp: new Date().toISOString(),
};
} catch (error) {
this.logger.error('Failed to update unique CEO symbols', { error });
throw error;
}
}

View file

@ -3,13 +3,9 @@ import {
Handler,
Operation,
ScheduledOperation,
type IServiceContainer
type IServiceContainer,
} from '@stock-bot/handlers';
import {
processIndividualSymbol,
updateCeoChannels,
updateUniqueSymbols
} from './actions';
import { processIndividualSymbol, updateCeoChannels, updateUniqueSymbols } from './actions';
@Handler('ceo')
// @Disabled()
@ -18,21 +14,21 @@ export class CeoHandler extends BaseHandler {
super(services); // Handler name read from @Handler decorator
}
@ScheduledOperation('update-ceo-channels', '0 */15 * * *', {
priority: 7,
immediately: false,
description: 'Get all CEO symbols and exchanges'
@ScheduledOperation('update-ceo-channels', '0 */15 * * *', {
priority: 7,
immediately: false,
description: 'Get all CEO symbols and exchanges',
})
updateCeoChannels = updateCeoChannels;
@Operation('update-unique-symbols')
@ScheduledOperation('process-unique-symbols', '0 0 1 * *', {
priority: 5,
immediately: false,
description: 'Process unique CEO symbols and schedule individual jobs'
@ScheduledOperation('process-unique-symbols', '0 0 1 * *', {
priority: 5,
immediately: false,
description: 'Process unique CEO symbols and schedule individual jobs',
})
updateUniqueSymbols = updateUniqueSymbols;
@Operation('process-individual-symbol')
processIndividualSymbol = processIndividualSymbol;
}
}

View file

@ -1,114 +1,107 @@
import { OperationContext } from '@stock-bot/di';
import type { ServiceContainer } from '@stock-bot/di';
/**
* Example handler showing how to use the new connection pooling pattern
*/
export class ExampleHandler {
constructor(private readonly container: ServiceContainer) {}
/**
* Example operation using the enhanced OperationContext
*/
async performOperation(data: any): Promise<void> {
// Create operation context with container
const context = new OperationContext(
'example-handler',
'perform-operation',
this.container,
{ data }
);
try {
// Log operation start
context.logger.info('Starting operation', { data });
// Use MongoDB through service resolution
const mongodb = context.resolve<any>('mongodb');
const result = await mongodb.collection('test').insertOne(data);
context.logger.debug('MongoDB insert complete', { insertedId: result.insertedId });
// Use PostgreSQL through service resolution
const postgres = context.resolve<any>('postgres');
await postgres.query(
'INSERT INTO operations (id, status) VALUES ($1, $2)',
[result.insertedId, 'completed']
);
// Use cache through service resolution
const cache = context.resolve<any>('cache');
await cache.set(`operation:${result.insertedId}`, {
status: 'completed',
timestamp: new Date()
});
context.logger.info('Operation completed successfully');
} catch (error) {
context.logger.error('Operation failed', { error });
throw error;
}
}
/**
* Example of batch operation with isolated connection pool
*/
async performBatchOperation(items: any[]): Promise<void> {
// Create a scoped container for this batch operation
const scopedContainer = this.container.createScope();
const context = new OperationContext(
'example-handler',
'batch-operation',
scopedContainer,
{ itemCount: items.length }
);
try {
context.logger.info('Starting batch operation', { itemCount: items.length });
// Get services once for the batch
const mongodb = context.resolve<any>('mongodb');
const cache = context.resolve<any>('cache');
// Process items in parallel
const promises = items.map(async (item, index) => {
const itemContext = new OperationContext(
'example-handler',
`batch-item-${index}`,
scopedContainer,
{ item }
);
try {
await mongodb.collection('batch').insertOne(item);
await cache.set(`batch:${item.id}`, item);
} catch (error) {
itemContext.logger.error('Batch item failed', { error, itemIndex: index });
throw error;
}
});
await Promise.all(promises);
context.logger.info('Batch operation completed');
} finally {
// Clean up scoped resources
await scopedContainer.dispose();
}
}
}
/**
* Example of how to use in a job handler
*/
export async function createExampleJobHandler(container: ServiceContainer) {
return async (job: any) => {
const handler = new ExampleHandler(container);
if (job.data.type === 'batch') {
await handler.performBatchOperation(job.data.items);
} else {
await handler.performOperation(job.data);
}
};
}
import { OperationContext } from '@stock-bot/di';
import type { ServiceContainer } from '@stock-bot/di';
/**
* Example handler showing how to use the new connection pooling pattern
*/
export class ExampleHandler {
constructor(private readonly container: ServiceContainer) {}
/**
* Example operation using the enhanced OperationContext
*/
async performOperation(data: any): Promise<void> {
// Create operation context with container
const context = new OperationContext('example-handler', 'perform-operation', this.container, {
data,
});
try {
// Log operation start
context.logger.info('Starting operation', { data });
// Use MongoDB through service resolution
const mongodb = context.resolve<any>('mongodb');
const result = await mongodb.collection('test').insertOne(data);
context.logger.debug('MongoDB insert complete', { insertedId: result.insertedId });
// Use PostgreSQL through service resolution
const postgres = context.resolve<any>('postgres');
await postgres.query('INSERT INTO operations (id, status) VALUES ($1, $2)', [
result.insertedId,
'completed',
]);
// Use cache through service resolution
const cache = context.resolve<any>('cache');
await cache.set(`operation:${result.insertedId}`, {
status: 'completed',
timestamp: new Date(),
});
context.logger.info('Operation completed successfully');
} catch (error) {
context.logger.error('Operation failed', { error });
throw error;
}
}
/**
* Example of batch operation with isolated connection pool
*/
async performBatchOperation(items: any[]): Promise<void> {
// Create a scoped container for this batch operation
const scopedContainer = this.container.createScope();
const context = new OperationContext('example-handler', 'batch-operation', scopedContainer, {
itemCount: items.length,
});
try {
context.logger.info('Starting batch operation', { itemCount: items.length });
// Get services once for the batch
const mongodb = context.resolve<any>('mongodb');
const cache = context.resolve<any>('cache');
// Process items in parallel
const promises = items.map(async (item, index) => {
const itemContext = new OperationContext(
'example-handler',
`batch-item-${index}`,
scopedContainer,
{ item }
);
try {
await mongodb.collection('batch').insertOne(item);
await cache.set(`batch:${item.id}`, item);
} catch (error) {
itemContext.logger.error('Batch item failed', { error, itemIndex: index });
throw error;
}
});
await Promise.all(promises);
context.logger.info('Batch operation completed');
} finally {
// Clean up scoped resources
await scopedContainer.dispose();
}
}
}
/**
* Example of how to use in a job handler
*/
export async function createExampleJobHandler(container: ServiceContainer) {
return async (job: any) => {
const handler = new ExampleHandler(container);
if (job.data.type === 'batch') {
await handler.performBatchOperation(job.data.items);
} else {
await handler.performOperation(job.data);
}
};
}

View file

@ -1,94 +1,94 @@
/**
* Example Handler - Demonstrates ergonomic handler patterns
* Shows inline operations, service helpers, and scheduled operations
*/
import {
BaseHandler,
Handler,
Operation,
ScheduledOperation,
type ExecutionContext,
type IServiceContainer
} from '@stock-bot/handlers';
@Handler('example')
export class ExampleHandler extends BaseHandler {
constructor(services: IServiceContainer) {
super(services);
}
/**
* Simple inline operation - no separate action file needed
*/
@Operation('get-stats')
async getStats(): Promise<{ total: number; active: number; cached: boolean }> {
// Use collection helper for cleaner MongoDB access
const total = await this.collection('items').countDocuments();
const active = await this.collection('items').countDocuments({ status: 'active' });
// Use cache helpers with automatic prefixing
const cached = await this.cacheGet<number>('last-total');
await this.cacheSet('last-total', total, 300); // 5 minutes
// Use log helper with automatic handler context
this.log('info', 'Stats retrieved', { total, active });
return { total, active, cached: cached !== null };
}
/**
* Scheduled operation using combined decorator
*/
@ScheduledOperation('cleanup-old-items', '0 2 * * *', {
priority: 5,
description: 'Clean up items older than 30 days'
})
async cleanupOldItems(): Promise<{ deleted: number }> {
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);
const result = await this.collection('items').deleteMany({
createdAt: { $lt: thirtyDaysAgo }
});
this.log('info', 'Cleanup completed', { deleted: result.deletedCount });
// Schedule a follow-up task
await this.scheduleIn('generate-report', { type: 'cleanup' }, 60); // 1 minute
return { deleted: result.deletedCount };
}
/**
* Operation that uses proxy service
*/
@Operation('fetch-external-data')
async fetchExternalData(input: { url: string }): Promise<{ data: any }> {
const proxyUrl = this.proxy.getProxy();
if (!proxyUrl) {
throw new Error('No proxy available');
}
// Use HTTP client with proxy
const response = await this.http.get(input.url, {
proxy: proxyUrl,
timeout: 10000
});
// Cache the result
await this.cacheSet(`external:${input.url}`, response.data, 3600);
return { data: response.data };
}
/**
* Complex operation that still uses action file
*/
@Operation('process-batch')
async processBatch(input: any, context: ExecutionContext): Promise<unknown> {
// For complex operations, still use action files
const { processBatch } = await import('./actions/batch.action');
return processBatch(this, input);
}
}
/**
* Example Handler - Demonstrates ergonomic handler patterns
* Shows inline operations, service helpers, and scheduled operations
*/
import {
BaseHandler,
Handler,
Operation,
ScheduledOperation,
type ExecutionContext,
type IServiceContainer,
} from '@stock-bot/handlers';
@Handler('example')
export class ExampleHandler extends BaseHandler {
constructor(services: IServiceContainer) {
super(services);
}
/**
* Simple inline operation - no separate action file needed
*/
@Operation('get-stats')
async getStats(): Promise<{ total: number; active: number; cached: boolean }> {
// Use collection helper for cleaner MongoDB access
const total = await this.collection('items').countDocuments();
const active = await this.collection('items').countDocuments({ status: 'active' });
// Use cache helpers with automatic prefixing
const cached = await this.cacheGet<number>('last-total');
await this.cacheSet('last-total', total, 300); // 5 minutes
// Use log helper with automatic handler context
this.log('info', 'Stats retrieved', { total, active });
return { total, active, cached: cached !== null };
}
/**
* Scheduled operation using combined decorator
*/
@ScheduledOperation('cleanup-old-items', '0 2 * * *', {
priority: 5,
description: 'Clean up items older than 30 days',
})
async cleanupOldItems(): Promise<{ deleted: number }> {
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);
const result = await this.collection('items').deleteMany({
createdAt: { $lt: thirtyDaysAgo },
});
this.log('info', 'Cleanup completed', { deleted: result.deletedCount });
// Schedule a follow-up task
await this.scheduleIn('generate-report', { type: 'cleanup' }, 60); // 1 minute
return { deleted: result.deletedCount };
}
/**
* Operation that uses proxy service
*/
@Operation('fetch-external-data')
async fetchExternalData(input: { url: string }): Promise<{ data: any }> {
const proxyUrl = this.proxy.getProxy();
if (!proxyUrl) {
throw new Error('No proxy available');
}
// Use HTTP client with proxy
const response = await this.http.get(input.url, {
proxy: proxyUrl,
timeout: 10000,
});
// Cache the result
await this.cacheSet(`external:${input.url}`, response.data, 3600);
return { data: response.data };
}
/**
* Complex operation that still uses action file
*/
@Operation('process-batch')
async processBatch(input: any, context: ExecutionContext): Promise<unknown> {
// For complex operations, still use action files
const { processBatch } = await import('./actions/batch.action');
return processBatch(this, input);
}
}

View file

@ -0,0 +1,38 @@
import type { IbHandler } from '../ib.handler';
export async function fetchExchangesAndSymbols(this: IbHandler): Promise<unknown> {
this.logger.info('Starting IB exchanges and symbols fetch job');
try {
// Fetch session headers first
const sessionHeaders = await this.fetchSession();
if (!sessionHeaders) {
this.logger.error('Failed to get session headers for IB job');
return { success: false, error: 'No session headers' };
}
this.logger.info('Session headers obtained, fetching exchanges...');
// Fetch exchanges
const exchanges = await this.fetchExchanges();
this.logger.info('Fetched exchanges from IB', { count: exchanges?.length || 0 });
// Fetch symbols
this.logger.info('Fetching symbols...');
const symbols = await this.fetchSymbols();
this.logger.info('Fetched symbols from IB', { count: symbols?.length || 0 });
return {
success: true,
exchangesCount: exchanges?.length || 0,
symbolsCount: symbols?.length || 0,
};
} catch (error) {
this.logger.error('Failed to fetch IB exchanges and symbols', { error });
return {
success: false,
error: error instanceof Error ? error.message : 'Unknown error',
};
}
}

View file

@ -1,16 +1,15 @@
/**
* IB Exchanges Operations - Fetching exchange data from IB API
*/
import { OperationContext } from '@stock-bot/di';
import type { ServiceContainer } from '@stock-bot/di';
import type { IbHandler } from '../ib.handler';
import { IB_CONFIG } from '../shared/config';
export async function fetchExchanges(sessionHeaders: Record<string, string>, container: ServiceContainer): Promise<unknown[] | null> {
const ctx = OperationContext.create('ib', 'exchanges', { container });
export async function fetchExchanges(this: IbHandler): Promise<unknown[] | null> {
try {
ctx.logger.info('🔍 Fetching exchanges with session headers...');
// First get session headers
const sessionHeaders = await this.fetchSession();
if (!sessionHeaders) {
throw new Error('Failed to get session headers');
}
this.logger.info('🔍 Fetching exchanges with session headers...');
// The URL for the exchange data API
const exchangeUrl = IB_CONFIG.BASE_URL + IB_CONFIG.EXCHANGE_API;
@ -28,7 +27,7 @@ export async function fetchExchanges(sessionHeaders: Record<string, string>, con
'X-Requested-With': 'XMLHttpRequest',
};
ctx.logger.info('📤 Making request to exchange API...', {
this.logger.info('📤 Making request to exchange API...', {
url: exchangeUrl,
headerCount: Object.keys(requestHeaders).length,
});
@ -41,7 +40,7 @@ export async function fetchExchanges(sessionHeaders: Record<string, string>, con
});
if (!response.ok) {
ctx.logger.error('❌ Exchange API request failed', {
this.logger.error('❌ Exchange API request failed', {
status: response.status,
statusText: response.statusText,
});
@ -50,19 +49,18 @@ export async function fetchExchanges(sessionHeaders: Record<string, string>, con
const data = await response.json();
const exchanges = data?.exchanges || [];
ctx.logger.info('✅ Exchange data fetched successfully');
this.logger.info('✅ Exchange data fetched successfully');
ctx.logger.info('Saving IB exchanges to MongoDB...');
await ctx.mongodb.batchUpsert('ibExchanges', exchanges, ['id', 'country_code']);
ctx.logger.info('✅ Exchange IB data saved to MongoDB:', {
this.logger.info('Saving IB exchanges to MongoDB...');
await this.mongodb.batchUpsert('ibExchanges', exchanges, ['id', 'country_code']);
this.logger.info('✅ Exchange IB data saved to MongoDB:', {
count: exchanges.length,
});
return exchanges;
} catch (error) {
ctx.logger.error('❌ Failed to fetch exchanges', { error });
this.logger.error('❌ Failed to fetch exchanges', { error });
return null;
} finally {
await ctx.dispose();
}
}
}

View file

@ -0,0 +1,83 @@
import { Browser } from '@stock-bot/browser';
import type { IbHandler } from '../ib.handler';
import { IB_CONFIG } from '../shared/config';
export async function fetchSession(this: IbHandler): Promise<Record<string, string> | undefined> {
try {
await Browser.initialize({
headless: true,
timeout: IB_CONFIG.BROWSER_TIMEOUT,
blockResources: false,
});
this.logger.info('✅ Browser initialized');
const { page } = await Browser.createPageWithProxy(
IB_CONFIG.BASE_URL + IB_CONFIG.PRODUCTS_PAGE,
IB_CONFIG.DEFAULT_PROXY
);
this.logger.info('✅ Page created with proxy');
const headersPromise = new Promise<Record<string, string> | undefined>(resolve => {
let resolved = false;
page.onNetworkEvent(event => {
if (event.url.includes('/webrest/search/product-types/summary')) {
if (event.type === 'request') {
try {
resolve(event.headers);
} catch (e) {
resolve(undefined);
this.logger.debug('Raw Summary Response error', { error: (e as Error).message });
}
}
}
});
// Timeout fallback
setTimeout(() => {
if (!resolved) {
resolved = true;
this.logger.warn('Timeout waiting for headers');
resolve(undefined);
}
}, IB_CONFIG.HEADERS_TIMEOUT);
});
this.logger.info('⏳ Waiting for page load...');
await page.waitForLoadState('domcontentloaded', { timeout: IB_CONFIG.PAGE_LOAD_TIMEOUT });
this.logger.info('✅ Page loaded');
//Products tabs
this.logger.info('🔍 Looking for Products tab...');
const productsTab = page.locator('#productSearchTab[role="tab"][href="#products"]');
await productsTab.waitFor({ timeout: IB_CONFIG.ELEMENT_TIMEOUT });
this.logger.info('✅ Found Products tab');
this.logger.info('🖱️ Clicking Products tab...');
await productsTab.click();
this.logger.info('✅ Products tab clicked');
// New Products Checkbox
this.logger.info('🔍 Looking for "New Products Only" radio button...');
const radioButton = page.locator('span.checkbox-text:has-text("New Products Only")');
await radioButton.waitFor({ timeout: IB_CONFIG.ELEMENT_TIMEOUT });
this.logger.info(`🎯 Found "New Products Only" radio button`);
await radioButton.first().click();
this.logger.info('✅ "New Products Only" radio button clicked');
// Wait for and return headers immediately when captured
this.logger.info('⏳ Waiting for headers to be captured...');
const headers = await headersPromise;
page.close();
if (headers) {
this.logger.info('✅ Headers captured successfully');
} else {
this.logger.warn('⚠️ No headers were captured');
}
return headers;
} catch (error) {
this.logger.error('Failed to fetch IB symbol summary', { error });
return;
}
}

View file

@ -1,18 +1,16 @@
/**
* IB Symbols Operations - Fetching symbol data from IB API
*/
import { OperationContext } from '@stock-bot/di';
import type { ServiceContainer } from '@stock-bot/di';
import type { IbHandler } from '../ib.handler';
import { IB_CONFIG } from '../shared/config';
// Fetch symbols from IB using the session headers
export async function fetchSymbols(sessionHeaders: Record<string, string>, container: ServiceContainer): Promise<unknown[] | null> {
const ctx = OperationContext.create('ib', 'symbols', { container });
export async function fetchSymbols(this: IbHandler): Promise<unknown[] | null> {
try {
ctx.logger.info('🔍 Fetching symbols with session headers...');
// First get session headers
const sessionHeaders = await this.fetchSession();
if (!sessionHeaders) {
throw new Error('Failed to get session headers');
}
this.logger.info('🔍 Fetching symbols with session headers...');
// Prepare headers - include all session headers plus any additional ones
const requestHeaders = {
...sessionHeaders,
@ -39,18 +37,15 @@ export async function fetchSymbols(sessionHeaders: Record<string, string>, conta
};
// Get Summary
const summaryResponse = await fetch(
IB_CONFIG.BASE_URL + IB_CONFIG.SUMMARY_API,
{
method: 'POST',
headers: requestHeaders,
proxy: IB_CONFIG.DEFAULT_PROXY,
body: JSON.stringify(requestBody),
}
);
const summaryResponse = await fetch(IB_CONFIG.BASE_URL + IB_CONFIG.SUMMARY_API, {
method: 'POST',
headers: requestHeaders,
proxy: IB_CONFIG.DEFAULT_PROXY,
body: JSON.stringify(requestBody),
});
if (!summaryResponse.ok) {
ctx.logger.error('❌ Summary API request failed', {
this.logger.error('❌ Summary API request failed', {
status: summaryResponse.status,
statusText: summaryResponse.statusText,
});
@ -58,36 +53,33 @@ export async function fetchSymbols(sessionHeaders: Record<string, string>, conta
}
const summaryData = await summaryResponse.json();
ctx.logger.info('✅ IB Summary data fetched successfully', {
this.logger.info('✅ IB Summary data fetched successfully', {
totalCount: summaryData[0].totalCount,
});
const symbols = [];
requestBody.pageSize = IB_CONFIG.PAGE_SIZE;
const pageCount = Math.ceil(summaryData[0].totalCount / IB_CONFIG.PAGE_SIZE) || 0;
ctx.logger.info('Fetching Symbols for IB', { pageCount });
this.logger.info('Fetching Symbols for IB', { pageCount });
const symbolPromises = [];
for (let page = 1; page <= pageCount; page++) {
requestBody.pageNumber = page;
// Fetch symbols for the current page
const symbolsResponse = fetch(
IB_CONFIG.BASE_URL + IB_CONFIG.PRODUCTS_API,
{
method: 'POST',
headers: requestHeaders,
proxy: IB_CONFIG.DEFAULT_PROXY,
body: JSON.stringify(requestBody),
}
);
const symbolsResponse = fetch(IB_CONFIG.BASE_URL + IB_CONFIG.PRODUCTS_API, {
method: 'POST',
headers: requestHeaders,
proxy: IB_CONFIG.DEFAULT_PROXY,
body: JSON.stringify(requestBody),
});
symbolPromises.push(symbolsResponse);
}
const responses = await Promise.all(symbolPromises);
for (const response of responses) {
if (!response.ok) {
ctx.logger.error('❌ Symbols API request failed', {
this.logger.error('❌ Symbols API request failed', {
status: response.status,
statusText: response.statusText,
});
@ -98,29 +90,28 @@ export async function fetchSymbols(sessionHeaders: Record<string, string>, conta
if (symJson && symJson.length > 0) {
symbols.push(...symJson);
} else {
ctx.logger.warn('⚠️ No symbols found in response');
this.logger.warn('⚠️ No symbols found in response');
continue;
}
}
if (symbols.length === 0) {
ctx.logger.warn('⚠️ No symbols fetched from IB');
this.logger.warn('⚠️ No symbols fetched from IB');
return null;
}
ctx.logger.info('✅ IB symbols fetched successfully, saving to DB...', {
this.logger.info('✅ IB symbols fetched successfully, saving to DB...', {
totalSymbols: symbols.length,
});
await ctx.mongodb.batchUpsert('ib_symbols', symbols, ['symbol', 'exchangeId']);
ctx.logger.info('Saved IB symbols to DB', {
await this.mongodb.batchUpsert('ib_symbols', symbols, ['symbol', 'exchangeId']);
this.logger.info('Saved IB symbols to DB', {
totalSymbols: symbols.length,
});
return symbols;
} catch (error) {
ctx.logger.error('❌ Failed to fetch symbols', { error });
this.logger.error('❌ Failed to fetch symbols', { error });
return null;
} finally {
await ctx.dispose();
}
}
}

View file

@ -0,0 +1,5 @@
export { fetchSession } from './fetch-session.action';
export { fetchExchanges } from './fetch-exchanges.action';
export { fetchSymbols } from './fetch-symbols.action';
export { fetchExchangesAndSymbols } from './fetch-exchanges-and-symbols.action';

View file

@ -1,90 +1,33 @@
/**
* Interactive Brokers Provider for new queue system
*/
import { getLogger } from '@stock-bot/logger';
import {
createJobHandler,
handlerRegistry,
type HandlerConfigWithSchedule,
} from '@stock-bot/queue';
import type { ServiceContainer } from '@stock-bot/di';
BaseHandler,
Handler,
Operation,
ScheduledOperation,
type IServiceContainer,
} from '@stock-bot/handlers';
import { fetchExchanges, fetchExchangesAndSymbols, fetchSession, fetchSymbols } from './actions';
const logger = getLogger('ib-provider');
@Handler('ib')
export class IbHandler extends BaseHandler {
constructor(services: IServiceContainer) {
super(services);
}
// Initialize and register the IB provider
export function initializeIBProvider(container: ServiceContainer) {
logger.debug('Registering IB provider with scheduled jobs...');
@Operation('fetch-session')
fetchSession = fetchSession;
const ibProviderConfig: HandlerConfigWithSchedule = {
name: 'ib',
operations: {
'fetch-session': createJobHandler(async () => {
// payload contains session configuration (not used in current implementation)
logger.debug('Processing session fetch request');
const { fetchSession } = await import('./operations/session.operations');
return fetchSession(container);
}),
@Operation('fetch-exchanges')
fetchExchanges = fetchExchanges;
'fetch-exchanges': createJobHandler(async () => {
// payload should contain session headers
logger.debug('Processing exchanges fetch request');
const { fetchSession } = await import('./operations/session.operations');
const { fetchExchanges } = await import('./operations/exchanges.operations');
const sessionHeaders = await fetchSession(container);
if (sessionHeaders) {
return fetchExchanges(sessionHeaders, container);
}
throw new Error('Failed to get session headers');
}),
@Operation('fetch-symbols')
fetchSymbols = fetchSymbols;
'fetch-symbols': createJobHandler(async () => {
// payload should contain session headers
logger.debug('Processing symbols fetch request');
const { fetchSession } = await import('./operations/session.operations');
const { fetchSymbols } = await import('./operations/symbols.operations');
const sessionHeaders = await fetchSession(container);
if (sessionHeaders) {
return fetchSymbols(sessionHeaders, container);
}
throw new Error('Failed to get session headers');
}),
'ib-exchanges-and-symbols': createJobHandler(async () => {
// Legacy operation for scheduled jobs
logger.info('Fetching symbol summary from IB');
const { fetchSession } = await import('./operations/session.operations');
const { fetchExchanges } = await import('./operations/exchanges.operations');
const { fetchSymbols } = await import('./operations/symbols.operations');
const sessionHeaders = await fetchSession(container);
logger.info('Fetched symbol summary from IB');
if (sessionHeaders) {
logger.debug('Fetching exchanges from IB');
const exchanges = await fetchExchanges(sessionHeaders, container);
logger.info('Fetched exchanges from IB', { count: exchanges?.length });
logger.debug('Fetching symbols from IB');
const symbols = await fetchSymbols(sessionHeaders, container);
logger.info('Fetched symbols from IB', { symbols });
return { exchangesCount: exchanges?.length, symbolsCount: symbols?.length };
}
return null;
}),
},
scheduledJobs: [
{
type: 'ib-exchanges-and-symbols',
operation: 'ib-exchanges-and-symbols',
cronPattern: '0 0 * * 0', // Every Sunday at midnight
priority: 5,
description: 'Fetch and update IB exchanges and symbols data',
// immediately: true, // Don't run immediately during startup to avoid conflicts
},
],
};
handlerRegistry.registerWithSchedule(ibProviderConfig);
logger.debug('IB provider registered successfully with scheduled jobs');
@Operation('ib-exchanges-and-symbols')
@ScheduledOperation('ib-exchanges-and-symbols', '0 0 * * 0', {
priority: 5,
description: 'Fetch and update IB exchanges and symbols data',
immediately: false,
})
fetchExchangesAndSymbols = fetchExchangesAndSymbols;
}

View file

@ -1,91 +0,0 @@
/**
* IB Session Operations - Browser automation for session headers
*/
import { Browser } from '@stock-bot/browser';
import { OperationContext } from '@stock-bot/di';
import type { ServiceContainer } from '@stock-bot/di';
import { IB_CONFIG } from '../shared/config';
export async function fetchSession(container: ServiceContainer): Promise<Record<string, string> | undefined> {
const ctx = OperationContext.create('ib', 'session', { container });
try {
await Browser.initialize({
headless: true,
timeout: IB_CONFIG.BROWSER_TIMEOUT,
blockResources: false
});
ctx.logger.info('✅ Browser initialized');
const { page } = await Browser.createPageWithProxy(
IB_CONFIG.BASE_URL + IB_CONFIG.PRODUCTS_PAGE,
IB_CONFIG.DEFAULT_PROXY
);
ctx.logger.info('✅ Page created with proxy');
const headersPromise = new Promise<Record<string, string> | undefined>(resolve => {
let resolved = false;
page.onNetworkEvent(event => {
if (event.url.includes('/webrest/search/product-types/summary')) {
if (event.type === 'request') {
try {
resolve(event.headers);
} catch (e) {
resolve(undefined);
ctx.logger.debug('Raw Summary Response error', { error: (e as Error).message });
}
}
}
});
// Timeout fallback
setTimeout(() => {
if (!resolved) {
resolved = true;
ctx.logger.warn('Timeout waiting for headers');
resolve(undefined);
}
}, IB_CONFIG.HEADERS_TIMEOUT);
});
ctx.logger.info('⏳ Waiting for page load...');
await page.waitForLoadState('domcontentloaded', { timeout: IB_CONFIG.PAGE_LOAD_TIMEOUT });
ctx.logger.info('✅ Page loaded');
//Products tabs
ctx.logger.info('🔍 Looking for Products tab...');
const productsTab = page.locator('#productSearchTab[role=\"tab\"][href=\"#products\"]');
await productsTab.waitFor({ timeout: IB_CONFIG.ELEMENT_TIMEOUT });
ctx.logger.info('✅ Found Products tab');
ctx.logger.info('🖱️ Clicking Products tab...');
await productsTab.click();
ctx.logger.info('✅ Products tab clicked');
// New Products Checkbox
ctx.logger.info('🔍 Looking for \"New Products Only\" radio button...');
const radioButton = page.locator('span.checkbox-text:has-text(\"New Products Only\")');
await radioButton.waitFor({ timeout: IB_CONFIG.ELEMENT_TIMEOUT });
ctx.logger.info(`🎯 Found \"New Products Only\" radio button`);
await radioButton.first().click();
ctx.logger.info('✅ \"New Products Only\" radio button clicked');
// Wait for and return headers immediately when captured
ctx.logger.info('⏳ Waiting for headers to be captured...');
const headers = await headersPromise;
page.close();
if (headers) {
ctx.logger.info('✅ Headers captured successfully');
} else {
ctx.logger.warn('⚠️ No headers were captured');
}
return headers;
} catch (error) {
ctx.logger.error('Failed to fetch IB symbol summary', { error });
return;
} finally {
await ctx.dispose();
}
}

View file

@ -8,16 +8,17 @@ export const IB_CONFIG = {
EXCHANGE_API: '/webrest/exchanges',
SUMMARY_API: '/webrest/search/product-types/summary',
PRODUCTS_API: '/webrest/search/products-by-filters',
// Browser configuration
BROWSER_TIMEOUT: 10000,
PAGE_LOAD_TIMEOUT: 20000,
ELEMENT_TIMEOUT: 5000,
HEADERS_TIMEOUT: 30000,
// API configuration
DEFAULT_PROXY: 'http://doimvbnb-US-rotate:w5fpiwrb9895@p.webshare.io:80',
PAGE_SIZE: 500,
PRODUCT_COUNTRIES: ['CA', 'US'],
PRODUCT_TYPES: ['STK'],
};
};

View file

@ -6,11 +6,12 @@
import type { IServiceContainer } from '@stock-bot/handlers';
import { autoRegisterHandlers } from '@stock-bot/handlers';
import { getLogger } from '@stock-bot/logger';
// Import handlers for bundling (ensures they're included in the build)
import './qm/qm.handler';
import './webshare/webshare.handler';
import './ceo/ceo.handler';
import './ib/ib.handler';
// Add more handler imports as needed
const logger = getLogger('handler-init');
@ -21,21 +22,17 @@ const logger = getLogger('handler-init');
export async function initializeAllHandlers(serviceContainer: IServiceContainer): Promise<void> {
try {
// Auto-register all handlers in this directory
const result = await autoRegisterHandlers(
__dirname,
serviceContainer,
{
pattern: '.handler.',
exclude: ['test', 'spec'],
dryRun: false
}
);
const result = await autoRegisterHandlers(__dirname, serviceContainer, {
pattern: '.handler.',
exclude: ['test', 'spec'],
dryRun: false,
});
logger.info('Handler auto-registration complete', {
registered: result.registered,
failed: result.failed
failed: result.failed,
});
if (result.failed.length > 0) {
logger.error('Some handlers failed to register', { failed: result.failed });
}
@ -51,21 +48,20 @@ export async function initializeAllHandlers(serviceContainer: IServiceContainer)
*/
async function manualHandlerRegistration(serviceContainer: any): Promise<void> {
logger.warn('Falling back to manual handler registration');
try {
// // Import and register handlers manually
// const { QMHandler } = await import('./qm/qm.handler');
// const qmHandler = new QMHandler(serviceContainer);
// qmHandler.register();
// const { WebShareHandler } = await import('./webshare/webshare.handler');
// const webShareHandler = new WebShareHandler(serviceContainer);
// webShareHandler.register();
logger.info('Manual handler registration complete');
} catch (error) {
logger.error('Manual handler registration failed', { error });
throw error;
}
}
}

View file

@ -1,24 +1,23 @@
/**
* Proxy Check Operations - Checking proxy functionality
*/
import type { ProxyInfo } from '@stock-bot/proxy';
import { OperationContext } from '@stock-bot/di';
import { getLogger } from '@stock-bot/logger';
import type { ProxyInfo } from '@stock-bot/proxy';
import { fetch } from '@stock-bot/utils';
import { PROXY_CONFIG } from '../shared/config';
/**
* Check if a proxy is working
*/
export async function checkProxy(proxy: ProxyInfo): Promise<ProxyInfo> {
const ctx = {
logger: getLogger('proxy-check'),
const ctx = {
logger: getLogger('proxy-check'),
resolve: <T>(_name: string) => {
throw new Error(`Service container not available for proxy operations`);
}
},
} as any;
let success = false;
ctx.logger.debug(`Checking Proxy:`, {
protocol: proxy.protocol,
@ -28,16 +27,17 @@ export async function checkProxy(proxy: ProxyInfo): Promise<ProxyInfo> {
try {
// Test the proxy using fetch with proxy support
const proxyUrl = proxy.username && proxy.password
? `${proxy.protocol}://${encodeURIComponent(proxy.username)}:${encodeURIComponent(proxy.password)}@${proxy.host}:${proxy.port}`
: `${proxy.protocol}://${proxy.host}:${proxy.port}`;
const proxyUrl =
proxy.username && proxy.password
? `${proxy.protocol}://${encodeURIComponent(proxy.username)}:${encodeURIComponent(proxy.password)}@${proxy.host}:${proxy.port}`
: `${proxy.protocol}://${proxy.host}:${proxy.port}`;
const response = await fetch(PROXY_CONFIG.CHECK_URL, {
proxy: proxyUrl,
signal: AbortSignal.timeout(PROXY_CONFIG.CHECK_TIMEOUT),
logger: ctx.logger
logger: ctx.logger,
} as any);
const data = await response.text();
const isWorking = response.ok;
@ -94,7 +94,11 @@ export async function checkProxy(proxy: ProxyInfo): Promise<ProxyInfo> {
/**
* Update proxy data in cache with working/total stats and average response time
*/
async function updateProxyInCache(proxy: ProxyInfo, isWorking: boolean, ctx: OperationContext): Promise<void> {
async function updateProxyInCache(
proxy: ProxyInfo,
isWorking: boolean,
ctx: OperationContext
): Promise<void> {
const _cacheKey = `${PROXY_CONFIG.CACHE_KEY}:${proxy.protocol}://${proxy.host}:${proxy.port}`;
try {
@ -167,6 +171,6 @@ async function updateProxyInCache(proxy: ProxyInfo, isWorking: boolean, ctx: Ope
function updateProxyStats(sourceId: string, success: boolean, ctx: OperationContext) {
// Stats are now handled by the global ProxyManager
ctx.logger.debug('Proxy check result', { sourceId, success });
// TODO: Integrate with global ProxyManager stats if needed
}
}

View file

@ -1,9 +1,8 @@
/**
* Proxy Query Operations - Getting active proxies from cache
*/
import type { ProxyInfo } from '@stock-bot/proxy';
import { OperationContext } from '@stock-bot/di';
import type { ProxyInfo } from '@stock-bot/proxy';
import { PROXY_CONFIG } from '../shared/config';
/**
@ -17,7 +16,7 @@ export async function getRandomActiveProxy(
minSuccessRate: number = 50
): Promise<ProxyInfo | null> {
const ctx = OperationContext.create('proxy', 'get-random');
try {
// Get all active proxy keys from cache
const pattern = protocol
@ -56,7 +55,10 @@ export async function getRandomActiveProxy(
return proxyData;
}
} catch (error) {
ctx.logger.debug('Error reading proxy from cache', { key, error: (error as Error).message });
ctx.logger.debug('Error reading proxy from cache', {
key,
error: (error as Error).message,
});
continue;
}
}
@ -76,4 +78,4 @@ export async function getRandomActiveProxy(
});
return null;
}
}
}

View file

@ -1,13 +1,13 @@
/**
* Proxy Queue Operations - Queueing proxy operations
*/
import { OperationContext } from '@stock-bot/di';
import type { ProxyInfo } from '@stock-bot/proxy';
import { QueueManager } from '@stock-bot/queue';
import { OperationContext } from '@stock-bot/di';
export async function queueProxyFetch(): Promise<string> {
const ctx = OperationContext.create('proxy', 'queue-fetch');
const queueManager = QueueManager.getInstance();
const queue = queueManager.getQueue('proxy');
const job = await queue.add('proxy-fetch', {
@ -24,7 +24,7 @@ export async function queueProxyFetch(): Promise<string> {
export async function queueProxyCheck(proxies: ProxyInfo[]): Promise<string> {
const ctx = OperationContext.create('proxy', 'queue-check');
const queueManager = QueueManager.getInstance();
const queue = queueManager.getQueue('proxy');
const job = await queue.add('proxy-check', {
@ -37,4 +37,4 @@ export async function queueProxyCheck(proxies: ProxyInfo[]): Promise<string> {
const jobId = job.id || 'unknown';
ctx.logger.info('Proxy check job queued', { jobId, count: proxies.length });
return jobId;
}
}

View file

@ -1,10 +1,14 @@
/**
* Proxy Provider for new queue system
*/
import type { ProxyInfo } from '@stock-bot/proxy';
import { getLogger } from '@stock-bot/logger';
import { handlerRegistry, createJobHandler, type HandlerConfigWithSchedule } from '@stock-bot/queue';
import type { ServiceContainer } from '@stock-bot/di';
import { getLogger } from '@stock-bot/logger';
import type { ProxyInfo } from '@stock-bot/proxy';
import {
createJobHandler,
handlerRegistry,
type HandlerConfigWithSchedule,
} from '@stock-bot/queue';
const handlerLogger = getLogger('proxy-handler');

View file

@ -137,4 +137,4 @@ export const PROXY_CONFIG = {
protocol: 'https',
},
],
};
};

View file

@ -10,4 +10,4 @@ export interface ProxySource {
total?: number; // Optional, used for stats
percentWorking?: number; // Optional, used for stats
lastChecked?: Date; // Optional, used for stats
}
}

View file

@ -6,16 +6,14 @@ import type { IServiceContainer } from '@stock-bot/handlers';
export async function fetchExchanges(services: IServiceContainer): Promise<any[]> {
// Get exchanges from MongoDB
const exchanges = await services.mongodb.collection('qm_exchanges')
.find({}).toArray();
const exchanges = await services.mongodb.collection('qm_exchanges').find({}).toArray();
return exchanges;
}
export async function getExchangeByCode(services: IServiceContainer, code: string): Promise<any> {
// Get specific exchange by code
const exchange = await services.mongodb.collection('qm_exchanges')
.findOne({ code });
const exchange = await services.mongodb.collection('qm_exchanges').findOne({ code });
return exchange;
}
}

View file

@ -9,10 +9,10 @@ import { QMSessionManager } from '../shared/session-manager';
/**
* Check existing sessions and queue creation jobs for needed sessions
*/
export async function checkSessions(handler: BaseHandler): Promise<{
cleaned: number;
queued: number;
message: string;
export async function checkSessions(handler: BaseHandler): Promise<{
cleaned: number;
queued: number;
message: string;
}> {
const sessionManager = QMSessionManager.getInstance();
const cleanedCount = sessionManager.cleanupFailedSessions();
@ -24,17 +24,17 @@ export async function checkSessions(handler: BaseHandler): Promise<{
const currentCount = sessionManager.getSessions(sessionId).length;
const neededSessions = SESSION_CONFIG.MAX_SESSIONS - currentCount;
for (let i = 0; i < neededSessions; i++) {
await handler.scheduleOperation('create-session', { sessionId , sessionType });
await handler.scheduleOperation('create-session', { sessionId, sessionType });
handler.logger.info(`Queued job to create session for ${sessionType}`);
queuedCount++;
}
}
}
return {
cleaned: cleanedCount,
queued: queuedCount,
message: `Session check completed: cleaned ${cleanedCount}, queued ${queuedCount}`
message: `Session check completed: cleaned ${cleanedCount}, queued ${queuedCount}`,
};
}
@ -42,16 +42,15 @@ export async function checkSessions(handler: BaseHandler): Promise<{
* Create a single session for a specific session ID
*/
export async function createSingleSession(
handler: BaseHandler,
handler: BaseHandler,
input: any
): Promise<{ sessionId: string; status: string; sessionType: string }> {
const { sessionId, sessionType } = input || {};
const sessionManager = QMSessionManager.getInstance();
// Get proxy from proxy service
const proxyString = handler.proxy.getProxy();
// const session = {
// proxy: proxyString || 'http://proxy:8080',
// headers: sessionManager.getQmHeaders(),
@ -60,15 +59,14 @@ export async function createSingleSession(
// lastUsed: new Date()
// };
handler.logger.info(`Creating session for ${sessionType}`)
handler.logger.info(`Creating session for ${sessionType}`);
// Add session to manager
// sessionManager.addSession(sessionType, session);
return {
sessionId: sessionType,
status: 'created',
sessionType
sessionType,
};
}

View file

@ -9,16 +9,15 @@ export async function spiderSymbolSearch(
services: IServiceContainer,
config: SymbolSpiderJob
): Promise<{ foundSymbols: number; depth: number }> {
// Simple spider implementation
// TODO: Implement actual API calls to discover symbols
// For now, just return mock results
const foundSymbols = Math.floor(Math.random() * 10) + 1;
return {
foundSymbols,
depth: config.depth
depth: config.depth,
};
}
@ -31,4 +30,4 @@ export async function queueSymbolDiscovery(
// TODO: Queue actual discovery jobs
await services.cache.set(`discovery:${term}`, { queued: true }, 3600);
}
}
}

View file

@ -6,16 +6,14 @@ import type { IServiceContainer } from '@stock-bot/handlers';
export async function searchSymbols(services: IServiceContainer): Promise<any[]> {
// Get symbols from MongoDB
const symbols = await services.mongodb.collection('qm_symbols')
.find({}).limit(50).toArray();
const symbols = await services.mongodb.collection('qm_symbols').find({}).limit(50).toArray();
return symbols;
}
export async function fetchSymbolData(services: IServiceContainer, symbol: string): Promise<any> {
// Fetch data for a specific symbol
const symbolData = await services.mongodb.collection('qm_symbols')
.findOne({ symbol });
const symbolData = await services.mongodb.collection('qm_symbols').findOne({ symbol });
return symbolData;
}
}

View file

@ -1,8 +1,4 @@
import {
BaseHandler,
Handler,
type IServiceContainer
} from '@stock-bot/handlers';
import { BaseHandler, Handler, type IServiceContainer } from '@stock-bot/handlers';
@Handler('qm')
export class QMHandler extends BaseHandler {
@ -11,10 +7,10 @@ export class QMHandler extends BaseHandler {
}
// @Operation('check-sessions')
// @QueueSchedule('0 */15 * * *', {
// priority: 7,
// immediately: true,
// description: 'Check and maintain QM sessions'
// @QueueSchedule('0 */15 * * *', {
// priority: 7,
// immediately: true,
// description: 'Check and maintain QM sessions'
// })
// async checkSessions(input: unknown, context: ExecutionContext): Promise<unknown> {
// // Call the session maintenance action
@ -36,13 +32,13 @@ export class QMHandler extends BaseHandler {
// // Check existing symbols in MongoDB
// const symbolsCollection = this.mongodb.collection('qm_symbols');
// const symbols = await symbolsCollection.find({}).limit(100).toArray();
// this.logger.info('QM symbol search completed', { count: symbols.length });
// if (symbols && symbols.length > 0) {
// // Cache result for performance
// await this.cache.set('qm-symbols-sample', symbols.slice(0, 10), 1800);
// return {
// success: true,
// message: 'QM symbol search completed successfully',
@ -58,7 +54,7 @@ export class QMHandler extends BaseHandler {
// count: 0,
// };
// }
// } catch (error) {
// this.logger.error('Failed to search QM symbols', { error });
// throw error;
@ -66,10 +62,10 @@ export class QMHandler extends BaseHandler {
// }
// @Operation('spider-symbol-search')
// @QueueSchedule('0 0 * * 0', {
// priority: 10,
// immediately: false,
// description: 'Comprehensive symbol search using QM API'
// @QueueSchedule('0 0 * * 0', {
// priority: 10,
// immediately: false,
// description: 'Comprehensive symbol search using QM API'
// })
// async spiderSymbolSearch(payload: SymbolSpiderJob | undefined, context: ExecutionContext): Promise<unknown> {
// // Set default payload for scheduled runs
@ -79,9 +75,9 @@ export class QMHandler extends BaseHandler {
// source: 'qm',
// maxDepth: 4
// };
// this.logger.info('Starting QM spider symbol search', { payload: jobPayload });
// // Store spider job info in cache (temporary data)
// const spiderJobId = `spider:qm:${Date.now()}:${Math.random().toString(36).substr(2, 9)}`;
// const spiderResult = {
@ -90,19 +86,18 @@ export class QMHandler extends BaseHandler {
// status: 'started',
// jobId: spiderJobId
// };
// // Store in cache with 1 hour TTL (temporary data)
// await this.cache.set(spiderJobId, spiderResult, 3600);
// this.logger.debug('Spider job stored in cache', { spiderJobId, ttl: 3600 });
// // Schedule follow-up processing if needed
// await this.scheduleOperation('search-symbols', { source: 'spider', spiderJobId }, 5000);
// return {
// success: true,
// return {
// success: true,
// message: 'QM spider search initiated',
// spiderJobId
// };
// }
}

View file

@ -2,11 +2,10 @@
* Shared configuration for QM operations
*/
// QM Session IDs for different endpoints
export const QM_SESSION_IDS = {
LOOKUP: 'dc8c9930437f65d30f6597768800957017bac203a0a50342932757c8dfa158d6', // lookup endpoint
// '5ad521e05faf5778d567f6d0012ec34d6cdbaeb2462f41568f66558bc7b4ced9': [], //4488d072b
// '5ad521e05faf5778d567f6d0012ec34d6cdbaeb2462f41568f66558bc7b4ced9': [], //4488d072b
// cc1cbdaf040f76db8f4c94f7d156b9b9b716e1a7509ec9c74a48a47f6b6b9f87: [], //97ff00cf3 // getQuotes
// '74963ff42f1db2320d051762b5d3950ff9eab23f9d5c5b592551b4ca0441d086': [], //32ca24e394b // getSplitsBySymbol getBrokerRatingsBySymbol getDividendsBySymbol getEarningsSurprisesBySymbol getEarningsEventsBySymbol
// '1e1d7cb1de1fd2fe52684abdea41a446919a5fe12776dfab88615ac1ce1ec2f6': [], //fb5721812d2c // getEnhancedQuotes getProfiles
@ -36,4 +35,4 @@ export const SESSION_CONFIG = {
MAX_FAILED_CALLS: 10,
SESSION_TIMEOUT: 10000, // 10 seconds
API_TIMEOUT: 15000, // 15 seconds
} as const;
} as const;

View file

@ -33,13 +33,15 @@ export class QMSessionManager {
if (!sessions || sessions.length === 0) {
return null;
}
// Filter out sessions with excessive failures
const validSessions = sessions.filter(session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS);
const validSessions = sessions.filter(
session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS
);
if (validSessions.length === 0) {
return null;
}
return validSessions[Math.floor(Math.random() * validSessions.length)];
}
@ -72,7 +74,7 @@ export class QMSessionManager {
*/
cleanupFailedSessions(): number {
let removedCount = 0;
Object.keys(this.sessionCache).forEach(sessionId => {
const initialCount = this.sessionCache[sessionId].length;
this.sessionCache[sessionId] = this.sessionCache[sessionId].filter(
@ -80,7 +82,7 @@ export class QMSessionManager {
);
removedCount += initialCount - this.sessionCache[sessionId].length;
});
return removedCount;
}
@ -94,13 +96,15 @@ export class QMSessionManager {
Referer: 'https://www.quotemedia.com/',
};
}
/**
* Check if more sessions are needed for a session ID
*/
needsMoreSessions(sessionId: string): boolean {
const sessions = this.sessionCache[sessionId] || [];
const validSessions = sessions.filter(session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS);
const validSessions = sessions.filter(
session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS
);
return validSessions.length < SESSION_CONFIG.MIN_SESSIONS;
}
@ -117,18 +121,22 @@ export class QMSessionManager {
*/
getStats() {
const stats: Record<string, { total: number; valid: number; failed: number }> = {};
Object.entries(this.sessionCache).forEach(([sessionId, sessions]) => {
const validSessions = sessions.filter(session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS);
const failedSessions = sessions.filter(session => session.failedCalls > SESSION_CONFIG.MAX_FAILED_CALLS);
const validSessions = sessions.filter(
session => session.failedCalls <= SESSION_CONFIG.MAX_FAILED_CALLS
);
const failedSessions = sessions.filter(
session => session.failedCalls > SESSION_CONFIG.MAX_FAILED_CALLS
);
stats[sessionId] = {
total: sessions.length,
valid: validSessions.length,
failed: failedSessions.length
failed: failedSessions.length,
};
});
return stats;
}
@ -145,4 +153,4 @@ export class QMSessionManager {
getInitialized(): boolean {
return this.isInitialized;
}
}
}

View file

@ -29,4 +29,4 @@ export interface SpiderResult {
success: boolean;
symbolsFound: number;
jobsCreated: number;
}
}

View file

@ -1,9 +1,8 @@
/**
* WebShare Fetch Operations - API integration
*/
import type { ProxyInfo } from '@stock-bot/proxy';
import { OperationContext } from '@stock-bot/di';
import type { ProxyInfo } from '@stock-bot/proxy';
import { WEBSHARE_CONFIG } from '../shared/config';
/**
@ -11,7 +10,7 @@ import { WEBSHARE_CONFIG } from '../shared/config';
*/
export async function fetchWebShareProxies(): Promise<ProxyInfo[]> {
const ctx = OperationContext.create('webshare', 'fetch-proxies');
try {
// Get configuration from config system
const { getConfig } = await import('@stock-bot/config');
@ -30,14 +29,17 @@ export async function fetchWebShareProxies(): Promise<ProxyInfo[]> {
ctx.logger.info('Fetching proxies from WebShare API', { apiUrl });
const response = await fetch(`${apiUrl}proxy/list/?mode=${WEBSHARE_CONFIG.DEFAULT_MODE}&page=${WEBSHARE_CONFIG.DEFAULT_PAGE}&page_size=${WEBSHARE_CONFIG.DEFAULT_PAGE_SIZE}`, {
method: 'GET',
headers: {
Authorization: `Token ${apiKey}`,
'Content-Type': 'application/json',
},
signal: AbortSignal.timeout(WEBSHARE_CONFIG.TIMEOUT),
});
const response = await fetch(
`${apiUrl}proxy/list/?mode=${WEBSHARE_CONFIG.DEFAULT_MODE}&page=${WEBSHARE_CONFIG.DEFAULT_PAGE}&page_size=${WEBSHARE_CONFIG.DEFAULT_PAGE_SIZE}`,
{
method: 'GET',
headers: {
Authorization: `Token ${apiKey}`,
'Content-Type': 'application/json',
},
signal: AbortSignal.timeout(WEBSHARE_CONFIG.TIMEOUT),
}
);
if (!response.ok) {
ctx.logger.error('WebShare API request failed', {
@ -55,22 +57,19 @@ export async function fetchWebShareProxies(): Promise<ProxyInfo[]> {
}
// Transform proxy data to ProxyInfo format
const proxies: ProxyInfo[] = data.results.map((proxy: {
username: string;
password: string;
proxy_address: string;
port: number;
}) => ({
source: 'webshare',
protocol: 'http' as const,
host: proxy.proxy_address,
port: proxy.port,
username: proxy.username,
password: proxy.password,
isWorking: true, // WebShare provides working proxies
firstSeen: new Date(),
lastChecked: new Date(),
}));
const proxies: ProxyInfo[] = data.results.map(
(proxy: { username: string; password: string; proxy_address: string; port: number }) => ({
source: 'webshare',
protocol: 'http' as const,
host: proxy.proxy_address,
port: proxy.port,
username: proxy.username,
password: proxy.password,
isWorking: true, // WebShare provides working proxies
firstSeen: new Date(),
lastChecked: new Date(),
})
);
ctx.logger.info('Successfully fetched proxies from WebShare', {
count: proxies.length,
@ -82,4 +81,4 @@ export async function fetchWebShareProxies(): Promise<ProxyInfo[]> {
ctx.logger.error('Failed to fetch proxies from WebShare', { error });
return [];
}
}
}

View file

@ -7,4 +7,4 @@ export const WEBSHARE_CONFIG = {
DEFAULT_MODE: 'direct',
DEFAULT_PAGE: 1,
TIMEOUT: 10000,
};
};

View file

@ -4,7 +4,7 @@ import {
Operation,
QueueSchedule,
type ExecutionContext,
type IServiceContainer
type IServiceContainer,
} from '@stock-bot/handlers';
@Handler('webshare')
@ -14,33 +14,45 @@ export class WebShareHandler extends BaseHandler {
}
@Operation('fetch-proxies')
@QueueSchedule('0 */6 * * *', {
priority: 3,
immediately: true,
description: 'Fetch fresh proxies from WebShare API'
@QueueSchedule('0 */6 * * *', {
priority: 3,
immediately: true,
description: 'Fetch fresh proxies from WebShare API',
})
async fetchProxies(_input: unknown, _context: ExecutionContext): Promise<unknown> {
this.logger.info('Fetching proxies from WebShare API');
try {
const { fetchWebShareProxies } = await import('./operations/fetch.operations');
const proxies = await fetchWebShareProxies();
if (proxies.length > 0) {
// Update the centralized proxy manager using the injected service
if (!this.proxy) {
this.logger.warn('Proxy manager is not initialized, cannot update proxies');
return {
success: false,
proxiesUpdated: 0,
error: 'Proxy manager not initialized',
};
}
await this.proxy.updateProxies(proxies);
this.logger.info('Updated proxy manager with WebShare proxies', {
this.logger.info('Updated proxy manager with WebShare proxies', {
count: proxies.length,
workingCount: proxies.filter(p => p.isWorking !== false).length,
});
// Cache proxy stats for monitoring
await this.cache.set('webshare-proxy-count', proxies.length, 3600);
await this.cache.set('webshare-working-count', proxies.filter(p => p.isWorking !== false).length, 3600);
await this.cache.set(
'webshare-working-count',
proxies.filter(p => p.isWorking !== false).length,
3600
);
await this.cache.set('last-webshare-fetch', new Date().toISOString(), 1800);
return {
return {
success: true,
proxiesUpdated: proxies.length,
workingProxies: proxies.filter(p => p.isWorking !== false).length,
@ -59,4 +71,3 @@ export class WebShareHandler extends BaseHandler {
}
}
}

View file

@ -4,20 +4,18 @@
*/
// Framework imports
import { initializeServiceConfig } from '@stock-bot/config';
import { Hono } from 'hono';
import { cors } from 'hono/cors';
import { initializeServiceConfig } from '@stock-bot/config';
// Library imports
import {
createServiceContainer,
initializeServices as initializeAwilixServices,
type ServiceContainer
type ServiceContainer,
} from '@stock-bot/di';
import { getLogger, setLoggerConfig, shutdownLoggers } from '@stock-bot/logger';
import { Shutdown } from '@stock-bot/shutdown';
import { handlerRegistry } from '@stock-bot/types';
// Local imports
import { createRoutes } from './routes/create-routes';
import { initializeAllHandlers } from './handlers';
@ -84,17 +82,17 @@ async function initializeServices() {
ttl: 3600,
},
};
container = createServiceContainer(awilixConfig);
await initializeAwilixServices(container);
logger.info('Awilix container created and initialized');
// Get the service container for handlers
const serviceContainer = container.resolve('serviceContainer');
// Create app with routes
app = new Hono();
// Add CORS middleware
app.use(
'*',
@ -105,17 +103,17 @@ async function initializeServices() {
credentials: false,
})
);
// Create and mount routes using the service container
const routes = createRoutes(serviceContainer);
app.route('/', routes);
// Initialize handlers with service container from Awilix
logger.debug('Initializing data handlers with Awilix DI pattern...');
// Auto-register all handlers with the service container from Awilix
await initializeAllHandlers(serviceContainer);
logger.info('Data handlers initialized with new DI pattern');
// Create scheduled jobs from registered handlers
@ -175,10 +173,10 @@ async function initializeServices() {
logger.info('All services initialized successfully');
} catch (error) {
console.error('DETAILED ERROR:', error);
logger.error('Failed to initialize services', {
logger.error('Failed to initialize services', {
error: error instanceof Error ? error.message : String(error),
stack: error instanceof Error ? error.stack : undefined,
details: JSON.stringify(error, null, 2)
details: JSON.stringify(error, null, 2),
});
throw error;
}
@ -236,14 +234,20 @@ shutdown.onShutdownMedium(async () => {
if (container) {
// Disconnect database clients
const mongoClient = container.resolve('mongoClient');
if (mongoClient?.disconnect) await mongoClient.disconnect();
if (mongoClient?.disconnect) {
await mongoClient.disconnect();
}
const postgresClient = container.resolve('postgresClient');
if (postgresClient?.disconnect) await postgresClient.disconnect();
if (postgresClient?.disconnect) {
await postgresClient.disconnect();
}
const questdbClient = container.resolve('questdbClient');
if (questdbClient?.disconnect) await questdbClient.disconnect();
if (questdbClient?.disconnect) {
await questdbClient.disconnect();
}
logger.info('All services disposed successfully');
}
} catch (error) {
@ -268,4 +272,4 @@ startServer().catch(error => {
process.exit(1);
});
logger.info('Data service startup initiated with improved DI pattern');
logger.info('Data service startup initiated with improved DI pattern');

View file

@ -1,69 +1,74 @@
/**
* Routes creation with improved DI pattern
*/
import { Hono } from 'hono';
import type { IServiceContainer } from '@stock-bot/handlers';
import { exchangeRoutes } from './exchange.routes';
import { healthRoutes } from './health.routes';
import { queueRoutes } from './queue.routes';
/**
* Creates all routes with access to type-safe services
*/
export function createRoutes(services: IServiceContainer): Hono {
const app = new Hono();
// Mount routes that don't need services
app.route('/health', healthRoutes);
// Mount routes that need services (will be updated to use services)
app.route('/api/exchanges', exchangeRoutes);
app.route('/api/queue', queueRoutes);
// Store services in app context for handlers that need it
app.use('*', async (c, next) => {
c.set('services', services);
await next();
});
// Add a new endpoint to test the improved DI
app.get('/api/di-test', async (c) => {
try {
const services = c.get('services') as IServiceContainer;
// Test MongoDB connection
const mongoStats = services.mongodb?.getPoolMetrics?.() || { status: services.mongodb ? 'connected' : 'disabled' };
// Test PostgreSQL connection
const pgConnected = services.postgres?.connected || false;
// Test cache
const cacheReady = services.cache?.isReady() || false;
// Test queue
const queueStats = services.queue?.getGlobalStats() || { status: 'disabled' };
return c.json({
success: true,
message: 'Improved DI pattern is working!',
services: {
mongodb: mongoStats,
postgres: { connected: pgConnected },
cache: { ready: cacheReady },
queue: queueStats
},
timestamp: new Date().toISOString()
});
} catch (error) {
const services = c.get('services') as IServiceContainer;
services.logger.error('DI test endpoint failed', { error });
return c.json({
success: false,
error: error instanceof Error ? error.message : String(error)
}, 500);
}
});
return app;
}
/**
* Routes creation with improved DI pattern
*/
import { Hono } from 'hono';
import type { IServiceContainer } from '@stock-bot/handlers';
import { exchangeRoutes } from './exchange.routes';
import { healthRoutes } from './health.routes';
import { queueRoutes } from './queue.routes';
/**
* Creates all routes with access to type-safe services
*/
export function createRoutes(services: IServiceContainer): Hono {
const app = new Hono();
// Mount routes that don't need services
app.route('/health', healthRoutes);
// Mount routes that need services (will be updated to use services)
app.route('/api/exchanges', exchangeRoutes);
app.route('/api/queue', queueRoutes);
// Store services in app context for handlers that need it
app.use('*', async (c, next) => {
c.set('services', services);
await next();
});
// Add a new endpoint to test the improved DI
app.get('/api/di-test', async c => {
try {
const services = c.get('services') as IServiceContainer;
// Test MongoDB connection
const mongoStats = services.mongodb?.getPoolMetrics?.() || {
status: services.mongodb ? 'connected' : 'disabled',
};
// Test PostgreSQL connection
const pgConnected = services.postgres?.connected || false;
// Test cache
const cacheReady = services.cache?.isReady() || false;
// Test queue
const queueStats = services.queue?.getGlobalStats() || { status: 'disabled' };
return c.json({
success: true,
message: 'Improved DI pattern is working!',
services: {
mongodb: mongoStats,
postgres: { connected: pgConnected },
cache: { ready: cacheReady },
queue: queueStats,
},
timestamp: new Date().toISOString(),
});
} catch (error) {
const services = c.get('services') as IServiceContainer;
services.logger.error('DI test endpoint failed', { error });
return c.json(
{
success: false,
error: error instanceof Error ? error.message : String(error),
},
500
);
}
});
return app;
}

View file

@ -11,7 +11,7 @@ exchange.get('/', async c => {
return c.json({
status: 'success',
data: [],
message: 'Exchange endpoints will be implemented with database integration'
message: 'Exchange endpoints will be implemented with database integration',
});
} catch (error) {
logger.error('Failed to get exchanges', { error });
@ -19,4 +19,4 @@ exchange.get('/', async c => {
}
});
export { exchange as exchangeRoutes };
export { exchange as exchangeRoutes };

View file

@ -10,11 +10,11 @@ queue.get('/status', async c => {
try {
const queueManager = QueueManager.getInstance();
const globalStats = await queueManager.getGlobalStats();
return c.json({
status: 'success',
data: globalStats,
message: 'Queue status retrieved successfully'
message: 'Queue status retrieved successfully',
});
} catch (error) {
logger.error('Failed to get queue status', { error });
@ -22,4 +22,4 @@ queue.get('/status', async c => {
}
});
export { queue as queueRoutes };
export { queue as queueRoutes };

View file

@ -37,4 +37,4 @@ export interface IBSymbol {
name?: string;
currency?: string;
// Add other properties as needed
}
}

View file

@ -90,4 +90,4 @@ export interface FetchWebShareProxiesResult extends CountableJobResult {
// No payload job types (for operations that don't need input)
export interface NoPayload {
// Empty interface for operations that don't need payload
}
}

View file

@ -1,5 +1,5 @@
import { getLogger } from '@stock-bot/logger';
import { sleep } from '@stock-bot/di';
import { getLogger } from '@stock-bot/logger';
const logger = getLogger('symbol-search-util');

View file

@ -1,101 +1,103 @@
#!/usr/bin/env bun
/**
* Test script for CEO handler operations
*/
import { initializeServiceConfig } from '@stock-bot/config';
import { createServiceContainer, initializeServices } from '@stock-bot/di';
import { getLogger } from '@stock-bot/logger';
const logger = getLogger('test-ceo-operations');
async function testCeoOperations() {
logger.info('Testing CEO handler operations...');
try {
// Initialize config
const config = initializeServiceConfig();
// Create Awilix container
const awilixConfig = {
redis: {
host: config.database.dragonfly.host,
port: config.database.dragonfly.port,
db: config.database.dragonfly.db,
},
mongodb: {
uri: config.database.mongodb.uri,
database: config.database.mongodb.database,
},
postgres: {
host: config.database.postgres.host,
port: config.database.postgres.port,
database: config.database.postgres.database,
user: config.database.postgres.user,
password: config.database.postgres.password,
},
questdb: {
enabled: false,
host: config.database.questdb.host,
httpPort: config.database.questdb.httpPort,
pgPort: config.database.questdb.pgPort,
influxPort: config.database.questdb.ilpPort,
database: config.database.questdb.database,
},
};
const container = createServiceContainer(awilixConfig);
await initializeServices(container);
const serviceContainer = container.resolve('serviceContainer');
// Import and create CEO handler
const { CeoHandler } = await import('./src/handlers/ceo/ceo.handler');
const ceoHandler = new CeoHandler(serviceContainer);
// Test 1: Check if there are any CEO symbols in the database
logger.info('Checking for existing CEO symbols...');
const collection = serviceContainer.mongodb.collection('ceoSymbols');
const count = await collection.countDocuments();
logger.info(`Found ${count} CEO symbols in database`);
if (count > 0) {
// Test 2: Run process-unique-symbols operation
logger.info('Testing process-unique-symbols operation...');
const result = await ceoHandler.updateUniqueSymbols(undefined, {});
logger.info('Process unique symbols result:', result);
// Test 3: Test individual symbol processing
logger.info('Testing process-individual-symbol operation...');
const sampleSymbol = await collection.findOne({});
if (sampleSymbol) {
const individualResult = await ceoHandler.processIndividualSymbol({
ceoId: sampleSymbol.ceoId,
symbol: sampleSymbol.symbol,
exchange: sampleSymbol.exchange,
name: sampleSymbol.name,
}, {});
logger.info('Process individual symbol result:', individualResult);
}
} else {
logger.warn('No CEO symbols found. Run the service to populate data first.');
}
// Clean up
await serviceContainer.mongodb.disconnect();
await serviceContainer.postgres.disconnect();
if (serviceContainer.cache) {
await serviceContainer.cache.disconnect();
}
logger.info('Test completed successfully!');
process.exit(0);
} catch (error) {
logger.error('Test failed:', error);
process.exit(1);
}
}
// Run the test
testCeoOperations();
#!/usr/bin/env bun
/**
* Test script for CEO handler operations
*/
import { initializeServiceConfig } from '@stock-bot/config';
import { createServiceContainer, initializeServices } from '@stock-bot/di';
import { getLogger } from '@stock-bot/logger';
const logger = getLogger('test-ceo-operations');
async function testCeoOperations() {
logger.info('Testing CEO handler operations...');
try {
// Initialize config
const config = initializeServiceConfig();
// Create Awilix container
const awilixConfig = {
redis: {
host: config.database.dragonfly.host,
port: config.database.dragonfly.port,
db: config.database.dragonfly.db,
},
mongodb: {
uri: config.database.mongodb.uri,
database: config.database.mongodb.database,
},
postgres: {
host: config.database.postgres.host,
port: config.database.postgres.port,
database: config.database.postgres.database,
user: config.database.postgres.user,
password: config.database.postgres.password,
},
questdb: {
enabled: false,
host: config.database.questdb.host,
httpPort: config.database.questdb.httpPort,
pgPort: config.database.questdb.pgPort,
influxPort: config.database.questdb.ilpPort,
database: config.database.questdb.database,
},
};
const container = createServiceContainer(awilixConfig);
await initializeServices(container);
const serviceContainer = container.resolve('serviceContainer');
// Import and create CEO handler
const { CeoHandler } = await import('./src/handlers/ceo/ceo.handler');
const ceoHandler = new CeoHandler(serviceContainer);
// Test 1: Check if there are any CEO symbols in the database
logger.info('Checking for existing CEO symbols...');
const collection = serviceContainer.mongodb.collection('ceoSymbols');
const count = await collection.countDocuments();
logger.info(`Found ${count} CEO symbols in database`);
if (count > 0) {
// Test 2: Run process-unique-symbols operation
logger.info('Testing process-unique-symbols operation...');
const result = await ceoHandler.updateUniqueSymbols(undefined, {});
logger.info('Process unique symbols result:', result);
// Test 3: Test individual symbol processing
logger.info('Testing process-individual-symbol operation...');
const sampleSymbol = await collection.findOne({});
if (sampleSymbol) {
const individualResult = await ceoHandler.processIndividualSymbol(
{
ceoId: sampleSymbol.ceoId,
symbol: sampleSymbol.symbol,
exchange: sampleSymbol.exchange,
name: sampleSymbol.name,
},
{}
);
logger.info('Process individual symbol result:', individualResult);
}
} else {
logger.warn('No CEO symbols found. Run the service to populate data first.');
}
// Clean up
await serviceContainer.mongodb.disconnect();
await serviceContainer.postgres.disconnect();
if (serviceContainer.cache) {
await serviceContainer.cache.disconnect();
}
logger.info('Test completed successfully!');
process.exit(0);
} catch (error) {
logger.error('Test failed:', error);
process.exit(1);
}
}
// Run the test
testCeoOperations();

View file

@ -1,15 +1,15 @@
{
"service": {
"name": "data-pipeline",
"port": 3005,
"host": "0.0.0.0",
"healthCheckPath": "/health",
"metricsPath": "/metrics",
"shutdownTimeout": 30000,
"cors": {
"enabled": true,
"origin": "*",
"credentials": false
}
}
}
{
"service": {
"name": "data-pipeline",
"port": 3005,
"host": "0.0.0.0",
"healthCheckPath": "/health",
"metricsPath": "/metrics",
"shutdownTimeout": 30000,
"cors": {
"enabled": true,
"origin": "*",
"credentials": false
}
}
}

View file

@ -1,27 +1,27 @@
import { PostgreSQLClient } from '@stock-bot/postgres';
import { MongoDBClient } from '@stock-bot/mongodb';
let postgresClient: PostgreSQLClient | null = null;
let mongodbClient: MongoDBClient | null = null;
export function setPostgreSQLClient(client: PostgreSQLClient): void {
postgresClient = client;
}
export function getPostgreSQLClient(): PostgreSQLClient {
if (!postgresClient) {
throw new Error('PostgreSQL client not initialized. Call setPostgreSQLClient first.');
}
return postgresClient;
}
export function setMongoDBClient(client: MongoDBClient): void {
mongodbClient = client;
}
export function getMongoDBClient(): MongoDBClient {
if (!mongodbClient) {
throw new Error('MongoDB client not initialized. Call setMongoDBClient first.');
}
return mongodbClient;
}
import { MongoDBClient } from '@stock-bot/mongodb';
import { PostgreSQLClient } from '@stock-bot/postgres';
let postgresClient: PostgreSQLClient | null = null;
let mongodbClient: MongoDBClient | null = null;
export function setPostgreSQLClient(client: PostgreSQLClient): void {
postgresClient = client;
}
export function getPostgreSQLClient(): PostgreSQLClient {
if (!postgresClient) {
throw new Error('PostgreSQL client not initialized. Call setPostgreSQLClient first.');
}
return postgresClient;
}
export function setMongoDBClient(client: MongoDBClient): void {
mongodbClient = client;
}
export function getMongoDBClient(): MongoDBClient {
if (!mongodbClient) {
throw new Error('MongoDB client not initialized. Call setMongoDBClient first.');
}
return mongodbClient;
}

View file

@ -1,58 +1,58 @@
import { getLogger } from '@stock-bot/logger';
import { handlerRegistry, type HandlerConfig, type ScheduledJobConfig } from '@stock-bot/queue';
import { exchangeOperations } from './operations';
const logger = getLogger('exchanges-handler');
const HANDLER_NAME = 'exchanges';
const exchangesHandlerConfig: HandlerConfig = {
concurrency: 1,
maxAttempts: 3,
scheduledJobs: [
{
operation: 'sync-all-exchanges',
cronPattern: '0 0 * * 0', // Weekly on Sunday at midnight
payload: { clearFirst: true },
priority: 10,
immediately: false,
} as ScheduledJobConfig,
{
operation: 'sync-qm-exchanges',
cronPattern: '0 1 * * *', // Daily at 1 AM
payload: {},
priority: 5,
immediately: false,
} as ScheduledJobConfig,
{
operation: 'sync-ib-exchanges',
cronPattern: '0 3 * * *', // Daily at 3 AM
payload: {},
priority: 3,
immediately: false,
} as ScheduledJobConfig,
{
operation: 'sync-qm-provider-mappings',
cronPattern: '0 3 * * *', // Daily at 3 AM
payload: {},
priority: 7,
immediately: false,
} as ScheduledJobConfig,
],
operations: {
'sync-all-exchanges': exchangeOperations.syncAllExchanges,
'sync-qm-exchanges': exchangeOperations.syncQMExchanges,
'sync-ib-exchanges': exchangeOperations.syncIBExchanges,
'sync-qm-provider-mappings': exchangeOperations.syncQMProviderMappings,
'clear-postgresql-data': exchangeOperations.clearPostgreSQLData,
'get-exchange-stats': exchangeOperations.getExchangeStats,
'get-provider-mapping-stats': exchangeOperations.getProviderMappingStats,
'enhanced-sync-status': exchangeOperations['enhanced-sync-status'],
},
};
export function initializeExchangesHandler(): void {
logger.info('Registering exchanges handler...');
handlerRegistry.registerHandler(HANDLER_NAME, exchangesHandlerConfig);
logger.info('Exchanges handler registered successfully');
}
import { getLogger } from '@stock-bot/logger';
import { handlerRegistry, type HandlerConfig, type ScheduledJobConfig } from '@stock-bot/queue';
import { exchangeOperations } from './operations';
const logger = getLogger('exchanges-handler');
const HANDLER_NAME = 'exchanges';
const exchangesHandlerConfig: HandlerConfig = {
concurrency: 1,
maxAttempts: 3,
scheduledJobs: [
{
operation: 'sync-all-exchanges',
cronPattern: '0 0 * * 0', // Weekly on Sunday at midnight
payload: { clearFirst: true },
priority: 10,
immediately: false,
} as ScheduledJobConfig,
{
operation: 'sync-qm-exchanges',
cronPattern: '0 1 * * *', // Daily at 1 AM
payload: {},
priority: 5,
immediately: false,
} as ScheduledJobConfig,
{
operation: 'sync-ib-exchanges',
cronPattern: '0 3 * * *', // Daily at 3 AM
payload: {},
priority: 3,
immediately: false,
} as ScheduledJobConfig,
{
operation: 'sync-qm-provider-mappings',
cronPattern: '0 3 * * *', // Daily at 3 AM
payload: {},
priority: 7,
immediately: false,
} as ScheduledJobConfig,
],
operations: {
'sync-all-exchanges': exchangeOperations.syncAllExchanges,
'sync-qm-exchanges': exchangeOperations.syncQMExchanges,
'sync-ib-exchanges': exchangeOperations.syncIBExchanges,
'sync-qm-provider-mappings': exchangeOperations.syncQMProviderMappings,
'clear-postgresql-data': exchangeOperations.clearPostgreSQLData,
'get-exchange-stats': exchangeOperations.getExchangeStats,
'get-provider-mapping-stats': exchangeOperations.getProviderMappingStats,
'enhanced-sync-status': exchangeOperations['enhanced-sync-status'],
},
};
export function initializeExchangesHandler(): void {
logger.info('Registering exchanges handler...');
handlerRegistry.registerHandler(HANDLER_NAME, exchangesHandlerConfig);
logger.info('Exchanges handler registered successfully');
}

View file

@ -13,7 +13,7 @@ export async function clearPostgreSQLData(payload: JobPayload): Promise<{
try {
const postgresClient = getPostgreSQLClient();
// Start transaction for atomic operations
await postgresClient.query('BEGIN');
@ -21,9 +21,7 @@ export async function clearPostgreSQLData(payload: JobPayload): Promise<{
const exchangeCountResult = await postgresClient.query(
'SELECT COUNT(*) as count FROM exchanges'
);
const symbolCountResult = await postgresClient.query(
'SELECT COUNT(*) as count FROM symbols'
);
const symbolCountResult = await postgresClient.query('SELECT COUNT(*) as count FROM symbols');
const mappingCountResult = await postgresClient.query(
'SELECT COUNT(*) as count FROM provider_mappings'
);
@ -57,4 +55,4 @@ export async function clearPostgreSQLData(payload: JobPayload): Promise<{
logger.error('Failed to clear PostgreSQL data', { error });
throw error;
}
}
}

View file

@ -16,11 +16,11 @@ export async function getSyncStatus(payload: JobPayload): Promise<SyncStatus[]>
ORDER BY provider, data_type
`;
const result = await postgresClient.query(query);
logger.info(`Retrieved sync status for ${result.rows.length} entries`);
return result.rows;
} catch (error) {
logger.error('Failed to get sync status', { error });
throw error;
}
}
}

View file

@ -18,11 +18,11 @@ export async function getExchangeStats(payload: JobPayload): Promise<any> {
FROM exchanges
`;
const result = await postgresClient.query(query);
logger.info('Retrieved exchange statistics');
return result.rows[0];
} catch (error) {
logger.error('Failed to get exchange statistics', { error });
throw error;
}
}
}

View file

@ -1,19 +1,19 @@
import { syncAllExchanges } from './sync-all-exchanges.operations';
import { syncQMExchanges } from './qm-exchanges.operations';
import { syncIBExchanges } from './sync-ib-exchanges.operations';
import { syncQMProviderMappings } from './sync-qm-provider-mappings.operations';
import { clearPostgreSQLData } from './clear-postgresql-data.operations';
import { getExchangeStats } from './exchange-stats.operations';
import { getProviderMappingStats } from './provider-mapping-stats.operations';
import { getSyncStatus } from './enhanced-sync-status.operations';
export const exchangeOperations = {
syncAllExchanges,
syncQMExchanges,
syncIBExchanges,
syncQMProviderMappings,
clearPostgreSQLData,
getExchangeStats,
getProviderMappingStats,
'enhanced-sync-status': getSyncStatus,
};
import { clearPostgreSQLData } from './clear-postgresql-data.operations';
import { getSyncStatus } from './enhanced-sync-status.operations';
import { getExchangeStats } from './exchange-stats.operations';
import { getProviderMappingStats } from './provider-mapping-stats.operations';
import { syncQMExchanges } from './qm-exchanges.operations';
import { syncAllExchanges } from './sync-all-exchanges.operations';
import { syncIBExchanges } from './sync-ib-exchanges.operations';
import { syncQMProviderMappings } from './sync-qm-provider-mappings.operations';
export const exchangeOperations = {
syncAllExchanges,
syncQMExchanges,
syncIBExchanges,
syncQMProviderMappings,
clearPostgreSQLData,
getExchangeStats,
getProviderMappingStats,
'enhanced-sync-status': getSyncStatus,
};

View file

@ -22,11 +22,11 @@ export async function getProviderMappingStats(payload: JobPayload): Promise<any>
ORDER BY provider
`;
const result = await postgresClient.query(query);
logger.info('Retrieved provider mapping statistics');
return result.rows;
} catch (error) {
logger.error('Failed to get provider mapping statistics', { error });
throw error;
}
}
}

View file

@ -1,102 +1,113 @@
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-qm-exchanges');
export async function syncQMExchanges(payload: JobPayload): Promise<{ processed: number; created: number; updated: number }> {
logger.info('Starting QM exchanges sync...');
try {
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient();
// 1. Get all QM exchanges from MongoDB
const qmExchanges = await mongoClient.find('qmExchanges', {});
logger.info(`Found ${qmExchanges.length} QM exchanges to process`);
let created = 0;
let updated = 0;
for (const exchange of qmExchanges) {
try {
// 2. Check if exchange exists
const existingExchange = await findExchange(exchange.exchangeCode, postgresClient);
if (existingExchange) {
// Update existing
await updateExchange(existingExchange.id, exchange, postgresClient);
updated++;
} else {
// Create new
await createExchange(exchange, postgresClient);
created++;
}
} catch (error) {
logger.error('Failed to process exchange', { error, exchange: exchange.exchangeCode });
}
}
// 3. Update sync status
await updateSyncStatus('qm', 'exchanges', qmExchanges.length, postgresClient);
const result = { processed: qmExchanges.length, created, updated };
logger.info('QM exchanges sync completed', result);
return result;
} catch (error) {
logger.error('QM exchanges sync failed', { error });
throw error;
}
}
// Helper functions
async function findExchange(exchangeCode: string, postgresClient: any): Promise<any> {
const query = 'SELECT * FROM exchanges WHERE code = $1';
const result = await postgresClient.query(query, [exchangeCode]);
return result.rows[0] || null;
}
async function createExchange(qmExchange: any, postgresClient: any): Promise<void> {
const query = `
INSERT INTO exchanges (code, name, country, currency, visible)
VALUES ($1, $2, $3, $4, $5)
ON CONFLICT (code) DO NOTHING
`;
await postgresClient.query(query, [
qmExchange.exchangeCode || qmExchange.exchange,
qmExchange.exchangeShortName || qmExchange.name,
qmExchange.countryCode || 'US',
'USD', // Default currency, can be improved
true, // New exchanges are visible by default
]);
}
async function updateExchange(exchangeId: string, qmExchange: any, postgresClient: any): Promise<void> {
const query = `
UPDATE exchanges
SET name = COALESCE($2, name),
country = COALESCE($3, country),
updated_at = NOW()
WHERE id = $1
`;
await postgresClient.query(query, [
exchangeId,
qmExchange.exchangeShortName || qmExchange.name,
qmExchange.countryCode,
]);
}
async function updateSyncStatus(provider: string, dataType: string, count: number, postgresClient: any): Promise<void> {
const query = `
UPDATE sync_status
SET last_sync_at = NOW(),
last_sync_count = $3,
sync_errors = NULL,
updated_at = NOW()
WHERE provider = $1 AND data_type = $2
`;
await postgresClient.query(query, [provider, dataType, count]);
}
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-qm-exchanges');
export async function syncQMExchanges(
payload: JobPayload
): Promise<{ processed: number; created: number; updated: number }> {
logger.info('Starting QM exchanges sync...');
try {
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient();
// 1. Get all QM exchanges from MongoDB
const qmExchanges = await mongoClient.find('qmExchanges', {});
logger.info(`Found ${qmExchanges.length} QM exchanges to process`);
let created = 0;
let updated = 0;
for (const exchange of qmExchanges) {
try {
// 2. Check if exchange exists
const existingExchange = await findExchange(exchange.exchangeCode, postgresClient);
if (existingExchange) {
// Update existing
await updateExchange(existingExchange.id, exchange, postgresClient);
updated++;
} else {
// Create new
await createExchange(exchange, postgresClient);
created++;
}
} catch (error) {
logger.error('Failed to process exchange', { error, exchange: exchange.exchangeCode });
}
}
// 3. Update sync status
await updateSyncStatus('qm', 'exchanges', qmExchanges.length, postgresClient);
const result = { processed: qmExchanges.length, created, updated };
logger.info('QM exchanges sync completed', result);
return result;
} catch (error) {
logger.error('QM exchanges sync failed', { error });
throw error;
}
}
// Helper functions
async function findExchange(exchangeCode: string, postgresClient: any): Promise<any> {
const query = 'SELECT * FROM exchanges WHERE code = $1';
const result = await postgresClient.query(query, [exchangeCode]);
return result.rows[0] || null;
}
async function createExchange(qmExchange: any, postgresClient: any): Promise<void> {
const query = `
INSERT INTO exchanges (code, name, country, currency, visible)
VALUES ($1, $2, $3, $4, $5)
ON CONFLICT (code) DO NOTHING
`;
await postgresClient.query(query, [
qmExchange.exchangeCode || qmExchange.exchange,
qmExchange.exchangeShortName || qmExchange.name,
qmExchange.countryCode || 'US',
'USD', // Default currency, can be improved
true, // New exchanges are visible by default
]);
}
async function updateExchange(
exchangeId: string,
qmExchange: any,
postgresClient: any
): Promise<void> {
const query = `
UPDATE exchanges
SET name = COALESCE($2, name),
country = COALESCE($3, country),
updated_at = NOW()
WHERE id = $1
`;
await postgresClient.query(query, [
exchangeId,
qmExchange.exchangeShortName || qmExchange.name,
qmExchange.countryCode,
]);
}
async function updateSyncStatus(
provider: string,
dataType: string,
count: number,
postgresClient: any
): Promise<void> {
const query = `
UPDATE sync_status
SET last_sync_at = NOW(),
last_sync_count = $3,
sync_errors = NULL,
updated_at = NOW()
WHERE provider = $1 AND data_type = $2
`;
await postgresClient.query(query, [provider, dataType, count]);
}

View file

@ -1,266 +1,275 @@
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload, SyncResult } from '../../../types/job-payloads';
const logger = getLogger('enhanced-sync-all-exchanges');
export async function syncAllExchanges(payload: JobPayload): Promise<SyncResult> {
const clearFirst = payload.clearFirst || true;
logger.info('Starting comprehensive exchange sync...', { clearFirst });
const result: SyncResult = {
processed: 0,
created: 0,
updated: 0,
skipped: 0,
errors: 0,
};
try {
const postgresClient = getPostgreSQLClient();
// Clear existing data if requested
if (clearFirst) {
await clearPostgreSQLData(postgresClient);
}
// Start transaction for atomic operations
await postgresClient.query('BEGIN');
// 1. Sync from EOD exchanges (comprehensive global data)
const eodResult = await syncEODExchanges();
mergeResults(result, eodResult);
// 2. Sync from IB exchanges (detailed asset information)
const ibResult = await syncIBExchanges();
mergeResults(result, ibResult);
// 3. Update sync status
await updateSyncStatus('all', 'exchanges', result.processed, postgresClient);
await postgresClient.query('COMMIT');
logger.info('Comprehensive exchange sync completed', result);
return result;
} catch (error) {
const postgresClient = getPostgreSQLClient();
await postgresClient.query('ROLLBACK');
logger.error('Comprehensive exchange sync failed', { error });
throw error;
}
}
async function clearPostgreSQLData(postgresClient: any): Promise<void> {
logger.info('Clearing existing PostgreSQL data...');
// Clear data in correct order (respect foreign keys)
await postgresClient.query('DELETE FROM provider_mappings');
await postgresClient.query('DELETE FROM symbols');
await postgresClient.query('DELETE FROM exchanges');
// Reset sync status
await postgresClient.query(
'UPDATE sync_status SET last_sync_at = NULL, last_sync_count = 0, sync_errors = NULL'
);
logger.info('PostgreSQL data cleared successfully');
}
async function syncEODExchanges(): Promise<SyncResult> {
const mongoClient = getMongoDBClient();
const exchanges = await mongoClient.find('eodExchanges', { active: true });
const result: SyncResult = { processed: 0, created: 0, updated: 0, skipped: 0, errors: 0 };
for (const exchange of exchanges) {
try {
// Create provider exchange mapping for EOD
await createProviderExchangeMapping(
'eod', // provider
exchange.Code,
exchange.Name,
exchange.CountryISO2,
exchange.Currency,
0.95 // very high confidence for EOD data
);
result.processed++;
result.created++; // Count as created mapping
} catch (error) {
logger.error('Failed to process EOD exchange', { error, exchange });
result.errors++;
}
}
return result;
}
async function syncIBExchanges(): Promise<SyncResult> {
const mongoClient = getMongoDBClient();
const exchanges = await mongoClient.find('ibExchanges', {});
const result: SyncResult = { processed: 0, created: 0, updated: 0, skipped: 0, errors: 0 };
for (const exchange of exchanges) {
try {
// Create provider exchange mapping for IB
await createProviderExchangeMapping(
'ib', // provider
exchange.exchange_id,
exchange.name,
exchange.country_code,
'USD', // IB doesn't specify currency, default to USD
0.85 // good confidence for IB data
);
result.processed++;
result.created++; // Count as created mapping
} catch (error) {
logger.error('Failed to process IB exchange', { error, exchange });
result.errors++;
}
}
return result;
}
async function createProviderExchangeMapping(
provider: string,
providerExchangeCode: string,
providerExchangeName: string,
countryCode: string | null,
currency: string | null,
confidence: number
): Promise<void> {
if (!providerExchangeCode) {
return;
}
const postgresClient = getPostgreSQLClient();
// Check if mapping already exists
const existingMapping = await findProviderExchangeMapping(provider, providerExchangeCode);
if (existingMapping) {
// Don't override existing mappings to preserve manual work
return;
}
// Find or create master exchange
const masterExchange = await findOrCreateMasterExchange(
providerExchangeCode,
providerExchangeName,
countryCode,
currency
);
// Create the provider exchange mapping
const query = `
INSERT INTO provider_exchange_mappings
(provider, provider_exchange_code, provider_exchange_name, master_exchange_id,
country_code, currency, confidence, active, auto_mapped)
VALUES ($1, $2, $3, $4, $5, $6, $7, false, true)
ON CONFLICT (provider, provider_exchange_code) DO NOTHING
`;
await postgresClient.query(query, [
provider,
providerExchangeCode,
providerExchangeName,
masterExchange.id,
countryCode,
currency,
confidence,
]);
}
async function findOrCreateMasterExchange(
providerCode: string,
providerName: string,
countryCode: string | null,
currency: string | null
): Promise<any> {
const postgresClient = getPostgreSQLClient();
// First, try to find exact match
let masterExchange = await findExchangeByCode(providerCode);
if (masterExchange) {
return masterExchange;
}
// Try to find by similar codes (basic mapping)
const basicMapping = getBasicExchangeMapping(providerCode);
if (basicMapping) {
masterExchange = await findExchangeByCode(basicMapping);
if (masterExchange) {
return masterExchange;
}
}
// Create new master exchange (inactive by default)
const query = `
INSERT INTO exchanges (code, name, country, currency, active)
VALUES ($1, $2, $3, $4, false)
ON CONFLICT (code) DO UPDATE SET
name = COALESCE(EXCLUDED.name, exchanges.name),
country = COALESCE(EXCLUDED.country, exchanges.country),
currency = COALESCE(EXCLUDED.currency, exchanges.currency)
RETURNING id, code, name, country, currency
`;
const result = await postgresClient.query(query, [
providerCode,
providerName || providerCode,
countryCode || 'US',
currency || 'USD',
]);
return result.rows[0];
}
function getBasicExchangeMapping(providerCode: string): string | null {
const mappings: Record<string, string> = {
NYE: 'NYSE',
NAS: 'NASDAQ',
TO: 'TSX',
LN: 'LSE',
LON: 'LSE',
};
return mappings[providerCode.toUpperCase()] || null;
}
async function findProviderExchangeMapping(provider: string, providerExchangeCode: string): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM provider_exchange_mappings WHERE provider = $1 AND provider_exchange_code = $2';
const result = await postgresClient.query(query, [provider, providerExchangeCode]);
return result.rows[0] || null;
}
async function findExchangeByCode(code: string): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM exchanges WHERE code = $1';
const result = await postgresClient.query(query, [code]);
return result.rows[0] || null;
}
async function updateSyncStatus(provider: string, dataType: string, count: number, postgresClient: any): Promise<void> {
const query = `
INSERT INTO sync_status (provider, data_type, last_sync_at, last_sync_count, sync_errors)
VALUES ($1, $2, NOW(), $3, NULL)
ON CONFLICT (provider, data_type)
DO UPDATE SET
last_sync_at = NOW(),
last_sync_count = EXCLUDED.last_sync_count,
sync_errors = NULL,
updated_at = NOW()
`;
await postgresClient.query(query, [provider, dataType, count]);
}
function mergeResults(target: SyncResult, source: SyncResult): void {
target.processed += source.processed;
target.created += source.created;
target.updated += source.updated;
target.skipped += source.skipped;
target.errors += source.errors;
}
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload, SyncResult } from '../../../types/job-payloads';
const logger = getLogger('enhanced-sync-all-exchanges');
export async function syncAllExchanges(payload: JobPayload): Promise<SyncResult> {
const clearFirst = payload.clearFirst || true;
logger.info('Starting comprehensive exchange sync...', { clearFirst });
const result: SyncResult = {
processed: 0,
created: 0,
updated: 0,
skipped: 0,
errors: 0,
};
try {
const postgresClient = getPostgreSQLClient();
// Clear existing data if requested
if (clearFirst) {
await clearPostgreSQLData(postgresClient);
}
// Start transaction for atomic operations
await postgresClient.query('BEGIN');
// 1. Sync from EOD exchanges (comprehensive global data)
const eodResult = await syncEODExchanges();
mergeResults(result, eodResult);
// 2. Sync from IB exchanges (detailed asset information)
const ibResult = await syncIBExchanges();
mergeResults(result, ibResult);
// 3. Update sync status
await updateSyncStatus('all', 'exchanges', result.processed, postgresClient);
await postgresClient.query('COMMIT');
logger.info('Comprehensive exchange sync completed', result);
return result;
} catch (error) {
const postgresClient = getPostgreSQLClient();
await postgresClient.query('ROLLBACK');
logger.error('Comprehensive exchange sync failed', { error });
throw error;
}
}
async function clearPostgreSQLData(postgresClient: any): Promise<void> {
logger.info('Clearing existing PostgreSQL data...');
// Clear data in correct order (respect foreign keys)
await postgresClient.query('DELETE FROM provider_mappings');
await postgresClient.query('DELETE FROM symbols');
await postgresClient.query('DELETE FROM exchanges');
// Reset sync status
await postgresClient.query(
'UPDATE sync_status SET last_sync_at = NULL, last_sync_count = 0, sync_errors = NULL'
);
logger.info('PostgreSQL data cleared successfully');
}
async function syncEODExchanges(): Promise<SyncResult> {
const mongoClient = getMongoDBClient();
const exchanges = await mongoClient.find('eodExchanges', { active: true });
const result: SyncResult = { processed: 0, created: 0, updated: 0, skipped: 0, errors: 0 };
for (const exchange of exchanges) {
try {
// Create provider exchange mapping for EOD
await createProviderExchangeMapping(
'eod', // provider
exchange.Code,
exchange.Name,
exchange.CountryISO2,
exchange.Currency,
0.95 // very high confidence for EOD data
);
result.processed++;
result.created++; // Count as created mapping
} catch (error) {
logger.error('Failed to process EOD exchange', { error, exchange });
result.errors++;
}
}
return result;
}
async function syncIBExchanges(): Promise<SyncResult> {
const mongoClient = getMongoDBClient();
const exchanges = await mongoClient.find('ibExchanges', {});
const result: SyncResult = { processed: 0, created: 0, updated: 0, skipped: 0, errors: 0 };
for (const exchange of exchanges) {
try {
// Create provider exchange mapping for IB
await createProviderExchangeMapping(
'ib', // provider
exchange.exchange_id,
exchange.name,
exchange.country_code,
'USD', // IB doesn't specify currency, default to USD
0.85 // good confidence for IB data
);
result.processed++;
result.created++; // Count as created mapping
} catch (error) {
logger.error('Failed to process IB exchange', { error, exchange });
result.errors++;
}
}
return result;
}
async function createProviderExchangeMapping(
provider: string,
providerExchangeCode: string,
providerExchangeName: string,
countryCode: string | null,
currency: string | null,
confidence: number
): Promise<void> {
if (!providerExchangeCode) {
return;
}
const postgresClient = getPostgreSQLClient();
// Check if mapping already exists
const existingMapping = await findProviderExchangeMapping(provider, providerExchangeCode);
if (existingMapping) {
// Don't override existing mappings to preserve manual work
return;
}
// Find or create master exchange
const masterExchange = await findOrCreateMasterExchange(
providerExchangeCode,
providerExchangeName,
countryCode,
currency
);
// Create the provider exchange mapping
const query = `
INSERT INTO provider_exchange_mappings
(provider, provider_exchange_code, provider_exchange_name, master_exchange_id,
country_code, currency, confidence, active, auto_mapped)
VALUES ($1, $2, $3, $4, $5, $6, $7, false, true)
ON CONFLICT (provider, provider_exchange_code) DO NOTHING
`;
await postgresClient.query(query, [
provider,
providerExchangeCode,
providerExchangeName,
masterExchange.id,
countryCode,
currency,
confidence,
]);
}
async function findOrCreateMasterExchange(
providerCode: string,
providerName: string,
countryCode: string | null,
currency: string | null
): Promise<any> {
const postgresClient = getPostgreSQLClient();
// First, try to find exact match
let masterExchange = await findExchangeByCode(providerCode);
if (masterExchange) {
return masterExchange;
}
// Try to find by similar codes (basic mapping)
const basicMapping = getBasicExchangeMapping(providerCode);
if (basicMapping) {
masterExchange = await findExchangeByCode(basicMapping);
if (masterExchange) {
return masterExchange;
}
}
// Create new master exchange (inactive by default)
const query = `
INSERT INTO exchanges (code, name, country, currency, active)
VALUES ($1, $2, $3, $4, false)
ON CONFLICT (code) DO UPDATE SET
name = COALESCE(EXCLUDED.name, exchanges.name),
country = COALESCE(EXCLUDED.country, exchanges.country),
currency = COALESCE(EXCLUDED.currency, exchanges.currency)
RETURNING id, code, name, country, currency
`;
const result = await postgresClient.query(query, [
providerCode,
providerName || providerCode,
countryCode || 'US',
currency || 'USD',
]);
return result.rows[0];
}
function getBasicExchangeMapping(providerCode: string): string | null {
const mappings: Record<string, string> = {
NYE: 'NYSE',
NAS: 'NASDAQ',
TO: 'TSX',
LN: 'LSE',
LON: 'LSE',
};
return mappings[providerCode.toUpperCase()] || null;
}
async function findProviderExchangeMapping(
provider: string,
providerExchangeCode: string
): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query =
'SELECT * FROM provider_exchange_mappings WHERE provider = $1 AND provider_exchange_code = $2';
const result = await postgresClient.query(query, [provider, providerExchangeCode]);
return result.rows[0] || null;
}
async function findExchangeByCode(code: string): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM exchanges WHERE code = $1';
const result = await postgresClient.query(query, [code]);
return result.rows[0] || null;
}
async function updateSyncStatus(
provider: string,
dataType: string,
count: number,
postgresClient: any
): Promise<void> {
const query = `
INSERT INTO sync_status (provider, data_type, last_sync_at, last_sync_count, sync_errors)
VALUES ($1, $2, NOW(), $3, NULL)
ON CONFLICT (provider, data_type)
DO UPDATE SET
last_sync_at = NOW(),
last_sync_count = EXCLUDED.last_sync_count,
sync_errors = NULL,
updated_at = NOW()
`;
await postgresClient.query(query, [provider, dataType, count]);
}
function mergeResults(target: SyncResult, source: SyncResult): void {
target.processed += source.processed;
target.created += source.created;
target.updated += source.updated;
target.skipped += source.skipped;
target.errors += source.errors;
}

View file

@ -1,206 +1,208 @@
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads';
import type { MasterExchange } from '@stock-bot/mongodb';
const logger = getLogger('sync-ib-exchanges');
interface IBExchange {
id?: string;
_id?: any;
name?: string;
code?: string;
country_code?: string;
currency?: string;
}
export async function syncIBExchanges(payload: JobPayload): Promise<{ syncedCount: number; totalExchanges: number }> {
logger.info('Syncing IB exchanges from database...');
try {
const mongoClient = getMongoDBClient();
const db = mongoClient.getDatabase();
// Filter by country code US and CA
const ibExchanges = await db
.collection<IBExchange>('ibExchanges')
.find({
country_code: { $in: ['US', 'CA'] },
})
.toArray();
logger.info('Found IB exchanges in database', { count: ibExchanges.length });
let syncedCount = 0;
for (const exchange of ibExchanges) {
try {
await createOrUpdateMasterExchange(exchange);
syncedCount++;
logger.debug('Synced IB exchange', {
ibId: exchange.id,
country: exchange.country_code,
});
} catch (error) {
logger.error('Failed to sync IB exchange', { exchange: exchange.id, error });
}
}
logger.info('IB exchange sync completed', {
syncedCount,
totalExchanges: ibExchanges.length,
});
return { syncedCount, totalExchanges: ibExchanges.length };
} catch (error) {
logger.error('Failed to fetch IB exchanges from database', { error });
return { syncedCount: 0, totalExchanges: 0 };
}
}
/**
* Create or update master exchange record 1:1 from IB exchange
*/
async function createOrUpdateMasterExchange(ibExchange: IBExchange): Promise<void> {
const mongoClient = getMongoDBClient();
const db = mongoClient.getDatabase();
const collection = db.collection<MasterExchange>('masterExchanges');
const masterExchangeId = generateMasterExchangeId(ibExchange);
const now = new Date();
// Check if master exchange already exists
const existing = await collection.findOne({ masterExchangeId });
if (existing) {
// Update existing record
await collection.updateOne(
{ masterExchangeId },
{
$set: {
officialName: ibExchange.name || `Exchange ${ibExchange.id}`,
country: ibExchange.country_code || 'UNKNOWN',
currency: ibExchange.currency || 'USD',
timezone: inferTimezone(ibExchange),
updated_at: now,
},
}
);
logger.debug('Updated existing master exchange', { masterExchangeId });
} else {
// Create new master exchange
const masterExchange: MasterExchange = {
masterExchangeId,
shortName: masterExchangeId, // Set shortName to masterExchangeId on creation
officialName: ibExchange.name || `Exchange ${ibExchange.id}`,
country: ibExchange.country_code || 'UNKNOWN',
currency: ibExchange.currency || 'USD',
timezone: inferTimezone(ibExchange),
active: false, // Set active to false only on creation
sourceMappings: {
ib: {
id: ibExchange.id || ibExchange._id?.toString() || 'unknown',
name: ibExchange.name || `Exchange ${ibExchange.id}`,
code: ibExchange.code || ibExchange.id || '',
aliases: generateAliases(ibExchange),
lastUpdated: now,
},
},
confidence: 1.0, // High confidence for direct IB mapping
verified: true, // Mark as verified since it's direct from IB
// DocumentBase fields
source: 'ib-exchange-sync',
created_at: now,
updated_at: now,
};
await collection.insertOne(masterExchange);
logger.debug('Created new master exchange', { masterExchangeId });
}
}
/**
* Generate master exchange ID from IB exchange
*/
function generateMasterExchangeId(ibExchange: IBExchange): string {
// Use code if available, otherwise use ID, otherwise generate from name
if (ibExchange.code) {
return ibExchange.code.toUpperCase().replace(/[^A-Z0-9]/g, '');
}
if (ibExchange.id) {
return ibExchange.id.toUpperCase().replace(/[^A-Z0-9]/g, '');
}
if (ibExchange.name) {
return ibExchange.name
.toUpperCase()
.split(' ')
.slice(0, 2)
.join('_')
.replace(/[^A-Z0-9_]/g, '');
}
return 'UNKNOWN_EXCHANGE';
}
/**
* Generate aliases for the exchange
*/
function generateAliases(ibExchange: IBExchange): string[] {
const aliases: string[] = [];
if (ibExchange.name && ibExchange.name.includes(' ')) {
// Add abbreviated version
aliases.push(
ibExchange.name
.split(' ')
.map(w => w[0])
.join('')
.toUpperCase()
);
}
if (ibExchange.code) {
aliases.push(ibExchange.code.toUpperCase());
}
return aliases;
}
/**
* Infer timezone from exchange name/location
*/
function inferTimezone(ibExchange: IBExchange): string {
if (!ibExchange.name) {
return 'UTC';
}
const name = ibExchange.name.toUpperCase();
if (name.includes('NEW YORK') || name.includes('NYSE') || name.includes('NASDAQ')) {
return 'America/New_York';
}
if (name.includes('LONDON')) {
return 'Europe/London';
}
if (name.includes('TOKYO')) {
return 'Asia/Tokyo';
}
if (name.includes('SHANGHAI')) {
return 'Asia/Shanghai';
}
if (name.includes('TORONTO')) {
return 'America/Toronto';
}
if (name.includes('FRANKFURT')) {
return 'Europe/Berlin';
}
return 'UTC'; // Default
}
import { getLogger } from '@stock-bot/logger';
import type { MasterExchange } from '@stock-bot/mongodb';
import { getMongoDBClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-ib-exchanges');
interface IBExchange {
id?: string;
_id?: any;
name?: string;
code?: string;
country_code?: string;
currency?: string;
}
export async function syncIBExchanges(
payload: JobPayload
): Promise<{ syncedCount: number; totalExchanges: number }> {
logger.info('Syncing IB exchanges from database...');
try {
const mongoClient = getMongoDBClient();
const db = mongoClient.getDatabase();
// Filter by country code US and CA
const ibExchanges = await db
.collection<IBExchange>('ibExchanges')
.find({
country_code: { $in: ['US', 'CA'] },
})
.toArray();
logger.info('Found IB exchanges in database', { count: ibExchanges.length });
let syncedCount = 0;
for (const exchange of ibExchanges) {
try {
await createOrUpdateMasterExchange(exchange);
syncedCount++;
logger.debug('Synced IB exchange', {
ibId: exchange.id,
country: exchange.country_code,
});
} catch (error) {
logger.error('Failed to sync IB exchange', { exchange: exchange.id, error });
}
}
logger.info('IB exchange sync completed', {
syncedCount,
totalExchanges: ibExchanges.length,
});
return { syncedCount, totalExchanges: ibExchanges.length };
} catch (error) {
logger.error('Failed to fetch IB exchanges from database', { error });
return { syncedCount: 0, totalExchanges: 0 };
}
}
/**
* Create or update master exchange record 1:1 from IB exchange
*/
async function createOrUpdateMasterExchange(ibExchange: IBExchange): Promise<void> {
const mongoClient = getMongoDBClient();
const db = mongoClient.getDatabase();
const collection = db.collection<MasterExchange>('masterExchanges');
const masterExchangeId = generateMasterExchangeId(ibExchange);
const now = new Date();
// Check if master exchange already exists
const existing = await collection.findOne({ masterExchangeId });
if (existing) {
// Update existing record
await collection.updateOne(
{ masterExchangeId },
{
$set: {
officialName: ibExchange.name || `Exchange ${ibExchange.id}`,
country: ibExchange.country_code || 'UNKNOWN',
currency: ibExchange.currency || 'USD',
timezone: inferTimezone(ibExchange),
updated_at: now,
},
}
);
logger.debug('Updated existing master exchange', { masterExchangeId });
} else {
// Create new master exchange
const masterExchange: MasterExchange = {
masterExchangeId,
shortName: masterExchangeId, // Set shortName to masterExchangeId on creation
officialName: ibExchange.name || `Exchange ${ibExchange.id}`,
country: ibExchange.country_code || 'UNKNOWN',
currency: ibExchange.currency || 'USD',
timezone: inferTimezone(ibExchange),
active: false, // Set active to false only on creation
sourceMappings: {
ib: {
id: ibExchange.id || ibExchange._id?.toString() || 'unknown',
name: ibExchange.name || `Exchange ${ibExchange.id}`,
code: ibExchange.code || ibExchange.id || '',
aliases: generateAliases(ibExchange),
lastUpdated: now,
},
},
confidence: 1.0, // High confidence for direct IB mapping
verified: true, // Mark as verified since it's direct from IB
// DocumentBase fields
source: 'ib-exchange-sync',
created_at: now,
updated_at: now,
};
await collection.insertOne(masterExchange);
logger.debug('Created new master exchange', { masterExchangeId });
}
}
/**
* Generate master exchange ID from IB exchange
*/
function generateMasterExchangeId(ibExchange: IBExchange): string {
// Use code if available, otherwise use ID, otherwise generate from name
if (ibExchange.code) {
return ibExchange.code.toUpperCase().replace(/[^A-Z0-9]/g, '');
}
if (ibExchange.id) {
return ibExchange.id.toUpperCase().replace(/[^A-Z0-9]/g, '');
}
if (ibExchange.name) {
return ibExchange.name
.toUpperCase()
.split(' ')
.slice(0, 2)
.join('_')
.replace(/[^A-Z0-9_]/g, '');
}
return 'UNKNOWN_EXCHANGE';
}
/**
* Generate aliases for the exchange
*/
function generateAliases(ibExchange: IBExchange): string[] {
const aliases: string[] = [];
if (ibExchange.name && ibExchange.name.includes(' ')) {
// Add abbreviated version
aliases.push(
ibExchange.name
.split(' ')
.map(w => w[0])
.join('')
.toUpperCase()
);
}
if (ibExchange.code) {
aliases.push(ibExchange.code.toUpperCase());
}
return aliases;
}
/**
* Infer timezone from exchange name/location
*/
function inferTimezone(ibExchange: IBExchange): string {
if (!ibExchange.name) {
return 'UTC';
}
const name = ibExchange.name.toUpperCase();
if (name.includes('NEW YORK') || name.includes('NYSE') || name.includes('NASDAQ')) {
return 'America/New_York';
}
if (name.includes('LONDON')) {
return 'Europe/London';
}
if (name.includes('TOKYO')) {
return 'Asia/Tokyo';
}
if (name.includes('SHANGHAI')) {
return 'Asia/Shanghai';
}
if (name.includes('TORONTO')) {
return 'America/Toronto';
}
if (name.includes('FRANKFURT')) {
return 'Europe/Berlin';
}
return 'UTC'; // Default
}

View file

@ -1,203 +1,207 @@
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload, SyncResult } from '../../../types/job-payloads';
const logger = getLogger('enhanced-sync-qm-provider-mappings');
export async function syncQMProviderMappings(payload: JobPayload): Promise<SyncResult> {
logger.info('Starting QM provider exchange mappings sync...');
const result: SyncResult = {
processed: 0,
created: 0,
updated: 0,
skipped: 0,
errors: 0,
};
try {
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient();
// Start transaction
await postgresClient.query('BEGIN');
// Get unique exchange combinations from QM symbols
const db = mongoClient.getDatabase();
const pipeline = [
{
$group: {
_id: {
exchangeCode: '$exchangeCode',
exchange: '$exchange',
countryCode: '$countryCode',
},
count: { $sum: 1 },
sampleExchange: { $first: '$exchange' },
},
},
{
$project: {
exchangeCode: '$_id.exchangeCode',
exchange: '$_id.exchange',
countryCode: '$_id.countryCode',
count: 1,
sampleExchange: 1,
},
},
];
const qmExchanges = await db.collection('qmSymbols').aggregate(pipeline).toArray();
logger.info(`Found ${qmExchanges.length} unique QM exchange combinations`);
for (const exchange of qmExchanges) {
try {
// Create provider exchange mapping for QM
await createProviderExchangeMapping(
'qm', // provider
exchange.exchangeCode,
exchange.sampleExchange || exchange.exchangeCode,
exchange.countryCode,
exchange.countryCode === 'CA' ? 'CAD' : 'USD', // Simple currency mapping
0.8 // good confidence for QM data
);
result.processed++;
result.created++;
} catch (error) {
logger.error('Failed to process QM exchange mapping', { error, exchange });
result.errors++;
}
}
await postgresClient.query('COMMIT');
logger.info('QM provider exchange mappings sync completed', result);
return result;
} catch (error) {
const postgresClient = getPostgreSQLClient();
await postgresClient.query('ROLLBACK');
logger.error('QM provider exchange mappings sync failed', { error });
throw error;
}
}
async function createProviderExchangeMapping(
provider: string,
providerExchangeCode: string,
providerExchangeName: string,
countryCode: string | null,
currency: string | null,
confidence: number
): Promise<void> {
if (!providerExchangeCode) {
return;
}
const postgresClient = getPostgreSQLClient();
// Check if mapping already exists
const existingMapping = await findProviderExchangeMapping(provider, providerExchangeCode);
if (existingMapping) {
// Don't override existing mappings to preserve manual work
return;
}
// Find or create master exchange
const masterExchange = await findOrCreateMasterExchange(
providerExchangeCode,
providerExchangeName,
countryCode,
currency
);
// Create the provider exchange mapping
const query = `
INSERT INTO provider_exchange_mappings
(provider, provider_exchange_code, provider_exchange_name, master_exchange_id,
country_code, currency, confidence, active, auto_mapped)
VALUES ($1, $2, $3, $4, $5, $6, $7, false, true)
ON CONFLICT (provider, provider_exchange_code) DO NOTHING
`;
await postgresClient.query(query, [
provider,
providerExchangeCode,
providerExchangeName,
masterExchange.id,
countryCode,
currency,
confidence,
]);
}
async function findProviderExchangeMapping(provider: string, providerExchangeCode: string): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM provider_exchange_mappings WHERE provider = $1 AND provider_exchange_code = $2';
const result = await postgresClient.query(query, [provider, providerExchangeCode]);
return result.rows[0] || null;
}
async function findOrCreateMasterExchange(
providerCode: string,
providerName: string,
countryCode: string | null,
currency: string | null
): Promise<any> {
const postgresClient = getPostgreSQLClient();
// First, try to find exact match
let masterExchange = await findExchangeByCode(providerCode);
if (masterExchange) {
return masterExchange;
}
// Try to find by similar codes (basic mapping)
const basicMapping = getBasicExchangeMapping(providerCode);
if (basicMapping) {
masterExchange = await findExchangeByCode(basicMapping);
if (masterExchange) {
return masterExchange;
}
}
// Create new master exchange (inactive by default)
const query = `
INSERT INTO exchanges (code, name, country, currency, active)
VALUES ($1, $2, $3, $4, false)
ON CONFLICT (code) DO UPDATE SET
name = COALESCE(EXCLUDED.name, exchanges.name),
country = COALESCE(EXCLUDED.country, exchanges.country),
currency = COALESCE(EXCLUDED.currency, exchanges.currency)
RETURNING id, code, name, country, currency
`;
const result = await postgresClient.query(query, [
providerCode,
providerName || providerCode,
countryCode || 'US',
currency || 'USD',
]);
return result.rows[0];
}
function getBasicExchangeMapping(providerCode: string): string | null {
const mappings: Record<string, string> = {
NYE: 'NYSE',
NAS: 'NASDAQ',
TO: 'TSX',
LN: 'LSE',
LON: 'LSE',
};
return mappings[providerCode.toUpperCase()] || null;
}
async function findExchangeByCode(code: string): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM exchanges WHERE code = $1';
const result = await postgresClient.query(query, [code]);
return result.rows[0] || null;
}
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload, SyncResult } from '../../../types/job-payloads';
const logger = getLogger('enhanced-sync-qm-provider-mappings');
export async function syncQMProviderMappings(payload: JobPayload): Promise<SyncResult> {
logger.info('Starting QM provider exchange mappings sync...');
const result: SyncResult = {
processed: 0,
created: 0,
updated: 0,
skipped: 0,
errors: 0,
};
try {
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient();
// Start transaction
await postgresClient.query('BEGIN');
// Get unique exchange combinations from QM symbols
const db = mongoClient.getDatabase();
const pipeline = [
{
$group: {
_id: {
exchangeCode: '$exchangeCode',
exchange: '$exchange',
countryCode: '$countryCode',
},
count: { $sum: 1 },
sampleExchange: { $first: '$exchange' },
},
},
{
$project: {
exchangeCode: '$_id.exchangeCode',
exchange: '$_id.exchange',
countryCode: '$_id.countryCode',
count: 1,
sampleExchange: 1,
},
},
];
const qmExchanges = await db.collection('qmSymbols').aggregate(pipeline).toArray();
logger.info(`Found ${qmExchanges.length} unique QM exchange combinations`);
for (const exchange of qmExchanges) {
try {
// Create provider exchange mapping for QM
await createProviderExchangeMapping(
'qm', // provider
exchange.exchangeCode,
exchange.sampleExchange || exchange.exchangeCode,
exchange.countryCode,
exchange.countryCode === 'CA' ? 'CAD' : 'USD', // Simple currency mapping
0.8 // good confidence for QM data
);
result.processed++;
result.created++;
} catch (error) {
logger.error('Failed to process QM exchange mapping', { error, exchange });
result.errors++;
}
}
await postgresClient.query('COMMIT');
logger.info('QM provider exchange mappings sync completed', result);
return result;
} catch (error) {
const postgresClient = getPostgreSQLClient();
await postgresClient.query('ROLLBACK');
logger.error('QM provider exchange mappings sync failed', { error });
throw error;
}
}
async function createProviderExchangeMapping(
provider: string,
providerExchangeCode: string,
providerExchangeName: string,
countryCode: string | null,
currency: string | null,
confidence: number
): Promise<void> {
if (!providerExchangeCode) {
return;
}
const postgresClient = getPostgreSQLClient();
// Check if mapping already exists
const existingMapping = await findProviderExchangeMapping(provider, providerExchangeCode);
if (existingMapping) {
// Don't override existing mappings to preserve manual work
return;
}
// Find or create master exchange
const masterExchange = await findOrCreateMasterExchange(
providerExchangeCode,
providerExchangeName,
countryCode,
currency
);
// Create the provider exchange mapping
const query = `
INSERT INTO provider_exchange_mappings
(provider, provider_exchange_code, provider_exchange_name, master_exchange_id,
country_code, currency, confidence, active, auto_mapped)
VALUES ($1, $2, $3, $4, $5, $6, $7, false, true)
ON CONFLICT (provider, provider_exchange_code) DO NOTHING
`;
await postgresClient.query(query, [
provider,
providerExchangeCode,
providerExchangeName,
masterExchange.id,
countryCode,
currency,
confidence,
]);
}
async function findProviderExchangeMapping(
provider: string,
providerExchangeCode: string
): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query =
'SELECT * FROM provider_exchange_mappings WHERE provider = $1 AND provider_exchange_code = $2';
const result = await postgresClient.query(query, [provider, providerExchangeCode]);
return result.rows[0] || null;
}
async function findOrCreateMasterExchange(
providerCode: string,
providerName: string,
countryCode: string | null,
currency: string | null
): Promise<any> {
const postgresClient = getPostgreSQLClient();
// First, try to find exact match
let masterExchange = await findExchangeByCode(providerCode);
if (masterExchange) {
return masterExchange;
}
// Try to find by similar codes (basic mapping)
const basicMapping = getBasicExchangeMapping(providerCode);
if (basicMapping) {
masterExchange = await findExchangeByCode(basicMapping);
if (masterExchange) {
return masterExchange;
}
}
// Create new master exchange (inactive by default)
const query = `
INSERT INTO exchanges (code, name, country, currency, active)
VALUES ($1, $2, $3, $4, false)
ON CONFLICT (code) DO UPDATE SET
name = COALESCE(EXCLUDED.name, exchanges.name),
country = COALESCE(EXCLUDED.country, exchanges.country),
currency = COALESCE(EXCLUDED.currency, exchanges.currency)
RETURNING id, code, name, country, currency
`;
const result = await postgresClient.query(query, [
providerCode,
providerName || providerCode,
countryCode || 'US',
currency || 'USD',
]);
return result.rows[0];
}
function getBasicExchangeMapping(providerCode: string): string | null {
const mappings: Record<string, string> = {
NYE: 'NYSE',
NAS: 'NASDAQ',
TO: 'TSX',
LN: 'LSE',
LON: 'LSE',
};
return mappings[providerCode.toUpperCase()] || null;
}
async function findExchangeByCode(code: string): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM exchanges WHERE code = $1';
const result = await postgresClient.query(query, [code]);
return result.rows[0] || null;
}

View file

@ -1,9 +1,9 @@
import { syncQMSymbols } from './qm-symbols.operations';
import { syncSymbolsFromProvider } from './sync-symbols-from-provider.operations';
import { getSyncStatus } from './sync-status.operations';
export const symbolOperations = {
syncQMSymbols,
syncSymbolsFromProvider,
getSyncStatus,
};
import { syncQMSymbols } from './qm-symbols.operations';
import { getSyncStatus } from './sync-status.operations';
import { syncSymbolsFromProvider } from './sync-symbols-from-provider.operations';
export const symbolOperations = {
syncQMSymbols,
syncSymbolsFromProvider,
getSyncStatus,
};

View file

@ -1,167 +1,183 @@
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-qm-symbols');
export async function syncQMSymbols(payload: JobPayload): Promise<{ processed: number; created: number; updated: number }> {
logger.info('Starting QM symbols sync...');
try {
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient();
// 1. Get all QM symbols from MongoDB
const qmSymbols = await mongoClient.find('qmSymbols', {});
logger.info(`Found ${qmSymbols.length} QM symbols to process`);
let created = 0;
let updated = 0;
for (const symbol of qmSymbols) {
try {
// 2. Resolve exchange
const exchangeId = await resolveExchange(symbol.exchangeCode || symbol.exchange, postgresClient);
if (!exchangeId) {
logger.warn('Unknown exchange, skipping symbol', {
symbol: symbol.symbol,
exchange: symbol.exchangeCode || symbol.exchange,
});
continue;
}
// 3. Check if symbol exists
const existingSymbol = await findSymbol(symbol.symbol, exchangeId, postgresClient);
if (existingSymbol) {
// Update existing
await updateSymbol(existingSymbol.id, symbol, postgresClient);
await upsertProviderMapping(existingSymbol.id, 'qm', symbol, postgresClient);
updated++;
} else {
// Create new
const newSymbolId = await createSymbol(symbol, exchangeId, postgresClient);
await upsertProviderMapping(newSymbolId, 'qm', symbol, postgresClient);
created++;
}
} catch (error) {
logger.error('Failed to process symbol', { error, symbol: symbol.symbol });
}
}
// 4. Update sync status
await updateSyncStatus('qm', 'symbols', qmSymbols.length, postgresClient);
const result = { processed: qmSymbols.length, created, updated };
logger.info('QM symbols sync completed', result);
return result;
} catch (error) {
logger.error('QM symbols sync failed', { error });
throw error;
}
}
// Helper functions
async function resolveExchange(exchangeCode: string, postgresClient: any): Promise<string | null> {
if (!exchangeCode) return null;
// Simple mapping - expand this as needed
const exchangeMap: Record<string, string> = {
NASDAQ: 'NASDAQ',
NYSE: 'NYSE',
TSX: 'TSX',
TSE: 'TSX', // TSE maps to TSX
LSE: 'LSE',
CME: 'CME',
};
const normalizedCode = exchangeMap[exchangeCode.toUpperCase()];
if (!normalizedCode) {
return null;
}
const query = 'SELECT id FROM exchanges WHERE code = $1';
const result = await postgresClient.query(query, [normalizedCode]);
return result.rows[0]?.id || null;
}
async function findSymbol(symbol: string, exchangeId: string, postgresClient: any): Promise<any> {
const query = 'SELECT * FROM symbols WHERE symbol = $1 AND exchange_id = $2';
const result = await postgresClient.query(query, [symbol, exchangeId]);
return result.rows[0] || null;
}
async function createSymbol(qmSymbol: any, exchangeId: string, postgresClient: any): Promise<string> {
const query = `
INSERT INTO symbols (symbol, exchange_id, company_name, country, currency)
VALUES ($1, $2, $3, $4, $5)
RETURNING id
`;
const result = await postgresClient.query(query, [
qmSymbol.symbol,
exchangeId,
qmSymbol.companyName || qmSymbol.name,
qmSymbol.countryCode || 'US',
qmSymbol.currency || 'USD',
]);
return result.rows[0].id;
}
async function updateSymbol(symbolId: string, qmSymbol: any, postgresClient: any): Promise<void> {
const query = `
UPDATE symbols
SET company_name = COALESCE($2, company_name),
country = COALESCE($3, country),
currency = COALESCE($4, currency),
updated_at = NOW()
WHERE id = $1
`;
await postgresClient.query(query, [
symbolId,
qmSymbol.companyName || qmSymbol.name,
qmSymbol.countryCode,
qmSymbol.currency,
]);
}
async function upsertProviderMapping(
symbolId: string,
provider: string,
qmSymbol: any,
postgresClient: any
): Promise<void> {
const query = `
INSERT INTO provider_mappings
(symbol_id, provider, provider_symbol, provider_exchange, last_seen)
VALUES ($1, $2, $3, $4, NOW())
ON CONFLICT (provider, provider_symbol)
DO UPDATE SET
symbol_id = EXCLUDED.symbol_id,
provider_exchange = EXCLUDED.provider_exchange,
last_seen = NOW()
`;
await postgresClient.query(query, [
symbolId,
provider,
qmSymbol.qmSearchCode || qmSymbol.symbol,
qmSymbol.exchangeCode || qmSymbol.exchange,
]);
}
async function updateSyncStatus(provider: string, dataType: string, count: number, postgresClient: any): Promise<void> {
const query = `
UPDATE sync_status
SET last_sync_at = NOW(),
last_sync_count = $3,
sync_errors = NULL,
updated_at = NOW()
WHERE provider = $1 AND data_type = $2
`;
await postgresClient.query(query, [provider, dataType, count]);
}
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-qm-symbols');
export async function syncQMSymbols(
payload: JobPayload
): Promise<{ processed: number; created: number; updated: number }> {
logger.info('Starting QM symbols sync...');
try {
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient();
// 1. Get all QM symbols from MongoDB
const qmSymbols = await mongoClient.find('qmSymbols', {});
logger.info(`Found ${qmSymbols.length} QM symbols to process`);
let created = 0;
let updated = 0;
for (const symbol of qmSymbols) {
try {
// 2. Resolve exchange
const exchangeId = await resolveExchange(
symbol.exchangeCode || symbol.exchange,
postgresClient
);
if (!exchangeId) {
logger.warn('Unknown exchange, skipping symbol', {
symbol: symbol.symbol,
exchange: symbol.exchangeCode || symbol.exchange,
});
continue;
}
// 3. Check if symbol exists
const existingSymbol = await findSymbol(symbol.symbol, exchangeId, postgresClient);
if (existingSymbol) {
// Update existing
await updateSymbol(existingSymbol.id, symbol, postgresClient);
await upsertProviderMapping(existingSymbol.id, 'qm', symbol, postgresClient);
updated++;
} else {
// Create new
const newSymbolId = await createSymbol(symbol, exchangeId, postgresClient);
await upsertProviderMapping(newSymbolId, 'qm', symbol, postgresClient);
created++;
}
} catch (error) {
logger.error('Failed to process symbol', { error, symbol: symbol.symbol });
}
}
// 4. Update sync status
await updateSyncStatus('qm', 'symbols', qmSymbols.length, postgresClient);
const result = { processed: qmSymbols.length, created, updated };
logger.info('QM symbols sync completed', result);
return result;
} catch (error) {
logger.error('QM symbols sync failed', { error });
throw error;
}
}
// Helper functions
async function resolveExchange(exchangeCode: string, postgresClient: any): Promise<string | null> {
if (!exchangeCode) {
return null;
}
// Simple mapping - expand this as needed
const exchangeMap: Record<string, string> = {
NASDAQ: 'NASDAQ',
NYSE: 'NYSE',
TSX: 'TSX',
TSE: 'TSX', // TSE maps to TSX
LSE: 'LSE',
CME: 'CME',
};
const normalizedCode = exchangeMap[exchangeCode.toUpperCase()];
if (!normalizedCode) {
return null;
}
const query = 'SELECT id FROM exchanges WHERE code = $1';
const result = await postgresClient.query(query, [normalizedCode]);
return result.rows[0]?.id || null;
}
async function findSymbol(symbol: string, exchangeId: string, postgresClient: any): Promise<any> {
const query = 'SELECT * FROM symbols WHERE symbol = $1 AND exchange_id = $2';
const result = await postgresClient.query(query, [symbol, exchangeId]);
return result.rows[0] || null;
}
async function createSymbol(
qmSymbol: any,
exchangeId: string,
postgresClient: any
): Promise<string> {
const query = `
INSERT INTO symbols (symbol, exchange_id, company_name, country, currency)
VALUES ($1, $2, $3, $4, $5)
RETURNING id
`;
const result = await postgresClient.query(query, [
qmSymbol.symbol,
exchangeId,
qmSymbol.companyName || qmSymbol.name,
qmSymbol.countryCode || 'US',
qmSymbol.currency || 'USD',
]);
return result.rows[0].id;
}
async function updateSymbol(symbolId: string, qmSymbol: any, postgresClient: any): Promise<void> {
const query = `
UPDATE symbols
SET company_name = COALESCE($2, company_name),
country = COALESCE($3, country),
currency = COALESCE($4, currency),
updated_at = NOW()
WHERE id = $1
`;
await postgresClient.query(query, [
symbolId,
qmSymbol.companyName || qmSymbol.name,
qmSymbol.countryCode,
qmSymbol.currency,
]);
}
async function upsertProviderMapping(
symbolId: string,
provider: string,
qmSymbol: any,
postgresClient: any
): Promise<void> {
const query = `
INSERT INTO provider_mappings
(symbol_id, provider, provider_symbol, provider_exchange, last_seen)
VALUES ($1, $2, $3, $4, NOW())
ON CONFLICT (provider, provider_symbol)
DO UPDATE SET
symbol_id = EXCLUDED.symbol_id,
provider_exchange = EXCLUDED.provider_exchange,
last_seen = NOW()
`;
await postgresClient.query(query, [
symbolId,
provider,
qmSymbol.qmSearchCode || qmSymbol.symbol,
qmSymbol.exchangeCode || qmSymbol.exchange,
]);
}
async function updateSyncStatus(
provider: string,
dataType: string,
count: number,
postgresClient: any
): Promise<void> {
const query = `
UPDATE sync_status
SET last_sync_at = NOW(),
last_sync_count = $3,
sync_errors = NULL,
updated_at = NOW()
WHERE provider = $1 AND data_type = $2
`;
await postgresClient.query(query, [provider, dataType, count]);
}

View file

@ -1,21 +1,21 @@
import { getLogger } from '@stock-bot/logger';
import { getPostgreSQLClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-status');
export async function getSyncStatus(payload: JobPayload): Promise<Record<string, unknown>[]> {
logger.info('Getting sync status...');
try {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM sync_status ORDER BY provider, data_type';
const result = await postgresClient.query(query);
logger.info(`Retrieved sync status for ${result.rows.length} entries`);
return result.rows;
} catch (error) {
logger.error('Failed to get sync status', { error });
throw error;
}
}
import { getLogger } from '@stock-bot/logger';
import { getPostgreSQLClient } from '../../../clients';
import type { JobPayload } from '../../../types/job-payloads';
const logger = getLogger('sync-status');
export async function getSyncStatus(payload: JobPayload): Promise<Record<string, unknown>[]> {
logger.info('Getting sync status...');
try {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM sync_status ORDER BY provider, data_type';
const result = await postgresClient.query(query);
logger.info(`Retrieved sync status for ${result.rows.length} entries`);
return result.rows;
} catch (error) {
logger.error('Failed to get sync status', { error });
throw error;
}
}

View file

@ -1,215 +1,231 @@
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload, SyncResult } from '../../../types/job-payloads';
const logger = getLogger('enhanced-sync-symbols-from-provider');
export async function syncSymbolsFromProvider(payload: JobPayload): Promise<SyncResult> {
const provider = payload.provider;
const clearFirst = payload.clearFirst || false;
if (!provider) {
throw new Error('Provider is required in payload');
}
logger.info(`Starting ${provider} symbols sync...`, { clearFirst });
const result: SyncResult = {
processed: 0,
created: 0,
updated: 0,
skipped: 0,
errors: 0,
};
try {
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient();
// Clear existing data if requested (only symbols and mappings, keep exchanges)
if (clearFirst) {
await postgresClient.query('BEGIN');
await postgresClient.query('DELETE FROM provider_mappings');
await postgresClient.query('DELETE FROM symbols');
await postgresClient.query('COMMIT');
logger.info('Cleared existing symbols and mappings before sync');
}
// Start transaction
await postgresClient.query('BEGIN');
let symbols: Record<string, unknown>[] = [];
// Get symbols based on provider
const db = mongoClient.getDatabase();
switch (provider.toLowerCase()) {
case 'qm':
symbols = await db.collection('qmSymbols').find({}).toArray();
break;
case 'eod':
symbols = await db.collection('eodSymbols').find({}).toArray();
break;
case 'ib':
symbols = await db.collection('ibSymbols').find({}).toArray();
break;
default:
throw new Error(`Unsupported provider: ${provider}`);
}
logger.info(`Found ${symbols.length} ${provider} symbols to process`);
result.processed = symbols.length;
for (const symbol of symbols) {
try {
await processSingleSymbol(symbol, provider, result);
} catch (error) {
logger.error('Failed to process symbol', {
error,
symbol: symbol.symbol || symbol.code,
provider,
});
result.errors++;
}
}
// Update sync status
await updateSyncStatus(provider, 'symbols', result.processed, postgresClient);
await postgresClient.query('COMMIT');
logger.info(`${provider} symbols sync completed`, result);
return result;
} catch (error) {
const postgresClient = getPostgreSQLClient();
await postgresClient.query('ROLLBACK');
logger.error(`${provider} symbols sync failed`, { error });
throw error;
}
}
async function processSingleSymbol(symbol: any, provider: string, result: SyncResult): Promise<void> {
const symbolCode = symbol.symbol || symbol.code;
const exchangeCode = symbol.exchangeCode || symbol.exchange || symbol.exchange_id;
if (!symbolCode || !exchangeCode) {
result.skipped++;
return;
}
// Find active provider exchange mapping
const providerMapping = await findActiveProviderExchangeMapping(provider, exchangeCode);
if (!providerMapping) {
result.skipped++;
return;
}
// Check if symbol exists
const existingSymbol = await findSymbolByCodeAndExchange(
symbolCode,
providerMapping.master_exchange_id
);
if (existingSymbol) {
await updateSymbol(existingSymbol.id, symbol);
await upsertProviderMapping(existingSymbol.id, provider, symbol);
result.updated++;
} else {
const newSymbolId = await createSymbol(symbol, providerMapping.master_exchange_id);
await upsertProviderMapping(newSymbolId, provider, symbol);
result.created++;
}
}
async function findActiveProviderExchangeMapping(provider: string, providerExchangeCode: string): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = `
SELECT pem.*, e.code as master_exchange_code
FROM provider_exchange_mappings pem
JOIN exchanges e ON pem.master_exchange_id = e.id
WHERE pem.provider = $1 AND pem.provider_exchange_code = $2 AND pem.active = true
`;
const result = await postgresClient.query(query, [provider, providerExchangeCode]);
return result.rows[0] || null;
}
async function findSymbolByCodeAndExchange(symbol: string, exchangeId: string): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM symbols WHERE symbol = $1 AND exchange_id = $2';
const result = await postgresClient.query(query, [symbol, exchangeId]);
return result.rows[0] || null;
}
async function createSymbol(symbol: any, exchangeId: string): Promise<string> {
const postgresClient = getPostgreSQLClient();
const query = `
INSERT INTO symbols (symbol, exchange_id, company_name, country, currency)
VALUES ($1, $2, $3, $4, $5)
RETURNING id
`;
const result = await postgresClient.query(query, [
symbol.symbol || symbol.code,
exchangeId,
symbol.companyName || symbol.name || symbol.company_name,
symbol.countryCode || symbol.country_code || 'US',
symbol.currency || 'USD',
]);
return result.rows[0].id;
}
async function updateSymbol(symbolId: string, symbol: any): Promise<void> {
const postgresClient = getPostgreSQLClient();
const query = `
UPDATE symbols
SET company_name = COALESCE($2, company_name),
country = COALESCE($3, country),
currency = COALESCE($4, currency),
updated_at = NOW()
WHERE id = $1
`;
await postgresClient.query(query, [
symbolId,
symbol.companyName || symbol.name || symbol.company_name,
symbol.countryCode || symbol.country_code,
symbol.currency,
]);
}
async function upsertProviderMapping(symbolId: string, provider: string, symbol: any): Promise<void> {
const postgresClient = getPostgreSQLClient();
const query = `
INSERT INTO provider_mappings
(symbol_id, provider, provider_symbol, provider_exchange, last_seen)
VALUES ($1, $2, $3, $4, NOW())
ON CONFLICT (provider, provider_symbol)
DO UPDATE SET
symbol_id = EXCLUDED.symbol_id,
provider_exchange = EXCLUDED.provider_exchange,
last_seen = NOW()
`;
await postgresClient.query(query, [
symbolId,
provider,
symbol.qmSearchCode || symbol.symbol || symbol.code,
symbol.exchangeCode || symbol.exchange || symbol.exchange_id,
]);
}
async function updateSyncStatus(provider: string, dataType: string, count: number, postgresClient: any): Promise<void> {
const query = `
INSERT INTO sync_status (provider, data_type, last_sync_at, last_sync_count, sync_errors)
VALUES ($1, $2, NOW(), $3, NULL)
ON CONFLICT (provider, data_type)
DO UPDATE SET
last_sync_at = NOW(),
last_sync_count = EXCLUDED.last_sync_count,
sync_errors = NULL,
updated_at = NOW()
`;
await postgresClient.query(query, [provider, dataType, count]);
}
import { getLogger } from '@stock-bot/logger';
import { getMongoDBClient, getPostgreSQLClient } from '../../../clients';
import type { JobPayload, SyncResult } from '../../../types/job-payloads';
const logger = getLogger('enhanced-sync-symbols-from-provider');
export async function syncSymbolsFromProvider(payload: JobPayload): Promise<SyncResult> {
const provider = payload.provider;
const clearFirst = payload.clearFirst || false;
if (!provider) {
throw new Error('Provider is required in payload');
}
logger.info(`Starting ${provider} symbols sync...`, { clearFirst });
const result: SyncResult = {
processed: 0,
created: 0,
updated: 0,
skipped: 0,
errors: 0,
};
try {
const mongoClient = getMongoDBClient();
const postgresClient = getPostgreSQLClient();
// Clear existing data if requested (only symbols and mappings, keep exchanges)
if (clearFirst) {
await postgresClient.query('BEGIN');
await postgresClient.query('DELETE FROM provider_mappings');
await postgresClient.query('DELETE FROM symbols');
await postgresClient.query('COMMIT');
logger.info('Cleared existing symbols and mappings before sync');
}
// Start transaction
await postgresClient.query('BEGIN');
let symbols: Record<string, unknown>[] = [];
// Get symbols based on provider
const db = mongoClient.getDatabase();
switch (provider.toLowerCase()) {
case 'qm':
symbols = await db.collection('qmSymbols').find({}).toArray();
break;
case 'eod':
symbols = await db.collection('eodSymbols').find({}).toArray();
break;
case 'ib':
symbols = await db.collection('ibSymbols').find({}).toArray();
break;
default:
throw new Error(`Unsupported provider: ${provider}`);
}
logger.info(`Found ${symbols.length} ${provider} symbols to process`);
result.processed = symbols.length;
for (const symbol of symbols) {
try {
await processSingleSymbol(symbol, provider, result);
} catch (error) {
logger.error('Failed to process symbol', {
error,
symbol: symbol.symbol || symbol.code,
provider,
});
result.errors++;
}
}
// Update sync status
await updateSyncStatus(provider, 'symbols', result.processed, postgresClient);
await postgresClient.query('COMMIT');
logger.info(`${provider} symbols sync completed`, result);
return result;
} catch (error) {
const postgresClient = getPostgreSQLClient();
await postgresClient.query('ROLLBACK');
logger.error(`${provider} symbols sync failed`, { error });
throw error;
}
}
async function processSingleSymbol(
symbol: any,
provider: string,
result: SyncResult
): Promise<void> {
const symbolCode = symbol.symbol || symbol.code;
const exchangeCode = symbol.exchangeCode || symbol.exchange || symbol.exchange_id;
if (!symbolCode || !exchangeCode) {
result.skipped++;
return;
}
// Find active provider exchange mapping
const providerMapping = await findActiveProviderExchangeMapping(provider, exchangeCode);
if (!providerMapping) {
result.skipped++;
return;
}
// Check if symbol exists
const existingSymbol = await findSymbolByCodeAndExchange(
symbolCode,
providerMapping.master_exchange_id
);
if (existingSymbol) {
await updateSymbol(existingSymbol.id, symbol);
await upsertProviderMapping(existingSymbol.id, provider, symbol);
result.updated++;
} else {
const newSymbolId = await createSymbol(symbol, providerMapping.master_exchange_id);
await upsertProviderMapping(newSymbolId, provider, symbol);
result.created++;
}
}
async function findActiveProviderExchangeMapping(
provider: string,
providerExchangeCode: string
): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = `
SELECT pem.*, e.code as master_exchange_code
FROM provider_exchange_mappings pem
JOIN exchanges e ON pem.master_exchange_id = e.id
WHERE pem.provider = $1 AND pem.provider_exchange_code = $2 AND pem.active = true
`;
const result = await postgresClient.query(query, [provider, providerExchangeCode]);
return result.rows[0] || null;
}
async function findSymbolByCodeAndExchange(symbol: string, exchangeId: string): Promise<any> {
const postgresClient = getPostgreSQLClient();
const query = 'SELECT * FROM symbols WHERE symbol = $1 AND exchange_id = $2';
const result = await postgresClient.query(query, [symbol, exchangeId]);
return result.rows[0] || null;
}
async function createSymbol(symbol: any, exchangeId: string): Promise<string> {
const postgresClient = getPostgreSQLClient();
const query = `
INSERT INTO symbols (symbol, exchange_id, company_name, country, currency)
VALUES ($1, $2, $3, $4, $5)
RETURNING id
`;
const result = await postgresClient.query(query, [
symbol.symbol || symbol.code,
exchangeId,
symbol.companyName || symbol.name || symbol.company_name,
symbol.countryCode || symbol.country_code || 'US',
symbol.currency || 'USD',
]);
return result.rows[0].id;
}
async function updateSymbol(symbolId: string, symbol: any): Promise<void> {
const postgresClient = getPostgreSQLClient();
const query = `
UPDATE symbols
SET company_name = COALESCE($2, company_name),
country = COALESCE($3, country),
currency = COALESCE($4, currency),
updated_at = NOW()
WHERE id = $1
`;
await postgresClient.query(query, [
symbolId,
symbol.companyName || symbol.name || symbol.company_name,
symbol.countryCode || symbol.country_code,
symbol.currency,
]);
}
async function upsertProviderMapping(
symbolId: string,
provider: string,
symbol: any
): Promise<void> {
const postgresClient = getPostgreSQLClient();
const query = `
INSERT INTO provider_mappings
(symbol_id, provider, provider_symbol, provider_exchange, last_seen)
VALUES ($1, $2, $3, $4, NOW())
ON CONFLICT (provider, provider_symbol)
DO UPDATE SET
symbol_id = EXCLUDED.symbol_id,
provider_exchange = EXCLUDED.provider_exchange,
last_seen = NOW()
`;
await postgresClient.query(query, [
symbolId,
provider,
symbol.qmSearchCode || symbol.symbol || symbol.code,
symbol.exchangeCode || symbol.exchange || symbol.exchange_id,
]);
}
async function updateSyncStatus(
provider: string,
dataType: string,
count: number,
postgresClient: any
): Promise<void> {
const query = `
INSERT INTO sync_status (provider, data_type, last_sync_at, last_sync_count, sync_errors)
VALUES ($1, $2, NOW(), $3, NULL)
ON CONFLICT (provider, data_type)
DO UPDATE SET
last_sync_at = NOW(),
last_sync_count = EXCLUDED.last_sync_count,
sync_errors = NULL,
updated_at = NOW()
`;
await postgresClient.query(query, [provider, dataType, count]);
}

View file

@ -1,41 +1,41 @@
import { getLogger } from '@stock-bot/logger';
import { handlerRegistry, type HandlerConfig, type ScheduledJobConfig } from '@stock-bot/queue';
import { symbolOperations } from './operations';
const logger = getLogger('symbols-handler');
const HANDLER_NAME = 'symbols';
const symbolsHandlerConfig: HandlerConfig = {
concurrency: 1,
maxAttempts: 3,
scheduledJobs: [
{
operation: 'sync-qm-symbols',
cronPattern: '0 2 * * *', // Daily at 2 AM
payload: {},
priority: 5,
immediately: false,
} as ScheduledJobConfig,
{
operation: 'sync-symbols-qm',
cronPattern: '0 4 * * *', // Daily at 4 AM
payload: { provider: 'qm', clearFirst: false },
priority: 5,
immediately: false,
} as ScheduledJobConfig,
],
operations: {
'sync-qm-symbols': symbolOperations.syncQMSymbols,
'sync-symbols-qm': symbolOperations.syncSymbolsFromProvider,
'sync-symbols-eod': symbolOperations.syncSymbolsFromProvider,
'sync-symbols-ib': symbolOperations.syncSymbolsFromProvider,
'sync-status': symbolOperations.getSyncStatus,
},
};
export function initializeSymbolsHandler(): void {
logger.info('Registering symbols handler...');
handlerRegistry.registerHandler(HANDLER_NAME, symbolsHandlerConfig);
logger.info('Symbols handler registered successfully');
}
import { getLogger } from '@stock-bot/logger';
import { handlerRegistry, type HandlerConfig, type ScheduledJobConfig } from '@stock-bot/queue';
import { symbolOperations } from './operations';
const logger = getLogger('symbols-handler');
const HANDLER_NAME = 'symbols';
const symbolsHandlerConfig: HandlerConfig = {
concurrency: 1,
maxAttempts: 3,
scheduledJobs: [
{
operation: 'sync-qm-symbols',
cronPattern: '0 2 * * *', // Daily at 2 AM
payload: {},
priority: 5,
immediately: false,
} as ScheduledJobConfig,
{
operation: 'sync-symbols-qm',
cronPattern: '0 4 * * *', // Daily at 4 AM
payload: { provider: 'qm', clearFirst: false },
priority: 5,
immediately: false,
} as ScheduledJobConfig,
],
operations: {
'sync-qm-symbols': symbolOperations.syncQMSymbols,
'sync-symbols-qm': symbolOperations.syncSymbolsFromProvider,
'sync-symbols-eod': symbolOperations.syncSymbolsFromProvider,
'sync-symbols-ib': symbolOperations.syncSymbolsFromProvider,
'sync-status': symbolOperations.getSyncStatus,
},
};
export function initializeSymbolsHandler(): void {
logger.info('Registering symbols handler...');
handlerRegistry.registerHandler(HANDLER_NAME, symbolsHandlerConfig);
logger.info('Symbols handler registered successfully');
}

View file

@ -1,16 +1,16 @@
// Framework imports
import { initializeServiceConfig } from '@stock-bot/config';
import { Hono } from 'hono';
import { cors } from 'hono/cors';
import { initializeServiceConfig } from '@stock-bot/config';
// Library imports
import { getLogger, setLoggerConfig, shutdownLoggers } from '@stock-bot/logger';
import { MongoDBClient } from '@stock-bot/mongodb';
import { PostgreSQLClient } from '@stock-bot/postgres';
import { QueueManager, type QueueManagerConfig } from '@stock-bot/queue';
import { Shutdown } from '@stock-bot/shutdown';
import { setMongoDBClient, setPostgreSQLClient } from './clients';
// Local imports
import { enhancedSyncRoutes, healthRoutes, statsRoutes, syncRoutes } from './routes';
import { setMongoDBClient, setPostgreSQLClient } from './clients';
const config = initializeServiceConfig();
console.log('Data Sync Service Configuration:', JSON.stringify(config, null, 2));
@ -66,17 +66,20 @@ async function initializeServices() {
// Initialize MongoDB client
logger.debug('Connecting to MongoDB...');
const mongoConfig = databaseConfig.mongodb;
mongoClient = new MongoDBClient({
uri: mongoConfig.uri,
database: mongoConfig.database,
host: mongoConfig.host || 'localhost',
port: mongoConfig.port || 27017,
timeouts: {
connectTimeout: 30000,
socketTimeout: 30000,
serverSelectionTimeout: 5000,
mongoClient = new MongoDBClient(
{
uri: mongoConfig.uri,
database: mongoConfig.database,
host: mongoConfig.host || 'localhost',
port: mongoConfig.port || 27017,
timeouts: {
connectTimeout: 30000,
socketTimeout: 30000,
serverSelectionTimeout: 5000,
},
},
}, logger);
logger
);
await mongoClient.connect();
setMongoDBClient(mongoClient);
logger.info('MongoDB connected');
@ -84,18 +87,21 @@ async function initializeServices() {
// Initialize PostgreSQL client
logger.debug('Connecting to PostgreSQL...');
const pgConfig = databaseConfig.postgres;
postgresClient = new PostgreSQLClient({
host: pgConfig.host,
port: pgConfig.port,
database: pgConfig.database,
username: pgConfig.user,
password: pgConfig.password,
poolSettings: {
min: 2,
max: pgConfig.poolSize || 10,
idleTimeoutMillis: pgConfig.idleTimeout || 30000,
postgresClient = new PostgreSQLClient(
{
host: pgConfig.host,
port: pgConfig.port,
database: pgConfig.database,
username: pgConfig.user,
password: pgConfig.password,
poolSettings: {
min: 2,
max: pgConfig.poolSize || 10,
idleTimeoutMillis: pgConfig.idleTimeout || 30000,
},
},
}, logger);
logger
);
await postgresClient.connect();
setPostgreSQLClient(postgresClient);
logger.info('PostgreSQL connected');
@ -124,7 +130,7 @@ async function initializeServices() {
enableDLQ: true,
},
enableScheduledJobs: true,
delayWorkerStart: true, // Prevent workers from starting until all singletons are ready
delayWorkerStart: true, // Prevent workers from starting until all singletons are ready
};
queueManager = QueueManager.getOrInitialize(queueManagerConfig);
@ -134,10 +140,10 @@ async function initializeServices() {
logger.debug('Initializing sync handlers...');
const { initializeExchangesHandler } = await import('./handlers/exchanges/exchanges.handler');
const { initializeSymbolsHandler } = await import('./handlers/symbols/symbols.handler');
initializeExchangesHandler();
initializeSymbolsHandler();
logger.info('Sync handlers initialized');
// Create scheduled jobs from registered handlers
@ -271,4 +277,4 @@ startServer().catch(error => {
process.exit(1);
});
logger.info('Data sync service startup initiated');
logger.info('Data sync service startup initiated');

View file

@ -11,13 +11,13 @@ enhancedSync.post('/exchanges/all', async c => {
const clearFirst = c.req.query('clear') === 'true';
const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('sync-all-exchanges', {
handler: 'exchanges',
operation: 'sync-all-exchanges',
payload: { clearFirst },
});
return c.json({ success: true, jobId: job.id, message: 'Enhanced exchange sync job queued' });
} catch (error) {
logger.error('Failed to queue enhanced exchange sync job', { error });
@ -32,14 +32,18 @@ enhancedSync.post('/provider-mappings/qm', async c => {
try {
const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('sync-qm-provider-mappings', {
handler: 'exchanges',
operation: 'sync-qm-provider-mappings',
payload: {},
});
return c.json({ success: true, jobId: job.id, message: 'QM provider mappings sync job queued' });
return c.json({
success: true,
jobId: job.id,
message: 'QM provider mappings sync job queued',
});
} catch (error) {
logger.error('Failed to queue QM provider mappings sync job', { error });
return c.json(
@ -55,13 +59,13 @@ enhancedSync.post('/symbols/:provider', async c => {
const clearFirst = c.req.query('clear') === 'true';
const queueManager = QueueManager.getInstance();
const symbolsQueue = queueManager.getQueue('symbols');
const job = await symbolsQueue.addJob(`sync-symbols-${provider}`, {
handler: 'symbols',
operation: `sync-symbols-${provider}`,
payload: { provider, clearFirst },
});
return c.json({ success: true, jobId: job.id, message: `${provider} symbols sync job queued` });
} catch (error) {
logger.error('Failed to queue enhanced symbol sync job', { error });
@ -77,13 +81,13 @@ enhancedSync.get('/status/enhanced', async c => {
try {
const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('enhanced-sync-status', {
handler: 'exchanges',
operation: 'enhanced-sync-status',
payload: {},
});
// Wait for job to complete and return result
const result = await job.waitUntilFinished();
return c.json(result);
@ -93,4 +97,4 @@ enhancedSync.get('/status/enhanced', async c => {
}
});
export { enhancedSync as enhancedSyncRoutes };
export { enhancedSync as enhancedSyncRoutes };

View file

@ -2,4 +2,4 @@
export { healthRoutes } from './health.routes';
export { syncRoutes } from './sync.routes';
export { enhancedSyncRoutes } from './enhanced-sync.routes';
export { statsRoutes } from './stats.routes';
export { statsRoutes } from './stats.routes';

View file

@ -10,13 +10,13 @@ stats.get('/exchanges', async c => {
try {
const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('get-exchange-stats', {
handler: 'exchanges',
operation: 'get-exchange-stats',
payload: {},
});
// Wait for job to complete and return result
const result = await job.waitUntilFinished();
return c.json(result);
@ -30,13 +30,13 @@ stats.get('/provider-mappings', async c => {
try {
const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('get-provider-mapping-stats', {
handler: 'exchanges',
operation: 'get-provider-mapping-stats',
payload: {},
});
// Wait for job to complete and return result
const result = await job.waitUntilFinished();
return c.json(result);
@ -46,4 +46,4 @@ stats.get('/provider-mappings', async c => {
}
});
export { stats as statsRoutes };
export { stats as statsRoutes };

View file

@ -10,13 +10,13 @@ sync.post('/symbols', async c => {
try {
const queueManager = QueueManager.getInstance();
const symbolsQueue = queueManager.getQueue('symbols');
const job = await symbolsQueue.addJob('sync-qm-symbols', {
handler: 'symbols',
operation: 'sync-qm-symbols',
payload: {},
});
return c.json({ success: true, jobId: job.id, message: 'QM symbols sync job queued' });
} catch (error) {
logger.error('Failed to queue symbol sync job', { error });
@ -31,13 +31,13 @@ sync.post('/exchanges', async c => {
try {
const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('sync-qm-exchanges', {
handler: 'exchanges',
operation: 'sync-qm-exchanges',
payload: {},
});
return c.json({ success: true, jobId: job.id, message: 'QM exchanges sync job queued' });
} catch (error) {
logger.error('Failed to queue exchange sync job', { error });
@ -53,13 +53,13 @@ sync.get('/status', async c => {
try {
const queueManager = QueueManager.getInstance();
const symbolsQueue = queueManager.getQueue('symbols');
const job = await symbolsQueue.addJob('sync-status', {
handler: 'symbols',
operation: 'sync-status',
payload: {},
});
// Wait for job to complete and return result
const result = await job.waitUntilFinished();
return c.json(result);
@ -74,13 +74,13 @@ sync.post('/clear', async c => {
try {
const queueManager = QueueManager.getInstance();
const exchangesQueue = queueManager.getQueue('exchanges');
const job = await exchangesQueue.addJob('clear-postgresql-data', {
handler: 'exchanges',
operation: 'clear-postgresql-data',
payload: {},
});
// Wait for job to complete and return result
const result = await job.waitUntilFinished();
return c.json({ success: true, result });
@ -93,4 +93,4 @@ sync.post('/clear', async c => {
}
});
export { sync as syncRoutes };
export { sync as syncRoutes };

View file

@ -1,27 +1,27 @@
export interface JobPayload {
[key: string]: any;
}
export interface SyncResult {
processed: number;
created: number;
updated: number;
skipped: number;
errors: number;
}
export interface SyncStatus {
provider: string;
dataType: string;
lastSyncAt?: Date;
lastSyncCount: number;
syncErrors?: string;
}
export interface ExchangeMapping {
id: string;
code: string;
name: string;
country: string;
currency: string;
}
export interface JobPayload {
[key: string]: any;
}
export interface SyncResult {
processed: number;
created: number;
updated: number;
skipped: number;
errors: number;
}
export interface SyncStatus {
provider: string;
dataType: string;
lastSyncAt?: Date;
lastSyncCount: number;
syncErrors?: string;
}
export interface ExchangeMapping {
id: string;
code: string;
name: string;
country: string;
currency: string;
}

View file

@ -1,15 +1,15 @@
{
"service": {
"name": "web-api",
"port": 4000,
"host": "0.0.0.0",
"healthCheckPath": "/health",
"metricsPath": "/metrics",
"shutdownTimeout": 30000,
"cors": {
"enabled": true,
"origin": ["http://localhost:4200", "http://localhost:3000", "http://localhost:3002"],
"credentials": true
}
}
}
{
"service": {
"name": "web-api",
"port": 4000,
"host": "0.0.0.0",
"healthCheckPath": "/health",
"metricsPath": "/metrics",
"shutdownTimeout": 30000,
"cors": {
"enabled": true,
"origin": ["http://localhost:4200", "http://localhost:3000", "http://localhost:3002"],
"credentials": true
}
}
}

View file

@ -1,27 +1,27 @@
import { PostgreSQLClient } from '@stock-bot/postgres';
import { MongoDBClient } from '@stock-bot/mongodb';
let postgresClient: PostgreSQLClient | null = null;
let mongodbClient: MongoDBClient | null = null;
export function setPostgreSQLClient(client: PostgreSQLClient): void {
postgresClient = client;
}
export function getPostgreSQLClient(): PostgreSQLClient {
if (!postgresClient) {
throw new Error('PostgreSQL client not initialized. Call setPostgreSQLClient first.');
}
return postgresClient;
}
export function setMongoDBClient(client: MongoDBClient): void {
mongodbClient = client;
}
export function getMongoDBClient(): MongoDBClient {
if (!mongodbClient) {
throw new Error('MongoDB client not initialized. Call setMongoDBClient first.');
}
return mongodbClient;
}
import { MongoDBClient } from '@stock-bot/mongodb';
import { PostgreSQLClient } from '@stock-bot/postgres';
let postgresClient: PostgreSQLClient | null = null;
let mongodbClient: MongoDBClient | null = null;
export function setPostgreSQLClient(client: PostgreSQLClient): void {
postgresClient = client;
}
export function getPostgreSQLClient(): PostgreSQLClient {
if (!postgresClient) {
throw new Error('PostgreSQL client not initialized. Call setPostgreSQLClient first.');
}
return postgresClient;
}
export function setMongoDBClient(client: MongoDBClient): void {
mongodbClient = client;
}
export function getMongoDBClient(): MongoDBClient {
if (!mongodbClient) {
throw new Error('MongoDB client not initialized. Call setMongoDBClient first.');
}
return mongodbClient;
}

View file

@ -77,17 +77,20 @@ async function initializeServices() {
// Initialize MongoDB client
logger.debug('Connecting to MongoDB...');
const mongoConfig = databaseConfig.mongodb;
mongoClient = new MongoDBClient({
uri: mongoConfig.uri,
database: mongoConfig.database,
host: mongoConfig.host,
port: mongoConfig.port,
timeouts: {
connectTimeout: 30000,
socketTimeout: 30000,
serverSelectionTimeout: 5000,
mongoClient = new MongoDBClient(
{
uri: mongoConfig.uri,
database: mongoConfig.database,
host: mongoConfig.host,
port: mongoConfig.port,
timeouts: {
connectTimeout: 30000,
socketTimeout: 30000,
serverSelectionTimeout: 5000,
},
},
}, logger);
logger
);
await mongoClient.connect();
setMongoDBClient(mongoClient);
logger.info('MongoDB connected');
@ -95,18 +98,21 @@ async function initializeServices() {
// Initialize PostgreSQL client
logger.debug('Connecting to PostgreSQL...');
const pgConfig = databaseConfig.postgres;
postgresClient = new PostgreSQLClient({
host: pgConfig.host,
port: pgConfig.port,
database: pgConfig.database,
username: pgConfig.user,
password: pgConfig.password,
poolSettings: {
min: 2,
max: pgConfig.poolSize || 10,
idleTimeoutMillis: pgConfig.idleTimeout || 30000,
postgresClient = new PostgreSQLClient(
{
host: pgConfig.host,
port: pgConfig.port,
database: pgConfig.database,
username: pgConfig.user,
password: pgConfig.password,
poolSettings: {
min: 2,
max: pgConfig.poolSize || 10,
idleTimeoutMillis: pgConfig.idleTimeout || 30000,
},
},
}, logger);
logger
);
await postgresClient.connect();
setPostgreSQLClient(postgresClient);
logger.info('PostgreSQL connected');

View file

@ -4,13 +4,13 @@
import { Hono } from 'hono';
import { getLogger } from '@stock-bot/logger';
import { exchangeService } from '../services/exchange.service';
import { createSuccessResponse, handleError } from '../utils/error-handler';
import {
validateCreateExchange,
validateUpdateExchange,
validateCreateProviderMapping,
validateUpdateExchange,
validateUpdateProviderMapping,
} from '../utils/validation';
import { handleError, createSuccessResponse } from '../utils/error-handler';
const logger = getLogger('exchange-routes');
export const exchangeRoutes = new Hono();
@ -32,19 +32,19 @@ exchangeRoutes.get('/', async c => {
exchangeRoutes.get('/:id', async c => {
const exchangeId = c.req.param('id');
logger.debug('Getting exchange by ID', { exchangeId });
try {
const result = await exchangeService.getExchangeById(exchangeId);
if (!result) {
logger.warn('Exchange not found', { exchangeId });
return c.json(createSuccessResponse(null, 'Exchange not found'), 404);
}
logger.info('Successfully retrieved exchange details', {
exchangeId,
logger.info('Successfully retrieved exchange details', {
exchangeId,
exchangeCode: result.exchange.code,
mappingCount: result.provider_mappings.length
mappingCount: result.provider_mappings.length,
});
return c.json(createSuccessResponse(result));
} catch (error) {
@ -56,25 +56,22 @@ exchangeRoutes.get('/:id', async c => {
// Create new exchange
exchangeRoutes.post('/', async c => {
logger.debug('Creating new exchange');
try {
const body = await c.req.json();
logger.debug('Received exchange creation request', { requestBody: body });
const validatedData = validateCreateExchange(body);
logger.debug('Exchange data validated successfully', { validatedData });
const exchange = await exchangeService.createExchange(validatedData);
logger.info('Exchange created successfully', {
logger.info('Exchange created successfully', {
exchangeId: exchange.id,
code: exchange.code,
name: exchange.name
name: exchange.name,
});
return c.json(
createSuccessResponse(exchange, 'Exchange created successfully'),
201
);
return c.json(createSuccessResponse(exchange, 'Exchange created successfully'), 201);
} catch (error) {
logger.error('Failed to create exchange', { error });
return handleError(c, error, 'to create exchange');
@ -85,32 +82,32 @@ exchangeRoutes.post('/', async c => {
exchangeRoutes.patch('/:id', async c => {
const exchangeId = c.req.param('id');
logger.debug('Updating exchange', { exchangeId });
try {
const body = await c.req.json();
logger.debug('Received exchange update request', { exchangeId, updates: body });
const validatedUpdates = validateUpdateExchange(body);
logger.debug('Exchange update data validated', { exchangeId, validatedUpdates });
const exchange = await exchangeService.updateExchange(exchangeId, validatedUpdates);
if (!exchange) {
logger.warn('Exchange not found for update', { exchangeId });
return c.json(createSuccessResponse(null, 'Exchange not found'), 404);
}
logger.info('Exchange updated successfully', {
logger.info('Exchange updated successfully', {
exchangeId,
code: exchange.code,
updates: validatedUpdates
updates: validatedUpdates,
});
// Log special actions
if (validatedUpdates.visible === false) {
logger.warn('Exchange marked as hidden - provider mappings will be deleted', {
logger.warn('Exchange marked as hidden - provider mappings will be deleted', {
exchangeId,
code: exchange.code
code: exchange.code,
});
}
@ -124,7 +121,7 @@ exchangeRoutes.patch('/:id', async c => {
// Get all provider mappings
exchangeRoutes.get('/provider-mappings/all', async c => {
logger.debug('Getting all provider mappings');
try {
const mappings = await exchangeService.getAllProviderMappings();
logger.info('Successfully retrieved all provider mappings', { count: mappings.length });
@ -139,18 +136,12 @@ exchangeRoutes.get('/provider-mappings/all', async c => {
exchangeRoutes.get('/provider-mappings/:provider', async c => {
const provider = c.req.param('provider');
logger.debug('Getting provider mappings by provider', { provider });
try {
const mappings = await exchangeService.getProviderMappingsByProvider(provider);
logger.info('Successfully retrieved provider mappings', { provider, count: mappings.length });
return c.json(
createSuccessResponse(
mappings,
undefined,
mappings.length
)
);
return c.json(createSuccessResponse(mappings, undefined, mappings.length));
} catch (error) {
logger.error('Failed to get provider mappings', { error, provider });
return handleError(c, error, 'to get provider mappings');
@ -161,26 +152,26 @@ exchangeRoutes.get('/provider-mappings/:provider', async c => {
exchangeRoutes.patch('/provider-mappings/:id', async c => {
const mappingId = c.req.param('id');
logger.debug('Updating provider mapping', { mappingId });
try {
const body = await c.req.json();
logger.debug('Received provider mapping update request', { mappingId, updates: body });
const validatedUpdates = validateUpdateProviderMapping(body);
logger.debug('Provider mapping update data validated', { mappingId, validatedUpdates });
const mapping = await exchangeService.updateProviderMapping(mappingId, validatedUpdates);
if (!mapping) {
logger.warn('Provider mapping not found for update', { mappingId });
return c.json(createSuccessResponse(null, 'Provider mapping not found'), 404);
}
logger.info('Provider mapping updated successfully', {
logger.info('Provider mapping updated successfully', {
mappingId,
provider: mapping.provider,
providerExchangeCode: mapping.provider_exchange_code,
updates: validatedUpdates
updates: validatedUpdates,
});
return c.json(createSuccessResponse(mapping, 'Provider mapping updated successfully'));
@ -193,26 +184,23 @@ exchangeRoutes.patch('/provider-mappings/:id', async c => {
// Create new provider mapping
exchangeRoutes.post('/provider-mappings', async c => {
logger.debug('Creating new provider mapping');
try {
const body = await c.req.json();
logger.debug('Received provider mapping creation request', { requestBody: body });
const validatedData = validateCreateProviderMapping(body);
logger.debug('Provider mapping data validated successfully', { validatedData });
const mapping = await exchangeService.createProviderMapping(validatedData);
logger.info('Provider mapping created successfully', {
logger.info('Provider mapping created successfully', {
mappingId: mapping.id,
provider: mapping.provider,
providerExchangeCode: mapping.provider_exchange_code,
masterExchangeId: mapping.master_exchange_id
masterExchangeId: mapping.master_exchange_id,
});
return c.json(
createSuccessResponse(mapping, 'Provider mapping created successfully'),
201
);
return c.json(createSuccessResponse(mapping, 'Provider mapping created successfully'), 201);
} catch (error) {
logger.error('Failed to create provider mapping', { error });
return handleError(c, error, 'to create provider mapping');
@ -222,7 +210,7 @@ exchangeRoutes.post('/provider-mappings', async c => {
// Get all available providers
exchangeRoutes.get('/providers/list', async c => {
logger.debug('Getting providers list');
try {
const providers = await exchangeService.getProviders();
logger.info('Successfully retrieved providers list', { count: providers.length, providers });
@ -237,21 +225,15 @@ exchangeRoutes.get('/providers/list', async c => {
exchangeRoutes.get('/provider-exchanges/unmapped/:provider', async c => {
const provider = c.req.param('provider');
logger.debug('Getting unmapped provider exchanges', { provider });
try {
const exchanges = await exchangeService.getUnmappedProviderExchanges(provider);
logger.info('Successfully retrieved unmapped provider exchanges', {
provider,
count: exchanges.length
logger.info('Successfully retrieved unmapped provider exchanges', {
provider,
count: exchanges.length,
});
return c.json(
createSuccessResponse(
exchanges,
undefined,
exchanges.length
)
);
return c.json(createSuccessResponse(exchanges, undefined, exchanges.length));
} catch (error) {
logger.error('Failed to get unmapped provider exchanges', { error, provider });
return handleError(c, error, 'to get unmapped provider exchanges');
@ -261,7 +243,7 @@ exchangeRoutes.get('/provider-exchanges/unmapped/:provider', async c => {
// Get exchange statistics
exchangeRoutes.get('/stats/summary', async c => {
logger.debug('Getting exchange statistics');
try {
const stats = await exchangeService.getExchangeStats();
logger.info('Successfully retrieved exchange statistics', { stats });
@ -270,4 +252,4 @@ exchangeRoutes.get('/stats/summary', async c => {
logger.error('Failed to get exchange statistics', { error });
return handleError(c, error, 'to get exchange statistics');
}
});
});

View file

@ -3,7 +3,7 @@
*/
import { Hono } from 'hono';
import { getLogger } from '@stock-bot/logger';
import { getPostgreSQLClient, getMongoDBClient } from '../clients';
import { getMongoDBClient, getPostgreSQLClient } from '../clients';
const logger = getLogger('health-routes');
export const healthRoutes = new Hono();
@ -11,13 +11,13 @@ export const healthRoutes = new Hono();
// Basic health check
healthRoutes.get('/', c => {
logger.debug('Basic health check requested');
const response = {
status: 'healthy',
service: 'web-api',
timestamp: new Date().toISOString(),
};
logger.info('Basic health check successful', { status: response.status });
return c.json(response);
});
@ -25,7 +25,7 @@ healthRoutes.get('/', c => {
// Detailed health check with database connectivity
healthRoutes.get('/detailed', async c => {
logger.debug('Detailed health check requested');
const health = {
status: 'healthy',
service: 'web-api',
@ -80,19 +80,19 @@ healthRoutes.get('/detailed', async c => {
health.status = allHealthy ? 'healthy' : 'unhealthy';
const statusCode = allHealthy ? 200 : 503;
if (allHealthy) {
logger.info('Detailed health check successful - all systems healthy', {
mongodb: health.checks.mongodb.status,
postgresql: health.checks.postgresql.status
postgresql: health.checks.postgresql.status,
});
} else {
logger.warn('Detailed health check failed - some systems unhealthy', {
mongodb: health.checks.mongodb.status,
postgresql: health.checks.postgresql.status,
overallStatus: health.status
overallStatus: health.status,
});
}
return c.json(health, statusCode);
});

View file

@ -1,15 +1,15 @@
import { getLogger } from '@stock-bot/logger';
import { getPostgreSQLClient, getMongoDBClient } from '../clients';
import { getMongoDBClient, getPostgreSQLClient } from '../clients';
import {
Exchange,
ExchangeWithMappings,
ProviderMapping,
CreateExchangeRequest,
UpdateExchangeRequest,
CreateProviderMappingRequest,
UpdateProviderMappingRequest,
ProviderExchange,
Exchange,
ExchangeStats,
ExchangeWithMappings,
ProviderExchange,
ProviderMapping,
UpdateExchangeRequest,
UpdateProviderMappingRequest,
} from '../types/exchange.types';
const logger = getLogger('exchange-service');
@ -18,7 +18,7 @@ export class ExchangeService {
private get postgresClient() {
return getPostgreSQLClient();
}
private get mongoClient() {
return getMongoDBClient();
}
@ -63,14 +63,17 @@ export class ExchangeService {
const mappingsResult = await this.postgresClient.query(mappingsQuery);
// Group mappings by exchange ID
const mappingsByExchange = mappingsResult.rows.reduce((acc, mapping) => {
const exchangeId = mapping.master_exchange_id;
if (!acc[exchangeId]) {
acc[exchangeId] = [];
}
acc[exchangeId].push(mapping);
return acc;
}, {} as Record<string, ProviderMapping[]>);
const mappingsByExchange = mappingsResult.rows.reduce(
(acc, mapping) => {
const exchangeId = mapping.master_exchange_id;
if (!acc[exchangeId]) {
acc[exchangeId] = [];
}
acc[exchangeId].push(mapping);
return acc;
},
{} as Record<string, ProviderMapping[]>
);
// Attach mappings to exchanges
return exchangesResult.rows.map(exchange => ({
@ -79,7 +82,9 @@ export class ExchangeService {
}));
}
async getExchangeById(id: string): Promise<{ exchange: Exchange; provider_mappings: ProviderMapping[] } | null> {
async getExchangeById(
id: string
): Promise<{ exchange: Exchange; provider_mappings: ProviderMapping[] } | null> {
const exchangeQuery = 'SELECT * FROM exchanges WHERE id = $1 AND visible = true';
const exchangeResult = await this.postgresClient.query(exchangeQuery, [id]);
@ -230,7 +235,10 @@ export class ExchangeService {
return result.rows[0];
}
async updateProviderMapping(id: string, updates: UpdateProviderMappingRequest): Promise<ProviderMapping | null> {
async updateProviderMapping(
id: string,
updates: UpdateProviderMappingRequest
): Promise<ProviderMapping | null> {
const updateFields = [];
const values = [];
let paramIndex = 1;
@ -359,7 +367,6 @@ export class ExchangeService {
break;
}
default:
throw new Error(`Unknown provider: ${provider}`);
}
@ -369,4 +376,4 @@ export class ExchangeService {
}
// Export singleton instance
export const exchangeService = new ExchangeService();
export const exchangeService = new ExchangeService();

View file

@ -100,4 +100,4 @@ export interface ApiResponse<T = unknown> {
error?: string;
message?: string;
total?: number;
}
}

View file

@ -1,7 +1,7 @@
import { Context } from 'hono';
import { getLogger } from '@stock-bot/logger';
import { ValidationError } from './validation';
import { ApiResponse } from '../types/exchange.types';
import { ValidationError } from './validation';
const logger = getLogger('error-handler');
@ -61,4 +61,4 @@ export function createSuccessResponse<T>(
}
return response;
}
}

View file

@ -1,7 +1,10 @@
import { CreateExchangeRequest, CreateProviderMappingRequest } from '../types/exchange.types';
export class ValidationError extends Error {
constructor(message: string, public field?: string) {
constructor(
message: string,
public field?: string
) {
super(message);
this.name = 'ValidationError';
}
@ -38,7 +41,10 @@ export function validateCreateExchange(data: unknown): CreateExchangeRequest {
}
if (currency.length !== 3) {
throw new ValidationError('Currency must be exactly 3 characters (e.g., USD, EUR, CAD)', 'currency');
throw new ValidationError(
'Currency must be exactly 3 characters (e.g., USD, EUR, CAD)',
'currency'
);
}
return {
@ -172,4 +178,4 @@ export function validateUpdateProviderMapping(data: unknown): Record<string, unk
}
return updates;
}
}

View file

@ -1,16 +1,16 @@
import { useCallback, useEffect, useState } from 'react';
import { exchangeApi } from '../services/exchangeApi';
import {
CreateExchangeRequest,
CreateProviderMappingRequest,
Exchange,
ExchangeDetails,
ExchangeStats,
ProviderMapping,
ProviderExchange,
CreateExchangeRequest,
ProviderMapping,
UpdateExchangeRequest,
CreateProviderMappingRequest,
UpdateProviderMappingRequest,
} from '../types';
import { exchangeApi } from '../services/exchangeApi';
export function useExchanges() {
const [exchanges, setExchanges] = useState<Exchange[]>([]);
@ -62,18 +62,15 @@ export function useExchanges() {
[fetchExchanges]
);
const fetchExchangeDetails = useCallback(
async (id: string): Promise<ExchangeDetails | null> => {
try {
return await exchangeApi.getExchangeById(id);
} catch (err) {
// Error fetching exchange details - error state will show in UI
setError(err instanceof Error ? err.message : 'Failed to fetch exchange details');
return null;
}
},
[]
);
const fetchExchangeDetails = useCallback(async (id: string): Promise<ExchangeDetails | null> => {
try {
return await exchangeApi.getExchangeById(id);
} catch (err) {
// Error fetching exchange details - error state will show in UI
setError(err instanceof Error ? err.message : 'Failed to fetch exchange details');
return null;
}
}, []);
const fetchStats = useCallback(async (): Promise<ExchangeStats | null> => {
try {

View file

@ -1,22 +1,22 @@
import { useState, useCallback } from 'react';
import { useCallback, useState } from 'react';
import { FormErrors } from '../types';
export function useFormValidation<T>(
initialData: T,
validateFn: (data: T) => FormErrors
) {
export function useFormValidation<T>(initialData: T, validateFn: (data: T) => FormErrors) {
const [formData, setFormData] = useState<T>(initialData);
const [errors, setErrors] = useState<FormErrors>({});
const [isSubmitting, setIsSubmitting] = useState(false);
const updateField = useCallback((field: keyof T, value: T[keyof T]) => {
setFormData(prev => ({ ...prev, [field]: value }));
// Clear error when user starts typing
if (errors[field as string]) {
setErrors(prev => ({ ...prev, [field as string]: '' }));
}
}, [errors]);
const updateField = useCallback(
(field: keyof T, value: T[keyof T]) => {
setFormData(prev => ({ ...prev, [field]: value }));
// Clear error when user starts typing
if (errors[field as string]) {
setErrors(prev => ({ ...prev, [field as string]: '' }));
}
},
[errors]
);
const validate = useCallback((): boolean => {
const newErrors = validateFn(formData);
@ -30,24 +30,29 @@ export function useFormValidation<T>(
setIsSubmitting(false);
}, [initialData]);
const handleSubmit = useCallback(async (
onSubmit: (data: T) => Promise<void>,
onSuccess?: () => void,
onError?: (error: unknown) => void
) => {
if (!validate()) {return;}
const handleSubmit = useCallback(
async (
onSubmit: (data: T) => Promise<void>,
onSuccess?: () => void,
onError?: (error: unknown) => void
) => {
if (!validate()) {
return;
}
setIsSubmitting(true);
try {
await onSubmit(formData);
reset();
onSuccess?.();
} catch (error) {
onError?.(error);
} finally {
setIsSubmitting(false);
}
}, [formData, validate, reset]);
setIsSubmitting(true);
try {
await onSubmit(formData);
reset();
onSuccess?.();
} catch (error) {
onError?.(error);
} finally {
setIsSubmitting(false);
}
},
[formData, validate, reset]
);
return {
formData,
@ -59,4 +64,4 @@ export function useFormValidation<T>(
handleSubmit,
setIsSubmitting,
};
}
}

View file

@ -1,25 +1,22 @@
import {
ApiResponse,
CreateExchangeRequest,
CreateProviderMappingRequest,
Exchange,
ExchangeDetails,
ExchangeStats,
ProviderMapping,
ProviderExchange,
CreateExchangeRequest,
ProviderMapping,
UpdateExchangeRequest,
CreateProviderMappingRequest,
UpdateProviderMappingRequest,
} from '../types';
const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || 'http://localhost:4000/api';
class ExchangeApiService {
private async request<T>(
endpoint: string,
options?: RequestInit
): Promise<ApiResponse<T>> {
private async request<T>(endpoint: string, options?: RequestInit): Promise<ApiResponse<T>> {
const url = `${API_BASE_URL}${endpoint}`;
const response = await fetch(url, {
headers: {
'Content-Type': 'application/json',
@ -33,7 +30,7 @@ class ExchangeApiService {
}
const data = await response.json();
if (!data.success) {
throw new Error(data.error || 'API request failed');
}
@ -76,10 +73,10 @@ class ExchangeApiService {
// Provider Mappings
async getProviderMappings(provider?: string): Promise<ProviderMapping[]> {
const endpoint = provider
const endpoint = provider
? `/exchanges/provider-mappings/${provider}`
: '/exchanges/provider-mappings/all';
const response = await this.request<ProviderMapping[]>(endpoint);
return response.data || [];
}
@ -96,7 +93,7 @@ class ExchangeApiService {
}
async updateProviderMapping(
id: string,
id: string,
data: UpdateProviderMappingRequest
): Promise<ProviderMapping> {
const response = await this.request<ProviderMapping>(`/exchanges/provider-mappings/${id}`, {
@ -132,4 +129,4 @@ class ExchangeApiService {
}
// Export singleton instance
export const exchangeApi = new ExchangeApiService();
export const exchangeApi = new ExchangeApiService();

View file

@ -66,4 +66,4 @@ export interface ExchangeStats {
active_provider_mappings: string;
verified_provider_mappings: string;
providers: string;
}
}

View file

@ -32,7 +32,9 @@ export interface AddExchangeDialogProps extends BaseDialogProps {
export interface AddProviderMappingDialogProps extends BaseDialogProps {
exchangeId: string;
exchangeName: string;
onCreateMapping: (request: import('./request.types').CreateProviderMappingRequest) => Promise<unknown>;
onCreateMapping: (
request: import('./request.types').CreateProviderMappingRequest
) => Promise<unknown>;
}
export interface DeleteExchangeDialogProps extends BaseDialogProps {
@ -40,4 +42,4 @@ export interface DeleteExchangeDialogProps extends BaseDialogProps {
exchangeName: string;
providerMappingCount: number;
onConfirmDelete: (exchangeId: string) => Promise<boolean>;
}
}

View file

@ -32,4 +32,4 @@ export interface UpdateProviderMappingRequest {
verified?: boolean;
confidence?: number;
master_exchange_id?: string;
}
}

View file

@ -21,7 +21,7 @@ export function sortProviderMappings(mappings: ProviderMapping[]): ProviderMappi
if (!a.active && b.active) {
return 1;
}
// Then by provider name
return a.provider.localeCompare(b.provider);
});
@ -32,4 +32,4 @@ export function truncateText(text: string, maxLength: number): string {
return text;
}
return text.substring(0, maxLength) + '...';
}
}

View file

@ -35,4 +35,4 @@ export function validateExchangeForm(data: {
export function hasValidationErrors(errors: FormErrors): boolean {
return Object.keys(errors).length > 0;
}
}

View file

@ -19,7 +19,11 @@ export function formatPercentage(value: number): string {
}
export function getValueColor(value: number): string {
if (value > 0) {return 'text-success';}
if (value < 0) {return 'text-danger';}
if (value > 0) {
return 'text-success';
}
if (value < 0) {
return 'text-danger';
}
return 'text-text-secondary';
}

View file

@ -23,9 +23,15 @@ export function formatPercentage(value: number, decimals = 2): string {
* Format large numbers with K, M, B suffixes
*/
export function formatNumber(num: number): string {
if (num >= 1e9) {return (num / 1e9).toFixed(1) + 'B';}
if (num >= 1e6) {return (num / 1e6).toFixed(1) + 'M';}
if (num >= 1e3) {return (num / 1e3).toFixed(1) + 'K';}
if (num >= 1e9) {
return (num / 1e9).toFixed(1) + 'B';
}
if (num >= 1e6) {
return (num / 1e6).toFixed(1) + 'M';
}
if (num >= 1e3) {
return (num / 1e3).toFixed(1) + 'K';
}
return num.toString();
}
@ -33,8 +39,12 @@ export function formatNumber(num: number): string {
* Get color class based on numeric value (profit/loss)
*/
export function getValueColor(value: number): string {
if (value > 0) {return 'text-success';}
if (value < 0) {return 'text-danger';}
if (value > 0) {
return 'text-success';
}
if (value < 0) {
return 'text-danger';
}
return 'text-text-secondary';
}
@ -42,6 +52,8 @@ export function getValueColor(value: number): string {
* Truncate text to specified length
*/
export function truncateText(text: string, length: number): string {
if (text.length <= length) {return text;}
if (text.length <= length) {
return text;
}
return text.slice(0, length) + '...';
}

View file

@ -1,148 +0,0 @@
# Enhanced Cache Provider Usage
The Redis cache provider now supports advanced TTL handling and conditional operations.
## Basic Usage (Backward Compatible)
```typescript
import { RedisCache } from '@stock-bot/cache';
const cache = new RedisCache({
keyPrefix: 'trading:',
defaultTTL: 3600 // 1 hour
});
// Simple set with TTL (old way - still works)
await cache.set('user:123', userData, 1800); // 30 minutes
// Simple get
const user = await cache.get<UserData>('user:123');
```
## Enhanced Set Options
```typescript
// Preserve existing TTL when updating
await cache.set('user:123', updatedUserData, { preserveTTL: true });
// Only set if key exists (update operation)
const oldValue = await cache.set('user:123', newData, {
onlyIfExists: true,
getOldValue: true
});
// Only set if key doesn't exist (create operation)
await cache.set('user:456', newUser, {
onlyIfNotExists: true,
ttl: 7200 // 2 hours
});
// Get old value when setting new one
const previousData = await cache.set('session:abc', sessionData, {
getOldValue: true,
ttl: 1800
});
```
## Convenience Methods
```typescript
// Update value preserving TTL
await cache.update('user:123', updatedUserData);
// Set only if exists
const updated = await cache.setIfExists('user:123', newData, 3600);
// Set only if not exists (returns true if created)
const created = await cache.setIfNotExists('user:456', userData);
// Replace existing key with new TTL
const oldData = await cache.replace('user:123', newData, 7200);
// Atomic field updates
await cache.updateField('counter:views', (current) => (current || 0) + 1);
await cache.updateField('user:123', (user) => ({
...user,
lastSeen: new Date().toISOString(),
loginCount: (user?.loginCount || 0) + 1
}));
```
## Stock Bot Use Cases
### 1. Rate Limiting
```typescript
// Only create rate limit if not exists
const rateLimited = await cache.setIfNotExists(
`ratelimit:${userId}:${endpoint}`,
{ count: 1, resetTime: Date.now() + 60000 },
60 // 1 minute
);
if (!rateLimited) {
// Increment existing counter
await cache.updateField(`ratelimit:${userId}:${endpoint}`, (data) => ({
...data,
count: data.count + 1
}));
}
```
### 2. Session Management
```typescript
// Update session data without changing expiration
await cache.update(`session:${sessionId}`, {
...sessionData,
lastActivity: Date.now()
});
```
### 3. Cache Warming
```typescript
// Only update existing cached data, don't create new entries
const warmed = await cache.setIfExists(`stock:${symbol}:price`, latestPrice);
if (warmed) {
console.log(`Warmed cache for ${symbol}`);
}
```
### 4. Atomic Counters
```typescript
// Thread-safe counter increments
await cache.updateField('metrics:api:calls', (count) => (count || 0) + 1);
await cache.updateField('metrics:errors:500', (count) => (count || 0) + 1);
```
### 5. TTL Preservation for Frequently Updated Data
```typescript
// Keep original expiration when updating frequently changing data
await cache.set(`portfolio:${userId}:positions`, positions, { preserveTTL: true });
```
## Error Handling
The cache provider includes robust error handling:
```typescript
try {
await cache.set('key', value);
} catch (error) {
// Errors are logged and fallback values returned
// The cache operations are non-blocking
}
// Check cache health
const isHealthy = await cache.health();
// Wait for cache to be ready
await cache.waitForReady(10000); // 10 second timeout
```
## Performance Benefits
1. **Atomic Operations**: `updateField` uses Lua scripts to prevent race conditions
2. **TTL Preservation**: Avoids unnecessary TTL resets on updates
3. **Conditional Operations**: Reduces network round trips
4. **Shared Connections**: Efficient connection pooling
5. **Error Recovery**: Graceful degradation when Redis is unavailable

View file

@ -1,169 +0,0 @@
# Loki Logging for Stock Bot
This document outlines how to use the Loki logging system integrated with the Stock Bot platform (Updated June 2025).
## Overview
Loki provides centralized logging for all Stock Bot services with:
1. **Centralized logging** for all microservices
2. **Log aggregation** and filtering by service, level, and custom labels
3. **Grafana integration** for visualization and dashboards
4. **Query capabilities** using LogQL for log analysis
5. **Alert capabilities** for critical issues
## Getting Started
### Starting the Logging Stack
```cmd
# Start the monitoring stack (includes Loki and Grafana)
scripts\docker.ps1 monitoring
```
Or start services individually:
```cmd
# Start Loki service only
docker-compose up -d loki
# Start Loki and Grafana
docker-compose up -d loki grafana
```
### Viewing Logs
Once started:
1. Access Grafana at http://localhost:3000 (login with admin/admin)
2. Navigate to the "Stock Bot Logs" dashboard
3. View and query your logs
## Using the Logger in Your Services
The Stock Bot logger automatically sends logs to Loki using the updated pattern:
```typescript
import { getLogger } from '@stock-bot/logger';
// Create a logger for your service
const logger = getLogger('your-service-name');
// Log at different levels
logger.debug('Detailed information for debugging');
logger.info('General information about operations');
logger.warn('Potential issues that don\'t affect operation');
logger.error('Critical errors that require attention');
// Log with structured data (searchable in Loki)
logger.info('Processing trade', {
symbol: 'MSFT',
price: 410.75,
quantity: 50
});
```
## Configuration Options
Logger configuration is managed through the `@stock-bot/config` package and can be set in your `.env` file:
```bash
# Logging configuration
LOG_LEVEL=debug # debug, info, warn, error
LOG_CONSOLE=true # Log to console in addition to Loki
LOKI_HOST=localhost # Loki server hostname
LOKI_PORT=3100 # Loki server port
LOKI_RETENTION_DAYS=30 # Days to retain logs
LOKI_LABELS=environment=development,service=stock-bot # Default labels
LOKI_BATCH_SIZE=100 # Number of logs to batch before sending
LOKI_BATCH_WAIT=5 # Max time to wait before sending logs
```
## Useful Loki Queries
Inside Grafana, you can use these LogQL queries to analyze your logs:
1. **All logs from a specific service**:
```
{service="market-data-gateway"}
```
2. **All error logs across all services**:
```
{level="error"}
```
3. **Logs containing specific text**:
```
{service="market-data-gateway"} |= "trade"
```
4. **Count of error logs by service over time**:
```
sum by(service) (count_over_time({level="error"}[5m]))
```
## Testing the Logging Integration
Test the logging integration using Bun:
```cmd
# Run from project root using Bun (current runtime)
bun run tools/test-loki-logging.ts
```
## Architecture
Our logging implementation follows this architecture:
```
┌─────────────────┐ ┌─────────────────┐
│ Trading Services│────►│ @stock-bot/logger│
└─────────────────┘ │ getLogger() │
└────────┬────────┘
┌────────────────────────────────────────┐
│ Loki │
└────────────────┬───────────────────────┘
┌────────────────────────────────────────┐
│ Grafana │
└────────────────────────────────────────┘
```
## Adding New Dashboards
To create new Grafana dashboards for log visualization:
1. Build your dashboard in the Grafana UI
2. Export it to JSON
3. Add it to `monitoring/grafana/provisioning/dashboards/json/`
4. Restart the monitoring stack
## Troubleshooting
If logs aren't appearing in Grafana:
1. Run the status check script to verify Loki and Grafana are working:
```cmd
tools\check-loki-status.bat
```
2. Check that Loki and Grafana containers are running:
```cmd
docker ps | findstr "loki grafana"
```
3. Verify .env configuration for Loki host and port:
```cmd
type .env | findstr "LOKI_"
```
4. Ensure your service has the latest @stock-bot/logger package
5. Check for errors in the Loki container logs:
```cmd
docker logs stock-bot-loki
```

View file

@ -1,212 +0,0 @@
# MongoDB Client Multi-Database Migration Guide
## Overview
Your MongoDB client has been enhanced to support multiple databases dynamically while maintaining full backward compatibility.
## Key Features Added
### 1. **Dynamic Database Switching**
```typescript
// Set default database (all operations will use this unless overridden)
client.setDefaultDatabase('analytics');
// Get current default database
const currentDb = client.getDefaultDatabase(); // Returns: 'analytics'
```
### 2. **Database Parameter in Methods**
All methods now accept an optional `database` parameter:
```typescript
// Old way (still works - uses default database)
await client.batchUpsert('symbols', data, 'symbol');
// New way (specify database explicitly)
await client.batchUpsert('symbols', data, 'symbol', { database: 'stock' });
```
### 3. **Convenience Methods**
Pre-configured methods for common databases:
```typescript
// Stock database operations
await client.batchUpsertStock('symbols', data, 'symbol');
// Analytics database operations
await client.batchUpsertAnalytics('metrics', data, 'metric_name');
// Trading documents database operations
await client.batchUpsertTrading('orders', data, 'order_id');
```
### 4. **Direct Database Access**
```typescript
// Get specific database instances
const stockDb = client.getDatabase('stock');
const analyticsDb = client.getDatabase('analytics');
// Get collections with database override
const collection = client.getCollection('symbols', 'stock');
```
## Migration Steps
### Step 1: No Changes Required (Backward Compatible)
Your existing code continues to work unchanged:
```typescript
// This still works exactly as before
const client = MongoDBClient.getInstance();
await client.connect();
await client.batchUpsert('exchanges', exchangeData, 'exchange_id');
```
### Step 2: Organize Data by Database (Recommended)
Update your data service to use appropriate databases:
```typescript
// In your data service initialization
export class DataService {
private mongoClient = MongoDBClient.getInstance();
async initialize() {
await this.mongoClient.connect();
// Set stock as default for most operations
this.mongoClient.setDefaultDatabase('stock');
}
async saveInteractiveBrokersData(exchanges: any[], symbols: any[]) {
// Stock market data goes to 'stock' database (default)
await this.mongoClient.batchUpsert('exchanges', exchanges, 'exchange_id');
await this.mongoClient.batchUpsert('symbols', symbols, 'symbol');
}
async saveAnalyticsData(performance: any[]) {
// Analytics data goes to 'analytics' database
await this.mongoClient.batchUpsert(
'performance',
performance,
'date',
{ database: 'analytics' }
);
}
}
```
### Step 3: Use Convenience Methods (Optional)
Replace explicit database parameters with convenience methods:
```typescript
// Instead of:
await client.batchUpsert('symbols', data, 'symbol', { database: 'stock' });
// Use:
await client.batchUpsertStock('symbols', data, 'symbol');
```
## Factory Functions
New factory functions are available for easier database management:
```typescript
import {
connectMongoDB,
setDefaultDatabase,
getCurrentDatabase,
getDatabase
} from '@stock-bot/mongodb-client';
// Set default database globally
setDefaultDatabase('analytics');
// Get current default
const current = getCurrentDatabase();
// Get specific database
const stockDb = getDatabase('stock');
```
## Database Recommendations
### Stock Database (`stock`)
- Market data (symbols, exchanges, prices)
- Financial instruments
- Market events
- Real-time data
### Analytics Database (`analytics`)
- Performance metrics
- Calculated indicators
- Reports and dashboards
- Aggregated data
### Trading Documents Database (`trading_documents`)
- Trade orders and executions
- User portfolios
- Transaction logs
- Audit trails
## Example: Updating Your Data Service
```typescript
// Before (still works)
export class DataService {
async saveExchanges(exchanges: any[]) {
const client = MongoDBClient.getInstance();
await client.batchUpsert('exchanges', exchanges, 'exchange_id');
}
}
// After (recommended)
export class DataService {
private mongoClient = MongoDBClient.getInstance();
async initialize() {
await this.mongoClient.connect();
this.mongoClient.setDefaultDatabase('stock'); // Set appropriate default
}
async saveExchanges(exchanges: any[]) {
// Uses default 'stock' database
await this.mongoClient.batchUpsert('exchanges', exchanges, 'exchange_id');
// Or use convenience method
await this.mongoClient.batchUpsertStock('exchanges', exchanges, 'exchange_id');
}
async savePerformanceMetrics(metrics: any[]) {
// Save to analytics database
await this.mongoClient.batchUpsertAnalytics('metrics', metrics, 'metric_name');
}
}
```
## Testing
Your existing tests continue to work. For new multi-database features:
```typescript
import { MongoDBClient } from '@stock-bot/mongodb-client';
const client = MongoDBClient.getInstance();
await client.connect();
// Test database switching
client.setDefaultDatabase('test_db');
expect(client.getDefaultDatabase()).toBe('test_db');
// Test explicit database parameter
await client.batchUpsert('test_collection', data, 'id', { database: 'other_db' });
```
## Benefits
1. **Organized Data**: Separate databases for different data types
2. **Better Performance**: Smaller, focused databases
3. **Easier Maintenance**: Clear data boundaries
4. **Scalability**: Can scale databases independently
5. **Backward Compatibility**: No breaking changes
## Next Steps
1. Update your data service to use appropriate default database
2. Gradually migrate to using specific databases for different data types
3. Consider using convenience methods for cleaner code
4. Update tests to cover multi-database scenarios

View file

@ -91,4 +91,4 @@
"apiKey": "",
"apiUrl": "https://proxy.webshare.io/api/v2/"
}
}
}

Some files were not shown because too many files have changed in this diff Show more