diff --git a/.gitignore b/.gitignore index f46fc36..21fb259 100644 --- a/.gitignore +++ b/.gitignore @@ -1,3 +1,8 @@ +# AI review context (AGENTS.md local, CLAUDE.md + MEMORY.md symlinked from parent) +AGENTS.md +CLAUDE.md +MEMORY.md + # Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] diff --git a/docs/advanced/README.md b/docs/advanced/README.md index 2071e65..c9643d0 100644 --- a/docs/advanced/README.md +++ b/docs/advanced/README.md @@ -26,3 +26,4 @@ Advanced topics and features of PyneCore - [Extra Fields](./extra-fields.md) - Custom CSV columns beyond OHLCV in scripts - [request.security() Internals](./request-security-internals.md) - Multiprocessing architecture, AST transformation, shared memory - [Bar Magnifier](./bar-magnifier.md) - Accurate intrabar order fills using lower-timeframe data +- [Live Mode](./live-mode.md) - Real-time streaming with intra-bar updates and paper trading diff --git a/docs/advanced/live-mode.md b/docs/advanced/live-mode.md new file mode 100644 index 0000000..0928c88 --- /dev/null +++ b/docs/advanced/live-mode.md @@ -0,0 +1,226 @@ + + +# Live Mode + +Live mode extends PyneCore beyond backtesting: after replaying historical data the script +seamlessly transitions to real-time streaming from a `LiveProviderPlugin`. Indicators update +on every tick; strategies run in paper-trading mode with tick-level order fill accuracy. + +## Quick Start + +```bash +# Stream BTC/USDT on 1-minute bars from Bybit via CCXT, prefetching 500 historical bars +pyne run my_strategy.py ccxt:BYBIT:BTC/USDT:USDT@1 --live -f -500 +``` + +The `--live` flag requires a **provider string** as the data source — it does not work with +local OHLCV files. + +## Historical Bar Count + +The `-f` / `--from` parameter controls how many historical bars the script processes before +going live. **Default: 500 bars** — enough for most scripts out of the box. + +Indicators need a warm-up period: `ta.sma(close, 200)` requires 200 bars before producing its +first value, and the initial values are still distorted by limited lookback. A good rule of +thumb is **2× the largest `length` parameter** in your script: + +```bash +# Script uses ta.ema(close, 50) and ta.atr(14) → largest length is 50 → -f -100 is enough +pyne run my_strategy.py ccxt:BYBIT:BTC/USDT:USDT@1 --live -f -100 + +# Script uses ta.sma(close, 200) → use -f -400 +pyne run my_strategy.py ccxt:BYBIT:BTC/USDT:USDT@1 --live -f -400 + +# No -f specified → default 500 bars, sufficient for most scripts +pyne run my_strategy.py ccxt:BYBIT:BTC/USDT:USDT@1 --live +``` + +Too few bars → indicators produce `NaN` or unreliable values, leading to missed or false +signals. Too many → longer startup, but no harm beyond that. When in doubt, err on the side +of more. + +## How It Works + +A live session has two phases: + +| Phase | Data source | `barstate.ishistory` | `barstate.isrealtime` | Strategy | +|------------|--------------------|----------------------|-----------------------|----------| +| Historical | Provider download | `True` | `False` | Suppressed | +| Live | WebSocket streaming | `False` | `True` | Active | + +### Historical Phase + +The provider downloads OHLCV data (controlled by `-f` / `--from`). The script runs on each +bar exactly like a normal backtest — indicators build up their series, `ta.sma()` warms up, +etc. **Strategy functions are suppressed**: calls to `strategy.entry()`, `strategy.exit()`, +and friends are silently ignored. This prevents phantom trades on historical bars that the +script sees for the first time. + +### Transition + +The ScriptRunner detects the transition automatically when the iterator yields its first +`BarUpdate` object (instead of a plain `OHLCV`). At this point: + +- `barstate.islastconfirmedhistory` becomes `True` on the final historical bar +- Output writers flush to disk (plot CSV, trade CSV) +- Strategy suppression is lifted — orders are now active + +### Live Phase + +The provider streams `BarUpdate` objects via WebSocket. Each update carries an OHLCV snapshot +and an `is_closed` flag: + +``` +BarUpdate(ohlcv=OHLCV(...), is_closed=False) # intra-bar tick +BarUpdate(ohlcv=OHLCV(...), is_closed=True) # bar closed +``` + +The script executes on **every update** — both intra-bar ticks and bar closes. + +## Intra-Bar Updates + +On TradingView, a real-time script re-executes on every tick within a bar. PyneCore replicates +this behavior in live mode. + +### barstate Values + +| Event | `isconfirmed` | `isnew` | `islast` | `isrealtime` | +|--------------------|---------------|---------|----------|--------------| +| First tick of bar | `False` | `True` | `True` | `True` | +| Later intra-bar | `False` | `False` | `True` | `True` | +| Bar close | `True` | `False` | `True` | `True` | + +### var vs varip in Live Mode + +The distinction between `Persistent` (Pine `var`) and `IBPersistent` (Pine `varip`) becomes +meaningful during intra-bar re-executions — the same mechanism used by +[calc_on_order_fills](./bar-magnifier.md#calc_on_order_fills): + +- **`Persistent` (var)**: rolled back to the bar-open snapshot before each intra-bar tick. + Every tick starts from the same baseline. +- **`IBPersistent` (varip)**: **not** rolled back — accumulates across all ticks within the bar. + +```python +var_counter: Persistent[int] = 0 +varip_counter: IBPersistent[int] = 0 + +var_counter += 1 # always == bar_index + 1 (rolled back each tick) +varip_counter += 1 # bar_index + 1 + total intra-bar ticks across all bars +``` + +This uses the same `VarSnapshot` mechanism as the bar magnifier's COOF loop. + +## Order Processing + +Strategies use **magnifier-style order processing** in live mode: intra-bar ticks are +accumulated as `sub_bars`. When the bar closes, `process_orders_magnified(sub_bars, final_bar)` +runs — checking limit, stop, and trailing stop orders against each tick's OHLCV in chronological +order. This gives tick-level fill accuracy even in paper trading. + +If `calc_on_order_fills=True`, the COOF re-execution loop runs on bar close as well — exactly +as it does in backtesting with the bar magnifier. + +### Strategy Suppression + +During the historical phase, all 7 strategy functions (`entry`, `exit`, `close`, `close_all`, +`cancel`, `cancel_all`, `order`) are no-ops. This is controlled by the internal +`lib._strategy_suppressed` flag — the same pattern as `lib._lib_semaphore`. + +## Output + +### Plot CSV + +Written only on **closed bars**. Intra-bar ticks do not produce plot output. This matches +TradingView behavior where plot values are committed only at bar close. + +### Strategy Stats CSV + +In live mode, the strategy statistics file is **rewritten after every closed bar** — not +appended. This means opening the file at any time shows the complete, up-to-date statistics +aggregated over the entire run (historical + live). + +### Trade CSV + +Trade entries and exits are recorded on the bar where the fill occurs, as in backtesting. + +## Provider String Format + +``` +provider:EXCHANGE:SYMBOL:SETTLE@TIMEFRAME +``` + +| Part | Example | Description | +|--------------|----------------|------------------------------------| +| `provider` | `ccxt` | Plugin name (entry point) | +| `EXCHANGE` | `BYBIT` | Exchange identifier | +| `SYMBOL` | `BTC/USDT` | Trading pair | +| `SETTLE` | `USDT` | Settlement currency (optional) | +| `TIMEFRAME` | `1` | TradingView timeframe format | + +The `-f` / `--from` option accepts a negative integer for relative bar count: + +```bash +# Prefetch last 500 bars before going live +pyne run script.py ccxt:BYBIT:BTC/USDT:USDT@1 --live -f -500 +``` + +## CLI Options + +| Flag | Description | +|-----------------------|------------------------------------------------------| +| `--live`, `-l` | Enable live streaming after historical phase | +| `--shutdown-timeout` | Max seconds for graceful shutdown (default: 120) | + +Press `Ctrl+C` to stop live streaming. The provider goes through a graceful shutdown sequence: +`can_shutdown()` is polled every second, then `disconnect()` is called. + +## Architecture + +``` +┌─────────────┐ provider.download() ┌──────────────┐ +│ run.py │ ───────────────────────── │ OHLCV file │ +│ (CLI) │ └──────┬───────┘ +│ │ │ OHLCVReader +│ │ itertools.chain() │ +│ │ ◄────────────────────────────────┤ +│ │ │ +│ │ live_ohlcv_generator() ┌──────┴───────┐ +│ │ ◄──── Queue ◄──── async ──│ WebSocket │ +└──────┬──────┘ └──────────────┘ + │ Iterator[OHLCV | BarUpdate] + │ +┌──────▼───────┐ +│ ScriptRunner │ +│ │ isinstance() detects BarUpdate → live transition +│ historical │ OHLCV bars → normal backtest loop +│ live loop │ BarUpdate → intra-bar + bar close processing +└──────────────┘ +``` + +The `live_ohlcv_generator` bridges the async WebSocket world to synchronous iteration via a +background thread and `queue.Queue`. The ScriptRunner is completely data-source agnostic — it +only cares whether it receives `OHLCV` or `BarUpdate` objects. + +## Limitations + +- **Paper trading only** — no real order execution. Live order routing is provided by + dedicated per-exchange broker plugins (`pynecore-bybit`, `pynecore-binance`, etc.). +- **Single timeframe** — `request.security()` with live providers (multi-timeframe live) is not + yet supported. +- **Provider required** — `--live` only works with provider strings, not local data files. +- **No replay** — there is no mechanism to replay missed ticks if the connection drops mid-bar. + The provider reconnects and resumes from the next available update. \ No newline at end of file diff --git a/docs/cli/run.md b/docs/cli/run.md index ed0654d..f19141b 100644 --- a/docs/cli/run.md +++ b/docs/cli/run.md @@ -138,6 +138,40 @@ The `run` command has two required arguments: Note: you don't need to write the file extensions in the command. +## Provider Mode + +Instead of a local data file, you can pass a **provider string** as the `DATA` argument. +The provider plugin downloads historical data and (with `--live`) streams real-time updates: + +```bash +# Download historical data from CCXT/Bybit and run the script +pyne run my_strategy.py ccxt:BYBIT:BTC/USDT:USDT@1 + +# Same, but continue with live streaming after the historical phase +pyne run my_strategy.py ccxt:BYBIT:BTC/USDT:USDT@1 --live -f -500 +``` + +Provider string format: `provider:EXCHANGE:SYMBOL:SETTLE@TIMEFRAME` + +| Part | Example | Description | +|-------------|-------------|--------------------------------| +| `provider` | `ccxt` | Plugin name (entry point) | +| `EXCHANGE` | `BYBIT` | Exchange identifier | +| `SYMBOL` | `BTC/USDT` | Trading pair | +| `SETTLE` | `USDT` | Settlement currency (optional) | +| `TIMEFRAME` | `1` | TradingView timeframe format | + +The `-f` / `--from` option accepts a **negative integer** for relative bar count when using +a provider string: + +```bash +# Prefetch the last 500 bars +pyne run script.py ccxt:BYBIT:ETH/USDT:USDT@5 -f -500 +``` + +See [Live Mode](../advanced/live-mode.md) for details on real-time streaming, intra-bar +updates, strategy suppression, and paper trading. + ## Command Options The `run` command supports several options to customize the execution: @@ -148,7 +182,7 @@ The `run` command supports several options to customize the execution: ### Date Range Options -- `--from`, `-f`: Start date (UTC) in 'YYYY-MM-DD' or 'YYYY-MM-DD HH:MM:SS' format. If not specified, it will use the first date in the data +- `--from`, `-f`: Start date (UTC) in 'YYYY-MM-DD' or 'YYYY-MM-DD HH:MM:SS' format. If not specified, it will use the first date in the data. In provider mode, also accepts a negative integer for relative bar count (e.g. `-f -500`); defaults to `-500` bars if omitted. - `--to`, `-t`: End date (UTC) in 'YYYY-MM-DD' or 'YYYY-MM-DD HH:MM:SS' format. If not specified, it will use the last date in the data. Example: @@ -157,6 +191,17 @@ Example: pyne run my_strategy.py eurusd_data.ohlcv --from "2023-01-01" --to "2023-12-31" ``` +### Live Mode Options + +- `--live`, `-l`: Continue with real-time data streaming after the historical phase. Only available in provider mode (provider string as data source). See [Live Mode](../advanced/live-mode.md). +- `--shutdown-timeout`: Maximum seconds to wait for graceful provider shutdown when stopping (default: 120). + +Example: +```bash +# Stream live 1-minute BTC/USDT bars after 500 historical bars +pyne run my_strategy.py ccxt:BYBIT:BTC/USDT:USDT@1 --live -f -500 +``` + ### Output Path Options - `--plot`, `-pp`: Path to save the plot data (CSV format). If not specified, it will be saved as `.csv` in the `workdir/output/` directory. diff --git a/docs/development/README.md b/docs/development/README.md index 912cfc8..3002ef9 100644 --- a/docs/development/README.md +++ b/docs/development/README.md @@ -19,5 +19,6 @@ Documentation for PyneCore developers ## In this section +- [Plugin System](./plugin-system.md) - How to create plugins for PyneCore - [Testing System](./testing-system.md) - Overview of the comprehensive testing system - [Contributing](./contributing.md) - Guide for contributing to PyneCore diff --git a/docs/development/plugin-system.md b/docs/development/plugin-system.md new file mode 100644 index 0000000..88a4498 --- /dev/null +++ b/docs/development/plugin-system.md @@ -0,0 +1,514 @@ + + +# Plugin System + +PyneCore uses a plugin architecture based on Python entry points. Plugins can +provide data sources, add CLI commands, or extend existing commands with new +parameters — all discovered automatically at startup. + +## Architecture + +Every plugin registers under a single entry point group: `pyne.plugin`. The +class hierarchy determines what a plugin can do: + +``` +Plugin (base) +├── ProviderPlugin — Offline OHLCV data provider +│ └── LiveProviderPlugin — WebSocket/streaming data (extends ProviderPlugin) +│ └── BrokerPlugin — Order execution (extends LiveProviderPlugin) +├── CLIPlugin — CLI subcommands and parameter hooks +└── ExtensionPlugin — Hook-based script extension (planned) +``` + +`LiveProviderPlugin` inherits from `ProviderPlugin` — every live provider can also download +historical data. See [Live Mode](../advanced/live-mode.md) for data-side details. + +`BrokerPlugin` inherits from `LiveProviderPlugin` — an exchange that routes orders can +also deliver the live market data those orders trade against. Order execution is handled +by dedicated per-exchange broker plugins (`pynecore-bybit`, `pynecore-binance`, +`pynecore-capitalcom`, etc.) — not by standalone data providers. + +Multiple inheritance combines capabilities: + +```python +class MyPlugin(ProviderPlugin, CLIPlugin): + """A plugin that provides both data downloading and CLI commands.""" + ... +``` + +## Quick Start: Hello Plugin + +A minimal plugin that adds `pyne hello greet` to the CLI. + +### Project structure + +``` +pynecore-hello/ +├── pyproject.toml +└── src/ + └── pynecore_hello/ + └── __init__.py +``` + +### pyproject.toml + +```toml +[project] +name = "pynecore-hello" +version = "0.1.0" +description = "Hello World plugin for PyneCore" +dependencies = ["pynesys-pynecore[cli]>=6.5"] + +# This is how PyneCore discovers the plugin automatically: +# "hello" = the plugin name (used as `pyne hello ...`) +# "pynecore_hello:HelloPlugin" = module:class to load +[project.entry-points."pyne.plugin"] +hello = "pynecore_hello:HelloPlugin" + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" +``` + +The `[project.entry-points."pyne.plugin"]` section is the key: it tells Python +that when someone installs this package, a plugin named `hello` should be +registered, pointing to the `HelloPlugin` class. + +### __init__.py + +```python +import typer +from pynecore.core.plugin import CLIPlugin + + +class HelloPlugin(CLIPlugin): + """Hello World plugin.""" + + @staticmethod + def cli() -> typer.Typer: + app = typer.Typer(help="Hello World commands") + + @app.command() + def greet(name: str = typer.Argument("World", help="Who to greet")): + """Say hello.""" + typer.echo(f"Hello, {name}!") + + return app +``` + +### Install and run + +```bash +pip install -e pynecore-hello/ +pyne hello greet PyneCore +# Hello, PyneCore! +``` + +That's it. The plugin is discovered automatically — no registration code, no +config files, no imports. Just install and use. + +## Plugin Capabilities + +### CLIPlugin — Commands and Parameter Hooks + +`CLIPlugin` provides two independent mechanisms: + +#### 1. Subcommands via `cli()` + +Return a Typer app to add a command group under `pyne `: + +```python +class FooPlugin(CLIPlugin): + @staticmethod + def cli() -> typer.Typer: + app = typer.Typer(help="Foo commands") + + @app.command() + def bar(name: str = typer.Argument("world")): + """Do something.""" + typer.echo(f"Hello {name}") + + return app +``` + +This creates `pyne foo bar`. + +#### 2. Parameter hooks via `cli_params()` + +Inject flags into existing commands (currently `run` is pluggable): + +```python +import click + +class FooPlugin(CLIPlugin): + @staticmethod + def cli_params(command_name: str) -> list[click.Parameter]: + if command_name == "run": + return [ + click.Option( + ["--verbose", "-V"], + is_flag=True, + default=False, + help="Enable verbose output", + ), + ] + return [] +``` + +These parameters appear in `pyne run --help` and are parsed automatically. +The values are available via `ctx.plugin_params` in the command callback: + +```python +# Inside the run command implementation +def run(ctx: typer.Context, script: Path = ..., data: Path = ...): + verbose = ctx.plugin_params.get("verbose", False) +``` + +> **Note:** Use standard `click.Option` / `click.Argument` — these are the same +> objects you'd use in any Click application. Typer-specific features like +> `rich_help_panel` are not available on injected parameters. + +#### Conflict Detection + +The plugin system prevents parameter collisions automatically: + +- If a plugin tries to register `--from` but the `run` command already uses it → + registration fails with a warning +- If two plugins both register `--verbose` → the second one fails with a warning +- If a plugin tries to use a built-in command name (`run`, `data`, `compile`, + etc.) as its subcommand name → skipped with a warning + +Both parameter names (`time_from`) and option strings (`--from`, `-f`) are +checked to prevent ambiguity. + +### ProviderPlugin — Data Sources + +Provides offline OHLCV data download capability. Used by `pyne data download`. + +```python +from dataclasses import dataclass +from pynecore.core.plugin import ProviderPlugin, override + + +@dataclass +class FooConfig: + """Foo provider""" + + api_key: str = "" + """API key for authentication""" + + use_sandbox: bool = False + """Use sandbox environment""" + + +class FooProvider(ProviderPlugin[FooConfig]): + Config = FooConfig + + @override + def get_available_symbols(self) -> list[str]: + ... + + @override + def download(self, days_back, on_progress=None, extra_field_names=None): + ... +``` + +The `Config` dataclass is automatically turned into a self-healing TOML file +at `workdir/config/plugins/.toml` — generated with all defaults commented +out, users uncomment and edit what they need: + +```toml +# Foo provider + +# API key for authentication +#api_key = "" + +# Use sandbox environment +#use_sandbox = false +``` + +The Generic type parameter (`ProviderPlugin[FooConfig]`) gives your IDE +full type information on `self.config` — no more `object | None` warnings. + +### BrokerPlugin — Order Execution + +A `BrokerPlugin` is a `LiveProviderPlugin` that can also **route orders** to an +exchange. It receives high-level intents from the engine (`execute_entry`, +`execute_exit`, `execute_close`, `execute_cancel`) and translates them into +exchange-specific calls. The engine handles idempotency, retry, and reconcile +— the plugin focuses on the actual REST/WebSocket wiring. + +```python +from dataclasses import dataclass +from pynecore.core.plugin import BrokerPlugin, override +from pynecore.core.broker.models import ( + DispatchEnvelope, ExchangeCapabilities, ExchangeOrder, ExchangePosition, +) + + +@dataclass +class FooBrokerConfig: + """Foo exchange credentials.""" + api_key: str = "" + api_secret: str = "" + demo: bool = True + + +class FooBroker(BrokerPlugin[FooBrokerConfig]): + Config = FooBrokerConfig + + @override + async def connect(self) -> None: + # Authenticate and populate self._account_id. + # The account_id property later reads it back as a sync value. + await self._authenticate() + self._account_id = f"foo-{'demo' if self.config.demo else 'live'}-{self._login}" + + @override + def get_capabilities(self) -> ExchangeCapabilities: + return ExchangeCapabilities( + stop_order=True, + tp_sl_bracket=True, + reduce_only=True, + # ... see pynecore.core.broker.models for the full struct + ) + + @override + async def execute_entry(self, envelope: DispatchEnvelope) -> ExchangeOrder: + ... + + @override + async def get_position(self, symbol: str) -> ExchangePosition | None: + ... +``` + +#### Storage — `self.store_ctx` + +Every broker plugin gets a `RunContext` wired in by `ScriptRunner` at startup +(`self.store_ctx`). This is the single entry point for persistence — you do +**not** write your own JSONL, SQLite, or in-memory bookkeeping. The +`RunContext` is backed by a shared `BrokerStore` (SQLite, WAL mode) at +`workdir/output/logs/broker.sqlite`, and it gives you: + +- **Generic alias lookup.** Exchange IDs that arrive later in the lifecycle + (Capital.com `dealId`, IB `permId`, Bybit `orderLinkId`) are stored in the + `order_refs` table. Reverse lookup is a single indexed SELECT: + + ```python + # When the exchange returns a durable ID, stash it as an alias. + self.store_ctx.add_ref(client_order_id, 'exchange_order_id', exchange_id) + + # Later, when a fill event arrives with only the exchange ID, resolve it: + row = self.store_ctx.find_by_ref('exchange_order_id', exchange_id) + if row is not None: + client_order_id = row.client_order_id + ``` + +- **Audit log.** Plugin-specific events (rate-limit hits, degraded protection, + reconcile outcomes) go through `log_event`: + + ```python + self.store_ctx.log_event( + 'rate_limit_hit', + client_order_id=coid, + payload={'retry_after_s': 1.5}, + ) + ``` + +- **Order state writes.** The sync engine handles the canonical order + lifecycle automatically. Only touch `upsert_order` / `set_exchange_id` / + `set_risk` if your plugin needs to record extra state the engine doesn't + know about. + +**Authentication and `account_id`.** `BrokerPlugin.account_id` is a sync +property that returns `self._account_id`. Your `connect()` (or the first +authenticating call) must populate `self._account_id` as a +**plugin-qualified** string, e.g. `"foo-demo-1234567"`. The `ScriptRunner` +reads it once during startup to build the `run_id` — if the bot later +switches accounts on the broker UI, the stored `run_id` won't silently drift. + +**Restart recovery.** If the process is `SIGKILL`-ed or the host restarts, +the `runs` row is left with `ended_ts_ms IS NULL` but its heartbeat goes +stale. The next startup's `open_run()` automatically closes stale rows +(heartbeat > 5 min) and logs a `stale_run_cleaned` event. There is nothing +for the plugin to do here — recovery is built into the store. + +#### `BrokerStore` schema — what gets stored where + +A single SQLite file at `workdir/output/logs/broker.sqlite` is shared by +every bot process in the same workdir (WAL mode; one writer at a time, no +blocked readers). Two identity keys share the tables: + +- **`run_id`** — logical stream, the humanly recognizable identifier of + a bot: `"{strategy}@{account}:{symbol}:{tf}[#label]"`. Stable across + restarts. +- **`run_instance_id`** — physical autoincrement integer, unique per + process-level run. Historical isolation. + +| Table | Keyed by | What it holds | +|-------------------------|----------------------|--------------------------------------------------------| +| `runs` | `run_instance_id` | Per-run metadata, heartbeat, lifecycle timestamps. | +| `envelopes` | `run_id` | Sync engine envelope identity (cross-restart). | +| `pending_verifications` | `run_id` | Parked dispatches awaiting confirmation. | +| `orders` | `run_instance_id` | Live order snapshot (+ plugin-specific `extras` JSON). | +| `order_refs` | `run_instance_id` | Generic alias lookup (broker IDs → `client_order_id`). | +| `events` | `run_instance_id` | Audit log (dispatch, fill, reconcile, stale-cleanup). | + +The `envelopes` and `pending_verifications` tables key on the **logical** +`run_id`, so a restarted bot picks up the same idempotency anchors. +Everything else keys on the **physical** `run_instance_id`, so historical +runs stay isolated. + +## Combining Capabilities + +A plugin can combine multiple capabilities via multiple inheritance. The +`Config` dataclass is shared — it belongs to the plugin itself, not to any +specific capability. The `[Config]` type parameter goes on the **first** parent +class — it doesn't matter which one, since both `ProviderPlugin` and `CLIPlugin` +propagate it: + +```python +from dataclasses import dataclass +import click +import typer +from pynecore.core.plugin import ProviderPlugin, CLIPlugin, override + + +@dataclass +class FooConfig: + """Foo provider""" + + api_key: str = "" + """API key for authentication""" + + use_sandbox: bool = False + """Use sandbox environment""" + + +# [FooConfig] on the first parent — either order works +class FooPlugin(ProviderPlugin[FooConfig], CLIPlugin): + """Provider with CLI management commands.""" + + Config = FooConfig + + # --- ProviderPlugin: data downloading --- + + @override + def get_available_symbols(self) -> list[str]: + # self.config is typed as FooConfig (via Generic) + ... + + @override + def download(self, days_back, on_progress=None, extra_field_names=None): + ... + + # --- CLIPlugin: subcommands --- + + @staticmethod + def cli() -> typer.Typer: + app = typer.Typer(help="Foo management commands") + + @app.command() + def status(): + """Show connection status.""" + typer.echo("Connected") + + return app + + # --- CLIPlugin: parameter hooks --- + + @staticmethod + def cli_params(command_name: str) -> list[click.Parameter]: + if command_name == "run": + return [click.Option(["--sandbox"], is_flag=True, default=False)] + return [] +``` + +This single plugin: +- Downloads data via `pyne data download foo` +- Adds `pyne foo status` subcommand +- Injects `--sandbox` into `pyne run` +- Gets a `workdir/config/plugins/foo.toml` with the `FooConfig` fields + +The config TOML file is auto-generated on first run with all defaults commented +out. Users uncomment and edit what they need: + +```toml +# Foo provider + +# API key for authentication +#api_key = "" + +# Use sandbox environment +#use_sandbox = false +``` + +## Plugin Configuration + +Any plugin type (`ProviderPlugin`, `CLIPlugin`, or a combination) can have a +`Config` dataclass. Just set the `Config` class attribute and PyneCore handles +the rest. + +The TOML file is: +- **Auto-generated** on first run with all defaults commented out +- **Self-healing** — new fields appear automatically, removed fields disappear +- **User-friendly** — docstrings become TOML comments, uncommented values survive regeneration +- **Cached** — `ensure_config()` returns the same instance on repeated calls + +For ProviderPlugin, the config is automatically loaded and passed via +`self.config`. For CLI-only plugins, load it manually: + +```python +from pynecore.core.config import ensure_config +from pynecore.cli.app import app_state + +config = ensure_config(FooConfig, app_state.config_dir / "plugins" / "foo.toml") +``` + +## Plugin Metadata + +Plugin metadata comes from `pyproject.toml` via `importlib.metadata` — not from +class attributes: + +```bash +pyne plugin list # List all installed plugins +pyne plugin info ccxt # Show details, config fields, capabilities +``` + +The `plugin_name` class attribute is optional and only used for display: + +```python +class FooPlugin(ProviderPlugin[FooConfig], CLIPlugin): + plugin_name = "Foo Service" # shown in `pyne plugin list` +``` + +## Package Naming Convention + +| Type | Package name | Example | +|-----------|---------------------------|------------------------| +| Official | `pynesys-pynecore-` | `pynesys-pynecore-foo` | +| 3rd party | `pynecore-` | `pynecore-bar` | + +## Dependencies + +For plugins with CLI capabilities, depend on the `cli` extra: + +```toml +dependencies = ["pynesys-pynecore[cli]>=6.5"] +``` + +This ensures Typer and Click are available. For provider-only plugins, +the base `pynesys-pynecore>=6.5` dependency is sufficient. diff --git a/docs/overview/compatibility.md b/docs/overview/compatibility.md index ba176b6..6b898cc 100644 --- a/docs/overview/compatibility.md +++ b/docs/overview/compatibility.md @@ -5,7 +5,7 @@ title: "Pine Script Compatibility" description: "Implementation status of Pine Script v6 features in PyneCore" icon: "checklist" date: "2026-03-28" -lastmod: "2026-03-28" +lastmod: "2026-04-13" draft: false toc: true categories: ["Overview", "Compatibility"] @@ -62,6 +62,8 @@ implementation status of all major Pine Script features. | `strategy.close_all()` | full | | | `strategy.cancel_all()` | full | | | Risk management | full | `strategy.risk.*` functions | +| `calc_on_order_fills` | full | Re-execution after fills, var rollback / varip persist | +| `calc_on_every_tick` | full | Live mode only — no effect on historical bars | ## Request Module @@ -178,7 +180,7 @@ All Pine Script v6 enum constants are implemented: | `if`/`else`/`switch` | full | Via PyneComp compilation | | `for`/`while` loops | full | | | `var` (persistent) | full | `Persistent[T]` annotation | -| `varip` (intrabar persist) | — | Not applicable in offline mode | +| `varip` (intrabar persist) | full | Persists across re-executions (COOF and live mode) | | Methods on types | full | `.get()`, `.set()`, `.size()`, etc. | | User-defined types (UDT) | full | Via PyneComp compilation | | Enums | full | Via PyneComp compilation | @@ -193,17 +195,13 @@ All Pine Script v6 enum constants are implemented: ## Not Applicable to PyneCore -These Pine Script features exist only in TradingView's live charting environment and are not -applicable to offline backtesting: - -| Feature | Reason | -|----------------------|---------------------------------------------------| -| `varip` | Intrabar persistence — offline bars are confirmed | -| Realtime bar updates | All bars are historical in offline mode | -| `alert()` triggers | No broker/notification integration | -| Chart rendering | No visual chart — output is CSV | -| `input()` UI widgets | Inputs are function parameters or TOML config | -| Order execution | Strategy simulator, not live trading | +These Pine Script features exist only in TradingView's live charting environment and have no +equivalent in PyneCore: + +| Feature | Reason | +|----------------------|--------------------------------------------------| +| Chart rendering | No visual chart — output is CSV | +| `input()` UI widgets | Inputs are function parameters or TOML config | ## Precision diff --git a/pyproject.toml b/pyproject.toml index b0b8e52..813807a 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -16,7 +16,7 @@ egg_info.egg_base = "build" [project] name = "pynesys-pynecore" -version = "6.4.5" # PineVersion.Major.Minor +version = "6.5.0" # PineVersion.Major.Minor description = "Python based Pine Script like runtime and API" authors = [{ name = "PYNESYS LLC", email = "hello@pynesys.com" }] readme = "README.md" @@ -62,6 +62,9 @@ optional-dependencies.capitalcom = ["httpx", "pycryptodome"] scripts.pyne = "pynecore.cli:app" +[project.entry-points."pyne.plugin"] +ccxt = "pynecore.providers.ccxt:CCXTProvider" + # # URLs diff --git a/pytest.ini b/pytest.ini index 4665b7e..7de1ebf 100644 --- a/pytest.ini +++ b/pytest.ini @@ -6,4 +6,7 @@ log_cli_level = DEBUG log_cli_format = %(asctime)s %(levelname)6s %(module_func_line)30s - %(message)s log_cli_date_format = %Y-%m-%d %H:%M:%S -addopts = --import-mode=importlib -rs -x --spec --ignore-glob="**/data/*modified.py" +addopts = --import-mode=importlib -rs -x --spec --ignore-glob="**/data/*modified.py" -m "not live" + +markers = + live: tests that connect to live exchange websockets (use -m live to run) diff --git a/src/pynecore/cli/commands/__init__.py b/src/pynecore/cli/commands/__init__.py index bf4e4da..214b694 100644 --- a/src/pynecore/cli/commands/__init__.py +++ b/src/pynecore/cli/commands/__init__.py @@ -1,23 +1,18 @@ +import logging from pathlib import Path +import click import typer from ..app import app, app_state from ..utils.error_hook import setup_global_error_logging -from ...providers import available_providers - # Import commands -from . import run, data, compile, benchmark, debug - -__all__ = ['run', 'data', 'compile', 'benchmark', 'debug'] +from . import run, data, compile, benchmark, debug, plugin -# Conditional import for private TradingView test command -_tv_path = Path(__file__).parent / "tv.py" -if _tv_path.exists() and _tv_path.is_symlink(): - from . import tv +__all__ = ['run', 'data', 'compile', 'benchmark', 'debug', 'plugin'] - __all__.append('tv') +logger = logging.getLogger(__name__) @app.callback() @@ -228,30 +223,22 @@ def main( config_dir = Path(workdir) / 'config' config_dir.mkdir(exist_ok=True) - # Create providers.toml file for all supported providers (if not exists) - providers_file = config_dir / 'providers.toml' - if not providers_file.exists() or recreate_provider_config: - with providers_file.open('w') as f: - for provider in available_providers: - f.write(f"[{provider}]\n") - provider_module = __import__(f"pynecore.providers.{provider}", fromlist=['']) - provider_class = getattr( - provider_module, - [p for p in dir(provider_module) if p.endswith('Provider')][0] - ) - for key, value in provider_class.config_keys.items(): - if key.startswith('#'): # Comments - f.write(f'{key}\n') - else: - if isinstance(value, str): - f.write(f'{key} = "{value}"\n') - elif isinstance(value, bool): - f.write(f'{key} = {str(value).lower()}\n') - elif isinstance(value, int) or isinstance(value, float): - f.write(f'{key} = {value}\n') - else: - raise ValueError(f"Unsupported type for {key}: {type(value)}") - f.write("\n") + # Generate per-plugin config files for all installed plugins + from ...core.plugin import discover_plugins + from ...core.config import ensure_config + + plugins_dir = config_dir / 'plugins' + plugins_dir.mkdir(exist_ok=True) + + for name, ep in discover_plugins().items(): + config_path = plugins_dir / f'{name}.toml' + if not config_path.exists() or recreate_provider_config: + try: + plugin_cls = ep.load() + if hasattr(plugin_cls, 'Config') and plugin_cls.Config is not None: + ensure_config(plugin_cls.Config, config_path) + except Exception as e: + logger.warning("Failed to load plugin config '%s': %s", name, e) # Create api.toml file for PyneSys API (if not exists) api_file = config_dir / 'api.toml' @@ -267,3 +254,61 @@ def main( # Setup global error logging setup_global_error_logging(workdir / "output" / "logs" / "error.log") + + +# --------------------------------------------------------------------------- +# CLIPlugin loading: subcommands and parameter hooks +# --------------------------------------------------------------------------- +_BUILTIN_COMMANDS = {'run', 'data', 'compile', 'benchmark', 'debug', 'plugin'} +_PLUGGABLE_COMMANDS = {'run': run} + + +def _register_cli_plugins(): + """Load CLIPlugin subcommands and parameter hooks from installed plugins.""" + from ...core.plugin import discover_plugins, CLIPlugin + from ..pluggable import PluggableCommand + + for name, ep in discover_plugins().items(): + try: + plugin_cls = ep.load() + + if not (isinstance(plugin_cls, type) and issubclass(plugin_cls, CLIPlugin)): + continue + + # 1. CLI subcommand registration + cli_app = plugin_cls.cli() + if cli_app is not None: + if name in _BUILTIN_COMMANDS: + typer.secho( + f"Warning: plugin '{name}' CLI name conflicts with " + f"built-in command, skipping", + fg="yellow", err=True, + ) + else: + app.add_typer(cli_app, name=name) + + # 2. Parameter hook registration + for cmd_name, cmd_func in _PLUGGABLE_COMMANDS.items(): + params = plugin_cls.cli_params(cmd_name) + if not params: + continue + + group = typer.main.get_command(app) + assert isinstance(group, click.Group) + click_cmd = group.commands.get(cmd_name) + if not isinstance(click_cmd, PluggableCommand): + continue + + for param in params: + if not click_cmd.register_plugin_param(param): + typer.secho( + f"Warning: plugin '{name}' param '{param.name}' " + f"conflicts on '{cmd_name}'", + fg="yellow", err=True, + ) + + except Exception as e: + logger.warning("Failed to load CLI plugin '%s': %s", name, e) + + +_register_cli_plugins() diff --git a/src/pynecore/cli/commands/data.py b/src/pynecore/cli/commands/data.py index 067328a..29bc3c0 100644 --- a/src/pynecore/cli/commands/data.py +++ b/src/pynecore/cli/commands/data.py @@ -1,4 +1,4 @@ -from typing import TYPE_CHECKING, TypeAlias +from typing import TYPE_CHECKING, TypeAlias, cast from pathlib import Path from enum import Enum from datetime import datetime, timedelta, UTC @@ -11,8 +11,8 @@ TimeElapsedColumn, TimeRemainingColumn) from ..app import app, app_state -from ...providers import available_providers -from ...providers.provider import Provider +from ...core.plugin import discover_plugins, load_plugin +from ...core.plugin import ProviderPlugin from ...lib.timeframe import in_seconds from ...core.data_converter import DataConverter, SupportedFormats as InputFormats from ...core.ohlcv_file import OHLCVReader @@ -38,8 +38,18 @@ class AvailableProvidersEnum(Enum): # DateOrDays is either a datetime or a number of days DateOrDays = str - # Create an enum from available providers - AvailableProvidersEnum = Enum('Provider', {name.upper(): name.lower() for name in available_providers}) + # Create an enum from plugins that are Provider subclasses + _provider_names = [] + for _name, _ep in discover_plugins().items(): + try: + _cls = _ep.load() + if isinstance(_cls, type) and issubclass(_cls, ProviderPlugin): + _provider_names.append(_name) + except Exception: # noqa + pass + AvailableProvidersEnum = Enum('Provider', { + name.upper(): name.lower() for name in sorted(_provider_names) + }) # Available output formats @@ -126,19 +136,21 @@ def download( """ Download historical OHLCV data """ - # Import provider module from - provider_module = __import__(f"pynecore.providers.{provider.value}", fromlist=['']) - # Find the provider class (exclude base Provider class) - provider_class = getattr(provider_module, [ - p for p in dir(provider_module) if p.endswith('Provider') and p != 'Provider' - ][0]) + # Load provider class via plugin system + provider_class = cast(type[ProviderPlugin], load_plugin(provider.value)) try: # If list_symbols is True, we show the available symbols then exit if list_symbols: + from ...core.config import ensure_config + config = None + config_cls: type | None = getattr(provider_class, 'Config', None) + if config_cls is not None: + config = ensure_config(config_cls, + app_state.config_dir / 'plugins' / f'{provider.value}.toml') with Progress(SpinnerColumn(), TextColumn("{task.description}"), transient=True) as progress: progress.add_task(description="Fetching market data...", total=None) - provider_instance: Provider = provider_class(symbol=symbol, config_dir=app_state.config_dir) + provider_instance: ProviderPlugin = provider_class(symbol=symbol, config=config) symbols = provider_instance.get_list_of_symbols() with (console := Console()).pager(): for s in symbols: @@ -149,9 +161,15 @@ def download( secho("Error: Symbol is required!", err=True, fg=colors.RED) raise Exit(1) - # Create provider instance - provider_instance: Provider = provider_class(symbol=symbol, timeframe=timeframe, - ohlv_dir=app_state.data_dir) + # Create provider instance with config + from ...core.config import ensure_config + config = None + config_cls: type | None = getattr(provider_class, 'Config', None) + if config_cls is not None: + config = ensure_config(config_cls, + app_state.config_dir / 'plugins' / f'{provider.value}.toml') + provider_instance: ProviderPlugin = provider_class(symbol=symbol, timeframe=timeframe, + ohlcv_dir=app_state.data_dir, config=config) # Download symbol info if not exists if force_save_info or not provider_instance.is_symbol_info_exists(): @@ -179,7 +197,8 @@ def download( ohlcv_writer.truncate() # If the start date is "continue" (default), we resume from the last download - resolved_from: datetime | None = time_from + resolved_from: datetime = time_from + fetch_all = False if time_from == "continue": end_ts = ohlcv_writer.end_timestamp interval = ohlcv_writer.interval @@ -187,27 +206,27 @@ def download( resolved_from = datetime.fromtimestamp(end_ts, UTC) # We need to add one interval to the start date to avoid downloading the same data resolved_from += timedelta(seconds=interval) - elif provider.value == 'tv': # TV provider: fetch all available data - resolved_from = None + elif getattr(provider_class, 'fetch_all_by_default', False): + resolved_from = datetime.fromtimestamp(0, UTC) + fetch_all = True else: # No data, download one year as default resolved_from = datetime.now(UTC) - timedelta(days=365) # We need to remove timezone info - if resolved_from is not None: - resolved_from = resolved_from.replace(tzinfo=None) + resolved_from = resolved_from.replace(tzinfo=None) time_to = time_to.replace(tzinfo=None) # We cannot download data from the future otherwise it would take very long if time_to > datetime.now(UTC).replace(tzinfo=None): time_to = datetime.now(UTC).replace(tzinfo=None) - # Check time range (skip for TV provider when resolved_from is None) - if resolved_from is not None and time_to < resolved_from: + # Check time range (skip for fetch_all providers) + if not fetch_all and time_to < resolved_from: secho("Error: End date (to) must be greater than start date (from)!", err=True, fg=colors.RED) raise Exit(1) # If the start date is before the start of the existing file, we truncate the file - if ohlcv_writer.start_timestamp and resolved_from is not None: + if ohlcv_writer.start_timestamp and not fetch_all: if resolved_from < ohlcv_writer.start_datetime.replace(tzinfo=None): secho(f"The start date (from: {resolved_from}) is before the start of the " f"existing file ({ohlcv_writer.start_datetime.replace(tzinfo=None)}).\n" @@ -218,8 +237,8 @@ def download( ohlcv_writer.seek(0) ohlcv_writer.truncate() - # TV provider with no resolved_from: use spinner-only progress (no time-based progress bar) - if resolved_from is None: + # fetch_all provider: use spinner-only progress (no time-based progress bar) + if fetch_all: with Progress( SpinnerColumn(finished_text="[green]✓"), TextColumn("{task.description}"), @@ -229,17 +248,15 @@ def download( description="Downloading all available OHLCV data...", total=None, ) - # Start downloading (no progress callback - TV provider shows its own progress) provider_instance.download_ohlcv(resolved_from, time_to, on_progress=None, limit=chunk_size) else: - start_from = resolved_from # narrowed to datetime - total_seconds = int((time_to - start_from).total_seconds()) + total_seconds = int((time_to - resolved_from).total_seconds()) # Get OHLCV data with Progress( SpinnerColumn(finished_text="[green]✓"), TextColumn("{task.description}"), - DateColumn(start_from), + DateColumn(resolved_from), BarColumn(), TimeElapsedColumn(), "/", @@ -252,11 +269,11 @@ def download( def cb_progress(current_time: datetime): """ Callback to update progress """ - elapsed_seconds = int((current_time - start_from).total_seconds()) + elapsed_seconds = int((current_time - resolved_from).total_seconds()) progress.update(task, completed=elapsed_seconds) # Start downloading - provider_instance.download_ohlcv(start_from, time_to, on_progress=cb_progress, limit=chunk_size) + provider_instance.download_ohlcv(resolved_from, time_to, on_progress=cb_progress, limit=chunk_size) except (ImportError, ValueError) as e: secho(str(e), err=True, fg=colors.RED) diff --git a/src/pynecore/cli/commands/plugin.py b/src/pynecore/cli/commands/plugin.py new file mode 100644 index 0000000..089a699 --- /dev/null +++ b/src/pynecore/cli/commands/plugin.py @@ -0,0 +1,191 @@ +from typer import Typer, Option, Argument, Exit, secho, colors + +from ..app import app + +__all__ = [] + +app_plugin = Typer(help="Plugin management commands") +app.add_typer(app_plugin, name="plugin") + + +def _get_capabilities(cls: type) -> list[str]: + """Determine plugin capabilities from its class hierarchy.""" + from ...core.plugin import ProviderPlugin, CLIPlugin + from ...core.plugin.broker import BrokerPlugin + + caps = [] + if isinstance(cls, type) and issubclass(cls, ProviderPlugin): + caps.append('provider') + if isinstance(cls, type) and issubclass(cls, BrokerPlugin): + caps.append('broker') + if isinstance(cls, type) and issubclass(cls, CLIPlugin): + caps.append('cli') + return caps + + +@app_plugin.command("list") +def list_plugins( + plugin_type: str = Option( + None, '--type', '-t', + help="Filter by capability (e.g. 'provider', 'cli')", + ), +): + """ + List all installed PyneCore plugins. + """ + from ...core.plugin import discover_plugins, get_plugin_metadata, get_plugin_summary + + plugins = discover_plugins() + if not plugins: + secho("No plugins installed.", fg=colors.YELLOW) + secho("") + return + + # Collect data first to calculate column widths + rows = [] + errors = [] + for name, ep in sorted(plugins.items()): + try: + cls = ep.load() + meta = get_plugin_metadata(ep) + caps = _get_capabilities(cls) + + if plugin_type and plugin_type not in caps: + continue + + display_name = getattr(cls, 'plugin_name', '') or name + version = f"v{meta['version']}" if meta['version'] else '' + caps_str = ', '.join(caps) if caps else 'library' + summary = get_plugin_summary(cls) or meta['description'] + rows.append((name, display_name, version, caps_str, summary)) + except Exception as e: + errors.append((name, str(e))) + + if not rows and not errors: + secho("No plugins found for the given filter.", fg=colors.YELLOW) + secho("") + return + + # Calculate column widths + w_name = max((len(r[0]) for r in rows), default=0) + w_disp = max((len(r[1]) for r in rows), default=0) + w_ver = max((len(r[2]) for r in rows), default=0) + + secho(f"\n Installed plugins:\n", fg=colors.BRIGHT_WHITE, bold=True) + + indent = 4 + w_name + 3 + for name, display_name, version, caps_str, summary in rows: + secho(f" {name:<{w_name}} {display_name:<{w_disp}} {version:<{w_ver}} [{caps_str}]") + if summary: + secho(f"{' ' * indent}└─ {summary}", dim=True) + + for name, error in errors: + secho(f" {name:<{w_name}} (failed to load: {error})", fg=colors.RED) + + secho("") + secho(" Use 'pyne plugin info ' for details.", dim=True) + secho("") + + +@app_plugin.command("info") +def plugin_info( + name: str = Argument(..., help="Plugin name (e.g. 'ccxt', 'capitalcom')"), +): + """ + Show detailed information about an installed plugin. + """ + from ...core.plugin import discover_plugins, get_plugin_metadata, get_plugin_description + from rich.console import Console + from rich.markdown import Markdown + from rich.padding import Padding + import dataclasses + + plugins = discover_plugins() + if name not in plugins: + secho(f"Plugin '{name}' not found.", fg=colors.RED, err=True) + secho(f"Install it with: pip install pynesys-pynecore-{name} (official)", fg=colors.YELLOW, err=True) + secho(f" or: pip install pynecore-{name} (3rd party)", fg=colors.YELLOW, err=True) + raise Exit(1) + + ep = plugins[name] + try: + cls = ep.load() + except Exception as e: + secho(f"Failed to load plugin '{name}': {e}", fg=colors.RED, err=True) + raise Exit(1) + + meta = get_plugin_metadata(ep) + caps = _get_capabilities(cls) + + secho(f"\n Plugin: {name}", fg=colors.BRIGHT_WHITE, bold=True) + secho(f" Package: {meta['package']}") + secho(f" Version: {meta['version'] or 'unknown'}") + secho(f" Description: {meta['description'] or '-'}") + secho(f" Min PyneCore: {'>=' + meta['min_pynecore'] if meta['min_pynecore'] else 'any'}") + secho(f" Capabilities: {', '.join(caps) if caps else 'library'}") + secho(f" Entry point: {ep.value}") + + description = get_plugin_description(cls) + if description: + secho("\n Details:", fg=colors.BRIGHT_WHITE, bold=True) + Console().print(Padding(Markdown(description), (0, 0, 0, 2))) + + if 'broker' in caps: + from rich.table import Table + from rich import box + from ...core.broker.models import CapabilityLevel + + # ``get_capabilities`` is an instance method but well-behaved broker + # plugins return a static dataclass with no I/O. Try a default + # construction first; if the plugin requires its Config dataclass + # (e.g. Capital.com asserts on it), retry with a default-constructed + # Config. Fall back to a warning if even that fails — long-term fix + # is to make ``get_capabilities`` a classmethod. + ex_caps = None + intro_error: Exception | None = None + try: + ex_caps = cls().get_capabilities() # type: ignore[call-arg] + except Exception as e1: + intro_error = e1 + cfg_cls = getattr(cls, 'Config', None) + if cfg_cls is not None and dataclasses.is_dataclass(cfg_cls): + try: + ex_caps = cls(config=cfg_cls()).get_capabilities() # type: ignore[call-arg] + intro_error = None + except Exception as e2: + intro_error = e2 + if intro_error is not None: + secho( + f"\n Exchange Capabilities: (cannot introspect: {intro_error})", + fg=colors.YELLOW, + ) + else: + assert ex_caps is not None + secho("\n Exchange Capabilities:", fg=colors.BRIGHT_WHITE, bold=True) + level_color = { + CapabilityLevel.NATIVE: "green", + CapabilityLevel.PARTIAL_NATIVE: "yellow", + CapabilityLevel.SOFTWARE: "cyan", + CapabilityLevel.UNSUPPORTED: "dim white", + } + table = Table(box=box.SIMPLE, show_header=False, padding=(0, 2, 0, 2)) + table.add_column("Capability", style="white") + table.add_column("Level") + # noinspection PyDataclass + for f in dataclasses.fields(ex_caps): + value: CapabilityLevel = getattr(ex_caps, f.name) + colour = level_color.get(value, "white") + table.add_row(f.name, f"[{colour}]{value.value}[/{colour}]") + Console().print(Padding(table, (0, 0, 0, 2))) + + config_cls: type | None = getattr(cls, 'Config', None) + if config_cls is not None and dataclasses.is_dataclass(config_cls): + # noinspection PyDataclass + fields = dataclasses.fields(config_cls) + if fields: + secho(f"\n Config fields (defaults):") + for f in fields: + default = f"= {f.default!r}" if f.default is not dataclasses.MISSING else "(required)" + secho(f" {f.name:20s} {default}") + + secho("") diff --git a/src/pynecore/cli/commands/run.py b/src/pynecore/cli/commands/run.py index a860f04..a67ffa3 100644 --- a/src/pynecore/cli/commands/run.py +++ b/src/pynecore/cli/commands/run.py @@ -6,25 +6,32 @@ import tomllib from pathlib import Path -from datetime import datetime +from datetime import datetime, timedelta, UTC +from typing import Any -from typer import Option, Argument, secho, Exit +from typer import Option, Argument, secho, Exit, colors from rich.progress import (Progress, SpinnerColumn, TextColumn, BarColumn, - ProgressColumn, Task) + ProgressColumn, Task, TimeElapsedColumn, TimeRemainingColumn) from rich.text import Text from rich.console import Console from ..app import app, app_state +from ..pluggable import PluggableCommand from ...utils.rich.date_column import DateColumn from pynecore.core.ohlcv_file import OHLCVReader from pynecore.core.data_converter import DataConverter, DataFormatError, ConversionError from pynecore.core.aggregator import validate_aggregation +from pynecore.lib.log import logger as pyne_logger from pynecore.lib.timeframe import in_seconds +from pynecore.core.broker.exceptions import BrokerManualInterventionError +from pynecore.lib.log import broker_info, broker_warning from pynecore.core.syminfo import SymInfo from pynecore.core.script_runner import ScriptRunner from pynecore.pynesys.compiler import PyneComp +from pynecore.core.provider_string import ProviderString, is_provider_string, parse_provider_string +from pynecore.core.live_runner import live_ohlcv_generator from ...cli.utils.api_error_handler import APIErrorHandler __all__ = [] @@ -62,19 +69,233 @@ def render(self, task: Task) -> Text: return Text(f"{minutes:02d}:{seconds:06.3f}", style="cyan") -@app.command() +def _parse_time_value(value: str | None, *, allow_bars: bool = False) -> datetime | int | None: + """ + Parse a --from or --to parameter value. + + :param value: The raw string value. + :param allow_bars: If True, allow negative numbers as bar counts. + :return: A datetime, a negative int (bar count), or None. + """ + if value is None: + return None + value: str = value.strip() + + # Negative number = bar count (only for --from in provider mode) + if allow_bars and value.startswith('-'): + try: + bars = int(value) + return bars + except ValueError: + pass + + # Positive number = days back + try: + days = int(value) + if days < 0: + secho("Error: Days cannot be negative (use negative numbers only with provider mode for bar count)", + err=True, fg=colors.RED) + raise Exit(1) + return (datetime.now(UTC) - timedelta(days=days)).replace(second=0, microsecond=0) + except ValueError: + pass + + # Date string + try: + return datetime.fromisoformat(value) + except ValueError: + secho(f"Error: Invalid date or number: '{value}'", err=True, fg=colors.RED) + raise Exit(1) + + +class _ProviderData: + """Result of provider data download, including the provider instance for live mode.""" + + def __init__(self, ohlcv_path: Path, syminfo: 'SymInfo', parsed_string: ProviderString, + provider_instance=None, time_from_ts: int | None = None): + self.ohlcv_path = ohlcv_path + self.syminfo = syminfo + self.provider_instance = provider_instance + self.parsed_string: ProviderString = parsed_string + # Exact start timestamp that yields exactly the requested bar + # count in ``-N bars`` mode — None when the caller should use + # the file's natural start (date/days mode, no bar target). + self.time_from_ts = time_from_ts + + +def _download_provider_data(provider_str: str, time_from_str: str | None) -> _ProviderData: + """ + Download historical data from a provider and return the result. + + :param provider_str: Provider string (e.g. "ccxt:BYBIT:BTC/USDT:USDT@1D"). + :param time_from_str: The --from parameter value (date, days, or -bars). + :return: _ProviderData with ohlcv_path, syminfo, and provider instance. + """ + from pynecore.core.plugin import load_plugin, ProviderPlugin + from pynecore.core.config import ensure_config + from pynecore.lib.timeframe import in_seconds + + ps = parse_provider_string(provider_str, require_timeframe=True) + + # Load provider plugin + provider_class = load_plugin(ps.provider) + if not issubclass(provider_class, ProviderPlugin): + secho(f"Plugin '{ps.provider}' is not a data provider.", err=True, fg=colors.RED) + raise Exit(1) + + # Default to -500 bars if --from not specified in provider mode + if not time_from_str: + time_from_str = "-500" + + time_from_value = _parse_time_value(time_from_str, allow_bars=True) + time_to_dt = datetime.now(UTC).replace(second=0, microsecond=0) + + # Convert bar count to time range. ``bar_count`` being set signals + # the "-N bars" mode — we then guarantee at least N *real* bars + # (exchange-provided, non-gap-fill) after download, extending the + # from-timestamp on miss. Date/days ranges are left untouched. + tf_seconds = in_seconds(ps.timeframe) + bar_count: int | None = None + if isinstance(time_from_value, int) and time_from_value < 0: + bc = abs(time_from_value) + bar_count = bc + # Pad the request by one bar to absorb the still-forming current + # bar that closed-bars-only providers (e.g. Capital.com) filter + # out of history responses. Without this, every ``-N`` run + # against a now-aligned end-time would burn a wasted retry pass + # (``real_bars == N - 1`` on first attempt → retry → success). + time_from_dt = time_to_dt - timedelta(seconds=tf_seconds * (bc + 1)) + else: + assert isinstance(time_from_value, datetime) + time_from_dt = time_from_value + + # Load config + config = None + config_cls: type | None = getattr(provider_class, 'Config', None) + if config_cls is not None: + config = ensure_config(config_cls, + app_state.config_dir / 'plugins' / f'{ps.provider}.toml') + + # Create provider instance + provider_instance: ProviderPlugin = provider_class( + symbol=ps.symbol, timeframe=ps.timeframe, + ohlcv_dir=app_state.data_dir, config=config + ) + + # Fetch symbol info + with Progress(SpinnerColumn(finished_text="[green]✓"), TextColumn("{task.description}")) as progress: + task = progress.add_task("Fetching symbol info...", total=1) + syminfo = provider_instance.get_symbol_info(force_update=not provider_instance.is_symbol_info_exists()) + progress.update(task, completed=1) + + # Download OHLCV data (always fresh in provider mode). In bar-count + # mode we may re-download with an extended ``from`` until we hit the + # target — some feeds omit minutes with no ticks (CFD quiet hours, + # illiquid futures), and ``--from -500`` must mean 500 real bars. + # The Progress wrapper lives outside the retry loop so a gap-driven + # second pass updates the same spinner instead of stamping a + # duplicate ``Downloading OHLCV data...`` line. + max_retries = 4 + with Progress( + SpinnerColumn(finished_text="[green]✓"), + TextColumn("{task.description}"), + DateColumn(), + BarColumn(), + TimeElapsedColumn(), + "/", + TimeRemainingColumn(), + ) as progress: + task = progress.add_task( + "Downloading OHLCV data...", total=1, start_time=time_from_dt, + ) + for attempt in range(max_retries + 1): + with provider_instance as ohlcv_writer: + ohlcv_writer.seek(0) + ohlcv_writer.truncate() + + time_from_dl = time_from_dt.replace(tzinfo=None) if time_from_dt.tzinfo else time_from_dt + time_to_dl = time_to_dt.replace(tzinfo=None) if time_to_dt.tzinfo else time_to_dt + + total_seconds = int((time_to_dl - time_from_dl).total_seconds()) + + progress.update( + task, total=total_seconds, completed=0, + start_time=time_from_dl, + ) + + def cb_progress(current_time: datetime): + elapsed_seconds = int((current_time - time_from_dl).total_seconds()) + progress.update(task, completed=elapsed_seconds) + + provider_instance.download_ohlcv(time_from_dl, time_to_dl, on_progress=cb_progress) + + if bar_count is None: + break + + # Count real bars (``volume >= 0``) in the requested range — + # gap-fill rows (``volume == -1``) emitted by the OHLCV writer + # don't count against the target. + with OHLCVReader(provider_instance.ohlcv_path) as r: # type: ignore[arg-type] + real_bars = sum(1 for _ in r.read_from( + int(time_from_dt.timestamp()), + int(time_to_dt.timestamp()), + skip_gaps=True, + )) + + if real_bars >= bar_count: + break + if attempt == max_retries: + secho( + f"Warning: requested {bar_count} bars, only got {real_bars} " + f"real bars after {max_retries + 1} attempts (feed has " + f"sparse coverage).", + fg=colors.YELLOW, + ) + break + + # Extend the range by the shortfall plus a proportional buffer so + # we likely finish in one extra pass instead of retrying again. + missing = bar_count - real_bars + gap_ratio = missing / real_bars if real_bars else 1.0 + extra_bars = int(missing * (1.0 + gap_ratio)) + 10 + time_from_dt -= timedelta(seconds=tf_seconds * extra_bars) + + assert provider_instance.ohlcv_path is not None + + # For bar-count mode, pin the start timestamp to the N-th last real + # bar so the reader serves *exactly* N bars, not the over-fetched + # surplus we used to guarantee coverage through gaps. + exact_from_ts: int | None = None + if bar_count is not None: + with OHLCVReader(provider_instance.ohlcv_path) as r: # type: ignore[arg-type] + real_ts = [b.timestamp for b in r.read_from( + 0, int(time_to_dt.timestamp()), skip_gaps=True, + )] + if len(real_ts) >= bar_count: + exact_from_ts = real_ts[-bar_count] + + return _ProviderData( + ohlcv_path=provider_instance.ohlcv_path, + syminfo=syminfo, + provider_instance=provider_instance, + parsed_string=ps, + time_from_ts=exact_from_ts, + ) + + +@app.command(cls=PluggableCommand) def run( script: Path = Argument(..., dir_okay=False, file_okay=True, help="Script to run (.py or .pine)"), - data: Path = Argument(..., dir_okay=False, file_okay=True, - help="Data file to use (*.ohlcv)"), - time_from: datetime | None = Option(None, '--from', '-f', - formats=["%Y-%m-%d", "%Y-%m-%d %H:%M:%S"], - help="Start date (UTC), if not specified, will use the " - "first date in the data"), - time_to: datetime | None = Option(None, '--to', '-t', - formats=["%Y-%m-%d", "%Y-%m-%d %H:%M:%S"], - help="End date (UTC), if not specified, will use the last " - "date in the data"), + data: str = Argument(..., + help="Data file (*.ohlcv, *.csv) or provider string " + "(e.g. ccxt:BYBIT:BTC/USDT:USDT@1D)"), + time_from: str | None = Option(None, '--from', '-f', + metavar="[DATE|DAYS|-BARS]", + help="Start: date (2025-01-01), days back (30), " + "or -N bars back (-500). Default: -500 bars in provider mode."), + time_to: str | None = Option(None, '--to', '-t', + metavar="[DATE|DAYS]", + help="End: date or days from start (default: end of data or now)"), plot_path: Path | None = Option(None, "--plot", "-pp", help="Path to save the plot data", rich_help_panel="Out Path Options"), @@ -89,6 +310,26 @@ def run( help="PyneSys API key for compilation (overrides configuration file)", envvar="PYNESYS_API_KEY", rich_help_panel="Compilation Options"), + live: bool = Option(False, "--live", "-l", + help="Continue with live data after historical phase " + "(provider mode only)"), + broker: bool = Option(False, "--broker", + help="Enable live broker trading — requires a provider plugin that " + "subclasses BrokerPlugin. Implies --live.", + rich_help_panel="Live Options"), + run_label: str | None = Option(None, "--run-label", + help="Optional label to distinguish parallel instances of the " + "same strategy+account+symbol+timeframe. Stored in the " + "broker run_id as ``...#