Client#
tradingstrategy.client.Client Python class in Trading Strategy framework.
- class Client[source]#
Bases:
BaseClient
An API client for querying the Trading Strategy datasets from a server.
The client will download datasets.
In-built disk cache is offered, so that large datasets are not redownloaded unnecessarily.
There is protection against network errors: dataset downloads are retries in the case of data corruption errors.
Nice download progress bar will be displayed (when possible)
You can
Client
either inJupyter Notebook environments - see Code examples for an example
Python application environments, see an example below
Integration tests - see
Client.create_test_client()
Python application usage:
import os trading_strategy_api_key = os.environ["TRADING_STRATEGY_API_KEY"] client = Client.create_live_client(api_key) exchanges = client.fetch_exchange_universe() print(f"Dataset contains {len(exchange_universe.exchanges)} exchanges")
- __init__(env, transport)[source]#
Do not call constructor directly, but use one of create methods.
- Parameters:
env (Environment) –
transport (CachedHTTPTransport) –
Methods
__init__
(env, transport)Do not call constructor directly, but use one of create methods.
clear_caches
([filename])Remove any cached data.
close
()Close the streams of underlying transport.
create_jupyter_client
([cache_path, api_key, ...])Create a new API client.
create_live_client
([api_key, cache_path])Create a live trading instance of the client.
create_pyodide_client_async
([cache_path, ...])Create a new API client inside Pyodide enviroment.
create_test_client
([cache_path])Create a new Capitalgram clienet to be used with automated test suites.
fetch_all_candles
(bucket)Get cached blob of candle data of a certain candle width.
fetch_all_liquidity_samples
(bucket)Get cached blob of liquidity events of a certain time window.
fetch_candle_dataset
(bucket)Fetch candle data from the server.
fetch_candles_by_pair_ids
(pair_ids, bucket)Fetch candles for particular trading pairs.
fetch_chain_status
(chain_id)Get live information about how a certain blockchain indexing and candle creation is doing.
Fetch list of all exchanges form the dataset server.
fetch_lending_candles_by_reserve_id
(...[, ...])Fetch lending candles for a particular reserve.
Get a cached blob of lending protocol reserve events and precomupted stats.
Get a cached blob of lending protocol reserve events and precomupted stats.
Fetch pair universe from local cache or the candle server.
fetch_trading_data_availability
(pair_ids, bucket)Check the trading data availability at oracle's real time market feed endpoint.
Checks that everything is in ok to run the notebook
Setup diagram rendering and such.
- __init__(env, transport)[source]#
Do not call constructor directly, but use one of create methods.
- Parameters:
env (Environment) –
transport (CachedHTTPTransport) –
- clear_caches(filename=None)[source]#
Remove any cached data.
Cache is specific to the current transport.
- fetch_pair_universe()[source]#
Fetch pair universe from local cache or the candle server.
The compressed file size is around 5 megabytes.
If the download seems to be corrupted, it will be attempted 3 times.
- Return type:
Table
- fetch_exchange_universe()[source]#
Fetch list of all exchanges form the dataset server.
- Return type:
- fetch_all_candles(bucket)[source]#
Get cached blob of candle data of a certain candle width.
The returned data can be between several hundreds of megabytes to several gigabytes and is cached locally.
The returned data is saved in PyArrow Parquet format.
For more information see
tradingstrategy.candle.Candle
.If the download seems to be corrupted, it will be attempted 3 times.
- Parameters:
bucket (TimeBucket) –
- Return type:
Table
- fetch_candles_by_pair_ids(pair_ids, bucket, start_time=None, end_time=None, max_bytes=None, progress_bar_description=None)[source]#
Fetch candles for particular trading pairs.
This is right API to use if you want data only for a single or few trading pairs. If the number of trading pair is small, this download is much more lightweight than Parquet dataset download.
The fetch is performed using JSONL API endpoint. This endpoint always returns real-time information.
- Parameters:
pair_ids (Collection[int]) – Trading pairs internal ids we query data for. Get internal ids from pair dataset.
time_bucket – Candle time frame
start_time (Optional[datetime]) – All candles after this. If not given start from genesis.
max_bytes (Optional[int]) – Limit the streaming response size
progress_bar_description (Optional[str]) – Display on download progress bar.
bucket (TimeBucket) –
- Returns:
Candles dataframe
- Raises:
tradingstrategy.transport.jsonl.JSONLMaxResponseSizeExceeded – If the max_bytes limit is breached
- Return type:
- fetch_trading_data_availability(pair_ids, bucket)[source]#
Check the trading data availability at oracle’s real time market feed endpoint.
Trading Strategy oracle uses sparse data format where candles with zero trades are not generated. This is better suited for illiquid DEX markets with few trades.
Because of sparse data format, we do not know if there is a last candle available - candle may not be available yet or there might not be trades to generate a candle
This endpoint allows to check the trading data availability for multiple of trading pairs.
Example:
exchange_universe = client.fetch_exchange_universe() pairs_df = client.fetch_pair_universe().to_pandas() # Create filtered exchange and pair data exchange = exchange_universe.get_by_chain_and_slug(ChainId.bsc, "pancakeswap-v2") pair_universe = PandasPairUniverse.create_pair_universe( pairs_df, [(exchange.chain_id, exchange.exchange_slug, "WBNB", "BUSD")] ) pair = pair_universe.get_single() # Get the latest candle availability for BNB-BUSD pair pairs_availability = client.fetch_trading_data_availability({pair.pair_id}, TimeBucket.m15)
- Parameters:
pair_ids (Collection[int]) – Trading pairs internal ids we query data for. Get internal ids from pair dataset.
time_bucket – Candle time frame
bucket (TimeBucket) –
- Returns:
Map of pairs -> their trading data availability
- Return type:
- fetch_candle_dataset(bucket)[source]#
Fetch candle data from the server.
Do not attempt to decode the Parquet file to the memory, but instead of return raw
- Parameters:
bucket (TimeBucket) –
- Return type:
- fetch_lending_candles_by_reserve_id(reserve_id, bucket, candle_type=LendingCandleType.variable_borrow_apr, start_time=None, end_time=None)[source]#
Fetch lending candles for a particular reserve.
- Parameters:
reserve_id (int) – Lending reserve’s internal id we query data for. Get internal id from lending reserve universe dataset.
bucket (TimeBucket) – Candle time frame.
candle_type (LendingCandleType) – Lending candle type.
start_time (Optional[datetime]) – All candles after this. If not given start from genesis.
- Returns:
Lending candles dataframe
- Return type:
- fetch_all_liquidity_samples(bucket)[source]#
Get cached blob of liquidity events of a certain time window.
The returned data can be between several hundreds of megabytes to several gigabytes and is cached locally.
The returned data is saved in PyArrow Parquet format.
For more information see
tradingstrategy.liquidity.XYLiquidity
.If the download seems to be corrupted, it will be attempted 3 times.
- Parameters:
bucket (TimeBucket) –
- Return type:
Table
- fetch_lending_reserve_universe()[source]#
Get a cached blob of lending protocol reserve events and precomupted stats.
The returned data can be between several hundreds of megabytes to several gigabytes in size, and is cached locally.
Note that at present the only available data is for the AAVE v3 lending protocol.
The returned data is saved in a PyArrow Parquet format.
If the download seems to be corrupted, it will be attempted 3 times.
- Return type:
Table
- fetch_lending_reserves_all_time()[source]#
Get a cached blob of lending protocol reserve events and precomupted stats.
The returned data can be between several hundreds of megabytes to several gigabytes in size, and is cached locally.
Note that at present the only available data is for the AAVE v3 lending protocol.
The returned data is saved in a PyArrow Parquet format.
If the download seems to be corrupted, it will be attempted 3 times.
- Return type:
Table
- fetch_chain_status(chain_id)[source]#
Get live information about how a certain blockchain indexing and candle creation is doing.
- classmethod setup_notebook()[source]#
Setup diagram rendering and such.
Force high DPI output for all images.
- async classmethod create_pyodide_client_async(cache_path=None, api_key='secret-token:tradingstrategy-d15c94d954abf9d98847f88d54403720ce52e41f267f5aaf16e63fcd30256af0', remember_key=False)[source]#
Create a new API client inside Pyodide enviroment.
More information about Pyodide project / running Python in a browser.
- Parameters:
cache_api_key – The API key used with the server downloads. A special hardcoded API key is used to identify Pyodide client and its XmlHttpRequests. A referral check for these requests is performed.
remember_key – Store the API key in IndexDB for the future use
- Returns:
pass
- Return type:
- classmethod create_jupyter_client(cache_path=None, api_key=None, pyodide=None)[source]#
Create a new API client.
This function is intented to be used from Jupyter notebooks
Any local or server-side IPython session
JupyterLite notebooks
- Parameters:
api_key (Optional[str]) – If not given, do an interactive API key set up in the Jupyter notebook while it is being run.
cache_path (Optional[str]) – Where downloaded datasets are stored. Defaults to ~/.cache.
pyodide – Detect the use of this library inside Pyodide / JupyterLite. If None then autodetect Pyodide presence, otherwise can be forced with True.
- Return type:
- classmethod create_test_client(cache_path=None)[source]#
Create a new Capitalgram clienet to be used with automated test suites.
Reads the API key from the environment variable TRADING_STRATEGY_API_KEY. A temporary folder is used as a cache path.
By default, the test client caches data under /tmp folder. Tests do not clear this folder between test runs, to make tests faster.
- Return type: