Skip to content

quantpylib.hft.feed

Feed

__init__(gateway, exchanges=None, archiver=False)

Initialize the feeder object.

Parameters:

Name Type Description Default
gateway Gateway

Initialized gateway object.

required
exchanges list

The list of exchanges to subscribe to. If None, use the clients in the gateway.

None

add_handler_to_feed(feed_id, handler)

Add a handler to a feed.

Parameters:

Name Type Description Default
feed_id str

The feed id.

required
handler coroutine

The handler to add.

required

add_l2_book_feed(exc, ticker, handler=None, depth=20, buffer=100, apply_shadow_depth=False, **kwargs) async

Add a level 2 order book feed.

Parameters:

Name Type Description Default
exc str

The exchange.

required
ticker str

The ticker.

required
handler coroutine

The handler to add. Defaults to None.

None
depth int

The depth of the order book. Defaults to 20.

20
buffer int

The buffer size. Defaults to 100.

100

add_trades_feed(exc, ticker, handler=None, buffer=100, **kwargs) async

Add a trades feed.

Parameters:

Name Type Description Default
exc str

The exchange.

required
ticker str

The ticker.

required
handler coroutine

The handler to add. Defaults to None.

None
buffer int

The buffer size. Defaults to 100.

100

cleanup() async

Cleanup the feeder object.

flush_archives(path='./archives/', prefix='', postfix='', idx='')

Flush the archives to disk with filename {path}/{exchange}/{feed_cls}/{feed_type}/{prefix}{ticker}{_metadata}{postfix}.parquet

Parameters:

Name Type Description Default
path str

The path to the archives. Defaults to './archives/'.

'./archives/'
prefix str

The prefix for the archive filename. Defaults to ''.

''
postfix str

The postfix for the archive filename. Defaults to ''.

''
idx str

The partial archive index. Defaults to ''.

''

get_feed(feed_id)

Get the feed object or buffer associated with feed_id.

Parameters:

Name Type Description Default
feed_id str

The feed id.

required

get_feed_id(exc, feed_cls, feed_type, **kwargs) staticmethod

Get the feed id.

Parameters:

Name Type Description Default
exc str

The exchange.

required
feed_cls str

The asset class.

required
feed_type str

The feed type.

required

get_feed_ids()

Get list of all feed ids.

load_lob_archive(exc, ticker, depth, feed_cls=FeedCls.PERPETUAL, path='./archives/', prefix='', postfix='', raw=False) staticmethod

Load parquet lob archives from disk. Loads the file from {path}/{exchange}/{feed_cls}/{feed_type}/{prefix}{ticker}{postfix}.parquet. An example would be ./archives/binance/perp/l2book/BTCUSDT_depth20.parquet.

Parameters:

Name Type Description Default
exc str

The exchange.

required
ticker str

The ticker.

required
depth int

The depth of the order book archive.

required
feed_cls str

The asset class. Defaults to FeedCls.PERPETUAL.

PERPETUAL
path str

The path to the archives. Defaults to './archives/'.

'./archives/'
prefix str

The prefix for the archive. Defaults to ''.

''
postfix str

The postfix for the archive. Defaults to ''.

''
raw bool

Whether to return the raw loaded parquet dataframe or processed archive [{ts,b,a}].

False

load_lob_archives(exc, ticker, depth, feed_cls=FeedCls.PERPETUAL, path='./archives/', start=datetime.now(pytz.utc) - timedelta(days=14), end=datetime.now(pytz.utc), only_paths=False) staticmethod

Load parquet lob archives that have been saved by the scheduler.

Parameters:

Name Type Description Default
exc str

The exchange.

required
ticker str

The ticker.

required
depth int

The depth of the order book archive.

required
feed_cls str

The asset class. Defaults to FeedCls.PERPETUAL.

PERPETUAL
path str

The path to the archives. Defaults to './archives/'.

'./archives/'
start str or datetime

The start date. Can take str format %Y-%m-%d:%H .Defaults to datetime.now(pytz.utc) - timedelta(days=14).

now(utc) - timedelta(days=14)
end str or datetime

The end date. Can take str format %Y-%m-%d:%H. Defaults to datetime.now(pytz.utc).

now(utc)
only_paths bool

Whether to return only the paths to the archives. Defaults to False.

False

load_trade_archive(exc, ticker, feed_cls=FeedCls.PERPETUAL, path='./archives/', prefix='', postfix='', raw=False) staticmethod

Load parquet trade archives from disk. Loads the file from {path}/{exchange}/{feed_cls}/{feed_type}/{prefix}{ticker}{postfix}.parquet. An example would be ./archives/binance/perp/trades/BTCUSDT.parquet.

Parameters:

Name Type Description Default
exc str

The exchange.

required
ticker str

The ticker.

required
feed_cls str

The asset class. Defaults to FeedCls.PERPETUAL.

PERPETUAL
path str

The path to the archives. Defaults to './archives/'.

'./archives/'
prefix str

The prefix for the archive. Defaults to ''.

''
postfix str

The postfix for the archive. Defaults to ''.

''
raw bool

Whether to return the raw loaded parquet or processed archive [[ts,price,size,dir]].

False

load_trade_archives(exc, ticker, feed_cls=FeedCls.PERPETUAL, path='./archives/', start=datetime.now(pytz.utc) - timedelta(days=14), end=datetime.now(pytz.utc), only_paths=False) staticmethod

Load parquet trade archives that have been saved by the scheduler.

Parameters:

Name Type Description Default
exc str

The exchange.

required
ticker str

The ticker.

required
feed_cls str

The asset class. Defaults to FeedCls.PERPETUAL.

PERPETUAL
path str

The path to the archives. Defaults to './archives/'.

'./archives/'
start str or datetime

The start date. Can take str format %Y-%m-%d:%H .Defaults to datetime.now(pytz.utc) - timedelta(days=14).

now(utc) - timedelta(days=14)
end str or datetime

The end date. Can take str format %Y-%m-%d:%H. Defaults to datetime.now(pytz.utc).

now(utc)
only_paths bool

Whether to return only the paths to the archives. Defaults to False.

False

run_archive_scheduler(splits=1, squash=True, rm_subfiles=True, max_workers=5) async

Run the archive scheduler which saves data with the following rules.

1: Store data in an archive buffer. 2: At the end of every hour, flush the buffer to disk. If splits > 1, flush the buffer every 3600 / splits seconds to partial archive to reduce memory load. 3: The files are stored in ./archives/{exchange}/{feed_cls}/{feed_type}/{yy}/{mm}/{ticker}{_metadata}_{ddhh}.parquet.

Parameters:

Name Type Description Default
splits int

The number of archival splits in an hour. Defaults to 1.

1
squash bool

Whether to squash the partial archives automatically. Defaults to True.

True
rm_subfiles bool

Whether to remove the subfiles after squashing. Defaults to True.

True
max_workers int

The number of thread workers to use for squashing. Defaults to 5.

5

squash_archives(path='./archives/', rm_subfiles=False, max_workers=5) staticmethod

Squash the partial parquet archives to a single parquet archive. Partial archives are used to store data in chunks when the scheduler is ran with splits > 1 to reduce memory footprint.

Parameters:

Name Type Description Default
path str

The path to the archives. Defaults to './archives/'.

'./archives/'
rm_subfiles bool

Whether to remove the subfiles after squashing. Defaults to False.

False
max_workers int

The number of thread workers to use for squashing. Defaults to 5.

5

FeedCls

Asset classes.

Attributes:

Name Type Description
PERPETUAL str

Perpetual futures.

FUTURES str

Futures.

SPOT str

Spot.

OPTIONS str

Options.

FeedType

Types of feeds.

Attributes:

Name Type Description
L1BOOK str

Level 1 order book feed.

L2BOOK str

Level 2 order book feed.

L2DELTA str

Level 2 order book delta feed.

TRADES str

Trades feed.

MIDS str

Mid prices