Databento

Databento

Financial Services

Salt Lake City, Utah 13,542 followers

A simpler, faster way to get market data.

About us

Databento makes it simpler and faster to get market data. Our self-service model allows users to instantly pick up live exchange feeds and terabytes of historical data—and only pay for what they use. Our goal is to power the world's largest finance and fintech institutions and make data accessible for small startups. Since starting in 2019, we've raised over $27.8M in funding. Our team brings years of experience running high-frequency trading desks and includes alumni from firms like Two Sigma, Flow Traders, Virtu, and Stripe.

Industry
Financial Services
Company size
11-50 employees
Headquarters
Salt Lake City, Utah
Type
Privately Held
Founded
2019
Specialties
Fintech, Market data, and Algorithmic trading

Locations

Employees at Databento

Updates

  • View organization page for Databento, graphic

    13,542 followers

    We're excited to announce the launch of Databento US Equities, providing real-time and historical data from 40 trading venues, including NYSE, Nasdaq, MIAX Exchange Group, and more—all under one pricing plan for a complete view of the market. This includes: - NYSE Integrated feeds with auction imbalance data; - Off-exchange trades on ATSs; - Nasdaq BX and PSX TotalView-ITCH; - Databento US Equities Summary dataset, and more. To bridge the gap between our retail and institutional clients, we added an entry-level subscription plan starting at $199 per month. As we step into this new year, we remain committed to innovation and delivering an even better market data experience for our 9,000+ customers. Here's to a year of growth, collaboration, and success together! #NYSE #Nasdaq #marketdata

    • No alternative text description for this image
  • View organization page for Databento, graphic

    13,542 followers

    Point-in-time instrument definitions are in high-demand, but many data providers have yet to catch up. Below are some of the many advantages of point-in-time definitions:  1. Capture intraday updates for complex instruments. 2. Never miss an IPO at the open. 3. Eliminate look-ahead bias in backtesting. At Databento, we’ve built our APIs to support point-in-time definitions natively. See the links in comments to learn more. 

    • No alternative text description for this image
  • Market Microstructure 101: What are PCAPs? PCAP (packet capture) files capture raw data directly from trading venues, in its native wire protocol format. Databento provides immediate cloud access to PCAPs. Our packet capture infrastructure is colocated within the same data center as the matching engine or primary point of presence (PoP), enabling lossless capture at up to 100 Gbps line rate with nanosecond-resolution, PTP-synchronized, and monotonic timestamps. Learn more on our PCAPs page: https://to.dbn.to/40hQNbH

    • No alternative text description for this image
  • Check out our Quant StackExchange account for various tips on quant development, market microstructure, and market data handling. #marketdata #api #apis #algotrading #quant

    View profile for Lou Lindley, graphic

    Head of Quantitative Analytics at Databento

    The #Databento Quant StackExchange account is another amazing learning resource for quant researchers and developers. It has some wonderful gems and practical wisdom: 1️⃣ On order book data structure design "New prices are often inserted towards the outside of the book. [...] Price levels are more likely to be removed by cancels or executions towards the inside of the book. [This] promotes an unbalanced and tall BST, which has much worse amortized runtime. Self-balancing is one naive solution." - https://lnkd.in/gM62P5jh 2️⃣ On market data storage "If the only purpose is to backtest with the data, the primary access pattern is to seek to a start time and read all of the data serially through to an end time. Then, there is a strong argument for storing it in plain, flat files with binary encoding." - https://lnkd.in/g2AdMrgk 3️⃣ On pre-trade risk checks "More likely, what you want is simply to avoid an errant strategy from blowing up by sending duplicate orders in a tight loop. This can be achieved without strict idempotency, [such as limiting] position-increasing order actions." - https://lnkd.in/gPwZiSjc 4️⃣ On subsampling, denoising, and signal extraction "Options and OTC data can exhibit trade to order ratios in excess of 1:10,000, so taking trade space will be highly efficient in reducing storage requirements, but inappropriate for modeling. The principled way of determining optimal sampling frequency is a classical bias-variance tradeoff. The practical way of handling this is just to use a few that you have strong priors around and just cross-validate your model on out-of-sample data." - https://lnkd.in/g5vCApuR 5️⃣ On queue position modeling in aggregate depth books "A naive guess is that the cancels arrive uniformly through the queue, i.e. 𝑝(𝑥)=𝑥. You could just use this as your prior and then penalize it online as your orders get filled early or late. A better guess is that cancels are unconditionally more likely to come from behind than in front of you [...], since orders in front of you have more value and can scratch out. [You'll also] want to model the conditional distribution. During high dislocation risk, it could be more likely that orders are pulled from in front of you than behind." - https://lnkd.in/gTtvSwij #quant #trading #algotrading #equities #futures #options

Similar pages

Browse jobs

Funding

Databento 6 total rounds

Last Round

Series A

US$ 10.0M

See more info on crunchbase