CatBoost is an open-source gradient boosting library 
with categorical features support

Get started

Features

1
Fast inference
Apply your trained model quickly and efficiently even to latency-critical tasks using CatBoost's model applier
2
Categorical features support
Improve your training results with CatBoost that allows you to use non-numeric factors, instead of having to pre-process your data or spend time and effort turning it to numbers.
3
Fast and scalable GPU version
Train your model on a fast implementation of gradient-boosting algorithm for GPU. Use a multi-card configuration for large datasets.
4
Improved accuracy
Reduce overfitting when constructing your models with a novel gradient-boosting scheme.
5
User-friendly API interface
Launch CatBoost right from the command line or enjoy a user-friendly API for Python or R, with tools for formula analysis and training visualisation.

About

CatBoost is an algorithm for gradient boosting on decision trees. Developed by Yandex researchers and engineers, it is the successor of the MatrixNet algorithm that is widely used within the company for ranking tasks, forecasting and making recommendations. It is universal and can be applied across a wide range of areas and to 
a variety of problems.

News

New ways to explore your data
New superb tool for exploring feature importance, new algorithm for finding most influential training samples, possibility to save your model as cpp or python code and more. Check CatBoost v0.8 details inside!
CatBoost on GPU talk at GTC 2018
Come and listen our talk about the fastest implementation of Gradient Boosting for GPU at the GTC 2018 Silicon Valley! GTC will take place on March 26-29 and will provide an excellent opportunity to get more details about CatBoost performance on GPU.
Best in class inference and a ton of speedups
New version of CatBoost has industry fastest inference implementation. It's 35 times faster than open-source alternatives and completely production ready. Furthermore 0.6 release contains a lot of speedups and improvements. Find more inside.