Achieve better results by reducing overfitting with CatBoost that is based on a proprietary algorithm for constructing models that differs from the standard gradient-boosting scheme.
Categorical features support
Improve your training results with CatBoost that allows you to use non-numeric factors, instead of having to pre-process your data or spend time and effort turning it to numbers.
User-friendly API interface
Launch CatBoost right from the command line or enjoy a user-friendly API for Python or R, with tools for formula analysis and training visualisation.
CatBoost is an algorithm for gradient boosting on decision trees. Developed by Yandex researchers and engineers, it is the successor of the MatrixNet algorithm that is widely used within the company for ranking tasks, forecasting and making recommendations. It is universal and can be applied across a wide range of areas and to a variety of problems.
The CatBoost source code is now available on GitHub under Apache License 2.0. In addition to the actual CatBoost algorithm, you can enjoy Python and R packages, as well as a comparison tool for popular gradient-boosting libraries.
CatBoost was used to improve the state-of-the-art performance of data processing system at LHCb, one of the experiments at the Large Hadron Collider. The data collected by the experiment is processed by CatBoost for individual collisions happening at rate of 40 million per second
Come and meet us at the 2017 ICML conference in Sydney! The 34th International Conference on Machine Learning will take place on August 6-11 and will provide an excellent opportunity to get a demo of CatBoost in action.