Federated Learning (FL) is a privacy-aware machine learning paradigm. It was initially designed to fit parametric models, namely Neural Networks (NNs) and thus, it has excelled on image, audio and text tasks. However, FL for tabular data still receives little attention. Tree-Based Models (TBMs) perform better than NNs on tabular data in a centralized setting, and are starting to see FL integrations. In this paper, we evaluate federated TBMs and NNs for horizontal FL, with varying data partitions, on 31 datasets. We propose treesXnets - a unified benchmarking tool for federated evaluation. treesXnets’ results capture model performance, e.g. accuracy, communication effort, model training duration, and device utilization. A cyclic implementation of federated XGBoost is the best performing model, outperforming the best federated NNs with 5-10% in terms of accuracy and regression error. It is also faster, requires less communication and memory than other federated XGBoost models.