Many real-world data are labeled with natural orders, i.e., ordinal labels. Examples can be found in a wide variety of fields. Ordinal regression is a problem to predict ordinal labels for given patterns. There are specially developed ordinal regression methods to tackle this type of problems, but they are usually centralized. However, in some scenarios, data are collected distributedly by nodes of a network. For the purpose of privacy protection or due to some practical constraints, it is difficult or impossible to transmit the data to a fusion center for processing. Thus the centralized ordinal regression methods are inapplicable. In this paper, we formulate a distributed generalized ordered logit model for distributed ordinal regression. To estimate parameters in the model, a distributed constrained optimization formulation based on maximum likelihood methods is established. Then, we propose a projected gradient based algorithm to solve the optimization problem. We prove the consensus and the convergence of the proposed distributed algorithm. We also conduct numerical simulations on synthetic and real-world datasets. Simulation results show that the proposed distributed algorithm is comparable to the corresponding centralized algorithm. Even when the data label distribution among nodes is unbalanced, the proposed algorithm still has competitive performance.
Ordinal regression methods are widely used to predict the ordered labels of data, among which support vector ordinal regression (SVOR) methods are popular because of their good generalization. In many realistic circumstances, data are collected by a distributed network. In order to protect privacy or due to some practical constraints, data cannot be transmitted to a center for processing. However, as far as we know, existing SVOR methods are all centralized. In the above situations, centralized methods are inapplicable, and distributed methods are more suitable choices. In this paper, we propose a distributed SVOR (dSVOR) algorithm. First, we formulate a constrained optimization problem for SVOR in distributed circumstances. Since there are some difficulties in solving the problem with classical methods, we used the random approximation method and the hinge loss function to transform the problem into a convex optimization problem with constraints. Then, we propose subgradient-based algorithm dSVOR to solve it. To illustrate the effectiveness, we theoretically analyze the consensus and convergence of the proposed method, and conduct experiments on both synthetic data and a real-world example. The experimental results show that the proposed dSVOR could achieve close performance to that of the corresponding centralized method, which needs all the data to be collected together.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.