In the past few years, several authors have presented methods of using functional decomposition as applied to machine learning. These authors explore the ideas of functional decomposition, but left the concepts of machine learning to the papers that they reference. In general, they never fully explain why a logic synthesis method should be applied to machine learning. This paper explores and presents the basic concepts of machine learning, and how some concepts match nicely with multi-valued logic synthesis, while others pose great difficulties. The main reason for using multi-valued synthesis is that many problems are naturally multi-valued (i.e., values taken from a discrete set). Thus, mapping the problem directly to a multi-valued set of inputs and outputs is much more natural than encoding the problem into a binary form. The paper also shows that any multi-valued logic synthesis method could be applied to the machine learning problem. But, this paper focuses on multivalued functional decomposition because of its generality of minimizing a given data set.
Decision trees are a widely used knowledge representation in machine learning. However, one of their main drawbacks is the inherent replication of isomorphic subtrees, as a result of which the produced classifiers might become too large to be comprehensible by the human experts that have to validate them. Alternatively, decision diagrams, a generalization of decision trees taking on the form of a rooted, acyclic digraph instead of a tree, have occasionally been suggested as a potentially more compact representation. Their application in machine learning has nonetheless been criticized, because the theoretical size advantages of subgraph sharing did not always directly materialize in the relatively scarce reported experiments on real-world data. Therefore, in this paper, starting from a series of rule sets extracted from three real-life credit-scoring data sets, we will empirically assess to what extent decision diagrams are able to provide a compact visual description. Furthermore, we will investigate the practical impact of finding a good attribute ordering on the achieved size savings.
Abstract-This paper presents two new functional decomposition partitioning algorithms that use multivalued decision diagrams (MDDs). MDDs are an exceptionally good representation for generalized decomposition because they are canonical and they can represent very large functions. Algorithms developed in this paper are for Boolean/multivalued input and output, completely/incompletely specified functions with application to logic synthesis, machine learning, data mining and knowledge discovery in databases. We compare the run-times and decision diagram sizes of our algorithms to existing decomposition partitioning algorithms based on decision diagrams. The comparisons show that our algorithms are faster and do not result in exponential diagram sizes when decomposing functions with small bound sets.Index Terms-Algorithms, logic design, unsupervised learning.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.