Current advances in emerging memory technologies enable novel and unconventional computing architectures for high-performance and low-power electronic systems, capable of carrying out massively parallel operations at the edge. One emerging technology, ReRAM, also known to belong in the family of memristors (memory resistors), is gathering attention due to its attractive features for logic and in-memory computing; benefits which follow from its technological attributes, such as nanoscale dimensions, low power operation and multi-state programming. At the same time, design with CMOS is quickly reaching its physical and functional limitations, and further research towards novel logic families, such as Threshold Logic Gates (TLGs) is scoped. TLGs constitute a logic family known for its high-speed and low power consumption, yet rely on conventional transistor technology. Introducing memristors enables a more affordable reconfiguration capability of TLGs. Through this work, we are introducing a physical implementation of a memristor-based currentmode TLG (MCMTLG) circuit and validate its design and operation through multiple experimental setups. We demonstrate 2-input and 3-input MCMTLG configurations and showcase their reconfiguration capability. This is achieved by varying memristive weights arbitrarily for shaping the classification decision boundary, thus showing promise as an alternative hardware-friendly implementation of Artificial Neural Networks (ANNs). Through the employment of real memristor devices as the equivalent of synaptic weights in TLGs, we are realizing components that can be used towards an in-silico classifier.Today's conventional computing paradigm is based on the MOSFET transistor and CMOS technology; two cornerstones which have underpinned the development of digital electronics over the last 5 decades. Although there is still optimism for future improvement of CMOS, accumulating scientific evidence indicates the need for advances in both new emerging technologies to replace MOSFETs and in new computer circuits and architectures 1,2 . The former addresses the increasing difficulty of pursuing further downscaling (with its associated drop in reliability 2 ) whilst the latter seeks to address the Von Neumann bottleneck, where increasingly big memories and powerful processors struggle to communicate over a limited interlink whose data transfer capacity doesn't scale fast enough 2-4 .On the computation/architecture front, there has been a sustained effort to develop bio-inspired computation concepts, mostly in the guise of artificial neural network-enabled (ANN) systems. Research on artificial neural networks has thus far spanned the entire interval between the first simplified models of all-or-none hardware neurons 5 and the current state-of-the-art GPU-based ANNs 6-8 . However, one often overlooked example of ANN-like computation can be found in the form of its quantized, digital counterpart, the so-called threshold logic (TL). TL is a model for performing a