Hypergraph neural networks (HyperGNNs) are a family of deep neural networks designed to perform inference on hypergraphs. HyperGNNs follow either a spectral or a spatial approach, in which a convolution or message-passing operation is conducted based on a hypergraph algebraic descriptor. While many HyperGNNs have been proposed and achieved state-ofthe-art performance on broad applications, there have been limited attempts at exploring high dimensional hypergraph descriptors (tensors) and joint node interactions carried by hyperedges. In this paper, we depart from hypergraph matrix representations and present a new tensor-HyperGNN framework (T-HyperGNN) with cross-node interactions. The T-HyperGNN framework consists of T-spectral convolution, T-spatial convolution, and T-message-passing HyperGNNs (T-MPHN). The Tspectral convolution HyperGNN is defined under the t-product algebra that closely connects to the spectral space. To improve computational efficiency for large hypergraphs, we localize the T-spectral convolution approach to formulate the T-spatial convolution and further devise a novel tensor message-passing algorithm for practical implementation by studying a compressed adjacency tensor representation. Compared to the state-of-theart approaches, our T-HyperGNNs preserve intrinsic high-order network structures without any hypergraph reduction and model the joint effects of nodes through a cross-node interaction layer. These advantages of our T-HyperGNNs are demonstrated in a wide range of real-world hypergraph datasets.