Network classification plays a crucial role in various domains like social network analysis and bioinformatics. While Graph Neural Networks (GNNs) have achieved significant success, they struggle with the problem of over-smoothing and capturing global information. Additionally, GNNs require a large amount of data, hindering performance on small datasets. To address these limitations, we propose a novel approach utilizing Deng’s entropy, capturing network topology and node/edge information. This entropy is calculated at multiple scales, resulting in an entropy sequence that incorporates both local and global features. We embed the networks by combining the entropy sequences for edges and nodes into a matrix, which then are fed into a bidirectional long short-term memory network to perform network classification. Our method outperforms GNNs in the bioinformatics, social, and molecule domains, achieving superior classification power on nine out of eleven benchmark datasets. Further experiments with both real-world and synthetic datasets highlight its exceptional performance, achieving an accuracy of 97.24% on real-world complex networks and 100% on synthetic complex networks. Additionally, our approach proves effective on datasets with a small number of networks and unbalanced classes and excels at distinguishing between synthetic and real-world networks.