Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  • articleNo Access

    CONSTRUCTING A QUERY-ABLE RADIAL BASIS FUNCTION ARTIFICIAL NEURAL NETWORK

    Artificial neural networks will be more widely accepted as standard engineering tools if their reasoning process can be made less opaque. This paper describes NetQuery, an explanation mechanism that extracts meaningful explanations from trained Radial Basis Function (RBF) networks. RBF networks are well suited for explanation generation because they contain a set of locally tuned units. Standard RBF networks are modified to identify dependencies between the inputs, to be sparsely connected, and to have an easily interpretable output layer. Given these modifications, the network architecture can be used to extract "Why?" and "Why not?" explanations from the network in terms of excitatory and inhibitory in-puts and their linear relationships, greatly simplified by a run-time pruning algorithm. These query results are validated by creating an expert system based on the explanations. NetQuery is also able to inform a user about a possible change in category for a given pattern by responding to a "How can I…?" query. This kind of query is extremely useful when analyzing the quality of a pattern set.

  • articleNo Access

    PRUNING AND MODEL-SELECTING ALGORITHMS IN THE RBF FRAMEWORKS CONSTRUCTED BY SUPPORT VECTOR LEARNING

    This paper presents the pruning and model-selecting algorithms to the support vector learning for sample classification and function regression. When constructing RBF network by support vector learning we occasionally obtain redundant support vectors which do not significantly affect the final classification and function approximation results. The pruning algorithms primarily based on the sensitivity measure and the penalty term. The kernel function parameters and the position of each support vector are updated in order to have minimal increase in error, and this makes the structure of SVM network more flexible. We illustrate this approach with synthetic data simulation and face detection problem in order to demonstrate the pruning effectiveness.

  • articleNo Access

    Link community detection combined with network pruning and local community expansion

    Studying overlapping community structure can help people understand complex network. In this paper, we propose a link community detection method combined with network pruning and local community expansion (NPLCE). Firstly, we delete unattractive links and transform pruned graph into line graph. Secondly, we calculate score matrix on line graph through pagerank algorithm. Then, we search seed nodes and expand local communities from the seed nodes. Finally, we merge those communities and transform them back into node communities. The experiment results on several real-world networks demonstrate the performance of our algorithm in terms of accuracy.

  • articleNo Access

    CNN Pruning with Multi-Stage Feature Decorrelation

    This paper proposes a channel pruning method based on multi-stage feature de-correlation to obtain a more efficient convolutional neural network (CNN) model. Based on the correlation of hidden features at each level of the network, we refine more efficient features of each convolutional layer by applying feature de-correlation constraints (MFD Loss) to each convolutional layer of the network and then prune channels according to the modulus of the feature maps output from each layer. After several rounds of pruning and fine-tuning, a network with similar accuracy, a substantially smaller network size, and more efficient operation is generated compared to the original model. Our experiments on pruning various popular CNN models on many standard datasets demonstrate the method’s effectiveness. Specifically, for VGG-16 on CIFAR10, our approach eliminates parameters by 97.0%, saves Float-Point-Operations (FLOPs) by 66.9%, with a 0.4% accuracy gain and state-of-art performance. For ResNet-50 on the ImageNet dataset, our method eliminates parameters by 30.0%, and saves FLOPs by 52%, with 1.4% accuracy loss, which also proves the effectiveness of the method. The code for the paper can be found at https://github.com/lovelyemperor/MFD.