Stochastic effects on convergence dynamics of reaction–diffusion recurrent neural networks (RNNs) with constant transmission delays are studied. Without assuming the boundedness, monotonicity and differentiability of the activation functions, nor symmetry of synaptic interconnection weights, by skillfully constructing suitable Lyapunov functionals and employing the method of variational parameters, M-matrix properties, inequality technique, stochastic analysis and non-negative semimartingale convergence theorem, delay independent and easily verifiable sufficient conditions to guarantee the almost sure exponential stability, mean value exponential stability and mean square exponential stability of an equilibrium solution associated with temporally uniform external inputs to the networks are obtained, respectively. The results are compared with the previous results derived in the literature for discrete delayed RNNs without diffusion or stochastic perturbation. Two examples are also given to demonstrate our results.