BAM (Bidirectional Associative Memory)神经网络以其双向异联想性、较强的学习和自适应能力以及噪声容忍性好等特点,在模式分类和识别等方面具有广泛的应用前景。与实值神经网络相比,复值神经网络是一种基于复数运算的神经网络模型,可以有效地刻画如图像、声音等具有多个维度的信号,减少对信号的近似,从而提高模型的精度。因此,本文主要研究了一类比例时滞复值BAM神经网络的全局指数稳定性,利用Banach不动点定理,给出了这类神经网络全局指数稳定的充分条件。最后,举出具体的数值算例验证了结果的有效性。BAM (Bidirectional Associative Memory) neural networks have significant potential for applications in pattern classification and recognition due to their bidirectional associations, robust learning and adaptive capabilities, and excellent noise tolerance. Compared with real-valued neural networks, complex-valued neural networks, which are based on complex operations, can more effectively represent multi-dimensional signals such as images and sounds. They reduce signal approximation errors and enhance model accuracy. Consequently, this paper primarily focuses on the global exponential stability of a class of proportional delay complex-valued BAM neural networks. By applying the Banach fixed point theorem, the sufficient conditions for the global exponential stability of these neural networks are given. Finally, a numerical example is provided to demonstrate the effectiveness of the results.
针对深度学习中的生成对抗网络GAN的模式坍塌和梯度消失问题,通过Wasserstein距离和梯度惩罚的方法,缓解模式坍塌和梯度消失。主要方法包括使用Wasserstein距离代替JS散度改进损失函数,同时防止梯度消失;利用梯度惩罚法强制判别器的连续性约束,确保梯度稳定,平衡生成器与判别器的训练速度;采用从简单到复杂的训练策略优化,平滑过渡,防止判别器过强,同时也防止梯度消失。对传统GAN和改进GAN进行了对比,其生成器和判别器的Loss曲线显示改进后较为稳定。Aiming at the problem of pattern collapse and gradient disappearance of generative adversarial network GAN in deep learning, Wasserstein distance and gradient penalty are used to alleviate the pattern collapse and gradient disappearance. The main methods include using Wasserstein distance instead of JS divergence to improve the loss function, while preventing the gradient from disappearing. The gradient penalty method is used to force the continuity constraint of discriminator to ensure the stability of gradient and balance the training speed of generator and discriminator. Smooth transitions are optimized using training strategies from simple to complex, preventing the discriminator from being too strong and also preventing the gradient from disappearing. A comparison is made between the traditional GAN and the improved GAN, the loss curves of its generator and discriminator show that the improved one is more stable.