Eckhorn, R., Reitbock, H. J., Arndt, M., and Dicke, P. (1989). A neural network for feature linking via synchronous activity: results from cat visual cortex and from simulations.
In R. M. J. Cotterill (ed.). Models of Brain Function. Cambridge: Cambridge University Press. pp. 255–272.
Lindblad, T., and Kinser, J. M. (2005). Image processing using pulse-coupled neural networks (2nd, rev. ed.). Berlin; New York: Springer. ISBN 3-540-24218-X.
These network models have been then used as a basis for PCNN (Pulse Coupled Neural Networks) or RYNN (Rybak Neural Networks) and HRYNN (Heterogenous Rybak NN) models (for example, see Wang at al. 2010; Liao et al. 2024; Qi et al. 2024; Rafi and Rivas, 2024; Goyal et al., 2025):
Wang, Z., Ma, Y., Cheng, F., and Yang, L. (2010) Review of pulse-coupled neural networks. Image and Vision Computing 28: 5-13.
Subashini M. M., and Sahoo, S. K. (2014) Pulse coupled neural networks and its applications. Review. Expert Systems with Applications 41 (8): 3965-3974.
Qi, Y., Yang, Z., Lian, Y. J., Guo, Y., Sun, W., Liu, J., Wang, R., (2021) A new heterogeneous neural network model and its application in image enhancement. Neurocomputing 440: 336-350.
Qi, Y., Yang, Z., Sun, W., Lou, M., Lian, J., Zhao, W., Deng, X., and Ma, Y. (2022) Comprehensive overview of image enhancement techniques. Archives of Computational Methods in Engineering, 29: 583-607.
Liao, J.-X., Hou, B.-J. Dong, H.-C., and Zhang, H. (2024) Quadratic neuron-empowered heterogeneous autoencoder for unsupervised anomaly detection. Transactions on Artificial Intelligence, 5: 4723-4737.
Qi, Y., Yang, Z., Lu, H., Li, S, and Ma, Y. (2024) A multi-channel neural network model for multi-focus image fusion. Expert Systems with Applications, 247: 123244.
Rafi, N., and Rivas, P. (2024) A review of Pulse-Coupled Neural Network applications in computer vision and image processing. arXiv: 2406.00239v1.
Goyal, N., Goyal, N., Mendiratta, T., Kharbanda, H., Bansal, K., Mann, S. K., Panigrahy, C., and Aggarwal, A. (2025) Dual-channel Rybak neural network based medical image fusion. Optics and Laser Technology, 181: 112118.
Through iterative computation, PCNN neurons produce temporal series of pulse outputs. The temporal series of pulse outputs contain information of input images and can be used for various image processing applications, such as image segmentation and feature generation. Compared with conventional image processing means, PCNNs have several significant merits, including robustness against noise, independence of geometric variations in input patterns, capability of bridging minor intensity variations in input patterns, etc.
Rybak PCNNs differ from traditional ANNs by being biologically inspired models that use iterative, time-based pulse outputs to process images, while traditional ANNs use a static, mathematical approach without training. PCNNs are more robust to noise and geometric variations and do not require a training phase, as their behavior is adjusted by tuning parameters rather than by learning from data. Traditional ANNs, on the other hand, are more widely used for a broader range of tasks but require extensive training to adjust their weights.
From Rybak's Neural Preprocessor to Next-Gen Visual AI