Cybernetics And Systems Analysis logo
Editorial Board Announcements Abstracts Authors Archive
Cybernetics And Systems Analysis
International Theoretical Science Journal
-->

UDC 681.3:007.52
R. Vdovychenko1, V. Tulchinsky2


1 V.M. Glushkov Institute of Cybernetics,
National Academy of Sciences of Ukraine,
Kyiv, Ukraine

ruslan.vdovichenko1@gmail.com

2 V.M. Glushkov Institute of Cybernetics,
National Academy of Sciences of Ukraine,
Kyiv, Ukraine

dep145@gmail.com

INCREASING THE SEMANTIC STORAGE DENSITY OF SPARSE DISTRIBUTED MEMORY

Abstract. The Compressive Sensing (CS) method integration in the Sparse Distributed Memory (SDM) implementation is proposed for increasing the storage capacity for Binary Sparse Distributed Representations of semantics particularly in Graphics Processing Units (GPU).

Keywords: Sparse Distributed Memory, SDM, Compressive Sensing, CS, associative memory, Binary Sparse Distributed Representations, neural networks, GPU.


FULL TEXT

REFERENCES

  1. Kanerva P. Sparse distributed memory. Cambridge, MA: MIT Press, 1988. 180 p.

  2. Flynn M.J., Kanerva P., Bhadkamkar N. Sparse distributed memory: principles and operation. Technical Report CSL-TR-89-400. Research Institute for Advanced Computer Science (RIACS), NASA Ames Research Center. 1989. P. 29–32. URL: http://i.stanford.edu/pub/cstr/ reports/csl/tr/89/400/CSL-TR-89-400.pdf.

  3. Kanerva P. Sparse distributed memory and related models. In: Associative Neural Memories: Theory and Implementation. Hassoum M.H. (Ed.). New York: Oxford University Press, 1993. P. 50–76.

  4. Hopfield J. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences. 1982. Vol. 79, N 8. P. 2554–2558.

  5. Ackley D.H., Hinton G.E., Sejnowski T.J. A learning algorithm for Boltzmann machines. Cognitive Science. 1985. Vol. 9, N 1. P. 147–169.

  6. Salakhutdinov R., Mnih A., Hinton G.E. Restricted Boltzmann machines for collaborative filtering. Proc. 24th International Conference on Machine Learning (ICML’07) (20–24 June 2007, Corvallis, USA). Corvallis, 2007. P. 791–798.

  7. Jaeckel L.A. An alternative design for a sparse distributed memory. Technical Report TR 89.28. Research Institute for Advanced Computer Science (RIACS), NASA Ames Research Center. 1989. P. 13–20. URL: https://ntrs.nasa.gov/api/citations/19920001073/downloads/19920001073.pdf.

  8. Jaeckel L.A. A class of designs for a sparse distributed memory. Technical Report TR 89.30. Research Institute for Advanced Computer Science (RIACS), NASA Ames Research Center. 1989. P. 17–25. URL: https://ntrs.nasa.gov/api/citations/19920002426/downloads/19920002426.pdf.

  9. Marr D. A theory of cerebellar cortex. The Journal of Physiology. 1969. Vol. 202, N 2. P. 437–470.

  10. Smith D.J., Forrest S., Perelson A.S. Immunological memory is associative. In: Artificial Immune Systems and Their Applications. Dasgupta D. (Ed.). Berlin; Heidelberg; New York: Springer, 1998. P. 105–112.

  11. CandЩs E.J., Wakin M.B. An introduction to compressive sampling. IEEE Signal Processing Magazine. 2008. Vol. 25, N 2. P. 21–30.

  12. Mallat S., Zhang Z. Matching pursuits with time-frequency dictionaries. IEEE Trans. Signal Process. 1993. Vol. 41, N 12. P. 3397–3415.

  13. Candes E., Romberg J., Tao T. Stable signal recovery from incomplete and inaccurate measurements. Comm. Pure Appl. Math. 2006. Vol. 59, N 8. P. 1207–1223.

  14. Baraniuk R., Davenport M., DeVore R., Wakin M. A simple proof of the restricted isometry property for random matrices. Constructive Approximation. 2008. Vol. 28, N 3. P. 253–263.

  15. Schlegel K., Neubert P., Protzel P. A comparison of vector symbolic architectures. Artif. Intell. Rev. 2021. https://doi.org/10.1007/s10462-021-10110-3.

  16. Fodor J.A., Pylyshyn Z.W. Connectionism and cognitive architecture: A critical analysis. Cognition. 1988. Vol. 28, Iss. 1-2. P. 7–31.

  17. Smolensky P. Tensor product variable binding and the representation of symbolic structures in connectionist systems. Artificial Intelligence. 1990. Vol. 46, Iss. 1-2. P. 159–216.

  18. Plate T.A. Holographic reduced representations. IEEE Transactions on Neural Networks. 1995. Vol. 6, N 3. P. 41–59.

  19. Kanerva P. The spatter code for encoding concepts at many levels. Proc. International Conference on Artificial Neural Networks (ICANN’94) (26–29 May 1994, Sorrento, Italy). Sorrento, 1994. Vol. 1. P. 226–229.

  20. Kanerva P. Hyperdimensional computing: An introduction to computing in distributed representation with high-dimensional random vectors. Cognitive Computation. 2009. Vol. 1, N 2. P. 139–159.

  21. Sjdin G. The Sparchunk code: A method to build higher-level structures in a sparsely encoded SDM. Proc. IEEE International Joint Conf. on Neural Networks (IJCNN/WCCI’98) (4–9 May 1998, Anchorage, USA). Anchorage, 1998. P. 50–58.

  22. Gayler R.W. Multiplicative binding, representation operators & analogy. In: Advances in Analogy Research: Integration of Theory and Data from the Cognitive, Computational, and Neural Sciences. Gentner D., Holyoak K. J., Kokinov B.N. (Eds.). Sofia: New Bulgarian University, 1998. P. 1–4.

  23. Rachkovskij D.A., Kussul E.M. Binding and normalization of binary sparse distributed representations by context-dependent thinning. Neural Computation. 2001. Vol. 13, N 2. P. 411–452.

  24. Rachkovskij D.A. Representation and processing of structures with binary sparse distributed codes. IEEE Trans. on Knowledge and Data Engineering. 2001. Vol. 13, Iss. 2. P. 261–276.

  25. Gayler R. Vector symbolic architectures answer Jackendoff’s challenges for cognitive neuroscience. Proc. ICCS/ASCS International Conference on Cognitive Science (13–17 July 2003, Sydney, Australia). Sydney, 2003. P. 133–138.

  26. Frady E.P., Kleyko D., Sommer F.T. Variable binding for sparse distributed representations: Theory and applications. IEEE Transactions on Neural Networks and Learning Systems. 2020. URL: https://arxiv.org/abs/2009.06734.

  27. Rachkovsky D.A. Codevectors: A sparse binary distributed representation of numeric data [in Russian]. Kyiv: Interservice, 2019. 200 p.

  28. Kussul E.M., Rachkovskij D.A., Baidyk T.N. Associative-projective neural networks: architecture, implementation, applications. Proc. 4th Intern. Conf. “Neural Networks & Their Applications” (4–8 November 1991, Nimes, France). Nimes, 1991. P. 463–476.

  29. Kussul E.M. Associative neural structures [in Russian]. Kyiv: Nauk. dumka, 1992. 144 p.

  30. Laiho M., Poikonen J.H., Kanerva P., Lehtonen E. High-dimensional computing with sparse vectors. Proc. IEEE Biomedical Circuits and Systems Conference: Engineering for Healthy Minds and Able Bodies (BioCAS-2015) (22–24 Oct. 2015, Atlanta, USA). Atlanta, 2015. P. 1–4.

  31. Ramalho T., Garnelo M. Adaptive posterior learning: few-shot learning with a surprise-based memory module. Proc. 7th International Conference on Learning Representations (ICLR 2019) (6–9 May 2019, New Orleans, USA). New Orleans, 2019. URL: https://arxiv.org/abs/ 1902.02527.

  32. Dantzig G.B. Linear programming and extensions. Princeton, NJ: Princeton University Press, 1963. 656 р.

  33. Mallat S., Zhang Z. Matching pursuits with time-frequency dictionaries. IEEE Trans. Signal Process. 1993. Vol. 41, Iss. 12. P. 3397–3415.

  34. Needell D., Tropp J.A. CoSaMP: Iterative signal recovery from incomplete and inaccurate samples. Appl. Comp. Harmonic Anal. 2008. Vol. 26, N 3. P. 301–321.

  35. Open code of the library CoSaMP. URL: https://github.com/rfmiotto/CoSaMP/blob/master/cosamp.ipynb.

  36. Virtanen P., Gommers R., Oliphant T.E. et al. SciPy 1.0: fundamental algorithms for scientific computing in Python. Nature Methods. 2020. Vol. 17, N 3. P. 261–272.

  37. Linear programming module LinProg from the library SciPy. URL: https://docs.scipy.org/ doc/scipy/reference/generated/scipy.optimize.linprog.html.

  38. Vdovichenko R.A. CS SDM Hybrid Neural Memory Model Computer Program. Certificate of registration of copyright to the work № 104882 dated 26.05.2021 (identifier in the UkrPatent database: CR0278260521). SE "Ukrainian Institute of Intellectual Property." 2021.

  39. Open code of the library CS-SDM. URL: https://github.com/Rolandw0w/phd-sdm-cs.




© 2022 Kibernetika.org. All rights reserved.