Brain-inspired ULtra-fast & ULtra-sharp machines for AI-assisted health-care
MAECI-MOST funded projects 2021 (time window 2022-2023)
Israeli Partner: Prof. Ido Kanter (Head), Department of Physics, Bar-Ilan University
Italian Partner: Prof. Adriano Barra (Head), Dipartimento di Matematica & Fisica, Università del Salento
Aims and purposes of this shared project between Israeli and Italian scientists: (argument: AI and its applications)
After the so-called “winter time”, following Minsky&Papert criticisms on Rosenblatt's perceptron, in the '70s and '80s the research focus in autonomous information processing systems drifted from models of single neuron to models of networks of interacting neurons. The new wave of Artificial Intelligence (AI) arouse in those decades as a discipline inspired by biological information processing in the brain, based on the neurophysiological knowledge available at that time. Next, the development of modern clouds (i.e., big-data repositories where these machines can be trained) as well as a shift in the paradigm from CPU-computing to GPU-computing, has brought to the so-called “computational spring time”, with modern-AI algorithms able of impressive results.
Indeed, nowadays the technology is ready to usher in a "AI revolution", which, much like earlier industrial revolutions, is sparking a great economic activity with an increase in the Gross Domestic Product of the Western world assessed around 18%. The ubiquity of related
applications has already changed our everyday life, yet, at present, AI is far from being optimized, especially as for health-care is concerned, where machine training requires massive datasets. In fact, the large-scale analysis which underlies a Personalized Medicine would imply
tremendous efforts in terms of time and energy consuming. A novel generation of learning machines is therefore imperative for large-scale healthcare applications (e.g., worldwide hospitals) and this project aims to contribute in this direction, by leveraging our recent understanding of information processing and storage in biological neural networks.
In this project we plan to extend the AI paradigm by including bio-inspired local improvement (at the single neuron level) such as implementing dendritic –i.e. beyond synaptic– learning that is expected to generate ultra-fast algorithms as well as bio-inspired global improvement (at the whole-network level). To this aim, we will implement sleeping mechanisms that allow neural networks to sensibly boost their storage capacity and precision in signal detection/pattern recognition, after proper rest. The natural outcome of this novel generation of ultra-fast & ultra-sharp AIs is expected to finally be its “scalability”, de facto allowing its broad usage in healthcare. We also plan to extensively investigate properties of this new AI at work with cancer-detection within our Laboratories in Italy (mainly pancreatic, brest and colon cancers with and without chemotherapies - for comparison), and at work with neural derangements within our Laboratory in Israel (in particular in cross-linking degeneracy and anisotropy at the single neuron level).
We acknowledge, from the Italy part, the MAECI -Ministero degli Affari Esteri e della Cooperazione Internazionale and, from the Israeli part, the MOST - Ministry of Science and Technology for for funding the project BULBUL.
After the so-called “winter time”, following Minsky&Papert criticisms on Rosenblatt's perceptron, in the '70s and '80s the research focus in autonomous information processing systems drifted from models of single neuron to models of networks of interacting neurons. The new wave of Artificial Intelligence (AI) arouse in those decades as a discipline inspired by biological information processing in the brain, based on the neurophysiological knowledge available at that time. Next, the development of modern clouds (i.e., big-data repositories where these machines can be trained) as well as a shift in the paradigm from CPU-computing to GPU-computing, has brought to the so-called “computational spring time”, with modern-AI algorithms able of impressive results.
Indeed, nowadays the technology is ready to usher in a "AI revolution", which, much like earlier industrial revolutions, is sparking a great economic activity with an increase in the Gross Domestic Product of the Western world assessed around 18%. The ubiquity of related
applications has already changed our everyday life, yet, at present, AI is far from being optimized, especially as for health-care is concerned, where machine training requires massive datasets. In fact, the large-scale analysis which underlies a Personalized Medicine would imply
tremendous efforts in terms of time and energy consuming. A novel generation of learning machines is therefore imperative for large-scale healthcare applications (e.g., worldwide hospitals) and this project aims to contribute in this direction, by leveraging our recent understanding of information processing and storage in biological neural networks.
In this project we plan to extend the AI paradigm by including bio-inspired local improvement (at the single neuron level) such as implementing dendritic –i.e. beyond synaptic– learning that is expected to generate ultra-fast algorithms as well as bio-inspired global improvement (at the whole-network level). To this aim, we will implement sleeping mechanisms that allow neural networks to sensibly boost their storage capacity and precision in signal detection/pattern recognition, after proper rest. The natural outcome of this novel generation of ultra-fast & ultra-sharp AIs is expected to finally be its “scalability”, de facto allowing its broad usage in healthcare. We also plan to extensively investigate properties of this new AI at work with cancer-detection within our Laboratories in Italy (mainly pancreatic, brest and colon cancers with and without chemotherapies - for comparison), and at work with neural derangements within our Laboratory in Israel (in particular in cross-linking degeneracy and anisotropy at the single neuron level).
We acknowledge, from the Italy part, the MAECI -Ministero degli Affari Esteri e della Cooperazione Internazionale and, from the Israeli part, the MOST - Ministry of Science and Technology for for funding the project BULBUL.
PUBLISHED PAPERS IN 2022 AND 2023
23. I. Kanter, et al.,
Learning on tree architectures outperforms a convolutional neural network,
Nature Scientific Reports 13, 962 webpage (2023).
This research is crucial for the project and BulBul was one of the principal sponsors.
This paper belongs to Tasks B.2 & B.3
22. L. Albanese, A. Barra, P. Bianco, F. Durante, D. Pallara,
Hebbian Learning from first principles,
submitted to J. Math. Phys. arxiv (2023).
This paper belongs to Task B.1
21. E. Agliari, F. Alemanno, A. Barra, M. Castellana, D. Lotito
Inverse modelling of time-delayed interactions via the dynamic-entropy formalism,
submitted to Phys. Rev. Lett. arxiv (2023).
This paper belongs to Tasks C.1 & C.2
20. E. Agliari, A. Alessandrelli, A. Barra, F. Ricci-Tersenghi,
Parallel learning in multitasking neural networks,
Journal of Statistical Mechanics 23, 113402 arxiv (2023).
This research is not pivotal for the project hence BulBul was a partner sponsors.
This paper belongs to Task B.3
19. M. Aquaro, F. Alemanno, I. Kanter, F. Durante, A. Barra, E. Agliari,
Hebbian dreaming for small datasets,
Neural Networks in press SSRN (2023).
This research is crucial for the project and BulBul was the principal sponsor.
This paper belongs to Task A.2
18. M. S. Centonze, I. Kanter, A. Barra,
Statistical mechanics of learning via reverberation in bidirectional associative memories,
Physica A in press arxiv (2023).
This research is crucial for the project and BulBul was the principal sponsor.
This paper belongs to Task A.1
17. L. Albanese, A. Alessandrelli, A. Barra, A. Annibale
About the De Almeida-Thouless line in neural networks
Physica A 663, 129372, arxiv webpage (2024).
This research is crucial for the project and BulBul was one of the principal sponsors.
This paper belongs to Tasks A & B broadly speaking
16. E. Agliari, F. Alemanno, M. Aquaro, A. Barra
Ultrametric identities in glassy models of Natural Evolution
J. Phys. A: Math & Theor. 56, 385001 webpage arxiv (2023).
This research is not pivotal for the project (it is a source of datasets for it) hence BulBul was a partner sponsors.
This paper belongs to Tasks C.1 & C.2
15. A. Barra, G. Catania, A. Decélle, B. Soane
Thermodynamics of bidirectional associative memories
J. Phys. A: Math & Theor. 56, 205005, webpage, arXiv (2023)
This research is crucial for the project and BulBul was one of the principal sponsors.
This paper belongs to Task A.1
14. I. Kanter, et al.,
Brain inspired neuronal silencing mechanisms to enable reliable sequence identification,
Nature Scientific Reports 12, 16003 webpage (2023).
This research is crucial for the project and BulBul was one of the principal sponsors.
This paper belongs to Tasks B.2 & B.3
13. M. Aquaro, F. Alemanno, I. Kanter, F. Durante, E. Agliari, A. Barra,
Recurrent neural networks that generalize from examples and optimize by dreaming
ArXiV pre-print arxiv-2204.07954 (2023).
This research is crucial for the project and BulBul was the principal sponsors.
This paper belongs to Task A.2
12. A. Barra, et al.,
Quantifying heterogeneity to drug response in cancer-stroma kinetics,
Proc. Natl. Acad. Sci. USA 120(11) e2122352120 webpage (2023).
This research is crucial for the project and BulBul was one of the principal sponsors.
This paper belongs to Task C.3
11. A. Barra, et al.,
Dense neural networks: a replica symmetric picture of supervised learning,
Physica A 110.626, 129076 webpage arxiv (2022).
This research is crucial for the project and BulBul was the principal sponsors.
This paper belongs to Task B.1
10. A. Barra, et al.,
Dense neural networks: a replica symmetric picture of unsupervised learning,
Physica A 109. 627, 129143 webpage arxiv (2022).
This research is crucial for the project and BulBul was the principal sponsors.
This paper belongs to Task B.1
09. E. Agliari, M. Aquaro, A. Barra, A. Fachechi, C. Marullo,
From Pavlov Conditioning to Hebb Learning,
Neural Computation a1, 28, webpage, arxiv (2023).
This research is crucial for the project and BulBul was the principal sponsors.
This paper belongs to Task B.2
08. L. Albanese, A. Alessandrelli,
On Gaussian spin-glass with P-wise interactions,
Journal of Mathematical Physics 63, 043302 webpage (2022).
This research is crucial for the project and BulBul was the principal sponsors.
This paper belongs to Task B.1
07. V. Onesto, et al.,
Probing single cell fermentation flux and intercellular exchange networks via inverse modeling,
ACS Nano 17:4, 3313 bioRxiv webpage (2022).
Note: Italian CNR (National Center for Scientific Research) press release available for this work: link here or downdload the file.
This research is not pivotal for the project (it is a source of datasets for it) hence BulBul was a partner sponsors.
This paper belongs to Tasks D.1 & D.2
06. M. Aquaro, F. Alemanno, I. Kanter, F. Durante, E. Agliari, A. Barra,
Recurrent neural networks that generalize from examples and optimize by dreaming,
ArXiv Public Repository 2204.07954 arXiv (2023).
This research is crucial for the project and BulBul was the principal sponsors.
This paper belongs to Task A.2
05. A. Fachechi, E. Agliari, F. Alemanno, A. Barra,
Outperforming RBM feature-extraction capabilities by "dreaming" mechanisms,
IEEE Trans. on Neural Nets and Learn Machines 1, 10 webpage (2022).
This research is crucial for the project and BulBul was the principal sponsors.
This paper belongs to Task A.2
04. A. Chandra, S. Pandija, F. Alemanno, R. Rizzo, R. Romano, G. Gigli, C. Bucci, A. Barra, L. Del Mercato,
A fully automatic computational approach for precisely measuring organelle acidification,
ACS Appl. Mat. & Interf. 14, 16, 18133 webpage (2022).
Note: Italian CNR (National Center for Scientific Research) press release available for this work: link to the website or download file
This research is not pivotal for the project (it is a source of datasets for it) hence BulBul was a partner sponsor.
This paper belongs to Tasks C.1 & C.2
03. F. Alemanno, M. Aquaro, I. Kanter, A. Barra, E. Agliari,
Supervised Hebbian learning: toward eXplainable AI,
Europhysics Letters (Perspective) 141, 11001, arxiv webpage (2022).
This research is crucial for the project and BulBul was the principal sponsor.
This paper belongs to Task A.1
02. L. Albanese, F. Alemanno, A. Alessandrelli, A. Barra,
Replica symmetry breaking in dense neural networks,
Journal of Statistical Physics 189, 24 arxiv webpage (2022).
This research is crucial for the project and BulBul was the principal sponsor.
This paper belongs to Tasks A.1 & A.2
01. E. Agliari, F. Alemanno, A. Barra, G. Di Marzio,
The emergence of a concept in shallow neural networks,
Neural Networks, 148, 232, arxiv webpage (2022).
This research is crucial for the project and BulBul was the principal sponsor.
This paper belongs to Task A.1
Group Members (Italian side)**
Adriano Barra, Ross Rinaldi, Elena Agliari (staff members)
Alberto Fachechi, Francesco Alemanno (researchers)
Martino Salomone Centonze (post-doc)
Linda Albanese, Miriam Aquaro, Daniele Lotito, Andrea Alessandrelli (PhD students).
**the group is spread among the Salento Unit and the Rome Unit: for more info see our ResearchGate webapage