#neuromorphic
There's this paper, which proposes an architecture that is both neuromorphic and at least on paper competitive with similar transformer architectures

arxiv.org/abs/2509.26507
The Dragon Hatchling: The Missing Link between the Transformer and Models of the Brain
The relationship between computing systems and the brain has served as motivation for pioneering theoreticians since John von Neumann and Alan Turing. Uniform, scale-free biological networks, such as ...
arxiv.org
November 9, 2025 at 2:54 PM
Hi #Bluesky! I’m a Senior Researcher at CTTC (rparada.netlify.app) exploring wireless, embedded systems & AI for sustainability & IoT. Recently diving into brain-inspired algorithms like SNNs. Sharing insights, if you’re into SNNs/neuromorphic ML, let’s connect! #Sustainability #AI #SNN #IoT
Raúl Parada Medina - Homepage
rparada.netlify.app
November 10, 2025 at 12:28 PM
Can neuromorphic computing help reduce AI’s high energy cost? | PNAS www.pnas.org/doi/10.1073/...
PNAS
Proceedings of the National Academy of Sciences (PNAS), a peer reviewed journal of the National Academy of Sciences (NAS) - an authoritative source of high-impact, original research that broadly spans...
www.pnas.org
November 7, 2025 at 12:23 PM
The next day (11/12), I will be giving a talk at 9:15 AM on "Neuromorphic Compression of Tactile Data with Spiking Neural Networks" as part of the session on "Neural Computing and the Engineering of Adaptive Neural Systems". This is work started during my PhD at @jhu.edu with Vivek Chari.
November 11, 2025 at 2:56 AM
#SNUFA day 2 starting in a few minutes. Lots of neuromorphic computing and SNNs. Check out snufa.net/2025 for agenda and links to join.
November 6, 2025 at 12:52 PM
"Our findings show that fungal computers can provide scalable, eco-friendly platforms for neuromorphic tasks, bridging bioelectronics and unconventional computing."
Sustainable memristors from shiitake mycelium for high-frequency bioelectronics
Neuromorphic computing, inspired by the structure of the brain, offers advantages in parallel processing, memory storage, and energy efficiency. However, current semiconductor-based neuromorphic chips...
doi.org
November 2, 2025 at 7:59 PM
How to cope with massive amounts of energy needed for #artificialintelligence systems, and large language models.
A different computing architecture, inspired by our brain, could help.
🧪 #Science #Neuroscience
Can neuromorphic computing help reduce AI’s high energy cost?
www.pnas.org/doi/10.1073/...
November 3, 2025 at 3:52 PM
Something to think about.

Will neuromorphic computing allow for AI chips that don’t need the power generated by a nuclear reactor to operate?

We’re gonna need all the power we can get to charge our electric vehicles. I wouldn’t go electric car if I didn’t have a solar system to charge it.
November 5, 2025 at 2:01 PM
Well to complement my son's efforts here is some background into my own first foray into neuromorphic computing. The original article can be seen in PNAS! A big shout out to Suraj Honnuraiah, who made it all happen! www.growkudos.com/publications...
November 4, 2025 at 2:31 PM
Calling *ME* a fucking 「 luddite 」 is god-damned laughable

I've been working on machine learning since bloody high school

My first job at Intel was designing & modelling neuromorphic function blocks for embedded microcontrollers

Claiming that you left because of earned shit FROM A YEAR AGO is BS
November 3, 2025 at 10:48 PM
First neuromorphic computer with 2 billion neurons

The system consists of 15 blade-type neuromorphic servers with 960 Darwin 3 chips each chip supports more than 2.35 million spiking neurons and hundreds of millions of synapses.

#AI #neuromorphiccomputing

www.futura-sciences.com/tech/actuali...
Darwin Monkey, premier ordinateur neuromorphique au monde à 2 milliards de neurones
En Chine, une équipe de chercheurs a mis au point un ordinateur neuromorphique d'une puissance jamais atteinte. Cette innovation va permettre de concevoir des systèmes informatiques plus perfo...
www.futura-sciences.com
November 1, 2025 at 12:13 PM
This is so pathetic when you're literally falling for a deliberate marketing tactic. People aren't shitting on neuromorphic computing because our economy is not artificially propped up on ouroboros of funding for it who's only real material impact is the acceleration of environmental decay.
October 30, 2025 at 3:31 PM
Can #NeuromorphicComputing help reduce AI’s high #energy cost? Researchers see big potential in #EnergyEfficient systems inspired by the #HumanBrain. A PNAS Core Concept explainer: https://ow.ly/45rk50XkYt5

#AI #ArtificialIntelligence #LLMs #ChatGPT #NeuralNetwork #DataCenter
October 31, 2025 at 5:01 PM
🔮 Future Computing Workshop
📅 Mar 16–17, 2026 | HLRS, Stuttgart

Join researchers, architects & vendors to discuss the next generation of HPC — from GPUs to neuromorphic systems.

Speakers: Elad Raz (NextSilicon), Ruti Ben-Shlomi (LightSolver)
Details & call: easychair.org/cfp/FCW26
CFP
FCW26 (Future Computing Workshop 2026) is a two-day event organized by HLRS in Stuttgart, bringing together researchers, architects, and developers from academia, industry, and national labs to…
easychair.org
October 24, 2025 at 8:29 AM
🔗https://gadgetrain.com/read/blog/neuromorphic-computing-explained

#NeuromorphicComputing #AIHardware #TechInnovation #FutureTech #AI #MachineLearning #Robotics #SustainableTech
October 22, 2025 at 10:08 AM
Inspiring perspective paper by Prof. Giacomo Indiveri in Neuron. In this interview, he talks about the term “neuromorphic”, the uniqueness of the Institute of Neuroinformatics and comparisons with artificial intelligence.

Read more: ee.ethz.ch/news-and-eve...

@giacomoi.bsky.social
October 21, 2025 at 11:40 AM
It's not fundamental to LLMs on *principle*. You can likely build a neurosymbolic LLM or an outright brainlike neuromorphic language model that lacks these flaws. But the architecture we do use— autoregressive attention-based transformers— yes, that IS absolutely a Potemkin village machine.
October 20, 2025 at 2:32 AM
#spiking #neural #networks (#SNN) running on continuous-time, noisy, and highly variable computing substrates can learn reliably with #ReinforcementLearning ... Not only in real brains, but also in mixed-signal #neuromorphic hardware! 😇

Neuromorphic dreaming […]

[Original post on fediscience.org]
October 12, 2025 at 9:28 PM
Neuromorphic dreaming as a pathway to efficient learning in artificial agents

https://www.doi.org/10.1088/2634-4386/ae0a77
October 12, 2025 at 9:32 PM
Our breakthrough: CMOS pulse shaper enables stochastic STDP learning at the device level. From deterministic to stochastic plasticity - building adaptive neuromorphic systems that learn in real-time.

#Neuromorphic #Robotics #AI
October 10, 2025 at 2:31 PM
A thought-provoking perspective from the visionary @giacomoi.bsky.social, calling for neuromorphic computing to return to its root: fundamental neuroscience; an inspiring vision for the future of NeuroAI 🤩
October 6, 2025 at 9:48 AM
Curious about the meaning of "neuromorphic"?
Here's my latest NeuroView on this topic:
www.cell.com/neuron/fullt...
Neuromorphic is dead. Long live neuromorphic
In this NeuroView, Giacomo Indiveri discusses the origins and evolution of neuromorphic engineering, reflects on recent shifts toward digital and AI-centric approaches, and advocates for a revival of ...
www.cell.com
October 3, 2025 at 3:54 PM
Implications:
🧠 Neuroscience — functional rationale for the evolutionary suppression of strong reciprocal motifs.
🤖 NeuroAI — reciprocity as a tunable design parameter in recurrent & neuromorphic networks.
September 27, 2025 at 9:44 PM