Fluid neural networks, spiky neural networks, neuromorphic chips. The next generation of AI will be very different. #ainews #ai #agi #singularity #neuralnetworks #machinelearning Thanks to our sponsor, Bright Data: Train your AI models on high-volume, high-quality web data through reliable channels, ready-to-use datasets and mining APIs. Learn More Viewers who enjoyed this video also tended to like: You don’t understand AI until you’ve seen THIS These 5 AI Discoveries Will Change the World Forever The Race for AI Humanoid Robots These New AIs Can Create and Edit life newsletter: Find AI tools and jobs: Donate: Here’s my gear in case you’re wondering: GPU: RTX 4080 Mouse/Keyboard: ALOGIC Echelon…
Thanks to our sponsor, Bright Data:
Train your AI models with high-volume, high-quality web data through reliable pipelines, ready-to-use datasets, and scraping APIs.
Learn more at https://brdta.com/aisearch
Viewers who enjoyed this video also tend to like the following:
You Don't Understand AI Until You Watch THIS https://youtu.be/1aM1KYvl4Dw
These 5 AI Discoveries will Change the World Forever https://youtu.be/fyVja-57EIs
The Insane Race for AI Humanoid Robots https://youtu.be/90TMZ2fq9Gs
These new AI's can create & edit life https://youtu.be/3K_LAGonsPU
What is the difference between learning and interpolation?
GPUs are the big elephant in the room.
Do you think the liquid networks have potential to be self aware?
The problem with Netflix is lack of contents, not AI.
Yes, but remember that it is possible to fly but we don't have spaceships in this future that we have predicted in the past.
The future of driving looks like this (We are going to have flying cars)
This Liquid Neural Network thing feels a bit voodoo. What I mean is, from the explanation I get that if the interpreter is trained on a sufficient amount of concepts and then I put in something it was not explicitly trained on but that could hypothetically be built from any given subset of those concepts it should output a logical, sensical output for that input just by virtue of interpreting the sauce, right? That would be bat s* crazy and super cool though also f*ing creepy. Kinda wanna try training one
Meaning at a certain point it won't really have to learn anymore, it will just do voodoo magic and spit out profound stuff left and right
Fluid Dynamics man, that stuff is different
Physically though what is a neural network. You’re just showing a picture of the system. What exactly is a node and how does it communicate with other nodes? Why are our brain so much more efficient despite these being modeled after us.
Can you please link the papers about liquid neural networks?
Hate this low quality bullshit
Is this computer code based on the fluid dynamics or actually liquid?
hey, super nice high level overview.
The human brain doesn't start with random weights. It's already pre-trained by millions of years of evolution.
I just want it on record as
the real current problem with AI is
it doesn't exist
an AI would be capable of learning
what we have is Generated Intelligence… hmmm….
This whole video is one unsupported assertion after another.
The history of AI right back to the 50’s proves that you just need to throw compute at the problem.
It’s like saying you can’t make an aeroplane fly unless the wings flap.
AI needs to be a computer that can reprogram itself.
Had to pause for a sec after you spoke about energy usage 1/3 in. You can't compare training of the network with usage of the brain. My brain is STILL being trained and as such you really should compare with 46 years of training in my case.. If you want to compare then compare usage against usage, or training against training. Your conclusion is right (That our brain is WAAAY more efficient). but just had to correct you there.
i love when people say human brain as if is the only that can learn
The script should have included "there are also many different architectures such as the Transformer model, and the Transformer model, and the Transformer mode.".
It’s so obvious how dumb A.I. is right now. It’s SO frustrating.
My friends the yourfff liquid neural network is actually called reinforcement learning and it invented in 1960s
LIQUID NEURAL NETWORKS AND ARTIFICIAL GENERAL INTELLEGENCE:-
This video explains how the Neural networks works AND ITS MIND BLOWING CONCEPTUALLY: The only thing lacking is NEUROPLASTICITY AND NEED FOR LOWERING OF THE ELECTRICITY.
What does this mean this means? this simply means that once like brain we scale up the Liquid Neural networks our effectiveness will increase and then MAY BE WE WILL NOT NEED HUMANS FOR THE AGI TO DO SELF LEARING.
Few more breakthroughs and may be we will have a system which learns in itself and we wont be needing Humans for AI to upgrade itself.
May be at that point TECHNOLOGICAL SINGULARITY WILL START. At the start of internet some 2005, I though that it will come at 2047 and many of the Pundits gave that date, but now it can happen at any moment.
After all Liquid neural network work like brain and need only 20,000 Vs trillions of parameters. THATS EXPONENTIAL………
the future is extreme wealth hoarding if people cant learn to work at a profit-sharing company instead of greed factory corporation.
Of course it can't learn infinitely. No physical system can store an infinite amount of information. Even just saying that is problematic.
The explanation ‘converting an image into tokens’ is unfortunate. This only holds for one highly specific approach, I.e., visual transformers. The discrete token concept comes from linguistic, not visual neural processing. The analog properties of the input are lost and it is not at all clear whether such a quantized approach would even be useful for liquid or spiking networks.
Excellent video.
Forcthe average Joe, I think AI has collectively already superceded everything we could do. AGI isn't here because AI is quite fragmanted at the moment…there needs to be robust user-interface that allows all of these masters of [something complex] to feed the brain of a jack of all trades, boom AGI.
A neural network doesn't make AI, validation by humans does… Neural networks is just the technology to make this possible.
All AI is NOT based on the neural network.
future occupation computer phycologist
High Dimensional NN Potential for Molecular Dynamic
But if the liquid layer can "learn" new things, can it also "unlearn"? That would be a big problem wouldn't it
When it comes to technology, if it's technically possible and there's economic demand for it, it will happen. Given the stupendous amount of energy needed to train current models, the incentives to perfect these more efficient models will lead to rapid progress. There's no way things stay this inefficient for too long.
I feel like the difference between a computer and a brain mirror the differences between communism and capitalism. Like a computer, communism uses a centralized authority. While technology may advance under communism and development may happen (not efficiently), communist societies never really break out of the scientific level they begin in without input from non-communist societies. This isn't perfectly analogous since obviously there are valid applications of computers but still. Now compare this with capitalism, or the brain, which is self-regulating, highly efficient, and very creative and innovative.
We already have brainlike computers. It's called your brain.
A nobody’s here! You don’t need that vehemence for achievement – once you do it right, or at least fundamentally differently than before (like me). Numbly mimicking one single aspect of intelligence, neurons, is like trying to read a manual never even turning the page. I’ll see…
would it not be best to combine all these? a standardized base set of a nueral network interface with fluid like restructuring to generate patterns and spike detection to recognize it´s own patterns over time. something that would truely mimic a brain in all senses taking the drawbacks away by combining and maintaining all the benefits.
I don’t think there’s any reason continuous training can’t happen with the current paradigm
we're toast – that is scary shit
very insightful in understanding current state of ai tech and its future though
Hay ! I’m a adhd from b4 the term , was coined 😂
Noticed the ground breaking?
You still have to train them and have output. And that is the big kicker. That’s always the problem with training systems. That still has a problem.
Most people ignore the (amount of) energy, (amount, quality and variability of) data and (in general) technological limitations required to level up the AI technology. They're under the effect of the AI smoke-selling propaganda.
I think it is important to highlight that current artificial neural networks are not based on how human brains work but inspired by biological neural networks.
Human brains are really complex thanks to half a billion years of evolution of the brain.
There is a pretty good book that serves as a primer on neuroscience called “A Brief History of Intelligence”. If you enjoyed Sapiens you will love this book.
You never explained how the liquid neural networks reservoir changes over time which I think would be pretty important to the explanation.
Very well explained
In response to you question at approx 28 minutes: Yes, I would like to see more on Spiking Neural Networks!