Artificial baby mind learns to talk
In the last months I have been immersed in exciting projects, while also keeping up with how the story told in my book An Impossible Invention continues to evolve.
There’s been so much on the theme of The Biggest Shift Ever that I would have liked to share in blog posts, so much fascinating science and tech news flowing towards me every day in the newsroom, depicting a world in accelerating innovation and change. But I just haven’t had the time.
Meanwhile I try to share parts of this flow on Twitter, so please follow me there if you would like updates more often. Hopefully I will be able to be more active here in a few months.
Today I just wanted to share one of the most intriguing pieces of research I’ve come across lately — an artificial simulated toddler learning to talk while interacting with its ‘caregiver’. Just watch this amazing video:
The project which involves computational models of the face and brain, combining Bioengineering, Computational and Theoretical Neuroscience, Artificial Intelligence and Interactive Computer Graphics Research, is being developed at the Laboratory for Animate Technologies at the University of Auckland in New Zealand.
I wouldn’t claim that AI has reached the point of a human baby mind just yet, but I think this is another clear sign that we’re on our way to get there.