Thursday, November 26, 2015
A group of researchers from the University of Sassari (Italy) and the University of Plymouth (UK) has developed a cognitive model, made up of two million interconnected artificial neurons, able to learn to communicate using human language starting from a state of "tabula rasa," only through communication with a human interlocutor. The model is called ANNABELL (Artificial Neural Network with Adaptive Behavior Exploited for Language Learning) and it is described in an article published in the international scientific journal PLOS ONE. This research sheds light on the neural processes that underlie the development of language.
Just six months after coming online, Comet, the new petascale supercomputer at the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, is already blazing new paths of discovery, thanks in part to its role as a primary resource for an assortment of science gateways that provide scientists across many research domains with easy access to its computing power. Simply described, science gateways provide web browser access to applications and data used by specific research communities. Gateways make it possible to run the available applications on supercomputers such as Comet, so results come quickly, even with large data sets.
See Past Spotlights >