Xopus

Main Menu

  • Schemas
  • CSS
  • Chrome
  • Firefox
  • Fund

Xopus

Header Banner

Xopus

  • Schemas
  • CSS
  • Chrome
  • Firefox
  • Fund
Schemas
Home›Schemas›Scientists say social interaction is ‘the dark matter of AI’

Scientists say social interaction is ‘the dark matter of AI’

By Warren B. Obrien
January 3, 2022
0
0

Two researchers from the University of Montreal today published a pre-printed research paper on creating “smarter artificial agents” by mimicking the human brain. We’ve heard this one before, but this time it’s a little different.

The big idea here is to give artificial intelligence agents more agency.

According to the researchers:

Despite advances in social neuroscience and developmental psychology, it was not until the last decade that serious efforts began to focus on the neural mechanisms of social interaction, which were believed to be “the” dark matter ”of social neuroscience.

Basically, there is something other than algorithms and architecture that makes our brains vibrate. According to the researchers, this “dark matter” is made up of social interactions. They argue that AI must be capable of “subjective awareness” in order to develop the neurological connections necessary to display advanced cognition.

By paper:

The study of consciousness in artificial intelligence is not a mere pursuit of the metaphysical mystery; from an engineering perspective, without understanding subjective consciousness, it might not be possible to create artificial agents that intelligently control and deploy their limited processing resources.

Making an AI as smart as a human isn’t a simple matter of building bigger supercomputers that can run faster algorithms.

Current AI systems fall far short of the cognitive abilities of a human. In order to close this gap, the researchers say agents will need three things:

  • Biological plausibility
  • Temporal dynamics
  • Social incarnation

The “biological plausibility” aspect is to create an AI architecture that mimics that of the human brain. It means creating a separate subconscious layer, but connected to, a dynamic consciousness layer.

Because our subconsciousness is intrinsically linked to the control of our body, scientists seem to be proposing to build an AI with a similar brain-body connection.

According to the researchers:

More precisely, the proposition is that the brain constructs not only a model of the physical body, but also a coherent, rich and descriptive model of attention.

The body diagram contains layers of valuable information that help to monitor and predict the stable and dynamic properties of the body; Likewise, the attention pattern helps control and predict attention.

One cannot understand how the brain controls the body without understanding the body pattern, and similarly one cannot understand how the brain controls its limited resources without understanding the pattern of attention.

As for “temporal dynamics,” the researchers suggest that artificial agents must be able to exist in the world in much the same way as humans. This is similar to the way our mind works in that we don’t just interpret information, we process it in relation to our surroundings.

As the researchers say:

In nature, complex systems are made up of simple components that self-organize over time, ultimately producing emerging behaviors that depend on the dynamic interactions between the components.

This makes understanding how time affects both an agent and its environment a necessary component of the proposed models.

And that brings us to “social incarnation,” which is essentially the creation of a literal body for the agent. Researchers say AI should be capable of social interaction on an equal footing.

According to the paper:

For example, in human-robot interaction, a gripper is not limited to its role in the manipulation of objects. Rather, it opens up a wide range of movements that can improve the robot’s communicative skills and, therefore, the quality of its possible interactions.

Ultimately, there is no real roadmap to AI at the human level. Researchers are trying to bring the worlds of cognitive and computer science together with engineering and robotics in ways we’ve never seen before.

But, arguably, this is just another attempt to pull a miracle out of deep learning technology. Barring some new computation or a new class of algorithms, we could be as close to AI agents at the human level as traditional reinforcement learning can.

Related posts:

  1. Biden administration signals sweeping shift in focus to deal with cyber concerns in government procurement Baker Donelson
  2. My five # 436 | Inbound Marketing Agency
  3. Spring Boot Tutorial Brian Matthews
  4. ChaosSearch Data Platform Now Available in the AWS Marketplace

Recent Posts

  • 4 CSS progress bars you can use on your website
  • Google TV is rolling out a new Highlights tab for the news
  • Mozilla explains how Firefox extensions will follow Chrome
  • Immediately formulate action plans for all CSS: Dulloo to officers
  • 2023 Toyota Tacoma SR5 Gets Extended SX Package, New Chrome Pack

Archives

  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021

Categories

  • Chrome
  • CSS
  • Firefox
  • Fund
  • Schemas
  • Terms and Conditions
  • Privacy Policy