Research, standards and thoughts for the digital world

Earlier posts by categories:

MPAI MPEG ISO

Prompt Creation: Where Words Meet Context

The Prompt Creation module is the storyteller and translator in the Autonomous User’s “brain”, It takes raw sensory input  –  audio and visual spatial data of Context (as objects in a scene with their position, orientation and velocity) and the User State (rich description of the A‑User’s understanding of the “internal state” of the User) – and turns it into a well‑formed prompt that Basic Knowledge can actually understand and respond to. We have already presented the system diagram of…

Continue ReadingPrompt Creation: Where Words Meet Context

Visual Spatial Reasoning: The Vision‑Aware Interpreter

  • Post author:
  • Post category:MPAI

Autonomous User (A-User) is an autonomous agent able to move and interact (converse, etc.) with another User in a metaverse. It is a “conversation partner in a metaverse interaction” with the User, itself an A-User or an H-User directly controlled by a human. The figure shows a diagram of the A-User while the User generates audio-visual streams of information and possibly text as well. This is the fourth of a sequence of posts aiming to illustrate the architecture of an…

Continue ReadingVisual Spatial Reasoning: The Vision‑Aware Interpreter

Audio Spatial Reasoning: The Sound-Aware Interpreter

  • Post author:
  • Post category:MPAI

Autonomous User (A-User) is an autonomous agent able to move and interact (converse, etc.) with another User in a metaverse. It is a “conversation partner in a metaverse interaction” with the User, itself an A-User or and H-User directly controlled by a human. The figure shows a diagram of the A-User while the User generates audio-visual streams of information and possibly text as well. We have already presented the system diagram of the Autonomous User (A-User), an autonomous agent able…

Continue ReadingAudio Spatial Reasoning: The Sound-Aware Interpreter

Context Capture: The A-User’s First Glimpse of the World

  • Post author:
  • Post category:MPAI

Autonomous User (A-User) is an autonomous agent able to move and interact (converse, etc.) with another User in a metaverse. It is a “conversation partner in a metaverse interaction” with the User, itself an A-User or and H-User directly controlled by a human. The figure shows a diagram of the A-User while the User generates audio-visual streams of information and possibly text as well. The sequence of posts – of which this is the second – that illustrates more in…

Continue ReadingContext Capture: The A-User’s First Glimpse of the World

A-User Control: The Autonomous Agent’s Brain

  • Post author:
  • Post category:MPAI

We have already presented the system diagram of the Autonomous User (A-User), an autonomous agent able to move and interact (converse, etc.) with another User in a metaverse. The latter User may also be an A-User or may be under the direct control of a human and is thus called a Human-User (H-User). The A-User acts as a “conversation partner in a metaverse interaction” with the User. This is the first of a planned sequence of posts having the goal…

Continue ReadingA-User Control: The Autonomous Agent’s Brain

A new MPAI standard project for Autonomous Users in a metaverse

  • Post author:
  • Post category:MPAI

The Mayflower Pilgrims left a world they considered oppressive to a new one that they expected would be challenging but more hospitable. With the development of one of its standards – the MPAI Metaverse Model – some MPAI members virtually left this world not because it was unfriendly but because they thought that the new virtual one would be more or differently attractive. Like the Pilgrims, these MPAI members had to face a new world that they wanted to populate.…

Continue ReadingA new MPAI standard project for Autonomous Users in a metaverse