Research, standards and thoughts for the digital world

Earlier posts by categories:

MPAI MPEG ISO

Audio Spatial Reasoning: The Sound-Aware Interpreter

  • Post author:
  • Post category:MPAI

Autonomous User (A-User) is an autonomous agent able to move and interact (converse, etc.) with another User in a metaverse. It is a “conversation partner in a metaverse interaction” with the User, itself an A-User or and H-User directly controlled by a human. The figure shows a diagram of the A-User while the User generates audio-visual streams of information and possibly text as well. We have already presented the system diagram of the Autonomous User (A-User), an autonomous agent able…

Continue ReadingAudio Spatial Reasoning: The Sound-Aware Interpreter

Context Capture: The A-User’s First Glimpse of the World

  • Post author:
  • Post category:MPAI

Autonomous User (A-User) is an autonomous agent able to move and interact (converse, etc.) with another User in a metaverse. It is a “conversation partner in a metaverse interaction” with the User, itself an A-User or and H-User directly controlled by a human. The figure shows a diagram of the A-User while the User generates audio-visual streams of information and possibly text as well. The sequence of posts – of which this is the second – that illustrates more in…

Continue ReadingContext Capture: The A-User’s First Glimpse of the World

A-User Control: The Autonomous Agent’s Brain

  • Post author:
  • Post category:MPAI

We have already presented the system diagram of the Autonomous User (A-User), an autonomous agent able to move and interact (converse, etc.) with another User in a metaverse. The latter User may also be an A-User or may be under the direct control of a human and is thus called a Human-User (H-User). The A-User acts as a “conversation partner in a metaverse interaction” with the User. This is the first of a planned sequence of posts having the goal…

Continue ReadingA-User Control: The Autonomous Agent’s Brain

A new MPAI standard project for Autonomous Users in a metaverse

  • Post author:
  • Post category:MPAI

The Mayflower Pilgrims left a world they considered oppressive to a new one that they expected would be challenging but more hospitable. With the development of one of its standards – the MPAI Metaverse Model – some MPAI members virtually left this world not because it was unfriendly but because they thought that the new virtual one would be more or differently attractive. Like the Pilgrims, these MPAI members had to face a new world that they wanted to populate.…

Continue ReadingA new MPAI standard project for Autonomous Users in a metaverse

Exploring the innovations of the MMM-TEC V2.1 standard

  • Post author:
  • Post category:MPAI

The MPAI Metaverse Model (MPAI-MMM) - Technologies (MMM-TEC) specification is based on an innovative approach. As in the real world (Universe) we have animate and inanimate things, in an MPAI Metaverse (M-Instance) we have Processes and Items. Processes can animate Items (things) in the metaverse but can also act as a bridge between metaverse and universe. For convenience, MMM-TEC defines four classes of Processes: Apps, Devices, Services, and Users. Probably, the most interesting one is the User, defined as the…

Continue ReadingExploring the innovations of the MMM-TEC V2.1 standard

Exploring the Up-sampling Filter for Video applications (EVC-TEC) standard

  • Post author:
  • Post category:MPAI

A method typically used in video coding is to down-sample to half the input video frame before encoding. This reduces the computational cost but requires an up-sampling filter to recover the original video resolution in the decoded video to reduce as much as possible the loss in visual quality. Currently used filters are bicubic and Lanczos, Figure 1 – Up-sampling Filters for Video application (EVC-UFV) In the last few years, Artificial Intelligence (AI), Machine Learning (ML), and especially Deep Learning…

Continue ReadingExploring the Up-sampling Filter for Video applications (EVC-TEC) standard