High level design
The autonomous capabilities, and whether or not the robot is authorized to exploit them, are embedded in the design of the Autopilot Inference. The design defines a particular software configuration of high-level, functional components. In line with many other reference architectures, such as observe-orient-decide-act and MAPE-K, the Autopilot Inference has five main functionalities that interact with each other, with the user, the robot, and the environment as follows:
- The user may interact with the Autopilot Inference either via our mission planner, called Cerebra Studio, or via the ROS2 network.
- The Autopilot Inference will interact with the physical robot through a so-called "Origin BringUp," which is a gateway for receiving sensor data, such as camera images and LiDAR point clouds, and for sending reference velocities.
high level functionality | theoretical meaning | current examples |
---|---|---|
Cognition | The mental action or process of acquiring knowledge and understanding through thought, experiences. | Managing data and information; Decomposing jobs and tasks into behaviors. |
Ordination | The mental action or process of putting something in order. | Behavior Execution; Software node management. |
Perception | The ability to see, hear, or become aware of something through the senses. | Pre-processing, Object detection, Object tracking. |
Navigation | The process or activity of accurately ascertaining one's position and planning and following a route. | Localization, Path planning, Path following. |
Interaction | Communication with someone or manipulation of something. | Inform user. |
We distinguish two main phases in the workflow: deployment-time and run-time. Deployment-time is the phase in which a user interacts with the Autopilot Inference to prepare the robot for its operation. Run-time is the phase in which the robot is executing its operation.
At deployment-time, a user interacts with the Autopilot Inference to prepare the robot for its operation. This includes setting prior information on the position of Aruco markers, providing a list of tasks the robot needs to execute, and providing any policies or constraints the robot needs to take into account. For example, a user may interact with the cognition functionality of the Autopilot Inference to manage the list of tasks and behaviors.
In the meantime, the robot is turned on and is awaiting to start its operation in front of an Aruco marker. The perception functionality detects the Aruco marker, which allows the navigation functionality to estimate the robot's initial position. This position is defined with respect to a global origin, which is a point in space either determined by a known map or by a known latitude-longitude. To determine its initial position based on RTK-GNSS, the robot requires a short manual drive of approximately 5 meters while interpolating the RTK-GNSS measurements (or fixes).
Once the user starts the execution of the tasks, then the robot is in run-time. During this time the cognition functionality decomposes the list of tasks (what to execute), into a list of behaviors (how to execute). The ordination functionality will poll the next behavior in this list, which is then executed by scheduling services and actions in the perception, navigation or interaction functionality. The cognition functionality will acquire and share information with the other functionalities as is necessary for the execution of the tasks (as illustrated by the solid lines in the above picture).
Note
A behavior is something that the robot is able to do, either in the real world, or with raw measurement data, within a limited amount of time. For example, moving to a waypoint, or taking a snapshot. The perception, navigation and interaction functionalities each inform the ordination functionality which behaviors it supports, so that the ordination functionality may call upon them when needed.
In what follows, we will first present a more detailed account on the operational principles of the robot, after which we continue with an in-depth explanation of each high-level functionality.
Info
You may either continue to some of the operational principles of the Origin One with Autopilot Inference, or to a more detailed description of the 5 different high-level functionalities starting with the Cognition stack.