Embodiment and Intelligence
I guess it has become a truism that “real intelligence” requires embodiment (some random pointers: why alphago is not ai; another good place to start is the AI section of the wikipedia embodied cognition entry)
I’ve been a firm believer in this for decades, since a non-embodied system requires explicitly stating an overwhelming level of explicit context. However, r ight now I want to push back a bit on what embodiment means in the “world”
Embodiment?
First: what is embodiment? A couple definitions:
From the Cambridge Dictionary of Philosophy
Rather it pertains to the phenomenal body and to the role it plays in our object-directed experiences.
Or from the Stanford Encyclopedia of Philosophy
Embodiment Thesis: Many features of cognition are embodied in that they are deeply dependent upon characteristics of the physical body of an agent, such that the agent’s beyond-the-brain body plays a significant causal role, or a physically constitutive role, in that agent’s cognitive processing.
These all seem squishy and imprecise. I think in term of four essentail requirements;
- The body must be contained within the world it is said to be embodied in
- The body must be able to be acted upon by other entities that are present in the world
- The body must be able to act upon other entities that are present in the world
- The body must be finite the results of actions will not be completely predictable to the body
Buttressed by an optional clarification:
- The body likely doesn’t need to be completely connected at all times. Although during periods of disconnectedness it may not function as a single body.
I raise the issue because I’ve been considering if the body needs to be physical per se, or if embodied knowledge can exist when the body may not be strictly physical
Given the work in robot cognition, and the importance of embodiment to that work, it’s relatively uncontroversial that the body need be neither alive nor organic. However, the answer to the question “need the body be physical” is less clear.
In the most literal sense, a non-physical body is an oxymoron, since we don’t know of anything that doesn’t require some physical support — even ideas must be thought by a physical brain to be available to us as ideas. However, in a more colloquial sense, it does not need to be material, it could be virtual. That’s the idea I want to explore more
As an example of what I’m considering is Agre & Chapman’s Pengi. Pengi is an early (1987) proof of concept system built around perception/action loops.
Pengi, plays a commercial arcade video game called Pengo. Pengo is played on a 2-d maze made of unit-sized ice blocks
Pengi isn’t a robot, but a character simulation that “lives in” a Pengo implementation. It’s knowledge representation is indexical
the-block-I’m-pushing
the-corridor-I’m-running-along
the-bee-on-the-other-side-of-this-block-next-to-me
the-block-that-the-block-I-just-kicked-will-collide-with
the-bee-that-is-heading-along-the-wall-that-I’m-on-the-other-side-of
There’s no sense in which the character is conscious, but there is a sense in which it is embodied in the game:
- It is contained,
- It acts & acted upon
Given the complexity of the game, the results of its actions weren’t completely predictable
Much of the work referencing Pengi involves embodied agents, which, oddly don’t have much to do with embodiment in the sense I use here. Embodied agents, as generally understood, are interfaces to programs which look like bodies, and can be either virtual or real. These bodies don’t satisfy the embodiment criteria outlined above for, although they are contained in the world, their ability to act/be acted upon and sense the world is limited, at best. — They are just abstractions with the visual representation of a physical body.
This is (somewhat) reassuring, it means that these four+ criteria might useful in distinguishing what constitutes a meaningful form of embodiment.
If this is the case, is there any way in which they must be changed to capture the embodiment non-physical agent? The one chage that appears necessary would be to expand our definition of the world. A non-physical embodiment requires that the embodiment reside in a non-physical world. This means tha the body and the world must be share the same space, be it virual or physical. The overlap between them must be such that the body able to directly sense and act/be acted upon by the world. Again, Pengi would meet these requirements.
Note that this is disjoint from intelligence or conscious activity neither of which is required for embodiment. My first take is that this moves us in the direction of considering the data center controllers embodied systems, e.g., data centers with software defined networks could potentially fit the definition of embodied entities. I haven’t been able to find sufficiently detailed descriptions of the operational software used in these systems, so I can’t tell if they might be set up that way, so I’m forced to defer the analysis.
However rather than a current high tech computational example, on the low-tech side (at least for 2017), I’d also posit that the governor for a steam engine is an embodied system . An reflexive, non-conscious system, but an embodied one nonetheless.
Leave a Reply