Inside -> out — understanding your environment
Inside out comes from this appleworld post Which, although it’s in reference to something a bit different (head mounted mixed reality displays), captures a fundamental aspect of this newest set of technological affordances.
If we think in terms of the skin as being a boundary layer between us and the world (& that layer having infinitesimal thickness) you can consider three cases
- Outside-in where the world thinks you are in relation to its external referents
- Inside-in your internal bodily states (heart rate, EKG — the whole quantified self thing)
- Inside-Out where/what the world is in relation to you
A paradigmatic example of Inside-out is a sound level meter on your watch. It doesn’t rely on available models of sound sources, reflections, dampers, just what is there.
Similar affordances are provided via UWB positioning and LiDAR “high resolution” distance measurements. Which provide direction and distance measurements as perceived by RF and infrared signals respectively. This is similar to, but qualitatively different from things like tiles which provide proximity information, UWB can provide 3 dimensional distance measurements (from SafetyNet )
Or (from decawave (Figure 4))
So, aside from perspective, what’s different? Well, if like me, you think that our existence consists of embedded, embodied immanence, then these capabilities add new modalities to that existence. Every new sensor I’ve incorporated over the years has changed the way I’ve seen the world. For example focus stacking and high ISO cameras each changed the way I see and allowed me to conceptualize pieces that would have been impossible before (although, admittedly, focus stacking is more something that allows portrayal of the world as seen, rather than seeing the world anew)
Since I’m describing nascent capabilities, it’s impossible to predict that what their ultimate form will take or even if they will survive in the long term. After all, Virtual Reality was born in the 1950’s and its viability is still a topic of discussion.
Currently they represent capabilities to watch and explore.
On the other hand, after having a LiDAR scanner in my phone for a few days made me realize that it would see these weights differently (hopefully it’s obvious that the dumbbell bar doesn’t drip down like that in real life)
Leave a Reply