Proper Threat Assessment is Phenomenological
Riding a bike it‘s quite clear why it won‘t work. Phenomenological threat assessment, a computer can’t do it. That‘s why self-driving cars are a tough one, because a computer cannot feel, no matter how much data you give it. Computers compute data, people feel aspects of Reality. A computer cannot feel, it can't interpret the logic of a situation, it can only work with numerically-parsed streams of information.
The word "sense" can be misleading, don’t let it trick you. Computer sensing is mecano-, photo-, electro- numerical. Human sensing is aesthetic. There's no numerical representation of a felt moment, product of aesthesis. It transcends the realms both numerical and computational, as much as we insist in establishing parallels between experience and computation.
The realm of intentions is invisible, inaccessible to machinely sense. Intentions happen in different world, metaphysical. There is no possible computation for how the other feels. Not even the finest computation of spatial net trajectory can predict reactions motivated by anger or by duty, of any person or person-operated artifact on a field.
Efficacious operation of vehicles in free, hypermediated space necessitates a felt comprehension of what's happening. A computer cannot properly assess what's happening because it is the nature of computation to evaluate in dimensionally-reduced ways. Human aestheto-semiotic assessment evaluates dynamics in contextually meaningful and generative ways. Making the distinctions that matter, you can't trust a computer to make them. You need human in the loop.
– Morningside Heights, NYC
✎ Connection to
bis / Epistemology of Computation