Sunday, June 12, 2005

Instapundit notes development of wearable video cameras with large data-storage capacity being used by the military to gather data for later analysis.

As it becomes widespread and data storage continues to be cheaper and more plentiful, this is going to create huge amounts of data, probably more than humans can sit down and watch every frame of, but too complex for the curent abilities of machine interpretation. This is another task that will drive the development of weakly superhuman AI: the need to process huge amounts of data in a human-like way, quickly and in massively parallel fashion.

3 Comments:

Blogger Roger Fraley said...

Do you really thing that artificial intelligence is possible? If so, how could we tell? I'm aware of the Turing test but wanted to know if you had a better idea. Finally, I'm reminded of the Borges story where the guy remembered everything but it took a whole day to recount the day before. When is anyone going to look at the data recorded?

5:30 PM  
Blogger TallDave said...

Well, in the loosest sense, AI is anything that can process logic, so in that sense it's been around since the first clunky mechanical computers. Humanlike AI seems to be becoming incrementally more possible with things like facial recognition software, playing chess well enough to beat a grandmaster, word recognition, learning heuristics, etc. DWIM (do what I mean) interfaces are probably next, as I hear MS is working on them now.

I think the Turing test is a bit anthropomorphic in that it assumes mimicking human behavior is important to the definition of intelligence, which seems like a rather speculative assumption today, though understandable for the era he devised it.

Given that free markets tend to drive development, it seems likely we'll see many more fragments of useful humanlike behavior similar to those above before we see a Turing-capable machine.

8:40 AM  
Blogger TallDave said...

Your Borges reference recalls a fundamental aspect of human memory: weak, prioritized recall. Computers already use cache memory as a rough analogy.

8:42 AM  

Post a Comment

<< Home