- U.S. funding research into AI that can predict how people will behave
- Software recognises activities and predicts what might happen next
- Intended for use in both military and civilian contexts
An artificial intelligence system that connects to surveillance cameras to predict when people are about to commit a crime is under development, funded by the U.S. military.
The software, dubbed Mind's Eye, recognises human activities seen on CCTV and uses algorithms to predict what the targets might do next - then notify the authorities.
The technology has echoes of the Hollywood film Minority Report, where people are punished for crimes they are predicted to commit, rather than after committing a crime.
Scientists from Carnegie Mellon University in Pittsburgh, Pennsylvania, have presented a paper demonstrating how such so-called 'activity forecasting' would work.
Their study, funded by the U.S. Army Research Laboratory, focuses on the 'automatic detection of anomalous and threatening behaviour' by simulating the ways humans filter and generalise information from the senses.
The system works using a high-level artificial intelligence infrastructure the researchers call a 'cognitive engine' that can learn to link relevant signals with background knowledge and tie it together.
The signals the AI can recognise - characterised by verbs including ‘walk’, ‘run’, ‘carry’, ‘pick-up’, ‘haul’, ‘follow’, and ‘chase’, among others - cover basic action types which are then set in context to see whether they constitute suspicious behaviour.
The device is expected to be used at airports, bus and train stations, as well as in military contexts where differentiating between suspicious and non-suspicious behaviour is important, like when trying to differentiate between civilians and militants in places like Afghanistan.
Tom Cruise in Minority Report: In the Hollywood film, Cruise's character must go on the run after authorities predict he is about to commit murder
Full story HERE