"The main innovation is in the SDSoC design framework", explained Jiri Kadlec, head of the signal processing department at UTIA. "Through this, developers can now estimate the acceleration for their algorithms, explore different avenues as necessary and export the best solutions as a stand-alone C application or Linux user space application."
Partly funded by the Artemis EMC2 project, the demonstration used a Dual-Core 32-bit ARM Cortex A9 processor of the Xilinx Zynq family to execute stand-alone C application programmes, performing initialization and synchronization of the hardware-accelerated video processing chain. The C programmes can be modified by the user and recompiled in Xilinx SDK 2015.4.
"While the framework has been presented on simple edge and motion detection, it is valid for fast I/O processing, industrial control - providing low latency - and connected items in the internet of things", stated Jiri Kadlec. "In real-time and image processing, it links the hardware accelerator to the rich database of C/C++ algorithms from OpenCV."
Flemming Christensen, managing director of HiPEAC member Sundance, added: "This represents a step forward for embedded applications where edge and/or motion detection is important, such as those relating to security, medical uses, unmanned vehicles and image processing in general."
Videos showing the edge/motion detection in action can be found at:http://bit.ly/UTIA-Sundance_video_processing