A team of researchers at the Indian Institute of Technology, Hyderabad has developed a new technique that promises to help deploy high volume artificial neural networks (ANNs) on mobile phones and portable devices.
Artificial neural networks work the same way as human nervous systems and are based on interconnected processing elements working together to solve specific problems. But they need a large number of computations and huge storage space. Compressing these networks to adjust to available space on devices may compromise with the quality of output.
The new technique may help overcome the problem. “Neural networks are based on numerical files and are stored in binary form in devices. Our technique helps by grouping binary files into fragments of certainly fixed lengths. This has helped us generate repetitions of unique bit patterns of certain frequencies. We have achieved a compression of up to 64 percent and a minimum single module decompression time of 0.33 seconds,” explained Amit Acharyya, member of the research team.
“The technique can be implemented in individual cores of multi-core mobile platforms. This has been achieved without stopping the on-going process of other memory core modules. This obviated the need to place the entire decompressed file on a single on-chip memory. The technique can also be used to increase memory storage capacity for using intelligent algorithms in standalone as well as the distributed environment”, he added.
This design has been patented and is currently being used by industries.
The research team included Chandrajit Pal, Sunil Pankaj, Wasim Akram of IIT, Hyderabad and Dwaipayan Biswas of IMEC, Belgium, besides Dr.Acharyya. The research finding was recently at the International Symposium on Circuits and Systems held at Florence, Italy. The work was partially funded by the Science and Engineering Research Board (SERB).