On the 1st of August 2022, FEMM Hub PhD Student Ze Zhang from the University of Sheffield published a journal article titled “Deep Transfer Learning With Self-Attention for Industry Sensor Fusion Tasks” along with authors Michael Farnsworth, Boyang Song, Divya Tiwari, and Ashutosh Tiwari in the IEEE Sensors Journal.

The journal article is open access as per UKRI open access policy which aims to ensure that findings from research funded by the public through UKRI can be freely accessed, used and built on. To access the paper please click on the following link: Deep transfer learning with self-attention for industry sensor fusion tasks.

A summary of the paper was provided by Ze and can be found below:

Monitoring of complex industrial processes can be achieved by obtaining process data by utilising various sensing modalities. While the recent emergence of deep learning provides a new routine for processing multi-sensor information, the large amount of training data required can be the main obstacle to using deep learning in industrial applications.

Image of the methodology used

This research provides a novel deep transfer learning method as a possible solution to solve this problem. This work presents how Transformer with self-attention trained from natural language can be transferred to the sensor fusion task.  This means the features learnt from languages will be generalised to industrial sensor fusion tasks. As the data availability of language is much better compared with industrial process data, it is relatively easier to train a deep model in the language domain. Hence, this work could enable industrial data processing to benefit from the powerful feature learning capabilities of deep networks. Our proposed method is tested on 3 datasets: condition monitoring of a hydraulic system, bearing, and gearbox dataset. The results show that the Transformer trained from natural language can effectively reduce the required data amount for using deep learning in industrial sensor fusion with high prediction accuracy. The difficult and uncertain artificial feature engineering which requires a large workload can also be eliminated, as the deep networks are able to extract features automatically. In addition, the self-attention mechanism of Transformer aids in the identification of critical sensors, hence the interpretability of deep learning in industrial sensor fusion can be improved.

If you have any questions relating to this paper, please contact Ze directly on: ze.zhang@sheffield.ac.uk