INFORMATION TECHNOLOGY

Scientists have made progress in the field of multi-focus image fusion


Depth of field is the distance between the closest and farthest clear imaging planes in an optical system. The greater the depth of field, the wider the range of clear images that can be imaged by the optical system. Due to the depth of field limitation of optical lenses, it is difficult to obtain a fully focused image, i.e., there is often a part of the blurred area. One of the effective ways to solve this problem is Multifocal Image Fusion (MFIF). MFIF aims to fuse multiple locally focused images obtained by focusing different objects in the same scene separately to obtain a fully focused image where all objects are clear. MFIF can effectively extend the depth of field of the optical lens, so that the imaging system can break through the depth of field limitation to obtain higher quality images. 

At present, in the field of MFIF, the effect of deep learning methods is significantly better than that of traditional algorithms. In recent years, deep learning-based MFIF algorithms have developed rapidly, but scientists often focus on designing increasingly complex network structures, modules, and loss functions to improve the fusion performance of algorithms. This means that a lot of time has to be spent designing clever network structures and doing enough comparative experiments. However, this is not conducive to the improvement of algorithm performance, resulting in a bottleneck in the performance of the current MFIF algorithm. 

To this end, Fu Weiwei’s team at the Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, reconsidered the image fusion task and modeled it as a conditional generation model. Combined with the diffusion models with the best effect in the current image generation field, the team proposed an MFIF algorithm based on the diffusion model, FusionDiff (the image fusion principle of FusionDiff is shown in Figure 1). This is the first application of the diffusion model in the field of multi-focus image fusion, which provides a new idea for the research in this field.Schematic diagram of the image fusion principle of the FusionDiff algorithm

Experiments show that FusionDiff is better than other MFIF algorithms in terms of fusion effect and small-shot learning performance. FusionDiff was compared with 13 representative MFIF algorithms on 8 evaluation indexes, and the best fusion results were achieved (Tables 1 and 2). At the same time, FusionDiff is a small-shot learning MFIF algorithm, which only needs 100 pairs of training sets to achieve good fusion results. Table 3 shows the training set size of different MFIF algorithms, and the training set size of FusionDiff is reduced to less than 2% of that of other algorithms. This means that the algorithm may be suitable for use cases where samples are scarce, such as microscopic image fusion.

Table 1. The average score of all algorithms on the Lytro public test set

Table 2. The average score of all algorithms on the MFFW public test set

Table 3. The size of the training set for different MFIF algorithms

The research results were published in Expert Systems with Applications, titled FusionDiff: Multi-focus image fusion using denoising diffusion probabilistic models. The research work was supported by the Natural Science Foundation of Shandong Province and the Youth Innovation Promotion Association of the Chinese Academy of Sciences. (Source: Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences)

Related Paper Information:https://doi.org/10.1016/j.eswa.2023.121664

Special statement: This article is reproduced only for the purpose of disseminating information, and does not mean that it represents the views of this website or confirms the authenticity of its content; if other media, websites or individuals reprint from this website, they must retain the “source” indicated on this website, and bear their own legal responsibilities such as copyright; if the author does not want to be reprinted or contact the reprint fee and other matters, please contact us.



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button