Title | Cost-efficient coupled learning methods for recovering near-infrared information from RGB signals: Application in precision agriculture |
Publication Type | Journal Article |
Year of Publication | 2023 |
Authors | Gkillas, A, Kosmopoulos, D, Berberidis, K |
Journal | Computers and Electronics in Agriculture |
Volume | 209 |
Issue | 107833 |
Abstract | [size= 16px; box-sizing: border-box; margin: 0px; padding: 0px; caret-color: #2e2e2e; color: #2e2e2e; font-family: ElsevierGulliver, Georgia, 'Times New Roman', STIXGeneral, 'Cambria Math', Arial, Helvetica, 'Lucida Sans Unicode', 'Microsoft Sans Serif', 'Segoe UI Symbol', 'Arial Unicode MS', serif][url=https://www.sciencedirect.com/topics/computer-science/multispectral-imag... imaging[/url] and the derived [/size][url=https://www.sciencedirect.com/topics/agricultural-and-biological-science... analysis[/url] offer useful tools for revealing beneficial information for a variety of applications, e.g., precision agriculture, [url=https://www.sciencedirect.com/topics/computer-science/medical-imaging]me... imaging[/url] and [url=https://www.sciencedirect.com/topics/computer-science/autonomous-driving... driving[/url]. Contrary to mainstream RGB cameras that can capture information derived only from three bands within the visible spectrum, the multispectral cameras can offer better spectral resolution by utilizing the underlying information in the visible and the near-infrared spectrum. However, the cost of the multispectral cameras is very high and their mobility is limited due to their weight and their need for special hardware equipment. Considering the aforementioned limitations, we propose two low-cost and efficient methods to infer detailed [url=https://www.sciencedirect.com/topics/computer-science/spectral-informati... information[/url] outside the visible spectrum range by employing only an RGB camera. The proposed methods require significantly less training data, containing approximately [size= 16px; box-sizing: border-box; margin: 0px; padding: 0px; caret-color: #2e2e2e; color: #2e2e2e; font-family: ElsevierGulliver, Georgia, 'Times New Roman', STIXGeneral, 'Cambria Math', Arial, Helvetica, 'Lucida Sans Unicode', 'Microsoft Sans Serif', 'Segoe UI Symbol', 'Arial Unicode MS', serif]99.8%[/size][size= 16px; box-sizing: border-box; margin: 0px; padding: 0px; caret-color: #2e2e2e; color: #2e2e2e; font-family: ElsevierGulliver, Georgia, 'Times New Roman', STIXGeneral, 'Cambria Math', Arial, Helvetica, 'Lucida Sans Unicode', 'Microsoft Sans Serif', 'Segoe UI Symbol', 'Arial Unicode MS', serif] less parameters compared to the competing [url=https://www.sciencedirect.com/topics/computer-science/deep-learning]deep learning[/url] approaches and can be deployed on various edge devices with computational and power constraints, e.g., mobile phones or unmanned drones for addressing problems in precision agriculture under real-field settings. Extensive numerical results demonstrate the efficacy of the proposed models to reconstruct images outside the visible spectrum. Additionally, the reconstructed images can be utilized to estimate the [url=https://www.sciencedirect.com/topics/agricultural-and-biological-science... difference vegetation index[/url] (NDVI), which can reveal valuable information concerning the health of the monitored plants without the need of a multispectral camera.[/size] |