This is a testing site only. See the live Public Lab site here »

# Question: Spectrometer from a drone

by sylvainbonhommeau |

### I'm a marine biologist and we use a (very) expensive hyperspectral sensor on-board an aircraft to get some images from lagoon etc. From the reflectance spectrum, we can identify the habitat (sand, algae, type of corals etc.). I'd like to equip a drone with a spectrometer to collect images along predefined tracks. I was wondering if this would be feasible using this spectrometer. As it's for marine applications, we don't need wavelengths higher than the red (absorbed in the first mm). To make it short:

• Can we use the spectrometer for outdoors applications?
• If so, is it possible to select the wavelengths to scan?
• From the wiki, I understand that you can take 20-30 samples per second so if I scan from 390 to 600 nm with a 3 nm step, it means that it would take about 4 sec to get an image?

Thanks!

Sylvain

Hi Sylvain,

I don't know the answer to this question because I don't quite understand hyperspectral scanning. But I think the answer is maybe. The type of spectrometer Public Lab uses does not scan through the wavelengths. It takes a photo of the diffraction pattern formed by an entrance slit and grating. The light intensity at each wavelength is recorded in a spectrogram (the photo) which can be graphed as brightness at each wavelength. If you point the spectrometer at different colored surfaces, the light reflected from those surfaces will form diffraction patterns which will differ from each other as the apparent color of the surfaces differ. So assuming that the source of illumination (the sun in this case) does not change between photos, differences among spectra could reveal color differences among surfaces.

Each spectrogram (photo of diffraction pattern) can be analyze later to determine how bright each wavelength was, so you can choose whichever wavelengths you want to study (mostly between 400 and 700 nm). The intensity at each wavelength is garbled by the color system in consumer digital cameras, so there is not a one-to-one relationship between intensity at each wavelength and the actual spectral brightness in the scene. But you might be able to work around that with some calibration targets.

Chris

The new SpectralWorkbench.js library can (in Node.js javascript) convert a spectrum image file to a spectrum JSON or CSV file, so perhaps that's a way you could be doing this either onboard a drone or in automated post-processing. I think the way you'd want to do it is to carefully align a spectrometer with a video camera, and ensure (using a high-contrast pattern on the ground) you know which pixels the spectrum actually corresponds to. Then you could record periodic (or continuous, w/ video) spectra along with a video of what you're flying over.

You'd probably want a more robust version, and smaller/more waterproof than the Desktop Spectrometry Kit), but this sounds plausible, though a good bit of work, with a compact Raspberry Pi-based spectrometer, like the one @cristoforetti has been working on as part of #webvalley.

Either the sequence of spectrum images or a frame-by-frame dump of a video of spectra could be processed using SpectralWorkbench.js. But keep in mind the spectrometer is not #intensity-calibrated (though there are published methods for getting close, by @stoft), so without that you'd need to compare to reference data you collect on the same instrument in similar lighting conditions.

This is a big project, but if you tackle it we'd love to hear how you do!

However ....... 1) the slit-based PLab design does not provide the optical characteristics of field of view like the lens on a camera (and I'm not aware of any analogous measure for the spectrometer) so with scattered reflected light the "detection aperture angle" is unknown and 2) the dynamic range of a web-cam based spectrometer is quite small relative to the potential light level dynamics of outdoors. A major factor could easily be having either very low, noisy signals or heavily clipped ones -- both unusable. A first step might be a simple experiment to just obtain a reflected light outdoor spectra from a few stories up, looking down, under "drone" light conditions. Outside lighting can cover many orders of magnitude but the PLab device's range is (presently) limited to adding attenuation. But, maybe, a single "setup attenuation" for each survey could be sufficient .? .. would be an interesting experiment.

It is a new application concept for a DIY spectrometer so could be worth investigating the basics to find the relative limits.

Is this a question? Click here to post it to the Questions page.

hi, thanks for all your answer! sorry for the incomplete description of the experiment. Basically, we need the light spectrum for each pixel. I saw this paper which describes how to do that with a "normal" camera. I guess an option using your device would be to fly relatively low so each acquisition from your spectrometer would be one pixel. However, it would increase the time to cover one area... Regarding the light intensity/reflectance issue, I think there could be ways to overcome this issue: - since we use a statistical model to link the spectrum shape of a pixel and what is observed (sand, corals...), we could recalibrate this model using the outputs of the spectrometer. It's not the best option since we already did a lot of field observation and hyperspectral acquisition with the (20 000us\$...) camera. - we could have a camera pointing upwards towards the sky to get the incoming light and then calculate the reflectance. Everything would be postprocessed because i don't need to have real time data. i also wanted to use a differentiel gps to have a centimeter precision basically using this diy gps. A friend has done one and with an IMU he's able to georeference each image on-board without long pos-processing with manual orthorectification. Anyway, this project might not be straightforward but I'm gonna start some tests. Maybe I'm gonna work with a master student to help me because I'm in marine biology so it reaches my limits in understanding... it could be a nice PhD topic too. I let you guys know about any new development but if you have other ideas, they are welcome!

thanks a lot! sylvain

the slit-based PLab design does not provide the optical characteristics of field of view like the lens on a camera (and I'm not aware of any analogous measure for the spectrometer) so with scattered reflected light the "detection aperture angle" is unknown

It's true that the PL spec was not designed for many simultaneous spectra, but if you look at some of the posts on #hyperspectral imaging, a slit does actually return a series of spectra that can be resolved into images. I think the angle would be measurable, but am not sure how any scattering could be assessed.