automated seedling health with rpi + infracam + opencv?

327 views
Skip to first unread message

maco...@gmail.com

unread,
Dec 9, 2015, 9:04:02 PM12/9/15
to plots-infrared
Hello from ManyLabs.org over on the west coast!

One of the interns here, Van, is working on an automated plant growth tracking system based on an rpi + picam. Currently it logs some basic measurements from soil sensors, provides scheduled illumination to the germinating plant, and posts images periodically to twitter of growth.

https://twitter.com/seeusgrow - "Live-ish feed of plants growing"

While the daily images sure are exciting, the textual content of each tweet is a bit repetitive... so we were thinking of perhaps writing a simple script to generate tweet text dynamically based on the plant's health. We hope to algorithmically model this primarily by analyzing each image. (We also can collect some other channels of data relevant to the plant's state: soil moisture, pH, temperature, insolation, but these would be more like alarms rather than a model. Here's an example of those data streams, but caution, lots of data is loaded in the background on this page https://www.manylabs.org/data/653/view/).

Anway, this seems like a job for... INFRACAMMMMM!

I searched around on the public labs wiki and the wider internet a bit but didn't find any published code for automated plant health monitoring (at least none that would run on an rpi).

I did see the page about the standalone python script for infragram analysis (https://publiclab.org/wiki/python-webcam-codes). So I am looking for some pointers:

1.) If we modded a second camera into an infracam and hooked it up to our pi, could we perform infracam analysis of the captured images directly onboard with the python library? Has anyone already done this?

2.) Any suggestions on how to pragmatically begin trying to model plant health over time from the series of NDVI images? I am thinking we might paint the background of the visual field around the plant an unplanty color and subtract it in each image, then calculate the overall NDVI ratio for the plant, and perhaps a distance metric for each pixel relative to its value in the prior image, maybe smoothing it all with a moving average (3 day? week?), and then finally creating overlays for regions of significant change. (Just thinking off the top of my head, I can try to explain this more clearly if that was too complicated sounding).

Thanks for the tips!

Mac


Mackenzie Cowell

unread,
Dec 9, 2015, 10:19:28 PM12/9/15
to plots-infrared
Just discovered nedhorning's research note about NDVI with imageJ/Fiji (https://publiclab.org/notes/nedhorning/06-24-2014/updated-photo-monitoring-plugin-to-compare-ndvi-with-dvi).

And this post from the Fiji-devel group with some tips on getting it running on rpi (https://groups.google.com/forum/#!topic/fiji-devel/qDKjrRNaEwQ). I'll look into that direction.

Don Blair

unread,
Dec 10, 2015, 12:14:28 AM12/10/15
to Mackenzie Cowell, Craig Versek, plots-infrared
Hello Mac!  And hallo, ManyLabs!

Wow, this sounds like a fantastic project. I'm one of the folks who put together the python scripts for capturing infragram imagery way back in the day ... haven't looked at that code in a long time, but I think folks have been using it recently & making sure that it still runs :)  But yes, as I recall, the latest version of the software was intended to combine imagery from two separate web cams and use openCV to compare and overlay the two images.  I also recall that it didn't work particularly well :)  

You should also check out what the photosynq.org folks are up to.  Their hardware is more elaborate than a simple web cam, but their measurements might be far more useful for your use-case.  They're super-friendly folks, and I think they're looking for people to test their hardware right now ... 

In terms of modeling ... if you are already making several additional measurements like pH, soil moisture, ambient humidity (and if you'd also be interested in CO2, our friend Craig (cc'd) has found some potentially useful sensors), it does sound like it would be really fun to see if you could correlate some of these measurements with optical measurements like NDVI.  

Since you're using a lab-based setup and you're interested in automated monitoring, maybe you could even make some very localized (e.g. a specific leaf) measurements, using a VIS- and NIR-sensitive light detector like this $5 one here: https://www.adafruit.com/products/439

I haven't thought about this in a while, but it seems that if you were to point one of those detectors back at the light source illuminating a leaf, and measuring VIS and NIR, and one of them pointing at the leaf to image the reflected proportions of VIS- and NIR-, you'd have a nice little experiment on your hands ... 


 

--
Post to this group at plots-i...@googlegroups.com
 
Public Lab mailing lists (http://publiclab.org/lists) are great for discussion, but to get attribution, open source your work, and make it easy for others to find and cite your contributions, please publish your work at http://publiclab.org
---
You received this message because you are subscribed to the Google Groups "plots-infrared" group.
To unsubscribe from this group and stop receiving emails from it, send an email to plots-infrare...@googlegroups.com.

Chris Fastie

unread,
Dec 10, 2015, 1:08:30 PM12/10/15
to plots-infrared, maco...@gmail.com, cve...@gmail.com, donb...@pvos.org

This seems to be a timely topic as there have been a couple of other discussions about using a Raspberry Pi camera to make NDVI images (one here: https://publiclab.org/notes/LaPa/12-03-2015/how-do-i-set-a-costum-white-balance-of-the-noir-modulo-cam).


It’s pretty exciting that Fiji could run on a Pi. If so, then anything is possible. Ned Horning’s plugin for Fiji is probably not the right route to an automated system (it requires user input at multiple points), but the plugin code could be a starting point.


The Python code might be more likely to offer a lightweight route to processing photos into NDVI. My recollection is that the various iterations of this code were intended to process single photos from infrared sensitive cameras with red or blue filters, not photos pairs from dual camera systems, but maybe both versions exist. Versions of Python code are here: https://github.com/publiclab/infrapix/blob/master/README.md and here: https://github.com/p-v-o-s/infragram-js.


The choice of single or dual camera systems requires some thought. The plants will be close enough to the cameras that parallax will be a big issue, so aligning photo pairs will be messy and maybe unsatisfying. This alignment is also processor intensive, so avoiding it might be wise. So this could be a good application of a single camera NDVI system. A Pi NoIR camera with a red filter (like Wratten 25A) might be the best choice (I say that with no information about anybody ever trying that combination). This setup will provide a rather pure NIR image in the blue channel and a mixed red + NIR image in the red channel.


How much good plant health information can be derived from those two channels is a big question. Generally, such systems (with other cameras) can be effective at distinguishing foliage from non-foliage, but their capability to distinguish stressed from healthy foliage is unproven.  In very controlled environments when the lighting is absolutely consistent and things like leaf angle and leaf surface moisture are constant, it might be possible to discern some plant health differences. However, your little parsley seedlings are probably going to be pampered and healthy. This will be a great test of DIY NDVI images to determine if they can tell you anything other than that the plants are still alive.


Your idea of using a background with a “non-plant” signature is intriguing. Plants will generally reflect lots of NIR, so a background that absorbs NIR could make this easy (if you capture an NIR image or channel). Subtracting the background then has the potential to allow quantification of the size of the plant. This might be the most straightforward and interesting automated reporting to try. If the image used for this also had a visible light channel (or a mostly visible light channel) you could also compute average (pseudo-) NDVI for the plant part of the image. It remains to be seen whether that value will have much power to discern very subtle changes in the actual health of the plant.


One of the pitfalls in interpreting NDVI values in close-up images of plants is that the 3D surface of plants causes shadows and highlights and aspect variance with different color balances. Ideally, you would want an image of a single, flat leaf at a constant orientation to the camera and light source. This calls for a very broad-leafed and large-leafed plant, maybe something as different as possible from parsley ;^)


To produce the most meaningful NDVI information, it might be worthwhile using Ned Horning’s calibration procedure. This requires calibration targets of known reflectance in the visible and NIR spectral ranges of interest. If you have access to a spectrometer that can measure (even crudely) the percent reflectance of red light (600-700 nm) and NIR (700-900 nm) from some different colored surfaces, you could calibrate your system so the NDVI values are more robust as plant health, lighting, and exposure vary. Ideally, the targets would be in every photo (so they would have to be waterproof), but maybe just intermittent calibration photos would be sufficient.


I’m looking forward to closely following your tweeting parsley.


Chris


Sort of related -- early (failed) attempts at dual camera and single camera NDVI timelapse videos of plants.

 

 

Mathew Lippincott

unread,
Dec 10, 2015, 3:24:31 PM12/10/15
to Chris Fastie, plots-infrared, maco...@gmail.com, Craig Versek, Don Blair
Back before Ned & Chris had worked out a filter system for a single camera setup, I was playing around with taking two-pictures in a single camera using a filter switcher, which does work with the R-Pi:

My goal was to use the Pi camera because you can shoot RAW and so calibration looked more tractable.  A camera has to stay pretty still to get two aligned photos, but its a very doable system, especially with FIJI running on a Pi.



--

ian collins

unread,
Dec 10, 2015, 4:29:27 PM12/10/15
to Chris Fastie, plots-infrared, maco...@gmail.com, cve...@gmail.com, donb...@pvos.org
All

I did some stuff with a pi2 streaming with nvdi calculation using the infragram usb camera, video4linux, gstreamer and opencv.

I ran it all summer on plants on balcony.

I must admit ndvi results weren't useful, for reasons I now understand.

Was fun though am keen to look at options for version 2 next year.


Happy to share pi stuff if anyone interested. issues were primarily on power  running 5ghz wifi and camera - used 12v car adapter usb charger socket,

Camera issues no access to rgb, required converting from 420 to rgb and back. Was able to achieve about 10fps.

Regards

Ian



--
Capture.jpg
Capture2.jpg

Nathan McCorkle

unread,
Dec 17, 2015, 6:43:47 PM12/17/15
to plots-infrared
On Thu, Dec 10, 2015 at 1:29 PM, ian collins <eanco...@gmail.com> wrote:
> All
>
> I did some stuff with a pi2 streaming with nvdi calculation using the
> infragram usb camera, video4linux, gstreamer and opencv.
>
> I ran it all summer on plants on balcony.
>
> I must admit ndvi results weren't useful, for reasons I now understand.

You elaborated on the electronics/programming... but unless I missed
it, you didn't speak to why the ndvi wasn't useful. What is it you now
understand?
Reply all
Reply to author
Forward
0 new messages