• Open AgTech
  • Posts
  • OpenSourceAg #4 - "Talk is cheap, show me the code"

OpenSourceAg #4 - "Talk is cheap, show me the code"

Working in the open on your own projects is a great way to build an easily shareable portfolio of work. Plus open source insect detection, precision spraying research and more!

There are many, many quotes that effectively mean ‘talk is cheap’. As kids, we’re taught that actions speak louder than words, not to brag but to instead let any achievements, kindness or good deeds tell their own story. And so Linus Torvalds (the creator of Linux) in response to a suggestion about the benefits of a particular method of threading, gave us this:

Talk is cheap, show me the code

If something is so much better, faster, more reliable than what already exists, then why not show me with the code? And so, in a competitive field, the ability to point to your own projects where you can show off strengths you might talk about in an interview or application goes a long way in demonstrating these specific skillsets.

This is one major (and self interested) reason for building out your own ideas in the open, but there are good reasons this could hold true for agtech business too.

Some code in progress for the OpenWeedLocator project.

Welcome

In the 4th edition of the OpenSourceAg newsletter, I want to explore some of the opportunities for individuals and companies offered by building in the open—demonstrating skills, letting clients see your work before signing on.

It’s fantastic to have a Amber Balfour-Cunningham as a guest writer this week sharing some of her work on open-source insect monitoring and the world of insect traps. I dive into some of the research on targeted application in the interesting reads, and share my find of the One Smart Spray open source software stack I stumbled upon during the research for this edition.

So I hope you enjoy the mixed bag this week.

Table of Contents

The self-serving benefits of building in the open

We often talk about open source in grandiose terms—industry-wide impact, collective progress, and the greater good for agtech innovation. I’ve certainly leaned into those arguments in OSA before. But in reality, there are plenty of personal, pragmatic reasons to build in the open, and they deserve just as much attention.

Sharing work publicly creates a visible track record of your expertise, interests and commitment to a particular subject. My first move into sharing code was building a basic script that could calculate the longest word you can make with a 7 segment display, inspired by a Tom Scott video on the same. You can still find the code on Github in all its glory.

Some code from my first shared repository on Github.

And while these 56 lines of code probably didn’t change my future much, larger portfolios and projects make it easier to attract collaborators, contributors, and start to build your own personal profile in a particular niche. For businesses, these advantages extend to clients who see not just the final product or the marketing but the thinking, problem-solving and array of other insights you and the business can offer. It gives them the code—not just you talking about it.

Impressive commit history on Github.

The strategy around building in the open is critical. As an individual if you discover the next room temperature superconductor, there probably needs to be some consideration around your go-to-market. But not everything you build is unique (nor should it be—I think others have written 7 segment word calculators before), game changing or valuable if kept locked up. Going back to my favourite analogy, how many best selling cookbooks discuss similar recipes yet still prosper? Instead, some tools that you build may have a lot more value to your own brand and future opportunities if released publicly and shared widely. I took this approach recently with my ImageReviewer project for checking annotated images of weeds. It was more valuable for me to share it publicly and write about it here/on LinkedIn than keep it to myself.

The same holds true for businesses. Not every internal breakthrough or advantage needs to be public, but sharing small datasets, insights, or specific tooling can position a company as a leader with minimal risk to their competitive advantage. A dataset of 20,000 images of wild radish in wheat isn’t going to break the bank of the incumbent weed recognition companies, but it would set them apart from the rest of the closed source pack while helping research and development more broadly.

This transparency can also build trust, with farmers (and everyone) more likely to engage with a company demonstrating real expertise rather than just claiming it—Linus once again ringing true. When done right, open-source work creates a dynamic where innovation isn’t just locked away—it becomes part of a broader conversation, organically drawing people toward your work rather than requiring you to constantly push it out.

For example, I find it interesting that the many companies offering services to standardise agtech data streams into various dashboards generally make no effort to publish their code/standards/data schemas. How do I build into their standards/platforms organically? I see some clear benefits of building these tools in the open.

On this point of organic development, Torvalds, known for being particularly blunt, has made some interesting observations around the benefits of open source development over the years. I wouldn’t use the same words/tone, but the point is quite clear and leans into the same ‘evolutionary arms race’ mentioned in OSA #3.

We humans have never been able to replicate something more complicated than what we ourselves are, yet natural selection did it without even thinking. Don't underestimate the power of survival of the fittest. And don't ever make the mistake that you can design something better than what you get from ruthless massively parallel trial-and-error with a feedback cycle.

Linus Torvalds, December 1st 2001, Coding style, a non-issue

As many software and design barriers to developing prototypes begin to fall, thanks to the improvements in LLMs and various prototyping tools, the value of relevant ideas, and rapid feedback cycles based on genuine understanding of pain points is increasing. I stumbled upon Amery Drage’s recent post on Twitter, where he had developed a rock monitoring app for farmers that rewarded you with a ‘beer’ for picking the rock up. I don’t know if he used LLMs, but coding tools are helping ideas like that one become reality. Faster iteration and field testing even if at a prototype level.

The flow of ideas from start to scalable product. LLMs are helping ideas become prototypes faster, but data access and scalable design remain barriers. Diagram designed with resources from Flaticon.com

And what’s the fastest way to keep the bucket full of ideas and massively speed up your feedback cycle? Well, open source of course.

Fun Open-Source Find: One Smart Spray

A fun find for the week has been the Open Source Software page of the One Smart Spray system (the Bosch, Xarvio, BASF joint venture). Here they list all the licenses of the open source software they have used in the system. For example, they use Tensorflow (not Pytorch), LiteOn cameras (based on the use of LiteOn camera drivers) and OpenCV.

Many permissive licenses, such as MIT and BSD require some form of attribution and inclusion of the original license documentation somewhere with the product sold (website, manual, user interface etc.). A total of 30 different pieces of software are used in the Field Camera Unit (FCU) and the Flashing tool, some of which are listed below.

Software

License

Purpose

TensorFlow-Lite

Apache-2.0

Machine learning framework for on-device inference (e.g., optimizing spray patterns)

LiteOn Camera Driver

GPL-2.0-only

Driver for LiteOn camera hardware integration (e.g., capturing field images)

OpenCV

BSD-3-Clause

Computer vision library (e.g., processing camera images to detect crops or weeds)

FreeRTOS with NXP adjustments

MIT, BSD-2-Clause, BSD-3-Clause, Apache-2.0, Motorola

Real-time operating system for embedded devices (e.g., controlling FCU in real time)

Libeigen

Mozilla Public License 2.0

Linear algebra library (e.g., matrix operations for image processing or ML models)

OpenCL API

MIT

API for parallel computing (e.g., GPU acceleration for image processing or ML inference)

Guest article: Net Gains or Pitfalls? Open-source Insect monitoring

Amber Balfour-Cunningham is a PhD Student at the University of Western Australia investigating the natural enemies of insect pests in canola. As an insect expert, Amber kindly agreed to write about the field of insect detection and share a few open-source insect monitoring projects. You can follow Amber’s work on Twitter. You can find the projects Amber mentions here over on the OpenSourceAg repository or through the links below.

Automated insect monitoring is the buzz in ‘Next-Gen’ Integrated Pest Management (IPM), but it’s not all smooth flying. Identifying tiny, fast-moving critters that look suspiciously alike is no walk in the field (Luke et al., 2023). And while smart traps sound, well, smart, they sometimes catch more than they bargained for—a whole heap of bycatch that even seasoned entomologists would struggle to sort. Many of these insects may be unknown to science, unique to a particular region, season, or landscape, and influenced by the surrounding crops. Plus, just because a smart trap detects a pest insect doesn’t necessarily mean it’s time to panic.

Even when we do set up smart camera traps and capture a suspect, getting an ID can be difficult. Some insect species require a rather, uh, intimate examination—yes, dissecting moth genitalia in the case of armyworm moths (Manderfield, 2022). AI-powered traps may be the future, but they still face significant challenges in making precise identifications, especially for cryptic species.

Despite these challenges, there is a huge market for smart traps. These tools provide continuous, real-time data on both pest and beneficial insect activity, helping growers and advisers make faster, data-driven pest management decisions. The StickyPi and Insect Detect are two cost-effective, DIY smart traps that operate autonomously. Their ability to track insect activity patterns over time offers new insights into insect behavior that traditional trapping methods might miss.

StickyPi: a high-frequency smart insect trap to study daily activity in the field

StickyPi is a low-cost (~$400 AUD) smart trap, tested in Canadian raspberry and blackberry fields. It continuously monitors insect activity throughout the day, providing valuable data on both pests and beneficial insects (Geissmann et al., 2022).

Sticky Pi, a high-frequency smart insect trap to study daily activity in the field (Geissman, 2023).

Key Features of StickyPi:

  1. Uses a Raspberry Pi-based system with a camera and environmental sensors to capture high-frequency images of insects on sticky cards as well as temperature and humidity data.

  2. The Universal Insect Detector (UID), based on Mask R-CNN, automatically detects and segments insects from images, while the Siamese Insect Matcher (SIM) tracks individual insects across frames.

  3. Captured images are retrieved by a Data Harvester, incrementally uploaded to a centralized database, and stored on an S3 server, with real-time visualization available through an RShiny web application.

  4. Solar-powered and autonomous, with an optional olfactory bait system (e.g., apple cider vinegar) to attract specific insect species.

Insect Detect: An open-source DIY camera trap for automated insect monitoring

Insect Detect is a non-lethal, AI-powered camera trap first tested in German fruit orchards to monitor hoverflies, a key pollinator group. It provides real-time detection and classification, enabling non-destructive insect monitoring (Sittinger et al., 2024).

Insect Detect DIY camera trap. The solar-powered DIY camera trap can be used for continuous automated monitoring of flower-visiting insects Source: Sittinger et al. (2024)

Key Features of Insect Detect:

  1. Uses an OpenCV AI Kit (OAK-1) with a 12MP image sensor for AI inference, paired with a Raspberry Pi Zero 2 W for data processing and storage.

  2. Powered by two rechargeable batteries (~91Wh total) connected to a 9W solar panel, housed in a weatherproof enclosure mounted on a post.

  3. Uses YOLOv5n, YOLOv6n, YOLOv7-tiny, and YOLOv8n deep learning models for real-time insect detection and tracking.

  4. Captures high-resolution images (1080p/4K), tracks insects using a Kalman Filter-based object tracker, and saves metadata to an S3 server with real-time visualization via an RShiny web application.

The Problem of Taxonomic Knowledge Gaps for Insects

Accurate insect identification remains a global challenge in entomology. For instance, if an article about lions mistakenly featured a photo of a tiger, most readers would catch the mistake. But if an article about bees had a photo of a hoverfly? Most people wouldn’t bat an eye (Valan et al., 2019). The internet is full of beautifully written articles about bees… illustrated with the wrong insect. A famous book about native bees once had a fly on its cover. Oops.

This highlights the need for better insect taxonomic education and more comprehensive training datasets for both humans and AI models.

Some taxonomists argue that AI will never fully replace human expertise, as many species require microscopic or genetic analysis to distinguish them accurately (Luke et al., 2023). In fact, the demand for insect taxonomists is likely to increase with the need for accurate insect image databases to train AI models.

Example: Identifying Pest Heliothis moths

In Western Australia Australia, multiple moth species appear visually similar, yet only a few are serious crop pests. The following are six species found in Western Australia, but only three are considered agricultural pests. Even entomologists can struggle to differentiate these species visually without dissection or close visual inspection.

All of these moths are different species found in Western Australia. Only three are crop pests. Top left: Australothis rubescens; top right: Australothis exopisso; middle left: Helicoverpa armigera (Cotton bollworm); middle right: Helicoverpa punctigera (Native budworm) female; bottom left: Helicoverpa hardwicki; bottom right: Heliothis punctifera (Lesser budworm). Image sources: Coffs Harbour Butterfly house, BOLD Systems, Lepiforum

Highlighting the challenge the photos above do not show the differences between male and female moths of the same species, nor the colour and pattern variations between the same moth species in different habitats, or differences between moths of different ages. It’s a hard task!

The Future of AI-Powered Insect Monitoring: Cautious Optimism

Insect monitoring is evolving, and open-source tools like StickyPi and Insect Detect are leading the charge. But these tools are only as good as the data they’re trained on, as well as field validation of the relationships between trap and field captures.

So, am I going to try building one of these open-source smart trap? Absolutely. But I’ll be checking its work carefully and creating custom annotated image libraries with ID’s confirmed by taxonomists.

DIY AI: Do We Need a ‘For Dummies’ Guide?

I think so! I feel there need clear, step-by-step guides, infographics, training workshops and short videos to help the less tech savvy, “anti-instruction manual” farm advisor and researchers set up and use these traps effectively. Graphic designers, videographers, and tech experts, please collaborate!

Really appreciate Amber’s insights here and for taking the time to share some open-source insect monitoring work.

Interesting Reads

TimberVision: Forestry dataset

  1. TimberVision: An open-access dataset for forestry (Steininger et al. 2025) | Paper | Github

From the same group in Austria that developed the CropAndWeed dataset with over 100k instances of crops/weeds in 74 classes, comes the TimberVision dataset for the forestry industry. The dataset has over 2000 annotated images with 51,000 trunk components. Well worth a look into if you’re interested in training some models in this space. Either way, an impressive set of data to follow on from a large annotated set of crop/weeds.

If you’re interested in other datasets and open source projects, you can find them in the OpenSourceAgriculture repository.

Example images with annotations from the TimberVision dataset out of the Austrian Institute of Technology (Steininger et al. 2025)

Research: Targeted application field trials

  1. John Deere See & Spray: Comparing herbicide application methods with See & Spray™ technology in soybean (Avent et al., 2024) | Paper

This article from November 2024, compared targeted application technology (using the John Deere See & Spray systems) with broadcast herbicide application for weed control in soybean in Arkansas and Mississippi. They found the targeted applications (spot spraying) across all site-years was comparable to broadcast applications with less than 1% decrease in efficacy. Providing over 93% control for key weeds in the region including Palmer amaranth, broadleaf signalgrass, morningglories and purslanes.

The upside is the technology saved between 28.4% to 62.4% of total herbicide used. Perhaps somewhat surprisingly, there is limited research on the herbicide savings of commercial systems (some other articles linked below), so it is great to see replicated, randomised work such as this.

Worth noting though that these results are specific to one piece of technology at one point in time—as models develop, setups evolve and hardware changes these results become less relevant, but it is important to have that benchmark to begin with.

You can learn more about the test platform and some research from Virginia Tech with Michael Flessner and Wyatt Stutzman on a the Get Rid of Weeds Network (GROW) video here.

The Agronomy Test Machine provided by John Deere to Virginia Tech researchers. These researchers were not involved in the study presented above. Image source: Claudi Rubione, GROW

  1. Ecorobotix ARA: The reduction of chemical inputs by ultra-precise smart spot sprayer technology maximizes crop potential by lowering phytotoxicity (Anne et al., 2024) | Paper

The Ecorobotix ARA precision sprayer is a Swiss development offering more targeted application than boom-mounted cameras. It uses both RGB and stereovision for depth perception, visible in the green boxes in the figure below. Summarising their 2023 season, the article suggests that, on average, 78.9% herbicide was saved across 8,266 ha of onion fields with 53 machines. Results across 2,000 ha of sugar beet were similar with 79.8% of the herbicide volume saved. This study is not as comprehensive/replicated through time as the previous work, so it will be interesting to see if the results persist in different seasons. However, the scale of the data does give weight to these figures.

An overview of the ARA system from Anne et al. (2024) showing the vision system and dense nozzles.

  1. One Smart Spray: Smart sprayer a technology for site-specific herbicide application | Paper

This third targeted application research article works with the Bosch x Xarvio x BASF joint venture development—”One Smart Spray”. This is a system for larger-scale agriculture and uses an RGB sensor with a filter for R/NIR, helping to segment plants.

The authors report savings between 10 and 55% following both plot and field trials. Weed control efficacy is summarised below—no reliably significant decline in weed control efficacy between the blanket and targeted treatments, though some indications that would warrant further investigation. The large error bars are indicative of the variability of these systems under certain conditions outside ideal lighting/specie/crop conditions.

Final thoughts

This one has turned into quite the long newsletter, so I’ll keep the final thoughts brief. There are many personal and self-serving benefits of building in the open. It’s worth having a crack with your next project, or dipping your toes in as a business.

Thank you as always for reading the newsletter and the comments, emails and interactions.

Until the next edition!

Cheers,
Guy

Reply

or to participate.