Now Reading
Greg Crutsinger – Sensefly

Greg Crutsinger – Sensefly

Greg Crutsinger PHD came from a research academia background, but has spent recent years serving in various roles in the development and sales of drones and drone related technology in education, agriculture and mapping for 3DR, Sensefly and Pix4d. He is currently the owner of Scholar Farms, a training and consulting business with popular programs around mapping protocols, workflows and plant mapping using drones.

What are some of the interesting educational programs that are developing curriculum around drones?
I’ve worked with a wide ranges of academic partners. Despite being early days for a lot of university drone programs, there are some clear leaders. Duke Marine Lab is pioneering drone use for the environmental sciences and coastal monitoring. They work everywhere from the east coast of the U.S. down to Antarctica and probably lead the planet on number of eBee flights. Sinclair Community College in Dayton, Ohio is leading the way for two-year colleges. They have being developing their UAS much longer than the vast majority of 4-year schools. They are also developing great partnerships with local businesses and public agencies in the state. It’s really impressive for a small college. Those are just two examples, but more are popping up all the time.

How should new companies be thinking about working with educators as they bring new technology in the drone industry to market?
The future of education and training is really at the intersection of education and industry partnerships. The technology and analytics are moving so fast, it’s difficult for universities to keep pace. Most don’t have the budget or the incentives to continuously stay on the bleeding edge. There are too many committee meetings to attend, long grant funding cycles, and lots of distractions from basic research. I know because I left a tenure-track job at a top research university to join a drone startup.

At the same time, the industry needs the workforce that colleges and universities produce and they need them trained in the right ways. The key to this is a constant dialogue between the two, which is what I have tried to facilitate over the past few years in the drone industry. I get frustrated by the slow pace of academia, but at the same time, I still strive to help out those who want the assistance. I’m still an academic at heart I guess.

What would be the most valuable hardware or software technology a technology company could develop to assist with research using drones?
Both the hardware and the processing software have improved dramatically over the past few years. We are starting to see some maturing in the industry and focus on particular verticals. It may seem mundane but better batteries are going to be essential to the long-term success of the commercial and research industry. The limitations of flight times are a big issue and I bet we will look back and laugh at how many batteries we use to lug around.

In terms of software for research, improvements could be directed towards improving the ability to merge data layers. That’s not just a need for drone data either. Its merging satellite imagery, weather data, ground sensors, and more. How do you compile all of this information, summarize it, and make decisions based upon it? It’s a challenge for sure…but it’s a fun one to have.

What are the industry segments in agriculture that have adopted and are benefitting most from using drones for precision agriculture?
It’s interesting, Ag tech is having its own sort of revolution that has many parallels to the drone industry, including a lot of vaporware and hype.

While there has been much excitement surrounding drones for agriculture, the reality is that the return on investment is still to be determined for many Ag applications. In the U.S., the restriction of maintaining line of sight creates a limitation on the size of farms that can be flown in one go. Image processing and data management also take time.

Right now, drone technology has the highest benefit and return for smaller spatial scale but high value, which includes academic research, commercial test plots and high value crops. In California where I live, this would be vineyards, berries, almonds, and most recently cannabis.

What do you think is the ideal hardware and software setup for gathering data for precision ag?
If you can get away with a multirotor, it makes take off and landings a lot easier. Then it’s about the sensors. There isn’t a great RGB and multispectral combination sensor out yet. What would be ideal would be a DJI Matrice 210 with dual cameras, a high resolution color camera and a multispectral camera like the new Micasense Red-edge M flown in tandem. You could also get by with this dual setup on a Matrice 100, but its less of a finished product and more a developer tool.

For fixed wing though, I like the eBee Plus. They are not cheap but they work and have been well tested in a whole range of applications. I would swap the 20 MP SODA RGB and either a Rededge M or a Parrot Sequoia depending on the resolution of multispectral data I needed.

That’s the ideal hardware to me, but obviously you could get by with lower cost options if needed.

For software, I would use Pix4D desktop software since many agricultural folks are in rural areas without high speed internet for cloud processing. Then I might supplement by pushing some of the finished maps to the cloud. Micasense Atlas has some nice visualization tools and a range of vegetation indices. Using the cloud also makes it so much easier to share data with partners, clients, or collaborators.

What are the barriers from more mass adoption of drones within agriculture?
First, cost. multispectral camera aren’t cheap (though it’s amazing what you can do with just color imagery from the camera on a Phantom 4 Pro for ~ $1500).

Then, data interpretation. A lot of ground-truthing is still needed to understand the relationships between the imagery, the plants, all the other field information being collected and the decisions to be made. We know quite a bit about a few crops but there is much more work to be done. It’s a fantastic time to be looking for a senior project or a master’s thesis.

When do you think we’ll see a “tipping point” in adoption of drones for agriculture?
People want simple answers and, unfortunately, it’s still not simple.

We have different drones flown at different heights, speeds, and times of year with different cameras over different varieties of different crops in different locations processed with different software and visualized differently.

‘It depends’ is not a very satisfying answer, but it’s where we are at in the near term. It’s going to be a while before we have set standards most people can follow and that’s what mass adoption is going to take.

In your opinion, what are the sensors that aren’t in mass-production on drones now that would be most valuable for companies to develop and integrate with drones for commercial use?  Are these similar or different to sensors for environment or research purposes?
I’m all about data capture and sensors.  I think light-weight RGB (color) cameras are improving all the time with amazingly high resolution and with global shutters for mapping.  So, I think we will just continue to improve on these.

In regards to more narrow use for agriculture or the geospatial sciences, the benefits would be in higher-resolution multispectral cameras that provide radiometrically accurate data for mapping vegetation. There are a couple options out there at the moment, but there is definitely room for improvement. Similarly, hyperspectral cameras are still limited to rather large research budgets and very specific applications.
To me, the thermal camera market seems prime for disruption. The
re are still relatively few options and the cameras get expensive quickly. With so much interest in thermal data for industrial applications, I am hopeful there are innovative products on the horizon.

I get asked a lot about LiDAR and there are still not fully-integrated LiDAR options. Most require piecing together companion computers, data storage, and rather large drones. This can end up costing a fortune. I am hopeful costs will continue to come down with so much interest in LiDAR for autonomous navigation (e.g. drones, cars, robots) and that some of that technology will cross over to mapping.

You’ve worked in-depth at Parrot and Pix4d on state of the art tools for mapping, what are some features that you would love to see integrated into mapping software in the future?
Near term, it would be great to see a deeper connection between the imagery and quantitative metrics. As much effort as we put into drone technology and capturing data, it seems like the end result is often just a pretty map and not rigorous analytics. I’d like to see more numbers attached to those pretty maps so we can better understand the links to the plants, nutrients in the soil, water use, next years yields, and so on.

Farther out, my hope is that better onboard computers will allow for real time processing. It would be great to reduce the number of tiny SD cards we are pulling out of cameras and losing in grass, as well as the hours and days of post processing. From the pace of the innovation, this might not even be that far away.

© 2019 Guinn Consulting LLC
All Rights Reserved.

Scroll To Top