Oz and the gift of stereoscopic sight and odometry
Our Oz weeding robot has recently been equipped with two small cameras and has now the gift of sight ! Oz still uses its laser for further guidance but its stereo camera eyes now help it to adequately analyse its environment. It can now find low crops all by itself and decide whether to follow crop rows, crop covers or turn around… Joan and Robin from our R&D department explain in detail what this means.
Could you explain the implications of 3D vision for Oz ?
First of all, it shows: Oz has two eyes! The weeding robot now understands and analyzes what it sees. It has depth perception, just like humans, but for robots, we call this stereoscopy. Stereo vision has some very practical applications for Oz: it helps in detecting markers, stakes or low crops, in following crop rows or crop covers and in deciding when it’s time to turn around and move to the next row.
To help Oz turn around at the end of a row, we use red filters to clear this color from the images Oz detects in order to allow the robot to see red stakes at the end of each row. Oz can analyze color and, thanks to stereo vision, it can decide to turn around, even when there are crops or obstacles behind the stakes.
For low crops, we use green filters. Even though concentrations may vary, all crops contain chlorophyll and there’s always an amount of green involved ! Oz detects this characteristic in order to see where crop sprouts are located.
Another important feature we added, was that Oz has to be able to follow crop covers. For this application, we don’t use color filters, as crop covers change color depending on reflexion, luminosity, rain or even in the presence of dirt… To follow crop covers, Oz looks straight ahead and widens its camera range to detect differences in the environment, in this case: the crop cover. It then lines up a series of dots to define a trajectory. Oz is guided by the difference between what it sees in the center and on the sides. But that’s not all: Oz can now also follow crop covers that are dirty, old, covered in soil or contain water puddles.
Until now, Oz couldn’t see. It used a laser for guidance but couldn’t detect anything shorter than 10 cm in height. It therefore didn’t make the distinction between different types of low crops or detect asperities. Lasers actually design a sort of map that only contains sections located at 10 cm above the surface. However, lasers very precise when it comes to measuring distance. We hang on to our lasers because some of our customers use stakes and have our robots weed dense crops at night, which is when our lasers prove extremely useful. The cameras and lasers are very complementary and both have their use in helping our farmers!
Another new feature: before Oz had eyes, it couldn’t follow crop covers that were laid out flat on the ground because it simply couldn’t see them. Same problem with turnarounds: sometimes Oz had problems in detecting the end of a row because it didn’t make the difference between tractor tracks and rows or between weeds and crops. The laser would tell Oz to move forward just as long as it detected elements on the side of the row. This is when the red stakes came up: they’re used to mark the end of rows and send a visual cue. Toady, when Oz sees a red stake, it turns around into the next row, whether crops continue or not.
The red stakes can also be used to guide re-entry. When there are a lot of weeds, bumps, obstacles or when fields have a certain slope, the red stakes allow Oz to aim for the middle, between the stakes, to make sure it enters the row at the right spot.
Oz now also uses visual odometry. Could you explain ?
Today, in fields that respect general user conditions and present no particular challenge, mechanical odometry, which measures distance by counting the number of times a wheel spins around, already yields excellent results. However, we decided to opt for visual odometers in order to make it possible for the weeding robot to make reliable turns in extreme situations, i.e. situations that wouldn’t normally comply with general user conditions. If Oz can handle very complex tasks, simple tasks are definitely not going to be an issue at all.
The cameras allow to measure the distance Oz has travelled without being affected by slipping wheels. When Oz turns around and slips on a sloping field or on slippery soil, mechanical odometers are likely to give false measurements on re-entry. Visual odometry allows Oz to take successive pictures of its surroundings in order to check its actual progress in the field.
How do farmers benefit from all this?
Our farmers draw up a list of specifications before they set Oz to work. They enter specific data about their farm and plots into Oz before the robot starts to weed and hoe. In the past, when fields had complicated layouts or encountered weather problems, the farmers had to explain that to Oz too, which could make things a bit complicated.
Now, once Oz finishes a row that is marked with red stakes, the weeding robot simply turns around by following user specifications. This makes Oz more reliable and easy to use, even under extreme circumstances. It also reduces the amount of data the farmer needs to enter into the Oz robot before it can function.
This being said, Oz can only do a perfect job if it has the right data. It remains essential for the farmer to set it up correctly before sending it out in the field. It’s a shame to miss a turnaround just because of a simple detail. However, 3D vision and visual odometers now enable the robot to overcome weather related problems and ensure reliable turnarounds without taking up the farmer’s time.
Oz’s stereo camera vision now also allows it to tackle plots it couldn’t handle before and to start work right at the beginning of the row in order to increase the possibilities. And we’re keeping up the good work!
What other R&D projects are you working on ?
Presently, we’re working on a camera-guided robot for vineyards that detects vines, stalks and vine shapes. Vines are complex plants and their specific position and shape allows automated mechanical treatment. We’re also thinking about using a multispectral camera to measure the hydric stress level in plants. Oz or its descendants could then also be used to water thirsty plants by geo-tracking them