Project by: Stephen Frame, Bauke Hendriks, Maurice Van Beijnen, Milad Habibi, Mitchel Mulder.
Introduction
Our project is from Hazeu, a local company in Delft that grows and sells a variety of Orchids (35 types of 6 colours) to retail establishments including supermarkets and florists. The production rate of Hazeu is about 100,000 units per week.
In the current process, the plants are delivered as saplings to Hazeu, where they mature, are organized by customer order, packaged, and then sent off. The process is mostly done by manual labourers, so there is opportunity for automation.
Assignment
The goal of the project was to be able to automatically recognize the type of orchid and to automate the process to put the plants in a sleeve. Initially, the scope included different types of flower of the same colour, but eventually the scope was reduced to only look at colour and not at other identifying characteristics, but continue with sleeving automation.
Solution
After some consideration, the decision was made to create three interacting systems:
- A conveyor belt with a lightbox and camera; to make colour identification quick, automatic and relatively consistent.
- A sleeving system; to put the plastic sleeves on the plants.
- A robot arm; used for moving the plant from the conveyor to the sleever and from the sleever to the “output” area, where the plants will be placed in a different location depending on colour.
The whole system is controlled by a Programmable Logic Controller.
Vision Process
The plant is put through the light box via a conveyor in a single file on a plant carrier. It enters the lightbox and hits a sensor underneath the camera. The software tells the PLC that the plant is there and so the camera takes a picture. The plant continues to the end of the conveyor and stops when it hits another sensor that tells the PLC the plant is ready to be picked up by the robot arm, then the plant is picked up and the conveyor continues to run.The recognition is done as follows:
The orchid flowers in the sample were either white, yellow, pink or purple. Under the camera on the conveyor is a lightbarrier which will tell when the camera needs to capture the frame.
After, the image gets processed using Python OpenCV, the colors get filtered so the picture only shows one colour. In this case, purple.
After that. The Python script makes a binary image of the image shown above. Everything that is not pitch black, will be white.
The Python script will then count the number of white pixels in the picture and saves the value.
This will be repeated with a pink, yellow and white filter. After the amount of white, yellow, pink and purple pixels are known, the highest value determines the colour of the plant.
This is done using the HSV colour space.
In the RGB colour space the outputted colour is a mixture of the intensities of the colours red, green and blue. In the HSV colour space the outputted colour is solely determined by one value: Hue. The other two values say something about the colour intensity (Saturation), and the brightness of the colour (Value).
This way it is also possible to easily sort colours that look alike; like yellow and green or purple and pink.
Sleeving Process
When the robot has picked up the plant from the conveyor, it places the plant into a feeder which sits above the sleeving frame, consisting of one moving plate with suction cups to grab a plastic sleeve and one static plate that holds the sleeves. When the plant is placed into the feeder, it lands on a drop plate above the sleeves, the plate moves back and allows the plant to fall into the opened sleeve, ready to be picked up by the robot and placed on the table, in order of the colour of the plant.
The trap door and the plate on which the suction cups are mounted are operated by pneumatic pistons. The trap door by a single piston and the plate by two pistons to create a complex movement pattern that ensures a good opening of the sleeve, thereby maximizing the chance of a successful sleeving cycle. The pistons are operated by bi-stable air valves controlled by the PLC. Safety is ensured through a main pressure valve that can be closed off, and pressure regulators creating a lower overall pressure in the pistons.
Major Decisions
Scope Reduction
Initially, Chiel Hazeu asked the group to recognise the type of orchid – which depends on the colour and pattern of the flower.
This would require an extensive machine learning process, with a large data set. At first, the plan was to create this data set, but through the project it was decided that there were not enough resources available (including time). This issue was already expected and there was a back-up plan ready to use: It was decided to drop the machine learning and stick to colour recognition.
Vision Set-up
The vision system was an area of the project that the group changed method a few times. At first, when the group considered type detection, we realised that the plant would have to be rotated to guarantee a better view of the flower face so we planned to create a roller system perpendicular to the conveyor to rotate the plant but quickly realised the plant is too tall and light to spin, making it unsteady and unreliable so this idea was expelled.
Next the group considered to use multiple cameras at different angles throughout the light box so more of the plant can be seen. This idea was also impractical as the lights in the box interfered with the pictures and integrating all cameras would be complex. Due to the above reasons, the group decided to settle for one camera that would be able to detect the colour of the plant.
Conclusion
The overall system running at optimal speed, takes one plant 45 seconds to get to its final output, sleeved and ordered in colour from its input on the conveyor. This means the estimated, conceptual production rate of our system is roughly 10,000 per week for a single conveyor and single robot arm, which is substantially lower than the current production rate at Hazeu although the proof of concept confirms this is possible to upscale and use in agricultural businesses to match and even better the production rate.
The recognition system 100% successfully detects the colour of the flower, and the sleeving machine sleeves the plant with a good degree of success. Integration between the two sub-systems has been completed by the UR10 robot and hence the project can be deemed successful, in accordance to the project agreement with our customer.