To sign up for this cluster, use Bison code L.27351.
This cluster consists of the following projects:

Automated Bin Picking with a Cobot

Bin Picking is one of the most wanted applications for cobots. Cobots, or collaborative robots, are robots intended to interact with humans in a shared space or to work safely in close proximity. Cobots stand in contrast to traditional industrial robots, which are designed to work autonomously with safety assured by isolation from human contact. Cobots can be easily programmed to pick up something in a specific place. It becomes more difficult when objects that the robot arm has to grab lie loosely in a container. However, there are several solutions for this: cameras and software, and a vision system. Bin Picking by Machine Learning: By combining the robot arm with a 3D vision system, it is possible to show the cobot depth and to recognize objects. The cobot can, just like an employee, take loose products from a container and add them to an assembly line or production process. The cobot can handle products independently, without employees having to perform the often physically demanding movements. The software needed to make a cobot be able to do Bin Picking is Machine Learning: recognizing objects by color, shape, dimensions. The machine must be taught what a product looks like (from all angles). Companies Voortman Steel Machinery and VIRO are interested in a system that is able to pick specific products from a bin and sort them in a different location. In this project, you will be trained to program a cobot in Robot Operating Software (ROS), designing a vision set up, learning about artificial intelligence and setting up a mechanical interface to pick up the specific products. Your work will be carried out at the applied research group of Mechatronics. You will closely collaborate with engineers from Voortman and/or VIRO and the researchers from the research group.

Dude, where’s my robot?

Fire departments in The Netherlands experiment with the use of robots to get access to places where firefighters are at high risk of getting injured or even killed. The usability of these robots is now limited, because the operator can no longer control the robot when it is out of sight of the firefighter. Current solutions that solve this problem for industrial robots do not work in environments filled with smoke and dirt. In this 3S project, you will create a protoype of a module that can be attached to any firefighter robot to localize where the robot is and show this to the operator. Challenges are the usage and manipulation of software frameworks that already provide robot localization for other robots, sensorfusion algorithms, navigation in rough environment and making a demonstrator that is convincing for firefighters. Another aspect that can be tackled is how this module could be taken to production and / or what aspects come into play for making modules that are not specifically for one robot.

Intelligent BEAST (Drone)

Drone Flight operations that are carried out in a vicinity of populated areas and close to building and infrastructures require much more risk mitigation measures than simple directly landing or return-to-home actions during improper functioning (such as flyaway, RC communication loss, critical battery level). Recently, there have been some initiatives to use intrinsically safe design of drones to make their risk relatively less during improper functioning, see the figures below. However, these measures alone are not adequate for autonomous Beyond visual line of sight (BVLOS) flights as they involve high risk on the surrounding (ground and aerial) and the drone itself.  On the other hand, with the help of hardware and software that includes a new vision-based artificial intelligence techniques, it is possible to conduct intelligent failsafe, such that risks to the surrounding environment and the drone can be minimized. The main objective of this assignment is to equip the BEAST drone with an intelligent fail-safe capability. This capability wil be realized with synergetic integration of hardware and software technques. Among others, the technique includes  systematic integration of  a low-cost, low-weight and low energy consumption elecromechanical components. The various hardware and software intelligent failsafe techniques will be integrated together for robust and reliable situational awareness (identify people, properties and possible landing space) and decision making during improper functioning of the BEAST Drone. This assignment is part of the BEAST project, in which 16 partners from the public organization, industry and knowledge instutued are involved. During the course of this project, student will be given necessary trainings on rapid prototyping technique and drone flight. Deliverables: at the end of the assignment, the following deliverables should be submitted to the client: demonstration with the designed proof-of-concept setup, design documentation.

Robust Obstacle Avoidance for Drones

Drones have a number of features that makes them appealing for a number of potential civil applications, such as inspection and maintenance of wind blades, fire fighting, transport of medical equipments and selective plant treatment. One of the main challenges in incorporating drones in a number of applications is their lack or limitation of  the current drones to create a robust and reliable awareness of their surroundings for safe navigations. Since most potential civil applications involving drone requires flight at a lower altitude, it is necessary that the drones are equipped with “Sense and Avoid “ mechanisms for robustly detecting and avoiding obstacles in a number of environmental conditions. The main objective of this assignment is, therefore, to develope a modular and multi-sensors based obstacle detection and avoidance module for the BEAST drone. The developed module will be integrated (electromechanically and softwarwise) to existing drone that uses pixhawk autopilot hardware with PX4 software stack, which is currently widely used in commercial as well as hobby drone. During the course of this assignment, student will be equipped with basis technical training required for flying drones. Deliverables: at the end of the assignment, the following deliverables should be submitted to the client: demonstration with the designed prototype and design Documentation.

Sky Workers

The  inspection and maintenance tasks for high altitude platforms like solar panels and wind turbines are increasing. The use of Aerial robots has the potential to replace humans in those risky tasks. The Sky Worker wants to facilitate the drones to bring this idea to life. This project will focus on the inspection of the wind turbine blade using an aerial manipulator using nondestructive testing (NDT) to detect probable damages and prevent extra cost. However, wind turbine blades are complicated objects: they are multilayered, have a variable thickness, have an arbitrary curved surface (Fig.1). One example of NDT is using ultrasonic sensors. A prototype of an aerial manipulator (drone+robot arm) is designed in a 3S project entitled “Flying hand”. The obtained practical knowledge in that project is going to be used to fill the gaps and develop ideas to cover contact-based inspection. Goals: the main goal of this 3S assignment is to design and realise a modular and standardized end-effector for a manipulater that is attached to a drone. The ‘aerial manipulator’ or Sky worker can move a non-destructive-testing (NDT) sensor on the 3D curved surface of a wind turbine blade while keeping contact with the surface. During the course of this project, student will be given necessary trainings on rapid prototyping technique and drone flight. At the end of the assignment, the following deliverables should be submitted to the client: demonstration of a NDT test with the designed prototype and design Documentation.

Check out how to sign up

Questions about this cluster?

Feel free to contact us if you have any questions about this cluster. You can call us at 088 - 019 53 11 or use the form below. We wil get back to you within two business days.