InstituteResearch Groups
Control Technology

Control Technology

Description of the research group:

The control technology group deals with the modeling, observation and control of linear, unsafe dynamic systems. In practice, the uncertainty or the information deficit mostly results from incomplete modeling, from indirect measurements (observations), from highly error-prone or incomplete measurements or from the discretization for the use of digital computers.

The control loop are used in:

  • mechatronic (e.g. assistance systems in vehicle technology),
  • micro-optoelectronic (e.g. micro-optical analysis system),
  • optomechatronic (e.g. image feedback control)
  • and acoustic (e.g. "active noise control")

systems together.

When designing the controller, linear (e.g. H∞ modifications), linear adaptive (e.g. predictor / corrector method) or non-linear (e.g. Lyapunov method) concepts are developed and used experimentally. In connection with measurement uncertainties, fuzzy models and linear and non-linear observation models are examined. Experimental identification methods, such as frequency characteristic methods, play an important role in the case of large modeling deficits. In the case of image feedback control, problem areas from the multi-dimensional signal processing such as object and environment recognition also arise.

  • Current Projects

    SIMULTANEOUS LOCALIZATION AND MAPPING SYSTEM USING AERIAL CAMERA

    Especially in the areas of robot navigation, localization and mapping, as well as in augmented or virtual reality, the demand for real-time 3D reconstruction methods has increased significantly in recent years. A promising approach to implementing 3D reconstruction methods is the principle of Simltaneous Localization and Mapping, or SLAM for short. The aim of this method is on the one hand the 3D reconstruction of the environment and on the other hand the relative localization and tracking of moving objects.

    Due to many advantages, such as availability, acquisition and maintenance costs, visual sensors, such as color cameras, are widespread in connection with SLAM systems. In this context, such systems are called Visual SLAM or vSLAM. The spread of visual real-time systems for simultaneous localization and mapping (VSLAM) in research has recently increased significantly. As a result, 3D reconstruction has become very important, especially in connection with the navigation of robots (such as remote-controlled, unmanned missiles (UAV). Current VSLAM systems use conventional image processing algorithms for pose estimation, such as ORB-SLAM, however Such algorithms are severely limited in images with many different and complex textures, so the accuracy of conventional algorithms is very limited in complex and semi-complex environments.

    Compared to such, feature-based, indirect VSLAM solutions, a direct SLAM system is to be developed that develops 3D models on the basis of semi-complex or even complex environments and makes highly accurate pose estimates.

    As part of the project, direct color data are used to register consecutive frames of the video stream without performing a time-consuming feature extraction. The pose estimates obtained with the help of the depth data can be used to reconstruct 3D models of the environment. A subsequent sliding window optimization allows the calculation of a local transformation matrix, whereby the estimation error can be minimized. The bundle adjustment approach is used for a high-precision 3D reconstruction.

    In order to evaluate the algorithm in semi-complex or complex environments, a micro-UAV is to be equipped with a mini PC and a color camera with an integrated depth sensor. By using depth data, the scale-drift problem known from conventional SLAM solutions can be easily avoided. In order to process data in real time, the algorithm is divided into two parts. The mapping and pose estimation are calculated directly on the mini-computer of the UAV, whereas the optimization algorithm is carried out with a stationary PC on the ground. The communication between the two computers is carried out by means of radio transmission.

     

     

  • Completed Projects

    ROBOT-ASSISTED ASSEMBLY OF OPTICAL SYSTEMS USING A PREDICTOR-CORRECTOR METHOD

    Problem

    Nowadays, an individualized and automated process for the production of optical systems is not yet possible. On the one hand, this is due to the constant drive towards miniaturization of optical systems; on the other hand, due to individual manufacturing requirements, there are high reject rates of optical components during the assembly process. To solve the problem, expensive active or passive adjustment mechanisms are currently being installed. Some of these still have to be fine-tuned by hand, which also leads to increased personnel costs.

    Goal

    This research project deals with the function-oriented structure of optical systems and aims to reduce the high required tolerances of both the optical components and the positioning systems. Furthermore, a lower reject rate of optical components should also be guaranteed during the manufacturing process.

    Approach

    In order to implement the project mentioned, a predictor-corrector method is used for the assembly process. This is based on the step-by-step construction of the optical system, in which a simulation model runs parallel to the assembly process and is constantly brought into line with reality using an identification process. This enables predictions for the placement of future optical components (prediction step). By checking the desired requirements (e.g. on the wavefront), a change in the nominal position can always be calculated (correction step) in order to guarantee the functionality of the optical system at all times.

    Experimental Setup

    The experimental setup essentially consists of a positioning system and an optical system that needs to be set up. A macro-micro-manipulator including a gripper is used as the positioning system. A Michelson interferometer is used as a prototype optical system, which is read out by means of a wavefront sensor. This can be used to draw conclusions about the position of individual optical components by means of identification processes.

    Contact Person: Dr.-Ing Christian Pape

    IMAGE FEEDBACK CONTROL OF AN OPTOMECHANICAL DEROTATOR FOR MEASUREMENTS ON ROTATING COMPONENTS

    Rotating components are installed in a large number of machines. To ensure efficient and safe operation, it is essential to check these. The most reliable way to achieve this is through measurements, especially if these are carried out without contact and during actual operation. This ensures that the results are not falsified by the measuring system.

    Conventional measuring methods quickly reach their limits with moving components, especially when they are subject to a rotational movement. The cause of image recordings (with normal high-speed or thermographic cameras) is the creation of motion blur. If objects rotate at a high speed or if it is necessary to choose a long exposure time (due to poor lighting conditions or due to the way the camera works), a movement of the object that can be perceived in the measurement data occurs. Measuring methods that emit a measuring beam (such as laser Doppler vibrometers) cannot be focused on a point on the rotating object and thus falsify the result. An optomechanical derotator offers one possible solution. A rotating mirror arrangement enables the derotator to optically compensate for the movement of a rotating measurement object.

    For this it is necessary that the derotator moves at exactly half the speed of the object to be measured. In order to determine this, an image-based approach is used at the Institute for Measurement and Control Engineering (IMR). Features on the measurement object, either specific structures on the object or externally attached markers, must first be recognized and then followed up in order to determine the angular position of the object. The position determined in this way is then transferred to a controller as a reference variable in order to regulate the derotator to the appropriate position and speed.

    It is also necessary that the optical axis of the derotator is aligned coaxially to the axis of rotation of the rotating object. For this purpose, the derotator is mounted on 6-axis parallel kinematics, a so-called hexapod. This can adjust its position very precisely in all six spatial directions. For this purpose, optical processes are used at the IMR that eliminate translational and rotational deviations in a two-stage optimization process.

    If these requirements are met, the derotator can be used to support a wide range of measurement tasks. These measurements are prepared, carried out, evaluated and interpreted at the IMR. Especially measurements where

    • Motion blur occurs due to long exposure times and high speeds,
    • Measurement radiation is emitted and focused on an object (e.g. laser Doppler vibrometry),
    • high frame rates are necessary,

    can be improved or only made possible through the use of the derotator.

    MACRO-MICRO-KINEMATICS FOR MICRO-ASSEMBLY

    General Information

    The focus of the research project lies in the research of methods for the construction of a handling system, which should form the actuating basis for a clinic-appropriate, ultra-precise mechatronic assistance system. The aim is to achieve a resolution of 1 µm in a working volume of 10 mm³, whereby the working volume can be flexibly positioned in the room.

    The technical implementation is carried out by coupling a piezo actuator to a 6-axis precision robot (µKRoS316). The robot takes over the positioning of the tools in the entire work area (macro positioning). The micro-positioning unit has tasks such as compensating for positional inaccuracies of the robot, compensating for vibrations and moving the tool with high precision.

    The work program can be divided into five areas according to the above-mentioned work priorities:

    • Design of a coupled scheme
    • Setup of an external measuring system for real-time 6D position determination
    • Research into methods to improve the 6D positional accuracy of positioning units
    • Matching the coordinate systems of the positioning units and the measuring system and path planning
    • Development and construction of the tool set

    Coupled Control

    In this subproject, the coupling of the robot with the piezo actuator is investigated. To this end, attempts are being made to improve the control of the 6-axis robot µKRoS and the xyz piezo table from Cedrat. Furthermore, the interactions between the piezo table and the robot and their effects on the overall accuracy are examined. The project includes a modeling of the robot including the nine torque motors and the piezo table. Finally, for validation with the overall system and a micro-milling machine, high-precision machining should be carried out.

    Structure of a Measuring System

    Coupled control is not possible without a suitable measuring system. The measuring system should record the position and orientation of the robot end effector in real time so that the deviation of the tool is corrected in good time by the xyz piezo table. Two high-speed cameras with telecentric lenses, the optical axes of which are oriented perpendicular to one another, can be used for this task. Such a measuring system enables the 6D position to be recorded with a subpixel accuracy of 1.5 µm, but has a limited sampling frequency.

    Calibration of the Positioning Units

    In industrial applications, the absolute positionability of robots is improved by calibrating the geometry parameters. The accuracies that are achieved are in the range <0.7 mm and therefore do not meet the current medical technology requirements. In order to meet the requirements to be able to use the robot as a handling system, an absolute accuracy of <0.1 mm or <0.1 ° in a working area of 10 mm³ is aimed for.

    Two approaches are being pursued:

    • Compensation of the purely geometric influencing factors such as deflection and zero position errors
    • Compensation of the 6D temperature drift (non-geometric error)

    Due to the limitation of the working volume to 10 mm³, it can be assumed for the sake of simplicity that the movements of the robot are linear and can be described by polynomial functions. Initial investigations have already shown that absolute accuracies of <0.087mm or <0.09 ° can be achieved.

    Furthermore, the axes of the piezo table must be calibrated, because the axes of the piezo table are not perpendicular to each other and their orientation to the camera measuring system and robot end effector are initially unknown. Even the piezo axes have slight deviations from their approximation line. A suitable calibration method is used to determine the relationships in order to eliminate the above-mentioned problems.

    Matching the Coordinate Systems and Path Planning

    The machining must take place in an absolute coordinate system. To do this, the coordinate systems of the robot, measuring system and piezo table, the workpiece and the tool must be merged into a common coordinate system. A trajectory for the movement of the tool should also be formed in this coordinate system. In order to correct the positioning errors of the robot using a piezo table, a method is being developed that only holds the tool in the position that is as close as possible to the target path. This method should completely eliminate the delays in communication between the robot and the piezo table.

    Tool Kit

    In order to implement the mentioned calibration methods and matching of the coordinate systems, a tool system has to be developed which on the one hand is adapted to the robot construction and on the other hand allows the determination of the tool and workpiece parameters.

    Contact Person: Dr.-Ing. Christian Pape 

CONTACT PERSON

Dr.-Ing. Christian Pape
Group Leader
Automatic Control & Accustic
Address
An der Universität 1
30823 Garbsen
Building
Room
115
Dr.-Ing. Christian Pape
Group Leader
Automatic Control & Accustic
Address
An der Universität 1
30823 Garbsen
Building
Room
115