Team:ETHZ Basel/InformationProcessing
From 2010.igem.org
(→Information Processing Overview) |
(→Information Processing Overview) |
||
Line 19: | Line 19: | ||
- | [[Image:ETH_iGEM_CSB_microscope_small.PNG|thumb|200px| | + | [[Image:ETH_iGEM_CSB_microscope_small.PNG|thumb|200px|The two key players in our information processing approach: Microscope (upper image) and Joystick (lower image)]] |
== [[Team:ETHZ_Basel/InformationProcessing/Microscope| Imaging]] == | == [[Team:ETHZ_Basel/InformationProcessing/Microscope| Imaging]] == | ||
The cells are placed in a 50 μm (?) high flow channel restricting their movement to the x/y-plane, thus preventing them from swimming out of focus. They are imaged in bright field by an automatized microscope with 40x magnification approximately every 0.3s. The image is send via a local network or the Internet to the controller workstation, which forwards them to Matlab/Simulink. | The cells are placed in a 50 μm (?) high flow channel restricting their movement to the x/y-plane, thus preventing them from swimming out of focus. They are imaged in bright field by an automatized microscope with 40x magnification approximately every 0.3s. The image is send via a local network or the Internet to the controller workstation, which forwards them to Matlab/Simulink. |
Revision as of 21:34, 22 October 2010
Information Processing Overview
This section is meant to describe in detail the in-silico part of the overall network. For the in-vivo part, please refer to the Biology & Wet Laboratory section.
E. Lemmings cells are imaged using microscopy techniques. The resulting images are processed by fast cell detection and cell tracking algorithms, which determine the actual movement direction of the chosen bacterium. The desired reference direction is set by the user and, by means of the controller algorithm, light signals (red light and far-red light) are automatically activated. Therefore, by time-dependently changing the tumbling frequency, the cell is forced to swim in a desired direction in real time.
The current microscope image represents the in-vivo -> in-silico interface, while the red and far-red light signals represent the in-silico -> in-vivo one. By interconnecting both sub-networks, we can thus close the loop and obtain the overall combined network. The new network has the central advantage of allowing one to significantly increase the control over the system, as compared to traditional synthetic networks completely realized in-vivo.
In addition, anyone can enjoy the friendly interface and the gaming experience offered by E. lemming, by simply downloading the MATLAB/Simulink Toolbox available for free on our website.
Imaging
The cells are placed in a 50 μm (?) high flow channel restricting their movement to the x/y-plane, thus preventing them from swimming out of focus. They are imaged in bright field by an automatized microscope with 40x magnification approximately every 0.3s. The image is send via a local network or the Internet to the controller workstation, which forwards them to Matlab/Simulink.
Cell detection and Cell tracking
In the controller workstation, the images are pre-processed by the Lemming Toolbox and the cells are detected and tracked in real-time, by means of fast image processing algorithms developed by our team. From the change of position between the microscope frames, the current direction of E.lemming is estimated.
Controller
For controlling E. lemming, our modeling group implemented five different control algorithms, based on the same template. The actual direction of E. lemming, together with the desired direction set by the user and the time-point of the simulation form the inputs of the algorithms, while boolean values for red light and far-red light represent the outputs.
Based on original combinations of error minimization, hysteresis, noise suppression and predictions, our algorithms decide what type of light is sent at every time-point. This decision is then send back through the network to the microscope computer, which activates or deactivates the respective diodes, thus closing the loop between the in-silico and in-vivo part of the network.
Furthermore the controller detects if a cell is swimming out of the field of vision of the microscope and automatically adjusts the position of the x/y-stage.
User Experience and Visualization
The Toolbox is connected to either a joystick or a keyboard with which the user can choose the cell he/she wants to control and interactively change the reference direction for the E.lemming in real time.
Finally, the microscope image is post-processed to show the position of all cells, the selected cell and its current and reference direction, and visualized on the computer screen or with a beamer.