Team:ETHZ Basel/InformationProcessing

From 2010.igem.org

(Difference between revisions)
(Information Processing Overview)
(Information Processing Overview)
Line 8: Line 8:
<div class="thumbcaption"><div class="magnify"><a href="http://www.youtube.com/watch?v=1qQBmMcMZDI?hd=1" class="external" title="Enlarge"><img src="/wiki/skins/common/images/magnify-clip.png" width="15" height="11" alt="" /></a></div><b>Information processing principle of E. lemming.</b> Tumbling / directed movement rates are monitored by image processing algorithms, which are linked to the light-pulse generator. This means that <i>E. coli</i> tumbling is induced or suppressed simply by pressing a light switch! This synthetic network enables control of single <i>E. coli</i> cells.</div></div></div>
<div class="thumbcaption"><div class="magnify"><a href="http://www.youtube.com/watch?v=1qQBmMcMZDI?hd=1" class="external" title="Enlarge"><img src="/wiki/skins/common/images/magnify-clip.png" width="15" height="11" alt="" /></a></div><b>Information processing principle of E. lemming.</b> Tumbling / directed movement rates are monitored by image processing algorithms, which are linked to the light-pulse generator. This means that <i>E. coli</i> tumbling is induced or suppressed simply by pressing a light switch! This synthetic network enables control of single <i>E. coli</i> cells.</div></div></div>
</html>
</html>
-
Although the synthetic network we implemented makes the tumbling frequency of ''E. coli'' cells dependent on red and far-red light, the biological part alone is not sufficient to control the swimming direction of E. lemming. Thus, it is complemented by an ''in-silico'' network realizing a controller which automatically sends the light signals and, by thus time-dependently changing the tumbling frequency, forces the cell to swim in a desired direction. The interface between the two sub-networks, the ''in-vivo'' network and the ''in-silico'' network, is defined as the current microscope image (''in-vivo'' -> ''in-silico'') and the red and far-red light signals (''in-silico'' -> ''in-vivo''). By interconnecting both sub-networks, we thus can close the loop and obtain the overall network, which allows us to increase the information processing capabilities significantly compared to traditional synthetic networks completely realized ''in-vivo''.
+
Although the synthetic network we implemented makes the tumbling frequency of ''E. coli'' cells dependent on red and far-red light, the biological part alone is not sufficient to control the swimming direction of E.lemming. Thus, it is complemented by a complex ''in-silico'' network realizing a controller which automatically sends the light signals and, by thus time-dependently changing the tumbling frequency, forces the cell to swim in a desired direction. The interface between the two sub-networks, the ''in-vivo'' network and the ''in-silico'' network, is defined as the current microscope image (''in-vivo'' -> ''in-silico'') and the red and far-red light signals (''in-silico'' -> ''in-vivo''). By interconnecting both sub-networks, we thus can close the loop and obtain the overall network, which allows us to increase the information processing capabilities significantly compared to traditional synthetic networks completely realized ''in-vivo''.
-
The ''in-silico'' network we created consists of [https://2010.igem.org/Team:ETHZ_Basel/InformationProcessing/Microscope| imaging cells] using microscopy techniques, [https://2010.igem.org/Team:ETHZ_Basel/InformationProcessing/CellDetection| detecting the cells] and [https://2010.igem.org/Team:ETHZ_Basel/InformationProcessing/CellDetection| tracking their trajectories] by image processing and, finally, designing a [https://2010.igem.org/Team:ETHZ_Basel/InformationProcessing/Controller|controller] to guide the cells towards an user - defined target.
+
This section is meant to describe in detail the ''in-silico'' part of the overall network. For the ''in-vivo'' part, please refer to the [[Team:ETHZ_Basel/Biology | Biology & Wet Laboratory]] section.<br clear="all" />
-
In this section we will describe in detail the ''in-silico'' part of the overall network. For the ''in-vivo'' part, please refer to the [[Team:ETHZ_Basel/Biology | Biology & Wet Laboratory]] section.<br clear="all" />
 
-
== Short Overview ==
+
This transforms our iGEM project into one of the coolest [https://2010.igem.org/Team:ETHZ_Basel/InformationProcessing/Game| games] ever!
-
The cells are placed in a 50 &mu;m (?) high flow channel restricting their movement to the x/y-plane, thus preventing them from swimming out of focus. They are imaged in bright field by an automatized microscope with 40x magnification approximately every 0.3s. The image is send via a local network or the internet to the controller workstation, which forwards them to Matlab/Simulink. There, the images are pre-processed by the Lemming Toolbox and the cells are detected and tracked in real-time. From the change of position between the microscope frames the current direction of the E. lemming is estimated. Furthermore, the Toolbox is connected to a joystick with which the user can choose the cell he wants to control and interactively define the reference direction for the E. lemming. Together with the actual direction of the cell the reference direction forms the input of the actual control algorithm. This algorithm decides, based on the actual and previous directions of the cell and the difference to the reference direction, when to send red or far-red light. This decision is then send back through the network to the microscope computer, which activates or deactivates the respective diodes, thus closing the loop between the in-silico and in-vivo part of the network. Furthermore the controller detects if a cell is swimming out of the field of vision of the microscope and automatically adjusts the position of the x/y-stage. Finally, the microscope image is post-processed to show the position of all cells, the selected cell and its current and reference direction, and visualized on the computer screen or with a beamer.
+
 
 +
 
 +
== [https://2010.igem.org/Team:ETHZ_Basel/InformationProcessing/Microscope|Imaging] ==
 +
The cells are placed in a 50 &mu;m (?) high flow channel restricting their movement to the x/y-plane, thus preventing them from swimming out of focus. They are imaged in bright field by an automatized microscope with 40x magnification approximately every 0.3s. The image is send via a local network or the Internet to the controller workstation, which forwards them to Matlab/Simulink.  
 +
 
 +
== [https://2010.igem.org/Team:ETHZ_Basel/InformationProcessing/CellDetection| Cell detection and Cell tracking] ==
 +
In the controller workstation, the images are pre-processed by the Lemming Toolbox and the cells are detected and tracked in real-time. From the change of position between the microscope frames, the current direction of E.lemming is estimated.
 +
 
 +
== User experience  ==
 +
The Toolbox is connected to either a joystick or a keyboard with which the user can choose the cell he/she wants to control and interactively define the reference direction for the E.lemming in real time.
 +
 
 +
== [https://2010.igem.org/Team:ETHZ_Basel/InformationProcessing/Controller| Controller] ==
 +
Together with the actual direction of the cell, the reference direction forms the input of the control algorithm. This algorithm decides, based on the actual and previous directions of the cell and the difference to the reference direction, when to send red or far-red light. This decision is then send back through the network to the microscope computer, which activates or deactivates the respective diodes, thus closing the loop between the in-silico and in-vivo part of the network.
 +
 
 +
Furthermore the controller detects if a cell is swimming out of the field of vision of the microscope and automatically adjusts the position of the x/y-stage.
 +
 
 +
 
 +
== [https://2010.igem.org/Team:ETHZ_Basel/InformationProcessing/Visualization| Visualization] ==
 +
Finally, the microscope image is post-processed to show the position of all cells, the selected cell and its current and reference direction, and visualized on the computer screen or with a beamer.

Revision as of 19:17, 22 October 2010

Information Processing Overview

Information processing principle of E. lemming. Tumbling / directed movement rates are monitored by image processing algorithms, which are linked to the light-pulse generator. This means that E. coli tumbling is induced or suppressed simply by pressing a light switch! This synthetic network enables control of single E. coli cells.
Although the synthetic network we implemented makes the tumbling frequency of E. coli cells dependent on red and far-red light, the biological part alone is not sufficient to control the swimming direction of E.lemming. Thus, it is complemented by a complex in-silico network realizing a controller which automatically sends the light signals and, by thus time-dependently changing the tumbling frequency, forces the cell to swim in a desired direction. The interface between the two sub-networks, the in-vivo network and the in-silico network, is defined as the current microscope image (in-vivo -> in-silico) and the red and far-red light signals (in-silico -> in-vivo). By interconnecting both sub-networks, we thus can close the loop and obtain the overall network, which allows us to increase the information processing capabilities significantly compared to traditional synthetic networks completely realized in-vivo.

This section is meant to describe in detail the in-silico part of the overall network. For the in-vivo part, please refer to the Biology & Wet Laboratory section.


This transforms our iGEM project into one of the coolest games ever!


[1]

The cells are placed in a 50 μm (?) high flow channel restricting their movement to the x/y-plane, thus preventing them from swimming out of focus. They are imaged in bright field by an automatized microscope with 40x magnification approximately every 0.3s. The image is send via a local network or the Internet to the controller workstation, which forwards them to Matlab/Simulink.

Cell detection and Cell tracking

In the controller workstation, the images are pre-processed by the Lemming Toolbox and the cells are detected and tracked in real-time. From the change of position between the microscope frames, the current direction of E.lemming is estimated.

User experience

The Toolbox is connected to either a joystick or a keyboard with which the user can choose the cell he/she wants to control and interactively define the reference direction for the E.lemming in real time.

Controller

Together with the actual direction of the cell, the reference direction forms the input of the control algorithm. This algorithm decides, based on the actual and previous directions of the cell and the difference to the reference direction, when to send red or far-red light. This decision is then send back through the network to the microscope computer, which activates or deactivates the respective diodes, thus closing the loop between the in-silico and in-vivo part of the network.

Furthermore the controller detects if a cell is swimming out of the field of vision of the microscope and automatically adjusts the position of the x/y-stage.


Visualization

Finally, the microscope image is post-processed to show the position of all cells, the selected cell and its current and reference direction, and visualized on the computer screen or with a beamer.