Science

Synthetic Biology and Supercomputers - innovation approaches

and methods

Salvatore Santoli

INT - International Nanobiological Testbed Ltd

c/o LBIC - London BioScience Innovation Centre

2 Royal College Street, London NW1 0NH, UK

The subject of this paper is some topics of area "Synthetic biology" and area "Supercomputers", that are close connected with each other methodologically.

Nothing in life is to be feared.

It is only to be understood.

Marie Curie

As a nanobiologist with a strong interest in computational nanoscience and a bent for looking at Biology through theoretical physical glasses, I have identified the basic problems involved in the formulation of the research plans of Synthetic Biology and its branch Synthetic Life - concretely emerging very adventurous paths undertaken by human mind - as the problem of revisiting and possibly redefining the concept of computation. An old adage warns that "computing is not number; it is insight", but why should this adage be recalled, and  what and where would the source of proper insight be? Indeed, an overview of the programs, their historical roots and some comments on such kind of research, considered as extreme engineering and by somebody as a scary research plan aiming even at crossing the line where Man starts playing at God, will make clear the whys of my standpoint and the roots of my proposition.

Synthetic Biology is a new area of biological research that combines science and engineering. This term encompasses a variety of different approaches, methodologies and disciplines, and many different definitions exist. What they all have in common, however, is that they see "synthetic biology" as the design and construction of new biological functions and systems not found in nature. The conception of this idea already has a history. This program has been rapidly developing indeed in the recent years, but the concept dates back to 1974, when the Polish geneticist Waclaw Szybalski introduced the term "synthetic biology", writing:

Let me now comment on the question "what next". Up to now we are working on the descriptive phase of molecular biology. ... But the real challenge will start when we enter the synthetic biology phase of research in our field. We will then devise new control elements and add these new modules to the existing genomes or build up wholly new genomes. This would be a field with the unlimited expansion potential and hardly any limitations to building "new better control circuits" and ..... finally other "synthetic" organisms, like a "new better mouse". ... I am not concerned that we will run out of exciting and novel ideas, ... in the synthetic biology, in general.

The synthesis of genes has already reached a high level of success. In 2010, J.Craig Venter's group*) announced they had been able to assemble a complete genome of millions of base pairs, to insert it into a cell, and to cause that cell to start replicating. This arose some basic concerns about what Venter and others in the new field of synthetic biology were doing. First, the objection was raised that one of these synthetic organisms might escape from the lab and run amok. And, second, it was warned by some fearful people that this kind of work would cross a line where humans would start playing at God**. As to questions concerning biosecurity and biosafety there has been an involvement also of stakeholders and intellectual property experts, and the IASB - International Association for Synthetic Biology*** launched an initiative for self-regulation, suggesting that specific measures should be implemented by Synthetic Biology industries. Symposia were also organized to discuss the societal issues and possible policies of such a research/development effort.

There's now a wide literature concerning the synthesis of quite complex genes with the aim at causing existing cells to behave differently from their natural behavior as to reproduction and basic properties, but there's also an approach based on the building of organs and organ systems replacing in cells the natural organs to obtain different products in upper-rank living systems. An example of this engineering-minded approach is the work of Chris Voigt who succeeded in the redesigning of the Type III secretion system used by Salmonella thyphimurium to secrete spider silk proteins, i.e. a strong and elastic material, instead of its own natural infectious proteins.

While all such approaches are based on the use of natural members of living matter, the special approach to Synthetic Biology that plans to build living systems through full synthesis from chemicals is called Synthetic Life. The most recent success along this pattern of thinking is an attempt at creating a new organism by replacing the genome in an existing natural cell with a different genome created by gene synthesis. This has been achieved with the creation of Synthia in 2010. But what all such approaches share necessarily is a mathematics and models for designing and developing the realizations envisaged, i.e. the fundamentals for creation of a proper technology as the basis for the engineering design and manufacturing of biosynthetic products. Stated otherwise, the key concepts toward the foundation of a branch of Biology that could be dubbed "biological engineering" would be the standardization of biological parts and of any hierarchical abstraction process allowing these parts to be used to build in a controlled way increasingly complex synthetic systems.

Many kinds of mathematics - stochastic differential equations, partial differential equations, ordinary differential equations, integral-differential equations, and even graph theory - have been exploited for such tasks. Multiscale models of gene regulatory network have been developed for Synthetic Biology applications. Simulations have been used that model all biomolecular interactions in functions like transcription, translation, regulation and induction of gene regulatory networks, guiding the design of synthetic systems. And provisions have been made for accurate measurements. But all such attempts have been carried out on the basis of a computing concept which is supported by gate-logic based computers, mainly of very high power of computation as to speed and massive parallelism of data processing. Now, it has been shown that such purely syntactic - i.e. not semantic - computing systems are hopeless in supplying a definite model even for a relatively simple problem of prediction of protein folding on insertion of further amino acids, the increasing computing power of such tools giving rise just to an increasing and disorienting  multitude of models as a thick fog on the way to solve the problem. Moreover, biological evolution from the very few hundreds of bits on the primeval Earth up to the complexity of human brain has been shown to have occurred in quite an extralogical, heuristic way: Nature appears to work as a tinker, not as an engineer, and by so doing she computes well beyond Bremermann's limit, i.e. the limit in information processing rate for any computing system made up of atoms and electrons, be it natural or artificial. This "transcomputing" ability would concern both classical and quantum level computation.

Two kinds of lessons are to be learned from Nature to proceed rapidly and hopefully with the Synthetic Biology and Synthetic Life conceptions: 1) what is the meaning of "computing" for a system that is mainly information-driven, not energy-driven, and whose evolutionary character necessarily involves simulation of the environment, i.e. the decoding of it as the basic ability to survival - the first condition for evolution, i.e. the gaining of stability - and growth - the second condition or the dynamics for moving from any stable state on, against entropic degradation; and 2) what are the dynamics of a system self-developing into more informational structures, i.e. structures of levels of increasing abstraction as reduction of the number of degrees of freedom for describing the environment and the lower-rank levels. Indeed, computing will mean to add a second dimension, which is the semantics or meaning added to what would be just blind syntax capable to describe in the logical space of any automaton what in phase space would be just an isentropic, i.e. not creative, flow of information: recursive functions only. (the notions of convergent thinking and divergent thinking there are in A.G.Grappone's paper). Any meaning successfully added is just the correct pragmatics - i.e. actuation - embodying the relationships with the environment that ensure survival and growth. This can be depicted by the closed chain of holistic structure-function undividable unity shown herein below:

SENSING INFORMATION PROCESSING ACTUATING

 

 

 

This is a self-referential and yet not paradoxical system just because it is dissipative; a paradox of infinite regression would appear in a fully logical - i.e. isentropic - system. Dissipation dispels paradox. Thus, extralogical self-referential computing embodies what is called semantic computing, which does not occur in the standard phase space conceived for describing energy-driven systems, but in a space of coded interactions, i.e. of interactions depending on frequencies and/or on space arrangements. The concept of meaning and of codes, and of complex codes-of-codes or hierarchies of codes introduces the thought category of quality, and more specifically of quantifiable qualities, a notion that hiddenly underlies all modern science, the latter being thought of according to an usual misconception as the success of the mere category of quantity.

A proper source of insight as the capability of semantic computing would be embodied by Geometry, mainly the non-commutative branch, and Topology, which both consist of quality and quantity, and by their connections with the general concept of field as a physical quantity associated to a point in spacetime: the field that occupies space, contains energy, and whose presence eliminates a true vacuum, but that also contains information through its wave dynamics and the ubiquitous phenomenon of synchronization that goes from macroreality to Heisenberg molecular field. Geometrization, as a fully energy-free approach to the analysis and synthesis of evolutionary systems would add quality to our equations and any computing tool, and would overcome transcomputational and non-computability problems, be they of classical or quantum nature, and would unveil the actual nature and the meaning of "measurement" for evolutionary systems. Indeed, differently with respect to measurements in the case of so-called inanimate systems, in the case of evolutionary self-reproducing systems like living matter, measuring is a real dialog between the observing and the observed system, where both such systems try to decode each other and build an "image" of one another ("Living matter is matter that chooses" as the famous biologist E. Margulis puts it). The entropy produced when an improper language is used, like that of syntactic computers, to decode a semantic system would just consist in the huge number of imperfect models, obtainable by means of the most powerful supercomputers; a number that would overwhelm the processing capabilities of the human observer. The effort for reaching the ability of measuring and computing along these lines would lead to descriptions of biosystems in the realm of spacetime and quantum field theories, deeply down to their envisageable aspects of macroscopic quantum objects linked to the underlying physics of the Universe, for their well controlled, fully from-scratch synthesis according to the ultimate plan formulated by Synthetic Biology and Synthetic Life: nothing to be feared, but just to be understood.

Notes

*) Scientists Reach Milestone On Way To Artificial Life, National Public Radio, Interview of 20th May 2010. The J. Craig Venter Institute was formed in October 2006 through the merger of several affiliated and legacy organizations - The Institute for Genomic Research (TIGR) and The Center for the Advancement of Genomics (TCAG), The J. Craig Venter Science Foundation, The Joint Technology Center, and the Institute for Biological Energy Alternatives (IBEA). Today all these organizations have become one large multidisciplinary genomic-focused organization. With more than 400 scientists and staff, more than 25.000 square metres of laboratory space, and locations in Rockville, Maryland and San Diego, California, the new JCVI is a world leader in genomic research.

 

**) This fear, of ancestral origin, has been affecting mankind for some centuries and has been coming again from time to time, mainly since when the scientific thinking was formulated and got some results deeply biting at beliefs and nibbling at principles or uncanny beliefs within which the general historical conscience was framed and societies were established. A vivid characterization of this can be found in a scene of Goethe's Faust, where Mephistopheles, in disguise as a university professor, when asked respectfully by one of his students to write a dedication on the student's workbook, takes the workbook, writes something and gives it back to the student, who takes it and reads aloud what Mephistopheles had written. It is, in Latin, the biblical devilish sentence of temptation to Man in the earthly paradise: "Eritis sicut Deus, scientes bonum et malum - You will be similar to God, with the knowledge of good and evil". Then the student bows and leaves. And while he is leaving, Mephistopheles says to himself "Follow so far the old saying of my uncle the Serpent, as you please; the day will surely come when you will be afraid of being similar to God". Knowledge is thus felt as a devilish dangerous trap, and ignorance as an imposed status to which Man is doomed  for a happy and quiet life.

 

***) IASB, c/o febit synbio GmbH, Im Neuenheimer Feld 519, 691220 Heidelberg, Germany

 




[Contents]

homeKazanUniversitywhat's newsearchlevel upfeedback

© 1995-2008 Kazan State University