Ethical perspective defending the patenting of DNA - some interesting thoughts but I'm a bit concerned that the bit about public libraries doesn't mention the fact that the Registry of Standard Biological Parts contains patented material and so often only gets used with a research exemption.
July 2009 Archives
The Multiplex Automated Genome Engineering (MAGE) technique developed in George Church's lab at Harvard Medical School provides a promising alternative to whole-genome synthesis. It's certainly going to work out a lot cheaper than writing a million base-pair sequence from scratch and can cope with situations where the changes needed to something such as E coli involve edits throughout the chromosome.
In their paper for Nature that was published online over the weekend, Harris Wang, Farren Isaacs and their co-authors describe how they used MAGE to essentially perform rapid prototyping on a genome and increase the amount of the chemical lycopene that E coli could produce. It takes advantage of an aspect of MAGE that lets you tune how much variation out of the edits the system makes to a genome.
Talking about MAGE last year at the Royal Society, Isaacs explained how it was being used to perform hundreds of edits on the E coli genome in a bid to rework the genetic code itself to build a wider range of proteins than natural genetic systems can today. The first step was to remove one of the stop codons used by E coli to terminate translation. All of the 300 or so instances of the sequence TAG would be replaced by the more common TAA variant of the code. This allows the three-base TAG codon to be redeployed as the code for a non-natural amino acid. Isaacs said initial results from this project are nearing completion.
Full-scale codon replacement is the kind of thing you can do if you push MAGE through many cycles. The process starts off with a single, isogenic genome. As MAGE proceeds, the replacement pieces of DNA do not supplant the original sections all at once. So, if you have a lot of simultaneous replacements to make, there will be a lot of variation between the genomes of individual cells. But, as you push MAGE further, ultimately all the sections get replaced and you wind up with a new isogenic genome. To get many different mutants, you simply stop the MAGE process in the middle and analyse what comes out. In the latest experiment, the process came up with more than 4 billion genomic variants each day. They then isolated variants that showed a significant increase in lycopene production - finding one that had a five-fold increase over the wild type in just three days.
The core of the MAGE techique is a protein used by the λ-Red virus to introduce its own changes into a bacterial genome with a little help from genetic engineering. The β protein binds to oligos that are intended to replace sections of DNA in the actual genome. The protein helps the sections displace the Okazaki fragments that the cell's own machinery uses to build complementary DNA strands on the lagging strand during DNA replication. Each end of the section provides a match to the original DNA to let it stick, with the new 30 base pairs or so of DNA lying somewhere in the middle. Normally, the cell's mismatch repair proteins would spot this alien DNA - because it does not marry up with the original complementary sequence - but one of the key genes for the repair mechanism has already been knocked out.
When the DNA replicates again, the new DNA is copied and becomes fully part of the genome. Some of the replacement fragments don't make it or, in the case of this experiment, are replaced by other near matches, which gives rise to the huge genetic variation at the edit points.
The edit points are not picked at random. Isaacs explained that some of them are simple knock-outs, putting stop codons in the middle of genes that might divert feedstock chemicals away from the pathway that produces lycopene. The other target was the gene that codes for the protein that makes lycopene itself. One of the directions of Isaacs' research is on the efficiency of translation as mRNA is used to produce the final proteins.
The Shine-Dalgarno sequence (TAAGGAGGT) in the section of messenger RNA where the ribosome first attaches generally boosts translation efficiency but it does not work in all cases. The structure of the messenger RNA seems to have an effect. Rather than try to design the most-efficient sequence a priori, Wang and Isaacs decided to let evolution have a go. They developed a variety of oligos that contained subtly different variants of the Shine-Dalgarno sequence and added them to the MAGE pool. They could then pull out the one that worked the best at the selection stage. The first selection of variants were made by looking for colonies that produced an intense red pigmentation on Luria-Bertani agar plates with the best performers screened from the ten thousand or so that process identified.
Isaacs thinks the technique can go much further and use evolution to pick out winners from more extensive reworkings of the genome, acting on promoters as well as ribosome binding sites. It will, in a sense, press the fast-forward button on evolution by letting biology explore genetic states that might be inaccessible because single changes on the way to them would kill off the candidate cells. Experiments such as the one performed by Mark Isalan and Luis Serrano and the team at CRG in Barcelona have shown how swapping bits of gene promoters can alter the fitness of bacteria. And these techniques could show how regulatory networks of genes affect cell behaviour.
"We want to think of general strategies that we can use with this method to introduce diversity with a universal oligo pool, and allow the cells to explore new genetic landscapes that will confer new properties. And then come up with screens and selections that let us pull out cells with specific behaviours," said Isaacs.
The aim is to target all 4000 genes of E coli with on the order of 15 000 oligos. "We could target every known coding and regulatory locus. You are seeing just the beginning of what we are trying to do," said Isaacs.
The core MAGE technique could work in other cell types, such as yeast, Isaacs claimed: "The mechanism that we use is something that is conserved across biology."
The New York Times reports that Exxon and J Craig Venter's Synthetic Genomics are to work on biofuel-producing algae - presumably using some of the genetic components that Venter's researchers have been capturing from a wide variety of organisms on their bulk genome-sampling missions.
Lloyd's has taken a look at synthetic biology and decided insurers need to wise up over "systemic risks". According to Insurance Daily, the report says: "The enormity of some adverse scenarios suggests the inclusion of various forms of sub-limit in the future."
The Lloyd's report itself is available alongside the news release.
Nature blogs on concerns over the public acceptance of synthetic biology at the recent Washington DC conference organised by the OECD. And whether the technology needs a new name, like "shiny, happy biology". The blog post doesn't mention it but Drew Endy was referring to a joke made at the SB 2.0 conference in 2006 over whether "synthetic biology" was the right name.
Local news for Bostonians on an MIT competition for clean energy. Cambridge, MA-based InAct Labs, which is working on microbial fuel cells, is one of the semi-finalists.
Nature, 19 March 2009: Design and engineering of an O2 transport protein by Ronald Koder, JL Ross Anderson, Lee Solomon, Konda Reddy, Christopher Moser & P Leslie Dutton
Most protein design consists of tweaking an existing, natural structure, often using directed evolution with the occasional bit of deliberate peptide insertion so that it will bind to a specific molecule. Take the work of Angela Belcher's team at MIT, for example, where the coat protein was altered to include an amino acid known to bind to carbon nanotubes or to iron phosphate.
The idea of deliberate protein design suffered a serious blow when Professor Homme Helliga and researchers in his lab had to withdraw a couple of papers. Collaborators were unable to repeat Hellinga's results - instead of creating a structure from scratch that could act as a ribose binding protein, some of the wild-type protein wound up in the experiment and was responsible for all the desired activity.
One big problem is that proteins are complex structures. But much of that complexity comes from the way evolution works, with bits of genes getting copied, inserted, knocked out and mutated.
"This complexity frustrates biochemists seeking to understand structure and function and presents an extraordinary challenge to protein engineers who aim to reproduce or create new functions in proteins," argue the authors of the Nature paper.
If you design a protein from scratch, is it worth following the example of the natural world? "However common it may be in nature, we maintain that complexity is not an essential feature of protein as a material, nor is it an essential feature of catalysis, as shown by synthetic chemical systems," the authors reckon.
Eric Drexler of nanomachine fame, agrees: "...close adherence to natural models is often intellectually and technologically crippling."
The idea behind this paper is to use protein structures that are much simpler than those employed by nature, and more akin to regular catalysts.
Lingchong You and colleagues at Duke University claim that, although there are many variants of quorum sensing in bacteria, there seems to be one constant: the total volume of bacteria in relation to the volume of their environment is a key to quorum sensing, no matter what kind of microbe is involved.
"If there are only a few cells in an area, nothing will happen," said Anand Pai, who has worked with You on the project. "If there are a lot of cells, the secreted chemicals are high in concentration, causing the cells to perform a specific action. We wanted to find out how these cells know when they have reached a quorum."
The researchers write about the project in the July 2009 issue of Molecular Systems Biology.
Forget furry encounters and virtual people glued to unreal fruit machines, members of the iGEM team based at Calgary in Canada have decided to make a real use of the virtual-reality space Second Life. And it looks as though the work by Patrick King and colleagues will make use of the scripting and visualisation features of Second Life as well as the meeting-space environment that most people (as least those that have stayed) use:
"The most often touted feature is that SL can offer a classroom-like environment for people at any distance from one another. While the iGEM Calgary island will make an excellent hangout for idle igemmers the world over, our focus is less on creating a classroom, and more on presenting concepts directly. We want to make it easier for new students to grasp the basics of synthetic biology by making it accessible and interactive. This is where SL's object creation and scripting facilities come into play: we can create anything we want, from molecules to cells to lab equipment, and then make it behave like the real thing."
The environment will not just be for the Calgary team, King writes:
"My number one goal for this project is for it to be useful to others, especially early university or high school students just beginning with iGEM, but also biology students in general, and the public. For it to be useful, it must be used; feedback on the accuracy of our work is essential! I hope that Lindsay Island will be open to the public near the end of the summer, but the real test will not come until iGEM 2010, when we will meet our first batch of fresh students."
One of the papers in the Royal Society Interface special issue on synthetic biology takes a look at the potential for a standard graphical notation for engineers working on system design. As one of the authors is Hiroaki Kitano, the proposal, naturally is for the Systems Biology Graphical Notation (SBGN) coupled with the Systems Biology Markup Language (SBML).
The authors concede that SBGN may need extensions to handle the constructs that synthetic biology engineers want to use but argue:
"Sharing of symbols representing identical biological elements would further help in developing a common graphical lingua franca for biological engineering, on the same lines as in electrical circuit diagrams and other advanced engineering disciplines.
"We strongly believe that careful collaboration on the visual as well as model representation aspects between the two communities would foster the development of a standard graphical notation schema and accelerate the application of computational techniques."
However, one lesson from electrical engineering is that graphical languages do not last long. Circuit diagrams cope well with simple analogue and digital circuits in electronics but as soon as things get complex, text tends to win out. Just look at the way in which textual languages such as Verilog and VHDL pushed graphical schematics to the margins. The textual representations also deal with the idea of parallel operation better. And in synthetic biology, it all happens in parallel.
Writing for a special issue of Royal Society Interface on synthetic biology, Steven Yearley, a member of the Genomics Policy and Research Forum sponsored by the UK's Environmental and Social Research Council, claims the regulatory and ethical concerns around the technology go hand-in-hand with the hype. And, to a certain extent, agrees with the idea that Big Promise technologies, by having bold claims made for them, wind up the concern to the point that extra regulation becomes inevitable:
"...once these assertions about far-reaching novelty or widespread applicability are made the regulatory implications are hard to avoid. The more strongly the claims are put forward, the more powerful the apparent regulatory logic.
"Proponents of synthetic biology need to make claims about its startling novelty and wide-ranging implications if they are to win support, yet they cannot make these claims without simultaneously raising questions about suitable safety and regulatory standards."
I'm not entirely convinced by this. Although you can see the effect proposed by Yearley reflected in the concern over nanotechnology, I think synthetic biology poses greater ethical and regulatory concerns to people because of the issue of dealing with life. You also have the shadow of genetically modified organisms hanging over it, which has encouraged government-funded organisations to focus very much on ethics and regulation in the hope of heading off another GMO crisis.
However, it is interesting to consider how things might have gone if J Craig Venter had not beaten the drum so hard on his lab's work.
Yearley's main point is that any ethical review of synthetic biology has to dispense with the kinds of framing used for bioethics so far. In other words, bioethics is not up to the job of determining the ethics of a synthetic biology. This is not as bizarre as it sounds: Yearley's argument is that bioethics narrowed its focus because that was what the main players wanted. Yearley does not make the argument directly but mentions an issue raised by the 1999 paper by Cho and Caplan, which was the first foray into this area, that because synthetic biology could challenge the popular view of what gives the living life any ethical debate has to take that into account. This is not something that traditional bioethics has had to deal with much, other than deciding at which point an organism has a distinct identity.