Paul Dabrowa does not know if it is illegal to genetically modify beer at home in a way that makes it glow. The process involves taking DNA information from jellyfish and applying it to yeast cells, then using traditional fermenting methods to turn it into alcohol. But he is worried that it could be against the law given that it involves manipulating genetic material.
“This stuff can be dangerous in the wrong hands, so I did that in an accredited lab,” he says, adding that he himself has only got as far as making yeast cells glow in a Petri dish.
For the most part Dabrowa, a 41-year old Melbourne-based Australian who styles himself as a bit of an expert on most things, prefers to conduct his biohacking experiments in his kitchen. He does this mostly to find cures for his own health issues. Other times just for fun.
In recent years the community of hobbyists and amateurs Dabrowa considers his kin has been energised by the falling cost and growing accessibility to gene-editing tools such as Crispr. This has led to an explosion of unchecked experimentation in self-constructed labs or community facilities focused on biological self-improvement.
Despite a lack of formal microbiological training, Dabrowa has successfully used faecal transplants and machine learning to genetically modify his own gut bacteria to lose weight without having to change his daily regime. The positive results he’s seen on himself have encouraged him to try to commercialise the process with the help of an angel investor. He hopes one day to collect as many as 3,000 faecal samples from donors and share the findings publicly.
Much of his knowledge — including the complex bits related to gene-editing — was gleaned straight from the internet or through sheer strength of will by directly lobbying those who have the answers he seeks. “Whenever I was bored, I went on YouTube and watched physics and biology lectures from MIT [Massachusetts Institute of Technology],” he explains. “I tried the experiments at home, then realised I needed help and reached out to professors at MIT and Harvard. They were more than happy to do so.”
At the more radical end of the community are experimentalists such as Josiah Zayner, a former Nasa bioscientist, who became infamous online after performing gene therapy on himself in front of a live audience. Zayner’s start-up, The Odin — to which Crispr pioneer and professor of genetics at Harvard Medical School George Church is an adviser — has stubbornly resisted attempts to regulate its capacity to sell gene-editing kits online in the idealistic belief that everyone should be able to manage their own DNA.
These garage scientists might seem like a quirky new subculture but their rogue mindset is starting to generate consternation among those who specialise in managing biological threats in governments and international bodies.
In 2018 the states that are signatories to the 1972 Biological Weapons Convention (BWC) identified gene editing, gene synthesis, gene drives and metabolic pathway engineering as research that qualifies as “dual use”, meaning it is as easy to deploy for harmful purposes as it is for good.
Many of the parties are now worried that increased accessibility to such technologies could heighten accidental or deliberate misuse, including the development of biological weapons by rogue actors for mass or targeted attacks.
It’s a regulatory oversight that worries Dabrowa more than most. He’s spent years trying to warn officials and journalists about the growing capabilities of amateurs like himself. “I would go and meet ministers with a vial of cowpox and explain the threat,” he says, referencing the relatively benign pathogen that has been used since the days of Edward Jenner to help inoculate people against smallpox.
Among these threats are DNA-sequencing techniques accessible to him as a hobbyist that could easily be used to home-brew lethal pathogens like smallpox out of naturally occurring cowpox or other vaccine-based derivatives.
“If bioterrorists wanted to do it undetected, they could buy a second-hand DNA synthesiser for $2,000. The whole process would cost $10,000 and could be done in a kitchen,” he says.
Similar concerns have long been echoed by Microsoft founder and vaccine philanthropist Bill Gates. Even so, most officials did not take Dabrowa or his warnings seriously. Perhaps because of his lack of microbiological credentials.
The global disruption caused by the Covid pandemic, he says, has changed things. It has brought to light how easy it could be for non-state actors to set off a deadly biological chain reaction to purposefully cause worldwide devastation.
Others newly attuned to the risk include former UK prime minister Tony Blair, who warned in a speech this month that “bioterror possibilities may seem like the realm of science fiction. But we would be wise now to prepare for their potential use by non-state actors.”
Engaging with biohackers
Under the terms of the BWC, states are officially committed to taking all the measures they can to prohibit and prevent biological weapon development with such dual-use capacities. In theory that means states have a responsibility to monitor and control activities such as Dabrowa’s.
In practice, the BWC was never designed to address the challenges posed by bioterrorism or rogue participants empowered by increasingly democratised access to gene-editing tools. Many of the prevailing codes of conduct remain fuzzy on what is and isn’t permitted across different jurisdictions, especially within the hobbyist arena.
Dabrowa’s lack of understanding of niche legalities is far from unusual.
According to Piers Millett, vice-president for safety and security at the iGem Foundation, which runs a synthetic biology competition to encourage best practice, there is a systemic lack of understanding about biosecurity issues across learning centres worldwide. Millett told participants at this month’s meeting of BWC experts and states parties in Geneva that up to 71 per cent of surveyed practitioners did not know the definition of “dual-use research of concern” and another 61 per cent did not know the definition of “dual use”.
“I think it really does highlight the importance of the international community better engaging with the DIY bio community or biohacker students,” James Revill, a bio security governance expert affiliated with the UN Institute for Disarmament Research and fellow participant at the convention, tells the Financial Times. “The challenge is not stifling localised innovation for peaceful purposes or people wanting to learn about biology.”
Dabrowa’s own campaign to enlighten officials has been motivated by a desire to prevent the actions of a few bad actors, whether by accident or through ill intent, from giving the whole hobbyist community a bad name that eventually shuts everyone down. This, he feels, is important because the greatest scientific breakthroughs have often hailed from out-of-the-box thinking in unsupervised areas.
“Biohackers are what we used to call scientists,” Dabrowa says, noting that Louis Pasteur might today have been considered an equally dangerous operator. “There is no real science or Da Vinci stuff happening in academia these days. The real scientists are tinkering in their garages.”
This view is echoed by some other top scientists who have grown frustrated with the bureaucracy involved in securing funding for the projects they hold dear.
“If you really want to make important discoveries, you have to know how to work the system,” says Richard Muller, an American physicist and emeritus professor of physics at the University of California, Berkeley. Muller revealed that he had secretly redirected funding from approved projects to fund riskier rejected ones in a letter in Science magazine in 1980.
Muller told the Financial Times that while this confession landed him in hot water he felt an obligation to come clean about the uncouth practices he had used to secure funding after his discoveries won awards. “So yes, I think there’s a long history of this. I’m like a garage scientist myself,” he notes.
To Dabrowa’s mind, pressures like this have made the biohacker scene all the more analogous to the one that spawned the personal computing revolution from the garages of college dropouts in Silicon Valley in the 1970s and 1980s. The difference this time is that instead of hacking computer mainframes, today’s biological equivalents are tinkering with genetics.
A similar compulsion to bypass the centralising dominance of big corporate and academic institutions nonetheless dominates the culture. “Instead of the internet, their discoveries will cure disease and increase everyone’s lifespan,” says Dabrowa.
Some Silicon Valley investors have embraced the off-grid approach to microbiological innovation, seeing it as an important part in democratising access to bespoke gene-based therapeutics. They agree that bureaucratic wrangling over funding and grants may be holding innovation back or directing it into the wrong sorts of risks.
“Whether biology or fusion, the most interesting work happens off-grid,” says venture capitalist Ajay Royan of Mithril Capital. “It has been thus forever . . . it is no accident that Ada Lovelace, Freeman Dyson, Hedy Lamarr, Srinivasan Ramanujan and Leonardo da Vinci all embodied a deeply independent, often subversive, genius.”
‘Gain of function’
Getting the balance right between experimentation that encourages innovation but does not simultaneously cultivate risks has never been easy in the microbiological field. However, RP Eddy, whose consulting group Ergo has been providing pandemic-related intelligence to the Biden administration and other government agencies, says that while genetic tools may have tipped the risk balance, it’s important not to get overly consumed by the biohacking field.
Eddy points instead to some of the riskier research with dual-use potential that has been happening in formal academic institutions for years. Much of this is not reliant on modern genetic advances and occurs in far less controlled environments than many assume. These, he says, sometimes require little more than an automatic self-sealing door, gloves and an autoclave machine or air pump.
“Right now there really isn’t an agreed to and followed set of standards for how [biosafety level] BSL3 and BSL4 labs should secure themselves,” he says.
Among the riskiest processes include a research method known as gain of function. This involves purposefully tinkering with viruses to make them more infectious so that vaccines and therapeutics can be preemptively researched and developed.
The process first drew international scrutiny when Ron Fouchier, a virologist at the Erasmus Medical University in the Netherlands, successfully used the method in November 2011 to make the H5N1 flu more infectious and transmissible to humans.
To create the highly lethal pathogen, Fouchier had taken flu samples and used them to infect ferrets many times over, cherry picking specimens from the sickest ferrets to infect the next ones in line. It was the simplicity and cheapness of the process that concerned many.
For Simon Wain-Hobson, a retired virologist known for his work in sequencing HIV at the Pasteur Institute, the gains involved never seemed worth the risks being taken. For this reason he and a handful of other peers, among them Richard Ebright, a molecular biologist and biodefence expert at Rutgers University in New Jersey, successfully advocated at the highest political levels for a moratorium on public funding, which began in October 2014.
In recent months gain-of-function activities that occurred after the moratorium at the Wuhan Institute of Virology (WIV) — and their alleged links to the origins of the Sars-Cov-2 outbreak — have come under growing scrutiny. A US intelligence probe ordered by President Joe Biden into whether the virus emerged naturally or could have leaked from the lab failed to reach a definitive conclusion in August and Beijing has rejected any suggestion of a leak from the laboratory. Nonetheless, the scrutiny has revived old controversies about the safety and usefulness of gain-of-function methods as well as whether they should be publicly funded.
The ambiguity has led Wain-Hobson to conclude that the same forces that compel amateurs to tinker away on potentially lethal pathogens in their garages exist everywhere in science, often finding innovative ways or excuses to get around restrictions.
People will always find ‘dirty solutions’ to even the strictest controls, he laments.
He likens the scientific compulsion to tinker to fantasy novelist Terry Pratchett’s observation that “if you put a large switch in some cave somewhere with a sign on it saying ‘end of the world switch. Please do not touch’, the paint wouldn’t have time to dry.”
But defenders of gain of function often make similar arguments. It is better in their opinion to fund the research in official settings where it can be supervised and influenced than to ban it and have it seeping into unsupervised pastures or those open to more ad hoc practices.
Those charged with monitoring biological threats and enforcing the BWC convention have tended to support bottom-up solutions, such as code of conduct development and awareness, for this reason. But the problem with such an approach is that it is voluntary, says Filippa Lentzos, a social scientist at King’s College London who is researching biological agent threats.
“It’s really up to the individual institutions, countries, areas and professional associations to look at these guidelines and try to implement their own,” she says, noting the BWC is still without formal enforcement or verification power.
For Dabrowa the measures do not go far enough. He would rather the international bioweapons community took a leaf out of the nuclear deterrent field and identified supply chain chokepoints that can be monitored more robustly.
“We need a part of the process that is expensive, difficult and necessary,” he says, advocating for control of a key input material for DNA processing called nucleoside phosphoramidite. “It’s the equivalent of asking why someone in Afghanistan just put an order online for weapons-grade uranium.”