>Researchers have come up with a new approach to computing that uses biological processes to solve complex computing problems. Led by Dr. Dan Nicolau, Chair of the Department of Bioengineering at McGill University in Montreal, Quebec, the research team has built a prototype biocomputer that works by parallel processing much like a supercomputer but operates on a much smaller scale – it’s about the size of a book, and uses just a fraction of the energy of a supercomputer.
Published this week in the Proceedings of the National Academy of Sciences, the results are the product of a collaborative effort begun seven years ago with colleagues in Germany, Sweden and the Netherlands.
“This started as a back of an envelope idea, after too much rum I think, with drawings of what looked like small worms exploring mazes,” says Dr. Nicolau.
In brief, the biocomputer encodes or imprints a given computing problem right into the physical geometry of a 1.5 cm2 microchip (thereby creating the “maze”) and then uses biological agents (the “worms”) such as actin filaments and microtubules, both of which function within biological cells, to independently travel throughout the imprinted substrate and “explore” the encoded problem. Unlike traditional computing in which electricity powers electrons along the circuits of a microchip, the biological agents doing the work within the new biocomputer are powered by adenosine trisphosphate (ATP), the chemical currency within all living cells.
Supercomputers are currently used for a wide range of computing tasks from complex combinatorial problems in mathematics and quantum mechanics to helping to predict the weather and aid in resource exploration. But traditional supercomputers create a lot of heat and correspondingly use up a lot of energy in cooling down their processors. A typical supercomputer can require several megawatts of power to operate – roughly, one megawatt is enough to power about 500 homes.
Until recently, only government and research institutions housed supercomputers due to their size and cost, but more often now private companies are buying supercomputers to address their data crunching needs. According to the technology web reviewer, V3, one of the leading supercomputer manufacturers, Cray, had 15 per cent of its 2015 revenue of $724.7 million (USD) come from sales to private companies and it expects that number to rise to one-third of its business in the future.
One of Cray’s clients is Petroleum Geo-Services, a company producing seismic and electromagnetic data for finding offshore oil and gas reserves, which recently installed the fastest supercomputer in the commercial sector, ranked 14th on the Top500 list of supercomputers.
Other biological approaches to supercomputing are in development, including DNA computation, quantum and microfluid computation but Dr. Nicolau and his team contend that these approaches have not yet been proven practical in an operational sense, whereas their biocomputer opens up a new range of possibilities in computing technology. “Now that this model exists as a way of successfully dealing with a single problem, there are going to be many others who will follow up and try to push it further, using different biological agents, for example,” Dr. Nicolau says.