The virus M13 bacteriophage, that infects the bacterium Escherichia coli was discovered to be a key way to develop faster computers by reducing the millisecond time delays.
Scientists have successfully used a virus to engineer a better type of memory in computers, which could boost their speed and efficiency.
The research, published in the journal Applied Nano Materials, found that a key way to develop faster computers is to reduce the millisecond time delays using the virus M13 bacteriophage, that infects the bacterium Escherichia coli.
These delays usually come from the transfer and storage of information between a traditional random access memory (RAM) chip and hard drive. A RAM chip is fast but expensive and volatile, meaning it needs power supply to retain information, said researchers from the Singapore University of Technology and Design (SUTD).
Phase-change memory can be as fast as a RAM chip and can contain even more storage capacity than a hard drive. The new memory technology uses a material that can reversibly switch between amorphous and crystalline states. A binary-type material, for example, gallium antimonide, could be used to make a better version of phase-change memory.
However, the use of this material can increase power consumption and it can undergo material separation at around 347 degrees Celsius, researchers said. Hence, it is difficult to incorporate a binary-type material into current integrated circuits, because it can separate at typical manufacturing temperatures at about 397 degrees Celsius.
“Our research team has found a way to overcome this major roadblock using tiny wire technology,” said Assistant Professor Desmond Loke from SUTD.
The traditional process of making tiny wires can reach a temperature of around 447 degrees Celsius, a heat that causes a binary-type material to separate. For the first time, the researchers showed that by using the M13 bacteriophage — a low-temperature construction of tiny germanium-tin-oxide wires and memory can be achieved.
“This possibility leads the way to the elimination of the millisecond storage and transfer delays needed to progress modern computing,” according to Loke.