Skip to main content

Here’s a technology that could make laptops and smartphones last longer

(Image credit: Shutterstock / McLittle Stock)

When you think about the parts of our electronic devices that consume the most power, the screen and processor usually spring to mind. However, data transfer - either within the device or over the waves (i.e. to cloud storage providers) - is consuming more and more power.

Scientists at the National University of Singapore (NUS) have come up with an innovative technique that promises to reduce the amount of energy consumed during memory-intensive processes by up to 80%. In other words, a fivefold improvement in efficiency over current solutions when bits travel on silicon.

They came up with a new type of network-on-chip that decreases quality a little bit, but also reduces power consumption significantly. This is achieved by adjusting the amplitude of the transmitted signal dynamically; using conventional values for mission-critical tasks to ensure maximum accuracy and lower values for greater power reductions.

Smarter than usual

The example provided by the team was that of imperceptible video quality degradation when full quality is unnecessary, for example when the user looks away from the screen, when ambient light is low or when battery life is short.

Similar scenarios are also applicable to more powerful (and power hungry) platforms such as desktop PCs, NAS boxes, laptops or even servers, but the key opportunity is to enable a full computer vision system - one that can replicate the human vision system while being viable from a power perspective.

The stated goal of the research is to build “a new breed of low-power smart cameras that could operate almost perpetually under the tight power budget extracted from the environment such as via a centimeter-sized solar cell”.

It's unclear when the technology will be rolled out for more practical use cases, but given TSMC - which manufactures chips for AMD, Nvidia and Qualcomm - supports the project for chip fabrication, we wouldn’t be surprised if it was sooner rather than later.