When you think about the parts of our electronic devices that consume the most power, the screen and processor usually spring to mind. However, data transfer - either within the device or over the waves (i.e. to cloud storage (opens in new tab) providers) - is consuming more and more power.
Scientists at the National University of Singapore (NUS) have come up with an innovative technique that promises to reduce the amount of energy consumed during memory-intensive processes by up to 80%. In other words, a fivefold improvement in efficiency over current solutions when bits travel on silicon.
They came up with a new type of network-on-chip that decreases quality a little bit, but also reduces power consumption significantly. This is achieved by adjusting the amplitude of the transmitted signal dynamically; using conventional values for mission-critical tasks to ensure maximum accuracy and lower values for greater power reductions.
- Check out our list of the best mobile workstations (opens in new tab) on the market
- We've built a list of the best business smartphones (opens in new tab) around
- Here's our choice of the best business laptops (opens in new tab) available
Smarter than usual
The example provided by the team was that of imperceptible video quality degradation when full quality is unnecessary, for example when the user looks away from the screen, when ambient light is low or when battery life is short.
Similar scenarios are also applicable to more powerful (and power hungry) platforms such as desktop PCs (opens in new tab), NAS boxes (opens in new tab), laptops (opens in new tab) or even servers (opens in new tab), but the key opportunity is to enable a full computer vision system - one that can replicate the human vision system while being viable from a power perspective.
The stated goal of the research is to build “a new breed of low-power smart cameras that could operate almost perpetually under the tight power budget extracted from the environment such as via a centimeter-sized solar cell”.
It's unclear when the technology will be rolled out for more practical use cases, but given TSMC - which manufactures chips for AMD, Nvidia and Qualcomm - supports the project for chip fabrication, we wouldn’t be surprised if it was sooner rather than later.
- Here's our list of the best cloud backup services (opens in new tab) out there