• Thursday, April 25, 2024
businessday logo

BusinessDay

How software will handle future data explosion – Schneider Electric

In race against fuel challenges, Schneider Electric canvass adoption of electric vehicles

An important principle in the development of IT over the decades has been Moore’s Law. The law predicts that transistor density in processors would double every two years as development progresses.

In a thought leadership article, Senior Vice President, Secure Power Division, International Operations, Schneider Electric, Natalya Makarochkina, says that despite many predictions of its demise, it has remained a more or less guiding principle. However, she notes that what is perhaps less well known is a similarly persistent trend in the data centre space.

As demand for data continues to increase, Makarochkina identified modular data centers, micro data centers better storage management as Key elements to handle future data explosions and achieve sustainability.

Global data production went from estimates of 2 zettabytes in 2010 to 41 zettabytes in 2019. International Data Center (IDC) estimates global data load will rise to a staggering 175 zettabytes by 2025.

The development of data centre infrastructure management (DCIM) systems has continued apace, allowing the integration of AI to take advantage of hardware and infrastructure developments. These experiments are now features, allowing unprecedented visibility and control. For those designing for new developments, software such as ETAP, allows power efficiency to be built into the design from the outset, while also accommodating microgrid architectures.

The data explosion is expected to continue increasing, with developments such as industrial IoT, 5G, with increasing general automation and autonomous vehicles as driving factors. According to her, the data that will be generated, far from the centralized data infrastructure, must be handled, processed and turned into intelligence quickly, where it is needed.

Read also: Hybrid threats and the future of supply chain management

Makarochkina notes that new data architectures are expected to improve efficiency in how all of that is handled, adding that edge computing is seen as an important approach to manage more data being generated at the edge.

“Schneider Electric has been committed to sustainable business for decades. That has meant a renewed focus on efficiency in all aspects of design and operation. Gains have been made in efficiency in power and cooling, with UPS systems and modular power supplies showing significant gains with each generation, culminating in the likes of the current Galaxy VL line. This line’s use of lithium-ion batteries has not only increased efficiency, it has extended operational life, reduced environmental impact in reducing raw materials, and facilitated “energized swapping”, where the addition and/or replacement of power modules can be performed with zero downtime, while increasing protection to operators and service personnel,” she says.

In addition to modular data centers, micro data centers and better storage management, Makarochkina says that other key elements to handle future data explosion include the application of better instrumentation, data gathering, and analysis that allows for better control and orchestration.

She, however, explains that efficiency must extend through not just the supply chain, but also throughout lifecycles. Vendors, suppliers, and partners must all be engaged to ensure that no part of the ecosystem lags in applying the tools to ensure efficiency.

“This applies as much in the design time of new equipment and applications as it does through working life and decommissioning. Understanding how an entire business ecosystem impacts the environment, she says, will be vital to truly achieve net zero goals,” Makarochkina said.

Makarochkina also highlights that agreed standards, transparency and measurability are all vital factors to ensure results.