Advertisment

IoT, Edge Computing, Big Data – Creating the Next Wave in Data Center Innovation

The next wave of data center improvement will be driven by the need for connectivity and using technology to enable big data center managers to effectively manage inherently complex environments.

author-image
Ashok Pandey
New Update
IoT Edge Computing Big Data – Creating the Next Wave in Data Center Innovation

By Ashok Rao, Director Strategy, Alliances & Planning, IT Business - Schneider Electric 

Advertisment

The next wave of data center improvement will be driven by the need for connectivity and using technology to enable data center managers to effectively manage inherently complex environments. While in the past the focus was on improving the management of one facility, the emerging ecosystem of hybrid cloud computing has shifted the management of the infrastructure into an exercise of managing multiple facilities with virtualized environments.

Over the past few years, the Internet of Things (IOT) has been a pretty hot topic. It has many definitions. Gartner describes IOT as the network of physical objects that contain embedded technology to communicate and sense or interact with their internal states or the external environment.IDC defines the IoT as a network of networks of uniquely identifiable endpoints (or "things") that communicate without human interaction using IP connectivity.

Whatever the definition, all agree that IOT is gaining momentum and the number of devices projected to be connected by 2020 to be a very large number (estimates range from 20B to 50B devices). These devices could be as varied as a “wearable” capturing data about physical parameters from a person exercising , a sensor that controls and manages the lighting inside a smart building, sensor – enabled pipes detecting water leaks in a “smart water “ setup to machines in a manufacturing plant monitoring their own performance and health. There will be much more such use cases, but in effect, these devices or things when connected will generate data which will need to be accessed, stored, analyzed and secured.

Advertisment

Usually, discussions around data from IoT focus on the protocol compatibility at an application layer as well as the transport layer. However equally important is the infrastructure that holds this data – the datacenter. The huge amount of data that requires being managed implies that the way datacenters are built and managed may have to evolve.This new paradigm will require the datacenter  owners and managers to be aware of the key trends in the datacenter space .

Flexible, agile datacenters – delivered fast

Companies will need to be able to build new or expanded data centers quickly in order to keep up with business demand while ensuring they are at once reliable and flexible. This will lead to the continued use of modular datacenter solutions and could include the use of prefabricated datacenters as well. The modularity will be not just at the component level like the use of modular UPS systems, but at design levels as well including the usage of “pods” etc.  Such modular systems shave months, if not years,off the time it takes to build a data center. The resulting datacenter is also more reliable because many components are assembled and tested in the factory, under highly controlled conditions – a far more reliable approach than assembling components in the field.

Advertisment

Micro datacenters

A lot of the process information will be uploaded to the Cloud – whether it’s a private Cloud, a public Cloud or centralized datacenters. However, certain applications will need to run close to the load increase productivity and manage costs. This need for “fast’ data will drive the discussion between bandwidth and latency considerations; a tradeoff between locating physical infrastructure closer to the edge versus the bandwidth required when the data is centrally stored.

This need for edge computing will drive the deployment of standardized micro datacenters. The essence of this is that you design this once and then deploy them anywhere. They will consist of standardized hardware – be it IT or physical infrastructure and be rolled out like an “appliance”.  These “appliances” cannot be scaled down versions of large datacenters; they need to design for purpose in order to ensure that they are most effective.

Advertisment

Instead of having to integrate solutions from different vendors, these will be rolled out as a consolidated appliance, resulting in lower costs of testing, deployment, training and support.

Datacenter infrastructure management

As datacenters become more “cloudy, diverse and connected “ ( as defined by 451 Group),  datacenter infrastructure management (DCIM) tools such as Schneider Electric’s StruxureWare for Datacenters  become crucial  to ensure data center reliability and efficiency.

Advertisment

DCIM enables customers to collect critical intelligence from the connected things about their datacenter facility. This can enhance the visibility of assets and understand stranded capacities, thereby helping to improve effective utilization. In addition, these tools can help improvement availability,alerting them to where hot spots are and identifying possible disruptions to operations. DCIM gives managers a better understanding of their data center and helps lower their total cost of ownership (TCO) by improving energy efficiency and maximizing resource utilization. Ultimately, DCIM enables IT groups to more effectively and efficiently support business operations.

However for DCIM to be most effective, it is imperative for the platform to be standards based and multi-vendor compatible. Further, it is critical for it to integrate with other management systems that reside in and around the datacenter like buildings management Systems and IT Network Management tools.

Datacenter lifecycle services

Advertisment

Handling all the data that IoT environments create – and other forms of Big Data – creates complexity and stress on data centers that can lead to higher energy use and increased costs. There is constant need to improve stability and resilience as the cost of any downtime in an “always connected world” can be quite high. These challenges are further exacerbated by the shortage of skilled personnel to manage these dynamic environments.

The customer’s needs vary from a vendor providing a break-fix solution or a partner working with them on the strategy. Even though the DC’s are built with highest resilient systems, they do break and the basic need of a customer starts with looking for a vendor to support their system when it breaks. This leads to a datacenter lifecycle approach to design, management, and optimization of the datacenters through their lifetime.  This approach will help companies to anticipate needs and respond in an efficient and proactive manner.

A digital service which includes remote monitoring and management of assets for their performances or by managing the complete facility and operating them will also gain traction in the coming years. These datacenter lifecycles services help companies better adapt to change without sacrificing efficiency and reliability, a capability that promises to be important today.

Availability and energy consumption will continue to be key elements in the data center strategy of organizations. Periodic datacenter assessments to evaluate the health of the datacenters as they constantly change will be key to ensuring that the business needs continue to be met. Energy management services such as efficiency assessment or Energy monitoring & reporting helps the customer to maintain the optimum operational efficiency of their data center.

In conclusion, the Internet of Things is not about a network or sensors; it is about data and continuous, safe and efficient access to this data so that organizations can derive insights from this to improve their business and operations.

big-data
Advertisment