Data center infrastructure management

Data center infrastructure management (DCIM) is a category of solutions which were originally created to extend the traditional data center management function to include all of the physical assets and resources found in the Facilities and IT domains. DCIM deployments over time were supposed to integrate information technology (IT) and facility management disciplines to centralize monitoring, management and intelligent capacity planning of a data center's critical systems. Since DCIM is a broadly used term which covers a wide range of data center management values, each deployment will include a subset of the full DCIM value needed and expected over time.[1]

Full DCIM deployments may involve specialized software, hardware and sensors, but most do not.[2] With more than 75 vendors in 2014 self-identifying their offerings to be part of the DCIM market segment, the rapid evolution of the DCIM category helped lead to the creation of several associated data center performance management and measurement metrics, including industry standard metrics like PUE, CUE and DCeP – Data Center Energy Productivity as well as vendor-driven metrics such as PAR4 - Server Power Usage and DCPM – Data Center Predictive Modeling with the intention of providing increasingly cost-effective planning and operations support for certain aspects of the data center and its contained devices.

Since its identification as a missing component for optimized data center management, the broad DCIM category has been flooded with a wide range of point-solutions and hardware-vendor offerings intended to address this void. The analyst firm Gartner Research has started using a term to try and focus on DCIM vendors with a more comprehensive set of capabilities. DCIM Suite vendors number less than two dozen in 2014, and consist of software offering which are wide-ranging and integrated in nature. The existing suites touch upon both IT and Facilities and depending upon the vendor's heritage, may have a bias towards either 1) IT asset lifecycle management or 2) facilities monitoring and access. It is likely that for an extended period of time, the DCIM Suites that exist will continue to have their core strength in one discipline or the other, but not equally addressing both. Important to note is that there are dozens of other vendors whose technologies directly support or enhance the DCIM suites. In general, these specialists' offerings can also be used as viable stand-lone solution to a specific set of data center management needs. In the fourth quarter of 2014, Gartner[3] released their Magic Quadrant[4] and Critical Capabilities[5] reports which are the first tangible approach to a quantitative comparison of the values each vendor has to offer. The Magic Quadrant focused on 17 vendors, while the Critical Capabilities report examined just 7 that they considered broad enough compare.

The large framework providers are re-tooling their own wares and creating DCIM alliances and partnerships with various other DCIM vendors to complete their own management picture. The inefficiencies seen previously by having limited visibility and control at the physical layer of the data center is simply too costly for end-users and vendors alike in the energy-conscious world we live in. These multibillion-dollar large framework providers include Hewlett-Packard, BMC, CA and IBM/Tivoli and have promised DCIM will be part of their overall management structure. Today, each is defining their approach in doing so through organic and partnership efforts.

While the physical layer of the data center has historically been viewed as a hardware exercise, there are a number of DCIM Suite and DCIM Specialist software vendors who offer varied DCIM capabilities including one or more of the following; Capacity Planning, 3D visualization, Real-Time Monitoring, Cable/Connectivity management, Environmental/Energy sensors, business analytics (including financial modeling), Process/Change Management and integration well with various types of external management systems and data sources.

In 2011 some predicted data center management domains would converge across the logical and physical layers.[6] This type of converged management environment will allow enterprises to use fewer resources, eliminate stranded capacity, and manage the coordinated operations of these otherwise independent components.[7]

Driving factors

According to an IT analyst at Gartner and presented in December 2013, "By 2017, DCIM tools will be significantly deployed in over 60% of larger data centers (> 3,000 sq ft) in North America." Hence, DCIM can be viewed as a high growth adoption since less than 10% percent of the same market had adopted anything in this category by 2012. There are several trends driving the adoption of DCIM. These drivers include:[8]

Features

At a high level, DCIM can be used for many purposes. DCIM can support data center availability and reliability requirements, it can identify and eliminate sources of risk to increase availability of critical IT systems, it can be used to identify interdependencies between facility and IT infrastructures to alert the facility manager to gaps in system redundancy, and it can assist in modeling the costs structures of building and maintaining the huge accumulation of assets which form the data center, over long periods of time.

Worth noting is a tiny bit of segmentation is beginning to occur now (2013). The roster of DCIM suppliers is becoming grouped (in many public forums) into a minimum of two buckets, or segments in an attempt to reduce the customer confusion when researching DCIM solutions. The first bucket is the integrated software suites, where a comprehensive set of lifecycle asset management features are brought together and share a common view of the data center. Integrated repositories, reporting and connectivity are all expected to exist within these suites. Suites share a common look and feel and leverage all of the underlying asset knowledge where appropriate. A single source of truth exists across the entire suite for any given attribute.

The second group of DCIM suppliers includes all of the remaining 100+ vendors. These vendors enhance the DCIM suites and can exist as stand-alone solutions as well. These solutions are also referred to as 'specialists' or 'DCIM-ready' components.[10] These include sensor systems, power management solutions, analytics packages, and monitoring. One of more of these enhancement solutions will likely be deployed or coupled with a single selected DCIM suite. There will be additional segmentation as vendors self-align their values to customer needs.

One popular initiative which certain DCIM solution can address is the reduction of energy usage and energy efficiency. In these cases, DCIM solutions enables data center managers to measure energy use, enabling safe operation at higher densities. According to Gartner Research, DCIM can lead to energy savings that reduce a data center's total operating expenses by up to 20 percent. In addition to measuring energy use, other DCIM components such as Connectivity management can document network service, optimum routing, failure scenarios and interact with incident and problem management processes. CFD can be used to maximize the use of airflow and eliminate stranded resources such as space, which further drives down infrastructure costs.

DCIM software is used to benchmark current space, network and power consumption with equipment temperature[11] often using real-time feeds and equipment ratings, then model the effects of "green" initiatives on the data center's power usage effectiveness (PUE) and data center infrastructure efficiency before committing resources to an implementation.

On the IT side of DCIM, certain vendor implementations of DCIM Suites will allow optimal server placement with regard to power, network, cooling and space requirements[12] and there is a US Patent (7,765,286) which provides a discussion about this type of intelligent placement based upon one or more existing data center conditions.[13]

Evolution of tools

Traditional approaches to resource provisioning and service requests have proven to be ill suited for virtualization and cloud computing. The manual handoffs between technology teams were also highly inefficient and poorly documented. This initially led to poor consumption of system resources and an IT staff that spent a lot of time on activities that provided little business value. In order to efficiently manage data centers and cloud computing environments, IT teams need to standardize and automate virtual and physical resource provisioning activities and develop better insight into real-time resource performance and consumption.[14]

Moving with the need of the times, now a days, there are efficient software to tackle specific needs of a data center. The management systems while replacing the manual energy invested, also provides services like auditing, data compiling and records, as well helps in making consist reports to help optimize operations. These specialized services include:[15]

Data center monitoring systems were initially developed to track equipment availability and to manage alarms. While these systems evolved to provide insight into the performance of equipment by capturing real-time data and organizing it into a proprietary user interface, they have lacked the functionality necessary to effectively monitor and make adjustments to interdependent systems across the physical infrastructure to address changing business and technology needs.

More were later developed to connect this equipment and provide a complete view of the facility's data center infrastructure. In addition to enabling comprehensive real-time monitoring, these tools were equipped with modeling and management functionality to facilitate long-term capacity planning; dynamic optimization of critical systems performance and efficiency; and efficient asset utilization.[17]

In response to the rapid growth of business-critical IT applications, server virtualization became a popular method for increasing a data center's IT application capacity without making additional investments in physical infrastructure. Server virtualization also enabled rapid provisioning cycles, as multiple applications could be supported by a single provisioned server.

Modern data centers are challenged with disconnects between the facility and IT infrastructure architectures and processes. These challenges have become more critical as virtualization creates a dynamic environment within a static environment, where rapid changes in compute load translate to increased power consumption and heat dispersal.[18] If unanticipated, rapid increases in heat densities can place additional stress on the data center's physical infrastructure, resulting in a lack of efficiency, as well as an increased risk for overloading and outages.[19] In addition to increasing risks to availability, inefficient allocation of virtualized applications can increase power consumption and concentrate heat densities, causing unanticipated "hot spots" in server racks and areas. These intrinsic risks, as well as the aforementioned drivers, have resulted in an increase in market demand for integrated monitoring and management solutions capable of "bridging the gap between IT and facilities" systems.[20]

In 2010, analyst firm Gartner. Inc. issued a report on the state of DCIM implementations and speculated on future evolutions of the DCIM approach. According to the report, widespread adoption of DCIM over time will lead to the development of "intelligent capacity planning" solutions that support synchronized monitoring and management of both physical and virtual infrastructures.[2]

Intelligent capacity planning is supposed to enable the aggregation and correlation of real-time data from heterogeneous infrastructures to provide data center managers with a common repository of performance and resource utilization information.[21] It is also promised to enable data center managers to automate the management of IT applications based on server capacity—as well as conditions within a data center's physical infrastructure—optimizing the performance, reliability and efficiency of the entire data center infrastructure.

References

  1. Harris, Mark J. (2013-09-18). "Data Center Infrastructure Management Tools Monitor Everything and Ease Capacity Planning and Operational Support". CIO.com.
  2. 1 2 Cappuccio, David J. (2010-03-29). "DCIM: Going Beyond IT". Gartner, Inc.
  3. Pultz, Jay (2014-12-30). "Data Center Infrastructure Management". Gartner.
  4. Pultz, Jay (2014-10-30). "Magic Quadrant for DCIM". Gartner.
  5. Pultz, Jay (2014-12-30). "Critical Capabilities of DCIM". Gartner.
  6. Huff, Lisa (2011-08-18). "The Battle for the Converged Data Center Network". Data Center Knowledge.
  7. Oestreich, Ken (2011-11-15). "Converged Infrastructure". The CTO Forum.
  8. "Put DCIM into Your Automation Plans". Forrester Research. December 2009. Archived from the original on January 9, 2010.
  9. "Infrastructure Monitoring and Management Tops List of Data Center User Issues". Information Management. 2010-06-03.
  10. http://www.rfcode.com/solutions/data-center-infrastructure-management-rfid-tracking-dcim
  11. http://graphicalnetworks.com/netterrain-dcim-2/
  12. "Problems DCIM Solves".
  13. US Patent 7,765,286
  14. Improving Datacenter Operational Efficiency Using Self-Service Provisioning and Advanced Performance Analytics
  15. "What are the Benefits of DCIM Software? | Sunbird DCIM". www.sunbirddcim.com. Retrieved 2016-05-23.
  16. "Energy Efficiency with CFD Integration | Datacenter Clarity LC". datacenter.mayahtt.com. Retrieved 2016-11-04.
  17. "Data Center Management and Efficiency Software", 451 Group
  18. Preimesberger, Chris (2010-10-19). "Emerson Power Bringing Its Perspective to Data Center Management". eWeek.
  19. Marko, Kurt (2010-07-02). "A Look At Data Center Infrastructure Management Software & Its Impact". Processor Magazine. 32 (14): 18.
  20. Harris, Mark (2010-06-08). "Bridging the Gap between IT and Facilities". Data Center Knowledge.
  21. Cole, Dave (June 2010). "The Infrastructure Management Elephant" (PDF). PTSDCS.
This article is issued from Wikipedia - version of the 11/4/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.