Energy Saving in Data Centers

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Computer Science & Engineering".

Deadline for manuscript submissions: closed (31 July 2016) | Viewed by 28730

Special Issue Editor


E-Mail Website
Guest Editor
Department of Computer Science, University of Nevada, Las Vegas, NE, USA
Interests: online competitive algorithms, combinatorial optimization, heuristics, scheduling, green computing, sensor networks, cloud computing, open source projects

Special Issue Information

Dear Colleagues,

The chief executive officer of Google remarked “What matters most to the computer designers at Google is not speed but power, low power, because data centers can consume as much energy as a city.” Cloud computing now represents a backbone of IT services, and, recently, there has been an increase in high-definition multimedia delivery, which has placed new burdens on energy resources. Energy demands have motivated research on energy efficiency strategies and the implementation of renewable energy sources as to reduce operational costs and environmental impact. The ever-increasing scale of data centers has presented unprecedented challenges for hardware and algorithmic development.

This Special Issue aims to present a wide range of papers with contributions in the area of green cloud computing. Contributions which bring together hardware challenges and algorithmic approaches are especially encouraged and we are interested in contributions from researchers who work at the intersection of Electrical Engineering, Modeling and Simulation, and Computer Science. However, we invite researchers from all relevant fields to contribute case studies, reviews, surveys and research papers.

Prof. Dr. Wolfgang Bein
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • data center sustainability, dependability and cost
  • energy reduction in server systems
  • energy-aware resource allocation
  • energy-optimized multimedia clouds
  • thermal states and cooling system operation
  • power-down management
  • burdened energy management
  • dynamic voltage and frequency scaling
  • dynamic component deactivation
  • load balancing power for cluster-based systems
  • power management in virtual cloud environments
  • geographic cloud server location design
  • case studies simulations for data center energy problems
  • approximation algorithms and heuristics for data center energy problems
  • integration of renewable energy  to cloud computing

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

3 pages, 136 KiB  
Editorial
Energy Saving in Data Centers
by Wolfgang Bein
Electronics 2018, 7(1), 5; https://doi.org/10.3390/electronics7010005 - 9 Jan 2018
Cited by 4 | Viewed by 5098
Abstract
Globally CO2 emissions attributable to Information Technology are on par with those resulting from aviation. Recent growth in cloud service demand has elevated energy efficiency of data centers to a critical area within green computing. Cloud computing represents a backbone of IT [...] Read more.
Globally CO2 emissions attributable to Information Technology are on par with those resulting from aviation. Recent growth in cloud service demand has elevated energy efficiency of data centers to a critical area within green computing. Cloud computing represents a backbone of IT services and recently there has been an increase in high-definition multimedia delivery, which has placed new burdens on energy resources. Hardware innovations together with energy-efficient techniques and algorithms are key to controlling power usage in an ever-expanding IT landscape. This special issue contains a number of contributions that show that data center energy efficiency should be addressed from diverse vantage points. Full article
(This article belongs to the Special Issue Energy Saving in Data Centers)

Research

Jump to: Editorial

1437 KiB  
Article
Characterizing Energy per Job in Cloud Applications
by Thi Thao Nguyen Ho, Marco Gribaudo and Barbara Pernici
Electronics 2016, 5(4), 90; https://doi.org/10.3390/electronics5040090 - 12 Dec 2016
Cited by 10 | Viewed by 5683
Abstract
Energy efficiency is a major research focus in sustainable development and is becoming even more critical in information technology (IT) with the introduction of new technologies, such as cloud computing and big data, that attract more business users and generate more data to [...] Read more.
Energy efficiency is a major research focus in sustainable development and is becoming even more critical in information technology (IT) with the introduction of new technologies, such as cloud computing and big data, that attract more business users and generate more data to be processed. While many proposals have been presented to optimize power consumption at a system level, the increasing heterogeneity of current workloads requires a finer analysis in the application level to enable adaptive behaviors and in order to reduce the global energy usage. In this work, we focus on batch applications running on virtual machines in the context of data centers. We analyze the application characteristics, model their energy consumption and quantify the energy per job. The analysis focuses on evaluating the efficiency of applications in terms of performance and energy consumed per job, in particular when shared resources are used and the hosts on which the virtual machines are running are heterogeneous in terms of energy profiles, with the aim of identifying the best combinations in the use of resources. Full article
(This article belongs to the Special Issue Energy Saving in Data Centers)
Show Figures

Figure 1

519 KiB  
Article
Scheduling Energy Efficient Data Centers Using Renewable Energy
by Santiago Iturriaga and Sergio Nesmachnow
Electronics 2016, 5(4), 71; https://doi.org/10.3390/electronics5040071 - 19 Oct 2016
Cited by 11 | Viewed by 5960
Abstract
This work presents a multi-objective approach for scheduling energy consumption in data centers considering traditional and green energy data sources. This problem is addressed as a whole by simultaneously scheduling the state of the servers and the cooling devices, and by scheduling the [...] Read more.
This work presents a multi-objective approach for scheduling energy consumption in data centers considering traditional and green energy data sources. This problem is addressed as a whole by simultaneously scheduling the state of the servers and the cooling devices, and by scheduling the workload of the data center, which is comprised of a set of independent tasks with due dates. Its goal is to simultaneously minimize the energy consumption budget of the data center, the energy consumption deviation from a reference profile, and the amount of tasks whose due dates are violated. Two multi-objective evolutionary algorithms hybridized with a greedy heuristic are proposed and are enhanced by applying simulated annealing for post hoc optimization. Experimental results show that these methods are able to reduce energy consumption budget by about 60% while adequately following a power consumption profile and providing a high quality of service. These results confirm the effectiveness of the proposed algorithmic approach and the usefulness of green energy sources for data center infrastructures. Full article
(This article belongs to the Special Issue Energy Saving in Data Centers)
Show Figures

Graphical abstract

4646 KiB  
Article
Energy Aware Pricing in a Three-Tiered Cloud Service Market
by Debdeep Paul, Wen-De Zhong and Sanjay Kumar Bose
Electronics 2016, 5(4), 65; https://doi.org/10.3390/electronics5040065 - 28 Sep 2016
Cited by 4 | Viewed by 4855
Abstract
We consider a three-tiered cloud service market and propose an energy efficient pricing strategy in this market. Here, the end customers are served by the Software-as-a-Service (SaaS) providers, who implement customized services for their customers. To host these services, these SaaS providers, in [...] Read more.
We consider a three-tiered cloud service market and propose an energy efficient pricing strategy in this market. Here, the end customers are served by the Software-as-a-Service (SaaS) providers, who implement customized services for their customers. To host these services, these SaaS providers, in turn, lease the infrastructure related resources from the Infrastructure-as-a-Service (IaaS) or Platform-as-a-Service (PaaS) providers. In this paper, we propose and evaluate a mechanism for pricing between SaaS providers and Iaas/PaaS providers and between SaaS providers and the end customers. The pricing scheme is designed in a way such that the integration of renewable energy is promoted, which is a very crucial aspect of energy efficiency. Thereafter, we propose a technique to strategically provide an improved Quality of Service (QoS) by deploying more resources than what is computed by the optimization procedure. This technique is based on the square root staffing law in queueing theory. We carry out numerical evaluations with real data traces on electricity price, renewable energy generation, workload, etc., in order to emulate the real dynamics of the cloud service market. We demonstrate that, under practical assumptions, the proposed technique can generate more profit for the service providers operating in the cloud service market. Full article
(This article belongs to the Special Issue Energy Saving in Data Centers)
Show Figures

Figure 1

1013 KiB  
Article
Virtual Machine Replication on Achieving Energy-Efficiency in a Cloud
by Subrota K. Mondal, Jogesh K. Muppala and Fumio Machida
Electronics 2016, 5(3), 37; https://doi.org/10.3390/electronics5030037 - 1 Jul 2016
Cited by 5 | Viewed by 5210
Abstract
The rapid growth in cloud service demand has led to the establishment of large-scale virtualized data centers in which virtual machines (VMs) are used to handle user requests for service. A user’s request cannot be completed if the VM fails. Replication mechanisms can [...] Read more.
The rapid growth in cloud service demand has led to the establishment of large-scale virtualized data centers in which virtual machines (VMs) are used to handle user requests for service. A user’s request cannot be completed if the VM fails. Replication mechanisms can be used to mitigate the impact of failures. Further, data centers consume a large amount of energy resulting in high operating costs and contributing to significant greenhouse gas (GHG) emissions. In this paper, we focus on Infrastructure as a Service (IaaS) cloud where user job requests are processed by VMs and analyze the effectiveness of VM replications in terms of job completion time performance as well as energy consumption. Three different schemes: cold, warm, and hot replications are considered. The trade-offs between job completion time and energy consumption in different replication schemes are characterized through comprehensive analytical models which capture VM state transitions and associated power consumption patterns. The effectiveness of replication schemes are demonstrated through experimental results. To verify the validity of the proposed analytical models, we extend the widely used cloud simulator CloudSim and compare the simulation results with analytical solutions. Full article
(This article belongs to the Special Issue Energy Saving in Data Centers)
Show Figures

Graphical abstract

Back to TopTop