Please use this identifier to cite or link to this item:
Appears in Collections:University of Stirling Research Data
Title: Generated data and plots for the paper "Exploring the Accuracy -- Energy Trade-off in Machine Learning"
Creator(s): Brownlee, Alexander E I
Contact Email:
Date Available: 4-Mar-2021
Citation: Brownlee, AEI (2021): Generated data and plots for the paper "Exploring the Accuracy -- Energy Trade-off in Machine Learning". University of Stirling. Facutly of Natural Sciences. Dataset.
Publisher: University of Stirling. Faculty of Natural Sciences
Dataset Description (Abstract): Machine learning accounts for considerable global electricity demand and resulting environmental impact, as training a large deep-learning model produces 284 000kgs of the greenhouse gas carbon dioxide. In recent years, search-based approaches have begun to explore improving software to consume less energy. Machine learning is a particularly strong candidate for this because it is possible to trade off functionality (accuracy) against energy consumption, whereas with many programs functionality is simply a pass-or-fail constraint. We use a grid search and NSGA-II optimisation run to explore hyperparameter configurations for a multilayer perceptron (from scikit-learn) on five classification data sets, considering trade-offs of classification accuracy against training or inference energy (using the PyRAPL) and run times. This includes the generated data of energy, time, accuracy metrics for each dataset, and the full set of corresponding plots. Details for each file are in the enclosed readme.
Dataset Description (TOC): [dataset]_MLP_metrics[_more]_objective1_objective2.pdf - plots showing the full set of results from the grid search. objective1 is either cross-fold validation accuracy or test data accuracy. objective2 is energy or time on training or testing. ("more" indicates that the larger set of hyperparameter values were used) [dataset]_MLP_metrics_more_hls_testcpuenergy.pdf [dataset]_MLP_metrics_more_hls_traingcpuenergy.pdf - relationship between hidden layer size and cpu energy, for training and testing [dataset]_MLP_metrics_more_[training|testing]_cputime_vs_cpuenergy.pdf - relationship between cpu time and energy for training and testing respectively [dataset]_MLP_metrics_more_training_cpuenergy_vs_testingcpuenergy.pdf - relationship between training energy and testing energy pf_aggr_[dataset]_MLP_metrics[_more]_[objective1]_[objective2].csv - the Pareto fronts from the grid search, aggregated using median values for both objectives pf_[dataset]_MLP_metrics[_more]_[objective1]_[objective2].csv - the Pareto fronts from the grid search, no aggregation pf_[dataset]_MLP_metrics[_more]_[objective1]_[objective2].pdf - plot of the Pareto front from the grid search iris-mlp-4-hist-cpu-energy.pdf - histogram of cpu energy measurements for iris - full set of raw data from the grid searches; includes the failed runs where energy is negative PFs_MLP_Testing.ods PFs_MLP_Training.ods - these are the spreadsheets used to show the relationships between hyperparameters and objectives for the Pareto fronts from the grid search - results from the NSGA-II run on diabetes, with objectives: test accuracy and training energy - fun_*.txt are the objective values from the Pareto front for each run - allfronts.csv is the above but concatenated into one file - var_*.txt are the hyperparam values from the Pareto front for each run - diabetes*.dat are the 1, 15, and 30th attainment surfaces attsurface_diabetes_MLP_metrics_more_testacc_trainenergy.pdf - the figure showing the attainment surface for the mop data above
Type: dataset
Funder(s): University of Stirling
Rights: Rights covered by the standard CC-BY 4.0 licence:
Affiliation(s) of Dataset Creator(s): University of Stirling (Biological and Environmental Sciences)

Files in This Item:
File Description SizeFormat 
GI_ICSE_2021-EnergyML.zip11.49 MBZIPView/Open

This item is protected by original copyright

Items in DataSTORRE are protected by copyright, with all rights reserved, unless otherwise indicated.