Please use this identifier to cite or link to this item: http://hdl.handle.net/11667/173
Full metadata record
DC FieldValueLanguage
dc.contributorBrownlee, Alexander E I-
dc.contributor.otherUniversity of Stirlingen_GB
dc.creatorBrownlee, Alexander E I-
dc.date.accessioned2021-03-04T14:56:21Z-
dc.date.available2021-03-04T14:56:21Z-
dc.date.created2021-01-
dc.identifier.urihttp://hdl.handle.net/11667/173-
dc.description.abstractMachine learning accounts for considerable global electricity demand and resulting environmental impact, as training a large deep-learning model produces 284 000kgs of the greenhouse gas carbon dioxide. In recent years, search-based approaches have begun to explore improving software to consume less energy. Machine learning is a particularly strong candidate for this because it is possible to trade off functionality (accuracy) against energy consumption, whereas with many programs functionality is simply a pass-or-fail constraint. We use a grid search and NSGA-II optimisation run to explore hyperparameter configurations for a multilayer perceptron (from scikit-learn) on five classification data sets, considering trade-offs of classification accuracy against training or inference energy (using the PyRAPL) and run times. This includes the generated data of energy, time, accuracy metrics for each dataset, and the full set of corresponding plots. Details for each file are in the enclosed readme.en_GB
dc.description.tableofcontents[dataset]_MLP_metrics[_more]_objective1_objective2.pdf - plots showing the full set of results from the grid search. objective1 is either cross-fold validation accuracy or test data accuracy. objective2 is energy or time on training or testing. ("more" indicates that the larger set of hyperparameter values were used) [dataset]_MLP_metrics_more_hls_testcpuenergy.pdf [dataset]_MLP_metrics_more_hls_traingcpuenergy.pdf - relationship between hidden layer size and cpu energy, for training and testing [dataset]_MLP_metrics_more_[training|testing]_cputime_vs_cpuenergy.pdf - relationship between cpu time and energy for training and testing respectively [dataset]_MLP_metrics_more_training_cpuenergy_vs_testingcpuenergy.pdf - relationship between training energy and testing energy pf_aggr_[dataset]_MLP_metrics[_more]_[objective1]_[objective2].csv - the Pareto fronts from the grid search, aggregated using median values for both objectives pf_[dataset]_MLP_metrics[_more]_[objective1]_[objective2].csv - the Pareto fronts from the grid search, no aggregation pf_[dataset]_MLP_metrics[_more]_[objective1]_[objective2].pdf - plot of the Pareto front from the grid search iris-mlp-4-hist-cpu-energy.pdf - histogram of cpu energy measurements for iris grid_search_results.zip - full set of raw data from the grid searches; includes the failed runs where energy is negative PFs_MLP_Testing.ods PFs_MLP_Training.ods - these are the spreadsheets used to show the relationships between hyperparameters and objectives for the Pareto fronts from the grid search mop.zip - results from the NSGA-II run on diabetes, with objectives: test accuracy and training energy - fun_*.txt are the objective values from the Pareto front for each run - allfronts.csv is the above but concatenated into one file - var_*.txt are the hyperparam values from the Pareto front for each run - diabetes*.dat are the 1, 15, and 30th attainment surfaces attsurface_diabetes_MLP_metrics_more_testacc_trainenergy.pdf - the figure showing the attainment surface for the mop data aboveen_GB
dc.language.isoengen_GB
dc.publisherUniversity of Stirling. Faculty of Natural Sciencesen_GB
dc.relationBrownlee, AEI (2021): Generated data and plots for the paper "Exploring the Accuracy -- Energy Trade-off in Machine Learning". University of Stirling. Facutly of Natural Sciences. Dataset. http://hdl.handle.net/11667/173en_GB
dc.relation.isreferencedbyBrownlee A, Adair J, Haraldsson S & Jabbo J (2021) Exploring the Accuracy - Energy Trade-off in Machine Learning. In: ICSEW'21: Proceedings of the IEEE/ACM 43rd International Conference on Software Engineering Workshops. Genetic Improvement Workshop at 43rd International Conference on Software Engineering, Virtual, 30.05.2021-30.05.2021. New York: ACM. DOI: https://doi.org/10.1109/GI52543.2021.00011 Available from: http://hdl.handle.net/1893/32312en_GB
dc.rightsRights covered by the standard CC-BY 4.0 licence: https://creativecommons.org/licenses/by/4.0/en_GB
dc.subject.classification::Mathematical sciences::Numerical Analysisen_GB
dc.subject.classification::Complexity science::Complexity Science::Human Factors in Complexityen_GB
dc.subject.classification::Medical and health interface::Medical science and disease::Diabetesen_GB
dc.titleGenerated data and plots for the paper "Exploring the Accuracy -- Energy Trade-off in Machine Learning"en_GB
dc.typedataseten_GB
dc.contributor.emailsbr@cs.stir.ac.uken_GB
dc.contributor.affiliationUniversity of Stirling (Biological and Environmental Sciences)en_GB
dc.date.publicationyear2021en_GB
Appears in Collections:University of Stirling Research Data

Files in This Item:
File Description SizeFormat 
GI_ICSE_2021-EnergyML.zip11.49 MBZIPView/Open


This item is protected by original copyright



Items in DataSTORRE are protected by copyright, with all rights reserved, unless otherwise indicated.