Please use this identifier to cite or link to this item:
http://hdl.handle.net/11667/137
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor | Ladouce, Simon | - |
dc.coverage.temporal | 2016-2017 | en_GB |
dc.creator | Ladouce, Simon | - |
dc.date.accessioned | 2019-09-25T10:05:49Z | - |
dc.date.available | 2019-09-25T10:05:49Z | - |
dc.identifier.uri | http://hdl.handle.net/11667/137 | - |
dc.description.abstract | The distribution of attention between competing processing demands can have dramatic real-world consequences, however little is known about how limited attentional resources are distributed during real-world behaviour. Here we employ mobile EEG to characterise the allocation of attention across multiple sensory-cognitive processing demands during naturalistic movement. We used a neural marker of attention, the Event-Related Potential (ERP) P300 effect, to show that attention to targets is reduced when human participants walk compared to when they stand still. In a second experiment, we show that this reduction in attention is not caused by the act of walking per se. A third experiment identified the independent processing demands driving reduced attention to target stimuli during motion. ERP data reveals that the reduction in attention seen during walking reflects the linear and additive sum of the processing demands produced by visual and inertial stimulation. The mobile cognition approach used here shows how limited resources are precisely re-allocated according to the sensory processing demands that occur during real-world behaviour. | en_GB |
dc.description.tableofcontents | Numerous EEG recordings of a series of experiments investigating factors underlying the capture of attentional resources during real-world behaviour. The datasets have been processed with the open-source EEGLAB 4 toolbox for MATLAB (pairs of files with .set and .fdt extensions can be accessed directly through the toolbox). The pair of files (.set and .fdt extensions) contain epoched and preprocessed EEG data. The files titles reflect the experiment to which the dataset belongs (e.g., XP1), followed by the anonymized subject number which is preceded by the experimenter initials (for this study Simon Ladouce, e.g., SL01), then the recording condition (e.g., 'standing','walking',...) and finally the stimulus type (either rare and frequent). In practice, the formatting used can be read as follow: 'XP1_SL1_standing_rare.fdt/.set'. Further details can be found in the abstract and/or the related manuscript (forthcoming). | en_GB |
dc.publisher | University of Stirling. Faculty of Natural Sciences | en_GB |
dc.relation | Ladouce, S (2019): Mobile EEG identifies the re-allocation of attention during real-world activity. University of Stirling. Faculty of Natural Sciences. Dataset. http://hdl.handle.net/11667/137 | en_GB |
dc.relation.isreferencedby | Ladouce, S., Donaldson, D.I., Dudchenko, P.A. et al.(2019) Mobile EEG identifies the re-allocation of attention during real-world activity, Science Reports, 9, 15851. DOI: https://doi.org/10.1038/s41598-019-51996-y. Available from: http://hdl.handle.net/1893/30437 | en_GB |
dc.rights | Rights covered by the standard CC-BY 4.0 licence: https://creativecommons.org/licenses/by/4.0/ | en_GB |
dc.subject.classification | ::Psychology::Psychology::Attention | en_GB |
dc.subject.classification | ::Psychology::Psychology::Cognition | en_GB |
dc.subject.classification | ::Psychology::Psychology::Experimental psychology | en_GB |
dc.title | Mobile EEG identifies the re-allocation of attention during real-world activity | en_GB |
dc.type | dataset | en_GB |
dc.contributor.email | simon.ladouce@stir.ac.uk | en_GB |
dc.contributor.affiliation | University of Stirling (Biological and Environmental Sciences) | en_GB |
dc.date.publicationyear | 2019 | en_GB |
Appears in Collections: | University of Stirling Research Data |
Files in This Item:
This item is protected by original copyright |
Items in DataSTORRE are protected by copyright, with all rights reserved, unless otherwise indicated.