Heading Direction Tracks Internally Directed Selective
Attention in Visual Working Memory
Jude L. Thom1
, Anna C. Nobre1, Freek van Ede1,2* , and Dejan Draschkow1*
Astratto
■ We shift our gaze even when we orient attention internally
to visual representations in working memory. Here, we show
the bodily orienting response associated with internal selective
attention is widespread as it also includes the head. In three
virtual reality experiments, participants remembered 2 visual
items. After a working memory delay, a central color cue indi-
cated which item needed to be reproduced from memory. Dopo
the cue, head movements became biased in the direction of the
memorized location of the cued memory item—despite there
being no items to orient toward in the external environment.
The heading-direction bias had a distinct temporal profile
from the gaze bias. Our findings reveal that directing attention
within the spatial layout of visual working memory bears a
strong relation to the overt head orienting response we engage
when directing attention to sensory information in the external
ambiente. The heading-direction bias further demonstrates
common neural circuitry is engaged during external and internal
orienting of attention. ■
l
D
o
w
N
o
UN
D
e
D
F
R
o
M
H
T
T
P
:
/
/
D
io
R
e
C
T
.
M
io
T
.
e
D
tu
/
j
/
o
C
N
UN
R
T
io
C
e
–
P
D
l
F
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
C
N
_
UN
_
0
1
9
7
6
P
D
.
F
B
sì
G
tu
e
S
T
T
o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3
INTRODUCTION
We often move our head when orienting attention overtly
to sensory information in our environment. We can also
orient attention covertly to items in the external world,
in the absence of large head movements. Per esempio,
you may be watching a film while directing your attention
toward your phone when you are expecting a phone call.
Covertly orienting attention to items in the environment is
accompanied by subtle overt manifestations of orienting
behavior, including directional biases in eye movements
( Yuval-Greenberg, Merriam, & Heeger, 2014; Hafed,
Lovejoy, & Krauzlis, 2011; Engbert & Kliegl, 2003; Hafed
& Clark, 2002).
We can also orient attention internally to items main-
tained in the spatial layout of visual working memory
(van Ede & Nobre, 2021; Manohar, Zokaei, Fallon, Vogels,
& Husain, 2019; Souza & Oberauer, 2016; Murray, Nobre,
Clark, Cravo, & Stokes, 2013; Olivers, Peters, Houtkamp, &
Roelfsema, 2011; Griffin & Nobre, 2003). Similar to atten-
tional selection in the external world, internal selective
attention within visual working memory is associated with
small directional eye-movement biases toward the memo-
rized locations of attended items (Draschkow, Nobre, &
van Ede, 2022; van Ede, Deden, & Nobre, 2021; van Ede,
Board, & Nobre, 2020; van Ede, Chekroud, & Nobre, 2019;
see also: Ferreira, Apel, & Henderson, 2008; Spivey &
Geng, 2001). This overt manifestation of internal selective
1University of Oxford, United Kingdom, 2Vrije Universiteit
Amsterdam, The Netherlands
*These authors contributed equally.
attention occurs despite the external absence of the
attended memory items and even when memorized item
location is not required for task performance.
Head movements are also affected by covert attentional
selection. Covert attention activates neck muscles (Corneil
& Munoz, 2014UN; Corneil, Munoz, Chapman, Admans, &
Cushing, 2007), and the lag between head and eye move-
ments is affected by the congruency of covert attentional
cues (Khan, Blohm, McPeek, & Lefèvre, 2009), suggesting
that the head and eyes may each be modulated or involved
when directing covert attention toward items in the
external environment. The potential involvement of head
and eye movements may be separable, provided there
are differences in the neurophysiological pathways con-
trolling head and eye movements (Gandhi & Sparks,
2007). Therefore, it is important to explore both the head
and eyes when asking questions relating to bodily orient-
ing behavior, because the head and eyes may contribute
in distinct ways as part of a broader bodily orienting
risposta (Corneil & Munoz, 2014B).
If the overt ocular traces of covert selective attention in
memory (Draschkow et al., 2022; van Ede et al., 2020,
2021; van Ede, Chekroud, & Nobre, 2019) are part of a more
widespread bodily orienting response, then directing inter-
nal selective attention to items in working memory should
also be accompanied by head movement. Therefore, it is
conceivable that internally directed selective attention in
working memory may not only be associated with small
orienting behavior of the eyes but also of the head.
To test whether such an embodied orienting response
of eyes and head occurs during internally directed spatial
Attenzione, we analyzed head- and eye-tracking data from a
© 2023 Istituto di Tecnologia del Massachussetts. Published under a
Creative Commons Attribution 4.0 Internazionale (CC BY 4.0) licenza.
Journal of Cognitive Neuroscience 35:5, pag. 856–868
https://doi.org/10.1162/jocn_a_01976
virtual reality ( VR) study investigating selective attention
in visual working memory (Draschkow et al., 2022). IL
head-tracking data, which was not interrogated previously,
allowed us to address whether head movements are
similarly biased toward the memorized locations of selec-
tively attended items in visual working memory.
METHODS
The data were collected as part of a study that used VR to
examine different spatial frames of working memory in
immersive environments (Draschkow et al., 2022). A
answer the current research question, we focused on
head-movement data, which were not analyzed in the pre-
vious study (Draschkow et al., 2022). In this section, we
describe the experimental materials and methods relevant
to the focus of our research question. Information on
additional manipulations that were not the focus of the
current study can be found in Draschkow et al. (2022).
Participants
We analyzed data from three experiments (1–3). Each
experiment had a sample size of 24 human volunteers.
Sample size was on the basis of our prior study that con-
tained four experiments using a similar outcome measure
(van Ede, Chekroud, & Nobre, 2019) and revealed robust
results with 20–25 participants. To address our new
research question and further increase power and sensi-
attività, we combined the samples from the individual
experiments to create a larger data set with 48 partici-
pants and 72 experimental runs. The participants in
Experiments 1–2 were the same and were recruited sep-
arately from the participants in Experiment 3 (Experi-
ments 1–2: mean age 25.8 years, age range 18–40 years,
all right-handed, 20 women; Experiment 3: mean age
25.5 years, age range 19–37 years, 1 left-handed, 13 women).
All participants had normal or corrected-to-normal vision.
Participants provided written consent prior to the exper-
iments and were compensated £10 per hour. Protocols
were approved by the local ethics committee (Central
University Research Ethics Committee #R64089/RE001
and #R70562/RE001).
Materials and Apparatus
Participants wore an HTC Vive Tobii Pro VR headset.
Participants held the controller in their dominant hand,
using their index finger and thumb to press response but-
tons. The positions of the headset and hand controller
were recorded by two Lighthouse base stations, using 60
infrared pulses per second. These pulses interacted with
37 sensors on the headset and 24 sensors on the control-
ler, providing submillimeter tracking accuracy. The head-
set contained a gyroscope and accelerometer, allowing
for the precise recording of head rotational positions
(accuracy < 0.001°). The headset contained a binocular
eye tracker (approximately 0.5° visual angle accuracy,
sampling rate 90 Hz). Two organic light-emitting diode
screens displayed the environment in the headset (refresh
rate 90 Hz, 1080 × 1200 pixels, field of view 100° horizontal
× 110° vertical). We used Vizard (Version 6) to render and
run the VR experimental environment on a Windows desk-
top computer.
In the VR environment, participants stood in the center
of a virtual room (4.2 m long, 4.2 m wide, 2.5 m tall) with a
gray concrete texture applied to the four walls (Figure 1A).
The working memory items were two colored bars (length
0.5 m/14.25° visual angle, diameter 0.05 m/1.425°of visual
angle), which appeared 2 m in front of the participant.
One item appeared 1 m to the left (28.7° visual angle),
on the front wall. The other appeared 1 m to the right,
on the front wall. The centers of the items were 2 m apart.
Procedure and Tasks
Participants were given time to get used to the headset,
controller, laboratory room, and virtual environment
before the experiments began. This included 24 practice
trials in which participants learned how to make responses
and became familiar with the trial sequence.
In all experiments, each trial consisted of the same main
steps (Figure 1A). At the beginning of each trial, partici-
pants stood upright in the center of the room and were
instructed to fixate on a fixation cross with their eyes (size
12 cm × 12 cm, ∼3.4° visual angle). During the task, par-
ticipants were free to hold their heads as they liked. After
500 msec of fixation, two items appeared (as described in
the Materials and Apparatus section). Both items were
slanted at independently drawn random orientations
(ranging 0–180°). One item was red, and the other was
blue. The color of each item was allocated randomly on
each trial. Participants were instructed to remember the
orientations of the items during a delay.
All three experiments included conditions in which par-
ticipants turned 90° to the left or right during the delay
between the presentation of the items and the cue (“turn-
ing trials”). These turning trials were part of a separate
study addressing a distinct question regarding how selec-
tion dynamics in visual working memory are influenced by
self-movement (Draschkow et al., 2022) and were not
included in our analyses.
Because of differences in the turning trials between
experiments, the timings of the tasks differed between
experiments. In Experiment 1, the items disappeared after
500 msec, compared with Experiments 2–3 where the
items remained present for 1600 msec. After the items
disappeared, the participant needed to remember the ori-
entations of the items during a delay. The delays lasted
1935 msec (Experiment 1) and 835 msec (Experiments
2–3) after the items disappeared.
Following the delay, the fixation cross changed to a blue
or red color—matching the color of the left or right item in
working memory. The color cue indicated the item for
Thom et al.
857
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
Figure 1. Heading direction tracks attentional selection in visual working memory. (A) Participants remembered the orientations of two colored
items in a VR environment. After a delay, the fixation cross retrospectively cued the color of the target item. Participants then reported the orientation
of the target item using the controller. (B) We recorded the shift in the projected location (in cm) of the “heading direction” onto the virtual wall in
front of the participant. (C) Average heading direction for left (L) and right (R) item trials as a function of time after cue. Shading indicates ±1 SEM.
(D) Towardness of heading direction as a function of time after cue. Horizontal line indicates a significant difference from zero, using cluster-based
permutation testing. Shading indicates ±1 SEM. (E) Density map showing the difference in heading-direction density between right minus left item
trials (500–2000 msec after cue). Circles indicate the locations of the items during encoding. Centers of items are at 100 cm (28.7° of visual angle).
(C–E) Data aggregated from Experiments 1–3. See Figure A1 for separate plots of heading direction and heading-direction towardness as functions of
time after cue for Experiments 1–3.
which the orientation response needed to be reproduced
(target item) and signaled that participants could initiate
the response when ready. The target item was randomly
selected in each trial irrespective of orientation, location,
and color. Participants had unlimited time to recall the
orientation of the target item and activate a response.
Once a response was initiated, participants had 2000 msec
to dial in the orientation of the target item, using the con-
troller. The response activation generated a dial made of
two handles (diameter 0.06 m) on a circular torus (diam-
eter 0.5 m, tube diameter 0.02 m), which was centered at
the fixation cross. This dial was only present during the
response stage. The handles moved along the torus
according to the controller’s orientation, allowing partic-
ipants to reproduce the orientation of the target item.
Participants confirmed their response by pressing the
trigger button of the controller. Immediately after con-
firming their response, the dial disappeared, and partici-
pants received feedback on their performance. Feedback
was presented on a 0–100 scale, with 100 being perfect
reproduction of the target item’s orientation. This num-
ber was presented above the fixation cross for 500 msec.
Feedback was followed by a 700-msec delay. After this
delay, there was an intertrial interval randomly selected
between 1500 and 2000 msec.
There were 100 stationary trials in each experiment (50
left target item, 50 right target item). Trials were presented
in five blocks with 20 trials each. The headset recalibrated
the gaze tracking at the beginning of each block. Partici-
pants completing Experiments 1 and 2 performed both
tasks in the same session, in counterbalanced order. Each
experiment lasted approximately 1 hr, and the full session
lasted approximately 2 hr.
Data Analysis
Tracking and behavioral recordings were stored in a
comma-separated variable file, for each participant. We
used R Studio ( Version 1.3.1093, 2020) to analyze the data.
The data files and analysis scripts are available on-line
here: https://doi.org/10.17605/OSF.IO/24U9M.
The “heading direction” variable refers to the projected
location (in cm) of the heading direction onto the virtual
wall in front of the participant. The “gaze direction” vari-
able was the horizontal distance between the fixation cross
and the gaze-fixation point on the virtual wall (averaged
between both eyes). For an illustration of the heading
direction variable, see Figure 1B.
We also recorded yaw, roll, and translation of the head-
set (Figure 2) to look at the contributions of these
858
Journal of Cognitive Neuroscience
Volume 35, Number 5
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
Figure 2. Biased movement in yaw, roll, and translation. (A) Left: Yaw as a component of heading direction. Center: Average yaw for left (L) and right
(R) item trials as a function of time after cue. Shading indicates ±1 SEM. Right: Towardness of yaw as a function of time after cue. Horizontal line
indicates a significant difference from zero, using cluster-based permutation testing. Shading indicates ±1 SEM. (B) Same as A, using roll instead of
yaw. (C) Same as A, using translation instead of yaw. (B–C) Lack of horizontal lines in towardness plots in the right indicates no significant difference
from zero was found, using cluster-based permutation testing with a threshold of p < .05.
individual components of the heading-direction vector.
Head yaw is the rotational position around the head’s ver-
tical axis. Head roll is the rotational position around the
head’s longitudinal axis. For example, rotating your head
while reading a large sign left-to-right would be reflected in
changing yaw values and tilting your head to read a slanted
sign would change roll values. Translation refers to the
horizontal movement of the entire headset (e.g., if the par-
ticipant moved their entire head to the left while looking
straight ahead). Together, yaw, roll, and translation are
components that can influence the horizontal heading
direction.
We epoched the data from 500 msec before cue to
2000 msec after cue. We smoothed all time-course data
over four samples (44-msec smoothing window aver-
age). In each trial, the mean value between 200 and
0 msec before the cue was used as a baseline and sub-
tracted from all values in the trial. We excluded trials in
which heading direction or gaze direction exceeded
0.5 m (half the distance to the locations of the memo-
randa) in either direction of the fixation cross during the
time window (−500 msec to 2000 msec) to remove the
effect of large outliers. This cutoff was set a priori in
accordance with our previous work (Draschkow et al.,
2022). We also excluded trials with a yaw or roll of over
20° in either direction (average percentage of excluded
trials per participant: M = 5.96%, SE = 0.01; total per-
centage of excluded trials: 16.58%). Importantly,
Thom et al.
859
however, not applying any cutoff did not change the
findings presented in the Results section.
Heading Direction Tracks Internal Selective
Attention in Visual Working Memory
We compared behavior between right- and left-item
trials in the three experiments separately to check if
the side of the target item affected performance. We
used within-subject ANOVAs to check for effects of target
side on error and RT. To follow up findings (including
null findings), we conducted Bayesian t tests (Rouder,
Speckman, Sun, Morey, & Iverson, 2009) with the default
settings of the Bayes Factor package (Morey et al.,
2021). Bayes-factor values either indicated evidence in
favor of the alternative hypothesis (B01 > 3), in favor
of the null hypothesis (B01 < 0.33), or suggested incon-
clusive evidence (B01 > 0.3 and B01 < 3; Kass & Raftery,
1995).
Next, we plotted the change in the time-course data
(heading direction, yaw, roll, translation, gaze direction)
from baseline (−200 to 0 msec before cue), separately
for left- and right-item trials. To increase sensitivity and
interpretability, we constructed a single measure of
“towardness.” Towardness aggregated horizontal move-
ment toward the target item on each trial, combining
leftward movement in left-item trials and rightward
movement in right-item trials. A positive towardness indi-
cated a horizontal position in the direction of the target
item. Towardness for each time step was given by the
trial-average horizontal position in right-item trials minus
the trial-average horizontal position in left-item trials
(where position values left of fixation were negative)
divided by two. The same procedure for calculating
towardness was used for all time-course head and gaze
data. We used this towardness variable to determine
the significance of the biased movements (compared
with zero), using “cluster-depth” (Frossard & Renaud,
2022) cluster-based permutation tests (Sassenhagen &
Draschkow, 2019; Maris & Oostenveld, 2007). We ran
the cluster-based permutation testing in R with the “per-
muco” package (Frossard & Renaud, 2021, 2022).
To gain a better understanding of the scale and vari-
ance of the heading direction, we plotted a density
map of all of the heading-direction values between
500 msec and 2000 msec postcue over all trials and all
participants (including excluded trials). We used color
to code the side of the target item in the trial and high-
light differences in the directionality of heading direction
between item-sides.
After the color change in the fixation cross (cue onset),
horizontal heading direction became biased in the direc-
tion of the memorized external location of the cued mem-
ory item (Figures 1B–1E). This heading-direction bias
occurred although there was no information present or
expected at the external location corresponding to the
memorized item after the color cue.
The bias in horizontal heading movement was leftward
in trials in which the color cue corresponded with the
memory item that had been encoded on the left (“left
item”), and rightward in trials in which the color cue cor-
responded with the memory item that had been encoded
on the right (“right item”). Figure 1B illustrates the nature
of the heading-direction bias in left- and right-item trials.
The average heading direction after the color cue for trials
with cued memory items on the left and right are plotted
separately in Figure 1C. To quantify this heading-direction
bias and express it as a single measure, we combined the
heading-direction bias from left- and right-item trials into a
measure of towardness (van Ede, Chekroud, & Nobre,
2019). The towardness of the heading direction became
evident starting at approximately 500 msec after the onset
of the cue (Figure 1D; cluster p < .05; largest cluster rang-
ing between 1167 and 1367 msec).
To explore the scale of the heading-direction bias, we
calculated density maps of single-trial heading-direction
values and subtracted density maps between left- and
right-item trials. To focus on the window of interest, we
considered all heading-direction values when the heading-
direction bias was most pronounced (500–2000 msec;
Figure 1E). This revealed the subtle nature of the heading-
direction bias. Participants did not move their heading
direction all the way to the memorized locations of the
items (circles in Figure 1E). Instead, participants subtly
moved their heading direction toward the memorized
item locations (< 0.5° of rotation), with heading-direction
biases remaining close to fixation—akin to the type of
directional biases we have recently observed in gaze
(Draschkow et al., 2022; van Ede et al., 2020, 2021; van
Ede, Chekroud, & Nobre, 2019). The properties of the
heading-direction bias were similar across three slightly
different versions of the task (Experiments 1–3) and are
plotted separately in Figure A1. There were no significant
effects of target side (left vs. right) on behavioral perfor-
mance (error and RT) in any of the experiments (see
Figure A2).
RESULTS
Participants performed a visual working memory task in a
VR environment while we tracked their head and gaze. In
the task, participants remembered the orientations of two
colored bars, one on the left and one on the right, for a
short delay (Figure 1A). After the working memory delay,
a color cue indicated the bar for which participants needed
to reproduce the orientation on a dial.
The Heading-Direction Bias Is Driven by Movement
along the Head’s Yaw Axis
To determine which heading-movement components
contributed to the heading-direction bias, we separately
analyzed yaw, roll, and translation. Like the heading-
direction vector, yaw followed the movement pattern of
860
Journal of Cognitive Neuroscience
Volume 35, Number 5
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
Figure 3. The gaze bias and the heading-direction bias. (A) Average gaze direction for left (L) and right (R) item trials as a function of time after cue.
Shading indicates ±1 SEM. (B) Towardness of gaze direction (gaze) and heading direction (heading) as a function of time after cue. Horizontal line
indicates a significant difference from zero, using cluster-based permutation testing. Shading indicates ±1 SEM.
heading-direction in the left- and right-item trials, which
was also confirmed by a significantly positive towardness
cluster (Figure 2A; p < .05, cluster-corrected). Roll showed
a nonsignificant towardness trend (Figure 2B; p > .999 for
all clusters of the full time window), and translation did not
move toward the memorized locations of the cued mem-
ory items (Figure 2C; p > .257 for all clusters of the full
time window). We also investigated all components mak-
ing up the heading direction measure (x-, y-, z-translation,
yaw, pitch, and roll), during the critical 500- to 1500-msec
postcue period (Figure A3). Figura 3 shows how head
rotation around the yaw axis closely tracks heading
direction. Così, the leftward–rightward rotation along
the head’s yaw axis was the primary factor contributing
to the directional heading-direction bias when selectively
attending items in our visual working memory task.
The Heading-Direction Bias Is Accompanied by a
Gaze Bias in the Same Direction
Like the heading direction, gaze direction moved toward
the location of the cued item during internal selective
Attenzione (as we have previously reported in this data set
(Draschkow et al., 2022) as well as in prior data sets (van
Ede & Nobre, 2021; van Ede et al., 2020, 2021; van Ede,
Chekroud, & Nobre, 2019). Figure 3A shows the leftward
and rightward movement of gaze direction in left- E
right-item trials. The gaze towardness was significantly dif-
ferent from zero after the cue ( P < .05 between 400 and
1244 msec, cluster-corrected; Figure 3B). We focused our
statistical analyses on the data aggregated across the indi-
vidual experiments to improve sensitivity, noting that the
gaze direction for left and right trials and towardness over
time were similar between all three experiments (Experi-
ments 1–3; Figure A1).
In Figure 3B, we also overlay the heading-direction bias
for a descriptive comparison. Whereas the heading-direction
bias and gaze bias both acted toward the memorized
location of the cued item, Figure 3B shows the bias lags
behind the gaze bias. The largest significant cluster (Frossard
& Renaud, 2021, 2022) for the gaze bias was significant at
∼400 msec, whereas the significant time window for the
heading-direction bias started more than a full second
after the cue ( p < .05; heading: 1167–1367 msec, gaze:
400–1244 msec).
DISCUSSION
Our results reveal that, like eyes, the heading direction
tracks internally directed selective attention inside visual
working memory. This manifests in directionally biased
head movements toward the memorized location of
attended memory items. Although the heading direction
bias is small (Figures 1 and A1), we were able to capture
it by calculating the relative change in heading direction
triggered by the cue and by aggregating the data from mul-
tiple experiments. The heading-direction bias in our task
was predominantly driven by the head’s rotation around
its yaw axis and accompanies a gaze bias in the same direc-
tion. The observed heading-direction bias suggests there
is a general bodily orienting response during internal
selective attention—suggesting brain structures involved
in orienting of the eye and head are also engaged when
orienting within the internal space of working memory.
The heading-direction and gaze biases may reflect bodily
signatures that are part of a widespread orienting response
activating brain areas that are involved in both overt and
covert attentional selection. Indeed, there is good evidence
that the brain’s oculomotor system is also involved in covert
orienting of spatial attention (Yuval-Greenberg et al., 2014;
Hafed et al., 2011; Moore & Fallah, 2004; Engbert & Kliegl,
2003; Moore & Armstrong, 2003; Hafed & Clark, 2002;
Nobre et al., 1997; Deubel & Schneider, 1996). Moreover,
from an evolutionary perspective, it is conceivable that our
ability to orient internally evolved gradually from external
orienting behaviors of the head and eyes—maybe relying
Thom et al.
861
on overlapping neural circuitry (Cisek, 2019). From this
perspective, the observed subtle bias in head- and
eye-movements may reflect an inevitable “spill over” from
activating neural circuitry that has evolved to orient both
internally and externally (Strauss et al., 2020).
It is maybe surprising to find this heading-direction
bias, even when attention is directed internally and with-
out any items in the environment toward which to orient.
However, in natural settings, there may be a behavioral
benefit of orienting the head and eyes toward the loca-
tions of selected memory items. In our task, no subse-
quent behavioral goal benefited from orienting toward
the memorized location of the attended memory item.
However, in daily life, items rarely disappear from their
location in the external environment as they do in our
task. Thus, orienting the eyes and head toward the mem-
orized locations of selected items may serve to guide
future behavior, such as resampling items. In fact, people
often resample items in a naturalistic working memory
task, when it is easy to do so (Draschkow, Kallmayer, &
Nobre, 2021; Ballard, Hayhoe, & Pelz, 1995). For example,
imagine you are with a friend in a café, and they comment
on the barista’s hat. You may attend the barista in mem-
ory, attempting to recall what their hat looked like. At the
same time, your head and eyes may be preparing for you
to shift your gaze and look at the barista’s hat again. In this
way, the small heading-direction and gaze biases toward
selected items in working memory may reflect a natural
tendency to engage in action in relation to selected mem-
oranda (Boettcher, Gresch, Nobre, & van Ede, 2021;
Heuer, Ohl, & Rolfs, 2020; Olivers & Roelfsema, 2020;
van Ede, 2020; van Ede, Chekroud, Stokes & Nobre,
2019), even if there was no incentive for this in our task.
In natural behavior, head and eye movements are intrin-
sically functionally linked (Solman, Foulsham, & Kingstone,
2016; Foulsham, Walker, & Kingstone, 2011; Land, 2009)
and head movements can even compensate for eye
movements when people cannot make saccades (Ceylan,
Henriques, Tweed, & Crawford, 2000; Gilchrist, Brown, &
Findlay, 1997). This coordinated relationship between
head- and eye-movements motivated us to look at both
the head and eyes when exploring bodily orienting
responses. The heading-direction bias revealed here impli-
cates that neural circuitry that controls head movements—
at least along the yaw axis—is recruited by, and potentially
overlaps with, circuitry that directs internal selective atten-
tion. In fact, previous research has found overlap between
brain areas thought to process spatial attention and eye
and head movements. For example, the FEFs play a role
in directing attention and controlling eye movements
(Taylor, Nobre, & Rushworth, 2007; Moore & Fallah,
2004; Grosbras & Paus, 2002; Bruce & Goldberg, 1984;
Robinson & Fuchs, 1969). Alongside attentional selection
and eye movements, the FEF also contributes to head
movements. The hemodynamic activity of the FEF
responds to head movement (Petit & Beauchamp, 2003),
and microstimulation to the FEF in primates results in head
movement (Elsley, Nagy, Cushing, & Corneil, 2007; Chen &
Walton, 2005). In addition, modulation of activity in the
superior colliculus—an area shown to process not only
eye (Wurtz & Albano, 1980; Schiller & Stryker, 1972; Wurtz
& Goldberg, 1971) but also head movements (Corneil,
Olivier, & Munoz, 2002; Bizzi, Kalil, & Morasso, 1972)—also
affects the deployment of covert attention (Krauzlis,
Lovejoy, & Zénon, 2013; Lovejoy & Krauzlis, 2009; Müller,
Philiastides, & Newsome, 2005). Our results complement
these findings, with the heading-direction and gaze biases
suggesting overlap between neural circuitry and activity
governing attentional selection inside working memory,
eye movements, and head movements.
However, control of the head and eye is not entirely
linked, as shown by differences in the neurophysiological
pathways controlling eye and head movements (Oommen
& Stahl, 2005; Bizzi et al., 1972; Horn et al., 2012). This is
demonstrated in the distinct temporal profiles of the
heading-direction and gaze biases presented here, which
highlight the value of looking at multiple components of
what might be a widespread bodily orienting response
involving the head and eyes. It is important to note that
comparisons between the temporal profiles of the head
and gaze biases should be made with caution because of
differences in mass and musculature of the head and eyes
and the signal-to-noise ratio of the two measures.
It is worth noting the apparent asymmetry in the magni-
tude and time course of the heading-direction bias in left
versus right trials and across experiments (as seen in
Figure 1 and Figure A1). On the basis of our previous work
on gaze biases (Draschkow et al., 2022; van Ede et al., 2020,
2021; van Ede, Chekroud, & Nobre, 2019), we a priori
decided to focus on a single measure of “towardness”,
which represents horizontal movement toward the target
item on each trial. This aggregated measure does not only
benefit from increased sensitivity but also removes any
potential drifts in the measure that are not because of selec-
tive attention (that could potentially contribute to the
apparent asymmetry we observed here). In future studies,
it would be interesting to further investigate these potential
asymmetries and how they relate to behavioral perfor-
mance, for example, by increasing trial numbers and intro-
ducing a neutral condition in which no item is cued.
Finally, by using VR, we were able to measure the
heading-direction bias alongside the gaze bias as partici-
pants’ head, eye, and body were unconstrained. To date,
the benefits of VR have been appreciated most promi-
nently by researchers studying naturalistic human naviga-
tion, ethology, and long-term memory (Mobbs et al., 2021;
Helbing, Draschkow, & Võ, 2020; Stangl et al., 2020;
Topalovic et al., 2020; Draschkow & Võ, 2017; Li, Aivar,
Kit, Tong, & Hayhoe, 2016). Our present findings further
highlight the benefits of using VR (combined with eye-
and head-tracking) to study bodily orienting behavior
(Draschkow et al., 2021, 2022) related to internal cognitive
processes, as showcased here for internal attentional
focusing in working memory.
862
Journal of Cognitive Neuroscience
Volume 35, Number 5
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
APPENDIX
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
Figure A1. The heading-direction and gaze biases for Experiments 1–3. (A) Left: Average heading direction for left (L) and right (R) item trials as a
function of time after cue, using data from Experiment 1. Middle: Towardness of heading direction as a function of time after cue, using data from
Experiment 1. Right: Distribution of the mean towardness between 500 and 1500 msec across participants. (B) Same as A, using gaze direction
instead of heading direction. (C) Same as A, using data from Experiment 2. (D) Same as (B), using data from Experiment 2. (E) Same as (A), using
data from Experiment 3. (F) Same as (B), using data from Experiment 3. (A–F) Shading indicates ±1 SEM.
Thom et al.
863
Figure A2. Similar performance
in left- and right-item trials.
(A) Left: Plot comparing the
mean RT between left item
(Item L) and right item (Item R)
trials, for each participant in
Experiment 1. Connected pairs
of points are the means of the
same participant. Error bars
represent a 95% confidence
interval. Right: Same as Left, for
error instead of RT. (B) Same as
(A), using data from Experiment 2.
(C) Same as A, using data from
Experiment 3. There was no
significant effect of target side
on mean error in any of
the experiments, Experiment 1:
F(1, 23) = 0.01, p = .934;
Experiment 2: F(1, 23) = 0.02,
p = .881; Experiment 3:
F(1, 23) = 2.04, p = .166. For
Experiments 1–2, the follow-up
Bayes t test supported the null
hypothesis, suggesting the
errors are similar between
left- and right-item trials,
Experiment 1: (B01 = 0.22),
Experiment 2: (B01 = 0.22).
Similarly, there was no
significant effect of target side
on mean RT in any of the
experiments, Exp. 1: F(1, 23) =
0.19, p = .671; Experiment 2:
F(1, 23) = 0.23, p = .633;
Experiment 3: F(1, 23) = 0.07,
p = .793. For Experiments 1–3,
the follow-up Bayes t tests
supported the null hypothesis,
suggesting the RTs are similar
between left- and right-item
trials, Experiment 1: (B01 = 0.23),
Experiment 2: (B01 = 0.24),
Experiment 3: (B01 = 0.22).
864
Journal of Cognitive Neuroscience
Volume 35, Number 5
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
Figure A3. Distributions of
measures making up heading
direction. The measures that
make up heading direction
(x-, y-, z-translation, yaw,
pitch, and roll) were z-score
normalized before calculating
their mean between 500 and
1500 msec. These mean values
were averaged across blocks
and trials for each participant
and split by item side.
Boxplots indicate median
and interquartile range. The
figure shows how head rotation
around the yaw axis closely
tracks heading direction. We
separately analyzed yaw, roll,
and x-translation in the main
text and figure.
Reprint requests should be sent to Jude L. Thom, Department
of Experimental Psychology, University of Oxford, Oxford,
United Kingdom, or via e-mail: jude.thom@linacre.ox.ac.uk;
or Dejan Draschkow, Department of Experimental Psychology,
University of Oxford, Oxford, United Kingdom, Oxford Centre
for Human Brain Activity, Wellcome Centre for Integrative Neu-
roimaging, Department of Psychiatry, University of Oxford,
Oxford, United Kingdom, or via e-mail: dejan.draschkow@psy
.ox.ac.uk.
Data Availability Statement
number: 850636 to F. v. E., and by the NIHR Oxford Health
Biomedical Research Centre. The Wellcome Centre for
Integrative Neuroimaging is supported by core funding
from the Wellcome Trust (https://dx.doi.org/10.13039
/100010269), grant number: 203139/Z/16/Z. The funders
had no role in study design, data collection and analysis,
decision to publish, or preparation of the manuscript.
For the purpose of open access, the author has applied a
CC BY public copyright license to any Author Accepted
Manuscript version arising from this submission.
The data files and analysis scripts are available on-line here:
https://osf.io/24u9m/.
Diversity in Citation Practices
Author Contribution
Jude L. Thom: Formal analysis; Investigation; Visualization;
Writing—Original draft; Writing—Review & editing. Anna
C. Nobre: Funding acquisition; Project administration;
Resources; Supervision; Writing—Original draft; Writing—
Review & editing. Freek van Ede: Funding acquisition;
Investigation; Methodology; Project administration;
Resources; Supervision; Writing—Original draft; Writing—
Review & editing. Dejan Draschkow: Data curation; Formal
analysis; Investigation; Methodology; Project administra-
tion; Resources; Supervision; Writing—Original draft;
Writing—Review & editing.
Funding Information
This research was funded by a Wellcome Trust Senior
Investigator Award (https://dx.doi.org/10.13039
/100010269), grant number: 104571/Z/14/Z, and a James
S. McDonnell Foundation Understanding Human Cogni-
tion Collaborative Award, grant number: 220020448 to
A. C. N., an ERC Starting Grant from the European Research
Council (https://dx.doi.org/10.13039/100010663), grant
Retrospective analysis of the citations in every article pub-
lished in this journal from 2010 to 2021 reveals a persistent
pattern of gender imbalance: Although the proportions of
authorship teams (categorized by estimated gender iden-
tification of first author/last author) publishing in the Jour-
nal of Cognitive Neuroscience ( JoCN ) during this period
were M(an)/M = .407, W(oman)/M = .32, M/ W = .115,
and W/ W = .159, the comparable proportions for the arti-
cles that these authorship teams cited were M/M = .549,
W/M = .257, M/ W = .109, and W/ W = .085 (Postle and
Fulvio, JoCN, 34:1, pp. 1–3). Consequently, JoCN encour-
ages all authors to consider gender balance explicitly when
selecting which articles to cite and gives them the oppor-
tunity to report their article’s gender citation balance.
REFERENCES
Ballard, D. H., Hayhoe, M. M., & Pelz, J. B. (1995). Memory
representations in natural tasks. Journal of Cognitive
Neuroscience, 7, 66–80. https://doi.org/10.1162/jocn.1995.7
.1.66, PubMed: 23961754
Bizzi, E., Kalil, R. E., & Morasso, P. (1972). Two modes of active
eye-head coordination in monkeys. Brain Research, 40,
Thom et al.
865
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
45–48. https://doi.org/10.1016/0006-8993(72)90104-7,
PubMed: 4624490
Boettcher, S. E. P., Gresch, D., Nobre, A. C., & van Ede, F.
(2021). Output planning at the input stage in visual working
memory. Science Advances, 7, eabe8212. https://doi.org/10
.1126/sciadv.abe8212, PubMed: 33762341
Bruce, C. J., & Goldberg, M. E. (1984). Physiology of the frontal
eye fields. Trends in Neurosciences, 7, 436–441. https://doi
.org/10.1016/S0166-2236(84)80149-6
Ceylan, M., Henriques, D. Y. P., Tweed, D. B., & Crawford, J. D.
(2000). Task-dependent constraints in motor control: Pinhole
goggles make the head move like an eye. Journal of
Neuroscience, 20, 2719–2730. https://doi.org/10.1523
/jneurosci.20-07-02719.2000, PubMed: 10729353
Chen, L. L., & Walton, M. M. G. (2005). Head movement evoked
by electrical stimulation in the supplementary eye field of the
rhesus monkey. Journal of Neurophysiology, 94, 4502–4519.
https://doi.org/10.1152/jn.00510.2005, PubMed: 16148273
Cisek, P. (2019). Resynthesizing behavior through phylogenetic
refinement. Attention, Perception, & Psychophysics, 81,
2265–2287. https://doi.org/10.3758/s13414-019-01760-1,
PubMed: 31161495
Corneil, B. D., & Munoz, D. P. (2014a). Overt responses during
covert orienting. Neuron, 82, 1230–1243. https://doi.org/10
.1016/j.neuron.2014.05.040, PubMed: 24945769
Corneil, B. D., & Munoz, D. P. (2014b). Overt responses during
covert orienting. Neuron, 82, 1230–1243. https://doi.org/10
.1016/j.neuron.2014.05.040, PubMed: 24945769
Corneil, B. D., Munoz, D. P., Chapman, B. B., Admans, T., &
Cushing, S. L. (2007). Neuromuscular consequences of
reflexive covert orienting. Nature Neuroscience, 11, 13–15.
https://doi.org/10.1038/nn2023, PubMed: 18059264
Corneil, B. D., Olivier, E., & Munoz, D. P. (2002). Neck muscle
responses to stimulation of monkey superior colliculus. I.
Topography and manipulation of stimulation parameters.
Journal of Neurophysiology, 88, 1980–1999. https://doi.org
/10.1152/jn.2002.88.4.1980, PubMed: 12364523
Deubel, H., & Schneider, W. X. (1996). Saccade target selection
and object recognition: Evidence for a common attentional
mechanism. Vision Research, 36, 1827–1837. https://doi.org
/10.1016/0042-6989(95)00294-4, PubMed: 8759451
Draschkow, D., Kallmayer, M., & Nobre, A. C. (2021). When
natural behavior engages working memory. Current
Biology, 31, 869–874. https://doi.org/10.1016/j.cub.2020.11
.013
Draschkow, D., Nobre, A. C., & van Ede, F. (2022). Multiple
spatial frames for immersive working memory. Nature
Human Behaviour, 6, 536–544. https://doi.org/10.1038
/s41562-021-01245-y, PubMed: 35058640
Draschkow, D., & Võ, M. L.-H. (2017). Scene grammar shapes
the way we interact with objects, strengthens memories, and
speeds search. Scientific Reports, 7, 16471. https://doi.org/10
.1038/s41598-017-16739-x, PubMed: 29184115
Elsley, J. K., Nagy, B., Cushing, S. L., & Corneil, B. D. (2007).
Widespread presaccadic recruitment of neck muscles by
stimulation of the primate frontal eye fields. Journal of
Neurophysiology, 98, 1333–1354. https://doi.org/10.1152/jn
.00386.2007, PubMed: 17625064
Engbert, R., & Kliegl, R. (2003). Microsaccades uncover the
orientation of covert attention. Vision Research, 43,
1035–1045. https://doi.org/10.1016/s0042-6989(03)00084-1,
PubMed: 12676246
Ferreira, F., Apel, J., & Henderson, J. M. (2008). Taking a new
look at looking at nothing. Trends in Cognitive Sciences,
12, 405–410. https://doi.org/10.1016/j.tics.2008.07.007,
PubMed: 18805041
Foulsham, T., Walker, E., & Kingstone, A. (2011). The where,
what and when of gaze allocation in the lab and the natural
environment. Vision Research, 51, 1920–1931. https://doi.org
/10.1016/j.visres.2011.07.002, PubMed: 21784095
Frossard, J., & Renaud, O. (2021). Package ‘permuco’.
Frossard, J., & Renaud, O. (2022). The cluster depth tests:
Toward point-wise strong control of the family-wise error
rate in massively univariate tests with application to
M/EEG. Neuroimage, 247, 118824. https://doi.org/10.1016/j
.neuroimage.2021.118824, PubMed: 34921993
Gandhi, N. J., & Sparks, D. L. (2007). Dissociation of eye and
head components of gaze shifts by stimulation of the
omnipause neuron region. Journal of Neurophysiology, 98,
360–373. https://doi.org/10.1152/jn.00252.2007, PubMed:
17493925
Gilchrist, I. D., Brown, V., & Findlay, J. M. (1997). Saccades
without eye movements. Nature, 390, 130–131. https://doi
.org/10.1038/36478, PubMed: 9367150
Griffin, I. C., & Nobre, A. C. (2003). Orienting attention to
locations in internal representations. Journal of Cognitive
Neuroscience, 15, 1176–1194. https://doi.org/10.1162
/089892903322598139, PubMed: 14709235
Grosbras, M. H., & Paus, T. (2002). Transcranial magnetic
stimulation of the human frontal eye field: Effects on visual
perception and attention. Journal of Cognitive Neuroscience,
14, 1109–1120. https://doi.org/10.1162/089892902320474553,
PubMed: 12419133
Hafed, Z. M., & Clark, J. J. (2002). Microsaccades as an overt
measure of covert attention shifts. Vision Research, 42,
2533–2545. https://doi.org/10.1016/s0042-6989(02)00263-8,
PubMed: 12445847
Hafed, Z. M., Lovejoy, L. P., & Krauzlis, R. J. (2011). Modulation
of microsaccades in monkey during a covert visual attention
task. Journal of Neuroscience, 31, 15219–15230. https://doi
.org/10.1523/jneurosci.3106-11.2011, PubMed: 22031868
Helbing, J., Draschkow, D., & Võ, M. L. H. (2020). Search
superiority: Goal-directed attentional allocation creates more
reliable incidental identity and location memory than explicit
encoding in naturalistic virtual environments. Cognition, 196,
104147. https://doi.org/10.1016/j.cognition.2019.104147,
PubMed: 32004760
Heuer, A., Ohl, S., & Rolfs, M. (2020). Memory for action: A
functional view of selection in visual working memory. Visual
Cognition, 28, 388–400. https://doi.org/10.1080/13506285
.2020.1764156
Horn, M. R., van Gandhi, N. J., Klier, E. M., Angelaki, D. E.,
Gilchrist, I., Johnston, K., et al. (2012). Eye-head gaze shifts.
In The Oxford handbook of eye movements (Vol. 1,
pp. 304–321). Oxford University Press. https://doi.org/10
.1093/oxfordhb/9780199539789.013.0016
Kass, R. E., & Raftery, A. E. (1995). Bayes factors. Journal of the
American Statistical Association, 90, 773–795. https://doi.org
/10.1080/01621459.1995.10476572
Khan, A. Z., Blohm, G., McPeek, R. M., & Lefèvre, P. (2009).
Differential influence of attention on gaze and head
movements. Journal of Neurophysiology, 101, 198–206.
https://doi.org/10.1152/jn.90815.2008, PubMed: 18987122
Krauzlis, R. J., Lovejoy, L. P., & Zénon, A. (2013). Superior
colliculus and visual spatial attention. Annual Review of
Neuroscience, 36, 165–182. https://doi.org/10.1146/annurev
-neuro-062012-170249, PubMed: 23682659
Land, M. F. (2009). Vision, eye movements, and natural
behavior. Visual Neuroscience, 26, 51–62. https://doi.org/10
.1017/s0952523808080899, PubMed: 19203425
Li, C.-L., Aivar, M. P., Kit, D. M., Tong, M. H., & Hayhoe, M. M.
(2016). Memory and visual search in naturalistic 2D and 3D
environments. Journal of Vision, 16, 9. https://doi.org/10
.1167/16.8.9, PubMed: 27299769
Lovejoy, L. P., & Krauzlis, R. J. (2009). Inactivation of
primate superior colliculus impairs covert selection of
866
Journal of Cognitive Neuroscience
Volume 35, Number 5
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
signals for perceptual judgments. Nature Neuroscience,
13, 261–266. https://doi.org/10.1038/nn.2470, PubMed:
20023651
Manohar, S. G., Zokaei, N., Fallon, S. J., Vogels, T. P., & Husain,
M. (2019). Neural mechanisms of attending to items in
working memory. Neuroscience & Biobehavioral Reviews,
101, 1–12. https://doi.org/10.1016/j.neubiorev.2019.03.017,
PubMed: 30922977
Maris, E., & Oostenveld, R. (2007). Nonparametric statistical
testing of EEG- and MEG-data. Journal of Neuroscience
Methods, 164, 177–190. https://doi.org/10.1016/j.jneumeth
.2007.03.024, PubMed: 17517438
Mobbs, D., Wise, T., Suthana, N., Guzmán, N., Kriegeskorte, N.,
& Leibo, J. Z. (2021). Promises and challenges of human
computational ethology. Neuron, 109, 2224–2238. https://
doi.org/10.1016/j.neuron.2021.05.021, PubMed: 34143951
Moore, T., & Armstrong, K. M. (2003). Selective gating of visual
signals by microstimulation of frontal cortex. Nature, 421,
370–373. https://doi.org/10.1038/nature01341, PubMed:
12540901
Moore, T., & Fallah, M. (2004). Microstimulation of the frontal
eye field and its effects on covert spatial attention. Journal of
Neurophysiology, 91, 152–162. https://doi.org/10.1152/jn
.00741.2002, PubMed: 13679398
Morey, R. D., Rouder, J. N., Jamil, T., Urbanek, S., Forner, K.,
& Ly, A. (2021). Package ‘BayesFactor’ [internet].
https://CRAN.R-Project.Org/Package=BayesFactor
Müller, J., Philiastides, M., & Newsome, W. (2005).
Microstimulation of the superior colliculus focuses
attention without moving the eyes. Proceedings of the
National Academy of Sciences, U.S.A., 102, 524–529.
https://doi.org/10.1073/pnas.0408311101, PubMed:
15601760
Murray, A. M., Nobre, A. C., Clark, I. A., Cravo, A. M., & Stokes,
M. G. (2013). Attention restores discrete items to visual
short-term memory. Psychological Science, 24, 550–556.
https://doi.org/10.1177/0956797612457782, PubMed:
23436786
Nobre, A., Sebestyen, G., Gitelman, D., Mesulam, M.,
Frackowiak, R., & Frith, C. (1997). Functional localization
of the system for visuospatial attention using positron
emission tomography. Brain, 120, 515–533. https://doi.org
/10.1093/brain/120.3.515, PubMed: 9126062
Olivers, C. N. L., Peters, J., Houtkamp, R., & Roelfsema, P. R.
(2011). Different states in visual working memory: When it
guides attention and when it does not. Trends in Cognitive
Sciences, 15, 327–334. https://doi.org/10.1016/j.tics.2011.05
.004, PubMed: 21665518
Olivers, C. N. L., & Roelfsema, P. R. (2020). Attention for
action in visual working memory. Cortex, 131, 179–194.
https://doi.org/10.1016/j.cortex.2020.07.011, PubMed:
32892152
Oommen, B. S., & Stahl, J. S. (2005). Amplitudes of head
movements during putative eye-only saccades. Brain
Research, 1065, 68–78. https://doi.org/10.1016/j.brainres
.2005.10.029, PubMed: 16300748
Petit, L., & Beauchamp, M. S. (2003). Neural basis of visually
guided head movements studied with fMRI. Journal of
Neurophysiology, 89, 2516–2527. https://doi.org/10.1152/jn
.00988.2002, PubMed: 12611944
Robinson, D. A., & Fuchs, A. F. (1969). Eye movements
evoked by stimulation of frontal eye fields. Journal of
Neurophysiology, 32, 637–648. https://doi.org/10.1152/jn
.1969.32.5.637, PubMed: 4980022
Rouder, J. N., Speckman, P. L., Sun, D., Morey, R. D., & Iverson, G.
(2009). Bayesian t tests for accepting and rejecting the null
hypothesis. Psychonomic Bulletin & Review, 16, 225–237.
https://doi.org/10.3758/pbr.16.2.225, PubMed: 19293088
Sassenhagen, J., & Draschkow, D. (2019). Cluster-based
permutation tests of MEG/EEG data do not establish
significance of effect latency or location. Psychophysiology,
56, e13335. https://doi.org/10.1111/psyp.13335, PubMed:
30657176
Schiller, P. H., & Stryker, M. (1972). Single-unit recording and
stimulation in superior colliculus of the alert rhesus monkey.
Journal of Neurophysiology, 35, 915–924. https://doi.org/10
.1152/jn.1972.35.6.915, PubMed: 4631839
Solman, G. J. F., Foulsham, T., & Kingstone, A. (2016). Eye and
head movements are complementary in visual selection.
Royal Society Open Science, 4, 160569. https://doi.org/10
.1098/rsos.160569, PubMed: 28280554
Souza, A. S., & Oberauer, K. (2016). In search of the focus of
attention in working memory: 13 years of the retro-cue
effect. Attention, Perception, & Psychophysics, 78,
1839–1860. https://doi.org/10.3758/s13414-016-1108-5,
PubMed: 27098647
Spivey, M. J., & Geng, J. J. (2001). Oculomotor mechanisms
activated by imagery and memory: Eye movements to absent
objects. Psychological Research, 65, 235–241. https://doi.org
/10.1007/s004260100059, PubMed: 11789427
Stangl, M., Topalovic, U., Inman, C. S., Hiller, S., Villaroman, D.,
Aghajan, Z. M., et al. (2020). Boundary-anchored neural
mechanisms of location-encoding for self and others. Nature,
589, 420–425. https://doi.org/10.1038/s41586-020-03073-y,
PubMed: 33361808
Strauss, D. J., Corona-Strauss, F. I., Schroeer, A., Flotho, P.,
Hannemann, R., & Hackley, S. A. (2020). Vestigial
auriculomotor activity indicates the direction of auditory
attention in humans. eLife, 9, e54536. https://doi.org/10.7554
/elife.54536, PubMed: 32618268
Taylor, P. C., Nobre, A. C., & Rushworth, M. F. (2007).
FEF TMS affects visual cortical activity. Cerebral Cortex,
17, 391–399. https://doi.org/10.1093/cercor/bhj156, PubMed:
16525126
Topalovic, U., Aghajan, Z. M., Villaroman, D., Hiller, S.,
Christov-Moore, L., Wishard, T. J., et al. (2020). Wireless
programmable recording and stimulation of deep brain
activity in freely moving humans. Neuron, 108, 322–334.e9.
https://doi.org/10.1016/j.neuron.2020.08.021, PubMed:
32946744
van Ede, F. (2020). Visual working memory and action:
Functional links and bi-directional influences. Visual
Cognition, 28, 401–413. https://doi.org/10.1080/13506285
.2020.1759744, PubMed: 33223921
van Ede, F., Board, A. G., & Nobre, A. C. (2020). Goal-directed
and stimulus-driven selection of internal representations.
Proceedings of the National Academy of Sciences, U.S.A.,
117, 24590–24598. https://doi.org/10.1073/pnas.2013432117,
PubMed: 32929036
van Ede, F., Chekroud, S. R., & Nobre, A. C. (2019). Human
gaze tracks attentional focusing in memorized visual space.
Nature Human Behaviour, 3, 462–470. https://doi.org/10
.1038/s41562-019-0549-y, PubMed: 31089296
van Ede, F., Chekroud, S. R., Stokes, M. G., & Nobre, A. C.
(2019). Concurrent visual and motor selection during visual
working memory guided action. Nature Neuroscience, 22,
477–483. https://doi.org/10.1038/s41593-018-0335-6, PubMed:
30718904
van Ede, F., Deden, J., & Nobre, A. C. (2021). Looking ahead in
working memory to guide sequential behaviour. Current
Biology, 31, R779–R780. https://doi.org/10.1016/j.cub.2021.04
.063, PubMed: 34157258
van Ede, F., & Nobre, A. C. (2021). Toward a neurobiology
of internal selective attention. Trends in Neurosciences,
44, 513–515. https://doi.org/10.1016/j.tins.2021.04.010,
PubMed: 33992457
Thom et al.
867
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
Wurtz, R., & Albano, J. (1980). Visual-motor function of the
primate superior colliculus. Annual Reviews of Neuroscience,
3, 189–226. https://doi.org/10.1146/annurev.ne.03.030180
.001201, PubMed: 6774653
Wurtz, R. H., & Goldberg, M. E. (1971). Superior colliculus cell
responses related to eye movements in awake monkeys.
Science, 171, 82–84. https://doi.org/10.1126/science.171.3966
.82, PubMed: 4992313
Yuval-Greenberg, S., Merriam, E. P., & Heeger, D. J. (2014).
Spontaneous microsaccades reflect shifts in covert attention.
Journal of Neuroscience, 34, 13693–13700. https://doi.org/10
.1523/jneurosci.0582-14.2014, PubMed: 25297096
l
D
o
w
n
o
a
d
e
d
f
r
o
m
h
t
t
p
:
/
/
d
i
r
e
c
t
.
m
i
t
.
e
d
u
/
j
/
o
c
n
a
r
t
i
c
e
-
p
d
l
f
/
/
/
/
3
5
5
8
5
6
2
0
7
7
7
7
4
/
j
o
c
n
_
a
_
0
1
9
7
6
p
d
.
f
b
y
g
u
e
s
t
t
o
n
0
7
S
e
p
e
m
b
e
r
2
0
2
3
868
Journal of Cognitive Neuroscience
Volume 35, Number 5