Archive of all online content
-
Volume 20 Issue 3
pp. 158-217
(19 August 2024) -
Volume 20 Issue 2
pp. 80-157
(24 June 2024) -
Volume 20 Issue 1
pp. 1-19
(1 March 2024)
-
Volume 19 Issue 4
pp. 1-105
(27 December 2023) -
Volume 19 Issue 3
pp. 211-333
(25 July 2023) -
Volume 19 Issue 2
pp. 111-200
(30 June 2023) -
Volume 19 Issue 1
pp. 1-110
(31 March 2023)
-
Volume 18 Issue 4
pp. 243-303
(31 December 2022) -
Volume 18 Issue 3
pp. 165-202
(30 September 2022) -
Volume 18 Issue 2
pp. 85-164
(30 June 2022) -
Volume 18 Issue 1
pp. 1-84
(31 March 2022)
-
Volume 17 Issue 4
pp. 250-291
(31 December 2021) -
Volume 17 Issue 3
pp. 193-249
(30 September 2021) -
Volume 17 Issue 2
pp. 99-192
(30 June 2021) -
Volume 17 Issue 1
pp. 1-98
(31 March 2021)
-
Volume 16 Issue 4
pp. 291-369
(31 December 2020) -
Volume 16 Issue 3
pp. 176-290
(30 September 2020) -
Volume 16 Issue 2
pp. 85-175
(30 June 2020) -
Volume 16 Issue 1
pp. 1-84
(31 March 2020)
-
Volume 15 Issue 4
pp. 236-317
(31 December 2019) -
Volume 15 Issue 3
pp. 169-235
(30 September 2019) -
Volume 15 Issue 2
pp. 75-168
(30 June 2019) -
Volume 15 Issue 1
pp. 1-74
(31 March 2019)
-
Volume 14 Issue 4
pp. 150-208
(31 December 2018) -
Volume 14 Issue 3
pp. 62-150
(30 September 2018) -
Volume 14 Issue 2
pp. 38-61
(30 June 2018) -
Volume 14 Issue 1
pp. 1-37
(31 March 2018)
-
Volume 13 Issue 4
pp. 267-322
(31 December 2017) -
Volume 13 Issue 3
pp. 190-266
(30 September 2017) -
Volume 13 Issue 2
pp. 121-189
(30 June 2017) -
Volume 13 Issue 1
pp. 1-120
(31 March 2017)
-
Volume 12 Issue 4 (special issue)
pp. 150-235
(31 December 2016) -
Volume 12 Issue 3
pp. 130-149
(30 September 2016) -
Volume 12 Issue 2
pp. 67-129
(30 June 2016) -
Volume 12 Issue 1
pp. 1-66
(31 March 2016)
-
Volume 11 Issue 4
pp. 118-135
(31 December 2015) -
Volume 11 Issue 3
pp. 64-117
(30 September 2015) -
Volume 11 Issue 2
pp. 31-63
(30 June 2015) -
Volume 11 Issue 1
pp. 1-30
(31 March 2015)
-
Volume 10 Issue 4
pp. 119-155
(31 December 2014) -
Volume 10 Issue 3
pp. 81-118
(30 September 2014) -
Volume 10 Issue 2
pp. 32-80
(30 June 2014) -
Volume 10 Issue 1
pp. 1-31
(27 February 2014)
-
Volume 9 Issue 4
pp. 156-223
(31 December 2013) -
Volume 9 Issue 3
pp. 112-155
(24 October 2013) -
Volume 9 Issue 2
pp. 53-111
(30 June 2013) -
Volume 9 Issue 1
pp. 1-52
(31 March 2013)
-
Volume 8 Issue 4
pp. 267-295
(31 December 2012) -
Volume 8 Issue 3
pp. 210-266
(27 September 2012) -
Volume 8 Issue 2
pp. 70-209
(28 June 2012) -
Volume 8 Issue 1
pp. 1-69
(29 March 2012)
-
Volume 7 Issue 2
pp. 55-156
(31 December 2011) -
Volume 7 Issue 1
pp. 1-54
(31 March 2011)
-
Volume 6 Issue 6
pp. 1-141
(31 December 2010)
-
Volume 5 Issue 5
pp. 1-134
(31 December 2009)
-
Volume 4 Issue 1
pp. 1-14
(31 March 2008)
-
Volume 3 Issue 4
pp. 419-465
(31 December 2007) -
Volume 3 Issue 3
pp. 363-417
(30 September 2007) -
Volume 3 Issue 1
pp. 1-361
(31 March 2007)
-
Volume 2 Issue 4
pp. 239-276
(31 December 2006) -
Volume 2 Issue 2
pp. 99-237
(30 June 2006) -
Volume 2 Issue 1
pp. 1-97
(31 March 2006)
-
Volume 1 Issue 1
pp. 1-16
()
Volume 8 Issue 3 (2012)
The direction of masked auditory category priming correlates with participants' prime discrimination ability
Christina Bermeitinger, Dirk Wentura, Christopher Koppermann, Micha Hauser, Benjamin Grass, Christian Frings
Christina Bermeitinger, University of Hildesheim, Department of Psychology, Marienburger Platz 22, D - 31141 Hildesheim, Germany.
E-mail: bermeitinger@uni-hildesheim.de
Semantic priming refers to the phenomenon that participants typically respond faster to targets following semantically related primes as compared to semantically unrelated primes. In contrast, Wentura and Frings (2005) found a negatively signed priming effect (i.e., faster responses to semantically unrelated as compared to semantically related targets) when they used (a) a special masking technique for the primes and (b) categorically related prime-target-pairs (e.g., fruit-apple). The negatively signed priming effect was most pronounced for participants with random prime discrimination performance, whereas participants with high prime discrimination performance showed a positive effect. In the present study we analyzed the after-effects of masked category primes in audition. A comparable pattern of results as in the visual modality emerged: The poorer the individual prime discrimination, the more negative is the semantic priming effect. This result is interpreted as evidence for a common mechanism causing the semantic priming effect in vision as well as in audition instead of a perceptual mechanism only working in the visual domain.
Keywords: semantic priming, masked priming, auditory priming, semantic memory, negative semantic priming effect, category priming, auditory primes and targetsThe processing of inter-item relations as a moderating factor of retrieval-induced forgetting
Tobias Tempel, Werner Wippich
Tobias Tempel, Department of Psychology, University of Trier, 54286 Trier, Germany.
E-mail: tempel@uni-trier.de
We investigated influences of item generation and emotional valence on retrieval-induced forgetting. Drawing on postulates of the three-factor theory of generation effects, generation tasks differentially affecting the processing of inter-item relations were applied. Whereas retrieval-induced forgetting of freely generated items was moderated by the emotional valence as well as retrieval-induced forgetting of read items, even though in the reverse direction (Experiment 1), fragment completion eliminated the moderation of retrieval-induced forgetting by emotional valence (Experiment 2). The results corroborate the assumption that the processing of inter-item relations is crucial for the immunization against retrieval-induced forgetting. Moreover, differential processing of inter-item relations may clarify the mixed results on moderating factors of retrieval-induced forgetting that have been reported.
Keywords: retrieval-induced forgetting, generation effect, episodic memory, recall, inhibitionMemory for facial expression is influenced by the background music playing during study
Michael R. Woloszyn, Laura Ewert
Michael Woloszyn, Department of Psychology, Thompson Rivers University, 900 McGill Road, Kamloops,V2C 0C8, BC, Canada.
E-mail: mwoloszyn@tru.ca
The effect of the emotional quality of study-phase background music on subsequent recall for happy and sad facial expressions was investigated. Undergraduates (N = 48) viewed a series of line drawings depicting a happy or sad child in a variety of environments that were each accompanied by happy or sad music. Although memory for faces was very accurate, emotionally incongruent background music biased subsequent memory for facial expressions, increasing the likelihood that happy faces were recalled as sad when sad music was previously heard, and that sad faces were recalled as happy when happy music was previously heard. Overall, the results indicated that when recalling a scene, the emotional tone is set by an integration of stimulus features from several modalities.
Keywords: music, emotion, memory, facial expressionThe very same thing: Extending the object token concept to incorporate causal constraints on individual identity
Chris Fields
Chris Fields, 814 E. Palace Ave. #14, Santa Fe, NM, 87501 USA.
E-mail: fieldsres@gmail.com
The contributions of feature recognition, object categorization, and recollection of episodic memories to the re-identification of a perceived object as the very same thing encountered in a previous perceptual episode are well understood in terms of both cognitive-behavioral phenomenology and neurofunctional implementation. Human beings do not, however, rely solely on features and context to re-identify individuals; in the presence of featural change and similarly-featured distractors, people routinely employ causal constraints to establish object identities. Based on available cognitive and neurofunctional data, the standard object-token based model of individual re-identification is extended to incorporate the construction of unobserved and hence fictive causal histories (FCHs) of observed objects by the pre-motor action planning system. It is suggested that functional deficits in the construction of FCH s are associated with clinical outcomes in both autism spectrum disorders and later-stage stage Alzheimer?s disease.
Keywords: emotional memory enhancement, explicit/ implicit retrieval, intentional/incidental encodingCan you eat it? A link between categorization difficulty and food likability
Yuki Yamada, Takahiro Kawabe, Keiko Ihaya
Yuki Yamada, The Research Institute for Time Studies, Yamaguchi University, 1677-1 Yoshida, Yamaguchi, 753-8512, Japan.
E-mail: yamadayuk@gmail.com
In the present study we examined whether categorization difficulty regarding a food is related to its likability. For this purpose, we produced stimulus images by morphing photographs of a tomato and a strawberry. Subjects categorized these images as either a tomato or a strawberry and in separate sessions evaluated the food's eatability or the subject?s willingness to eat (Experiments 1 and 2) and the likeliness of existence of each food (Experiment 2). The lowest score for ca- tegorization confidence coincided with the lowest scores for eatability, willingness to eat, and likeliness of existence. In Experiment 3, we found that food neophobia, a trait of ingestion avoidance of novel foods, modulated food likability but not categorization confidence. These findings suggest that a high categorization difficulty generally co-occurs with a decrease in food likability and that food neophobia modulates likability. This avoidance of difficult-to-categorize foods seems ecologically valid because before eating we have little information regarding whether a food is potentially harmful.
Keywords: categorization, food neophobia, appetite, emotionEmotional enhancement of immediate memory: Positive pictorial stimuli are better recognized than neutral or negative pictorial stimuli
Hanna Chainay, George A. Michael, Mélissa Vert-pré, Lionel Landré, Amandine Plasson
Hanna Chainay, Laboratoire d'Étude des Mécanismes Cognitifs, Université Lumière Lyon 2, 5 avenue Pierre Mendes France, 69676 Bron Cedex, France.
E-mail: hanna.chainay@univ-lyon2.fr
We examined emotional memory enhancement (EEM) for negative and positive pictures while manipulating encoding and retrieval conditions. Two groups of 40 participants took part in this study. Both groups performed immediate implicit (categorization task) and explicit (recognition task) retrieval, but for one group the tasks were preceded by incidental encoding and for the other group by intentional encoding. As indicated by the sensitivity index (d'), after incidental encoding positive stimuli were easier to recognize than negative and neutral stimuli. Participants' response criterion was more liberal for negative stimuli than for both positive and neutral ones, independent of encoding condition. In the implicit retrieval task, participants were slower in categorizing positive than negative and neutral stimuli. However, the priming effect was larger for emotional than for neutral stimuli. These results are discussed in the context of the idea that the effect of emotion on immediate memory enhancement may depend on the intentionality to encode and retrieve information.
Keywords: emotional memory enhancement, explicit/ implicit retrieval, intentional/incidental encoding