object files
Recently Published Documents


TOTAL DOCUMENTS

50
(FIVE YEARS 6)

H-INDEX

12
(FIVE YEARS 1)

Author(s):  
Sneha Sudhakaran ◽  
Aisha Ali-Gombe ◽  
Augustine Orgah ◽  
Andrew Case ◽  
Golden G. Richard

2020 ◽  
Vol 4 (OOPSLA) ◽  
pp. 1-28
Author(s):  
Yuting Wang ◽  
Xiangzhe Xu ◽  
Pierre Wilke ◽  
Zhong Shao

Author(s):  
Michael Murez ◽  
Joulia Smortchkova ◽  
Brent Strickland

The chapter outlines and evaluates the most ambitious version of the mental files theory of singular thought, according to which mental files are a wide-ranging psychological natural kind, including psychologists’ object-files as a representative subspecies, and underlying all and only singular thinking. It argues that such a theory is unsupported by the available psychological data, and that its defenders may have overestimated the similarities between different notions of “file” used in philosophy and cognitive science. Nevertheless, critical examination of the theory from a psychological perspective opens up promising avenues for research, especially concerning the relationship between our perceptual capacity to individuate and track basic individuals and our higher-level capacities for singular thought.


Analysis ◽  
2019 ◽  
Vol 80 (2) ◽  
pp. 293-301
Author(s):  
Ian Phillips

Abstract A wealth of cases – most notably blindsight and priming under inattention or suppression – have convinced philosophers and scientists alike that perception occurs outside awareness. In recent work (Phillips 2016a, 2018; Phillips and Block 2017, Peters et al. 2017), I dispute this consensus, arguing that any putative case of unconscious perception faces a dilemma. The dilemma divides over how absence of awareness is established. If subjective reports are used, we face the problem of the criterion: the concern that such reports underestimate conscious experience (Eriksen 1960, Holender 1986, Peters and Lau 2015). If objective measures are used, we face the problem of attribution: the concern that the case does not involve genuine individual-level perception. Quilty-Dunn (2019) presents an apparently compelling example of unconscious perception due to Mitroff et al. (2005) which, he contends, evades this dilemma. The case is fascinating. However, as I here argue, it does not escape the dilemma’s clutches.


2018 ◽  
Vol 85 (2) ◽  
pp. 177-200 ◽  
Author(s):  
E. J. Green
Keyword(s):  

Author(s):  
E J Green ◽  
Jake Quilty-Dunn

Abstract The notion of an object file figures prominently in recent work in philosophy and cognitive science. Object files play a role in theories of singular reference, object individuation, perceptual memory, and the development of cognitive capacities. However, the philosophical literature lacks a detailed, empirically informed theory of object files. In this article, we articulate and defend the multiple-slots view, which specifies both the format and architecture of object files. We argue that object files represent in a non-iconic, propositional format that incorporates discrete symbols for separate features. Moreover, we argue that features of separate categories (such as colour, shape, and orientation) are stored in separate memory slots within an object file. We supplement this view with a computational framework that characterizes how information about objects is stored and retrieved. 1Introduction2Empirical Support for Object Files2.1Object reviewing and multiple-object tracking2.2Visual short-term memory3The Format of Object Files3.1Iconic format3.2Object files and iconic format3.3Object files and propositional format4The Architecture of Object Files: A Multiple-Slots Model4.1Independent memory stores4.2Within-category versus across-category conjunctions in visual short-term memory5Multiple Slots and Indirect Addressing6Conclusion


2017 ◽  
Vol 29 (6) ◽  
pp. 105-116
Author(s):  
A.A. Mikhailov ◽  
A.E. Hmelnov
Keyword(s):  

2016 ◽  
Vol 79 (1) ◽  
pp. 138-153 ◽  
Author(s):  
Martijn J. Schut ◽  
Jasper H. Fabius ◽  
Nathan Van der Stoep ◽  
Stefan Van der Stigchel

Sign in / Sign up

Export Citation Format

Share Document