The Causal Role of Macroscopic Objects

Author(s):  
Alyssa Ney

This chapter considers and critiques some strategies for solving the macro-object problem for wave function realism. This is the problem of how a wave function understood as a field on a high-dimensional space may come to make up or constitute the low-dimensional, macroscopic objects of our experience. It is first noted that simply invoking correspondences between particle configurations and states of the wave function will not suffice to solve the macro-object problem, following issues noted previously by Maudlin and Monton. More sophisticated strategies are considered that appeal to functionalism. It is argued that these functionalist strategies for recovering low-dimensional macroscopic objects from the wave function also do not succeed.

Author(s):  
Alyssa Ney

This chapter proposes a solution to the macro-object problem for wave function realism. This is the problem of how a wave function in a high-dimensional space may come to constitute the low-dimensional, macroscopic objects of our experience. The solution takes place in several stages. First, it is argued that how the wave function’s being invariant under certain transformations may give us reason to regard three-dimensional configurations corresponding symmetries with ontological seriousness. Second it is shown how the wave function may decompose into low-dimensional microscopic parts. Interestingly, this reveals mereological relationships in which parts and wholes inhabit distinct spatial frameworks. Third, it is shown how these parts may come to compose macroscopic objects.


Author(s):  
Alyssa Ney

This chapter considers and responds to the objection that a wave function in a high-dimensional space cannot ultimately constitute the low-dimensional macroscopic objects of experience. It discusses two forms this objection takes: one based on the putative fact that our evidence for quantum theories consists of low-dimensional objects, and another based on the putative fact that quantum theories are about low-dimensional objects, that they have primitive ontologies of local beables. Even admitting that there may be something straightforward and comprehensible about the fundamental ontologies for quantum theories proposed by the wave function realist, the philosophers who raise these objections see a problem with these ontologies in that they cannot serve as the constitutive foundation for the world as we experience it. And this undermines the promise of wave function realism to serve as a framework for the interpretation of quantum theories.


Author(s):  
Zeyu Sun ◽  
Xiaohui Ji

The process of high-dimensional data is a hot research area in data mining technology. Due to sparsity of the high-dimensional data, there is significant difference between the high-dimensional space and the low-dimensional space, especially in terms of the data process. Many sophisticated algorithms of low-dimensional space cannot achieve the expected effect, even cannot be used in the high-dimensional space. Thus, this paper proposes a High-dimensional Data Aggregation Control Algorithm for Big Data (HDAC). The algorithm uses information to eliminate the dimension not matching with the specified requirements. Then it uses the principal components method to analyze the rest dimension. Thus, the simplest method is used to reduce the calculation of dimensionality reduction as can as it possible. In the process of data aggregation, the self-adaptive data aggregation mechanism is used to reduce the phenomenon of network delay. Finally, the simulation shows that the algorithm can improve the performance of node energy-consumption, rate of the data post-back and the data delay.


2020 ◽  
pp. 286-300
Author(s):  
Zeyu Sun ◽  
Xiaohui Ji

The process of high-dimensional data is a hot research area in data mining technology. Due to sparsity of the high-dimensional data, there is significant difference between the high-dimensional space and the low-dimensional space, especially in terms of the data process. Many sophisticated algorithms of low-dimensional space cannot achieve the expected effect, even cannot be used in the high-dimensional space. Thus, this paper proposes a High-dimensional Data Aggregation Control Algorithm for Big Data (HDAC). The algorithm uses information to eliminate the dimension not matching with the specified requirements. Then it uses the principal components method to analyze the rest dimension. Thus, the simplest method is used to reduce the calculation of dimensionality reduction as can as it possible. In the process of data aggregation, the self-adaptive data aggregation mechanism is used to reduce the phenomenon of network delay. Finally, the simulation shows that the algorithm can improve the performance of node energy-consumption, rate of the data post-back and the data delay.


Author(s):  
Alyssa Ney

What are the ontological implications of quantum theories, that is, what do they tell us about the fundamental objects that make up our world? How should quantum theories make us reevaluate our classical conceptions of the basic constitution of material objects and ourselves? Is there fundamental quantum nonlocality? This book articulates several rival approaches to answering these questions, ultimately defending the wave function realist approach. Wave function realism is a way of interpreting quantum theories so that the central object they describe is the quantum wave function, interpreted as a field in an extremely high-dimensional space. According to this approach, the nonseparability and nonlocality we seem to find in quantum mechanics are ultimately manifestations of a more intuitive, separable, and local picture in higher dimensions.


2021 ◽  
pp. 1-12
Author(s):  
Jian Zheng ◽  
Jianfeng Wang ◽  
Yanping Chen ◽  
Shuping Chen ◽  
Jingjin Chen ◽  
...  

Neural networks can approximate data because of owning many compact non-linear layers. In high-dimensional space, due to the curse of dimensionality, data distribution becomes sparse, causing that it is difficulty to provide sufficient information. Hence, the task becomes even harder if neural networks approximate data in high-dimensional space. To address this issue, according to the Lipschitz condition, the two deviations, i.e., the deviation of the neural networks trained using high-dimensional functions, and the deviation of high-dimensional functions approximation data, are derived. This purpose of doing this is to improve the ability of approximation high-dimensional space using neural networks. Experimental results show that the neural networks trained using high-dimensional functions outperforms that of using data in the capability of approximation data in high-dimensional space. We find that the neural networks trained using high-dimensional functions more suitable for high-dimensional space than that of using data, so that there is no need to retain sufficient data for neural networks training. Our findings suggests that in high-dimensional space, by tuning hidden layers of neural networks, this is hard to have substantial positive effects on improving precision of approximation data.


2001 ◽  
Vol 24 (3) ◽  
pp. 305-320 ◽  
Author(s):  
Benoit Lemaire ◽  
Philippe Dessus

This paper presents Apex, a system that can automatically assess a student essay based on its content. It relies on Latent Semantic Analysis, a tool which is used to represent the meaning of words as vectors in a high-dimensional space. By comparing an essay and the text of a given course on a semantic basis, our system can measure how well the essay matches the text. Various assessments are presented to the student regarding the topic, the outline and the coherence of the essay. Our experiments yield promising results.


Sign in / Sign up

Export Citation Format

Share Document