On the missed opportunities of classical physics

2021 ◽  
Vol 34 (4) ◽  
pp. 475-479
Author(s):  
Toir Makhsudovich Radzhabov

This study considers a variant of the realization of Dirac’s ideas regarding the limited number of Faraday force lines and allowance for the finite size of microparticles in physical theory. It is shown that within the framework of the classical approach, consideration of the limited number of Faraday force lines opens additional possibilities for describing and characterizing the physical field and associated phenomena. Specifically, it is shown that it becomes possible to obtain in a facile manner an expression for describing the discrete radiation of an atom, which agrees well with the empirical Balmer relation. An assumption is made about the possibility of the material existence of Faraday force lines as structural elements of the physical field. It is suggested that the natural fields of physical bodies can be considered as a set of materially existing lines of force, i.e., as a luminiferous ether.

Author(s):  
Jeremy Butterfield

Over the centuries, the doctrine of determinism has been understood, and assessed, in different ways. Since the seventeenth century, it has been commonly understood as the doctrine that every event has a cause; or as the predictability, in principle, of the entire future. To assess the truth of determinism, so understood, philosophers have often looked to physical science; they have assumed that their current best physical theory is their best guide to the truth of determinism. It seems that most have believed that classical physics, especially Newton’s physics, is deterministic. And in this century, most have believed that quantum theory is indeterministic. Since quantum theory has superseded classical physics, philosophers have typically come to the tentative conclusion that determinism is false. In fact, these impressions are badly misleading. The above formulations of determinism are unsatisfactory. Once we use a better formulation, we see that there is a large gap between the determinism of a given physical theory, and the bolder, vague idea that motivated the traditional formulations: the idea that the world in itself is deterministic. Admittedly, one can make sense of this idea by adopting a sufficiently bold metaphysics; but it cannot be made sense of just by considering determinism for physical theories. As regards physical theories, the traditional impression is again misleading. Which theories are deterministic turns out to be a subtle and complicated matter, with many open questions. But broadly speaking, it turns out that much of classical physics, even much of Newton’s physics, is indeterministic. Furthermore, the alleged indeterminism of quantum theory is very controversial: it enters, if at all, only in quantum theory’s account of measurement processes, an account which remains the most controversial part of the theory.


Author(s):  
Olivier Darrigol

This article examines the gradual development of James Clerk Maxwell’s electromagnetic theory, arguing that he aimed at general structures through his models, illustrations, formal analogies, and scientific metaphors. It also considers a few texts in which Maxwell expounds his conception of physical theories and their relation to mathematics. Following a discussion of Maxwell’s extension of an analogy invented by William Thomson in 1842, the article analyzes Maxwell’s geometrical expression of Michael Faraday’s notion of lines of force. It then revisits Maxwell’s honeycomb model that he used to obtain his system of equations and the concomitant unification of electricity, magnetism, and optics. It also explores Maxwell’s view about the Lagrangian form of the fundamental equations of a physical theory. It shows that Maxwell was guided by general structural requirements that were inspired by partial and temporary models; these requirements were systematically detailed in Maxwell’s 1873 Treatise on electricity and magnetism.


Author(s):  
Jeremy Butterfield

Over the centuries, the doctrine of determinism has been understood, and assessed, in different ways. Since the seventeenth century, it has been commonly understood as the doctrine that every event has a cause; or as the predictability, in principle, of the entire future. To assess the truth of determinism, so understood, philosophers have often looked to physical science; they have assumed that their current best physical theory is their best guide to the truth of determinism. Most have believed that classical physics, especially Newton’s physics, is deterministic. And in this century, most have believed that quantum theory is indeterministic. Since quantum theory has superseded classical physics, philosophers have typically come to the tentative conclusion that determinism is false. In fact, these impressions are badly misleading, on three counts. First of all, formulations of determinism in terms of causation or predictability are unsatisfactory, since ‘event’, ‘causation’ and ‘prediction’ are vague and controversial notions, and are not used (at least not univocally) in most physical theories. So if we propose to assess determinism by considering physical theories, our formulation of determinism should be more closely tied to such theories. To do this, the key idea is that determinism is a property of a theory. Imagine a theory that ascribes properties to objects of a certain kind, and claims that the sequence through time of any such object’s properties satisfies certain regularities. Then we say that the theory is deterministic if and only if for any two such objects: if their properties match exactly at a given time, then according to the theory, they will match exactly at all future times. Second, this improved formulation reveals that there is a large gap between the determinism of a given physical theory, and the bolder, vague idea that motivated the traditional formulations: the idea that the world as a whole, independent of any single theory, is deterministic. Admittedly, one can make sense of this idea by adopting a sufficiently bold metaphysics: namely, a metaphysics that accepts the idea of a theory of the world as a whole, so that its objects are possible worlds, and determinism becomes the requirement that any two possible worlds described by the theory that match exactly at a given time also match exactly at all future times. But this idea cannot be made sense of using the more cautious strategy of considering determinism as a feature of a given physical theory. Third, according to this more cautious strategy, the traditional consensus is again misleading. Which theories are deterministic turns out to be a subtle and complicated matter, with many questions still open. But broadly speaking, it turns out that much of classical physics, even much of Newton’s physics, is indeterministic. Furthermore, the alleged indeterminism of quantum theory is very controversial: it enters, if at all, only in quantum theory’s account of measurement processes, an account which remains the most controversial part of the theory. These subtleties and controversies mean that physics does not pass to philosophers any simple verdict about determinism. But more positively, they also mean that determinism remains an exciting topic in the philosophy of science.


1923 ◽  
Vol 42 ◽  
pp. 225-246
Author(s):  
William Gordon Brown

The method of describing a field of force by means of lines or tubes of induction, which originated with Faraday, was given a quantitative form by Sir J. J. Thomson, and further discussed by N. Campbell in his book Modern Electrical Theory. Since Maxwell himself looked on his work as a mathematical theory of Faraday's lines of force, one is tempted to examine the original physical theory for hints as to the modification of the Maxwellian theory to suit certain modern requirements.


Author(s):  
Miguel Navascués ◽  
Harald Wunderlich

One of the most important problems in physics is to reconcile quantum mechanics with general relativity, and some authors have suggested that this may be realized at the expense of having to drop the quantum formalism in favour of a more general theory. Here, we propose a mechanism to make general claims on the microscopic structure of the Universe by postulating that any post-quantum theory should recover classical physics in the macroscopic limit. We use this mechanism to bound the strength of correlations between distant observers in any physical theory. Although several quantum limits are recovered, such as the set of two-point quantum correlators, our results suggest that there exist plausible microscopic theories of Nature that predict correlations impossible to reproduce in any quantum mechanical system.


2021 ◽  
Vol 54 (6) ◽  
pp. 211-225
Author(s):  
Nina V. Kochergina ◽  
◽  
Alexander A. Mashinyan ◽  
Elena V. Lomakina ◽  
◽  
...  

A structural and logical scheme is a visual image of the logical connection of the main elements of knowledge within the framework of a training course, section or topic. When studying physics as an applied discipline in a technical university, its professional orientation and applied knowledge corresponding to this function come out in the first place. But applied knowledge as a consequence of physical theories is not enough for the development of a modern quantum-relativistic worldview. The idea of our research is to precede the systematic study of general physics with systematic ideas about the place and meaning of each physical theory, namely: before studying classical physics, to show its connection with quantum and relativistic physics. To do this, it is necessary to apply a preliminary and final generalization at different stages of the study of physics with the help of appropriate structural and logical schemes. When implementing this idea, the following methods were used: the method of structural and logical analysis of the course of general physics with the allocation of knowledge elements, the method of systematization based on clarifying the connection between physical theories and the method of generalization, leading to the construction of new generalized schemes of this course. In the proposed schemes "Connection of mechanical theories" and "Scales of the Universe-Velocities", we identify structural elements that reveal the specifics of the methodological representations of the theory in accordance with its place in the Universe and the velocities of its objects. The proposed methodology is based on two types of generalization: preliminary and final. The preliminary generalization shows the place of physical theory in the system of physical knowledge in the course of general physics, the final generalization is used to make students aware of the specifics of the entire range of methodological concepts used in this physical theory. The methodology is aimed at forming students ' systematic knowledge of general physics and at developing their modern quantum-relativistic worldview.


Author(s):  
Roger Penrose ◽  
Martin Gardner

What need we know of the workings of Nature in order to appreciate how consciousness may be part of it? Does it really matter what are the laws that govern the constituent elements of bodies and brains? If our conscious perceptions are merely the enacting of algorithms, as many AI supporters would have us believe, then it would not be of much relevance what these laws actually are. Any device which is capable of acting out an algorithm would be as good as any other. Perhaps, on the other hand, there is more to our feelings of awareness than mere algorithms. Perhaps the detailed way in which we are constituted is indeed of relevance, as are the precise physical laws that actually govern the substance of which we are composed. Perhaps we shall need to understand whatever profound quality it is that underlies the very nature of matter, and decrees the way in which all matter must behave. Physics is not yet at such a point. There are many mysteries to be unravelled and many deep insights yet to be gained. Yet, most physicists and physiologists would judge that we already know enough about those physical laws that are relevant to the workings of such an ordinary-sized object as a human brain. While it is undoubtedly the case that the brain is exceptionally complicated as a physical system, and a vast amount about its detailed structure and relevant operation is not yet known, few would claim that it is in the physical principles underlying its behaviour that there is any significant lack of understanding. I shall later argue an unconventional case that, on the contrary, we do not yet understand physics sufficiently well that the functioning of our brains can be adequately described in terms of it, even in principle. To make this case, it will be necessary for me first to provide some overview of the status of present physical theory. This chapter is concerned with what is called ‘classical physics’, which includes both Newton’s mechanics and Einstein’s relativity.


2019 ◽  
Vol 97 ◽  
pp. 03036 ◽  
Author(s):  
Viktor Orlov

The article deals with the mathematical model of console-type structural elements. The dynamic load is presented as quasi-static one. The differential equation of bending of an object is nonlinear and has movable singular points in which the solution has discontinuity. From a physical point of view, the object will break (collapse) in this place. The application of the majorant method to the solution of the problem allows, in contrast to the classical approach, establishing the boundaries of the solution area and to construct an analytical approximate solution to the problem with a given accuracy. As a result, it’s possible to calculate the displacement at any point of the cantilever structure and estimate the stress-strain state of the object.


2016 ◽  
Vol 3 (4) ◽  
pp. 160064 ◽  
Author(s):  
Yu-Zhong Chen ◽  
Le-Zhi Wang ◽  
Wen-Xu Wang ◽  
Ying-Cheng Lai

Recent works revealed that the energy required to control a complex network depends on the number of driving signals and the energy distribution follows an algebraic scaling law. If one implements control using a small number of drivers, e.g. as determined by the structural controllability theory, there is a high probability that the energy will diverge. We develop a physical theory to explain the scaling behaviour through identification of the fundamental structural elements, the longest control chains (LCCs), that dominate the control energy. Based on the LCCs, we articulate a strategy to drastically reduce the control energy (e.g. in a large number of real-world networks). Owing to their structural nature, the LCCs may shed light on energy issues associated with control of nonlinear dynamical networks.


Author(s):  
Jun Jiao

HREM studies of the carbonaceous material deposited on the cathode of a Huffman-Krätschmer arc reactor have shown a rich variety of multiple-walled nano-clusters of different shapes and forms. The preparation of the samples, as well as the variety of cluster shapes, including triangular, rhombohedral and pentagonal projections, are described elsewhere.The close registry imposed on the nanotubes, focuses attention on the cluster growth mechanism. The strict parallelism in the graphitic separation of the tube walls is maintained through changes of form and size, often leading to 180° turns, and accommodating neighboring clusters and defects. Iijima et. al. have proposed a growth scheme in terms of pentagonal and heptagonal defects and their combinations in a hexagonal graphitic matrix, the first bending the surface inward, and the second outward. We report here HREM observations that support Iijima’s suggestions, and add some new features that refine the interpretation of the growth mechanism. The structural elements of our observations are briefly summarized in the following four micrographs, taken in a Hitachi H-8100 TEM operating at an accelerating voltage of 200 kV and with a point-to-point resolution of 0.20 nm.


Sign in / Sign up

Export Citation Format

Share Document