Facebook - konwersja
Czytaj fragment
Pobierz fragment

  • Empik Go W empik go

Cognitive Prosthesis - ebook

Wydawnictwo:
Data wydania:
1 maja 2024
Format ebooka:
EPUB
Format EPUB
czytaj
na czytniku
czytaj
na tablecie
czytaj
na smartfonie
Jeden z najpopularniejszych formatów e-booków na świecie. Niezwykle wygodny i przyjazny czytelnikom - w przeciwieństwie do formatu PDF umożliwia skalowanie czcionki, dzięki czemu możliwe jest dopasowanie jej wielkości do kroju i rozmiarów ekranu. Więcej informacji znajdziesz w dziale Pomoc.
Multiformat
E-booki w Virtualo.pl dostępne są w opcji multiformatu. Oznacza to, że po dokonaniu zakupu, e-book pojawi się na Twoim koncie we wszystkich formatach dostępnych aktualnie dla danego tytułu. Informacja o dostępności poszczególnych formatów znajduje się na karcie produktu.
, MOBI
Format MOBI
czytaj
na czytniku
czytaj
na tablecie
czytaj
na smartfonie
Jeden z najczęściej wybieranych formatów wśród czytelników e-booków. Możesz go odczytać na czytniku Kindle oraz na smartfonach i tabletach po zainstalowaniu specjalnej aplikacji. Więcej informacji znajdziesz w dziale Pomoc.
Multiformat
E-booki w Virtualo.pl dostępne są w opcji multiformatu. Oznacza to, że po dokonaniu zakupu, e-book pojawi się na Twoim koncie we wszystkich formatach dostępnych aktualnie dla danego tytułu. Informacja o dostępności poszczególnych formatów znajduje się na karcie produktu.
(2w1)
Multiformat
E-booki sprzedawane w księgarni Virtualo.pl dostępne są w opcji multiformatu - kupujesz treść, nie format. Po dodaniu e-booka do koszyka i dokonaniu płatności, e-book pojawi się na Twoim koncie w Mojej Bibliotece we wszystkich formatach dostępnych aktualnie dla danego tytułu. Informacja o dostępności poszczególnych formatów znajduje się na karcie produktu przy okładce. Uwaga: audiobooki nie są objęte opcją multiformatu.
czytaj
na tablecie
Aby odczytywać e-booki na swoim tablecie musisz zainstalować specjalną aplikację. W zależności od formatu e-booka oraz systemu operacyjnego, który jest zainstalowany na Twoim urządzeniu może to być np. Bluefire dla EPUBa lub aplikacja Kindle dla formatu MOBI.
Informacje na temat zabezpieczenia e-booka znajdziesz na karcie produktu w "Szczegółach na temat e-booka". Więcej informacji znajdziesz w dziale Pomoc.
czytaj
na czytniku
Czytanie na e-czytniku z ekranem e-ink jest bardzo wygodne i nie męczy wzroku. Pliki przystosowane do odczytywania na czytnikach to przede wszystkim EPUB (ten format możesz odczytać m.in. na czytnikach PocketBook) i MOBI (ten fromat możesz odczytać m.in. na czytnikach Kindle).
Informacje na temat zabezpieczenia e-booka znajdziesz na karcie produktu w "Szczegółach na temat e-booka". Więcej informacji znajdziesz w dziale Pomoc.
czytaj
na smartfonie
Aby odczytywać e-booki na swoim smartfonie musisz zainstalować specjalną aplikację. W zależności od formatu e-booka oraz systemu operacyjnego, który jest zainstalowany na Twoim urządzeniu może to być np. iBooks dla EPUBa lub aplikacja Kindle dla formatu MOBI.
Informacje na temat zabezpieczenia e-booka znajdziesz na karcie produktu w "Szczegółach na temat e-booka". Więcej informacji znajdziesz w dziale Pomoc.
Czytaj fragment
Pobierz fragment

Cognitive Prosthesis - ebook

The book The Cognitive Prosthesis is an innovative and original position. It discusses in detail theories of esoPhysics and based on it shows what they are and how to use them, mental quantum tools. Such a tool is a cognitive prosthesis, based completely on the algorithm, philosophical stone. When you get to know and use a mental quantum tool once, guaranteed that you will no longer be able to do without it. You will keep your health, good condition and you will solve your life problems

Kategoria: Fizyka
Zabezpieczenie: Watermark
Watermark
Watermarkowanie polega na znakowaniu plików wewnątrz treści, dzięki czemu możliwe jest rozpoznanie unikatowej licencji transakcyjnej Użytkownika. E-książki zabezpieczone watermarkiem można odczytywać na wszystkich urządzeniach odtwarzających wybrany format (czytniki, tablety, smartfony). Nie ma również ograniczeń liczby licencji oraz istnieje możliwość swobodnego przenoszenia plików między urządzeniami. Pliki z watermarkiem są kompatybilne z popularnymi programami do odczytywania ebooków, jak np. Calibre oraz aplikacjami na urządzenia mobilne na takie platformy jak iOS oraz Android.
ISBN: 978-83-8384-047-5
Rozmiar pliku: 1,3 MB

FRAGMENT KSIĄŻKI

Introduction

At the very beginning of this publication, I would like to clarify that this will be a book mainly about preserving and improving the condition of the mind and life of a person using Quantum Tools, primarily the Philosopher’s Stone Algorithm, of which I am the author. Of course, you can also use other Tools in this regard, but due to the fact that I am accustomed to the Algorithm and personally consider it the best Tool ever invented, so, I use it, as it were, automatically. These technical Tools like Quantic or Healy — probably others will be created soon — are expensive and you have to have a big enough budget for it, though. I know, I know, sick people are selling out to buy themselves similar modern devices, but still, for now, I think that „my” Philosopher’s Stone Algorithm is the best. Only it requires commitment and time. Someone who doesn’t want to delve into the theory of how these Quantum Tools work and has the cash to do so, will probably choose to buy a technical quantum device of the Quantec or Healy type, or something else, which, I predict, will be created in the near future, and will have the problem over with, will be able to just passively enjoy the benefits of Quantum Tools and the whole theory, and go about their own business.

However, in order to focus on the substance of this topic, I must refer here to the theory of esoPhysics and the Two Level Interpretation of Quantum Mechanics (quantum’s), which I have been promoting in my books. However, from now on they will call it the Theistic Interpretation of Quantum Mechanics. However, in order for someone to learn more deeply about the subject, I would recommend reading the contents of my previous publications. These books were written several times and they are: „esoPhysics”, „The Uber Mage”, „The Quantum Condition of the Mind”, „The Four Pillars” and „Mind and esoPhysics”. You don’t have to buy them right away, as most are on offer from Legimi and Empik Go rental services.

Here I must refer to this theory to justify my view of the nature of the human psyche and consciousness, in the context of the possible intervention of this positive and „repair” it. That is, the proper correction of the psyche, human behavior and everything that affects us humans in life. Normally, after all, we live so godly, that is, so without deep reflection and attention to how these turns of life sculpt us, or rather our brains. And yet, as it happens in life, some are more fortunate others less fortunate (at least it seems so to us from the perspective of the Measurable Level, but in reality …?!). Thus, our psyches, consciousnesses and even our souls can be corrected with the help of Quantum Tools. Someone will say: but after all, this is the content of our Path of Spiritual Development, it is inscribed in the meaning of our existence here on Earth. And yes and no. Indeed, until now there were no effective methods to interfere with this, but today in the Quantum Age there are finally the right Quantum Tools to face this subject. And there is also already a relevant theory, which, let me say here, I have mostly developed myself, possibly unearthed, and which people have long since discovered, only this deluge of Atheism, Materialism that has taken over Science and the official narrative has pushed this out of the public domain. Now, I’ll state at the outset what I’m going to try to demonstrate before I move on to the proper forms of Cognitive Prosthesis, which makes it possible to interfere with the quality of our lives. So, first of all, I will demonstrate that the world is based on two levels. On the Measurable Level, that of Hard Matter, and on the other, the Non-Measurable Level, the level of Transcendence, the Spiritual Level. Then I will prove that on the Non-Measurable Level the Causal and Purposeful Cause, which I call God, must manifest, and which is already sufficient for all the Effects it causes, and which Effects are already manifested on the Measurable Level. I will show that God has given humans the attribute of influencing this Non-Measurable Level, which is the proper base of Causal and Purposeful Causes, the Effects of which manifest later on the Measurable Level, the one that can be subjected to measurement. At this level (the Non-Measurable Level), as I will also show, there is no randomness. I will need this to show that man, as an entity that functions on the Measurable Level, has this material body and material brain, must, like any entity, also be linked to this Non-Measurable Level, because, after all, everything must have a link to it, to this Non-Measurable Level. And related to what? The Soul, which is the manifestation of a person’s Spirit from the Non-Measurable Level to this very Measurable Level. There is sufficient empirical evidence to claim that man’s brain, through superposition, contains this material part, related to the Measurable Level, this Animality, and has a Soul, an expression of his Spirit, which is perceived by the brain, probably, as the Occultists claim, through the pineal gland, but this is not even important through what we perceive our Soul in the brain, because the Soul is certainly not a product of the work of the neurons in the brain. What is important is that this end result of this superposition of the Spiritual and the Animal is human consciousness. Human consciousness cannot be explained by a mere neural structure and the structure of the brain and nervous system, although attempts to study and explain this go back more than a hundred and fifty years, and so far the phenomenon of human consciousness could not and still cannot be explained by materialistic and atheistic categories. So Descartes was right that there is in man a dualism of the Spiritual and the Material (Animalism). Next, I will show how it is possible, using Quantum Tools and according to this whole theory, to influence the brain and the quality of life of man, by what I called, perhaps erroneously, imprecisely Cognitive Prosthesis.1.Historical outline—

a. Physics to the 20th century

Our science, of our modern civilization, is short, very short, it’s just the last few minutes in, conventionally put it this way, the daily history of our species. Therefore, although we already boast of quantum computers, it is not surprising that we still know so little. We are lost in conjecture, and knowledge even that of the Universities is based mainly on usus. What authorities deem to be true is taught universally. And although the history of science, I am thinking here mainly, however, of physics and metaphysics-although the latter is often regarded, not without reason, as a collection of confabulations-is so short, it has abounded in a number of theories, which, however, over time have been rejected by authorities. For example, let’s give such an example, Ptolemy’s geocentric cosmology or the theory of heat, or the concept of the ether as a carrier of electromagnetic waves, or even Kelvin’s vortex concept of the structure of atoms — all of which, it is believed, were falsified and rejected. Falsification was widely regarded as the primary logical tool in establishing the prevailing Paradigm of Science, at least it was so until the end of the 20th century. Today, it is believed that there are theories that do not lend themselves to falsification, although they are considered by most authorities to be at least promising. I am referring to M-Theory, String Theory, or Supersymmetry. This is because science has reached the limit of human perception, that is, it has descended below Plank time and Plank length, which is no longer possible to subject to falsification, which is mainly done in Science empirically, that is, by experience. Although even so, one is frantically seeking empirical confirmation of these theories at CERN, this still does not yield the desired result for the time being. Physics as Science originates in the deliberations of the early Greek philosophers, who passionately considered the ontological nature of entities. And this, along with philosophy, is also a favorite life theme of physics. It is believed that antiquity abounded with a host of philosophers who are still highly regarded today, but such first-order scientists were on the fingers of one hand. Certainly among such can be counted Pythagoras, Archimedes and Aristotle, who, after all, knew everything and, unfortunately, he introduced a large number of confusions and inaccuracies into common Science, which persisted almost until the time of Galileo. And although he (Aristotle) based his theories on verbal formulations — after all, the algebraic form of representing the laws of nature was not popularized until the 17th century and later — it is known that one of his postulates and laws that he believed to be correct was the famous formulation F= mv. According to him, force was proportional to mass and velocity. Hence, a common conclusion from this formula was that when there is no force on a body of mass m, i.e. F=0, the body does not move, because it must be v=0. Which seems total nonsense to us today, but this was still believed to be the case until the advent of Galileo Galilei. Galileo Galilei was probably one of the first full-fledged physicists who operated mainly with knowledge derived from experience. He was the first to correct Aristotle, with a series of experiments he proved that if there is no force acting on a body, F=0, then such a body moves uniformly along a rectilinear path or remains at rest. From this postulate of his, it followed that F≠mv, speculation for the time being, then why is F equal? Galileo partially found the answer to this question, but did not formulate and disseminate it. And it was only Newton who gave a complete answer to this question, Newton, who created integral and differential calculus. It is true that Galileo stated that for F=0 then v= constants, but it was Newton who knew that a=dv/dt, hence d(constant)/dt=0, because the derivative of a constant is zero (0), i.e. his correct conclusion was that for F=0 then a=0, i.e. that „F” must be directly proportional to „a”, hence it was a small step to formulate Newton’s universal law of Dynamics that F=ma. In this Newtonian formulation, „m” (mass) acts as a constant coefficient of proportionality. This, then, was an effective refutation of Aristotle’s findings. But before this happened, the history of physics still saw the emergence of such names as Tycho de Brache and his heir Kepler. The former developed, to the naked eye, very accurate astronomical tables, from which, after deep analysis, Kepler derived his laws, in which, among other things, he established that the Planets move not in circular orbits, as hitherto assumed, but in ellipses, with the Sun always located at one of the poles of the ellipse. Kepler’s cosmology was cemented by Newton, who finally explained the cosmological laws with his formula for the gravitational force, which is directly proportional to the product of the mass of the Sun and the Planet, and inversely proportional to the square of the distance between them. It turns out that all of Kepler’s Laws follow directly from this simple relationship. Newton was also the first scientist to formulate the so-called Scientific Method, which is the way of working of a physicist, a scientist who objectively investigates and analyzes the conditions of a physical problem, without recourse to some extraordinary and mystical circumstances. You can find such a scheme in all his available works, especially in _Principia…_, where Newton explicitly shuns formulations that cannot be proved scientifically. This can be seen in relation to the problem of the action of the Gravitational Force through „empty” space, which was difficult for him to accept (a force acting without direct application). He preferred not to draw any metaphysical conclusions on this subject. This problem was solved only by Einstein in the General Theory of Relativity, who determined that there is no Gravitational Force, there is only a disturbance of space-time and cosmic bodies move along geodesic lines, determined by the masses (of these bodies) disturbing this space. For Newton it was difficult to assume that Gravitational Forces act without intermediaries through the empty space of the Cosmos, and this is what his Cosmology boiled down to. Today in modern physics it is also accepted that Forces act through force exchange particles, and that they cannot act without intermediaries unless they act in a Force Field, as accepted in Maxwell’s Classical Theory, but the concept of a Field is a separate deep physical concept. Today we know that there must be Force Exchange particles anyway. And although Newton is considered the founder of modern Science, without reference to extraordinary phenomena, it is worth knowing that privately he was an avid Alchemist and Magician, who later devoted almost his entire life to the study of the Bible Code, from where, he hoped, he would gain deep knowledge of the world and reality, and Spirituality. It is somewhat of a paradox that the Founding Father of rational Science was de facto a Magus, i.e. someone who partook of the post-sensory world. Even so, his achievements in the field of Science are undeniable and indisputable.

The period of the Enlightenment and later, was a period of the emergence of classical formalisms, including mainly formalisms based on purposive causality. So these were (these purposive formalisms) and the principle of least action, Lagrange’s formalism, later Hamilton’s formalism, somewhat different formalisms from Newtonian formalism, which was based on causal causality, where the main role was the Forces that caused the action. In the formalism based on causal causality, it was not the Forces that were at the core, but the intentional (Hamiltonian) conditions that the bodies subject to such a formalism had to meet. But for both Causal Causation and Purposeful Causation in Determinism, which was in effect at the time, Causes (both Purposeful and Purposeful) were already sufficient and sufficient for effects to take place in a certain way. This distinguishes the formalism of classical physics (mechanics) from quantum physics (mechanics), as I will write about later. It will turn out that for Quantum at our Measurable level Indeterminism already applies, i.e. that Causes (Causal or Purposeful) are only necessary, but not sufficient, for Effects to take place. This distinction into two types of causes, Causal Causes and Purposeful Causes, dates back to antiquity, and in fact its origin is the already mentioned Aristotle himself. In general, he introduced the concept of four types of Causes. Causes: formal, material, causal and purposive (or final), but only Causal and Purposive Causes have survived to our time. The former two pertained to his metaphysical conception of the ontology of being (the structure of matter), but that this idea turned out to be at least inaccurate and even wrong, today they (these Causes: material and formal) are no longer included in the description of physics. And indeed, Newtonian formalism based on the concept of Force is consistent with Causal Causes, Lagrange’s and Hamilton’s formalism is based on and related strictly to Purposeful Causes, in Hamilton’s formalism the elements of a system (body) must satisfy the purposeful condition that the Hamiltonian of the system defines. But Causality, or the Law of Cause and Effect, must take into account both types of Causes and applies to both types. I, in my earlier books as more attached to Newtonian formalism, which was after all the first, defined the Law of Causality in the context of Causal Causes, but all conclusions based on Causality, i.e. on Determinism or Indeterminism, can just as well be related to Intentional Causes in the same way, so now I will use a generalization: Causal, in both contexts: Causal and Purposeful, without specifying. It can be immediately noted here that the Hamiltonian and Lagrange formalisms are more readily used in modern physics because of a number of advantages, including computational, over the Newtonian formalism. These include the ease of transformation from a system to another system (reference system) (generalized coordinates and momentum) and the resulting simplicity of describing the physics of such systems in these formalisms based on Purposeful Causes. However, the concept of Force is very persistent, as most descriptions and laws of physics use this concept. Four types of Elementary Forces are specified. These are the Gravitational Force, the Electromagnetic Force, and the Nuclear Strong and Nuclear Weak Forces. As will become clear later in Quantum, my Interpretation of the Two Levels of Quantum takes into account two more types of Forces, but about that in its own time. It may be noted that these Elementary Forces (the four main ones) are equivalent to Causal Causes, because they cause source action, or causality.

So it is worth referring here still to the concept of Causality (the Law of Cause and Effect). This is, and actually has been so far a concept rather nonchalantly treated in Science, in Physics. Everyone, yes, agrees that it is a very important concept, but actually it has been little considered in Science so far. It is common knowledge to everyone that in a physical process the Cause (Causal?) must precede in time the Effect it causes. Purposeful Causes condition physical processes, which, however, always end in some kind of Effect. In this context, they are also sufficient (Determinism) or merely necessary (Indeterminism) for these Effects.

Until now, it has also always been assumed that Causes are sufficient (the opinion of Aristotle, who first came up with this) and sufficient for Effects, and this was thought so until the early 20th century. But in fact this case refers to Determinism. It is worth realizing that in general Causality is divided into two branches. Into Determinism and Indeterminism. So this case of the sufficiency of the Cause for the Effect it causes applies only to Determinism. But what about Indeterminism? How is it to be understood? And when do we have to deal with it? This is my concept, which I promote in my publications. And it should be understood this way, that Indeterminism occurs when Causes are only (?) necessary, but insufficient for the Effects they cause. In other words, when they can, but do not have to effect (these Causes). It is explained that such a case occurs in Quantum. When (at the Measurable(?) Level) Effects take place in physical processes, but with some probability of this fact, or to put it a little differently, with some uncertainty. I will write more about this as part of my description of Quantum Mechanics. At this point, however, I must mention it. Why? Because all Classical Physics (almost all) from the Measurable Level is subject to de facto Determinism. All formulas are clear, transparent, unambiguous and Deterministic. And all Quantum from the Measurable Level is subject to Indeterminism, uncertainty. And on this, for now, at this point, I must stop, and I will return to this later in the book.

Since the 18th century, the era of Electromagnetism begins in Science. It is with this phenomenon that the concept of Force Field, which was introduced by Faraday, is associated. It would seem, from the very beginning, that this is a similar concept of forces as gravitational forces, that is, forces that act at a distance, something that cannot be explained Scientifically, and it was to somehow solve this that the concept of Force Field was introduced. Maxwell’s Classical Theory is based on the concept of Electromagnetic Force Field and EM (electromagnetic) waves. Maxwell, who gave the formulas describing all of Classical Electromagnetism, also proved that light is a kind of electromagnetic wave. Maxwell’s opinion and theses were instrumental in the fact that from then until the advent of quantum’s in the 20th century, light was considered waves. The trouble is that waves need a carrier. Just as the carrier of sea waves is water, and of acoustic waves is air. But, although for these three hundred years in advance there has been a search for a carrier of EM (Electromagnetic) waves — that is, the so-called Ether, and in fact there are still some who are still looking for it today — no one has succeeded in this art. I will describe this issue in detail when I write about Quantum. Then I will also try to prove that there is no Ether, because light and EM waves are particles, by the way, as Newton himself defined it. Field Quanta (particles) that propagate through space at absolute speed „c”. The fact is that these particles behave as if they were waves, because they manifest, their physics, the hallmarks of waviness, for example, bending and interference, so for people of those years the natural conclusion was that light is a wave. Even today, in the Copenhagen Interpretation of Quantum Mechanics, there is a corpuscular-wave dualism, which assumes that matter, including light and elementary particles, can be considered waves at one time and particles at another time. I will try to write about this when I present the Quantum part of the book. Now I will only mention that such views (dualism) are associated with a certain misunderstanding, which can be easily explained. The explanation lies almost as if on a platter in the mathematical formalism of Quantum.

Discoveries within the theory of electromagnetism have brought significant and lasting technological advances to people, and improved common living conditions. With the advent of electrical engineering, numerous electromagnetic devices and motors in homes, there was widespread lighting, power grids and electrical appliances to improve living comfort. In fact, progress in this field continues to this day. It’s true that today electronic and quantum engineering and computer science have joined it, but these have only enriched this progress, which source began as early as the 18th century.

The development of physics, theoretical mathematics and physical mathematics since the time of Newton, since his famous work Principia, is and was so great that practically all the main mathematical formalisms were established then, which are still widely used today even in new physical and scientific theories. The knowledge that has been established by man since those days is today practically beyond the grasp of even brilliant people. This has forced scientists to develop very selective branches of Science, including physics and mathematics. In Universities, physics, mathematics, chemistry, biology and virtually all other branches of knowledge had to be broken down into detailed faculties. Universities around the world have grown immeasurably, and a disproportionate number of scientists, people who are professionally engaged in Science and teaching, have also arrived.

It is worth noting that practically until the emergence of Quantum Mechanics, physical formulas and theories were Deterministic. Even Thermodynamics, which was established in this first period, was based on deterministic assumptions, and only generalizing to the vastness of molecules found in a single mole of gas or matter led to probabilistic rules and laws of thermodynamics. They were governed by Determinism as one form of Causality: def. of Determinism:

∀Cause (Cause →Effect).

At the same time, it was about Causal or Intentional Causes. That is, in these formulas, the Causes were already sufficient and sufficient for the Effects carried out in the physical processes.

The main reason for this fact was that only Real numbers were included in these formulas. But as was soon to happen, the Real numbers and the Body of Real Numbers was no longer a sufficient set of numbers to describe all physical phenomena. And with the arrival of the 20th century, the Body of Complex Numbers joined the game in describing the physics of phenomena. And this was a significant and qualitative change. Changing the whole understanding of physics and the structure of the World.

b. Twentieth century physics

As late as Lord Kelvin in the late 19th century, he was firmly convinced that the theory of physics was already practically finished, and that the rest would be taken care of by the Academies, where engineers would be trained to make practical use of the findings of the great theorists of already finished physics. That is, in his opinion, nothing interesting in the theory of physics was going to happen anymore. This was undoubtedly a great physicist who left behind a not inconsiderable body of work, but in this respect he was cardinally wrong. Because, after all, in retrospect, we know that in fact the fun in physics was just about to begin. In the 20th century, and at its very beginning, two gigantic branches of Science were created. The Theory of Relativity and Quantum Mechanics. Actually, the Theory of Relativity is two theories, the Special and General Theory of Relativity. The former is actually a generalization of Newtonian Kinematics and Dynamics to the case of bodies moving with significant velocity, that is, with a velocity of v≥c/3, a velocity above a third of the speed of light. In this theory, Einstein, the founder of the theory, assumed that all inertial systems are equivalent for physical processes. Hence, it follows that there is a maximum velocity of bodies (finite, but enormous), the so-called absolute velocity „c” in these equivalent systems, which is the same for all inertial systems (in general for all). In his thought experiments at a still very young age, for which Einstein was later widely known, the creator of the theory came to the conclusion that light moves at such a speed. And indeed, empirics confirmed this conclusion. Today we know that all massless particles (massless entities) move at „c” speed. Importantly, light, according to this, has the same absolute velocity „c” relative to any system, no matter how it moves. There are a great many seemingly paradoxical conclusions from the Special Theory of Relativity. First, time loses its rank as an absolute parameter, that is, a parameter that does not change when moving from system to system. Second, space and time are inextricably linked, forming space-time. When you go from system to system, the entire space-time is transformed, and this is expressed in the formulas for the Lorenz Transformation. In detail, it turns out, according to these formulas, that time undergoes dilation and distance undergoes Lorentzian shortening, that is, in a moving system, relative to one at rest, time slows down, and length in a moving system (relative to a system at rest, i.e., a laboratory system) in the direction consistent with the motion of that system shortens. And here, right at the start, Einstein had to deal with the first paradox (turns out to be a pseudo-paradox). The point is that, after all, motion is relative. After all, if system A is moving with relative velocity „v” with respect to system B, then it can be equivalently said, according to this principle, that it is system B that is moving with velocity "-v” with respect to A. The sign "-" changes nothing here, it is only a matter of changing the return of motion. So, what do you mean, then in which system does time slow down? This was the main weapon of all opponents of STW (Special Theory of Relativity). Because it was a conclusion that supposedly led to an outright absurdity, i.e., proof No Straight that STW is fundamentally false. And if we now caused one of these moving systems (A or B) to turn around, and these systems came into contact again, as they did at the beginning of the movement, in which one would the passage of time be less? This is the content of the so-called Paradox of Twins (Because in these systems, one can assume, each in a different one, there are twins). In turn, the solution is trivially simple. Refer to the dynamics. Note that one of the systems must change direction of motion at some point, and then we are no longer dealing with an inertial system, but subject to overload, that is, the findings of relativity of inertial motion do not apply to it. It is in this system where these forces will occur (so that the system can turn around) that time dilation occurs. This follows directly from the mathematical operations carried out in a tedious manner, resulting from the formulas of the Lorenz Transformation, which I have omitted here, for convenience, but anyone can do this, these mathematical operations, themselves, on a piece of paper, as it were.

Something else is important when considering the transformation of the transition from system to system. Something that physicists have so far overlooked or have done so involuntarily, without special attention. Well, on all transformations from system to system must still be applied the condition that in the case of spatially similar (and temporally different) events, Causality between these events must be preserved, that is, the Law of Cause and Effect must be preserved. In other words, if in some reference system for spatial-similar events the temporal order is as follows Cause (A) precedes Effect (B), then at any transformation of systems this order must be preserved or both events (A and B) occur simultaneously. Causality need not be preserved only for time-like events. It is worth noting here that it is difficult to obtain temporal simultaneity for two different points in space. Hence, various strange conclusions arise from events in STW. And it is so unintuitive, but there are no paradoxes in it, and so far it (STW) has not been falsified, and empiricism confirms all its unexpected conclusions.

The first half of the 20th century undoubtedly belonged to Albert Einstein, who first formulated (1905) the STW and then the OTW (General Theory of Relativity) (1915). His contribution to the theory of physics was not limited to these two powerful ideas, for he was also co-creator of Quantum Mechanics (1900 -1925), which was also nascent at the time. And although, admittedly, at some point, once the theory of MK (Quantum Mechanics) had somewhat solidified, he was unable to accept and come to terms with the blatant Indeterminism (or, in short, randomness) that flowed from MK, famously in his phrase …_God does not play dice with the world…_, he nevertheless followed closely to the end of his life all the findings of MK, which in time began to dominate Science, as the most important physical theory of the 20th century. Before I briefly discuss the OTW and move on to MK, however, at this point I would like to note that Einstein was both wrong and very much right, which I will justify in detail a little later. I will now note enigmatically that, indeed, Quantum Mechanics (MK) from the Measurable Level is Indeterministic, that is, largely probabilistic (random), but from the second level, which I will allude to later, from the Non-Measurable Level, it is not Indeterministic. In essence, then, it is the case that to us, at the Measurable Level, it appears that MK is Indeterministic, in fact, at a deeper, this second level, at the Non-Measurable Level, there is no randomness and God, in fact, does not play dice with the world. So Einstein’s tremendous intuition is bowed to here and confirmed. What I am writing about now may indeed seem puzzling and enigmatic, but I will address this in more detail later in the book and try to explain it. I can only state that almost all theories, written from the Measurable Level, i.e. written up to the 20th century, are Deterministic, are written precisely in this spirit. This also still applies to STW and OTW. They too are as Deterministic as possible. Only MK and theories approaching the common denominator of Quantum are written from the Non-Measurable Level, and there from this level (Non-Measurable) they are not Indeterministic, and there are such from the point of view of the Measurable Level actually Indeterministic, and this is already qualitatively a completely different form of description of the physics of phenomena. It is interesting that it is only after a century of MK that someone (i.e. me) is making such obvious statements. Before I start describing this division of the structure of the world into two levels (Measurable and Non-Measurable) and describe Quantum Mechanics, and Quantum in general, let’s go back to the General Theory of Relativity (OTW). To Einstein’s second theory.

The General Theory of Relativity assumes the equivalence of all types of reference systems and those inertial and those in which forces act, as equivalent in the description of physics. It applies mainly to Cosmology, because it replaces Newton’s Cosmology, which was falsified in this way, so to speak. It assumes the equivalence of heavy mass with inertial mass, thus postulating that the acceleration resulting from the action of the gravitational „force” should be considered equivalent to the acceleration resulting from the action of the inertial force. It constitutes, this whole theory (OTW), a kind of nonlinear geometry with a specific metric, which is formed by the masses of bodies (Planets, stars, space objects), which curve space-time locally, so to speak, causing these masses (these space bodies) to move along geodesic lines determined by these masses. The most important equation in the General Theory of Relativity is Einstein’s field equation. This equation describes the relationship between the curvature of space-time and the distribution of masses and energies in it. Important elements of the OTW are the Energy-momentum tensor, the Riemann curvature tensor, the metric tensor and constants of nature, such as the Gravitational constant. These elements are the main elements of the equation. Actually, the OTW is curvilinear geometry, with the constraint that Einstein’s Field Equation imposes on it. Bodies, space objects, must satisfy these conditions. That is, there are Purposeful Causes (Einstein’s Field equation), forces do not exist, so there are no Causal Causes (as far as the force of Gravity is concerned). Purposeful Causes are therefore sufficient and sufficient, in a word, pure Determinism. Although this description is seemingly simple, there are infinitely many solutions to this equation. Whether the world given by such an equation will shrink or expand, or be static, is determined by the average density of the world’s matter. According to empirical findings, our World is expanding and will continue to expand until the moment of „heat death”, when, after a tumultuous history, after the phase of formation of stars, planets and then black holes, masses will evaporate (black holes) into the form of photon radiation. And a full Eon of the world’s existence will come to an end, which will be the origin of a new world (a new Aeon), that is, according to this theory, the world is eternal and undergoes successive Eons. One such full Eon is estimated to last about 10^13 years. This is the version of Cosmology behind Sir Roger Penrose’s theory, which, it’s true, not everyone agrees with, but as of today it seems to be the correct theory. In that case, however, there must be some remnants in the Cosmos from past Eons, and this already yields to falsification. For now, a strenuous search is underway for these remnants, which will confirm or negate Roger Penrose’s theory. Somewhat for the sake of sobriety, it is worth noting that the present Universe (the current Aeon) is about 13.8 billion years old, or on the order of 10^10 years (officially), although recent findings suggest that this age may be as much as twice as long. However, there is still a radically long time until the end of the Aeon. More even, the Solar System in which we find ourselves and the age of the Earth has not yet reached even an estimated half of its potential existence.

Albert Einstein is considered if not the most outstanding, then certainly one of the most outstanding physicists not only of modern times, but in general. Certainly no one else-maybe only Isaac Newton can match him in this- has created such coherent and great systems in physics. His greatness would have been perfect and unquestionable were it not for a small detail. Although he was, after all, one of the pioneers of Quantum Mechanics, he did not imprint himself with his achievements and did not leave such a mark of genius as he could have, after all, as in his flagship Theories of Relativity. This, by the way, aroused frustration and embitterment in him almost to the end of his life, which was reflected in the fact that he could not come to terms with this system (MK). In fact, he couldn’t come to terms with the popular and valid interpretations of Quantum Mechanics at the time, mainly the Copenhagen Interpretation of Quantum Mechanics. And even more strictly with Indeterminism, which overtly followed from MK. Toward the end of his life (until his death), he worked to unify his Theory of Relativity and Quantum Mechanics into a single Deterministic system. Unfortunately, these efforts proved to be futile. I am not some outstanding theoretician, physicist, but in my opinion this was due to the fact that then and in modern Science Quantum Mechanics (MK) and all its valid interpretations are one-level. And this is, among other things, the reason that while Classical Physics, that is, up to the beginning of the 20th century, was written mainly in the manner of Determinism, because it was written from the Measurable Level, Quantum Mechanics is already written mainly from the Non-Measurable Level. On how I understand this in a moment. Now let’s return to the history of physics.

Quantum mechanics, or rather its framework, was created with the beginning of the century, around 1900. The first was Planck. In order to explain the spectrum of blackbody radiation, he proposed the novel solution that the energy emitted by the Blackbody in the form of radiation was quantized into portions, today we would say into quanta of energy. This was, in his opinion at the time, a purely mathematical operation, and not a formal change or undermining of the existing Paradigm of Science. It was supposed to save the then formulae for the radiation spectrum of the Perfect Black Body, because they struggled with the so-called catastrophe in the ultraviolet, diverging to infinity (emitted energies) as the radiation frequency increased. Empiricism, however, did not confirm this, and only such a maneuver (quantizing the energy, i.e. dividing it into separate portions) saved the whole thing. Interestingly, as the anecdote goes, earlier, at the time when he was about to get serious about Science, he was advised against studying physics, claiming that nothing new and interesting could be discovered in physics anymore, and should rather take up some other, more promising branch of Science. But in fact, as we know, he became an absolute pioneer of a new branch of Science, physics, probably the most important physical system (Quantum Mechanics and Quantum in general), which changed not only the Paradigms of Science, but in general the entire modern world. The fact that today we use computers, electronics, and that AI was born and is becoming more and more common, has its roots in this „mathematical” trick of Plank, in the quantization of energy radiated in the form of light. Very soon after, in 1905, Einstein reasoned that light, and EM radiation in general, should be treated as corpuscles, particles that move at absolute speed „c” (very large, but nonetheless limited). That is, de facto Einstein clarified that this „mathematical” trick of Planck has as much physical basis as possible, because the energy emitted in the form of light (EM radiation(waves)) is always in the form of particles, quanta of Field (Energy), which henceforth are used to call photons. Light: the dilemma, are they waves or particles?, is known as corpuscular-wave dualism in the prevailing modern interpretations of Quantum. I will refer to it later in the book.

It turns out that the radiation quantum or radiation energy quantum equals: Ek=hf, where „h” is Planck’s constant and „f” is the frequency of the EM wave. Quantization of energy does not mean that the energy (E) is discontinuous or has any holes. It just says that for a certain frequency of an EM wave, this energy is emitted or absorbed in portions, or quanta, and for a given frequency, such a quantum of energy equals: Ek=hf, whereby, however, the frequency „f” of the EM wave is as continuous as possible. At the same time, it should still be taken into account that such a quantum of energy, is the smallest portion of energy for a given frequency „f”, that is, it is the energy of one photon of a given „f”. However, the frequency of photons is not arbitrary and is determined by the physical processes under consideration. In this sense, therefore, it is discrete and the energy, the quantum of energy, for such a process is determined and they are also discrete. Energy, after all, in this case, is a multiple of a specific quantum of energy. But formally, for different physical processes, for different elements, the frequency of a photon has no mathematical limits, and so the quantum of energy, for one process can be fa= 2.42 10^ 15Hz, and for another fb= 6.4 10^16Hz.

It was also very quickly possible for physicists to explain on this basis the atomic structure of hydrogen H and the conditions that must be met by an electron in a hydrogen atom, what are the values of energy absorbed or emitted when an electron jumps from one orbit to another, because Quantum turned out to perfectly describe the world at the atomic level. The first co-creators and discoverers of Quantum were Planck, Einstein, Bohr and others. This was that first period, one might say: very naive, of Quantum Mechanics. This was followed by a period of forming and developing the theory of mathematical formalisms of Quantum Mechanics, which lasted almost two decades. Each year, each brought astonishing discoveries about Quantum. The final stage was the blossoming of Quantum Mechanics into a definite and finite mathematical formalism, which we use practically to this day.

So the final formalism of Quantum Mechanics was defined between 1925 and 1926. The first was the matrix formalism presented in 1925 by Heisenberg, now called the Heisenberg image, the next in order was the Schrödinger formalism in 1926, called the Schrödinger image, which is more popular. There are other images, including the Dirac image. These are all equivalent formalisms. With the right Unitarian transformation, it is easy to switch from one formalism to another. The Heisenberg formalism differs from the others in that in it the forms of the Observables matrices (operators) change over time and a slightly different equation, called the Heisenberg equation, corresponds to this, while in the Schrödinger formalism the State Function of the system changes over time and the Schrödinger wave equation, familiar to most physicists, corresponds to this. This certainly has some metaphysical overtones, which I won’t write about now, but let’s remember that also in Classical Physics, Classical Mechanics can also use equivalent formalisms, such as Hamilton’s or Newton’s formalism.

In the following years of the 20th century, a some of other theories were developed, but all were based on the idea of Quantum. They arose directly from the Ground of Quantum Mechanics, such as Quantum Field Theory (QFT), and from it Quantum Electrodynamics (QED), Quantum Chromodynamics (QCD) and others. Prominent contributors to these theories include Feynman, Dirac, Pauli, Wigner, and others. This book does not claim to be considered a textbook on physics, which is why I describe this history of 20th century physics so briefly. However, it is worth recalling that back in the early years of the 20th century, the German scientist Noether managed to create and develop the foundations of an important branch of mathematical physics, without which it is difficult to consider modern physics today, and which can be encapsulated in the word Symmetries and Conservation Principles. The formalisms of Quantum plus the applied laws of symmetry form the so-called Standard Model (MS), which is the applicable Paradigm of modern physics.

In the final years of the 20th century, additionally, other physical theories were developed, which were intended to Unify all known forces. In particular, gravity with the forces already unified by MS (Standard Model). String Theory was created, later its generalization M-Theory. However, all of them are base on the concept of Quantum, all of them are base different from Classical physics. They all represent a new quality. Very importantly, practically all of them are not subject to falsification. They cannot practically be verified empirically, because the energies that physicists would have to have at their disposal to verify them empirically are gigantic, beyond the reach, not only of all research centers, but beyond the reach of humanity in general. Thus, it has come to pass that so-called Physicalism, that is, the current in physics that consists in physicists, Scientists, recognizing as scientific facts only concepts and objects and theories that can be empirically experienced, has become seriously overblown in the late 20th century.
mniej..

BESTSELLERY

Kategorie: