niedziela, 1 maja 2016

Planckons and elsymons. Part 2

Planckons and elsymons
Part 2

Contents    
1. The potential energy of a system of two planckons. Mathematical model. Potential energy niches - conclusions concerning planckons’ capacity to create structures.
2. How to build elementary particles? The basic repetitive elements of the structure of particles.
3. What is the dark matter? Panelsymon – Universe at the start. Urela and phase transition. Dark matter and the density parameter. Why masses of galaxies are basically similar to each other? How the islands of future galaxies were formed.
Reflections.

1. The potential energy of a system of two planckons
     One can ask: What constitutes an "encouraging" factor for planckons to form aggregates - particles? Answer: There should be a niche of potential energy. So let us consider how the potential energy of a system of two planckons changes with distance. Of course, we have to take into account the system’s mass defect which occurs when the distance between planckons diminishes. Here it should be noted that such a niche cannot exist if gravity means only attraction. According to the currently prevailing (even obligatory) view, gravity means attraction - repulsion is out of the question. In such a situation it is difficult to find motivation for undertaking research on the structure of elementary particles, as if these studies were simply not possible. Nowadays this is a generally held opinion. The argument behind this opinion is based on uncertainty principle, which implies fluctuations that undermine the possibility of assessing the linear sizes of particles. For this reason, it is difficult to talk about their structure (in particular concerning leptons). So it is best to assume that they are point objects. So how did all this structuring of matter come about - the chemical elements, atoms, subatomic particles...? Despite the expected reaction of many readers, the problem exists. It is in fact the basic problem. We have here a paradigm of observability conditioning the possibility of acquiring knowledge.
   Is such a niche really possible? Judging by the qualitative considerations in the previous article* – yes. Will the quantitative considerations confirm our suppositions? We'll see.
   On the basis of previous findings (formulas: (8), (9), (10) in the preceding article) we can write the formula for the potential energy of the system, taking into account the mass defect: 
As you can see, the resultant gravitational mass of the system was divided by two, so as to get mass per one planckon. 

   The matter, however, is not finished. It should be expected that the potential energy with respect to distances of less than half of the Planck length, should be positive (being negative for larger distances) - judging by the discussion regarding a system of two material points (the more general case), since it is about repulsion. At the minimum distance**, the (positive) potential energy should have the maximum value. In the case of an isolated system of two planckons we would basically have to deal with the cyclical movement, with vibrations. It associates with the oscillating Universe. The potential energy is equal to zero when, of course, the mutual distance is equal to half the Planck length, and in that case the gravitational mass of the system is equal to zero. This can be symbolically expressed as follows:

From formula (11) it transpires, however, that the potential energy should always be negative. It would be correct if gravity meant only attraction. But it turns out that this is not the case. In the formula for energy we should, therefore, take this into account by adding a factor, which should be defined as follows (just like we did in the first article***):
And here we come to the final form of the expression for potential energy:
For convenience sake let’s assume that: 

Let’s also note that, on the basis of formula (10):
Thus we get the following formula for potential energy:
 Function  E(p) can be explored. I would suggest this to high school students (although today it's a lost cause). Below we see a sketch of a graph of potential energy. We express it, of course in Mc² units. Note the consistency of this result with the previous results, in particular the zeroing of the system’s mass when the distance between planckons equals half the Planck length. 
Judging by the graph, we see (surprisingly?) the point of minimum potential energy, corresponding to a distance equal to 3/2 of Planck length (-8/27Mc²). This corresponds to energy of 3.67·10^18GeV. By the way, it would be the energy of two orders higher than the energies considered by GUT (Grand Unification Theory). This minimum energy helps in the process of connecting planckons into systems which include, one can assume, all particles (except photons and neutrinos). Yes, neutrinos, due to their unique characteristics, differing them from other particles. I will explore them in the third part of the book. Also "at the crossroads”, at the 1/2 point, the niche is formed (further on there is attraction, and before it there is repulsion) and it is there the place for planckon systems, with the proviso that the potential energy is there equal to zero. So this corresponds also to zero mass of the system. This is the place where photons are formed. By the way, let’s note the perfect convergence of this graph with the graph illustrating changes of the potential energy of two material points (article five). This was to be expect (by taking into account that 
R = 2L).
     As it transpires, we did not find here room for neutrinos. It is a sign that, despite all, our insight is only partial. But I think we've gone a little forward. That’s because there are more options which exclusion will bring us closer to the solution. By the way, it could mean that the neutrinos are just complex systems (let’s remember that our "theoretical" deliberations apply only to systems made of two planckons). One cannot explore neutrinos without paying attention to their very special properties. This could (already) imply that they didn’t have to come into being in some potential energy hole, but in some other way. As I have already noticed, the thing is described elsewhere. Besides, according to one of the hypotheses they diverged still before the occurrence of the phase transformation.
   So let’s continue our interpretation. The value of the potential energy which we obtained in the minimum: -8/27, makes one wonder, as it happen to be the third power of -2/3. Let’s also note that this value happens at the point of abscissa 3/2. This also makes one wonder, if only because of the fact that the product of these two numbers gives unity (minus expresses attraction), which, among other things, means certainty. This actually corresponds to the principle of "completeness" according to which reality in its objective form is an ideal towards which strive all its approximate descriptions, which under the name of theories aspire to accurately describe Nature. Among other things, this is exactly why scientists are looking for aesthetics in their equations – it’s a kind of unwritten criterion in the search for truth. They seek solutions in the form of whole numbers or simple fractions, or else special numbers such as the number π. Particular attention should be paid also to number φ = 1.61803 ..., called the golden ratio, which is the ratio of the lengths of two parts of the golden section, as well as the limit to which tends the ratio of consecutive numbers in the Fibonacci sequence. We will return to it in a moment. This is by no means about some kabbalah fun. Although who knows, maybe cabalism is based, among other things, on the (yet) unfathomable to us qualities of Nature, described symbolically in the Torah (?).
   Let’s note that the number 2 represents a fundamental alternative, it expresses the existence of two exclusions (up-down, right-left, true-false, attraction-repulsion, etc.). In this context number 3 defines the number of elements of a basic optional set: plus, minus, zero; in addition indicates the symmetry of the world –zero symbol separates "equally" two opposites, the combination of which also makes zero. 

It is a common thing in nature. Let’s take, for example, the absolute quantitative equality of positive and negative charges. It could means that the (electrically) charged being arose from the dissociation of some primary creation. This in itself indirectly indicates the existence of structuring (and therefore atomisticity) somewhere deep, below the threshold of observability. A place providing an opportunity for planckons (and future doctoral students) to show off their capacity.
In this context, the third power of the number -2/3 is perhaps (Perhaps? Rather quite naturally) an expression of three-dimensionality of space. I already noticed it in the fifth article. 

If we were some two-dimensional creatures, instead of -8/27 we would get 4/9... And if we were four-dimensional? Perhaps in the formula which we would get, the index would be 4 and the resulting fraction would amount to 16/81. That’s the product of imagination which can surely be erroneous, especially that sometimes my head brims with jokes. By the way, the energy would be positive, which would mean a maximum, not a niche. The particles would not exist, at least in this point. Even if it's a joke.
After all, the volume is determined by the third power of length. Three-dimensionality. It is worthy of consideration, although it’s non-convergent with the findings of the theory of superstrings. Apparently, to reach the micro-reality from above (being obligated, moreover, by the paradigm of observability), we should equip ourselves with some additional dimensions. However, from the bottom, at the source, everything looks simpler. Perhaps at the source, the space created by the multitude of planckons is three-dimensional, while the assumed multi-dimensionality is associated with the complexity of systems forming reality, and it constitutes a condition enabling description of this reality by the means available to our perception

In our perception there are three types of interactions (weak concerning neutrinos, outside the limits of our perception - to understand why, it may be worth reading the essay devoted to neutrinos). If we refer to each of these three interactions, the three dimensions necessary for the full understanding of dynamic systems, we’ll get together nine dimensions. Remember superstrings? Well. such loose associations.
Moreover, if all the interactions ultimately boil down to gravity (as the result of complexity of structures), no wonder that getting to the elementary structure on the basis of unawareness of this (fact?), requires additional procedures that extend the data space by additional "dimensions" (and greatly complicate mathematical modelling). So you might think, but the internal features of the planckon itself still remain a puzzle, and they suggest that to describe them we would still have to broaden our spatial "imagination".****
   To complete this issue, in the formula (13), by replacing the mass with the expression defining it (Formula (2) in the preceding article), we obtain another formula for potential energy:
There are (at least in my mind) various elsymons, the number of possibilities is enormous, almost infinite. However, not every system has features of durability, not each one of them can be a particle of a bearable life time. It is not my purpose (and there is no time for it) to seek formal rules of selection and exclusion enabling the construction (on paper) of creations with characteristics of specified particles, although it is not an impossible task. And if someone embarks on it, it will turn out that the currently in force standard model is a special case, or actually a "local" regularity - after taking into account the dual gravity and, of course, all what’s brought by the concept of the absolutely elementary being. It is even knows which direction to follow. We’ll talk about it further on. I hope that I will be able to leave the continuation of exploration (a fascinating research topic) to the young and willing. [Does it sound like a dream by someone affected by Asperger syndrome?]   

   In this context, the existence of "instability" (particle decays) is in itself an interesting subject, and the question: "What is its cause?", is not at all trivial. After all, "gravity is only and exclusively a bonding factor." Durability of a system is associated with the concept of balance. The specified system is in stable equilibrium when its state is characterized by the minimum potential energy. For example, a body located at the bottom of a pit is in a state of stable equilibrium. If gravity was only attraction, there would be no problem... and we wouldn’t exist. The pit would be actually one-sided endlessly deep chasm. Everything would fall into it and nothing would be left behind (another matter whence it would have come, to fall afterwards). It’s hard to speak here of the balance and stability. Somehow nobody talks about it. The problem ceases to exist if we do not talk about it. And if someone speaks out, well, that's his problem ...

Fascinating, how Everything could have arisen in such a situation, and, moreover, expand? Scientists have found a way. There is a way for everything. Thus, in spite of all, we exist, with all due respect for the graciously ruling singular black-hole-bureaucracy armed with vacuum power.

     
So let's ask: What is it that makes, however, the existence of stable systems possible? This mysterious factor that on the one hand does not allow for an unlimited collapse, and on the other hand causes systems decay. This is certainly a kind of repulsion. What could be its source? Certainly not electrostatics. As we know, the particle decays always involve the participation of neutrinos, which do not interact electromagnetically. We have one more reason to pounder.
     Thus the existence of repulsion changes the situation. Not only that. On the one hand we realized that there were certain problems and even internal contradictions arising from the traditional view about gravity, and on the other hand these problems have been (ideologically) resolved. But that's not all. Let's go back to our planckons, which permeate one another. As previously noted, there are two niches of potential energy enabling creation of systems-particles. In particular, where the distance between the centres of planckons is equal to half the Planck length, there is the niche of zero mass, the photon niche. If the distance is even smaller, the mass of the system becomes negative, which manifests itself in the force of repulsion. As the mutual distance between planckons becomes smaller, the numerical value of the (positive) potential energy rapidly increases. Thus the system of planckons resembles a spring that is certainly not at rest, a vibrating spring in fact, as we will find out further on. This also reminds of the Big Bang, and along with that, whatever preceded it. Gravitational repulsion, concurrently over the entire volume.
2. How to build elementary particles?
     We have already quite a solid "ideological" base for thought, maybe even for preliminary determinations concerning criteria for the construction of particles-elsymons. Let me get these thoughts, scattered all over the text (this one and the preceding ones), into one whole.
      While considering the potential energy of a system of two planckons, we came to the conclusion pointing to the existence of two minima, two places best suited for planckons to connect each other. It's quite an important criterion, but a graph of the potential energy which we obtained, relates to a system of two planckons, while actual particles are probably quite complex combinations of such systems. And what kind of arrangements are they? They should be durable, that is, when left alone they should not disintegrate. They should be, therefore, of cyclical character. I explained the matter in the reference with three stars in the previous article. That is, particle is a cyclic arrangement. It is a complex system where continuous (and ceaseless) changes take place, but these changes are cyclical. It's about vibrations. Planckons forming a particle, and also, irrespectively, their systems (sub-systems of a whole), vibrate. These sub-systems are coupled together. It can be assumed that the distribution of vibrations of the whole entity is of a Fourier character. The vibrations are "tailored" to each other. The system should be stable. Instability causes rapid rupture of the system. Of course, in case of a system of planckons, there is no electromagnetic radiation during the undergoing changes. Not only because these changes are cyclical. Also because the photon itself is just such a system.
     But that's not all. Additional criterion concerning the structure of particles is based on the (admittedly quite surprising) hypothesis of the existence of saturation of the gravitational field (I wrote about it in the preceding article). The concept assuming the existence of saturation of the gravitational field has not been so far taken into consideration, because it is not consistent with the mainstream research and, obviously, with the current set of beliefs. The impetus for thought in this new direction (apart for very old, student-era speculations) is the fact that not everything which constitutes current knowledge fits like a glove, makes a monolith. After all, we know that the present state of knowledge no longer anticipates anything new, while the astronomical observations generally surprise and the scientists have to adapt to them by (not necessarily justified) creations of new entities. The distinct example of this approach is the so-called dark energy (awarded even with the Nobel Prize). Elsewhere I presented the (quantitative) model explaining the effect of supernovae, which gave rise to the invention of the dark energy, the model - consistent with observations - anticipating the magnitude of darkening depending on the distance. This creation, by today science, of new beings left and right, is quite symptomatic (along with strong opposition that these words arouse amongst many members of the community of physicists, and I, though also a physicist, will be soon removed from this fraternity. To my joy, because it will be a sign that they have read what I’ve written). But this (creation of new entities) is a fact, even if today's geniuses make fun of phlogiston.
     The new concept should be, of course, checked, if only because it shatters the current order of things. Check so as to reject. Check because... it constitutes a considerable contribution for possible heuristics. It opens directly the cornucopia of research topics, among them the traditional research problems, including those still regarded as insoluble, as well as research topics which are far beyond the horizon of expectations of today's science and cognitive awareness. The implications may be numerous, significant to the extent that the exclusion, without serious study and research, of considerations based on the model presented in my works would not be too wise. Therefore, all this should be rejected in advance, and of course, wrapped in silence ... And that's what is actually happening..
     In the preceding article, so as to illustrate things, I used (kind of infantile) model of hands, which limited number means the existence of saturation. I stated there, that the matter is worth describing primarily in relation to the truly elementary planckon systems. It can be said that nature is minimalist. Planckon therefore has the least number of hands by which it can, first of all, connect with other planckons, and thus create the simplest elementary spatial arrangement, and secondly, this simplest system is still the source of the gravitational field enabling it to create more complex systems. Thus, plankton has four hands. Thanks to them, an elementary four-walled system come into being (regular tetrahedron), having also four hands. Why exactly four? Four points (four planckons forming a regular tetrahedron) form a uniform three-dimensional space (three-dimensional space is formed by four points - the vertices of a tetrahedron, just as three points make up a plane, and two a straight line.) Naturally, the system is not static. Planckons vibrate. One can initially ask: How? Are all four getting closer to each other and then move apart in the concurrent phases causing cyclical changes in the size of the tetrahedron? Or maybe they oscillate pair by pair in opposite phases? Here the description would be more complicated. I encourage you to think, and search for possibilities by which the system remains stable. This is of course an elementary system, which is linked with other identical systems, also vibrating. It is therefore about the stability of the system made which can be made of many of these interconnected tetrahedra. In the case of a single tetrahedron, in connection with its elementary character, there is no question of higher harmonics.
   Planckons can also create a dodecahedron*****. Its walls are regular pentagons. It has 20 vertices, which means that is composed of twenty planckons. So does it, therefore, have twenty outreaching hands? That might result, but not immediately. "The sides of regular pentagons may be of the same length as the sides of a tetrahedron, but here there are longer diagonals (smaller mass deficit - greater mass). Only tetrahedron is devoid of diagonals." However, the quoted sentence does not take into account the existence of saturation of the gravitational field - planckons on both sides of the diagonal do not feel each other’s existence. In our model, we would need longer hands, but they do not exist, since they all have the same length. All free hands are reaching outwards. It is worth a thought, even if it is a very childish model, since it serves as a model of quantum gravity. And if we remember Gauss’s law, we will immediately conclude that there, inside, there is no gravitational field. Inside this "ball" gravity does not exist.
   Apart from this, dodecahedral system should be prioritized, if only because of the relationship of regular pentagon with golden ratio, which manifests itself as a natural feature of a huge number of physical systems. Particularly noteworthy here are living organisms, in which the proportions of body construction are based on the golden ratio, and the question: "Why do we (not just us) have five fingers?" in this context is not at all trivial (unless we are the heroes of a cartoon film). Judging by all this, it can be expected that the gravitational mass of a dodecahedral form is five times greater than the mass of a tetrahedral form (20 vs. 4). Is it a far-reaching simplification? You can check it out if you won’t be as lazy as I am.
     And maybe these two forms separately make two different types of particles: leptons and hadrons? Proton would be, for example, a system made of only dodecahedral cells (So the quarks as well?) while electron would be a system made solely of tetrahedra. Such clean connections would be very stable and impossible to break by the means at our disposal. Let’s recall the magnitude of the minimum potential energy in the system of two planckons, it amounts to 3.67 · 10^18 GeV. And in a tetrahedron the binding energy is not much smaller. Indeed, we have only two absolutely stable particles. And what about neutrino? If neutrino does not decay either, then its construction is probably different. By the way, I think that the cause of any particle decays are the background neutrinos. If so, are they also dangerous to themselves? Somehow until now no one observed neutrino decays, although the phenomenon of their oscillation has been discovered. But that’s not decay into something else. Moreover, neutrino itself should be also built from such tetrahedra. Tetrahedron will not break tetrahedron. In the third part of the book I will devote to neutrinos a special series of articles.
     On the other hand the mixed systems are easier to break, especially by separating systems containing connected to each other dodecahedral and tetrahedral forms. Neutrino would do the job, judging by the previous suggestions. The binding energy between these two forms is weaker than between planckons making given form, or between identical forms. It is rather a reasonable assumption. "Homogeneous" systems would be, in principle, impervious to neutrino ["in principle", because also particles μ  and τ (leptons) disintegrate.] None of the neutrinos. There are only two inviolable systems: electron and proton. This is also an important tip. 
     And what about neutrino? What does it do at the moment of breaking a particle? It is probably enough that by its intrusion, by its own field forcing resonance (what a fantasy), it disturbs the order of particle’s autonomous vibrations, and that causes its disintegration. In addition, some evidence suggests that neutrino is gravitationally (not in appearance) repellent. But this is not an explanation, it is at most a qualitative premise.
     If at the same neutrino does not fall apart (spontaneously on its own or by force), it would provide an argument for the fact that it is indeed responsible for the decay of other particles. The experiment seems to confirm this (the presence of neutrinos during the process of particles’ decay). It is also a sign that it has a unique feature. Is it repulsion? But how otherwise? How else would it perform its breaking role? Otherwise neutrinos would attract each other, unable to cause decay in their own environment. Perhaps the phenomenon of oscillation testifies to this. I have already drawn attention to this possibility in the article on dual gravity (the first article of this series, and the fifth overall). There I pointed out that two systems of negative gravitational mass attract each other. In the light of those comments and conclusions, the hypothesis that the mass of neutrino is negative, is not as crazy as one might think at first glance. In addition, it could shed some light on the conditions in which this particle came into being. Rather earlier than the other particles.
     Separate attention should be given to systems forming the elementary charge, occurring both in leptons and in hadrons. These are also absolutely stable systems and permanently attached to one or the other system. It’s interesting how this being is built. Is there a third form, or the only one of its kind (in exactly two ways) combination of the known forms, an absolutely permanent combination? Or maybe, (yet another) crazy idea, some structural polarity, something that can be associated with a pair of sex chromosomes: XX and XY, this time forming two poles. By continuing this exciting association, we note that there is an asymmetry – feminine element on top. We will still return to this modelling.
    Quarks are also noteworthy - which differentiates between them? And what about their fractional charge? What structural conditions determine the membership to a particular, one of the three, generations (two quarks + two leptons)? And what about crumbling leptons (μ and τ)? What in structural terms is the excitation of electron or other particles? No, I do not have answers to these questions and to many others. Will the mechanistic insight be able to deal with them? And how is it in fact? Well, untamed imagination. But in spite of all we have moved a little forward, which does not mean that in the right direction. And if so, does it mean that I have to take care of everything?
     Returning to the vibrations occurring in planckon systems, especially in the context of the above considerations, we note that because of the existence of symmetry in both considered forms, vibrations in both of them should be coordinated. It may be noted that these elementary geometrical beings are not barred from their neighbors. They can even to some extend penetrate each other and create more complex systems. The condition of their stability, and in fact of their existence, is that vibrations, despite the complexity, must be also mutually coordinated. So there cannot exist non-cyclical systems - as absolutely unstable. This, with the increasing complexity of systems, reduces the number of possibilities. We are aware that the number of particles is not unlimited. The possibility of constructing standard models of particles can be taken here as evidence. Lack of restrictions would have made it impossible.
     It is also significant that the resultant mass of these systems is much smaller than the mass of a single planckon. The masses of all particles known to us are not very different from each other. I pointed this out in the previous article – they are comparable due to the existence of stable, unbreakable forms (two of them). These masses are relatively small due to the significant gravitational saturation. These are the known to us particles.
     Among them photons stand out as completely saturated. It is also known that the number of positive and negative charges is exactly equal in the entire Universe. Why? Apparently they resulted from dissociation of... what? Does the connection (unknown to us today) of elementary structural forms of positive and negative charges create a photon? Not quite sure, due to the diversity of photons in terms of energy. This would be a kind of addition to the repetitive form of varying complexity - as identical chains (maybe chain rings?) with different numbers of links having defined characteristic, repeatable tips, for example XX - see above. There will be more on this topic in the third part of the book, in an article entitled "Wave-corpuscular duality in the deterministic version". Today, what we know about photons is that they are bosons transferring electromagnetic interactions.          It is worth some pondering, some wheeling dealing. Dual gravity gives its blessing.
   Ahead of us, of course, still a lot of questions, a lot of research problems, maybe even more than before I went nuts. That's for sure. For example: What form, what planckon system, has the characteristics of what we perceive as an electric charge (something unique, something repeatable, something that occurs in the majority of particles)? What is the (structural, planckonic) nature of the magnetic field? After all, the magnetic field is the result of charge movement. How does the process of particle-antiparticle annihilation evolve? How do these entities actually differ in terms of structure? What system creates a strongly interacting form? What is the structure of quarks? Etc. A heap of questions. Is it wrong? It's great. Too bad I'm too old to take care of all this. But there are still many young people, moreover more talented than I am in terms of workshop efficiency.  
   The description presented above is of a qualitative character. Only suggestions. You will most likely, dear reader, name them as fantasies. I do not recommend, however, to dismissively reject them in advance, for those who will endure reading further may be rewarded. In any case you have to admit that, so far, no one tried to deal with the problem of structure of particles. This is simply not considered. Does this mean that it should not be considered? That it is too early? So far, there was no foothold. And now? Rejecting the orthodox understanding of the theory of relativity and quantum mechanics...
     To sum it up, we have affirmed:
a) the possible existence of the duality of gravity, based on a different than the currently accepted definition of the gravitational mass;
b) as fact, the existence of an absolutely elementary being;
c) the existence of structure, graininess, discreteness in the fabric of matter – we accepted it as the truth of nature. In this light, even if there are no planckons which are modeling here this absolutely elementary being, the attempt to describe the structure of particles is totally legitimate despite the fact that so far no one has tackled this problem. It is simply a natural consequence of explorations and there is no need for a genius to say so. Appropriate level of knowing how much remains to be learn, plus a bit of fantasy. It was only a qualitative description. I think it may indicate the direction for further, this time quantitative, inquiry. So we have a source of rather uncommon heuristics. That’s what I think. I would add that quite a lot of things can be explained on the basis of planckon model, in addition without negating the confirmed by observations and experiments characteristics of the matter within our knowledge. At the same time, you can quite easily "materialize" the matter waves and, without beating about the bush, in a simply mechanistic way provide a model of corpuscular-wave duality. The matter remains the same matter. And what about the dark matter? 
 3. What is the dark matter?
   We noted above the existence of two niches of potential energy where numerous and varied elsymons can be "hatched": separately photons, separately "normal" particles. This particular feature of the two planckons system gives (with total certainty) an inducement to considerations concerning both particles (perceptible) in the micro scale and in the scale of the Universe, especially at the very beginning of expansion. Until now it was impossible. This "impossibility" is justified, moreover, by the tradition of quantum uncertainty. The beginning of everything is (in accordance with the tradition) necessarily hazy and impossible to clarify. Another thing is that in our thought experiment we approached (two) planckons to each other, and, in relation to Universe, at the very beginning, at time zero, all planckons already formed one integrated, kind of general system of elsymons - panelsymon.
     This Universe "at the start", was in structural terms similar to a mono-crystal. I called it Panelsymon. Its very rapid expansion as a result of gravitational repulsion of constituent planckons, led to gradual loosening of the bonds between the components and consequently, at some point, to their breaking, and as a result, the dispersion of various elsymons. At that moment the phase transition took place. Chaos ensued. As the final result of the changes taking place there was created the material environment, which included photons (quantitatively dominant), and others known to us (and not known) massive particles.
     There was also "debris", remnants of "retarded" elsymons and especially free planckons, which till then served as connectors between the structures, which after the separation became autonomous particles. As we already know, planckon dimensions are smaller than their own gravitational horizon. Searching for them (visually) is therefore useless, although their total gravitational mass should be actually very large, given the huge mass of a single planckon in comparison to the mass of particles we know.
     Their separation would be related to very rapid, exceeding the speed of light, expansion (URELA)******, and not fully coordinated scattering of matter after its termination, resulting from aforementioned phase transformation. The conditions of chaos (including fractality) completed the rest. In addition to the matter that has evolved to make Us the observable fact (I should be put under special observation), there was also a lot of debris, a kind of waste tip at a building site.
     In the chaos caused by the phase transformation there appeared clusters of planckon matter. There, thanks to the extremely strong gravity, matter began to accumulate. In this matter stars started to form. And later in these places galaxies began to form. Now we know what dark matter is. It is simply the clusters of planckons. No wonder that dark matter does not radiate, that it is only a source of gravity. This is confirmed, willy-nilly, by studies conducted in recent years - gravitational lensing of light coming from very distant galaxies. As you can see, this model of dark matter is the most consistent and logical. And it doesn’t require bringing to life new, occasional beings. Ockham grows a beard.
     Galaxies were formed in fractals, condensations of planckons forming dark matter, while planckons scattered in intergalactic space form fairly homogeneous network filling the Universe. In these vast spaces the forces acting on a single body (say, the trial body), compensate each other almost completely. So we have vacuum and of course vacuum’s energy. Is it all impossible? Is it impossible because it is too simple and does not require equations? Or impossible because no one of importance came to this idea, and it occurred to me? Why it hasn’t occurred to them? Because, a trifle, one needed to define a little differently gravitational mass and accept as possible the existence of an absolutely elementary being (no matter whether planckon, or something else).
     All these planckons, all taken together, constitute a major contribution to the total mass of the Universe. They form, I repeat, the dark matter sought by scholars, which visual detection, for by now obvious reasons is not possible. Now, if we take into account the existence of planckon network embracing the whole Universe, we have reason to believe that we can live happily without dark energy to ensure that density parameter******* of the Universe is equal to unity. We’ll come back to it. As for the dark energy, in an essay under the telling title „Horizontal Catastrophe” I will lay it to rest and send into eternal oblivion. Indeed, as I noted above, despite the enormous mass of the planckon network, the forces acting on each object in the intergalactic space compensate each other almost entirely. Thus it is difficult to directly detect potential contributions to the unit value of density parameter.   
     We’ll come back to this topic¹ when we’ll get to the cosmological issues.
    The process of scattering elsymons in all the diversity of their types, gained, as mentioned above, the characteristics of chaos and fractality. This (taking into account the existence of extremely massive "dark" matter) would explain the observed heterogeneity in the distribution of visually detectable matter. In summary, we can say that the above-mentioned "common elsymon (panelsymon)" with the structural characteristics of mono-crystal, burst at some point during the rapid expansion and "shattered" (like a wine glass hitting the floor). Chaos ensued and by the same token Temperature came into being. Only then (!). And there also appeared non-gravitational interactions: strong and electromagnetic, plus, of course, radiation (photons), and the massive particles, and gravity turned into the force of attraction. And there also came density fluctuations responsible for the observable nowadays large-scale structure. And how the galaxies were formed?
    I often wondered about it: Why the masses of galaxies are of the same order of magnitude? Of course, not counting the small, satellite galaxies. How did this happen? Around 200 million years after the Big Bang, the stars began to appear. Matter was everywhere still sufficiently dense. So stars should have taken the whole space of the Universe. If this set had been absolutely homogeneous, matter would have never congested in numerous centres. According to cosmological principle, the Universe does not have a clearly specified centre.
      Fortunately, there were (thanks to chaos (phase transformation)) density fluctuations. However, the ordinary fluctuations could cause only concentration of matter of stars of any sizes, without any guarantee that they would form galaxies. Fluctuations caused rather the formation the large-scale structures - galaxy clusters, "walls". And galaxies, as I pointed out above, have comparable masses. This cannot be explained by fluctuations. So here we have a legitimate question: What would limit the size (and mass) of the "islands" out of which galaxies were formed?
     And here the meritorious dark matter plays its part (together with the concept of mass defect based on the new definition of the gravitational mass). Fluctuations in the concentration of matter apply also to planckons (dark matter). The places of their greater density began to attract, pulling in nearby stars. Yes, but why masses of galaxies are similar to each other? Well, planckonic matter in those clusters was gathering and densifying. So it created clear centre. However, as the matter densified, the mass deficit of such a system was increasing. Finally, regardless of the number of planckons, the mass of the nucleus of such a grouping, at its deepest level, remained close to zero. As a result, the resultant gravitational masses, regardless of the number of planckons in a given "island", were almost equal.
     If there is no other explanation, we have here, by the way, an indirect confirmation of the whole concept of dual gravity. Why such a mass, and not another? It's one of the secrets of the Big Bang and, of course, of the features of the planckon itself. Knowing these features it will be possible in the future to answer this question as well.
     Only planckons could densify in this manner. The nucleons couldn’t possibly undergo such a process. Atomic nuclei are incompressible. After all, the concrete objects that we identify as particles, are systems of unbreakable forms (tetrahedral and dodecahedral) with infinitesimal masses (and with high relative speeds - temperature). It’s hard to imagine them concentrating in the chaos, in the first moments of the Universe. In the beginning they were relatively homogeneous element. At that moment the densification of massive stars and galactic nuclei into gravitationally closed creations, is something that would happen in a very distant future, in at least several hundred million years.  
   We can already now try to describe the process of the formation of galaxies. After about 200 million years, the temperature was low enough so that the process of formation of the first stars could have begun. This process intensified with time. The matter was then still dense enough. Also the formation of clusters of dark matter, constituting gravitational centres drawing in the matter of gas and stars, had to take some time. This had to last no less that than a billion years, which is understandable after taking into account the dimensional scale of systems (in comparison with the dimensions of the stars themselves). So at this stage we have swarm of stars of the first generation, attracted by the islands of dark matter. It’s not surprising that the masses of the created groups of star were of the same order of magnitude, even if they contained completely different numbers of planckons. And what about the stars themselves? The stars pulled toward the centre pressed at each other from all sides. This had to result in thermonuclear hyper-explosion involving billions of stars. So we have quasars, we have the primary (and the most abundant) source of metals. Not some supernovae - capricious and extremely rare. Let’s note that the matter from which the solar system is formed has already existed for at least six billion years. I described the creation of galaxies elsewhere, in an essay entitled: "How galaxies were formed?" Everything’s there. If someone wants to discuss on topics raised here, lets him/her first get acquainted with the content of this article.

   Meanwhile, the number of hypotheses about the nature of the alleged dark matter keeps on growing. An example of those are hypothetical WIMPs (Weakly Interacting Massive Particles), which for obvious reasons do not participate in electromagnetic interactions. One more unnecessary being. Astronomers are searching for their tracks. What would they say about planckons and elsymons? They’d say nothing...

Reflections
   ¹ Perhaps these planckons form also an invisible homogeneous network, which to some extend can be compared to the famous aether. Would that be its physical meaning? Speaking of this, one could give some thought to the essence of the speed of light. Is it only the "internal" matter of electromagnetic interactions? Most likely not. In the model of the Universe, which I have introduced, in particular in the articles of Part 2 and, obviously, in my books published in 2010********, the speed of light is primarily the speed of expansion of the Universe, which is the upper limit of the set of relative speeds of objects, and its invariance results directly from the cosmological principle – there will be still a lot about this in later articles. According to the view that I express on other occasions, the existence of space is a direct result of this movement (and that’s the reason why the space of the Universe is flat). This I will explain briefly in the next article.
   Therefore, because the Universe is in the global scale isotropic (according to cosmological principle), and since the speed of light is the speed of its linear expansion, c speed should be the same everywhere and in all directions, that is not dependent on the choice of the reference system, or put it otherwise - absolute. The c magnitude is a parameter of the Universe, and not only the speed of expansion of electric and magnetic fields. No wonder that aether could not be detected in the Michelson-Morley experiment. Since c primarily determines the rate of expansion of the Universe, it cannot depend on the reference system, which is local by definition. Today everything, especially in the context of opinions expressed above, appears simple. However, over a century ago, when the Universe was considered infinite and static, Einstein's postulate of the invariance of the c parameter made a revolution, paved the way for physics of the twentieth century. Invariance of the speed of light is the basis of the special theory of relativity. In those former (?) times the invariance of the speed of light was not at all associated with the cosmological principle. And today? I guess it is not connected. In those old days, in this regard, there were only Maxwell’s equations and there were some mismatches in relation to classical mechanics based on Galilean relativity. And that was actually the spur, putting it simply, for research (conducted by many scientists), which ultimately led to the special theory of relativity.
   And now, going back to aether (needless to say, rejected by Einstein), hereby (tentatively) reactivated (the name), it can be expected that the spread of light occurs as a wave of (longitudinal) densifications and rarefactions in the global planckon network. Fine, but how do you reconcile it with the fact that the electromagnetic wave is a transverse wave? Has this anything to do with the speed of propagation? And what about wave-corpuscular duality (photons are strictly material creations)? One hypothesis chasing another, and along the way the old questions remain valid, even if we have a little less of them. A lot of things still need some thought. So let’s add something else. Now, in view of the fact that the Universe expands, the distances between individual planckons of our network get larger. It follows that the speed at which light propagates should gradually decrease. It’s interesting that the same conclusion, or actually supposition, can be reached in another way, on the basis of cosmological considerations (about this in subsequent articles). Astronomical observation (study of quasars’ spectra) seems to indicate that the fine structure constant: α = e²/ħc is increasing. As we can see, the observationally attested variability of this parameter may mean variation of universal constants. One of the options, in view of the increase in the fine structure constant, is that the speed of light diminishes. If the speed of light gradually decreases, the expansion of the universe will end at a minimum (if not zero) value of the c parameter. Then there will be inversion, and the Universe will begin to collapse. Judging by some calculations that I will quote later, it is to happen when the age of the Universe will be of the order of 10^20 years. That is the basis (and not on the density parameter Ω) of the model of oscillating Universe described in this book and in the first of the books mentioned hereunder.

   Is it FN (Fantasy), or maybe rather FN (Fiction)? What about the aforementioned Higgs particles? Let them better keep quiet. Maybe...

*) Part 1. Bereshit... (In the beginning…)
**) As it will turn out (in the next article), there is a minimum distance at which two planckons can approach each other. No closer (Pauli exclusion principle?). So there is a maximum positive potential energy of their interaction. The graph does not reach higher (so it is not an asymptote). At this point, the function reaches absolute maximum. It is a notable thing, also in the philosophical context. Facts of nature, ontological truths, are not mathematical constructs, in spite of the judgements of those whose reflectiveness can’t go beyond mathematical upper limit. I have already pointed it out in different contexts.
***) The dual character of gravity”                                                
****) Here is some other example of such an enlargement. In 1919 Albert Einstein received a letter whose sender was a mathematician from Königsberg, Theodor Kaluza. In his work, described in the letter, Kaluza presents the possibility of joining "in one equation" two fundamental interactions: gravity (general theory of relativity) and electromagnetism (Maxwell’s theory). The condition for this, as it turned out, was to introduce an additional, fourth spatial dimension. The letter surprised Einstein. Kaluza's article had to wait two years before it appeared under the title: "The problem of unity in physics." This delay did not matter, because the idea of Kaluza found fertile soil only after fifty years of continuous development of science. Today, we are talking about hyperspace having eleven dimensions (including time), where the complete unification of all known interactions is taking place. Superstring theory is the expression of this new approach. Its development is M-theory. It is an encouraging option for the future. And I dare to turn back the course of history? Personally, I have no such intention, although I do not deny that history repeats itself.
*****) Other platonic solids (cube, octahedron as well as icosahedron) are not suitable for our purposes. It is easy to find out why.                        
******) Urela – ultra-relativistic acceleration, is a process of my invention, of accelerated expansion at the very beginning of the Big Bang, the (much more consistent) alternative  to "inflation", which in principle is accepted by the scientific community on the basis of "better than nothing", while awaiting for something better. It is based on mutual repulsion of planckons forming at the beginning the squeezed to the limits network of "mono-crystal" - panelsymon. Urela ended with the phase transformation at the moment when the gravitational mass of the system came to zero (and breaking of connections between the nodes of the "crystal"). It’s all in the text (this one and the next).
*******) Density parameter: the ratio of the average density of the Universe to its critical density; It is used primarily in cosmology based on the general theory of relativity (Friedmann equation) as a criterion for defining development trend of the Universe. It is equal to unity if the geometry of the Universe is flat, that is Euclidean - what is ascertained observationally. In truth, this is "flatness on a knife edge," or, if you will, balancing on the tightrope, however inviolable during the whole history of the Universe (starting from phase transformation). This is, in short, the so-called flatness problem, which characterized Friedman’s cosmology. There will be quite a lot on this subject. It will also turn out that, resulting from my inquiry, this problem will disappear. Cosmology of the twentieth century took this problem away with the hypothesis of inflation, which I will also dispose of.
********) 1. Józef Gelbard –  “Let’s fantasize about the Universe  I. Oscillating? That’s not that simple.” ISBN: 978-83-62740-06-2  
2. “Let’s fantasize about the Universe II. Into the depths of matter: gravity in sub-dimensions.” ISBN: 978-83-62740-13-0


Next article: Planckons and elsymons. Part 3   
Some implications of the foregoing findings. The problem of force, speed; what is the absolutely minimal distance between planckons? Pauli exclusion principle differently.  
Multidimensionality. Planckons contra strings. 

Brak komentarzy:

Prześlij komentarz