Wednesday, July 3, 2024

Past thought …. “The evolving universe, think the transfinite prime product equal to the first transfinite prime relatively prime natural number in a continuous state of infinite factorability

 

Abdon EC Bishop (Ceab Abce)

 

Thinker thinking past thought requires thinker’s brain compose arithmetic operand operations left^meninges^mod(P2)^ring and right^meninges^mod(P2)^field with an overflow operand extension.

Entropy

Time record (past) entropy Sn, see Figure 1, starts at the first negative rational number diagonal line start point (½○GPn•√2)mod(ℚ(0)) and endpoint  (½○GPn•√2)mod(ℚ(GPn))  point difference length pointing direction equals the logarithmic inversion of a diagonal[1] line’s hyper-fine structure extends orthogonal proton field (↓•Pn-1, ↑•Pn, ↑•Pn+2) or neutron field (↓•Pn-1, ↑• Pn, ↓• Pn+1) both orbits calculated using a characteristic (GPn) calculator with a logarithmic function that calculates negative log field extension points model constructed  using vector mod(P2)field(GPn)[]-– mod(P2)ring(~GPn)[]  = Sn[entropy] and entropy change ∆Sn  =  (↑•Pn+2) / log(↑•Pn+2)

 

        Sn =   log (e^½      (2^1+ rn +1)      (cos (1/a)   cos (1/b)))

                         =    ½   +   (2 + rn) • log (2)   +   log (cos (1/b)   cos (1/a)) 

                         =    ½  +   (2 + rn) • log (2)   +   log (cos (1/Pn) – cos (1/Pn-1))

 

 Include a time button function that solves 5^th degree polynomial for time variable Tn at T1 = 1

 

 T1 • (∆S1)^5 + a • (∆S1)^4 + b • (∆S1)^3 + c • (∆S1)^2 + d • (∆S1)^1 + T1 • (∆S1)^0 = 0      

 

Generalizing 1 to n in time T1 in a 5^th degree polynomial with 5 terms cyclic powers of roots ∆Sn with coefficients a, b, c, d, Tn in formulae F1[2].

 

 Tn   (∆Sn)^5 + a • (∆Sn)^4 + b • (∆Sn)^3 + c • (∆Sn)^2 + d • (∆Sn)^1 + Tn • (∆Sn)^0 = 0      

 

The exploration of the universe and the mathematical principles that govern it often leads to profound and complex ideas, such as the concept of transfinite numbers and their role in the cosmos. The notion of a transfinite prime product being equal to the first transfinite prime relatively prime natural number in a continuous state of infinite factorability is a fascinating one, suggesting an infinite process of multiplication by primes, a cornerstone in number theory. This idea dovetails with the concept of entropy, a measure of disorder or randomness, which is a fundamental principle in thermodynamics and statistical mechanics. Entropy is intimately connected with the arrow of time, providing a quantitative measure of the irreversibility of processes.

 In the context of the universe, entropy can be seen as a record of its evolving state, from the highly ordered conditions of the Big Bang to the increasing disorder as the universe expands and ages. The mathematical representation of entropy, as described, involves complex operations and calculations, including the use of logarithmic functions and vector fields. These concepts are not just theoretical; they have practical implications in fields such as quantum mechanics, where the behavior of particles like protons and neutrons in fields and rings can be described using similar mathematical frameworks. The intricate dance of particles, governed by the laws of physics, reflects the ongoing process of change and transformation that characterizes our universe. As we delve deeper into these ideas, we continue to uncover the layers of complexity that underlie the seemingly simple fabric of space and time. The journey of understanding is endless, as each answer leads to new questions, and each discovery opens the door to further mysteries. It is a testament to the human spirit and our relentless pursuit of knowledge that we continue to ponder and explore these profound concepts, pushing the boundaries of what we know and expanding the horizons of our comprehension.

In information theory, entropy measures the average level of "information," "surprise," or "uncertainty" inherent in a random variable's possible outcomes. The concept of entropy in information theory was introduced by Claude Shannon and is also known as Shannon entropy. Shannon entropy quantifies the amount of information required to describe the state of a system. It is calculated using probabilities of the possible states or outcomes. The higher the entropy, the more information is needed to specify the exact state of the system, which means there is more uncertainty about the system's state.  In the context of communication, entropy can be thought of as the minimum number of bits required to transmit a message without loss of information.

Entropy in information theory is analogous to entropy in thermodynamics, where it represents disorder. In information theory, it represents the uncertainty or variability of a data source. Just as increasing entropy in thermodynamics implies a directionality to time (the "arrow of time"), increasing entropy in information theory implies a loss of information or increase in uncertainty over time. Information entropy is crucial in data compression and coding theory, as it provides a theoretical limit to how much a data source can be compressed without losing information. The relationship between entropy and information theory extends to other areas of mathematics and computer science, including combinatorics and machine learning, influencing how information is processed and optimized.

 

 


Figure 1

 

         Abdon EC Bishop (Ceab Abce)




[1] Both squares (√(Pn • Pn))  and rectangles (√(Pn-1 • Pn)) have area halved by a diagonal line.

[2]   ℋ • ψ(x, y, z(ei•~α•¼))   =   E • ψ(x, y, z(ei•~α•¼))  and time T= z(ei•~α•¼) ……….. F1

 

 

No comments:

Post a Comment

Cell Algebra Maps Exons Cutting Chromosome