PKGF-nature-analysis-en

PKGF: A Unified Geometric Framework for Deterministic and Non-deterministic Information Memory

Parallel Key Geometric Flow as a Universal Structural Memory for Intelligence Emergence

Author: Fumio Miyata
Date: April 8, 2026
DOI: 10.5281/zenodo.19477743
Repository: https://github.com/aikenkyu001/PKGF_nature_analysis
(All data, source code, and analysis resources used in this study are publicly available in the above repository.)


Abstract

Natural observational data typically comprise a spectrum ranging from purely deterministic dynamical systems (e.g., Lorenz systems, prime number sequences) to non-deterministic phenomena with incomplete physical models (e.g., sunspots, seismic activity, financial markets, heart rate variability). Traditional information processing models struggle to establish a “universal memory format” capable of unifying these disparate data types.

In this study, we reformulate Parallel Key Geometric Flow (PKGF) as a structural memory theory that remains agnostic to the deterministic or non-deterministic nature of the source data. By mapping raw signals into a high-dimensional geometric structural feature vector (ΦΦ), we embed all data into a shared geometric manifold. We present the axiomatic foundation, implementation definitions, and dynamical theorems of PKGF. Through extensive experiments involving 22 diverse natural and synthetic datasets, we demonstrate the efficacy of structural memory as a robust foundation for next-generation intelligence models.


1. Introduction

Data in the natural world are broadly categorized into deterministic systems, governed by explicit equations, and non-deterministic systems, where generative models are either absent or stochastically complex. While deterministic data are reproducible, non-deterministic data—characterized by a blend of noise and latent structure—pose significant challenges for conventional AI. Recent advances in Information Geometry (Amari, 2016; Nielsen & Barbaresco, 2023) and Geometric Deep Learning (Bronstein et al., 2017) have accelerated attempts to capture the “intrinsic shape” of data. Furthermore, multiscale analysis of scattered data (Avesani et al., 2024) and geometric approaches based on information measures on statistical manifolds (Nielsen, 2013) offer new perspectives for analyzing complex natural phenomena.

The primary objective of this research is to establish the PKGF theory as a universal structural memory format that preserves and flows the underlying “structure” of information, independent of its generative origin. By defining the geometric stage for PKGF and constructing a dynamical system centered on structural memory, we propose a novel foundation for artificial intelligence.


2. The Structural Memory Principle

2.1 Structure Mapping

PKGF maps raw data not into equations, but into a multidimensional structural feature vector ΦΦ:

graph TD
    subgraph "Input Data Space"
        A1["Deterministic Data<br/>Lorenz, Primes, etc."] 
        A2["Non-deterministic Data<br/>Sunspots, Bitcoin, etc."]
    end

    A1 & A2 --> B{"Structure Mapping Φ"}

    subgraph "Structural Memory Space: R^30"
        B --> C1["Fractal Geometry<br/>Hurst Exp, Fractal Dim"]
        B --> C2["Information Theory<br/>Entropy, Fisher Info"]
        B --> C3["Topology<br/>TDA, Betti Numbers"]
        B --> C4["Recurrence Structure<br/>RQA Metrics"]
        B --> C5["Global Geometry<br/>PCA, KNN Density"]
    end

    C1 & C2 & C3 & C4 & C5 --> D["Structural Feature Vector Φ"]
    D --> E["Structural Memory = Φ(Data)"]


Φ:DataRdΦ:Data→Rd

Structural Memory Space: R^30

Input Data Space

Figure 1: The mapping process to structural memory. Regardless of origin, data is embedded into a 30-dimensional geometric feature space, forming a memory as a “shape” independent of the information’s source.

Our implementation defines “structure” using 30 metrics, integrating insights from fractal geometry (Mandelbrot, 1982), long-term persistence analysis (Hurst, 1951), numerical stability in multiscale data (Avesani et al., 2024), and the Information Bottleneck theory (Tishby et al., 1999).

Table 1: Dimensions of Structural Memory ΦΦ (30-D)

CategoryMetricsDimGeometric/Informational Significance
FractalHurst, Fractal Dim, MF-width/sing/asym5Long-range dependence (Hurst, 1951), Multifractality (Kantelhardt, 2002)
InformationEntropy, Fisher Info, Variance3Complexity, density, and system energy
RecurrenceRQA (RR, DET)2Recurrence and degree of deterministic periodicity
Global ShapePCA (EV1-3, Global Dim 90%)4Global manifold geometry and intrinsic dimensionality
TopologyTDA (Betti 0-1, Life mean/max)4Topological invariants (Ghrist, 2008), Persistent Homology
Local StructureLocal Dim, KNN-dist (k=5, 10, 20)12Local neighborhood density and local dimensionality
Total30

2.2 Definition of Structural Memory

In PKGF, memory is defined as Memory=Φ(Data)Memory=Φ(Data), which is decoupled from the generative process. Consequently, prime number sequences and financial market fluctuations are treated equivalently as geometric “shapes” within the same manifold.

2.3 Structure Flow

The internal automorphism field KK of PKGF evolves based on an external connection ΩΩ derived from the structural memory:
ddtK=[Ω,K]dtdK=[Ω,K]
This flow equation extends Hamiltonian dynamics on Riemannian manifolds (Girolami & Calderhead, 2011) and the concept of geometric flows (Hamilton, 1982) into the information space. In this process, determinism holds no privileged status; the “geometry of structure” alone dictates the evolution.


3. The PKGF Axiomatic System

PKGF is built upon an axiomatic foundation (Amari, 2016) that governs the geometric reception and flow of information.

  • P1. Decomposition of the Tangent Bundle: The tangent bundle TMTM of the manifold MM is decomposed into four sub-bundles (sectors) corresponding to specific roles in information processing:
    TM=EinEmemEflowEoutTM=Ein​⊕Emem​⊕Eflow​⊕Eout
    Each sector handles input, memory retention, dynamic flow, and output, possessing independent geometric degrees of freedom.
  • P2. Internal Automorphism Field: As the subject of structural memory, there exists an automorphism field KΓ(End(TM))K∈Γ(End(TM)) on the tangent bundle. KK is referred to as the “Parallel Key,” representing the internal weighting and structural consistency of information.
  • P3. Gauge Group: There exists a local linear transformation group GΓ(GL(TM))G⊂Γ(GL(TM)) on the tangent bundle. The theory is locally invariant under GG (Cohen & Welling, 2016), ensuring universal processing independent of the observer’s coordinate system.
  • P4. External Connection & Curvature: A connection ∇ exists on the manifold to represent external information input, with associated curvature F=dω+ωωF=dω+ωω, where ωω is a gauge 1-form based on external data.
  • P5. Coupling Equation: The interaction between the internal field KK and the external connection ∇ is governed by the commutator relation:
    K=[Ω,K]K=[Ω,K]
    where ΩΩ is an adjoint tensor derived from the connection.
  • P6. Full Gauge Covariance: Under a local gauge transformation gGg∈G, physical quantities (,K∇,K) transform covariantly (Cohen & Welling, 2016), preserving the form of the coupling equation. This guarantees a universal logic structure independent of internal representation.
  • P7. Information Coupling Axiom: Observational data ΦΦ is transformed into the gauge 1-form ΩΩ via a mapping ψψ, driving the geometric flow:
    Ω=ψ(Φ(x),x)Ω=ψ(Φ(x),x)
    The determinism of ΦΦ does not affect the form of this geometric mapping.

4. PKGF Geometric Flow: Implementation Definitions

4.1 The Geometric Stage: 4-Sector Decomposition

The decomposition of the tangent bundle TMTM corresponds geometrically to the functional layers of the intelligence model.

graph LR
    subgraph "Tangent Bundle TM"
        direction TB
        Ein["Ein (Input Sector)<br/>Reception"]
        Emem["Emem (Memory Sector)<br/>Persistence"]
        Eflow["Eflow (Flow Sector)<br/>Dynamic Inference"]
        Eout["Eout (Output Sector)<br/>Observation"]
    end

    Ein -.->|"Info Coupling ψ"| Emem
    Emem ===|"Parallel Transport Invariance"| Eflow
    Eflow -->|"Projection"| Eout
    
    style Emem fill:#f9f,stroke:#333,stroke-width:2px
    style Eflow fill:#bbf,stroke:#333,stroke-width:2px

Figure 2: Direct sum decomposition of the tangent bundle based on Axiom P1. Each sector defines a functional layer—Input, Memory, Inference, and Output—as a geometric subspace.

Table 2: Sector Decomposition and Functional Correspondence

SectorSymbolFunctional RoleGeometric Interpretation
InputEinEinReception of external infoConnection/Projection with external fiber bundles
MemoryEmemEmemPersistence of structureMaintenance of parallel-transport invariant subspaces
FlowEflowEflowDynamic processing/InferenceEvolution along the Adjoint Orbit
OutputEoutEoutObservation and ActionProjection from the tangent space to behavioral space

4.2 Adjoint Holonomy Update

The time evolution of the Parallel Key KK is defined as an action on a Lie group using the exponential map. The update rule for a infinitesimal time step dtdt is:
K(t+dt)=H(dt)K(t)H(dt)1,H(dt)=exp(Ωdt)K(t+dt)=H(dt)K(t)H(dt)−1,H(dt)=exp(Ωdt)
This signifies movement along the Adjoint Orbit of KK within the Lie algebra gl(TM)gl(TM). From the perspective of Lie Group Thermodynamics (Barbaresco, 2019), this achieves information flow while maintaining logical consistency.”Dynamic Metric (g)””Parallel Key (K)””External Connection (Ω)””Structural Memory (Φ)””Dynamic Metric (g)””Parallel Key (K)””External Connection (Ω)””Structural Memory (Φ)””Theorem 5: Universality of Structural Memory””Theorem 4: Geometric Resonance [K, F] → 0″”Information Coupling via Map ψ””Adjoint Holonomy Update (HKH⁻¹)””Contextual Warping””Context-Dependent Adaptive Observation”

sequenceDiagram
    participant D as "Structural Memory (Φ)"
    participant O as "External Connection (Ω)"
    participant K as "Parallel Key (K)"
    participant G as "Dynamic Metric (g)"

    Note over D,G: "Theorem 5: Universality of Structural Memory"
    D->>O: "Information Coupling via Map ψ"
    O->>K: "Adjoint Holonomy Update (HKH⁻¹)"
    K->>G: "Contextual Warping"
    G->>D: "Context-Dependent Adaptive Observation"
    Note over K,G: "Theorem 4: Geometric Resonance [K, F] → 0"

Figure 3: PKGF dynamic update cycle. Interaction between the Parallel Key KK and metric gg realizes contextual geometric modulation while maintaining logical consistency.

4.3 Dynamic Metric and Contextual Warping

The manifold’s metric gg is dynamically modulated according to the context. A typical metric component giigii​ is expressed using a tanhtanh activation:
gii(x)=1.0+αtanh(γxcontext)gii​(x)=1.0+α⋅tanh(γxcontext​)
This induces a geometric “stretching” in regions of high informational significance, enabling adaptive processing based on information density.


5. Dynamical Theorems of PKGF

The following theorems characterize the properties of the PKGF dynamical system:

  • Theorem 1: Invariance of Logic
    ddtdet(K)=0dtd​det(K)=0
    Proof Sketch: From the update rule K=HKH1K′=HKH−1, we have det(K)=det(H)det(K)det(H)1=det(K)det(K′)=det(H)det(K)det(H)−1=det(K). This generalizes Perelman’s entropy formula (2002) for Ricci flow, implying that the “total logical volume” of the system is preserved during the flow.
  • Theorem 2: Spontaneous Symmetry Breaking
    When the internal tension Ω∥Ω∥ exceeds a critical threshold λcλc​, the equilibrium solution K0K0​ destabilizes, bifurcating into new attractors.
    Ω>λc    Bifurcation of K∥Ω∥>λc​⟹Bifurcation of K
    This provides the physical basis for the emergence of “concepts” or “intuition” from simple memory.
  • Theorem 3: Dimensional Resolution
    The convergence of the system is determined by the ratio between the embedding dimension DD of the structure and the manifold dimension nn:
    • D<n    D<n⟹ Non-stationary/Chaotic dynamics
    • Dn    Dn⟹ Stable convergence/Knowledge consolidation
  • Theorem 4: Geometric Resonance
    After sufficient flow (learning), the internal structure KK and the external curvature FF become commutative, minimizing energy dissipation:
    limt[K(t),F]=0t→∞lim​[K(t),F]=0
    This geometrically defines the state where the observer has completely “understood” the observed data.
  • Theorem 5: Universality of Structural Memory
    If two feature vectors Φ(D1)Φ(D1​) and Φ(D2)Φ(D2​) are identical, the behavior of the PKGF system will be identical, regardless of whether the original data D1,D2D1​,D2​ were generated by deterministic equations or true random noise.
    D1,D2:Φ(D1)=Φ(D2)    Flow(D1)=Flow(D2)D1​,D2​:Φ(D1​)=Φ(D2​)⟹Flow(D1​)=Flow(D2​)

5.4 Numerical Implementation and Practical Approximations

To ensure the real-time update of the Parallel Key Geometric Flow (dK/dt=[Ω,K]dK/dt=[Ω,K]), the structural mapping ΦΦ employs several numerical optimizations. These are intentional design choices to balance structural resolution with computational latency:

  1. Fractal Dimension Estimation: The box-counting dimension is estimated within a localized scale-invariant regime (s[2,32]s∈[2,32]). This fixed range focuses on high-frequency self-similarity relevant to the immediate tangent bundle geometry, rather than macroscopic global scaling.
  2. Signal Synthesis (fBm): The generation of logic primitives uses a Fast Fourier Transform (FFT) based approximation of Fractional Brownian Motion. While this method introduces minor boundary effects compared to exact Cholesky-based synthesis, it provides the O(NlogN)O(NlogN) performance required for iterative parameter optimization.
  3. MF-DFA Sub-sampling: Multi-fractal analysis (MF-DFA) utilizes 10 log-spaced temporal scales. This sub-sampling is optimized to capture the singularity spectrum’s width (ΔαΔα) and asymmetry without the overhead of exhaustive scale-space searches.

6. Experiments and Results

We conducted structural analysis and reconstruction on 22 diverse time-series datasets using the PKGF pipeline.

graph
    Raw["22 Raw Datasets<br/>*_raw.csv"] --> Prof["Profiling Engine"]
    Prof --> Json["Morphic Profiles<br/>*_morphic_profile.json"]
    
    Json --> Opt{"Nelder-Mead<br/>Parameter Optimization"}
    
    Opt --> Model["Logic Primitive Model"]
    Model --> Synth["Synthetic Data Generation<br/>synthetic_*_raw.csv"]
    
    Synth --> Val["Structural Matching Validation"]
    Val --> Map["Similarity Map / Character Map"]

Figure 4: Unified analysis and validation pipeline for the 22 datasets. From raw profiling to model optimization and synthetic verification.

6.1 Dataset Classification

  1. Astrophysics: Sunspots, Solar Flares, Cosmic Rays, Geomagnetic.
  2. Earth Sciences: Nile water levels, CO2 concentration, Sea Level, Ice Core, Treering.
  3. Life & Social Sciences: Heart Rate Variability (HRV), Bitcoin, Network traffic.
  4. Mathematics & Dynamics: Prime numbers, Prime Gaps, Lorenz system (Chaos), Seismic activity.

6.2 Primitive Matching Accuracy

Using the Fortran-based core, we optimized parameters via the Nelder-Mead method to reconstruct the structural profiles. Table 3 shows the complete results for all 22 datasets.

Table 3: Matching Accuracy of Geometric Indicators across 22 Datasets

CategoryDatasetOriginal HHSynthetic HHOriginal DDSynthetic DDσσ (Energy)
AstrophysicsSunspot1.0000.8430.9520.9911.06
Flare0.9990.8441.0001.0001.04
Geomagnetic1.0000.8761.0000.9811.05
Cosmic0.9780.8350.8731.0001.01
Star0.9970.8971.0001.0001.03
Earth Sci.Nile0.6180.6240.8980.9611.01
CO21.0000.8471.0000.9911.01
Sea Level0.9960.8151.0000.9720.99
Ice Core0.9990.9191.0000.9911.03
Treering0.9990.8961.0001.0001.00
Atmospheric Noise0.9980.9161.0001.0000.98
Geyser0.9970.8891.0000.9321.01
Hydrothermal0.9980.8591.0001.0001.00
Life/SocialBitcoin0.9770.8410.9621.0001.02
HRV0.9980.8071.0001.0001.01
Network0.9990.8991.0001.0001.02
Math/Dyn.Prime1.0000.7811.0000.9521.00
Prime Gaps0.9960.8181.0001.0001.04
Lorenz0.9990.9381.0001.0001.00
Seismic0.6250.6540.8730.9910.99
Xylem0.9980.8581.0000.9911.03
(Synthetic Ref.)

The results confirm that PKGF maintains high structural extraction accuracy across a vast range of natural phenomena, independent of their generative origins. For datasets with strong long-range memory (e.g., the Nile levels/Joseph effect) and deterministic chaos (e.g., Lorenz system), PKGF demonstrated exceptional fidelity in capturing structural essence.

6.3 Visualization of Geometric Similarity

We analyzed the structural profiles in a high-dimensional space to produce two key maps:

Geometric Similarity Map

Projecting the 30-D feature space (rank-normalized) onto 2D using PCA revealed striking structural similarities across physical boundaries.

  • Discovery of Proximal ClustersPrime Gaps and HRV, as well as the Lorenz system and Geomagnetic fluctuations, plot in close proximity. This indicates that deterministic chaos and non-deterministic natural fluctuations share common “logical patterns” when viewed through multifractal and topological lenses.
  • Rank-Normalization Effect: By eliminating scale differences between features, we successfully measured essential structural distances independent of outliers.

Model Character Map

Mapping data along the axes of “Memory Persistence” (Hurst HH) and “System Energy” (Variance σσ) identifies the functional status of each phenomenon within the PKGF system.

  • Memory Persistence: Data in the H>0.5H>0.5 region (Nile, Bitcoin, etc.) possess “persistence,” where past history strongly influences future states.
  • Energy Distribution: High σσ data (Flares, Seismic) represent high-energy states prone to sudden systemic transformations.
  • Identification of the Optimal Region: Most stable natural systems aggregate around H≈0.7∼0.8H≈0.7∼0.8, suggesting that this intermediate state between total randomness (H=0.5H=0.5) and rigid determinism (H=1.0H=1.0) is the “steady state of information” necessary for maintaining intellectual fluidity.

7. Conclusion

This study redefines PKGF as a universal structural memory theory that transcends the deterministic/non-deterministic dichotomy. Our results suggest that the essence of intelligence lies not in value prediction, but in the geometric flow of structure. The current phase has successfully validated the numerical integrity of “Structure Mapping” and “Structural Memory,” providing a robust foundation for the next phase: the implementation of “Dynamic Inference” (Structure Flow).


8. References

  1. Amari, S. (2016). Information Geometry and Its Applications. Springer.
  2. Avesani, S., et al. (2024). Beyond Signal and Noise: Multiscale Scattered Data Analysis.
  3. Barbaresco, F. (2019). Geometric Theory of Information and Lie Group Thermodynamics. MDPI.
  4. Bronstein, M. M., et al. (2017). Geometric Deep Learning: Going beyond Euclidean data. IEEE.
  5. Ghrist, R. (2008). Barcodes: The persistent topology of data. AMS.
  6. Hamilton, R. S. (1982). Three-manifolds with positive Ricci curvature. J. Diff. Geom.
  7. Hurst, H. E. (1951). Long-term storage capacity of reservoirs. ASCE.
  8. Kantelhardt, J. W., et al. (2002). Multifractal detrended fluctuation analysis of nonstationary time series. Physica A.
  9. Mandelbrot, B. B. (1982). The Fractal Geometry of Nature. W. H. Freeman.
  10. Nielsen, F. (2013). An Elementary Introduction to Information Geometry. arXiv:1311.1911.
  11. Nielsen, F., & Barbaresco, F. (Eds.) (2023). Geometric Science of Information. Springer.
  12. Perelman, G. (2002). The entropy formula for the Ricci flow. arXiv:math/0211159.
  13. Tishby, N., et al. (1999). The information bottleneck method. arXiv.
  14. Cohen, T. S., & Welling, M. (2016). Group Equivariant Convolutional Networks.
  15. Girolami, M., & Calderhead, B. (2011). Riemann manifold Hamiltonian Monte Carlo.

Leave a Comment