This video is adapted from 10.3390/ai6020023
This video presents how the combinatorial fusion cascade provides a surprisingly simple and complete explanation for the origin of the genetic code based on competing protocodes. Although its molecular basis is only beginning to be uncovered, it represents a natural pattern of information generation from initial signals and has potential applications in designing more-efficient neural networks. By utilizing the properties of the combinatorial fusion cascade, this video demonstrates its embedding into deep neural networks with sequential fully connected layers using the dynamic matrix method and compares the resulting modifications. It is observed that the Fiedler Laplacian eigenvector of a combinatorial cascade neural network does not reflect the cascade architecture. Instead, eigenvectors associated with the cascade structure exhibit higher Laplacian eigenvalues and are distributed widely across the network. This video analyzes a text classification model consisting of two sequential transformer layers with an embedded cascade architecture. The cascade shows a significant influence on the classifier’s performance, particularly when trained on a reduced dataset (approximately 3% of the original). The properties of the combinatorial fusion cascade are further examined for their application in training neural networks without relying on traditional error backpropagation.