### Infinite Horizon

Welcome to Entropy Ventures (EV), the Research & Development division of Alkemy. As our corporate nucleus, EV has been tasked with thought leadership, developing intellectual property, incubating future products and divisions, and upholding the entrepreneurial bedrock that Alkemy was founded on long into the future. In a sentence: this is our 21st century rendition of the original “Bell Labs”.

For the past 3 years, by the work of over 200 people, our team has been exploring many of the foundations of technology, information, and modern knowledge. We have continuously channeled our learning and integrated this research into sharpening our vision of the future. In leveraging these purer scientific and academic underpinnings, Alkemy is actively building a revolutionary infrastructure for the burgeoning Internet of Everything (IoE).

An epic quest starts with inspiration, and a great company needs an ethos. As (roughly) said by Isaac Newton, “If we have seen further, it is by standing on the shoulders of giants”. Therefore, as we’ve been navigating the landscape for 21st century innovation, we remind ourselves of the state of the world on which we build upon.

The twentieth century saw two beautiful and comprehensive theories of description.

In 1915, Einstein’s theory of General Relativity (GR) explained how three classical dimensions of Euclidean space and one dimension of classical time could be thought of inside one geometric structure (coined as Minkowski space aka spacetime). In GR, energy-- or mass, remembering e=mc^{2} - displaces this dynamical and wholesome fabric, manifesting in the responsive force of gravity. Einstein’s foundational ideas (GR, SR, EPR, ER), regardless of any sub-optimal interpretations or conclusions considered modernly-false, helped revolutionize our understanding that we’re in a non-fixed playing field, and that the prior ideas on space and time (or events) - and of our physical world’s geometry - were incorrect.

During this same period, Max Planck’s foundational idea (1900) of a smallest interval of anything ‘X’ - an absolute minimum (quanta) of energy or for process - began a quantum realization and started the expedition for quantum mechanics (QM). While it took many of the best minds of the 20th century, sincerely remarkable Truths about our world were uncovered through a collective journey, including, but not limited to: (1) rethinking false dichotomies (“dude, is it a wave or a particle?”), (2) new mathematical formulas (Schrödinger’s wave function), (3) dealing with new un-intuitive concepts (antimatter, entanglement, HUPs), (4) “quantizing” old concepts (Von Neumann entropy), (5) unifying understanding (electroweak theory, duality generally), (6) creating new fields of study (on information, computation and complexity), (7) developing new techniques (Feynman diagrams), (8) discovering new things (blackholes, higgs field, dark matter), (9) building new things (digital computers), and (10) making new models (the Standard Model, inflation, string theory).

All things considered, after being ~13.8 BN years “late to the party”, we’ve made real headway into actually figuring out what is going on and what can be done! But while each of GR and QM has significant accuracy and predictive power, the two are not quite compatible. In today's quest for unification towards the "holy grail" - a quantum mechanical theory of gravity (spacetime geometry) which also limits what kinds of matter can exist - an important thread has been progressively uncovered. That thread is a 'yin & yang' pair more contentful than perhaps any other: entropy and information.

In short, entropy measures the lack of certainty in an observation (of ‘X’) due to the systematically quantum nature in the motion of particles at the microscopic level (i.e. an *inherent* statespace spectra comprises/defines each 'X'). This uncertainty can be thought of as "one's ignorance". And by a wonderfully useful tautology: you can only have as much ignorance as there is information to be ignorant of! As the complement to entropy, 'full information' equates to what is needed to *physically* describe what is going on in a system.

You can think of this info like the memory on your computer; bits and bytes. In 1948, Claude Shannon provided a mathematical definition of information: providing a way to measure the number of "bits" in a very general sense (e.g. for the number of bits per letter in an English text). Because there are 26 letters in the alphabet you might think this would be between 4 and 5 bits. However, because of the language's regularities, like the fact that 'e' is very common and 'x' is not so common, there are fewer bits needed based on this structural predictability (0.6-1.3 bits/character).

Including because the foundations of QM necessitate that information is never lost, a black hole's horizon is a special laboratory. In 1972, Jacob Bekenstein showed that the *surface area* of the horizon is proportional to the black hole's entropy. The construct of "entropy bounds" turn this measurement of ignorance on its head towards a deep truth. Whether you throw a box of books or a confederate general into a black hole, observing the delta of the horizon (before and after) provides a lower bound (minimum) for the amount of space needed to describe these additions.

In a further and active iteration, Raphael Bousso has helped to generalize this concept: showing a universal relationship between information and spacetime surfaces (geometry). We’ve learned that a bounded physical system can store, at maximum, ~10^{69} qubits per square meter (or 'bits', as by Holevo's theorem; pointed out by S. Aaronson).

So what applies to black holes expresses a much deeper ubiquitous relationship (constraint) of the world. With the amount of information that a system can register limited by its maximum entropy, and assuming that the cosmological constant (vacuum energy) driving space's outward expansion is constant (as is thought), then as is consistent with the best results we have in the search for quantum gravity, there is a maximum of ~10^{122} qubits (or bits) in our observable universe (i.e. within our horizon of reach). As by the Margolus-Levitin theorem, the fundamental limit of quantum (or classical) computation is 6 *10^{33} operations per second per joule of energy - creating an upper bound for the total number of elementary logical operations (ops) since our observable universe began (i.e. related to its age of ~13.8 BN years).

What does this mean? Nothing, *and* everything. This does not relate to semantic meaning, many other complex or human phenomenon, or the limits of knowledge. But, given this 'groundstone' to reason from, we find it remarkable that, even where spatially, temporally and computationally bounded by the horizon, *the cumulative lattice of discovery, experience and adventure is seemingly infinite.*

It is our deep belief at Entropy Ventures that as this story continues - a better understanding of the world, its properties, and the way it progresses through space and with running time - will be profound for human beings in the 21st century and beyond.