Entropy Ventures

The R&D Division of Alkemy

Entropy Ventures
NTROPY
VENTURES

Infinite Horizon

Welcome to Entropy Ventures (EV), the Research & Development division of Alkemy. As our corporate nucleus, EV has been tasked with thought leadership, developing intellectual property, incubating future products and divisions, and upholding the entrepreneurial bedrock that Alkemy was founded on long into the future. In a sentence: this is our 21st century rendition of the original “Bell Labs”.

For the past 3 years, by the work of over 200 people, our team has been exploring many of the foundations of technology, information, and modern knowledge. We have continuously channeled our learning and integrated this research into sharpening our vision of the future. In leveraging these purer scientific and academic underpinnings, Alkemy is actively building a revolutionary infrastructure for the burgeoning Internet of Everything (IoE).

An epic quest starts with inspiration, and a great company needs an ethos. As (roughly) said by Isaac Newton, “If we have seen further, it is by standing on the shoulders of giants”. Therefore, as we’ve been navigating the landscape for 21st century innovation, we remind ourselves of the state of the world on which we build upon.

The twentieth century saw two beautiful and comprehensive theories of description.

In 1915, Einstein’s theory of General Relativity (GR) explained how three classical dimensions of Euclidean space and one dimension of classical time could be thought of inside one geometric structure (coined as Minkowski space aka spacetime). In GR, energy-- or mass, remembering e=mc2 - displaces this dynamical and wholesome fabric, manifesting in the responsive force of gravity. Einstein’s foundational ideas (GR, SR, EPR, ER), regardless of any sub-optimal interpretations or conclusions considered modernly-false, helped revolutionize our understanding that we’re in a non-fixed playing field, and that the prior ideas on space and time (or events) - and of our physical world’s geometry - were incorrect.

Team members at table

*A: "Caption?: B: "Something witty, something topical."

During this same period, Max Planck’s foundational idea (1900) of a smallest interval of anything ‘X’ - an absolute minimum (quanta) of energy or for process - began a quantum realization and started the expedition for quantum mechanics (QM). While it took many of the best minds of the 20th century, sincerely remarkable Truths about our world were uncovered through a collective journey, including, but not limited to: (1) rethinking false dichotomies (“dude, is it a wave or a particle?”), (2) new mathematical formulas (Schrödinger’s wave function), (3) dealing with new un-intuitive concepts (antimatter, entanglement, HUPs), (4) “quantizing” old concepts (Von Neumann entropy), (5) unifying understanding (electroweak theory, duality generally), (6) creating new fields of study (on information, computation and complexity), (7) developing new techniques (Feynman diagrams), (8) discovering new things (blackholes, higgs field, dark matter), (9) building new things (digital computers), and (10) making new models (the Standard Model, inflation, string theory).

All things considered, after being ~13.8 BN years “late to the party”, we’ve made real headway into actually figuring out what is going on and what can be done! But while each of GR and QM has significant accuracy and predictive power, the two are not quite compatible. In today's quest for unification towards the "holy grail" - a quantum mechanical theory of gravity (spacetime geometry) which also limits what kinds of matter can exist - an important thread has been progressively uncovered. That thread is a 'yin & yang' pair more contentful than perhaps any other: entropy and information.

In short, entropy measures the lack of certainty in an observation (of ‘X’) due to the systematically quantum nature in the motion of particles at the microscopic level (i.e. an inherent statespace spectra comprises/defines each 'X'). This uncertainty can be thought of as "one's ignorance". And by a wonderfully useful tautology: you can only have as much ignorance as there is information to be ignorant of! As the complement to entropy, 'full information' equates to what is needed to physically describe what is going on in a system.

You can think of this info like the memory on your computer; bits and bytes. In 1948, Claude Shannon provided a mathematical definition of information: providing a way to measure the number of "bits" in a very general sense (e.g. for the number of bits per letter in an English text). Because there are 26 letters in the alphabet you might think this would be between 4 and 5 bits. However, because of the language's regularities, like the fact that 'e' is very common and 'x' is not so common, there are fewer bits needed based on this structural predictability (0.6-1.3 bits/character).

Including because the foundations of QM necessitate that information is never lost, a black hole's horizon is a special laboratory. In 1972, Jacob Bekenstein showed that the surface area of the horizon is proportional to the black hole's entropy. The construct of "entropy bounds" turn this measurement of ignorance on its head towards a deep truth. Whether you throw a box of books or a confederate general into a black hole, observing the delta of the horizon (before and after) provides a lower bound (minimum) for the amount of space needed to describe these additions.

The train called 'time'

*The train called "time".

In a further and active iteration, Raphael Bousso has helped to generalize this concept: showing a universal relationship between information and spacetime surfaces (geometry). We’ve learned that a bounded physical system can store, at maximum, ~1069 qubits per square meter (or 'bits', as by Holevo's theorem; pointed out by S. Aaronson).

So what applies to black holes expresses a much deeper ubiquitous relationship (constraint) of the world. With the amount of information that a system can register limited by its maximum entropy, and assuming that the cosmological constant (vacuum energy) driving space's outward expansion is constant (as is thought), then as is consistent with the best results we have in the search for quantum gravity, there is a maximum of ~10122 qubits (or bits) in our observable universe (i.e. within our horizon of reach). As by the Margolus-Levitin theorem, the fundamental limit of quantum (or classical) computation is 6 *1033 operations per second per joule of energy - creating an upper bound for the total number of elementary logical operations (ops) since our observable universe began (i.e. related to its age of ~13.8 BN years).

What does this mean? Nothing, and everything. This does not relate to semantic meaning, many other complex or human phenomenon, or the limits of knowledge. But, given this 'groundstone' to reason from, we find it remarkable that, even where spatially, temporally and computationally bounded by the horizon, the cumulative lattice of discovery, experience and adventure is seemingly infinite.

It is our deep belief at Entropy Ventures that as this story continues - a better understanding of the world, its properties, and the way it progresses through space and with running time - will be profound for human beings in the 21st century and beyond.


Fractal Geometry

Self-replication is a capacity common to every species of living thing. In exchanging energy and matter with its surroundings, a living thing can make a “copy” of itself. And while the statistical physics of living systems operate (very) far from thermodynamic equilibrium, the self-replication process must invariably be fueled by the production of entropy.

Ellis Island

*Ellis Island: Diversity from Common Origin.

Recently, MIT professor Jeremy England’s research team derived a lower bound (“minimum cost”) for the amount of heat that is produced during this process (of self-replication). By demonstrating that reproductive fitness is linked to efficient metabolism, their team arrives at a most prodigious conjecture: that the original process of self-replication - when one cell became two, and when n cells became you - was incented by the improved diffusion of energy at the cellular level. This is because the self-replicator which dissipated more heat (energy) had the potential to grow faster; increasing the chances of survival. This economic strategy was accentuated in increasingly developed organizational structures by lowering the metabolic energy exerted in marginal self-replication (i.e. the cost of growth).

The upshot of this is that England’s team has developed a new physical explanation to explain how non-intelligent biological systems 'X' grow and scale, and the tendencies of the configurations naturally adapted towards (for 'X'); something quite novel.

Cross-Section of EV model (Alkemy’s DNA).

*Cross-Section of EV model (Alkemy's DNA).

We drew a parallel between these recent results and derived a model that would maximize Alkemy's own rate of self-replication (growth). Just as there is a "winning strategy" in biology, so to is there a "winning framework" for a bona fide technology company.

What we've constructed is something of an ever-evolving fractal: a model which evolves to minimize our marginal entropy & transaction costs (to "lower metabolic energy"), while maximizing our resources through an ecosystem-like configuration (towards "dissipating more heat"). As our corporate centerpiece, EV serves as the scalable architecture with Alkemy's DNA that most efficiently captures value creation and accelerates its manufacturing process (i.e. R&D). To effectively convert EV's ideation (potential energy) towards popular application (kinetic energy), we have embedded Hartford Lab as our in-house BD incubator. This outward, self-similar "radiation" from our core will permit for organic provenance between Alkemy's intellectual property, our products and operating divisions - of today and towards tomorrow.

As in biology, and of information generally, time facilitates growing the global complexity of Entropy Ventures. In future versions of ourselves, and as an expression of the aforementioned continuous integration maxim, EV will progressively bloom into a constellation of partnerships, internal assets (e.g. IP prosecution boutique), and further resources (e.g. a CVC vehicle).


Truth Source

We find it important to remind ourselves as to 'what' "technology" is, and 'how' it manifests itself. As we have evaluated the nature of innovation - from more ancient history to the present day - we see an undeniable pattern; a critical chronology, if you will. First, predominantly "useless" mathematicians and thinkers discover and invent truths & the tools to interact with them. Physicists and computer scientists then conceptually make explanatory models, often by adopting these truths and by using these tools (or sometimes by arriving at them separately). The world then collectively tests these models through repeated experiment with variant conditions. This root process of mapping the world allows us to uncover the fundamental principles, forces, rules, structures, and processes of our universe (collectively, "intelligence").

Industrialization

*Connected Industrialization via nature.

From these a priori foundations, a new t0 baseline of knowledge is born. Inventors find and construct the materials and compilations ("hardware") necessary to leverage this intelligence. Then, entrepreneurial leaders creatively iterate on these inventions - uniquely packaging the newly available infrastructure, products, applications or services - and build teams to focus resources (e.g. labor) towards successfully growing their vision. Through markets comprised of individual actors (like people and companies), we accelerate the time it takes to optimize this process (i.e. maximizing end-application value potential, usability, quantity, quality). In forging stability (of these markets), legal systems & local rules normalize behaviors through incentives and restrictions.

In the modern version of the story, software, devices, and other media interfaces stack on top of (mobile) hardware, while programmed machines enhance computational capacities and connected mediums (like the internet) better enable demand to meet supply by lowering transaction costs (i.e. distribution, exchange, search costs, verification, enforcement).

The moral of this allegory is that new, creative ideas are necessary conditions to yield real value, as they sit at the bottom of this golden sequence. As beautifully said by Steven Weinberg, "it is as if when Neil Armstrong first walked on the moon (the scientist), he found the footsteps of Jules Verne in the lunar dust (the mathematician)." This metaphor is profound. Ideas lead, science follows, and industrialization harvests. When coupled with the empirical imperative for emphasizing dynamic efficiency (as opposed to static), Alkemy decided to spend our first 4 years in a period of research & development.

Hallway

*Architect path progression recursively.

We have distilled the 10-year R&D plan @ EV into five (5) focus areas:

I. Techniques: Solving problems should be the easy part. What is most difficult is framing them properly, connecting seemingly disparate variables, interpreting data, orienting away one’s biases, and yielding tools - whether conceptual or computational - to arrive at a manageable position. We therefore study and develop frameworks (e.g. the golden sequence), intuitive methods (e.g. using recursion to solve multi-step problems), unique approaches (e.g. Grothendieck was notorious for using increasing abstractions to sculpt towards truth), and mathematical discoveries (e.g. tensor networks) to enhance our brainstorming.

II. Computing: All problems and tasks still need to be solved feasibly. As laid down by Turing and Church, "complexity theory" provides the domains - or the “clarifying sword” to use S. Aaronson’s words - of what can be theoretically done and how the resources to solve a problem will scale (i.e. what can be reasonably done). While often out of sight-out of mind, all computations involve the communication (i.e. transmission and receipt) and interpretation (i.e. processing) of information through a channel (i.e. a medium). To optimize this loop and these sub-processes, we study everything from software architectures (e.g. microservices, serverless, decentralized systems) to foundational works in logic (e.g. Gödel’s incompleteness theorem, automata theory) to machine intelligence (e.g. ML variants) to security measures (e.g. various encryption techniques) to the future of 21st century computing (e.g. high temperature superconductors like graphene and metallic hydrogen, QC hardware proposals).

III. Taxonomy: We use this phrase broadly rather than "ontology", but genuinely investigate a dynamic fusion of the two subjects. From definitional language to organizational hierarchy to relational mapping to (symmetry) groups; the logical differentiation, connective properties, and rules of interaction describing the 'various systems of a system' (collectively, a "framework") are native aspects in examining/designing the mechanics of process. Further, subsequent 'higher order' data, semantic meaning, and a system's logical evolution (i.e. consistent and stable growth) also require such a framework. As the backbone for any (information) system, we pay careful attention to taxonomy and these related fields of study.

Building at MIT

*Economies of Learning

IV. Quantum: Quantum Information is special and behaves in some qualitatively different ways than classical information. But despite the popular belief, Quantum Computers will not solve different problems than classical computers. That is, there is not a new class of problems which are theoretically enabled by a QC (vs. a classical one), but a class of known problems (BQP) which would now be feasible by the exponential decrease in steps needed to obtain and verify a solution (i.e. currently not feasible by our best classical algorithms). While not as glamorous sounding as '1-click answers to anything', this new technical feasibility and the means of leveraging properties of our quantum universe, may create the opening for many applications (statically) and empower an exponential paradigm of future evolution (dynamically). For instance, EPR (entanglement) is nature's answer for non-breakable encryption through the statistics of maximally entangled qubits (where even snooping, a type of "lateral privacy", can be monitored). The notion of secure communication - aided by the no cloning theorem and 'auto-interference detection' - could open the door to a new era of connectivity and commerce. In a sentence: we think the 21st century will be quantum; the time is ripe to harvest these powers into technologies that will cover the world and enhance our civilization.

V. Systems: Systems have parts, they have players, they have rules. In seeing systems through the hierarchy of process architecture, we often think about suprastructure (abstract constraints and tendencies), infrastructure (physical objects and processes), and superstructure (configurations and sub-systems) when discussing them. At the 'elementary/DNA level' (parts), systems are best described mathematically and logically. In the pure form, we are interested in how structure emanates emergent phenomenon and properties and are particularly interested in geometry, topology, and pure/abstract mathematics (e.g. Hilbert Space, Mandelbrot’s work, primes, Nima’s amplituhedron). The players have incentives (e.g. microeconomics) that operate dynamically under constraints - nicely illustrated in practice through the lens of law and economics. These players (and their systems) also evolve, where we supplement our classical economics with the impact of things like 'the new adjacent possible', principles of organization, and other economic phenomenon which manifest under increasing complexity and time evolution. Because rules (like laws) are enabling and constraining, and their spectrum operates like forces do, we find it smart to work at the modern intersections of math, computer science, economics, biology and physics to uncover the deepest systematic insights.



Not Pictured: Many other personnel and advisors. We take great pride in bringing business, industry, and technical experts into the Alkemy family.