Intelligence is a complex and multifaceted concept that has been the subject of much research and discussion over the years. It is a concept that is often used to describe an individual’s ability to think, learn, and reason effectively. While there has been much research into intelligence, it is still not well understood, and the concept of intelligence remains difficult to define in a concise and practical way.
The Suárez Formula of Intelligence
Recently, a relatively new proposition, here referred to as the Suárez formula (see “The Most Practical Intelligence Equation”, Suárez 2023), has aimed to provide a highly practical definition of intelligence for everyday use. This equation was developed by Dr. Suárez early in his scientific career and it became a very handy tool in his own personal development and success. The simple formula states thatintelligence is highest when results are obtained with the least effort and is expressed as:
E x I = R
Where E represents effort, I represents intelligence, and R represents results. If we solve this equation for the variable I, we get that I=R/E. This formula provides a practical definition of intelligence that can be easily understood and applied in everyday life.
Wissner’s “Entropic Force Equation”
On the other hand, the Wissner equation (see “Causal Entropic Forces” – Wissner-Gross and Freer, 2013) is a well-established equation in the field of intelligence research. The equation states that a kind of intelligence—understood as an “entropic force”—is proportional to entropy (a term often made synonymous to disorder and chaos, yet technically it is a measure of the number of different ways that a set of objects can be arranged) and is expressed as:
F = T ∇ Sτ.
In this equation, F represents the force of intelligence, T represents temperature, Sτ represents “causal entropy,” and ∇represents differentiation with respect to space-time coordinates. To clarify a bit this complex equation, “causal entropy” (Sτ) is in essence a measure of how much information can be gained from observing an event and represents the amount of uncertainty associated with predicting future states based on past observations. More importantly, it can be interpreted as diversity of possible accessible futures or degree of future freedom of action. It increases as more paths are explored through space-time regions. In this equation, temperature (T) parametrizes the system’s bias towards macrostates that maximize causal entropy. Finally, the symbol ∇ serves to indicate how much change there is in the amount of mess or confusion (entropy) in space and time. This entropic force drives systems toward higher-entropy states over time i.e., maximizes their potential energy through exploring different paths through these regions, thus providing a physical relationship between intelligence and maximum instantaneous entropy production. As Dr. Wissner puts it all this in perspective, he tells us: “intelligence doesn’t like to get trapped”. Phew! Complex, yet beautiful! Glad you made it through!
Dr. Suárez saw the potential to merge these two equations in order to gain a more complete understanding of intelligence. He believed that the merging of these two equations would result in a new equation that would provide novel insights into intelligence. In order to achieve this, he made some assumptions about the variables in each equation. He assumed that the variable F in Wissner’s formula is equal to the variable I for intelligence in Suarez’s formula and that T in Wissner’s formula is equivalent to E in Suárez’s formula based on the general idea that these variables denoted levels of intensity or magnitudes.
The Suárez-Wissner equation
The resulting Suarez-Wissner equation is a mathematical formula that represents the relationship between intelligence, entropy and results. The equation is written as follows:
I = √(Sτ ∇ * R)
In this equation I stands for intelligence, Sτ stands for causal entropy
(or diversity of possible accessible futures or the degree of future freedom of action), ∇stands for differentiation and R stands for results. The equation suggests that the higher the diversity of possible futures (Sτ), the higher the intelligence (I). Additionally, it also suggests that the higher the results (R) achieved, the higher the intelligence (I).
The Suarez-Wissner equation lets us see that the highest intelligence (I) is achieved when having a wide range of possible futures (Sτ) that allow for more opportunities for creativity and problem solving. Furthermore, it suggests that having a high degree of future freedom of action (Sτ) allows for more flexibility and adaptability, which is in tune with higher intelligence (I).
On the other hand, the equation also shows that the more results (R) that are achieved, the higher the intelligence (I). This may mean that as an individual or organization accumulates more experiences, knowledge, and skills, their intelligence also increases.
The Suarez-Wissner equation states that intelligence (I) is proportional to the square root of the product of causal entropy (Sτ) and the accumulation of results (R). In this sense, as causal entropy decreases, intelligence decreases as well, and as results increase, intelligence increases.
Some examples where increased entropy can indicate higher intelligence, while the gradual accumulation of results can lead to lower entropy over time, when the same level of intelligence is applied.
1. Creative Problem-Solving: A person who is able to approach a problem from multiple angles, consider various solutions and generate novel ideas, would likely have a high level of causal entropy and thus, higher intelligence according to the Suarez-Wissner equation. On the other hand, a person who repeatedly applies the same solution to similar problems, may have a lower level of causal entropy and thus, lower intelligence.
2. Innovation: A company or organization that fosters a culture of experimentation, encourages taking risks and allows for failure, would likely have higher levels of causal entropy and thus, higher collective intelligence. On the other hand, a company or organization that relies heavily on established processes and discourages experimentation, would have lower levels of causal entropy and thus, lower collective intelligence.
3. Learning: A student who is curious and constantly seeking new knowledge and experiences, would have higher levels of causal entropy and thus, higher intelligence. On the other hand, a student who consistently relies on memorization and rote learning, would have lower levels of causal entropy and thus, lower intelligence.
These examples highlight the importance of maintaining a high level of causal entropy in order to continually improve intelligence, and demonstrate that the gradual accumulation of results does not necessarily equate to higher intelligence, but rather a decrease in causal entropy.
Interestingly, the Suárez-Wissner equation indicates that as results accumulate, the individual’s intelligence increases and the level of causal entropy decreases. To illustrate this, consider the following examples:
1. In a company, a highly intelligent employee who consistently produces high-quality work may experience lower levels of causal entropy as they become more familiar with their tasks and develop more efficient methods. Over time, their results will continue to improve and their intelligence will increase as well.
2. A student who is taking a challenging course in college may experience high levels of causal entropy at first as they grapple with new material and unfamiliar concepts. However, as they study and work to understand the material, their results will gradually improve and their intelligence in that subject will increase.
3. A software developer who is working on a complex project may experience high levels of causal entropy as they navigate unfamiliar code and design new solutions. As they progress and their results improve, their intelligence in that area will also increase.
These examples reflect the relationship between intelligence, results, and entropy in the Suarez-Wissner equation. Fundamentally, having a wide range of possible futures and achieving results can lead to higher intelligence. Not surprisingly, it is a natural tendency of intelligent beings to want to be free.
The 2nd Suarez-Wissner equation
The 2nd Suarez-Wissner equation, is a mathematical formula that relates results (R), effort (E), and causal entropy (Sτ). It is represented as follows:
R = E^2 ∇ Sτ
In this equation, effort (E) represents the amount of energy and resources put into a task or activity in order to achieve a desired outcome or result. Results (R) represent the outcome or end product of a task or activity. Once more, causal entropy (Sτ) refers to the diversity of possible accessible futures or the degree of future freedom of action. This means that Sτ represents the amount of uncertainty or unpredictability in a given situation, and it is calculated by considering the range of future possibilities.
The 2nd Suarez-Wissner equation states that the results of a task or activity are directly proportional to the square of the effort invested, and to the causal entropy of the situation. This means that if the amount of effort invested increases, the results will also increase. If the causal entropy decreases (meaning there is less uncertainty), then the results will also increase.
In this equation, the relationship between effort, causal entropy, and results can be interpreted as follows: an increase in effort results in an increase in results, but this increase is dependent on the degree of causal entropy. If there is a high degree of causal entropy, meaning a lot of potential for different outcomes, then the increase in results will be greater with a given level of effort. Conversely, if there is a low degree of causal entropy, meaning fewer potential outcomes, then the increase in results will be smaller.
For example, imagine that a person is trying to lift a heavy object. The effort required to lift the object is represented by the variable E. The potential outcomes, or diversity of possible futures, can be represented by the causal entropy (Sτ). If the object is extremely heavy and there is no chance of lifting it, then the degree of causal entropy is low, and the results (R) will be small. However, if the object is lighter and there are different ways of lifting it, such as using a lever or rope, then the degree of causal entropy is high, and the results (R) will be greater for a given level of effort (E).
Many more examples reveal the direct proportionality of Sτ and R in the 2nd Suarez-Wissner equation R = E^2 ∇ Sτ. For example:
1. In a company, the R&D department has a set budget to develop new products. The more diverse the future options available for product development (Sτ), the greater the potential for the department to achieve results (R).
2. In sports, a basketball coach wants to improve the skills of his players. The more diverse the training exercises available (Sτ), the greater the potential for the players to achieve better results (R) on the court.
3. In education, a teacher wants to engage his students in a lesson. The more diverse the teaching methods available (Sτ), the greater the potential for the students to achieve better results (R) in their understanding of the material.
The Suárez-Wissner equations that were derived here provide a unique perspective on the relationship between effort, causal entropy, and results. They highlight the importance of effort in achieving results and how the level of entropy in the system can impact this relationship. These equations have a broad range of potential applications, including in fields such as psychology, economics, and even sports, as they provide a mathematical framework for understanding the relationship between effort and achievement. These may also offer a promising new tool for understanding and optimizing intelligence in individuals and organizations, thus enhancing the efficiency of personal development strategies and increasing organizational intelligence. While there are certainly limitations to these equations, they provide a useful starting point for further exploration and research into the relationship between effort, entropy, and results.