Do you know that mathematical equations affect our day-to-day lives? In fact, mathematics has been called the language of the universe that has shaped our understanding of the world in numerous ways. There are some equations that changed the world!

The brightest mind in history has used mathematics to lay the foundation for how we measure and understand our universe.

While there are many mathematical equations that have molded mathematics and human history,** let’s have a look at 10 of them:**

### Isaac Newton’s Law Of Universal Gravitation:

Issac Newton’s publication of the Principia in July 1687 changed the way how we look at the universe.

Newton’s law is a remarkable piece of scientific history – it explains, almost perfectly, why the planets move in the way they do. Also remarkable is its universal nature – this is not just how gravity works on Earth, or in our solar system, but anywhere in the universe.

Newton not only concluded that planets revolve around each other because of gravity, but he also gave the exact formula that calculates how much force is between two large objects given their masses.

### The Pythagoras theorem:

One of the fundamental principles in Euclidean Geometry, this theorem is foundational to our understanding of geometry. It describes the relationship between the sides of a right triangle on a flat plane. The theorem states that: “*The sum of the squares of the lengths of the legs of a right triangle is equal to the square of the length of the hypotenuse*“.Currently, there are over 130 different proofs for the Pythagorean Theorem, ranging from geometric arrangements to differential calculus.

### Logarithms:

Until the development of the digital computer, this was the most common way to quickly multiply together large numbers, greatly speeding up calculations in physics, astronomy, and engineering.

Logarithms are the inverses, or opposites, of exponential functions. For example, the base 10 logarithm of 1 is log(1) = 0, since 1 = 100; log(10) = 1, since 10 = 101; and log(100) = 2, since 100 = 102.

This was the most common way to quickly multiply together large numbers, greatly speeding up calculations in physics, astronomy, and engineering until the development of the digital computer.

### Calculus:

The formula given here is the definition of the derivative in calculus. The derivative measures the rate at which a quantity is changing. For example, we can think of velocity, or speed, as being the derivative of position – if you are walking at 3 miles per hour, then every hour, you have changed your position by 3 miles.

Naturally, much of science is interested in understanding how things change, and the derivative and the integral – the other foundation of calculus – sit at the heart of how mathematicians and scientists understand change

### Maxwell’s Equations:

James Clerk Maxwell’s equations are a set of four fundamental forces in the world that describe the behavior of and relationship between electricity (E) and magnetism (H).

First published between 1861 and 1862, by combining the electric and magnetic fields into a set of four equations they define the key mathematics behind radio waves of all types also called an electromagnetic radiation by scientists and engineers.

however, modern physics relies on a quantum mechanical explanation of electromagnetism, and it is now clear that these elegant equations are just an approximation that works well on human scales.

### Albert Einstein’s Theory Of Relativity:

Einstein’s theory of relativity usually covers two interrelated theories:

special relativity and general relativity. Special relativity brought in ideas like the speed of light being a universal speed limit and the passage of time is different for people moving at different speeds.

General relativity describes gravity as a curving and folding of space and time themselves and was the first major change to our understanding of gravity since Newton’s law. General relativity is essential to our understanding of the origins, structure, and ultimate fate of the universe.

The classic equation E = mc^2 states that matter and energy are equivalent to each other.

### The Second Law Of Thermodynamics:

Rudolf Clausius’second law of thermodynamics states that”The total entropy can never decrease over time for an isolated system, that is, a system in which neither energy nor matter can enter or leave.” The total entropy can remain constant in ideal cases where the system is in a steady state (equilibrium) or is undergoing a reversible process. In all other real cases, the total entropy always increases and the process is irreversible.

The second law of thermodynamics is one of the few cases in physics where time matters in this way.

### Chaos Theory:

Chaos theory is a branch of mathematics focused on the behavior of dynamical systems. That is highly sensitive to initial conditions.

In other words, it shows how small changes can lead to consequences of much greater scale. It is used to gain greater mathematical insight into weather predictions, and into unstable systems such as turbulence in fluid flows, instability in finance and economic systems, and so on.

### Schrödinger Equation:

Developed by Austrian physicist Erwin Schrödinger in 1926, his equation was a significant landmark in developing the theory of quantum mechanics. The equation is a type of differential equation known as a wave-equation, which serves as a mathematical model of the movement of waves.

It governs the behavior of atoms and subatomic particles in quantum mechanics. Today, all of our semiconductors (transistors, integrated circuits, Intel CPU chips, etc.) depend on the science of quantum mechanics that may be impossible to understand without Schrödinger’s equation.

### Information Theory:

The equation given here is for Shannon information entropy. As with the thermodynamic entropy given above, this is a measure of disorder.

In this case, it measures the information content of a message. A book, a JPEG picture sent on the internet or anything that can be represented symbolically. The Shannon entropy of a message represents a lower bound on how much that message can be compressed without losing some of its content.

Shannon’s entropy measure launched the mathematical study of information. His results are central to how we communicate over networks today.