Wednesday, 29 May 2013

Hadron


                                     In particle physics, a hadron is a composite particle made of quarks held togetherby                                                                                 the strong force (in the same way as atoms and molecules are held together by the                                                                                                                   electromagnetic force).
Hadrons are categorized into two families:
  • baryons, such as protons and neutrons, made of three quarks
  • mesons, such as pions, made of one quark and one antiquark.
Other types of hadron may exist, such as tetraquarks (or, more generally, exotic mesons)                                                                                          and pentaquarks (exotic baryons), but no current evidence conclusively suggests their                                                                                                                  existence.
Of the hadrons, protons and neutrons bound to atomic nuclei are stable, while others are unstable                                                      under ordinary conditions; free neutrons decay in 15 minutes. Experimentally, hadron physics                                                                                                                   is studied by colliding protons or nuclei of heavy elements such as lead, and detecting the                                                                                            debris in the produced particle showers.

Quantum chromodynamics


                                                In theoretical physics, quantum chromodynamics (QCD) is a theory of the strong interaction (color force), a fundamental force describing the interactions between quarks and gluons which make up hadrons (such as the proton, neutron or pion). It is the study of the SU(3)[Special unitary group] Yang–Mills theory of color-charged fermions (the quarks). QCD is a quantum field theory of a special kind called a non-abelian gauge theory, consisting of a 'color field' mediated by a set of exchange particles (the gluons). The theory is an important part of the Standard Model of particle physics. A huge body of experimental evidence for QCD has been gathered over the years.
QCD enjoys two peculiar properties:
  • Confinement, which means that the force between quarks does not diminish as they are separated. Because of this, it would take an infinite amount of energy to separate two quarks; they are forever bound into hadrons such as the proton and the neutron. Although analytically unproven, confinement is widely believed to be true because it explains the consistent failure of free quark searches, and it is easy to demonstrate in lattice QCD.
  • Asymptotic freedom, which means that in very high-energy reactions, quarks and gluons interact very weakly. This prediction of QCD was first discovered in the early 1970s by David Politzer and by Frank Wilczek and David Gross. .
There is no known phase-transition line separating these two properties; confinement is dominant in low-energy scales but, as energy increases, asymptotic freedom becomes dominant.

Classical String Theory

                                Before we consider free strings let us first take a look at the description of a free massless relativistic point-particle:
\begin{displaymath}S=\int d\tau \frac{dx^\mu}{d\tau}\frac{dx^\nu}{d\tau}\eta_{\mu\nu}, \end{displaymath}(3.1)

where $\tau$ is the parameter along the wordline of the particle, $x^{\mu}$ maps this worldline into target space, in this case Minkowski space. Variation of this action gives us the equation of motion of a free massless particle:

\begin{displaymath}\frac{d^2 x^\mu}{d\tau^2} =0 \end{displaymath}


Strings are the one-dimensional generalisation of particles, so they sweep out a two dimensional worldsheet in space-time. The dynamics of the string is descibed by its coordinate $X^\mu$ which is parametrized by $\{\tau,\sigma\}$ where $\tau$ is the time-evolution parameter and $\sigma$ the parameter along the length of the string.
The Action describing a free bosonic string moving in flat space then becomes:
\begin{displaymath}
S=\frac{1}{2\alpha'} \int d\tau d\sigma \sqrt{-h} h^{ab} \partial _a X^\mu \partial _b X^\nu \eta_{\mu\nu},
\end{displaymath}(3.2)

where $h^{ab}$ is the metric on the worldsheet. The dimensional parameter $\alpha'$ is the characteristic length of the string (or inverse string tension) which makes the integrand dimensionless.
This string action has the following symmetries: it is invariant under reparametrization of the worldsheet (ofcourse):

\begin{displaymath}\tau , \sigma \to \xi^\tau(\tau,\sigma),\xi^\sigma(\tau,\sigma) \end{displaymath}


and it is invariant under conformal rescaling of the internal metric:

\begin{displaymath}h^{ab} \to \lambda (\tau,\sigma) h^{ab}, \end{displaymath}


this is called Weyl invariance.
We can generalise this to a string moving in curved space-time. The action for a bosonic string propagating in a gravitational background then becomes:
\begin{displaymath}
S = \frac{1}{2\alpha'} \int d\tau d\sigma \sqrt{-h}h^{ab}\partial ^a X^\mu \partial ^b X^\nu g_{\mu\nu}(X).
\end{displaymath}(3.3)

This bosonic string is the simplest example of string theory, but it is not realy a realistic theory for the desciption of physical phenomina. For instance its spectrum of particles includes a tachyon and no fermions. But there are more complicated string theories that have more realistic features. The supersymmetric version of string theory, superstrings, for instance solves the tachyon problem. There are now many different consistent string theories known, all with different worldsheet symmetries or different worldsheet topologies. Next to that we can change the background in which the string propagates.
For instance the action of a bosonic string moving in a general background is described by the generalised sigma model:
\begin{displaymath}
S = \frac{1}{2\alpha'} \int d\tau d\sigma \sqrt{h} \left\{ g...
...^{\mu} \partial _b X^{\nu} + \alpha' R^{(2)} \Phi(X) \right\},
\end{displaymath}(3.4)

where the backgroundfields $g_{\mu\nu}$$B_{\mu\nu}$ and $\Phi$ are the metric, the anti-symmetric tensor field and Dilaton field respectivily. $R^{(2)}$ is the Curvature-scalar on the 2-dimensional worldsheet.

Tuesday, 28 May 2013

Mathematical formulations                      

                                        Mathematical formulations of quantum mechanics

In the formalism of quantum mechanics, the state of a system at a given time is described by a complex wave function, also referred to as state vector in a complex vector space. This abstract mathematical object allows for the calculation of probabilities of outcomes of concrete experiments. For example, it allows one to compute the probability of finding an electron in a particular region around the nucleus at a particular time. Contrary to classical mechanics, one can never make simultaneous predictions of conjugate variables, such as position and momentum, with accuracy. For instance, electrons may be considered (to a certain probability) to be located somewhere within a given region of space, but with their exact positions unknown. Contours of constant probability, often referred to as "clouds", may be drawn around the nucleus of an atom to conceptualize where the electron might be located with the most probability. Heisenberg's uncertainty principle quantifies the inability to precisely locate the particle given its conjugate momentum.
According to one interpretation, as the result of a measurement the wave function containing the probability information for a system collapses from a given initial state to a particular eigenstate. The possible results of a measurement are the eigenvalues of the operator representing the observable — which explains the choice of Hermitian operators, for which all the eigenvalues are real. The probability distribution of an observable in a given state can be found by computing the spectral decomposition of the corresponding operator. Heisenberg's uncertainty principle is represented by the statement that the operators corresponding to certain observables do notcommute.
The probabilistic nature of quantum mechanics thus stems from the act of measurement. This is one of the most difficult aspects of quantum systems to understand. It was the central topic in the famous Bohr-Einstein debates, in which the two scientists attempted to clarify these fundamental principles by way of thought experiments. In the decades after the formulation of quantum mechanics, the question of what constitutes a "measurement" has been extensively studied. Newer interpretations of quantum mechanics have been formulated that do away with the concept of "wavefunction collapse" (see, for example, the relative state interpretation). The basic idea is that when a quantum system interacts with a measuring apparatus, their respective wavefunctions become entangled, so that the original quantum system ceases to exist as an independent entity. For details, see the article on measurement in quantum mechanics.
Generally, quantum mechanics does not assign definite values. Instead, it makes a prediction using a probability distribution; that is, it describes the probability of obtaining the possible outcomes from measuring an observable. Often these results are skewed by many causes, such as dense probability clouds. Probability clouds are approximate, but better than the Bohr model, whereby electron location is given by a probability function, the wave function eigenvalue, such that the probability is the squared modulus of the complex amplitude, or quantum state nuclear attraction. Naturally, these probabilities will depend on the quantum state at the "instant" of the measurement. Hence, uncertainty is involved in the value. There are, however, certain states that are associated with a definite value of a particular observable.
In the everyday world, it is natural and intuitive to think of everything (every observable) as being in an eigenstate. Everything appears to have a definite position, a definite momentum, a definite energy, and a definite time of occurrence. However, quantum mechanics does not pinpoint the exact values of a particle's position and momentum (since they are conjugate pairs) or its energy and time (since they too are conjugate pairs); rather, it provides only a range of probabilities in which that particle might be given its momentum and momentum probability. Therefore, it is helpful to use different words to describe states having uncertain values and states having definitevalues (eigenstates). Usually, a system will not be in an eigenstate of the observable (particle) we are interested in. However, if one measures the observable, the wavefunction will instantaneously be an eigenstate (or "generalized" eigenstate) of that observable. This process is known as wavefunction collapse, a controversial and much-debated process that involves expanding the system under study to include the measurement device. If one knows the corresponding wave function at the instant before the measurement, one will be able to compute the probability of the wavefunction collapsing into each of the possible eigenstates. For example, the free particle in the previous example will usually have a wavefunction that is a wave packet centered around some mean position x0 (neither an eigenstate of position nor of momentum). When one measures the position of the particle, it is impossible to predict with certainty the result. It is probable, but not certain, that it will be near x0, where the amplitude of the wave function is large. After the measurement is performed, having obtained some result x, the wave function collapses into a position eigenstate centered at x.
The time evolution of a quantum state is described by the Schrödinger equation, in which the Hamiltonian (the operator corresponding to the total energy of the system) generates the time evolution. The time evolution of wave functions is deterministic in the sense that - given a wavefunction at an initial time - it makes a definite prediction of what the wavefunction will be at any later time.
During a measurement, on the other hand, the change of the initial wavefunction into another, later wavefunction is not deterministic, it is unpredictable (i.e. random). A time-evolution simulation can be seen here.
Wave functions change as time progresses. The Schrödinger equation describes how wavefunctions change in time, playing a role similar to Newton's second law in classical mechanics. The Schrödinger equation, applied to the aforementioned example of the free particle, predicts that the center of a wave packet will move through space at a constant velocity (like a classical particle with no forces acting on it). However, the wave packet will also spread out as time progresses, which means that the position becomes more uncertain with time. This also has the effect of turning a position eigenstate (which can be thought of as an infinitely sharp wave packet) into a broadened wave packet that no longer represents a (definite, certain) position eigenstate.       
Quantum mechanics

                                      Quantum mechanics (QM – also known as quantum physics, or quantum theory) is a branch of physics which deals with physical phenomena at microscopic scales, where the action is on the order of the Planck constant. Quantum mechanics departs from classical mechanicsprimarily at the quantum realm of atomic and subatomic length scales. Quantum mechanics provides a mathematical description of much of the dual particle-like and wave-like behavior and interactions of energy and matter.
In advanced topics of quantum mechanics, some of these behaviors are macroscopic and emerge at only extreme (i.e., very low or very high) energies or temperatures. The name quantum mechanics derives from the observation that some physical quantities can change only in discrete amounts (Latin quanta), and not in a continuous (cf. analog) way. For example, the angular momentum of an electron bound to an atom or molecule is quantized. In the context of quantum mechanics, the wave–particle duality of energy and matter and the uncertainty principle provide a unified view of the behavior of photonselectrons, and other atomic-scale objects.
The mathematical formulations of quantum mechanics are abstract. A mathematical function known as the wavefunction provides information about the probability amplitude of position, momentum, and other physical properties of a particle. Mathematical manipulations of the wavefunction usually involve the bra-ket notation, which requires an understanding of complex numbers and linear functionals. The wavefunction treats the object as a quantum harmonic oscillator, and the mathematics is akin to that describing acoustic resonance. Many of the results of quantum mechanics are not easily visualized in terms of classical mechanics—for instance, the ground state in a quantum mechanical model is a non-zero energy state that is the lowest permitted energy state of a system, as opposed to a more "traditional" system that is thought of as simply being at rest, with zero kinetic energy. Instead of a traditional static, unchanging zero state, quantum mechanics allows for far more dynamic, chaotic possibilities, according to John Wheeler.
The earliest versions of quantum mechanics were formulated in the first decade of the 20th century. At around the same time, the atomic theorand the corpuscular theory of light (as updated by Einstein) first came to be widely accepted as scientific fact; these latter theories can be viewed as quantum theories of matter and electromagnetic radiation, respectively. Early quantum theory was significantly reformulated in the mid-1920s by Werner Heisenberg, Max Born and Pascual Jordan, who created matrix mechanicsLouis de Broglie and Erwin Schrödinger (Wave Mechanics); and Wolfgang Pauli and Satyendra Nath Bose (statistics of subatomic particles). And the Copenhagen interpretation of Niels Bohr became widely accepted. By 1930, quantum mechanics had been further unified and formalized by the work of David HilbertPaul Dirac and John von Neumann,with a greater emphasis placed on measurement in quantum mechanics, the statistical nature of our knowledge of reality, and philosophical speculation about the role of the observer. Quantum mechanics has since branched out into almost every aspect of 20th century physics and other disciplines, such as quantum chemistryquantum electronics, quantum optics, and quantum information science. Much 19th century physics has been re-evaluated as the "classical limit" of quantum mechanics, and its more advanced developments in terms of quantum field theorystring theory, and speculative quantum gravity theories.

Scientific inquiry into the wave nature of light go back to the 17th and 18th centuries when scientists such as Robert Hooke, Christian Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803, Thomas Young, an English polymath, performed the famous double-slit experiment that he later described in a paper entitled "On the nature of light and colours". This experiment played a major role in the general acceptance of the wave theory of light.
In 1838 with the discovery of cathode rays by Michael Faraday, these studies were followed by the 1859 statement of the black-body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, and the 1900 quantum hypothesis of Max PlanckPlanck's hypothesis that energy is radiated and absorbed in discrete "quanta" (or "energy elements") precisely matched the observed patterns of black-body radiation.
In 1896, Wilhelm Wien empirically determined a distribution law of black-body radiation, known as Wien's law in his honor. Ludwig Boltzmann independently arrived at this result by considerations of Maxwell's equations. However, it was valid only at high frequencies, and underestimated the radiance at low frequencies. Later, Max Planck corrected this model using Boltzmann statistical interpretation of thermodynamics and proposed what is now called Planck's law, which led to the development of quantum mechanics.
Among the first to study quantum phenomena in nature were Arthur Compton, C.V. Raman, Pieter Zeeman, each of whom has a quantum effect named after him. Robert A. Millikan studied the Photoelectric effect experimentally and Albert Einstein developed a theory for it. At the same time Niels Bohr developed his theory of the atomic structure which was later confirmed by the experiments of Henry Moseley. In 1913, Peter Debye extended Niels Bohr's theory of atomic structure, introducing elliptical orbits, a concept also introduced by Arnold Sommerfeld. This phase is known as Old quantum theory.
According to Planck, each energy element E is proportional to its frequency Î½:
 E = h \nu\

where h is Planck's constant. Planck (cautiously) insisted that this was simply an aspect of theprocesses of absorption and emission of radiation and had nothing to do with the physical reality of the radiation itself. In fact, he considered his quantum hypothesis a mathematical trick to get the right answer rather than a sizeable discovery. However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material.
The foundations of quantum mechanics were established during the first half of the 20th century by Max Planck, Niels Bohr, Werner Heisenberg,Louis de Broglie, Arthur Compton, Albert Einstein,Erwin Schrödinger, Max Born, John von Neumann,Paul Dirac, Enrico Fermi, Wolfgang Pauli, Max von Laue, Freeman Dyson, David Hilbert, Wilhelm Wien, Satyendra Nath Bose, Arnold Sommerfeldand others. In the mid-1920s, developments in quantum mechanics led to its becoming the standard formulation for atomic physics. In the summer of 1925, Bohr and Heisenberg published results that closed the "Old Quantum Theory". Out of deference to their particle-like behavior in certain processes and measurements, light quanta came to be called photons (1926). From Einstein's simple postulation was born a flurry of debating, theorizing, and testing. Thus the entire field of quantum physics emerged, leading to its wider acceptance at the Fifth Solvay Conference in 1927.
The other exemplar that led to quantum mechanics was the study of electromagnetic waves, such as visible and ultraviolet light. When it was found in 1900 by Max Planck that the energy of waves could be described as consisting of small packets or "quanta", Albert Einstein further developed this idea to show that an electromagnetic wave such as light could also be described as a particle (later called the photon) with a discrete quantum of energy that was dependent on its frequency. As a matter of fact, Einstein was able to use the photon theory of light to explain the photoelectric effect, for which he won the Nobel Prize in 1921. This led to a theory of unity between subatomic particles and electromagnetic waves, called wave–particle duality, in which particles and waves were neither one nor the other, but had certain properties of both. Thus coined the term wave-particle duality.
While quantum mechanics traditionally described the world of the very small, it is also needed to explain certain recently investigated macroscopic systems such as superconductorssuperfluids, and larger organic molecules.
 The discovery that particles are discrete packets of energy with wave-like properties led to the branch of physics dealing with atomic and sub-atomic systems which is today called quantum mechanics. It is the underlying mathematical framework of many fields of physics and chemistry, including condensed matter physicssolid-state physicsatomic physics, molecular physics, computational physics,computational chemistry, quantum chemistry, particle physics, nuclear chemistry, and nuclear physics. Some fundamental aspects of the theory are still actively studied.
Quantum mechanics is essential to understanding the behavior of systems at atomic length scales and smaller. In addition, if classical mechanics truly governed the workings of an atom, electrons would really 'orbit' the nucleus. Since bodies in circular motion accelerate, they must emit radiation and collide with the nucleus in the process. This clearly contradicts the existence of stable atoms. However, in the natural world electrons normally remain in an uncertain, non-deterministic, "smeared", probabilistic wave–particle wavefunction orbital path around (or through) the nucleus, defying the traditional assumptions of classical mechanics and electromagnetism.
Quantum mechanics was initially developed to provide a better explanation and description of the atom, especially the differences in the spectra of light emitted by different isotopes of the same element, as well as subatomic particles. In short, the quantum-mechanical atomic model has succeeded spectacularly in the realm where classical mechanics and electromagnetism falter.
Broadly speaking, quantum mechanics incorporates four classes of phenomena for which classical physics cannot account:
  • The quantization of certain physical properties
  • Wave–particle duality
  • The Uncertainty principle
  • Quantum entanglement.

                                                       

Metaphysics
                                         Metaphysics is a traditional branch of philosophy concerned with explaining the fundamental nature of being and the world, although the term is not easily defined.Traditionally, metaphysics attempts to answer two basic questions in the broadest possible terms:
1.    What is there?
2.    What is it like?
A person who studies metaphysics is called a metaphysicistor a metaphysician. The metaphysician attempts to clarify the fundamental notions by which people understand the world, e.g., existence, objects and their properties, space and time, cause and effect, and possibility. A central branch of metaphysics is ontology, the investigation into the basic categories of being and how they relate to each other. Another central branch of metaphysics is cosmology, the study of the totality of all phenomena within the universe.

Copernican principle

                             In physical cosmology, the Copernican principle, named after Nicolaus Copernicus, states that the Earth is not in a central, specially favored position. More recently, the principle has been generalized to the relativistic concept that humans are not privileged observers of the universe. In this sense, it is equivalent to the mediocrity principle, with important implications for the philosophy of science.
Since the 1990s the term has been used (interchangeably with "the Copernicus method") for J. Richard Gott's Bayesian-inference-based prediction of duration of ongoing events, a generalized version of the Doomsday argument.

Origin and it's implications

Michael Rowan-Robinson emphasizes the importance of the Copernican principle: "It is evident that in the post-Copernican era of human history, no well-informed and rational person can imagine that the Earth occupies a unique position in the universe.
Hermann Bondi named the principle after Copernicus in the mid-20th century, although the principle itself dates back to the 16th-17th century paradigm shift away from the Ptolemaic system, which placed Earth at the center of the Universe. Copernicus demonstrated the motion of the planets can be explained without the assumption that Earth is centrally located and stationary. He argued that the apparent retrograde motion of the planets is an illusion caused by Earth's movement around the Sun, which the Copernican model placed at the centre of the Universe. Copernicus himself was mainly motivated by technical dissatisfaction with the earlier system and not by support for any mediocrity principle. In fact, although the Copernican heliocentric model is often described as "demoting" Earth from its central role it had in the Ptolemaic geocentric model, neither Copernicus nor other 15th- and 16th-century scientists and philosophers viewed it as such.
In cosmology, if one assumes the Copernican principle and observes that the universe appears isotropic or the same in all directions from our vantage-point on Earth, then one can infer that the Universe is generally homogeneous or the same everywhere (at any given time) and is also isotropic about any given point. These two conditions make up the cosmological principle. In practice, astronomers observe that the Universe has heterogeneous or non-uniform structures up to the scale of galactic superclusters, filaments and great voids. It becomes more and more homogeneous and isotropic when observed on larger and larger scales, with little detectable structure on scales of more than about 200 million parsecs. However, on scales comparable to the radius of the observable universe, we see systematic changes with distance from the Earth. For instance, galaxies contain more young stars and are less clustered, and quasars appear more numerous. While this might suggest that the Earth is at the center of the Universe, the Copernican principle requires us to interpret it as evidence for the evolution of the Universe with time: this distant light has taken most of the age of the Universe to reach and shows us the Universe when it was young. The most distant light of all, cosmic microwave background radiation, is isotropic to at least one part in a thousand.
Modern mathematical cosmology is based on the assumption that the Cosmological principle is almost, but not exactly, true on the largest scales. The Copernican principle represents the irreducible philosophical assumption needed to justify this, when combined with the observations.
Bondi and Thomas Gold used the Copernican principle to argue for the perfect cosmological principle which maintains that the universe is also homogeneous in time, and is the basis for the steady-state cosmology. However, this strongly conflicts with the evidence for cosmological evolution mentioned earlier: the Universe has progressed from extremely different conditions at the Big Bang, and will continue to progress toward extremely different conditions, particularly under the rising influence of dark energy, apparently toward the Big Freeze or Big Rip.

 Cosmology


                                              Cosmology is the study of the origins and eventual fate of the universe. Physical cosmology is the scholarly and scientific study of the origin, evolution, structure, dynamics, and ultimate fate of the universe, as well as the natural laws that keep it in order.
 It is a branch of astronomy, is the study of the largest-scale structures and dynamics of the Universe and is concerned with fundamental questions about its formation and evolution. For most of human history, it was a branch of metaphysics and religion. Cosmology as a science originated with the Copernican principle, which implies that celestial bodies obey identical physical laws to those on Earth, and Newtonian mechanics, which first allowed us to understand those laws.
Physical cosmology, as it is now understood, began with the 20th century development of Albert Einstein's general theory of relativity, and better astronomical observations of extremely distant objects. These advances made it possible to speculate about the origin of the Universe, and allowed scientists to establish the Big Bang Theory as the leading cosmological model. Some researchers still advocate a handful of alternative cosmologies however, cosmologists generally agree that the Big Bang theory best explains observations.
Cosmology draws heavily on the work of many disparate areas of research in theoretical and applied physics. Areas relevant to cosmology include particle physics experiments and theory, including astrophysics, general relativity, quantum mechanics, and plasma physics. Thus, cosmology unites the physics of the largest structures in the Universe with the physics of the smallest structures in the Universe.

Theory of Everything and philosiphy


                                               The philosophical implications of a physical ToE are frequently debated. For example, if philosophical physicalism is true, a physical ToE will coincide with a philosophical theory of everything.
The "system building" style of metaphysics attempts to answer all the important questions in a coherent way, providing a complete picture of the world. Plato and Aristotle could be said to have created early examples of comprehensive systems. In the early modern period (17th and 18th centuries), the system-building scope of philosophy is often linked to the rationalist method of philosophy, which is the technique of deducing the nature of the world by pure a priori reason. Examples from the early modern period include the Leibniz's Monadology Descarte's  Dualism, and Spinoza's Monism. Hegel's Absolute idealism and Whitehead's Process philosophywere later systems.

Other philosophers do not believe their techniques can aim so high. Some scientists think a more mathematical approach than philosophy is needed for a ToE, for instance Stephen Hawking wrote in A Brief History of Time that even if we had a ToE, it would necessarily be a set of equations. He wrote, “What is it that breathes fire into the equations and makes a universe for them to describe?”.

Spacetime


                                                        In physics, spacetime (also space–timespace time or space–time continuum ) is any mathematical model that combines space and time into a single continuum. Spacetime is usually interpreted with space as existing in three dimensions and time playing the role of a fourth dimension that is of a different sort from the spatial dimensions. From a Euclidean space perspective, the universe has three dimensions of space and one of time. By combining space and time into a single manifold, physicists have significantly simplified a large number of physical theories, as well as described in a more uniform way the workings of the universe at both the supergalactic and subatomiclevels.
In non-relativistic classical mechanics, the use of Euclidean space instead of spacetime is appropriate, as time is treated as universal and constant, being independent of the state of motion of an observer. In relativistic  [Theory of Relativity] contexts, time cannot be separated from the three dimensions of space, because the observed rate at which time passes for an object depends on the object's velocity relative to the observer and also on the strength of gravitational fields, which can slow the passage of time.
In cosmology, the concept of spacetime combines space and time to a single abstract universe. Mathematically it is a manifold consisting of "events" which are described by some type of coordinate system. Typically three spatial dimensions (length, width, height), and one temporal dimension (time) are required. Dimensions are independent components of a coordinate grid needed to locate a point in a certain defined "space". For example, on the globe the latitude and longitude are two independent coordinates which together uniquely determine a location. In spacetime, a coordinate grid that spans the 3+1 dimensions locates events (rather than just points in space), i.e. time is added as another dimension to the coordinate grid. This way the coordinates specify where and when events occur. However, the unified nature of spacetime and the freedom of coordinate choice it allows imply that to express the temporal coordinate in one coordinate system requires both temporal and spatial coordinates in another coordinate system. Unlike in normal spatial coordinates, there are still restrictions for how measurements can be made spatially and temporally (see Spacetime intervals). These restrictions correspond roughly to a particular mathematical model which differs from Euclidean space in its manifest symmetry.
Until the beginning of the 20th century, time was believed to be independent of motion, progressing at a fixed rate in all reference frames; however, later experiments revealed that time slows at higher speeds of the reference frame relative to another reference frame. Such slowing, called time dilation, is explained in special relativity theory. Many experiments have confirmed time dilation, such as the relativistic decay of muons from cosmic ray showers and the slowing of atomic clocks aboard a Space Shuttle relative to synchronized Earth-bound inertial clocks. The duration of time can therefore vary according to events and reference frames.
When dimensions are understood as mere components of the grid system, rather than physical attributes of space, it is easier to understand the alternate dimensional views as being simply the result of coordinate transformations.
The term spacetime has taken on a generalized meaning beyond treating spacetime events with the normal 3+1 dimensions. It is really the combination of space and time. Other proposed spacetime theories include additional dimensions—normally spatial but there exist some speculative theories that include additional temporal dimensions and even some that include dimensions that are neither temporal nor spatial (e.g. superspace). How many dimensions are needed to describe the universe is still an open question. Speculative theories such as string theory predict 10 or 26 dimensions (with M-theory predicting 11 dimensions: 10 spatial and 1 temporal), but the existence of more than four dimensions would only appear to make a difference at the subatomic level.

Loop quantum gravity 

                                     Current research on loop quantum gravity may eventually play a fundamental role in a  Theory of Everything [ToE], but that is not its primary aim. Also loop quantum gravity introduces a lower bound on the possible length scales.
There have been recent claims that loop quantum gravity may be able to reproduce features resembling the Standard Model. So far only the first generation of fermions (leptons and quarks) with correct parity properties have been modelled by Sundance Bilson-Thompsonusing preons constituted of braids of spacetime as the building blocks.However, there is no derivation of the Lagrangian that would describe the interactions of such particles, nor is it possible to show that such particles are fermions, nor that the gauge groups or interactions of the Standard Model are realized. Utilization of quantum computing concepts made it possible to demonstrate that the particles are able to survive quantum fluctuations.
This model leads to an interpretation of electric and color charge as topological quantities (electric as number and chirality of twists carried on the individual ribbons and color as variants of such twisting for fixed electric charge).

Bilson-Thompson's original paper suggested that the higher-generation fermions could be represented by more complicated braidings, although explicit constructions of these structures were not given. The electric charge, colour, and parity properties of such fermions would arise in the same way as for the first generation. The model was expressly generalized for an infinite number of generations and for the weak force bosons (but not for photons or gluons) in a 2008 paper by Bilson-Thompson, Hackett, Kauffman and Smolin.[By Akhand Pratap Singh xth 2013 aps]

 String theory and M-theory 

                                    Since the 1990s, many physicists believe that 11-dimensional M-theory, which is described in some limits by one of the five perturbative superstring theories, and in another by the maximally-supersymmetric 11-dimensional supergravity, is the theory of everything. However, there is no widespread consensus on this issue.
A surprising property of string/M-theory is that extra dimensions are required for the theory's consistency. In this regard, string theory can be seen as building on the insights of the Kaluza-Klein theory, in which it was realized that applying general relativity to a five dimensional universe (with one of them small and curled up) looks from the four-dimensional perspective like the usual general relativity together with Maxwell's electrodynamics. This lent credence to the idea of unifying gauge and gravity interactions, and to extra dimensions, but didn't address the detailed experimental requirements. Another important property of string theory is its supersymmetry, which together with extra dimensions are the two main proposals for resolving the hierarchy problem of the standard model, which is (roughly) the question of why gravity is so much weaker than any other force. The extra-dimensional solution involves allowing gravity to propagate into the other dimensions while keeping other forces confined to a four-dimensional spacetime, an idea that has been realized with explicit stringy mechanisms.
Research into string theory has been encouraged by a variety of theoretical and experimental factors. On the experimental side, the particle content of the standard model supplemented with neutrino masses fits into a spinor representation of SO(10), a subgroup of E8 that routinely emerges in string theory, such as in heterotic string theoryor (sometimes equivalently) in F-theory. String theory has mechanisms that may explain why fermions come in three hierarchical generations, and explain the mixing rates between quark generations.On the theoretical side, it has begun to address some of the key questions in quantum gravity, such as resolving the black hole information paradox, counting the correct entropy of black holes and allowing for topology-changing processes.It has also lead to many insights in pure mathematics and in ordinary, strongly-coupled gauge theory due to the Gauge/String duality.
In the late 1990s, it was noted that one major hurdle in this endeavor is that the number of possible four-dimensional universes is incredibly large. The small, "curled up" extra dimensions can be compactified in an enormous number of different ways (one estimate is 10500 ) each of which leads to different properties for the low-energy particles and forces. This array of models is known as the string theory landscape.

One proposed solution is that many or all of these possibilities are realised in one or another of a huge number of universes, but that only a small number of them are habitable, and hence the fundamental constants of the universe are ultimately the result of the anthropic principle rather than dictated by theory. This has led to criticism of string theory,arguing that it cannot make useful (i.e., original, falsifiable, and verifiable) predictions and regarding it as a pseudoscience. Others disagree, and string theory remains an extremely active topic of investigation in theoretical physics.