how is earthquakes measured


Richter magnitude scale - Wikipedia, the free encyclopedia

--------------------

** Richter magnitude scale **

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The *Richter magnitude scale* (also *Richter scale*) assigns a magnitude
number to quantify the energy released by an earthquake. The Richter scale,
developed in the 1930s, is a base-10 logarithmic scale, which defines
magnitude as the logarithm of the ratio of the amplitude of the seismic
waves to an arbitrary, minor amplitude.

As measured with a seismometer, an earthquake that registers 5.0 on the
Richter scale has a shaking amplitude 10 times that of an earthquake that
registered 4.0, and thus corresponds to a release of energy 31.6 times that
released by the lesser earthquake.^[1]

The Richter scale was succeeded in the 1970s by the Moment Magnitude Scale
(MMS). This is now the scale used by the United States Geological Survey to
estimate magnitudes for all modern large earthquakes. But, earthquake
magnitudes are still sometimes incorrectly reported by the press as "an
earthquake of XX on the Richter scale", when the correct terminology using
the MMS is "a magnitude XX earthquake".^[2]

*Contents*

· 1 Development
· 2 Details
· 3 Richter magnitudes

· 3.1 Energy release equivalents

· 4 Magnitude empirical formulae
· 5 See also
· 6 References
· 7 External links

*Development[edit]*

Charles Francis Richter, c. 1970

In 1935, the seismologists Charles Francis Richter and Beno Gutenberg, of
the California Institute of Technology, developed the (future) Richter
magnitude scale, specifically for measuring earthquakes in a given area of
study in California, as recorded and measured with the Wood-Anderson
torsion seismograph. Originally, Richter reported mathematical values to
the nearest quarter of a unit, but the values later were reported with one
decimal place; the local magnitude scale compared the magnitudes


Source: en.wikipedia.org/wiki/Richter_magnitude_scale

© 2005-2018 HaveYourSay.org