I wrote a small perl program to test things from this quake database. I came across some interesting results. This is how I computed them. I was interested in what I deemed as surface quakes, ie. those occurring between 0 to 100 km of depth; I was interested in quakes of mangitude 6.5 to 10, and I wanted them from the whole globe. I also wanted the statistics from 1970 to 2001, to 16th of July ( today). I grouped the quake statistics yearly, taking advantage of these properties:
date of quake (year),
magnitude of quake (mag)
(1) count of quakes per year, which I called cnt
(2) 10 ** mag, which I called exp
(3) mag ** 2, which I called pwr
The purpose of the 10 ** mag was to produce a sum-up and average of the magnitudes of the quakes in non-logarithmic
scale, which I deemed more appropriate in following the trends of the quakes recently and in past. Also, the average
magnitude is computed using log(exp / cnt) for each year. This should be a more proper manner of computing average of a
logarithmic property. This is the data which I received from my Perl Program (source code in bottom). Notice that the year
2001 statistics are Shooting through the roof. I do not know what this means, in general terms, but I thought you might be
interested in this.
It must be noted that having higher activity earthquake periods is not at all uncommon, latest similar peak in earthquakes has been in the 1970s, as is shown. So, it could be errorneous to judge that this period is in particular more severe than any other recorded peak. It could be interesting to also check the data from 1900 onwards, but I didn't do this. I must admit that the numbers for 2001 do look large but more quakes in this year that would be somewhat milder could do a lot in lessening the seeming strength of those numbers. Or they might increase them.
Offered by Antti.