GravityGuy
12-05-2008, 06:11 PM
Dear Authors,
I have been using the Delaunay Triangulation method for close to 15 years now to generate a typical network of several 100,000 triangles over a digital elevation file now available for many parts of the country due to satellite radar technology. For the purposes of calculating outer terrain corrections for gravity surveys each station requires a unique triangulation net and I typically run 100's of stations in a single processing run. It used to take hours with older computers but the new processors can do the same job in minutes, generating this sized net in only a few seconds, in fact, faster than the rest of the gravity calculation on a per station basis.
I am curious that in Chapter 21 on Computational Geometry that no mention was made in the references to Donald Knuth's monograph, "Axioms and Hulls", published by Springer-Verlag in 1992, some 100 pages long. Nor did I see a reference to Knuth's book, "The Stanford Graphbase" Addison-Wesley, 1993. I have not done a full literature search on the subject, but considering the depth to which Knuth has studied this topic, and the 72 references he includes in the bibliography in Axioms and Hulls, I would have expected to see one or both of these. However, his reference [36] is the same Gubias and Knuth paper as your [1]. I did not read this paper to see whether the entire algorithm was described.
Back in 1995 I managed to implement Knuth's linked list data structure in Fortran 77!! While the static array sizes are a bit of a nuisance, it still runs with amazing performance. Of course, Knuth is not one to take short-cuts anywhere, so as you pointed out in your chapter, if you need to be concerned with points falling on boundaries, being exactly co-linear, and having a way of generating the exact same triangulation with the same set of points each time, then perhaps wading into the Knuth deep-end may be a worthy alternative. With anything else, advances in technology tend to change the direction of attack when it comes to choosing between integer and floating point arithmetic. Performance limitations then may no longer be the same, as well as the amount of memory you have to work with, not that it's ever a good thing to waste.
For anyone generally interested in other computational geometry algorithms and code that is "literate", I do not hesitate to recommend the afformentioned Stanford Graphbase - A Platform for Combinatorial Computing. And I believe the source code is still available by ftp from several mirrored sites - just search for it with the appropriate keywords.
I have not yet tried a head-to-head comparison of the NR implementation vs mine in ftn77. It would be interesting.
I have been using the Delaunay Triangulation method for close to 15 years now to generate a typical network of several 100,000 triangles over a digital elevation file now available for many parts of the country due to satellite radar technology. For the purposes of calculating outer terrain corrections for gravity surveys each station requires a unique triangulation net and I typically run 100's of stations in a single processing run. It used to take hours with older computers but the new processors can do the same job in minutes, generating this sized net in only a few seconds, in fact, faster than the rest of the gravity calculation on a per station basis.
I am curious that in Chapter 21 on Computational Geometry that no mention was made in the references to Donald Knuth's monograph, "Axioms and Hulls", published by Springer-Verlag in 1992, some 100 pages long. Nor did I see a reference to Knuth's book, "The Stanford Graphbase" Addison-Wesley, 1993. I have not done a full literature search on the subject, but considering the depth to which Knuth has studied this topic, and the 72 references he includes in the bibliography in Axioms and Hulls, I would have expected to see one or both of these. However, his reference [36] is the same Gubias and Knuth paper as your [1]. I did not read this paper to see whether the entire algorithm was described.
Back in 1995 I managed to implement Knuth's linked list data structure in Fortran 77!! While the static array sizes are a bit of a nuisance, it still runs with amazing performance. Of course, Knuth is not one to take short-cuts anywhere, so as you pointed out in your chapter, if you need to be concerned with points falling on boundaries, being exactly co-linear, and having a way of generating the exact same triangulation with the same set of points each time, then perhaps wading into the Knuth deep-end may be a worthy alternative. With anything else, advances in technology tend to change the direction of attack when it comes to choosing between integer and floating point arithmetic. Performance limitations then may no longer be the same, as well as the amount of memory you have to work with, not that it's ever a good thing to waste.
For anyone generally interested in other computational geometry algorithms and code that is "literate", I do not hesitate to recommend the afformentioned Stanford Graphbase - A Platform for Combinatorial Computing. And I believe the source code is still available by ftp from several mirrored sites - just search for it with the appropriate keywords.
I have not yet tried a head-to-head comparison of the NR implementation vs mine in ftn77. It would be interesting.