Resource Geology

A Resource Geologist is a geologist whose role is focused on modeling, estimation, and reporting Mineral Resources. The resource geologist producing 2-D or 3-D models of geology, structure, geometallurgy, or other properties important for mine planning. After modelling, the resource geologist uses geostatistics to perform numeric estimation or simulation of tonnages, grades and other properties to determine quantity and quality of material in a deposit. Lastly. a resource geologist is typically responsible for calculating and reporting the Mineral Resources of a deposit for internal or external reporting.

Resource geology requires detailed knowledge of exploration, mining, statistics & geostatistics, data management, economic geology, software, and various reporting codes such as JORC, CIM/NI 43-101, SAMREC, and SEC S-K1300. Often, a resource geologist oversees drilling and resource development programs to advance a site’s confidence in Mineral Resources. There is a great deal of overlap between resource, mine, and exploration geology.

Download the Data Analysis Guidebook


Join our community newsletter and receive the free Data Analysis guidebook:

SUBSCRIBE TODAY AND RECEIVE:

Introduction to

Exploratory Data Analysis (EDA)

using Excel®

Domaining and Stationarity

Simply put, get it right! If I had to pick a single item for Resource Geologists to spend the most time on and will make the greatest difference in a model, it's data domaining. This comes in a variety of ways such as a robust geological model, a strong understanding of data statistics (EDA), the art of lumping & breaking, and sound geologic understanding of your deposit or area.  I'm digging out an old article on this post that makes a quick read but hopefully stresses the importance that if you don't get your domains right, no amount of fancy, mind-numbing mathematics can make your model better.


Top Cutting or Capping

“To cap, or not to cap, that is the question: Whether 'tis nobler in the mind to suffer the slings and arrows of outrageous fortune, or to take Arms against a Sea of trouble”. High-grade samples are of course what we want in mining but when they’re outliers they become problematic. Therefore, Capping or Top Cutting of extreme outlier data is commonly used in resource estimation to generate a realistic estimate by not overstating the “true” underlying mean grade, variance and CV or simply blowing unsupported high-grade into all blocks in the zip code and thus overstating a resource. Here are few points when considering capping: 1) check to see if the data is real or an error, 2) check stats before and after capping to assess the impact to the mean, 3) view them spatially, are they clustered in one area or scattered about the ore body? 4) calculate the extreme outlier upper threshold = (Q3 + 3*(interquartile range),  and 5) perhaps run two estimates, with capping and without, to truly understand their impact on the final estimate and grade-tonne curve. There are many ways of dealing with extreme outliers. When you do cap, ensure you perform capping on composited samples to ensure consistent volume of samples. 

top cutting capping

(Top cutting evaluation of raw gold grades


Kriging Neighbourhood Analysis

On the subject of Qualitative Kriging Neighbourhood Analysis (QKNA) and conditional bias. At a high level, this is essentially a method of testing several search neighborhood parameters to find an acceptable compromise in the proper selection of data.  Multiple neighborhoods are reviewed against estimation outputs such as slope of regression, sum of negative weights, weight of the mean, and kriging efficiency to "optimize" the search to reduce conditional bias. A detailed description is best presented by John Vann, Jackson, and Bertoli (2003).  On the flip side, Ed Isaaks and others have shown how KNA can increase estimation error in predicted grade-tonnes. Boyle (2010) demonstrated that KNA overstates benefits of one search over another and domaining and sampling accuracy are far more important.

QKNA


Nugget

The nugget is a measure of the intrinsic variability of a grade variable + sampling error. Determining  the proportion of nugget effect is critical when you model spatial continuity. In mining, most grade variables have at least some nugget value. The relative nugget effect (proportion of nugget to sill) is the same in all directions and usually best to determine from downhole variograms simply because samples are so close to each other.  The nugget has significant impact on kriging estimates. The kriged estimate determines the grade distribution or histogram. The grade distribution determines total grade-tonnes at cut-offs. So yeah, it's important. The image below shows the same data set estimated with a high relative nugget effect and one with a low nugget. Both have the same mean but the shaded area under the curve represents over or under-estimated tonnages due to changes in the nugget. A low nugget will result in a smoother distribution.  Long story short - pay attention to nugget values and understand how the final nugget value can impact variograms, grade distribution, and final grade-tonne curves.

variogram nugget


 

Resource Geology, Estimation, and Geostatistical References (Affiliates):


Mining Geology HQ recommends the following references as premier guides for resource geology and geostatistics. Each link is an affiliate link meaning that Mining Geology HQ will earn a small percentage of the sale by providing this recommendation. We aim to provide recommendations only on products we have personally used and feel would be helpful to industry geologists:

Isaaks

Isaaks, E. and Srivastava, M., An Introduction to Applied Geostatistics - The classic "Geostats by Ed and Mo" should be on the shelf of any resource geologist. This textbook is able to translate a sometimes complicated subject into simpler terms using many mining examples.

 

 

 

 

Armstrong

Armstrong, M., Basic Linear Geostatistics - A great reference which is more on the applied side than many geostatistical textbooks. Armstrong does a good job of explaining linear estimation concepts in an easy-to-understand manner.

Resource Geology, Estimation, and Geostatistical References (free):


Practical Geostatistics by Isobel Clark (1979). Dr. Clark has been kind enough to share the original Practical Geostatistics book. There is a new edition which I encourage everyone to check out by contacting her at kriging.com

USGS Practical Primer to Geostatistics by Richardo Olea (2009). This free reference provides a good high-level introduction to some of the important aspects of geostatistics.

The kriging oxymoron: Conditionally unbiased and accurate prediction (2nd Edition)
(2004) Isaaks, E.H, Proceedings of the 2004 Geostat Congress, Banff, Alberta. Provided courtesy of Ed Isaaks at isaaks.com

Beyond Ordinary Kriging - An Overview of Non-linear Estimation
(1998) Vann, J., and Guibal, D., Proceedings from Beyond Ordinary Kriging Symposium, Perth, Western Australia. Provided courtesy of the Geostatistical Association of Australasia (GAA).

A Practioners Implementation of Indicator Kriging
(1998) Glacken, I. and Blackney, P., Proceedings from Beyond Ordinary Kriging Symposium, Perth, Western Australia. Provided courtesy of the Geostatistical Association of Australasia (GAA). 

 

Resource Geology Software (free):


The following software packages are not endorsed by Mining Geology HQ but are free open-source software.

GSLIB (64 bit) - The classic open-source software package used globally. If you're after a more visually appealing version, try the paid version of WinGsLib at gslib.com. This program accompanies the training package provided by Statios.

SGeMS - The Stanford Geostatistical Modeling Software (SGeMS) is an open-source computer package for geostatistics. It provides geostatistics practitioners with a user-friendly interface, an interactive 3-D visualization, and a wide selection of algorithms.

Geoscience ANALYST - Developed by Mira Geoscience, this free 3D visualization and communication software is for integrated, multi-disciplinary earth models and data. Check out the video here.

PairQA (32 and 64 bit) - Developed by Xstract Mining Consultants, PairQA calculates the Average Coefficient of Variation (CVAVR (%)) and the Reduced Major Axis Regression (RMA). The purpose is to define an unbiased measurement of error for duplicate samples. The software is based on published work on the Average Coefficient of Variation by Stanley and Lawie (2007) and Abzalov (2008), and for the Reduced Major Axis Regression by Sinclair and Blackwell (2002).

Google Refine 2.5 - A great little tool for cleaning up those messy drilling databases from the past 50 years.

PyGSLIB - An exciting open sourced project combining python with gslib. Check out the documentation and how to install here.

GSLIB Rotation View Tool - This handy little tool helps you work out the various directional schemes betweeen Vulcan, Isatis, GSLIB, and others.