Compiling Geological Data
In mineral exploration, there are increasingly fewer places in Canada that have not had a boot set on the ground to collect samples (a.k.a., greenfields) and, as extraction technology improves, there are more companies re-visiting old mines (a.k.a, brownfields). Having the knowledge, and organizational skills, to compile historical data and merge it with current data is essential in mineral exploration. As with all data, your final interpretation is only as good as your data.
There are five fundamental characteristics of a good data set: consistency, relevance, readability, high-quality and adaptability.
As more information accumulates over the years interpretations change to affect the deposit model, rock types and geophysical interpretations. It is very important that older data is re-interpreted with the new schemes including changing rock types to reflect current petrological findings. In addition, any changes should be noted within the data set to ensure that any aberrations ca be easily followed back to the original data.
Of special note is geochemical (and biogeochemical data) where variations between crews, assay methods, detection limits, quality and labs over the years can create artifacts within the data so that some years may have lower gold values, for example, than other years. This can cause an area to be overlooked, or focused upon unnecessarily, resulting in lost exploration time, money and investor confidence.
Another aspect to data compilations is ensuring that the data you are using has significant bearing on and is suitable for your interpretations. Including incomplete data, such as missing element values or extraneous data such as information collected in one year but not in any other years can significantly affect your interpretations if not dealt with appropriately, cause unnecessary complexity and can make interpretations difficult.
Often geologist will use codes to describe rock types and typically each geologist will use a different set of codes. Each code should be described in full within the compiled data set to ensure all users can read the data without having to refer to a second table for code descriptions. In addition, column headings (i.e., organization) should be succinct and represent the data accurately.
Often historical quality assurance/quality control is missing or can be daunting. Ensuring the quality of the data is very important and worth the effort of slogging through pages of certificates to run statistics on the precision and accuracy. Although the samples may not be available to assay again, it is possible to correct variations so they can be usable in interpretations.
Be prepared to change and adjust the structure of your data sets as new information becomes available and as the needs of your project changes. Each compilation should be new so as to not carry-forward biases and errors to subsequent data sets.
Data compilations are often very time intensive and requires exceptional organization skills as well as knowledge of the area of interest and the goal of the compilation. Historical compilations are also undeniably worth the effort, when done appropriately, and can add a lot of value to your project.
About the Author:
Diana has over 20 years of experience working in the mineral exploration industry searching for diamonds and metals in a range of roles: from heavy minerals lab technician to till sampler, rig geologist, project manager and business owner. Following a Master of Science degree in diamond indicator mineral geochemistry, Diana has conducted field work in BC, NWT, YT, ON (Canada) and in Greenland. She has also been involved, remotely through a BC-based office, on mineral exploration projects located in South America, Africa, Eurasia, and the Middle East. Diana finished a Ph.D. at UNBC in 2017 researching geochemical multivariate statistical analysis and interpretation. Currently, Diana is the owner of Takom Exploration Ltd., a small geological and environmental consulting firm focused on metal exploration in BC and the Yukon.