OBSIDIAN SOURCING SUDIES IN THE GREAT BASIN: P ES AMD PD SPE Richard E. Hughes Introduction During the past few years there has been an ever increasing demand for obsidian sourcing studies in archaeological research. The reasons for this demand are obvious: accurate and replicable matches between parent geological sources and obsidian artifacts are prerequisite to establishing source-specific obsidian hydration rates and to determining prehistoric trade and exchange relations. Archaeologists have eagerly submitted artifacts for analysis, and obsidian sourcing studies have generated tremendous quantities of data. Up to this point, however, there has not been a corresponding enthusiasm for critical evaluation of the accuracy and replicability of these source assignments, and I think it would be fair to say that the results of most sourcing studies are accepted on faith. Uncritical acceptance of the results of sourcing studies occurs, for the most part, because archaeologists typically are unable to evaluate the methods and techniques applied by specialists to assign artifacts to sources. Because of this, incorrect source assignments rarely are detected. This paper has three purposes: first, to consider some of the methods coaornly used by specialists to identify sources and to assign artifacts to them, second, to offer a critical appraisal of application of these methods and, third, to offer suggestions about some of the ways incorrect source assignments can be detected. The coanmn denominator underlying all sourcing studies is that there exists unique combinations of constituent elements in obsidian which, when considered together or in various subsets, allow distinctions to be made between obsidian sources. Once these distinctions have been made, obsidian artifacts are assigned membership in one or another of the sources on the basis of how closely they resemble the source profiles. Ternary Diagrams These source profiles were depicted early on in California obsidian sourcing studies by the use of ternary, or triangular, diagrams. Plots for sources and artifacts were determined in the same way: counts on three elements (usually Rb, Sr and Zr) were summed. then divided into Rb, Sr and Zr values individually to determine the relative percentages for plotting on intersecting axes of the diagram. This method is simple, straightforward and easy to compute and it served admirably in early studies (i.e. Stross et al. 1968; Jack and Heizer 1968) where there existed relativelv pronounced differences in trace element composition between sources. 1 2 However, there are two basic difficulties with the ternary diagram method of presentation which limit its effectiveness. First, it is quite difficult to distinguish between sources with overlapping plots. As depicted in Figure 1, the cluster of dots represent artifacts fran the Humboldt Lakebed site, Nevada (Ch-15) analyzed by Robert Jack (unpublished data). The dots of imme- diate concern here fall for the most part within the Rb, Sr and Zr variation depicted for the Casa Diablo source (Jack 1976: 211), located in the Mono Basin. Based solely on this ternary diagram plot, many of these artifacts probably would have been assigned to the Casa Diablo source. However, subsequent to Jack's Humboldt Lakebed artifact analysis, a source of artifact quality obsidian has been located near Majuba Mountain in northwestern Nevada, and the ternary diagram plot for this source overlaps with Casa Diablo. Recent analyses of artifacts from Lovelock Cave (Ch-18), located only a few hundred meters fran the Humboldt Lakebed site, indicates that many of the specimens that fall within the Casa Diablo range of variation on the ternary diagram in fact were manu- factured fran parent obsidian of the Majuba Mountain geochemical type (Hughes unpublished data; see Figure 1). This separation could not have been made solely on the basis of Rb, Sr and Zr plots. The second difficulty arises fran the fact that specimens from the same source will not plot in the same place on the ternary diagram if different elemental measurement units are employed. Diagram plots based on parts per million (ppm) trace element concentrations will not necessarily overlap with those based on peak intensity counts. Figure 2 shows that ternary diagram plots derived from peak intensities of Caa - Diablo source specimens (Jackson 1974: Figure 15), although generally similar, do not correspond with plots for the same trace elements generated from ppm estimates (Jack, unpublished data). In addition, ppm plots for Queen and Mt. Hicks both fall outside the range of rapid scan peak intensity plots for these same sources. In fact, Queen ppm plots fall within the range of peak intensity plots for Mono Craters, Mono Glass Mountain, Coso Hot Springs and Fish Springs. An equally dramatic contrast can be drawn by considering specimens from the Cougar Butte source in the Medicine Lake Highland of northeast California (Figure 3). The filled circles in Figure 3 depict specimens analyzed in 1978 using the rapid scan technique (Hughes 1978: 62), while the open circles represent specimens from the same source computed from ppm concentrations (Hughes 1983a). Even though six of ten specimens are common to both studies, comparision of these two plots likely would lead one to the conclusion that two different sources were represented. As a final example, Green (1982) has argued on the basis of ternary diagram correspondences that the Hawkins-Malad-Oneida locality in southeastern Idaho was the primary source employed in the manufacture of obsidian artifacts at Danger and Hogup Caves in Utah. However, comparisons of the Rb, Sr and Zr ppm values for Danger Cave and Hogup Cave artifacts published by Condie and Blaxland (1970: 280) with these same values for the assigned source (see Table 3 herein) indicate that this attribution likely is in error. In historical perspective, difficulties with ternary diagrams were held to a minimum because nearly all the x-ray fluorescence analyses of archaeological collections between about 1967 and 1974 were done at the same laboratory, with the same machine, under the same set of analytical conditions. Thus, the work was internally consistent. Within the last five years, however, more 3 laboratories have became involved in x-ray analysis of western North American archaeological specimens, and the issue of inter-laboratory comparison has become significant. One way around the problem of inter-lab comparison using ternary diagrams is to convert peak intensities into ppm estimates. Properly done, this conversion overcomes machine-specific differences in count rates that result in non-comparable plots; thus, when expressed in ppm, ternary diagrams can be compared directly between laboratories. Discriminant Analysis Recognition of the limitations of the ternary diagram method, coupled with the newfound capabilities of semi-autoaated energy dispersive x-ray fluorescence machines to conduct non-destructive analyses on many elements simultaneously, led some researchers to turn away from presentation of results using ternary diagrams and to explore multivariate statistical techniques. Discriminant analysis was among the first of the multivariate techniques to be applied to western North American obsidians (Ericson 1981: 9), and later applications to obsidians in this region (i.e. Nelson and Holmes 1979) yielded positive results. Indeed, discriminant analysis appeared ideally suited to assist in obsidian characterization, since "the basic problem of (the technique) is to assign an observation, or case, of unknown origin to one of two (or more) distinct groups on the basis of the value of the observation" (Lachenbruch 1975: 1). Two different, but clearly related aims of discriminant analysis can be pursued. The first, which could be called descriptive, simply derives allocation rules to characterize tlhe differences between obsidian sources on the basis of major, minor, trace or rare earth elemnts. Once this step has been accomplished, discriminant analysis can be employed to classify cases of unknown origin (these are usually obsidian artifacts) on the basis of allocation rules derived from the analysis of known sources. This latter aim is concerned with predicting source membership for ungrouped cases (cf. Habbema and Hermans 1977). Hbwever appealing discrimnnant analysis might seem, there are several statistical requirements that must be satisfied before the results of the analysis can be considered reliable. Anmng the most important are: 1) equality of group covariance matrices, which can be assessed using Box's M statistic, and 2) multivariate normality. Even though multivariate normality is some- times difficult to assess in the aggregate, on a univariate basis it can be mnitored by inspection of ranges, means and standard deviations of each element contemplated for use as a discriminating variable. If measurement of a particular trace element, for example, yields a large standard deviation and coefficient of variation and this element does not vary significantly between sources, it should not be used in the analysis. Inclusion of poorly measured, weak, or redundant variables in discriminant analysis can actually increase the number of misclassifications (cf. Klecka 1980). While it is commonly believed that the inclusion of larger numbers of variables in discriminant analysis results in a 'Tbetter" classification, in fact this is not necessarily the case. The goal of a stepwise analysis is to find an optimal set of discriminating variables (in this case, trace and rare 4 earth elements) which, in carbination, work as well or better than the full set. Some researchers (Dunn and Varady 1966) have found that for small sample sizes, the use of many variables results in an increase in the misclassification rate when the program is asked to assign unknown cases to their knawn groups. With this background in mind, let's consider the output options available with the SPSS discriminant analysis program (Klecka 1975) which have been used by some analysts (e.g. Sappington 1981a, 1981b) to measure classification validity and reliability. These are, first, F-statistics, second, classifi- cation results (or "percent correctly classified't) tables, and third posterior probabilities for group membership. The F-statistic is the test for statistical significance of the amount of separation between groups, under the assumption of multivariate distribution. The problem with using this statistic as a "true" measure of the di.fference between groups is that it considers absolute group sample size. In other words, comparisions between larger groups with more cases will be given more weight in the computation than comparisons between smaller groups. The net result is that these statistics tend to exaggerate differences between source pairs that contain large numbers of cases. In short, "the F-statistic is not directly related to classification perfonnance, but is only in a vague sense connected with measuring discrimination" (Habbem and Hennans 1977: 491). It is definitely not the case that "The larger the F values, the more probable the autonaoy of the pairs of groups" (Sappington 1981a: 136). Because of sample size limi- tations, F-statistics should be interpreted with caution. The classification results table is essentially a vehicle for presenting infonmation on how well the program classified cases with known group member- ship and, again, the results in these tables often are taken as a true indication of the reliability of the classification procedure (cf. Sappington 1981a: 137). As it turns out, however, the "percent correctly classified" procedure nearly always overestimates the apparent accuracy because "the validation is based on the same cases used to derive the classification functions" (Kiecka 1980: 51). Consequently, it should come as no surprise that a remarkably high percentage of specimens of known group membershin will be correctly classified because the allocation rules are applied post-hoc to classification of the very same cases fran which they were generated! So far, this discussion has been limited to consideration of problems associated with discrininant analysis classification of specimens with known group membership; that is, obsidian source standards. However, when attention shifts to classification of cases of unknown origin; that is, obsidian artifacts, one should becane much more concerned about the possibility of misclassification. When classifying cases with known group membership, misclassifications could be easily assessed because the true probability of a case being from a particular obsidian source was known. In considering assigning artifacts to sources the question now beccmes, "How likely is it that this artifact was manufactured fron obsidian from a particular source?" The answer involves discussion of two related issues - how probability estimates are derived in discriminant analysis, and how to interpret and evaluate them. 5 Tm probability estimates are provided with SPSS discriminant analysis output. P(G/X) is the probability that the object in question is a mreber of the assigned obsidian source group, while P(X/G) is the probability that a specimen from the assigned source group would be as far from the group centroid as the particular artifact being classified. The first probability, P(G/X), sometimes termed the posterior probability, has been most widely employed i western North American obsidian studies because it appears to be the most straightforward measure of how confident one can be in the artifact-to-source assignment (cf. Nelson and Holmes 1979; Sappington 1981a, 1981b; Nelson, this volume). If one gets a posterior probability of 1.0, the usual interpre- tation is that this signals a "perfect" fit between the artifact and that particular obsidian source. Unfortunately, closer inspection of the statistical assumptions underlying the ccmputation of this posterior probability reveal that this estimate is, in fact, quite poorly suited for use in obsidian studies. The reason for this takes us back to the assumptions underlying classical discriminant analysis -- that an ungrouped case is in fact a member of one of the groups in the sampling universe (cf. Kendall 1957; Tatsuoka 1971; Klecka 1980). In applications of discriminant analysis to Great Basin obsidian studies, one cannot safely assume that all potential sources of obsidian within a particular region have been included in the sampling universe (cf. Ward 1977; Hughes 1982). Because the posterior probability is computed on the basis of the assumption that complete information exists about the sampling universe, and further that the ungrouped artifact must be a member of one of these groups, this probability estimate can be completely misleading. The allocation rules in discriminant analysis dictate that a match must be made to one of the sources in the sampling universe regardless of how poorly the elemental measurements of the artifact match the measurements for the source to which it is assigned. Simply stated, the posterior probability, when considered in isolation, is of little utility in assigning artifact to sources, and it is extremely ill-equipped to detect errors in source assignment because of the assumption that the artifact must be a member of one of the known groups. Tb illustrate this point, consider the data in Table 1. This table presents trace element concentrations for the Hbrse tountain obsidian source in southcentral Oregon, along with trace element ppm values for a projectile point excavated from the King's Dog site (CA-Mod-204) in Surprise Valley, north- east California. When these data were analyzed using the SPSS discriminant analysis program (see Hughes 1983a), the King's Dog artifact was assigned a posterior probability (P G/X) of 1.0, indicating what appeared to be a con- vincing match to the Horse Mountain source. Unfortunately, the assignment was dead wrong. As can be seen when the actually ppm values for Horse Mountain and the King's Dog artifact are compared, the correspondence is not all that great. However, since it is the best match within the universe of sources included in the study, the program was completely consistent in assigning this artifact such a high posterior probability of being a member of the Horse Mountain source group. 6 Detecting Miscia;sifications Given these limitations, how can one detect incorrect source assignments in discriminant analysis and identify artifacts that may not belong to any of the sources in the sampling universe? A useful way to do this is to determine the dispersion of Mahalanobis D2 values around each obsidian source group mean, and then to establish tolerance limits for these D2 distances on.a source-by- source basis. These D2 distances can be understood as the multivariate distances of an individual artifact (or case) to each of the obsidian source group centroids. Cases are classified into a particular group on the basis of the snallest D2 distance. High D2 values, compared to the observed dispersion of D2 values for source standards to which the artifact is assigned, can profitably be used to detect an artifact that may not be a merber of any source imcluded in the sampling universe (cf. Luedtke 1979). In some packaged discriminant analysis prograas (i.e. 5PSS) D2 values are not one of the available output options. In this instance, the P(X/G), the second probability mentioned above, can be used as a rugh approximation of D2 distances. A low P(X/G) probability value likely signals that the artifact in question lies far frmn the assigned obsidian source gup centroid, and there- fore that this artifact would be assoiated with a high value. To provide an illustration of how D2 distances can be used to assess artifact-to-source assignments, let's retur to consideration of the data in Table 1. The lower portion of this table presents the range, mean and standard deviation of the D2 values determined for source specimens fran Horse Mountain. Above these data is the D2 value detenmined for the King's Dog projectile point. As is apparent, the observed D2 value for this artifact lies about 80 standard deviations from the mean of the Horse Mountain source even though it was assigned to this source with a posterior probability (P G/X) of 1.0! The large D2 value for this artifact, relative to the dispersion of D2 values for the source group, indicates that in fact the artifact lies far frm the Horse Mountain group centroid. Additional examples could be cited, but this one illustrates how the use of D2 values on a source-by-source basis can make it possible to identify artifacts which do not belong to any of the sources in the sampling universe. It should be emphasized that this misclassification could not have been identified solely on the basis of posterior probability (P G/X) interpretation. Detecting Measurement Errors Recall that it was stated earlier that one of the requirements of discrim- inant analysis was that the discriminatig variables (in this case, trace and rare earth elements) should conform to a multivariate normal distribution. Although nmltivariate normality is not readily assessed, it can be monitored in a general way by inspection of the means and standard deviations of the individual elements contemplated for use as discriminating variables. If elemental measurements are highly variable, this usually signals that either the element is not measured well by the particular analytical system, or that there is a great deal of trace element variability present in the parent geological source material. 7 Tb see how one might assess whether particular trace and rare earth elements should be used in discrininant analysis classification, refer to the data in Table 2. This table presents the means and standard deviations for ten elements measured by R.L. Sappington at the Idaho Bureau of Mines and Geology, University of Idaho (Green 1982: Tables 2-7) on obsidian source specinens from Idaho. The measured intensities for these ten elements were said to be "ideally quantifiable for the application of statistical procedures" (Sappington 1981a: 135, 139) and all ten were enployed as discriminating variables in SPSS discriminant anaalysis. Because no U.S.G.S. or other international standard rocks apparently were analyzed, it is not possible at this stage to assess the accuracy of these measu ts. Consequently, the data in Table 2 nust be evaluated in terms of precision that is, how uniform the measurements are for particular elements within and across sources. Elements that are not well measured should be excluded from consideration in geochemical characterization studies (see Bowman, Asaro and Perlman 1973: 312; Deutchman 1980: 124; Hughes 1982: 176-178; 1983b: 402). Because of the marked differences bewteen mean values apparent in Table 2, it is not possible to coapare standard deviations directly to determine how well pEicular elemnts were measured. 'b do this, coefficients of variation (CV%) were computed for each element from each source to facilitate compar- isons between groups with very different means (cf. Blalock 1972: 88). Put another way, the coefficient of variation provides a good indication of the homogeneity of source-specific elemental variationi. Using the guidelines proposed by Thomas (1976: 84): "...most CV (measured upon biological variables, at least) should lie between 4 and 10 percent, with 5 and 6 percent being good average values. Observed values much below this range often indicate that the group selection was inadquate to repent the overall variability of the variable. Groups showing values greater than 10 percent or so probably are unpure, possibly because the underlying distribution is bimda,xal." The CV percentage values for obsidian source standards measured at the University of Idaho indicate a considerable anunt of intra-elennt variability. Assessed on the basis of CV percentages, Ba, Zr and Rb appear to be the elements best measured by this system, while the other seven (Fe, Sr, Y, Nb, Sn, La and Ce) are much more variable and probably would not be good candidates for inclusion as discriminating variables in discriminant analysis. Although most of these inter- and intra-element asurents appear too mutable to be used in discriminant analysis, one cannot decide on the basis of these data whether there may be high neasurent error in the x-ray fluorescence measurement procedure, or whether these elements are, in fact, as inherently variable in these sources as indicated in Table 2. In order to resolve this issue, a random sample of obsidian source standards from the Hawkins-Malad-Oneida I Also referred to as the "coefficient of variability" (Friedman 1972: 104- 105) and the "coefficient of relative variation" (Ott, Mendenhall and Larson 1978: 126) . 8 (cf. Sappington 1981a: "Oneida"; Nelson, this volume, 'Malad") obsidian source was analyzed, and these data were compared to a sample fron the sam source analyzed at Brigham Young University (see Table 3; also Nelson, this volume, source #31). Analytical conditions associated with the U.C. Berkeley a ts appear in Hughes (1983a, 1983b); those employed at Brigham Young University appear in Nelson (this volume) and Nelson and Holmes (1979). It is imi diately apparent fran inspection of Table 3 that the analyses conducted at U.C. Berkeley and at Brigham Young University are in exceptionally close agreemnt (see Hughes, Hnipel and Nelson n.d.) and it is correspondingly clear from comparison of CV percentages in Tables 2 and 3 that the University of Idaho system evidences significantly greater measurement fluctuations on specimens from the same source. In most cases, University of Idaho measurements are at least twice as variable as those conducted at the other two laboratories. This table suggests that while the University of Idaho system is comparatively stable for Ba and Zr measurement, Rb, Sr, Y, Nb and Ce values are not measured with acceptable precision. To the degree that the magnitude of measurement variability apparent in Table 2 is represented in other obsidian sources in Sappington's (1981a; 1981b) sampling universe 2 , one would be compelled to suggest that these poorly measured elements should not have been included as variables in discriminant analysis. Without thorough consideration of some of the pitfalls of discriminant analysis applications in obsidian sourcing studies it is difficult to determine, for any particular analysis, whether one of four outcomes has resulted: First, one could get the wrong result because of Improper use of the method. In this case, an incorrect artifact-to-source assignment my have been accepted by the analyst due to unperceived violation of assmtions, poor masureent of discriminating variables (trace elements), or the possibility that too much "noise" (redundant or unnecessary variables) was included in the analysis. Second, one could get the wrong result even though the method was used properly. In this instance, an incorrect asignment of artifact-to-source was nde and no statistical assumptions were violated. This my happen because the analyst erroneously accepts a high posterior probability (P G/X) value as indicating a "correct" assignment. As discussed above, the use of D2 values can help detect this kind of error. Third, the correct result could be obtained even though the method was used improperly. In this case, a correct classification of an artifact-to-source would be made in spite of violation of statistical assumptions, poor element measurement, or inclusion of redundant variables in the analysis. This usually means that the elemnts measured on both the artifact and the assigned source are im such close, unique correspondence that the correct assignment will be nade no natter what the analyst does. 2 This is not possible to determine from the published reports (Sappington 1981a, 1981b) because the raw data on obsidian source standard measurnnt are not presented. 9 Eburth, one can obtain the right result using the method properly. This happens, of course, when an artifact is correctly assigned to source and all statistical requirements and measurnt criteria have been met. I should caution that a "correct" assignment does not mean that the assignment is correct in any ultinate sense (see Hughes 1983a)- just correct insofar as program requirements and optimization procedures have been satisfied within the frame- work of probability estimtes fundamental to the operation, and interpretive limitations, of the technique. As is probably clear by now, there are numerous ways to unknowingly get the wrong answer using discriminant analysis. Discriminant analysis classifications often are difficult to troubleshoot even by specialists in multivariate statis- tical analysis because of the complex interaction ameng discriminating variables, the multiplicity of statistical assumptions that must be addressed, and the translation of elemental values into slightly alien classification functions and discriminant scores. Conclusions At the beginning of this study, I said that this paper had three purposes: to consider some of the methods used to identify obsidian sources and to assign artifacts to them; to offer a critical appraisal of the applications of some of these methods; and to offer some suggestions about some of the ways incorrect source assignments can be detected. Up to this point, I have devoted m st attention to the first two, but the third is perhaps the most important to the archaeological couan uity because archaeologists rely on the results of obsidian sourcing studies to provide the substantive data base required to test and refine extant interpretations and reconstructions of prehistoric exchange relationships in the Great Basin. Because of this it is necessary to take a close and critical look at the methods and techniques employed to identify the sources of obsidian artifacts. This is not to say that to do this, archaeologists must re-tool and become multivariate statisticians. I do suggest, however, that archaeologists expend the effort to obtain basic infonmation frm any geochemical character- ization analysis, so that even if they do not feel competent to assess the work, enough information will be available for others to do so (cf. Ives 1975). First, it is important to know how the data were generated; this includes machine specifications, whether or not any U.S.G.S. or other international rocks were analyzed as standards, as well as specification of data reduction procedures and overlapping peak stripping routines (see Hughes 1983a: 25-30; Nelson and Holms 1979: 67-68; Hanpel, this volume). It is also important to know whether the resulting data are reported in quantitative units (p values) or in semi- quantitative fashion (elenental intensities, peak height amplitudes, or ratios of these) for reasons discussed earlier. Probably most important, it is imperative that the elemental values (whether quantitative or semi-quantitative) be presented for each artifact analyzed along with values measured in comparable units for each of the obsidian sources to which the artifacts are assigned. Given the variable interpretations possible when quantitative and semi-quanti- tative modes are mixed, the importance of this requiremnt is by now probably obvious. There are other technical details pertinent to discriminant analysis 10 classifications that should be requited (see ghes 1983a: 5378), but these will be discussed elsewhere. Finally, it should be stressed that obsidian studies in the Great Bai are in their infancy, and this places sharp emphasis on taking first things first. No matter how sophisticated the data manipulation, the results will only be as good as the fundarental. easurements they are based on. Garbage in, garbage out. Before interpretive studies can effectively be completed, the strengths and weaknesses of the techniques used to identify sources and to assign artifacts to them should be scrutinized. Unless accurate and replicable geochemical characterizations of obsidian sources and artifacts can be made, it will be impossible to develop a broad comparative data base from which convincing arguments about the social consequences of prehistoric obsidian distributions can be advanced. Acknowledgements I thank Michael Miller, Statistical Laboratory, University of California Davis, for discussions and constructive comients on this paper; Jeanette L. Blomberg for suggestions that improved content; Joachim Ha mpel for technical support and assistance; James P. Green for providing obsidian source specimens from tMalad, Idaho; and Robert N. Jack for generously allowing me to present the results of some of his unpublished analyses. 11 Sr Zr Rb Figure 1. Ternary digram plot for H dt Lakebed (NV-Ch-15) artifacts, with coqosite plot for Casa Diablo obsidian source specime ns i dashed line. Plots for Casa Diablo source specinens and Ch-15 artifacts determined on the basis of rapid scan wavelength dispersive x-ray fluorescence analyss-s (see Jack and Heizer 1968); Ch-15 artifact plots courtesy of R.N. Jack. 12 Casa Diablo S I -,. ' I I I , a , *: I I es ' I00 a Zr Figure 2. Ternary diagram plots for scme obsidian sources in the Mono Basin and western Great Basin. Dashed lines delimit composite plots for source specimens analyzed by rapid scan wavelength dispersive x-ray fluorescence (from Jackson 1974 and Jack 1976). Dots represent plots for source specimens derived fran quantitative analysis (in parts per million) using wavelength dispersive x-ray fluorescence (fran R.N. Jack, unpublished data). 13 Sr * Cougar Butte v V V v v v v / \ * S.~~~~~~~~~~~~~~~~ Zr Rb Figure 3. Ternary diagram plot for obsidian source specimens frmn Cougar Butte, Medicine Lake Highland, northeast California. Dots depict specimens analyzed using the rapid scan wavelength dispersive technique, while open circles represent plots for specimens computed from parts per million concentrations using energy dispersive x-ray fluorescence. 14 Obsidian Source Horse Mtn. (n=20) Trace Elent Concentrations (ppm) x s.d. DS% 141.8 6.1 Sr Y Zr Nb 0.1 115.4 706.7 29.7 .6 5.3 23.4 3.4 King's Dog projectile (1-228122) (Mod-204) point 0.0 120.5 705.2 80.7 Mod-204 projectile point hbd-204 projectile point Ibrse Mtn. P(G/X) = 1.0000 D2 = 355.13 D2 statistic range = 0.30 - 14.22 x = 5.05 s.d. = 4.06 Table 1. Trace elnemnt concentration values for obsidian from Hbrse Mountai, Oregon, and an obsidian projectile point from the King's Dog site in relation to Mahalanobis D2 values. 204.0 15 Obsidian Source Cams- Dry Creek Owyhee- Brown ' s- Castle Timfber-Squaw Butte (Webb Creek) (n-24) (n-32) 941.9 117.9 12.5 103.5 38.7 37.4 325.4 47.6 14.6 78.1 29.9 38.3 609.8 71.3 11.7 101.0 33.3 33.0 28.5 30.4 106.7 25147.7 2219.2 8.8 0 0 0 150.9 143.1 94.8 920.6 191.8 20.8 391.1 59.1 15.1 1.4 4.5 317.0 601.3 72.6 12.1 1002.6 90.9 9.1 1303.3 124.2 9.5 411.3 45.7 11.1 0 0 0 17.1 27.0 158.4 1092.9 362.8 33.2 (n=21) (n=25) (n=10) 748.8 81.7 10.9 205.1 28.7 14.0 24.7 21.3 86.1 124.9 27.2 21.8 896.5 68.7 7.7 264.9 33.1 12.5 67.7 26.8 39.6 11875.1 1883.7 15.9 0 0 0 694.7 142.6 20.5 439.2 93.5 21.3 255.7 34.8 13.6 10.4 14.3 41.3 74.4 19.6 26.4 315.0 78.1 24.8 92.2 18.8 20.4 92.1 38.5 41.8 4312.3 2333.4 54.1 0 0 0 258.5 115.5 44.7 84.7 96.3 113.7 209.9 48.1 22.9 2.8 5.9 210.7 93.8 28.3 30.1 140.7 47.0 33.4 191.7 37.3 19.5 65.9 40.3 61.2 323.1 80.6 25.0 4.0 12.7 316.3 0.5 1.6 316.0 Table 2. Elemntal intensity means (X), standard deviations of variation (CV%) for ten elenents fram six Idaho Measurements mde at the Idaho Bureau of Mines and from data in Green (1982: Tables 2-7). (S.D.) and coefficients obsidian sources. Geology; canputed Hawkins- Malad- Oneida Elements Snith Creek (Chester- field) Big South- ern Butte Fe Rb Sr y Zr Nb Sn Ba La Ce (n=SO) 389.6 78.5 20.1 135.3 28.6 21.1 64.8 23.8 36.7 63.1 23.3 36.9 278.9 40.4 14.5 117.0 27.2 23.3 19.1 22.0 115.1 27696.5 2195.3 7.9 0 0 0 62.4 72.4 116.1 X S.D. CV% X S.D. CVZ X S.D. CVZ X S.D. CV% S.D. CVZ CV% X S.D. CV% X S.D. CV% S.D. CV% X S.D. CV% S.D. CV% U.C. Berkeley (R.E. Hughes, analyst ) n=20 119.2 10.9 9.1 X S.D. CV% X S.D. CV% x S.D. CVz X S.D. CV% _ S.D. CV% x S.D. CV% X S.D. CV% X S.D. CV% 68.2 6.2 9.0 30.1 5.0 16.7 93.9 9.3 9.9 Brigham Young University (F.W. Nelson, analyst) n=7 127.2 3.0 2.4 77.1 2.5 3.2 9.2 7.6 82.6 86.1 6.2 7.2 16.5 2.1 12.6 7.2 3.8 52.8 1614.4 72.7 4.5 1628.9 12.4 0.8 31.2 6.4 20.6 n.m. n.m. 62.9 7.6 12.0 n.m. = not measured Table 3. Trace element concentrations (in parts per million) for obsidian source specimens from lalad, Idaho. Mean (X) and standard deviation (S.D.) values in parts per million. Source specimens for U.C. Berkeley analysis were selected at random fron those in Hawkins-Malad-Oneida colum i Table 2. 16 Elements Rb Sr y Zr Nb Ba la Ce 17 .eiferepces Blalock, H.M., Jr. 1972 Social Statistics. McGrawu4ill Book Ccinpanxy Bowman, H.R., F. Asaro and I. 1973 On the uniformity nMFrtic mixing. Second edition. Perln.an of caxposit ion in obsidians and evidence for Journaz of GeoZogy 81: 312-327. Condie, K.C. and A.B. Blaxland 1970 Sources of obsidian in Hogup and Danger Caves. In: by C.M. Aikens. University of Utah Anthropological 93: 275-281. Ikgup Cave, Papers Deutchman, 1980 H.L. Caemical evidence of ceramic exchange on Black Mesa. and Methods in Regional Exchange, edited by R.E. Fry. American Archaeology Papers 1: 119-133. In: Models Society for Dunn, O.J. and P.D. Varady 1966 Probabilities of correct classification in discriminant analysis. Biometrics 22: 908-924. Ericson, J.E. 1981 Exchange and production systa in Califronian prehistory: the results of hydration dating and chemical characterization of obsi n sources. British ArchaeoZogicaZ Reports, InternationaZ Series 110. Oxford. Friedman, H. 1972 Introduction to Statistics. Green, J.P. 1982 Randan House, New York. XRF trace element analysis and hydration measurnemnt of archaeological and source obsidians from the northeastern Great Basin. Paper presented at the 18th Great Basin Anthropological Conference, Reno, Nevada. Habbema, J.D.F. and J. Hermans 1977 Selection of variables in discr'iminant analysis by F-statistic and error rate. Technometrics 19(4): 487-493. Hughes, R.E. 1978 Aspects of prehistoric Wiyot exchange and social ranking. of 'CaZifornia Anthropology 5(1): 53-66. 1982 Age and exploitation of obsidian from the Medicine Lake Highland, California. JournaZ of ArchaeoZogicaZ Science 9(2): 173-185. 1983a Exploring diachronic variability in obsidian procurement patterns i northeast California and southeentral Oregon: Geochemical Journal 18 characterization of obsidian sources and projectile moints by energy dispersive x-ray fluoresence. Ph.D. dissertation, Department of Anthropology, University of California, Davis. 1983b X-ray fluorescence characterization of obsidian. In: The Archaeology of Monitor Valley: 2. The Archaeology of Gatecliff Shelter, by D.H. Thomas. AnthropoZogicaZ Papers of the American Musewn of NaturaZ History 59: 401-408. Hughes, R.E. n.d. Ives, D.J. 1975 Jack, R.N. 1976 J.H. Hanpel and F.W. Nelson, Jr. Interlaboratory calibration of x-ray fluorescence analyses of obsidian. Manuscript in preparation. Trace element analyses of archaeological material. American Antiquity 40(4): 235-236. Prehistoric obsidian in California I: geochemical aspects. In: Advances in Obsidian Glass Studies: Archaeological and Greochemical Perspectives, edited by R.E. Taylor. Pp. 183-217. Noyes Press, Park Ridge, New Jersey. Jack , R. N. and R. F. Heizer 1968 "Finger-Printing" of som Mesoamerican obsidian artifacts. Contributions of the University of CaZifornia ArchaeologicaZ Research FaciZity 5: 81-100. Berkeley. Jackson, T.L. 1974 The econcmics of obsidian in central California prehistory: applications of x-ray fluorescence spectrography in archaeology. H.A. thesis, Department of Anthropology, San francisco State University. Kendall, M.G. 1957 A Course in MuZtivariate AnaZysis. Hafner Press, New York. Klecka, W.R. 1975 Discriminant analysis. In: Sciences, edited by N.H. Nie, Steinbrenner and D.H. Bent. I York. Second edition. Statistical C.H. Hl, Pp. 434-476. Package for the SociaZ J.G. Jenkins, K. . McGraw-Hill, New 1980 Discriminant analysis. Sage University Paper, Quantitative AppZications in the SociaZ Sciences, Series No. 07-019. Sage Publications, Beverly Hills. Lauchenbruch, P. A. 1975 Discriminant AnaZysis. Hafner Press, New York. Luedtke, B. E. 1979 The identification of sources of chert artifacts. American Antiquity 44(4): 744-757.