This version of the Help is no longer updated. See JMP.com/help for the latest version.

.
Publication date: 07/30/2020

References

Agrawal, R., and Srikant, R. (1994). “Fast Algorithms for Mining Association Rules.” In Proceedings of the 20th VLDB Conference. Santiago, Chile: IBM Almaden Research Center. Accessed July 5, 2016. https://rakesh.agrawal-family.com/papers/vldb94apriori.pdf.

Baglama, J., and Reichel, L. (2005). “Augmented implicitly restarted Lanczos bidiagonalization methods.” SIAM Journal on Scientific Computing, 27:19–42.

Bates, D. M., and Watts, D. G. (1988). Nonlinear Regression Analysis and Its Applications. New York: John Wiley & Sons.

Benford, F. (1938). “The law of anomalous numbers.” Proceedings of the American philosophical society, 551-572.

Benjamini, Y., and Hochberg, Y. (1995). “Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing.” Journal of the Royal Statistical Society, Series B 57:289–300.

Box, G. E. P., Jenkins, G. M., and Reinsel, G. C. (1994). Time Series Analysis: Forecasting and Control. 3rd ed. Englewood Cliffs, NJ: Prentice-Hall.

Cleveland, W. S. (1994). Visualizing Data, Summit, NJ: Hobart Press.

Conover, W. J. (1999). Practical Nonparametric Statistics. 3rd ed. New York: John Wiley & Sons.

Cureton, E. E. (1967). “The Normal Approximation to the Signed-Rank Sampling Distribution when Zero Differences are Present.” Journal of the American Statistical Association 62: 319, 1068–1069.

Hahsler, M. (2015). “A Probabilistic Comparison of Commonly Used Interest Measures for Association Rules.” Accessed September 17, 2018. http://michael.hahsler.net/research/association_rules/measures.html.

Hand, D. J., Mannila, H., and Smyth, P. (2001). Principles of Data Mining. Cambridge, MA: MIT Press.

Hastie, T. J., Tibshirani, R. J., and Friedman, J. H. (2009). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. 2nd ed. New York: Springer-Verlag.

Hawkins D. M., and Kass G. V. (1982). “Automatic Interaction Detection.” In Topics in Applied Multivariate Analysis, edited by D. M. Hawkins, 267–300. Cambridge: Cambridge University Press.

Huber, P. J., and Ronchetti, E. M. (2009). Robust Statistics. 2nd ed. New York: John Wiley & Sons.

Hyndman, R. J., Koehler, A. B., Ord, J. K., and Snyder, R. D. (2008). Forecasting with Exponential Smoothing: The State Space Approach. Berlin: Springer-Verlag.

Jolliffe, I. T. (2002). Principal Component Analysis. New York: Springer-Verlag.

Kass, G. V. (1980). “An Exploratory Technique for Investigating Large Quantities of Categorical Data.” Journal of the Royal Statistical Society, Series C 29:119–127

Lehman, E. L. (2006). Nonparametrics: Statistical Methods Based on Ranks. 2nd ed. New York: Springer.

Mason, R. L., and Young, J. C. (2002). Multivariate Statistical Process Control with Industrial Applications. Philadelphia: SIAM.

McCullagh, P., and Nelder, J. A. (1989). Generalized Linear Models. 2nd ed. London: Chapman & Hall.

Nagelkerke, N. J. D. (1991). “A Note on a General Definition of the Coefficient of Determination.” Biometrika 78:691–692.

Nelder, J. A., and Wedderburn, R. W. M. (1972). “Generalized Linear Models.” Journal of the Royal Statistical Society, Series A 135:370–384.

Parker, R. J. (2015). Efficient Computational Methods for Large Spatial Data. Ph.D. diss., Department of Statistics, North Carolina State University. Accessed June 30, 2016. https://repository.lib.ncsu.edu/ir/bitstream/1840.16/10572/1/etd.pdf.

Platt, J. (1998). Sequential minimal optimization: A fast algorithm for training support vector machines. Technical Report MST-TR-98-14, Microsoft Research. https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr-98-14.pdf.

Qian, P. Z., Huaiquing, W., and Wu, C. F. (2012). “Gaussian process models for computer experiments with qualitative and quantitative factors.” Technometrics 50:383–396.

Ramsay, J. O., and Silverman, B. W. (2005). Functional Data Analysis. 2nd ed. New York: Springer.

Ratkowsky, D. A. (1990). Handbook of Nonlinear Regression Models. New York: Marcel Dekker.

Sall, J. (2002). “Monte Carlo Calibration of Distributions of Partition Statistics.” SAS Institute Inc.,Cary, NC. Accessed July 29, 2015. https://www.jmp.com/content/dam/jmp/documents/en/white-papers/montecarlocal.pdf.

Santer, T., Williams, B., and Notz, W. (2003). The Design and Analysis of Computer Experiments. New York: Springer-Verlag.

SAS Institute Inc. (2017). SAS/ETS User’s Guide, Version 14.3. Cary, NC: SAS Institute Inc. https://support.sas.com/documentation/onlinedoc/ets/143/etsug.pdf.

Schäfer, J., and Strimmer, K. (2005). “A Shrinkage Approach to Large-Scale Covariance Matrix Estimation and Implications for Functional Genomics.” Statistical Applications in Genetics and Molecular Biology 4, Article 32.

Schuirmann, D. J. (1987). “A Comparison of the Two One-sided Tests Procedure and the Power Approach for Assessing the Equivalence of Average Bioavailability.” Journal of Pharmacokinetics and Biopharmaceutics 15:657–680.

Shiskin, J., Young, A. H., and Musgrave, J. C. (1967). The X-11 Variant of the Census Method II Seasonal Adjustment Program. Technical Report 15, US Department of Commerce, Bureau of the Census.

Shmueli, G., Patel, N. R., and Bruce, P. C. (2010). Data Mining For Business Intelligence: Concepts, Techniques, and Applications in Microsoft Office Excel with XLMiner. 2nd ed. Hoboken, NJ: John Wiley & Sons.

Shmueli, G., Bruce, P. C., Stephens M. L., and Patel, N. R. (2017). Data Mining For Business Intelligence: Concepts, Techniques, and Applications with JMP Pro. Hoboken, NJ: John Wiley & Sons.

Westfall, P. H., Tobias, R. D., and Wolfinger, R. D. (2011). Multiple Comparisons and Multiple Tests Using SAS. 2nd ed. Cary, NC: SAS Institute Inc.

Want more information? Have questions? Get answers in the JMP User Community (community.jmp.com).