Skip to content

Latest commit

 

History

History
289 lines (145 loc) · 14.9 KB

File metadata and controls

289 lines (145 loc) · 14.9 KB

References

This is a consolidated bibliography of all references cited throughout the Numerics library documentation.


General Numerical Methods

[1] Press, W. H., Teukolsky, S. A., Vetterling, W. T., & Flannery, B. P. (2007). Numerical Recipes: The Art of Scientific Computing (3rd ed.). Cambridge University Press.

[2] Burden, R. L., & Faires, J. D. (2010). Numerical Analysis (9th ed.). Brooks/Cole.

[3] Stoer, J., & Bulirsch, R. (2002). Introduction to Numerical Analysis (3rd ed.). Springer.


Probability and Statistics

[4] Johnson, N. L., Kotz, S., & Balakrishnan, N. (1994). Continuous Univariate Distributions (2nd ed., Vols. 1-2). Wiley.

[5] Hosking, J. R. M. (1990). L-moments: Analysis and estimation of distributions using linear combinations of order statistics. Journal of the Royal Statistical Society: Series B, 52(1), 105-124.

[6] Hosking, J. R. M., & Wallis, J. R. (1997). Regional Frequency Analysis: An Approach Based on L-Moments. Cambridge University Press.

[7] Coles, S. (2001). An Introduction to Statistical Modeling of Extreme Values. Springer.

[8] Pawitan, Y. (2001). In All Likelihood: Statistical Modelling and Inference Using Likelihood. Oxford University Press.

[9] Wilks, D. S. (2011). Statistical Methods in the Atmospheric Sciences (3rd ed.). Academic Press.

[10] Fisher, R. A. (1930). The moments of the distribution for normal samples of measures of departure from normality. Proceedings of the Royal Society of London. Series A, 130(812), 16-28.

[11] Welford, B. P. (1962). Note on a method for calculating corrected sums of squares and products. Technometrics, 4(3), 419-420.

[12] Kendall, M. G. (1938). A new measure of rank correlation. Biometrika, 30(1/2), 81-93.

[13] Mood, A. M., Graybill, F. A., & Boes, D. C. (1974). Introduction to the Theory of Statistics (3rd ed.). McGraw-Hill.

[14] Casella, G., & Berger, R. L. (2002). Statistical Inference (2nd ed.). Duxbury/Thomson.


Hydrology and Water Resources

[15] England, J. F., et al. (2018). Guidelines for Determining Flood Flow Frequency—Bulletin 17C. U.S. Geological Survey Techniques and Methods, Book 4, Chapter B5.

[16] Stedinger, J. R., Vogel, R. M., & Foufoula-Georgiou, E. (1993). Frequency analysis of extreme events. In D. R. Maidment (Ed.), Handbook of Hydrology (Chapter 18). McGraw-Hill.

[17] Helsel, D. R., Hirsch, R. M., Ryberg, K. R., Archfield, S. A., & Gilroy, E. J. (2020). Statistical Methods in Water Resources. U.S. Geological Survey Techniques and Methods, Book 4, Chapter A3.

[18] Cunnane, C. (1978). Unbiased plotting positions—A review. Journal of Hydrology, 37(3-4), 205-222.

[19] Cohn, T. A., England, J. F., Berenbrock, C. E., Mason, R. R., Stedinger, J. R., & Lamontagne, J. R. (2013). A generalized Grubbs-Beck test statistic for detecting multiple potentially influential low outliers in flood series. Water Resources Research, 49(8), 5047-5058.

[20] Eckhardt, K. (2005). How to construct recursive digital filters for baseflow separation. Hydrological Processes, 19(2), 507-515.

[21] Rao, A. R., & Hamed, K. H. (2000). Flood Frequency Analysis. CRC Press.


Model Evaluation

[22] Moriasi, D. N., Arnold, J. G., Van Liew, M. W., Bingner, R. L., Harmel, R. D., & Veith, T. L. (2007). Model evaluation guidelines for systematic quantification of accuracy in watershed simulations. Transactions of the ASABE, 50(3), 885-900.

[23] Moriasi, D. N., Gitau, M. W., Pai, N., & Daggupati, P. (2015). Hydrologic and water quality models: Performance measures and evaluation criteria. Transactions of the ASABE, 58(6), 1763-1785.

[24] Nash, J. E., & Sutcliffe, J. V. (1970). River flow forecasting through conceptual models part I—A discussion of principles. Journal of Hydrology, 10(3), 282-290.

[25] Gupta, H. V., Kling, H., Yilmaz, K. K., & Martinez, G. F. (2009). Decomposition of the mean squared error and NSE performance criteria: Implications for improving hydrological modelling. Journal of Hydrology, 377(1-2), 80-91.

[26] Legates, D. R., & McCabe, G. J. (1999). Evaluating the use of "goodness-of-fit" measures in hydrologic and hydroclimatic model validation. Water Resources Research, 35(1), 233-241.

[27] Burnham, K. P., & Anderson, D. R. (2002). Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach (2nd ed.). Springer.

[28] Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control, 19(6), 716-723.

[29] Schwarz, G. (1978). Estimating the dimension of a model. Annals of Statistics, 6(2), 461-464.

[30] Murphy, A. H. (1988). Skill scores based on the mean square error and their relationships to the correlation coefficient. Monthly Weather Review, 116(12), 2417-2424.

[31] Kling, H., Fuchs, M., & Paulin, M. (2012). Runoff conditions in the upper Danube basin under an ensemble of climate change scenarios. Journal of Hydrology, 424-425, 264-277.


Copulas and Multivariate Analysis

[32] Nelsen, R. B. (2006). An Introduction to Copulas (2nd ed.). Springer.

[33] Joe, H. (1997). Multivariate Models and Dependence Concepts. Chapman & Hall.

[34] Genest, C., & Favre, A.-C. (2007). Everything you always wanted to know about copula modeling but were afraid to ask. Journal of Hydrologic Engineering, 12(4), 347-368.

[35] Salvadori, G., De Michele, C., Kottegoda, N. T., & Rosso, R. (2007). Extremes in Nature: An Approach Using Copulas. Springer.

[36] Salvadori, G., & De Michele, C. (2004). Frequency analysis via copulas: Theoretical aspects and applications to hydrological events. Water Resources Research, 40(12).

[37] Anderson, T. W. (2003). An Introduction to Multivariate Statistical Analysis (3rd ed.). Wiley.

[38] Kotz, S., Balakrishnan, N., & Johnson, N. L. (2000). Continuous Multivariate Distributions, Volume 1: Models and Applications (2nd ed.). Wiley.

[39] Johnson, N. L., Kotz, S., & Balakrishnan, N. (1997). Discrete Multivariate Distributions. Wiley.

[40] Tong, Y. L. (2012). The Multivariate Normal Distribution. Springer Science & Business Media.

[41] Kotz, S., & Nadarajah, S. (2004). Multivariate t Distributions and Their Applications. Cambridge University Press.


Numerical Integration

[42] Piessens, R., de Doncker-Kapenga, E., Überhuber, C. W., & Kahaner, D. K. (1983). QUADPACK: A Subroutine Package for Automatic Integration. Springer.

[43] Press, W. H., & Farrar, G. R. (1990). Recursive stratified sampling for multidimensional Monte Carlo integration. Computers in Physics, 4(2), 190-195.

[44] Lepage, G. P. (1978). A new algorithm for adaptive multidimensional integration. Journal of Computational Physics, 27(2), 192-203.


Numerical Differentiation

[45] Ridders, C. J. F. (1982). Accurate computation of F'(x) and F'(x)F''(x). Advances in Engineering Software, 4(2), 75-76.


Optimization

[46] Nocedal, J., & Wright, S. J. (2006). Numerical Optimization (2nd ed.). Springer.

[47] Nelder, J. A., & Mead, R. (1965). A simplex method for function minimization. The Computer Journal, 7(4), 308-313.

[48] Storn, R., & Price, K. (1997). Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization, 11(4), 341-359.

[49] Duan, Q., Sorooshian, S., & Gupta, V. K. (1994). Optimal use of the SCE-UA global optimization method for calibrating watershed models. Journal of Hydrology, 158(3-4), 265-284.

[50] Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. Proceedings of ICNN'95, 4, 1942-1948.


Linear Algebra

[51] Golub, G. H., & Van Loan, C. F. (2013). Matrix Computations (4th ed.). Johns Hopkins University Press.

[52] Trefethen, L. N., & Bau, D. (1997). Numerical Linear Algebra. SIAM.


Root Finding

[53] Brent, R. P. (1973). Algorithms for Minimization without Derivatives. Prentice-Hall.

[54] Sprott, J. C. (1991). Numerical Recipes, Routines and Examples in Basic. Cambridge University Press.

[55] Süli, E., & Mayers, D. (2003). An Introduction to Numerical Analysis. Cambridge University Press.


Interpolation

[56] Akima, H. (1970). A new method of interpolation and smooth curve fitting based on local procedures. Journal of the ACM, 17(4), 589-602.

[57] de Boor, C. (2001). A Practical Guide to Splines (Rev. ed.). Springer.


Random Number Generation

[58] Matsumoto, M., & Nishimura, T. (1998). Mersenne Twister: A 623-dimensionally equidistributed uniform pseudo-random number generator. ACM Transactions on Modeling and Computer Simulation, 8(1), 3-30.

[59] Niederreiter, H. (1992). Random Number Generation and Quasi-Monte Carlo Methods. SIAM.

[60] McKay, M. D., Beckman, R. J., & Conover, W. J. (1979). A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics, 21(2), 239-245.

[61] Owen, A. B. (2003). Quasi-Monte Carlo sampling. In Monte Carlo Ray Tracing: Siggraph 2003 Course 44, 69-88.


MCMC and Bayesian Methods

[62] Gelman, A., Carlin, J. B., Stern, H. S., Dunson, D. B., Vehtari, A., & Rubin, D. B. (2013). Bayesian Data Analysis (3rd ed.). CRC Press.

[63] Robert, C. P., & Casella, G. (2004). Monte Carlo Statistical Methods (2nd ed.). Springer.

[64] Haario, H., Saksman, E., & Tamminen, J. (2001). An adaptive Metropolis algorithm. Bernoulli, 7(2), 223-242.

[65] ter Braak, C. J. F., & Vrugt, J. A. (2008). Differential Evolution Markov Chain with snooker updater and fewer chains. Statistics and Computing, 18(4), 435-446.

[66] Neal, R. M. (2011). MCMC using Hamiltonian dynamics. In Handbook of Markov Chain Monte Carlo (pp. 113-162). CRC Press.

[67] Vehtari, A., Gelman, A., Simpson, D., Carpenter, B., & Bürkner, P.-C. (2021). Rank-normalization, folding, and localization: An improved R-hat for assessing convergence of MCMC. Bayesian Analysis, 16(2), 667-718.

[68] Vrugt, J. A. (2016). Markov chain Monte Carlo simulation using the DREAM software package: Theory, concepts, and MATLAB implementation. Environmental Modelling & Software, 75, 273-316.

[69] Hoffman, M. D., & Gelman, A. (2014). The No-U-Turn Sampler: Adaptively setting path lengths in Hamiltonian Monte Carlo. Journal of Machine Learning Research, 15(47), 1593-1623.

[70] Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., & Teller, E. (1953). Equation of state calculations by fast computing machines. The Journal of Chemical Physics, 21(6), 1087-1092.

[71] Gelman, A., & Rubin, D. B. (1992). Inference from iterative simulation using multiple sequences. Statistical Science, 7(4), 457-472.

[72] Sobol, I. M. (1967). On the distribution of points in a cube and the approximate evaluation of integrals. USSR Computational Mathematics and Mathematical Physics, 7(4), 86-112.

[73] Roberts, G. O., Gelman, A., & Gilks, W. R. (1997). Weak convergence and optimal scaling of random walk Metropolis algorithms. Annals of Applied Probability, 7(1), 110-120.

[74] Roberts, G. O., & Rosenthal, J. S. (2001). Optimal scaling for various Metropolis-Hastings algorithms. Statistical Science, 16(4), 351-367.

[75] Betancourt, M. (2017). A conceptual introduction to Hamiltonian Monte Carlo. arXiv preprint arXiv:1701.02434.

[76] Geyer, C. J. (1992). Practical Markov chain Monte Carlo. Statistical Science, 7(4), 473-483.

[77] Flegal, J. M., Haran, M., & Jones, G. L. (2008). Markov chain Monte Carlo: Can we trust the third significant figure? Statistical Science, 23(2), 250-260.


Uncertainty Analysis

[78] Efron, B., & Tibshirani, R. J. (1993). An Introduction to the Bootstrap. Chapman & Hall.

[79] Stedinger, J. R. (1983). Confidence intervals for design events. Journal of Hydraulic Engineering, 109(1), 13-27.

[80] Hirsch, R. M., & Stedinger, J. R. (1987). Plotting positions for historical floods and their precision. Water Resources Research, 23(4), 715-727.

[81] Efron, B. (1987). Better bootstrap confidence intervals. Journal of the American Statistical Association, 82(397), 171-185.

[82] Davison, A. C., & Hinkley, D. V. (1997). Bootstrap Methods and Their Application. Cambridge University Press.


Special Distributions

[83] Weibull, W. (1951). A statistical distribution function of wide applicability. Journal of Applied Mechanics, 18(3), 293-297.

[84] Vose, D. (2008). Risk Analysis: A Quantitative Guide (3rd ed.). Wiley.

[85] McLachlan, G., & Peel, D. (2000). Finite Mixture Models. Wiley.

[86] Silverman, B. W. (1986). Density Estimation for Statistics and Data Analysis. Chapman & Hall.

[87] Mardia, K. V., & Jupp, P. E. (2000). Directional Statistics. Wiley.


Goodness-of-Fit Tests

[88] D'Agostino, R. B., & Stephens, M. A. (1986). Goodness-of-Fit Techniques. Marcel Dekker.

[89] Anderson, T. W., & Darling, D. A. (1954). A test of goodness of fit. Journal of the American Statistical Association, 49(268), 765-769.


Time Series

[90] Box, G. E. P., Jenkins, G. M., Reinsel, G. C., & Ljung, G. M. (2015). Time Series Analysis: Forecasting and Control (5th ed.). Wiley.

[91] Box, G. E. P., & Cox, D. R. (1964). An analysis of transformations. Journal of the Royal Statistical Society: Series B, 26(2), 211-252.


Data Sources

[92] U.S. Geological Survey. USGS Water Services. https://waterservices.usgs.gov/

[93] Environment and Climate Change Canada. Historical Hydrometric Data. https://wateroffice.ec.gc.ca/

[94] Australian Bureau of Meteorology. Water Data Online. https://www.bom.gov.au/waterdata/


Machine Learning

[95] Nelder, J. A., & Wedderburn, R. W. M. (1972). Generalized linear models. Journal of the Royal Statistical Society: Series A, 135(3), 370-384.

[96] Breiman, L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and Regression Trees. Wadsworth.

[97] Breiman, L. (2001). Random forests. Machine Learning, 45(1), 5-32.

[98] Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1), 21-27.

[99] Zhang, H. (2004). The optimality of naive Bayes. Proceedings of the Seventeenth International FLAIRS Conference, 562-567.

[100] MacQueen, J. (1967). Some methods for classification and analysis of multivariate observations. Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, 1, 281-297.

[101] Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.

[102] Jenks, G. F. (1967). The data model concept in statistical mapping. International Yearbook of Cartography, 7, 186-190.

[103] Hastie, T., Tibshirani, R., & Friedman, J. (2009). The Elements of Statistical Learning (2nd ed.). Springer.

[104] Dempster, A. P., Laird, N. M., & Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society: Series B, 39(1), 1-38.

[105] Arthur, D., & Vassilvitskii, S. (2007). k-means++: The advantages of careful seeding. Proceedings of the 18th Annual ACM-SIAM Symposium on Discrete Algorithms, 1027-1035.