Self-study: CMU Intermediate Statistics 36-705 Fall 2016 materials.

This page is intended as a resource for those wishing to self-study the graduate-level course "Intermediate Statistics", taught by Larry Wasserman to MSc and PhD students in machine learning at Carnegie Mellon University in the fall of 2016. It contains all publicly available course materials from the official course page, work I completed as part of my own self-study; as well as my own comments and materials for those who wish to similarly benefit.

This page contains the course materials that used to be publicly available at Larry Wasserman’s course page. The materials are now no longer accessible in their original location because they have been superseded by a later version of the course.

I have archived these previously available course materials, and also the original course page, in my GitHub repo below:

https://github.com/cyber-rhythms/cmu-36-705-intermediate-statistics-fall-2016

The above repo can be cloned in its entirety if one wishes to download all the materials linked from this page at once.

For those wishing to use the materials from this course for self-study, the utility of this particular iteration over later versions, however, is that it is linked to recorded lectures that are publicly available on YouTube.

In order to facilitate the use of the course materials for those self-studying, I have based this page on the original course page and course syllabus as faithfully as possible. I have endeavoured to keep my own comments to a minimum, at least in the first part of this page.

In the second part of this page under Self-study support”, I include additional comments specific to my own experiences with self-studying this course. I also include additional materials such as my scanned notes, lecture summaries, homework assignment and practice test write-ups; and also supplementary reference texts I found personally found useful.

Disclosure: I am an affiliate with Bookshop.org. This page contains affiliate links, so if you purchase a reference textbook through one of these links, you will not pay anything in addition to the listed price, but I will receive a small commission. If you wish to find out more about Bookshop, please visit their webpage.

Why should I study this course?

As someone undertaking a programme of self-study in machine learning using CMU publicly available materials, this course has provided me with an invaluable foundation to better understand a ‘pre-requisite’ to modern machine learning methods - statistics. And that is something I have only come to appreciate after having studied this course.

On the relevance of this course to machine learning, here is an extract from Larry Wasserman’s book, All of Statistics:

Statistics, data mining, and machine learning are all concerned with collecting and analysing data. For some time, statistics research was conducted in statistics departments, while data mining and machine learning research was conducted in computer science departments. Statisticians thought that computer scientists were reinventing the wheel. Computer scientists thought that statistical theory didn’t apply to their problems.

Things are changing. Statisticians now recognize that computer scientists are making novel contributions while computer scientists now recognize the generality of statistical theory and methodology. Clever data mining algorithms are more scalable than statisticians ever thought possible. Formal statistical theory is more pervasive than computer scientists had realized.

Students who analyze data, or who aspire to develop new methods for analyzing data, should be well grounded in basic probability and mathematical statistics. Using fancy tools like neural nets, boosting, and support vector machines without understanding basic statistics is like doing brain surgery before knowing how to use a band-aid.

Course description.

This course will cover the fundamentals of theoretical statistics. Topics include: VC theory, convergence, point and interval estimation, maximum likelihood, hypothesis testing, data reduction, Bayesian inference, nonparametric statistics and bootstrap resampling. We will cover Chapter 1 - 12 from the text plus some supplementary material.

This course is excellent preparation for advanced work in statistics and machine learning.

The main textbook to accompany this course is the following:

Wasserman, L. (2004). All of Statistics: A Concise Course In Statistical Inference. Springer.

Course prerequisites.

Familiarity with basic probability and statistics is assumed. You should already know the following concepts: probability, distribution functions, density functions, moments, transformations of variables, and moment generating functions.

Familiarity with the material in Chapters 1-3 of the book (basic probability) is assumed.

These prerequisites are oriented at the average CMU graduate student. For those self-studying, the above applies equally. See also the section in the 2nd part of this page, “Additional comments on course prerequisites.”.

Course outline, syllabus, and lecture notes.

The original copy of the pdf course syllabus is located here.

The content of the course covers 18 topics spread across 18 sets of PDF lecture notes.

  1. Brief review of basic probability.
  2. Probability inequalities.
  3. Vapnik-Chervonenkis theory and uniform bounds.
  4. Convergence.
  5. Statistical models and sufficiency.
  6. Likelihood.
  7. Parametric point estimation.
  8. Minimax theory.
  9. Asymptotic theory.
  10. Hypothesis testing.
  11. Confidence intervals.
  12. Nonparametric inference
  13. The bootstrap.
  14. Bayesian inference.
  15. Prediction.
  16. Model selection.
  17. Causal inference.
  18. Multiple testing and confidence intervals.

All the lecture notes can be downloaded in their entirety in this subdirectory of my GitHub repo.

Recorded lectures and course schedule.

The content of the above lecture notes were covered across a series of 36 scheduled lectures in the fall of 2016.

The course is a 15 week course, including a one-week break. The lecture schedule, extracted from the original course page, was as follows:

Week Of   Mon Wed Thursday Friday
August 29 Review. Review, inequalities. HW1 Inequalities.
September 5 NO CLASS Inequalities.   Test I.
September 12 VC theory. Convergence. HW2 Convergence.
September 19 Convergence. Convergence.   No class.
September 26 Likelihood. Point estimation. HW3 Minimax theory.
October 3 Minimax theory. Asymptotics. HW4 Asymptotics.
October 10 Asymptotics. Review.   Test II.
October 17 Testing. Testing. HW5 Mid-semester break.
October 24 Testing. Confidence intervals. HW6 Confidence intervals.
October 31 Nonparametrics. Nonparametrics. HW7 Bootstrap.
November 7 Bootstrap. Review   Test III.
November 14 Bayesian inference. Bayesian inference. HW8 Prediction.
November 21 No class. No class.    
November 28 Prediction. Causation. HW9 Causation.
December 5 Model Selection. Multiple testing. HW10 Review.

Around 25/36 of the scheduled lectures were recorded and uploaded to YouTube. I have taken the liberty of adding individual links to each recorded lecture on YouTube.

These recorded lectures contain instructor-led exposition of the notes, which are done on the blackboard. These recorded lectures contain intuitive details, further examples, and illustrative diagrams which are not in the official course lecture notes.

For this reason I have included my own additional materials that I developed whilst watching these lectures. These additional materials can be found further down in the 2nd part of this page under Lecture summaries, session notes, course lecture notes reviews.”

Homework assignments.

The course involves 10 homework assignments.

Homework assignment 1.

Homework assignment 2.

Homework assignment 3.

Homework assignment 4.

Homework assignment 5.

Homework assignment 6.

Homework assignment 7.

Homework assignment 8.

Homework assignment 9.

Homework assignment 10.

All the homework assignments can be downloaded in their entirety in this subdirectory of my GitHub repo.

For those self-studying, please see the section “Additional comments on homework assignments in the 2nd part of this page.

Homework assignment solutions.

For self-study, there will be no instructor or teaching assistant marking one’s work, and no opportunity for feedback. However, there are official solutions available against which one check their own attempts.

To assist in the responsible use of these solutions during self-study, I have stored the solutions of each homework assignment in a password protected WinRAR archive. In order to access it after downloading, the password required is:

hw#-ive-attempted-this-to-the-best-of-my-ability.

where # is a a number from 1-10 for each solution set.

For example, if you want to access the solutions for homework assignment 1, you will need to enter:

hw1-ive-attempted-this-to-the-best-of-my-ability.

Homework assignment 1 solutions.

Homework assignment 2 solutions.

Homework assignment 3 solutions.

Homework assignment 4 solutions.

Homework assignment 5 solutions.

Homework assignment 6 solutions.

Homework assignment 7 solutions.

Homework assignment 8 solutions.

Homework assignment 9 solutions.

Homework assignment 10 solutions.

All homework solutions can be downloaded in their entirety in this subdirectory of my GitHub repo.

As part of self-study I attempted all the homework assignments. After completing them to the best of my ability, I transcribed these as PDF write-ups and used the above solutions to correct them. Corrections on these write-ups are shown in italics. These are linked further down the page under Homework assignments: my write-ups.

Practice tests.

There are three practice tests for the course:

Test II practice.

Test III practice.

Final exam practice.

There are no official course solutions for the practice tests, although a selection of questions are covered in the review lectures. However as part of self-study, I wrote up my solutions,. Because these may be useful for others self-studying, these are linked further down the page under Practice tests: my write-ups.

The following are additional recommended texts from the course syllabus:

Casella, G., Berger, R. L. (2002). Statistical Inference (2nd ed.). Duxbury Press.

Bickel, P. J. and Doksum, K. A. (2015). Mathematical Statistics Volume I (2nd ed.). Chapman and Hall.

Rice, J. A. (2006). Mathematical Statistics and Data Analysis. (3rd ed.). Duxbury Press.

van der Vaart, A. (2000). Asymptotic Statistics. Cambridge University Press.

I have updated the edition number of each recommended reference text to be the latest edition at the time of writing.

Further extensions.

This course continues to form one of the four compulsory, core courses for the CMU Primary Master’s in Machine Learning.

This course itself is a prerequisite for many more advanced courses at CMU. Of particular interest is 36-702: Statistical Machine Learning, and more recently, 10-716: New Statistical Machine Learning. That is because there is the material covered in the 36-702 course and 10-716 course naturally follows on from 36-705.

At some point in the future, I will archive and use these for self-study purposes.

Subsequent iterations of this course.

There is a later iteration of this course in fall 2019, at Siva Balakrishnan’s course page. And there is also a later iteration of this course in spring 2020 at Larry Wasserman’s course page. Both links were active at the time of writing.

These subsequent iterations do not have accompanying publicly available recorded lecture videos.

In light of this and on the whole, having audited these later iterations myself, there is a sufficiently significant overlap of content between the fall 2016 iteration of this course and the subsequent iterations for the fall 2016 iteration to continue to remain highly useful.

Please see “Additional comments on subsequent iterations of the course” further down the page for more specific comments on the similarities and differences in content; and for comments on how materials from these subsequent iterations might be used to supplement this fall 2016 iteration.

Self-study support.

In what follows, I include some additional comments from my own experiences self-studying the course. This may be helpful to those wishing to make use this course in a similar fashion.

I also include additional materials that I wrote when self-studying this course. These include my handwritten lecture notes and high-level summaries for each recorded video lecture; write-ups of my own attempts at the homework assignment and corrections with the solutions; write-ups of my attempts of the practice tests for which comprehensive solutions are not available; and supplementary reference textbooks that I found useful.

The usual caveats with these additional materials apply - use them at your risk.

Additional comments on course prerequisites.

The official course pre-requisites have been stated above, so these represent my own views, from a self-study perspective, on some of the content that is necessary to master the material covered in this course.

It is necessary to state that this course is not suitable as a first course in probability and statistics. So if the contents of chapters 1 - 3 in All of Statistics are completely unfamiliar, then it is advised to take an easier course.

The course additionally requires a degree of literacy in formal arguments common to real analysis in mathematics. For example, one will see frequent use of \(\epsilon\)-\(\delta\) arguments, limits, continuity, pointwise and uniform convergence and greatest lower and least upper bounds. This will undoubtedly be tricky if you have never seen these types of arguments before, and they represent the more rigorous aspect of mathematics in general.

However, the analysis arguments made are not so involved that taking a separate course in analysis is required. And it is possible to get by with an intuitive or even basic Wikipedia understanding of the concepts I have italicised. However, one would benefit from reading a few chapters of a book in analysis.

For this reason, I found that reading a few chapters and doing some exercises from Understanding Analysis by Stephen Abbott to be helpful. I have listed this text under Supplementary texts.

If one wants a terser refresher of analysis concepts specifically in context of the asymptotics section of the course, a remarkably clear reference text would be the first few chapters of Erich Lehmann’s “Elements of Large Sample Theory.” This is also listed under Supplementary texts..

A little a bit about my own background. I came to this course with secondary school knowledge of pure mathematics (calculus and algebra) and statistics. I also had completed a few introductory statistics courses, as well as a number of econometrics courses covering regression analysis and time series for my economics BSc undergraduate degree. The statistical content of the latter was geared towards the context of economics, and did not go into the kind of detail into the construction of statistical tools as covered in this course.

I also do not have formal mathematical training, a distinguishing feature of which is a real analysis course, rather, only passing familiarity with this from further mathematics in secondary school. It is for this reason that I recommend the chapters in Abbott as helpful for those in a similar situation.

Additional comments on course content.

This course is one in theoretical statistics with a view to how it connects with machine learning, rather than say a data analysis course. Therefore one can view its aims as supplying the necessary background knowledge and the introductory formalisms with research-oriented motivated aims in mind. The research-oriented motivation is in the fact that the machine learning MSc at CMU is geared towards reading and writing research papers in machine learning. For this reason, this course is not intended for supplying information on the practicalities of conducting data and statistical analysis, such as R or Python.

Whilst much of the course covers theoretical statistics, there is a distinctive thread that emphasises aspects of more modern techniques, known as statistical machine learning. That is, the statistical study of modern machine learning techniques. In particular, the sections on concentration inequalities, minimax theory, nonparametric inference, bootstrapping, prediction, causal inference, and multiple testing do not tend to be encountered in introductory theoretical statistics courses.

Additional comments on video lectures.

The exposition in the video lectures is extremely clear. The utility of watching these lectures over and above working through the notes and his book by oneself cannot be overstated.

Much of this is a communicative phenomenon. I am fairly certain that I would not have been able to develop the same level of understanding had I just worked through the notes, his book, and the problem sets by myself.

The position I have just espoused may be just be an artefact of my own personal self-study preferences. However, the reasons why I hold that position is because there are many points of emphasis that are difficult to communicate to oneself in the monotonal voice in which one reads a book or a set of notes for self-study; and which are better communicated by an experienced instructor i.e. Larry Wasserman. Furthermore, the presence of diagrams in the exposition on the blackboard helps immeasurably for communicating the intuition of what is going on.

The only minor issues are those of a technical nature, but in any case these teething problems can be overcome. On one or two occasions, the blackboard may not be in appropriate focus, or the audio is not as discernible as one would like.

For the very occasional issues with blackboard focus, I used the section of the notes being covered to assist in discerning what was being written. For audio issues, turn up the volume and carefully follow the section of the notes being covered.

Perhaps the most significant issue, but one which is no way insurmountable, is that there are not enough of the excellent recorded lectures. In that 11 of the scheduled lectures are missing, especially from the later parts of the course schedule. My recommendation in this case is to be more diligent in cross-referencing the arguments with the reference textbooks or to do some Googling for other reputable sources; to take more time with the arguments being made in the corresponding course lecture notes; and also to use the more detailed and explicit lecture notes from the subsequent fall 2019 iteration of this course as a supplement. Please see “Additional comments on subsequent iterations of the course for more information on this last point.

Additional comments on homework assignments.

The following are some pointers and areas of support that I found helpful when completing the assignments, and is geared towards those completing the assignments by themselves.

Do not be discouraged if you get stuck on a question or if you cannot complete a question in its entirety.

A helpful counterpoint to bear in mind is that if one could complete all the homework assignments completely without getting stuck or even complete all the questions without breaking sweat, then it is likely that the course is not sufficiently challenging. This means that there is probably not much one can learn from a course like this in the first place.

Do not be discouraged by how certain ‘tricks’ or ‘observations’ not immediately obvious to you impede your ability to complete a question.

Whilst a decent proportion of the homework questions will fairly naturally follow on from the worked examples covered in the lectures, some of the questions may involve a particular ‘trick’ or ‘observation’ that can only come with experience. This is inherent to the nature of mathematical problem-solving, and it would be a lie to state that this can be easily skirted around.

The good news is that there is a relative paucity of these kinds of problem questions. In the case that one is not able to complete a question purely because of the inability to ‘see the trick’, one can take consolation in the fact that the primary aim of the homework assignments is not purely that of mathematical problem-solving, rather, to assist in the assimilation of theoretical statistics concepts taught in the course.

That being said, knowledge of these tricks as one comes across them are in general, useful for mathematical problem solving, so one should not discount them altogether.

Manage your expectations concerning the time it will take to complete a whole-hearted attempt to each homework assignment.

It may be wise to manage one’s expectations concerning the degree to which the one-week homework deadline can be adhered to, assuming that one may be limited in their access to experienced instructors and teaching assistants to assist.

If you are really stuck, and do not have the benefit of support, consider posting your question on Q&A sites such as Cross Validated or Maths Stack Exchange.

The Cross Validated Q&A is suitable for all questions in statistics, machine learning, data science.

If the course throws up questions of a conceptual nature that you cannot get a handle after consulting multiple references on the same topic, then consider posting it here.

If after a whole-hearted struggle with a homework problem you cannot make any further headway, consider posting it on the above with a self-study tag, and members of the community will try and assist you with hints.

In the event that your question is on a pure mathematical aspect of the course, then the Mathematics Stack Exchange is more appropriate, and they also accept statistics, machine learning and data science questions.

Whilst there is no guarantee that your question will be answered, there is a very helpful community there willing to lend their expertise. It might be a good idea to see the guidelines on how to ask a good question to enhance the chances of receiving a response. If you have enough reputation, you can also add a bounty to your question to draw more attention to it.

Additional comments on reference textbooks.

I found that I had recourse to consulting All of Statistics by Wasserman and Statistical Inference by Casella and Berger most frequently. That is because a fair proportion of the course is based on parts of these texts.

I only occasionally consulted Mathematical Statistics: Basic Ideas and Selected Topics Volume I by Bickel and Doksum, and Mathematics Statistics and Data Analysis by Rice. However, they were useful when I was confused by a particular topic and needed to read about the same topic from a different source.

I consulted Asymptotic Statistics by van der Vaart perhaps the least frequently, and in this case it was only out of curiosity on more advanced topics outside the scope of the course e.g. Wald’s consistency proof for maximum likelihood estimators.

In terms of the reference textbooks that I bought hard-copies of, I purchased All of Statistics, Statistical Inference and Asymptotic Statistics.

At first I self-studied this course using only electronic copies of the textbooks. However, I would highly recommend purchasing as a minimum, a copy of All of Statistics and Statistical Inference.

Working with electronic copies of textbooks has its own merits, such as being able to efficiently search for a particular term quickly. However, I find that if one wants to quickly refer and compare the contents of multiple pages in a hard copy book, this is not easy to do on an electronic PDF reader.

Additional comments on subsequent iterations of the course.

The core content of the subsequent fall 2019 iteration and spring 2020 iteration of this course and the fall 2016 version archived on this page are mostly the same.

The differences are that the lecture course notes in the subsequent iterations are more explicit and detailed, in particular the fall 2019 lecture course notes; and a a number of additional points of interest are covered within each topic.

For example, for probability inequalities, a few additional concentration inequalities are supplied and Rademacher complexity is introduced. Some proofs or examples that are covered on the blackboard in the fall 2016 iteration are documented in more detail in the fall 2019 course lecture notes. Additional supplementary points in a particular topic with broader significance in statistics and machine learning are also given more exposition in the course lecture notes of the fall 2019 iteration. For example, one briefly sees how probability inequalities relate to the celebrated Johnson-Lindenstrauss lemma. The subsequent iteration lecture notes also document informal intuition to assist in understanding that would normally be communicated orally in the lectures.

Hence it may be a good idea to supplement the lecture course notes of the fall 2016 iteration on this page with the lecture course notes of the fall 2019 version. The homework assignments for later version also differ from the fall 2016 iteration, providing a further source of practice.

In particular, reviewing the more detailed fall 2019 lecture course notes would be helpful for dealing with the ~11 missing lectures in this fall 2016 iteration, especially on the later topics such as prediction, causation, model selection and multiple testing.

At some point in the future, I plan to create a document listing the small amount of additional material covered in subsequent iterations that are not covered in this fall 2016 iteration.

Lecture summaries, session notes, course lecture notes reviews.

As part of self-study and to assist in my understanding, I kept a bullet-pointed high-level summary of the content of each recorded lecture in a single PDF document. It is organised by the original date each lecture took place (see course schedule above), with additional information on the course lecture notes being covered by a particular recorded lecture. This document is available below:

Lecture summaries.

The recorded video lectures contain further intuitive details, examples and illustrative diagrams not covered in the 18 sets of PDF course notes. Therefore I made handwritten notes of what was written on the blackboard and on important points of emphasis communicated orally by Larry Wasserman. Scanned versions of these notes, organised by the date of the original lectures are below:

31/08/16 lecture session notes.

02/09/16 lecture session notes.

07/09/16 lecture session notes.

12/09/16 lecture session notes.

14/09/16 lecture session notes.

16/09/16 lecture session notes.

19/09/16 lecture session notes.

21/09/16 lecture session notes.

26/09/16 lecture session notes.

28/09/16 lecture session notes.

30/09/16 lecture session notes.

05/10/16 lecture session notes.

10/10/16 lecture session notes.

12/10/16 lecture session notes.

17/10/16 lecture session notes.

19/10/16 lecture session notes.

In a few cases, a choice is made in the recorded lectures not to cover certain parts of the course lecture notes. Instead, these parts of the course lecture notes are supposed to be reviewed in one’s own time. I also found that certain concepts or points which I didn’t grasp during the recorded lecture required me to “sweep-up” during a review.

Below are further scanned notes which I made corresponding to each of the course lecture notes that I reviewed:

All of Statistics Chapter 1 review notes.

All of Statistics Chapter 2 review notes.

All of Statistics Chapter 3 review notes.

Lecture Notes 2: Inequalities supplementary.

Lecture Notes 3: Uniform Bounds supplementary.

Lecture Notes 4: Convergence supplementary.

Lecture Notes 5: Statistical models supplementary.

Lecture Notes 6: Likelihood scanned supplementary.

Lecture Notes 7: Point estimation supplementary.

Lecture Notes 8: Minimax theory supplementary.

Lecture Notes 9: Asymptotics supplementary.

Homework assignments: my write-ups.

My attempts at the homework assignments are linked below. After completing my work on paper and transcribing it to LaTeX, I checked my work against the official solutions, and corrections are in italics.

Homework assignment 1 - my write-up.

Homework assignment 2 - my write-up, pending LaTeX transcription.

Homework assignment 3 - my write-up.

Homework assignment 4 - my write-up.

Homework assignment 5 - my write-up.

Homework assignment 6 - my write-up, pending LaTeX transcription.

Homework assignment 7 - my write-up, pending LaTeX transcription.

Homework assignment 8 - my write-up, pending LaTeX transcription.

Homework assignment 9 - my write-up, pending LaTeX transcription.

Homework assignment 10 - my write-up, pending LateX transcription.

My homework assignment write-ups are verbose and are not concise by any formal mathematical standards. However, this was done to assist in better understanding the definitions and proof strategies covered in the course.

Practice tests: my write-ups.

As the practice tests do not have official solutions, except for the fact that a selection of questions are covered on the blackboard during review lectures, below are the links to the solutions I wrote up when attempting the questions.

Test II practice - my write-up.

Test III practice - my write-up.

Final exam practice - my write-up.

Similar comments on the verbosity of my solutions apply in this case also.

Supplementary texts.

As previously mentioned under Additional comments on course prerequisites, the following is an excellent introductory text for developing a greater degree of fluency in analysis-style arguments used in the course:

Abbott, S. (2015). Understanding Analysis (2nd ed.). Undergraduate Texts in Mathematics. Springer.

If one wants a somewhat terser introduction to analysis concepts solely in context of asymptotics, I found that dipping into a few chapters of the text below helped. That the text is terser with respect to coverage of analysis concepts does not affect the fact that on the whole the exposition is remarkably clear. I have also found it to be a good general reference textbook on large sample theory.

Lehmann, E. L. (2004). Elements of Large Sample Theory (3rd ed.). Springer-Verlag New York Inc.

Some of the concentration inequalities in the course are not covered in either All of Statistics by Wasserman or in Statistical Inference by Casella and Berger. It is often helpful to read about these inequalities from another reputable source. The concentration inequalities in the book below go beyond the scope of the course, but is useful as a reference for dipping in to:

Boucheron, S., Lugosi, G., Massart, P. (2016). Concentration Inequalities: A Nonasymptotic Theory of Independence. Oxford University Press.

Electronic copies of reference textbooks.

Whilst in an ideal world one would have hard copies of all the books here, I recognise that there are many who are financially constrained and not able to purchase the books for themselves.

For this reason, here are a selection of links to freely available electronic copies of the reference textbooks I was able to find:

Wasserman, L. (2004). All of Statistics: A Concise Course In Statistical Inference. Springer.

Casella, G., Berger, R. L. (2002). Statistical Inference (2nd ed.). Duxbury Press.

Rice, J. A. (2006). Mathematical Statistics and Data Analysis. (2nd ed.). Duxbury Press.

Abbott, S. (2015). Understanding Analysis (2nd ed.). Undergraduate Texts in Mathematics. Springer.

Lehmann, E. L. (1998). Elements of Large Sample Theory. (1st ed.). Springer-Verlag New York Inc.

Boucheron, S., Lugosi, G., Massart, P. (draft ed.). Concentration Inequalities: A Nonasymptotic Theory of Independence. Oxford University Press.

At the time of writing, these links were all working. I will not actively maintain these links. Furthermore, some of the links above are not the most up-to-date editions, which may or may not be appropriate for one’s purposes.

In the event that a link goes dead, and for the reference textbooks that are not included in the list above, if one really needs access to an electronic copy, the following is a possible solution:

Visit the Genesis Library, a link to which is contained in the following Wikipedia article.