Skip to content

Commit e0508ee

Browse files
committed
all code examples should be up now
1 parent af6d146 commit e0508ee

19 files changed

Lines changed: 1715 additions & 1 deletion

File tree

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ Excerpts from the [Foreword](./docs/foreword_ro.pdf) and [Preface](./docs/prefac
4444
10. Predicting Continuous Target Variables with Regression Analysis [[dir](./code/ch10)] [[ipynb](./code/ch10/ch10.ipynb)] [[nbviewer](http://nbviewer.ipython.org/github/rasbt/python-machine-learning-book/blob/master/code/ch10/ch10.ipynb)]
4545
11. Working with Unlabeled Data – Clustering Analysis [[dir](./code/ch11)] [[ipynb](./code/ch11/ch11.ipynb)] [[nbviewer](http://nbviewer.ipython.org/github/rasbt/python-machine-learning-book/blob/master/code/ch11/ch11.ipynb)]
4646
12. Training Artificial Neural Networks for Image Recognition [[dir](./code/ch12)] [[ipynb](./code/ch12/ch12.ipynb)] [[nbviewer](http://nbviewer.ipython.org/github/rasbt/python-machine-learning-book/blob/master/code/ch12/ch12.ipynb)]
47-
13. Parallelizing Neural Network Training via Theano
47+
13. Parallelizing Neural Network Training via Theano [[dir](./code/ch13)] [[ipynb](./code/ch13/ch13.ipynb)] [[nbviewer](http://nbviewer.ipython.org/github/rasbt/python-machine-learning-book/blob/master/code/ch13/ch13.ipynb)]
4848

4949

5050
### [Literature References & Further Reading Resources](./docs/references.md)
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
Dealing with missing data
2+
Eliminating samples or features with missing values
3+
Imputing missing values
4+
Understanding the scikit-learn estimator API
5+
Handling categorical data
6+
Mapping ordinal features
7+
Encoding class labels
8+
Performing one-hot encoding on nominal features
9+
Partitioning a dataset in training and test sets
10+
Bringing features onto the same scale
11+
Selecting meaningful features
12+
Sparse solutions with L1 regularization
13+
Sequential feature selection algorithms
14+
Assessing feature importance with random forests
15+
Summary
Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
Unsupervised dimensionality reduction via principal component analysis 128
2+
Total and explained variance
3+
Feature transformation
4+
Principal component analysis in scikit-learn
5+
Supervised data compression via linear discriminant analysis
6+
Computing the scatter matrices
7+
Selecting linear discriminants for the new feature subspace
8+
Projecting samples onto the new feature space
9+
LDA via scikit-learn
10+
Using kernel principal component analysis for nonlinear mappings
11+
Kernel functions and the kernel trick
12+
Implementing a kernel principal component analysis in Python
13+
Example 1 – separating half-moon shapes
14+
Example 2 – separating concentric circles
15+
Projecting new data points
16+
Kernel principal component analysis in scikit-learn
17+
Summary

code/_convenience_scripts/blank_tocs/ch06.toc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
Streamlining workflows with pipelines Loading the Breast Cancer Wisconsin dataset Combining transformers and estimators in a pipelineUsing k-fold cross-validation to assess model performance The holdout method K-fold cross-validationDebugging algorithms with learning and validation curves Diagnosing bias and variance problems with learning curves Addressing overfitting and underfitting with validation curvesFine-tuning machine learning models via grid search Tuning hyperparameters via grid search Algorithm selection with nested cross-validationLooking at different performance evaluation metrics Reading a confusion matrix
2+
Optimizing the precision and recall of a classification model
3+
Plotting a receiver operating characteristic The scoring metrics for multiclass classification
4+
Summary

code/_convenience_scripts/blank_tocs/ch07.toc

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
Learning with ensembles
2+
Implementing a simple majority vote classifier
3+
Combining different algorithms for classification with majority vote
4+
Evaluating and tuning the ensemble classifierBagging – building an ensemble of classifiers from bootstrap samples
5+
Leveraging weak learners via adaptive boosting
6+
Summary

code/_convenience_scripts/blank_tocs/ch08.toc

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
Obtaining the IMDb movie review dataset
2+
Introducing the bag-of-words model Transforming words into feature vectors
3+
Assessing word relevancy via term frequency-inverse document frequency
4+
Cleaning text data
5+
Processing documents into tokensTraining a logistic regression model for document classification
6+
Working with bigger data – online algorithms and out-of-core learning
7+
Summary

code/_convenience_scripts/blank_tocs/ch09.toc

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
Serializing fitted scikit-learn estimators
2+
Setting up a SQLite database for data storage
3+
Developing a web application with Flask
4+
Our first Flask web application
5+
Form validation and rendering
6+
Turning the movie classifier into a web applicationDeploying the web application to a public server Updating the movie review classifierSummary

code/_convenience_scripts/blank_tocs/ch10.toc

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
Introducing a simple linear regression model
2+
Exploring the Housing Dataset
3+
Visualizing the important characteristics of a dataset
4+
Implementing an ordinary least squares linear regression model
5+
Solving regression for regression parameters with gradient descent
6+
Estimating the coefficient of a regression model via scikit-learn
7+
Fitting a robust regression model using RANSAC
8+
Evaluating the performance of linear regression models
9+
Using regularized methods for regressionTurning a linear regression model into a curve – polynomial regression
10+
Modeling nonlinear relationships in the Housing Dataset Dealing with nonlinear relationships using random forests
11+
Decision tree regression
12+
Random forest regressionSummary
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
Grouping objects by similarity using k-means
2+
K-means++
3+
Hard versus soft clustering
4+
Using the elbow method to find the optimal number of clusters
5+
Quantifying the quality of clustering via silhouette plots
6+
Organizing clusters as a hierarchical tree
7+
Performing hierarchical clustering on a distance matrix
8+
Attaching dendrograms to a heat map
9+
Applying agglomerative clustering via scikit-learn
10+
Locating regions of high density via DBSCAN
11+
Summary

code/_convenience_scripts/blank_tocs/ch12.toc

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
Modeling complex functions with artificial neural networks
2+
Single-layer neural network recap
3+
Introducing the multi-layer neural network architecture
4+
Activating a neural network via forward propagation
5+
Classifying handwritten digits Obtaining the MNIST dataset Implementing a multi-layer perceptronTraining an artificial neural network Computing the logistic cost function Training neural networks via backpropagationDeveloping your intuition for backpropagation
6+
Debugging neural networks with gradient checking
7+
Convergence in neural networks
8+
Other neural network architectures Convolutional Neural Networks Recurrent Neural NetworksA few last words about neural network implementation
9+
Summary

0 commit comments

Comments
 (0)