| Week 03: | ||
| - | ||
| - | ||
linprog4.cs.fsu.edu. In order to compile, an OpenCV 2.4.4 C++11 program in main.cpp using g++ version 4.7.2, use the following command on linprog4.cs.fsu.edu:g++47 -o main.exe main.cpp -std=c++11 -O3 -Wall -Wextra -Werror -I. -I/usr/local/include/ -I/usr/local/include/boost_1.53.0/ -L/usr/local/lib64/ -lopencv_calib3d -lopencv_contrib -lopencv_core -lopencv_features2d -lopencv_flann -lopencv_gpu -lopencv_highgui -lopencv_imgproc -lopencv_legacy -lopencv_ml -lopencv_nonfree -lopencv_objdetect -lopencv_photo -lopencv_stitching -lopencv_ts -lopencv_video -lopencv_videostab -Wl,-rpath,/usr/local/lib64/ && ./main.exe
Mat_ ... the class that you will usually use. If you use this class cv::Mat_<double>, then you don't have to specify the type using the CV_64F OpenCV macro.
Mat ... the class that cv::Mat_<T> derives from.
Mat::Mat: Various Mat constructors
Mat::zeros: Returns a zero array of the specified size and type.
Mat::ones: Returns an array of all 1’s of the specified size and type.
Mat::eye: Returns an identity matrix of the specified size and type.
Mat::diag: Extracts a diagonal from a matrix, or creates a diagonal matrix.
Mat::create: Allocates new array data if needed.
Mat::dot: Computes a dot-product of two vectors.
Mat::t: Transposes a matrix.
Mat::inv: Inverses a matrix.
Mat::inv( DECOMP_SVD ): If the matrix is singular or even non-square, the pseudo inversion is computed.
determinant: Returns the determinant of a square floating-point matrix.
trace: Returns the trace of a matrix.
norm: Calculates an absolute array norm, an absolute difference norm, or a relative difference norm.
normalize: Normalizes the norm or value range of an array.
mean: Calculates an average (mean) of array elements.
calcCovarMatrix: Calculates the covariance matrix of a set of vectors.
SVD: Class for computing Singular Value Decomposition of a floating-point matrix. The Singular Value Decomposition is used to solve least-square problems, under-determined linear systems, invert matrices, compute condition numbers, and so on.
SVD::compute: Performs SVD of a matrix
sqrt: Calculates a square root of array elements.
min: Calculates per-element minimum of two arrays or an array and a scalar.
max: Calculates per-element maximum of two arrays or an array and a scalar.
sum: Calculates the sum of array elements.
reduce: Reduces a matrix to a vector.
CvMLData: Class for loading the data from a .csv file.
CvMLData::read_csv: Reads the data set from a .csv-like filename file and stores all read values in a matrix.
CvMLData::set_response_idx: Specifies index of response column in the data matrix.
CvMLData::change_var_idx: Enables or disables particular variable in the loaded data.
CvMLData::get_class_labels_map: Returns a map that converts strings to labels.
CvTrainTestSplit: Structure setting the split of a data set read by CvMLData.
CvMLData::set_train_test_split: Divides the read data set into two disjoint training and test subsets.
CvMLData::get_train_sample_idx: Returns the matrix of sample indices for a training subset.
CvMLData::get_test_sample_idx: Returns the matrix of sample indices for a testing subset.
CvMLData::get_var_idx: Returns the indices of the active variables in the data matrix.
CvMLData::get_values: Returns a pointer to the matrix of predictors and response values.
CvKNearest: The class implements K-Nearest Neighbors model as described in the beginning of this section.
CvKNearest::CvKNearest: Default and training constructors.
CvKNearest::train: Trains the model.
CvKNearest::find_nearest: Finds the neighbors and predicts responses for input vectors.
| Order | Topic | Book | Section | Page Numbers | Pages | Presentations |
| 1. | Multivariate Linear Regression | AIMA | Section 18.6.2: | Pages 720–723 | (4 Pages) | Learning from Observations and Machine Learning |
| 2. | Linear Classification with Logistic Regression | AIMA | Section 18.6.4: | Pages 725–727 | (3 Pages) | Learning from Observations and Machine Learning |
| 3. | Artificial Neural Networks | AIMA | Section 18.7: | Pages 727–737 | (11 Pages) | Neural Networks and Machine Learning |
| 4. | Support Vector Machines | AIMA | Section 18.9: | Pages 744–748 | (5 Pages) | Learning from Observations and Machine Learning |
| 5. | Intro to Learning Probabilistic Models | AIMA | Section 20 Intro: | Page 802 | (1 Page) | Statistical Learning and Machine Learning |
| 6. | Statistical Learning | AIMA | Section 20.1: | Pages 802–805 | (4 Pages) | Statistical Learning and Machine Learning |
| 7. | Intro to Learning with Complete Data | AIMA | Section 20.2 Intro: | Page 806 | (1 Page) | Statistical Learning and Machine Learning |
| 8. | Maximum-Likelihood Parameter Learning: Discrete Models | AIMA | Section 20.2.1: | Pages 806–808 | (3 Pages) | Statistical Learning and Machine Learning |
| 9. | Naive Bayes Models | AIMA | Section 20.2.2: | Pages 808–809 | (2 Pages) | Statistical Learning and Machine Learning |
| 10. | Maximum-Likelihood Parameter Learning: Continuous Models | AIMA | Section 20.2.3: | Pages 809–810 | (2 Pages) | Statistical Learning and Machine Learning |
| 11. | Bayesian Parameter Learning | AIMA | Section 20.2.4: | Pages 810–813 | (4 Pages) | Statistical Learning and Machine Learning |
| 12. | Learning Bayes Net Structures | AIMA | Section 20.2.5: | Pages 813–814 | (2 Pages) | Statistical Learning and Machine Learning |
| 13. | Intro to Nonparametric Models | AIMA | Section 18.8 Intro: | Page 737 | (1 Page) | Learning from Observations and Machine Learning |
| 14. | Nearest Neighbor Models | AIMA | Section 18.8.1: | Pages 738–739 | (2 Pages) | Learning from Observations and Machine Learning |
| 15. | Finding Nearest Neighbors with k-d Trees | AIMA | Section 18.8.2: | Pages 739–740 | (2 Pages) | Learning from Observations and Machine Learning |
| 16. | Density Estimation with Nonparametric Models | AIMA | Section 20.2.6: | Pages 814–816 | (3 Pages) | Statistical Learning and Machine Learning |