From 94b4c48cb69da55fba63e650fe4c4ba076e7d785 Mon Sep 17 00:00:00 2001 From: marc <78002988+novak-99@users.noreply.github.com> Date: Thu, 2 Dec 2021 22:47:44 -0800 Subject: [PATCH 1/4] unnecessary info --- README.md | 10 ---------- 1 file changed, 10 deletions(-) diff --git a/README.md b/README.md index 817e826..9ca2f65 100644 --- a/README.md +++ b/README.md @@ -146,16 +146,6 @@ The result will be the model's predictions for the entire dataset. - Positive Definiteness Checker 3. QR Decomposition 13. ***Numerical Analysis*** - 1. Numerical Diffrentiation - - Univariate Functions - - Multivariate Functions - 2. Jacobian Vector Calculator - 3. Hessian Matrix Calculator - 4. Function approximator - - Constant Approximation - - Linear Approximation - - Quadratic Approximation - 5. Newton-Raphson Method 14. ***Linear Algebra Module*** 15. ***Statistics Module*** 16. ***Data Processing Module*** From 579973842ad771033b3ac9e63744a859e24189b3 Mon Sep 17 00:00:00 2001 From: marc <78002988+novak-99@users.noreply.github.com> Date: Thu, 2 Dec 2021 22:55:31 -0800 Subject: [PATCH 2/4] Update README.md --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 9ca2f65..6679b71 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # ML++ -Machine learning is a vast and exiciting discipline, garnering attention from specialists of many fields. Unfortunately, for C++ programmers and enthusiasts, there appears to be a lack of support in the field of machine learning. To fill that void and give C++ a true foothold in the ML sphere, this library was written. My intent with this library is for it to act as a crossroad between low-level developers and machine learning engineers. +Machine learning is a vast and exiciting discipline, garnering attention from specialists of many fields. Unfortunately, for C++ programmers and enthusiasts, there appears to be a lack of support in the field of machine learning. To fill that void and give C++ a true foothold in the ML sphere, this library was written. The intent with this library is for it to act as a crossroad between low-level developers and machine learning engineers.
Date: Thu, 2 Dec 2021 22:55:53 -0800 Subject: [PATCH 3/4] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 6679b71..5bd190d 100644 --- a/README.md +++ b/README.md @@ -175,4 +175,4 @@ ML++, like most frameworks, is dynamic, and constantly changing. This is especia
## Citations -Various different materials helped me along the way of creating ML++, and I would like to give credit to them here. [This](https://www.tutorialspoint.com/cplusplus-program-to-compute-determinant-of-a-matrix) article by TutorialsPoint was a big help when trying to implement the determinant of a matrix, and [this](https://www.geeksforgeeks.org/adjoint-inverse-matrix/) article by GeeksForGeeks was very helpful when trying to take the adjoint and inverse of a matrix. Lastly, I would like to thank [this](https://towardsdatascience.com/svm-implementation-from-scratch-python-2db2fc52e5c2) article by Towards Data Science which helped illustrate a practical definition of the Hinge Loss activation function and its gradient when optimizing with SGD. +Various different materials helped me along the way of creating ML++, and I would like to give credit to them here. [This](https://www.tutorialspoint.com/cplusplus-program-to-compute-determinant-of-a-matrix) article by TutorialsPoint was a big help when trying to implement the determinant of a matrix, and [this](https://www.geeksforgeeks.org/adjoint-inverse-matrix/) article by GeeksForGeeks was very helpful when trying to take the adjoint and inverse of a matrix. Lastly, I would like to thank [this](https://towardsdatascience.com/svm-implementation-from-scratch-python-2db2fc52e5c2) article by Towards Data Science which helped illustrate a practical definition of the Hinge Loss function and its gradient when optimizing with SGD. From 71f75773aa32ccc76f2d1f8e920e8fdaf1f1c229 Mon Sep 17 00:00:00 2001 From: marc <78002988+novak-99@users.noreply.github.com> Date: Fri, 3 Dec 2021 14:55:58 -0800 Subject: [PATCH 4/4] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 5bd190d..cb9684a 100644 --- a/README.md +++ b/README.md @@ -175,4 +175,4 @@ ML++, like most frameworks, is dynamic, and constantly changing. This is especia ## Citations -Various different materials helped me along the way of creating ML++, and I would like to give credit to them here. [This](https://www.tutorialspoint.com/cplusplus-program-to-compute-determinant-of-a-matrix) article by TutorialsPoint was a big help when trying to implement the determinant of a matrix, and [this](https://www.geeksforgeeks.org/adjoint-inverse-matrix/) article by GeeksForGeeks was very helpful when trying to take the adjoint and inverse of a matrix. Lastly, I would like to thank [this](https://towardsdatascience.com/svm-implementation-from-scratch-python-2db2fc52e5c2) article by Towards Data Science which helped illustrate a practical definition of the Hinge Loss function and its gradient when optimizing with SGD. +Various different materials helped me along the way of creating ML++, and I would like to give credit to several of them here. [This](https://www.tutorialspoint.com/cplusplus-program-to-compute-determinant-of-a-matrix) article by TutorialsPoint was a big help when trying to implement the determinant of a matrix, and [this](https://www.geeksforgeeks.org/adjoint-inverse-matrix/) article by GeeksForGeeks was very helpful when trying to take the adjoint and inverse of a matrix. Lastly, I would like to thank [this](https://towardsdatascience.com/svm-implementation-from-scratch-python-2db2fc52e5c2) article by Towards Data Science which helped illustrate a practical definition of the Hinge Loss function and its gradient when optimizing with SGD.