Merge branch 'main' of https://github.com/novak-99/MLPP into main

This commit is contained in:
novak_99 2021-12-15 12:59:59 -08:00
commit 4b470fa962

View File

@ -1,6 +1,6 @@
# ML++
Machine learning is a vast and exiciting discipline, garnering attention from specialists of many fields. Unfortunately, for C++ programmers and enthusiasts, there appears to be a lack of support in the field of machine learning. To fill that void and give C++ a true foothold in the ML sphere, this library was written. My intent with this library is for it to act as a crossroad between low-level developers and machine learning engineers.
Machine learning is a vast and exiciting discipline, garnering attention from specialists of many fields. Unfortunately, for C++ programmers and enthusiasts, there appears to be a lack of support in the field of machine learning. To fill that void and give C++ a true foothold in the ML sphere, this library was written. The intent with this library is for it to act as a crossroad between low-level developers and machine learning engineers.
<p align="center">
<img src="https://user-images.githubusercontent.com/78002988/119920911-f3338d00-bf21-11eb-89b3-c84bf7c9f4ac.gif"
@ -116,7 +116,7 @@ The result will be the model's predictions for the entire dataset.
2. Autoencoder
3. Softmax Network
4. ***Natural Language Processing***
1. Word2Vec (Continous Bag of Words, Skip-N Gram)
1. Word2Vec (Continous Bag of Words, Skip-Gram)
2. Stemming
3. Bag of Words
4. TFIDF
@ -146,16 +146,6 @@ The result will be the model's predictions for the entire dataset.
- Positive Definiteness Checker
3. QR Decomposition
13. ***Numerical Analysis***
1. Numerical Diffrentiation
- Univariate Functions
- Multivariate Functions
2. Jacobian Vector Calculator
3. Hessian Matrix Calculator
4. Function approximator
- Constant Approximation
- Linear Approximation
- Quadratic Approximation
5. Newton-Raphson Method
14. ***Linear Algebra Module***
15. ***Statistics Module***
16. ***Data Processing Module***
@ -185,4 +175,4 @@ ML++, like most frameworks, is dynamic, and constantly changing. This is especia
</p>
## Citations
Various different materials helped me along the way of creating ML++, and I would like to give credit to them here. [This](https://www.tutorialspoint.com/cplusplus-program-to-compute-determinant-of-a-matrix) article by TutorialsPoint was a big help when trying to implement the determinant of a matrix, and [this](https://www.geeksforgeeks.org/adjoint-inverse-matrix/) article by GeeksForGeeks was very helpful when trying to take the adjoint and inverse of a matrix. Lastly, I would like to thank [this](https://towardsdatascience.com/svm-implementation-from-scratch-python-2db2fc52e5c2) article by Towards Data Science which helped illustrate a practical definition of the Hinge Loss activation function and its gradient when optimizing with SGD.
Various different materials helped me along the way of creating ML++, and I would like to give credit to several of them here. [This](https://www.tutorialspoint.com/cplusplus-program-to-compute-determinant-of-a-matrix) article by TutorialsPoint was a big help when trying to implement the determinant of a matrix, and [this](https://www.geeksforgeeks.org/adjoint-inverse-matrix/) article by GeeksForGeeks was very helpful when trying to take the adjoint and inverse of a matrix. Lastly, I would like to thank [this](https://towardsdatascience.com/svm-implementation-from-scratch-python-2db2fc52e5c2) article by Towards Data Science which helped illustrate a practical definition of the Hinge Loss function and its gradient when optimizing with SGD.