Update README.md

This commit is contained in:
marc 2021-09-24 17:02:38 -07:00 committed by GitHub
parent 84532086fc
commit cafec0fb22
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -155,9 +155,15 @@ The result will be the model's predictions for the entire dataset.
## What's in the Works? ## What's in the Works?
ML++, like most frameworks, is dynamic, and constantly changing! This is especially important in the world of ML, as new algorithms and techniques are being developed day by day. Here a couple things currently being developed for ML++: ML++, like most frameworks, is dynamic, and constantly changing! This is especially important in the world of ML, as new algorithms and techniques are being developed day by day. Here a couple things currently being developed for ML++:
1. Convolutional Neural Networks <p>
2. Kernels for SVMs - Convolutional Neural Networks
</p>
<p>
- Kernels for SVMs
</p>
<p>
3. Support Vector Regression 3. Support Vector Regression
</p>
## Citations ## Citations
Various different materials helped me along the way of creating ML++, and I would like to give credit to them here. [This](https://www.tutorialspoint.com/cplusplus-program-to-compute-determinant-of-a-matrix) article by TutorialsPoint was a big help when trying to implement the determinant of a matrix, and [this](https://www.geeksforgeeks.org/adjoint-inverse-matrix/) website by GeeksForGeeks was very helpful when trying to take the adjoint and inverse of a matrix. Lastly, I would like to thank [this](https://towardsdatascience.com/svm-implementation-from-scratch-python-2db2fc52e5c2) article from Towards Data Science which helped illustrate a practical definition of the Hinge Loss activation function and its gradient. Various different materials helped me along the way of creating ML++, and I would like to give credit to them here. [This](https://www.tutorialspoint.com/cplusplus-program-to-compute-determinant-of-a-matrix) article by TutorialsPoint was a big help when trying to implement the determinant of a matrix, and [this](https://www.geeksforgeeks.org/adjoint-inverse-matrix/) website by GeeksForGeeks was very helpful when trying to take the adjoint and inverse of a matrix. Lastly, I would like to thank [this](https://towardsdatascience.com/svm-implementation-from-scratch-python-2db2fc52e5c2) article from Towards Data Science which helped illustrate a practical definition of the Hinge Loss activation function and its gradient.