mirror of
https://github.com/Relintai/MLPP.git
synced 2025-02-10 16:10:06 +01:00
Merge branch 'main' of https://github.com/novak-99/MLPP into main
This commit is contained in:
commit
9489ff258b
73
README.md
73
README.md
@ -3,6 +3,75 @@
|
||||
Machine learning is a vast and exiciting discipline, garnering attention from specialists of many fields. Unfortunately, for C++ programmers and enthusiasts, there appears to be a lack of support for this magnificient language in the field of machine learning. As a consequence, this library was created in order to fill that void and give C++ a true foothold in the ML sphere to act as a crossroad between low level developers and machine learning engineers.
|
||||
|
||||
<p align="center">
|
||||
<img src="https://raw.githubusercontent.com/novak-99/MLPP/main/cover_gif.gif"
|
||||
<img src="https://user-images.githubusercontent.com/78002988/119920911-f3338d00-bf21-11eb-89b3-c84bf7c9f4ac.gif"
|
||||
width = 600 height = 400>
|
||||
</p>
|
||||
</p>
|
||||
|
||||
## Contents of the Library
|
||||
1. ***Regression***
|
||||
1. Linear Regression
|
||||
2. Logistic Regression
|
||||
3. Softmax Regression
|
||||
4. Exponential Regression
|
||||
5. Probit Regression
|
||||
6. CLogLog Regression
|
||||
2. ***Deep, Dynamically Sized Neural Networks***
|
||||
1. Possible Activation Functions
|
||||
- Linear
|
||||
- Sigmoid
|
||||
- Swish
|
||||
- Softplus
|
||||
- CLogLog
|
||||
- Gaussian CDF
|
||||
- GELU
|
||||
- Unit Step
|
||||
- Sinh
|
||||
- Cosh
|
||||
- Tanh
|
||||
- Csch
|
||||
- Sech
|
||||
- Coth
|
||||
- Arsinh
|
||||
- Arcosh
|
||||
- Artanh
|
||||
- Arcsch
|
||||
- Arsech
|
||||
- Arcoth
|
||||
2. Possible Loss Functions
|
||||
- MSE
|
||||
- RMSE
|
||||
- MAE
|
||||
- MBE
|
||||
- Log Loss
|
||||
- Cross Entropy
|
||||
- Hinge Loss
|
||||
3. Possible Regularization Methods
|
||||
- Lasso
|
||||
- Ridge
|
||||
- ElasticNet
|
||||
4. Possible Weight Initialization Methods
|
||||
- Uniform
|
||||
- Xavier Normal
|
||||
- Xavier Uniform
|
||||
- He Normal
|
||||
- He Uniform
|
||||
3. ***Prebuilt Neural Networks***
|
||||
1. Multilayer Peceptron
|
||||
2. Autoencoder
|
||||
3. Softmax Network
|
||||
4. ***Natural Language Processing***
|
||||
1. Word2Vec (Continous Bag of Words, Skip-N Gram)
|
||||
2. Stemming
|
||||
3. Bag of Words
|
||||
4. TFIDF
|
||||
5. Tokenization
|
||||
6. Auxiliary Text Processing Functions
|
||||
5. ***Computer Vision***
|
||||
1. The Convolution Operation
|
||||
2. Max, Min, Average Pooling
|
||||
3. Global Max, Min, Average Pooling
|
||||
4. Prebuilt Feature Detectors
|
||||
- Horizontal/Vertical Prewitt Filter
|
||||
- Horizontal/Vertical Sobel Filter
|
||||
- Horizontal/Vertical Scharr Filter
|
||||
- Horizontal/Vertical Roberts Filter
|
||||
|
Loading…
Reference in New Issue
Block a user