2021-05-27 22:14:20 +02:00
# ML++
Machine learning is a vast and exiciting discipline, garnering attention from specialists of many fields. Unfortunately, for C++ programmers and enthusiasts, there appears to be a lack of support for this magnificient language in the field of machine learning. As a consequence, this library was created in order to fill that void and give C++ a true foothold in the ML sphere to act as a crossroad between low level developers and machine learning engineers.
2021-05-28 03:52:51 +02:00
< p align = "center" >
2021-05-28 04:31:00 +02:00
< img src = "https://user-images.githubusercontent.com/78002988/119920911-f3338d00-bf21-11eb-89b3-c84bf7c9f4ac.gif"
2021-05-28 04:43:40 +02:00
width = 600 height = 400>
2021-05-28 04:25:14 +02:00
< / p >
2021-05-28 04:43:40 +02:00
2021-05-28 07:47:08 +02:00
## Usage
2021-05-28 07:58:28 +02:00
Please note that ML++ uses the ```std::vector< double > ``` data type for emulating vectors, and the ```std::vector< std::vector < double > >``` data type for emulating matricies.
2021-05-28 07:57:13 +02:00
2021-05-28 07:56:53 +02:00
Begin by including the respective header file of your choice.
2021-05-28 07:47:08 +02:00
```cpp
#include "MLPP/LinReg/LinReg.hpp"
```
2021-05-28 07:59:13 +02:00
Next, instantiate an object of the class. Don't forget to pass the input set and output set as parameters.
2021-05-28 07:55:41 +02:00
```cpp
LinReg model(inputSet, outputSet);
```
2021-05-28 07:59:39 +02:00
Now call the optimizer that you would like to use. For iterative optimizers such as gradient descent, include the learning rate, epoch number, and whether or not to utilize the UI pannel.
2021-05-28 07:55:41 +02:00
```cpp
model.gradientDescent(0.001, 1000, 0);
```
Great, you are now ready to test! To test a singular testing instance, utilize the following function:
```cpp
model.modelTest(testSetInstance);
```
This will return the model's singular prediction for that example.
To test an entire dataset of instances, use the following function:
```cpp
model.modelSetTest(testSet);
```
The result will be the model's predictions for the entire dataset.
2021-05-28 07:47:08 +02:00
2021-05-28 04:43:40 +02:00
## Contents of the Library
2021-05-28 04:45:39 +02:00
1. ** *Regression***
2021-05-28 04:43:40 +02:00
1. Linear Regression
2. Logistic Regression
3. Softmax Regression
4. Exponential Regression
5. Probit Regression
6. CLogLog Regression
2021-05-28 05:35:24 +02:00
7. Tanh Regression
2021-05-28 04:54:43 +02:00
2. ** *Deep, Dynamically Sized Neural Networks***
2021-05-28 04:53:57 +02:00
1. Possible Activation Functions
- Linear
- Sigmoid
- Swish
- Softplus
- CLogLog
- Gaussian CDF
- GELU
- Unit Step
- Sinh
- Cosh
- Tanh
- Csch
- Sech
- Coth
- Arsinh
- Arcosh
- Artanh
- Arcsch
- Arsech
- Arcoth
2. Possible Loss Functions
2021-05-28 04:49:07 +02:00
- MSE
- RMSE
- MAE
- MBE
- Log Loss
- Cross Entropy
- Hinge Loss
2021-05-28 04:53:57 +02:00
3. Possible Regularization Methods
- Lasso
- Ridge
- ElasticNet
4. Possible Weight Initialization Methods
- Uniform
- Xavier Normal
- Xavier Uniform
- He Normal
- He Uniform
2021-05-28 05:04:26 +02:00
3. ** *Prebuilt Neural Networks***
1. Multilayer Peceptron
2. Autoencoder
3. Softmax Network
4. ** *Natural Language Processing***
1. Word2Vec (Continous Bag of Words, Skip-N Gram)
2. Stemming
3. Bag of Words
4. TFIDF
5. Tokenization
6. Auxiliary Text Processing Functions
5. ** *Computer Vision***
1. The Convolution Operation
2. Max, Min, Average Pooling
3. Global Max, Min, Average Pooling
4. Prebuilt Feature Detectors
- Horizontal/Vertical Prewitt Filter
- Horizontal/Vertical Sobel Filter
- Horizontal/Vertical Scharr Filter
- Horizontal/Vertical Roberts Filter
2021-05-28 05:25:37 +02:00
6. ** *Principal Component Analysis***
7. ** *Naive Bayes Classifiers***
1. Multinomial Naive Bayes
2. Bernoulli Naive Bayes
3. Gaussian Naive Bayes
2021-05-28 05:32:49 +02:00
8. ** *K-Means***
2021-05-28 05:25:37 +02:00
9. ** *k-Nearest Neighbors***
10. ** *Outlier Finder (Using z-scores)***
11. ** *Linear Algebra Module***
12. ** *Statistics Module***
13. ** *Data Processing Module***
1. Setting and Printing Datasets
2. Feature Scaling
3. Mean Normalization
4. One Hot Representation
5. Reverse One Hot Representation