Advertisement
mayankjoin3

prompt for regression

Nov 8th, 2024
66
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Python 2.16 KB | None | 0 0
  1. I want to develop a generic multiregression program. Each ML regression method must be in a function. Ensure logging library is used. The file name for log should be log_timestamp.log. All codes should be in try except block so that if multiple ML regression are executing, the error on one should not stop the other from executing. The code should define a variable k_fold. If k_fold is 10 then its 10 fold validation, if it is 7 then seven fold validation. The code should define a variable dataset_percent. If dataset_percent is 10 then use 10 percent of the data and so on. Store a csv for each algo in the output folder. the file name should be the ml algo name. The csv in the output folder will have two columns : 1st column has a header called actual, 2nd column has a header called predicted. Results of all fold should be exported to csv. Since its a k fold code, for a given algo append the output generated k times in a single file for a single algo. The input file is input.csv stored in the input folder. Also do that min-max scaler for all column and later on inverse it too while storing the output with actual and predicted values. Please also display the the time to train, time to test, and total time of execution of each ML algo. The regression results should append in the subsequent runs in the csv file called time.csv which should be in append mode. Assume last column as the one to be classified.
  2.  
  3. The algos are:
  4.  
  5. 1. Linear Regression
  6. 2. Polynomial Regression
  7. 3. Ridge Regression
  8. 4. Lasso Regression
  9. 5. Elastic Net Regression
  10. 6. Bayesian Ridge Regression
  11. 7. Ordinary Least Squares Regression (OLS)
  12. 8. Huber Regression
  13. 9. Theil-Sen Estimator
  14. 10. Quantile Regression
  15. 11. Decision Tree Regression
  16. 12. Random Forest Regression
  17. 13. Gradient Boosting Regression
  18. 14. XGBoost Regression
  19. 15. LightGBM Regression
  20. 16. CatBoost Regression
  21. 17. Support Vector Regression (SVR)
  22. 18. K-Nearest Neighbors Regression (KNNR)
  23. 19. Principal Component Regression (PCR)
  24. 20. Partial Least Squares Regression (PLSR)
  25. 21. Artificial Neural Networks (ANN) Regression
  26. 22. Multi-layer Perceptron (MLP) Regression
  27. 23. Stochastic Gradient Descent (SGD) Regression
  28. 24. Bayesian Regression
  29.  
  30.  
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement