A1

所属分类:其他
开发工具:Jupyter Notebook
文件大小:0KB
下载次数:0
上传日期:2023-11-18 09:38:57
上 传 者sh-1993
说明:  反向传播(Back Propagation)
(Back Propagation)

文件列表:
A1-KamillaTen.pdf (636560, 2023-11-19)
A1-synthetic.txt (110546, 2023-11-19)
A1-turbine.txt (16315, 2023-11-19)
MyNeuralNetwork.py (878, 2023-11-19)
dailyActivity_merged.csv (111288, 2023-11-19)
rslt/ (0, 2023-11-19)
rslt/bp_dataset_result.txt (7080, 2023-11-19)
rslt/bp_synthetic_result.txt (10138, 2023-11-19)
rslt/bp_turbine_result.txt (3430, 2023-11-19)
rslt/mlr_dataset_result.txt (4449, 2023-11-19)
rslt/mlr_synthetic_result.txt (6095, 2023-11-19)
rslt/mlr_turbine_result.txt (1879, 2023-11-19)
slide_231.jpg (67084, 2023-11-19)
src/ (0, 2023-11-19)
src/bp_synthetic.ipynb (208276, 2023-11-19)
src/bp_turbine.ipynb (125752, 2023-11-19)

# A1 - Back Propagation ## Overview This project focuses on predicting turbine performance using machine learning models, specifically a Backpropagation Neural Network (BP) and a benchmark Linear Regression model (MLR). The implementation is in Python, leveraging libraries such as pandas, numpy, scikit-learn, and matplotlib. ![Slide 231](https://github.com/kamillok505/A1/assets/151350933/fcb4b80b-9928-45bd-9c0c-f6f7053999f2) ## Table of Contents - [Description of Implementation](#description-of-implementation) - [Execution Instructions](#execution-instructions) - [Selected Dataset](#selected-dataset) - [Implementation Decisions](#implementation-decisions) - [Discussion and Results](#discussion-and-results) - [Conclusions](#conclusions) ## Description of Implementation The project employs two models, BP and MLR, for turbine performance prediction. Python is used, and data preprocessing includes handling missing values, categorical value representation, outlier detection, and data normalization for consistent scales. ## Execution Instructions Ensure the required libraries (pandas, numpy, scikit-learn, matplotlib) are installed. Run the code in a Python environment. ## Selected Dataset The Fitbit Fitness Tracker Dataset from Kaggle is used for this project. Two additional datasets, A1-synthetic and A1-turbine, are also employed. Data normalization techniques, such as Min-Max scaling, are applied for effective model training. ## Implementation Decisions ### Neural Network (BP) - **Architecture:** Input layer, hidden layer (4 neurons), output layer. - **Activation Function:** Sigmoid in the hidden layer. - **Loss Function:** Mean Absolute Percentage Error (MAPE). - **Training:** 1000 epochs, learning rate of 0.01. ### Linear Regression (MLR) - **Model:** Utilizes scikit-learn's LinearRegression. - **Evaluation Metric:** Mean Absolute Error (MAE). ## Discussion and Results Both models are evaluated on synthetic and turbine datasets. MAPE and MAE values, along with scatter plots, provide insights into their performance. MLR outperforms BP on the synthetic dataset, while challenges are observed in turbine dataset predictions for both models. ## Conclusions The project highlights the importance of aligning model architecture with dataset characteristics. Further exploration of neural network architecture, parameter tuning, and dataset characteristics is recommended. Future work should focus on advanced models, ensemble methods, dataset enhancement, and systematic hyperparameter tuning. Special thanks to Kaggle for providing the Fitbit Fitness Tracker Dataset.

近期下载者

相关文件


收藏者