This repository contains the datasets used and the machine learning models implemented in Python 3 for long-term degradation prognostics of proton exchange membrane fuel cells (PEMFCs) in the paper "Accurate long-term prognostics of proton exchange membrane fuel cells using recurrent and convolutional neural networks", published in the International Journal of Hydrogen Energy.
If you use the code in your research, please cite the paper as:
Sahajpal K, Rana KPS, Kumar V. Accurate long-term prognostics of proton exchange membrane fuel cells using recurrent and convolutional neural networks. Int J Hydrogen Energy 2023. https://doi.org/10.1016/j.ijhydene.2023.04.143.
Running the code only requires an elementary knowledge of Python. But to experiment with model implementation and hyperparameter optimization, you might need to refer to the documentation of the associated libraries (keras, sklearn, numpy, optuna, etc.).
The IEEE 2014 PHM Challenge dataset files for the PEMFC stack operated with and without current ripples are provided with this repo in the folders FC1_Without_Ripples and Full_FC2_With_Ripples, respectively, within the IEEE2014DataChallengeData folder. To test/implement the deep learning models in Google Colab:
- Open the PEMFC_Prognostics_DL.ipynb file in Github and click Open in Colab.
- Connect to a Colab runtime, preferably a GPU one as model training and optimization are compute-intensive.
- Add the dataset files to the runtime. There are two ways to do this:
- Add the dataset files to your Google Drive. The code assumes that you add the dataset files in the topmost Google Drive directory. If the datasets are at a different path, modify the path in
pd.read_csv()
in the IEEE PHM 2014 Data Challenge code cells. OR - Create a shortcut to this Drive link in Google Drive. To do so, click the down arrow near the folder name following Shared with me and click Add Shortcut to Drive.
- Add the dataset files to your Google Drive. The code assumes that you add the dataset files in the topmost Google Drive directory. If the datasets are at a different path, modify the path in
- Mount Google Drive by running the Mount Google Drive cell.
- Run the Requirements cells and allow Google Colab access to Google Drive when mounting.
- You can experiment with neural network model structures under Model Implementation. To experiment with the varying training-validation-dataset sizes, use the
train_test_split()
function under Dataset Preprocessor. - To evaluate the dataset performance with the optimum hyperparameters reported in the paper up to 4 significant digits, run the Preprocess the Dataset and Model Evaluation code cells without tweaking anything in Step 4.
- To optimize the model hyperparameters, you can configure the functions in
class optuna_search()
under Hyperparameter Optimization. Please refer to the Optuna documentation for more information on the functions involved.- You can change the network optimizer hyperparameters for RMSprop/Adam/SGD Optimizers in the
create_optimizer
function definition. - You can select the model to optimize and define the hyperparameter search spaces for the number of convolutional filters, dropout, number of neurons in each hidden layer, and activation function in the
objective_function
function definition. - You can choose how many optimization trials you want to run by changing
n_trials
in theoptimize_study
function definition. You can also select hyperparameter optimizers other than tree-structured Parzen estimators (TPE) and pruners other than Hyperband withinoptimize_study
.
- You can change the network optimizer hyperparameters for RMSprop/Adam/SGD Optimizers in the
For any questions, you can reach out to me at kartik.sharma1004@gmail.com or kartik.sahajpal.ug20@nsut.ac.in.