0% found this document useful (0 votes)
9 views5 pages

Ieee 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views5 pages

Ieee 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

280

PV-Power Forecasting using Machine Learning


Techniques
Kazi Abdullah Al Arafat1 , Kode Creer2 , Anjan Debnath3 , Temitayo O. Olowu4 , and Imtiaz Parvez5
2024 IEEE International Conference on Electro Information Technology (eIT) | 979-8-3503-3064-9/24/$31.00 ©2024 IEEE | DOI: 10.1109/eIT60633.2024.10609848

1
Dept. of Computer Sci and Eng, Atish Dipankar University of Science and Technology, Dhaka, Bangladesh
3
Dept. of Electrical Engineering and Computer Science, 1500 SW Jefferson Way Corvallis, OR 97331
4
Idaho National Laboratory, 1955 N. Fremont Ave., Idaho Falls, ID 83415
2,5
Dept. of Computer Science, Utah Valley University, 800 W University Pkwy, Orem, UT 84058
Email: cse2010004@adust.edu.bd, debnatha@oregonstate.edu, temitayo.olowu@inl.gov, {kode.creer, imtiaz.parvez}@uvu.edu

Abstract—Solar energy forecasting plays a pivotal role in the accurate forecast of their power generation is also needed for
efficient utilization of renewable energy resources for sustainable day-to-day operations of the solar PV plant. This allows the
power generation. This study delves into the domain of solar- system operators and the utility companies to manage the
power forecasting, employing a comprehensive analysis of ma-
chine learning models. The primary objective is to evaluate and power grid assets efficiently, and economically. This is also
compare the performance of Gated Recurrent Unit (GRU), Re- necessary to effectively mitigate the challenges associated with
current Neural Network (RNN), Multi-Layer Perceptron (MLP), the intermittency in solar power generation by coordinating
and Linear Regression (LR) models in predicting solar energy them with other dispatchable power generation sources in the
production. Through a comprehensive evaluation of individual grid. To address issues with forecasting of solar PV generation,
model performance, the study provides nuanced insights into the
strengths and limitations of each forecasting approach. Results several artificial intelligence (AI)-based (which include ma-
indicate that the Multy-Layer Perceptron (MLP) model excels chine learning and deep learning approaches) methods of solar
in accuracy, exhibiting low root mean square error (RMSE) and PV power generation have been proposed in literature [5]. ML-
high correlation among the parameters. The Gated Recurrent based methods include artificial neural networks (ANN), adap-
Unit (GRU) model demonstrates competitive performance, while tive neuro-fuzzy inference system (ANFIS), support vector
the Recurrent Neural Network model showcases strengths in
multiple metrics. Additionally, MLP and GRU models display machine (SVM), support vector regression (SVR), K-nearest
superior predictive accuracy, emphasizing their efficacy in solar neighbor (K-NN), regression tress, random forest amongst
energy forecasting. others have been used. Deep learning methods that have
Index Terms—Solar, Forecasting, Gated Recurrent Unit, Re- been proposed and used for solar PV forecast include long
current Neural Network, Multi-Layer Perceptron, and Linear short term memory (LSTM), recurrent neural networks (RNN),
Regression.
convolutional neural networks (CNN), generative adversarial
networks (GAN), gated recurrent units (GRU), amongst others.
I. I NTRODUCTION Reference [6] used a feedforward ANN to carry out a day-
The deployment of power generation from renewable energy ahead forecast of panel and total array output using a similar
resources has been on the rise over the last two decades historical weather data with the same day to be forecasted. The
[1]. In 2023, the annual renewable capacity is increased by LSTM is used to validate the proposed ANN-based model with
approximately 50% globally. This amounts to almost 510 mean absolute percentage errors (MAPE) of 16.9% and 9.4%
gigawatts (GW) in capacity additions all over the world. This for three-phase and single-phase models respectively. A Neural
is the most capacity addition in the last two decades. With Network approach is used to make 1-6 h ahead forecast for a
the steady declining cost of power generation from solar PVs 75MW PV power plant is proposed in [7] . For various forecast
over the years [2], [3], globally, three-quarters of new capacity horizon and weather types, the proposed NN-based forecast
additions of renewable capacity can be attributed to solar PV with inverter-level clustering is able to achieve a MAPE
systems [1]. between 1.42 − 8.13%. Authors of [8] proposed a transfer
This massive deployment of solar PV systems definitely learning technique that address cases where there is inadequate
comes with several technical challenges due to the variability amount if data to train ML-based forecasting models. This
in their power output. Their power generation depends on sev- work used the three transfer learning algorithm with Long-
eral environmental factors which include solar irradiance, am- Term Short-Term Memory (LSTM) model to provide accurate
bient temperature, module temperature, cloud coverage etc. On solar PV production forecast.
the other hand, some of the technical challenges with increased A multiple Gated Recurrent Unit (m-GRU) networks is used
solar PV generation include voltage and frequency instability, in [9] to predict power generation from a solar PV plant.
reverse power flow, feeder losses, harmonics, resiliency, reli- The proposed m-GRU is to the classical GRU, ANN, SVR
ability, protection coordination, big data processing, amongst and K Nearest Neighbors (KNN). The m-GRU model is able
others [4]. To address some of these issues, accurate forecast- to achieve a lower training time and normalized RMSE. To
ing of expected solar power generation becomes necessary. overcome the challenges of overfitting and also to minimize
An accurate forecast of solar PV power generation is highly the variance of a single MLP model, an ensemble MLP is
important for distribution planning in order to determine the proposed in [10]. The proposed method is used for day-
optimal sizing, optimal location of solar PV systems as well head solar prediction. The ensemble MLP model is fed with
as PV hosting capacity of feeders where the solar PV systems the nearest day power generation and feature vectors. The
will be connected to. After installations of the solar PV arrays, results showed that the overfitting problem with MLP can

979-8-3503-3064-9/24/$31.00©2024 EEE
orized licensed use limited to: Vignan's Foundation for Science Technology & Research (Deemed to be University). Downloaded on August 03,2024 at 09:08:09 UTC from IEEE Xplore. Restrictions a
281

be addressed using ensemble MLB with better forecasting from standard feedforward neural networks in that they incor-
accuracy especially ons sunny days. porate loops within their design, enabling them to demonstrate
The input of the ensemble model for day-ahead photovoltaic dynamic temporal behavior. RNNs are very suitable for appli-
forecasting comprises feature vectors and the 24-hour power cations that include sequences, such as speech recognition,
generation of the nearest day language modeling, time series prediction, and handwriting
In this study, we used state-of-the-art machine learning recognition, because to their intrinsic memory capabilities.
techniques to conduct a thorough analysis of solar-power fore- An important characteristic of Recurrent Neural Networks
casting. By employing four different algorithms, we were able (RNNs) is its capacity to process inputs that have varying
to predict solar energy output with impressive accuracy. We lengths, enabling them to be used to a diverse array of practical
demonstrate the effectiveness of these algorithms in real-world situations. Nevertheless, recurrent neural networks (RNNs)
scenarios and provide insight into their practical applicability have constraints such as the occurrence of disappearing or
in the field of solar energy forecasting. Our study paves the bursting gradients during training, which might impede their
way for more dependable and effective solar energy prediction efficacy in capturing long-range relationships. The constraints
models by exposing the strengths and limitations of each have prompted the creation of more sophisticated versions,
strategy and offering stakeholders in the renewable energy such as Long Short-Term Memory (LSTM) networks and
sector actionable insights. Gated Recurrent Units (GRUs), to tackle these challenges and
The rest of the paper is structured as follows: Section II enhance performance on sequential tasks.
describes the methodology used in this work; the simulation Gated Recurrent Unit (GRU): A Gated Recurrent Unit (GRU)
results are presented in Section II; while the paper is concluded is a specific type of recurrent neural networks (RNNs) that
in Section IV. aims to overcome the drawbacks of typical RNNs, including
the issues of vanishing gradients and the challenge of capturing
II. F ORECASTING A LGORITHMS long-term dependencies [17]. GRUs, which are a modification
of LSTM networks, feature a more streamlined design with
In this study, we explored four machine learning algorithms fewer parameters. This results in improved computational
for forecasting. The algorithms are as follows: efficiency without sacrificing competitive performance. GRUs
Linear Regression (LR): Linear regression is a basic sta- employ gating mechanisms to selectively modify and reset
tistical technique employed to describe the correlation be- information inside their hidden states, allowing them to effec-
tween a dependent variable and one or more independent tively regulate the transmission of information across different
variables [11], [12]. At its most basic level, this method time steps. The gating mechanism of GRUs enables them to
presupposes that there is a straight-line connection between the preserve crucial information throughout extended sequences
variables. The variable that depends on the others may be rep- and address the issue of disappearing gradients by boosting the
resented as a combination of the independent variables, with gradient flow during training. Furthermore, GRUs are known
each variable being multiplied by a coefficient. The objective for their ease of implementation and training in comparison to
of linear regression is to determine the coefficients that provide LSTMs, which contributes to their widespread usage in diverse
the most accurate fit to the given data, while reducing the sequential applications such as natural language processing,
disparity between the projected values and the actual values. time series prediction, and audio recognition.
Typically, the method of least squares is employed to minimize
the sum of the squared discrepancies between the observed III. S IMULATION AND R ESULTS
and predicted values. Linear regression, although simple, is In order to assess the effectiveness of our proposed model,
nonetheless a potent and adaptable tool for modeling and we utilize one year of photovoltaic (PV) generation data
evaluating data. It forms the basis for more intricate regression obtained from the National Renewable Energy Laboratory
methods. (NREL) database. Table I displays a portion of the dataset.
Multi-Layer Perceptron (MLP): A Multi-Layer Perceptron The input variables include temperature, irradiance, relative
(MLP) is a fundamental form of artificial neural network that humidity, sun zenith angle, and wind speed. The desired output
has many layers of linked nodes, or neurons, arranged in consists of the generated voltage and current. Out of the
a feedforward fashion [13], [14]. Every neuron in a Multi- data collected over a period of 1 year, we allocate 60% for
Layer Perceptron (MLP) is linked to every neuron in the training, 20% for cross-validation, and 20% for testing. The
following layer, creating a dense network configuration. MLPs performance of the four models is as follows:
are extensively utilized for supervised learning tasks, namely a) Gated Recurrent Unit (GRU): The performance of the
in classification and regression issues. They have exceptional GRU is shown in Table II.
proficiency in capturing intricate non-linear correlations be- The GRU model predicts solar Watt values with a remark-
tween input characteristics and desired outcomes. The con- able accuracy of 99.99%. Robust model performance is indi-
cealed strata of a Multilayer Perceptron (MLP) facilitate its cated by the absence of significant errors, both maximum and
acquisition of hierarchical representations of the input data, minimum, and by a perfect correlation. The model successfully
progressively extracting more advanced characteristics as in- represents the underlying dependencies and patterns in the
formation spreads across the network. MLPs utilize backprop- solar dataset.
agation to modify the weights of neuron connections during The real data and its forecasted value through GRU are
training, aiming to decrease the discrepancy between expected demonstrated in Fig 1, where the x-axis is the random 20
and actual outputs and enhance the network’s efficiency. instant and the y-axis is the watt. It is evident that the
Recurrent Neural Networks (RNNs): A Recurrent Neural forecasting follows the real value very closely.
Network (RNN) is an artificial neural network specifically b) Recurrent Neural Network (RNN): The performance
intended to efficiently handle sequential data by preserving of the RNN is shown in Table III. The RNN model exhibits
knowledge about previous inputs through recurrent connec- exceptional precision and accuracy of 99.99%. The model’s
tions inside the network’s hidden layers [15], [16]. RNNs differ dependability is demonstrated by the lack of significant errors

orized licensed use limited to: Vignan's Foundation for Science Technology & Research (Deemed to be University). Downloaded on August 03,2024 at 09:08:09 UTC from IEEE Xplore. Restrictions a
282

TABLE I
R EAL W ORLD PV DATASET
Solar Wind Relative
Year Month Day Hour Minute Irradiance Temperature Zenith Speed Humid- Vmp Imp Watt
Angle ity
2016 1 1 4 0 0 25 131.23 2.1 86.91 0 0 0
2016 1 1 4 30 0 25 124.58 2.1 86.91 0 0 0
2016 1 1 5 0 0 25 117.99 2.2 85.96 0 0 0
2016 1 1 5 30 0 25 111.47 2.2 85.97 0 0 0
2016 1 1 6 0 0 25 105.02 2.3 85.26 0 0 0
2016 1 1 6 30 0 25 98.68 2.3 85.28 0 0 0
2016 1 1 7 0 0 25 92.47 2.3 80.85 0 0 0
2016 1 1 7 30 40 25 86.4 2.5 80.86 26.905 0.170923404 4.598694176
2016 1 1 8 0 128 26 80.53 2.9 77.68 29.805 0.552353727 16.46290282
2016 1 1 8 30 226 26 74.9 3.2 77.69 31.245 0.978856761 30.5843795
2016 1 1 9 0 323 26 69.56 3.5 77.02 32.1 1.401788201 44.99740127
2016 1 1 9 30 144 26 64.59 3.6 77.01 30.11 0.621887171 18.72502272
2016 1 1 10 0 304 26 60.09 3.7 77.33 31.96 1.318842914 42.15021954
2016 1 1 10 30 331 26 56.17 3.7 77.3 32.155 1.436763807 46.19914022
2016 1 1 11 0 383 26 52.95 3.7 73.62 32.49 1.6635243 54.04790451
2016 1 1 11 30 321 26 50.58 3.5 73.58 32.085 1.393088052 44.69723016
2016 1 1 12 0 427 26 49.18 3.4 74.38 32.73 1.855565072 60.7326448
2016 1 1 12 30 479 26 48.82 3.3 74.34 32.98 2.082298597 68.67420773
2016 1 1 13 0 521 27 49.55 3.2 74.78 33.005 2.265464939 74.77167031

TABLE II TABLE III


G ATED R ECURRENT U NIT (GRU) R ECURRENT N EURAL N ETWORK (RNN)

Performance Metrics Watt Performance Metrics Watt


RMSE 0.4830 RMSE 0.7800
NRMSE 0.0167 NRMSE 0.0270
MAE 0.3205 MAE 0.5422
NMAE 0.0111 NMAE 0.0188
Min Abs Error 0.0004 Min Abs Error 0.0001
Max Abs Error 5.2009 Max Abs Error 3.0464
r 1.0000 r 1.0000
R2 0.9999 R2 0.9997

Fig. 1. Testing Data vs Forecasting data for Gated Recurrent Unit Fig. 2. Testing Data vs Forecasting data for Recurrent Neural Network

and the flawless correlation. The temporal dependencies and temporal dependencies found in the solar dataset are not
patterns in the production of solar energy are well captured entirely captured by the MLP’s architecture.
by the model. The real data and its forecasted value by MLP are demon-
The real data and its forecasted value by RNN are demon- strated in Fig 3 , where the x-axis is the random 20 instant
strated in Fig 2, where the x-axis is the random 20 instant and and the y-axis is the watt. It is evident that the forecasting
the y-axis is the watt. It is evident that the forecasting follows follows the real value very closely.
the real value very closely. d) Linear Regression (LR): The performance of the LR is
c) Multi-Layer Perceptron (MLP): The performance of shown in Table V. Compared to GRU and RNN, the LR model
the MLP is shown in Table IV. In comparison to GRU and shows larger errors despite demonstrating a high correlation.
RNN, the MLP model exhibits slightly higher errors, despite Although the model works well, it might not be as good at
its good performance. Although the model can predict events capturing the complexities found in the solar energy dataset.
quite well, it might still be improved. It’s possible that the Non-linear patterns and temporal dependencies might not be

orized licensed use limited to: Vignan's Foundation for Science Technology & Research (Deemed to be University). Downloaded on August 03,2024 at 09:08:09 UTC from IEEE Xplore. Restrictions a
283

TABLE IV
M ULTI -L AYER P ERCEPTRON (MLP)

Performance Metrics Watt


RMSE 0.0725
NRMSE 0.0025
MAE 0.0426
NMAE 0.0015
Min Abs Error 0.0000
Max Abs Error 0.8775
r 1.0000
R2 1.0000

Fig. 4. Testing Data vs Forecasting data for Linear Regressing

TABLE VI
C OMPARISON OF F OUR M ODELS
Performance Metrics GRU RNN MLP LR
RMSE 0.48308 0.7800 0.0725 0.5946
NRMSE 0.0167 0.0270 0.0025 0.0206
MAE 0.3205 0.5422 0.0426 0.3950
NMAE 0.0111 0.0188 0.0015 0.0137
Min Abs Error 0.0004 0.0001 0.0000 0.0008
Max Abs Error 5.2009 3.0464 0.8775 5.0322
r 1.0000 1.0000 1.0000 0.9999
R2 0.9999 0.9997 1.0000 0.9998
Fig. 3. Testing Data vs Forecasting data for Multi-Layer Perceptron

completely captured by linear regression. b) Recurrent Neural Network (RNN): Strengths:


Predictions are accurate when the RMSE (0.7800) and
TABLE V
NRMSE (0.0270) are low. Excellent model fit is highlighted
L INEAR R EGRESSION (LR) by strong correlation (r = 1.0000) and R² (R² = 0.9997).
Effectively captures temporal dependencies.
Performance Metrics Watt Considerations:
RMSE 0.5946
NRMSE 0.0206 Computational power and data volume have an impact on
MAE 0.3950 training, just like in GRU. Performance can be improved with
NMAE 0.0137 additional optimization.
Min Abs Error 0.0008
Max Abs Error 5.0322 c) Multi-Layer Perceptron (MLP): Strengths:
r 0.9999 Excellent overall performance with high R² (R² = 1.0000) and
R2 0.9998 correlation (r = 1.0000). Sufficient for less complex forecasting
jobs.
Considerations:
The real data and its forecasted value by LR are demon- Higher than GRU and RNN in terms of RMSE (0.0725)
strated in Fig 4, where the x-axis is the random 20 instant and and NRMSE (0.0025). Hyperparameter tuning and architecture
the y-axis is the watt. It is evident that the forecasting follows modifications might be beneficial.
the real value very closely. d) Linear Regression (LR): Strengths:
A strong linear fit is indicated by a high correlation (r =
A. Comparative Analysis 0.9999) and R² (R² = 0.9998). Ease of interpretation and
The comparison of four algorithms are tabulated in Ta- simplicity.
ble VI. Considerations:
Elevated RMSE (0.5946) and NRMSE (0.0206) indicate po-
a) Gated Recurrent Unit (GRU): Strengths: tential constraints in identifying intricate patterns. Not as good
Demonstrates excellent predictive accuracy by achieving the at simulating non-linear relationships.
lowest RMSE (0.4830) and NRMSE (0.0167). An ideal fit to e) Overall Comparison: Accuracy: As referred to in
the data is indicated by perfect correlation (r = 1.0000) and Fig. 5. The highest accuracy is shown by GRU and RNN,
R-squared (R² = 0.9999). Reduces the absolute minimum and which also have low RMSE and NRMSE. MLP works well
maximum errors, indicating accuracy. but has a little bit more errors. LR exhibits larger error rates,
Considerations: especially for intricate forecasting tasks.
Needs a sufficient amount of data and processing power to Precision: RNN and GRU show accuracy with low absolute
train. Performance may be further optimized with hyperpa- errors. MLP demonstrates respectable accuracy, albeit with
rameter tuning. higher errors than GRU and RNN.

orized licensed use limited to: Vignan's Foundation for Science Technology & Research (Deemed to be University). Downloaded on August 03,2024 at 09:08:09 UTC from IEEE Xplore. Restrictions a
284

[6] M. Jasinski, O. Homaee, D. Opałkowski, A. Najafi, and Z. Leonowicz,


“On the forecastability of solar energy generation by rooftop panels
pointed in different directions,” IEEE Transactions on Sustainable
Energy, vol. 15, no. 1, pp. 699–702, 2024.
[7] A. du Plessis, J. Strauss, and A. Rix, “Short-term solar power
forecasting: Investigating the ability of deep learning models
to capture low-level utility-scale photovoltaic system behaviour,”
Applied Energy, vol. 285, p. 116395, 2021. [Online]. Available:
https://www.sciencedirect.com/science/article/pii/S0306261920317657
[8] E. Sarmas, N. Dimitropoulos, V. Marinakis, Z. Mylona, and H. Doukas,
“Transfer learning strategies for solar power forecasting under data
scarcity,” Scientific Reports, vol. 12, no. 1, p. 14643, 2022.
[9] N. Sodsong, K. M. Yu, and W. Ouyang, “Short-term solar pv forecasting
using gated recurrent unit with a cascade model,” in 2019 International
Conference on Artificial Intelligence in Information and Communication
(ICAIIC), 2019, pp. 292–297.
[10] M. Wang and P. Wang, “Ensemble multilayer perceptron model for day-
ahead photovoltaic forecasting,” in Proceedings of the 4th International
Fig. 5. Comparison of performance metrics of GRU, MLP, RNN and LR Conference on Control and Computer Vision, 2021, pp. 189–194.
[11] S. Weisberg, Applied linear regression. John Wiley & Sons, 2005, vol.
528.
[12] M. Trigo-González, F. Batlles, J. Alonso-Montesinos, P. Ferrada,
Computational Requirements: In terms of computation, J. Del Sagrado, M. Martı́nez-Durbán, M. Cortés, C. Portillo, and
MLP and LR require less work. More resources might be A. Marzo, “Hourly pv production estimation by means of an exportable
multiple linear regression model,” Renewable energy, vol. 135, pp. 303–
needed for GRU and RNN training. 312, 2019.
Interpretability: Because linearity is inherent in LR, inter- [13] D. W. Ruck, S. K. Rogers, and M. Kabrisky, “Feature selection using
pretability is simple. The interpretation of MLP, GRU, and a multilayer perceptron,” Journal of neural network computing, vol. 2,
no. 2, pp. 40–48, 1990.
RNN models is more difficult due to their complexity. [14] M. G. S. Sriyananda, I. Parvez, I. Güvene, M. Bennis, and A. I. Sarwat,
Recommendations: GRU or RNN is advised for highly “Multi-armed bandit for lte-u and wifi coexistence in unlicensed bands,”
precise and accurate solar energy forecasting. When pro- in 2016 IEEE Wireless Communications and Networking Conference,
2016, pp. 1–6.
cessing power is at a premium, MLP works well for less [15] L. Medsker and L. C. Jain, Recurrent neural networks: design and
complex prediction jobs. LR’s simplicity can be taken into applications. CRC press, 1999.
consideration, but in complex scenarios, its usefulness might [16] V. Veerasamy, N. I. A. Wahab, M. L. Othman, S. Padmanaban, K. Sekar,
R. Ramachandran, H. Hizam, A. Vinayagam, and M. Z. Islam, “Lstm
be limited. recurrent neural network classifier for high impedance fault detection
Every model possesses its own advantages and nuances. The in solar pv integrated power system,” IEEE access, vol. 9, pp. 32 672–
option chosen will rely on the particulars of the forecasting 32 687, 2021.
[17] R. Dey and F. M. Salem, “Gate-variants of gated recurrent unit (gru)
job, the resources at hand, and the intended harmony between neural networks,” in 2017 IEEE 60th international midwest symposium
interpretability and accuracy. Up to date models can be pro- on circuits and systems (MWSCAS). IEEE, 2017, pp. 1597–1600.
duced through additional testing and refinement.
IV. C ONCLUSION
In this study, we explored four machine-learning algorithms
algorithms for PV forecasting. Our study paves the way for
more dependable and effective solar energy prediction models
by exposing the strengths and limitations of each models and
offering stakeholders in the renewable energy sector actionable
insights. With few errors and perfect correlation, GRU and
RNN show better predictive performance. Although MLP
performs better than GRU and RNN, its error rate is slightly
higher. Although LR displays a high degree of correlation,
its errors are greater than those of GRU and RNN. These
results highlight how well recurrent neural network architec-
tures—more especially, GRU and RNN—work for forecasting
solar energy generation. The accuracy of these models may be
enhanced through additional optimization and fine-tuning.
R EFERENCES
[1] “Renewables 2023, IEA, Paris,” https://www.iea.org/reports/renewables-
2023, License: CC BY 4.0, [Accessed 15-01-2024].
[2] V. Ramasamy, J. Zuboy, M. Woodhouse, E. O’Shaughnessy, D. Feldman,
J. Desai, A. Walker, R. Margolis, and P. Basore, “Us solar photovoltaic
system and energy storage cost benchmarks, with minimum sustain-
able price analysis: Q1 2023,” National Renewable Energy Laboratory
(NREL), Golden, CO (United States), Tech. Rep., 2023.
[3] “Solar Futures Study,” https://www.energy.gov/eere/solar/solar-futures-
study, [Accessed 15-01-2024].
[4] T. O. Olowu, A. Sundararajan, M. Moghaddami, and A. I. Sarwat,
“Future challenges and mitigation methods for high photovoltaic
penetration: A survey,” Energies, vol. 11, no. 7, 2018. [Online].
Available: https://www.mdpi.com/1996-1073/11/7/1782
[5] T. Rajasundrapandiyanleebanon, K. Kumaresan, S. Murugan,
M. Subathra, and M. Sivakumar, “Solar energy forecasting using
machine learning and deep learning techniques,” Archives of
Computational Methods in Engineering, pp. 1–21, 2023.

orized licensed use limited to: Vignan's Foundation for Science Technology & Research (Deemed to be University). Downloaded on August 03,2024 at 09:08:09 UTC from IEEE Xplore. Restrictions a

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy