0% found this document useful (0 votes)
32 views7 pages

Stock Prediction RNN

Stock prediction

Uploaded by

glitergayu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views7 pages

Stock Prediction RNN

Stock prediction

Uploaded by

glitergayu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

SUN_Pharma_Stock_Prediction_RNN

September 22, 2024

1 Recurrent Neural Network


1.1 Part 1 - Data Preprocessing
1.1.1 Importing the libraries

[1]: import numpy as np


import matplotlib.pyplot as plt
import pandas as pd

1.1.2 Importing the training set

[2]: dataset_train = pd.read_csv('SUN_train_data.csv')


dataset_train['Price'] = dataset_train['Price'].str.replace(',', '').
↪astype(float)

training_set = dataset_train.iloc[:, 1:2].values

1.1.3 Feature Scaling

[3]: from sklearn.preprocessing import MinMaxScaler


sc = MinMaxScaler(feature_range = (0, 1))
training_set_scaled = sc.fit_transform(training_set)

1.1.4 Creating a data structure with 80 timesteps and 1 output

[5]: X_train = []
y_train = []
for i in range(80, 2471):
X_train.append(training_set_scaled[i-80:i, 0])
y_train.append(training_set_scaled[i, 0])
X_train, y_train = np.array(X_train), np.array(y_train)

1.1.5 Reshaping

[6]: X_train = np.reshape(X_train, (X_train.shape[0], X_train.shape[1], 1))

1
1.2 Part 2 - Building and Training the RNN
1.2.1 Importing the Keras libraries and packages

[7]: from keras.models import Sequential


from keras.layers import Dense
from keras.layers import LSTM
from keras.layers import Dropout

1.2.2 Initialising the RNN

[8]: regressor = Sequential()

1.2.3 Adding the first LSTM layer and some Dropout regularisation

[9]: regressor.add(LSTM(units = 80, return_sequences = True, input_shape = (X_train.


↪shape[1], 1)))

regressor.add(Dropout(0.2))

C:\Users\Shekhar Singh\onedrive\desktop\ml\function\env\Lib\site-
packages\keras\src\layers\rnn\rnn.py:204: UserWarning: Do not pass an
`input_shape`/`input_dim` argument to a layer. When using Sequential models,
prefer using an `Input(shape)` object as the first layer in the model instead.
super().__init__(**kwargs)

1.2.4 Adding a second LSTM layer and some Dropout regularisation

[10]: regressor.add(LSTM(units = 80, return_sequences = True))


regressor.add(Dropout(0.2))

1.2.5 Adding a third LSTM layer and some Dropout regularisation

[11]: regressor.add(LSTM(units = 80, return_sequences = True))


regressor.add(Dropout(0.2))

1.2.6 Adding a fourth LSTM layer and some Dropout regularisation

[12]: regressor.add(LSTM(units = 80, return_sequences = True))


regressor.add(Dropout(0.2))

1.2.7 Adding a fifth LSTM layer and some Dropout regularisation

[13]: regressor.add(LSTM(units = 80, return_sequences = True))


regressor.add(Dropout(0.2))

2
1.2.8 Adding a sixth LSTM layer and some Dropout regularisation

[14]: regressor.add(LSTM(units = 80))


regressor.add(Dropout(0.2))

1.2.9 Adding the output layer

[15]: regressor.add(Dense(units = 1))

1.2.10 Compiling the RNN

[16]: regressor.compile(optimizer = 'adam', loss = 'mean_squared_error')

1.2.11 Fitting the RNN to the Training set

[17]: regressor.fit(X_train, y_train, epochs = 50, batch_size = 32)

Epoch 1/50
75/75 �������������������� 25s 167ms/step -
loss: 0.0399
Epoch 2/50
75/75 �������������������� 13s 167ms/step -
loss: 0.0055
Epoch 3/50
75/75 �������������������� 12s 164ms/step -
loss: 0.0057
Epoch 4/50
75/75 �������������������� 12s 165ms/step -
loss: 0.0040
Epoch 5/50
75/75 �������������������� 13s 171ms/step -
loss: 0.0038
Epoch 6/50
75/75 �������������������� 12s 165ms/step -
loss: 0.0033
Epoch 7/50
75/75 �������������������� 12s 166ms/step -
loss: 0.0032
Epoch 8/50
75/75 �������������������� 12s 166ms/step -
loss: 0.0037
Epoch 9/50
75/75 �������������������� 12s 163ms/step -
loss: 0.0032
Epoch 10/50
75/75 �������������������� 13s 167ms/step -
loss: 0.0025
Epoch 11/50

3
75/75 �������������������� 12s 160ms/step -
loss: 0.0027
Epoch 12/50
75/75 �������������������� 12s 164ms/step -
loss: 0.0023
Epoch 13/50
75/75 �������������������� 12s 163ms/step -
loss: 0.0021
Epoch 14/50
75/75 �������������������� 12s 162ms/step -
loss: 0.0023
Epoch 15/50
75/75 �������������������� 12s 161ms/step -
loss: 0.0027
Epoch 16/50
75/75 �������������������� 12s 158ms/step -
loss: 0.0019
Epoch 17/50
75/75 �������������������� 12s 160ms/step -
loss: 0.0021
Epoch 18/50
75/75 �������������������� 12s 163ms/step -
loss: 0.0023
Epoch 19/50
75/75 �������������������� 12s 161ms/step -
loss: 0.0020
Epoch 20/50
75/75 �������������������� 12s 158ms/step -
loss: 0.0017
Epoch 21/50
75/75 �������������������� 12s 161ms/step -
loss: 0.0018
Epoch 22/50
75/75 �������������������� 12s 161ms/step -
loss: 0.0017
Epoch 23/50
75/75 �������������������� 12s 160ms/step -
loss: 0.0016
Epoch 24/50
75/75 �������������������� 12s 156ms/step -
loss: 0.0016
Epoch 25/50
75/75 �������������������� 12s 156ms/step -
loss: 0.0015
Epoch 26/50
75/75 �������������������� 12s 156ms/step -
loss: 0.0015
Epoch 27/50

4
75/75 �������������������� 12s 156ms/step -
loss: 0.0014
Epoch 28/50
75/75 �������������������� 12s 158ms/step -
loss: 0.0017
Epoch 29/50
75/75 �������������������� 12s 158ms/step -
loss: 0.0014
Epoch 30/50
75/75 �������������������� 12s 156ms/step -
loss: 0.0013
Epoch 31/50
75/75 �������������������� 12s 155ms/step -
loss: 0.0013
Epoch 32/50
75/75 �������������������� 12s 156ms/step -
loss: 0.0013
Epoch 33/50
75/75 �������������������� 11s 152ms/step -
loss: 0.0013
Epoch 34/50
75/75 �������������������� 11s 152ms/step -
loss: 0.0013
Epoch 35/50
75/75 �������������������� 12s 156ms/step -
loss: 0.0011
Epoch 36/50
75/75 �������������������� 12s 158ms/step -
loss: 0.0014
Epoch 37/50
75/75 �������������������� 12s 158ms/step -
loss: 0.0012
Epoch 38/50
75/75 �������������������� 12s 157ms/step -
loss: 0.0011
Epoch 39/50
75/75 �������������������� 12s 157ms/step -
loss: 0.0011
Epoch 40/50
75/75 �������������������� 12s 154ms/step -
loss: 0.0013
Epoch 41/50
75/75 �������������������� 11s 153ms/step -
loss: 0.0012
Epoch 42/50
75/75 �������������������� 11s 150ms/step -
loss: 0.0011
Epoch 43/50

5
75/75 �������������������� 12s 154ms/step -
loss: 0.0010
Epoch 44/50
75/75 �������������������� 12s 153ms/step -
loss: 0.0011
Epoch 45/50
75/75 �������������������� 12s 157ms/step -
loss: 0.0011
Epoch 46/50
75/75 �������������������� 12s 156ms/step -
loss: 0.0010
Epoch 47/50
75/75 �������������������� 11s 153ms/step -
loss: 8.6836e-04
Epoch 48/50
75/75 �������������������� 12s 156ms/step -
loss: 9.2004e-04
Epoch 49/50
75/75 �������������������� 11s 151ms/step -
loss: 8.9111e-04
Epoch 50/50
75/75 �������������������� 11s 151ms/step -
loss: 9.3594e-04

[17]: <keras.src.callbacks.history.History at 0x26891d6c470>

1.3 Part 3 - Making the predictions and visualising the results


1.3.1 Getting the real stock price of 2024

[18]: dataset_test = pd.read_csv('SUN_test_data.csv')


dataset_test['Price'] = dataset_test['Price'].str.replace(',', '').astype(float)
real_stock_price = dataset_test.iloc[:, 1:2].values

1.3.2 Getting the predicted stock price of 2024

[19]: dataset_total = pd.concat((dataset_train['Price'], dataset_test['Price']), axis␣


↪= 0)

inputs = dataset_total[len(dataset_total) - len(dataset_test) - 80:].values


inputs = inputs.reshape(-1,1)
inputs = sc.transform(inputs)
X_test = []
for i in range(80, 142):
X_test.append(inputs[i-80:i, 0])
X_test = np.array(X_test)
X_test = np.reshape(X_test, (X_test.shape[0], X_test.shape[1], 1))
predicted_stock_price = regressor.predict(X_test)
predicted_stock_price = sc.inverse_transform(predicted_stock_price)

6
2/2 �������������������� 3s 2s/step

1.3.3 Visualising the results

[20]: plt.plot(real_stock_price, color = 'red', label = 'Real Stock Price')


plt.plot(predicted_stock_price, color = 'blue', label = 'Predicted Stock Price')
plt.title('Stock Price Prediction')
plt.xlabel('Time')
plt.ylabel('Stock Price')
plt.legend()
plt.show()

[21]: import math


from sklearn.metrics import mean_squared_error
rmse = math.sqrt(mean_squared_error(real_stock_price, predicted_stock_price))

[22]: rmse

[22]: 35.315576787821435

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy