Enhancing ML knowledge

Some recommended books include "Pattern Recognition and Machine Learning" by Christopher Bishop, "Deep Learning" by Ian Goodfellow et al., and "Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow" by Aurélien Géron.

Random Forest Regression

Random Forest Regression Importing the libraries import numpy as np import pandas as pd import matplotlib.pyplot as plt Importing the dataset dataset = pd.read_csv('Position_Slaries.txt') X = dataset.iloc[:, 1:-1].values y = dataset.iloc[:, -1].values Training data with Random Forest Model import sklearn.ensemble import RandomForestRegressor regressor = RandomForestRegressor(n_estimators =10, random_state= 0) regressor.fit(X, y) Prediction regressor.predict([[6.5]]) Visualising output

Decision Tree Regression Model

For implementing Decision Tree Regression model take action in following steps: (Note no feature scaling is required for decision tree regression) Importing the libraries Importing the dataset Training the Decision Tree Regression model on the whole dataset Predicting a new result Visualising the Decision Tree Regression results (higher resolution) Now we implement it in google collab on uploaded csv file "Position_Salaries.csv' Importing the libraries import numpy as np import pandas as pd import matplotlib.pyplot as plt Importing the dataset dataset = pd.read_csv('Position_Salaries.csv') X = dataset.iloc[:,1:-1].values y = dataset.iloc[:, -1].values Training the Decision Tree Regression model on the whole dataset from sklearn.tree import DecisionTreeRegressor clf = DecisionTreeRegressor(random_state =0) clf.fit(X, y) Predicting a new result predictions = clf.predict([[6.5]]) Visualising the Decision Tree Regression results (higher resolution) X_grid = np.arange(min(X), max(X), 0.1) X_grid = X_grid.reshape((len(X_grid), 1)) plt.scatter(X, y, color = 'red') plt.plot(X_grid, clf.predict(X_grid), color = 'blue') plt.title('Truth or Bluff (Decision Tree Regression)') plt.xlabel('Position level') plt.ylabel('Salary') plt.show()

ML Resources and Notes: SVR (Support Vector Regression)

http://core.ac.uk/download/pdf/81523322.pdf Steps to implement Support Vector Regression (SVR) in python: Importing the libraries Importing the dataset Feature Scaling Training the SVR model on the whole dataset Predicting a new result Visualising the SVR results Visualising the SVR results (for higher resolution and smoother curve) Importing Libraries: import numpy as np import matplotlib.pyplot as plt import pandas as pd Importing Data: dataset = pd.read_csv('path-Position_Salaries.csv') X = dataset.iloc[:, 1:-1].values y = dataset.iloc[:, -1].values y =y.reshape(len(y),1) # : ( ? Feature Scaling: from sklearn.preprocessing import StandardScaler sc_X = StandardScaler() X = sc_X.fit_transform(X) sc_y = StandardScaler() y = sc_y.fit_transform(y) Training SVR model on whole dataset: from sklearn.svm import SVR regressor= SVR(kernel = 'rbf') regressor.fit(X, y) Predicting single data Salary for 6.5 yr experience: sc_y.inverse_transform(regressor.predict(sc_X.transform([[6.5]])).reshape(-1,1)) Visualising SVR result: plt.scatter(sc_X.inverse_transform(X),sc_y.inverse_transform(y), color= 'red') plt.plot(sc_X.inverse_transform(X),sc_y.inverse_transform(regressor.predict(X).reshape(-1,1)), color= 'blue') plt.title('Salary vs Experience') plt.xlabel( 'Experience') plt.ylabel( 'Salary') plt.show() Visualising SVR result (On higher resolution for smoother curve): X_grid = np.arange(min(sc_X.inverse_transform(X)), max(sc_X.inverse_transform(X)), 0.1) X_grid = X_grid.reshape((len(X_grid), 1)) plt.scatter(sc_X.inverse_transform(X), sc_y.inverse_transform(y), color = 'red') plt.plot(X_grid, sc_y.inverse_transform(regressor.predict(sc_X.transform(X_grid)).reshape(-1,1)), color = 'blue') plt.title('Truth or Bluff (Polynomial Regression)') plt.xlabel('Position level') plt.ylabel('Salary') plt.show()

ML Resources and Notes: Polynomial Regression

Steps for Polynomial Regression model training: Importing the libraries Importing the dataset Training the Linear Regression model on the whole dataset Training the Polynomial Regression model on the whole dataset Visualising the Linear Regression results Visualising the Polynomial Regression results Visualising the Polynomial Regression results (for higher resolution and smoother curve) Predicting a new result with Linear Regression Predicting a new result with Polynomial Regression Importing Libraries: import numpy as np import pandas as pd import matplotlib.pyplot as plt Importing data: dataset = pd.read_csv('path-Position_Salaries.csv') X = dataset.iloc(:, :-1).values y = dataset.iloc(:, -1).values print(X) Training the Linear Regression model on the whole dataset: from sklearn.linear_model import LinearRegression lin_reg = LinearRegression() lin_reg.fit(X,y) Training Polynomial Regression model on whole dataset: from sklearn.preprocessing import PolynomialFeatures poly_reg = PolynomialFeatures(degree = 4) X_poly = poly_reg.fit_transform(X) lin_reg_2 = LinearRegression() lin_reg_2.fit(X_poly, y) Visualizing Linear Regression: plt.scatter(X, y, color = 'red') plt.plot(X, lin_reg.predict(X), color = 'blue') plt.title('Truth or Bluff (Linear Regression)') plt.xlabel('Position Level') plt.ylabel('Salary') plt.show() Visualizing Polynomial Regression results: plt.scatter(X, y, color = 'red') plt.plot(X, lin_reg_2.predict(poly_reg.fit_transform(X)), color = 'blue') plt.title('Truth or Bluff (Polynomial Regression)') plt.xlabel('Position level') plt.ylabel('Salary') plt.show() Visualising the Polynomial Regression results (for higher resolution and smoother curve): X_grid = np.arange(min(X), max(X), 0.1) X_grid = X_grid.reshape((len(X_grid), 1)) plt.scatter(X, y, color = 'red') plt.plot(X_grid, lin_reg_2.predict(poly_reg.fit_transform(X_grid)), color = 'blue') plt.title('Truth or Bluff (Polynomial Regression)') plt.xlabel('Position level') plt.ylabel('Salary') plt.show() Predicting a new result with Linear Regression: in_reg.predict([[6.5]]) Predicting a new result with Polynomial Regression: lin_reg_2.predict(poly_reg.fit_transform([[6.5]]))

First Previous 1 2 3 4 Next Last