Sei sulla pagina 1di 4

Pre requisites –

Participants should have attended the Data Science and Machine Learning Using
Python training

Course Curriculum
Day-1: Data Handling and Basic Models
1. Deep Learning Course Introduction a. Introduction
b. Applications
c. Major Topics
d. Course structure

2. Data Handling in Python a. Data Types and Operations


b. Python packages
c. Important packages used in Machine Learning
d. Data importing
e. Working with datasets
f. Descriptive statistics
g. Central Tendency
h. Variance
i. Datasets validation tips and tricks

3. Linear Models a. Regression Model


b. Linear Model
c. The concept of error function
d. Building a linear model
e. Validation of the model
f. Issues in multiple linear regression

Day-2: Classification and Model Validation


4. Classification Models a. Regression vs Classification
b. Need of logistic Regression
c. Building logistic regression model
d. Validating the model
e. Issues in logistic regression line
f. Feature selection

5. Model Validation Metrics a. Accuracy, Sensitivity, Specificity


b. Over fitting
c. Under fitting
d. ROC, AUC
e. Bias – Variance Trade off
f. Holdout cross Validation

Day-3: TensorFlow, Keras and ANN


6. Deep Learning tool – Tensor Flow and Keras (Wrapper on Tensor Flow) a. Deep Learning tool TensorFlow
b. Comparison with python libraries
c. Introduction to TensorFlow
d. Architecture
e. Programming paradigm of TensorFlow
f. TensorFlow made easy with Keras
g. Setting up Keras
h. Keras on TensorFlow
i. Keras Basic Commands
a. Logistic regression to Neural networks
b. Concept of hidden layers
c. Feed forward networks
d. Error function
e. Back propagation algorithm
f. Building ANN model on Python
g. Choosing the optimal model
h. Building ANN model in TensorFlow
i. Building ANN model on Keras
j. Major issues while building ANN models

7. ANN

Day-4: ANN Hyperparameters and CNN


8. ANN Hyper Parameters on Keras a. Activation function
b. Number of hidden layers
c. Regularization
d. L1 & L2 Regularization
e. Decay parameter finetuning
f. Dropout
g. Learning rate finetuning
h. Momentum
i. Optimization functions

9. CNN a. CNN Introduction


b. Issues with Standard ANN
c. Kernel filter
d. Convolution layer
e. Pooling layer
f. Fully connected dense layer
g. Weights and number of parameters
h. Back propagation
i. CNN Model building
j. CNN Hyperparameters
k. CNN tips and tricks

Day-5: RNN and NLP


10. RNN a. Sequential model’s introduction
b. RNN Introduction
c. Word Predictor model
d. RNN theory
e. The number of parameters
f. Back Propagation through time
g. RNN case study
h. The problem of vanishing gradients

a. What is text mining

b. The NLTK package

c. Preparing text for analysis

d. Step by step guide to prepare text data

e. Text summarisation

f. Sentiment analysis

g. Naïve Bayes technique for sentiment analysis

h. Movie review sentiment analysis

i. Word2Vec model

j. Word Embeddings

k. Word2Vec models using Gensim

l. Google Word2Vec Model Transfer learning

a. Couse conclusion

b. Reference books, videos and blogs

c. Next steps

d. Final Q&A

e. Final assessment (optional)

11. Text Mining and NLP

Potrebbero piacerti anche