We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2
🧠 Machine Learning
1. What is Machine Learning?
Machine Learning is a subset of AI that enables systems to learn from data and improve their performance over time without being explicitly programmed.
2. Differentiate between supervised and unsupervised learning.
Supervised learning uses labeled data to train models, while unsupervised learning finds patterns in unlabeled data.
3. What is overfitting in Machine Learning?
Overfitting occurs when a model learns the training data too well, including its noise, leading to poor generalization to new data.
4. How can overfitting be prevented?
Techniques include cross-validation, pruning, regularization (L1, L2), and using simpler models.
5. What is the bias-variance tradeoff?
It's the balance between a model's simplicity (bias) and its flexibility (variance). High bias can cause underfitting, and high variance can cause overfitting.
6. Explain the difference between classification and regression.
Classification predicts discrete labels, while regression predicts continuous values.
7. What is a confusion matrix?
A table used to evaluate the performance of a classification model by comparing actual and predicted classifications.
8. What are precision and recall?
Precision is the ratio of true positives to all predicted positives, while recall is the ratio of true positives to all actual positives.
9. What is the F1 score?
The harmonic mean of precision and recall, providing a balance between the two.
10. What is cross-validation?
A technique to assess how a model generalizes to an independent dataset by partitioning the data into subsets.
11. What is the difference between bagging and boosting?
Bagging reduces variance by training multiple models in parallel, while boosting reduces bias by training models sequentially.
12. What is a decision tree?
A flowchart-like structure used for classification and regression, where each node represents a feature, and branches represent decision rules.
13. What is a random forest?
An ensemble of decision trees that improves predictive accuracy and controls overfitting.
14. What is gradient descent?
An optimization algorithm used to minimize the cost function by iteratively moving in the direction of the steepest descent. 15. What is the learning rate in gradient descent? A hyperparameter that determines the step size at each iteration while moving toward a minimum of the cost function.
16. What is feature selection?
The process of selecting a subset of relevant features for model construction to improve performance and reduce overfitting.
17. What is dimensionality reduction?
Techniques like PCA reduce the number of input variables in a dataset, simplifying models and reducing computation.
18. What is the curse of dimensionality?
As the number of features increases, the data becomes sparse, making it difficult for models to find patterns.
19. What is a support vector machine (SVM)?
A supervised learning model that finds the optimal hyperplane to separate classes in feature space.
20. What is the difference between parametric and non-parametric models?
Parametric models assume a fixed number of parameters, while non-parametric models grow in complexity with the data.
Download Complete (Ebook) Engineering Optimization: Theory and Practice, Fifth Edition by Singiresu S. Rao ISBN 9781119454717, 1119454719 PDF for All Chapters