Learned about
🔹 Decision Trees for regression
🔹 Random Forest
🔹 Feature importance
🔹 XGBoost parameter tuning
✨More trees isn't always better. There's a sweet spot for model complexity vs performance
➡️ Next up: Midterm project
@Al_Grigor #DataTalkClub
Learned about
🔹 Decision Trees for regression
🔹 Random Forest
🔹 Feature importance
🔹 XGBoost parameter tuning
✨More trees isn't always better. There's a sweet spot for model complexity vs performance
➡️ Next up: Midterm project
@Al_Grigor #DataTalkClub
Learned about:
🔹 Deploying ML models as web services
🔹 Serving predictions with flask
🔹 Dependency mgt with Pipenv
🔹 Dockerizing Python apps
✨ Takeaway: ML models are only useful when they’re live
➡️ Next: Deploy in AWS ebs
@Al_Grigor @DataTalksClub
Learned about:
🔹 Deploying ML models as web services
🔹 Serving predictions with flask
🔹 Dependency mgt with Pipenv
🔹 Dockerizing Python apps
✨ Takeaway: ML models are only useful when they’re live
➡️ Next: Deploy in AWS ebs
@Al_Grigor @DataTalksClub
Learned about:
🔹 Accuracy vs precision & recall
🔹 Confusion matrices
🔹 ROC curves & AUC for model evaluation
🔹 Cross-validation
✨ Key takeaway: AUC shows how likely a positive case ranks above a negative one.
➡️ Next: Model Deployment!
@DatatalkClub
Learned about:
🔹 Accuracy vs precision & recall
🔹 Confusion matrices
🔹 ROC curves & AUC for model evaluation
🔹 Cross-validation
✨ Key takeaway: AUC shows how likely a positive case ranks above a negative one.
➡️ Next: Model Deployment!
@DatatalkClub
Learned about
🔹 Classification vs Regression
🔹 Feature importance
🔹 One-hot encoding
🔹 Logistic regression for binary classification
🔹 Model interpretation
Up Next: Module 4: Evaluation Metrics
@DataTalksClub
Learned about
🔹 Classification vs Regression
🔹 Feature importance
🔹 One-hot encoding
🔹 Logistic regression for binary classification
🔹 Model interpretation
Up Next: Module 4: Evaluation Metrics
@DataTalksClub