When My First ML Model Memorized Instead of Learning (And How I Fixed It)
When I started working on my first machine learning projects, I thought I was doing everything right. My model showed almost perfect accuracy during training, and I felt confident about the results...

Source: DEV Community
When I started working on my first machine learning projects, I thought I was doing everything right. My model showed almost perfect accuracy during training, and I felt confident about the results. But as soon as I tested it on new data… everything broke. That’s when I learned one of the most important lessons in machine learning: high accuracy doesn’t always mean your model is actually learning. 🧠 The Problem: Overfitting The issue I faced was overfitting. Because my dataset was relatively small, the model started memorizing the training data instead of learning general patterns. It captured noise, small variations, and specific details that didn’t apply to new data. So while performance looked great during training, it completely failed in real-world scenarios. 🛠️ How I Fixed It (From My Projects) While working on projects like E-commerce Churn Prediction and Diabetes Prediction, I focused on solving this problem step by step. 1. Handling Imbalanced Data with SMOTE Instead of dupl