A reflection on the journey from Data Science to AI engineering

TLDR: A lot has changed, but no change has been too troublesome. In fact, I sometimes miss the quiet nuances of data science.

These days, I spend most of my time building AI systems. Not just models, but agents, automations, workflows that talk to vector databases, orchestrate multiple LLMs, route decisions, and invoke tools, real tools, to take action. I am tuning base models, stacking retrieval pipelines, experimenting with synthetic data, and wrestling with the latency, memory, and quality triangle that comes with building anything remotely intelligent.

But it was not always this way.

For most of my career, I lived in the world of data science. It was a quieter kind of work. A Jupyter-notebook-shaped world where progress looked like cleaner metrics, better coverage, tighter confidence intervals, and slowly building trust with stakeholders. It was not about agents reasoning or hallucinating. It was about making sense of the mess, methodically and often manually. And oddly enough, I miss that sometimes.

In data science, nuance was the whole game. You did not just care about whether the model worked. You cared about whether it was fair, stable, interpretable, and whether the metric matched what the business actually needed. You obsessed over leakage, feature drift, and baseline comparisons. You wrote SQL slowly, like poetry. You explained why precision mattered more than recall, or why a dashboard should lag by a day to avoid false alarms. There was something beautiful about that kind of care.

Then LLMs exploded. And like many others, I moved.

What started with basic embeddings and zero-shot prompts quickly turned into full-fledged systems. Retrieval augmented generation, function calling, evaluation frameworks, autonomous agents, memory stores, and even fine-tuning workflows. What fascinates me now is not just what these models can say, but what they can do. They are not just answering questions anymore, they are rewriting the workflows themselves.

And yet, despite the excitement, I have realized something important. The further you go into AI engineering, the easier it becomes to lose the rigor that data science forced you to have. It is easier to chase demos over decisions. Easier to fall in love with the illusion of intelligence instead of questioning what is underneath. The best of both worlds, AI that works and makes sense, still requires the discipline I learned back in the data science trenches.

What has been most fulfilling, though, is seeing how the two worlds meet. How all the thinking around metrics, inference patterns, experimentation, and stakeholder communication still shows up, just with different tools and different failure modes. I find myself explaining vector search precision the way I used to explain ROC curves. I still write post-mortems. I still obsess over edge cases. And I still believe that trust is the real deliverable.

So what changed?
A lot.
But nothing that made me feel out of place.

If anything, this new chapter has deepened my appreciation for the last one. Data science gave me the instincts. AI engineering is giving me the playground. And between the two, I feel like I am finally building systems that both work and matter.

Onward.

or gradient descent?