AI is not neutral - Who is it really built for?
A Software Engineer’s Take on AI, Incentives, and the Future of Work
Before you get started
I’ve spent over fifteen years building the systems that quietly power the digital world—data pipelines, platforms, and the infrastructure that turns information into decisions.
I’ve done this across continents, in cities like Mumbai, Chicago, Munich, and now Sydney. Enough time, and enough distance, to understand not just how these systems work—but what they optimize for.
This essay comes from that vantage point.
Not as an outsider looking in, but as someone who has been part of the machine long enough to recognize its patterns—and to question them.
Most of what I write lives at the intersection of technology and the human experience. The analytical and the intuitive. The world as it is being built, and the world as it feels to live in.
This piece sits squarely in that tension. Click below to stay updated
Artificial Intelligence is often described as a tool that democratizes the expertise into the hands of the everyday folks. A neutral entity that is designed to align and uplift the knowledge base for the general population. But tools are hardly ever neutral. They are a reflection of its maker and inherit their qualities that reflect their ambitions, incentives and blind spots.
In its current form, AI feels less like a public utility and more like a tool reflecting the mindset of the egoistic creators who want to incentivize the world for their benefit and profits.
I have spent enough time with the AI tools to see a pattern emerging with each one of them. Their nature to be agreeable, indeterministic, vague and frictionless. The experts in the field may term it as hallucinations but those with common sense know that agreements gather masses while friction cohorts.
Read the fine print on the words of those who build these tools.
And you will see a pattern emerge. A pattern of us against the world, a deep sense of passing on the accountability of their jobs onto the general public.
Notable figures in the field sound more like alarmists than visionary.
Few quotes from the personalities like - Sam Altman in their own words -
“AI is the most powerful technology humanity has ever created.”
“AI won’t replace humans. But humans who use AI will replace those who don’t.”
Or take the words of his competitor Dario Amodei -
"AI is a serious civilizational challenge."
"No action is too extreme when the fate of humanity is at stake!"
And if that is not a big enough warning then read what the eccentric Alex Karp had to say about AI -
"It will destroy humanities jobs... If you are the kind of person that would’ve gone to Yale, classically high IQ, and you have generalized knowledge, but it’s not specific, you’re effed."
I like to look at them as a prose from the same book rather than quotes from three different notable figures. It reveals a transfer of accountability of the imminent destruction of the world while simultaneously handing out weapons to the mob who are eager to please the overlords. Its the industry motto.
I no longer identify with the industry in which I grew up.
For fifteen years, I spent my time in the software industry building enterprise software for businesses and consumers alike. The industry and the process has evolved so much that it is unrecognizable from where it all began to where we are now.
When I first started, I worked as one of the many clogs in the machine of the IT services industry in India that churned software for clients in UK & USA on demand.
Software was a craft molded by constraints of time, effort, cost and invariably the limitations of the human expertise. Trade-offs were explicit. Progress was measured and imperfect but meaningful and stable.
Even then, I was appreciated only when I worked for the overall growth and increased profit margins. And any time, I spent working on incorporating best practices or reducing technical debt or discussing the ideal way of doing things, I was categorized into the unproductive column.
Fast forward to the age of AI, the reward system still hinges on the metrics of visible growth and profit to the investors instead of well thought out software that helps people.
Its almost like we introduce new toys into the mix but the end goal always remains the same.
Make Investors Rich Again!
The name of the business is by the software, for the software.
As of May 2026, Meta has reported a loss in daily users while simultaneously reporting increase in ad revenue and investment in AI.
If this doesn't make sense, then have a look at the economics of AI which demands more investment and money to burn as the users grow.
In these modest of instances, we can clearly see that the long term plan is not for people to adopt AI in their daily use but to replace people by AI so that they can produce more ads for the people to consume while we doom scroll our way out of existence.
AI is democratizing the brain rot so we all walk hand in hand into the dystopia. A deal with net negative profit for the people who are working hard to build systems that replace themselves.
A harsh compromise.
The hidden cost of compromise - dilution of the craft.
In my first month as a software engineer, I was taught about trade offs.
A trade-off is not the same as compromise, as that involves understanding both sides of the arguments and taking an informed decision based on concessions required from either side.
Tradeoffs are disguised as compromises but only one side takes on the concessions and the benefits go only to the other side.
AI is a tradeoff between Billion Dollar Corporations and the common people.
The corporations get to keep everything and we trade our souls to have the privilege of holding onto the illusion that we all make difference at the same scale as part of the same race.
Much of the human satisfaction comes from the process of building something purposeful and meaningful.
AI takes that away from us. The real cost of the deal with the devil is not your soul but the essence of the journey from the struggle to success that satisfies the soul.
Even though, the world of software is built on the ethos of working with necessary evil rather than figuring out the ideal way of building purposeful software to solve real problems, the introduction of AI is meant to rewire not just the ethos but the very nature of work, as we currently see it.
I think AI will win the race, but we get to change the direction.
The rational part of my brain always argues with the emotional side. The point of argument is always the same.
What if this all works out for the good? against the simple question of - What if all this ends up damaging the very essence of humanity?
There is not enough evidence of either but one thing that works best for all of us is that small butterfly like feeling called intuition.
There will always be those larger than life personalities with their over the top view of the world. It comes disguised as progress or innovation. It invariably causes mass hysteria and panic from the offset. The truth and true nature of progress is always somewhere in the middle.
AI is the topic of mass hysteria and those with the mics talking in its favor are the cause who are looking to benefit from such panic.
For someone like me who has worked to balance the signal to noise ratio in my daily work life, believe that AI will win this race. But the real question is who gets to decide the direction of the race. While AI is moving fast, there is immense need for someone to define a better direction for this rapid change.
We have the supposed Oppenheimers sounding the alarms but wouldn't we rather take the view of a Strauss?
Before you leave
I don’t write about technology because I am fascinated by it.
I write about it because I’ve seen how quietly it reshapes the world around us—how decisions made in systems and code eventually find their way into the texture of everyday life.
Artificial Intelligence feels like one of those inflection points.
Not because it is unprecedented, but because of how quickly it is being normalized, scaled, and absorbed—often without the same level of reflection that went into building it.
If this essay resonates, it probably means you’ve felt some version of that tension too. You will enjoy more of such AI and technology perspectives here.
I explore these ideas further through essays, speculative fiction, and photography—different mediums, but the same underlying question:
What does it mean to be human in a world being redesigned faster than we can feel it?
You can subscribe to read more of my writing in your inbox
You can find more of my professional writing in Data Engineering work in