The Shift in Expectations in the AI Era
The Shift in Expectations in the AI Era
Adapting to AI is not just about mastering tools, but about upgrading your own output standards. This article analyzes the “expectations” that are silently shifting and impacting various roles in the tech industry.
Hello everyone,
In previous discussions, we mentioned how AI generally affects various fields, especially the technology sector—where AI is applied most proactively but is also the first to feel its powerful impact.
In this post, we want to provide another perspective revolving around how specific factors are influenced and how they affect different roles in the tech world. When AI becomes a tool to amplify an engineer’s capabilities, tradeoffs are inevitable.
Specifically, in a recent report by AppKnox - AI and Developer Burnout Report 2025, it mentions that AI is reshaping the nature of work, as well as the definition of burnout, accompanied by the emergence of a new form of stress: cognitive and emotional burnout.
From possessing new, better, and faster tools to the appearance of new forms of stress, have engineers had a chance to look back at what factor is silently changing like an undercurrent? Instead of just trying to chase and master this new tool?
From my perspective, expectation—the expectation between stakeholders and engineers, between bosses and teams, as well as among team members—is the main driver, the root cause that is reshaping the software industry, necessitating tech professionals to adapt and develop.

The Author’s Perspective
From various articles and statistics, we can see that adapting in the coming time is not just about adapting to new technology and tools, but also adapting to new expectations. Adaptation means understanding that the “rules of the game” have changed. Your boss, your customers, and even your team are silently changing their evaluation standards for the “definition of done”.
Combining personal observations with my experience, I can summarize some of the “rules of the game” that have been and are shifting for:
1. Interns/Juniors
Previously, Juniors were allowed syntax errors, allowed to be slow to research, and to get used to tools. These are the explicit values that organizations used to accept paying for you to learn.
But now, AI does those things better, faster, and cheaper than any Junior. AI can help implement a feature or even an entire framework in 30 minutes. But do those lines of code contain logic holes? Are they easy to maintain? or are they becoming tech debt from the moment they are committed?
Seniors’ expectations for you have changed:
- Previously: “Have you implemented this feature yet? Did you face any difficulties during implementation?”
- Now: “Do you understand why AI chose this library instead of that one?” “Do you trust it didn’t hallucinate this function?” or “Is there any better way?”
The key lies in the fact that Juniors will often still not be expected to be faster. Previously, 4 hours might include 3 hours to code and 1 hour to test. Now, still 4 hours: 30 minutes to code, 30 minutes to test. So how do you use the remaining 3 hours? Sitting around or critical thinking?
2. Senior Engineers
When the gap in coding speed between Senior and Junior is narrowing, why do companies still need to pay high salaries for Seniors?
Seniors are no longer paid primarily to demonstrate algorithmic skills or complex system implementation. They are expected to say “NO”.
- Say NO when requested a series of changes in a short time, because they need to know what they can and cannot control regarding the consequences.
- Say NO when a change needs confirmation of historical factors or context. While AI has the power to gather explicit information, historical context (implicit knowledge) is what Seniors grasp best.
- Or, as in discussions about Architects, say NO when the intuition of an industry veteran “speaks up”.
Note: Saying no skillfully is not simple.
3. Managers
Perhaps this is the hardest part. What is observed, heard, or read is often just the surface, and it’s hard to evaluate the entire picture or the submerged part of the iceberg. Thus, this might sound quite general.
This position is under specific pressure: from dizzying technological changes, the team’s delivery speed, to external expectations of quality and time. The shift in expectations drags interest away from just progress, towards risk regarding technology and people.
The baseline expectation for the team’s product quality is raised. Code is committed and deployed continuously. AI can make architectural decisions. So how do we establish “Guardrails” to ensure the team doesn’t turn into workers cleaning up garbage for AI later?
When the expectation for work rate is also raised thanks to AI support, is “ensuring schedule” still the main focus of a Manager? Or has the expectation for the manager shifted to being a pacemaker—ensuring human links operate durably, accurately, and without exhaustion?
Conclusion
It seems that as machines gradually improve their own capabilities, we also need to do better at what humans need to do. To condense from the examples above: Skepticism (Junior), Context Understanding (Senior), and Risk Management (Manager).
A long post to conclude, offering a direct perspective on expectations that have changed—something we might unintentionally not have discussed frankly with each other daily.
What is essential is invisible to the eye - Antoine de Saint-Exupéry’s The Little Prince