I work in tech. I have many thoughts about AI. I wanted a place to put them.
-
"It Should Be Quick Since You're Using AI"
TL;DR: Generative AI amplifies expertise, it doesn't substitute it.
AI does not make you an expert
As AI becomes more available I want to address the subtle but crucial misconception that AI allows anyone to do anything. Reality (as usual) is more nuanced.
AI can significantly lower friction, speed up workflows, and expand what people are capable of producing, but it doesn't replace domain knowledge or professional judgment. Expecting it to do so creates unrealistic expectations for individuals, teams, and organizations. AI is most effective when it acts as an assistant.
Having access to AI does not suddenly confer professional skill. Examples:
- You're not suddenly an expert data scientist simply because you've asked AI to generate statistical summaries.
- You don't have the depth of knowledge required to write usable technical manuals just because an LLM can draft documentation.
- You're not a professional video editor because you can generate a 2-minute video from a prompt.
In each of these fields, the visible output is only a small portion of the work. Behind it are decisions informed by experience:
- Knowing the audience
- Understanding constraints, risks, and context
- Applying standards, whether field-wide or institutional
AI can help you execute parts of the work, but it does not understand why those decisions matter unless guided by someone who does.
When you use AI to work outside your existing skill set, the time savings of making something faster does not remove the need for learning; it only shifts where the effort occurs. You still need to:
- Research unfamiliar concepts and terminology
- Learn what "good" looks like in that discipline
- Evaluate whether the output is accurate, appropriate, or misleading
Output is not the same as understanding.
Humans require expertise for AI to be used well
The second misconception I want to address is that using AI effectively requires very little skill. The opposite is true in practice. Meaningful use of AI requires:
- Knowing what to ask for, what not to ask for, and how to frame it
- Evaluating whether an output is incomplete or wrong
- Making informed decisions on what to keep, revise, or discard
- Adapting outputs to specific systems or audiences
AI responses can appear confident while being incomplete or wrong. Without subject-matter knowledge, this is risky in technical, regulated, or user-facing context where inaccuracies can cause real harm. AI does not determine what should be done, why it matters, or whether it's appropriate for a given situation.
Unless it's part of the trained data, AI does not understand your organization's policies, user needs, your technical stack, or operational constraints. As the "human in the loop," it's your responsibility to understand evaluate responses. Using AI does not fill in those blanks for you unless you know how to prompt for those results.
When used appropriately, AI can be a powerful multiplier of existing expertise and free up time for higher-level thinking and decision-making. Examples:
- Accelerate drafting, analysis, and synthesis
- Reduce time spent on repetitive tasks
- Support exploration and comparison of alternatives
- Act as a sounding board during ideation
- Help translate knowledge into different formats or levels of complexity.
Shift your expectations
Organizations sometimes frame AI as a way to replace skills/people, rather than support them. This leads to expectations that one person can now do the work of many roles simply because "AI is available." This framing is flawed and detrimental to organizational workflows and products.
The people who will benefit most from AI are those who already understand their field well enough to guide, evaluate, and refine the output. Organizations are asking: "What can AI do instead of people?" when they should be asking: "How can AI help skilled people do their work more effectively?"
It's a capability extender, not a career switch. It helps writers write more in less time. It helps analysts analyze large amounts of data faster. It can help video editors get started with structure and branding (it does not help with the process of video editing which will not be faster just because you used AI to generate). It does not replace the learning, experience, and responsibility that come with professional practice.
When we treat AI as an assistant to support existing skills, we produce better outcomes and reduce the risk of overconfidence driven by automation.
AI can help you do more of what you already do well; it cannot make you someone you are not.
Originally composed January 22, 2026, but I didn't have this blog then.