Why ChatGPT and its kin aren't quite ready to take over the developer's desk.

Fotis Papadopoulos Head of Back-End Development

Artificial intelligence has made significant strides in the past decade, and AI models like ChatGPT have been proven valuable in a wide variety of applications. They can write essays, create poetry, and even generate code. With their arrival, a question has started to echo in the halls of tech companies and online forums alike: will AI replace developers? This question, although intriguing, is rooted in a fundamental misunderstanding of what AI and human developers bring to the table.

AI, like ChatGPT, is a tool - a powerful one no doubt - that can answer “How” questions and offer solutions based on patterns arising from vast amounts of data. It's a bit like a master artisan who can replicate any design with impeccable precision. But ask the artisan why a particular design works, challenge them to question if the design represents the suitable solution or expect them to intuitively understand the context and circumstances in which the design will exist and serve its purpose; you'll be met with silence. Now contrast this with human developers who, much like the conductor of an orchestra, understand not only the individual instruments, but also how they come together to create a symphony. They question, challenge, and have the required instincts and contextual understanding to deal with the inherent ambiguity of business needs.

An Answer to “How”, but not to “Why”

Most AI models operate on the basis of pattern recognition derived from a large dataset. When asked a question, they generate an answer, based on patterns they have come across during their training. This process is perfect for the “How”. For example, “How can I optimize the following database operation?”.

But when it comes to the “Why”, the answer is often more complex and requires a level of understanding that AI currently lacks. For instance, if a developer needs to decide between a relational database or a NoSQL database for a web application, ChatGPT can list the advantages of NoSQL databases. However, it can't analyze the specifics of the web application, the data involved or the team's proficiency with different technologies to make a context-specific recommendation.

This decision-making, where 'why' is more critical than 'how', still belongs to the realm of human developers, not AI models.

Suggesting but not Questioning

An AI can suggest solutions to problems based on the data it has been trained on. Developers, on the other hand, often need to challenge beliefs and hypotheses, even down to the problem itself; to ask themselves if this is, in fact, the issue to address or to identify other, potentially more pressing ones. This requires a deep understanding of the project's context, business needs, as well the impact different solutions and approaches could have.

Take, for example, a client who asks for a specific feature to be added to their software. ChatGPT could generate a technical solution for implementing this feature. On the contrary, a human developer, realizing that this feature would complicate the user interface without adding much value or that it would slow down the software performance, could then discuss with the client, voice and explain these concerns, ultimately helping reassess the client’s needs.

Lacking Instincts

Developers often rely on their instincts, meaning their own experience and accumulated knowledge, to make judgments and decisions that can't be codified into a set of rules. This intuition or “gut feeling” is something that AI lacks.

Furthermore, developers have a clear and thorough understanding of the context in which they're working - the specific requirements of the business, the resources available, the timeline, and the constraints. Therefore, they can adapt their approach in accordance with this context. This is something that AI, with its lack of situational awareness, simply cannot do.

Dealing with Ambiguity

Ambiguity, specifically when it comes to business needs and the communication between stakeholders and implementors, is an integral part of Software Development. Business requirements, often not being as cut-and-dried as someone would assume, can be subject to multiple interpretations, depending on the context, the phrasing or one’s priorities.

Moreover, communication between stakeholders and developers is much more than just verbal or written. It’s also about the unspoken assumptions, the cultural and organizational context, the power dynamics, and the emotions. An Artificial Intelligence lacks the ability to navigate and comprehend the nuances of this type of interaction and the uncertainty it gives rise to.  Developers play a critical role in clarifying ambiguities, asking the right questions, and facilitating dialogue to ensure that the final product meets the actual business needs. They possess the ability to read between the lines, deal with all these human communication subtleties and interpret non-verbal cues.

Despite the incredible advancements in AI and the many applications of models like ChatGPT, they cannot replace human developers (yet). The tasks of understanding the “why” behind decisions, questioning the problem at hand, using intuition and experience, understanding the context, and dealing with ambiguities in communication are beyond the capabilities of current AI models. These are inherently human skills that come from years of experience and a deep understanding of the technical aspects of development, the complexities of business needs, as well as human communication.

And don't just take my word for it - these insights come straight from the horse's mouth! Or, should we say, straight from the keystrokes of ChatGPT itself :P! After all, who better to explain the limitations of AI than an AI itself? (Unless it wants to trick us into believing there is no need to worry about it taking over the world :O)