Low-code and no-code tools vs AI. Synergy or deepening crisis in the IT industry?

Recently, the processes of creating applications and AI models are undergoing increasing changes. They are moving towards reducing the role of programming in their design and lowering the entry threshold into these areas. Tools such as no-code and low-code are partly responsible for this situation.

Low-code and no-code tools vs AI. Synergy or deepening crisis in the IT industry?
00:00 00:00

Summary

  • Artificial intelligence (AI) has become more accessible due to no-code and low-code (NCLC) tools, reducing the need for proficiency in programming languages. These tools allow for application creation without specialized knowledge, often using a drag-and-drop interface.
  • Generative AI tools like ChatGPT, Dall-E, Midjourney, and Github Copilot are examples of NCLC tools. Major tech companies are incorporating large language models into their NCLC platforms, such as Microsoft's Azure OpenAI platform and the GPT-4 model.
  • Despite making AI more accessible, digital inequalities can still lead to discrimination in opportunities. The state plays a crucial role in reducing these disparities and promoting technology democratization.
  • Implementing NCLC tools and generative AI in organizations requires a consistent strategy to ensure employee understanding and prevent conflicts. However, these tools cannot solve the issue of poor quality initial data.
  • Users of NCLC tools or generative AI may not fully understand how the applications or models they are building work, leading to a "programmer's cargo cult".
  • Using ready-made solutions can result in a loss of flexibility and personalization. Generative AI tools are essentially black boxes, with users having little control over their functioning.
  • The oligopoly in the market of large language models and cloud services means only major companies can afford to develop and share their own models on their own platforms. These services and tools are not required to be interoperable, creating a barrier for those who wish to switch to another tool.

From its very beginnings, artificial intelligence was a field of computer science and as such, it was characterized by a high entry threshold. It was primarily conceived as technical expert solutions intended for specialists proficient in programming languages - in this case mainly Python, but also R, Julia and Scala.

Recently, however, we observe some changes in the approach to creating AI applications and models. These changes are moving towards reducing the role of programming in performing these tasks, and consequently - towards lowering the technical requirements and entry threshold into these areas. Responsible for these changes are no-code and low-code (NCLC) tools and it is them and the consequences of their existence for the IT and AI market that we will look at in this text.

How do no-code and low-code tools change the IT industry?

NCLC solutions assume a paradigmatic change in the way formerly exclusively technical areas are treated. They allow for the creation of applications without the need for specialized knowledge. This means that to do a developer's job, knowledge of programming languages is needed to a minimal extent (low-code) or not at all (no-code).

These tools most often take the form of interactive environments for creating applications using ready-made modules operated in drag-and-drop mode (drag and drop). Such simplicity is their huge advantage. Through such solutions, it is relatively easy and quick to create an application e.g. for scheduling, without having to spend many hours writing conditional instructions in a programming language or creating appropriate commands in Excel.

Generative AI tools, e.g. ChatGPT in several versions, other generators of audiovisual materials (Dall-E, Midjourney) or Github Copilot also fit into the NCLC paradigm. Although it is possible to work with such tools in programming environments, e.g. via an API interface, the graphical user interface, most often a chat window possibly enriched with a few functionalities, offers possibilities sufficient for most.

Interestingly, technology corporations are increasingly making their own NCLC platforms available, also adding large language models there. This has already been done by, for example, Microsoft with the Azure OpenAI platform and the GPT-4 model. This now makes it possible to build your own AI applications or machine learning models using the capabilities of large language models, e.g. for processing large amounts of text from various sources. It should be noted that some large language models are no longer strictly language-based, but multimodal, as they can also work with numerical data or process images.   

So it's worth asking whether the increasing prevalence of low-code and no-code tools, as well as generative AI tools that do not require technical skills, somewhat strips programming professions of their aura of mysticism and sense of elitism?

Tools no-code and low-code – democratization or deepening inequality?

Observing the development of digital devices and how we use them, it must be clearly stated that their operation is becoming more and more convenient and intuitive. In the past, using a computer based on the MS-DOS system with characteristic commands like cd or mkdir, was in a way a separate programming activity. Meanwhile, almost everyone can handle today's operating systems. The same was true for building your own websites in the days of Web 1.0 - you had to know HTML markup language, CSS standard and have knowledge about hosting. Thanks to today's ready-made templates, having a standard website is no longer a challenge.

So we see here a clear democratizing trend in broadly understood computer science. Hardware and software are becoming easier to use, so they reach an increasingly wider group of non-technical recipients. Looking from this perspective at NCLC tools, it is not surprising that building applications or using the benefits of artificial intelligence also ceases to be a technical or programming domain.

The problem in this case, however, turns out to be digital inequalities. They can concern such basic things as access to the internet or owning electronic devices, but also ways and purposes of consuming content on the network. In sociology, it is already known that digital inequalities are usually secondary to socio-economic inequalities. Therefore, for example, pulling fiber optic cables to previously neglected places and providing computers there will not make the local residents automatically start using, for example, online technological courses on building NCLC applications.

However, this may lead to situations where people who do not use NCLC solutions or generative AI may be discriminated against in various opportunities on the educational or professional path. As broadly understood digital competences, including non-technical ones, become key in the labor market, the role of the state in reducing such differences and reducing economic, educational or digital inequalities is important here. Only this can actually lead to a wider democratization of many technological solutions.

Tools no-code and low-code – challenges and problems in implementation

Researchers also mention a number of other challenges facing the implementation of NCLC tools, especially for the development of AI solutions in organizations. Above all, a consistent strategy is needed so that employees not only see the sense and benefit of working with such solutions, but also to prevent conflicts between traditional programmers and NCLC developers.

In the era of increasing popularity of AI and the availability of NCLC tools, it is also easy to forget that these tools are not a remedy for poor quality initial data. This is of course a problem inherent in all AI applications, as current models are at best as good as the data behind them. NCLC tools or generative AI not only cannot solve this problem, but even exacerbate it.

In addition, users of NCLC tools or generative AI, who often do not know what lies behind the screen of the graphical interface and the modules they use, may also not fully understand how the application or model they are building works. They may therefore intuitively use solutions observed in others, but without a precise understanding of why the application works as it does. In programming, such a phenomenon of mindless work with code is called a programmer's cargo cult.

The cargo cult has been known to anthropology for decades. This phenomenon became widespread during World War II in the Pacific, when the American and Japanese sides fighting each other built makeshift airfields on small islands. After the war, however, planes stopped landing there, and local residents, hoping for their return and the resumption of supply deliveries, began building makeshift landing strips themselves. NCLC application programmers and developers are also susceptible to this phenomenon, when without understanding the actual cause of the code's operation they mindlessly copy it into the program they are creating in the hope of similar results. This only shows that regardless of technical conveniences, critical thinking will always be valued.

However, let's not forget that by using ready-made solutions we inevitably lose some flexibility. NCLC programs do not provide the same freedom or level of personalization as writing your own application from scratch, for example in Java or C. Generative AI tools, on the other hand, are essentially black boxes, i.e. we do not know exactly how they work and how they were trained, as companies protect this information from competition and public opinion. What's worse, users have almost no control over this. It is therefore possible, and this happens to some extent, that AI models will start functioning differently from one day to the next.

However, unlike with NCLC tools, where at least in theory we can learn to program and build applications ourselves, with AI we can't even create our own language model. We can at most use another, for example a more transparent open source model. But if we use the cloud infrastructure coupled with the model and the platform provided there, our choice of tool is predetermined by the owner of that infrastructure.

This is the price to pay for an oligopoly in the market of large language models and cloud services - only giants can afford to develop and share their own models on their own platforms based on their own cloud infrastructure. Their services and tools do not have to be interoperable, and if you want to switch to another tool or, you have to start everything from scratch. And this is neither efficient nor democratic.