Programming paradigms have become indispensable lenses for software engineers in today’s big data age. Along with a set of principles, they are based on a mathematical theory in order to support a set of concepts. Often times, there is a multi-paradigm rather than one single paradigm in use as in order to solve a programming problem, the right concepts should be chosen for each part. Also, there is usually more than one problem that needs to be solved. The ideal programming language should support various concepts which also requires the user of that program to be fluent in different paradigms. This may seem to be a drawback. In order to alleviate this issue, a paradigm that includes the right concepts should be chosen. While by choosing a few concepts programs might become complicated, choosing too many would result in a complicated reasoning. So, establishing a balance between the concepts and paradigms is crucial when it comes to programming.
Establishing a balance between concepts and paradigms cannot occur by simply combining related concepts. One of the main principles for organizing concepts is to recognize the sign when a new concept needs to be created. When in a particular paradigm, programs seem to get complicated due to technical reasons so that there is no longer any relationship to the problem at hand, then it is time that a new concept should be discovered. Usually, in such cases, a need arises to make nonlocal or pervasive changes to a program in order to fulfill a specific goal.
According to the computer science scholar Van Roy (2016), there are two main properties of a programming paradigm:
- Does it have observable non-determinism?
- How strongly does it support state?
Van Roy (2016) also asserts that a state should be understood in terms of how much a paradigm supports recording the values sequence in time. As there are various state combinations, not all of them might provide value.
When it comes to understanding the programming paradigms, it is essential to have a grasp of the following programming concepts (Van Roy, 2016):
- Records: These refer to sets of data items that have indexed access to each other
- Lexically scoped closures: These refer to the combination of a procedure along with its external references.
- Independence: The idea of independent evolution of activities or their concurrent execution
- Named state: The idea of naming a piece of state.
One rule of thumb should be to avoid the invisibility of named states. In other words, they should always be accessible from outside as they are crucial for a system’s modularity. Van Roy (2016) also made the interesting statement that while organisms grow by learning and display a time-based behavior, the notion of state in programming includes a timeless behavior. In real life, the reaction of an organism to the same stimulus might differ in time which cannot be modeled into a program. Therefore, an entity with an authentic name needs to be modeled so that its behavior changes throughout the execution of the program. For this purpose, an abstract concept of time is added to the program which simply refers to the sequence of values in time that has a name. This sequence is referred to as a named state.
In order to understand the spirit of a paradigm one should also be able to understand some essential concepts (Van Roy, 2016):
A data abstraction offers a method for organizing data structures in accordance with specific rules in order to ensure the proper use of data structures. There are three aspects of data abstraction: inside, outside and the interface between the two.
There are two questions which determine how to organize data abstractions:
- Does the abstraction use a named state?
- Are the operations combined as a single entity with the data?
The main issue regarding concurrency is non-determinism which can become challenging to be managed in case the user of the program observes it. Nondeterminism which is observable is also referred to as a race condition.
In the absence of non-determinism, the ability to create programs with independent parts would be limited. Yet, the observability aspect of non-determinate behavior can be limited by performing one of the following options (Van Roy, 2016):
- Defining a language so that non-determinism can no longer be observed.
- Keeping the scope of observable non-determinism only for those program parts which are really in need of it.
Some of the programming paradigms which are concurrent but do not possess any determinism in an observable way are (Van Roy, 2016):
- Declarative concurrency (monotonic data flow): This is also referred to as monotonic data flow in which deterministic inputs are used to specify deterministic results.
- Functional reactive programming ( FRP): This is also referred to as continuous synchronous programming in which function programs with revisable arguments can be developed. Time is discrete rather than continuous.
- Discrete synchronous programming: This is also referred to as reactive systems which emit output events after some internal calculations. Time is continuous in contrast to FRP.
Constraint programming defines the problem in terms of being a constraint satisfaction problem. By doing so, it is seen as one of the most declarative pragmatic paradigms in programming. In constraint programming, the issue is defined as a group of variables with constraints and propagators that make use of them in order to model the issue to be resolved.
According to Van Roy (2016), when it comes to language design, there are also some design guidelines that need to be taken into account. While for programming in the small, a dual-paradigm language might be feasible, for programming in the large abstraction and modularity might be more relevant. Van Roy (2016) asserts that the ideal design should possess the following main layers:
- a strict functional core
- declarative concurrency
- asynchronous message passing
- global named state.
These layers would support all the programming paradigms mentioned above.
In summary, based on Van Roy’s (2016) observations, we can say that not only is the declarative paradigm of programming one of the main programming approaches, but it will also presumably continue to be so in the near future because of its being secure and fault-tolerant. In addition to this, deterministic concurrency provides a great potential for making best use of multi-core processors. For systems of larger scale, a self-sufficient design approach can be adapted so that systems will become self-configuring. While the communication among components occurs via means of passing messages, the named state provides support for the system maintenance and configuration which makes the system itself as a set of interlocking feedback loops.
Last, but not least, unless actual programming of a particular paradigm occurs, the spirit of that paradigm cannot be understood in-depth.