In May, a research team from MIT announced a new programming language. Finch, its creators say, is “a simple bytecode interpreted, purely object-oriented, prototype-based, dynamically-typed programming language.”
Not to be confused with the Swedish fintech of the same name, this is a language that has been inspired by Smalltalk, Self, and Javascript.
In particular, Finch simplifies the creation of concurrent and parallel programs, emphasising lightweight processes and message passing, similar to languages like Erlang. It also aims to be more accessible, and easier to learn.
A challenge exists in the form of structured data, and Finch is designed to address the limitations of existing implementations.
3 software roles across the EU
Sources have said that, “One of Finch’s key innovations lies in its support for a rich structured array programming language. By offering familiar constructs like for-loops, if-conditions, and early breaks over structured data, Finch elevates the productivity level to that of dense arrays. This allows programmers to work with complex data structures without sacrificing expressive power or efficiency.”
Finch is a way off from being widely understood, as it is specialised and as of yet, not as broadly adopted or recognised as other mainstream languages and frameworks that are typically used for parallel processing.
And unlike widely adopted languages such as Python and Java, it isn’t as well supported yet either. As of now, it’s a limited ecosystem, with a smaller community and fewer libraries, tools, and frameworks available.
Given all that, why should developers get to know the language? Because Finch provides native support for creating and managing concurrent processes, it aims to make parallel programming easier and safer by abstracting the complexities of thread management and synchronisation.
It also supports lightweight processes, which are cheaper to create and manage compared to traditional operating system threads. This allows for high levels of concurrency with minimal overhead.
And instead of shared memory, the language uses message passing for communication between processes. This helps to avoid many common issues related to concurrency, such as race conditions and deadlocks.
It is safe and simple, using syntax and semantics designed to be straightforward, reducing the cognitive load on the programmer. Perhaps thanks to its origins in academia, it can be used to teach concepts related to concurrency and parallelism due to its simplicity and focus on these areas.
Finch use cases
When it comes to its use cases, it has a wide range of implementations. For example, Finch can be used to create a variety of applications that benefit from concurrent and parallel programming — especially those that require efficient handling of multiple tasks simultaneously.
It’s well-suited for concurrent algorithms, such as rare conditions, deadlocks, and synchronisation as well as for developing small-scale parallel computation projects, such as matrix multiplications or sorting algorithms. Networked applications like chat and HTTP servers are also in its ballpark, as are game engines, and concurrent simulations.
3 tech roles to explore across the EU
Data processing too, is a possibility. Finch can be used to implement a log processing system that reads, processes, and analyses logs in parallel to improve throughput and efficiency, or to build data pipelines.
There are plenty of other ways developers can use it: including to develop real-time systems, or microservices, for example.
While Finch may not be the best choice for large-scale production systems due to its niche status and smaller ecosystem, it is worth consideration for smaller projects, where having an understanding of and applying concurrent programming principles is the key objective.