The dream of programming language design is to bring about orders-of-magnitude productivity improvements in software development tasks No future language will give us the factor of 10 advantage that assembler gave us over binary No future language will give us 50 , or 20 , or even 10 reduction in workload
Previous PDF | Next PDF |
[PDF] Future Programming Languages - Department of Computer Science
Basic Research (Grundforskning) – someone has to do it All software is written in a programming language 15 And there is more just around the corner Parametric polymorphism • Type inference • (Garbage collection) – Hope
[PDF] The Future of OOP
Here are their views on the future of programming, competing languages, Should we expect a true component-oriented language to evolve? Who will create
[PDF] Next-Paradigm Programming Languages - Yannis Smaragdakis
The dream of programming language design is to bring about orders-of-magnitude productivity improvements in software development tasks No future language will give us the factor of 10 advantage that assembler gave us over binary No future language will give us 50 , or 20 , or even 10 reduction in workload
[PDF] Empirical Analysis of Programming Language - Leo Meyerovich
Some programming languages become widely popular while others fail to grow beyond are much more likely to use a language they already know In Section 5, we with programming languages, a longitudinal study might assist future work guage, but the creator has fewer grounds to expect usage ) Figure 1a shows
Programming Languages - DiVA
code structure with easier to write syntaxes, more complex functionality, and 9 2 4 Is there things you would like to see in future languages (changes, We expect to find some sort of pattern of popularity and be able to compare it to the less
[PDF] C++ future
12 mai 2011 · What is going to change in this programming language? What new features will it have? ⇨ We have a “Final Committee Draft” for C++0x I expect
[PDF] Programming Languages: History and Future - THE CORE MEMORY
Programming languages are almost as old as ACM, this paper I hope to indicate some of the problems in (and in fact most of these apply to any program)
[PDF] show a list of the ten commandments
[PDF] show appendix in table of contents latex
[PDF] show limn→∞ an n > 0 for all a ∈ r
[PDF] show me all country zip code
[PDF] show me erie county jail roster
[PDF] show me what a hexagon looks like
[PDF] show that (p → q) ∧ (q → r) → (p → r) is a tautology by using the rules
[PDF] show that (p → r) ∧ q → r and p ∨ q → r are logically equivalent
[PDF] show that 2^p+1 is a factor of n
[PDF] show that 2^p 1(2p 1) is a perfect number
[PDF] show that 4p^2 20p+9 0
[PDF] show that a sequence xn of real numbers has no convergent subsequence if and only if xn → ∞ asn → ∞
[PDF] show that etm turing reduces to atm.
[PDF] show that every infinite turing recognizable language has an infinite decidable subset.
Next-Paradigm Programming Languages: What Will
They Look Like and What Changes Will They Bring?
Yannis Smaragdakis
Department of Informatics and Telecommunications
University of Athens
smaragd@di.uoa.gr orders-of-magnitude productivity improvements in software development tasks. Designers can endlessly debate on how this dream can be realized and on how close we are to its realization. Instead, I would like to focus on a question with an answer that can be, surprisingly, clearer: what will be the program development? Based on my decade-plus experience of heavy-duty development in declarative languages, I spec- ulate that certain tenets of high-productivity languages are inevitable. These include, for instance, enormous variations in performance (including automatic transformations that change the asymptotic complexity of algorithms); a radical change in a programmer"s work?ow, elevating testing from a near-menial task to an act of deep understanding; and a change in the need for formal proofs. CCS ConceptsSoftware and its engineering→Gen- eralprogramminglanguages ;Socialandprofessional topics→History of programming languages.Keywords
programming paradigms, next-paradigm pro- gramming languagesACM Reference Format:
What Will They Look Like and What Changes Will They Bring?. In Proceedings of the 2019 ACM SIGPLAN International Symposium on ware (Onward! "19), October 23-24, 2019, Athens, Greece.ACM, New York, NY, USA, 11 pages.h?ps://doi.org/10.1145/3359591.3359739 Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for pro?t or commercial advantage and that copies bear this notice and the full citation on the ?rst page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior speci?c permission and/or a fee. Request permissions from permissions@acm.org.Onward! "19, October 23-24, 2019, Athens, Greece
©2019 Copyright held by the owner/author(s). Publication rights licensed to ACM.ACM ISBN 978-1-4503-6995-4/19/10...$15.00
h?ps://doi.org/10.1145/3359591.3359739[W]e"ve passed the point of diminishing returns. No future language will give us the factor of 10 advantage that assembler gave us over binary. No future language will give us 50%, or 20%, or even 10% reduction in workload.Robert C. Martin [18]1 Introduction
Since the 1950s, high-level programming languages have resulted in orders-of-magnitude productivity improvements compared to machine-level coding. This feat has been a great enabler of the computing revolution, during a time when computer memories and conceptual program complexity have steadily grown at exponential rates. The history of com- accomplishments: of the 53 Turing awards to present (from1966 to 2018) a full 16 have been awarded for contributions
to programming languages or compilers. 1 At this time, however, the next steps in programming language evolution are hard to discern. Large productivity improvements will require a Kuhnianparadigm shiftin lan- guages. A change of paradigm, in the Kuhn sense, is a drastic reset of our understanding and nomenclature. It is no sur- prise that we are largely ine?ective at predicting its onset, its nature, or its principles. Despite this conceptual di?culty, the present paper is an attempt to peer behind the veil of next-paradigm program- ming languages. I happen to believe (on even days of the month) that a change of paradigm is imminent and all its technical components are already here. But you do not have to agree with me on either point-after all, the month also has its odd days. Reasonable people may also di?er on the possible cata- lysts of such a paradigm shift. Will it be machine learning and statistical techniques [23], trained over vast data sets of code instances? Will it be program synthesis techniques [9], employing symbolic reasoning and complex constraint solv- ing? Will it be mere higher-level language design combined1 The count is based on occurrences of "programming language(s)" or "com- piler(s)" in the brief citation text of the award, also including Richard Ham- ming who is cited for "automatic coding systems" (i.e., the L2 precursor of Fortran). Notably, the number does not include John McCarthy or Dana Scott, who are well-known for languages contributions yet the terms do not appear in the citation.Onward! "19, October 23-24, 2019, Athens, GreeceYannis Smaragdakiswith technology trends, such as vast computing power and
enormous memories? Regardless of one"s views, I hope to convince the reader that there is reasonable clarity onsomefeatures that next- paradigm programming languages will have,ifthey ever dominate. Similarly, there is reasonable clarity on what changes next-paradigm programming languages will induce in the tasks of everyday software development. For a sampling of the principles I will postulate and their corollaries, consider the following conjectures: Next-paradigm programming languages will not display on the surface the computational complexity of their cal- culations. Large changes in asymptotic complexity (e.g., fromO(n4)toO(n2)) will be e?ected by the language im- totic complexity factors, favoring solution simplicity and countering performance problems by limiting the inputs of algorithms (e.g., applying an expensive computation only locally) or accepting approximate results. Next-paradigm programming languages will need a?rm mental grounding, in order to keep program develop- ment manageable. This grounding can include: a well- understood cost model; a simple and clear model on how new code can or cannot a?ect the results of earlier code; a natural adoption of parallelism without need for con- currency reasoning.Development with next-paradigm programming lan-
guages will be signi?cantly di?erent from current soft- dous impact on the output and its computation cost. Incre- mental development will be easier. Testing and debugging will be as conceptually involved as coding. Formal rea- soning will be easier, but less necessary. In addition to postulating such principles, the goal of the paper is to illustrate them. I will use examples from real, de- Datalog. My experience in declarative programming is a key inspiration for most of the observations of the paper. It is also what makes the conjectures of the paper "real". All of the elements I describe, even the most surprising, are in- stances I have encountered in programming practice. I begin with this personal background before venturing to further speculation. I"ve seen things, that"s why I"m seeing things.-me2 Where I Come From
ative, logic-based code of uncommon volume and variety, under stringent performance requirements. This experience underlies my speculation on the properties of next-paradigm programming languages.Declarative code-lots of it.
Most of my research (and the
has been on declarative program analysis [24]. My group and a growing number of external collaborators have im- plemented large, full-featured static analysis frameworks for Java bytecode [4], LLVM bitcode [2], Python [14], and Ethereum VM bytecode [6,7]. The frameworks have been the ecosystem for a large number of new static analysis algorithms, leading to much new research in the area. These analysis frameworks are written in the Datalog language. Datalog is a bottom-up variant of Prolog, with similar syntax. "Bottom-up" means that no search is per- formed to ?nd solutions to logical implications-instead, all valid solutions are computed, in parallel. This makes the lan- guage much more declarative than Prolog: reordering rules, or clauses in the body of a rule, does not a?ect the output. Accordingly, computing all possible answers simultaneously means that the language has to be limited to avoid possibly in?nite computations. Construction of new objects (as op- posed to new combinations of values) is, therefore, outside the core language and, in practice, needs to be strictly con- trolled by the programmer. These features will come into play in later observations and conjectures. The Datalog language had been employed in static pro- gram analysis long before our work [10,15,22,29]. However, our frameworks are distinguished by being almost entirely written in Datalog: not just quick prototypes or "convenient" core computations are expressed as declarative rules, but the complete, ?nal, and well-optimized version of the deployed code, as well as much of the sca?olding of the analysis. As a result, our analysis frameworks are possibly the largest Datalog programs ever written, and among the largest pieces of declarative code overall. For instance, the Doop codebase [4] comprises several thousands of Datalog rules, or tens of This may seem like a small amount of code, but, for logical rules in complex mutual recursion, it represents a daunting amount of complexity. This complexity captures core static virtually the entire complexity of Java), logic for common frameworks and dynamic behaviors, and more.Emphasis on performance, including parallelism.
The performance at least equal to a manually-optimized impera- a clear cost model in mind. The author of a declarative rule ated: in how many nested loops and in what order, with what indexing structures, with which incrementality policy forNext-Paradigm Programming Languages Onward! "19, October 23-24, 2019, Athens, Greecefaster convergence when recursion is employed. Optimiza-
tion directives are applied to achieve maximum performance. Shared-memory parallelism is implicit, but the programmer is well aware of which parts of the evaluation parallelize well and which are inherently sequential. In short, although the code is very high-level, its structure is anything but random, and its performance is not left to chance. Maximum e?ort is expended to encode highest-performing solutions purely declaratively.