SLE 2018
Sun 4 - Fri 9 November 2018 Boston, Massachusetts, United States
co-located with SPLASH 2018

The ACM SIGPLAN International Conference on Software Language Engineering (SLE) is devoted to the principles of software languages: their design, their implementation, and their evolution.

With the ubiquity of computers, software has become the dominating intellectual asset of our time. In turn, this software depends on software languages, namely the languages it is written in, the languages used to describe its environment, and the languages driving its development process. Given that everything depends on software and that software depends on software languages, it seems fair to say that for many years to come, everything will depend on software languages.

Software language engineering (SLE) is the discipline of engineering languages and their tools required for the creation of software. It abstracts from the differences between programming languages, modelling languages, and other software languages, and emphasizes the engineering facet of the creation of such languages, that is, the establishment of the scientific methods and practices that enable the best results. While SLE is certainly driven by its metacircular character (software languages are engineered using software languages), SLE is not self-satisfying: its scope extends to the engineering of languages for all and everything.

Like its predecessors, the 11th edition of the SLE conference, SLE 2018, will bring together researchers from different areas united by their common interest in the creation, capture, and tooling of software languages. It overlaps with traditional conferences on the design and implementation of programming languages, model-driven engineering, and compiler construction, and emphasizes the fusion of their communities. To foster the latter, SLE traditionally fills a two-day program with a single track, with the only temporal overlap occurring between co-located events.

SLE 2018 is to be held on November 5th and 6th 2018, co-located with SPLASH and GPCE in Boston, USA.

Proceedings are available online.

Supporters
Gold Sponsor
Gold Sponsor
Dates
You're viewing the program in a time zone which is different from your device's time zone change time zone

Mon 5 Nov

Displayed time zone: Guadalajara, Mexico City, Monterrey change

10:30 - 12:00
ParsingSLE 2018 at Studio 1
Chair(s): Tijs van der Storm CWI & University of Groningen
10:30
30m
Talk
Declarative Specification of Indentation Rules: A Tooling Perspective on Parsing and Pretty-Printing Layout-Sensitive Languages
SLE 2018
Luis Eduardo de Souza Amorim Delft University of Technology, Netherlands, Michael J. Steindorfer Delft University of Technology, Sebastian Erdweg TU Delft, Eelco Visser Delft University of Technology
Link to publication DOI
11:00
30m
Talk
GLL Parsing with Flexible Combinators
SLE 2018
L. Thomas van Binsbergen Royal Holloway University of London, Elizabeth Scott Royal Holloway University of London, Adrian Johnstone
File Attached
11:30
30m
Talk
Morbig: A Static Parser for POSIX Shell
SLE 2018
Yann Régis-Gianas IRIF, University Paris Diderot and CNRS, France / INRIA PI.R2, Nicolas Jeannerod IRIF, Université de Paris, Ralf Treinen IRIF
13:30 - 15:00
Parsing / CompositionSLE 2018 at Studio 1
Chair(s): Eelco Visser Delft University of Technology
13:30
20m
Talk
Input-Driven Regular Expressions (Vision Paper)Vision / New Idea
SLE 2018
13:50
30m
Talk
Modular Language Composition for the Masses
SLE 2018
Manuel Leduc Univ Rennes, Inria, CNRS, IRISA, Thomas Degueule Centrum Wiskunde & Informatica, Benoit Combemale University of Rennes 1
DOI Pre-print
14:20
20m
Talk
Storm: A Language Platform for Interacting and Extensible Languages (Tool Demo)Tool Demo
SLE 2018
Filip Strömbäck Linköping University
DOI
14:40
20m
Talk
Languages as First-Class Citizens (Vision Paper)Vision / New Idea
SLE 2018
Matteo Cimini University of Massachusetts Lowell
15:30 - 17:30
Validation & VerificationSLE 2018 at Studio 1
Chair(s): Marsha Chechik University of Toronto
15:30
30m
Talk
Continuous Model Validation using Reference Attribute Grammars
SLE 2018
Johannes Mey Technische Universität Dresden, René Schöne Technische Universität Dresden, Görel Hedin , Emma Söderberg Lund University, Thomas Kühn Technische Universität Dresden, Niklas Fors Lund University, Jesper Oqvist Lund University, Uwe Aßmann TU Dresden, Germany
Link to publication DOI Pre-print Media Attached
16:00
30m
Talk
Migrating Business Logic to an Incremental Computing DSL: A Case Study
SLE 2018
Daco Harkes Delft University of Technology, Elmer van Chastelet Delft University of Technology, Eelco Visser Delft University of Technology
Link to publication DOI Pre-print
16:30
20m
Talk
An Industrial Case Study in Compiler Testing (Tool Demo)Tool Demo
SLE 2018
Vadim Zaytsev Raincode Labs
16:50
20m
Talk
Messir, a Text-first DSL-based Approach for UML Requirements Engineering (Tool Demo)Tool Demo
SLE 2018
Benoît Ries University of Luxembourg, Alfredo Capozucca University of Luxembourg, Nicolas Guelfi University of Luxembourg
DOI

Tue 6 Nov

Displayed time zone: Guadalajara, Mexico City, Monterrey change

08:30 - 10:00
Keynote: RinardSLE 2018 at Studio 1
08:30
15m
Day opening
Awards
SLE 2018
Friedrich Steimann Fernuniversität, Tanja Mayerhofer TU Wien, Matthew Roberts Macquarie University, Romina Eramo University of L'Aquila
08:45
75m
Talk
A New Approach for Software Correctness and ReliabilityKeynote
SLE 2018
Martin C. Rinard Massachusetts Institute of Technology
10:30 - 12:00
Types & ConstraintsSLE 2018 at Studio 1
Chair(s): Ralf Laemmel Facebook London
10:30
30m
Talk
Constraint-based Run-time State Migration for Live Modeling
SLE 2018
Ulyana Tikhonova CWI, Jouke Stoel CWI, Tijs van der Storm CWI & University of Groningen, Thomas Degueule Centrum Wiskunde & Informatica
11:00
30m
Talk
The Next 700 Unit Checkers
SLE 2018
Oscar Bennich-Björkman Uppsala University, Steve McKeever Uppsala University
11:30
30m
Talk
A Practical Type System for Safe Aliasing
SLE 2018
Dimi Racordon University of Geneva, Centre Universitaire d'Informatique, Geneva, Switzerland, Didier Buchs University of Geneva, Centre Universitaire d'Informatique, Geneva, Switzerland
13:30 - 15:00
Grammars & MetamodellingSLE 2018 at Studio 1
Chair(s): Thomas Degueule Centrum Wiskunde & Informatica
13:30
30m
Talk
Facet-Oriented Modelling: Open Objects for Model-Driven Engineering
SLE 2018
Juan de Lara Universidad Autónoma de Madrid, Esther Guerra Universidad Autónoma de Madrid, Jörg Kienzle McGill University, Canada, Yanis Hattab McGill University
14:00
30m
Talk
Analysing Meta-Model Product Lines
SLE 2018
Esther Guerra Universidad Autónoma de Madrid, Juan de Lara Universidad Autónoma de Madrid, Marsha Chechik University of Toronto, Rick Salay University of Toronto
14:30
30m
Talk
Translating Grammars to Accurate Metamodels
SLE 2018
Arvid Butting Software Engineering RWTH Aachen University, Nico Jansen Software Engineering, RWTH Aachen University, Bernhard Rumpe RWTH Aachen University, Andreas Wortmann RWTH Aachen University
15:30 - 17:30
Grammars & Metamodelling / WorkbenchesSLE 2018 at Studio 1
Chair(s): Juan de Lara Universidad Autónoma de Madrid
15:30
30m
Talk
Deriving Fluent Internal Domain-Specific Languages from Grammars
SLE 2018
Arvid Butting Software Engineering RWTH Aachen University, Manuela Dalibor Software Engineering, RWTH Aachen University, Gerrit Leonhardt Software Engineering, RWTH Aachen University, Bernhard Rumpe RWTH Aachen University, Andreas Wortmann RWTH Aachen University
16:00
20m
Talk
Fostering Metamodels and Grammars Within a Dedicated Environment for HPC: The NabLab Environment (Tool Demo)Tool Demo
SLE 2018
Benoît Lelandais CEA/DAM/DIF, France, Marie-Pierre Oudot CEA/DAM/DIF, France, Benoit Combemale University of Rennes 1
16:20
20m
Talk
Migrating Custom DSL Implementations to a Language Workbench (Tool Demo) Tool Demo
SLE 2018
Jasper Denkers TU Delft, Louis van Gool Océ Technologies B.V., Eelco Visser Delft University of Technology
Link to publication DOI
16:40
20m
Talk
Bacatá: A Language Parametric Notebook Generator (Tool Demo)Tool Demo
SLE 2018
Mauricio Verano Merino Technische Universiteit Eindhoven, Jurgen Vinju Centrum Wiskunde & Informatica / Technische Universiteit Eindhoven / SWAT.engineering BV, Tijs van der Storm CWI & University of Groningen
17:00
20m
Talk
Shape-Diverse DSLs: Languages without Borders (Vision Paper)Vision / New Idea
SLE 2018
Fabien Coulon University of Toulouse / Obeo, Thomas Degueule Centrum Wiskunde & Informatica, Tijs van der Storm CWI & University of Groningen, Benoit Combemale University of Rennes 1
Pre-print

Unscheduled Events

Not scheduled
Talk
GPCE Keynote: How to Make Sparse FastKeynote
SLE 2018
Not scheduled
Day opening
Opening
SLE 2018
David J. Pearce Victoria University of Wellington, Friedrich Steimann Fernuniversität, Tanja Mayerhofer TU Wien

Accepted Papers

Title
Analysing Meta-Model Product Lines
SLE 2018
An Industrial Case Study in Compiler Testing (Tool Demo)Tool Demo
SLE 2018
A Practical Type System for Safe Aliasing
SLE 2018
Awards
SLE 2018
Bacatá: A Language Parametric Notebook Generator (Tool Demo)Tool Demo
SLE 2018
Constraint-based Run-time State Migration for Live Modeling
SLE 2018
Continuous Model Validation using Reference Attribute Grammars
SLE 2018
Link to publication DOI Pre-print Media Attached
Declarative Specification of Indentation Rules: A Tooling Perspective on Parsing and Pretty-Printing Layout-Sensitive Languages
SLE 2018
Link to publication DOI
Deriving Fluent Internal Domain-Specific Languages from Grammars
SLE 2018
Facet-Oriented Modelling: Open Objects for Model-Driven Engineering
SLE 2018
Fostering Metamodels and Grammars Within a Dedicated Environment for HPC: The NabLab Environment (Tool Demo)Tool Demo
SLE 2018
GLL Parsing with Flexible Combinators
SLE 2018
File Attached
GPCE Keynote: How to Make Sparse FastKeynote
SLE 2018
Input-Driven Regular Expressions (Vision Paper)Vision / New Idea
SLE 2018
Languages as First-Class Citizens (Vision Paper)Vision / New Idea
SLE 2018
Messir, a Text-first DSL-based Approach for UML Requirements Engineering (Tool Demo)Tool Demo
SLE 2018
DOI
Migrating Business Logic to an Incremental Computing DSL: A Case Study
SLE 2018
Link to publication DOI Pre-print
Migrating Custom DSL Implementations to a Language Workbench (Tool Demo) Tool Demo
SLE 2018
Link to publication DOI
Modular Language Composition for the Masses
SLE 2018
DOI Pre-print
Morbig: A Static Parser for POSIX Shell
SLE 2018
Opening
SLE 2018
Shape-Diverse DSLs: Languages without Borders (Vision Paper)Vision / New Idea
SLE 2018
Pre-print
Storm: A Language Platform for Interacting and Extensible Languages (Tool Demo)Tool Demo
SLE 2018
DOI
The Next 700 Unit Checkers
SLE 2018
Translating Grammars to Accurate Metamodels
SLE 2018

Call for Papers

Topics of Interest

SLE 2018 solicits high-quality contributions in areas ranging from theoretical and conceptual contributions, to tools, techniques, and frameworks in the domain of software language engineering. Topics relevant to SLE cover generic aspects of software languages development rather than aspects of engineering a specific language. In particular, SLE is interested in contributions from the following areas:

  • Software Language Design and Implementation
    • Approaches to and methods for language design
    • Static semantics (e.g., design rules, well-formedness constraints)
    • Techniques for specifying behavioral / executable semantics
    • Generative approaches (incl. code synthesis, compilation)
    • Meta-languages, meta-tools, language workbenches
  • Software Language Validation
    • Verification and formal methods for languages
    • Testing techniques for languages
    • Simulation techniques for languages
  • Software Language Integration and Composition
    • Coordination of heterogeneous languages and tools
    • Mappings between languages (incl. transformation languages)
    • Traceability between languages
    • Deployment of languages to different platforms
  • Software Language Maintenance
    • Software language reuse
    • Language evolution
    • Language families and variability
  • Domain-specific approaches for any aspects of SLE (design, implementation, validation, maintenance)
  • Empirical evaluation and experience reports of language engineering tools
    • User studies evaluating usability
    • Performance benchmarks
    • Industrial applications

Types of Submissions

  • Research papers: These should report a substantial research contribution to SLE or successful application of SLE techniques or both. Full paper submissions must not exceed 12 pages excluding bibliography.

  • Tool papers: Because of SLE’s interest in tools, we seek papers that present software tools related to the field of SLE. Selection criteria include originality of the tool, its innovative aspects, and relevance to SLE. Any of the SLE topics of interest are appropriate areas for tool demonstrations. Submissions must provide a tool description of 4 pages excluding bibliography, and a demonstration outline including screenshots of up to 6 pages. Tool demonstrations must have the keywords “Tool Demo” or “Tool Demonstration” in the title. The 4-page tool description will, if the demonstration is accepted, be published in the proceedings. The 6-page demonstration outline will be used by the program committee only for evaluating the submission.

  • New ideas / vision papers: New ideas papers should describe new, non-conventional SLE research approaches that depart from standard practice. They are intended to describe well-defined research ideas that are at an early stage of investigation. Vision papers are intended to present new unifying theories about existing SLE research that can lead to the development of new technologies or approaches. New ideas / vision papers must not exceed 4 pages excluding bibliography.

Workshops: Workshops will be organized by SPLASH. Please inform us and contact the SPLASH organizers if you would like to organize a workshop of interest to the SLE audience. Information on how to submit workshops can be found at the SPLASH 2018 Website.

Artifact Evaluation

For the third year SLE will use an evaluation process for assessing the quality of the artifacts on which papers are based to foster the culture of experimental reproducibility. Authors of accepted papers are invited to submit artifacts. For more information, please have a look at the Artifact Evaluation page.

Submission

Format

Submissions have to use the ACM SIGPLAN Conference Format “acmart”; please make sure that you always use the latest ACM SIGPLAN acmart LaTeX template, and that the document class definition is \documentclass[sigplan,screen]{acmart}. Do not make any changes to this format!

Using the Word template is strongly discouraged.

Ensure that your submission is legible when printed on a black and white printer. In particular, please check that colors remain distinct and font sizes in figures and tables are legible.

SLE follows a single-blind review process. Thus, you do not have to blind your submission.

All submissions must be in PDF format.

Concurrent Submissions

Papers must describe unpublished work that is not currently submitted for publication elsewhere as described by SIGPLAN’s Republication Policy. Submitters should also be aware of ACM’s Policy and Procedures on Plagiarism. Submissions that violate these policies will be desk-rejected.

Submission Site

Submissions will be accepted at https://sle18.hotcrp.com/.

Reviewing Process

All submitted papers will be reviewed by at least three members of the program committee. Research papers and tool papers will be evaluated concerning novelty, correctness, significance, readability, and alignment with the conference call. New ideas / vision papers will be evaluated primarily concerning novelty, significance, readability, and alignment with the conference call.

For fairness reasons, all submitted papers must conform to the above instructions. Submissions that violate these instructions may be rejected without review, at the discretion of the PC chairs.

Awards

  • Distinguished paper: Award for most notable paper, as determined by the PC chairs based on the recommendations of the programme committee.
  • Distinguished reviewer: Award for distinguished reviewer, as determined by the PC chairs.
  • Distinguished artifact: Award for the artifact most significantly exceeding expectations, as determined by the AEC chairs based on the recommendations of the artifact evaluation committee.

Publication

All accepted papers will be published in the ACM Digital Library.

AUTHORS TAKE NOTE: The official publication date is the date the proceedings are made available in the ACM Digital Library. This date may be up to two weeks prior to the first day of the conference. The official publication date affects the deadline for any patent filings related to published work.

Contact

For additional information, clarification, or answers to questions, please contact the organizers by email: sle2018@googlegroups.com.

SLE will, for the third year, use an evaluation process for assessing the quality of artifacts on which papers are based. The aim of this evaluation process is to foster a culture of experimental reproducibility and to provide a peer review process for artifacts as well as papers.

Authors of papers accepted for SLE 2018 will be invited to submit artifacts. Any kind of artifact that is presented in the paper, supplements the paper with further details, or underlies the paper can be submitted. This includes, for instance; tools, grammars, metamodels, models, programs, algorithms, scripts, proofs, datasets, statistical tests, checklists, surveys, interview scripts, visualizations, annotated bibliographies, and tutorials.

The submitted artifacts will be reviewed by a dedicated Artifact Evaluation Committee (AEC). Artifacts that live up to the expectations created by the paper will receive a badge of approval from the AEC. The approved artifacts will be invited for inclusion in the electronic conference proceedings published in the ACM Digital Library. This will ensure the permanent and durable storage of the artifacts alongside the published papers fostering the repeatability of experiments, enabling precise comparison with alternative approaches, and helping the dissemination of the author’s ideas in detail.

The AEC will award the artifact that most significantly exceeds the expectations with a Distinguished Artifact Award.

Participating in the artifact evaluation and publishing approved artifacts in the ACM Digital Library is voluntary. However, we strongly encourage authors to consider this possibility as the availability of artifacts will greatly benefit readers of papers and increase the impact of the work. Note that the artifact evaluation cannot affect the acceptance of the paper, because it only happens after the decision about acceptance has been made.

The artifact evaluation process of SLE borrows heavily from processes described at artifact-eval.org and from previous experience in SLE, ECOOP, and ICSCME. The process is detailed in the below.

Submission

If and when your paper has been accepted for SLE 2018, you will be invited by the AEC chairs to submit the artifacts related to your work. This invitation will contain detailed instructions on how to submit your artifacts.

An artifact submission comprises the following components:

  • Paper: Preliminary PDF version of the accepted SLE 2018 paper. The paper will be used to evaluate the consistency of the accepted paper and the submitted artifact, as well as to assess whether the artifact lives up to the expectations created by the paper.
  • Authors of the artifact: This list may include people who are not authors of the accepted paper, but contributed to creating the artifact.
  • Abstract: A short description of the artifact to be used for assignments of artifacts to AEC members.
  • Artifact: An archive file (gz, xz, or zip) containing everything needed for supporting a full evaluation of the artifact. The archive file has to include at least the artifact itself and a text file README.txt that contains the following information:
    • An overview of the archive file documenting the content of the archive.
    • A setup / installation guide giving detailed instructions on how to setup or install the submitted artifact.
    • Detailed step-by-step instructions on how to reproduce any experiments or other activities that support the conclusions given in the paper.

If multiple artifacts relate to an accepted SLE paper, all artifacts should be collected in one archive and submitted together in one single submission. For instance, if a tool has been developed, a tutorial has been authored with detailed instructions on how to use the tool, and user studies have been performed for evaluating the tool’s properties, the tool, the tutorial, and the raw data collected in the user study should be packed in one archive file and submitted together in one single submission to the SLE 2018 artifact evaluation.

When preparing your artifact, consider that your artifact should be as accessible to the AEC as possible. In particular, it should be possible for the AEC to quickly make progress in the investigation of your artifact. Please provide some simple scenarios describing concretely how the artifact is intended to be used. For a tool, this would include specific inputs to provide or actions to take, and expected output or behavior in response to this input.

For artifacts that are tools, it is recommended to provide the tool installed and ready to use on a virtual machine for VirtualBox, VMware, SHARE, a Docker image, or a similar widely available platform.

Please use widely supported open formats for documents (e.g., PDF, HTML) and data (e.g., CSV, JSON).

Evaluation Process

Submitted artifacts will be evaluated by the AEC concerning the following criteria. Artifacts should be:

  • consistent with the paper,
  • as complete as possible,
  • well documented, and
  • easy to (re)use, facilitating further research.

Each submitted artifact will be evaluated by at least two members of the AEC. Thereby, the artifacts will be treated confidentially, as with the submitted paper.

Artifacts that pass the evaluation will receive an “Artifact Evaluated - Functional” badge and be invited for inclusion in the electronic conference proceedings published in the ACM Digital Library. Artifacts that will be included in the ACM Digital Library or that will be made permanently available in another publicly accessible archival repository will also receive the “Artifact Available” badge. Detailed definitions of these badges and the respective evaluation criteria may be found at the ACM Artifact Review Badging site.

The evaluation consists of two steps:

  1. Kicking-the-tires: Reviewers will check the artifact’s integrity and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM won’t start, immediate crashes on the simplest example, etc.). In case of any problems, authors will be given a 48-hour period (September 14-15) to read and respond to the kick-the-tires reports of their artifacts and solve any issues preventing the artifact evaluation.
  2. Artifact assessment: Reviewers evaluate the artifacts and decide on the approval of the artifact.

Notification about the outcome of the artifact evaluation and reviews including suggestions for improving the artifacts will be distributed about one week before the deadline for the final version of the research paper, such that the outcome can be mentioned in the paper and the final artifact can be uploaded for inclusion in the ACM Digital Library.

Important Dates

  • August 31, 2018: Artifact submission
  • September 14, 2018: Kick-the-tires author response
  • October 10, 2018: Artifact notification

Artifact Evaluation Chairs:

Artifact Evaluation Committee

to be announced

Further Information

For further information on the artifact evaluation of SLE 2018, feel free to contact the artifact evaluation chairs with an e-mail to matthew.roberts@mq.edu.au or romina.eramo@univaq.it

The Software Language Engineering Body of Knowledge (SLEBoK) is a community-wide effort to provide a unique and comprehensive description of the concepts, tools and methods developed by the SLE community.

The SLEBoK workshop is a continuation of efforts from the SLEBoK Dagstuhl Seminar of 2017, as well as earlier brainstorming events like the SL(E)BOK workshop at SLE 2012. To get more information about the workshop visit the workshop’s web page.