Inria Lille - Nord Europe




Pharo is a pure object reflective and dynamic language inspired by Smalltalk. In addition, Pharo comes with a full advanced programming environment developed under the MIT License. It provides a platform for innovative development both in industry and research. By providing a stable and small core system, excellent developer tools, and maintained releases, Pharo’s goal is to be a platform to build and deploy mission critical applications, while at the same time continue to evolve. Web site: http://www.pharo.org - http://www.github.com/pharo-project/pharo

Pharo 9.0 has 790 packages for 10,658 classes and 143,420 methods. It has around 250 forks on github https://github.com/pharo-project/pharo/ and Pharo is composed out of several github projects such as http://github.com/pharo-vcs, http://github.com/pharo-graphics, or http://github.com/pharo-spec. Pharo has around 18 regular contributors and up to 100 occasional ones. In addition Pharo has many users (we roughly estimate around 10,000) and Pharo is taught is around 30-40 universities worldwide and used by aroudn 30 research teams. It has also an industrial consortium. Pharo runs on 11 different OSes linux, OSX, Windows, Raspberrian and multiple architectures 32/64 bits, Intel/ARM. Pharo development is driven by the Pharo consortium http://consortium.pharo.org.


Moose (http://www.moosetechnology.org and https://modularmoose.org) is a data and software analysis platform. It is composed of multiple generic and scriptable engines (visualization, tool builder, modular parser, charting,....). Moose is used by several research groups among which Software composition Group (Bern), A. Bergel (University of S.), and a couple of companies. Moose is a meta tool builder in the sense that it allows one to define new dedicated tools. It was composed of Glamour - an IDE building tool (developed by T. Girba), Roassal - visualization DSL (developed by A. Bergel), PetitParser (developed by L. Renggli) and many libraries.

Moose is the supporting platform of over more than 350 research articles (including Masters, PhDs).

Based on the experience of Synectique (a startup based on Moose), we team started to rewrite Moose: during a first period the meta-modeling layer was rewritten to support family of languages and not one universal language metamodel. Currently we are redesigning the complete tool suite.