It's been a bit over a week since we all got back from ICFP in Baltimore. I thought I'd write up a little report with my perspective.
As usual it was great fun. I basically count it as a holiday, though not a very restful one, with a packed programme of talks and every other waking moment spent jabbering to friends and colleagues.
ICFP and its associated workshops
This year the conferences seemed to be arranged in a progression from most highly academic and mathematical towards the more practical and commercial. I don't know if it was deliberate, but it seemed to work fairly well. People could arrive or leave at the point suiting their interest. It was quite interesting to see how the mix of people changed through the week. I missed the metatheory, mathematically structured FP and generic programming, but arrived in time for the main 3-day ICFP conference and stayed through to the end.
Well-Typed was pretty well represented at the conferences this year. Andres was on the ICFP programme committee and had a paper accepted for the Haskell Symposium. I was on the Haskell Symposium programme committee (but didn't review Andres et al's paper of course!) and along with Simon Marlow, I co-organised the Haskell Implementors' Workshop.
In theory I co-authored a presentation with Don Stewart on Hackage, Cabal and the Haskell Platform though in practice he did it all and I just reviewed the slides and made a few suggestions. I had a slight feeling beforehand that there was not really that much to talk about, partly because I'm feeling a little frustrated that I have not been able to spend more time on Cabal. On reflection however there was plenty to say, we have made quite a bit of progress during the year, especially in establishing the platform as the way most people get their Haskelly goodness.
Colin Runciman, Don and I spent a good couple hours plotting for a paper, perhaps for ICFP or the Haskell Symposium next year. I'm looking forward to working with Colin and Don on that. It's a nice bit of classic lazy functional programming I think.
Simon Marlow and I declared that we would hand over the organisation of the Haskell Implementors' Workshop to a new team and we've already got a couple volunteers from this year's programme committee. So I had thought that I would not be organising anything for next year's ICFP in Japan. That was until Michael Sperber asked if I would like to help him organise the CUFP tutorials next year. If you were at ICFP in the last couple years you may remember DEFUN, the functional programming developer tracks. This year they were rebranded as being part of CUFP. The idea is to appeal more to programmers using (or wanting to use) FP at work and to help persuade managers that it is worthwhile training.
A couple papers from the Haskell Symposium that I particularly liked, or thought significant:
STG in Coq or to give the proper paper name A Systematic Derivation of the STG Machine Verified in Coq, by Maciej Piróg and Dariusz Biernacki from the University of Wrocław in Poland. They presented a fragment of a bigger project to build a verified Haskell compiler, perhaps similar to Xavier Leroy's work on a verified C compiler. To verify that a compiler faithfully translates a program in a high level language to a program with the same meaning but in a low level language, what you need is a proper formal connection between the high and low level languages. And of course it is not just a high and low level but a whole series of intermediate languages. Xavier's "compcert" uses about a dozen intermediate languages. Real compilers also use several intermediate languages. GHC goes from Haskell, to Core (System Fc) to STG to C-- and finally into either C, LLVM or assembly. This paper focuses on STG which is the language on the boundary between the functional world and the imperative world. From the functional side it is just a stylised subset of Core, but the same language also has an imperative semantics using an abstract machine that explains how to efficiently execute it. The paper makes the formal connection between the functional and imperative semantics of the language. I'm looking forward to more work from this team.
J. Garrett Morris presented an experience report on Using Hackage to Inform Language Design. I thought this was great, but not just because he cites one of my old blog posts as inspiration! The basic idea is really simple: take advantage of the fact that we have a large amount of publicly available code in a standardised form to get empirical data to inform questions about language design. Getting lots of real data has not generally been the tradition in the programming language community, partly because it is so hard to get (but also because we have ideas about what programmers ought to do). He gave an example to do with the design of the type class system, and did a survey to see how overlapping instances are used in practice. The tools at this stage are a bit hacky but with a little work it could be improved and automated much more. I hope we will see more people taking this approach in future, especially to help the Haskell prime process.
"The Future of Haskell" discussion
Traditionally, the Haskell Symposium ends with a long discussion entitled "The Future of Haskell". For the past few years this session has become less and less about the future direction of the language or about uncomfortable home truths and more about incremental changes and infrastructure (like Hackage and the Haskell Platform). Recognising this, the programme committee this year decided to scrap the future of Haskell discussion and just have short reports on the progress of the language standard i.e. Haskell 2010 and the future 2011/2012 revision.
For the implementors' workshop we decided to pick up the baton from the symposium and run a "Beyond Haskell" discussion. The idea was to be a bit less self-congratulatory, more forward looking and to pose uncomfortable questions. To kick things off we had Ben Lippmeier give a short intro. I didn't know beforehand what direction he would take, but we expected he'd do something interesting and we were not disappointed.
Ben talked about the problem of performance, that we often know what ugly fast low level program we want to write, but we have difficulty in expressing it in a high level way that we can reliably translate into the ugly fast version. So it's not that we cannot write fast programs but we want to have our cake and eat it, we want to write nice programs and reliably generate fast programs. People who work on this often end up writing Haskell but constantly studying the generated core to see why the transformation they wanted didn't quite work out. Reliability can be crucial: if it is a transformation that makes a 10x or 100x difference then you need to be sure that it is going to work. Perhaps not everyone worries about performance like this but it struck a chord with me because it was more or less exactly the issue I had in mind when I started my PhD. I was working on partial evaluation with the notion that the programmer would be able to control the compile-time transformations that generate the fast program from the nice program. I still think it's an approach worth investigating.
All the HIW slides and videos are on the HIW wiki page. Thanks to all the presenters for making their slides available and to Malcolm Wallace for videoing the whole event.
The Haskell "BoF" session
One of the things that CUFP started doing this year is "birds of a feather" (BoF) sessions. The idea is to get groups together for a couple hours to discuss some topic of interest in the community. Bryan and Johan organised a session on "Haskell in the real world". It was a pretty interesting and useful discussion I thought, particularly in relation to how we make improvements in infrastructure and attract volunteers to do that. We also talked quite a bit about what needs doing to keep up Haskell adoption, like coherence of the web presence, IDEs etc. Don did a good job as secretary and posted his notes afterwards.
Google Summer of Code
I was very pleased this summer to be involved with two GSoC projects and two excellent students. I was not technically the mentor in either case but since they were both related to Cabal/Hackage then I could hardly not be involved!
What I was especially pleased about is that both of them came to the Haskell Implementors' Workshop to give presentations about their GSoC projects. The HIW programme committee were very supportive of their talk proposals. The talks were on topic (being about infrastructure), they were useful for disseminating news to the community, and having GSoC students attend is great for integrating them into the community.
The new hackage
Matt Gruen has been working on the new hackage server implementation which will give us a decent extensible platform for adding the new haskage features that everyone has been clamouring for.
Matt has the new hackage server code running on sparky and has recently been working on the process of how we will transition from the old to the new server. If anyone wants to help him with that, I'm sure he would appreciate it. There are quite a few things to do. He's got a plan up on the wiki. You can find him by email or in the #hackage IRC channel on freenode.
Thomas Tuegel was working on "cabal test" which is a new Cabal feature to let packages define test suites and have other tools run them and collect results.
As anyone following the cabal-devel mailing list will have noticed from the deluge of patches, I finally finished reviewing and applying all of Thomas's cabal test patches. The plan is that this will be in Cabal-1.10.x which will come with GHC 7. If you watch Thomas's presentation you'll understand that one of the important features of the design is that we can have different protocols that test suites can support. So far we have two protocols, a basic one and a more detailed one. For the Cabal-1.10 release however we will enable just the basic "exitcode-stdio" test interface. We will continue to work on the more detailed interface in the development version of Cabal. In particular we are working with Max Bolingbroke, author of the popular test-framework package, to refine the interface for describing sets of tests.
Taken together, these two projects are an important step in our long term plan to make it easier to work out which are the high quality packages on hackage and to improve package quality overall.