Hi all, OCaml devs are looking for example Coq files that are memory intensive and relatively self-contained, in order to test Coq under Sandmark for performance monitoring. I've suggested @Jason Gross performance suite [rewrite and tests] for now, but it'd be good to have more examples of particular files [as opposed to complete developments]
I guess that they could install a few dependencies and then test that particular file too; some library authors that may face memory pressure are @Ralf Jung @Jason Gross @Cyril Cohen (math-comp) @Yishuai Li @Yishuai Li (Gitter import) @Karl Palmskog [add your own]
I can try to extract a very email example of an arbitrarily memory intensive file: it should be enough to use typeclasses to reify a term with hundreds or thousands of binders to PHOAS, which should be doable in at most a couple dozen lines of Coq code.
(I don't usually feel memory pressure, since I build on a machine with 64 GB RAM.)
Thanks @Jason Gross , I've let the OCaml devs know; by the way, I'm curious if you are taking OCaml itself into account in your extensive performance measurement? That could be an interesting ax to explore, let me know if you would be interested.
@Emilio Jesús Gallego Arias I've just pushed an example of 50 lines of code to https://github.com/coq-community/coq-performance-tests/blob/master/src/typeclass_reification_common.v ; running the files in https://github.com/coq-community/coq-performance-tests/tree/master/src/typeclass_reification show that running time is cubic, peak memory usage is quadratic, and vo size is linear:
vo-size-bytes-vs.-n.svg real-time-vs.-n.svg mem-vs.-n.svg
mem-vs.-n.png vo-size-bytes-vs.-n.png real-time-vs.-n.png
If you want an example taking 4 GB RAM peak memory usage, that should be around n = 950, where the running time (on my machine) will be about 18 minutes
by the way, I'm curious if you are taking OCaml itself into account in your extensive performance measurement
@Emilio Jesús Gallego Arias What do you mean by this? Do you mean time of native_compute
, or something else...? Or time taken by Coq depending on what version of OCaml compiled it?
@Jason Gross thanks! I'll forward to the multicore devs, they use slack which is a bit hard to use due to invites etc... Maybe I can move the discussion here.
w.r.t. OCaml, I mean indeed the OCaml version [there are some variations indeed], but in particular flambda + optimization flags. There are quite a few
@Emilio Jesús Gallego Arias I don't have any plans to try out different optimization flags of OCaml at present (most of my benchmarks are more about the asymptotic complexity than the constant factors anyway), but if you want to play around, the coq-performance-tests repo (which is not all of my experiments, but some of them) has autogenerated plots at https://github.com/coq-community/coq-performance-tests#performanceexperiments-plots ; the CI is set up to regenerate the plots on every commit with multiple versions of Coq, and you could, e.g., set up a fork or make a PR to add plots for other versions of OCaml, etc
Hey :wave: @Jason Gross
I'm a multicore OCaml dev working on benchmarking coq on multicore, I was trying to run coq-performance-tests repo as a benchmark in sandmark (https://github.com/ocaml-bench/sandmark).
I was following the instructions where you stated that adding a particular coq version generated a plot on every commit with multiple coq versions.
So I tried adding the one which works with multicore -> https://github.com/ejgallego/coq/releases/tag/multicore-t01
I'm not sure where the graphs would be produced when I add a specific coq version to the gh-pages repo?
Any info on this would be extremely helpful, Thanks!!
@kumar The GitHub Actions CI (https://github.com/coq-community/coq-performance-tests/blob/3f0024a9e8543390ccb2b89f9c17ffef91eb8990/.github/workflows/coq.yml#L81-L88) is configured for the master branch to run make && make perf && make pdf && make doc
, and then it runs make copy-pdf copy-doc copy-perf OUTPUT="doc-build/${COQ_VERSION}"
which copies the graphs to the given output folder from the plots/
folder where they are generated by make pdf
and make doc
. The CI then pushes the doc-build folder to the gh-pages branch. Does this answer your question?
If you want to see all of the files we've generated, you can look at the various folders in the gh-pages branch at https://github.com/coq-community/coq-performance-tests/tree/gh-pages
@Jason Gross Does the coq version have to be an enumerated?
like if it's not 8.*
Something like this https://github.com/ejgallego/coq/releases/tag/multicore-t01
There are two requirements:
1 is just a matter of adapting the CI infrastructure, while 2 is about what version is reported by Coq itself. But we should be somewhat flexible there, at worst, I think, you'll just be missing a couple of tests that need a new version of Coq
Jason Gross said:
kumar The GitHub Actions CI (https://github.com/coq-community/coq-performance-tests/blob/3f0024a9e8543390ccb2b89f9c17ffef91eb8990/.github/workflows/coq.yml#L81-L88) is configured for the master branch to run
make && make perf && make pdf && make doc
, and then it runsmake copy-pdf copy-doc copy-perf OUTPUT="doc-build/${COQ_VERSION}"
which copies the graphs to the given output folder from theplots/
folder where they are generated bymake pdf
andmake doc
. The CI then pushes the doc-build folder to the gh-pages branch. Does this answer your question?
@Jason Gross what I wanted to do was follow the above steps and generate files for a particular dev version of coq. It was just for seeing what happens when I run with that coq version.
Although what would you suggest would be a good file to be used as a benchmark?
Also whie I was running make
I got this error
Makefile:31: warning: overriding recipe for target 'perf'
Makefile:26: warning: ignoring old recipe for target 'perf'
Makefile:40: warning: overriding recipe for target 'install-perf'
Makefile:36: warning: ignoring old recipe for target 'install-perf'
make -C src
make[1]: Entering directory '/home/sk/coq-performance-tests/src'
coq_makefile -f _CoqProject -o Makefile.coq
make[1]: execvp: coq_makefile: Not a directory
make[1]: *** [Makefile:58: Makefile.coq] Error 127
make[1]: Leaving directory '/home/sk/coq-performance-tests/src'
make: *** [Makefile:7: src] Error 2
I don't have a good sense of what's different in this version of Coq, so not sure what would be a good benchmark
And that seems like a very strange error
I've fixed the warnings about overridden targets (oops, that was my bad)
I'm still baffled by Not a directory
, it seems to indicate something is broken with your install of Coq?
I've asked a question about this at https://stackoverflow.com/questions/63143267/what-does-not-a-directory-mean-from-execvp-when-the-filename-has-no-slashes
@Emilio Jesús Gallego Arias are there any other coq benchmarks which are interesting for multicore ocaml?
@Jason Gross I'm trying to reinstall coq in a new switch and see if things can be resolved, maybe it's because I was using the multicore compatible version of coq and hence it's breaking, I'm not sure but I'll be trying to resolve this issue :)
@kumar that error is bizarre, did you properly install Coq so the binaries are in path? If not you may need to add the _build/install/default/bin
dir to your path
I'm away for a couple of weeks but I can have a look after.
I think some interesting developments to check are the fourcolor / odd-order ones
IRIS
compcert
and fiat-crypto tho Jason's tests may cover some of the patterns of the latter
Sure I'm trying to currently do a fresh install of coq, I'll try doing that again.
And also checkout the projects you mentioned
Thanks for the suggestions and help!!
@Emilio Jesús Gallego Arias I was recently coming back to coq benchmarks and I tried to pin the multicore-tag -> https://github.com/ejgallego/coq/tree/multicore-t01
but it errors out to the following, any idea why?
[coq.dev] downloaded from https://github.com/ejgallego/coq/releases/tag/multicore-t01
[ERROR] Error getting source from https://github.com/ejgallego/coq/releases/tag/multicore-t01:
- Unknown archive type: /tmp/opam-19447-1872ec/multicore-t01
Umm, that's bizarre; I'm still on holidays, maybe you could ask on the opam channel?
Last updated: Nov 29 2023 at 19:01 UTC