[isabelle-dev] Using PolyML's memory consumption profiling on Isabelle
Makarius
makarius at sketis.net
Fri Oct 14 11:56:05 CEST 2011
On Fri, 14 Oct 2011, Andreas Schropp wrote:
> On 10/13/2011 01:24 PM, Thomas Sewell wrote:
>> There are surprisingly many dummy tasks.
>> [...]
>> 918632 Task_Queue.dummy_task(1)
>>
Such numbers always need to be put in relation. The original list was
like that:
> 918632 Task_Queue.dummy_task(1)
> ...
> 13085440 Term.map_atyps(2)
This means there are 2 orders of magnitude compared to the top entry.
Even if such top entries are reduced significantly, the overall impact is
very low on average. Addressing the lower entries is normally not worth
the effort.
> val dummy_task = Task(NONE, ~1)
>
> Values are not shared?! What the hell?
This looks like an older version. Thomas was referring to this one in
Isabelle/73dde8006820:
fun dummy_task () =
Task {group = new_group NONE, name = "", id = 0, pri = NONE, timing = new_timing ()};
Since the timing is a mutable variable here, it has to be created afresh
for each use -- in Future.value construction. Normally 1 million extra
allocations are not a big deal, but an experiment from yesterday shows
that there is in fact a measurable impact. See now Isabelle/2afb928c71ca
and the corresponding charts at
http://www4.in.tum.de/~wenzelm/test/stats/at-poly.html
I can only guess that allocation of mutable stuff costs extra.
Anyway, that is just a peophole optimization. The real improvements are
usually coming from looking at the big picture. The very introduction of
the dummy tasks for pre-evaluated future values was one such optimization.
Another one the introduction of the timing field for tasks to improve the
overall scheduling and throughput of the worker thread farm that crunches
on these tasks.
Makarius
More information about the isabelle-dev
mailing list