Hi,
We're noticing that prince requires large amounts of memory to process large - but not excessively large in our opinion - html files.
We have an example of a 43,3MB html file that results in 359 pdf pages.
Prince 14.2 needs 4,3GB to process this file.
We determined the amount of memory needed by limiting the available memory using ulimit. It passes with ulimit -Sv 4300000 set, it fails with "Mercury runtime: Could not allocate 32 bytes, exiting." when ulimit -Sv 4200000 is run first.
After experimenting with the input file, we suspect that the memory use is linked to the size of the DOM and we are wondering whether it is possible this can be improved, allowing us to generate the result with less memory available.
I anonymized the example, so I can share it if you want.
With kind regards,
Nick Hofstede
We're noticing that prince requires large amounts of memory to process large - but not excessively large in our opinion - html files.
We have an example of a 43,3MB html file that results in 359 pdf pages.
Prince 14.2 needs 4,3GB to process this file.
We determined the amount of memory needed by limiting the available memory using ulimit. It passes with ulimit -Sv 4300000 set, it fails with "Mercury runtime: Could not allocate 32 bytes, exiting." when ulimit -Sv 4200000 is run first.
After experimenting with the input file, we suspect that the memory use is linked to the size of the DOM and we are wondering whether it is possible this can be improved, allowing us to generate the result with less memory available.
I anonymized the example, so I can share it if you want.
With kind regards,
Nick Hofstede