Hi all,
On my company, we have a software that generates an impressive, for a small company, number of pdfs. Almost 40k/month.
One of most used features enable users bulk export documents to download. For instance: download 1000 documents as PDF bundled in a zip format.
When I saw prince I thought: "Wow, we could open 30 prince processes, ~2 seconds per pdf, and get a throughput of ~30 pdfs in 2s . F**K Yeah "
But on my tests, i got another reality, the best environment was with 10 prince processes at same time. Maybe because we got hit by too much context switching.
Is there any way of create a prince process pool and reuse one prince instance ?
What is the best approach to maximize my throughput?
I' am using nodejs to spawn a new prince process everytime a new pdf is requested.
Thanks
btw: Prince is really impressive !
On my company, we have a software that generates an impressive, for a small company, number of pdfs. Almost 40k/month.
One of most used features enable users bulk export documents to download. For instance: download 1000 documents as PDF bundled in a zip format.
When I saw prince I thought: "Wow, we could open 30 prince processes, ~2 seconds per pdf, and get a throughput of ~30 pdfs in 2s . F**K Yeah "
But on my tests, i got another reality, the best environment was with 10 prince processes at same time. Maybe because we got hit by too much context switching.
Is there any way of create a prince process pool and reuse one prince instance ?
What is the best approach to maximize my throughput?
I' am using nodejs to spawn a new prince process everytime a new pdf is requested.
Thanks
btw: Prince is really impressive !