The following options are never tested : -dPDFA -dPDFX Also the ps2write device is not tested. Adding it here because ps2write is a clone of pdfwrite and its testing is similar.
Setting P2 since it impededs the development.
Setting priority to P4 since developers can (and do) use private scripts to run local regression to test specific modes. Also this bug report is incomplete since it does not document which source files are useful for PDF/A and which for PDF/X (all comparefiles, a specific subset of PS or PDF files) and which modes of PDF/A and PDF/X should be exercised. I am assigning this to myself, and will also collect suggestions for specific 'spot check' devices and/or modes. Other common devices not tested are ps2write, bbox, tiff (with and without compression and strip size limits). Also for consideration are running at least a few files through at the highest resolution used by our customers. I've also changed the status to 'enhancement' since it clearly is more of an extension than a bug in the process that we run regularly.
It would be great to test all devices on a few sample files.
Regarding to comment #2 : 1. All comparefiles are worth to test with -dPDFA, -dPDFX, ps2write. 2. -PDFA, -dPDFX have no specific modes useful for testing. However it would be useful to test pdfwrite with various distiller parameters (it's another story). 3. I think besides the night;y regression test, we would like to see a weekly regression test with many devices and device parameters. If caspers overflows, should buy more computers. 4. From the last year practice, P4 means "never". Please choose a better priority. 5. I guess "P4 enhancements" is set because Support can't handle it shortly. I believe the main reason is the overcomplicated regression test script. Alex has a 10 times simpler script for Linux, likely it should be used here.
Passing to Marcos for consideration and possible discussion or closing.
This has been partially done: We now test all devices with all the files in examples on a nightly basis. Unfortunately we don't have any way of comparing the output, so we only look for errors, seg. faults, and lockups. ps2write testing has also been added to the nightly regression testing. I've recently brought up another cluster node that I can use to run more weekly regressions, so if anyone has any suggestions for other modes to test please let me know. I agree that testing of PDF/A and PDF/X output would be desirable since we know customers are using those features. However it's hard to implement, since as far as I know there aren't any command line validators available for Linux (in theory it would be possible to set something up running Acrobat on a Mac OS X machine via AppleScript, but it would be kludgy).