I’m experiencing Heroku’s H18 errors uploading large files and passing it to an S3 instance. It happens itermittently. The Heroku docs say the following.
Usually, an H18 indicates that a response has multiple stages – for
instance, streaming chunks of a large response – and that one of those
stages has thrown an error.
The docs further suggest looking through the log files for the response’s request_id to determine which stage/chunk of the response it threw the error on and fixing that problem. I have been unsuccessful in doing so at this time.
As far as configuring time out to accommodate the large file, Heroku says it’s not possible.
The timeout value is not configurable. If your server requires longer
than 30 seconds to complete a given request, we recommend moving that
work to a background task or worker to periodically ping your server
to see if the processing request has been finished. This pattern frees
your web processes up to do more work, and decreases overall
application response times.
And suggest using a “direct” file upload method, passing it from a browser directly to an S3.
Sounds like poppycock to me. But I haven’t found a workaround yet.