Bigquery stream: 'Failed to insert XX rows due to timeout' -
recent days, our streaming met
"failed insert xx rows. first error: {"errors":[{"reason":"timeout"}],"index":yy}"
during past half month of continuous streaming unchanged data source , program scripts, no such failure has been found before.
project id: red-road-574
fellow bigquery team member here.
it looks our documentation bit incorrect, in can have partial commit of rows. we'll reject request if there invalid rows (structure mismatch), individual rows may fail buffered.
in case, rows indicated failed commit. if have insert id can retry failed rows, or retry full request if desired (though each retried row count against table quota).
this increased occurrence of these row-level errors due change around how handle batches of insertions. previously, entire request have encountered timeout.
hope helps. sean
Comments
Post a Comment