Bigquery stream: 'Failed to insert XX rows due to timeout' -


recent days, our streaming met

"failed insert xx rows. first error: {"errors":[{"reason":"timeout"}],"index":yy}"   

during past half month of continuous streaming unchanged data source , program scripts, no such failure has been found before.

project id: red-road-574

fellow bigquery team member here.

it looks our documentation bit incorrect, in can have partial commit of rows. we'll reject request if there invalid rows (structure mismatch), individual rows may fail buffered.

in case, rows indicated failed commit. if have insert id can retry failed rows, or retry full request if desired (though each retried row count against table quota).

this increased occurrence of these row-level errors due change around how handle batches of insertions. previously, entire request have encountered timeout.

hope helps. sean


Comments

Popular posts from this blog

javascript - Jquery show_hide, what to add in order to make the page scroll to the bottom of the hidden field once button is clicked -

javascript - Highcharts multi-color line -

javascript - Enter key does not work in search box -