python - HappyBase and Atomic Batch Inserts for HBase -


with happybase api hbase in python, batch insert can performed following:

import happybase connection = happybase.connection() table = connection.table('table-name') batch = table.batch() # put several rows batch via batch.put() batch.send() 

what happen in event batch failed half way through? rows had been saved remain saved , didn't not saved?

i noted in happybase github table.batch() method takes transaction , wal parameters. these configured in such way rollback saved rows in event batch fails halfway through?

will happybase throw exception here, permit me take note of row keys , perform batch delete?

did follow tutorial batch mutations in happybase docs? looks you're mixing few things here. https://happybase.readthedocs.org/en/latest/user.html#performing-batch-mutations

batches purely performance optimization: avoid round-tripping thrift server each row stored/deleted, may result in significant speedup.

the context manager behaviour (the with block), explained numerous examples in user guide linked above, purely client-side convenience api makes application code easier write , reason about. if with block completes mutations sent server in 1 go.

however... that's happy path. in case python exception raised somewhere with block? that's transaction flag comes play: if true, no data sent @ server, if false, pending data flushed anyway. behaviour preferred depends on use case.


Comments

Popular posts from this blog

angularjs - ADAL JS Angular- WebAPI add a new role claim to the token -

php - CakePHP HttpSockets send array of paramms -

node.js - Using Node without global install -