Processing high volume of records with Cassandra getting BusyPoolException -


i need process millions of records.

what efficient way using phantom ?

this how table looks :

create table my_table ( key text,  timestamp bigint,  value double,  primary key (key, timestamp) ) 

i executing query per unique key (which can reach millions)

select * my_table key = 'xyz' , timestamp >= 1504660890 , timestamp < 1504670890; 

however @ point getting busypoolexception


Comments

Popular posts from this blog

python - Operations inside variables -

Generic Map Parameter java -

arrays - What causes a java.lang.ArrayIndexOutOfBoundsException and how do I prevent it? -