Hi All,
I have a question regarding an issue that i a dealing with eight now:
Are there any nest practices regarding huge data insertions in the cluster?
My testing project relies on a one table model that is supposed to receive some thousands/millions data lines every X minutes. An external application will be in charge of performing the data insertion , and java is a compolsory choice.
So how can such kind of treatment can be dealt with according to you? i want to take advantage of the existing APIs , maybe clusterJ / ClusterJPA but are there any recommandations as far as the data insertion itself? can batch-like approach be performed ( if yes than how?)
Thank u all for ur feedbacl!
I have a question regarding an issue that i a dealing with eight now:
Are there any nest practices regarding huge data insertions in the cluster?
My testing project relies on a one table model that is supposed to receive some thousands/millions data lines every X minutes. An external application will be in charge of performing the data insertion , and java is a compolsory choice.
So how can such kind of treatment can be dealt with according to you? i want to take advantage of the existing APIs , maybe clusterJ / ClusterJPA but are there any recommandations as far as the data insertion itself? can batch-like approach be performed ( if yes than how?)
Thank u all for ur feedbacl!