How can we upload huge data sets say 1 lakh records to kinvey database
1) using the client library and adding each record one by one to the collection?
2) is there a way other than importing csv file through kinvey console?
Any Update on this..
Thank you for choosing Kinvey as your choice of mBaaS. And sorry about the delay in responding. I missed it probably because it was put under Java category.
To answer your questions:
I have some additional questions. When you say "lakh" do you mean 100,000 bytes, records, individual data sets, files?
Also, if the answer is individual data sets, what are the size of those data sets. If they are not fixed, do you have an average?
What is the format of your data sets?
Once updated, will you be added to the data over time or will this be a fixed amount of data that your application will retrieve? Or is it both?
Thanks Wani, Billy for your reply.
We are in DEV Week now and the application is new one. We are loading data records into kinvey by pre-fetching third party apis, csv data files. We will save application specific data into kinvey collection. For the process of pre-fetching the data to kinvey we thought of having a process which will hit the third party api and saves to kinvey(by using update collection api) in a scheduled way. So if there are 1 lakh records, process will make 1 lakh service calls to kinvey. Is it the better way to do this or any good approach available?.