Hi Kinvey team
the databrowser view in the console began to be very slow recently and I have realized that even though the view shows only couple of first characters of every column in fact it loads ALL the data for each row and column. This is obviously a huge burden while loading the view but also the browser is reaching the limit of what it can display - megabytes of data are injected into the DOM and probably reflown on scroll. If this is your approach in the console then this will obviously not scale and I am very concerned about how I am going to keep using Kinvey in the future.
Can you tell me how many rows there are in your collections on an average? And how much data size does that mean?
Also, has the browser started lagging only after the number of rows increased or is that a general observation regardless of row count?
so far one of my collection has about 40 rows with 6MB in total of data. All of it is loaded into the codebrowser upfront and written into the DOM. The browser is already lagging when I scroll that collection and searching in the page with CTRL+F is almost impossible.
Hi, any update on this please?
I discussed this with the Engineering team. I have asked them to look into ways to optimize the loading/viewing of data in Kinvey console.
Could you suggest what improvements you would like to see in the Kinvey console so that I can follow up with the Engineering team?
this kind of leaves me perplexed - does it mean that nobody is really using Kinvey as their backend in production? Because obviously if anybody has more than couple of MB of data and the Console Databrowser loads ALL their data at once and even injects it to the DOM then the Databrowser is clearly unusable (and crashing the browser) by design.
I will be more than happy to suggest an improvement but I am sure (I hope) Kinvey engineers don't really need my advice on such straightforward design problem. Just don't load all the rows at once and limit the content of all fields you load unless user asks for a detail of that field.