Start a new topic

Custom endpoint within a custom endpoint

I see that because of recent pricing changes there is now a 2 seconds time limit on execution of an endpoint. We actually had a custom endpoint that does map reduce to generate 5/15/60/240 minutes time series data from 1 minute data that is stored in the collections. The problem is that because of the time limit of 2 seconds the map reduce can only process about 3000-4000 records which is roughly 2-3 days of 1 minute data. This means the endpoint needs to be called many many times with 2 days at a time (Imagine if I need to generate the time series for 2 months). One of the options would be to write a wrapper custom endpoint that in turn calls the actual custom end point with 2 days at a time. So my query really is when the wrapper end point calls the mapreduce endpoint multiple times, can the calling be done asynchronously so that the wrapper end point doesnt get killed because of the 2 second limit. In reality what I am trying to recreate is some kind of a batch process using two end points with the wrapper end point triggering the inner end point multiple times and them return itself without waiting for the inner end point instances to return the status. I can of course write the wrapper script out side the Kinvey BL but being able to do it within Kinvey seems like a more elegant solution till Kinvey provides some kind of batch processing.

Hey Pankaj,

The pricing update actually hasn't affected the BL timeout- it's always been 2 seconds. The only thing we changed there was adding the longer timeout limits at the higher pricing tiers. That aside, let's see if we can resolve your issue.
Hi Caroline,

Thanks for the response. The thing is I wasnt getting the script terminated errors before the new pricing policy came in. Also the script was running pretty ok (with occasional timeout) but after the new pricing, the situation is such that the time series data generation has virtually come to a stop. On careful examination I can see that given the collection has close to half a million entities, a query to filter out entities for map reduce processing itself consumes most of the time and triggers the 2 second limit. While I could have actually done all this work in a batch/long operation but given that Kinvey doesnt yet support batch processing I am forced to do it inside the endpoint. Not sure if batch processing is coming but it would be great if the time limits are relaxed till batch processing functionality comes by. Without that its virtually impossible to do any data mining/analysis.
Hey Pankaj, Have you looked into options of pre-aggregating before inserting into the Kinvey platform? EMR might be a solution there for batch style processing:
HI Caroline,

As I mentioned in my previous email the issue is even a simple query like below (without any mapReduce processing etc) gets terminated many times if the collection being queried has half a million rows.

function onRequest(request, response, modules){

logger = modules.logger;

var queryStr = {"date_time":{"$gte": request.body.start_date, "$lte": request.body.end_date}};

if (request.body.zone_name != null)

queryStr._zone = request.body.zone_name;

modules.collectionAccess.collection(request.body.collection).find(queryStr, function (err, docs) {

if (err) {

logger.error('Query failed: '+ err);

return response.error(400);

} else {;

return response.complete(200);



So while pre-aggregation and EMR might be worth exploring but if the bottleneck is the query and the collection size itself, then whatever I try I will eventually hit the termination limits if I use a custom end point. Coming to queries, I assume that there is no timeout limit if I make a direct RestAPI query call instead of routing it via a custom end point. Is that correct?
Yes, that is correct. There is not a 2-second timeout on requests via the REST API.
Login or Signup to post a comment