Titanium SDK version: 1.6.1
iPhone SDK version: 4.2
I am trying to build a solution to cache JSON calls. I have done a first attempt that does the job, but is there a better solution? I am using textfiles to save the JSON output, is this OK performance wise?
http://pastie.org/1734763
Thankful for all feedback!
I think that'd be ok. As long as the files aren't massive in number/size it should perform quite well.
The other approach you could try if you decide you're not happy with performance, or want to maintain less code, is to use App storage, which persists data beyond app sessions.
Titanium.App.setString('jsonResponse', this.responseText);
Titanium.App.setInt('expires', this.responseText.expires);
Then before you make your request you can check if the cache is indeed stale:
var expires = Titanium.App.getInt('expires');
// Get the current time in milliseconds, etc.
if(expires > current_time) {
// Cache is still valid
var response = Titanium.App.getString('jsonResponse');
var obj = JSON.parse(response);
}
else {
// Cache is stale - query for new data
}
Related
I am currently working on performance improvements for a React-based SPA. Most of the more basic stuff is already done so I started looking into more advanced stuff such as service workers.
The app makes quite a lot of requests on each page (most of the calls are not to REST endpoints but to an endpoint that basically makes different SQL queries to the database, hence the amount of calls). The data in the DB is not updated too often so we have a local cache for the responses, but it's obviously getting lost when a user refreshes a page. This is where I wanted to use the service worker - to keep the responses either in cache store or in IndexedDB (I went with the second option). And, of course, the cache-first approach does not fit here too well as there is still a chance that the data may become stale. So I tried to implement the stale-while-revalidate strategy: fetch the data once, then if the response for a given request is already in cache, return it, but make a real request and update the cache just in case.
I tried the approach from Jake Archibald's offline cookbook but it seems like the app is still waiting for real requests to resolve even when there is a cache entry to return from (I see those responses in Network tab).
Basically the sequence seems to be the following: request > cache entry found! > need to update the cache > only then show the data. Doing the update immediately is unnecessary in my case so I was wondering if there is any way to delay that? Or, alternatively, not to wait for the "real" response to be resolved?
Here's the code that I currently have (serializeRequest, cachePut and cacheMatch are helper functions that I have to communicate with IndexedDB):
self.addEventListener('fetch', (event) => {
// some checks to get out of the event handler if certain conditions don't match...
event.respondWith(
serializeRequest(request).then((serializedRequest) => {
return cacheMatch(serializedRequest, db.post_cache).then((response) => {
const fetchPromise = fetch(request).then((networkResponse) => {
cachePut(serializedRequest, response.clone(), db.post_cache);
return networkResponse;
});
return response || fetchPromise;
});
})
);
})
Thanks in advance!
EDIT: Can this be due to the fact that I put stuff into IndexedDB instead of cache? I am sort of forced to use IndexedDB instead of the cache because those "magic endpoints" are POST instead of GET (because of the fact they require the body) and POST cannot be inserted into the cache...
I am trying to measure how long does it take for images to load in a react native app on my users' devices in different countries.
In debug mode there is performance.now() that creates timestamp that I then send as a property of the event to Amplitude.
But performance.now() is a JS method and is not available in Release builds for users. There is an undocumented global.nativePerformanceNow method
const loadStartAmplitudeEvent = () => {
if (R.not(__DEV__)) {
const timeStamp = global.nativePerformanceNow();
amplitude.logEvent('Photo On Load Start', {
uri, timeStamp,
});
}
};
For example, that's how I create an event with a timestamp to send to amplitude, but I get an error, what am I doing wrong? Thanks a lot! Should i use some other method? Is the global.nativePerformanceNow → g.nativePerformanceNow transformation messing it up?
2019-08-06 03:10:45.134 [error][tid:com.facebook.react.JavaScript]
g.nativePerformanceNow is not a function.
(In 'g.nativePerformanceNow()', 'g.nativePerformanceNow' is undefined)
Seems like the feature was removed from RN in mid October 2018...
This crashes on device (works on Chrome tools which is a bit meh):
https://github.com/facebook/react-native/issues/27274#issuecomment-557586801
But also seems it's been brought back from the dead recently.
So maybe next React Native release will include performance, performanceNow(), performance.now()
https://github.com/facebook/react-native/commit/232517a5740f5b82cfe8779b3832e9a7a47a8d3d
My google script web app is recently hitting qps limits. What would be a better way to improve performance.
I have about 50 active users. I use 15,000 rows google spreadsheet as a database and my app is serving json data requested by users from this spreadsheet. I use long-poll to keep connection alive for 5 min and close it if no update in spreadsheet happens. Then client reconnects. Web App is published to be executed as me.
My polling works like this:
function doGet(e){
var userHasVersion = e.parameter.userVersion
while (runningTime < 300001) {
var currentServerVersion = parseInt(cache.get("currentVersion"),10)
if(userVersion<currentServerVersion){
var returndata = []
for(var i = userVersion+1; i <= currentServerVersion;i++){
var newData = cache.get(i)
if(newData!=null){returnData.push(JSON.parse(cache.get(newData)))}
}
return ContentService.createTextOutput(JSON.stringify({currentServerVersion,data:returnData })).setMimeType(ContentService.MimeType.JSON);
} else {
Utilities.sleep(20000)
}
runningTime = calculateRunningTime()
}
}
What I have tried so far:
1) I optimized requests with CacheService to reduce calls to Spreadsheet. It helped for few months, but now I'm getting qps errors more and more often.
2) Asking Google team about quotas. They explained me, that there is no published quotas/limits for simultanous executions and they are subject to change without notice. They advised further usage of cacheService and better error handling.
I think to switch from long-polling to short-polling. But it feels like drawback. Should I try to further optimize performance or move to another service?
Would trying to use "execute app as user accessing the app" help? (users should use the same database)
Is Google Script API Executable different from Web App? It looks like it might fit, but I'm not sure if they share the same qps quotas.
I'm also considering GAE service, but I'd like to avoid going over free quota.
Any advice will be much appreciated!
I think that a following part can be improved. When data is retrieved from cache service, getAll() is more effectively than get(). I have ever measured the difference. That is about 890 times faster than get(). If the number of data retrieving from cache service is large, I think that the improvement of this part is important for performance.
Your script :
var returndata = []
for(var i = userVersion+1; i <= currentServerVersion;i++){
var newData = cache.get(i)
if(newData!=null){returnData.push(JSON.parse(cache.get(newData)))}
}
Improved script :
var ar = [];
for(var i = userVersion+1; i <= currentServerVersion;i++){
ar.push([i]);
}
var r = JSON.parse(JSON.stringify(cache.getAll(ar))); // Since key is number, I used this.
var returnData = [r[j] for each (j in r)if (!r[j])];
Since I cannot see your data, I cannot confirm this execution. So if errors occur, please tell me.
If I misunderstand your question, I'm sorry.
I am using an API. The resquest-response time on the API takes too long time. So, due to this problem, I want to store the response on the Server but not on the browser. This is because, I want to display the result from cache for similar searches. I know, there are limitations on the accuracy on cached data, but that's another part which I don't want to include here.
I am using laravel framework and using this code for current moment as in the laravel documentation.
$expiresAt = Carbon::now()->addMinutes(10);
Cache::put('key', 'value', $expiresAt);
The problem with this code is, it stores cache on the browser only. But I want to store it on Server. I have heard about the Memcached but could not implement it. I have also heard of apc_store() but I think it stores on local. So, How can I store cache on server?
Similar to Cache::put(), you can use Cache::pull() to check for data saved (on the server).
// Check cache for data
$cachedData = Cache::pull($key);
// Get new data if no cache
if (! $cachedData) {
$newData = 'New Data';
$expiresAt = Carbon::now()->addMinutes(10);
// Save new data to cache
$cachedData = Cache::put($key, $newData, $expiresAt);
}
echo $cachedData;
Today, I had to restart my browser due to some issue with an extension. What I found when I restarted it, was that my browser (Chromium) automatically updated to a new version that doesn't allow synchronous AJAX-requests anymore. Quote:
Synchronous XMLHttpRequest on the main thread is deprecated because of
its detrimental effects to the end user's experience. For more help,
check http://xhr.spec.whatwg.org/.
I need synchronous AJAX-requests for my node.js applications to work though, as they store and load data from disk through a server utilizing fopen. I found this to be a very simplistic and effective way of doing things, very handy in the creation of little hobby projects and editors... Is there a way to re-enable synchronous XMLHttpRequests in Chrome/Chromium?
This answer has been edited.
Short answer:
They don't want sync on the main thread.
The solution is simple for new browsers that support threads/web workers:
var foo = new Worker("scriptWithSyncRequests.js")
Neither DOM nor global vairables aren't going to be visible within a worker but encapsulation of multiple synchronous requests is going to be really easy.
Alternative solution is to switch to async but to use browser localStorage along with JSON.stringify as a medium. You might be able to mock localStorage if you allowed to do some IO.
http://caniuse.com/#search=localstorage
Just for fun, there are alternative hacks if we want to restrict our self using only sync:
It is tempting to use setTimeout because one might think it is a good way to encapsulate synchronous requests together. Sadly, there is a gotcha. Async in javascript doesn't mean it gets to run in its own thread. Async is likely postponing the call, waiting for others to finish. Lucky for us there is light at the end of the tunnel because it is likely you can use xhttp.timeout along with xhttp.ontimeout to recover. See Timeout XMLHttpRequest
This means we can implement tiny version of a schedular that handles failed request and allocates time to try again or report error.
// The basic idea.
function runSchedular(s)
{
setTimeout(function() {
if (s.ptr < callQueue.length) {
// Handles rescheduling if needed by pushing the que.
// Remember to set time for xhttp.timeout.
// Use xhttp.ontimeout to set default return value for failure.
// The pushed function might do something like: (in pesudo)
// if !d1
// d1 = get(http...?query);
// if !d2
// d2 = get(http...?query);
// if (!d1) {pushQue tryAgainLater}
// if (!d2) {pushQue tryAgainLater}
// if (d1 && d2) {pushQue handleData}
s = s.callQueue[s.ptr++](s);
} else {
// Clear the que when there is nothing more to do.
s.ptr = 0;
s.callQueue = [];
// You could implement an idle counter and increase this value to free
// CPU time.
s.t = 200;
}
runSchedular(s);
}, s.t);
}
Doesn't "deprecated" mean that it's available, but won't be forever. (I read elsewhere that it won't be going away for a number of years.) If so, and this is for hobby projects, then perhaps you could use async: false for now as a quick way to get the job done?