I am developing an application and I have the following problem. I have to upload images to a server, but it has to be a background upload. With iOS (UIApplication.SharedApplication.BeginBackgroundTask (() => {});) I have had no problems and I have managed to send up to 300 images (they are appraisal images so I speak of at least 100 images) but on Android it always gives problems. I have a ForegroundService to run the loading process but like I say on Android it returns an error with sockets or it freezes and doesn't follow the process. I have tried to send the images sequentially (ERROR), I have sent them 5 by 5 with a recursive method, seeing which ones fail and resending them (up to 3 attempts), I have tried a var tasks = new List (); -> Task t = Task.WhenAll (tasks); and neither, I have tried with Parallel but again the sockets problem and now I am trying to send them with MultipartFormDataContent grouping them from 5 to 5 (content.Add (new StreamContent (item.imageStream), "...", ". .. ");) and, as always, iOS sends me without problems but in Android there is no way.
Do you know how I can do it to be able to send them? Is there a default limitation in Android that does not allow to upload so many images?
I send the images in base64 except in the last attempt at MultipartFormDataContent that passed the image stream.
Related
We are running 2 servers. Server 1 hosts a react application. Server 2 hosts a webcomponent exposed as a single javascript bundle along with some assets such as images. We are dynamically loading the webcomponent Javascript hosted on Server 2 in our react app hosted on Server 1. The fact that it is a webcomponent might or might not affect the issue.
What's happening is that the webcomponent makes uses of assets such as images that are located on Server 2. But when the react app loads the webcomponent, the images are not found as its looking for the images locally on Server 1.
We can fix this in many ways. I am looking for the simplest fix. Since Server 1 app and Server 2 apps are being developed by different teams both should be able to develop in the most natural way possible without making allowances for their app being potentially loaded by other apps.
The fixes that I could think of was:
Making use of absolute URLs to load assets - Need to know the deployed location in advance .
Adding a reverse proxy to Server 1 to redirect to Server 2 whenever a particular path is hit - Need to configure this. The React app could load hundreds of webcomponents, viz we need add a lot of reverse proxies.
Embed all assets into the single javascript on Server 2, like embed svgs into the javascript. - Too limiting. If the SVGs are huge and will make the bundle size bigger.
I was hoping to implement something like -
Since the react app knows where to hit Server 2, can't we write some clever javascript that will make the browser go to Server 2 whenever assets are requested by a Javascript loaded by Server 2.
If you download your Web Component via a classic script (<script> with default type="text/javascript") you can retrieve the URL of the loaded file by using document.currentScript.src.
If you download the file as a module script (<script> with type="module"), you can retrieve the URL by using import.meta.url.
Parse then the property to extract the base path to the Web Component.
Example of Web Component Javascript file:
( function ( path ) {
var base = path.slice( 0, path.lastIndexOf( '/' ) )
customElements.define( 'my-comp', class extends HTMLElement {
constructor() {
super()
this.attachShadow( { mode: 'open' } )
.innerHTML = `<img src="${base}/image.png">`
}
} )
} ) ( document.currentScript ? document.currentScript.src : import.meta.url )
How about uploading all required assets to a 3rd location, or maybe an AWS S3 bucket, Google Drive, Dropbox etc.? That way those assets always have a unique, known URL, and both teams can access them independently. As long as the names remain the same, so will the URLs.
I am developing a web application with Mojolicious. The morbo development server is a wonderful thing that works great, but once I start returning complicated hashes on the stack and then rendering a webpage, the morbo server will start to act funny. In my browser, if I navigate to one of those webpages that use a complicated hash, the browser will tell me that the connection has been reset. I have to refresh about 10-12 times before the page will load.
For example:
The code below shows one of my app controllers. It simply gets a json object from an AJAX request, and then returns a different json object. It works fine, except that the browser demands to be refreshed a thousand times before it will load.
package MyApp::Controller::Library;
use Mojo::Base 'Mojolicious::Controller';
use Mojo::Asset::File;
use MyApp::Model::Generate;
use MyApp::Model::Database;
use MyApp::Model::IpDatabase;
use Mojo::JSON qw(decode_json);
# Receives a json object from an AJAX request and
# sends the necessary information back to be
# displayed in a table.
sub list_ajax_catch {
my $self = shift;
my $json = $self->param('data');
my $input = decode_json $json;
$self->render(
json => {
"Object A" => {
"name" => "Object A's Name",
"description" => "A Description for Object A",
"height" => "10",
"width" => "5",
}
}
);
}
1;
The problem is not limited to this instance. It seems that anytime there is a lot of processing on the server, the browser has troubles resetting. It doesn't matter what browser, I've tried Chrome, IE, Firefox, and others (on multiple computers). It doesn't matter if I'm not even sending or receiving data back and forth from the html to the app. All that seems to trigger it is if there is any amount of processing in my web app that is more than just rendering templates, BUT if I am running Hypnotoad, everything is fine.
This example is not one that requires a lot of processing, but it does cause the browser to reset, and as you can see, it shouldn't take long to run or freeze anything up. I thought the problem was a timeout issue, but by default, timeout doesn't happen until after 15 seconds, so it can't be that.
I have figured out the problem! This has been an issue for me for over a month now and I am so glad that it is working again. My problem was that when I started the morbo development server, I used the following command:
morbo -w ~/web_dev/my_app script/my_app
The -w allows me to watch a directory for changes so that I don't have to restart the app each time I changed some of my JavaScript files. My problem was that the directory I watched also contained my log files. So each time I went to my webpage, the logs would change and the server would restart.
I am trying all means to improve the loading time of an link in android webview in vain.I am using a link which has images and text fetched from external server by using some js functions and those images and text may change on any moment.In this scenario the code I used is as below.I am not interested in HTML 5 caching or server caching techniques,as the same link loads faster on browser fails to do the same in android webview.There is not much js script i can load from resource to improve performance hence most of the script just pulls data from external server and images from on amazon server.
WebSettings settings = getSettings();
settings.setJavaScriptEnabled(true);
settings.setPluginState(PluginState.ON);
settings.setRenderPriority(WebSettings.RenderPriority.HIGH);
settings.setDomStorageEnabled(false);
settings.setLoadsImagesAutomatically(true);
settings.setCacheMode(WebSettings.LOAD_CACHE_ELSE_NETWORK);
settings.setJavaScriptEnabled(true);
settings.setAppCacheEnabled(false);
settings.setGeolocationEnabled(false);
1 .I guess turning on DomStaorage and AppCache to true have impact on loading time so i have turned it off. -> Is this true
2.In OnDestroy i call clearcache on webview to clear application cache - I am doing this hence i am afraid that my app size may grow as webview db grows if i fail to clear.-> is this must,or android handles this gracefully.
3.Few suggestion i hear is to set setCacheMode to LOAD_NO_CACHE and comment clearing webview cache in onDestroy. --> Does calling webview.clearcache() have any impact hence I have already set not to load from cache.
4.The link does not provide me information of previously refreshed time,in that case what cache mode could any one recommend.
5.Is there is any concert testing method to find out loading time on devices that takes network connection and server lags into account.hence the same page once it loads fast may load slow in some other time.
Give a try with "fast click".
mobile browsers will wait approximately 300ms from the time that you tap the button to fire the click event. The reason for this is that the browser is waiting to see if you are actually performing a double tap.
Download the script, add
<script type='application/javascript' src='fastclick.js'></script>
in your header and this code in your script.
window.addEventListener('load', function() {
FastClick.attach(document.body);
}, false);
More informations: https://developers.google.com/mobile/articles/fast_buttons
I have a gwt app that need to display images thats hosted by other server, i used the Image(url) to create those icon, but it's unbearably slow, (need to display up to 50 images on one page), is there any way i can speed up? i looked a bit of image bundle but seems it only works for the images hosted on my own server.
here is my code:
for (int i = 0 ; i < 50; i++) {
item = items.get(i);
icon = new Image(ROOT_URL + item.getIconURI());
....
}
1) If there is not security concern ( just images right ) ensure that you are not requesting over https.
2) Use Chrome Dev Tools - Network Profiler to monitor the page load and http requests. Tune your application using the profiler suggestions.
3) Try precaching the images ( i.e ) fetch them before the user navigates to the page in the background.
4) You can also try requesting the image host to send compressed images if they are not compressed already.
The above suggestion have very little to do with GWT.
I am using BackgroundTransferService to download a file from the internet.
pseudo code goes something like this:
BackgroundTransferRequest transferRequest = new BackgroundTransferRequest(transferUri);
transferRequest.Method = "GET";
transferRequest.Tag = "myTag";
transferRequest.TransferPreferences = TransferPreferences.AllowCellularAndBattery;
BackgroundTransferService.Add(transferRequest);
after this, i add an event handler to handle the transfer when it is completed.
I am only using TransferStatusChanged event handler, not TransferProgressChanged
transferRequests = BackgroundTransferService.Requests;
transferRequests.Last().TransferStatusChanged += new EventHandler<BackgroundTransferEventArgs>(transfer_TransferStatusChanged);
under transfer_TransferStatusChanged() i do whatever i want to do with my downloaded file, or handle the failed situations (404 etc).
The problem is that my downloads go on for indefinite time if there is no 404 response from the server (for example when there is no such server, eg. www.googlea.com/myfilename). I want to implement a timeout for such scenario .. how can i do that ?
There is no built in support for such a scenario. You'll have to build in the timeout support yourself.
Be careful of transfering large files though as the transfer could be done in parts and over a very large period of time, depending on connectivity and battery charge level.
Of course, you may want to add a check that a file exists before making the transfer request and if you have any control over the server you should make sure that the correct responses are being sent too.