Uploading APK and Expansion Files to GooglePlay - apk

I am using Telerik AppBuilder (former Icenium) to build an APK for PlayStore. However, my app is bigger than 50MB, therefore I have to use Expansion Files.
Since files are in cloud, I have pushed them to my Github account, then pulled them, and I created Expansion Files successfully.
I have found a way how to upload Expansion File to Google Playstore, using "placeholder" (dummy) APK.
The problem is, this all doesnt help me much because I have to somehow replace this "placeholder" APK with my actual APK, that is bigger than 50 MB?
Do I need to make my APK smaller than 50MB? Can I just exclude files from my project?

Everything you need to know about using expansion files for an app of yours that exceeds 50 Mb is detailed in the Google Play documentation here: http://developer.android.com/google/play/expansion-files.html. No need for "placeholder" apks, or replacing apks, the expansion files serve this very purpose.
If you project exceeds the 50 Mb limit with just a little bit, you might consider hosting some of its resources on a server and getting those via ajax requests on demand (whenever an user goes to a view where you need those resources).
If you have high res images or videos, you can compress them and this would reduce size as well.

Related

Project size increased to 716 MB after adding Realm DB

The project is just a simple notepad but the size increased to 716 MB after adding Realm DB. What could be the problem ? There is a warning in Xcode as well: "/:1:1: Umbrella header for module 'Realm' does not include header '/core/realm.h'"
If you refer to the github for Realm releases you can see the RealmSwift SDK comes in at 871 Mb zipped (2Gb unzipped!).
If the examples and so forth are removed, it's still about 525Mb so what you're seeing in your question is about right.
EDIT
There are a lot of additional files and 'stuff' that may exist within a project development folder that may not be part of the final project because the SDK may need to support multiple devices.
For example, in the Realm/core/realm-monorepo.xcframework folder, there are supporting files for macOS/tVOS/iOS/watchOS etc, so a lot of extraneous stuff that has nothing to do with the project on your device. If you are not using that part of the SDK, those can be removed to make the project file smaller.
Additionally, the 10.7.2 SDK that's downloaded from github has a bunch of example projects and code that can also be removed.

Speeding up deploys and development builds for image heavy Gatsby.js website

I'm using Gatsby.js and gatsby-image to build a website that currently has about 300 images on it. I'm encountering 2 problems:
gatsby develop and gatsby build take many minutes to run because gatsby-image generates multiple resolutions and svg placeholder images for every image on the site. This makes for a great user experience once that pre-optimization is done, but a very slow development experience if I ever need to re-build.
My current workaround is to remove all but a few images during development.
Deploying to GitHub Pages takes too long with so many images (300 base images * 3 resolutions + 1 svg representation). Trying to deploy the website to GitHub pages causes a timeout. Going to attempt deploying to Netlify instead, but I anticipate the same problem. I also don't want to have to re-upload the images every time I make a change on the website.
I don't feel like my <1000 images should qualify as "image heavy", but given poor to mediocre upload speeds, I need a way to upload them incrementally and not re-upload images that have not changed.
Is there a way to upload images separately from the rest of a build for a Gatsby website?
I think maybe I could get something working with AWS S3, manually choosing which files from my build folder I upload when creating a new deploy.
Anyone else dealt with handling an image-heavy Gatsby site? Any recommendations for speeding up my build and deploy process?
Disclaimer: I work for Netlify
Our general recommendation is to do the image optimization locally and check those files into GitHub since it can take longer than our CI allows you (15 minutes) to do all that work and it is repetitive.
There is also an npm module that lets you cache the things you've made alongside your dependencies: https://www.npmjs.com/package/cache-me-outside that may do that for you without abusing GitHub (instead abusing Netlify's cache :))
See another answer from Netlify: smaller source images (as mentioned by #fool) or offloading to a service like Cloudinary or Imgix.

Why does XCTest save a ton of data to my /private/vars/folders/rx/ folder?

I'm using XCtest with XCode 6.3 to test an app of mine. I've noticed that it seems to be saving a ton of data to my /private/vars/folders/rx/ folder. If I delete my app's Derived Data these XCTest folders don't get deleted, and if I restart my computer it's still there too. Is this data safe to delete or do I need it?
I used DaisyDisk (awesome program by the way!) to take a look at my computer's memory and I see this:
So I thought I'd take a look at what was in each of these folders. When I looked, it seemed like XCTest was the culprit.
When running my tests I include a large bundle of jpgs that is around 1.7 GB. I'm working on streamlining my app testing so that I don't need the large bundle, but I'm still curious if I can safely delete some of these older folders.
Yes, they are safe to delete. This is where Xcode stores built copies of your applications when built in "Unit Test" mode. Just as Xcode has a Derived Data folder for building OS X apps, it has another location for storing XCTest data (storing this data in the Derived Data folder wouldn't be appropriate, since that is used to store built copies of the application intended to be run by the user).
You should note, though, that Xcode will probably put the data back there again the next time you try to run your app's unit tests, and it may take longer to build/run since some of the cached data is gone. It's up to you whether the trade-off is worth it.
And by the way, Xcode builds your entire project to run XCTest, because your tests might depend on certain resources being there in your app bundle. So that's why the folder is so large.

Why does my worklight project build/deploy much slower than coworkers?

I just recently installed worklight in Eclipse in order to work on developing an iPad app, but I noticed it takes me significantly longer to build and deploy compared to the other developers. The others take rougly 5-7minutes each build while mine takes about 25-30 minutes. I am not sure what could be the reason and was hoping for some suggestions on what it may be?
I was told that in the build process worklight copies the contents of your projects to another directory on your machine, and I think the location of that directory might be the issue, but I am not sure how to check to see where this is happening.
Edit: To give more details as requested:
Both my machine and my coworkers machine are running Windows 7 Enterprise, with Intel dual core and 8G of RAM.
The workspace containing the project is located locally in the base of the C: drive but user profile files/folders such as My Documents are stored on a shared network drive. The project itself is 143mb.
To the best of my knowledge there are few factors that influence build time:
Size of the Project (eg. 100MB)
Number of Files in the Project (eg. 1200 files)
Your environment got into a strange state.
Some one reported performance issues with adding new Java code.
Hardware
You can try:
Lower the size of your project by removing unnecessary files, compressing images using lossy compression, etc.
Concatenate resources like JS and CSS files.
Try to use resources hosted on other servers, at least for development, for example:
< script data-dojo-config="async: 1"
src="http//ajax.googleapis.com/ajax/libs/dojo/1.8.1/dojo/dojo.js">
< script src="http://code.jquery.com/jquery-1.9.1.min.js">
Try creating a new Workspace and importing your project or removing (back up first!) the project's metadata directories and files (Workspace/WorklightServerHome, bin/). You may have a some success removing and re-creating the native environment folders. There's also a -clean flag you can pass to eclipse.
I was able to fix my own problem, worklight was using a .wlapp which was stored on my shared network drive. By changing the TEMP and TMP environment variables to a folder which is for sure local, such as C:\TEMP, worklight then accesses only local files great speeding up the build proccess.

How to speed up the eclipse project 'refresh'

I have a fairly large PHP codebase (10k files) that I work with using Eclipse 3.4/PDT 2 on a windows machine, while the files are hosted on a Debian fileserver. I connect via a mapped drive on windows.
Despite having a 1gbit ethernet connection, doing an eclipse project refresh is quite slow. Up to 5 mins. And I am blocked from working while this happens.
This normally wouldn't be such a problem since Eclipse theoretically shouldn't have to do a full refresh very often. However I use the subclipse plugin also which triggers a full refresh each time it completes a switch/update.
My hunch is that the slowest part of the process is eclipse checking the 10k files one by one for changes over samba.
There is a large number of files in the codebase that I would never need to access from eclipse, so I don't need it to check them at all. However I can't figure out how to prevent it from doing so. I have tried marking them 'derived'. This prevents them from being included in the build process etc. But it doesn't seem to speed up the refresh process at all. It seems that Eclipse still checks their changed status.
I've also removed the unneeded folders from PDT's 'build path'. This does speed up the 'building workspace' process but again it doesn't speed up the actual refresh that precedes building (and which is what takes the most time).
Thanks all for your suggestions. Basically, JW was on the right track. Work locally.
To that end, I discovered a plugin called FileSync:
http://andrei.gmxhome.de/filesync/
This automatically copies the changed files to the network share. Works fantastically. I can now do a complete update/switch/refresh from within Eclipse in a couple of seconds.
Do you have to store the files on a share? Maybe you can set up some sort of automatic mirroring, so you work with the files locally, and they get automatically copied to the share. I'm in a similar situation, and I'd hate to give up the speed of editing files on my own machine.
Given it's subversioned, why not have the files locally, and use a post commit hook to update to the latest version on the dev server after every commit? (or have a specific string in the commit log (eg '##DEPLOY##') when you want to update dev, and only run the update when the post commit hook sees this string).
Apart from refresh speed-ups, the advantage of this technique is that you can have broken files that you are working on in eclipse, and the dev server is still ok (albeit with an older version of the code).
The disadvantage is that you have to do a commit to push your saved files onto the dev server.
I solwed this problem by changing "File Transfer Buffer Size" at:
Window->Preferences->Remote Systems-Files
and change "File transfer buffer size"-s Download (KB) and Upload (KB) values to high value, I set it to 1000 kb, by default it is 40 kb
Use offline folder feature in Windows by right-click and select "Make availiable offline".
It could save a lot of time and round trip delay in the file sharing protocol.
The use of svn externals with the revision flag for the non changing stuff might prevent subclipse from refreshing those files on update. Then again it might not. Since you'd have to make some changes to the structure of your subversion repository to get it working, I would suggest you do some simple testing before doing it for real.

Resources