Xcode indicates that nsurlsessiondownloadtask consumes massive amounts of memory - xcode

I am using NSUrlSession and NSUrlSessionDownloadTask to download files from a server to my app.
When I am running my app written in Swift from Xcode and looking at the memory consumption gauge the memory consumed increases steadily as the files are downloaded corresponding roughly to the data fetched.
So when downloading 1 GB of files Xcode indicates that my app is using over 1 GB of memory.
This makes absolutely no sense since the downloaded data is saved on disk and not in memory.
Also the app will run without crashing which seems impossible if the reported memory usage is correct.
Have anyone else encountered this issue?

Related

How Do I Reduce Size of UWP App (Can I Compress HD Images?)

My UWP app for Windows 10, which is in development at the moment and has about 400 - 500 HD images, takes up a whopping 1.7 GB of hard drive space. File Explorer claims that the images take up about 1.68 GB while the code is the other 0.02 GB...
When all is said and done, the app needs to have a couple thousand HD images. Clearly this will be unsustainable as the size of the app will be nearly 10 GB, or possibly even larger.
Is there a way to compress these images within the app?
This is unrelated to this Stack Overflow question. I am not using Xamarin. Also I tested downloading a release build from the Store to confirm--and it does in fact say it is 1.7 GB in size.
The images used in the application are actually application resources.
When the application is packaged, this part of the resource will not be specially processed. You can only compress the image before packaging, or consider extracting the image resource to allow the user to Download and import image resources after installing the application.
Just like some online games, after downloading the game body, they will download additional data packets the first time they run. This part of the resource is not hosted by the store, but is deposited by the developer.

Reduce time to launch Play mode and scripts update

Every working day, all unity developers have to launch the play mode in a Unity more than once. And if you think about it, it may take you a long time to wait for the launch of this mode.
My question is advise how I can speed up the launch mode and scripts update?
I have already taken the following actions:
Created a RAM disk and installed Unity on it.
Transferred all project-related files to the SSD (I am afraid to transfer the disc to RAM because of the possibility of losing everything in case of failure)
EDIT:
it running on:
i7 3770, ram ddr3 16gb(12 ram + 4 ram disk),Slim S55 240GB,gtx670

Google chrome takes 1.1 gb of memory to download and load a large image (24000x12000) of size 17.2 mb

How does google chrome browser internally work while downloading and processing images?
When one tries to open this image then google chrome task manager shows 1.1 GB of memory footprint(do make sure you use disabled cache while replication)
After the image is downloaded and loaded then the memory is released and it drops to 77 MB of memory footprint
I couldn't figure out any reason for such high memory consumption. Neither what chrome internally does that consumes such huge memory.
I'm looking for any relevant answer or blog which can help me understand the internal architecture or design which guides chrome to behave such a way.
JPEG is a compressed image storage format. For displaying the image, an application has to uncompress it in memory. A reasonable expectation is 4 bytes per pixel (one byte for each color channel), so your image takes 24000*12000*4 bytes = 1.07 GB.

mac osx occupied memory increase quickly

I noticed when I run Xcode especially start to run Interface builder.
Mac osx occupied memory increased quickly.
Not only xcode, there are some other apps also cause memory occupy too much after running a while.
Even the memory of my mac is 4GB, some time I have to use tool to free memory.
What is reason and how to avoid this case happen in my developing mac app?
Welcome any comment
I just experienced something similar(but probably not the same) in my Qt application.
I was reading and checksumming lots of files and the free memory kept dropping, though my appliations "real memory" stayed at a steady 50ish MB. However the amount of "Inactive memory" kept climbing.
What was happening was that every file i read was being added to the disk cache. The memory consumed by the disk cache is apperantly marked as "inactive", which should be just as availible as "free" memory according to apple ( http://support.apple.com/kb/HT1342 ) but that didn't stop OSX from starting to swap when "free" hit below 50ish MB.
in C:
#include "fnctl.h"
fcntl(f.handle(),F_GLOBAL_NOCACHE,1);
Seemd to fix that by bypassing disk caching for that file descriptor.
Freeing up inactive memory ( if that is indeed your problem ) can also be done from the commandline using the "purge" command.

Link failure with either abnormal memory consumption or LNK1106 in Visual Studio 2005

I am trying to build a solution for windows XP in Visual Studio 2005. This solution contains 81 projects (static libs, exe's, dlls) and is being successfully used by our partners. I copied the solution bundle from their repository and tried setting it up on 3 similar machines of people in our group. I was successful on two machines and the solution failed to build on my machine.
The build on my machine encountered two problems:
During a simple build creation of the biggest static library (about 522Mb in debug mode) would fail with the message "13>libd\ui1d.lib : fatal error LNK1106: invalid file or disk full: cannot seek to 0x20101879"
Full solution rebuild creates this library, however when it comes to linking the library to main .exe file, devenv.exe spawns link.exe which consumes about 80Mb of physical memory and 250MB of virtual and spawns another link.exe, which does the same. This goes on until the system runs out of memory. On PCs of my colleagues where successful build could be performed, there is only one link.exe process which uses all the memory required for linking (about 500Mb physical).
There is a plenty of hard drive space on my machine and the file system is NTFS.
All three of our systems are similar - Core2Quad processors, 4Gb of RAM, Windows XP SP3. We are using Visual studio installed from the same source.
I tried using a different RAM and CPU, using dedicated graphics adapter to eliminate possibility of video memory sharing influencing the build, putting solution files to different location, using different versions of VS 2005 (Professional, Standard and Team Suite), changing the amount of available virtual memory, running memtest86 and building the project from scratch (i.e. a clean bundle).
I have read what MSDN says about LNK1106, none of the cases apply to me except for maybe "out of heap space", however I am not sure how I should fight this.
The only idea that I have left is reinstalling the OS, however I am not sure that it would help and I am not sure that my situation wouldn't repeat itself on a different machine.
Would anyone have any sort of advice for me?
Thanks
Yes, 522 Megabytes is about as large a contiguous chunk of virtual memory that can be allocated on the 32-bit version of Windows. That's one hunking large library, you'll have to split it up.
You might be able to postpone the inevitable by building on a 64-bit version of Windows, 32-bit programs get a much larger virtual memory space, close to 4 GB.

Resources