I have a Drupal 7 website that I am trying to optimize according to this website and I have A's for everything except "compress text" where I have a "C". Here is what it reports:
GZIP encode all appropriate text assets: 79/100
119.1 KB total in compressible text, target size = 94.4 KB - potential savings = 24.7 KB
FAILED - (7.7 KB, compressed = 1.9 KB - savings of 5.8 KB) - /sites/all/libraries/galleria/themes/azur/galleria.azur.css
FAILED - (4.5 KB, compressed = 2.0 KB - savings of 2.5 KB) - /sites/all/libraries/galleria/themes/azur/galleria.azur.min.js
How would I configure galleria to be gzipped in a drupal environment? I am using the boost module and some custom .htaccess rules.
Thank you!
Related
. Variables and constants in RAM (global, static), used 33060 / 80192 bytes (41%)
║ SEGMENT BYTES DESCRIPTION
╠══ DATA 1500 initialized variables
╠══ RODATA 5128 constants
╚══ BSS 26432 zeroed variables
. Instruction RAM (IRAM_ATTR, ICACHE_RAM_ATTR), used 60753 / 65536 bytes (92%)
║ SEGMENT BYTES DESCRIPTION
╠══ ICACHE 32768 reserved space for flash instruction cache
╚══ IRAM 27985 code in IRAM
. Code in flash (default, ICACHE_FLASH_ATTR), used 383656 / 1048576 bytes (36%)
║ SEGMENT BYTES DESCRIPTION
╚══ IROM 383656 code in flash
Failed uploading: no upload port provided
Code uploaded Successfully
I am currently deploying a zipped package on AWS lambda. I am able to deploy a zip which is of size 80 MB (Uncompressed: 315 MB).
However, as soon as I attach a custom layer with compressed size of 4 MB (Uncompressed: 7 MB), I get the above mentioned error.
After this I tried reducing the deployment package size and I could reduced it to 75 MB. (Uncompressed: 300 MB). Even after that if I try attaching the custom layer of same size, I am getting the same error.
Full Lambda size Earlier: 80 MB (Uncompressed: 315 MB) - This gets deployed successfully.
Full Lambda size with old package & layer: 80 MB (Uncompressed: 315 MB) + 4 MB (Uncompressed: 7 MB) = 84 MB (Uncompressed: 322 MB) - This give the subjected error.
Full Lambda size with reduced package & layer: 75 MB (Uncompressed: 300 MB) + 4 MB (Uncompressed: 7 MB) = 79 MB (Uncompressed: 307 MB) - This give the subjected error.
I am not sure on what is actually happening when I add a custom layer? Can anyone help on this?
How can I find
Disk usage or size of my entire Subversion repository
Disk usage or size only for a particular branch on my repo. Eg ( https://mysvn/svn/myrepo/myfolder)
OS: Windows 2008 server
I have RDP access to the server.
You should be able to find out the size of the repositories by checking their size on disk on the server. For example, if your repositories are stored under C:\Repositories, check the overall size of this directory or check the size of individual repository C:\Repositories\MyRepository.
Disk usage or size of my entire Subversion repository
Just check the size of the repositories on disk.
In case you use VisualSVN Server, try the Measure-SvnRepository cmdlet. It will produce the following output:
Name Revisions Size SizeOnDisk
---- --------- ---- ----------
MyRepo 498 3,340 KB 4,529 KB
MyRepo2 479 21,313 KB 22,571 KB
MyRepo3 201 1,032 KB 2,226 KB
MyRepo5 2 71 KB 90 KB
You can also check the SVN repo size in the VisualSVN Server Manager console:
Disk usage or size only for a particular branch on my repo. Eg (
https://mysvn/svn/myrepo/myfolder)
Normally, a new branch in the repository should take minimum of space (several kilobytes). A branch or tag in SVN is a cheap copy. When you create a branch or tag, Subversion doesn't actually duplicate any data. Therefore, the repository should not grow in size (unless you are going to commit new large content to it).
Instead of counting the "branch size", check the size of revisions on disk, e.g. under C:\Repositories\MyRepository. You can also view and examine the repository storage statistics with the svnfsfs stats tool. Here is an example:
svnfsfs stats C:\Repositories\MyRepository
I am debugging a firmware code and need to use the output .bin file for programming the hardware. In the debug configuration, the binary file size is 158 KB and in the release configuration, it goes down to 120 KB by applying the optimization settings in IAR Embedded Workbench.
I know that the file size can go down to below 50 KB as there are some old .bin files that the previous developer could get from the software. But I can't find a way to reduce the file size further.
Does anyone have any idea how the binary file size could be reduced in the release configuration in IAR Embedded Workbench?
Here's the ending lines of my map file:
38 674 bytes of readonly code memory
4 721 bytes of readonly data memory
17 351 bytes of readwrite data memory
Errors: none
Warnings: none
In Vista Task Manager, I understand the available page file is listed like this:
Page File inUse M / available M
In XP it's listed as the Commit Charge Limit.
I had thought that:
Available Virtual Memory = Physical Memory Total + Sum of Page Files
But on my machine I've got Physical Memory = 2038M, Page Files = 4096M, Page File Available = 6051. There's 83M unaccounted for here. What's that used for. I thought it might be something to do with the Kernel memory, but the number doesn't seem to match up?
Info I've found so far:
See http://msdn.microsoft.com/en-us/library/aa965225(VS.85).aspx for more info.
Page file size can be found here: Computer Properties, advanced, performance settings, advanced.
I think you are correct in your guess it has to do something with the kernel - the kernel memory needs some physical backup as well.
However I have to admit that when trying to verify try, the numbers still do not match well and there is a significant amount of memory not accounted for by this.
I have:
Available Virtual Memory = 4 033 552 KB
Physical Memory Total = 2 096 148 KB
Sum of Page Files = 2048 MB
Kernel Non-Paged Memory = 28 264 KB
Kernel Paged Memory = 63 668 KB