Maximum IO Transfer Size in Linux Kernel 4.13.10 - linux-kernel

I am unable to issue an IO with transfer size more than 131 KB in linux kernel 4.13.10. The failure is observed from the block layer api blk_rq_map_user with error code EINVAL (22). Is there any limitation on maximum IO size?

Related

Limit Linux DMA allocation to specific range

I am working on a SOC which it's RAM starts at 0x200M.
For some reason I need to limit the DMA allocations up to 0x220M, so I called - dma_set_mask_and_coherent(dev, 0x21FFFFFFF).
I noticed that
dev->dma_mask was not set, while
dev->coherent_dma_mask was set.
Still, calling dma_alloc_coherent returns buffers above the requested limit.
How can I limit the buffer address?

Max transfer size of SCSI Read(10) on a Physical Drive

Refer to here, IOCTL_STORAGE_QUERY_PROPERTY with StorageAdapterProperty can be used to get the max transfer size of per SCSI Read(10) command.
In this code, 16 sectors are read from start of lba. I tried to modify the number and in my Win7 environment, the max number is 256 sector through SATA and 128 sector through bridge (SATA-USB) to a SSD, which match the result using IOCTL_STORAGE_QUERY_PROPERTY with StorageAdapterProperty.
As far as I know, when installing OS (win7, win10, macOS), a device can receive a SCSI Read(10) command up to 2048 sector. I wonder which layer limits the transfer size (operating system/device driver...) and is there any way to bypass the layer to send a SCSI Read(10) command longer than the limitation at a time?

Should the stats reported by Go's runtime.ReadMemStats approximately equal the resident memory set reported by ps aux?

In Go Should the "Sys" stat or any other stat/combination reported by runtime.ReadMemStats approximately equal the resident memory set reported by ps aux?
Alternatively, assuming some memory may be swapped out, should the Sys stat be approximately greater than or equal to the RSS?
We have a long-running web service that deals with a high frequency of requests and we are finding that the RSS quickly climbs up to consume virtually all of the 64GB memory on our servers. When it hits ~85% we begin to experience considerable degradation in our response times and in how many concurrent requests we can handle. The run I've listed below is after about 20 hours of execution, and is already at 51% memory usage.
I'm trying to determine if the likely cause is a memory leak (we make some calls to CGO). The data seems to indicate that it is, but before I go down that rabbit hole I want to rule out a fundamental misunderstanding of the statistics I'm using to make that call.
This is an amd64 build targeting linux and executing on CentOS.
Reported by runtime.ReadMemStats:
Alloc: 1294777080 bytes (1234.80MB) // bytes allocated and not yet freed
Sys: 3686471104 bytes (3515.69MB) // bytes obtained from system (sum of XxxSys below)
HeapAlloc: 1294777080 bytes (1234.80MB) // bytes allocated and not yet freed (same as Alloc above)
HeapSys: 3104931840 bytes (2961.09MB) // bytes obtained from system
HeapIdle: 1672339456 bytes (1594.87MB) // bytes in idle spans
HeapInuse: 1432592384 bytes (1366.23MB) // bytes in non-idle span
Reported by ps aux:
%CPU %MEM VSZ RSS
1362 51.3 306936436 33742120

How to Resolve this Out of Memory Issue for a Small Variable in Matlab?

I am running a 32-bit version of Matlab R2013a on my computer (4GB RAM, and 32-bit Windows 7).
I have dataset (~ 60 MB) and I want to read it using
ds = dataset('File', myFile, 'Delimiter', ',');
And each time I face Out of Memory error. Theoretically, I should be able to use 2GB of RAM, so there should be no problem reading such small files.
Here is what I got when typed memory
Maximum possible array: 36 MB (3.775e+07 bytes) *
Memory available for all arrays: 421 MB (4.414e+08 bytes) **
Memory used by MATLAB: 474 MB (4.969e+08 bytes)
Physical Memory (RAM): 3317 MB (3.478e+09 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
I followed every instructions I found (this is not a new issue), but for my case it seems rather weird, because I cannot run a simple program now.
System: Windows 7 32 bit
Matlab: R2013a
RAM: 4 GB
Clearly your issue is right here.
Maximum possible array: 36 MB (3.775e+07 bytes) *
You are either using a lot of memory in your system and/or you have a very low swap space.

How do I increase memory limit (contiguous as well as overall) in Matlab r2012b?

I am using Matlab r2012b on win7 32-bit with 4GB RAM.
However, the memory limit on Matlab process is pretty low. On memory command, I am getting the following output:
Maximum possible array: 385 MB (4.038e+08 bytes) *
Memory available for all arrays: 1281 MB (1.343e+09 bytes) **
Memory used by MATLAB: 421 MB (4.413e+08 bytes)
Physical Memory (RAM): 3496 MB (3.666e+09 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
I need to increase the limit to as much as possible.
System: Windows 7 32 bit
RAM: 4 GB
Matlab: r2012b
For general guidance with memory management in MATLAB, see this MathWorks article. Some specific suggestions follow.
Set the /3GB switch in the boot.ini to increase the memory available to MATLAB. Or set it with a properties dialog if you are averse to text editors. This is mentioned in this section of the above MathWorks page.
Also use pack to increase the Maximum possible array by compacting the memory. The 32-bit MATLAB memory needs blocks of contiguous free memory, which is where this first value comes from. The pack command saves all the variables, clears the workspace, and reloads them so that they are contiguous in memory.
More on overall memory, try disabling the virtual machine, closing programs, stopping unnecessary Windows services. No easy answer for this part.

Resources