How can 8086 processors access harddrives larger than 1 MB? - memory-management

How can 8086 processors (or real mode on later processors) access harddrives larger than 1 MB, when they can only access 1 MB (without expanded memory) of RAM?

Access is not linear (by byte) but by sector. Sector size may be for example 512 bytes. The computer reads sectors to memory as needed.

Related

How do you calculate slack space?

A file of size 12,500 bytes is to be stored on a hard disk drive where the sector size is 512 bytes, and a cluster consists of 8 sectors. How much slack space is there once the file has been saved?
A few things to know:
Slack space is the leftover sectors and bytes from a cluster allocation.
Hard drives understand sectors and are nearly universally 512 bytes as far as logical block addressing goes.
OS (operating systems) understand clusters, which are groups of sectors. This amount of sectors can vary between OS and file system.
The OS DOES NOT understand sectors. The hard drive DOES NOT understand clusters. But they are both terms needed for calculating slack space.
Example 1
Given:
Sector Size: 512 bytes (basically the ONLY sector size these days)
cluster size: 8 sectors
file size: 2560 bytes
In this scenario, when disk space is allocated to store the file, the smallest amount that the OS can read/write MUST be 8 sectors or 4096 bytes.
To find slack space:
first find your cluster size in bytes. Cluster == 8 sectors.
∴ allotment is 4096.
then find if the file size is larger or smaller than that allotment size.
2560 bytes < 4096.
∴ only one cluster is needed to save this file
subtract the file size from the cluster size and you have slack space.
4096 - 2560 == 1536 bytes (or 3 sectors) of slack space.
Example 2
Given:
Sector Size: 512 bytes
cluster size: 16 sectors
file size: 61440 bytes
In this scenario, when disk space is allocated to store the file, the smallest amount that the OS can read/write MUST be 16 sectors or 8192 bytes.
Let's work through the same process:
first find your cluster size in bytes. Cluster == 16 sectors.
∴ allotment is 8192.
then find if the file size is larger or smaller than that allotment size.
61440 bytes > 8192.
∴ several clusters are needed to save this file.
since this file is larger, divide it by the cluster size in bytes.
61440 / 8192 == 7.5 clusters required to save this file.
This isn't a nice round number, so we will have to round up. Recall that the OS cannot write LESS than a whole cluster and if we allot less whole clusters than necessary, we won't save the file.
∴ we require 8 clusters.
find the size in bytes of your allotment size.
8 clusters * 8192 == 65536.
subtract the file size from the cluster allotment and you have slack space.
65536 - 61440 == 4096 bytes (or 4 sectors) of slack space.
Try it.

Virtual Memory - Calculating the Virtual Address Space

I am studying for an exam tomorrow and I came across this question:
You are given a memory system with 2MB of virtual memory, 8KB page size,
512 MB of physical memory, TLB contains 16 entries, 2-way set associative.
How many bits are needed to represent the virtual address space?
I was thinking it would be 20 bits, since 2^10 is 1024, so I simply multiply 2^10*2^10 and get 2^20. However, the answer ends up being 21 and I have no idea why.
The virtual address space required is 2MB.
As you have calculated, 20 bits can accomodate 1MB of VM space. You need 21 bits to accomodate 2MB.

subsector erase would increase endurance of flash memory more than whole-sector erase?

I'm planning to use Spansion (now Cypress) S25FL032P flash memory. ( datasheet : http://www.cypress.com/file/196861/download )
On the datasheet, it says,
 Memory architecture
– Uniform 64 KB sectors
– Top or bottom parameter block (Two 64-KB sectors (top or
bottom) broken down into sixteen 4-KB sub-sectors each)
– 256-byte page size
...
 Erase
– Bulk erase function
– Sector erase (SE) command (D8h) for 64 KB sectors
– Sub-sector erase (P4E) command (20h) for 4 KB sectors
– Sub-sector erase (P8E) command (40h) for 8 KB sectors
So, it can perform erase operation on 1. chip, 2. 64k sector, 3. 8k sector, 4k sector level.
My question is, In both cases equivalently consume single P/E cycle on the perspective of 64kb sector?
Single command for 64kb sector erase.
16 commands for 64kb sector erase using 4k sector erase command (P4E)
Otherwise, second method consumes 16x P/E cycle?
I'm asking this because I want to issue erase command on as small as possible size while maximizing cell endurance as long as possible.

How to Resolve this Out of Memory Issue for a Small Variable in Matlab?

I am running a 32-bit version of Matlab R2013a on my computer (4GB RAM, and 32-bit Windows 7).
I have dataset (~ 60 MB) and I want to read it using
ds = dataset('File', myFile, 'Delimiter', ',');
And each time I face Out of Memory error. Theoretically, I should be able to use 2GB of RAM, so there should be no problem reading such small files.
Here is what I got when typed memory
Maximum possible array: 36 MB (3.775e+07 bytes) *
Memory available for all arrays: 421 MB (4.414e+08 bytes) **
Memory used by MATLAB: 474 MB (4.969e+08 bytes)
Physical Memory (RAM): 3317 MB (3.478e+09 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
I followed every instructions I found (this is not a new issue), but for my case it seems rather weird, because I cannot run a simple program now.
System: Windows 7 32 bit
Matlab: R2013a
RAM: 4 GB
Clearly your issue is right here.
Maximum possible array: 36 MB (3.775e+07 bytes) *
You are either using a lot of memory in your system and/or you have a very low swap space.

How do I increase memory limit (contiguous as well as overall) in Matlab r2012b?

I am using Matlab r2012b on win7 32-bit with 4GB RAM.
However, the memory limit on Matlab process is pretty low. On memory command, I am getting the following output:
Maximum possible array: 385 MB (4.038e+08 bytes) *
Memory available for all arrays: 1281 MB (1.343e+09 bytes) **
Memory used by MATLAB: 421 MB (4.413e+08 bytes)
Physical Memory (RAM): 3496 MB (3.666e+09 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
I need to increase the limit to as much as possible.
System: Windows 7 32 bit
RAM: 4 GB
Matlab: r2012b
For general guidance with memory management in MATLAB, see this MathWorks article. Some specific suggestions follow.
Set the /3GB switch in the boot.ini to increase the memory available to MATLAB. Or set it with a properties dialog if you are averse to text editors. This is mentioned in this section of the above MathWorks page.
Also use pack to increase the Maximum possible array by compacting the memory. The 32-bit MATLAB memory needs blocks of contiguous free memory, which is where this first value comes from. The pack command saves all the variables, clears the workspace, and reloads them so that they are contiguous in memory.
More on overall memory, try disabling the virtual machine, closing programs, stopping unnecessary Windows services. No easy answer for this part.

Resources