What is the reason of memory gap in hard drives? [closed] - windows

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I want to know that what is the exact reason behind memory difference in our hard drives or pen drives .
As when we say , We have a pen drive of 4GB but actual usable space is 3.7GB . What happens to the rest of memory? Are the manufacturing companies stealing these memory spaces from us or there is any technical reason behind this?
Thanks,
Nitesh Kumar

They use decimal prefixes, you're using binary prefixes. This gives a discrepancy of approximately 2.4% per prefix magnitude.

Related

Searching through an list [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I'm reading about AI and in the notes it is mentioned
A lookup table in chess would have roughly 35^100 entries.
But what does this mean? Is there any way we could find out how long it would take the computer to search through and find it's entry? Would we assume thereis some order or that there is no order?
The number of atoms in the known universe is estimated to be around 10^80 which is much less than 35^100. With current technology, at least a few thousand atoms are required to store a single bit. I assume that each entry of your table would have multiple bits. You would need some really advanced technology to implement the memory of your computer.
So the answer is: With current technology it is not a matter of time, it is simply impossible.

In Windows what is a "runtime image"? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am reading the book "MySQL 5.0 Certification Study Guide".
On page 362 it states:
• mysql-debug contains support for debugging. Normally, you don't choose this server for production use because it has a larger runtime image and uses more memory.
What is an "image"? I have searched extensively to try to find the answer.
The image is the size of the executable code in memory.
In general, "X uses more memory than Y" could refer to both the runtime image size and the amount of space allocated for non-executable data. This quotation is clarifying that both are worse in the debug version.

How much data can I allocate on heap in codechef problems? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
What is the maximum amount of data I can allocate in a solution in codechef ?
Codechef uses SPOJ servers. The new problems use Cube cluster, so as mentioned here- http://www.spoj.com/clusters/ memory limit is 1536 MB. Which means you have plenty of memory available on heap (a large portion of total memory) and need not worry for any reasonable solution.

File not being sent to Pen drive [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have a 32Gb Pen Drive with me.
When i Try to send a file greater than 4GB it displays a message that the file is too large for the destination. Why does it displays this message?
How can i solve this problem?
Reformat it as NTFS instead of FAT.
And check out the other sites of "stackexchange", where such non-programming (!) questions are appropriate.

How to check what process is writing to hard drive in shell [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Currently the free space of my hard disk is shrinking for unknown reason. The free space keeps reducing until no space lefts. I don't know what's the cause, so I want to inspect to see what process is the culprit and terminate it.
Command like find / -size +5M will help you find files bigger than particular size (5M in the example). Probably these would be log files, so you might want to set up logrotate properly. Other possibility is core files, which means some autostarted program is screwed. Also have a look at lsof.

Resources