SSD got only 500 Mb reserved for the system and don´t let me Delete this partition nor alocate the 450 Gb of free memory that are not alocated [migrated] - disk

This question was migrated from Stack Overflow because it can be answered on Super User.
Migrated yesterday.
I bought a new SSD to use in my old computer and so once a plugged it and started the process of allocation it only allocated 500 Mb that are said to be reserved to the System, letting the 450 Gb that rested not allocated. Also the disk won´t let me delete this partition nor allocate the empty space, even though I can click to do it.
Disk 1 is the new one
So, after that I tried Delete this partition to retry the process.
Delete partition action
And strangely it did not deleted any thing, since the field is still there, and once I plugged it again, even the name returned.
Because of this I jumped to try to at least allocate the rest of the memory like showed in then next image. Notice that in this image you can also see the deleted partition that doesn´t go away.
Allocate partition action
And so I waited for like 15 minutes for just so it show an screen saying that:
Failed to complete the operation because the Disk Management console display is not up to date. Update the display using the update task. If the problem persists, close the Disk Management console and restart Disk Management or restart your computer.
And so I did only to return to the same situation.
This is the image of the message that appeared.

Related

SCCM "Create Task Sequence Media" not enough space

I am getting the following error in SCCM when trying to create USB media:
In the CreateTSMedia log for this task I get:
Which indicates that there is not enough space in the destination or staging area. STaging area is sort of hard set to c:\users\user, and I have the destination pointing to D:
You can see in the initial image that the total size of all the content in the sequence is just over 4Gb, and the drive with the smallest amount of space free has over 55Gb, and yet I still get "not enough free space" errors.
Everything online just say, clear some space. But I already have plenty, something else is going on here.

Can't install sample dataset on Kibana - Forbidden Error

Installed Kibana and Elasticsearch on my macOS Catalina via Brew (not sudo), but I'm not able to install the sample data sets. Anyone have any idea why I'm getting this Forbidden error and how to resolve? The error message is on the bottom right of the picture
go to the conf of elasticsearch uncomment and fill the path.logs line with the right path
Check if you have enough(>90%) disk space available
A good way to look for reason of any error is logs if available :)
I was trying to load Sample Data (eCommerce orders, flight , web logs) in to my kibana. I was getting some error. Logs shown below
elasticsearch.log
[o.e.c.r.a.DiskThresholdMonitor] high disk watermark [90%] exceeded on [/Users/xyz/Installs/ElasticSearch/elasticsearch-7.9.3/data/nodes/0] free: 15.1gb[6.4%], shards will be relocated away from this node; currently relocating away shards totalling [0] bytes; the node is expected to continue to exceed the high disk watermark when these relocations are complete
My MAC has total 250 GB space, I freed up extra 20 GB, then it worked. Please check if you have enough memory (more than 90% should be available)
go to management -> index pattern -> create index pattern.
Looks like for some reason I hit the threshold where the indexes are locked because of no more disk space and so I had to unlock the indexes manually
https://discuss.elastic.co/t/unable-to-create-index-pattern-from-kibana/167184

Solr ate all Memory and throws -bash: cannot create temp file for here-document: No space left on device on Server

I have been started solr for long time approx 2 weeks then I saw that Solr ate around 22 GB from 28 GB RAM of my Server.
While checking status of Solr, using bin/solr -i it throws -bash: cannot create temp file for here-document: No space left on device
I stopped the Solr, and restarted the solr. It is working fine.
What's the problem actually. Didn't get?
And what is the solution for that?
I never want that Solr gets stop/halt while running.
First you should check the space on your file system. For example using df -h. Post the output here.
Is there any mount-point without free space?
2nd: find out the reason, why there is no space left. Your question handles two different thing: no space left on file system an a big usage of RAM.
Solr stores two different kind of data: the search index an the data himself.
Storing the data is only needed, if you like to output the documents after finding them in index. For example if you like to use highlighting. So take a look at your schema.xml an decide for every singe field, if it must be stored or if "indexing" the field is enough for your needs. Use the stored=true parameter for that.
Next: if you rebuild the index: keep in mind, that you need double space on disc during index rebuild.
You also could think about to move your index/data files to an other disk.
If you have solved you "free space" problem on disc, so you probably don't have an RAM issue any more.
If there is still a RAM problem, please post you java start parameter. There you can define, how much RAM is available for Solr. Solr needs a lot of virtual RAM, but an moderate size of physical RAM.
And: you could post the output of your logfile.

Creating an image of a drive cloned with ddrescue.

We have an old server with disk failures that we've tried to clone into VMSphere. This resulted in an error from what that error came from we couldn't pin point.
With ddrescue we cloned the machine to a 2TB external hard drive that we can use to lab around with without having any downtime.
Then we used normal dd to try to create an image that we then later could convert or insert into the virtual environment.
Problem is that we have don't have any workstations that are able to handle a 2TB file. Is there any way that we can create an image of the drive with the partitions, data and mbr? Basically everything except for the unallocated space.
You could try using "dd". If you have some idea how big the OS and data partitions were, just chop that much plus a bit extra off the front of the image and save in a new file. Say you guess it was 4GB of OS and 8GB of data, do something like this:
dd if=yourimage of=newsmallerimage bs=1024k count=14000
which will get you around 14GB. Any Virtual Machine will likely ignore any blank space at the end anyway.

Windows paging file size

I am trying to understand how to set the paging file size appropriately on Vista. For example, under System Properties, Advanced, Performance options it shows under "Total paging file size for all drives", a recommended size of about 8 GB, and a currently allocated of about 4 GB. I've been trying everything possible to (unchecking the box for automatically manage paging file size for all drives) get the value to recommended in order to run some larger problems with my code.
But it only shows briefly (when I use a custom size setting on one of my other hard drives in the computer) after I hit Set and OK; but when I restart it goes back to the default settings?? What am I doing wrong? Appreciate if somebody can point me to some place for help with this or share their experience.
You can alternatively make the change in the registry.
Key: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Memory management
Value: PagingFiles
This value will have an entry for each drive with it's associated pagefile location and its minimum and maximum sizes.
It might look something like this:
C:\pagefile.sys 250 500
Where 250 is the minimum and 500 is the maximum. Try changing it in here and see what happens.

Resources