Read-only file system when attempting mkdir /root/.local/share on Mac with Putty - macos

I am trying to start Bitnami AWS with Putty in mac, but when i start Auth in SSH with both Catalina and Big Sur i get this error:
(putty: 3637): Gtk-WARNING **: Attempting to set the permissions of `/Users/daniele/.local/share/recently-used.xbel ', but failed: No such file or directory
I tried to install the folder:
sudo mkdir -p /root/.local/share
I get this error:
mkdir: /root/.local/share: Read-only file system

As per the error message, we should create the folder at the following path:
/Users/daniele/.local/share/
And not:
/root/.local/share
Therefore, the correct command is:
mkdir -p /Users/Daniele/.local/share

Require the result in command: csrutil status
If result is enabled, you need to restart machine and press command + R, open the terminal in the recovery, so input csrutil diabled.
Restart, and check the status: csrutil status.
Here are two methods:
you are root.
sudo mount -uw /
so, you could mkdir or touch new file.
If you still can't read or write any, you maybe try this:
cd ~ # cd home directory
sudo vim /etc/synthetic.conf # create new file if this doesn't exist
In the conf files, add new line
data /User/xx/data # Notice: the space between this two strings is tab
Restart, and you will find a link named /data in the root.

Related

GCP SDK changing cloud directories from local terminal

I'm trying to follow this tutorial. It seems that if I open a "cloud shell" in the browser, I can execute commands such as cd <cloud path> without any issues. However, when I try to do so from my laptop terminal I run into issues:
(base) user ~ % gsutil ls
gs://my_project/
(base) user ~ % cd gs://my_project/
cd: no such file or directory: gs://my_project/
(base) user ~ % gsutil cd gs://my_project/
CommandException: Invalid command "cd".
(base) user ~ % gcloud cd gs://my_project/
ERROR: (gcloud) Invalid choice: 'cd'.
How should one change cloud directories from a local terminal, using gcloud SDK?
The program gsutil does not support the concept of a current working directory. Objects are specified using the full path <protocol><bucket><object-name>.
If you install the package gcsfuse, you can mount a Google Cloud Storage bucket as a directory. Then you can use the Linux shell command cd to move around the bucket as a file system.
Cloud Storage FUSE
This article shows how to set up gcsfuse on Cloud Shell:
Unlimited persistent disk in google cloud shell

mkdir -p ~ /.ssh returning Permission Denied

This is the very first time for me to start deploying website using Linode. So now I am following a tutorial and after creating a user in server I am trying to create a folder using mkdir -p ~ /.ssh but I keep receiving mkdir: cannot create directory ‘/.ssh’: Permission denied
I am using Linode Ubintu and using Putty. So my question is why am I receiving this error and how do I fix it?
You have an extra space in your command:
mkdir -p ~ /.ssh
^ here
That space splits the path you wanted (~/.ssh) to the new 2 paths: ~ and /.ssh. Note that the second one is in the root (/) directory, which requires additional access rights to write in. But you most probably wanted to create .ssh in your home directory (to which the ~ leads), so proper command would be:
mkdir -p ~/.ssh

tmuxinator generated bash script ruined by ruby error message that should be on stderr?

So I'm trying to bring me up a tmuxinator (version 1.1) window and panes...and when I do nothing happens...so I run this instead:
leeand00#me-host:~$ tmuxinator debug pwrsh_n_bash
/var/lib/gems/2.4.0/gems/tmuxinator-1.1.1/lib/tmuxinator/project.rb:352: warning: Insecure world writable dir /mnt/c in PATH, mode 040777
#!/bin/bash
# Clear rbenv variables before starting tmux
unset RBENV_VERSION
unset RBENV_DIR
...
and I notice that some ruby library deep, deep down is complaining about a directory being writable at the top of the script that is generated and that's why it won't run...I copied just the bash script into another terminal and it runs just fine.
I went to the file in question:
def extract_tmux_config
options_hash = {}
options_string = `#{show_tmux_options}` # <- THIS BEING LINE 352 from whence the warning came...
options_string.encode!("UTF-8", invalid: :replace)
options_string.split("\n").map do |entry|
key, value = entry.split("\s")
options_hash[key] = value
options_hash
end
options_hash
end
So I figure maybe it doesn't like the permissions on my ~/.tmux.conf, they are a little goofy (thanks MS, every file in my OneDrive is -rw-rw-rw- using this WSL thing).
I try changing it:
$ chmod o-w ~/.tmux.conf
$ chmod g-w ~/.tmux
Now it's permissions are -rw-r--r--
I run tmuxinator debug pwrsh_n_bash again, and I still get the same error message at the top that prevents it from running...
Maybe it's the ~/.tmuxinator folder I think...
chmod -R o-w ./.tmuxinator/
chmod -R g-w ./.tmuxinator/
re-ran $ tmuxinator debug pwrsh_n_bash, still get the error at the top.
I'm aware that I have some of these writable folders on my mounted c drive but I don't understand what that has to do with this warning showing up and preventing me from terminal bliss.
What library causes this issue and how can I shut up it's warning? And shouldn't that stuff be piped out to stderr instead of it showing up in my stdout or the file that this gem is trying to execute after writing it?

Vagrant Error: File upload source file must exist

I'm trying to use a vagrant file I received to set up a VM in Ubuntu with virtualbox.
After using the vagrant up command I get the following error:
File provisioner:
* File upload source file /home/c-server/tools/appDeploy.sh must exist
appDeploy.sh does exist in the correct location and looks like this:
#!/bin/bash
#
# Update the app server
#
/usr/local/bin/aws s3 cp s3://dev-build-ci-server/deploy.zip /tmp/.
cd /tmp
unzip -o deploy.zip vagrant/tools/deploy.sh
cp -f vagrant/tools/deploy.sh /tmp/.
rm -rf vagrant
chmod +x /tmp/deploy.sh
dos2unix /tmp/deploy.sh
./deploy.sh
rm -rf ./deploy.sh ./deploy.zip
#
sudo /etc/init.d/supervisor stop
sudo /etc/init.d/supervisor start
#
Since the script exists in the correct location, I'm assuming it's looking for something else (maybe something that should exist on my local computer). What that is, I am not sure.
I did some research into what the file provisioner is and what it does but I cannot find an answer to get me past this error.
It may very well be important that this vagrant file will work correctly on Windows 10, but I need to get it working on Ubuntu.
In your Vagrantfile, check that the filenames are capitalized correctly. Windows isn't case-sensitive but Ubuntu is.

Receiving a no such file or directory error: gcloud compute copy-files onto instance from local machine

gcloud compute copy-files /Users/myusername/Pictures/IMG_0382.JPG myusername#my-instance:/var/www --zone asia-east1-c
/Users/myusername/Pictures/IMG_0382.JPG: No such file or directory
ERROR: (gcloud.compute.copy-files) [/usr/bin/scp] exited with return code [1].
I am copy-pasting the file directory into my local terminal. What is the deal? Do I have to modify the .bash file or something?
So, it turns out I was not in the right terminal. Instead of ssh-ing into your instance and then issuing the command, you have to open up a fresh terminal window, cd to the appropiate directory (the folder with the desired upload file) and then issue the gcloud compute copy-files LOCAL... yada yada. If you are already connected to your instance, that's fine, just open up a new terminal window.
Does this command succeed or fail:
ls -ld /Users/myusername /Users/myusername/Pictures/IMG_0382.JPG
The copy and paste most likely did not do what you expected.
The GCP documentation can be found here.
To make it work I had to:
include sudo For permissions
delte the ~ specified in the documentation.
add zone of my VM
Code:
sudo gcloud compute scp [LOCAL_FILE_PATH] [INSTANCE_NAME]:/ --zone [INSTANCE ZONE]
Example
sudo gcloud compute scp /Users/jb/Downloads/cudnn-8.0-linux-x64-v7.tar instance-1:/home/jb --zone us-west

Resources