At work I have to connect to our server every day. After becoming annoyed with having to use the GUI Connect to Server every day, I wrote a quick script (using mount) that does the same thing.
When I use Connect to Server, however, a link to the mounted server appears in the side panel of the File Manager, which I use all the time. How do I add this link from a terminal/shell script?
(Or even better, where can I find the code for the Connect to Server program?)
Thanks in advance.
You want to use gvfs-mount rather than mount
See the discussion here: http://www.g-loaded.eu/2008/12/08/access-gvfs-mounts-from-the-command-line/
Related
We have a standard config (tunnel.conf) for Wireguard that we want to push to clients (via JAMF Pro).
We do not want the end user to have to open the Wireguard UI to import the config, we want to do this via scripting.
Given I can place the tunnel.conf file anywhere on the end user's system, where do I have to place it, and what command do I need to run to import it?
And conversely, how can I delete a tunnel config from Wireguard, via scripting?
So, as it turns out, Wireguard has a unique key-pair per tunnel - which means each user has their own keys.
Managing that via JAMF sounds like a nightmare, and it'll be easier to point users at their accounts in the VPN to pull down their config, than to manage it for them. Documentation and handholding time!
But it seems to be possible to manage applying a profile via automation. The kind support people at my VPN provider pointed me to this article on JAMF community board:
https://community.jamf.com/t5/jamf-pro/wireguard-configuration-file-distribution/m-p/264747
There's a related page on the wireguard-apple repository:
https://github.com/WireGuard/wireguard-apple/blob/master/MOBILECONFIG.md
If we do end up trying to manage the users configs, I'll update here.
I have a Linux server where I start a few Ruby programs during the day. The server is directly connected to the internet (no firewall) at a hoster and I wonder, if there is a way to start and close the mySQL server just before I update the db and close it afterwards. The target is, to have the mySQL server only open when it is needed. So I thought it might be a way to activate the port or the service directly out of Ruby.
thank you for answering,
Werner
You'd probably have to change the permissions to the database through ruby, then, doing whatever you want to do, and change the permissions back.
You could do that usig the mysql gem, connecting to the database and running the commands.
Then restart the process, and do the same thing but backwards
Honestly, I don't know why you would do that, and I wouldn't recommend someone to do that. But that would be my approach
I'm experimenting with reverse shell tcp. I managed to establish a connection, but my question is, how do I mantain a connection even after I close the multihandler? And when I'm using the target's command prompt, how do I send files to the target's computer using his command prompt?
Pedro,
The short answer is you can't.
In order to maintain a connection you need to install persistence on
the victim machine. You will still have to reuse the multi/handler in
order to receive a new connection.
In order to transfer files you need to use the meterpreter payload in
order to upload and download files.
However, if you have powershell on your target machine you can run a
powershell download that will fetch internet hosted resources for
you.
Hope this helped.
After following this simple tutorial http://www.louisaslett.com/RStudio_AMI/ and video guide http://www.louisaslett.com/RStudio_AMI/video_guide.html I have setup an RStudio environment on EC2.
The only problem is, I can't upload large files (> 1GB).
I can upload small files just fine.
When I try to upload a file via RStudio, it gives me the following error:
Unexpected empty response from server
Does anyone know how I can upload these large files for use in RStudio? This is the whole reason I am using EC2 in the first place (to work with big data).
Ok so I had the same problem myself and it was incredibly frustrating, but eventually I realised what was going on here. The default home directory size for AWS is less than 8-10GB regardless of the size of your instance. As this as trying to upload to home then there was not enough room. An experienced linux user would not have fallen into this trap, but hopefully any other windows users new to this who come across this problem will see this. If you upload into a different drive on the instance then this can be solved. As the Louis Aslett Rstudio AMI is based in this 8-10GB space then you will have to set your working directory outside this, the home directory. Not intuitively apparent from Rstudio server interface. Whilst this is an advanced forum and this is a rookie error I am hoping no one deletes this question as I spent months on this and I think someone else will too. I hope this makes sense to you?
Don't you have shell access to your Amazon server? Don't rely on RStudio's upload (which may have a 2Gb limit, reasonably) and use proper unix dev tools:
rsync -avz myHugeFile.dat amazonusername#my.amazon.host.ip:
on your local PC command line (install cygwin or other unixy compatibility system) will transfer your huge file to your amazon server, and if interrupted will resume from that point, will compress the data for transfer too.
For a windows gui on something like this, WinSCP was what we used to do in the bad old days before Linux.
This could have something to do with your web server. Are you using nginx or apache as your web server. If so you can modify the upload feature in your nginx server. If you are running nginx on the front end of the web server I would recommend the following fix in your nginx.conf file.
http {
...
client_max_body_size 100M;
}
https://www.tecmint.com/limit-file-upload-size-in-nginx/
I had a similar problems with a 5GB file. What worked for me was to use SQLite to create a database with the csv file that I needed. Use SQLite code to bring create the database. Then I used a function in RStudio to communicate with the local database. In that way, I was able to bring in the csv file. I can track down the R code that I used if you like.
I'm trying to keep a file updated real time with the server. Its more like a real time syncing which has a very small delay. Is there any application that lets me do this? Or would you suggest me using a local host as a server?
I dont know how you are connected to your server - but i assume this will be something like SCP / SFTP / FTP and i dont know your OS. WinSCP will do excatly this what you need, you can set it to watch your Filesystem (to a specified folder) and it will update the server files as soon as your file on your drive changes.
It also supports command line features so that you can use it within your own applications.