Output domain COMPUTERNAME and logged in USERNAME spreadsheet - windows

So the boss asked for a spreadsheet that would show him all the computers in our enterprise and the users that are currently logged in.
I stopped asking "why?" and started with NBTSTAT. The <03> results were inconsistent. Then I tried NET CONFIG WORKSTATION, and finally PSLOGGEDON.EXE (SYSINTERNALS). These were good, but I'd have to find a way to pass the results of NET VIEW and format the output nicely for a csv.
But then I thought there must be a better way. 90% of our PCs are WinXP so I could use WMIC or maybe DSQuery. I'd rather isolate the command execution to the workstations in our AD Computers container and not touch our Servers.
Does anyone have any recommendations?

You could do an LDAP search to Active Directory (LDIFDE)
ldifde -m -f OUTPUT.LDF -b USERNAME DOMAINNAME * -s SERVERNAME -d "cn=users,DC=DOMAINNAME,DC=Microsoft,DC=Com" -r "(objectClass=user)"
searching users then computers by changing the objectclass.
You can also specify just the attributes you want at the end.
eg: ldifde -f e:\Exportuser.ldf -s DomainController -d "ou=user,ou=sitename,dc=uk" -p subtree -r "(objectClass=User)" -l givenname,sn,samaccountname,mail,dn
or use a .net script

PsExec.exe + PsLoggedOn.exe + batch file.
Step 1. Download PsExec.exe and put it in a folder.
Step 2. Put PsExec.exer and PsLoggedOn.exe in a folder together.
Step 3. Determine a network share.
Step 4. Create a text file called pc_list.txt with the results of NET VIEW pasted in it (minus the "\" - should be one computer name per line). Basically should be a text with the following:
computer1
computer2
computer3
computer4
10.1.1.1
10.2.1.3
10.3.1.4
etc.
You can use IP's or Computer names.
Step 5. Create a Batch file called PsExec.bat with the following code:
"location_of_PsExec.exe"\PsExec.exe #"location_of_text_file"\pc_list.txt -u domain\Admin_username -p password -n 60 -c -f "location_of_runPsLoggedOn.bat"\runPsLoggedOn.bat
Step 6. Create another batch file called runPsLoggedOn.bat with the following code:
#echo off
echo.%computername%>>"location_of_network_share"\userlist.txt
PsLoggedOn.exe -x>>"location_of_network_share"\userlist.txt
echo.>>"location_of_network_share"\userlist.txt
Step 7. Run the PsExec.bat
This is really probably not going to be the EXACT answer you're looking for, but this should at least get you pointed in the right direction...
I didn't test this out, but I use something similar to get other things done using PsExec.exe.
You would want to just write in the code to only output the domain\username to the file on your network share, (from PsLoggedOn.exe -x>>"location_of_network_share"\userlist.txt) because the output is like this:
Users logged on locally:
domain\user
No one is logged on via resource shares.
You also might want to find, as far as formatting goes, a way to output that username to a variable so that you could do the following:
echo.%computername% %variable_username_from_PsLoggedOn%>>"location_of_network_share"\userlist.txt
Anyhow, this should get you started on your way to getting this solved...

How about using Netscan from Softperfect?

This seems to give me what I want:
FOR /F "skip=3 delims=\ " %%A IN ('net view') DO WMIC /Node:"%%A" ComputerSystem Get UserName /Format:csv 2>nul | MORE /E +2 >> computer_user_list.csv
Note that there is a tabulator followed by a space after delims=\
Maybe someone else could use it.

Related

Multiple site FTP downloads, multiple variables in Bash script - function, loop or other?

I have tried searching but can't find exactly what I'm after and maybe I don't even know exactly what to search for...
I need to FTP a variety of csv files from multiple sites each with different credentials.
I am able to do this one by one with the following, however I need to do this for 30 sites and do not want to copy paste all this.
What would be the best way to write this and if you can show me how or point me to an answer that would be great.
And for bonus points (I might have to ask a separate question), mget is not working linux to linux, only from linux to windows. I have also tried curl but no luck either.
Thanks a lot.
p.s. not sure if it makes a difference, but I will be running this as a cron job every 15 minutes. I'm ok with that part ;)
#!/bin/bash
chmod +x ftp.sh
#Windows site global variables
ROOT='/data'
PASSWD='passwd'
# Site 1
SITE='site1'
HOST='10.10.10.10'
USER='sitename1'
ftp -in $HOST <<EOF
user $USER $PASSWD
binary
cd "${ROOT}/${SITE}/"
lcd "/home/Downloads"
mget "${SITE}}.csv1" "${SITE}}.csv2" #needs second "}" as part of file name
quit
EOF
echo "Site 1 FTP complete"
# Site 2
SITE='site2'
HOST='20.20.20.20'
USER='sitename2'
ftp -in $HOST <<EOF
user $USER $PASSWD
binary
cd "${ROOT}/${SITE}/"
lcd "/home/instrum/Downloads"
mget "${SITE}}.csv1" "${SITE}}.csv2" #needs second "}" as part of file name
quit
EOF
echo "Site 2 FTP complete"
#Linux site Global variables
ROOT='/home/path'
USER='user'
PASSWD='passwd2'
#Site 3
SITE='site_3'
HOST='30.30.30.30'
ftp -in $HOST << EOF
user $USER $PASSWD
binary
cd "${ROOT}/${SITE}/"
lcd "/home/Downloads"
get "${SITE}file1.csv" #mget not working for linux to linux FTP, don't know why.
get "${SITE}file2.csv"
quit
EOF
echo "Site 3 FTP complete"
#Site 4
SITE='site_4'
HOST='40.40.40.40'
ftp -in $HOST << EOF
user $USER $PASSWD
binary
cd "${ROOT}/${SITE}/"
lcd "/home/Downloads"
get "${SITE}file1.csv" #mget not working for linux to linux FTP, don't know why.
get "${SITE}file2.csv"
quit
EOF
echo "Site 4 FTP complete"
For credentials, put this into a separate file, with variables for site 1 as, site1, host1, user1, and comments, so if a different user is running this script, the user would be able to understand this quickly, and also for less chance of amending the passwords on the file and creating an error. When your main script loads, you can load the file with the passwords before running the main script.
On your main script, if the functionality is similar on all sites, and you are always going to run the same code for all 30 sites as well, then you can use a while loop starting at 1 and ending at 30. In your code amend the variables, site, host and user, to insert the number at the end, to execute the code with the right variables.
There are tools for copying files for example, if these servers are on your network, for example rsync which is efficient as well. If you would like to take a look

Telling my backup script in windows to "mount everything that's supposed to be mounted."

I have a backup script that works. I won't put in the entire script, because I don't think it's necessary, but it looks like this:
set backup_dir=R:\__MYBACKUPS
REM *** <<< Right here, I want to have a command that says "Mount everything you know how to mount"
if not exist %backup_dir% (
echo The backup directory does not exist.
echo Making the backup directory %backup_dir%
)
call :get_datetime_stamp
set BACKUPNAME=%backup_dir%\bak%DATEANDTIME%.zip
set default_to_backup="c:\Users\%username%\documents\*.doc*" "c:\Users\%username%\documents\*.csv" "c:\Users\%username%\documents\*.xls*" "c:\Users\%username%\documents\*.pdf" ^
"c:\Users\%username%\documents\*.ppt*" "c:\Users\%username%\documents\*.txt"
set what_to_backup=%default_to_backup% "c:\Users\%username%\etc"
set zip_program="c:\Program Files\winzip\winzip64.exe"
set zip_parameters=-min -a -r
echo EXECUTING THE FOLLOWING BACKUP COMMAND:
echo %zip_program% %zip_parameters% %BACKUPNAME% %what_to_backup%
%zip_program% %zip_parameters% %BACKUPNAME% %what_to_backup%
It works fine, if I mount the R drive. The way I normally do this is when I login I manually go to the explorer window and just click on the R drive ... and that action magically "connects" the R shared network drive. HOWEVER, what I would prefer to do is have the script "mount the drive." I don't want to go in and click all the network drives to connect them. I don't want to specify all the connection garbage. I just want to tell windows, "Mount everything that you already know how to mount."
This is all information that our computer services people have already figured out and configured on all our machines.

Run a Bash Script automatically upon login

I wrote a script that sends the date and username of the person who logs in to a log file to keep a record of who has logged in. I am wondering how can you set this script to execute automatically when a user logs in rather than have to manually run it in the terminal. NOTE: the USERNAME is the current user that is logged in.
my code:
#!/bin/bash
printf "$(date) $HOSTNAME booted!\n" >> /home/USERNAME/boot.log
A more elegant way to solve this problem is to read from log files that are already being written and cannot be changed by the user. No one could say it better than Bjørne Malmanger's in his answer:
I wouldn't trust the user to GIVE you the information. As root you
TAKE it ;-)
A nice way to do this is the last command, which is great because it neatly displays all logins: Graphical, console and SSH.
last
A less elegant but still secure way is to do a grep on /var/log/auth.log. On my Gnome/Ubuntu system I can use this to track graphical logins:
grep "session opened for user USERNAME"
The right pattern for your machine needs to be found for each login type: graphical, console and SSH. This is cumbersome, but you might need to do it if you need information that goes further back than last reaches.
To directly answer your question:
You can modify the script like this to get the username
#!bin/bash
printf "$(date) $HOSTNAME booted!\n" >> /home/$(whoami)/boot.log
And add this line to /etc/profile
. /path/to/script.sh
This is not secure though because the user will be able to edit his own log
Why don't you use the last command?
I wouldn't trust the user to GIVE you the information. As root you TAKE it ;-)
Put it in ~/.bash_profile. It will be run each time they log in.
More information is available at the women's rights page (i.e. man bash).

batch file doesnt work on Windows 7

I am trying to execute a simple batch file with following contents..
ECHO OFF
::CMD will no longer show us what command it’s executing(cleaner)
ECHO As a network admin, I’m getting tired of having to type these commands in!
:: Print some text
IPCONFIG /ALL
:: Outputs tons of network information into the command prompt
PAUSE
:: Lets the user read the important network information
PING www.google.com
:: Ping google to figure out if we’ve got internet!
ECHO All done pinging Google.
::Print some text
PAUSE
But, nothing happens except a flash of command prompt. PAUSE does not seem to have any effect. Please help.
I believe you need to have administrator privileges to run ipconfig.

FTP inside a shell script not working

My host upgraded my version of FreeBSD and now one of my scripts is broken. The script simply uploads a data feed to google for their merchant service.
The script (that was working prior to the upgrade):
ftp ftp://myusername:mypassword#uploads.google.com/<<END_SCRIPT
ascii
put /usr/www/users/myname/feeds/mymerchantfile.txt mymerchantfile.txt
exit
END_SCRIPT
Now the script says "unknown host". The same script works on OSX.
I've tried removing the "ftp://". - No effect
I can log in from the command line if I enter the username and password manually.
I've search around for other solutions and have also tried the following:
HOST='uploads.google.com'
USER='myusername'
PASSWD='mypassword'
ftp -dni <<END_SCRIPT
open $HOST
quote USER $USER
quote PASS $PASS
ascii
put /usr/www/users/myname/feeds/mymerchantfile.txt mymerchantfile.txt
END_SCRIPT
And
HOST='uploads.google.com'
USER='myusername'
PASSWD='mypassword'
ftp -dni <<END_SCRIPT
open $HOST
user $USER $PASS
ascii
put /usr/www/users/myname/feeds/mymerchantfile.txt mymerchantfile.txt
END_SCRIPT
Nothing I can find online seems to be doing the trick. Does anyone have any other ideas? I don't want to use a .netrc file since it is executed by cron under a different user.
ftp(1) shows that there is a simple -u command line switch to upload a file; and since ascii is the default (shudder), maybe you can replace your whole script with one command line:
ftp -u ftp://username:password#uploads.google.com/mymerchantfile.txt\
/usr/www/users/myname/feeds/mymerchantfile.txt
(Long line wrapped with \\n, feel free to remove the backslash and place it all on one line.)
ftp $HOSTNAME <<EOFEOF
$USER
$PASS
ascii
put $LOCALFILE $REMOTETEMPFILE
rename $REMOTETEMPFILE $REMOTEFINALFILE
EOFEOF
Please note that the above code can be easily broken by, for example, using spaces in the variables in question. Also, this method gives you virtually no way to detect and handle failure reliably.
Look into the expect tool if you haven't already. You may find that it solves problems you didn't know you had.
Some ideas:
just a thought since this is executed in a subshell which should inherit correctly from parent, does an env show any difference when executed from within the script than from the shell?
Do you use a correct "shebang"?
Any proxy that requires authentication?
Can you ping the host?
In BSD, you can create a NETRC script that ftp can use for logging on. You can even specify the NETRC file in your ftp command too using the -N parameter. Otherwise, the default NETRC is used (which is $HOME/.netrc).
Can you check if there's a difference in the environment between your shell-login, and the cron-job? From your login, run env, and look out for ftp_proxy and http_proxy.
Next, include a line in the cron-job that will dump the environment, e.g. env >/tmp/your.env.
Maybe there's some difference...Also, did you double-check your correct usage of the -n switch?

Resources