How to export user lists and passwords in synology NAS - synology

I would like to know there is methods to export user lists and passwords in synology NAS device

Local
See user Greenstream's answer in the Synology Forum:
Download config backup file from the Synology
Change file extension from .cfg to .gzip
Unzip the file using 7-Zip or another utility that can extract from gzip archives
Download and install DB Browser for SQL LIte from http://sqlitebrowser.org/
Open the extracted ‘_Syno_ConfBkp.db’ file in DB Browser for SQL Lite
From the top menu bar select File, then Export, then Export as csv
In the export dialog select the table confbkp_user_tb
In the options
a. select column names in first line, Field separator character ,
b. Quote character "
c. New line characters ‘Windows: CR+LF(\r\n)’
Save the file to your desktop and open in Excel
LDAP
Based on ldap2csv.py and How to retrieve all the attributes of LDAP database to determine the available attributes, using python-ldap:
#!/usr/bin/python
import ldap
host = 'ldap://[ip]:389' # [ip]: The ip/name of the NAS, using the default port
dn = 'uid=[uid],cn=[cn],dc=[dc]' # LDAP Server Settings: Authentication Information / Bind DN
pw = '[password]' # LDAP Server Settings: Password
base_dn = 'dc=[dc]' # LDAP Server Settings: Authentication Information / Base DN
filter = '(uid=*)' # Get all users
attrs = ['cn', 'uid', 'uidNumber', 'gidNumber', 'homeDirectory', 'userPassword', 'loginShell', 'gecos', 'description']
con = ldap.initialize(host)
con.simple_bind_s(dn, pw)
res = con.search_s(base_dn, ldap.SCOPE_SUBTREE, filter, attrs)
con.unbind()
print(res)
The used ports can be found here.

Related

How to list azure Databricks workspaces along with properties like workspaceId?

My objective is to create a csv file that lists all azure databricks workspaces and in particular has the workspace id.
I have been able to retrieve all details as json using the CLI:
az rest -m get --header "Accept=application/json" -u 'https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.Databricks/workspaces?api-version=2018-04-01' > workspaces.json
How can I retrieve the same information using azure resource graph?
If you prefer to work with the workspace list api that returns json, here is one approach for post processing the data (in my case I ran this from a jupyter notebook):
import json
pd.set_option('display.max_columns', None)
pd.set_option('display.max_colwidth', None)
# json from https://learn.microsoft.com/en-us/rest/api/databricks/workspaces/list-by-subscription?tabs=HTTP&tryIt=true&source=docs#code-try-0
# E.g.
# az rest -m get --header "Accept=application/json" -u 'https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.Databricks/workspaces?api-version=2018-04-01' > workspaces.json
pdf = pd.read_json('./workspaces.json')
# flatten the nested json
pdf_flat = pd.json_normalize(json.loads(pdf.to_json(orient="records")))
# drop columns with name '*.type'
pdf_flat.drop(pdf_flat.columns[pdf_flat.columns.str.endswith('.type')], axis=1, inplace=True)
# drop rows without a workspaceId
pdf_flat = pdf_flat[ ~pdf_flat['value.properties.workspaceId'].isna() ]
# drop unwanted columns
pdf_flat.drop(columns=[
'value.properties.parameters.enableFedRampCertification.value',
'value.properties.parameters.enableNoPublicIp.value',
'value.properties.parameters.natGatewayName.value',
'value.properties.parameters.prepareEncryption.value',
'value.properties.parameters.publicIpName.value',
'value.properties.parameters.relayNamespaceName.value',
'value.properties.parameters.requireInfrastructureEncryption.value',
'value.properties.parameters.resourceTags.value.databricks-environment',
'value.properties.parameters.storageAccountName.value',
'value.properties.parameters.storageAccountSkuName.value',
'value.properties.parameters.vnetAddressPrefix.value',
], inplace=True)
pdf_flat
I was able to retrieve the information I needed by:
Searching for databricks resources in the Azure portal:
From there I could click Open Query to use the Azure Resource Graph Explorer and write a query to extract the information I need:
I ended up using the following query:
// Run query to see results.
where type == "microsoft.databricks/workspaces"
| project id,properties.workspaceId,name,tenantId,type,resourceGroup,location,subscriptionId,kind,tags

Transferring google bucket file to end user without saving file locally

Right now when client download file from my site, I'm:
Downloading file from google cloud bucket to server (GCP download file, GCP streaming download)
Saving downloaded file to a Ruby Tempfile
sending Tempfile to enduser using Rails 5 send_file
I would like to skip step 2, to somehow transfer/stream file from google cloud to enduser without the file being saved at my server- is that possible?
Note the google bucket is private.
Code I'm currently using:
# 1 getting file from gcp:
storage = Google::Cloud::Storage.new
bucket = storage.bucket bucket_name, skip_lookup: true
gcp_file = bucket.file file_name
# 2a creates tempfile
temp_file = Tempfile.new('name')
temp_file_path = temp_file.path
# 2b populate tempfile with gcp file content:
gcp_file.download temp_file_path
# 3 sending tempfile to user
send_file(temp_file, type: file_mime_type, filename: 'filename.png')
What I would like:
# 1 getting file from gcp:
storage = Google::Cloud::Storage.new
bucket = storage.bucket bucket_name, skip_lookup: true
gcp_file = bucket.file file_name
# 3 sending/streaming file from google cloud to client:
send_file(gcp_file.download, type: file_mime_type, filename: 'filename.png')
Since making your objects or your bucket publicly readable or accessible is not an option for your project, the best option that I could suggest is using signed URLs so that you can still have control over your objects or bucket and also giving users sufficient permission to perform specific actions like download objects in your GCS bucket.

How to use SAS url at directory level in ADLS Gen2 to get contents of folder using python

I have sas url at directory level and want to use to read contents of directory instead of using connection string
Follow below Syntax:
Create mount
dbutils.fs.mount(
source = "wasbs://<container_name>#<storage_account_name>.blob.core.windows.net/",
mount_point = "/mnt/t123",
extra_configs = {"fs.azure.sas.<container_name>.<storage_account_name>.blob.core.windows.net":"Your_SAS_token"})
Read csv file
file_location ="wasbs://<container_name>#<storage_account_name>.blob.core.windows.net/filename.csv"
df = spark.read.format("csv").option("inferSchema", "true").option("header", "true").option("delimiter",",").load(file_location)
display(df)
Reference:
Reading and Writing data in Azure Data Lake Storage Gen 2 with Azure Databricks by Ryan Kennedy
Mount ADLS Gen2 storage account provided by Microsoft

Sending an email from R using the sendmailR package

I am trying to send an email from R, using the sendmailR package. The code below works fine when I run it on my PC, and I recieve the email. However, when I run it with my macbook pro, it fails with the following error:
library(sendmailR)
from <- sprintf("<sendmailR#%s>", Sys.info()[4])
to <- "<myemail#gmail.com>"
subject <- "TEST"
sendmail(from, to, subject, body,
control=list(smtpServer="ASPMX.L.GOOGLE.COM"))
Error in socketConnection(host = server, port = port, blocking = TRUE) :
cannot open the connection
In addition: Warning message:
In socketConnection(host = server, port = port, blocking = TRUE) :
ASPMX.L.GOOGLE.COM:25 cannot be opened
Any ideas as to why this would work on a PC, but not a mac? I turned the firewall off on both machines.
Are you able to send email via the command-line?
So, first of all, fire up a Terminal and then
$ echo “Test 123” | mail -s “Test” user#domain.com
Look into /var/log/mail.log, or better use
$ tail -f /var/log/mail.log
in a different window while you send your email. If you see something like
... setting up TLS connection to smtp.gmail.com[xxx.xx.xxx.xxx]:587
... Trusted TLS connection established to smtp.gmail.com[xxx.xx.xxx.xxx]:587:\
TLSv1 with cipher RC4-MD5 (128/128 bits)
then you succeeded. Otherwise, it means you have to configure you mailing system. I use postfix with Gmail for two years now, and I never had have problem with it. Basically, you need to grab the Equifax certificates, Equifax_Secure_CA.pem from here: http://www.geotrust.com/resources/root-certificates/. (They were using Thawtee certificates before but they changed last year.) Then, assuming you used Gmail,
Create relay_password in /etc/postfix and put a single line like this (with your correct login and password):
smtp.gmail.com login#gmail.com:password
then in a Terminal,
$ sudo postmap /etc/postfix/relay_password
to update Postfix lookup table.
Add the certificates in /etc/postfix/certs, or any folder you like, then
$ sudo c_rehash /etc/postfix/certs/
(i.e., rehash the certificates with Openssl).
Edit /etc/postfix/main.cf so that it includes the following lines (adjust the paths if needed):
relayhost = smtp.gmail.com:587
smtp_sasl_auth_enable = yes
smtp_sasl_password_maps = hash:/etc/postfix/relay_password
smtp_sasl_security_options = noanonymous
smtp_tls_security_level = may
smtp_tls_CApath = /etc/postfix/certs
smtp_tls_session_cache_database = btree:/etc/postfix/smtp_scache
smtp_tls_session_cache_timeout = 3600s
smtp_tls_loglevel = 1
tls_random_source = dev:/dev/urandom
Finally, just reload the Postfix process, with e.g.
$ sudo postfix reload
(a combination of start/stop works too).
You can choose a different port for the SMTP, e.g. 465.
It’s still possible to use SASL without TLS (the above steps are basically the same), but in both case the main problem is that your login informations are available in a plan text file... Also, should you want to use your MobileMe account, just replace the Gmail SMTP server with smtp.me.com.

How to fetch a binary file from a remote embeded system using telnet?

I have a remote embedded system and it is telnet-able. How can I fetch a binary file from it using ruby? If it were a text file, I could have used:
con = Net::Telnet::new("Host"=>ip,"Timeout"=>200) #Host not host
File.open("fetched_file","w+") do |f|
con.cmd("cat /ect/file") {|data| f.write(data)}
end
But this wouldn't work for binary file you won't get desirable data by cating it.
establish your telnet connection then
send the command:
uuencode filename -
to the remote host, replacing filename with the filename
take the data you are sent and pass it to uudecode on your system
If the device has uuencode installed, you could use that to 'wrap' the binary into printable characters. Other possibility is to run dd if=/etc/file 2>/dev/null to dump the data (however I am not completely certain this will word any better...)

Resources