I am able to get all the ec2 instance details in csv format using boto3.But i am not able to print all the ec2 instance details in xlsx file.
enter code here
can any one suggest how to print all the ec2 details in the account?
The question is unclear. What script did you use to get the csv format?
If you used create_data_set, you could follow the boto3 documentation here.
If your script is working fine for csv files or already have the csv files, then you can just use xlsxwriter or openpyxl as stated in this solution.
And honestly once you have csv files, you can simply open in a spreadsheet software like microsoft excel and use the 'save as' operation.
Related
I have my CSV data stored on a server that can be accessed via the internet. There is no authorization needed for access to that file. At the moment, I use scp and some crone jobs to copy that file to my Memgrpah server. Then I import data using LOAD CSV.
Is there a way to do something like LOAD CSV FROM "https://my-server.com/files/state.csv" NO HEADER AS row?
At the moment, it is not possible to import remote files into Memgrpah using LOAD CSV. There is an issue opened on GitHub repo so the solution should be available at some point.
This question already has answers here:
Windows batch file file download from a URL
(20 answers)
Closed 1 year ago.
First I apologize if this is not a good place to ask this question.
So I will specify the problem:
Firm A is sending me an HTTP link that holds an excel document with price changes from the competition which I have to save to a specific folder so our Software can implement the same document into our database. Basically what we get is our current price and price of 5-8 of our competitors that are stored into the database and can be seen for each product in the database.
My question is it possible to automate saving a file from a HTTP link to a specific folder and do it few times a day so our software can load it n amount of times. Also since the name of the file is the same it needs to overwrite each time it saves the document.
The software works in a way that it is able to import excel files and update database fields properly. API in any way is not possible.
Everything needs to be done in Windows 10, for what i need to use i'm open to suggestions.
I thought about FTP access but since is outside our local network im not qualified to be absolutely sure that the connection is safe.
Thanks in advance.
So what i did is use powershell:
Invoke-WebRequest -Uri "HTTP LINK" -OutFile "WhereToPlaceit\nameofthe.file"
And run task scheduler, more info here: https://community.spiceworks.com/how_to/17736-run-powershell-scripts-from-task-scheduler
Place it on the server, and the command repeats itself each hour overwriting the existing file.
Do I really need a Google Cloud box to run DLP conversion for a local file before up-load to a Google Cloud bigquery box?
I just want to convert a csv file to a secure data protective file format ?
Currently the DLP product is an API in the cloud.
You can use gCloud similar to the example below - though this will only do simple redaction and based in infoType findings, not record transforms (apply transform to a whole column). For the full set of features, you would stream in as a table into the Content.deidentify
I would like to upload a huge file (50GB .csv) to Amazon EC2 RStudio set-up in order to make some statistical calculations.
I do only hve little experience with unix/linux. Is there a way to upload directly within RStudio?
Thanks!
First make your file in S3 public by selecting the file and then select Make public from the drop down- Action, right click the file for properties and copy the link (http://s3-us-west....)and then read the csv file as below
a <- read.csv("link", header=TRUE)
I have some n number of files in a server directory. Is there a way to write a bash script that will automate whether the data is getting updated in json file when i access through the portal and if i make any changes manually then it needs to get updated in json file. Is there a way to do this?
Yes, you can manipulate json with jq and bash script around.