How to create .audit file to upload a custom amazon AWS audit for Audit Cloud Infrastructure - nessus

I am new to nessus Audit Cloud Infrastructure. I have an infrastructure over AWS cloud with Unix based machine.
Audit Cloud Infrastructure requires an audit file without it an error is coming as below :-
Error (400): At least one audit must be added to this policy in the
'Compliance' section.
I have tried to search a sample .audit file which I can import for my policy but I haven't get anything.
As per the slide below, there is tool as i2a but I haven't able to get that and neither I am able to contact with support portal(may be because I am using the trial version for now) :-
https://www.slideshare.net/jderienzo/create-a-nessus-audit-file-30230893
If anyone have any source of .audit file please share it. specially for unix one. Or please atleast provide some link/video etc so I can create my own .audit file.
I have tried below link but it didn't work for me:-
https://www.tenable.com/blog/version-2-of-windows-compliance-checks-available-for-testing
Any help is appreciated in advance!!

Related

How do I create a snapshot of a VM in Oracle Cloud (OCI)?

I recently set up a Ubuntu Linux instance in Oracle Cloud. I'm used to take snapshots of VM's so I can roll back if something later goes wrong. In Digitalocean (droplet) this is as simple as pressing a button but in Oracle Cloud I can't seem to find this functionality. I've read documentation and Googled but to no avail. Also seems Oracle have several different cloud offerings with similar naming which makes it hard to find relevant information.
Go to boot volume and create a full backup. This will help you to restore your server to the last good configuration.

google cloud sdk is installed in my EC2 instance and I could access gcloud. But bq is not available even though I see it in the list of components

I am installed google-cloud-sdk in my matillion instance hosted on EC2. I am able to access gcloud command in the ssh instance and also by using a bash component in my matillion.
However, I am not able to run bq commands. I see it has been installed as part of the cloud sdk. I was able to configure my account and everything. But it doesn't work.
Can someone help me with this?
As per the documentation, its necessary that you activated the BigQuery API in order to use the bq command-line tool.
These are all the steps that you need to follow:
In the Cloud Console, on the project selector page, select or create
a Cloud project.
Install and initialize the Cloud SDK.
BigQuery is automatically enabled in new projects. To activate BigQuery in a preexisting project, go to Enable the BigQuery API.
I also was getting the same error than you and activating the API was the solution.

Do i need to ask gcloud support for enabling exchange custom rules in VPC peering?

I have created two gcloud projects, one for cloud sql and one for kubernete cluster. For accessing SQL in project one i have set import export custom routes . Do i need to take gcloud confirmation for this or this is enough? as i have read somewhere that after these steps ask gcloud support for enable the exchange of custom routes for your speckle-umbrella VPC network associated with your instance that is automatically created upon the Cloud SQL instance is created.
As far as I know this step is not included in the public documentation and is not necessary if you are connecting from within the same project or from any service project ( if configured with Shared VPC), then you don't require to export those routes. This is generally required if you are connecting from on-prem.
If you are having anny issue please let us know

service accounts configuration in template file in osb 11g

I need to configure service accounts for connecting to some of the services and for that we are required to configure the details in a template file.
So basically that means, I want to configure service account at run time.
We are using oracle service bus 11g.
Since I've never worked on service accounts before, any suggestions will be helpful.
I checked that we can do that at run time by fn-bea:lookupBasicCredentials XQuery function. but this is not what we want. We want to generated dynamically through the template files.

Pros/Cons of Parse Dashboard local installation vs deployment

I wasn't able to find solid information on this and I wanted to ask developers who use Parse Dashboard:
What are the pros/cons of Parse Dashboard local installation vs deployment?
I currently run the Parse Dashboard on local installation, but I know that deployment to Heroku is also an option (my app is deployed on Heroku). I wanted to gather some information before deploying/not deploying.
Thank you!
I also have it running locally and I think for security reasons it's best to do so. If you setup the dashboard on the same server on which Parse is running, then you will have to take security measure to protect access to the dashboard and the config file which includes your masterkey and all that. This definitely outweighs the arguments to host it locally, which in my opinion only is that it's easier to access the dashboard.
If you really want to setup a dashboard on a server at least do it on a separate server.

Resources