How to fix the "Create role error" on Cloud9 when running "ask deploy" (Alexa Skill Kit)? - cloud9-ide

I want to use an Amazon Could9 Workspace to develop an Alexa Skill using ASK.
I did follow this guide https://developer.amazon.com/docs/smapi/set-up-credentials-for-an-amazon-web-services-account.html to create an IAM user.
The user was added to aws using "awsconfigure".
Now, while trying to deploy my skill:
ask deploy
I do receive the following error:
Skill Id: amzn1.ask.skill.xxx
Skill deployment finished.
Model deployment finished.
Create role error.
InvalidClientTokenId: The security token included in the request is invalid

In AWS IAM, under user, you should see what permission you have given to a particular user. For Cloud-9, I have AWSCloud9Administrator added and seems working fine for me.
https://console.aws.amazon.com/iam/home?region=us-east-1#/users/
//

Related

Azure SAML2 login system issue

Short description:
Im using laravel application which already has system for logging in with microsoft account. That system works, but this is the first time im working on it, and i can not establish locally that users can sign in with their microsoft account into the application. Because system in the application works, and i get error when logging in, the issue must be in my configuration at Azure portal.
My configuration is as following:
I have created tenant and registered app in it. My SAML config is as following:
Entity ID: https://login.microsoftonline.com/tenant-id/saml2
Reply URL (Assertion Consumer Service URL): https://sts.windows.net/tenant-id/
In my .env i have set following values:
AZURE_AD_CALLBACK_URL=/login/microsoft/callback
AZURE_AD_CLIENT_ID=id-of-the-application-in-tenant
AZURE_AD_CLIENT_SECRET=tenant-secret-key
SAML2_AZURE_SAML_ENABLED=true
SAML2_AZURE_IDP_SSO_URL="https://login.microsoftonline.com/tenant-id/saml2"
SAML2_AZURE_IDP_ENTITYID="https://sts.windows.net/tenant-id/"
SAML2_AZURE_IDP_x509="tenant-id"
SAML2_AZURE_SP_ENTITYID="https://some-app.com/"
I get following error after entering my credentials:
AADSTS700016: Application with identifier 'https://someapp/' was not found in the directory 'tenant-id'. This can happen if the application has not been installed by the administrator of the tenant or consented to by any user in the tenant. You may have sent your authentication request to the wrong tenant.
I have added user to the application, which i use to test login, so this error is totally confusing for me.
I dont know if i provided all neccessary info, but if some missing i will provide them.
I hope someone knows what is wrong with the configuration
The tenant id is a GUID. Have you used this or are you using the "tenant-id" string?
Also, the ACS is an endpoint in your application - not an Azure URL.

streaming data from DynamoDB to elasticsearch is failing with "no permissions for [indices:data/write/bulk]"

I'm trying to stream data from DynamoDB to ElasticSearch. I've checked the documentation by AWS + some other sources online but I'm stuck on a security issue. So I'm using a lambda function, the process of retrieving data from DynamoDB is fine, but then when I try to write back to ElasticSearch I get an error:
"no permissions for [indices:data/write/bulk] and User [name=arn:aws:iam::account number:role/dynamodb_to_es, backend_roles=[arn:aws:iam::account number:role/dynamodb_to_es], requestedTenant=null]"
This is my Lambda function. I realized it would always fail because of the "_bulk" extension. Thank you!!
My Lambda function:
https://github.com/YassineRjl/Lambda-Func---DynamoDB-to-ElasticSearch/blob/master/lambda_func.py
My IAM role:
You don't have to disable "fine-grained access control". Instead, you can edit role mapping.
For detailed information & steps please check:
https://aws.amazon.com/tr/premiumsupport/knowledge-center/es-troubleshoot-cloudwatch-logs/
"I'm unable to stream my CloudWatch log group to an Amazon ES domain when fine-grained access control is enabled." subject.
If you're finding this from Google like me, this worked for my use-case:
Open your Kibana dashboard (https://your-domain-somerandomstring.us-east-1.es.amazonaws.com/us-east-1.es.amazonaws.com/_plugin/kibana/app/opendistro_security#/roles/view/all_access/mapuser)
Navigate to Security in the left nav
Select Roles
Select a role that you'd like to attach your user to (in the example above, it's "all_access")
Add your Username from the error log, or use the ARN for your username (mine was related to the instance profile associated to the instance I was connecting from)
I found the solution. During the creation of the ES instance, make sure to unselect "fine-grained access control" & avoid VPC for the sake of Https, then on the roles, create a role on IAM and copy-paste the ARN in the ES dashboard during the instance setup
You should map your user to role: kibana_user, which defines basic permission to access index.

Google API import natural language dataset import not working

Node command, as suggested by the tutorial, is not working and throws 403.
node automlNaturalLanguageDataset.js import-data
I had some issues passing parameters, so hard-coded, project id, compute region, etc. I was able to run create-dataset, list-datasets successfully, just not import-data. Error I get:
Error: 3 INVALID_ARGUMENT: Error encountered when accessing gs://<my project id>/csv/happiness.csv, error code 403, error details custom-vision#appspot.gserviceaccount.com does not have storage.objects.get access to <my project id>/csv/happiness.csv.
I invoked gcloud projects add-iam-policy-binding as in the document with my own service account name but it looks like it is pulling the example service account name used in the tutorial. I checked js code as well my environment, could not find this account name. Any idea what I am missing?
Tutorial I am following is at https://cloud.google.com/natural-language/automl/docs/tutorial.
Aside from your own service account, you have to allow also AutoML Natural Language service accounts to access your Google Cloud project resources. As instructed in item#9 of the tutorial, you can run the following command:
gcloud projects add-iam-policy-binding project-id --member="serviceAccount:custom-vision#appspot.gserviceaccount.com" --role="roles/storage.admin"

Google app engine remote python console credentials/login?

Hi I am trying to start remote GAE shell with
python $GAE_SDK_ROOT/remote_api_shell.py -s your_app_id.appspot.com
"You don't need any additional authentication" says the GAE RemoteAPI page,
yet my command fails miserably with HTTP Error 401: Unauthorized Too many auth attempts.
I think I was able to do start it (with various degree of success for different apps) in some remote past, either with gmail credentials or some auth key from google cloud.
Please share your hints, or, ideally, drop a link to easy to follow step by step guide.
Also I cannot access Datastore Admin for that project
in online console
, if I click It invites to sing in, which fails. Recently, I got owner role, yet project was created by a person with a different email domain.

Add user to Google API

In console.developers.google.com, I am trying to add permissions to a new user, with this email: xxxx#gmail.com, while my current console.developers.google.com is a google app account, let's call it yyyy#gapp.net
This is the error I got while granting xxxx#gmail.com admin/can edit/can read:
The non-domain account: xxxx#gmail.com can not be added to a domain project.
What should I do? Any setting I can tweak to solve this?
I am doing this because I want to transfer my app from yyyy#gapp.net to xxxx#gmail.com
It's not possible to share an app outside the domain in this way. You'll need to create a new project in the gmail.com account and switch to using it.

Resources