AWS Simple Email Service - Need code to use boto3 with IAM Profile - windows

I want to access Amazon’s SES(Simple Email Service) using the boto3 library and a user profile. The code I want to execute is listed below which does not work. It gives me an error “botocore.exceptions.NoCredentialsError: Unable to locate credentials”, I am currently able to access S3 services with no issue so I know my profile is setup correctly, but I cannot find code to access the SES service using a profile. I need some pointers how to get the code to work.
SES code that NOT work
import boto3
session = boto3.session.Session(profile_name='myprofile')
client = boto3.client('ses','us-east-1')
response = client.list_verified_email_addresses()
S3 code which currently works with a profile
import boto3
session = boto3.session.Session(profile_name='myprofile')
s3 = session.resource('s3')
Reference: http://boto3.readthedocs.org/en/latest/reference/services/ses.html

In your SES code, you are not using the session that you created with your profile. But in S3 code, you use the session. When you call boto3.client(), it knows nothing about your profile. Try this:
session = boto3.Session(profile_name='myprofile')
client = session.client('ses','us-east-1')
response = client.list_verified_email_addresses()

Related

errors when trying to tweet using tweepy

I am working on an academic research project and I am trying to send out tweets using the the Twitter API. The error I am receiving repeatedly is
Forbidden: 403 Forbidden
Your client app is not configured with the appropriate oauth1 app permissions for this endpoint.
import tweepy
#from tweepy import OAuthHandler
ACCESS_KEY = 'xxx'
ACCESS_SECRET = 'xxx'
CONSUMER_KEY = 'xxx'
CONSUMER_SECRET = 'xxx'
api = tweepy.Client(bearer_token='xxx',
access_token=ACCESS_KEY,
access_token_secret=ACCESS_SECRET,
consumer_key=CONSUMER_KEY,
consumer_secret=CONSUMER_SECRET)
api.create_tweet(text='I want to Tweet')
Here is my code. The authentication raises no errors. Just the attempt at tweeting.
You can fix the problem by activating Read / Write in the Oauth section of your application, and then you shall regenerate the "Access Token and Secret".
You can check that are properly recreated when you see:
Created with Read and Write permissions
EDIT as of 10/February/2023: You are now required to ask for Elevated access if you want to have read + write permission. You only have read access from the V2 API Endpoints as of today

Can I limit the available scopes on a Google API service account?

I have done the following:
Created a project in Google API console
Enabled the Google Drive API in the project
Created a service account
Shared a Google Drive folder with the service account
Connected successfully to Google Drive and retrieved the list of folders and files shared with the service account.
When you create an OAuth client ID, you can limit that to predefined scopes. As far as I can tell, the service account has access to any Google Drive scope. I wanted to tighten that down to the following scope: https://www.googleapis.com/auth/drive.readonly just as a reassurance that there's no way the Google Drive app I'm making unintentionally adds/edits/deletes any files.
I know I can add the account to different roles. However, I looked through the list multiple times and none of them are related to Google Drive. I attempted to make my own role, but the available permissions on that screen do not reference Google Drive either. It's possible I missed something or there's another place I could look. Any suggestions?
To limit the scope a Service Account, you have to specify the scope on the server-side.
Service accounts are special Google accounts that can be used by
applications to access Google APIs programmatically via OAuth 2.0. A
service account uses an OAuth 2.0 flow that does not require human
authorization. Instead, it uses a key file that only your application
can access.
For example:
In python, you can specify the scope of a service account by creating a list of scopes and use it as parameter when getting the credentials.
Folder and Files:
python:
Search all image with jpeg extension:
import httplib2
import os
from apiclient import discovery
from google.oauth2 import service_account
scopes = ["https://www.googleapis.com/auth/drive.readonly"]
secret_file = os.path.join(os.getcwd(), 'client_secret.json')
credentials = service_account.Credentials.from_service_account_file(secret_file, scopes=scopes)
service = discovery.build('drive', 'v3', credentials=credentials)
page_token = None
while True:
response = service.files().list(q="mimeType='image/jpeg'",
spaces='drive',
fields='nextPageToken, files(id, name)',
pageToken=page_token).execute()
for file in response.get('files', []):
# Process change
print('Found file: %s' % (file.get('name')))
page_token = response.get('nextPageToken', None)
if page_token is None:
break
Output:
Found file: cute-puppy.jpg
Creating folder with readonly scope:
import httplib2
import os
from apiclient import discovery
from google.oauth2 import service_account
scopes = ["https://www.googleapis.com/auth/drive.readonly"]
secret_file = os.path.join(os.getcwd(), 'client_secret.json')
credentials = service_account.Credentials.from_service_account_file(secret_file, scopes=scopes)
service = discovery.build('drive', 'v3', credentials=credentials)
file_metadata = {
'name': 'Invoices',
'mimeType': 'application/vnd.google-apps.folder'
}
file = service.files().create(body=file_metadata,
fields='id').execute()
Error message:
<HttpError 403 when requesting https://www.googleapis.com/drive/v3/files?fields=id&alt=json returned "Insufficient Permission: Request had insufficient authentication scopes.". Details: "Insufficient Permission: Request had insufficient authentication scopes.">
References:
Google Auth Python
OAuth Scopes

Unable to get data from firestore when deployed spring boot app on app engine

I have deployed spring boot service on app engine, it shows that connection to firebase is successful but unable to fetch any result. Everything is working fine when I am running it locally. This is how app is making connection to fireStore. I am setting the serviceAccounnts.json in GOOGLE_APPLICATION_CREDENTIALS environment variable
FirestoreOptions firestoreOptions =
FirestoreOptions.getDefaultInstance().toBuilder()
.setProjectId(PROJECT_ID)
.setCredentials(GoogleCredentials.getApplicationDefault())
.build();
In Service bean, the fireStore Instance is initialized as
public static Firestore dbFireStore = FirestoreOptions.getDefaultInstance().getService();
I am fetching data like this
CollectionReference collectionReference = dbFireStore.collection("Collection_Name")
.document("Document_Name")
.collection("SubCollection_Name");
ApiFuture<QuerySnapshot> querySnapshot = collectionReference.whereEqualTo("Search_Field", "Search_Value").get();
List<QueryDocumentSnapshot> queryDocumentSnapshot = querySnapshot.get().getDocuments();
The list returned here is empty but expected size was 1 and working fine on local.
From your code, it looks like you're initializing Firestore with default credentials. To understand, App Engine automatically retrieves the service account credentials to call Google Cloud APIs.
However if the environment variable GOOGLE_APPLICATION_CREDENTIALS is set, ADC (Application Default Credentials) uses that service account file that the variable points to and permission problems will arise if App Engine can't find that service account credential. You should check your App Engine loggings on Google Cloud Console.
What I suggest for your App Engine build is that you remove that environment variable and try again.

How can I authenticate against ADXProxy using app key authentication?

I am trying to access an Azure Application Insights resource via Redash, using the (preview) ADXProxy feature.
I've created an App Registration in Azure, and I've got some proof-of-concept python code which can successfully access my Application Insights resource and execute a Kusto query (traces | take 1) using an application token:
import azure.kusto
import azure.kusto.data.request
import msal
cluster = 'https://ade.applicationinsights.io/subscriptions/<MY_SUBSCRIPTION>/resourcegroups/<MY_RESOURCE_GROUP>/providers/microsoft.insights/components/<MY_APP_INSIGHTS_RESOURCE>'
app_id = '<MY_APP_ID>'
app_key = '<MY_SECRET>'
authority_id = '<MY_AAD_SUBSCRIPTION_ID>'
def run():
app = msal.ConfidentialClientApplication(
client_id=app_id,
client_credential=app_key,
authority='https://login.microsoftonline.com/<MY_AAD_SUBSCRIPTION_ID>')
token = app.acquire_token_for_client(['https://help.kusto.windows.net/.default'])
kcsb = azure.kusto.data.request.KustoConnectionStringBuilder.with_aad_application_token_authentication(
connection_string=cluster,
application_token=token['access_token']
)
client = azure.kusto.data.request.KustoClient(kcsb)
result = client.execute('<MY_APP_INSIGHTS_RESOURCE>', 'traces | take 1')
for res in result.primary_results:
print(res)
return 1
if __name__ == "__main__":
run()
However, Redash doesn't support application token authentication: it uses application key authentication, making a call like:
kcsb = azure.kusto.data.request.KustoConnectionStringBuilder.with_aad_application_key_authentication(
connection_string = cluster,
aad_app_id = app_id,
app_key = app_key,
authority_id = '<MY_AAD_SUBSCRIPTION_ID>'
)
I can't successfully connect to my App Insights resource using this type of flow. If I substitute this KustoConnectionStringBuilder into my program above, I get an exception telling me:
The resource principal named https://ade.applicationinsights.io was not found in the tenant named <MY_AAD_SUBSCRIPTION_ID>. This can happen if the application has not been installed by the administrator of the tenant or consented to by any user in the tenant. You might have sent your authentication request to the wrong tenant.
Is there something I can do in code or Azure Portal configuration to connect my 'tenant' to the ade.applicationinsights.io resource principal and get this connection working?
Adxproxy supports only tokens minted by Azure Active Directory (AAD). The token must be created for an Azure Data Explorer cluster (ADX), that you own. If you don't have your own ADX cluster, and for whatever reason you want to access your Application Insights resources via Adxproxy, you can always authenticate to 'https://help.kusto.windows.net' and use that token.

Using the S3 client from the AWS SDK for Go without credentials

I am using the Go AWS SDK to access an object in an S3 bucket. I instantiate s3.S3, then call
s3client.GetObject(...)
The object I am accessing is publicly accessible, so I do not wish to provide any credentials. However, if I do not provide any credentials then I get the following error:
NoCredentialProviders: no valid providers in chain. Deprecated.\n\tFor verbose messaging see aws.Config.CredentialsChainVerboseErrors
When creating the session:
sess := session.Must(session.NewSession(&aws.Config{
Credentials: credentials.AnonymousCredentials,
....
}))

Resources