SSL on AWS S3, Deployed on Heroku - heroku

I have a Django app deployed on Heroku with AWS S3 handling images. The S3 bucket name contains a "-" and no ".".
After using SSL for my domain (and having done the necessary configuration on Heroku), I run into issues uploading images to AWS S3 via the SSL domain.
Chrome's error: ERR_CONNECTION_RESET
Firefox error: The page you are trying to view cannot be shown because the authenticity of the received data could not be verified.
What is the problem? Thanks much.

Related

aws elastic beanstalk / S3 The page was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint

I have a Spring boot application deployed using AWS Elastic Beanstalk, im using S3 bucket for my angular app.
I have generated certifacate using aws certifacate manager and created CloudFront Distribution so my angular app is loaded on https.
The problem is I am calling a rest API from Https deployed Application to Http Rest API.
I keep getting this error:
Mixed Content: The page at "https://mywebsite.com" was loaded over HTTPS, but requested an insecure XMLHttpRequest 'http://myendpoint'. This request has been blocked; the content must be served over HTTPS.
I tried generating my own certificate in my spring boot application it worked locally but once deployed on elastic beanstalk web services doesnt respond.
any tip on how use https / beanstalk ?
The error message sums the problem clearly. It would be a huge security issue to allow unencrypted data transfer, for seemingly securely encrypted web page.
Moreover, you don't really want to do SSL termination on your instances, for performance reasons, you don't want to manually manage keys an so forth.
In your situation, I would advise setting up a CloudFront distribution in front of your ALB (which I assume you have). That will solve your problems immediately, as CloudFront will automatically setup a domain for you and will expose your endpoints via HTTPS. Afterwards if you decide, you can easily setup a custom domain and certificates.
Finally, I recommend reading this article to make sure you avoid common pitfalls when configuring ALB and CloudFront.
Best, Stefan

how to serve node js api through AWS cloudfront?

I am trying to serve my node js API (deployed on AWS EC2 and attached with application load balancer) through cloudfront url, is it possible?
Here are the steps I followed so far -
Created S3 bucket to host static website hosting
Created cloudfront distribution and linked S3 bucket with it. I can access S3 bucket contents with default url generated from cloudfront
Created custom origin for node js instance
Created behavior "api/*" to access node js API through cloudfront.
But when I am trying to access API with following url -
http://d3m30a4naen9t2.cloudfront.net/api/getItems
it throws "not found", it's not 404, this response is from EC2 server however the specified route exists.
Can anyone help please?
I am using ELB. I deployed my code which is on node.js, and everything is working fine. I was facing a lots of problem in ELB but finally we came to stable stage. If you want to serve your APIs then first use SSL which has lesser protocols in other words, use less secure SSL otherwise, your API will not be able to hit from any other source. Simply deploy your code through git or directly from filezilla and run the command on both servers (primary and secondary) as pm2 start index.js/server.js or whatever your main express file is.
Suggestion: Please be careful while selecting security certificates, because on ELB, if you do not follow the correct implementation, you will definitely face the problem of "API not accessible" or "Remote server is unable to connect".

Unable to upload image to S3 via Beanstalk but working in local server

I deployed My Springboot Application in Aws Elasticbeanstalk. In my application i have a module to upload an profile picture and i am saving it in Aws S3,its working fine in local server,but not working in Elasticbeanstalk.I am getting Permission Denied Error.Any One Please help me to solve this issue.
aws elastic beanstalk logs
Check if your instance profile has permissions to upload to S3. I suppose locally you are using Secret and Access keys which have access to s3. However on the ec2 instance started by elasticbeanstalk probably has no permissions to s3.
Checkout https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/iam-instanceprofile.html for more details on how to configure the role for your instance.
Hope this helps!

File permission in Linux, Allowing user upload in Amazon ec2

I am making a web app that is hosted on amazon ec2. One of the page allows user to upload photo to the site.
MY script works on ordinary site, but not with ec2, so I suspect the problem is about connection or authorization.
Maybe ordinary web user do not have authorization to upload to the var/www/html/ directory? But how to enable them?
(specifically I think the following line is not working
move_uploaded_file)
I am using AMazon ec2 micro instance Linux AMI server. I am uploading via a form in a php file.

amazon - S3 or CloudFront - httpS custom domain

I'm looking for some CDN where I can setup certificate for httpS
I have subdomain.site.com and I need that all files from https://subdomain.site.com hosted on amazon servers
quick look showed me that I cannot setup own httpS certificate for custom domain bucket on S3 - is this correct ?
Can Amazon CloudFront do this ?
Yes, you can serve a CloudFront distribution over HTTPS with a custom domain now.
Amazon recently rolled out a new feature that supports custom SSL certificates at no charge using SNI (Server Name Indication): http://aws.amazon.com/cloudfront/custom-ssl-domains/
Well, appears that is not posible
Might be relevant. But as of today, AWS offers the ability to upload a custom SSL certificate. $600 / month. More details on their page: http://aws.amazon.com/cloudfront/pricing/

Resources