Read files from remote EC2 instance with Javascript? - ajax

I have files stored in my remote EC2 server. When users pull up my website, I need to be able to get the text from those files in real time and pull them into the website. Like Github does.
How do I go about doing this?
If this doesn't make sense, I want to do what Github is doing. You can navigate between files on a website that are stored on a server, and you just read from the server and get that data to render to the user. I am using firebase for hosting, but changing to an S3 is possible if absolutely necessary. Thank you.

S3 has versioning but its not a full on Source Control Management tool, so it doesn't natively support branching or merging. Usually for a database file system I'd recommend using a Virtual Path Provider technology where you read documents out of a database, eg HTML files used in a CMS.
Github essentially is a web interface for GIT. The whole thing is run using a plethora of GIT SCM repo's.
Hence what you want is AWS CodeCommit, a fully-managed source control service that hosts secure Git-based repositories.
Edit:
I need all my users to be able to push to their own branches.
If you don't need merging you could simply use an S3 bucket by creating folders (programmatically) for each user. With a Policy you can restrict each user to their own bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": [
{
"AWS": [
"arn:aws:iam::222222222222:role/ROLENAME",
"arn:aws:iam::222222222222:user/**${aws:username}**"
]
}
],
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::MyExampleBucket"
}
Ref: https://aws.amazon.com/blogs/security/how-to-restrict-amazon-s3-bucket-access-to-a-specific-iam-role/

Related

scripted file upload to aws cloudshell environment

We maintain dozens of developer accounts on AWS and for maintenance purposes it would be amazing if on all cloudshell environments we would have a set of scripts available.
It is possible to upload files to the cloudshell environment manually by using the Actions->Upload File feature in the web console, but that is not feasible to manage dozens of environments.
Is there any ansible module or other way to upload files to cloudshell? Probably via S3 bucket, but we're missing the last mile into the cloudshell environment.

Indexing of mega shared folder Link in heroku

I want to do indexing of the mega shared folder in Heroku
[ it is the same as indexing of the google drive folder in Heroku App ]
Is there any way to do that?
Tips:
I think rclone can do that
here is the GitHub link
https://github.com/developeranaz/Mega-index-heroku
this GitHub requires login credentials of mega cloud to get indexed for single account in Heroku
I want
Either multiple account indexing to Heroku should be there
Or multiple shared folder indexing to Heroku should be there
[ Any of the one solutions will be best ]

What is webhookId and deployKeyId when mirroring a bitbucket repository in google cloud

We have a lot of bitbucket repositories which I am trying to link to google cloud so I can use automatic build triggers to build container images.
If you set up a build trigger using a bitbucket source through the admin panel interface, google seems to create a mirrored source repository, adds a ssh key on bitbucket (so it can pull any changes) and also add a bb service (old style webhooks) on the repository so that pushing to bb triggers a pull in to google (i.e. mirroring). This all seems to work well, but I would like to be able to set this up programatically via an API.
I can setup the bitbucket stuff fine using their API but I am not sure how to go about creating the google repository using this
Using the API to fetch an existing mirrored repository gives some clues about how it works. The mirrorConfig key is the key.
{
"name": "projects/my_project_id/repos/bitbucket-organisation-myrepo",
"size": "7670706",
"url": "https://source.developers.google.com/p/my_project_id/r/bitbucket-organisation-myrepo",
"mirrorConfig": {
"url": "ssh://git#bitbucket.org/organisation/myrepo.git",
"webhookId": "12299619",
"deployKeyId": "6759258"
}
}
The POST service setup in bitbucket is:
https://source.developers.google.com/webhook/bitbucket?id=M12A8PQNCVD&project=385688625156
Notice how the ids in google don't correspond with the bitbucket webhook url. Adding another repository gives completely different ids and creates a new ssh keypair on my bitbucket user every time.
webhookId looks like it's linking to some sort of "google cloud webhook" which must be how the bitbucket webhook is linked up with the google webhook, but I cant see where you find or create them.
deployKeyId sounds like it's linking to some sort of credentials store which must be where it stores the private key.
My question is, what are these two IDs really? and which APIs can I use to setup webhooks and deploy keys in google cloud?

Deploying to Heroku with LoopBack buildpack overwrites my changes to LoopBack code

I'm deploying my LoopBack project in Heroku using the buildpack in:
https://github.com/strongloop/strongloop-buildpacks.git
However I have a few changes I made to the LoopBack User model (specifically, I changed the ACLs to deny access to User creation by $everyone) and when I deploy it in Heroku those changes are overwritten with the default values (i.e. the ACL allows $everyone to POST to /Users)
My guess is that when deploying in Heroku, my changes are put first and then the buildpack is installed so any changes to the LB source code are overwritten.
Is there any way I can make changes to the LoopBack source code and deploy to Heroku?
Do I have to create my own buildpack with my changes? any recommended resources on how to create a buildpack?
Thanks!
After some research it seems that my assumtions on why this was failing were right. It turns out that the reason why the changes are being overwritten is indeed because the buildpack installs everything on top of whatever project structure you commit to your Heroku app.
In my case, since my changes involved changing StrongLoop's files, whenever the SL buildpack was installed those changes where lost.
Solution:
The way I solved this was by forking StrongLoop's buildpack and then adding a few lines to the bin/compile file to use sed to delete the ACL entries that allow anyone ("$everyone" role) to POST a new User instance:
status "Removing CREATE permissions for User model"
sed '42,47d' $build_dir/node_modules/loopback/common/models/user.json > $build_dir/node_modules/loopback/common/models/user.tmp
mv $build_dir/node_modules/loopback/common/models/user.tmp $build_dir/node_modules/loopback/common/models/user.json
(link to the position of the lines is here)
In the version of SL that I'm using this deletes the following lines:
},
{
"principalType": "ROLE",
"principalId": "$everyone",
"permission": "ALLOW",
"property": "login"
(link to GitHub lines here)
I then used this new buildpack to create a new Heroku app which now has disabled access to creating new Users by "$everyone" role.
Caveats
This is of course a very crude way of accomplishing this, and I would think that the correct way would be to actually fork the StrongLoop repo, make the changes there, and then use a buildpack that installs the forked repo, however in my case it meant that I had to be paying attention to fixes commited to the original StrongLoop repo and merge them back, which for the small change I needed seemed unnecessary.

Using Amazon S3 in place of an SFTP Server

I need to set up a repository where multiple people can go to drop off excel and csv files. I need a secure environment that has access control so customers logging on to drop off their own data can't see another customers data. So if person A logs on to drop a word document they can't see person B's excel sheet. I have an AWS account and would prefer to use S3 for this. I originally planned to setup an SFTP server on an EC2 server however, I feel that using S3 would be more scalable and safer after doing some research. However, I've never used S3 before nor have I seen it in a production environment. So my question really comes down to this does S3 provide a user interface that allows multiple people to drop files off similar to that of an FTP server? And can I create access control so people can't see other peoples data?
Here are the developer resources for S3
https://aws.amazon.com/developertools/Amazon-S3
Here are some pre-built widgets
http://codecanyon.net/search?utf8=%E2%9C%93&term=s3+bucket
Let us know your angle as we can provide other ideas knowing more about your requirements
Yes. It does, you can actually control access to your resources using IAM users and roles.
http://aws.amazon.com/iam/
You can allow privileges to parts of an S3 bucket say depending on the user or role for example:
mybucket/user1
mybucket/user2
mybucket/development
could all have different permissions.
Hope this helps.

Resources