Can I look at files my code makes on heroku? [closed] - heroku

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I have a discord bot that saves JSON files on the dir he is in so it could work on more than one server without colliding.
I finished my code and I uploaded it to heroku for hosting. The thing is , when I ran the code locally I could see the files that were being created for each server for testing but now I don't know how to reach them.
Is there a way to check all the files I have in heroku without downloading everything down?

You can install Heroku CLI and then access your files using:
heroku login
heroku run bash -a APPNAME
But be aware that Heroku uses an ephemeral filesystem. This means that your local filesystem is only accessible to a single dyno, and once the dyno is stopped, restarted, or moved, all files on the local filesystem are destroyed.
You could use a service like Amazon S3 to store your files in a more permanent way.

Related

Bash scripting to copy [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed last year.
The community reviewed whether to reopen this question last year and left it closed:
Original close reason(s) were not resolved
Improve this question
Please share any bash script to run s3 copy commands. I have already tried separate Script to call the rclone command for each agency to backup EC2 instance windows server data to S3.
The command below syncs the current directory to an S3 bucket using a named profile.
aws s3 sync . $S3_BUCKET_URL --profile $YOUR_PROFILE_BRO
The next command syncs the S3 bucket to the current directory using a named profile.
aws s3 sync $S3_BUCKET_URL . --profile $YOUR_PROFILE_BRO
The next command copies a file (file.txt) from your machine to S3 using the default profile of your machine.
aws s3 cp file.txt s3://my-bucket/
Of course, you need an aws-cli and AWS credential pair (secret key & secret key id) to make this work from your machine or on-premise networks. If you want to copy from EC2 to S3, you can assign IAM roles that posses permissions to write/read objects to that EC2 and you should be good to go.

Is it possible to access S3 objects from on-prem servers with automatically updating AWS credentials? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I have a list of zip files stored in an S3 bucket that I would like to access from my web app which runs on an IIS server, outside AWS (on-prem). What would be the best way to handle the AWS credentials (I have a total of 4 environments - 4 IIS servers I need to maintain).
I know that I can manually set up an AWS credential profile (using an IAM user) using the AWS CLI on each environment but I was wondering if there would be a better way to handle the AWS credentials that are stored on the servers. For example, our organization has an IAM user policy that expires the credentials every 90 days. That means that there would be a bit of maintenance overhead requiring me to update the IAM user credentials all the machines every 90 days.
Is there a way to automate the above process so that the IAM user credentials are updated automatically?
You would need to create such a process yourself. There is nothing within AWS that can update credentials stored on your own computers.
IAM Users can have two active credentials, so the process would be:
Generate second set of credentials on the IAM User
Update credentials stored in your applications
Disable/delete the first set of credentials from the IAM User
This gives some time for applications to update to the new credentials while the old credentials are still valid. These steps can be automated via API calls to AWS, but your own application would need to initiate the steps and update the credentials on your servers.

Unexpected HTTP 500 error when user logs in on LocalHost, but login is successful on live site [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 2 years ago.
Improve this question
The code I have pulled from the production server to my localhost server is identical. On the live site, the user can login as normal however on Localhost I cannot. I am using Laravel 5.1 for this project along with a XAMMP server.
The issue cannot be with the source code- as it is the same on the live site as it is on my machine. And it works as it should on the live site. Therefore I'm guessing the issue lies in my XAMMP configuration settings, however I am unsure where to start.
If page returns HTTP 500 Error it means that somewhere is syntax error or script can't run correctly. Maybe your settings set for non-local server?
The issue was with the XAMPP configuration file and its index.php file. I had a conversation with the original developer who guided me through how to congfigure this and all is resolved now. I needed to change some permissions with the files I had downloaded RWX etc, as well as the root file of my project files.

Access files uploaded to Parse.com after Heroku migration [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I am attempting to migrate my server from Parse.com over to Heroku with their one click migration. Their documentation says that Parse Server supports "file" type, but I can't find any documentation on transferring these files so Heroku can access them.
This isn't an answer but I've been having the same issue/dilemma and have partial information that might be helpful in eventually finding an answer. I did a migration and took a look at some of the stuff going on.
Example, photoObj.get('file').url();
On Parse Hosting: files point to the following:
http://files.parsetfss.com/parseFileKey/fileName.ext
This is stored on some amazon S3 thing. Basically this points to:
https://s3.amazonaws.com/files.parsetfss.com/parseFileKey/fileName.ext
After migrating to Heroku/MongoLab, photoObj.get('file').url() points to the following:
http://files.parsetfss.com/newHostFileKey/fileName.ext
newHostFileKey is something we designate in the parse-server setup and seems to be automatically generated via this setting.
I don't see any evidence so far that the migration tool moves files from Parse Hosting to the new host/db.
File uploading to the new host works fine. On the new host, if one generates a new file it ends up pointing to something like this:
http://newHostURL/parse/files/appID/fileName.ext
parse is whatever you designate at the startup of your parse-server like app.use('/parse', api);
appID is whatever you designate at the startup of your parse-server like
var api = new ParseServer({
appId: 'appID',
fileKey: 'newHostFileKey'
});
Changing the url point of an Parse Hosted file to fit the new host pattern doesn't yield anything (file not found) etc.
I have no idea how new files are being stored and to where the url routes to.
With new files that are uploaded via the new host, I notice that some new tables/collections are created in the MongoLab DB. These are fs.chunks and fs.files
fs.chunks is where the data of the file is being stored (I think). So under the new heroku/mongolab setup, files seem to reside "in" the DB.
As for what the best way is to migrate images from Parse hosting to new hosting is, I have no idea but I'm not sure there is a straightforward answer that is publicly out there at this point.

Forgot Amazon EC2 instance [windows 2008 server] password. [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I cannot retrieve the password for my EC2 instance running Windows 2008 server. I want this particular instance and my get password does not work.
Any work around how I can retrieve the password.
Your server is launched from an AMI most probably that is the reason you cannot retrieve your password. It has happened to me in past. In short what you have to do is create an image of the existing instance and then launch a new instance using that image.
create a snapshot with ec2-create-snapshot
attach the snapshot to a new instance.

Resources