How to allow fs and multer to write files on AWS EC2 - mean-stack

I am new to programming and recently finished a full MEAN stack project which is deployed on AWS EC2.
Project codes:
https://github.com/Cryfoo/13
Deployed at: http://35.166.172.216/
On server side, it uses fs.writeFile to save game logs after each game.
// Codes from server/config/game.js # lines 1361~1364
var filename = 10000 + roomNum;
filename = filename + "_" + endTime;
fs.writeFile("server/logs/" + filename + ".txt", variables[roomNum].logs, function(err) {});
On client side, it sends http request to server and uses multer to upload a profile picture of a user.
// Codes from server/controllers/user.js # lines 3~12
var storage = multer.diskStorage({
destination: function (req, file, callback) {
callback(null, "./client/static/profile");
},
filename: function(req, file, callback) {
callback(null, file.originalname);
}
});
var upload = multer({storage: storage, limits: {fileSize: 5000000}}).single("profile");
It works well locally on my laptop, but these two features do not work on EC2. I am assuming the problem has to do with allowing a permission to write files. How do I allow fs and multer to write files in EC2?
I have done a lot of searches for all the problems during this project and found solutions on stackoverflow and google, but for this problem I can't figure it out. I apologize if I am not being specific enough (First time posting a question here). Thank for the help in advance though.

If it's an issue with permissions on the folder, than you just need to change them
sudo chmod a+x -R server/logs
Edit: you probably need to run the command with sudo

Related

Transferring google bucket file to end user without saving file locally

Right now when client download file from my site, I'm:
Downloading file from google cloud bucket to server (GCP download file, GCP streaming download)
Saving downloaded file to a Ruby Tempfile
sending Tempfile to enduser using Rails 5 send_file
I would like to skip step 2, to somehow transfer/stream file from google cloud to enduser without the file being saved at my server- is that possible?
Note the google bucket is private.
Code I'm currently using:
# 1 getting file from gcp:
storage = Google::Cloud::Storage.new
bucket = storage.bucket bucket_name, skip_lookup: true
gcp_file = bucket.file file_name
# 2a creates tempfile
temp_file = Tempfile.new('name')
temp_file_path = temp_file.path
# 2b populate tempfile with gcp file content:
gcp_file.download temp_file_path
# 3 sending tempfile to user
send_file(temp_file, type: file_mime_type, filename: 'filename.png')
What I would like:
# 1 getting file from gcp:
storage = Google::Cloud::Storage.new
bucket = storage.bucket bucket_name, skip_lookup: true
gcp_file = bucket.file file_name
# 3 sending/streaming file from google cloud to client:
send_file(gcp_file.download, type: file_mime_type, filename: 'filename.png')
Since making your objects or your bucket publicly readable or accessible is not an option for your project, the best option that I could suggest is using signed URLs so that you can still have control over your objects or bucket and also giving users sufficient permission to perform specific actions like download objects in your GCS bucket.

how to get access from local machine to server-side terminal by using node

I want to get access to server-side terminal from node js code(from my local machine ) . I know that if you want to execute some terminal commands from code ( from node js code) , u can use child-process - but how to make connection to server-side terminal and send some commands there . Hope my question is clear .
If SSH is enabled in your server, you can use the NPM node-ssh package
var path, node_ssh, ssh, fs
fs = require('fs')
path = require('path')
node_ssh = require('node-ssh')
ssh = new node_ssh()
ssh.connect({
host: 'localhost',
username: 'steel',
privateKey: '/home/steel/.ssh/id_rsa'
}).then( function(){
/* your ssh interactions with the server go here */
/* use the node-ssh API to execute commands, upload files... */
})

Laravel 5: How to copy (stream) a file from Amazon S3 to FTP?

I have to move large content, which I don't want to put into memory from AWS S3 to FTP with Laravel's filesystem.
I know how to stream local content to S3, but haven't found a solution yet from S3 to FTP.
The closest I found was this, but I'm stuck in adapting it for my case.
Here is what's missing in my code (??):
$inputStream = Storage::disk('s3')->getDriver()->??
$destination = Storage::disk('ftp')->getDriver()->??
Storage:disk('ftp')->getDriver()->putStream($destination, $inputStream);
I think I found a solution:
$input = Storage::disk('s3')->getDriver();
$output = Storage::disk('ftp')->getDriver();
$output->writeStream($ftp_file_path, $input->readStream($s3_file_path));

node.js: Run external command in new window

I'm writing a node.js script that permanently runs in the background and initiates file downloads from time to time. I want to use WGET for downloading the files, since it seems more robust until I really know what I'm doing with node.js.
The problem is that I want to see the progress of the downloads in question. But I can't find a way to start WGET through node.js in a way that a new shell window is opened and displayed for WGET.
Node's exec and spawn functions run external commands in the background and would let me access the output stream. But since my script runs in the background (hidden), there's not much I could do with the output stream.
I've tried opening a new shell window by running "cmd /c wget..." but that didn't make any difference.
Is there a way to run external command-line processes in a new window through node.js?
You can use node-progress and Node's http client (or requesT) for this, no need for wget:
https://github.com/visionmedia/node-progress
Example:
var ProgressBar = require('../')
, https = require('https');
var req = https.request({
host: 'download.github.com'
, port: 443
, path: '/visionmedia-node-jscoverage-0d4608a.zip'
});
req.on('response', function(res){
var len = parseInt(res.headers['content-length'], 10);
console.log();
var bar = new ProgressBar(' downloading [:bar] :percent :etas', {
complete: '='
, incomplete: ' '
, width: 20
, total: len
});
res.on('data', function(chunk){
bar.tick(chunk.length);
});
res.on('end', function(){
console.log('\n');
});
});
req.end();
UPDATE:
Since you want to do this in a background process and listen for the download progresses (in a separate process or what have you) you can achieve that using a pub-sub functionality, either:
use a message queue like Redis, RabbitMQ or ZeroMQ
start a TCP server on a known port / UNIX domain and listen to it
Resources:
http://nodejs.org/api/net.html

Dynamics AX 2009: How to FTP from a batch job on an AOS

After quite a few searches for ways to FTP files in AX, I was happy to discover the WinInet class, which is more or less just a wrapper for the .DLL of the same name. I thought my problems were solved! I was not aware, however, that the class had a major Achilles heel -- it doesn't run in batch (on a server).
Can anybody point me in the right direction? Specifically, I want to upload (FTP put) a file to another server in a server-run batch job (running as a service user with admin rights to the file in question). Anybody?
There is another example of using .NET classes for FTP in Axaptapedia. It is different enough from 10p's example code to take a look...
In my own experience I ended up writing and then calling a bat file from the command line to pass in ftp commands as we needed to use a special FTP client! Here are two examples of using shell scripting - Net Time && Run a Process.
Use .NET classes in AX, e.g. following code logs into the FTP server and renames the file there:
str ftpHostName = 'ftp.microsoft.com'; // without "ftp://", only name
str username = 'myloginname';
str password = 'mypassword';
str oldname = 'oldfilename';
str newname = 'newfilename';
System.Net.Sockets.Socket socket;
System.Net.Dns dns;
System.Net.IPHostEntry hostEntry;
System.Net.IPAddress[] addresses;
System.Net.IPAddress address;
System.Net.IPEndPoint endPoint;
void sendCommand(str _command)
{
System.Text.Encoding ascii;
System.Byte[] bytes;
;
ascii = System.Text.Encoding::get_ASCII();
bytes = ascii.GetBytes(_command + '\r\n');
socket.Send(bytes, bytes.get_Length(), System.Net.Sockets.SocketFlags::None);
}
;
socket = new System.Net.Sockets.Socket(System.Net.Sockets.AddressFamily::InterNetwork, System.Net.Sockets.SocketType::Stream, System.Net.Sockets.ProtocolType::Tcp);
hostEntry = System.Net.Dns::GetHostEntry(ftpHostName);
addresses = hostEntry.get_AddressList();
address = addresses.GetValue(0);
info(address.ToString());
endPoint = new System.Net.IPEndPoint(address, 21);
socket.Connect(endPoint);
sendCommand(strfmt("USER %1", username));
sendCommand(strfmt("PASS %1", password));
sendCommand(strfmt("RNFR %1", oldname));
sendCommand(strfmt("RNTO %1", newname));
This is just an example but feel free to use any standard FTP command slightly mpdifying this code. Let me know if the concept is unclear.

Resources