Parse Server migrate current files from GridFS to S3 - parse-platform

In my parse server project, we saved all files locally with GridFS in the database. Over time the database is growing fast and needs a lot of space. I analyzed this and saw that files use a lot of space.
I would like to migrate all current files which are saved currently in GridFS to AWS S3 via S3 file adapter.
Is there any way or plan for how to do that? Is there maybe some sample code or steps on how I can do that without creating data inconsistency.

Related

SpringBoot: How to store image/video files when using on-prem

I have a spring boot application that receives data from user, the data contains images and videos.
I am a little skeptical about storing big files on database(MongoDb). As I can not use Amazon S3, what should I do to save images/videos ?
Should I save them in a folder in a docker on a VM? or
should I convert the image/video file to base64 and save it on MongoDb?
in case 1) is a good option, what should be my implementation approach? I am new to this,
i appreciate your comments.
I tried to save image/video files to database but it is extremely time consuming.

Continuous file load directly from FTP to Snowflake using Snowpipe

I'd like to copy files incrementally from an FTP server into Snowflake.
Currently, I am using ADF pipeline which runs every 15 minutes and copies them into Azure Blob Storage. Then I have Snowpipes which ingest them into the target tables in Snowflake. However, I am looking for a more optimized solution.
Is there a way to load data directly from FTP to Snowflake instead of using ADF to copy files to Azure blob storage and read them from there?
Thanks
you have 2 alternative options:
Use logic apps instead of ADF. In Logic Apps, add FTP component with (when the file is added or modified).
Use stream and task instead of snowpipe with ADF or Logic Apps.

Parse- Server Transferring files hosted on Heroku to AWS S3 bucket

I have parse-server running on Heroku. When I first created this app, I didn't specify a files adapter in index.js, so all uploaded files have been getting stored on Heroku.
So I have now run out of room and I have set up an AWS S3 bucket to store my files on. This is working fine expect for the fact that any files which were originally stored on Heroku can no longer be accessed through the application.
At the moment I am thinking about looping through all objects which have a relation to a file stored on heroku, then uploading that file to S3 bucket. Just hoping that there may be some tool out there or that someone has an easier process for doing this.
thanks
There are migration guides for migrating parse server itself but I don't see anything in the documentation for migrating hosted files unfortunately.
I did find one migration tool but it appears to still utilize the previous file adapter (on your heroku instance) and then stores anything new on the new adapter (s3 storage).
parse-server-migrating-adapter

Heroku PlayFramework - create thumbnail

I already have PlayFramework app runing , but I am in process of migrating it to Heroku. Because on Heroku I can not use local filesystem like I did in my app. I am forced to use Amazon S3 ,but I do not know how to rewrite creating of thumbnails. For that I am using :
https://github.com/coobird/thumbnailator
Thumbnails.of(picture.getFile()).size(300,300).toFile(new File("public/images/data", thumb));
The problem is that I can not do this at heroku,because file wont be saved.
How do I create thumbnails then? If I do not want to use another service which will generate thumbnails for me and save them to s3 somehow...
Honestly if I would know how many different services would I need for simple page with java then I would stay at php forever...
In Heroku (as in many PaaS) there's no persistent filesystem. However, you do have access to temp directory.
So you could save it into a temp file
File temp = File.createTempFile("prefix", "suffix").getAbsolutePath;
Thumbnails.of(picture.getFile()).size(300,300).toFile(temp, thumb));
Then take that file and upload it to S3.
If you strictly insist on not using S3 for storing binary files, then you could base64 the file content and save it into the DB. (see some pros/cons for such approach here)

Storing images in file system, amazon-s3 datastores

There has been numerous discussions related to storing images (or binary data) in the database or file system (Refer: Storing Images in DB - Yea or Nay?)
We have decided to use the file system for storing images and relevant image specific metadata in the database itself in the short term and migrate to an amazon s3 based data store in the future. Note: the data store will be used to store user pictures, photos from group meetings ...
Are there any off the shelf java based open source frameworks which provide an abstraction to handle storage and retrieval via http for the above data stores. We wouldn't want to write any code related to admin tasks like backups, purging, maintenance.
Jets3t - http://jets3t.s3.amazonaws.com/index.html
We've used this and it works like a charm for S3.
I'm not sure I understand if you are looking for a framework that will work for both file-system storage and S3, but as unique as S3 is, I'm not sure that such a thing would exist. Obviously with S3, backups and maintenance are handled for you.

Resources