Continuous file load directly from FTP to Snowflake using Snowpipe - ftp

I'd like to copy files incrementally from an FTP server into Snowflake.
Currently, I am using ADF pipeline which runs every 15 minutes and copies them into Azure Blob Storage. Then I have Snowpipes which ingest them into the target tables in Snowflake. However, I am looking for a more optimized solution.
Is there a way to load data directly from FTP to Snowflake instead of using ADF to copy files to Azure blob storage and read them from there?
Thanks

you have 2 alternative options:
Use logic apps instead of ADF. In Logic Apps, add FTP component with (when the file is added or modified).
Use stream and task instead of snowpipe with ADF or Logic Apps.

Related

SpringBoot: How to store image/video files when using on-prem

I have a spring boot application that receives data from user, the data contains images and videos.
I am a little skeptical about storing big files on database(MongoDb). As I can not use Amazon S3, what should I do to save images/videos ?
Should I save them in a folder in a docker on a VM? or
should I convert the image/video file to base64 and save it on MongoDb?
in case 1) is a good option, what should be my implementation approach? I am new to this,
i appreciate your comments.
I tried to save image/video files to database but it is extremely time consuming.

Parse Server migrate current files from GridFS to S3

In my parse server project, we saved all files locally with GridFS in the database. Over time the database is growing fast and needs a lot of space. I analyzed this and saw that files use a lot of space.
I would like to migrate all current files which are saved currently in GridFS to AWS S3 via S3 file adapter.
Is there any way or plan for how to do that? Is there maybe some sample code or steps on how I can do that without creating data inconsistency.

Heroku PlayFramework - create thumbnail

I already have PlayFramework app runing , but I am in process of migrating it to Heroku. Because on Heroku I can not use local filesystem like I did in my app. I am forced to use Amazon S3 ,but I do not know how to rewrite creating of thumbnails. For that I am using :
https://github.com/coobird/thumbnailator
Thumbnails.of(picture.getFile()).size(300,300).toFile(new File("public/images/data", thumb));
The problem is that I can not do this at heroku,because file wont be saved.
How do I create thumbnails then? If I do not want to use another service which will generate thumbnails for me and save them to s3 somehow...
Honestly if I would know how many different services would I need for simple page with java then I would stay at php forever...
In Heroku (as in many PaaS) there's no persistent filesystem. However, you do have access to temp directory.
So you could save it into a temp file
File temp = File.createTempFile("prefix", "suffix").getAbsolutePath;
Thumbnails.of(picture.getFile()).size(300,300).toFile(temp, thumb));
Then take that file and upload it to S3.
If you strictly insist on not using S3 for storing binary files, then you could base64 the file content and save it into the DB. (see some pros/cons for such approach here)

How to upload images on heroku server using attachment_fu plugin

I have an app in Heroku, I need simple file storage for uploaded images for this I
used send_data using attachment_fu plugin.
After that I have used the tmp/ directory to write this file and want to display on browser, But these files are not displayed in browser.
How can I display these images on browser?
What is the alternate solution to store and retrieve images?
Thanks!
You cannot store uploaded files on Heroku.
You must use an alternative strategy. A couple alternative strategies:
Store uploaded files in your database on Heroku. Because database systems are not specifically optimized for storing files, you should test out how well your application performs under load before you use this strategy.
Store uploaded files in an external storage service. Amazon S3 is a popular cloud storage service that anyone can use, and it has pay-as-you-go pricing just like Heroku. (In fact, Heroku itself runs on top of another of Amazon's cloud services, EC2.)

Storing images in file system, amazon-s3 datastores

There has been numerous discussions related to storing images (or binary data) in the database or file system (Refer: Storing Images in DB - Yea or Nay?)
We have decided to use the file system for storing images and relevant image specific metadata in the database itself in the short term and migrate to an amazon s3 based data store in the future. Note: the data store will be used to store user pictures, photos from group meetings ...
Are there any off the shelf java based open source frameworks which provide an abstraction to handle storage and retrieval via http for the above data stores. We wouldn't want to write any code related to admin tasks like backups, purging, maintenance.
Jets3t - http://jets3t.s3.amazonaws.com/index.html
We've used this and it works like a charm for S3.
I'm not sure I understand if you are looking for a framework that will work for both file-system storage and S3, but as unique as S3 is, I'm not sure that such a thing would exist. Obviously with S3, backups and maintenance are handled for you.

Resources