Saving third-party images on third-party server - heroku

I am writing a service as part of which a user chooses an image from a url (not my domain) and later he and others can view that image.
I need to save this image to a third party server (S3).
After a lot of wasted time I found I can not do it from the client side due to security issues (I can't get the third party image data and send it from the client side without alerting the client, which is just bad)
I also do not want to do the uploading on my server because I run Rails on Heroku and the workers expansive.
So I though of two options:
use something like transloadit.com,
or write a service on EC2 that will run over my db, find where the rows where the images are not uploaded and upload them.
I decided to go for the EC2 and S3 because the solution i am writing is meant for enterprise and it seems that it will sound better as part of the architecture when presented to customers.
My question is: what is the setup i need so I can access the Heroku db from an external service?
Any better ideas on how to solve this?

So you want to effectively write a worker, but instead of doing it on Heroku you want to do it on EC2? That feels like more work.
As for the database, did you see the documentation? It shows how to get the URL.
PS. Did you not find it in the docs?

Related

How to set local database for google safe browsing update API (v4)?

I am building a service for checking for phishing or malware URLs for one of my applications. This service will be running on google app engine. Now, I want to use google safe browsing's Update API (v4) to have local database of URL hashes. But I am having hard time to understand the setup process of the local database they have mentioned.
https://developers.google.com/safe-browsing/v4/local-databases
They do provide a Go source code to do something of this sort but its not descriptive enough to have my own implementation.
I want to setup the db on google cloud itself. Can anyone point me to any good documentation or some ways to do the same if you have tried this before.
you can use:
"https://safebrowsing.googleapis.com/v4/threatListUpdates:fetch?key=" + apiKey;
to download hash version database.
you may need send a post request to get the database according to https://developers.google.com/safe-browsing/v4/reference/rest/v4/threatListUpdates/fetch

Migrate existing Squarespace site to AWS

I have I Squarespace website I made for myself a while back. The main purpose at the time was to have something to link to from my iOS app, and I opted for something expedient rather that thinking long term just to get the app released. Fast forward to now and I have an AWS EC2 instance where I could do more with a personal site in the future. Ultimately it would be nice to get it off Squarespace and not have to pay another full year billing cycle, but the renewal date is a pretty tight deadline at this point.
Nothing on this domain requires must more than frontend web code really, but a completely different page UI could take more time than I have for this. I'm wondering if there might be a way to just temporarily have the Squarespace page source as is running on EC2 so I can worry about a possible non CMS design when I'm not worried about getting billed for another whole year by Squarespace.
I'm not sure if this is possible, but if not it seems like I should just port the content to minimalistic empty html files with no styling just to avoid the billing or get billed for a shorter time period. Billing seems like the limiting factor here. I would also need to add my new credit card to get billed for more time which I also have yet to do.
Basically, has anyone else dealt with this situation personally? What would you recommend I do? Does Squarespace even allow me to port to EC2 somehow, or is that more in the realm of WordPress? Thanks.
Note: Tomcat's what I'm using on the EC2 instance currently. I will also need to do the multiple site per instance setup for this, but I believe that's the most relevant config info here unless I'm forgetting something.
Not sure why you've already chosen to use Tomcat as I don't see anything that would allow you to easily convert your Squarespace site to a Java webapp. It looks like Squarespace sites can be exported into Wordpress, which you could host on an EC2 server.
Alternatively you could use wget to create a static copy of your website which you could then host easily on your EC2 server with Nginx, or skip EC2 and just host the static website on S3.

using your own ftp server to receive uploaded files on my website

Im looking for an out of the box solution to be able to add an upload form so that my users can upload large files from my website onto my own FTP server.
Has anyone found a good service to accomplish this? Again I want to be able to use my own server in my office and i also need a form attached to the uploaded file.
I run a graphics printing company and need to be able to receive large files that my designers send to me.
I want my user experience to be painless and not complicated as possible so i would prefer if they did not have to download any FTP clients like filezilla or transmit.
I just want them to fill out the form
upload their files
click send
then i receive it on my server
If there is any off the shelves solution for this that would be amazing.
Thank you!
I guess this is an "out of the box" web app. It allows you to brand the app to look like your own web site by modifying a couple of files. All the functionality is built in. It is called Simple2FTP and can be found at www.Simple2ftp.com
Maintaining a ftp server is not trivial. There are various dropbox-type services on the
web that are very easy to use.

Heroku architecture for running different applications but on the same domain

I have a unique set-up I am trying to determine if Heroku can accommodate. There is so much marketing around polygot applications, but only one example I can actually find!
My application consists of:
A website written in Django
A separate Java application, which takes files uploaded by users, parses them, and stores the data in a database
A shared database accessible by both applications
Because these user-uploaded files can be enormous, I want the uploaded file to go directly to the Java application. My preferred architecture is:
The Django-generated webpage displays the upload form.
The form does an AJAX submit to the Java application
The browser starts polling the database to see if the Java application has inserted the data
Meanwhile the Java application does its thing w/ the user-uploaded file and updates the database when it's done
The Django webpage AJAX-refreshes a div with the results of the user upload once the polling mechanism sees that the upload is complete
The big issue I can't figure out here is if I can get both the Django the Java apps either running on the same set of dynos or on different dynos but under the same domain to avoid AJAX cross-domain issues. Does Heroku support URL-level routing? For ex:
Django application available at http://www.myawesomewebsite.com
Java application available at http://www.myawesomewebsite.com/javaurl/
If this is not possible, does anyone have any ideas for work-arounds? I know I could have the user upload the file to Django and have Django send the request to Java from the server-side instead of the client side, but that's an awful lot of passing around of enormous files.
Thanks so much!
Heroku does not support the ability to route via the URL. Polyglot components should exist as their own subdomains and operate in a cross-domain fashion.
As a side-note: Have you considered directly uploading to S3 instead of uploading to your app on Heroku which will then (presumably) upload to S3. If you're dealing with cross-domain file uploads this is worth considering for its high level of scalability.

FTP access on Windows Azure

Quick question. I'm currently moving a asp.net MVC web application to the Windows Azure platform. Everything is working out okay apart from one thing.
In the application at the moment, we make use of FTP accounts for each user to import large quantities of files to our database.
I understand FTP on Azure is not as straightforward.
I've googled and found this article: Ftp on Azure
This seems to be what I need except obviously we'll need to be able to add new users with their own separate FTP account. Does anyone know of an easy workaround for this?
Thanks in advance
Did you consider running a (FTP) service that's not IIS based, and you could add users programatically? Also, how are you going to solve data sync issues when the role recycles or when you upgrade it? Make sure to backup to blob on a somewhat regular basis!
Personally, I'd mount a VHD drive (Azure Drive) which is actually hosted on blob storage, and have my FTP server point to that drive. However, make sure you only have one instance of the server (problem #1) unless you don't need higher than 99,9% reliability you can solve this by running a single instance. Step 2 is I'd implement user management in relation to that program.
It's not straightforward, and I'd advise against it though. But I understand that sometimes you have to do this. I would solve it like I described above.

Resources