protection of downloadable file - download

here's my situation:
I have a paint like tool
I want users of the tool to be able to download their creation
I don't want the user to be able to read or open the file
The user needs to upload the same image file to another tool
My problem is i can't make the first tool and third one work in unisson so users definitely need to first click download to then click upload on the other third party tool.
Does this seem possible to any of you? To have a way to protect the downloadable file so that the only option is to upload it to the 2nd tool ?
I hope this was clear enough
Thanks for your help in these dark times, cheers :)

There is no "secure" way. Because we're talking about protection, we will not discuss semi-secure ways.
I see only these options:
Encrypt the file. The other service need the encryption key. It needs to be known before or service a tells it service b in the background (without the client)
Send the file directly from service a to service b.
Service b is pulling the data from service a. You transfer-url could be transmitted via the client, if service a is checking the source IP (so only service b is able to download the file from service a).

Related

How to implement collaboration in WebDAV Server?

I had a requirement of editing my AWS-S3 uploaded MS Office documents from the browser. So, we created our own WebDAV server on Apache using the httpd extension. Workflow is explained in the below diagram and description.
Now editing is working just fine. My UI Client hits Spring boot API on WebDAV Server. It will copy the abcd.docx -> S3 object to the WebDAV folder let's say var/www/html/webdav. Now the path to a file becomes var/www/html/webdav/abcd.docx. I send file path to UI clients like https://www.mywebdavserverxyz.com/webdav/abcd.txt. I open this document in the local Microsoft office application like :
<a href='ms-word:ofe|u|https://www.mywebdavserverxyz.com/webdav/abcd.txt'>Edit</a>
Now my question is if I want to enable collaborative editing then which options do I have and which one is best? currently, a single user is able to edit the file at a time, for others file is opening in read-only mode.
Co-authoring using WebDAV is not possible.
However, if you modify your WebDAV server to implement shared locks, when the second user goes to edit the document they will receive a message that the document is currently being edited and they can decide to continue editing, once the initial user UNLOCKs the document their changes will be merged into the second users changes (if any).

Saving third-party images on third-party server

I am writing a service as part of which a user chooses an image from a url (not my domain) and later he and others can view that image.
I need to save this image to a third party server (S3).
After a lot of wasted time I found I can not do it from the client side due to security issues (I can't get the third party image data and send it from the client side without alerting the client, which is just bad)
I also do not want to do the uploading on my server because I run Rails on Heroku and the workers expansive.
So I though of two options:
use something like transloadit.com,
or write a service on EC2 that will run over my db, find where the rows where the images are not uploaded and upload them.
I decided to go for the EC2 and S3 because the solution i am writing is meant for enterprise and it seems that it will sound better as part of the architecture when presented to customers.
My question is: what is the setup i need so I can access the Heroku db from an external service?
Any better ideas on how to solve this?
So you want to effectively write a worker, but instead of doing it on Heroku you want to do it on EC2? That feels like more work.
As for the database, did you see the documentation? It shows how to get the URL.
PS. Did you not find it in the docs?

using your own ftp server to receive uploaded files on my website

Im looking for an out of the box solution to be able to add an upload form so that my users can upload large files from my website onto my own FTP server.
Has anyone found a good service to accomplish this? Again I want to be able to use my own server in my office and i also need a form attached to the uploaded file.
I run a graphics printing company and need to be able to receive large files that my designers send to me.
I want my user experience to be painless and not complicated as possible so i would prefer if they did not have to download any FTP clients like filezilla or transmit.
I just want them to fill out the form
upload their files
click send
then i receive it on my server
If there is any off the shelves solution for this that would be amazing.
Thank you!
I guess this is an "out of the box" web app. It allows you to brand the app to look like your own web site by modifying a couple of files. All the functionality is built in. It is called Simple2FTP and can be found at www.Simple2ftp.com
Maintaining a ftp server is not trivial. There are various dropbox-type services on the
web that are very easy to use.

FTP access on Windows Azure

Quick question. I'm currently moving a asp.net MVC web application to the Windows Azure platform. Everything is working out okay apart from one thing.
In the application at the moment, we make use of FTP accounts for each user to import large quantities of files to our database.
I understand FTP on Azure is not as straightforward.
I've googled and found this article: Ftp on Azure
This seems to be what I need except obviously we'll need to be able to add new users with their own separate FTP account. Does anyone know of an easy workaround for this?
Thanks in advance
Did you consider running a (FTP) service that's not IIS based, and you could add users programatically? Also, how are you going to solve data sync issues when the role recycles or when you upgrade it? Make sure to backup to blob on a somewhat regular basis!
Personally, I'd mount a VHD drive (Azure Drive) which is actually hosted on blob storage, and have my FTP server point to that drive. However, make sure you only have one instance of the server (problem #1) unless you don't need higher than 99,9% reliability you can solve this by running a single instance. Step 2 is I'd implement user management in relation to that program.
It's not straightforward, and I'd advise against it though. But I understand that sometimes you have to do this. I would solve it like I described above.

Transfer files between users of the same app with "cloud" storage

I am building an WPF app that need to exchange some very small xml files with other users. I'm currently looking into peer2peer networking, but I need the sender of the files to be able to send without the receiver being online also. I do not want to host a service myself, and I want the users to store the other users they interact with locally on their machine, for example just a name together with a GUID or email adress to identify them.
Do you guys have any suggestions on how to solve this? My wishful thinking would be if there was a free or cheap service where users could connect via my program to a public API and upload their files. And when the receiver user logs on, it would check the service and authenticate somehow, and download the XML files so it could be imported by the program.
I have made a solution with a IMAP library where the XML files are attached in the email and sent to the receivers email account. The program on the receiver checks the email and read the attachment. This works ok, but is not very slick and also filling up the users inbox and sent items with garbage..
Any suggestions is greatly appreciated.
Best regards
Ola
This is one idea:
Normally, Block Storage volumes are seen as extra 'Drives' that you attach to a Virtual Machine. You could then use this but as if it was a metaphorical 'USB flashkey' that you would share with the other person.
Create the storage volume
Attach it to your VM and copy your
data-to-be-shared on it
Detach the volume from your VM
Your Buddy, attaches it to his VM
He then copies the files and voila!
All this could be done through a web interface and you wouldn't have to do any networking steps.
All you need now is a capable Cloud Geek, a pot of coffee and some convincing.

Resources