FTP access on Windows Azure - ftp

Quick question. I'm currently moving a asp.net MVC web application to the Windows Azure platform. Everything is working out okay apart from one thing.
In the application at the moment, we make use of FTP accounts for each user to import large quantities of files to our database.
I understand FTP on Azure is not as straightforward.
I've googled and found this article: Ftp on Azure
This seems to be what I need except obviously we'll need to be able to add new users with their own separate FTP account. Does anyone know of an easy workaround for this?
Thanks in advance

Did you consider running a (FTP) service that's not IIS based, and you could add users programatically? Also, how are you going to solve data sync issues when the role recycles or when you upgrade it? Make sure to backup to blob on a somewhat regular basis!
Personally, I'd mount a VHD drive (Azure Drive) which is actually hosted on blob storage, and have my FTP server point to that drive. However, make sure you only have one instance of the server (problem #1) unless you don't need higher than 99,9% reliability you can solve this by running a single instance. Step 2 is I'd implement user management in relation to that program.
It's not straightforward, and I'd advise against it though. But I understand that sometimes you have to do this. I would solve it like I described above.

Related

Where to begin with WebDAV, Cloud storage, and Laravel

Is it possible to give users the ability to remotely edit documents that are stored in the cloud storage of my web application?
I know that with the webdav, you can remotely open and edit ms-office documents on the local machine and save files back to the cloud storage. I want to add this feature on my laravel web-application and i cant find the solution.
I heard about ITHit Webdav Library and i need something like this.
Do I need my own webdav server for this?
Are there any free solutions or libraries for this?
I use minio as cloud storage, can this library help me?
I need at least some guidance for this. Thank you very much in advance.
To answer your questions:
Yes, you need your own WebDAV server, cloud storage providers typically don't have this built-in.
The most popular library for PHP would be sabre/dav
It does not matter which cloud storage provider you use, with WebDAV it will work for most (if not all) and will definitely work with Minio. As for my recommended package, see above.
ITHit WebDAV Library is a commercial library and you should stick with open-source in my opinion.
Implementing it into Laravel would be quite opinionated, but doing so ultimately depends on what you need to achieve from a user experience and functionality perspective.
There is nothing simple about the road you're travelling and it is quite complex.

Setting up web farm for DNN 6.2.6 CE with multiple file servers

We are planning to convert our website which is running on single server to a web farm with two servers on Windows 2008 R2. I am afraid I haven't found lot of documentation on how to achieve this. Can any one please point me to the proper documentation for this. The one document I found is
http://www.datasprings.com/resources/articles-information/creating-a-webfarm-for-your-dotnetnuke-site
This one explains using single UNC share as file server but we are looking into to use every server in the web farm as file server (i.e. have dotnetnuke folder on all the server's local drive) since UNC share becomes single point of failure. So my questions are:
Can we do DNN web farm with multiple file servers, if so, how?
And also, how does the modules updates should be done? Does it need to be done on all the servers separately or does the DNN has any inbuilt mechanism for this or do we need to use DFS replication between the servers?
Also, we use heavy caching. Since we have to use file caching in the DNN CE web farms, how does the caching works with multiple file servers?
Also, please let me know any points or gotcha that I need be aware of. Any help is greatly appreciated.
The recommended way or doing a web farm for DNN is to use a single UNC share. Even with the paid editions of DNN that is the recommended approach.
Is it possible to do it any other way, yes, but there is nothing build into DNN to help you do so.
If you want to use multiple file servers you start running into issues with file based caching, module installations, etc.
Using UNC Share is best and easy to setup method to run DNN portal on webfarm. If you use single UNC share from all servers then possibly you do not get cache issue.
I had setup webfarm of DNN portal which was running on 4 web servers and 5th one was being used as file sever (UNC share) and DNN Database server and it was working quite well.
One more thing you should consider is that - Sessions.
DNN itself do not make use of Session and session variables. But if you are using your own modules or third party modules using session then it will be good to implement Session State Server.

How to use Windows Azure in Indonesia?

I plan to implement my website (asp.net & sql2008) using windows azure, but I have difficulty to do it because windows azure has not released yet in my location (Indonesia).
Should someone like to share the solution the same with my problem would be appreciated.
The question was asked on MSDN and the answer is that it is not possible. The only solution is to wait for Windows Azure available in your country.
MSDN Forum
Just run your apps on HK or Singapore Windows Azure Public Data Centers, these are the APAC Data Centers for your region.
for testing reasons, I wanted to create an Azure account, and faced the same here in Egypt.
I've made it by remotely logging into one of our U.S-based servers, and registered from there :) If you can't do so, and need this account badly, and don't have such server, try using TOR.
Update: TOR is a proxy-like solution for your internet connection, it will redirect all requests/responses to a node on the TOR network, which consists of volunteers like you and me.
so my solution is simple, we gonna use tor to simulate that you are inside one of the permitted countries, and register your account with ease.
what you gonna need is to install TOR and configure your browser to use it, but my personal recommendation is to install TOR browser bundle, it's TOR+a Browser that is pre-configured to use it.
you gonna find a nice video on the TOR browser bundle page that will give you an overview about it.
give it a try, and tell me what happened.

Where can host some server side logic without having a web site?

I'd like to host some php or perl/cgi script, without having a full blown web site, does anybody know someone is offering this kind of service, free, hopefully.
Thanks,
David
you can sign up for a developer account with Amazon Web Services and get a server instance of your choice for free for one year - http://aws.amazon.com/
You could run your own Linux or Windows webserver - both are completely capable of hosting as simple or complex a site you want. Unless you want to make this script available for others to use as a service, there's no need to find an "outside" provider.
Hmm, Free File Hosting. Or, if you don't need to actually access the files from anywhere, and you just want them hosted somewhere, gist might work well for you.

How can a web application synch a folder of text files on the client's PC?

I want to be able to synchronize several text files on a user's PC in real time from my web application. Basically I want a few data files on the local PC to mirror the state of a user's data in my web application so if the web application or the user's internet connection is lost he can use those data files to get some critical info (possibly using html/javascript code stored in with those files that would run in offline mode on those data files.)
I know that google gears has a lot of interesting tools for working with offline state, but I'd prefer an even simpler application in html/javascript that wouldn't be as reliant on google gears. I'd rather use google gears to just create those files and slowly keep them in synch with the web application's version of data throughout the day.
Update on answers:
PersistJS is a good suggestion I will look into, but I was hoping people would direct me towards really good Google Gears tutorials resources.
You can save data on the browser using PersistJS, which uses the best client-side persistent storage mechanism it can find, supporting:
Flash
Google Gears
HTML 5 storage specs
browser-specific extensions
cookies
When your app reconnects, you can resync. Creating and reading text files is something the browser will generally block your web site from doing.
Risking of stating the obvious; if you want to store user state locally, isn't cookies the standard way?
maybe more then one cookie will be needed, but that sounds like the simplest of ways.
You're going to need to make an ActiveX control and a FireFox plugin to get these permissions. Short of that I agree with orip try using PersistJS
You can ask the user to download a subversion client that is predefined to interface with your subversion server only. Then write your web application to interface with the subversion service from your side only.
There is a good deal of security harm associated with granting access to a user's file system so you will want to lock down all possible points of exploitation. You will want to ensure that the user cannot access the subversion server except through the client that you ask them to install. You will want to ensure the connection between the application server and the subversion server is extremely secure so that the transmission path cannot be compromised and that malicious logic that may be loaded onto the application server cannot access the subversion server. I would say to encrypt the transmission path between those two servers and put the subversion server behind the firewall separating your network DMZ. I would also suggest use a challenge/response mechanism between the application server and the subversion server to prevent malicious code from appearing to be legitimate decisions made on the application server. Also, ensure that data only flows form the application server to the subversion server in a unidirectional fashion only, because if there is malicious logic planted on your application server then any data that comes from the subversion server is compromised without even accessing that server.
you could use the File System Object FSO through javascript, however it is dependant on Microsoft as it is an ActiveX control, it would also require permissions in the browser, or perhaps a HTA (HTML Application).
http://www.webreference.com/js/column71/
Its a real security issue so most avenues are closed down inhrentley.
Inherently the web model was designed not to authorize upstream from server to client. Now things are changing slowly maybe could you do this with Websocket ?

Resources