I was reading a great article here that covers the in's and out's of using ASP MVC for uploading/downloading files. I'll try to keep this short here goes.
My MVC3 app will be running in one place (houston) and its mission will be to allow our employees, all over the world (30 domain controllers) to download various software packages via our Intranet. Today, each of these 30 sites has a replicated "Programs" folder mapped to a shared drive. So, someone in China isn't dowloading an app to install from here in Houston. They'll get if off their logonserver in China. I am able to interrogate the "logonserver" environment variable to determine the server they are on...but short of dynamically building a hyperlink with the UNC embedded (this works by the way)...I can't really use any of the streaming methods specific to MVC because it will end up getting pulled into the box where my web app is running (houston) to be streamed back across some pond. Is my thinking right on this? I prefer to stream because this way I can lock down that Programs folder and only grant the ASP Process the authority to access the replicated folder.
Related
I am new to web development but I have some experience in VB.Net desktop app programming. The software I am currently maintaining is only accessible within the company office network. Now that the management demanded the software to be accessible online, I have no choice but to convert it as a web-based system.
I had already started developing the software using laravel framework and so far so good. I successfully created the login form, few pages with CRUD implementation.
However, I came to a confusing part of development stage where I want to prevent some type of users from editing/adding/deleting records. Basically, there are two type of users I'm using which are Administrator and Normal User.
I already made many sections and pages, and I'm thinking of creating a duplicate version of these pages and folders for normal users only. This way I can remove some of the edit/delete/etc from the pages that are not needed for the user. But I'm still in dilemma because if I create duplicates for these pages, it would be tiresome to make changes in future when maintaining it.
So, what do you guys think? should I make a modified copy of the pages or just use the same pages for all users and disable some features based on user type?
We have an application which is not per user and can be used my multiple users simultaneously and data is also shared by all users. So the path we use data folders is ProgramData\OurAppName\Data (post Vista) and give full control to all users, so that our application run by them can make changes to files under Data folder.
Now the issue, with this, any other application (malware/virus) can also modify files i.e. an attack can be made on our application's data files. Our applications is Win32 Desktop application.
Is there anyway by which we can restrict the access to Data folder to only our applications?
The Windows security model is per-user, not per-application. So there is no built-in way to restrict access to files based on which application is making the request.
The proper solution is for a server program (either running on an actual server, or as a system service on the local machine) to have exclusive access to the files (which works because the server program will be running as a different user) and for the client application (the application the end users run) to make all requests via the server. The server can then vet the requests to make sure they are not destructive before carrying them out.
Possible ad-hoc solutions would include a system service that hands out access to the files to your application (via handle duplication) or a file system filter driver. These approaches could be bypassed easily enough, but might be adequate against common-variety viruses that are not targeting your application specifically.
I have implemented Continuous Integration using TFS Version Control and TFS Build 2010. The compiled website project gets dropped in a shared folder with a version number.
Now I have a very basic question and may be a stupid question. When we normally deploy a website project from VS 2010 to a webserver it uploads App_Offline.htm file to the website folder so no requests are served to the user. After publish is completed that App_Offline.htm file is removed. During that period of time users see outage.
If we use CI on a live website then how can we eliminate that outage which appears to a user. I believe the whole point of CI is that users get to see newer features and the site is never down.
How is this accomplished? If we deploy website project to root folder then existing users will be affected and that is certainly no advisable.
I wanted to know what is the recommended practice with VS2010, TFS2010 Build & Version Control.
There's no real foolproof method for this, service up-time is never 100%, that's why people usually define it in 'nines'
But, if you had multiple web servers (Backup, fail-over, mirror etc.), you could roll out the update across them, so that as you update some servers, others will still be online (albeit with the old version) to serve users.
In general, only some of the largest websites have to worry so meticulously about being down for a few short minutes, so make sure you're focusing your energy in the right place ; )
Regarding taking down the site for the shortest time possible, the only way I've seen this done successfully is using multiple sites - either load balancing, or 2 sites on the same machine + swapping host headers after the release/warm up. But in most cases it's not worth the effort, releases shouldn't take down the site for more than a few seconds in which time there should be relatively few requests. You're better off trying a few things you can do to help your users live through a site release.
Move session out of proc.
If the users session lives in the app pool it will be lost when a new version is released, change the config to move it into a session server or the database.
Specify a machine key for the website
Viewstate (and cookies?) are encrypted using a key that is generated when a site starts, if a site restarts due to a release any users filling out a form will receive a invalid viewstate exception on postback. (Note: this may have other security implications)
I have a desktop application which uses flat files (some xml and small pictures) as data. I want this data to be available on other PCs which have the desktop application installed and usable by a smartphone client (WP7 at the moment) as well.
The user should have it very easy to synchronize this data. He should be able to use accounts he already possesses (Live-Login, Googlemail, Facebook,...).
I thought about using Azure Blob Storage to save the data in Azure, the Sync Framework to perform the actual synchronization and the Access Control Service to handle authentication.
I have not used any of this technologies before so any advice would be great but I'm searching foremost for errors or shortcomings in this strategy I don't see yet. Is this approach viable at all?
Windows Azure is basically a virtualized datacentre. It is elaborate and complicated and is pitched at corporations who don't want to own their server infrastructure or hardware.
If I understand correctly, what you want is a cloud fileserver, not a whole LAN. Windows SkyDrive fulfils this requirement nicely and offers 25GB of storage per member with no charge for membership.
About Hotmail and Windows Live People often confuse Hotmail and
Windows Live, because when you set up a Hotmail account it uses
Windows Live for authentication and therefore you end up with a
Windows Live account and all the associated facilities, including
SkyDrive. However, it is entirely possible to set up a Windows Live
account using any email address as the username.
If you do this, it is important to be aware that the Windows Live
password associated with a given email address is completely
independent of the password required by the mail server that hosts
mail for the account. This can cause a great deal of user confusion.
For Hotmail (or any other mail server that uses Windows Live for
authentication) they are guaranteed to be the same password.
There is no official Microsoft framework support for SkyDrive. There is an open source project called SkyDriveApiClient, but it only works with the full .NET framework. I tried porting it but the author was a bit of an architecture astronaut, and it is absolutely riddled with [Serializable] which is not available on WP7x.
The WP7 guys have said that the WP7 framework will probably include support for SkyDrive but not in Mango (WP7.1) and given that Microsoft's typical release cycle is 18 months and Mango has yet to hit the streets, I'd say it will be two years before you can count on intrinsic cloud file services for WP7.
Roll-your-own wouldn't be hard, WCF services are dead easy to use from WP7. But that's not really cloud since you have to provide and maintain the server infrastructure yourself. For this reason and given the MS timetable, I have put a great deal of effort into producing my own SkyDrive client for WP7. Core functionality is complete and I am now refactoring, improving robustness and adding performance enhancements like local cacheing of tokens (cookies, essentially). I don't intend to release it; I have a number of apps planned that depend on this functionality and it suits me fine that there is a substantial barrier to competition.
I didn't tell you that to tease you. My point is that I'm so sure SkyDrive is the right answer that I put a lot of work into making it happen.
Cloud file storage is a perfect fit for mobile devices.
Azure is not a good answer for the sort of phone apps individuals want because the data store isn't shared in a way that required indexing or supports high levels of concurrency
I can certainly think of corporate phone apps that would benefit from using SQL Server as storage
Azure can do file services but it represents an ongoing expense. Nobody's going to put up with that when Google and Microsoft both give away web based cloud storage.
I can personally attest that if you're determined, it is possible to use SkyDrive from WP7.
Cloud storage is the only way you're going to get programmatically accessible storage that's shared by your user's mobile device and his computer. One of the things I intend to do that depends on shared storage is write a Silverlight app that lets you prepare map routes with multiple waypoints on a desktop computer and a companion app that uses them on WP7.
The Windows Live team has released what they call support for WP7. They supply a sample project showing you how to instantiate a browser object and load their login pages and manipulate them to log in and use their javascript API to manipulate SkyDrive.
This has one big advantage: browser cookies and cached credentials. The disadvantages are obvious; technical shortcomings notwithstanding the Windows Live team seems to think the only thing people want to do with a phone is tag their photos and fiddle with social media.
I have finished my own libraries. They do not support most of the social media twaddle. I have treated SkyDrive as no more or less than a cloud file system, providing
Authenticate(username, password)
CreateFolder(folderpath[, blocking=false])
Delete(fileOrFolderPath[, blocking=false])
SaveString(filepath, value[, blocking=false])
LoadString(filepath)
I could handle binaries but Convert.ToBase64 makes this unnecessary and strings are convenient for XML. CreateFolder, Delete and SaveString are optionally blocking. LoadString is always blocking because it's a function that returns the loaded string. CreateFolder is recursive so you can create an entire path in one call (eg /folder1/folder2/folder3). Calling CreateFolder on a pre-existing path has no effect, and SaveString uses CreateFolder to ensure the path is valid, making it unnecessary to create a filepath in advance. Authenticate loads the file system (except file content) into memory eliminating server chatter. This is asynchronous and a FileSystemReady event announces when the file system is completely loaded. The model is maintained as you add and remove files and folders.
This was a lot of work and no one reponded to my attempt to make it an open source project so I'm not inclined to give the fruits of my labour away, but provided your plans don't compete with mine I could be persuaded to come to an arrangement.
I want to be able to synchronize several text files on a user's PC in real time from my web application. Basically I want a few data files on the local PC to mirror the state of a user's data in my web application so if the web application or the user's internet connection is lost he can use those data files to get some critical info (possibly using html/javascript code stored in with those files that would run in offline mode on those data files.)
I know that google gears has a lot of interesting tools for working with offline state, but I'd prefer an even simpler application in html/javascript that wouldn't be as reliant on google gears. I'd rather use google gears to just create those files and slowly keep them in synch with the web application's version of data throughout the day.
Update on answers:
PersistJS is a good suggestion I will look into, but I was hoping people would direct me towards really good Google Gears tutorials resources.
You can save data on the browser using PersistJS, which uses the best client-side persistent storage mechanism it can find, supporting:
Flash
Google Gears
HTML 5 storage specs
browser-specific extensions
cookies
When your app reconnects, you can resync. Creating and reading text files is something the browser will generally block your web site from doing.
Risking of stating the obvious; if you want to store user state locally, isn't cookies the standard way?
maybe more then one cookie will be needed, but that sounds like the simplest of ways.
You're going to need to make an ActiveX control and a FireFox plugin to get these permissions. Short of that I agree with orip try using PersistJS
You can ask the user to download a subversion client that is predefined to interface with your subversion server only. Then write your web application to interface with the subversion service from your side only.
There is a good deal of security harm associated with granting access to a user's file system so you will want to lock down all possible points of exploitation. You will want to ensure that the user cannot access the subversion server except through the client that you ask them to install. You will want to ensure the connection between the application server and the subversion server is extremely secure so that the transmission path cannot be compromised and that malicious logic that may be loaded onto the application server cannot access the subversion server. I would say to encrypt the transmission path between those two servers and put the subversion server behind the firewall separating your network DMZ. I would also suggest use a challenge/response mechanism between the application server and the subversion server to prevent malicious code from appearing to be legitimate decisions made on the application server. Also, ensure that data only flows form the application server to the subversion server in a unidirectional fashion only, because if there is malicious logic planted on your application server then any data that comes from the subversion server is compromised without even accessing that server.
you could use the File System Object FSO through javascript, however it is dependant on Microsoft as it is an ActiveX control, it would also require permissions in the browser, or perhaps a HTA (HTML Application).
http://www.webreference.com/js/column71/
Its a real security issue so most avenues are closed down inhrentley.
Inherently the web model was designed not to authorize upstream from server to client. Now things are changing slowly maybe could you do this with Websocket ?