By this code i am not able to upload my file in UserImage Folder also getting path in database after upload my site on Local host On IIS before that it's works.
When i am press submit button Error Will Occured is -
Access to the path
'C:\Users\PEERBITS\Desktop\ClientProj\ClientProj\UserImages\Lighthouse.jpg'
is denied.
Here is the code :
if (uploadFile != null && uploadFile.ContentLength > 0)
{
var filename = Path.GetFileName(uploadFile.FileName);
var path = Server.MapPath("~/UserImages/") + filename;
uploadFile.SaveAs(path);
ins.image_path = "~/UserImages/" + filename;
}
Please check 'UserImages' folder's poperties.
Please remove read-only attribute of this folder.
Related
I need to my extension can generate and save text file inside downloads folder. Just give me example of code how to do it.
The downloads API is what you are probably looking for:
https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/downloads
The downloads.download() function lets you download a file from a URL to your Downloads folder. Here is the example based on the Downloads.download() page.
function onStartedDownload(id) {
console.log('Started downloading: ' + id);
}
function onFailed(error) {
console.log('Download failed: ' + error);
}
var downloadUrl = "https://www.mozilla.org/media/img/home/2018/cards/irl-season-3.821df676279d.png";
var downloading = browser.downloads.download({
url : downloadUrl,
filename : 'mozilla-home.png',
conflictAction : 'uniquify'
});
downloading.then(onStartedDownload, onFailed);
If you need to download data created in Javascript, then you'll first have to create a URL for that data using URL.createObjectURL()
I am ruing my website on azure, every folder is present on site directory in azure but uploadimages is my sub folder of content is absent from wwwroot, and images is not uploading also
I am using
var path =Path.Combine(Server.MapPath("~/Content/UploadImages/")+filename);
same with document upload
According to your description, I have tested on my side, please follow below to find out whether it could help you.
As you said, you got the target file path by this code:
var path = Path.Combine(Server.MapPath("~/Content/UploadImages/") + filename);
Before uploading files, please make sure that the directory in your web server “~/Content/UploadImages/” is existed.
Here is my test code:
MVC controller method
[HttpPost]
public JsonResult UploadFiles()
{
try
{
foreach (string file in Request.Files)
{
var fileContent = Request.Files[file];
if (fileContent != null && fileContent.ContentLength > 0)
{
var stream = fileContent.InputStream;
var fileName = Path.GetFileName(fileContent.FileName);
string baseDir = Server.MapPath("~/Content/UploadImages/");
if (!Directory.Exists(baseDir))
Directory.CreateDirectory(baseDir);
var path = Path.Combine(baseDir, fileName);
using (var fileStream = System.IO.File.Create(path))
{
stream.CopyTo(fileStream);
}
}
}
}
catch (Exception e)
{
return Json(new
{
Flag = false,
Message = string.Format("File Uploaded failed with exception:{0}", e.Message)
});
}
return Json(new
{
Flag = true,
Message = "File uploaded successfully!"
});
}
Additionally, for long-term consideration, you could store your files on Azure Blob Storage which could bring you some benefits, such as:
1.Serving images or documents directly to a browser
2.Storing files for distributed access
3.When you scale up your site, your site could run in multiple Web Server instances which could access the same files & docs simultaneously
For more details, please refer to this link: https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-blobs/
After a long time lurking, its time to ask my first question...
I'm having trouble with an AJAX query that used to work prior to moving to AWS.
Previously, My web site was hosted on a WAMP server for testing and development and the following relevant code worked fine.
//Read XML file from disc and send file content to other functions\\
function Get_XML_File(){
var XML_File_Path = File_Path + Client_Code + '/' + ID + '/' + ID + '_Analysed_Wave.web'
var xhttps = new XMLHttpRequest();
xhttps.onreadystatechange = function() {
if (xhttps.readyState == 4 && xhttps.status == 200){
Read_Meta_Data(xhttps)
Read_Time_Status(xhttps)
Read_Wave_Height(xhttps)
;}
};
xhttps.open("GET", XML_File_Path, true);
xhttps.send();
}
//Extract Header Data from XML file\\
function Read_Meta_Data(xml) {
var xmlDoc = xml.responseXML;
// Client//
var Client_ID = xmlDoc.getElementsByTagName('Client_ID')[0].childNodes[0]
var Client_Name = xmlDoc.getElementsByTagName('Client_Name')[0].childNodes[0]
Recently, This site was moved to a Elastic Beanstalk distribution with AWS.
'www.atmocean.com.au' has been provisioned with an SSL certificate using the AWS certificate manager.
'assets.atmocean.com.au' is also covered by an SSL certificate and is mapped to a cloudfront distribution of a S3 bucket.
Within the S3 bucket are xml formatted files with a .web suffix (these are generated by proprietary software.)
When the relevent web page is viewed, the chrome console shows the following error: "Uncaught TypeError: Cannot read property 'getElementsByTagName' of null"
this error is in reference to this line:
var Client_ID = xmlDoc.getElementsByTagName('Client_ID')[0].childNodes[0]
What I can't understand is that when the 'Network' tab of the developer console is viewed, the resource is shown as correctly loaded with a status code of 200.
Further, the file content can be viewed in the 'response' tab.
Does this mean that the file has been correctly downloaded from the server to the client?
If so, why does code that formerly worked without error now fail to get the file content?
Does something other than a standard website configuration need to be provisioned through elastic beanstalk (or other means)?
Thanks in anticipation.
You receive a HTTP 200 meaning the server understand the request and can full-fill the request but then it delivers the content, when you execute Read_Meta_Data it does not mean the full content has been delivered
you could add a console.log(xml) and console.log(xmlDoc) to see the current content of your progress
what I would suggest you leverage your code to add a listener on the completion of the transfer
var xhttps = new XMLHttpRequest();
xhttps.overrideMimeType('text/xml');
xhttps.addEventListener("load", transferComplete, false);
function transferComplete(evt) {
// from here your transfer has been completed
}
note: there's also a loadend method which runs when the load has been completed (wether it has been successful or not - never used it)
Frederic,
Thanks for your response.
The clue was in the following line:
xhttp.overrideMimeType('text/xml');
because the xml files use a custom file extension it was being returned as a text string.
I've changed the function to now read as follows:
//Read XML file from disc and send file content to other functions\\
function Get_XML_File(){
var XML_File_Path = File_Path + Client_Code + '/' + ID + '/' + ID + '_Analysed_Wave.web'
var xhttp = new XMLHttpRequest();
xhttp.overrideMimeType('text/xml');
xhttp.onreadystatechange = function() {
if (xhttp.readyState == 4 && xhttp.status == 200){
Read_Meta_Data(xhttp)
Read_Time_Status(xhttp)
Read_Wave_Height(xhttp)
;}
};
xhttp.open("GET", XML_File_Path, true);
xhttp.send();
}
xhttp.overrideMimeType('text/xml');
And with that one change, all is well with the world. (well my function at least.)
I try to save files in IsoStore. In WP8 emulator files have been successfully saved, but when I run my program in other emulators or on my phone(with WP7.8) I get a error: "path must be a valid file name"
I do this:
var path = #"\Shared\Media\mapp\";
var imageName = guid from the server;
if (!_fileStorage.DirectoryExists(path))
_fileStorage.CreateDirectory(path);
//here I get a error using (IsolatedStorageFileStream fileStream =
_fileStorage.OpenFile(path + imageName,
FileMode.OpenOrCreate))
{//do anything}
I try to set path = #"iso:\Shared\Media\mapp\" or #"isostore:\Shared\Media\mapp\" or #"files:\Shared\Media\mapp\" or #"file:\Shared\Media\mapp\" and it doesn't work.
If I set #"\Shared\Media\" all fine in all devices. Who can tell me why I can't create a directory?
For Windows-Phone-7 you can't create a directory, which name ends with "/" or "//", that will cause an "path must be a valid file name" error.
To solve your problem, just change your code a bit:
var path = #"\Shared\Media\mapp";
var imageName = guid from the server;
using (IsolatedStorageFile store = IsolatedStorageFile.GetUserStoreForApplication())
{
if (!store.FileExists(path))
{
store.CreateDirectory(path);
}
store.OpenFile(path + "\\" + imageName, FileMode.OpenOrCreate);
}
Hope, that helps.
I am having some trouble with a asp.net MVC3 web application that I am developing. I need an upload page which Allows the user to upload excel files and dump them to the file system. I got this to work fine. The next part is the part that I am having trouble with, After I upload the excel files I need to programmatically kick off a SSIS package which I have created already to import the excel files.
Here is what I have so far in code:
//
// POST: /Home/Update/
[HttpPost]
public ActionResult Update(HttpPostedFileBase file)
{
// Verify that the user selected a file
if (file != null && file.ContentLength > 0)
{
var fileName = Path.GetFileName(file.FileName);
// store the file inside ~/App_Data/uploads folder
var path = Path.Combine(Server.MapPath("~/App_Data/uploads"), fileName);
ViewBag.Message = "File Uploaded Successfully";
file.SaveAs(path);
}
//Start the SSIS here
try
{
Application app = new Application();
Package package = null;
package = app.LoadPackage( #"C:\Users\Chris\Documents\Visual Studio
2008\Projects\Integration Services Project1\Integration Services Project1
\bin\Package.dtsx", null);
// Execute Package
DTSExecResult results = package.Execute();
if(results == DTSExecResult.Failure)
{
foreach(DtsError local_DtsError in package.Errors)
{
ViewBag.Message("Package Execution results:{0}",
local_DtsError.Description.ToString());
}
}
}
catch(DtsException ex)
{
//ViewBag.Message("{0} Exception caught.", ex);
}
// redirect back to the index action to show the form once again
return RedirectToAction("Update");
}
When I run the code and upload an excel file I get a DtsException caught, which says:
Failed to open package file "C:\Users\Chris\Documents\Visual Studio 2008\Projects\Integration Services Project1\Integration Services Project1\bin\Package.dtsx" due to error 0x80070003 "The system cannot find the path specified.". This happens when loading a package and the file cannot be opened or loaded correctly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format.
I don't understand why it is giving me this because the file path is right I checked and it is exactly correct. I need some help fixing this issue I would greatly appreciate any help you guys can give.
Permissions I should think. Put the file somewhere where account running IIS can see it. Whereever you were planning on deploying it, would be good.