Downloading files from remote server - download

I am using C# and a console app and I am using this script to download files from a remote server. There area a couple of things I want to add. First, when it writes to a file, it doesn't take into consideration a newline. This seems to run a certain amount of bytes and then goes to a newline. I would like it to keep the same format as the file it is reading from. Second, there are multiple .jpg files on the server that I need to download. How can I use this script to download multiple, .jpg files
public static int DownLoadFiles(String remoteUrl, String localFile)
{
int bytesProcessed = 0;
// Assign values to these objects here so that they can
// be referenced in the finally block
StreamReader remoteStream = null;
StreamWriter localStream = null;
WebResponse response = null;
// Use a try/catch/finally block as both the WebRequest and Stream
// classes throw exceptions upon error
try
{
// Create a request for the specified remote file name
WebRequest request = WebRequest.Create(remoteUrl);
request.PreAuthenticate = true;
NetworkCredential credentials = new NetworkCredential("id", "pass");
request.Credentials = credentials;
if (request != null)
{
// Send the request to the server and retrieve the
// WebResponse object
response = request.GetResponse();
if (response != null)
{
// Once the WebResponse object has been retrieved,
// get the stream object associated with the response's data
remoteStream = new StreamReader(response.GetResponseStream());
// Create the local file
localStream = new StreamWriter(File.Create(localFile));
// Allocate a 1k buffer
char[] buffer = new char[1024];
int bytesRead;
// Simple do/while loop to read from stream until
// no bytes are returned
do
{
// Read data (up to 1k) from the stream
bytesRead = remoteStream.Read(buffer, 0, buffer.Length);
// Write the data to the local file
localStream.WriteLine(buffer, 0, bytesRead);
// Increment total bytes processed
bytesProcessed += bytesRead;
} while (bytesRead > 0);
}
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
finally
{
// Close the response and streams objects here
// to make sure they're closed even if an exception
// is thrown at some point
if (response != null) response.Close();
if (remoteStream != null) remoteStream.Close();
if (localStream != null) localStream.Close();
}
// Return total bytes processed to caller.
return bytesProcessed;

Why don't you use WebClient.DownloadData or WebClient.DownloadFile instead?
WebClient client = new WebClient();
client.Credentials = new NetworkCredentials("id", "pass");
client.DownloadFile(remoteUrl, localFile);
By the way the correct way to copy a stream to another is not what you did. You shouldn't read into char[] at all, as you might run into encoding and end of line issues as you are downloading a binary file. The WriteLine method call is problematic too. The right way to copy contents of a stream to another is:
void CopyStream(Stream destination, Stream source) {
int count;
byte[] buffer = new byte[BUFFER_SIZE];
while( (count = source.Read(buffer, 0, buffer.Length)) > 0)
destination.Write(buffer, 0, count);
}
The WebClient class is much easier to use and I suggest using that instead.

The reason you're getting spurious newlines in the result file is because StreamWriter.WriteLine() puts them there. Try using StreamWriter.Write() instead.
Regarding downloading multiple files, can't you just run the function several times, passing it the URLs of the different files you need?

Related

Upload large files to Azure Media Services in Xamarin

I am trying to upload a .mp4 file, selected from the user's iOS or Android device, to my Azure Media Services account.
This code works for small files ( less than ~95MB):
public static async Task<string> UploadBlob(string blobContainerSasUri, string blobName, byte[] blobContent, string path)
{
string responseString;
int contentLength = blobContent.Length;
string queryString = (new Uri(blobContainerSasUri)).Query;
string blobContainerUri = blobContainerSasUri.Split('?')[0];
string requestUri = string.Format(System.Globalization.CultureInfo.InvariantCulture, "{0}/{1}{2}", blobContainerUri, blobName, queryString);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestUri);
request.Method = "PUT";
request.AllowWriteStreamBuffering = false;
request.Headers.Add("x-ms-blob-type", "BlockBlob");
request.ContentLength = contentLength;
request.Timeout = Int32.MaxValue;
request.KeepAlive = true;
int bufferLength = 1048576; //upload 1MB at time, useful for a simple progress bar.
Stream requestStream = request.GetRequestStream();
requestStream.WriteTimeout = Int32.MaxValue;
ProgressViewModel progressViewModel = App.Locator.GetProgressBar(App.Locator.MainViewModel.currentModuleItemId);
MyVideosPage myVideosPage = App.Locator.GetVideosPage(App.Locator.MainViewModel.currentModuleItemId);
FileStream fileStream = new FileStream(path, FileMode.Open, FileAccess.Read);
int nRead = 0;
int currentPos = 0;
while ((nRead = fileStream.Read(blobContent, currentPos, bufferLength)) > 0)
{
await requestStream.WriteAsync(blobContent, currentPos, nRead);
currentPos += nRead;
}
fileStream.Close();
requestStream.Close();
HttpWebResponse objHttpWebResponse = null;
try
{
// this is where it fails for large files
objHttpWebResponse = (HttpWebResponse)request.GetResponse();
Stream responseStream = objHttpWebResponse.GetResponseStream();
StreamReader stream = new StreamReader(responseStream);
responseString = stream.ReadToEnd();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (objHttpWebResponse != null)
objHttpWebResponse.Close();
}
return responseString;
}
An exception is thrown after this line is called:
(HttpWebResponse)request.GetResponse();
The exception message is "The request body is too large and exceeds the maximum permissible limit."
The exception StatusCode is "RequestEntityTooLarge".
How can I upload large files? Is this a problem with HttpWebRequest, or Azure Media Services?
Azure Storage supports one shot upload (aka PutBlob API) up to 256MB if you are using the new REST API versions. But since you are not specifying the REST API version, you're defaulting to a very old version where the maximum supported size of one shot upload is 100MB.
Use x-ms-version: 2018-03-28 header to be able to upload up to 256MB in one HTTP request.
If you have to deal with larger files, you will need to use block & commit upload. You will need to use PutBlock API to stage blocks from the source file. Blocks can be up to 100MB each. Then you need to commit all the blocks using the PutBlockList API. If you don't have to deal with this logic yourself, simply use the Azure Storage SDK for .NET (supports Xamarin) and use the uploadFromFile method. It is simple, and resilient.

Saving base64String as image on FTP server, saves corrupted file

Saving base64String as image on FTP Server is saving it as corrupted file.
I am doing following things
converted base64String into byte[].
Initialized MemoryStream with byte converted in above step.
Opened stream from FTP
Write stream on ftp.
Below is the code
public bool WriteFromBase64ToFile(string base64, string path, string fileName)
{
bool result = false;
using (FtpClient ftp = new FtpClient())
{
// setting ftp properties with required values.
ftp.ReadTimeout = 999999999;
ftp.Host = host;
ftp.Credentials = new System.Net.NetworkCredential(username, password);
ftp.Port = Convert.ToInt32(port);
ftp.DataConnectionType = FtpDataConnectionType.AutoPassive;
ftp.Connect();
ftp.ConnectTimeout = 1000000;
// converting base64String into byte array.
byte[] file = Convert.FromBase64String(base64);
if (ftp.IsConnected)
{
int BUFFER_SIZE = file.Length; // 64KB buffer
byte[] buffer = new byte[file.Length];
// Initializing MemoryStream with byte converted from base64String.
MemoryStream ms = new MemoryStream(buffer);
using (Stream readStream = ms)
{
fileName = fileName.ReplacingSpecialCharacterswithEntities();
// Getting stream from ftp and then writing it on FTP server.
using (Stream writeStream = ftp.OpenWrite(path + "/" + fileName+".jpg", FtpDataType.Binary))
{
while (readStream.Position < readStream.Length)
{
buffer.Initialize();
// Reading stream
int bytesRead = readStream.Read(buffer, 0, BUFFER_SIZE);
// Writing stream
writeStream.Write(buffer, 0, bytesRead);
}
// flushing stream.
writeStream.Flush();
}
}
}
}
result = true;
return result;
}

Windows 8.1 store xaml save InkManager in a string

I'm trying to save what i have drawn with the pencil as a string , and i do this by SaveAsync() method to put it in an IOutputStream then convert this IOutputStream to a stream using AsStreamForWrite() method from this point things should go fine, however i get a lot of problems after this part , if i use for example this code block:
using (var stream = new MemoryStream())
{
byte[] buffer = new byte[2048]; // read in chunks of 2KB
int bytesRead = (int)size;
while (bytesRead < 0)
{
stream.Write(buffer, 0, bytesRead);
}
byte[] result = stream.ToArray();
// TODO: do something with the result
}
i get this exception
"Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection."
or if i try to convert the stream into an image using InMemoryRandomAccessStream like this:
InMemoryRandomAccessStream ras = new InMemoryRandomAccessStream();
await s.CopyToAsync(ras.AsStreamForWrite());
my InMemoryRandomAccessStream variable is always zero in size.
also tried
StreamReader.ReadToEnd();
but it returns an empty string.
found the answer here :
http://social.msdn.microsoft.com/Forums/windowsapps/en-US/2359f360-832e-4ce5-8315-7f351f2edf6e/stream-inkmanager-strokes-to-string
private async void ReadInk(string base64)
{
if (!string.IsNullOrEmpty(base64))
{
var bytes = Convert.FromBase64String(base64);
using (var inMemoryRAS = new InMemoryRandomAccessStream())
{
await inMemoryRAS.WriteAsync(bytes.AsBuffer());
await inMemoryRAS.FlushAsync();
inMemoryRAS.Seek(0);
await m_InkManager.LoadAsync(inMemoryRAS);
if (m_InkManager.GetStrokes().Count > 0)
{
// You would do whatever you want with the strokes
// RenderStrokes();
}
}
}
}

Code Analysis showing Do not dispose objects multiple times

My encryption code is
{
AesManaged aes = null;
MemoryStream memoryStream = null;
CryptoStream cryptoStream = null;
try
{
Rfc2898DeriveBytes rfc2898 = new Rfc2898DeriveBytes(password, Encoding.UTF8.GetBytes(salt), 10000);
aes = new AesManaged();
aes.Key = rfc2898.GetBytes(32);
aes.IV = rfc2898.GetBytes(16);
memoryStream = new MemoryStream();
cryptoStream = new CryptoStream(memoryStream, aes.CreateEncryptor(), CryptoStreamMode.Write);
byte[] data = Encoding.UTF8.GetBytes(dataToEncrypt);
cryptoStream.Write(data, 0, data.Length);
cryptoStream.FlushFinalBlock();
return Convert.ToBase64String(memoryStream.ToArray());
}
finally
{
if (cryptoStream != null)
cryptoStream.Close();
if (memoryStream != null)
memoryStream.Close();
if (aes != null)
aes.Clear();
}
}
I just tried Code Analysis its giving me
CA2202
Do not dispose objects multiple times
Object 'memoryStream' can be disposed more than once in method 'EncryptDecrypt.Encrypt(string, string, string)'.
To avoid generating a System.ObjectDisposedException you should not call Dispose more than one time on an object.
But my code is working fine I created memoryStream & cryptoStream & closed them after... but I am not able to understand why is it telling me multiple objects multiple times
The guidelines for IDisposable state that disposing the same object twice should have no effect the second time.
However, not all implementations follow this guideline, so Code Analysis tells you not to rely on it.
Your particular objects are safe in this regard, so you don't have an actual problem.

How to speed up the unzip action on windows phone 7?

When I used the SharpZipLib to unzip a zip file which has 5000 files on the windows phone 7. It took more than 5 minutes to finish it.
Here is the code:
using (StreamReader httpwebStreamReader = new StreamReader(ea.Result))
{
//open isolated storage to save files
using (IsolatedStorageFile isoStore = IsolatedStorageFile.GetUserStoreForApplication())
{
using (ZipInputStream s = new ZipInputStream(httpwebStreamReader.BaseStream))
{
//s.Password = "123456";//if archive is encrypted
ZipEntry theEntry;
while ((theEntry = s.GetNextEntry()) != null)
{
string directoryName = Path.GetDirectoryName(theEntry.Name);
string fileName = Path.GetFileName(theEntry.Name);
// create directory
if (directoryName.Length > 0)
{
isoStore.CreateDirectory(directoryName);
}
if (fileName != String.Empty)
{
//save file to isolated storage
using (BinaryWriter streamWriter =
new BinaryWriter(new IsolatedStorageFileStream(theEntry.Name,
FileMode.OpenOrCreate, FileAccess.Write, FileShare.Write, isoStore)))
{
int size = 2048;
byte[] data = new byte[2048];
while (true)
{
size = s.Read(data, 0, data.Length);
if (size > 0)
{
streamWriter.Write(data, 0, size);
}
else
{
break;
}
}
}
}
}
}
}
}
Why it's so slow?
How can I speed up the unzip action?
Anyone knows?
I think you need to increase your buffer size. Change the lines
int size = 2048;
byte[] data = new byte[2048];
And change the 2048 to something like 32768 (32*1024).
A 2KB block size is making a lot of individual writes to the flash storage. In my experience that's a somewhat slow thing and can vary from device to device. A 32KB block size should do 16 times fewer but I don't know if that will result in a direct 16x speedup. I'm interested to hear back.

Resources